POST https://konbert.io/api/v1/pipelines/sqlify/convert/run
csv
or json
, it will be automatically detected if not specifiedcsv
, sql
or json
string
, integer
, float
, json
The above command returns JSON structured like this:
GET https://konbert.com/api/v1/files
The above command returns JSON structured like this:
GET https://konbert.com/api/v1/files/${FILE_ID}
POST https://api.sqlify.io/v1/files
The above command returns JSON structured like this:
POST https://konbert.com/api/v1/files.
POST https://konbert.com/api/v1/files
POST https://konbert.com/api/v1/files/${FILE_ID}/data
201 Created
when the last chunk is uploaded, 200 Sucess
and for the rest of chunks.POST https://konbert.com/api/v1/files/${FILE_ID}
string
, integer
, float
, json
queued
, running
, failed
or completed
failed
or completed
statuscompleted
limit_exceeded
GET https://konbert.com/api/v1/runs/${RUN_ID}
sqlify/detect-schema
step analyzes the file and writes the schema in the file specified.sqlify/export-*
steps take the data, cast it to the schema and encode it in the desired format.sqlify/convert-to-csv
sqlify/detect-schema
sqlify/export-csv
sqlify/convert-to-json
sqlify/detect-schema
sqlify/export-json
sqlify/convert-to-sql
sqlify/detect-schema
sqlify/export-sql
rewrite_schema
option is passed. So you can specify the schema in your file and run the pipeline safely.GET https://konbert.com/api/v1/pipelines/${PIPELINE}/run