Skip to main content
Server path: /openai-data | Type: Application | PCID required: Yes

Tools

ToolDescription
openai_data_add_upload_partAdds a Part to an Upload object. A Part represents a chunk of bytes from the file you are trying to upload. Each Part can be at most 64 MB, and you can add Parts until you hit the Upload maximum of 8 GB. It is possible to add multiple Parts in parallel. You can decide the intended order of the Parts when you complete the Upload.
openai_data_cancel_batchCancels an in-progress batch. The batch will be in status cancelling for up to 10 minutes, before changing to cancelled, where it will have partial results (if any) available in the output file.
openai_data_cancel_eval_runCancel an ongoing evaluation run.
openai_data_cancel_fine_tuning_jobImmediately cancel a fine-tune job.
openai_data_cancel_uploadCancels the Upload. No Parts may be added after an Upload is cancelled.
openai_data_cancel_vector_store_file_batchCancel a vector store file batch. This attempts to cancel the processing of files in this batch as soon as possible.
openai_data_complete_uploadCompletes the Upload. Within the returned Upload object, there is a nested File object that is ready to use in the rest of the platform. You can specify the order of the Parts by passing in an ordered list of the Part IDs. The number of bytes uploaded upon completion must match the number of bytes initially specified when creating the Upload object. No Parts may be added after an Upload is completed.
openai_data_create_batchCreates and executes a batch from an uploaded file of requests
openai_data_create_evalCreate the structure of an evaluation that can be used to test a model’s performance. An evaluation is a set of testing criteria and a datasource. After creating an evaluation, you can run it on different models and model parameters. We support several types of graders and datasources. For more information, see the Evals guide.
openai_data_create_eval_runCreate a new evaluation run. This is the endpoint that will kick off grading.
openai_data_create_fileUpload a file that can be used across various endpoints. Individual files can be up to 512 MB, and the size of all files uploaded by one organization can be up to 100 GB. The Assistants API supports files up to 2 million tokens and of specific file types. See the Assistants Tools guide for details. The Fine-tuning API only supports .jsonl files. The input also has certain required formats for fine-tuning chat or completions models. The Batch API only supports .jsonl files up to 200 MB in size. The input also has a specific required format. Please contact us if you need to increase these storage limits.
openai_data_create_fine_tuning_checkpoint_permissionNOTE: Calling this endpoint requires an admin API key. This enables organization owners to share fine-tuned models with other projects in their organization.
openai_data_create_fine_tuning_jobCreates a fine-tuning job which begins the process of creating a new model from a given dataset. Response includes details of the enqueued job including job status and the name of the fine-tuned models once complete. Learn more about fine-tuning
openai_data_create_uploadCreates an intermediate Upload object that you can add Parts to. Currently, an Upload can accept at most 8 GB in total and expires after an hour after you create it. Once you complete the Upload, we will create a File object that contains all the parts you uploaded. This File is usable in the rest of our platform as a regular File object. For certain purpose values, the correct mime_type must be specified. Please refer to documentation for the supported MIME types for your use case. For guidance on the proper filename extensions for each purpose, please follow the documentation on creating a File.
openai_data_create_vector_storeCreate a vector store.
openai_data_create_vector_store_fileCreate a vector store file by attaching a File to a vector store.
openai_data_create_vector_store_file_batchCreate a vector store file batch.
openai_data_delete_evalDelete an evaluation.
openai_data_delete_eval_runDelete an eval run.
openai_data_delete_fileDelete a file.
openai_data_delete_fine_tuning_checkpoint_permissionNOTE: This endpoint requires an admin API key. Organization owners can use this endpoint to delete a permission for a fine-tuned model checkpoint.
openai_data_delete_vector_storeDelete a vector store.
openai_data_delete_vector_store_fileDelete a vector store file. This will remove the file from the vector store but the file itself will not be deleted. To delete the file, use the delete file endpoint.
openai_data_download_fileReturns the contents of the specified file.
openai_data_get_evalGet an evaluation by ID.
openai_data_get_eval_runGet an evaluation run by ID.
openai_data_get_eval_run_output_itemGet an evaluation run output item by ID.
openai_data_get_eval_run_output_itemsGet a list of output items for an evaluation run.
openai_data_get_eval_runsGet a list of runs for an evaluation.
openai_data_get_vector_storeRetrieves a vector store.
openai_data_get_vector_store_fileRetrieves a vector store file.
openai_data_get_vector_store_file_batchRetrieves a vector store file batch.
openai_data_list_batchesList your organization’s batches.
openai_data_list_evalsList evaluations for a project.
openai_data_list_filesReturns a list of files.
openai_data_list_files_in_vector_store_batchReturns a list of vector store files in a batch.
openai_data_list_fine_tuning_checkpoint_permissionsNOTE: This endpoint requires an admin API key. Organization owners can use this endpoint to view all permissions for a fine-tuned model checkpoint.
openai_data_list_fine_tuning_eventsGet status updates for a fine-tuning job.
openai_data_list_fine_tuning_job_checkpointsList checkpoints for a fine-tuning job.
openai_data_list_paginated_fine_tuning_jobsList your organization’s fine-tuning jobs
openai_data_list_vector_store_filesReturns a list of vector store files.
openai_data_list_vector_storesReturns a list of vector stores.
openai_data_modify_vector_storeModifies a vector store.
openai_data_retrieve_batchRetrieves a batch.
openai_data_retrieve_fileReturns information about a specific file.
openai_data_retrieve_fine_tuning_jobGet info about a fine-tuning job. Learn more about fine-tuning
openai_data_retrieve_vector_store_file_contentRetrieve the parsed contents of a vector store file.
openai_data_search_vector_storeSearch a vector store for relevant chunks based on a query and file attributes filter.
openai_data_update_evalUpdate certain properties of an evaluation.
openai_data_update_vector_store_file_attributesUpdate attributes on a vector store file.

openai_data_add_upload_part

Adds a Part to an Upload object. A Part represents a chunk of bytes from the file you are trying to upload. Each Part can be at most 64 MB, and you can add Parts until you hit the Upload maximum of 8 GB. It is possible to add multiple Parts in parallel. You can decide the intended order of the Parts when you complete the Upload. Parameters:
ParameterTypeRequiredDefaultDescription
upload_idstringYesThe ID of the Upload.
datastringYesThe chunk of bytes for this Part.

openai_data_cancel_batch

Cancels an in-progress batch. The batch will be in status cancelling for up to 10 minutes, before changing to cancelled, where it will have partial results (if any) available in the output file. Parameters:
ParameterTypeRequiredDefaultDescription
batch_idstringYesThe ID of the batch to cancel.

openai_data_cancel_eval_run

Cancel an ongoing evaluation run. Parameters:
ParameterTypeRequiredDefaultDescription
eval_idstringYesThe ID of the evaluation whose run you want to cancel.
run_idstringYesThe ID of the run to cancel.

openai_data_cancel_fine_tuning_job

Immediately cancel a fine-tune job. Parameters:
ParameterTypeRequiredDefaultDescription
fine_tuning_job_idstringYesThe ID of the fine-tuning job to cancel.

openai_data_cancel_upload

Cancels the Upload. No Parts may be added after an Upload is cancelled. Parameters:
ParameterTypeRequiredDefaultDescription
upload_idstringYesThe ID of the Upload.

openai_data_cancel_vector_store_file_batch

Cancel a vector store file batch. This attempts to cancel the processing of files in this batch as soon as possible. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store that the file batch belongs to.
batch_idstringYesThe ID of the file batch to cancel.

openai_data_complete_upload

Completes the Upload. Within the returned Upload object, there is a nested File object that is ready to use in the rest of the platform. You can specify the order of the Parts by passing in an ordered list of the Part IDs. The number of bytes uploaded upon completion must match the number of bytes initially specified when creating the Upload object. No Parts may be added after an Upload is completed. Parameters:
ParameterTypeRequiredDefaultDescription
upload_idstringYesThe ID of the Upload.
md5stringNoThe optional md5 checksum for the file contents to verify if the bytes uploaded matches what you expect.
part_idsstring[]YesThe ordered list of Part IDs.

openai_data_create_batch

Creates and executes a batch from an uploaded file of requests Parameters:
ParameterTypeRequiredDefaultDescription
completion_windowstringYesThe time frame within which the batch should be processed. Currently only 24h is supported.
endpointstringYesThe endpoint to be used for all requests in the batch. Currently /v1/responses, /v1/chat/completions, /v1/embeddings, and /v1/completions are supported. Note that /v1/embeddings batches are also restricted to a maximum of 50,000 embedding inputs across all requests in the batch.
input_file_idstringYesThe ID of an uploaded file that contains requests for the new batch. See upload file for how to upload a file. Your input file must be formatted as a JSONL file, and must be uploaded with the purpose batch. The file can contain up to 50,000 requests, and can be up to 200 MB in size.
metadataobjectNoSet of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard. Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.

openai_data_create_eval

Create the structure of an evaluation that can be used to test a model’s performance. An evaluation is a set of testing criteria and a datasource. After creating an evaluation, you can run it on different models and model parameters. We support several types of graders and datasources. For more information, see the Evals guide. Parameters:
ParameterTypeRequiredDefaultDescription
data_source_configobjectYesThe configuration for the data source used for the evaluation runs.
metadataobjectNoSet of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard. Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.
namestringNoThe name of the evaluation.
testing_criteriaany[]YesA list of graders for all eval runs in this group.

openai_data_create_eval_run

Create a new evaluation run. This is the endpoint that will kick off grading. Parameters:
ParameterTypeRequiredDefaultDescription
eval_idstringYesThe ID of the evaluation to create a run for.
data_sourceobjectYesDetails about the run’s data source.
metadataobjectNoSet of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard. Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.
namestringNoThe name of the run.

openai_data_create_file

Upload a file that can be used across various endpoints. Individual files can be up to 512 MB, and the size of all files uploaded by one organization can be up to 100 GB. The Assistants API supports files up to 2 million tokens and of specific file types. See the Assistants Tools guide for details. The Fine-tuning API only supports .jsonl files. The input also has certain required formats for fine-tuning chat or completions models. The Batch API only supports .jsonl files up to 200 MB in size. The input also has a specific required format. Please contact us if you need to increase these storage limits. Parameters:
ParameterTypeRequiredDefaultDescription
filestringYesThe File object (not file name) to be uploaded.
purposestringYesThe intended purpose of the uploaded file. One of: - assistants: Used in the Assistants API - batch: Used in the Batch API - fine-tune: Used for fine-tuning - vision: Images used for vision fine-tuning - user_data: Flexible file type for any purpose - evals: Used for eval data sets

openai_data_create_fine_tuning_checkpoint_permission

NOTE: Calling this endpoint requires an admin API key. This enables organization owners to share fine-tuned models with other projects in their organization. Parameters:
ParameterTypeRequiredDefaultDescription
fine_tuned_model_checkpointstringYesThe ID of the fine-tuned model checkpoint to create a permission for.
project_idsstring[]YesThe project identifiers to grant access to.

openai_data_create_fine_tuning_job

Creates a fine-tuning job which begins the process of creating a new model from a given dataset. Response includes details of the enqueued job including job status and the name of the fine-tuned models once complete. Learn more about fine-tuning Parameters:
ParameterTypeRequiredDefaultDescription
hyperparametersobjectNoThe hyperparameters used for the fine-tuning job. This value is now deprecated in favor of method, and should be passed in under the method parameter.
integrationsobject[]NoA list of integrations to enable for your fine-tuning job.
metadataobjectNoSet of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard. Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.
methodobjectNoThe method used for fine-tuning.
modelobjectYesThe name of the model to fine-tune. You can select one of the supported models.
seedintegerNoThe seed controls the reproducibility of the job. Passing in the same seed and job parameters should produce the same results, but may differ in rare cases. If a seed is not specified, one will be generated for you.
suffixstringNoA string of up to 64 characters that will be added to your fine-tuned model name. For example, a suffix of “custom-model-name” would produce a model name like ft:gpt-4o-mini:openai:custom-model-name:7p4lURel.
training_filestringYesThe ID of an uploaded file that contains training data. See upload file for how to upload a file. Your dataset must be formatted as a JSONL file. Additionally, you must upload your file with the purpose fine-tune. The contents of the file should differ depending on if the model uses the chat, completions format, or if the fine-tuning method uses the preference format. See the fine-tuning guide for more details.
validation_filestringNoThe ID of an uploaded file that contains validation data. If you provide this file, the data is used to generate validation metrics periodically during fine-tuning. These metrics can be viewed in the fine-tuning results file. The same data should not be present in both train and validation files. Your dataset must be formatted as a JSONL file. You must upload your file with the purpose fine-tune. See the fine-tuning guide for more details.

openai_data_create_upload

Creates an intermediate Upload object that you can add Parts to. Currently, an Upload can accept at most 8 GB in total and expires after an hour after you create it. Once you complete the Upload, we will create a File object that contains all the parts you uploaded. This File is usable in the rest of our platform as a regular File object. For certain purpose values, the correct mime_type must be specified. Please refer to documentation for the supported MIME types for your use case. For guidance on the proper filename extensions for each purpose, please follow the documentation on creating a File. Parameters:
ParameterTypeRequiredDefaultDescription
bytesintegerYesThe number of bytes in the file you are uploading.
filenamestringYesThe name of the file to upload.
mime_typestringYesThe MIME type of the file. This must fall within the supported MIME types for your file purpose. See the supported MIME types for assistants and vision.
purposestringYesThe intended purpose of the uploaded file. See the documentation on File purposes.

openai_data_create_vector_store

Create a vector store. Parameters:
ParameterTypeRequiredDefaultDescription
chunking_strategyobjectNoThe chunking strategy used to chunk the file(s). If not set, will use the auto strategy. Only applicable if file_ids is non-empty.
expires_afterobjectNoThe expiration policy for a vector store.
file_idsstring[]NoA list of File IDs that the vector store should use. Useful for tools like file_search that can access files.
metadataobjectNoSet of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard. Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.
namestringNoThe name of the vector store.

openai_data_create_vector_store_file

Create a vector store file by attaching a File to a vector store. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store for which to create a File.
attributesobjectNoSet of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard. Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters, booleans, or numbers.
chunking_strategyobjectNoThe chunking strategy used to chunk the file(s). If not set, will use the auto strategy.
file_idstringYesA File ID that the vector store should use. Useful for tools like file_search that can access files.

openai_data_create_vector_store_file_batch

Create a vector store file batch. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store for which to create a File Batch.
attributesobjectNoSet of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard. Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters, booleans, or numbers.
chunking_strategyobjectNoThe chunking strategy used to chunk the file(s). If not set, will use the auto strategy.
file_idsstring[]YesA list of File IDs that the vector store should use. Useful for tools like file_search that can access files.

openai_data_delete_eval

Delete an evaluation. Parameters:
ParameterTypeRequiredDefaultDescription
eval_idstringYesThe ID of the evaluation to delete.

openai_data_delete_eval_run

Delete an eval run. Parameters:
ParameterTypeRequiredDefaultDescription
eval_idstringYesThe ID of the evaluation to delete the run from.
run_idstringYesThe ID of the run to delete.

openai_data_delete_file

Delete a file. Parameters:
ParameterTypeRequiredDefaultDescription
file_idstringYesThe ID of the file to use for this request.

openai_data_delete_fine_tuning_checkpoint_permission

NOTE: This endpoint requires an admin API key. Organization owners can use this endpoint to delete a permission for a fine-tuned model checkpoint. Parameters:
ParameterTypeRequiredDefaultDescription
fine_tuned_model_checkpointstringYesThe ID of the fine-tuned model checkpoint to delete a permission for.
permission_idstringYesThe ID of the fine-tuned model checkpoint permission to delete.

openai_data_delete_vector_store

Delete a vector store. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store to delete.

openai_data_delete_vector_store_file

Delete a vector store file. This will remove the file from the vector store but the file itself will not be deleted. To delete the file, use the delete file endpoint. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store that the file belongs to.
file_idstringYesThe ID of the file to delete.

openai_data_download_file

Returns the contents of the specified file. Parameters:
ParameterTypeRequiredDefaultDescription
file_idstringYesThe ID of the file to use for this request.

openai_data_get_eval

Get an evaluation by ID. Parameters:
ParameterTypeRequiredDefaultDescription
eval_idstringYesThe ID of the evaluation to retrieve.

openai_data_get_eval_run

Get an evaluation run by ID. Parameters:
ParameterTypeRequiredDefaultDescription
eval_idstringYesThe ID of the evaluation to retrieve runs for.
run_idstringYesThe ID of the run to retrieve.

openai_data_get_eval_run_output_item

Get an evaluation run output item by ID. Parameters:
ParameterTypeRequiredDefaultDescription
eval_idstringYesThe ID of the evaluation to retrieve runs for.
run_idstringYesThe ID of the run to retrieve.
output_item_idstringYesThe ID of the output item to retrieve.

openai_data_get_eval_run_output_items

Get a list of output items for an evaluation run. Parameters:
ParameterTypeRequiredDefaultDescription
eval_idstringYesThe ID of the evaluation to retrieve runs for.
run_idstringYesThe ID of the run to retrieve output items for.
afterstringNoIdentifier for the last output item from the previous pagination request.
limitintegerNoNumber of output items to retrieve.
statusstringNoFilter output items by status. Use failed to filter by failed output items or pass to filter by passed output items.
orderstringNoSort order for output items by timestamp. Use asc for ascending order or desc for descending order. Defaults to asc.

openai_data_get_eval_runs

Get a list of runs for an evaluation. Parameters:
ParameterTypeRequiredDefaultDescription
eval_idstringYesThe ID of the evaluation to retrieve runs for.
afterstringNoIdentifier for the last run from the previous pagination request.
limitintegerNoNumber of runs to retrieve.
orderstringNoSort order for runs by timestamp. Use asc for ascending order or desc for descending order. Defaults to asc.
statusstringNoFilter runs by status. One of queued | in_progress | failed | completed | canceled.

openai_data_get_vector_store

Retrieves a vector store. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store to retrieve.

openai_data_get_vector_store_file

Retrieves a vector store file. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store that the file belongs to.
file_idstringYesThe ID of the file being retrieved.

openai_data_get_vector_store_file_batch

Retrieves a vector store file batch. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store that the file batch belongs to.
batch_idstringYesThe ID of the file batch being retrieved.

openai_data_list_batches

List your organization’s batches. Parameters:
ParameterTypeRequiredDefaultDescription
afterstringNoA cursor for use in pagination. after is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list.
limitintegerNoA limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20.

openai_data_list_evals

List evaluations for a project. Parameters:
ParameterTypeRequiredDefaultDescription
afterstringNoIdentifier for the last eval from the previous pagination request.
limitintegerNoNumber of evals to retrieve.
orderstringNoSort order for evals by timestamp. Use asc for ascending order or desc for descending order.
order_bystringNoEvals can be ordered by creation time or last updated time. Use created_at for creation time or updated_at for last updated time.

openai_data_list_files

Returns a list of files. Parameters:
ParameterTypeRequiredDefaultDescription
purposestringNoOnly return files with the given purpose.
limitintegerNoA limit on the number of objects to be returned. Limit can range between 1 and 10,000, and the default is 10,000.
orderstringNoSort order by the created_at timestamp of the objects. asc for ascending order and desc for descending order.
afterstringNoA cursor for use in pagination. after is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list.

openai_data_list_files_in_vector_store_batch

Returns a list of vector store files in a batch. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store that the files belong to.
batch_idstringYesThe ID of the file batch that the files belong to.
limitintegerNoA limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20.
orderstringNoSort order by the created_at timestamp of the objects. asc for ascending order and desc for descending order.
afterstringNoA cursor for use in pagination. after is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list.
beforestringNoA cursor for use in pagination. before is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, starting with obj_foo, your subsequent call can include before=obj_foo in order to fetch the previous page of the list.
filterstringNoFilter by file status. One of in_progress, completed, failed, cancelled.

openai_data_list_fine_tuning_checkpoint_permissions

NOTE: This endpoint requires an admin API key. Organization owners can use this endpoint to view all permissions for a fine-tuned model checkpoint. Parameters:
ParameterTypeRequiredDefaultDescription
fine_tuned_model_checkpointstringYesThe ID of the fine-tuned model checkpoint to get permissions for.
project_idstringNoThe ID of the project to get permissions for.
afterstringNoIdentifier for the last permission ID from the previous pagination request.
limitintegerNoNumber of permissions to retrieve.
orderstringNoThe order in which to retrieve permissions.

openai_data_list_fine_tuning_events

Get status updates for a fine-tuning job. Parameters:
ParameterTypeRequiredDefaultDescription
fine_tuning_job_idstringYesThe ID of the fine-tuning job to get events for.
afterstringNoIdentifier for the last event from the previous pagination request.
limitintegerNoNumber of events to retrieve.

openai_data_list_fine_tuning_job_checkpoints

List checkpoints for a fine-tuning job. Parameters:
ParameterTypeRequiredDefaultDescription
fine_tuning_job_idstringYesThe ID of the fine-tuning job to get checkpoints for.
afterstringNoIdentifier for the last checkpoint ID from the previous pagination request.
limitintegerNoNumber of checkpoints to retrieve.

openai_data_list_paginated_fine_tuning_jobs

List your organization’s fine-tuning jobs Parameters:
ParameterTypeRequiredDefaultDescription
afterstringNoIdentifier for the last job from the previous pagination request.
limitintegerNoNumber of fine-tuning jobs to retrieve.
metadataobjectNoOptional metadata filter. To filter, use the syntax metadata[k]=v. Alternatively, set metadata=null to indicate no metadata.

openai_data_list_vector_store_files

Returns a list of vector store files. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store that the files belong to.
limitintegerNoA limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20.
orderstringNoSort order by the created_at timestamp of the objects. asc for ascending order and desc for descending order.
afterstringNoA cursor for use in pagination. after is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list.
beforestringNoA cursor for use in pagination. before is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, starting with obj_foo, your subsequent call can include before=obj_foo in order to fetch the previous page of the list.
filterstringNoFilter by file status. One of in_progress, completed, failed, cancelled.

openai_data_list_vector_stores

Returns a list of vector stores. Parameters:
ParameterTypeRequiredDefaultDescription
limitintegerNoA limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20.
orderstringNoSort order by the created_at timestamp of the objects. asc for ascending order and desc for descending order.
afterstringNoA cursor for use in pagination. after is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list.
beforestringNoA cursor for use in pagination. before is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, starting with obj_foo, your subsequent call can include before=obj_foo in order to fetch the previous page of the list.

openai_data_modify_vector_store

Modifies a vector store. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store to modify.
expires_afterobjectNoExpires After
metadataobjectNoSet of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard. Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.
namestringNoThe name of the vector store.

openai_data_retrieve_batch

Retrieves a batch. Parameters:
ParameterTypeRequiredDefaultDescription
batch_idstringYesThe ID of the batch to retrieve.

openai_data_retrieve_file

Returns information about a specific file. Parameters:
ParameterTypeRequiredDefaultDescription
file_idstringYesThe ID of the file to use for this request.

openai_data_retrieve_fine_tuning_job

Get info about a fine-tuning job. Learn more about fine-tuning Parameters:
ParameterTypeRequiredDefaultDescription
fine_tuning_job_idstringYesThe ID of the fine-tuning job.

openai_data_retrieve_vector_store_file_content

Retrieve the parsed contents of a vector store file. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store.
file_idstringYesThe ID of the file within the vector store.

openai_data_search_vector_store

Search a vector store for relevant chunks based on a query and file attributes filter. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store to search.
filtersobjectNoA filter to apply based on file attributes.
max_num_resultsintegerNoThe maximum number of results to return. This number should be between 1 and 50 inclusive.
querystring[]YesA query string for a search
ranking_optionsobjectNoRanking options for search.
rewrite_querybooleanNoWhether to rewrite the natural language query for vector search.

openai_data_update_eval

Update certain properties of an evaluation. Parameters:
ParameterTypeRequiredDefaultDescription
eval_idstringYesThe ID of the evaluation to update.
metadataobjectNoSet of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard. Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.
namestringNoRename the evaluation.

openai_data_update_vector_store_file_attributes

Update attributes on a vector store file. Parameters:
ParameterTypeRequiredDefaultDescription
vector_store_idstringYesThe ID of the vector store the file belongs to.
file_idstringYesThe ID of the file to update attributes.
attributesobjectYesSet of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard. Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters, booleans, or numbers.