Skip to main content
Version: v1.4.1

Inference

Run predictions on deployed models.

All methods are accessed via client.inference.

predict()

POST/v1/predict

Predicts the target column of a dataset.

Parameters

filenamestrRequired
The name of the file.
model_idstrRequired
The model id.
version_idstrRequired
The version id.
thresholdfloatdefault: 0.5
The threshold for classification models.
delimiterstrdefault: ','
The delimiter of the file.

Returns

dict: The prediction results.

Example

1result = client.inference.predict(
2 filename="./data.csv",
3 model_id="model_abc123",
4 version_id="version_xyz789",
5 threshold=0.5,
6 delimiter=","
7)

stream_predictions()

GET/v1/predict

Stream predictions for large datasets by processing in batches.

Parameters

filenamestrRequired
Path to CSV file to stream
model_idstrRequired
ID of the model
version_idstrRequired
ID of the model version
thresholdfloatdefault: 0.5
Classification threshold
delimiterstrdefault: ','
CSV delimiter
batch_sizeintdefault: 1000
Size of each batch to process

Example

1result = client.inference.stream_predictions(
2 filename="./data.csv",
3 model_id="model_abc123",
4 version_id="version_xyz789",
5 threshold=0.5,
6 delimiter=",",
7 batch_size=1000
8)