To perform a real-time prediction, construct a dictionary with model features as keys and the corresponding data as values.
sample = {
"tenure": 30,
"contract": "One year",
"internetservice": "DSL",
# Add other fields as required by your model schema
}
# Inference on a single data point
prediction = client.predict(sample, model_id=os.getenv("VIEWAI_MODEL_ID"))
# Print the probabilities
print(prediction.get_probabilities())
2. Inference on a Dataset
For batch predictions, load your dataset (e.g., from a CSV file), convert it to a list of dictionaries.
# Read the dataset
import pandas as pd
scoring_dataset = pd.read_csv("datasets/score_churn.csv")
# Batch inference
client.predict(
scoring_dataset.to_dit(orient="records"), model_id=os.getenv("VIEWAI_MODEL_ID")
)