View AI
  • Welcome to View AI's Documentation!
  • Product Tour
  • UI Guide
    • Explainable AI
    • Fairness
    • Monitoring
    • Analyze
  • Platform Guide
    • Interpretability
      • ML Tasks
      • Evaluate
    • Observability
  • Client Guide
    • Quickstart
    • Training Models
    • Deploying Models
    • Inference
    • Updating Schemas
Powered by GitBook
On this page

Was this helpful?

  1. Client Guide

Inference

1. Inference on a Single Data Point

To perform a real-time prediction, construct a dictionary with model features as keys and the corresponding data as values.

sample = {
    "tenure": 30,
    "contract": "One year",
    "internetservice": "DSL",
    # Add other fields as required by your model schema
}

# Inference on a single data point
prediction = client.predict(sample, model_id=os.getenv("VIEWAI_MODEL_ID"))

# Print the probabilities
print(prediction.get_probabilities())

2. Inference on a Dataset

For batch predictions, load your dataset (e.g., from a CSV file), convert it to a list of dictionaries.

# Read the dataset
import pandas as pd
scoring_dataset = pd.read_csv("datasets/score_churn.csv")

# Batch inference
client.predict(
    scoring_dataset.to_dit(orient="records"), model_id=os.getenv("VIEWAI_MODEL_ID")
)
PreviousDeploying ModelsNextUpdating Schemas

Last updated 1 year ago

Was this helpful?

↪ Questions? Chat with an AI or talk to a .

product expert