Quick start
Let's get started with how to use BlindAI to deploy an AI model in minutes!
This section guides you through deploying your first model with BlindAI Inference Server!
We will use the example of DistilBert model for demonstration purposes. We will go through three main phases:

1. Run the BlindAI Server

2. Prepare the model

Loading and exporting the model in ONNX format.

3. Send and run the model

Sending the model in BlindAI Server and running the inference.