BlindAI
Search…
Prepare the model
Let's assume we want to deploy a DistilBERT model for classification, within our confidential inference server. This could be useful for instance to analyze medical records in a privacy-friendly manner and compliant way.
BlindAI uses the ONNX format, which is an open and interoperable AI model format. Pytorch or Tensorflow models can be easily exported to ONNX.

Step 1: Load the BERT model

1
from transformers import DistilBertForSequenceClassification
2
​
3
# Load the model
4
model = DistilBertForSequenceClassification.from_pretrained("distilbert-base-uncased")
Copied!
For simplicity, we will take a pre-trained DistilBERT without finetuning it, as the purpose is to show how to deploy a model with confidentiality.

Step 2: Export it in ONNX format

Because DistilBert uses tracing behind the scenes, we need to feed it an example input.
1
from transformers import DistilBertTokenizer
2
import torch
3
​
4
# Create dummy input for export
5
tokenizer = DistilBertTokenizer.from_pretrained("distilbert-base-uncased")
6
sentence = "I love AI and privacy!"
7
inputs = tokenizer(sentence, padding = "max_length", max_length = 8, return_tensors="pt")["input_ids"]
8
​
9
# Export the model
10
torch.onnx.export(
11
model, inputs, "./distilbert-base-uncased.onnx",
12
export_params=True, opset_version=11,
13
input_names = ['input'], output_names = ['output'],
14
dynamic_axes={'input' : {0 : 'batch_size'},
15
'output' : {0 : 'batch_size'}})
Copied!
​