Model Inferences

Inference sets include all of your model predictions for your dataset

Overview

Within Aquarium, we refer to model results and predictions as inferences.These inferences are the results from your model that was trained off corresponding ground truth values. As a result your inferences relate back to your labeled data. We can upload ground truth labels and inferences for the same image, point cloud, etc. That allows us to compare the results between our ground-truth labels and our inferences.

When working with Aquarium, for each model inference we create an object called an InferenceFrame and assign it to an Inferences set.

An InferenceFrame contains all of the information related to model predictions: predicted bounding boxes, predicted classifications, and all inference metadata.

For real examples of uploading inference data, please look at our quickstart guides!

Prerequisites to Uploading Model Inferences

In order to ensure the following steps will work smoothly, this guide assumes you have already:

To view your labeled data once uploaded, you will have to make sure that you have selected and set up the appropriate data sharing method for your team.

Creating and Formatting Your Inference Data

To ingest you model inferences, there are two main objects you'll work with:

Assuming a Project and a LabeledDataset have been created in Aquarium, let's also upload your model inferences. Inferences, like labels, must be matched to a frame within the dataset. For each LabeledFrame in your dataset, we will create an InferencesFrame and then assign the appropriate inferences to that InferencesFrame.

Then, we add those InferencesFrames to the Inferences object in order to upload your model inferences into Aquarium. This usually means looping through your data and creating InferenceFrames to add to the Inferences object.

If you have generated your own embeddings and want to use them during your inference data uploads, please also see this section for additional guidance!

When defining an inference frame, it is important that you use the same frame_id as your labeled frame.

This is how Aquarium associated labels to inferences is by corresponding frame ID. So we define a new object called InferencesFrame, but we do so by using the frame_id we used to create a LabeledFrame. Think of an InferenceFrame as a container for your model inferences.

Defining these objects looks like this:

# defining the Inference Set object
inference_dataset = al.Inferences()

# defining a InferenceFrame object
# frame_id must be the same as the labeledframe_id 
inf_frame = al.InferencesFrame(frame_id=SAME_FRAME_ID_AS_LABELEDFRAME)

Once you've defined your frame, we need to associate some data with it!

Adding Model Inferences to Your Inference Frame

Like labeled datasets, the Inferences object is also made of frames of type InferencesFrame, which contain inferred values.

Inference frames can contain zero or more inference objects. The type of prediction like bounding boxes or classification depends on the type of ML task for your project.

Adding data to an InferencesFrame usually corresponds 1:1 with a label format, with the addition of a confidence parameter.

For example, this is what it looks like when you add a bounding box label to a 2D image:

frame.add_label_2d_bbox(
    sensor_id='some_camera',
    label_id='abcd_label',
    classification='dog',
    top=200,
    left=300,
    width=250,
    height=150
)

And this is what it looks like to add a bounding box that represents the model inference to a 2D image:

inference_frame.add_inference_2d_bbox(
    sensor_id='some_camera',
    label_id='abcd_inference',
    classification='cat',
    top=200,
    left=300,
    width=250,
    height=150,
    confidence=0.85
)

There is quite a bit of overlap between adding data to a LabeledFrame and a InferencesFrame and you can see all of the different options in the API docs!

Putting It All Together

Now that we've discussed the general steps for adding inference/prediction data, here is an example of what this would look like for a 2D object detection task (you can view the quickstart guide for more details on the dataset used below):

# create our inference set we will be uploading
inference_dataset = al.Inferences()

# loop through all the inferences we have
for entry in inference_entries:
    # Create a frame object, using the same id as the label
    frame_id = entry['image_name'].split('.png')[0]
    inf_frame = al.InferencesFrame(frame_id=frame_id)

    # for each image, we need to loop through each predicted plane
    for idx, identified_plane in enumerate(entry['predictions']):   
        x1 = identified_plane['bbox'][0]
        y1 = identified_plane['bbox'][1]
        x2 = identified_plane['bbox'][2]
        y2 = identified_plane['bbox'][3]

        # inference data formats bbox as [x1,y1,x2,y2] 
        # aquarium needs top, left, width, and height
        # so we need to do a little bit of math
        width = abs(x1 - x2)
        height = abs(y1 - y2)

        # Add the inferred classification label to the frame
        inf_label_id = frame_id + '_inf_' + str(idx)

        # we add a 2D inference bounding box object to each inference frame
        inf_frame.add_inference_2d_bbox(
            label_id=inf_label_id,
            classification=identified_plane['category_name'],
            confidence=identified_plane['confidence'],
            top = y1,
            left = x1,
            width = width,
            height = height
        )
    
    # Add the frame to the inferences collection
    inference_dataset.add_frame(inf_frame)

Uploading Your Inferences Set

Now that we have everything all set up, let's submit your new inference dataset to Aquarium!

Reminder, Aquarium does some processing of your data, like indexing metadata and possibly calculating embeddings, so after they're submitted so you may see a delay before they show up in the UI. You can view some examples of what to expect as well as troubleshooting your upload here!

Submitting Your Dataset

You can submit your LabeledDataset to be uploaded in to Aquarium by calling .create_inferences().

To spot check our data immediately, we can set the preview_first_frame flag to True and see a link in the console to a preview frame allows you to make sure data and labels look right.

This is an example of what the create_inferences() call will look like:

INFERENCE_DATASET_NAME = 'inferences_v1'
al_client.create_inferences(
    PROJECT_NAME, 
    DATASET_NAME, 
    inferences=inference_dataset, 
    inferences_id=INFERENCE_DATASET_NAME, 
    wait_until_finish=True
)

After kicking off your inferences upload, it can take anywhere from minutes to multiple hours depending on your dataset size.

You can monitor your uploads under the "Streaming Uploads" tab in the project view. Here is a guide on how to find that page.

Once completed within Aquarium in the specific Project page, you'll be able to see your dataset details and on the righthand side you can see all of your available inferences once the upload has completed.

Quickstart Examples

For examples of how to upload labeled datasets, check out our quickstart examples.

pageQuickstart Guides

Last updated