Aquarium offers automatic metric computation for common tasks, but you may have a unique ML task, or domain-specific metrics you care about. In these cases, you can provide your own custom metrics for your inferences, which will be indexed and searchable just like the default metrics.
Each frame of an inference set can be assigned a number for a named objective function. For example, your domain may have a business-driven score that corresponds to "correctness." You can attach that score to each frame, and work with it like a default metric such as precision or recall.
For tasks with a classification component, confusion matrices are a natural way of representing performance. For most object detection tasks, Aquarium provides confusion matrices by doing IOU based matching between labels and detections.
If your domain has more nuanced definitions for associations or what counts as a false-positive/false-negative, you can provide a confusion matrix for each frame.
When creating a project in Aquarium, you can provide the custom metrics you want to be indexed for its datasets:
al_client.create_project(..., custom_metrics=[al.CustomMetricsDefinition(name='my_score', metrics_type='objective'),al.CustomMetricsDefinition(name='my_conf_matrix', metrics_type='confusion_matrix'),])
Then, whenever you go to add a new inference set to it, you can attach your metrics to each frame of the inferences:
inferences_frame = al.InferencesFrame(frame_id=frame_id)# ...inferences_frame.add_custom_metric('my_score', 3.32)inferences_frame.add_custom_metric('my_conf_matrix', [[6,2],[1,5]])