Webhooks

How to use Aquarium webhooks to communicate with your own services, including automating labeling service integrations.

Many of our processes are asynchronous and encourage you to check back in on the app periodically to know when a dataset has been fully ingested or some other event has occurred. To support usecases where you'd prefer for these events to be pushed to you via an HTTP request, we've implemented webhooks.

Configuration

Webhooks are configured for events per project, and can be edited under the Webhooks tab on a project page:

One endpoint can service multiple events, or you can configure a unique endpoint for every event you want to be alerted on. You can also deactivate the entire webhook if you need to quickly disable sending to it.

You can also test the endpoint by sending a minimal test payload using the send icon; we will generate a test event and send it through the same processing pipelines that other events go through.

Test payload
...and error feedback if it fails outright
...or if the endpoint returns an error message

Payloads will always be POSTed to the endpoint, with the headers Content-Type: application/json

An authenticating header will also be included if one has been configured for the organization (see below).

All payloads will follow the same basic schema:

{
event: str, // the event-type that triggered the call
project: str; // the name of the project that the event originated in
[entity]: Dict; // a json representation of the entity that the event refers to, if any
}

Authentication

When you configure a webhook that will be called by an Aquarium service, it will also be available for the public to call. To ensure that any payload you receive on that endpoint originated with Aquarium, we've provided a method to generate a secret key in your organization settings page. This key applies across all configured webhooks in all projects in your organization. We'll generate and show it only once in the UI, so please remember to write it down!

Once a secret key is generated, it will be set in all request headers as X-Aquarium-Secret which you can verify against the secret key you've been shown.

A new secret key will immediately replace the previous one.

Event Schemas

All event payloads will follow the same basic schema:

{
event: str, // the event-type that triggered the call
project: str; // the name of the project that the event originated in
[entity]: Dict; // a json representation of the entity that the event refers to, if any
}

Below is an exhaustive list of schema definitions for every event that Aquarium supports and the shape of the [entity] in the payload.

Dataset Completed Processing

The dataset-complete event fires when a labeled dataset or an inference set for a labeled dataset has been uploaded; the shape will differ slightly depending on what was uploaded.

Dataset
Inference Set
Dataset
{
event: "dataset-complete",
project: str,
dataset: {
id: str,
archived: bool,
created_at: str,
updated_at: str,
frame_count: int,
label_count: int,
data_url: str[],
embeddings_url?: str[],
dataflow_status: str
dataflow_status_postprocess: str,
}
}
Inference Set
{
event: "dataset-complete",
project: str,
inference_set: {
id: str,
archived: bool,
base_dataset: str,
created_at: str,
updated_at: str,
data_url: str[],
frame_count: int,
label_count: int,
data_url: str[],
embeddings_url?: str[],
cached_metrics?: float[][][],
dataflow_status: str
dataflow_status_postprocess: str,
}
}

Dataset Failed Processing

The dataset-failed event fires when a labeled dataset or an inference set for a labeled dataset has been uploaded; the shape will differ slightly depending on what was uploaded.

Dataset
Inference Set
Dataset
{
event: "dataset-failed",
project: str,
dataset: {
id: str,
archived: bool,
created_at: str,
updated_at: str,
frame_count: int,
label_count: int,
data_url: str[],
embeddings_url?: str[],
dataflow_status: str
dataflow_status_postprocess: str,
}
}
Inference Set
{
event: "dataset-failed",
project: str,
inference_set: {
id: str,
archived: bool,
base_dataset: str,
created_at: str,
updated_at: str,
data_url: str[],
frame_count: int,
label_count: int,
data_url: str[],
embeddings_url?: str[],
cached_metrics?: float[][][],
dataflow_status: str
dataflow_status_postprocess: str,
}
}

Issues Created

{
event: "issues-created",
project: str,
issues: [{
id: str,
issue_name: str,
creation_type: "auto" | "manual",
element_type: "frame" | "crop",
num_elements: int,
originating_inference_set_name?: str // the inference set that was used to generate this issue's element, usually applicable to auto-created issues
}]
}

Issue Updated

{
event: "issue-updated",
project: str,
issue_update: {
id: str,
update_type: "add" | "remove" | "move" | "elt_status" | "issue_status",
num_elements_added?: int, // if update_type == "add"
num_elements_removed?: int, // if update_type == "remove"
num_elements_moved?: int, // if update_type == "move"
num_elements_already_present?: int, // if update_type == "move"
original_issue_name?: int, // if update_type == "move", the issue the elements were originally in
new_elements_status?: str, // if update_type == "elt_status"
num_elements_updated?: int, // if update_type == "elt_status"
previous_issue_status?: str, // if update_type == "issue_status"
new_issue_status?: str // if update_type == "issue_status"
}
}

Issue Exported

{
event: "issue-exported",
project: str,
issue: {
id: str,
elements: [{
dataset: str,
inference_set: str,
issue_name: str,
element_id: str,
element_type: "frame" | "crop",
frame_id: str,
frame_data: {
coordinate_frames: [{
coordinate_frame_id: str,
coordinate_frame_metadata: Optional[Dict],
coordinate_frame_type: str,
}],
custom_metrics: {
[custom_metric_type]: int[][] | float,
},
date_captured: str,
device_id: str,
geo_data: {
[coordinate_field]: float,
},
label_data: [{
attributes: {
confidence: float,
...
},
label: str,
label_coorindate_frame: str,
label_type: str,
linked_labels: str[],
uuid: str,
}],
sensor_data: [{
coordinate_frame: str,
data_urls: {
image_url: str,
},
date_captured: str,
sensor_id: str,
sensor_metadata: Dict,
sensor_type: str,
}],
task_id: str,
[user__metadata_field]: str | int | float | bool,
},
}]
}
}