Webhooks allow you to set up integrations that subscribe to certain events occurring inside Label Studio. When an event is triggered, Label Studio sends an HTTP POST
request to the configured webhook URL. This enables you to automate workflows and seamlessly integrate Label Studio with other systems.
Label Studio supports the following webhook events:
TASK_CREATED
: Triggered when new tasks are created.TASK_DELETED
: Triggered when tasks are deleted.ANNOTATION_CREATED
: Triggered when new annotations are created.ANNOTATION_UPDATED
: Triggered when an annotation is updated.ANNOTATION_DELETED
: Triggered when an annotation is deleted.PROJECT_CREATED
: Triggered when a project is created.PROJECT_UPDATED
: Triggered when project settings are updated.PROJECT_DELETED
: Triggered when a project is deleted.START_TRAINING
: Triggered when the Start Training action is initiated for a connected model.For more information about setting up and developing Label Studio webhooks, see our documentation.
When should you use webhooks rather than SDK or other API workflows?
The SDK is ideal for the following scenarios:
However, you might want to consider webhooks for the following:
HTTP POST
requests to your configured URL whenever these events occur.HTTP POST
requests. Webhooks are ideal for notifying other services or applications about events in Label Studio without the need for continuous polling.
Use webhooks with your ML backend to automate your training pipeline. For this use case, you would use the ANNOTATION_CREATED
and/or ANNOTATION_UPDATED
events.
Because the annotation webhook payloads includes the annotation result, you can get really creative with how you use them with your ML pipelines. For example, you can use webhooks to automate prompt engineering tasks so that whenever a negative/false review is captured, you can trigger an event to rewrite the prompt using an SME or another LLM.
As another example, you can use webhooks to send annotation data to an external machine learning system such as Amazon Sagemaker. For a full tutorial of this process, see From raw data to a trained model: Automate your ML pipeline with Label Studio & Amazon SageMaker.
Tip: In Label Studio Enterprise, users can identify ground truth annotations. This allows you to track model performance against ground truth annotations and trigger actions based on performance metrics.
Create versions of training data as your project progresses. Creating versioned datasets may be a requirement within your organization to support reproducibility and traceability.
For this use case, you would use the TASK_CREATED
and/or ANNOTATION_UPDATED
events.
Notify annotators when there is a new project ready for labeling.
For this use case, you would use the PROJECT_CREATED
webhook.
Tip: Label Studio Enterprise includes a number of built-in queue management and automated workflow features. For example, you can automatically assign annotators to a task that has a low agreement score.
Update an external database with information submitted by annotators. For example, you might use this to help build an internal knowledge base as annotators submit feedback on LLM responses.
For this use case, you would use the ANNOTATION_CREATED
event.
Trigger active learning and reorganize the labeling queue based on annotation results.
For this use case, you would use the ANNOTATION_CREATED
and/or ANNOTATION_UPDATED
events.
Tip: Label Studio Enterprise includes a number of built-in queue management and automated workflow features. For example, you can automatically assign annotators to a task that has a low agreement score.
We hope this article helps you better understand when to use webhooks with Label Studio, and show you just how flexible this feature is. If you're interested in learning more about how Label Studio Enterprise can give you even more power to automate and customize your labeling workflow, schedule some time to chat with one of our experts. Happy labeling!