We have made a number of improvements to task summaries.
Before:
After:
Improvements include:
When calculating agreement, control tags that are not populated in each annotation will now count as agreement.
Previously, agreement only considered control tags that were present in the annotation results. Going forward, all visible control tags in the labeling configuration are taken into consideration.
For example, the following result set would previously be considered 0% agreement between the two annotators, as only choices group 1 would be included in the agreement calculation.
Now it would be considered 50% agreement (choices group 1 has 0% agreement, and choices group 2 has 100% agreement).
Annotator 1
Choices group 1
Choices group 2
Annotator 2
Choices group 1
Choices group 2
Notes:
This change only applies to new projects created after November 13th, 2025.
Only visible tags are taken into consideration. For example, you may have tags that are conditional and hidden unless certain other tags are selected. These are not included in agreement calculations as long as they remain hidden.
When you add OpenAI models to Prompts or to the organization model provider list, GPT-5.1 will now be included.
There is a new page available from Organization > Settings > Permissions that allows users in the Owner role to refine permissions across the organization.
This page is only visible to users in the Owner role.
For more information, see Customize organization permissions.
You can now find the follow templates in the in-app template gallery:
Fine-Tune an Agent with an LLM
Fine-tune an Agent without an LLM
Evaluate Production Conversations for RLHF
Previously, if using a machine learning model with a project, you had to set up your ML backend with a legacy API token.
You can now use personal access tokens as well.
The Chat tag now supports markdown and HTML in messages.
The Quality section of the project settings has been improved.
The onboarding checklist for projects has been improved to make it clearer which steps still need to be taken before annotators can begin working on a project:
There is a new option to select between the default column display and a more compact version.
There is a new Agreement (Selected) column that allows you to view agreement data between selected annotators, models, and ground truths.
This is different than the Agreement column, which displays the average agreement score between all annotators.
Also note that now when you click on a comparison between annotators in the agreement matrix, you will be taken to the Data Manager with the Agreement (Selected) column pre-filtered for those annotators and/or models.
For more information, see Agreement and Agreement (Selected) columns.
value parameter. When deleting a project, users will now be asked to enter text in the confirmation window:
Activity logs are now only retained for 180 days.
The Member Performance Dashboard now includes two new graphs for Reviewer metrics:
You can also now find a Review Time column in the Data Manager:
Note that data collection for review time began on September 25, 2025. You will not be to view review time for reviewing activity that happened before data collection began.
There is a new <Markdown> tag, which you can use to add content to your labeling interface.
For example, adding the following to your labeling interface:
<View>
<Markdown>
## Heading 2
### Heading 3
- bullet point one
- bullet point two
**Bold text** and *italic text*
`inline code`
```
code block
```
[Link](https://humansignal.com/changelog/)

</Markdown>
</View>
Produces this:
Previously, the Table tag only accepted key/value pairs, for example:
{
"data": {
"table_data": {
"user": "123456",
"nick_name": "Max Attack",
"first": "Max",
"last": "Opossom"
}
}
}
It will now accept an array of objects as well as arrays of primitives/mixed values. For example:
{
"data": {
"table_data": [
{ "id": 1, "name": "Alice", "score": 87.5, "active": "true" },
{ "id": 2, "name": "Bob", "score": 92.0, "active": "false" },
{ "id": 3, "name": "Cara", "score": null, "active": "true" }
]
}
}
You can now perform page-level annotation on PDF files, such as for OCR, NER, and more.
This new functionality also supports displaying PDFs natively within the labeling interface, allowing you to zoom and rotate pages as needed.
The PDF functionality is now available for all Label Studio Enterprise customers. Contact sales to request a trial.
The Video tag now has the following optional parameters:
defaultPlaybackSpeed - The default playback speed when the video is loaded. minPlaybackSpeed - The minimum allowed playback speed.
The default value for both parameters is 1.
We have continued to add new endpoints to our SDK, including new endpoints for model and user stats.
See our SDK releases and API reference.
You can now select specific workspaces and projects when inviting new Label Studio users. Those users will automatically be added as members to the selected projects and/or workspaces:
There is also a new Invite Members action available from the Settings > Members page for projects. This is currently only available for Administrators and Owners.
This will create a new user within your organization, and also immediately add them as a member to the project:
The Annotation section of the project settings has been improved.
For more information, see Project settings - Annotation.
splitchannels="true".
Chat conversations are now a native data type in Label Studio, so you can annotate, automate, and measure like you already do for images, video, audio, and text.
For more information, see:
Blog - Introducing Chat: 4 Use Cases to Ship a High Quality Chatbot
There is a new cloud storage option to connect your Databricks Unity Catalog to Label Studio.
For more information, see Databricks Files (UC Volumes).
When you click the Agreement column in the Data Manager, you can see a pop-up with an inter-annotator agreement matrix. This pop-up will now also identify annotations with ground truths.
For more information about adding ground truths, see Ground truth annotations.
You can now sort regions by media start time.
Previously you could sort by time, but this would reflect the time that the region was created. The new option reflects the start time in relation to the media.
When you add Gemini or Vertex AI models to Prompts or to the organization model provider list, you will now see the latest Gemini models.
gemini-2.5-pro
gemini-2.5-flash
gemini-2.5-flash-lite
You can now search the template gallery. You can search by template title, keywords, tag names, and more.
Note that template searches can only be performed if your organization has AI features enabled.
Introducing two new tags: Vector and VectorLabels.
These tags open up a multitude of new uses cases from skeletons, to polylines, to Bézier curves, and more.
There is a new Model Providers page available at the organization level where you can configure API keys to use with LLM tasks.
If you have previously set up model providers as part of your Prompts workflow, they are automatically included in the list.
For more information, see Model provider API keys for organizations.
The Organization page (only accessible to Owner and Admin roles) has been redesigned to be more consistent with the rest of the app.
Note that as part of this change, the Access Token page has been moved under Settings.
Before:
After:
There is a new project setting available from Annotation > Annotating Options and Review > Reviewing Options called Show unused data columns to reviewers in the Data Manager.
This setting allows you to hide unused Data Manager columns from any Annotator or Reviewer who also has permission to view the Data Manager.
"Unused" Data Manager columns are columns that contain data that is not being used in the labeling configuration.
For example, you may include meta or system data that you want to view as part of a project, but you don't necessarily want to expose that data to Annotators and Reviewers.
Each user has a numeric ID that you can use in automated workflows. These IDs are now easier to quickly find through the UI.
You can find them listed on the Organization page and in the Annotation Summary table on the Members page for projects.
Managers and Reviewers will now see a link to the Annotator Dashboard from the Home page.
The Annotator Dashboard displays information about their annotation history.
Managers:
Reviewers:
We have continued to add new endpoints to our SDK, including new endpoints for bulk assign and unassign members to tasks.
See our SDK releases and API reference.
If your project is using predictions, you will now see a Show Models toggle on the Members dashboard.
This will allow you to view model agreement as compared to annotators, other models, and ground truths.
For more information, see the Members dashboard.
When duplicating a project, you will now see a modal with an updated UI and more helpful text.
We have continued to add new endpoints to our SDK. See our SDK releases.
Administrators and Owners can now opt in to get an email notification when a new user logs in who has not yet been assigned a role.
When you have a labeling configuration that includes multiple <Labels> blocks, like the following:
<View>
<Text name="text" value="$text" granularity="word"/>
<Labels name="category" toName="text" choice="single">
<Label value="Animal" background="red"/>
<Label value="Plant" background="darkorange"/>
</Labels>
<Labels name="type" toName="text" choice="single">
<Label value="Mammal" background="green"/>
<Label value="Reptile" background="gray"/>
<Label value="Bird" background="blue"/>
</Labels>
</View>
You can now choose multiple labels to apply to the selected region.
When loading the Data Manager in which you have not yet imported data, you will now see a more helpful interface.
We released a new version of the SDK, with multiple functional and documentation enhancements.
All imported predictions are now validated against your project’s labeling configuration and the required prediction schema.
Predictions that are missing required fields (for example, from_name, to_name, type, value) or that don’t match the labeling configuration (for example, to_name must reference an existing object tag) will be rejected with detailed, per-task error messages to help you correct the payloads.
You can now filter prediction results by selecting options that correspond to control tag values.
Previously, you could only filter using an unstructured text search.
The prediction results filter also includes a nested model version filter, which (if specified) will ensure that your filters returns tasks only when the selected prediction result comes from the selected model.
There is a new See Logs option for custom agreement metrics, which you can use to view log history and error messages.
leafsOnly parameter for taxonomies. Text or Hypertext with multiple Taxonomy tags at the same time. When using an OpenAI API key, you will now see the following models as options:
You can now connect your projects to Azure Blob Storage using Service Principal authentication.
Service Principal authentication uses Entra ID to authenticate applications rather than account keys, allowing you to grant specific permissions and can be easily revoked or rotated.
For more information, see Azure Blob Storage with Service Principal authentication.
The Organization > Usage & License page has new options to disable individual email notifications for all members in the organization.
If disabled, the notification will be disabled for all users and hidden from their options on their Account & Settings page.
When adding cloud storage, the modal has now been redesigned to add clarity and additional guidance to the process.
For example, you can now preview a list of files that will be imported in order to verify your settings.
When applying an annotation results filter, you will now see a nested Annotator option. This allows you to specify that the preceding filter should be related to the specific annotator.
For example, the following filter will retrieve any tasks that have an annotation with choice "bird" selected, and also retrieve any tasks that have an annotation submitted by "Sally Opossum."
This means if you have a task where "Max Opossum" and "Sally Opossum" both submitted annotations, but only Max chose "bird", the task would be returned in your filter.
With the new nested filter, you can specify that you only want tasks in which "Sally Opossum" selected "bird":
While you can still adjust the default height in the labeling configuration, now users can drag and drop to adjust the height as needed.
Next week, we are releasing version 2.0.0 of the Label Studio SDK, which will contain breaking changes.
If you use the Label Studio SDK package in any automated pipelines, we strongly recommend pinning your SDK version to <2.0.0.
When labeling paragraphs in dialogue format (layout="dialogue"), you can now apply labels at an utterance level.
There is a new button that you can click to apply the selected label to the entire utterance. You can also use the pre-configured Command + Shift + A hotkey:
You can now annotate time series data on the sub-second decimal level.
Note: Your time format must include .%f to support decimals.
For example:timeFormat="%Y-%m-%d %H:%M:%S.%f"
There is a new option on the Members page to export its data to CSV:
When listing organization members via the API, you can use two new query params to exclude project or workspace members:
exclude_project_id exclude_workspace_idThe following API endpoints have been deprecated and will be removed on September 16, 2025.
GET /api/projects/{id}/dashboard-membersGET /api/projects/{id}/exportsnap="pixel" was not included in autocomplete options for RectangleLabels.You now have the option to view the Projects page in list format rather than as a grid of cards:
In the list view, you see will a condensed version of the project information that includes fewer metrics, but more projects per page:
(Admin view)
(Annotator view)
This change also includes a new option to sort projects (available in either view):
When you are using a labeling configuration that includes <TimelineLabels>, you will now see a settings icon.
From here you can specify the following:
The <Rectangle> and <RectangleLabels> tags now include the snap parameter, allowing you to snap bounding boxes to pixels.
Tip: To see a pixel grid when zoomed in on an image, you must disable pixel smoothing. This can be done as a parameter on the <Image> tag or from the user settings.
The <Collapse> tag now includes an open parameter. You can use this to specify whether a content area should be open or collapsed by default.
/import API calls no longer apply to GET requests.You can now configure global hotkeys for each user account. These are available from the Account & Settings page.
Previously, the bulk annotation actions were only available to users in the Reviewer role or Manager and higher.
Now, users in the Annotator role can access these action.
Note that this is only available when the project is using Manual distribution and annotators must have access to the Data Manager.
You can now search by project description and project ID.
You can now click a link in the project breadcrumbs to navigate back to a specific workspace.
Removed the default zoom level calculation for Audio, allowing it to render the full waveform by default.
include and filter parameters. /api/tasks/{id} call for tasks with more than 10 annotations.<TextArea> field was still submitted even if the field was conditionally hidden.
We’ve introduced a new BitMask tag to support pixel-level image annotation using a brush and eraser. This new tag allows for highly detailed segmentation using brush-based region and a cursor that reflects brush size down to single pixels for fine detail. We’ve also improved performance so it can handle more regions with ease.
Additionally, Mac users can now use two fingers to pinch zoom and pan images for all annotation tasks.
Email notifications have been added for important project events, including task assignments, project publishing, and data export completion. This helps annotators and project managers stay in sync without unnecessary distractions.
Users can manage email preferences in their user settings.
All Label Studio Starter Cloud and Enterprise SaaS users, including those on a free trial can ask inline questions of an AI trained on our docs and even use AI to quickly create projects, including configuring labeling UIs, with natural language.
Account owners can enable the AI Assistant in the Settings > Usage & Licenses by toggling on “Enable AI” and “Enable Ask AI.” For more information see the docs.
There is a new option to display audio files as spectrograms. You can further specify additional spectrogram settings such as windowing function, color scheme, dBs, mel bands, and more.
Spectrograms can provide a deeper level of audio analysis by visualizing frequency and amplitude over time, which is crucial for identifying subtle sounds (like voices or instruments) that might be missed with traditional waveform views.
There is a new Multichannel tag for visualizing time series data. You can use this tag to combine and view multiple time series channels simultaneously on a single channel, with synchronized interactions.
The Multichannel tag significantly improves the usability and correlation of time series data, making it easier for users to analyze and pinpoint relationships across different signals.
When using the View All action, users who are in the Reviewer role or higher can now see a summary of the annotations for a specific task. This summary includes metadata, agreements, and side-by-side comparisons of labels.
You can use this summary for a more efficient and detailed review of annotated tasks and to better understand consensus and discrepancies, especially when needing to compare the work of multiple annotators.
When applying filters, you will see new options that correspond to annotation results.
These options are identified by the results chip and correspond to control tag names and support complex filtering for multiple annotation results. For example, you can filter by “includes all” or “does not include.”
This enhancement provides a much more direct, predictable, and reliable way to filter and analyze annotation results, saving time and reducing the chances of errors previously encountered with regex matching.
For more information, see Filter annotation results.
When deleting annotations, reviews, or assignments, you can now select a specific user for the delete action. Previously, you were only able to delete all instances.
With this change, you will have more granular control over data deletion, allowing for precise management of reviews and annotations.
This enhancement is available for the following actions:
Users can now opt into email notifications when you are invited to a project or workspace. These options are available from the Account & Settings page.
This ensures users are promptly aware of new project and workspace invitations, improving collaboration and onboarding workflows.
There are two UI changes related to storage proxies:
The Billing & Usage page has been renamed the Usage & License page. Previously this page was only visible to users in the Owner role. A read-only form of this page is now available to all users in the Admin role.
Organization owners can use the new Session Timeout Policies fields to control session timeout settings for all users within their organization. These fields are available from the Usage & License page.
Owners can configure both the maximum session age (total duration of a session) and the maximum time between activity (inactivity timeout).