Connecting models to generate predictions in Label Studio Enterprise has never been easier thanks to a significant update to the workflow and UI, and the addition of basic auth. Whether you want to automate labeling or evaluate models, the new ML backend connection enables you to accelerate projects with your choice of models while maintaining full control, customizability, and compliance.
This works similarly to loading a set of existing predictions into Label Studio, except in this case, the labeling task is sent to the model as context and the predictions returned by your model are applied right inside of Label Studio. Annotators can then review and accept predictions rather than tediously entering their own.
For example, an image annotation task may require drawing bounding boxes around building signage and then inputting the actual text of each sign. An OCR model can accelerate this task by auto-populating the text of the sign.
For a document summarization task, an annotator can now easily and interactively prompt an LLM to assist with crafting the summary, saving the successful prompt to be used for the next labeling task.
Once a dataset is annotated, you can evaluate your model's performance by comparing its predictions against the ground truth labels. Model evaluation lets you test models before production use, identify challenging edge cases, and discover and respond to data drift once a model is deployed.
These new capabilities in Label Studio Enterprise make it easier to automate and speed up the labeling workflow within the already familiar annotation UI. Annotators will see AI models instantly offer predictions in the UI that they can easily review, refine, and accept. They’ll also have the tools to evaluate results, refine model prompts, and apply automated predictions across datasets much larger than what can be annotated by hand.
Enhanced productivity for your team may look appealing, but there are important questions: How do you control which model is being used, or bring your own custom models? How can you guarantee the safety and compliance of your data?
The good news is we built AI automation for Label Studio with demanding enterprise requirements in mind.
Teams have full control to select any available model to power these automations—from open source, from a commercial provider, or completely custom. In fact, after training or fine-tuning your own model, your team can now use the same automation workflows to evaluate the model's performance and compare its outputs to existing human annotations.
Enhancements to the ML backend integration in Label Studio Enterprise include:
Label Studio Enterprise runs fully managed cloud infrastructure offering HIPAA compliance and SOC2 certification, or on-premises. And even as you make use of automation, your data in transit between the interface and your models never touches our servers.
When it becomes faster to annotate new data and to evaluate model output, training loops run much more efficiently. You can finally have ML workflows that can keep pace with the incredible volumes of data seen at scale.
If you are looking to bring ML projects to market faster, the automation capabilities in Label Studio will be a powerful asset for your team and projects. We'd love to offer guidance to help you select models, stand up an integration, and rapidly automate workflows for your entire team.