Customer Story
In Conversation With
Associate Director
At OM1, the team’s work often requires extracting structured information from unstructured clinical notes. These records often contain multiple related findings within the same narrative, each with its own attributes. On a recent large-scale clinical annotation project, OM1 used Label Studio Enterprise to manage complex entity relationships, scale its QA workflow, and keep review documentation organized across thousands of tasks.
Clinical notes rarely come in neatly structured formats. A single patient record might contain multiple procedure findings, each with associated attributes like anatomical location, measurements, and clinical interpretations, and those associations need to stay intact when they are extracted. Linking the wrong attribute to the wrong finding corrupts the downstream dataset.
Beyond extraction quality, projects at this scale require a rigorous review process. With thousands of notes and multiple annotations per task, reviewers need to efficiently compare outputs, resolve disagreements, and document decisions, all while maintaining context and avoiding external tracking overhead.
Clinical annotation projects live or die on data integrity and workflow efficiency.
Label Studio's Relations feature let OM1’s annotators link attributes directly to their corresponding findings during labeling, rather than reconstructing those relationships later during downstream data processing. When a note contains multiple findings, annotators can keep each finding distinct while linking the correct attributes to it. This preserves data integrity in the exported dataset by maintaining correct relationships between findings and their attributes, without requiring downstream cleanup.
OM1 saw a 20–30% improvement in annotator efficiency from this approach alone.
For quality assurance, OM1 ran two independent annotations on every task. The 'Review All Tasks' stream in Label Studio Enterprise gave reviewers a side-by-side view of both annotations, so they could identify differences and resolve them within a single workflow without toggling between records.
This cut OM1’s review cycle time by ~50% compared to external comparison approaches.
At scale, it’s not enough to resolve disagreements, teams also need a record of how and why decisions were made. Task-level commenting in Label Studio Enterprise lets reviewers log decisions and flag edge cases directly on the relevant task, with timestamps.
Reviewers can also flag edge cases and recurring issues directly within tasks, making it easier for the team to identify patterns in disagreements or guideline ambiguity across the project.
This reduced reliance on ad hoc tracking spreadsheets and made issue resolution 50–75% faster.
Clinical annotation projects live or die on data integrity and workflow efficiency. Label Studio gives OM1 the tooling to handle each of these challenges, preserving complex relationships during annotation, scaling the review process without adding headcount, and keeping documentation traceable without external overhead. It's become a core part of how the team operates at scale.