•
Join us for a live webinar on understanding Annotator performance using Label Studio's updated Agreement metrics. You'll walk away understanding which metrics to use when and the tools to derive immediate insights to improve your data quality at scale.
Pairwise agreement tells you which humans make the same choices, but in order to manage data quality you need to understand where annotators converge. For high-stakes data, projects with a high-volume of annotators, or genAI evaluation use cases, consensus agreement is a must-have. But it’s not enough to know your overall agreement scores. You need to know exactly where agreement is low so you can take the right actions to improve reliability for business outcomes.
In this live webinar, you’ll learn:
For anyone operating on the frontlines of data quality, this session will teach you how to use consensus agreement to align models with human judgment.
Machine Learning Evangelist, HumanSignal
Micaela Kaplan is the Machine Learning Evangelist at HumanSignal. With her background in applied Data Science and a masters in Computational Linguistics, she loves helping other understand AI tools and practices.
Director of Product Management, HumanSignal
Alec Harris is the Director of Product for Label Studio Enterprise. He is focused on building workflows that unlock value and meet the needs of teams operating at the frontier of AI.