Human Data Annotator – AI/LLM Evaluation (Media & News Domain) at Beam Data
Remote, British Columbia, Canada -
Full Time


Start Date

Immediate

Expiry Date

30 Sep, 25

Salary

0.0

Posted On

25 Aug, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Good communication skills

Industry

Information Technology/IT

Description

WHO YOU ARE

  • Strong familiarity with news media, journalism, or publishing content
  • Experience working with language data, content moderation, or QA/editing tasks
  • Strong English reading and writing comprehension
  • Sharp attention to detail, nuance, and editorial quality
  • Able to follow annotation guidelines and collaborate in a remote team
  • Comfortable using web-based annotation tools, spreadsheets, or QA platforms
  • Bonus: Prior experience with RLHF, LLMs, or data labeling (e.g., in AI/ML startups, NLP projects)
Responsibilities

ABOUT THE ROLE

We’re seeking detail-oriented Human Data Annotators to support a high-impact AI/LLM (large language model) project in the news and media publishing space. Your work will directly contribute to the development of a more accurate, responsible, and human-aligned AI system removing bias, hallucinations and improving model outputs and responses.
As part of a QA annotation team, you will review AI-generated text outputs and provide human feedback that helps improve the system through reinforcement learning with human feedback (RLHF) and fine-tuning.
This role is ideal for professionals who are passionate about journalism, news quality, and AI ethics, and who want to shape the future of responsible AI systems in news media publishing and journalism.

WHAT YOU’LL DO

  • Review and evaluate AI-generated summaries, answers, or responses from LLMs
  • Provide simple structured feedback (e.g., thumbs up/down or brief qualitative notes)
  • Flag hallucinations, factual inaccuracies, biased content, and unclear language
  • Help ensure AI outputs align with journalistic quality and standards
  • Collaborate with team leads, professional journalists, and AI engineers
  • Contribute to and refine annotation guidelines and training docs
  • Use internal tools and industry annotation platforms to complete assignments
Loading...