AI Research Engineer (Pre-training) at Tether Operations Limited
London, England, United Kingdom -
Full Time


Start Date

Immediate

Expiry Date

16 Sep, 25

Salary

0.0

Posted On

17 Jun, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Good communication skills

Industry

Information Technology/IT

Description

JOIN TETHER AND SHAPE THE FUTURE OF DIGITAL FINANCE

At Tether, we’re not just building products, we’re pioneering a global financial revolution. Our cutting-edge solutions empower businesses—from exchanges and wallets to payment processors and ATMs—to seamlessly integrate reserve-backed tokens across blockchains. By harnessing the power of blockchain technology, Tether enables you to store, send, and receive digital tokens instantly, securely, and globally, all at a fraction of the cost. Transparency is the bedrock of everything we do, ensuring trust in every transaction.

Responsibilities
  • Conduct pre-training AI models on large, distributed servers equipped with thousands of NVIDIA GPUs.
  • Design, prototype, and scale innovative architectures to enhance model intelligence.
  • Independently and collaboratively execute experiments, analyze results, and refine methodologies for optimal performance.
  • Investigate, debug, and improve both model efficiency and computational performance.
  • Contribute to the advancement of training systems to ensure seamless scalability and efficiency on target platforms.
  • A degree in Computer Science or related field. Ideally PhD in NLP, Machine Learning, or a related field, complemented by a solid track record in AI R&D (with good publications in A* conferences).
  • Hands-on experience contributing to large-scale LLM training runs on large, distributed servers equipped with thousands of NVIDIA GPUs, ensuring scalability and impactful advancements in model performance.
  • Familiarity and practical experience with large-scale, distributed training frameworks, libraries and tools.
  • Deep knowledge of state-of-the-art transformer and non-transformer modifications aimed at enhancing intelligence, efficiency and scalability.
  • Strong expertise in PyTorch and Hugging Face libraries with practical experience in model development, continual pretraining, and deployment
Loading...