What You Will Be Doing:
- Pretraining large language models (LLMs).
- Fine-tuning and aligning LLMs on streaming data.
- Writing training pipelines, including distributed training, for LLM models.
- Accelerating the inference of LLMs.
Requirements:
- At least 6 years of commercial Python development experience.
- 5+ years of experience in NLP, with at least 2 years of hands-on experience in pretraining and fine-tuning LLMs.
- Proficiency with frameworks such as PyTorch, TensorFlow, Hugging Face, Transformers, etc.
- Excellent understanding of the architectures of major LLMs.
- Practical experience applying LLMs to various tasks.
- Experience optimizing model inference, including techniques such as quantization.
- A degree in Computer Science, Applied Mathematics, or related fields.
- English proficiency at B2 level or higher.
Bonus Points:
- Research or publications related to NLP or LLM topics.
- Prize-winning placements in LLM-related competitions (e.g., Kaggle, Boosters).
What We Offer:
- Participation in the development of a fast-growing product operating in real-time markets.
- A competitive salary based on your qualifications and interview performance, ranging from $8,000 to $15,000.
- Opportunities to enhance your expertise by working with top-tier colleagues and learning on the job.
- A dynamic and supportive team of professionals who value integrity, honesty, and openness.
- English classes with a native speaker, health insurance after the probation period, and thoughtful holiday gifts.
- The chance to implement bold and ambitious initiatives.
- A horizontal organizational structure with no bureaucracy or "big boss" syndrome.
- A results-driven work culture with a flexible schedule and fully remote opportunities.
If this sounds like you, apply now to join our team!