company

Research Engineer, Foundation Model

On-site Full-time

Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.


Unlock Your Potential

Generate Job-Optimized Resume

One Click And Our AI Optimizes Your Resume to Match The Job Description.

Is Your Resume Optimized For This Role?

Find Out If You're Highlighting The Right Skills And Fix What's Missing

Experience Level

Experience

Qualifications

QualificationsThe ideal candidate will possess:A strong foundation in machine learning and AI principles. Experience with structured data and familiarity with tabular datasets. Proficiency in programming languages such as Python, TensorFlow, or PyTorch. Excellent problem-solving skills and the ability to work collaboratively in a team-oriented environment. A passion for innovation and a desire to make a significant impact in the field of AI.

About the job

About Us

At Prior Labs, we are pioneers in developing foundation models that effectively comprehend tabular data, which serves as the cornerstone of various fields including science and business. While foundation models have revolutionized the processing of text and images, structured data remains a largely untapped resource. Our mission is to address this $600 billion opportunity, fundamentally transforming the way organizations engage with scientific, medical, financial, and business data.

Our Achievements: We proudly stand as the leading organization in the realm of structured data machine learning (ML). Our groundbreaking TabPFN v2 model has been featured in Nature, establishing a new benchmark in tabular machine learning. Following its release, we have significantly enhanced our model capabilities, achieving over 2.5 million downloads and receiving more than 5,500 stars on GitHub. We are witnessing a rapid uptick in adoption from both research and industry sectors as we build the next generation of tabular foundation models and commercialize them with enterprises across Europe and the United States.

Our Team: Our team consists of a highly selective group of over 20 engineers and researchers, chosen from a pool of more than 5,000 applicants. We have backgrounds from industry giants such as Google, Apple, Amazon, Microsoft, G-Research, Jane Street, Goldman Sachs, and CERN. We are led by the creators of TabPFN and receive guidance from eminent AI researchers including Bernhard Schölkopf and Turing Award winner Yann LeCun. Meet our talented team here.

What's Next: Supported by top-tier investors and leaders from Hugging Face, DeepMind, and Silo AI, we are on a rapid growth trajectory. This is an exceptional time to join us and help shape the future of structured data AI. Explore our manifesto for further insight.

Core Areas of Impact

As a member of our engineering team, you will contribute to the development of a novel class of AI models. Our latest innovation, TabPFN, significantly surpasses existing methods by orders of magnitude, and we are just getting started. This is a unique opportunity to:

  • Engage in groundbreaking advancements in AI, rather than just incremental enhancements.
  • Influence the future of how organizations globally manage their most critical data.
  • Join us at an opportune moment: we have secured substantial funding (with announcements imminent), achieved strong initial traction (over 100,000 downloads), and are expanding swiftly.

At Prior Labs, we prioritize collaboration and integration of research into practical applications. Our Research Engineers play a critical role in bridging the gap between innovative research and real-world implementation.

About Prior Labs

Prior Labs is at the forefront of AI innovation, focusing on the development of foundation models that transform the understanding of tabular data across various sectors. With a commitment to tackling a significant market opportunity, our team is dedicated to driving advancements that shape the future of data utilization in science and business.

Similar jobs

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.