About the job
This opportunity is exclusively available for candidates currently residing in Europe.
In this role, you will ensure that Waniwani's infrastructure remains seamless and efficient at scale.
We have pioneered the first insurance application within ChatGPT, and now numerous companies are eager to launch their own AI-driven storefronts using our platform. Each of these businesses requires its own MCP server, unique codebase, and tailored distribution—all functioning flawlessly, interconnected, and overseen by you.
About Us
Waniwani is revolutionizing the AI distribution framework for the $10 trillion services sector. We empower businesses to offer quote-based services (insurance, wealth management, etc.) directly within LLMs, where consumer decisions are increasingly being made. Envision Shopify, but tailored for services. We are constructing the foundational infrastructure for AI-native commerce.
We are at the beginning of a transformative platform shift. Our launch of ChatGPT insurance resulted in $26 billion in market cap losses for the world’s largest brokers within just 48 hours. The media dubbed it "the industry's ChatGPT moment." We refer to it as Day 1.
Purpose of This Role
Having surpassed the proof of concept stage, our product is fully operational. We now need to support hundreds of clients simultaneously, each requiring their own MCP server, bespoke data pipelines, and distribution dashboards, all while ensuring the platform operates smoothly.
This position presents a systems-level engineering challenge. You will be responsible for the architecture that allows Waniwani to scale from a handful of deployments to thousands. You'll work with multi-tenant infrastructure, orchestrate multiple codebases, and manage real-time data flowing through hundreds of interconnected services. One flawed abstraction could cost us months, while one well-designed abstraction could save years.
We anticipate that 95% of your code will be generated by AI. Your true value lies not in typing but in discerning what to build, how to architect it, and when the AI makes errors. You will operate at a pace that integrates LLMs as a fundamental part of your engineering process, not as a gimmick. The real constraint will be your judgment, not your typing speed.
Your Responsibilities
Oversee the multi-tenant platform. Design and sustain the infrastructure that accommodates numerous MCP codebases—each independent, customizable, and all viewable from a unified interface (or MCP).
Construct the data backbone. Devise the pipelines and frameworks necessary for seamless data integration and analysis.

