We’re seeking AI Engineers to help design, build, and deploy next-gen AI solutions across Mercer’s global ecosystem. You will work on Generative AI, RAG pipelines, and AI agents that power internal tools, chatbots, and decision-support systems used by our consultants. This hands‑on engineering role bridges full‑stack software development and advanced AI techniques.
What You’ll Do
- Design and implement AI-driven applications, copilots, and conversational agents using LLMs (OpenAI, Anthropic, Azure OpenAI).
- Build and maintain RAG pipelines using text chunking, embeddings, and vector databases (e.g., MongoDB Atlas Vector Search, FAISS, Pinecone, Weaviate).
- Develop and optimize prompting strategies and agent reasoning logic.
- Engineer LLM‑based middleware integrations using Python and Node.js.
- Contribute to full‑stack application development using the MEAN stack (MongoDB, Express, Angular, Node.js).
- Support secure architecture by implementing MFA, OAuth flows, and containerized deployments.
- Manage CI/CD pipelines using GitHub and Jenkins, and work with containerized deployments.
- Collaborate with cross‑functional teams (data scientists, product owners, platform engineers) to bring prototypes into production.
- Maintain code quality, observability, and robustness with proper testing and documentation.
Required Qualifications
- 2+ years of experience in AI/ML or full‑stack software engineering.
- Hands‑on experience building LLM‑based applications, RAG solutions, or AI agents.
- Proficient in Python and/or Node.js and familiar with modern backend/middleware patterns.
- Solid experience in MongoDB, including vector search capabilities (Atlas Vector Search preferred).
- Experience with MEAN stack (MongoDB, Express, Angular, Node.js).
- Familiarity with prompt engineering, embeddings, and retrieval pipelines.
- Experience implementing or integrating Multi‑Factor Authentication.
- Comfortable using GitHub for source control and Jenkins for CI/CD automation.
- Excellent problem‑solving skills and the ability to communicate technical concepts.
- Strong communication skills – able to explain AI concepts to non‑technical stakeholders.
Nice to Have (But Not Required)
- Experience with Databricks, model registries, and MLOps workflows.
- Familiarity with LangChain, LlamaIndex, Dapr, or Azure AI Agent Service.
- Prior work in AI observability, testing, and platform governance.
- Understanding of cloud security, data governance, and compliance in AI systems.
Preferred Tools & Stack
- MEAN stack, Python, or any other middleware technologies.
- Databricks (Nice to have), OpenAI, Azure OpenAI
- LangChain, LlamaIndex, or similar frameworks
- Git, CI/CD, Docker, and cloud platforms (especially Azure)