- The Midas Report
- Posts
- Amplitude’s internal AI shows why owning agents beats renting APIs
Amplitude’s internal AI shows why owning agents beats renting APIs
4 min read.

A custom assistant that reaches across Slack, Jira, Salesforce, and more turned product work into an ambient layer, not another dashboard.
On a recent episode of the How I AI podcast, Wade Chambers, Chief Engineering Officer at Amplitude, walked through how the company built Moda, a custom internal assistant that searches across enterprise data to support faster product development, decision making, and cross functional collaboration.
The story is simple and instructive. Instead of buying yet another point solution, Amplitude created an AI layer that lives where people already work, draws on the company’s own information, and changes how teams coordinate.
From Dashboard Fatigue to an Ambient Workflow
Moda is designed to access Amplitude’s core knowledge and communication systems. The episode cites Slack, Confluence, Jira, Salesforce, Zendesk, Google Drive, Productboard, Zoom, Asana, Dropbox, GitHub, HubSpot, and Abnormal Security as sources. In practice, that means a product manager can ask Moda in Slack to analyze customer feedback, generate a PRD, and even help kick off a prototype, all without toggling between tools.
The assistant was demonstrated in Slack and was built in three to four weeks of engineers’ spare time, then adopted companywide in just one week through what Chambers describes as a social engineering approach.
The result is not another reporting surface. It is a workflow shift. The episode describes a sequence that compresses user research, PRD creation, and prototyping into a single meeting. Product managers use the assistant to analyze feedback across multiple data sources to identify themes.
Teams run role swapping exercises with AI tools to increase fluency across product, design, and engineering. Engineering leaders use the same tools to address tech debt and improve development velocity. By pulling information and actions into an ambient assistant that sits inside daily communication, the operational surface becomes unified. The work moves from reading dashboards to asking questions and taking steps in context.
Build Versus Buy in Practice
This is the core build versus buy question for internal AI. The episode explicitly discusses the build versus buy decision and references tools like Glean, ChatGPT, Cursor, Bolt, Figma, Lovable, and v0. There is a growing ecosystem of capable products. Amplitude’s choice to build a custom assistant that accesses enterprise data across multiple systems shows another path.
When the assistant can see Slack threads, pull from Confluence pages, open Jira tickets, and synthesize Salesforce and Zendesk notes, it can serve the actual rhythms of the company, not an abstraction of them. That level of integration is hard to rent. It is easier to own.
The speed of Moda’s creation and adoption underscores the point. Built in a few weeks of spare time, then adopted in a week through a social engineering approach, the tool spread because it was embedded where work already happens and because it solved immediate, cross functional tasks. The adoption story suggests that the hardest part of internal AI is not exotic modeling, it is integrating with the systems and behaviors that teams already use. That is an organizational capability more than a procurement task.
Our take
Our take is straightforward. Companies that own internal agent infrastructure will outpace those that rent APIs. Moda illustrates why. A custom assistant that taps the company’s systems can compress decision cycles, reduce context switching, and make cross functional collaboration feel native. It can put research synthesis, PRD generation, prototyping, and even tech debt triage into a single conversational loop. The build versus buy discussion is not only about cost or model quality. It is about whether the AI layer becomes a first class part of the operating system of the company.
Amplitude’s example is not a glossy demo. It is an internal tool, built quickly, adopted broadly, and used to change how product work gets done. For founders, developers, and operators, the lesson is to treat the assistant as infrastructure, not a plugin.
Choose the integration surface where your teams live, connect the authoritative data sources, and set a social adoption strategy that brings the tool into real meetings and real tickets. The ambient layer is here, and it looks less like a dashboard and more like a colleague in Slack.
Internal agents that you build, wire into your stack, and socialize across teams are becoming the new enterprise productivity moat. Moda shows the path.