FR
Long Talk (25min)
Building a Reliable AI Product: The Example of Vedder, Spotify's Text-to-Insights Platform
Description
At Spotify, we launched Vedder, an internal tool that transforms natural language questions into reliable SQL queries (text-to-SQL), used every day by data teams.
This project was born from a simple need: to make data accessible to everyone, reliably and with a real guarantee of quality answers. Text-to-SQL is a well-known subject, but mastering it at the scale of all a company's data is a considerable challenge. Explaining how we built Vedder is telling the story of how to build an AI product based on large language models (LLMs).
In this talk, I will share the experience of a PM on an AI product and a few key messages:
• Start small, learn, then scale: how we transformed an experiment into a widely adopted product.
• Adding certainty to LLMs: setting up continuous evaluations to ensure reliability of results.
• Involving users in learning: collaborative data curation by business experts.
• Measuring the ROI of an AI product: productivity gains, new types of users, and organizational adoption.
Finally, I will address possible developments and how to project them:
• The perspectives of context retrieval and the connection with the company's collective knowledge to move from text-to-SQL to text-to-insights.
• And the strategic question: how these internal AI products can differentiate themselves sustainably from major LLM providers like OpenAI or Anthropic.

