LLMs are transforming data access with SQL automation
Experimenting with LLMs for Full Automation of Data Retrieval
Our goal was simple: to make data access easier. We wanted users to retrieve data by simply asking questions. To achieve this, we tested various large language models (LLMs) like OpenAI, Mistral, Llama2, and Wizardcoder. We aimed to see how well each could turn natural language requests into SQL commands.
Benchmarking Methods and Key Findings
We tested both API-based and locally deployed models. Our focus was on how accurately each model could understand different data contexts and create SQL statements. We grouped the models by how they performed under specific conditions. Here’s a breakdown of what we found:
The chart illustrates performance across key test scenarios:
- No Table Structure Provided: We asked the models to generate SQL without knowing the data structure.
- With Table Definitions: We supplied table definitions to guide SQL creation.
- With Sampled Queries and Table Definitions: The models received sample queries along with table details.
Each model’s performance in these categories gave us insights into strengths and trade-offs, helping us fine-tune our AI’s data interaction capabilities. Some models excelled with structured data setups, while others adapted well to iterative “pilot learning,” adjusting as they processed new information.
Benchmark Data Set
We used a curated subset of EDGAR records, a detailed dataset on corporate and individual financial disclosures. With data on around 100,000 companies, it offered a rich resource for testing how well each model handled complex data.
Building a Generative User Experience: Key Lessons
As we progressed, a few key insights emerged that shaped how we think about LLMs and generative AI for enterprise solutions:
- LLMs and SQL Compatibility: Through benchmarking, we saw how effectively LLMs could translate user prompts into SQL commands. However, achieving reliable accuracy required continuous adjustments to ensure the AI could handle the diverse data structures typical in large-scale enterprise environments.
- Adaptive Learning and Metadata: To make data access more intuitive, we needed the AI to autonomously generate metadata that makes data easily searchable. This step required the AI to learn dynamically from user behavior—a central principle in creating a generative system that could grow smarter over time.
- Platform Architecture to Support Autonomy: This experience reinforced the importance of building a flexible architecture. Enabling AI to autonomously retrieve data and interpret user intent requires a platform that allows for real-time adaptation. This meant designing our system with evolutionary learning principles at its core.
Introducing Wingman SQL Assistant
These insights led to the creation of Wingman SQL Assistant, a tool that lets users interact with data by simply asking questions, without writing SQL. Currently demoed with a sample music database, Wingman allows users to get answers on metrics like top-selling tracks or customer purchasing behavior—all through natural language.
Try It Yourself
Explore Wingman SQL Assistant and experience conversational data access firsthand. Visit Wingman SQL Assistant
and see how it can transform the way you work with data. You can ask for SQL queries for a sample data base of a music collection. For example you could ask to give you all artists or all customers and their purchased songs.
Sharing Our Vision: A Collaborative Step Toward Generative AI
Our work on LLM-driven data retrieval is ongoing, but it’s taught us that the potential for generative AI in enterprise tech is enormous. We envision a future where AI can autonomously manage and interpret data, responding to user requests with accuracy and insight. By sharing these lessons, we hope to contribute to the broader AI community and spark discussions around best practices in data access automation.
At Salesteq, we’re building an AI that’s more than just a tool—it’s a partner in the sales journey. If you’re working on similar challenges, we invite you to join the conversation as we explore the frontiers of autonomous, adaptive intelligence in enterprise solutions.