AI SQL Query Optimization for Increased Accuracy of AI Assistants [Case Study]

AI SQL Query Optimization for Increased Accuracy of AI Assistants [Case Study]

AI chatbots have taken the technology world by storm. Natural-sounding, human-like responses of AI assistants provided immediate and clear benefits and prompted fast and widespread adoption in a variety of industries. 

One problem remains, however – the accuracy of answers. 

Generative AI models (such as ChatGPT) are known for hallucinating answers, which is unacceptable for business applications. On top of that, it is also remarkably difficult to make AI models systematically and correctly retrieve information from external sources, such as business-specific databases. 

In this article, we describe how our team solved this challenge by using two alternative strategies. In our case study, we built an AI chatbot for a construction company and connected it to a database containing historical data of construction projects. Then, we compared whether Retrieval-Augmented Generation or SQL query optimization is more effective at ensuring the highest accuracy of answers in our solution.  

Read further to learn how we managed to progressively increase the accuracy of AI responses with a custom AI SQL query optimizer. 

What are AI chatbots?

AI chatbots are apps that use natural language to talk with humans by text or voice. AI chatbots rely on artificial intelligence to understand, process, and respond to human queries in a manner that resembles real human interaction. 

AI bots are primarily used for two purposes – customer support and internal knowledge-sharing. 

For example, an AI bot can simplify retrieval of information from an internal database. Instead of writing a code-specific command, you can just ask a question in a regular human language and get a precise answer from a database. This helps non-technical employees to use the database and removes the tedious learning process. 

Components of AI chatbots

There are three key architectural components of AI chatbots – Natural Language Processing, Large Language Models, and the knowledge base. 

Natural Language Processing

Natural Language Processing is a technology that allows humans and computers to communicate using human language. It combines principles from computational linguistics and machine learning to enable chatbots to generate useful and meaningful text. 

NLP allows chatbots to: 

  • Understand text. Chatbots can analyze and comprehend the content and context of written text.

  • Recognize speech. Chatbots can convert spoken language into text.

  • Generate text. Chatbots can create coherent and contextually appropriate text based on given input.

  • Analyze sentiment. Chatbots can determine the emotional tone behind a body of text.

  • Translate languages: Chatbots can translate text or speech from one language to another.

NLP effectively processes unstructured human queries. This allows AI chatbots to understand the requests, find relevant information, and provide accurate and timely responses.  

LLM

Large Language Models are a type of artificial intelligence systems that can perform a wide variety of tasks that involve Natural Language Processing. LLMs are trained on millions of gigabytes of data from the Internet, which allows them to interpret human language and provide answers. 

The key difference of LLMs from a standard computer program is the ability to respond to unpredictable requests. Whereas other programs can only respond to a specific set of inputs or commands in an accepted syntax, an LLM can understand virtually any query written in human language

This makes a big difference for the end user. Suppose you have a clothes shop, and need a database with sales statistics for different clothing items. In most cases, you would use a database environment called SQL. The SQL database, however, can only understand queries written in SQL syntax. 

This means that if you want to know how many dresses were sold last quarter, you will need to type a query similar to this:

SELECT SUM(quantity) AS total_dresses_sold
FROM sales
WHERE product_type = 'dress'
  AND purchase_date >= DATE_SUB(CURDATE(), INTERVAL (QUARTER(CURDATE()) * 3) MONTH)
  AND purchase_date < DATE_SUB(CURDATE(), INTERVAL ((QUARTER(CURDATE()) - 1) * 3) MONTH);

If you have an LLM, you will get the same result by simply asking “How many dresses were sold last quarter?”. 

The difference is obvious – LLMs serve as a bridge between the database and the non-technical user. Instead of learning SQL commands, you can simply ask the chatbot a question in a regular language, and it will retrieve the necessary information from a database. 

The database

A database, or knowledge base, is a repository of information that a chatbot uses to provide detailed and accurate responses. While large language models (LLMs) are trained on vast datasets, they are not all-knowing. For example, GPT-3.5's knowledge is limited to information available on the Internet up until September 2021. This limitation means that LLMs often struggle with answering highly specific or detailed queries.

If you need your chatbot to access specialized data from your company or other technical information—such as internal proposals, project reports, blueprints, etc.—you must create a custom database. When you connect the LLM to this tailored repository and train the chatbot to retrieve information from it, you can ensure it provides precise and relevant answers. The tricky part is choosing how to do it. 

Our developers used and compared the efficiency of two main approaches: 

  • RAG. Retrieval-augmented generation is a popular method for connecting LLMs to external databases. In a RAG pipeline, user queries are sent to an AI model that converts them into a numeric format to enable the machine to read it. These numeric values (also called vectors) are then compared to vectors in a knowledge database. When the system finds matches, it retrieves data, converts it into human words, and sends it back to the LLM

  • SQL query tuning. Instead of converting information into a vector database, this method creates a standard SQL database for storing information. Then, user queries are converted into SQL queries by using a specialized AI model, whereas the selected general LLM model is thoroughly prompt-tuned to ensure the higher accuracy of answers. 

We have discussed the RAG method in more detail in an article about AI chatbots. In the next section, we’ll describe the AI SQL query method, and show how it helped achieve exceptional accuracy in our solution for a construction business. 

AI SQL optimizer for higher accuracy: a case study

A full life cycle construction service provider with years of experience in building large-scale projects across the U.S contacted Apiko with a problem related to construction estimating. 

The company had collected vast amounts of data over time, and yet, this data remained unstructured and inaccessible. Because the data was unused, estimators often had to do estimates from scratch – a much slower and less accurate process. 

Together, we decided to develop a chatbot that would facilitate effective access to historical data within their company, including proposals, project reports, blueprints, invoices, and more. 

After developing an initial RAG version of the chatbot (read the article here), our team was somewhat disappointed by the RAG accuracy of responses. The team decided to choose an alternative method. 

Components of the SQL optimizer for an AI chatbot

Here is the architecture of the developed solution: 

Architecture of Conversational AI chatbot

The solution consists of 4 components or steps. 

  1. Conversion of the human question to suggested question.
    After the user asks the question from the chatbot, the question is directed to Mistral LLM. Mistral LLM reformulates the question using the description of the database (the list of tables and fields of the database), and suggested questions. This is done in order to increase the accuracy of response. During the testing phase, our team has experimentally found the formulations that were the most effective at retrieving the right answers from the database.

  2. Conversion of the suggested question to an SQL query.
    After that, another specialized model is used to convert the suggested question to an SQL query. 

  3. Retrieval of data with an SQL query.
    After the correct SQL query is formed, the information is retrieved from the SQL database. 

  4. Conversion of response from the SQL database into human language. The response from the SQL database is directed to Mistral LLM again. The prompt instructs the LLM to write a response based on the user request, the SQL query, and the SQL response. 

As you can see, this approach extensively uses prompt-tuning – a process of feeding the AI model with prompts that would guide it to a desired result. Let’s see the results that we achieved with this approach as compared to the RAG chatbot. 

SQL query performance tuning: the effectiveness 

ai sql results

The graph shows the comparison of accuracy between the RAG chatbot (apiko-ai), and two variants of SQL chatbots – one without prompt-tuning via suggestions, and one with prompt-tuning. 

The results are clear – the RAG chatbot produces a majority of inaccurate answers. An SQL-based chatbot without prompt-tuning is accurate in 70% of cases, and the same chatbot optimized with prompt-tuning is accurate in 85% of cases. 

Our case study clearly shows the effectiveness of using prompt-tuning as an SQL optimizer when working with LLM models. SQL performance tuning enables to achieve higher LLM accuracy and ensure the reliability of the chatbot. 

Additionaly, we prepared for you a video of the Conversational AI chatbot demo. Just take a few minutes to watch out: 

Conclusions

Our case study proves the advantages of an AI SQL query optimizer in improving the accuracy of AI in construction. The high accuracy compared to other approaches ensures that the user receives correct and timely responses. Aside from increasing the value of AI for construction estimating, this approach can be used for optimizing AI chatbots in other industries, including hospitality, real estate, manufacturing, and others. If you are thinking about an AI-based project for your business – don’t hesitate to reach out. Our team has extensive experience in developing bespoke AI solutions for a variety of international businesses.