Reading time:
In this guest post, we present an insightful case study on the innovative applications of generative AI, specifically large language models (LLMs), in the automotive industry. Adam Kozłowski, Head of Automotive R&D at Grape Up, shares his expertise on how these advanced Gen AI tools are revolutionizing knowledge management and customer support through sophisticated chatbots and knowledge graphs.
Read on to discover the transformative potential and practical implementations of LLMs in real-world scenarios
The most common and also the most standard application for Generative AI, specifically Large Language Models (LLMs), are areas related to knowledge management and customer support, which in practice means chatbots. However, LLM-powered chatbots are much more advanced applications than those we were accustomed to. They are capable not only of providing answers to standard questions but also of triggering specific events.
We are currently working on several such applications for the automotive industry. One is a technical support system that searches through documentation accumulated in the organization and—if necessary—triggers the functions that help the customer manage his account, invoicing, payment information, and purchase history. The other—quite similar—is intended for a business development unit.
It allows not only to quickly access information but also to compare, for example, records contained in contracts with agreements made during negotiations, such as during a call.
Both systems support operational work because they have access to everything in the internal communication systems and, depending on a given user’s access level, can provide information based on this knowledge. They can be used to accelerate access to information and verify agreements.
Another chatbot we are working on is dedicated to programmers, but its principle of operation is similar: it streamlines the software development process by accelerating the search for information about technologies approved for use in a given project, i.e., accepted by the cybersecurity, IT, and legal departments. This may sound trivial, but in large organizations, assembling the so-called tech stack in a new project can take months.
A less standard project in which we use not only the conversational abilities of LLMs is our knowledge graph. In this system, we use LLMs in two places: the first deals with analyzing text documents and creating a graph based on them, which can then be a source of knowledge for the “dialogue” LLM, meaning you can simply ask it what is in the graph in a similar manner to vector search.
Adam Kozłowski, Head of Automotive R&D, Grape Up
Grape Up is a technology consulting company helping the world’s leading enterprises deliver their most impactful software using AI, Machine Learning, Cloud-Native Technologies, and a unique approach to software delivery.
Are you interested in more insights about Gen AI’s real-life implementations? Download our report!
Category: