Technology

What's the Big Deal with LLMs?

January 01, 2024

Back to All Insights
What's the Big Deal with LLMs?

Large Language Models such as ChatGPT have gotten a lot of mainstream attention the past few months. But what will LLMs actually mean for enterprises?

The Firebrand team recently polled a handful of leaders and came to a surprising conclusion: most people do not realize how impactful Large Language Models (LLMs) will be in the enterprise! This was unexpected, as the evidence so far points to vast potential for LLMs in everything from Sales & Marketing to Customer Support, and even Finance, Legal, and HR. With that in mind, we hope this post will spur your thinking on the opportunities LLMs present for the enterprise, and get you started on the road to LLM innovation. As you've probably heard, an LLM is an artificial intelligence system that can understand and generate natural language. Examples of LLMs include GPT-4 by OpenAI, PaLM2 by Google, LLaMA2 by Meta, and BLOOM by Hugging Face. This is a just a small sampling, as literary dozens of new LLMs have emerged in recent months. But LLMs go beyond language: because they are trained on large amounts of general knowledge, they also possess a certain level of basic reasoning and understanding of various topics. This is where it gets interesting. These generalized (often termed foundation) LLMs, with their language, reasoning, and understanding skills, are useful for a variety of enterprise purposes, especially if combined with the data the enterprise owns. Essentially, an enterprise can adopt an LLM and train it further on domain-specific and company-specific knowledge, all without sacrificing privacy. The resulting specialty or Hybrid LLM can then augment or support human knowledge workers in completing a variety of complex tasks, improving the speed and quality of their output. What's an example Use Case for an LLM? An example most people can relate with is customer support. Enterprises currently spend large amounts of money on customer support, and any technologies that can reduce the time spent and cognitive load on both the agent and the customer will be welcomed. Simple chat bots have been around for years filling in gaps, but they lack the sophistication to handle more complex queries. Generative AI, on the other hand, has been shown to be up to the task of complex queries, and can even outperform humans in certain circumstances, particularly when it comes to applying policy consistently. Generative AI also retains a memory of previous conversations with the customer, which makes for a better customer experience. Sounds Great. What's the Catch? As with any new technology, there is still much to learn. There are only a handful of Generative AI solutions deployed at enterprises today, and those companies are keeping their learnings a closely-guarded secret. The Firebrand team has been involved with some of these solutions and can attest to the need for strong governance and clear use cases. But we can also attest to the benefits of Generative AI, and the fact that this technology delivers on its promise in many ways. Embarking on a Generative AI journey will be a learning experience, but well worth it. Cost is the next factor to consider. Today the cost of the most advanced systems (e.g. OpenAI's ChatGPT4.0) is prohibitively high for some use cases. But costs will continue to fall, and the advanced technologies of today will come into reach for everyone tomorrow. That said, cost will always be part of the calculus in terms of which use cases to address and when. Finally, compliance capabilities are still developing within Gen AI. For instance, because of the way an LLM stores data, there is no clear way to remove information from an LLM. That said, several companies are working on a solution to this problem and it's expected to be in market before most companies get to the use cases that will need it. So Where to Start? If you're just getting started with Generative AI, it's best to start with co-pilots on internal use cases. A co-pilot is an AI that rides shotgun with a human, helping them be more effective and efficient. Here are the top 3 use cases we recommend, in order: BI: Analytics leaders and other managers can use an LLM on top of their BI data to develop insights in a conversational manner. This is a great use case for an LLM and showcases many of its capabilities. Refer to our Featured Offering "AI4BI" for more details. Sales: Sales executive and other account leaders can use LLM co-pilots to help draft customer communications and locate technical information pertinent to a customer's situation. Support: Customer support agents can use LLM co-pilots to recommend resolutions to customer problems and help the agent apply policy correctly, e.g. working with a customer on a refund.