With Generative AI, Businesses Should Listen More And Generate Less

Amit Ben is the Founder and CEO of One AI.

With the rise of advanced language generation models like OpenAI’s ChatGPT, Anthropic’s Claude and Inflection’s Pi, we are witnessing a new era in language AI. These sophisticated models, continually refining their capabilities, are efficiently generating coherent, natural language across various fields such as chatbots, language translation and creative content generation.

Epictetus, an ancient Greek Stoic philosopher, is often referenced to have said, “We have two ears and one mouth so that we can listen twice as much as we speak.” This wisdom applies beyond interpersonal relationships; it holds crucial significance for businesses, reminding them of the importance of attentively listening to their stakeholders: customers, employees and partners.

This understanding serves as the foundation for the future role of natural language understanding (NLU). NLU not only enables businesses to comprehend and respond accurately to their customers’ needs and preferences, but it also offers deeper insights that can inform strategic decisions. By unlocking an understanding of language at multiple levels—operational, analytical and managerial—NLU promises to revolutionize business interactions and decision-making processes.

Comprehension And Generation: AI Versus The Human Approach

While understanding and generation of language are intimately linked in humans, supported by overlapping cognitive mechanisms, AI-based language models adopt a distinctive approach.

For these language models, understanding predominantly involves processing and analyzing input text to deduce meaning, context and relevant information. This process entails tasks like information retrieval, comprehension and semantic understanding. By training on substantial quantities of text data, these models can recognize patterns, associations and relationships between words and phrases, leading to a generalized understanding of language.

Contrastingly, language generation focuses on crafting coherent, contextually appropriate text based on a given prompt or input. Trained to predict the best next word, generative models typically employ autoregressive decoding techniques, predicting and generating subsequent words or sequences based on the provided context and their intrinsic knowledge.

Although understanding and generation in language models share a relationship, their underlying processes are distinct. Models can be independently trained on each task using dedicated datasets and objectives. However, improvements in models often incorporate joint training or multitask learning, wherein models are trained concurrently on understanding and generation tasks, leveraging their synergy to boost overall language competencies.


NLU and NLG (natural language generation) represent two integral subfields within the broad scope of natural language processing (NLP). While NLU centers on deciphering meaning from text or speech, NLG is concerned with crafting text or speech that resonates with human understanding.

These two aspects, though distinct, are intrinsically interwoven. The synergy between NLU and generative AI, a subset of AI that can produce human-like content, hinges on a comprehensive understanding of human language. This intricate connection aims to produce content that is not only realistic but also rich in meaning.

The merging of NLU and generative AI foreshadows a transformation in the landscape of human-computer interaction. It brings us closer to a future where conversations with computers can mirror the nuance and depth of human dialogues—and where generative AI can design highly personalized and engaging experiences for users. The combination of understanding and generating language could redefine our digital interactions, pushing the boundaries of what AI can achieve.

Large Scale Language Analytics: Unleashing Insights From The Data Deluge

A crucial application of NLU, large-scale language analytics, broadens comprehension beyond individual pieces of content to encompass vast volumes of data in various formats, such as text, complex documents, PDFs and multimedia content.

Large-scale language analytics gives businesses a profound understanding of their users by dynamically grouping language data based on semantic meaning, unveiling recurring themes, and extracting vital insights. These insights, identifying trends over time and across topics, significantly contribute to strategic decision-making processes.

Beyond informing strategies, this analytical power also aids automation. Large-scale language analytics classify incoming text items based on existing data, facilitating prompt, appropriate actions and empowering efficient data retrieval through semantic searches across language collections.

The benefits of large-scale language analytics and NLU extend to various industries. In customer service, it can improve interactions and boost customer satisfaction. In education, it enables personalized learning experiences, while in HR, it can transform employee understanding, proactively addressing concerns and improving the workplace environment. And in the fast-paced e-commerce sector, NLU provides valuable insights into customer sentiment and product performance, driving data-driven decisions and enhancing customer experience.

Future Prospects And Ethical Implications Of NLU

The potential of NLU is vast and extends even to sectors such as healthcare, promising improved efficiency and accuracy in areas like patient interaction through NLU-powered chatbots.

As NLU technology evolves and matures, it becomes increasingly imperative to consider its ethical implications. NLU could potentially be exploited to create deepfakes or manipulate emotions. Hence, establishing and strictly adhering to ethical guidelines and regulations is of paramount importance when deploying advanced NLU technology.

In an era dominated by AI, we should remember Epictetus. While generation is important, businesses need to pay close attention to listening and understanding. This equilibrium between NLU and NLG is critical for driving meaningful interactions, fostering robust relationships and enabling precise decision making.

The implementation of large-scale language analytics promises transformative impacts across various sectors, boosting efficiency and providing insightful analysis. However, as we navigate toward a future enriched with advanced NLU, it becomes crucial to focus on the ethical implications surrounding it, ensuring that the technology is utilized in the most responsible way.

Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

By Bury