Back to blog
Articles
May 12, 2023
·
2 MIN READ

Five Advantages of NLU

May 12, 2023
|
2 MIN READ

Latest content

Customer Stories
4 min read

How Infobip Generated 220+ Knowledge Articles with Gen AI For Smarter Self-Service and Better NPS

Partnering with HumanFirst, Infobip generated over 220 knowledge articles, unlocked 30% of their agents' time, and improved containment by a projected 15%.
September 16, 2024
Articles
7 min read

Non-Technical AI Adoption: The Value of & Path Towards Workforce-Wide AI

Reviewing the state of employee experimentation and organizational adoption, and exploring the shifts in thinking, tooling, and training required for workforce-wide AI.
September 12, 2024
Articles
6 min read

AI for CIOs: From One-Off Use to Company-Wide Value

A maturity model for three stages of AI adoption, including strategies for company leaders to progress to the next stage.
September 12, 2024
Tutorials
4 min read

Building Prompts for Generators in Dialogflow CX

How to get started with generative features.
August 15, 2024
Announcements
3 min read

HumanFirst and Infobip Announce a Partnership to Equip Enterprise Teams with Data + Generative AI

With a one-click integration to Conversations, Infobip’s contact center solution, HumanFirst helps enterprise teams leverage LLMs to analyze 100% of their customer data.
August 8, 2024
Tutorials
4 min read

Two Field-Tested Prompts for CX Teams

Get deeper insights from unstructured customer data with generative AI.
August 7, 2024
Tutorials
5 min read

Optimizing RAG with Knowledge Base Maintenance

How to find gaps between knowledge base content and real user questions.
April 23, 2024
Tutorials
4 min read

Scaling Quality Assurance with HumanFirst and Google Cloud

How to use HumanFirst with Vertex AI to test, improve, and trust agent performance.
March 14, 2024
Customer Stories
4 min read

How Infobip Generated 220+ Knowledge Articles with Gen AI For Smarter Self-Service and Better NPS

Partnering with HumanFirst, Infobip generated over 220 knowledge articles, unlocked 30% of their agents' time, and improved containment by a projected 15%.
September 16, 2024
Articles
7 min read

Non-Technical AI Adoption: The Value of & Path Towards Workforce-Wide AI

Reviewing the state of employee experimentation and organizational adoption, and exploring the shifts in thinking, tooling, and training required for workforce-wide AI.
September 12, 2024
Articles
6 min read

AI for CIOs: From One-Off Use to Company-Wide Value

A maturity model for three stages of AI adoption, including strategies for company leaders to progress to the next stage.
September 12, 2024

Let your data drive.

I gained an appreciation for the power of Natural Language Understanding (NLU) engines while experimenting with the predictive and classification capabilities of Large Language Models (LLMs).

NLU Engines have been used to power chatbots and customer facing conversations for a while now. While they are mainly recognised for their generative capabilities, NLU Engines are also incredibly efficient in the area of predictive capabilities - which can be more tricky to get right. Here are five areas in which NLU Engines are exceptionally well optimised…

Entering Data Is Simple

Chatbot development frameworks typically offer a no-code graphical user interface (GUI) for NLU data entry.

This allows users to enter data in a point-and-click fashion, eliminating the need to format data into a JSON or CSV structure for importation. This also avoids potential errors associated with manual data entry.

Migrating between chatbot frameworks and NLU engines is made easy by the lightweight nature of NLU Engine data exports.

Flexibility Of Intents

Classes can be seen as intents, and the necessity of assigning text to one or multiple categories, labels, or intents will always remain.

Here are some recent developments in the field of intents:

Source

Entities Have Developed Over Time

Intents are like verbs and entities like nouns.

Accurate entity detection is important to avoid  re-prompting the user for input already supplied. Thus, it is necessary to be able to identify entities from unstructured input on the first input.

Read more on Entities here.

Model Training Time

I recently trained a multi-label model using Google Cloud Vertex AI.

The model training process took 4 hours and 48 minutes to complete, with 11,947 training data items.

To help reduce training times, chatbot development frameworks like Rasa have implemented incremental training.  Other chatbot frameworks have decreased training time drastically.

Multiple Installation Options

Natural Language Understanding (NLU) Engines are lightweight, often open-sourced, and can easily be installed in any environment, including workstations.

Examples of open-sourced NLU APIs include Rasa, which is easy to configure and train, as well as other accessible NLU tools such as Spacy, Snips, and Cisco MindMeld.

With these tools, individuals can explore natural language understanding without having to invest in expensive computing, LLMs, or complicated ML pipelines.

I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language. Including NLU design, evaluation & optimisation. Data-centric prompt tuning and LLM observability, evaluation and fine-tuning.

Subscribe to HumanFirst Blog

Get the latest posts delivered right to your inbox