Back to blog
Articles
May 12, 2023
·
2 MIN READ

Five Advantages of NLU

May 12, 2023
|
2 MIN READ

Latest content

Tutorials
5 min read

Optimizing RAG with Knowledge Base Maintenance

How to find gaps between knowledge base content and real user questions.
April 23, 2024
Tutorials
4 min read

Scaling Quality Assurance with HumanFirst and Google Cloud

How to use HumanFirst with Vertex AI to test, improve, and trust agent performance.
March 14, 2024
Announcements
2 min read

Full Circle: HumanFirst Welcomes Maeghan Smulders as COO

Personal and professional history might not repeat, but it certainly rhymes. I’m thrilled to join the team at HumanFirst, and reconnect with a team of founders I not only trust, but deeply admire.
February 13, 2024
Tutorials
4 min read

Accelerating Data Analysis with HumanFirst and Google Cloud

How to use HumanFirst with CCAI-generated data to accelerate data analysis.
January 24, 2024
Tutorials
4 min read

Exploring Contact Center Data with HumanFirst and Google Cloud

How to use HumanFirst with CCAI-generated data to streamline topic modeling.
January 11, 2024
Articles
5 min

Building In Alignment: The Role of Observability in LLM-Led Conversational Design

Building In Alignment: The Role of Observability in LLM-Led Conversational Design
December 6, 2023
Articles
5 min read

Rivet Is An Open-Source Visual AI Programming Environment

Rivet is suited for building complex agents with LLM Prompts, and it was Open Sourced recently.
September 27, 2023
Articles
6 min read

What Is The Future Of Prompt Engineering?

The skill of Prompt Engineering has been touted as the ultimate skill of the future. But, will prompt engineering be around in the near future? In this article I attempt to decompose how the future LLM interface might look like…considering it will be conversational.
September 26, 2023
Articles
4 min read

LLM Drift

A recent study coined the term LLM Drift. LLM Drift is definite changes in LLM responses and behaviour, over a relatively short period of time.
September 25, 2023
Tutorials
5 min read

Optimizing RAG with Knowledge Base Maintenance

How to find gaps between knowledge base content and real user questions.
April 23, 2024
Tutorials
4 min read

Scaling Quality Assurance with HumanFirst and Google Cloud

How to use HumanFirst with Vertex AI to test, improve, and trust agent performance.
March 14, 2024
Announcements
2 min read

Full Circle: HumanFirst Welcomes Maeghan Smulders as COO

Personal and professional history might not repeat, but it certainly rhymes. I’m thrilled to join the team at HumanFirst, and reconnect with a team of founders I not only trust, but deeply admire.
February 13, 2024

Let your data drive.

I gained an appreciation for the power of Natural Language Understanding (NLU) engines while experimenting with the predictive and classification capabilities of Large Language Models (LLMs).

NLU Engines have been used to power chatbots and customer facing conversations for a while now. While they are mainly recognised for their generative capabilities, NLU Engines are also incredibly efficient in the area of predictive capabilities - which can be more tricky to get right. Here are five areas in which NLU Engines are exceptionally well optimised…

Entering Data Is Simple

Chatbot development frameworks typically offer a no-code graphical user interface (GUI) for NLU data entry.

This allows users to enter data in a point-and-click fashion, eliminating the need to format data into a JSON or CSV structure for importation. This also avoids potential errors associated with manual data entry.

Migrating between chatbot frameworks and NLU engines is made easy by the lightweight nature of NLU Engine data exports.

Flexibility Of Intents

Classes can be seen as intents, and the necessity of assigning text to one or multiple categories, labels, or intents will always remain.

Here are some recent developments in the field of intents:

Source

Entities Have Developed Over Time

Intents are like verbs and entities like nouns.

Accurate entity detection is important to avoid  re-prompting the user for input already supplied. Thus, it is necessary to be able to identify entities from unstructured input on the first input.

Read more on Entities here.

Model Training Time

I recently trained a multi-label model using Google Cloud Vertex AI.

The model training process took 4 hours and 48 minutes to complete, with 11,947 training data items.

To help reduce training times, chatbot development frameworks like Rasa have implemented incremental training.  Other chatbot frameworks have decreased training time drastically.

Multiple Installation Options

Natural Language Understanding (NLU) Engines are lightweight, often open-sourced, and can easily be installed in any environment, including workstations.

Examples of open-sourced NLU APIs include Rasa, which is easy to configure and train, as well as other accessible NLU tools such as Spacy, Snips, and Cisco MindMeld.

With these tools, individuals can explore natural language understanding without having to invest in expensive computing, LLMs, or complicated ML pipelines.

I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language. Including NLU design, evaluation & optimisation. Data-centric prompt tuning and LLM observability, evaluation and fine-tuning.

Subscribe to HumanFirst Blog

Get the latest posts delivered right to your inbox