This article looks at NLP, or Natural Language Processing, in simple terms, explaining firstly what it is and where you can see it in action. We look at some examples of NLP applications in use today and explain why this technical-sounding branch of AI technology matters so much to marketers.
NLP – What Is It?
So, let’s get the most confusing question out of the way. Confusing why? Because NLP is an acronym for two very different things which, unfortunately, both sound quite technical and very closely related.
The first is Neuro-Linguistic Programming, an experimental branch of neurology and psychology and the second, the subject of this article, is Natural Language Processing, a field of research in the science of Artificial Intelligence (AI).
In essence, both fields are concerned with linguistics, or language, but the former is focused on the human brain, hence the prefix “neuro”.
Natural Language Processing is a branch of artificial intelligence concerned with processing human speech. It’s all about helping electronic brains make better sense of regular human speech – to make machines seem more natural.
Rise Of The Machines
Natural Language Processing is a branch of AI research known as machine learning, whereby computers are trained by constant repetition to fulfil specific tasks ranging from facial recognition to medical diagnoses.
What computers lack in intuition they make up for in sheer processing speed. This allows them to make decisions based on pre-existing data at a far greater speed than a human being can accomplish.
Mastery is slow but gains are exponential, which explains why AI systems have been able to beat grandmasters in games like chess, where everything takes place on a flat 8×8 board, but self-driving cars continue to struggle on our roads, where there are still so many unknown variables.
Natural Language Processing is the application of machine learning to human speech, in both oral and written form.
It uses artificial intelligence to firstly convert human language into a framework that computers can comprehend. It then processes this data, before converting back it into an output humans can understand, i.e. human language.
It might sound complicated, but you have undoubtedly encountered and used this technology countless times today already without even realising it.
Common Examples of Natural Language Processing
Your phone’s autocomplete feature is the most common example of NLP.
We use it so much nowadays that we take it for granted. After all, it’s easy to forget what an incredible technological accomplishment NLP is when it fails to understand our voices or autocorrects our text inputs improperly.
Still, those frustrations aside, I’m sure you’ll agree it’s a remarkable piece of technology that’s literally improving itself every single day.
You may have also noticed similar technology being introduced in email clients, which skim the content and offer one-click automated responses like “great thanks” added as a time-saving feature.
Meanwhile, email clients like Gmail keep your inbox free of spam thanks to NLP and other forms of Machine Learning (the sub-branch of AI to which NLP belongs) technology, while also categorising emails to help to keep your inbox organised.
Plus, the last time you required live technical support online chances are you chatted with a machine without realising it.
Much like the automated voicemail systems used in call centres to filter phone calls, AI chatbots are used to respond to online customer queries, filtering and escalating to agents where necessary.
But their primary job is to resolve them, which they do so with an average success rate of 68.9%.
AI chatbots using NLP are also hugely beneficial from a marketing standpoint, helping to increase conversions on websites by answering common product-related questions.
By now, of course, you’re probably already thinking about the most obvious application of NLP technology, digital personal assistants.
Products like Amazon’s Alexa smart speaker, Apple’s Siri or the “Ok Google” feature on Android devices all use NLP to understand human speech, in its most natural form.
We speak, the machine listens and then interprets our queries in real time.
Does it often get things wrong? Yes of course it does, often with hilarious results, but again we are talking about an emerging technology that, three decades ago, existed solely in the realm of science fiction.
Now its existence is a boring mundane fact of life, so much so that we grumble when it doesn’t work properly.
But it’s important to reiterate that the main benefit of this technology is that improvements are automatic and exponential.
Take translation software. It’s easy to forget how terrible translation software was back in the early 2000s. At best it could muddle through with basic sentences, but it generally tended to struggle with some languages more than others.
You could use it to just about get the gist of a phrase or website but when it got things wrong the results were spectacularly bad.
This prompted Google to enter the space, launching its own translation product, Google Translate, in 2006. Initially, the product suffered from the same flaws as its predecessors, prompting those in the translation business to scoff that it could never be fully trusted for providing fully reliable translation services.
And while that’s as true now as it was then, that degree of reliability has shifted enormously, as anyone who’s ever fully navigated a foreign holiday with the aid of a smartphone can attest.
That’s because ten years after its inception, Google Translate began incorporating NLP technology, greatly improving the product’s accuracy. It’s still far from perfect, but for most of us, it doesn’t need to be.
So, while I’m sure we’d all agree that it’s best not to rely on it for professional, academic or legal purposes, it’s still a highly reliable tool that helps us to navigate and understand the world easier in multiple languages.
At the time of writing, there are 133 languages available on Google Translate and the company continues to add new ones all the time. Of course, there are thousands of languages spoken today throughout the world, not to mention a vast variety of accents, dialects and quirky cultural idioms. And yet, despite us humans having a head start of 50,000 years (at the very least) the machines are rapidly catching up.
How Does NLP Work?
Natural Language Processing works by breaking down human speech into tiny pieces which computers can understand. We tend to take language for granted, but it is actually enormously complicated. Any given sentence can be made up of several different components – nouns, verbs, adjectives, adverbs, prepositions, etc. – which can be combined in a huge variety of ways.
Since computers think in numbers, not words, the first step is to help them assign categories to each of those different components and create specific values for each individual word. Step two involves training the algorithm to detect different speech patterns and the various interrelationships between them.
The computer also needs to be aware of structure, syntax and grammar rules, while simultaneously taking into account how fluid and heterogeneous human speech can be.
The more input the computer has, the more practice the algorithm gets, and the better it becomes at understanding us through mathematical probability.
But just think about how much nuance there is in the English language alone, how many ways we can say the same thing and how many different things can share the same word or phrase, you can begin to get an idea of how complex this process is.
For example, while sitting in a cafe I recently overheard an American tourist say to her friends, “I woke up this morning and I was like, literally dead.”
Taken at face value the statement makes no sense yet I understood what she meant instantly. A computer, however, would have to analyse the sentence and compare it with its word matrix to gain its true meaning.
First, it would have to know to strip out the superfluous word “like” and calculate the probability of whether “literally” was used correctly in this context, or whether it was simply a figure of speech (i.e. not literal at all), based on the other words in the sentence.
Finally, it would be able to discern, based on past data, that “dead” in this instance was referring to fatigue.
Of course, the computer wouldn’t be able to tell, like I could, that she was visibly hungover because a computer lacks real-world context.
But the more time an NLP algorithm spends learning about the complex interrelationships of words, the greater it becomes at divining context from everyday speech. It doesn’t understand the words, of course, but it knows how they all connect.
Take the word “apple”, for example.
An apple can be a computer, a company or a fruit, depending on the context. And while a computer cannot ever know what an apple tastes like it knows, through complex semantic mapping, that an apple can be juicy. It also knows an apple can be red or green, that apple can mean a flavour or it can be a fruit, much like an orange is a different kind of fruit and also a colour.
And on and on it goes, joining all those linguistic dots together, all day, every day, millions of times per second.
With a growing number of people (50% in the US alone) using voice search, as opposed to simply typing, context and clarity are more important than ever. Forget mobile-first, that’s so last decade. Now is the time to prepare your website for a voice-first world.
This becomes even more apparent once you understand what Google’s currently working on…
MUMs The Word
Google has been using machine learning as part of its search product for several years now. The first such incarnation was 2015’s RankBrain. It was designed to help Google better understand the context of words and detect synonyms with more accuracy, while simultaneously downgrading sites that used spammy tactics like keyword stuffing.
The result was a more intuitive, less clunky search environment which the company built upon with their second AI system, BERT, which fully incorporated NLP technology.
In May 2021 the company announced its third major AI update, Multitask United Model or MUM for short. MUM is 1,000 times more powerful than BERT and will represent the next frontier for NLP technology.
As always with Google, this update will be rolled out in phases but the technology promises nothing short of a revolution in how we use the web. It will provide multi-language support across various platforms and file formats with the ability to understand not just text, but also speech, audio recordings, videos, graphics and more.
MUM was designed to remove language barriers and make information more accessible to all. Looking for technical specifications which only appear on a German website? Google can translate it for you instantly and serve you relevant search results. And that’s just for starters.
The technology also means you might be able to pull research data directly from podcasts, regardless of the language they were recorded in. Or how about that summer hit you heard on holiday in Spain you wish you knew the name of? Simply croon it phonetically into your phone and get the resulting video up on YouTube…. well maybe.
The technology’s not quite there yet, but don’t forget, its growth is exponential which of course begs the inevitable question…
Will AI Take My Job?
But it will certainly change the way you work.
Because, although NLP technology is improving exponentially each day, it still only understands the words in relation to other words. It calculates context through mathematical probability but it doesn’t understand any deeper meaning.
Today software already exists that can translate, transcribe and even write content automatically. But these products should only be viewed as tools to help speed up our work processes.
There will always be a place for professional translation and transcription services, copywriters, technical writers, editors and publishers. What is changing, however, is how NLP technology is making that work less labour intensive allowing businesses to scale.
Therefore AI should not be seen as something that will put us out of work, but rather as a tool that helps us do our work more efficiently.
It will also help us to make more sense of the world around us, conduct research across multiple languages and respond to customer queries more efficiently, all while reducing the barriers to communication between man and machine.
But, of course, one thing a machine will never be able to do, is truly appreciate the taste of a nice juicy apple. (Or, for that matter, the “literally dead” feeling of a horrible hangover.)