2020 has been a year of significant growth for Natural Language Processing (NLP). In a recent post, I outlined the trends accelerating adoption of the technology, advances in the field, and rising IT budgets allocated to NLP-related projects. With the massive quantity of data we have across all industries, it’s hard to dispute how valuable NLP can be to better decide and act on insights from customers, patients, behaviors, trends, and just about any other patterns you’d want to unearth.

As enhancements are made and NLP becomes more broadly used, the forces driving this uptick in use will shift, too. My last post explored the trends that defined NLP in 2020, but as we enter 2021, there are additional factors that will help NLP continue on its growth trajectory—and will have an even bigger impact on business, research, and humanity. Here are the ones to keep an eye on:

NLP Breakthroughs Move from Research into Production

One of the reasons for the increased investment in NLP is that recent advances in deep learning and transfer learning are now moving from research to production. In a healthcare setting, for example, advancements in reading comprehension enable algorithms to extract facts from radiology, pathology, genomic and lab reports as accurately as humans. We’re only starting to integrate this capability into the clinical workflow so practitioners can benefit. But, despite it being early days, NLP is already being used to diagnose patients, match them to clinical trials, highlight high-risk situations, and enable faster drug discovery. This has the potential to reduce the challenges of an overburdened healthcare system, and will get better over time.

Winning Organizations Will Take a Holistic Approach to NLP

As important as technical talent is to implement and scale NLP projects, understanding how AI will work within a product from a business perspective is equally vital to success. The most accurate patient risk prediction model won’t help anyone if it’s not integrated into the clinical workflow and isn’t easily trusted and used by doctors. A system that analyzes SEC filings in real-time won’t make money if the trading team doesn’t use it regularly to make better trades. An e-discovery system has to be designed around how lawyers practice, so that it goes beyond providing search results to help win cases. AI technology is only as useful as it is well-used.

As such, all disciplines within an organization need to understand the benefits of integrating AI, and how it will affect their job function. Failing to train and actively involve product managers, designers, marketers, and sales professionals in designing AI and NLP systems is why so many projects don’t work in practice. Keeping these initiatives locked within a data science team often amounts to sprinkling the ‘AI’ buzzword onto a business and hoping for the best. It’s the investment in time, education, and practice across the entire organization that will separate the success stories from the tech laggards in the coming year.

Multilingual Offerings Will Further Democratize NLP

For the past few years, you were largely out of luck if you needed NLP support in languages other than English, Mandarin Chinese, or a handful of others. Thankfully, state-of-the-art models are now gradually being made available in dozens of languages. Cloud providers now offer support for over a hundred languages, and open-source libraries such as Spark NLP, now support over 50 languages out of the box.

With new research advances such as language-agnostic sentence embeddings, zero-shot learning, and the recent public availability of multilingual embeddings, this will become the norm. More access to code and availability of many languages evens the playing field globally, resulting in a more diverse and inclusive AI ecosystem.

Pre-Trained NLP Models Will Lower the Barrier to Entry

Running some of the most accurate and complex deep learning models in history has been reduced to a single line of code, such as Python’s NLU library, for example. Additionally, some NLP libraries provide official support for their published models, so that models and pipelines are regularly updated or replaced when a better algorithm, model, or embedding becomes available.

With the advent of NLP model hubs—by Hugging Face, TensorFlow, PyTorch, and others—thousands of free, pre-trained models are available. To help take the guesswork out of finding the right model for your use case, better faceted search, curated suggestions, and smarter ranking of search results are coming to fruition, too. This will make it easier for NLP novices to get started and for skilled data scientists to deliver results faster.

With production-ready software libraries, broad multilingual capabilities, greater access to pre-trained models, and growing organizational buy-in, the barriers to entry have been lowered significantly. As we’ve seen with other forms of AI, as NLP technology is democratized further, real room for innovation is created. While the NLP research advances of the past few years have been truly exciting and fundamental, we are barely scratching the surface on the impact it will have once applied in the real world.