Course Introduction
You might use this technology every day without even knowing it.
For example,
Google Search knows what you’re looking for through either text or speech.
Gmail generates Smart Reply responses based on messages.
Google Translate translates between over 100 languages, and covers almost all the languages with at least 10 million speakers.
Google Assistant interprets your intention and conducts interactive conversations.
According to the Google I/O 2022 keynote, Google is launching a new auto summary function for products in Workspace, including
Google Docs and Google Chat.
Behind all these applications, NLP
technology plays a critical role.
NLP
is a marriage between linguistics and artificial intelligence.
You can apply it in a large variety of fields, just to list a few:
Text-to-Speech and Speech-to-Text: convert between text and speech..
Text classification: predict correct labels that you want to assign to a document. For example,
sentiment analysis (deciding the rating based on the comments of a customer),
spam identification (determining whether an email is potentially a spam), and
topic classification (identifying the topic of the content).
Entity extraction (or entity recognition): identify entities, or the relevant specifics, within text. For example, Google Assistant understands that you’re asking about the weather (which is the topic) for today (which is the date) in Denver (which is the location).
Machine translation: use tools such as Google Translate to translate between languages.
Interactive conversation: simulate human conversations such as a chatbot.
NLP
is everywhere, so why should you learn it from Google?
We’d like to address two reasons among many others: First, Google has over ten years of experience developing NLP technologies and applying them to its main products such as Google Search.
Google contributes to the NLP field by developing numerous state-of-the-art technologies.
For example, Google developed word2vec
in 2013
, which is considered a breakthrough because it used neural networks in text representation.
Google continued creating innovative technologies and architectures such as attention mechanism, transformers, BERT
(short for Bidirectional Encoder Representations from Transformers), and T5
.
In 2022, Google achieved a breakthrough in large language models called PaLM
(short for Pathways Language Models).
Second, Google provides an end-to-end development platform called Vertex AI, which empowers developers at different expertise levels to build an NLP project.
For example, you can use AutoML
, which is a no-code solution, if you don’t have coding experience or an ML
background.
You can also use custom training, which is a code-based solution, if you’re familiar with TensorFlow
or Python
and an expert in ML
.
Even for coding, TensorFlow
makes the difficult concepts accessible by providing numerous libraries, so you don’t have to build an NLP model from the beginning.
You might be excited already and eager to learn: How does Google do NLP?
To answer this question, you start with the NLP products and services on Google Cloud,
specifically the NLP solutions
and the pre-built APIs.
You can apply these products and services to NLP projects without in-depth ML
knowledge.
Then you’ll proceed to the NLP development and explore Vertex AI, the end-to-end MLdevelopment platform.
Specifically, you’ll practice with AutoML
, a no-code solution to build an NLP model and predict text source.
After that, you’ll advance to the backend of the NLP development and use TensorFlow
in the next three modules
with Vertex AI custom training, a code-based solution.
Module three answers the first question that you normally face when you build an NLP model:
Different techniques will be introduced.
When the text data is ready, you’ll feed it to model training and prediction.
In module four, you’ll advance to the NLP models,
where you’ll learn about multiple neural networks that are used in NLP
such as
ANN
(artificial neural network),DNN
(deep neural network),RNN
(recurrent neural network),LSTM
(long short-term memory), andGRU
(gated recurrent unit).
After learning the commonly used NLP models, you’ll finally proceed to advanced NLP models, where you’ll learn the state-of-the-art NLP technologies developed by Google
such as
encoder-decoder,
attention mechanism,
transformers,
BERT
, andlarge language models.