1/32 Course Introduction
2/32 Course Introduction
3/32 Course Introduction

You might use this technology every day without even knowing it.

For example,

4/32 Course Introduction

Google Search knows what you’re looking for through either text or speech.

5/32 Course Introduction

Gmail generates Smart Reply responses based on messages.

6/32 Course Introduction

Google Translate translates between over 100 languages, and covers almost all the languages with at least 10 million speakers.

7/32 Course Introduction

Google Assistant interprets your intention and conducts interactive conversations.

According to the Google I/O 2022 keynote, Google is launching a new auto summary function for products in Workspace, including

8/32 Course Introduction

Google Docs and Google Chat.

9/32 Course Introduction

Behind all these applications, NLP technology plays a critical role.

10/32 Course Introduction

NLP is a marriage between linguistics and artificial intelligence.

11/32 Course Introduction

You can apply it in a large variety of fields, just to list a few:

  • Text-to-Speech and Speech-to-Text: convert between text and speech..

  • Text classification: predict correct labels that you want to assign to a document. For example,

    • sentiment analysis (deciding the rating based on the comments of a customer),

    • spam identification (determining whether an email is potentially a spam), and

    • topic classification (identifying the topic of the content).

  • Entity extraction (or entity recognition): identify entities, or the relevant specifics, within text. For example, Google Assistant understands that you’re asking about the weather (which is the topic) for today (which is the date) in Denver (which is the location).

  • Machine translation: use tools such as Google Translate to translate between languages.

  • Interactive conversation: simulate human conversations such as a chatbot.

12/32 Course Introduction

NLP is everywhere, so why should you learn it from Google?

13/32 Course Introduction

We’d like to address two reasons among many others: First, Google has over ten years of experience developing NLP technologies and applying them to its main products such as Google Search.

Google contributes to the NLP field by developing numerous state-of-the-art technologies.

14/32 Course Introduction

For example, Google developed word2vec in 2013, which is considered a breakthrough because it used neural networks in text representation.

15/32 Course Introduction

Google continued creating innovative technologies and architectures such as attention mechanism, transformers, BERT (short for Bidirectional Encoder Representations from Transformers), and T5.

16/32 Course Introduction

In 2022, Google achieved a breakthrough in large language models called PaLM (short for Pathways Language Models).

17/32 Course Introduction

Second, Google provides an end-to-end development platform called Vertex AI, which empowers developers at different expertise levels to build an NLP project.

For example, you can use AutoML, which is a no-code solution, if you don’t have coding experience or an ML background.

You can also use custom training, which is a code-based solution, if you’re familiar with TensorFlow or Python and an expert in ML.

Even for coding, TensorFlow makes the difficult concepts accessible by providing numerous libraries, so you don’t have to build an NLP model from the beginning.

18/32 Course Introduction

You might be excited already and eager to learn: How does Google do NLP?

19/32 Course Introduction

To answer this question, you start with the NLP products and services on Google Cloud,

20/32 Course Introduction

specifically the NLP solutions

21/32 Course Introduction

and the pre-built APIs.

You can apply these products and services to NLP projects without in-depth ML knowledge.

22/32 Course Introduction

Then you’ll proceed to the NLP development and explore Vertex AI, the end-to-end MLdevelopment platform.

23/32 Course Introduction

Specifically, you’ll practice with AutoML, a no-code solution to build an NLP model and predict text source.

24/32 Course Introduction

After that, you’ll advance to the backend of the NLP development and use TensorFlow in the next three modules

25/32 Course Introduction

with Vertex AI custom training, a code-based solution.

26/32 Course Introduction

Module three answers the first question that you normally face when you build an NLP model:

27/32 Course Introduction
28/32 Course Introduction

Different techniques will be introduced.

29/32 Course Introduction

When the text data is ready, you’ll feed it to model training and prediction.

In module four, you’ll advance to the NLP models,

30/32 Course Introduction

where you’ll learn about multiple neural networks that are used in NLP such as

  • ANN (artificial neural network),

  • DNN (deep neural network),

  • RNN (recurrent neural network),

  • LSTM (long short-term memory), and

  • GRU (gated recurrent unit).

31/32 Course Introduction

After learning the commonly used NLP models, you’ll finally proceed to advanced NLP models, where you’ll learn the state-of-the-art NLP technologies developed by Google

32/32 Course Introduction

such as

  • encoder-decoder,

  • attention mechanism,

  • transformers,

  • BERT, and

  • large language models.