AI Practioner: BERT Best Practices & Design Considerations


Overview/Description
Expected Duration
Lesson Objectives
Course Number
Expertise Level



Overview/Description

Bidirectional Encoder Representations from Transformers (BERT), a natural language processing technique, takes the capabilities of language AI systems to great heights. Google's BERT reports state-of-the-art performance on several complex tasks in natural language understanding. In this course, you'll examine the fundamentals of traditional NLP and distinguish them from more advanced techniques, like BERT. You'll identify the terms "attention" and "transformer" and how they relate to NLP. You'll then examine a series of real-life applications of BERT, such as in SEO and masking. Next, you'll work with an NLP pipeline utilizing BERT in Python for various tasks, namely, text tokenization and encoding, model definition and training, and data augmentation and prediction. Finally, you'll recognize the benefits of using BERT and TensorFlow together.



Expected Duration (hours)
1.0

Lesson Objectives

AI Practioner: BERT Best Practices & Design Considerations

  • discover the key concepts covered in this course
  • recall traditional natural language processing techniques and approaches
  • describe the limitations of traditional natural language processing techniques and list potential breakthroughs
  • define the terms "attention" and "transformer" as they relate to natural language processing
  • specify the role of natural language processing techniques like BERT
  • describe how utilizing BERT techniques helps improve search quality
  • outline how BERT techniques facilitate context specificity
  • list ways of using BERT techniques for search engine optimization
  • describe how masking is used in BERT
  • demonstrate how to do data augmentation using masking and BERT in Python
  • illustrate how to do text tokenization using BERT in Python
  • show how to do text encoding using BERT in Python
  • define a BERT model in Python and create and compile the BERT layer using TensorFlow
  • train a BERT model in Python and identify the various hyperparameters for BERT
  • demonstrate how to do data prediction using BERT in Python, load a trained BERT model, create the sample data, and predict using the model
  • describe how the use of the TensorFlow package can advance BERT techniques
  • summarize the key concepts covered in this course
  • Course Number:
    it_aibtbpdj_01_enus

    Expertise Level
    Expert