Fine Tune BERT for Text Classification with TensorFlow
half-circle
vector

Fine Tune BERT for Text Classification with TensorFlow

أبرز محتويات الدورة

This is a guided project on fine-tuning a Bidirectional Transformers for Language Understanding (BERT) model for text classification with TensorFlow. In this 2.5 hour long project, you will learn to preprocess and tokenize data for BERT classification, build TensorFlow input pipelines for text data with the tf.data API, and train and evaluate a fine-tuned BERT model for text classification with TensorFlow 2 and TensorFlow Hub. Prerequisites: In order to successfully complete this project, you should be competent in the Python programming language, be familiar with deep learning for Natural Language Processing (NLP), and have trained models with TensorFlow or and its Keras API. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

حول مقدم الدورة

Coursera provides access to more than 3000+ courses across a wide variety of subjects in parntership with different universities and organizations.

الطبع بواسطة

  • self
    التعلم الذاتي
  • dueration
    المدة 3 ساعات
  • domain
    الاختصاص التطوير الشخصي
  • subs
    Monthly Subscription Option not available
  • fee
    Buy Now مجاني
  • language
    اللغة الإنكليزية