Amazon cover image
Image from Amazon.com

Transformers for natural language processing : build innovative deep neural network architectures for NLP with python, pyTorch, tensorFlow, BERT, RoBERTa, and more / by Denis Rothman.

By: Material type: Computer fileComputer fileLanguage: English Publication details: Birmingham : Packt Publishing, Limited, 2021Description: 1 online resource (xvi, 383, pages) : color illustrationsContent type:
  • text
Media type:
  • computer
Carrier type:
  • online resource
ISBN:
  • 9781800565791 (e-book)
Subject(s): LOC classification:
  • Q336  R74 2021
Online resources:
Contents:
1. Getting started with the model architecture of the transformer -- 2. Fine-tuning BERT models -- 3. Pretraining a RoBERT a model from scratch -- 4. Downstream NLP tasks with transformers -- 5. Machine translation with the transformer -- 6. Text generation with OpenAI GPT-2 and GPT-3 models -- 7. Applying transformers to legal and financial documents for AI text summarization -- 8. Matching tokenizers and datasets -- 9. Semantic role labeling with BERT-Based transformers -- 10. Let your data do the talking : story, questions, and answers -- 11. Detecting customer emotions to make predictions -- 12. Analyzing fake news with transformers
Summary: The transformer architecture has proved to be revolutionary in outperforming the classical RNN and CNN models in use today. With an apply-as-you-learn approach, Transformers for Natural Language Processing investigates in vast detail the deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question answering, and many more NLP domains with transformers. The book takes you through NLP with Python and examines various eminent models and datasets within the transformer architecture created by pioneers such as Google, Facebook, Microsoft, OpenAI, and Hugging Face. The book trains you in three stages. The first stage introduces you to transformer architectures, starting with the original transformer, before moving on to RoBERTa, BERT, and DistilBERT models. You will discover training methods for smaller transformers that can outperform GPT-3 in some cases. In the second stage, you will apply transformers for Natural Language Understanding (NLU) and Natural Language Generation (NLG). Finally, the third stage will help you grasp advanced language understanding techniques such as optimizing social network datasets and fake news identification. By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models by tech giants to various datasets.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Collection Call number Materials specified Status Notes Date due Barcode
Online E-Books Online E-Books Ladislao N. Diwa Memorial Library Multimedia Section Non-fiction OEBP Q336 R74 2021 (Browse shelf(Opens below)) Available PAV OEBP000267
Compact Discs Compact Discs Ladislao N. Diwa Memorial Library Multimedia Section Non-fiction EB Q336 R74 2021 (Browse shelf(Opens below)) Room use only PAV EB000267

https://portal.igpublish.com/iglibrary/ is required to read this e-book.

Includes bibliographical references and index

1. Getting started with the model architecture of the transformer -- 2. Fine-tuning BERT models -- 3. Pretraining a RoBERT a model from scratch -- 4. Downstream NLP tasks with transformers -- 5. Machine translation with the transformer -- 6. Text generation with OpenAI GPT-2 and GPT-3 models -- 7. Applying transformers to legal and financial documents for AI text summarization -- 8. Matching tokenizers and datasets -- 9. Semantic role labeling with BERT-Based transformers -- 10. Let your data do the talking : story, questions, and answers -- 11. Detecting customer emotions to make predictions -- 12. Analyzing fake news with transformers

The transformer architecture has proved to be revolutionary in outperforming the classical RNN and CNN models in use today. With an apply-as-you-learn approach, Transformers for Natural Language Processing investigates in vast detail the deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question answering, and many more NLP domains with transformers.

The book takes you through NLP with Python and examines various eminent models and datasets within the transformer architecture created by pioneers such as Google, Facebook, Microsoft, OpenAI, and Hugging Face.

The book trains you in three stages. The first stage introduces you to transformer architectures, starting with the original transformer, before moving on to RoBERTa, BERT, and DistilBERT models. You will discover training methods for smaller transformers that can outperform GPT-3 in some cases. In the second stage, you will apply transformers for Natural Language Understanding (NLU) and Natural Language Generation (NLG). Finally, the third stage will help you grasp advanced language understanding techniques such as optimizing social network datasets and fake news identification.

By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models by tech giants to various datasets.

Fund 164 CE-Logic Purchased April 14, 2022 OEBP000267 P. Roderno PHP 4,223.70
2022-04-230 0000

Copyright © 2023. Cavite State University | Koha 23.05