Transformers for Natural Language Processing Build Innovative Deep Neural Network Architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and More

Being the first book in the market to dive deep into the Transformers, it is a step-by-step guide for data and AI practitioners to help enhance the performance of language understanding and gain expertise with hands-on implementation of transformers using PyTorch, TensorFlow, Hugging Face, Trax, and...

Full description

Bibliographic Details
Main Author: Rothman, Denis
Format: eBook
Language:English
Published: Birmingham Packt Publishing, Limited 2021
Subjects:
Online Access:
Collection: O'Reilly - Collection details see MPG.ReNa
LEADER 03032nmm a2200517 u 4500
001 EB001998120
003 EBX01000000000000001161021
005 00000000000000.0
007 cr|||||||||||||||||||||
008 210823 ||| eng
020 |a 1800568630 
050 4 |a Q336 
100 1 |a Rothman, Denis 
245 0 0 |a Transformers for Natural Language Processing  |h [electronic resource]  |b Build Innovative Deep Neural Network Architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and More 
260 |a Birmingham  |b Packt Publishing, Limited  |c 2021 
300 |a 385 p. 
505 0 |a Includes bibliographical references and index 
505 0 |a Table of Contents Getting Started with the Model Architecture of the Transformer Fine-Tuning BERT Models Pretraining a RoBERTa Model from Scratch Downstream NLP Tasks with Transformers Machine Translation with the Transformer Text Generation with OpenAI GPT-2 and GPT-3 Models Applying Transformers to Legal and Financial Documents for AI Text Summarization Matching Tokenizers and Datasets Semantic Role Labeling with BERT-Based Transformers Let Your Data Do the Talking: Story, Questions, and Answers Detecting Customer Emotions to Make Predictions Analyzing Fake News with Transformers Appendix: Answers to the Questions 
653 |a Computers / Neural Networks / bisacsh 
653 |a Artificial intelligence / bicssc 
653 |a Artificial intelligence / Data processing / http://id.loc.gov/authorities/subjects/sh85008182 
653 |a Neural networks & fuzzy systems / bicssc 
653 |a Cloud computing / fast 
653 |a Python (Computer program language) / fast 
653 |a Infonuagique 
653 |a Python (Computer program language) / http://id.loc.gov/authorities/subjects/sh96008834 
653 |a Artificial intelligence / fast 
653 |a Computers / Natural Language Processing / bisacsh 
653 |a Cloud computing / http://id.loc.gov/authorities/subjects/sh2008004883 
653 |a Artificial intelligence / Data processing / fast 
653 |a Intelligence artificielle / Informatique 
653 |a Artificial intelligence / Software 
653 |a Computers / Intelligence (AI) & Semantics / bisacsh 
653 |a Natural language & machine translation / bicssc 
653 |a Python (Langage de programmation) 
041 0 7 |a eng  |2 ISO 639-2 
989 |b OREILLY  |a O'Reilly 
500 |a Description based upon print version of record 
015 |a GBC3F7658 
776 |z 9781800568631 
776 |z 1800565798 
776 |z 9781800565791 
856 4 0 |u https://learning.oreilly.com/library/view/~/9781800565791/?ar  |x Verlag  |3 Volltext 
082 0 |a 331 
082 0 |a 006.3 
520 |a Being the first book in the market to dive deep into the Transformers, it is a step-by-step guide for data and AI practitioners to help enhance the performance of language understanding and gain expertise with hands-on implementation of transformers using PyTorch, TensorFlow, Hugging Face, Trax, and AllenNLP.