Transformers for Natural Language Processing : Build, train, and fine-tune deep neural network architectures for NLP with Python, PyTorch, 2/e (Paperback)

Rothman, Denis

買這商品的人也買了...

相關主題

商品描述

Under the hood working of transformers, fine-tuning GPT-3 models, DeBERTa, vision models, and the start of Metaverse, using a variety of NLP platforms: Hugging Face, OpenAI API, Trax, and AllenNLP

Key Features

  • Implement models, such as BERT, Reformer, and T5, that outperform classical language models
  • Compare NLP applications using GPT-3, GPT-2, and other transformers
  • Analyze advanced use cases, including polysemy, cross-lingual learning, and computer vision

Book Description

Transformers are a game-changer for natural language understanding (NLU) and have become one of the pillars of artificial intelligence.

Transformers for Natural Language Processing, 2nd Edition, investigates deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question-answering, and many more NLP domains with transformers.

An Industry 4.0 AI specialist needs to be adaptable; knowing just one NLP platform is not enough anymore. Different platforms have different benefits depending on the application, whether it's cost, flexibility, ease of implementation, results, or performance. In this book, we analyze numerous use cases with Hugging Face, Google Trax, OpenAI, and AllenNLP.

This book takes transformers' capabilities further by combining multiple NLP techniques, such as sentiment analysis, named entity recognition, and semantic role labeling, to analyze complex use cases, such as dissecting fake news on Twitter. Also, see how transformers can create code using just a brief description.

By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models to various datasets.

What you will learn

  • Discover new ways of performing NLP techniques with the latest pretrained transformers
  • Grasp the workings of the original Transformer, GPT-3, BERT, T5, DeBERTa, and Reformer
  • Find out how ViT and CLIP label images (including blurry ones!) and reconstruct images using DALL-E
  • Carry out sentiment analysis, text summarization, casual language analysis, machine translations, and more using TensorFlow, PyTorch, and GPT-3
  • Measure the productivity of key transformers to define their scope, potential, and limits in production

Who this book is for

If you want to learn about and apply transformers to your natural language (and image) data, this book is for you.

A good understanding of NLP, Python, and deep learning is required to benefit most from this book. Many platforms covered in this book provide interactive user interfaces, which allow readers with a general interest in NLP and AI to follow several chapters of this book.

商品描述(中文翻譯)

《自然語言處理的Transformer》第二版深入探討了機器翻譯、語音轉文字、文字轉語音、語言建模、問答等多個自然語言處理領域中,使用Transformer的深度學習技術。Transformer是自然語言理解(NLU)的一個重要突破,已成為人工智慧的支柱之一。

本書將進一步結合多種自然語言處理技術,如情感分析、命名實體識別和語義角色標註,分析複雜的應用案例,例如解析Twitter上的假新聞。同時,還將展示Transformer如何僅憑簡短描述就能生成程式碼。

通過閱讀本書,您將從認知科學的角度理解Transformer,並能熟練地應用預訓練的Transformer模型於各種數據集。

本書的重點包括:
- 探索使用最新的預訓練Transformer執行自然語言處理技術的新方法
- 理解原始Transformer、GPT-3、BERT、T5、DeBERTa和Reformer的工作原理
- 了解ViT和CLIP如何對圖像進行標註(包括模糊圖像!)並使用DALL-E重建圖像
- 使用TensorFlow、PyTorch和GPT-3進行情感分析、文本摘要、非正式語言分析、機器翻譯等
- 測量關鍵Transformer的生產力,以定義它們在生產中的範圍、潛力和限制

本書適合對自然語言(和圖像)數據想要學習並應用Transformer的讀者。閱讀者需要對自然語言處理、Python和深度學習有良好的理解。本書涵蓋的許多平台提供了互動式用戶界面,使對自然語言處理和人工智慧有一般興趣的讀者能夠跟隨本書的幾個章節。

目錄大綱

1. What are Transformers?
2. Getting Started with the Architecture of the Transformer Model
3. Fine-Tuning BERT Models
4. Pretraining a RoBERTa Model from Scratch
5. Downstream NLP Tasks with Transformers
6. Machine Translation with the Transformer
7. The Rise of Suprahuman Transformers with GPT-3 Engines
8. Applying Transformers to Legal and Financial Documents for AI Text Summarization
9. Matching Tokenizers and Datasets
10. Semantic Role Labeling with BERT-Based Transformers
11. Let Your Data Do the Talking: Story, Questions, and Answers
12. Detecting Customer Emotions to Make Predictions
13. Analyzing Fake News with Transformers
14. Interpreting Black Box Transformer Models
15. From NLP to Task-Agnostic Transformer Models
16. The Emergence of Transformer-Driven Copilots
17. Appendix I ― Terminology of Transformer Models

目錄大綱(中文翻譯)

1. 什麼是Transformer?
2. 開始使用Transformer模型的架構
3. 使用Fine-Tuning調整BERT模型
4. 從頭開始預訓練RoBERTa模型
5. 使用Transformer進行下游自然語言處理任務
6. 使用Transformer進行機器翻譯
7. GPT-3引擎推動超人類Transformer的崛起
8. 將Transformer應用於法律和金融文件以進行AI文本摘要
9. 匹配分詞器和數據集
10. 使用基於BERT的Transformer進行語義角色標註
11. 讓你的數據說話:故事、問題和答案
12. 檢測客戶情緒以進行預測
13. 使用Transformer分析假新聞
14. 解釋黑盒Transformer模型
15. 從自然語言處理到任務不可知的Transformer模型
16. Transformer驅動副駕駛的出現
17. 附錄I ― Transformer模型術語