出版时间:2021.7
官网链接:Manning
下载地址:百度网盘(truePDF+EPUB+MOBI)
提取码 :x76s
内容简介:
Build custom NLP models in record time by adapting pre-trained machine learning models to solve specialized problems.
In Transfer Learning for Natural Language Processing you will learn:
- Fine tuning pretrained models with new domain data
- Picking the right model to reduce resource usage
- Transfer learning for neural network architectures
- Generating text with generative pretrained transformers
- Cross-lingual transfer learning with BERT
- Foundations for exploring NLP academic literature
Training deep learning NLP models from scratch is costly, time-consuming, and requires massive amounts of data. In Transfer Learning for Natural Language Processing, DARPA researcher Paul Azunre reveals cutting-edge transfer learning techniques that apply customizable pretrained models to your own NLP architectures. You’ll learn how to use transfer learning to deliver state-of-the-art results for language comprehension, even when working with limited label data. Best of all, you’ll save on training time and computational costs.
about the technology
Build custom NLP models in record time, even with limited datasets! Transfer learning is a machine learning technique for adapting pretrained machine learning models to solve specialized problems. This powerful approach has revolutionized natural language processing, driving improvements in machine translation, business analytics, and natural language generation.
about the book
Transfer Learning for Natural Language Processing teaches you to create powerful NLP solutions quickly by building on existing pretrained models. This instantly useful book provides crystal-clear explanations of the concepts you need to grok transfer learning along with hands-on examples so you can practice your new skills immediately. As you go, you’ll apply state-of-the-art transfer learning methods to create a spam email classifier, a fact checker, and more real-world applications.
what’s inside
- Fine tuning pretrained models with new domain data
- Picking the right model to reduce resource use
- Transfer learning for neural network architectures
- Generating text with pretrained transformers
about the reader
For machine learning engineers and data scientists with some experience in NLP.
about the author
Paul Azunre holds a PhD in Computer Science from MIT and has served as a Principal Investigator on several DARPA research programs.