Deep Learning for NLP - Part 3

seeders: 12
leechers: 10
updated:

Download Fast Safe Anonymous
movies, software, shows...
  • Downloads: 85
  • Language: English

Files

[ FreeCourseWeb.com ] Udemy - Deep Learning for NLP - Part 3
  • Get Bonus Downloads Here.url (0.2 KB)
  • ~Get Your Files Here ! 1. Sentence Embeddings
    • 1. Introduction-en_US.srt (2.3 KB)
    • 1. Introduction.mp4 (9.9 MB)
    • 10. Multi-Task Learning MILAMSR Sentence Embeddings-en_US.srt (5.7 KB)
    • 10. Multi-Task Learning MILAMSR Sentence Embeddings.mp4 (30.0 MB)
    • 11. SentenceBERT-en_US.srt (17.1 KB)
    • 11. SentenceBERT.mp4 (89.2 MB)
    • 12. Summary-en_US.srt (3.7 KB)
    • 12. Summary.mp4 (11.8 MB)
    • 2. Bag of Words approaches-en_US.srt (12.2 KB)
    • 2. Bag of Words approaches.mp4 (60.3 MB)
    • 3. Unsupervised methods Doc2Vec-en_US.srt (13.9 KB)
    • 3. Unsupervised methods Doc2Vec.mp4 (66.6 MB)
    • 4. Unsupervised methods SkipThought and QuickThoughts-en_US.srt (13.4 KB)
    • 4. Unsupervised methods SkipThought and QuickThoughts.mp4 (77.5 MB)
    • 5. Supervised method RecNNs and Deep Averaging Networks-en_US.srt (13.1 KB)
    • 5. Supervised method RecNNs and Deep Averaging Networks.mp4 (57.1 MB)
    • 6. Supervised method InferSent-en_US.srt (15.4 KB)
    • 6. Supervised method InferSent.mp4 (78.1 MB)
    • 7. CNNs for semantic similarity DSSM-en_US.srt (6.6 KB)
    • 7. CNNs for semantic similarity DSSM.mp4 (29.9 MB)
    • 8. Multi-Task Learning USE-en_US.srt (17.1 KB)
    • 8. Multi-Task Learning USE.mp4 (81.6 MB)
    • 9. Multi-Task Learning MTDNN-en_US.srt (9.3 KB)
    • 9. Multi-Task Learning MTDNN.mp4 (40.8 MB)
    2. Generative Transformer Models
    • 1. Introduction-en_US.srt (3.8 KB)
    • 1. Introduction.mp4 (11.4 MB)
    • 2. UniLM-en_US.srt (10.6 KB)
    • 2. UniLM.mp4 (63.4 MB)
    • 3. Transformer-XL and XLNet-en_US.srt (67.5 KB)
    • 3. Transformer-XL and XLNet.mp4 (370.3 MB)
    • 4. MASS-en_US.srt (11.3 KB)
    • 4. MASS.mp4 (60.4 MB)
    • 5. BART-en_US.srt (15.7 KB)
    • 5. BART.mp4 (85.7 MB)
    • 6. CTRL-en_US.srt (13.6 KB)
    • 6. CTRL.mp4 (79.7 MB)
    • 7. T5-en_US.srt (33.2 KB)
    • 7. T5.mp4 (202.8 MB)
    • 8. ProphetNet-en_US.srt (18.3 KB)
    • 8. ProphetNet.mp4 (101.6 MB)
    • 9. Summary-en_US.srt (4.0 KB)
    • 9. Summary.mp4 (18.8 MB)
    • Bonus Resources.txt (0.3 KB)

Description

Deep Learning for NLP - Part 3



MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz
Language: English | Size: 1.58 GB | Duration: 3h 26m
What you'll learn
Deep Learning for Natural Language Processing
Sentence Embeddings: Bag of words, Doc2Vec, SkipThought, InferSent, DSSM, USE, MTDNN, SentenceBERT
Generative Transformer Models: UniLM, Transformer-XL and XLNet, MASS, BART, CTRL, T5, ProphetNet
DL for NLP
Requirements
Basics of machine learning
Recurrent Models: RNNs, LSTMs, GRUs and variants
Basic understanding of Transformer based models and word embeddings
Description
This course is a part of "Deep Learning for NLP" Series. In this course, I will introduce concepts like Sentence embeddings and Generative Transformer Models. These concepts form the base for good understanding of advanced deep learning models for modern Natural Language Generation.

The course consists of two main sections as follows.

In the first section, I will talk about sentence embeddings. We will start with basic bag of words methods where sentence embedddings are obtained using an aggregation over word embeddings of constituent words. We will talk about averaged bag of words, word mover's distance, SIF and Power means method. Then we will discuss two unsupervised methods: Doc2Vec and SkipThought. Further, we will discuss about supervised sentence embedding methods like recursive neural networks, deep averaging networks and InferSent. CNNs can also be used for computing semantic similarity between two text strings; we will talk about DSSMs for the same. We will also discuss 3 multi-task learning methods including Universal Sentence Encodings and MT-DNN. Lastly, I will talk about SentenceBERT.

In the second section, I will talk about multiple Generative Transformer Models. We will start with UniLM. Then we will talk about segment recurrence and relative position embeddings in Transformer-XL. Then get to XLNets which use Transformer-XL along with permutation language modeling. Next we will understand span masking in MASS and also discuss various noising methods on BART. We will then discuss about controlled natural language generation using CTRL. We will discuss how T5 models every learning task as a text-to-text task. Finally, we will discuss how ProphetNet extends 2-stream attention modeling from XLNet to n-stream attention modeling, thereby enabling n-gram predictions.



Download torrent
1.6 GB
seeders:12
leechers:10
Deep Learning for NLP - Part 3


Trackers

tracker name
udp://tracker.torrent.eu.org:451/announce
udp://tracker.tiny-vps.com:6969/announce
http://tracker.foreverpirates.co:80/announce
udp://tracker.cyberia.is:6969/announce
udp://exodus.desync.com:6969/announce
udp://explodie.org:6969/announce
udp://tracker.opentrackr.org:1337/announce
udp://9.rarbg.to:2780/announce
udp://tracker.internetwarriors.net:1337/announce
udp://ipv4.tracker.harry.lu:80/announce
udp://open.stealth.si:80/announce
udp://9.rarbg.to:2900/announce
udp://9.rarbg.me:2720/announce
udp://opentor.org:2710/announce
µTorrent compatible trackers list

Download torrent
1.6 GB
seeders:12
leechers:10
Deep Learning for NLP - Part 3


Torrent hash: 630B4E604975ABE28229984D9F8C6801C22F1EF1