You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Set of notebooks associated with Chapter 3 of the book.
4
+
## 🔖 Outline
5
5
6
-
1.**[One-Hot Encoding](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/01_OneHotEncoding.ipynb)**: Here we demonstrate One-Hot encoding from first principle as well as scikit learn's implementation on our toy corpus.
6
+
To be added
7
7
8
-
2.**[Bag of Words](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/02_Bag_of_Words.ipynb)** : Here we demostrate how to arrive at the bag of words representation for our toy corpus.
9
-
10
8
11
-
3.**[Bag of N Grams](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/03_Bag_of_N_Grams.ipynb)**: Here we demonstrate how Bag of N Grams work using our toy corpus.
9
+
## 🗒️ Notebooks
10
+
11
+
Set of notebooks associated with the chapter.
12
+
13
+
1.**[One-Hot Encoding](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/01_OneHotEncoding.ipynb)**: Here we demonstrate One-Hot encoding from the first principle as well as scikit learn's implementation on our toy corpus.
14
+
15
+
2.**[Bag of Words](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/02_Bag_of_Words.ipynb)** : Here we demonstrate how to arrive at the bag of words representation for our toy corpus.
16
+
17
+
18
+
3.**[Bag of N Grams](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/03_Bag_of_N_Grams.ipynb)**: Here we demonstrate how Bag of N-Grams work using our toy corpus.
12
19
13
20
4.**[TF-IDF](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/04_TF_IDF.ipynb)**: Here we demonstrate how to obtain the get the TF-IDF representation of a document using sklearn's TfidfVectorizer(we will be using our toy corpus).
14
21
15
-
5.**[Pre-trained Word Embeddings](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/05_Pre_Trained_Word_Embeddings.ipynb)**: Here we demonstrate how we can represent text using pre-trained word embedding models and how to use them to get respresentations for the full text.
22
+
5.**[Pre-trained Word Embeddings](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/05_Pre_Trained_Word_Embeddings.ipynb)**: Here we demonstrate how we can represent text using pre-trained word embedding models and how to use them to get representations for the full text.
16
23
17
-
6.**[Custom Word Embeddings](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/06_Training_embeddings_using_gensim.ipynb)**: Here we demonstrate how to train a custom Word Embedding model(word2vec) using gensim on both, our toy corpus and a subset of wikipedia data.
24
+
6.**[Custom Word Embeddings](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/06_Training_embeddings_using_gensim.ipynb)**: Here we demonstrate how to train a custom Word Embedding model(word2vec) using gensim on both, our toy corpus and a subset of Wikipedia data.
18
25
19
26
7.**[Vector Representations via averaging](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/07_DocVectors_using_averaging_Via_spacy.ipynb)**: Here we demonstrate averaging of Document Vectors using spaCy.
20
27
@@ -23,3 +30,29 @@ Set of notebooks associated with Chapter 3 of the book.
23
30
9.**[Visualizing Embeddings Using TSNE](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/09_Visualizing_Embeddings_Using_TSNE.ipynb)**: Here we demonstrate how we can use dimensionality reduction techniques such as TSNE to visualize embeddings.
24
31
25
32
10.**[Visualizing Embeddings using Tensorboard](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/10_Visualizing_Embeddings_using_Tensorboard.ipynb)**: Here we demonstrate how we can visualize embeddings using Tensorboard.
0 commit comments