Skip to content

Commit 855ffab

Browse files
authored
Updating figures & descriptions for Ch 3 #23
1 parent 43596b8 commit 855ffab

File tree

1 file changed

+40
-7
lines changed

1 file changed

+40
-7
lines changed

Ch3/README.md

+40-7
Original file line numberDiff line numberDiff line change
@@ -1,20 +1,27 @@
11

22
# Text Representation
33

4-
Set of notebooks associated with Chapter 3 of the book.
4+
## 🔖 Outline
55

6-
1. **[One-Hot Encoding](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/01_OneHotEncoding.ipynb)**: Here we demonstrate One-Hot encoding from first principle as well as scikit learn's implementation on our toy corpus.
6+
To be added
77

8-
2. **[Bag of Words](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/02_Bag_of_Words.ipynb)** : Here we demostrate how to arrive at the bag of words representation for our toy corpus.
9-
108

11-
3. **[Bag of N Grams](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/03_Bag_of_N_Grams.ipynb)**: Here we demonstrate how Bag of N Grams work using our toy corpus.
9+
## 🗒️ Notebooks
10+
11+
Set of notebooks associated with the chapter.
12+
13+
1. **[One-Hot Encoding](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/01_OneHotEncoding.ipynb)**: Here we demonstrate One-Hot encoding from the first principle as well as scikit learn's implementation on our toy corpus.
14+
15+
2. **[Bag of Words](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/02_Bag_of_Words.ipynb)** : Here we demonstrate how to arrive at the bag of words representation for our toy corpus.
16+
   
17+
18+
3. **[Bag of N Grams](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/03_Bag_of_N_Grams.ipynb)**: Here we demonstrate how Bag of N-Grams work using our toy corpus.
1219

1320
4. **[TF-IDF](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/04_TF_IDF.ipynb)**: Here we demonstrate how to obtain the get the TF-IDF representation of a document using sklearn's TfidfVectorizer(we will be using our toy corpus).
1421

15-
5. **[Pre-trained Word Embeddings](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/05_Pre_Trained_Word_Embeddings.ipynb)**: Here we demonstrate how we can represent text using pre-trained word embedding models and how to use them to get respresentations for the full text.
22+
5. **[Pre-trained Word Embeddings](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/05_Pre_Trained_Word_Embeddings.ipynb)**: Here we demonstrate how we can represent text using pre-trained word embedding models and how to use them to get representations for the full text.
1623

17-
6. **[Custom Word Embeddings](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/06_Training_embeddings_using_gensim.ipynb)**: Here we demonstrate how to train a custom Word Embedding model(word2vec) using gensim on both, our toy corpus and a subset of wikipedia data.
24+
6. **[Custom Word Embeddings](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/06_Training_embeddings_using_gensim.ipynb)**: Here we demonstrate how to train a custom Word Embedding model(word2vec) using gensim on both, our toy corpus and a subset of Wikipedia data.
1825

1926
7. **[Vector Representations via averaging](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/07_DocVectors_using_averaging_Via_spacy.ipynb)**: Here we demonstrate averaging of Document Vectors using spaCy.
2027

@@ -23,3 +30,29 @@ Set of notebooks associated with Chapter 3 of the book.
2330
9. **[Visualizing Embeddings Using TSNE](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/09_Visualizing_Embeddings_Using_TSNE.ipynb)**: Here we demonstrate how we can use dimensionality reduction techniques such as TSNE to visualize embeddings.
2431

2532
10. **[Visualizing Embeddings using Tensorboard](https://github.com/practical-nlp/practical-nlp/blob/master/Ch3/10_Visualizing_Embeddings_using_Tensorboard.ipynb)**: Here we demonstrate how we can visualize embeddings using Tensorboard.
33+
34+
35+
## 🖼️ Figures
36+
37+
Color figures as requested by the readers.
38+
39+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-1.png)
40+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-2.png)
41+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-3.png)
42+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-4.png)
43+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-5.png)
44+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-6.png)
45+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-7.png)
46+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-8.png)
47+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-9.png)
48+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-10.png)
49+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-11.png)
50+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-12.png)
51+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-13.png)
52+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-14.png)
53+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-15.png)
54+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-16.png)
55+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-17.png)
56+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-18.png)
57+
![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/3-19.png)
58+

0 commit comments

Comments
 (0)