diff --git a/EmoPy.egg-info/PKG-INFO b/EmoPy.egg-info/PKG-INFO
index a549bfb..783eb28 100644
--- a/EmoPy.egg-info/PKG-INFO
+++ b/EmoPy.egg-info/PKG-INFO
@@ -7,18 +7,18 @@ Author: ThoughtWorks Arts
Author-email: info@thoughtworksarts.io
License: UNKNOWN
Description: # EmoPy
- EmoPy is a python toolkit with deep neural net classes which aims to make accurate predictions of emotions given images of people's faces.
+ EmoPy is a python toolkit with deep neural net classes which predicts human emotional expression classifications given images of people's faces. The goal of this project is to explore the field of [Facial Expression Recognition (FER)](https://en.wikipedia.org/wiki/Emotion_recognition) using existing public datasets, and make neural network models which are free, open, easy to research and easy integrate into other projects.
- 
+ 
*Figure from [@Chen2014FacialER]*
- The goal of this project is to explore the field of [Facial Expression Recognition (FER)](https://en.wikipedia.org/wiki/Emotion_recognition) using existing public datasets, and make neural network models which are free, open, easy to research, and easy to integrate into different projects. The behavior of the system is highly dependent on the available data, and the developers of EmoPy created and tested the system using only publicly-available datasets.
+ The behavior of the system is highly dependent on the available data, and the developers of EmoPy created and tested the system using only publicly-available datasets.
To get a better grounding in the project you may find these write-ups useful:
* [Recognizing human facial expressions with machine learning](https://www.thoughtworks.com/insights/blog/recognizing-human-facial-expressions-machine-learning)
* [EmoPy: a machine learning toolkit for emotional expression](https://www.thoughtworks.com/insights/blog/emopy-machine-learning-toolkit-emotional-expression)
- We aim to expand our development community, and we are open to suggestions and contributions. Usually these types of algorithms are used commercially, so we want to help open source the best possible version of them in order to improve public access and engagement in this area. Please [contact us](mailto:aperez@thoughtworks.com) to discuss.
+ We aim to expand our development community, and we are open to suggestions and contributions. Usually these types of algorithms are used commercially, so we want to help open source the best possible version of them in order to improve public access and engagement in this area. Please contact an EmoPy maintainer (see below) to discuss.
## Overview
@@ -34,7 +34,7 @@ Description: # EmoPy
The `fermodel.py` module uses pre-trained models for FER prediction, making it the easiest entry point to get a trained model up and running quickly.
- Each of the modules contains one class, except for `neuralnets.py`, which has one interface and four subclasses. Each of these subclasses implements a different neural net architecture using the Keras framework with Tensorflow backend, allowing you to experiment and see which one performs best for your needs.
+ Each of the modules contains one class, except for `neuralnets.py`, which has one interface and five subclasses. Each of these subclasses implements a different neural net architecture using the Keras framework with Tensorflow backend, allowing you to experiment and see which one performs best for your needs.
The [EmoPy documentation](https://emopy.readthedocs.io/) contains detailed information on the classes and their interactions. Also, an overview of the different neural nets included in this project is included below.
@@ -61,16 +61,30 @@ Description: # EmoPy
## Environment Setup
- EmoPy runs using Python 3.6 and up, theoretically on any Python-compatible OS. We tested EmoPy using Python 3.6.6 on OSX. You can install [Python 3.6.6](https://www.python.org/downloads/release/python-366/) from the Python website.
Python is compatible with multiple operating systems. If you would like to use EmoPy on another OS, please convert these instructions to match your target environment. Let us know how you get on, and we will try to support you and share you results.
- If you do not have Homebrew installed run this command to install:
+ ### macOS
+
+
+ Before beginning, if you do not have Homebrew installed run this command to install:
```
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
```
+ EmoPy runs using Python 3.6 and up, theoretically on any Python-compatible OS. We tested EmoPy using Python 3.6.6 on OSX.
+
+ There are 2 ways you can install Python 3.6.6:
+
+ 1. Directly from the [Python website] (https://www.python.org/downloads/release/python-366/), or
+ 2. Using [pyenv] (https://github.com/pyenv/pyenv):
+
+ ```
+ $ brew install pyenv
+ $ pyenv install 3.6.6
+ ```
+
GraphViz is required for visualisation functions.
```
@@ -86,7 +100,14 @@ Description: # EmoPy
```
python3.6 -m venv venv
```
- where the second `venv` is the name of your virtual environment. To activate, run from the same directory:
+
+ Or if using pyenv:
+
+ ```
+ $ pyenv exec python3.6 -m venv venv
+ ```
+
+ Where the second `venv` is the name of your virtual environment. To activate, run from the same directory:
```
source venv/bin/activate
```
@@ -94,6 +115,35 @@ Description: # EmoPy
(To deactivate the virtual environment run ```deactivate``` in the command line. You'll know it has been deactivated when the prefix ```(venv)``` disappears.)
+
+ ### Windows
+
+ This works on Windows 10 using Python 3.6.6 and 3.6.8.
+
+ Download python directly from the [Python website] (https://www.python.org/downloads/release/python-366/), or if you are already using pyenv, then you can use pyenv to install 3.6.6 and then activate version 3.6.6 before doing the next steps.
+
+ The next step is to set up a virtual environment using virtualenv.
+ ```
+ pip install virtualenv
+ ```
+
+ Create and activate the virtual environment. Run:
+ ```
+ python -m venv emoPyVenv
+ ```
+
+ Where `emoPyVenv` is the name of your virtual environment. To activate, run from the same directory:
+ ```
+ source emoPyVenv\Scripts\activate
+ ```
+ Your terminal command line should now be prefixed with ```(emoPyVenv)```.
+
+ (To deactivate the virtual environment run ```deactivate``` in the command line. You'll know it has been deactivated when the prefix ```(emoPyVenv)``` disappears.)
+
+ One you have created the virtual environment, each time you activate it, version 3.6.6 will be the version of python.
+
+ You should do the instalation from the source described below. Then each time you activate the virtual environment, you'll have access to the installation of EmoPy.
+
## Installation
@@ -121,6 +171,17 @@ Description: # EmoPy
Now you're ready to go!
+ ## Running tests
+
+ You can run the tests with:
+
+ ```
+ python EmoPy/tests/run_all.py
+ ```
+
+ We encourage improvements and additions to these tests!
+
+
## Running the examples
You can find example code to run each of the current neural net classes in [examples](EmoPy/examples). You may either download the example directory to a location of your choice on your machine, or find the example directory included in the installation.
@@ -165,7 +226,7 @@ Description: # EmoPy
To train your own neural net, use one of our FER neural net classes to get started. You can try the convolutional_model.py example:
```
- python convolutional_example.py
+ python convolutional_model.py
```
The example first initializes the model. A summary of the model architecture will be printed out. This includes a list of all the neural net layers and the shape of their output. Our models are built using the Keras framework, which offers this visualization function.
@@ -202,9 +263,13 @@ Description: # EmoPy
This model uses a technique known as [Transfer Learning](https://www.analyticsvidhya.com/blog/2017/06/transfer-learning-the-art-of-fine-tuning-a-pre-trained-model/), where pre-trained deep neural net models are used as starting points. The pre-trained models it uses are trained on images to classify objects. The model then retrains the pre-trained models using facial expression images with emotion classifications rather than object classifications. It adds a couple top layers to the original model to match the number of target emotions we want to classify and reruns the training algorithm with a set of facial expression images. It only uses still images, no temporal context.
+ #### ConvolutionalNNDropout
+
+ This model is the most recent addition to EmoPy. It is a 2D Convolutional Neural Network that implements dropout, batch normalization, and L2 regularization. It is currently performing with a training accuracy of 0.7045 and a validation accuracy of 0.6536 when classifying 7 emotions. Further training will be done to determine how it performs on smaller subsets of emotions.
+
## Performance
- Currently the ConvolutionalLstmNN model is performing best when classifying 7 emotions with a validation accuracy of 47.5%. The table below shows accuracy values of this model and the TransferLearningNN model when trained on all seven standard emotions and on a subset of three emotions (fear, happiness, neutral). They were trained on 5,000 images from the [FER+](https://github.com/Microsoft/FERPlus) dataset.
+ Before implementing the ConvolutionalNNDropout model, the ConvolutionalLstmNN model was performing best when classifying 7 emotions with a validation accuracy of 47.5%. The table below shows accuracy values of this model and the TransferLearningNN model when trained on all seven standard emotions and on a subset of three emotions (fear, happiness, neutral). They were trained on 5,000 images from the [FER+](https://github.com/Microsoft/FERPlus) dataset.
| Neural Net Model | 7 emotions | | 3 emotions | |
|---------------------|-------------------|---------------------|-------------------|---------------------|
@@ -249,10 +314,35 @@ Description: # EmoPy
Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)):
-
- | [
angelicaperez37](https://github.com/angelicaperez37)
[π»](https://github.com/thoughtworksarts/EmoPy/commits?author=angelicaperez37 "Code") [π](#blog-angelicaperez37 "Blogposts") [π](https://github.com/thoughtworksarts/EmoPy/commits?author=angelicaperez37 "Documentation") | [
sbriley](https://github.com/sbriley)
[π»](https://github.com/thoughtworksarts/EmoPy/commits?author=sbriley "Code") | [
Sofia Tania](http://tania.pw)
[π»](https://github.com/thoughtworksarts/EmoPy/commits?author=stania1 "Code") | [
Andrew McWilliams](https://jahya.net)
[π](https://github.com/thoughtworksarts/EmoPy/commits?author=microcosm "Documentation") [π€](#ideas-microcosm "Ideas, Planning, & Feedback") | [
Webs](http://www.websonthewebs.com)
[π»](https://github.com/thoughtworksarts/EmoPy/commits?author=weberswords "Code") | [
Sara GW](https://github.com/saragw6)
[π»](https://github.com/thoughtworksarts/EmoPy/commits?author=saragw6 "Code") | [
Megan Sullivan](http://www.linkedin.com/in/meganesu)
[π](https://github.com/thoughtworksarts/EmoPy/commits?author=meganesu "Documentation") |
- | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
- | [
sadnantw](https://github.com/sadnantw)
[π»](https://github.com/thoughtworksarts/EmoPy/commits?author=sadnantw "Code") [β οΈ](https://github.com/thoughtworksarts/EmoPy/commits?author=sadnantw "Tests") | [
Julien Deswaef](http://xuv.be)
[π»](https://github.com/thoughtworksarts/EmoPy/commits?author=xuv "Code") [π](https://github.com/thoughtworksarts/EmoPy/commits?author=xuv "Documentation") | [
Tanushri Chakravorty](https://github.com/sinbycos)
[π»](https://github.com/thoughtworksarts/EmoPy/commits?author=sinbycos "Code") [π‘](#example-sinbycos "Examples") | [
Linas VepΕ‘tas](http://linas.org)
[π](#plugin-linas "Plugin/utility libraries") |
+
+
+
angelicaperez37 π» π π |
+ sbriley π» |
+ Sofia Tania π» |
+ Andrew McWilliams π π€ |
+ Webs π» |
+ Sara GW π» |
+ Megan Sullivan π |
+
sadnantw π» β οΈ |
+ Julien Deswaef π» π |
+ Tanushri Chakravorty π» π‘ |
+ Linas VepΕ‘tas π |
+ Emily Sachs π» |
+ Diana Gamez π» |
+ dtoakley π π» |
+
Anju π§ |
+ Satish Dash π§ |
+