From fd3beb77a704f7c4dc88f24820cf45d2fe5312ec Mon Sep 17 00:00:00 2001 From: DavidCooperCheyney Date: Mon, 23 Aug 2021 16:05:44 -0400 Subject: [PATCH] Windows install --- EmoPy.egg-info/PKG-INFO | 118 +++++++++++++++++++++++++++++++----- EmoPy.egg-info/SOURCES.txt | 8 +++ EmoPy.egg-info/requires.txt | 13 ++-- README.md | 43 +++++++++++-- setup.py | 6 +- 5 files changed, 160 insertions(+), 28 deletions(-) diff --git a/EmoPy.egg-info/PKG-INFO b/EmoPy.egg-info/PKG-INFO index a549bfb..783eb28 100644 --- a/EmoPy.egg-info/PKG-INFO +++ b/EmoPy.egg-info/PKG-INFO @@ -7,18 +7,18 @@ Author: ThoughtWorks Arts Author-email: info@thoughtworksarts.io License: UNKNOWN Description: # EmoPy - EmoPy is a python toolkit with deep neural net classes which aims to make accurate predictions of emotions given images of people's faces. + EmoPy is a python toolkit with deep neural net classes which predicts human emotional expression classifications given images of people's faces. The goal of this project is to explore the field of [Facial Expression Recognition (FER)](https://en.wikipedia.org/wiki/Emotion_recognition) using existing public datasets, and make neural network models which are free, open, easy to research and easy integrate into other projects. - ![Labeled FER Images](readme_docs/labeled_images.png "Labeled Facial Expression Images") + ![Labeled FER Images](readme_docs/labeled_images_7.png "Labeled Facial Expression Images") *Figure from [@Chen2014FacialER]* - The goal of this project is to explore the field of [Facial Expression Recognition (FER)](https://en.wikipedia.org/wiki/Emotion_recognition) using existing public datasets, and make neural network models which are free, open, easy to research, and easy to integrate into different projects. The behavior of the system is highly dependent on the available data, and the developers of EmoPy created and tested the system using only publicly-available datasets. + The behavior of the system is highly dependent on the available data, and the developers of EmoPy created and tested the system using only publicly-available datasets. To get a better grounding in the project you may find these write-ups useful: * [Recognizing human facial expressions with machine learning](https://www.thoughtworks.com/insights/blog/recognizing-human-facial-expressions-machine-learning) * [EmoPy: a machine learning toolkit for emotional expression](https://www.thoughtworks.com/insights/blog/emopy-machine-learning-toolkit-emotional-expression) - We aim to expand our development community, and we are open to suggestions and contributions. Usually these types of algorithms are used commercially, so we want to help open source the best possible version of them in order to improve public access and engagement in this area. Please [contact us](mailto:aperez@thoughtworks.com) to discuss. + We aim to expand our development community, and we are open to suggestions and contributions. Usually these types of algorithms are used commercially, so we want to help open source the best possible version of them in order to improve public access and engagement in this area. Please contact an EmoPy maintainer (see below) to discuss. ## Overview @@ -34,7 +34,7 @@ Description: # EmoPy The `fermodel.py` module uses pre-trained models for FER prediction, making it the easiest entry point to get a trained model up and running quickly. - Each of the modules contains one class, except for `neuralnets.py`, which has one interface and four subclasses. Each of these subclasses implements a different neural net architecture using the Keras framework with Tensorflow backend, allowing you to experiment and see which one performs best for your needs. + Each of the modules contains one class, except for `neuralnets.py`, which has one interface and five subclasses. Each of these subclasses implements a different neural net architecture using the Keras framework with Tensorflow backend, allowing you to experiment and see which one performs best for your needs. The [EmoPy documentation](https://emopy.readthedocs.io/) contains detailed information on the classes and their interactions. Also, an overview of the different neural nets included in this project is included below. @@ -61,16 +61,30 @@ Description: # EmoPy ## Environment Setup - EmoPy runs using Python 3.6 and up, theoretically on any Python-compatible OS. We tested EmoPy using Python 3.6.6 on OSX. You can install [Python 3.6.6](https://www.python.org/downloads/release/python-366/) from the Python website. Python is compatible with multiple operating systems. If you would like to use EmoPy on another OS, please convert these instructions to match your target environment. Let us know how you get on, and we will try to support you and share you results. - If you do not have Homebrew installed run this command to install: + ### macOS + + + Before beginning, if you do not have Homebrew installed run this command to install: ``` /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" ``` + EmoPy runs using Python 3.6 and up, theoretically on any Python-compatible OS. We tested EmoPy using Python 3.6.6 on OSX. + + There are 2 ways you can install Python 3.6.6: + + 1. Directly from the [Python website] (https://www.python.org/downloads/release/python-366/), or + 2. Using [pyenv] (https://github.com/pyenv/pyenv): + + ``` + $ brew install pyenv + $ pyenv install 3.6.6 + ``` + GraphViz is required for visualisation functions. ``` @@ -86,7 +100,14 @@ Description: # EmoPy ``` python3.6 -m venv venv ``` - where the second `venv` is the name of your virtual environment. To activate, run from the same directory: + + Or if using pyenv: + + ``` + $ pyenv exec python3.6 -m venv venv + ``` + + Where the second `venv` is the name of your virtual environment. To activate, run from the same directory: ``` source venv/bin/activate ``` @@ -94,6 +115,35 @@ Description: # EmoPy (To deactivate the virtual environment run ```deactivate``` in the command line. You'll know it has been deactivated when the prefix ```(venv)``` disappears.) + + ### Windows + + This works on Windows 10 using Python 3.6.6 and 3.6.8. + + Download python directly from the [Python website] (https://www.python.org/downloads/release/python-366/), or if you are already using pyenv, then you can use pyenv to install 3.6.6 and then activate version 3.6.6 before doing the next steps. + + The next step is to set up a virtual environment using virtualenv. + ``` + pip install virtualenv + ``` + + Create and activate the virtual environment. Run: + ``` + python -m venv emoPyVenv + ``` + + Where `emoPyVenv` is the name of your virtual environment. To activate, run from the same directory: + ``` + source emoPyVenv\Scripts\activate + ``` + Your terminal command line should now be prefixed with ```(emoPyVenv)```. + + (To deactivate the virtual environment run ```deactivate``` in the command line. You'll know it has been deactivated when the prefix ```(emoPyVenv)``` disappears.) + + One you have created the virtual environment, each time you activate it, version 3.6.6 will be the version of python. + + You should do the instalation from the source described below. Then each time you activate the virtual environment, you'll have access to the installation of EmoPy. + ## Installation @@ -121,6 +171,17 @@ Description: # EmoPy Now you're ready to go! + ## Running tests + + You can run the tests with: + + ``` + python EmoPy/tests/run_all.py + ``` + + We encourage improvements and additions to these tests! + + ## Running the examples You can find example code to run each of the current neural net classes in [examples](EmoPy/examples). You may either download the example directory to a location of your choice on your machine, or find the example directory included in the installation. @@ -165,7 +226,7 @@ Description: # EmoPy To train your own neural net, use one of our FER neural net classes to get started. You can try the convolutional_model.py example: ``` - python convolutional_example.py + python convolutional_model.py ``` The example first initializes the model. A summary of the model architecture will be printed out. This includes a list of all the neural net layers and the shape of their output. Our models are built using the Keras framework, which offers this visualization function. @@ -202,9 +263,13 @@ Description: # EmoPy This model uses a technique known as [Transfer Learning](https://www.analyticsvidhya.com/blog/2017/06/transfer-learning-the-art-of-fine-tuning-a-pre-trained-model/), where pre-trained deep neural net models are used as starting points. The pre-trained models it uses are trained on images to classify objects. The model then retrains the pre-trained models using facial expression images with emotion classifications rather than object classifications. It adds a couple top layers to the original model to match the number of target emotions we want to classify and reruns the training algorithm with a set of facial expression images. It only uses still images, no temporal context. + #### ConvolutionalNNDropout + + This model is the most recent addition to EmoPy. It is a 2D Convolutional Neural Network that implements dropout, batch normalization, and L2 regularization. It is currently performing with a training accuracy of 0.7045 and a validation accuracy of 0.6536 when classifying 7 emotions. Further training will be done to determine how it performs on smaller subsets of emotions. + ## Performance - Currently the ConvolutionalLstmNN model is performing best when classifying 7 emotions with a validation accuracy of 47.5%. The table below shows accuracy values of this model and the TransferLearningNN model when trained on all seven standard emotions and on a subset of three emotions (fear, happiness, neutral). They were trained on 5,000 images from the [FER+](https://github.com/Microsoft/FERPlus) dataset. + Before implementing the ConvolutionalNNDropout model, the ConvolutionalLstmNN model was performing best when classifying 7 emotions with a validation accuracy of 47.5%. The table below shows accuracy values of this model and the TransferLearningNN model when trained on all seven standard emotions and on a subset of three emotions (fear, happiness, neutral). They were trained on 5,000 images from the [FER+](https://github.com/Microsoft/FERPlus) dataset. | Neural Net Model | 7 emotions | | 3 emotions | | |---------------------|-------------------|---------------------|-------------------|---------------------| @@ -249,10 +314,35 @@ Description: # EmoPy Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)): - - | [angelicaperez37
angelicaperez37](https://github.com/angelicaperez37)
[πŸ’»](https://github.com/thoughtworksarts/EmoPy/commits?author=angelicaperez37 "Code") [πŸ“](#blog-angelicaperez37 "Blogposts") [πŸ“–](https://github.com/thoughtworksarts/EmoPy/commits?author=angelicaperez37 "Documentation") | [sbriley
sbriley](https://github.com/sbriley)
[πŸ’»](https://github.com/thoughtworksarts/EmoPy/commits?author=sbriley "Code") | [Sofia Tania
Sofia Tania](http://tania.pw)
[πŸ’»](https://github.com/thoughtworksarts/EmoPy/commits?author=stania1 "Code") | [Andrew McWilliams
Andrew McWilliams](https://jahya.net)
[πŸ“–](https://github.com/thoughtworksarts/EmoPy/commits?author=microcosm "Documentation") [πŸ€”](#ideas-microcosm "Ideas, Planning, & Feedback") | [Webs
Webs](http://www.websonthewebs.com)
[πŸ’»](https://github.com/thoughtworksarts/EmoPy/commits?author=weberswords "Code") | [Sara GW
Sara GW](https://github.com/saragw6)
[πŸ’»](https://github.com/thoughtworksarts/EmoPy/commits?author=saragw6 "Code") | [Megan Sullivan
Megan Sullivan](http://www.linkedin.com/in/meganesu)
[πŸ“–](https://github.com/thoughtworksarts/EmoPy/commits?author=meganesu "Documentation") | - | :---: | :---: | :---: | :---: | :---: | :---: | :---: | - | [sadnantw
sadnantw](https://github.com/sadnantw)
[πŸ’»](https://github.com/thoughtworksarts/EmoPy/commits?author=sadnantw "Code") [⚠️](https://github.com/thoughtworksarts/EmoPy/commits?author=sadnantw "Tests") | [Julien Deswaef
Julien Deswaef](http://xuv.be)
[πŸ’»](https://github.com/thoughtworksarts/EmoPy/commits?author=xuv "Code") [πŸ“–](https://github.com/thoughtworksarts/EmoPy/commits?author=xuv "Documentation") | [Tanushri Chakravorty
Tanushri Chakravorty](https://github.com/sinbycos)
[πŸ’»](https://github.com/thoughtworksarts/EmoPy/commits?author=sinbycos "Code") [πŸ’‘](#example-sinbycos "Examples") | [Linas VepΕ‘tas
Linas VepΕ‘tas](http://linas.org)
[πŸ”Œ](#plugin-linas "Plugin/utility libraries") | + + + + + + + + + + + + + + + + + + + + + + + + + +

angelicaperez37

πŸ’» πŸ“ πŸ“–

sbriley

πŸ’»

Sofia Tania

πŸ’»

Andrew McWilliams

πŸ“– πŸ€”

Webs

πŸ’»

Sara GW

πŸ’»

Megan Sullivan

πŸ“–

sadnantw

πŸ’» ⚠️

Julien Deswaef

πŸ’» πŸ“–

Tanushri Chakravorty

πŸ’» πŸ’‘

Linas VepΕ‘tas

πŸ”Œ

Emily Sachs

πŸ’»

Diana Gamez

πŸ’»

dtoakley

πŸ“– πŸ’»

Anju

🚧

Satish Dash

🚧
+ + + This project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome! diff --git a/EmoPy.egg-info/SOURCES.txt b/EmoPy.egg-info/SOURCES.txt index def5bb6..4676198 100644 --- a/EmoPy.egg-info/SOURCES.txt +++ b/EmoPy.egg-info/SOURCES.txt @@ -142,6 +142,7 @@ EmoPy/models/conv_emotion_map_25.json EmoPy/models/conv_model_01.json EmoPy/models/conv_model_012.hdf5 EmoPy/models/conv_model_012.json +EmoPy/models/conv_model_0123456.h5 EmoPy/models/conv_model_02.json EmoPy/models/conv_model_024.json EmoPy/models/conv_model_025.hdf5 @@ -185,6 +186,13 @@ EmoPy/src/dataset.py EmoPy/src/directory_data_loader.py EmoPy/src/fermodel.py EmoPy/src/neuralnets.py +EmoPy/tests/__init__.py +EmoPy/tests/run_all.py +EmoPy/tests/unittests/__init__.py +EmoPy/tests/unittests/test_data_generator.py +EmoPy/tests/unittests/test_dataloader.py +EmoPy/tests/unittests/library/__init__.py +EmoPy/tests/unittests/library/image_test.py EmoPy/tests/unittests/resources/dummy_data_directory/happiness/sample_image.jpg EmoPy/tests/unittests/resources/dummy_empty_data_directory/.gitignore EmoPy/tests/unittests/resources/dummy_time_series_data_directory/happiness/sample1/sample_image.jpg diff --git a/EmoPy.egg-info/requires.txt b/EmoPy.egg-info/requires.txt index 9bf7843..bc2397c 100644 --- a/EmoPy.egg-info/requires.txt +++ b/EmoPy.egg-info/requires.txt @@ -1,14 +1,15 @@ -keras>=2.2.0 +coverage==4.5.3 +keras==2.2.4 lasagne pytest -matplotlib>2.1.0 -numpy<=1.14.5,>=1.13.3 -scikit-image>=0.13.1 +numpy==1.17.4 +matplotlib==2.2.0 +scikit-image==0.13.1 scikit-learn>=0.19.1 scikit-neuralnetwork>=0.7 scipy==1.0.0 -tensorflow>=1.10.1 +tensorflow==1.13.1 opencv-python -h5py +h5py==2.9.0 pydot graphviz diff --git a/README.md b/README.md index 0ac9f00..3cbf70c 100644 --- a/README.md +++ b/README.md @@ -1,7 +1,7 @@ # EmoPy EmoPy is a python toolkit with deep neural net classes which predicts human emotional expression classifications given images of people's faces. The goal of this project is to explore the field of [Facial Expression Recognition (FER)](https://en.wikipedia.org/wiki/Emotion_recognition) using existing public datasets, and make neural network models which are free, open, easy to research and easy integrate into other projects. -![Labeled FER Images](readme_docs/labeled_images_7.png "Labeled Facial Expression Images") +![Labeled FER Images](readme_docs/labeled_images_7.png "Labeled Facial Expression Images") *Figure from [@Chen2014FacialER]* The behavior of the system is highly dependent on the available data, and the developers of EmoPy created and tested the system using only publicly-available datasets. @@ -53,15 +53,19 @@ Predictions ideally perform well on a diversity of datasets, illumination condit ## Environment Setup + Python is compatible with multiple operating systems. If you would like to use EmoPy on another OS, please convert these instructions to match your target environment. Let us know how you get on, and we will try to support you and share you results. +### macOS + + Before beginning, if you do not have Homebrew installed run this command to install: ``` /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" ``` -EmoPy runs using Python 3.6 and up, theoretically on any Python-compatible OS. We tested EmoPy using Python 3.6.6 on OSX. +EmoPy runs using Python 3.6 and up, theoretically on any Python-compatible OS. We tested EmoPy using Python 3.6.6 on OSX. There are 2 ways you can install Python 3.6.6: @@ -71,7 +75,7 @@ There are 2 ways you can install Python 3.6.6: ``` $ brew install pyenv $ pyenv install 3.6.6 -``` +``` GraphViz is required for visualisation functions. @@ -103,6 +107,35 @@ Your terminal command line should now be prefixed with ```(venv)```. (To deactivate the virtual environment run ```deactivate``` in the command line. You'll know it has been deactivated when the prefix ```(venv)``` disappears.) + +### Windows + +This works on Windows 10 using Python 3.6.6 and 3.6.8. + +Download python directly from the [Python website] (https://www.python.org/downloads/release/python-366/), or if you are already using pyenv, then you can use pyenv to install 3.6.6 and then activate version 3.6.6 before doing the next steps. + +The next step is to set up a virtual environment using virtualenv. +``` +pip install virtualenv +``` + +Create and activate the virtual environment. Run: +``` +python -m venv emoPyVenv +``` + +Where `emoPyVenv` is the name of your virtual environment. To activate, run from the same directory: +``` +source emoPyVenv\Scripts\activate +``` +Your terminal command line should now be prefixed with ```(emoPyVenv)```. + +(To deactivate the virtual environment run ```deactivate``` in the command line. You'll know it has been deactivated when the prefix ```(emoPyVenv)``` disappears.) + +One you have created the virtual environment, each time you activate it, version 3.6.6 will be the version of python. + +You should do the instalation from the source described below. Then each time you activate the virtual environment, you'll have access to the installation of EmoPy. + ## Installation @@ -138,7 +171,7 @@ You can run the tests with: python EmoPy/tests/run_all.py ``` -We encourage improvements and additions to these tests! +We encourage improvements and additions to these tests! ## Running the examples @@ -186,7 +219,7 @@ To train your own neural net, use one of our FER neural net classes to get start ``` python convolutional_model.py -``` +``` The example first initializes the model. A summary of the model architecture will be printed out. This includes a list of all the neural net layers and the shape of their output. Our models are built using the Keras framework, which offers this visualization function. diff --git a/setup.py b/setup.py index 7a62e2e..5fc8a91 100644 --- a/setup.py +++ b/setup.py @@ -1,6 +1,6 @@ import setuptools -with open("README.md","r") as fh: +with open("README.md","r",encoding="UTF-8") as fh: long_description = fh.read() setuptools.setup( @@ -24,15 +24,15 @@ 'keras==2.2.4', 'lasagne', 'pytest', - 'matplotlib>2.1.0', 'numpy==1.17.4', + 'matplotlib==2.2.0', 'scikit-image==0.13.1', 'scikit-learn>=0.19.1', 'scikit-neuralnetwork>=0.7', 'scipy==1.0.0', 'tensorflow==1.13.1', 'opencv-python', - 'h5py', + 'h5py==2.9.0', 'pydot', 'graphviz', ]