Skip to content

Designing and constructing convolutional autoencoders using nature-inspired algorithms

License

Notifications You must be signed in to change notification settings

SasoPavlic/NiaNetCAE

Repository files navigation

NiaPy


PyPI Version PyPI - Python Version Downloads GitHub license

Nature-Inspired Algorithm-driven Convolutional Autoencoder Architecture search: Empowered by High-Performance Computing (HPC)

Search space

Description 📝

The proposed method NiaNetCAE attempts to pick hyperparameters and Convolutional autoencoder architecture that will result in a successful encoding and decoding (minimal difference between input and output). NiaNetCAE uses the collection of algorithms available in the library NiaPy to navigate efficiently in waste search-space.

What it can do? 👀

  • Construct novel CONV AE's architecture using nature-inspired algorithms.
  • Selects the best hyperparameters for the given dataset.
  • It can be utilized for any kind of dataset, which has 3D images values.
  • Applied for depth estimation

Our example of the search space for the depth estimation problem:

  • Search space is defined by:
    • Architecture problem:
      • x: layer step (difference between input and output dimension)
      • y: Number of layers (architecture depth)
    • Hyperparameter problem:
      • z: activation function
      • w: optimizer algorithm
  • Total solutions:
    • x: 304
    • y: 304
    • z: 8
    • w: 6
    • x * y * z * w = 4.435.968 unique solutions
  • NiaNetCAE can find the best solution by leveraging the power of nature-inspired algorithms.

Installation ✅

Installing NiaNetCAE with pip3:

TODO: Publish it to PyPi

pip3 install nianetcae

Documentation 📘

The purpose of this paper is to get an understanding of the NiaNetCAE approach.

TODO - Future Journal: NiaNetCAE for depth estimation

Examples

Usage examples can be found here. Currently, there is an example for finding the appropriate Convolutional Autoencoder for depth estimation on NYU2 Dataset.

Getting started 🔨

Create your own example:
  1. Replace the dataset in data folder.
  2. Modify the parameters in main_config.py
  3. Adjust the dataloader logic in dataloaders folder.
  4. Specify the search space in conv_ae.py from your problem domain.
  5. Redesign the fitness function in cae_architecture_search.py based on your optimization.
Changing dataset:

Once the dataset is changed, dataloaders needs to be modified to be able for forwarding new shape of data to models.

Specify the search space:

Set the boundaries of your search space with conv_ae.py.

The following dimensions can be modified:

  • x: layer step
  • y: number of layers,
  • z: activation function
  • w: optimizer algorithm.

You can run the NiaNetCAE script once your setup is complete.

Running NiaNetCAE script with Docker:

docker build --tag spartan300/nianet:cae .

docker run \
  --name=nianet-cae \
  -it \
  -v $(pwd)/logs:/app/nianetcae/logs \
  -v $(pwd)/data:/app/data \
  -v $(pwd)/configs:/app/configs \
  -w="/app" \
  --shm-size 8G \
  --gpus all spartan300/nianet:cae \
  python main.py
Running NiaNetCAE script with Poetry help:
  1. Run the installation via poetry install
  2. Then run the task withpoetry run poe autoinstall-torch-cuda
Running NiaNetCAE script with HPC SLURM:
  1. First build an image with docker (above example)
  2. Docker push to Docker Hub: docker push username/nianet:cae
  3. SSH into a HPC Cluster via your access credentials
  4. Create the following nianetcae.sh script: cat > nianetcae.sh
#!/bin/bash
## Running code on SLURM cluster
##https://pytorch-lightning.readthedocs.io/en/stable/clouds/cluster_advanced.html
#SBATCH -J nianet-pso
#SBATCH -o nianet-pso-%j.out
#SBATCH -e nianet-pso-%j.err
#SBATCH --nodes=1
#SBATCH --ntasks=1
#SBATCH --partition=gpu
#SBATCH --mem-per-gpu=8GB  # memory per GPU
#SBATCH --gres=gpu:1
#SBATCH --time=72:00:00

singularity exec -e \
    --pwd /app \
    -B $(pwd)/logs:/app/logs,$(pwd)/data:/app/data,$(pwd)/configs:/app/configs \
    --nv docker://spartan300/nianet:cae \
    python main.py -alg particle_swarm
  1. Make script executable: chmod +x nianetcae.sh
  2. Make sure that you have the following folders in your current directory: logs, data, configs
  3. Set folder permissions to 777: chmod -R 777 logs data configs
  4. Submit your script to a job scheduler: SBATCH nianetcae.sh

HELP ⚠️

[email protected]

Acknowledgments 🎓

License

This package is distributed under the MIT License. This license can be found online at http://www.opensource.org/licenses/MIT.

Disclaimer

This framework is provided as-is, and there are no guarantees that it fits your purposes or that it is bug-free. Use it at your own risk!

About

Designing and constructing convolutional autoencoders using nature-inspired algorithms

Resources

License

Code of conduct

Stars

Watchers

Forks