Skip to content

Commit c925691

Browse files
committed
convert readme to markdown, add note about min python versions
1 parent 1b79a6a commit c925691

File tree

2 files changed

+72
-74
lines changed

2 files changed

+72
-74
lines changed

README.md

Lines changed: 72 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,72 @@
1+
# Welcome to RETURNN
2+
3+
[GitHub repository](https://github.com/rwth-i6/returnn),
4+
[RETURNN paper 2016](https://arxiv.org/abs/1608.00895),
5+
[RETURNN paper 2018](https://arxiv.org/abs/1805.05225).
6+
7+
RETURNN - RWTH extensible training framework for universal recurrent neural networks,
8+
is a PyTorch/TensorFlow-based implementation of modern neural network architectures.
9+
It is optimized for fast and reliable training of neural networks in a multi-GPU environment.
10+
11+
The high-level features and goals of RETURNN are:
12+
13+
- **Simplicity**
14+
- Writing config / code is simple & straight-forward (setting up experiment, defining model)
15+
- Debugging in case of problems is simple
16+
- Reading config / code is simple (defined model, training, decoding all becomes clear)
17+
18+
- **Flexibility**
19+
- Allow for many different kinds of experiments / models
20+
21+
- **Efficiency**
22+
- Training speed
23+
- Decoding speed
24+
25+
All items are important for research, decoding speed is esp. important for production.
26+
27+
See our [Interspeech 2020 tutorial "Efficient and Flexible Implementation of Machine Learning for ASR and MT" video](https://www.youtube.com/watch?v=wPKdYqSOlAY)
28+
([slides](https://www-i6.informatik.rwth-aachen.de/publications/download/1154/Zeyer--2020.pdf))
29+
with an introduction of the core concepts.
30+
31+
More specific features include:
32+
33+
- Mini-batch training of feed-forward neural networks
34+
- Sequence-chunking based batch training for recurrent neural networks
35+
- Long short-term memory recurrent neural networks
36+
including our own fast CUDA kernel
37+
- Multidimensional LSTM (GPU only, there is no CPU version)
38+
- Memory management for large data sets
39+
- Work distribution across multiple devices
40+
- Flexible and fast architecture which allows all kinds of encoder-attention-decoder models
41+
42+
See [documentation](https://returnn.readthedocs.io/).
43+
See [basic usage](https://returnn.readthedocs.io/en/latest/basic_usage.html) and [technological overview](https://returnn.readthedocs.io/en/latest/tech_overview.html).
44+
45+
[Here is the video recording of a RETURNN overview talk](https://www-i6.informatik.rwth-aachen.de/web/Software/returnn/downloads/workshop-2019-01-29/01.recording.cut.mp4)
46+
([slides](https://www-i6.informatik.rwth-aachen.de/web/Software/returnn/downloads/workshop-2019-01-29/01.returnn-overview.session1.handout.v1.pdf),
47+
[exercise sheet](https://www-i6.informatik.rwth-aachen.de/web/Software/returnn/downloads/workshop-2019-01-29/01.exercise_sheet.pdf); hosted by eBay).
48+
49+
There are [many example demos](https://github.com/rwth-i6/returnn/blob/master/demos/)
50+
which work on artificially generated data,
51+
i.e. they should work as-is.
52+
53+
There are [some real-world examples](https://github.com/rwth-i6/returnn-experiments)
54+
such as setups for speech recognition on the Switchboard or LibriSpeech corpus.
55+
56+
Some benchmark setups against other frameworks
57+
can be found [here](https://github.com/rwth-i6/returnn-benchmarks).
58+
The results are in the [RETURNN paper 2016](https://arxiv.org/abs/1608.00895).
59+
Performance benchmarks of our LSTM kernel vs CuDNN and other TensorFlow kernels
60+
are in [TensorFlow LSTM benchmark](https://returnn.readthedocs.io/en/latest/tf_lstm_benchmark.html).
61+
62+
There is also [a wiki](https://github.com/rwth-i6/returnn/wiki).
63+
Questions can also be asked on
64+
[StackOverflow using the RETURNN tag](https://stackoverflow.com/questions/tagged/returnn).
65+
66+
[![CI](https://github.com/rwth-i6/returnn/workflows/CI/badge.svg)](https://github.com/rwth-i6/returnn/actions)
67+
68+
## Dependencies
69+
70+
pip dependencies are listed in `requirements.txt` and `requirements-dev`, although some parts of the code may require additional dependencies (e.g. `librosa`, `resampy`) on-demand.
71+
72+
RETURNN supports Python >= 3.8. Bumps to the minimum Python version are listed in [`CHANGELOG.md`](https://github.com/rwth-i6/returnn/blob/master/CHANGELOG.md).

README.rst

Lines changed: 0 additions & 74 deletions
This file was deleted.

0 commit comments

Comments
 (0)