Back-2-Back Translation Dataset The dataset used is WMT-14 en-de . Model Training See colab notebook Results Learning from Explanations with Neural Execution Tree