Tuesday , July 14 2020

google / trax, Hacker News

                    

        

Trax – your path to advanced deep learning

train tracks

train tracks

Trax

Helps you understand and explore advanced deep learning. We focus on making Trax code clear while pushing advanced models like Reformer to their limits. Trax is actively used and maintained in the Google Brain team . Give it a try, talk to us or open an issue if needed. Use Trax

You can use Trax either as a library from your own python scripts and notebooks or as a binary from the shell, which can be more convenient for training large models. Trax includes a number of deep learning models (ResNet, Transformer, RNNs, …) and has bindings to a large number of deep learning datasets, including and TensorFlow datasets ). It runs without any changes on CPUs, GPUs and TPUs.

To see how to use Trax as a library, take a look at this quick start colab

which explains how to create data in python, connect it to a Transformer model in Trax, train it and run inference. You can select a CPU or GPU runtime, or even get a free 8-core TPU as runtime. With TPUs in colab you need to set extra flags as demonstrated in these training and inference colabs.

To use Trax as a binary and not forget all the parameters (model type, learning rate, other hyper-paramters and training settings), we recommend gin-config . Take a look at an example gin config

for training a simple MLP on MNIST and run it as follows: python -m trax.trainer –config_file=$ PWD / trax / configs / mlp_mnist.gin

As a more advanced example, you can train a Reformer on Imagenet 075

to generate images like this with the following command: python -m trax.trainer –config_file=$ PWD / trax / configs / reformer_imagenet . gin Structure

Trax code is structured in a way that allows you to understand deep learning from scratch. We start with basic maths and go through layers, models, supervised and reinforcement learning. We get to advanced deep learning results, including recent papers such as Reformer – The Efficient Transformer , selected for oral presentation at (ICLR)

The main steps needed to understand deep learning correspondences to sub-directories in Trax code:

    math /

– basic math operations and ways to accelerate them on GPUs and TPUs (through

(JAX) and TensorFlow )

About admin

Check Also

Google Play has been spreading advanced Android malware for years, Ars Technica

Google Play has been spreading advanced Android malware for years, Ars Technica

SURPRISE! — Advanced hacker group seeded market with least 8 apps since at least 2016. Dan Goodin - Apr 29, 2020 10:15 pm UTC Hackers have been using Google Play for years to distribute an unusually advanced backdoor capable of stealing a wide range of sensitive data, researchers said on Tuesday. Researchers from security firm…

Leave a Reply

Your email address will not be published. Required fields are marked *