Skip to content

solar464/dnc

 
 

Repository files navigation

Differentiable Neural Computer (DNC)

This package provides an implementation of the Differentiable Neural Computer, as published in Nature.

Any publication that discloses findings arising from using this source code must cite “Hybrid computing using a neural network with dynamic external memory", Nature 538, 471–476 (October 2016) doi:10.1038/nature20101.

Introduction

The Differentiable Neural Computer is a recurrent neural network. At each timestep, it has state consisting of the current memory contents (and auxiliary information such as memory usage), and maps input at time t to output at time t. It is implemented as a collection of RNNCore modules, which allow plugging together the different modules to experiment with variations on the architecture.

  • The access module is where the main DNC logic happens; as this is where memory is written to and read from. At every timestep, the input to an access module is a vector passed from the controller, and its output is the contents read from memory. It uses two futher RNNCores: TemporalLinkage which tracks the order of memory writes, and Freeness which tracks which memory locations have been written to and not yet subsequently "freed". These are both defined in addressing.py.

  • The controller module "controls" memory access. Typically, it is just a feedforward or (possibly deep) LSTM network, whose inputs are the inputs to the overall recurrent network at that time, concatenated with the read memory output from the access module from the previous timestep.

  • The dnc simply wraps the access module and the control module, and forms the basic RNNCore unit of the overall architecture. This is defined in dnc.py.

DNC architecture

Installation

make install

The above command will create a virtual environment and install the dependencies and pre-commit hooks.

Run source venv/bin/activate in the root directory of this repository to activate the installed virtual env.

Testing

make test

Run unit tests in tests/ using pytest.

Train

The DNC requires an installation of TensorFlow and Sonnet. An example training script is provided for the algorithmic task of repeatedly copying a given input string. This can be executed from a python interpreter:

$ ipython train.py

You can specify training options, including parameters to the model and optimizer, via flags:

$ python train.py --memory_size=64 --num_bits=8 --max_length=3

# Or with ipython:
$ ipython train.py -- --memory_size=64 --num_bits=8 --max_length=3

Periodically saving, or 'checkpointing', the model is disabled by default. To enable, use the checkpoint_interval flag. E.g. --checkpoint_interval=10000 will ensure a checkpoint is created every 10,000 steps. The model will be checkpointed to ./logs/repeat_copy/checkpoint by default. From there training can be resumed. To specify an alternate checkpoint directory, use the log_dir flag. Note: ensure that existing checkpoints are deleted or moved before training is resumed with different model parameters, to avoid shape inconsistency errors.

More generally, the DNC class found within dnc.py can be used as a standard TensorFlow rnn core and unrolled with TensorFlow rnn ops, such as keras.layers.RNN on any sequential task.

Model Inspection

jupyter notebook interactive.ipynb

Jupyter notebook that loads a trained model from checkpoints. It provides helper functions for evaluating arbitrary input bit sequences and visualizing output and intermediate read/write states.

tensorboard --logdir logs/repeat_copy/

Tensorboard visualization of test/train loss and TensorFlow Graph. Test/Train loss is emitted based on report_interval.

Disclaimer: This is not an official Google product

About

A TensorFlow implementation of the Differentiable Neural Computer.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 89.2%
  • Jupyter Notebook 10.4%
  • Makefile 0.4%