Unverified Commit 14906723 authored by Bharath Ramsundar's avatar Bharath Ramsundar Committed by GitHub
Browse files

Merge pull request #1067 from lilleswing/docs

Docs Update
parents f3ccbc93 93f527ed
Loading
Loading
Loading
Loading
+47 −47
Original line number Diff line number Diff line
@@ -34,7 +34,7 @@ sys.path.insert(0, os.path.abspath("../sphinxext"))
extensions = [
    'sphinx.ext.autodoc', 'sphinx.ext.autosummary', 'sphinx.ext.doctest',
    'sphinx.ext.intersphinx', 'sphinx.ext.coverage', 'sphinx.ext.mathjax',
    'sphinx.ext.ifconfig', 'numpydoc', 'sphinx.ext.viewcode'
    'sphinx.ext.ifconfig', 'sphinx.ext.viewcode', 'sphinx.ext.napoleon'
]

autosummary_generate = True
+0 −104
Original line number Diff line number Diff line
%% Cell type:markdown id: tags:

# Using Tensorboard in DeepChem

%% Cell type:markdown id: tags:

DeepChem Neural Networks models are built on top of [tensorflow](https://www.tensorflow.org).  [Tensorboard](https://www.tensorflow.org/get_started/summaries_and_tensorboard) is a powerful visualization tool in tensorflow for viewing your model architecture and performance.

In this tutorial we will show how to turn on tensorboard logging for our models, and go show the network architecture for one of our more popular models.

%% Cell type:markdown id: tags:

The first thing we have to do is load a dataset that we will monitor model performance over.

%% Cell type:code id: tags:

``` python
import deepchem as dc
from deepchem.molnet import load_tox21
from deepchem.models.tensorgraph.models.graph_models import GraphConvTensorGraph

# Load Tox21 dataset
tox21_tasks, tox21_datasets, transformers = load_tox21(featurizer='GraphConv')
train_dataset, valid_dataset, test_dataset = tox21_datasets

```

%% Output

    /home/leswing/miniconda3/envs/deepchem/lib/python3.5/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
      from ._conv import register_converters as _register_converters

    Loading dataset from disk.
    Loading dataset from disk.
    Loading dataset from disk.

%% Cell type:markdown id: tags:

Now we will create our model with tensorboard on.  All we have to do to turn tensorboard on is pass the tensorboard=True flag to the constructor of our model

%% Cell type:code id: tags:

``` python
# Construct the model with tensorbaord on
model = GraphConvTensorGraph(len(tox21_tasks),  mode='classification', tensorboard=True)

# Fit the model
model.fit(train_dataset, nb_epoch=10)
```

%% Output

    /home/leswing/miniconda3/envs/deepchem/lib/python3.5/site-packages/tensorflow/python/ops/gradients_impl.py:96: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory.
      "Converting sparse IndexedSlices to a dense Tensor of unknown shape. "

    Starting epoch 0
    Starting epoch 1
    Starting epoch 2
    Starting epoch 3
    Starting epoch 4
    Starting epoch 5
    Starting epoch 6
    Starting epoch 7
    Starting epoch 8
    Starting epoch 9
    Ending global_step 630: Average loss 940.645
    TIMING: model fitting took 52.005 s

    940.6450127495659

%% Cell type:markdown id: tags:

### Viewing the Tensorboard output
When tensorboard is turned on we log all the files needed for tensorboard in model.model_dir.  To launch the tensorboard webserver we have to call in a terminal
```bash
tensorboard --logdir=model.model_dir
```

This will launch the tensorboard web server on your local computer on port 6006.  Go to http://localhost:6006 in your web browser to look through tensorboard's UI.

The first thing you will see is a graph of the loss vs mini-batches.  You can use this data to determine if your model is still improving it's loss function over time or to find out if your gradients are exploding!.

%% Cell type:markdown id: tags:

![tensorboard_landing](assets/tensorboard_landing.png)

%% Cell type:markdown id: tags:

If you click "GRAPHS" at the top you can see a visual layout of the model.  Here is what our GraphConvTensorGraph Model looks like

%% Cell type:markdown id: tags:

![Graph Conv Architecture](assets/GraphConvArch.png)

%% Cell type:markdown id: tags:

The "GraphGather" box is the "Neural Fingerprint" developed by learning features of the molecule via graph convolutions.

Using tensorboard to visualize the model is a fast way to get a high level understanding of what a model is made of and how it works!

%% Cell type:code id: tags:

``` python
```