https://github.com/deepmind/sonnet

sort by:
Revision Author Date Message Commit Date
c2b3eda Sonnet version update produced on Tuesday, 12. February 2019 PiperOrigin-RevId: 233617272 26 February 2019, 17:34:24 UTC
9c68120 Add mkdocs yaml file. PiperOrigin-RevId: 233616884 26 February 2019, 17:34:15 UTC
bcec0d4 internal change PiperOrigin-RevId: 233565498 12 February 2019, 16:51:22 UTC
b12c680 Support lower precision inputs. PiperOrigin-RevId: 233081564 12 February 2019, 16:51:14 UTC
84e800f internal change PiperOrigin-RevId: 233030785 12 February 2019, 16:51:04 UTC
d11f9c4 A class to linear transform the concatenation of a list of Tensors. It ensures the relative importance of all inputs are similar even if they have very different sizes. PiperOrigin-RevId: 232509624 12 February 2019, 16:50:56 UTC
62b0f6e internal change PiperOrigin-RevId: 232439572 12 February 2019, 16:50:45 UTC
03d9aef Update protobuf dependency. PiperOrigin-RevId: 232317599 04 February 2019, 18:14:37 UTC
381b630 Sonnet version update produced on Tuesday, 29. January 2019 PiperOrigin-RevId: 231442829 04 February 2019, 18:14:29 UTC
e1f8283 internal change PiperOrigin-RevId: 231365216 29 January 2019, 17:35:41 UTC
8b4942f Internal change PiperOrigin-RevId: 231186348 29 January 2019, 17:35:34 UTC
ae7e001 Internal change. PiperOrigin-RevId: 231069857 29 January 2019, 17:35:28 UTC
674007e Make module/connection stacks thread local. PiperOrigin-RevId: 231065224 29 January 2019, 17:35:21 UTC
f138492 Internal Change. PiperOrigin-RevId: 230959989 29 January 2019, 17:35:15 UTC
3a9e904 internal change PiperOrigin-RevId: 230881819 29 January 2019, 17:35:08 UTC
eb46086 Allow control over max pondering steps. PiperOrigin-RevId: 230704013 29 January 2019, 17:35:02 UTC
f451e9a Add hyperlinks to documentation. PiperOrigin-RevId: 230691280 29 January 2019, 17:34:56 UTC
e478c60 Fix headers in installation instructions. PiperOrigin-RevId: 230691260 29 January 2019, 17:34:49 UTC
11e00f7 Remove Pandoc and Sphinx configuration files. PiperOrigin-RevId: 230690314 29 January 2019, 17:34:43 UTC
6c53258 internal change PiperOrigin-RevId: 230302480 29 January 2019, 17:34:36 UTC
249c817 internal change PiperOrigin-RevId: 228307386 29 January 2019, 17:34:29 UTC
675e849 Increase test size to accommodate sanitizers PiperOrigin-RevId: 228141037 29 January 2019, 17:34:19 UTC
2f7a52f Remove `@snt.experimental.reuse_vars`. Please use `@snt.reuse_variables` instead! PiperOrigin-RevId: 228063197 29 January 2019, 17:34:12 UTC
a9c6a8d Allowing Conv2D to accept unicode, in addition to str. PiperOrigin-RevId: 227748726 29 January 2019, 17:34:06 UTC
0bfb404 Add MLP MNIST example. PiperOrigin-RevId: 227704815 29 January 2019, 17:34:00 UTC
e95ba5c Add wrapt to required packages. Fixes https://github.com/deepmind/sonnet/issues/115 PiperOrigin-RevId: 227703904 29 January 2019, 17:33:53 UTC
a12ce4d internal change PiperOrigin-RevId: 227485581 29 January 2019, 17:33:46 UTC
bbdf2ae Put reused variables in _all_variables when using nested modules. It was possible before to end up in an inconsistent state if inside `_build` `ParentModule` assigned `self.child = Child()` and in your training loop you used `mod.child.variables` (or `get_all_variables()`): ``` mod = ParentModule() for record in inputs: with tf.GradientTape() as tape: outs = mod(record) vars = mod.child.variables # After first iteration this would be empty. grads = tape.gradient(outs, vars) ``` While I suspect this sort of thing would be more common in eager mode, the bug still exists in graph mode. PiperOrigin-RevId: 227178237 29 January 2019, 17:33:40 UTC
91bab7b internal change PiperOrigin-RevId: 227108833 29 January 2019, 17:33:33 UTC
fcf20ed Enable Python 3 in Sonnet's py_binary rules. PiperOrigin-RevId: 227011720 29 January 2019, 17:33:27 UTC
e0988b1 Remove some dependencies on internal TensorFlow symbols. PiperOrigin-RevId: 226960072 29 January 2019, 17:33:20 UTC
0400b3d internal change PiperOrigin-RevId: 226476878 29 January 2019, 17:33:14 UTC
fe13d13 internal change PiperOrigin-RevId: 226179552 29 January 2019, 17:33:03 UTC
79d920c Sonnet version update produced on Wednesday, 19. December 2018 PiperOrigin-RevId: 226148846 29 January 2019, 17:32:56 UTC
bb2e382 Update changelog for version 1.28 PiperOrigin-RevId: 226008459 29 January 2019, 17:32:49 UTC
39e817b FIX: setup.py.tmpl referenced wrong package name setup.py.tmpl referenced "tensor-probability-gpu" instead of "tensorflow-probability-gpu". this caused installation via pip to fail since dm-sonnet-gpu==1.25 GIT_ORIGIN_REV_ID=b18ca77dd73795d5c18dbf2a1e895471493c15f2 PiperOrigin-RevId: 226004212 29 January 2019, 17:32:43 UTC
5995647 Allow for kwargs to be forwarded to sub-modules in DeepRNN. Useful for propagating 'is_training'. PiperOrigin-RevId: 225976178 29 January 2019, 17:32:32 UTC
8b557d8 internal change PiperOrigin-RevId: 225181012 29 January 2019, 17:32:25 UTC
a35746d ConvNet2D{Transpose}: Filter the kwargs passed to the normalizer. This makes it possible to connect the module with is_training=True, even if the normalization module selected is something which does not support that (eg LayerNorm). Normal practice would be to call the module with is_training inside a try block, catch any errors and then reconnect without is_training. However, that will generally have created some variables internally, so global (tf.Graph-level) state has been changed, producing errors. This solution works for any normalization module which has a signature that does not contain **kwargs, because that makes it impossible to introspect over whether the flag is supported or not. PiperOrigin-RevId: 224994092 29 January 2019, 17:32:14 UTC
0c8a946 1. Fix the sequence reading order for backward unroll. The order should be decreasing but was increasing. 2. Fix RNN unroll by feeding correct state (the previous state) to each unroll step. The state was always the initial state. PiperOrigin-RevId: 224174351 29 January 2019, 17:32:07 UTC
a64805b Support non-2D recurrent state in pondering RNN. (Still relies on leading dimension being the batch dimension.) PiperOrigin-RevId: 223522357 29 January 2019, 17:32:01 UTC
8ba30c4 Logging utility function, to be used from inside a module. A kwarg `verbose=True` must be provided to the function for it to actually print, meaning users don't need to fill their module code with if statements. Currently just logs the provided information, preceded by full module path. PiperOrigin-RevId: 223516389 29 January 2019, 17:31:54 UTC
9b1503a Add `remove_unsupported_kwargs()` which can filter a set of potential kwargs by whether they are supported by a module. If the function has **kwargs in the signature, we assume all kwargs are valid. PiperOrigin-RevId: 223500517 29 January 2019, 17:31:48 UTC
f20e0a6 Remove references to `tf.contrib.rnn.RNNCell` in the RNNCore docstring. While RNNCore used to inherit from RNNCell, it doesn't any longer and the docstring needed updating. PiperOrigin-RevId: 223176347 29 January 2019, 17:31:42 UTC
5d2e01c Re-enable bfloat16 test PiperOrigin-RevId: 223175090 29 January 2019, 17:31:34 UTC
0b660f9 Add supports_kwargs() function. This allows testing whether some callable, either a function / method / object, supports a list of keyword args. If an object is provided, object.__call__ is checked which maps to _build for Sonnet modules. This can be used for checking whether kwargs like `is_training` are supported by a given module. PiperOrigin-RevId: 223158721 29 January 2019, 17:31:28 UTC
3216e4b gated_rnn_test: Make GRUTest also inherit from parameterized for consistency. PiperOrigin-RevId: 223151892 29 January 2019, 17:31:21 UTC
78d9437 Make recurrent dropout / zoneout tests less extreme. PiperOrigin-RevId: 223027131 29 January 2019, 17:31:14 UTC
7552f41 Use more specific assertions in util_test. PiperOrigin-RevId: 222998330 29 January 2019, 17:31:08 UTC
6d46414 Disable mod.get_variables() in eager mode when using defun. PiperOrigin-RevId: 222602705 29 January 2019, 17:31:02 UTC
7217be1 Replaced deprecated tf.create_partitioned_variables with tf.get_variable PiperOrigin-RevId: 222569043 29 January 2019, 17:30:55 UTC
436f794 Sonnet version update produced on Tuesday, 20. November 2018 PiperOrigin-RevId: 222260709 20 November 2018, 20:00:34 UTC
289e227 Update Changelog for version 1.27 PiperOrigin-RevId: 222252851 20 November 2018, 16:54:29 UTC
989e397 scale by key_size in relational memory _multihead_attention GIT_ORIGIN_REV_ID=b92d54997e7df890458b672540c4f9832da9e8aa PiperOrigin-RevId: 222243055 20 November 2018, 16:54:22 UTC
9efc432 Make SkipConnectionCore and ResidualCore call the initial_state/zero_state methods of the base core. PiperOrigin-RevId: 222085553 20 November 2018, 16:54:15 UTC
8a6d51c Rename the 'axes' argument in the snt.LayerNorm constructor to 'axis' to be consistent with sonnet's BatchNorm constructor. Also modified the checks (and docs) such that axis can alternatively be a scalar int instead of requiring a list for convenience. PiperOrigin-RevId: 222064990 20 November 2018, 16:54:09 UTC
eace9e7 Copy signature of _build to __call__. PiperOrigin-RevId: 221109661 20 November 2018, 16:54:02 UTC
f0ae8ac Backwards-compatibility fix for _ConvND.padding, following cl/221079656 which introduced different padding types per dimension. layer.padding will now return a single padding type where it's the same across all dimensions; .paddings can be used to get the padding for each dimension separately. PiperOrigin-RevId: 221087200 20 November 2018, 16:53:56 UTC
4e6f863 Support FULL, CAUSAL and REVERSE_CAUSAL padding options for _ConvND and its subclasses. Also support using different padding settings for {depth,} height and width dimensions in Conv{2,3}D. This is implemented using a tf.pad where necessary before the convolution op. Deprecates snt.CausalConv1D which is now achievable via Conv1D(..., padding=CAUSAL). PiperOrigin-RevId: 221079656 20 November 2018, 16:53:50 UTC
ee326a9 Print the type (legacy or resource) of Tensorflow variables in snt.log_variables(). PiperOrigin-RevId: 220446611 20 November 2018, 16:53:40 UTC
5f28cec Change some formatting, based on the automatic formatter. PiperOrigin-RevId: 220116426 20 November 2018, 16:53:33 UTC
667f06e Make ConvNet2D more flexible in what normalization scheme is used. The use_batch_norm and batch_norm_config flags are deprecated and will be removed in the future. Note that for backwards compatibility, the normalization modules will still be built with the name 'batch_norm_{layer_index}'. Old checkpoints will still load. PiperOrigin-RevId: 220102133 20 November 2018, 16:53:26 UTC
c646148 Use initializer with stddev=1 for sonnet.Embed. PiperOrigin-RevId: 219775902 20 November 2018, 16:53:20 UTC
3c85a7c Allow `LayerNorm` to accept >2D input Previously >2D input would be an error, whereas now it will normalize over all non-batch dimensions. No code which previously didn't throw an error should have changed behaviour. PiperOrigin-RevId: 219461741 20 November 2018, 16:53:14 UTC
729aedf A test erroneously suggested use_batch_norm accepts iterables of booleans. Fix the test and notify users who were using snt.ConvNet2D unknowingly incorrectly. PiperOrigin-RevId: 219278708 20 November 2018, 16:53:07 UTC
55c7214 Internal change. PiperOrigin-RevId: 219275912 20 November 2018, 16:53:01 UTC
b4a6f8a Internal change PiperOrigin-RevId: 219116281 20 November 2018, 16:52:54 UTC
d70b531 Sonnet version update produced on Monday, 22. October 2018 PiperOrigin-RevId: 218144877 22 October 2018, 11:20:33 UTC
53aac92 Check dependencies before importing rest of library PiperOrigin-RevId: 217882721 22 October 2018, 11:20:23 UTC
087cb9d Sonnet version update produced on Tuesday, 16. October 2018 PiperOrigin-RevId: 217309141 16 October 2018, 16:51:45 UTC
4fe21ce Update changelog for 1.25 PiperOrigin-RevId: 217304436 16 October 2018, 16:51:36 UTC
3f21751 Change Sonnet to depend on tensorflow_probability PiperOrigin-RevId: 216874759 16 October 2018, 13:00:36 UTC
482fafb Change dependency on tf.contrib.distributions to tfp.distributions. PiperOrigin-RevId: 216785589 16 October 2018, 13:00:23 UTC
a3f6246 Changed inputs.shape to tf.shape(inputs) to allow unknown batch dimension. PiperOrigin-RevId: 216722246 16 October 2018, 13:00:16 UTC
ce11b92 Change axis in concat in DeepRNN when using skip_connections. Previous use cases will work (i.e. cores with shape [batch_size, feature_size] will have the same behaviour), but will enable more sensible concatenation of cores that are multidimensional. PiperOrigin-RevId: 216543007 16 October 2018, 13:00:08 UTC
058e291 Add `rate` field to the SeparableConv[1,2]D classes. PiperOrigin-RevId: 216208732 16 October 2018, 13:00:00 UTC
ac0d4f5 Describe input shape requirements for _ConvND module more accurately. PiperOrigin-RevId: 216188489 16 October 2018, 12:59:52 UTC
ef01425 Additional argument-overriding custom getter that only updates defaults, honouring any non-None argument values set in tf.get_variable (or in nested scopes' custom getters). PiperOrigin-RevId: 215360232 16 October 2018, 12:59:34 UTC
fe0874e Internal changes. PiperOrigin-RevId: 215180895 16 October 2018, 12:59:26 UTC
7106507 Add dropout to sonnet's MLP class. Dropout is a very useful regularizer that isn't currently supported in sonnet's MLP class. In this CL, we add an argument to the MLP class, `use_dropout`. The `_build` method now takes optional `is_training` and `dropout_keep_probability` arguments. PiperOrigin-RevId: 214926124 16 October 2018, 12:59:18 UTC
f100d0b Add Learn to Execute example for Relational Memory Core to sonnet examples. PiperOrigin-RevId: 214918399 16 October 2018, 12:59:10 UTC
c99c3f9 Adjust test sizes/tags for sanitizers PiperOrigin-RevId: 213795074 16 October 2018, 12:59:02 UTC
1c3d7a8 Adjust test sizes/tags for sanitizers PiperOrigin-RevId: 213622590 16 October 2018, 12:58:53 UTC
55061cb Replace tf.GraphKeys.VARIABLES with tf.GraphKeys.GLOBAL_VARIABLES PiperOrigin-RevId: 212790529 16 October 2018, 12:58:45 UTC
e8bd52b Fix for n-th farthest task RMC example. Corrects index reference to object. PiperOrigin-RevId: 212530863 16 October 2018, 12:58:36 UTC
f4da53a Increase size of convnet_test and dilation_test from small to medium. PiperOrigin-RevId: 211973944 16 October 2018, 12:58:28 UTC
ffa7249 Avoid same graph checks in eager mode and stop using `_graph_key`. PiperOrigin-RevId: 211439151 16 October 2018, 12:58:17 UTC
f32ee48 RNN Shakespeare test: reduce number of training steps from 10 to 5. PiperOrigin-RevId: 211337169 16 October 2018, 12:58:08 UTC
6680867 Fix docstring PiperOrigin-RevId: 211210870 16 October 2018, 12:58:00 UTC
0bdd9c3 Removes the reference to snt.SkipConnectionCore. PiperOrigin-RevId: 211061744 16 October 2018, 12:57:52 UTC
5e0234e Comment _scale_gradient_op regarding possible memoization requirements. PiperOrigin-RevId: 210785875 16 October 2018, 12:57:38 UTC
3011932 Allow Sonnet modules to defun wrap their reuse_variables methods. There was a subtle bug in `_capture_variables` where inside a `defun` we did not re-enter the Template's variable store (since `executing_eagerly` is False). By not re-entering the store we break variable re-use (since `get_variable` returns a new variable instance each time it is called). PiperOrigin-RevId: 210727568 16 October 2018, 12:57:29 UTC
f18095f Allow Sonnet modules to be defun wrap their call method. >>> mlp = snt.nets.MLP([1, 2, 3]) >>> mlp(tf.constant([[1.0]])) Tensor("mlp_1/linear_2/add:0", shape=(1, 3), dtype=float32) >>> mlp.defun() >>> mlp(tf.constant([[1.0]])) Tensor("PartitionedCall:0", shape=(1, 3), dtype=float32) By wrapping `_call` and not the whole module we allow properties on the module to remain accessible (without keeping a reference to the "raw" and defun'd objects. A good example of this is seen in the updated test. PiperOrigin-RevId: 210528522 16 October 2018, 12:57:20 UTC
535ccdb Fix docstring typo PiperOrigin-RevId: 210092924 16 October 2018, 12:57:12 UTC
c776d06 Sonnet embed: print warning about using default initializer. Eventually we will switch to using a default initializer of stddev=1. PiperOrigin-RevId: 210082974 16 October 2018, 12:57:04 UTC
a5fcdac Add clone method to snt.nets.MLP PiperOrigin-RevId: 209902324 16 October 2018, 12:56:57 UTC
257a62c Make `snt.scale_gradient` support eager mode. Much like optimizers and other tesor taking APIs, in eager mode we require users to pass us a callable which they want to scale the gradients of. >>> f = lambda x: tf.pow(x, 2) >>> f = scale_gradient(f, scale=0.1) >>> dy_scaled, tfe.gradients_function(f)(x) >>> print dy_scaled.numpy() # 0.2 PiperOrigin-RevId: 209405560 16 October 2018, 12:56:50 UTC
0e649da Implement same graph check using graph keys. tl;dr - resolves the last known issues with Sonnet and `tfe.defun`. Inside a `defun` we observe a different graph instance each time the function is traced. Sonnet asserts that each time a module is traced that it's graph has not changed. This CL changes that test to check that the graph we observe each time has the same "key" (rather than being the same instance). The key is akin to a primary key for the graph. Graph instances with the same key form part of the same parent graph (e.g. a sub-graph representing a function with key "a" shares variables with a regular tf.Graph with key "a" which it is a part of). DifferentGraphError used to fire undesirably in defun in the following cases (both of which are fixed in this cl): 1) `enter_variable_scope` is used in conjunction with `_build` (e.g. in the constructor or via `snt.reuse_variables`. 2) `defun` re-traces `_build` due to the input signature changing (e.g. Tensor shape changing, or Python parameter values changing). PiperOrigin-RevId: 209131983 16 October 2018, 12:56:44 UTC
6a42160 Pass optional named arguments to the wrapped sonnet module. Named arguments can be used to change the behavior of sonnet modules. For example, it's not uncommon to use dropout to regularize a deep neural network. However, dropout should only be used during the training of the network, not afterwards. PiperOrigin-RevId: 208786599 16 October 2018, 12:56:36 UTC
1fb3f4e Make MLP/ConvNet compatible with `tfe.defun`. PiperOrigin-RevId: 208621259 16 October 2018, 12:56:29 UTC
back to top