bcec0d4 | Deepmind | 12 February 2019, 09:51:42 UTC | internal change PiperOrigin-RevId: 233565498 | 12 February 2019, 16:51:22 UTC |
b12c680 | jwrae | 08 February 2019, 18:17:33 UTC | Support lower precision inputs. PiperOrigin-RevId: 233081564 | 12 February 2019, 16:51:14 UTC |
84e800f | Deepmind | 08 February 2019, 10:59:16 UTC | internal change PiperOrigin-RevId: 233030785 | 12 February 2019, 16:51:04 UTC |
d11f9c4 | sracaniere | 05 February 2019, 18:15:48 UTC | A class to linear transform the concatenation of a list of Tensors. It ensures the relative importance of all inputs are similar even if they have very different sizes. PiperOrigin-RevId: 232509624 | 12 February 2019, 16:50:56 UTC |
62b0f6e | Deepmind | 05 February 2019, 09:04:59 UTC | internal change PiperOrigin-RevId: 232439572 | 12 February 2019, 16:50:45 UTC |
03d9aef | diegolascasas | 04 February 2019, 18:06:54 UTC | Update protobuf dependency. PiperOrigin-RevId: 232317599 | 04 February 2019, 18:14:37 UTC |
381b630 | diegolascasas | 29 January 2019, 19:40:35 UTC | Sonnet version update produced on Tuesday, 29. January 2019 PiperOrigin-RevId: 231442829 | 04 February 2019, 18:14:29 UTC |
e1f8283 | Deepmind | 29 January 2019, 09:21:56 UTC | internal change PiperOrigin-RevId: 231365216 | 29 January 2019, 17:35:41 UTC |
8b4942f | Deepmind | 28 January 2019, 11:00:01 UTC | Internal change PiperOrigin-RevId: 231186348 | 29 January 2019, 17:35:34 UTC |
ae7e001 | tomhennigan | 26 January 2019, 22:46:09 UTC | Internal change. PiperOrigin-RevId: 231069857 | 29 January 2019, 17:35:28 UTC |
674007e | tomhennigan | 26 January 2019, 21:01:19 UTC | Make module/connection stacks thread local. PiperOrigin-RevId: 231065224 | 29 January 2019, 17:35:21 UTC |
f138492 | Deepmind | 25 January 2019, 21:36:51 UTC | Internal Change. PiperOrigin-RevId: 230959989 | 29 January 2019, 17:35:15 UTC |
3a9e904 | Deepmind | 25 January 2019, 12:22:35 UTC | internal change PiperOrigin-RevId: 230881819 | 29 January 2019, 17:35:08 UTC |
eb46086 | Deepmind | 24 January 2019, 13:19:44 UTC | Allow control over max pondering steps. PiperOrigin-RevId: 230704013 | 29 January 2019, 17:35:02 UTC |
f451e9a | diegolascasas | 24 January 2019, 11:23:21 UTC | Add hyperlinks to documentation. PiperOrigin-RevId: 230691280 | 29 January 2019, 17:34:56 UTC |
e478c60 | diegolascasas | 24 January 2019, 11:23:06 UTC | Fix headers in installation instructions. PiperOrigin-RevId: 230691260 | 29 January 2019, 17:34:49 UTC |
11e00f7 | diegolascasas | 24 January 2019, 11:12:46 UTC | Remove Pandoc and Sphinx configuration files. PiperOrigin-RevId: 230690314 | 29 January 2019, 17:34:43 UTC |
6c53258 | Deepmind | 22 January 2019, 09:44:49 UTC | internal change PiperOrigin-RevId: 230302480 | 29 January 2019, 17:34:36 UTC |
249c817 | Deepmind | 08 January 2019, 12:08:30 UTC | internal change PiperOrigin-RevId: 228307386 | 29 January 2019, 17:34:29 UTC |
675e849 | Deepmind | 07 January 2019, 12:19:34 UTC | Increase test size to accommodate sanitizers PiperOrigin-RevId: 228141037 | 29 January 2019, 17:34:19 UTC |
2f7a52f | tomhennigan | 06 January 2019, 17:14:18 UTC | Remove `@snt.experimental.reuse_vars`. Please use `@snt.reuse_variables` instead! PiperOrigin-RevId: 228063197 | 29 January 2019, 17:34:12 UTC |
a9c6a8d | Deepmind | 03 January 2019, 22:16:42 UTC | Allowing Conv2D to accept unicode, in addition to str. PiperOrigin-RevId: 227748726 | 29 January 2019, 17:34:06 UTC |
0bfb404 | Deepmind | 03 January 2019, 17:58:37 UTC | Add MLP MNIST example. PiperOrigin-RevId: 227704815 | 29 January 2019, 17:34:00 UTC |
e95ba5c | mareynolds | 03 January 2019, 17:52:59 UTC | Add wrapt to required packages. Fixes https://github.com/deepmind/sonnet/issues/115 PiperOrigin-RevId: 227703904 | 29 January 2019, 17:33:53 UTC |
a12ce4d | Deepmind | 02 January 2019, 08:34:44 UTC | internal change PiperOrigin-RevId: 227485581 | 29 January 2019, 17:33:46 UTC |
bbdf2ae | tomhennigan | 29 December 2018, 00:05:30 UTC | Put reused variables in _all_variables when using nested modules. It was possible before to end up in an inconsistent state if inside `_build` `ParentModule` assigned `self.child = Child()` and in your training loop you used `mod.child.variables` (or `get_all_variables()`): ``` mod = ParentModule() for record in inputs: with tf.GradientTape() as tape: outs = mod(record) vars = mod.child.variables # After first iteration this would be empty. grads = tape.gradient(outs, vars) ``` While I suspect this sort of thing would be more common in eager mode, the bug still exists in graph mode. PiperOrigin-RevId: 227178237 | 29 January 2019, 17:33:40 UTC |
91bab7b | Deepmind | 28 December 2018, 09:12:07 UTC | internal change PiperOrigin-RevId: 227108833 | 29 January 2019, 17:33:33 UTC |
fcf20ed | tomhennigan | 27 December 2018, 11:56:34 UTC | Enable Python 3 in Sonnet's py_binary rules. PiperOrigin-RevId: 227011720 | 29 January 2019, 17:33:27 UTC |
e0988b1 | tomhennigan | 26 December 2018, 23:30:02 UTC | Remove some dependencies on internal TensorFlow symbols. PiperOrigin-RevId: 226960072 | 29 January 2019, 17:33:20 UTC |
0400b3d | Deepmind | 21 December 2018, 13:30:08 UTC | internal change PiperOrigin-RevId: 226476878 | 29 January 2019, 17:33:14 UTC |
fe13d13 | Deepmind | 19 December 2018, 16:48:03 UTC | internal change PiperOrigin-RevId: 226179552 | 29 January 2019, 17:33:03 UTC |
79d920c | mareynolds | 19 December 2018, 11:30:40 UTC | Sonnet version update produced on Wednesday, 19. December 2018 PiperOrigin-RevId: 226148846 | 29 January 2019, 17:32:56 UTC |
bb2e382 | mareynolds | 18 December 2018, 17:26:09 UTC | Update changelog for version 1.28 PiperOrigin-RevId: 226008459 | 29 January 2019, 17:32:49 UTC |
39e817b | TR | 18 December 2018, 16:56:56 UTC | FIX: setup.py.tmpl referenced wrong package name setup.py.tmpl referenced "tensor-probability-gpu" instead of "tensorflow-probability-gpu". this caused installation via pip to fail since dm-sonnet-gpu==1.25 GIT_ORIGIN_REV_ID=b18ca77dd73795d5c18dbf2a1e895471493c15f2 PiperOrigin-RevId: 226004212 | 29 January 2019, 17:32:43 UTC |
5995647 | jwrae | 18 December 2018, 12:28:16 UTC | Allow for kwargs to be forwarded to sub-modules in DeepRNN. Useful for propagating 'is_training'. PiperOrigin-RevId: 225976178 | 29 January 2019, 17:32:32 UTC |
8b557d8 | Deepmind | 12 December 2018, 15:27:14 UTC | internal change PiperOrigin-RevId: 225181012 | 29 January 2019, 17:32:25 UTC |
a35746d | mareynolds | 11 December 2018, 14:50:30 UTC | ConvNet2D{Transpose}: Filter the kwargs passed to the normalizer. This makes it possible to connect the module with is_training=True, even if the normalization module selected is something which does not support that (eg LayerNorm). Normal practice would be to call the module with is_training inside a try block, catch any errors and then reconnect without is_training. However, that will generally have created some variables internally, so global (tf.Graph-level) state has been changed, producing errors. This solution works for any normalization module which has a signature that does not contain **kwargs, because that makes it impossible to introspect over whether the flag is supported or not. PiperOrigin-RevId: 224994092 | 29 January 2019, 17:32:14 UTC |
0c8a946 | Deepmind | 05 December 2018, 18:12:02 UTC | 1. Fix the sequence reading order for backward unroll. The order should be decreasing but was increasing. 2. Fix RNN unroll by feeding correct state (the previous state) to each unroll step. The state was always the initial state. PiperOrigin-RevId: 224174351 | 29 January 2019, 17:32:07 UTC |
a64805b | Deepmind | 30 November 2018, 16:53:21 UTC | Support non-2D recurrent state in pondering RNN. (Still relies on leading dimension being the batch dimension.) PiperOrigin-RevId: 223522357 | 29 January 2019, 17:32:01 UTC |
8ba30c4 | mareynolds | 30 November 2018, 16:06:24 UTC | Logging utility function, to be used from inside a module. A kwarg `verbose=True` must be provided to the function for it to actually print, meaning users don't need to fill their module code with if statements. Currently just logs the provided information, preceded by full module path. PiperOrigin-RevId: 223516389 | 29 January 2019, 17:31:54 UTC |
9b1503a | mareynolds | 30 November 2018, 13:24:30 UTC | Add `remove_unsupported_kwargs()` which can filter a set of potential kwargs by whether they are supported by a module. If the function has **kwargs in the signature, we assume all kwargs are valid. PiperOrigin-RevId: 223500517 | 29 January 2019, 17:31:48 UTC |
f20e0a6 | Deepmind | 28 November 2018, 16:59:38 UTC | Remove references to `tf.contrib.rnn.RNNCell` in the RNNCore docstring. While RNNCore used to inherit from RNNCell, it doesn't any longer and the docstring needed updating. PiperOrigin-RevId: 223176347 | 29 January 2019, 17:31:42 UTC |
5d2e01c | mareynolds | 28 November 2018, 16:50:47 UTC | Re-enable bfloat16 test PiperOrigin-RevId: 223175090 | 29 January 2019, 17:31:34 UTC |
0b660f9 | mareynolds | 28 November 2018, 14:51:17 UTC | Add supports_kwargs() function. This allows testing whether some callable, either a function / method / object, supports a list of keyword args. If an object is provided, object.__call__ is checked which maps to _build for Sonnet modules. This can be used for checking whether kwargs like `is_training` are supported by a given module. PiperOrigin-RevId: 223158721 | 29 January 2019, 17:31:28 UTC |
3216e4b | mareynolds | 28 November 2018, 13:40:50 UTC | gated_rnn_test: Make GRUTest also inherit from parameterized for consistency. PiperOrigin-RevId: 223151892 | 29 January 2019, 17:31:21 UTC |
78d9437 | mareynolds | 27 November 2018, 19:34:14 UTC | Make recurrent dropout / zoneout tests less extreme. PiperOrigin-RevId: 223027131 | 29 January 2019, 17:31:14 UTC |
7552f41 | mareynolds | 27 November 2018, 16:53:09 UTC | Use more specific assertions in util_test. PiperOrigin-RevId: 222998330 | 29 January 2019, 17:31:08 UTC |
6d46414 | tomhennigan | 23 November 2018, 10:13:22 UTC | Disable mod.get_variables() in eager mode when using defun. PiperOrigin-RevId: 222602705 | 29 January 2019, 17:31:02 UTC |
7217be1 | Deepmind | 22 November 2018, 23:31:22 UTC | Replaced deprecated tf.create_partitioned_variables with tf.get_variable PiperOrigin-RevId: 222569043 | 29 January 2019, 17:30:55 UTC |
436f794 | diegolascasas | 20 November 2018, 17:45:06 UTC | Sonnet version update produced on Tuesday, 20. November 2018 PiperOrigin-RevId: 222260709 | 20 November 2018, 20:00:34 UTC |
289e227 | mareynolds | 20 November 2018, 16:50:53 UTC | Update Changelog for version 1.27 PiperOrigin-RevId: 222252851 | 20 November 2018, 16:54:29 UTC |
989e397 | Dr. Kashif Rasul | 20 November 2018, 15:32:52 UTC | scale by key_size in relational memory _multihead_attention GIT_ORIGIN_REV_ID=b92d54997e7df890458b672540c4f9832da9e8aa PiperOrigin-RevId: 222243055 | 20 November 2018, 16:54:22 UTC |
9efc432 | Deepmind | 19 November 2018, 17:06:49 UTC | Make SkipConnectionCore and ResidualCore call the initial_state/zero_state methods of the base core. PiperOrigin-RevId: 222085553 | 20 November 2018, 16:54:15 UTC |
8a6d51c | Deepmind | 19 November 2018, 14:13:35 UTC | Rename the 'axes' argument in the snt.LayerNorm constructor to 'axis' to be consistent with sonnet's BatchNorm constructor. Also modified the checks (and docs) such that axis can alternatively be a scalar int instead of requiring a list for convenience. PiperOrigin-RevId: 222064990 | 20 November 2018, 16:54:09 UTC |
eace9e7 | Deepmind | 12 November 2018, 17:59:34 UTC | Copy signature of _build to __call__. PiperOrigin-RevId: 221109661 | 20 November 2018, 16:54:02 UTC |
f0ae8ac | Deepmind | 12 November 2018, 15:15:48 UTC | Backwards-compatibility fix for _ConvND.padding, following cl/221079656 which introduced different padding types per dimension. layer.padding will now return a single padding type where it's the same across all dimensions; .paddings can be used to get the padding for each dimension separately. PiperOrigin-RevId: 221087200 | 20 November 2018, 16:53:56 UTC |
4e6f863 | Deepmind | 12 November 2018, 14:02:43 UTC | Support FULL, CAUSAL and REVERSE_CAUSAL padding options for _ConvND and its subclasses. Also support using different padding settings for {depth,} height and width dimensions in Conv{2,3}D. This is implemented using a tf.pad where necessary before the convolution op. Deprecates snt.CausalConv1D which is now achievable via Conv1D(..., padding=CAUSAL). PiperOrigin-RevId: 221079656 | 20 November 2018, 16:53:50 UTC |
ee326a9 | Deepmind | 07 November 2018, 13:18:48 UTC | Print the type (legacy or resource) of Tensorflow variables in snt.log_variables(). PiperOrigin-RevId: 220446611 | 20 November 2018, 16:53:40 UTC |
5f28cec | Deepmind | 05 November 2018, 17:27:20 UTC | Change some formatting, based on the automatic formatter. PiperOrigin-RevId: 220116426 | 20 November 2018, 16:53:33 UTC |
667f06e | mareynolds | 05 November 2018, 15:58:08 UTC | Make ConvNet2D more flexible in what normalization scheme is used. The use_batch_norm and batch_norm_config flags are deprecated and will be removed in the future. Note that for backwards compatibility, the normalization modules will still be built with the name 'batch_norm_{layer_index}'. Old checkpoints will still load. PiperOrigin-RevId: 220102133 | 20 November 2018, 16:53:26 UTC |
c646148 | Deepmind | 02 November 2018, 10:27:44 UTC | Use initializer with stddev=1 for sonnet.Embed. PiperOrigin-RevId: 219775902 | 20 November 2018, 16:53:20 UTC |
3c85a7c | mareynolds | 31 October 2018, 13:14:34 UTC | Allow `LayerNorm` to accept >2D input Previously >2D input would be an error, whereas now it will normalize over all non-batch dimensions. No code which previously didn't throw an error should have changed behaviour. PiperOrigin-RevId: 219461741 | 20 November 2018, 16:53:14 UTC |
729aedf | Deepmind | 30 October 2018, 11:39:52 UTC | A test erroneously suggested use_batch_norm accepts iterables of booleans. Fix the test and notify users who were using snt.ConvNet2D unknowingly incorrectly. PiperOrigin-RevId: 219278708 | 20 November 2018, 16:53:07 UTC |
55c7214 | Deepmind | 30 October 2018, 11:03:24 UTC | Internal change. PiperOrigin-RevId: 219275912 | 20 November 2018, 16:53:01 UTC |
b4a6f8a | mareynolds | 29 October 2018, 13:04:00 UTC | Internal change PiperOrigin-RevId: 219116281 | 20 November 2018, 16:52:54 UTC |
d70b531 | diegolascasas | 22 October 2018, 11:10:41 UTC | Sonnet version update produced on Monday, 22. October 2018 PiperOrigin-RevId: 218144877 | 22 October 2018, 11:20:33 UTC |
53aac92 | mareynolds | 19 October 2018, 16:39:11 UTC | Check dependencies before importing rest of library PiperOrigin-RevId: 217882721 | 22 October 2018, 11:20:23 UTC |
087cb9d | diegolascasas | 16 October 2018, 13:42:58 UTC | Sonnet version update produced on Tuesday, 16. October 2018 PiperOrigin-RevId: 217309141 | 16 October 2018, 16:51:45 UTC |
4fe21ce | mareynolds | 16 October 2018, 12:57:05 UTC | Update changelog for 1.25 PiperOrigin-RevId: 217304436 | 16 October 2018, 16:51:36 UTC |
3f21751 | mareynolds | 12 October 2018, 16:34:50 UTC | Change Sonnet to depend on tensorflow_probability PiperOrigin-RevId: 216874759 | 16 October 2018, 13:00:36 UTC |
482fafb | Deepmind | 11 October 2018, 23:59:24 UTC | Change dependency on tf.contrib.distributions to tfp.distributions. PiperOrigin-RevId: 216785589 | 16 October 2018, 13:00:23 UTC |
a3f6246 | Deepmind | 11 October 2018, 17:45:50 UTC | Changed inputs.shape to tf.shape(inputs) to allow unknown batch dimension. PiperOrigin-RevId: 216722246 | 16 October 2018, 13:00:16 UTC |
ce11b92 | adriap | 10 October 2018, 16:30:01 UTC | Change axis in concat in DeepRNN when using skip_connections. Previous use cases will work (i.e. cores with shape [batch_size, feature_size] will have the same behaviour), but will enable more sensible concatenation of cores that are multidimensional. PiperOrigin-RevId: 216543007 | 16 October 2018, 13:00:08 UTC |
058e291 | Deepmind | 08 October 2018, 17:29:17 UTC | Add `rate` field to the SeparableConv[1,2]D classes. PiperOrigin-RevId: 216208732 | 16 October 2018, 13:00:00 UTC |
ac0d4f5 | Deepmind | 08 October 2018, 15:13:49 UTC | Describe input shape requirements for _ConvND module more accurately. PiperOrigin-RevId: 216188489 | 16 October 2018, 12:59:52 UTC |
ef01425 | Deepmind | 02 October 2018, 10:06:02 UTC | Additional argument-overriding custom getter that only updates defaults, honouring any non-None argument values set in tf.get_variable (or in nested scopes' custom getters). PiperOrigin-RevId: 215360232 | 16 October 2018, 12:59:34 UTC |
fe0874e | diegolascasas | 01 October 2018, 09:16:13 UTC | Internal changes. PiperOrigin-RevId: 215180895 | 16 October 2018, 12:59:26 UTC |
7106507 | Deepmind | 28 September 2018, 13:15:00 UTC | Add dropout to sonnet's MLP class. Dropout is a very useful regularizer that isn't currently supported in sonnet's MLP class. In this CL, we add an argument to the MLP class, `use_dropout`. The `_build` method now takes optional `is_training` and `dropout_keep_probability` arguments. PiperOrigin-RevId: 214926124 | 16 October 2018, 12:59:18 UTC |
f100d0b | Deepmind | 28 September 2018, 11:56:45 UTC | Add Learn to Execute example for Relational Memory Core to sonnet examples. PiperOrigin-RevId: 214918399 | 16 October 2018, 12:59:10 UTC |
c99c3f9 | Deepmind | 20 September 2018, 12:36:12 UTC | Adjust test sizes/tags for sanitizers PiperOrigin-RevId: 213795074 | 16 October 2018, 12:59:02 UTC |
1c3d7a8 | Deepmind | 19 September 2018, 14:15:41 UTC | Adjust test sizes/tags for sanitizers PiperOrigin-RevId: 213622590 | 16 October 2018, 12:58:53 UTC |
55061cb | Deepmind | 13 September 2018, 11:12:32 UTC | Replace tf.GraphKeys.VARIABLES with tf.GraphKeys.GLOBAL_VARIABLES PiperOrigin-RevId: 212790529 | 16 October 2018, 12:58:45 UTC |
e8bd52b | Deepmind | 11 September 2018, 22:15:20 UTC | Fix for n-th farthest task RMC example. Corrects index reference to object. PiperOrigin-RevId: 212530863 | 16 October 2018, 12:58:36 UTC |
f4da53a | Deepmind | 07 September 2018, 14:31:24 UTC | Increase size of convnet_test and dilation_test from small to medium. PiperOrigin-RevId: 211973944 | 16 October 2018, 12:58:28 UTC |
ffa7249 | tomhennigan | 04 September 2018, 12:03:40 UTC | Avoid same graph checks in eager mode and stop using `_graph_key`. PiperOrigin-RevId: 211439151 | 16 October 2018, 12:58:17 UTC |
f32ee48 | Deepmind | 03 September 2018, 11:22:22 UTC | RNN Shakespeare test: reduce number of training steps from 10 to 5. PiperOrigin-RevId: 211337169 | 16 October 2018, 12:58:08 UTC |
6680867 | Deepmind | 01 September 2018, 14:50:23 UTC | Fix docstring PiperOrigin-RevId: 211210870 | 16 October 2018, 12:58:00 UTC |
0bdd9c3 | Deepmind | 31 August 2018, 10:19:14 UTC | Removes the reference to snt.SkipConnectionCore. PiperOrigin-RevId: 211061744 | 16 October 2018, 12:57:52 UTC |
5e0234e | fviola | 29 August 2018, 21:04:14 UTC | Comment _scale_gradient_op regarding possible memoization requirements. PiperOrigin-RevId: 210785875 | 16 October 2018, 12:57:38 UTC |
3011932 | tomhennigan | 29 August 2018, 15:44:29 UTC | Allow Sonnet modules to defun wrap their reuse_variables methods. There was a subtle bug in `_capture_variables` where inside a `defun` we did not re-enter the Template's variable store (since `executing_eagerly` is False). By not re-entering the store we break variable re-use (since `get_variable` returns a new variable instance each time it is called). PiperOrigin-RevId: 210727568 | 16 October 2018, 12:57:29 UTC |
f18095f | tomhennigan | 28 August 2018, 13:03:23 UTC | Allow Sonnet modules to be defun wrap their call method. >>> mlp = snt.nets.MLP([1, 2, 3]) >>> mlp(tf.constant([[1.0]])) Tensor("mlp_1/linear_2/add:0", shape=(1, 3), dtype=float32) >>> mlp.defun() >>> mlp(tf.constant([[1.0]])) Tensor("PartitionedCall:0", shape=(1, 3), dtype=float32) By wrapping `_call` and not the whole module we allow properties on the module to remain accessible (without keeping a reference to the "raw" and defun'd objects. A good example of this is seen in the updated test. PiperOrigin-RevId: 210528522 | 16 October 2018, 12:57:20 UTC |
535ccdb | mareynolds | 24 August 2018, 13:57:33 UTC | Fix docstring typo PiperOrigin-RevId: 210092924 | 16 October 2018, 12:57:12 UTC |
c776d06 | Deepmind | 24 August 2018, 11:46:14 UTC | Sonnet embed: print warning about using default initializer. Eventually we will switch to using a default initializer of stddev=1. PiperOrigin-RevId: 210082974 | 16 October 2018, 12:57:04 UTC |
a5fcdac | arahuja | 23 August 2018, 09:14:11 UTC | Add clone method to snt.nets.MLP PiperOrigin-RevId: 209902324 | 16 October 2018, 12:56:57 UTC |
257a62c | tomhennigan | 20 August 2018, 13:08:46 UTC | Make `snt.scale_gradient` support eager mode. Much like optimizers and other tesor taking APIs, in eager mode we require users to pass us a callable which they want to scale the gradients of. >>> f = lambda x: tf.pow(x, 2) >>> f = scale_gradient(f, scale=0.1) >>> dy_scaled, tfe.gradients_function(f)(x) >>> print dy_scaled.numpy() # 0.2 PiperOrigin-RevId: 209405560 | 16 October 2018, 12:56:50 UTC |
0e649da | tomhennigan | 17 August 2018, 11:26:11 UTC | Implement same graph check using graph keys. tl;dr - resolves the last known issues with Sonnet and `tfe.defun`. Inside a `defun` we observe a different graph instance each time the function is traced. Sonnet asserts that each time a module is traced that it's graph has not changed. This CL changes that test to check that the graph we observe each time has the same "key" (rather than being the same instance). The key is akin to a primary key for the graph. Graph instances with the same key form part of the same parent graph (e.g. a sub-graph representing a function with key "a" shares variables with a regular tf.Graph with key "a" which it is a part of). DifferentGraphError used to fire undesirably in defun in the following cases (both of which are fixed in this cl): 1) `enter_variable_scope` is used in conjunction with `_build` (e.g. in the constructor or via `snt.reuse_variables`. 2) `defun` re-traces `_build` due to the input signature changing (e.g. Tensor shape changing, or Python parameter values changing). PiperOrigin-RevId: 209131983 | 16 October 2018, 12:56:44 UTC |
6a42160 | Deepmind | 15 August 2018, 08:59:31 UTC | Pass optional named arguments to the wrapped sonnet module. Named arguments can be used to change the behavior of sonnet modules. For example, it's not uncommon to use dropout to regularize a deep neural network. However, dropout should only be used during the training of the network, not afterwards. PiperOrigin-RevId: 208786599 | 16 October 2018, 12:56:36 UTC |
1fb3f4e | tomhennigan | 14 August 2018, 10:15:37 UTC | Make MLP/ConvNet compatible with `tfe.defun`. PiperOrigin-RevId: 208621259 | 16 October 2018, 12:56:29 UTC |
0452bbd | tomhennigan | 09 August 2018, 12:48:53 UTC | Make use of `variable_creator_scope` for variable tracking. Variable creators are stackable factory functions used to control variable creation. By placing a variable creator at the top of the stack we can observe all variables being created or re-requested (e.g. via `tf.get_variable`) and store them in `self._all_variables`. Additionally this works for the case where a `custom_getter` creates more than one variable (as in the "Bayes by Backprop" case). PiperOrigin-RevId: 208034949 | 16 October 2018, 12:56:22 UTC |
1f310fd | tomhennigan | 03 August 2018, 19:33:02 UTC | Simplify module stacks by removing weakref to graph. We perform same-graph checks agressively when the module is connected, so it seems to me there is no simple way to end up with multiple-graphs in the module stack. The reason I'd like to remove this is that it causes some weirdness with `defun`, since if two modules `defun` themselves you get two different capturing graphs which would cause the module stack functionality to break down: >>> @tfe.defun ... def foo(): ... print 'foo', id(tf.get_default_graph()) ... return bar() >>> @tfe.defun ... def bar(): ... print 'bar', id(tf.get_default_graph()) ... return tf.ones([]) >>> id(tf.get_default_graph()) 140336549583184 >>> foo(); foo 140336571240336 bar 140336555261456 There are other issues with composing defuns which mean it's not simple to add a Sonnet specific test for this (yet!), however we have many tests which cover the functionality enabled by the module stack (pushing variables from child to parent) and those still pass :) PiperOrigin-RevId: 207307133 | 16 October 2018, 12:56:15 UTC |