https://github.com/casperkaae/parmesan

sort by:
Revision Author Date Message Commit Date
8677668 Update requirements.txt 07 August 2016, 12:11:16 UTC
93209ef Merge pull request #39 from casperkaae/stl10 add STL10 + fix normalization off by one 27 July 2016, 22:24:30 UTC
0f93650 add STL10 + fix normalization off by one - add STL10 loader. - fixes normalizaton dividing by 256 instead of 255. 27 July 2016, 11:19:28 UTC
d50ed81 Update datasets.py fix cifar10 loader 05 May 2016, 22:56:01 UTC
de4e573 Merge pull request #36 from casperkaae/vimco add vimco example + bernoulli sample layers 05 May 2016, 14:59:45 UTC
8fcb6ec add vimco 02 May 2016, 11:35:03 UTC
ea1df42 Merge pull request #35 from wuaalb/helpers thanks, merged #35 08 April 2016, 08:44:23 UTC
9dbd91e revert eps to epsilon, clean up distributions 06 April 2016, 14:27:13 UTC
1c3d877 added normal-normal kld 02 April 2016, 19:59:02 UTC
ac4b940 updated examples 01 April 2016, 18:14:00 UTC
ba49e0b changed some argument names to be more consistent with lasagne, numpy 01 April 2016, 17:53:03 UTC
f288eaf added log normal parameterized with variance 01 April 2016, 17:44:41 UTC
61e3f0c added log-sum-exp util 01 April 2016, 17:18:14 UTC
f5af671 Merge pull request #34 from casperkaae/updatedatasetloader downloading the svhn extra dataset only when needed. 31 March 2016, 20:50:29 UTC
bfc34da downloading the svhn extra dataset only when needed. 31 March 2016, 15:12:05 UTC
4ed9c02 Update vae_vanilla.py 10 March 2016, 14:54:40 UTC
01c7f24 Update iw_vae_normflow.py 07 March 2016, 10:30:19 UTC
0836cd3 Update iw_vae.py 07 March 2016, 10:29:56 UTC
6ee0b67 fix epsilon iw_vae_normflow 06 March 2016, 12:30:52 UTC
74ab56f fix epsilon iw_vae 06 March 2016, 12:29:59 UTC
c2465b9 Merge pull request #32 from thjashin/master address NaN issues 06 March 2016, 12:26:45 UTC
ebd52bd eps default to 0 03 March 2016, 13:03:42 UTC
161cb5b add eps as paramters 03 March 2016, 05:13:36 UTC
43053fb address NaN issues 02 March 2016, 07:14:44 UTC
2c2b99f fix mu->mean in examples 18 February 2016, 20:55:29 UTC
695f1ed Merge pull request #31 from casperkaae/add_seed add seed option to sample layers 16 February 2016, 16:15:58 UTC
16b00db add seed v2 15 February 2016, 21:17:20 UTC
d26e011 add seed method 15 February 2016, 21:15:32 UTC
230cbf0 add seed option to sample layers add seed option to SampleLayer and SimpleSampleLayer init 15 February 2016, 20:24:12 UTC
ca2ae31 Merge pull request #30 from casperkaae/fixdatasetloader fix datasetloader and confmat 15 February 2016, 20:10:50 UTC
08d0d5d fix line width v2 15 February 2016, 20:09:27 UTC
5b79e7c fix line widths in dataset.py 15 February 2016, 20:08:55 UTC
7880c75 fix datasetloader and confmat 1) Moves import of nonstandard packages to local functions in dataset-loaders 2) Changes interface to the ConfusionMatrix.batchadd to be consistent with cost functions 15 February 2016, 19:30:18 UTC
f28d1c6 Merge pull request #28 from casperkaae/naming Naming 15 February 2016, 19:18:04 UTC
1b31d1a latex typo 15 February 2016, 12:28:02 UTC
0111643 naming 15 February 2016, 12:18:36 UTC
a770b01 add multinomial + naming 15 February 2016, 12:18:13 UTC
f252053 Fix error where h4 was wrongly wired to h5 and h6 14 February 2016, 23:07:06 UTC
b721d93 Merge pull request #27 from casperkaae/rotten add rotten tomatoes character loader 09 February 2016, 08:42:31 UTC
6a0ff3a loader for rotten tomatoes 08 February 2016, 23:40:30 UTC
e14aaf1 fix norb dataset loader 28 January 2016, 14:09:25 UTC
5024d81 add small norb dataset 28 January 2016, 11:36:18 UTC
1c9fa4d fix omniglot loader 27 January 2016, 09:50:57 UTC
58da87c add omniglot from burdas repo 27 January 2016, 09:09:18 UTC
06a60e1 better description 14 January 2016, 23:19:02 UTC
4420361 add keyword argument for custom nonlinearity in sample layer 14 January 2016, 23:10:41 UTC
023e9a2 fix omniglot 04 January 2016, 15:55:06 UTC
948be10 add omniglot 04 January 2016, 14:32:34 UTC
627ca2b add omniglot 04 January 2016, 14:31:45 UTC
cae7766 add proper cifar10 loader 30 December 2015, 14:28:38 UTC
b76bb97 20newsgroup fix 30 December 2015, 10:46:44 UTC
bdce921 fix onehot 2 29 December 2015, 17:42:26 UTC
e5b359b 20newsgroup onehot fix 29 December 2015, 17:31:03 UTC
2e7a05b add 20newsgroup dataset 29 December 2015, 17:22:12 UTC
6a821cb fixed ff fixed splits 25 November 2015, 14:15:27 UTC
ba18078 add download of fixed splits for lfw and freyfaces 25 November 2015, 13:58:46 UTC
d0f90c1 fix reshape error in svhn loader 25 November 2015, 13:04:51 UTC
2068e03 reshape freyfaces 25 November 2015, 12:59:06 UTC
63f2938 dequantify frey faces 25 November 2015, 12:55:17 UTC
2e397de add option to dequantify image data 25 November 2015, 12:33:58 UTC
7b79215 Merge pull request #23 from casperkaae/add_datasets add svhn and lfw datasets 24 November 2015, 21:46:30 UTC
57cde4f add svhn and lfw datasets 24 November 2015, 15:11:37 UTC
6a96b84 add freyfaces dataset 23 November 2015, 14:57:57 UTC
44a7dee add init to params in scalelayer fix 06 November 2015, 23:18:04 UTC
b64044e add beta and gamma inits as params 06 November 2015, 17:51:04 UTC
d6d352b fix errorhandling in DenoiseLayer Conflicts: parmesan/layers/ladderlayers.py 05 November 2015, 15:11:33 UTC
9bda427 fix error check in DenoiseLayer 05 November 2015, 15:01:52 UTC
ea5bbce Merge pull request #21 from wuaalb/fix-normflow Fixes problem with Normalizing Flows 02 November 2015, 14:04:00 UTC
92e714e Fixes problem with Normalizing Flows - Fixes problem where p(z0) was used instead of p(zK), see eq. 15 of Rezende NF paper - Made `NormalizingPlanarFlowLayer` layer output logdet-Jacobian instead of `psi_u` so all logic specific to planar type flows is contained in layer and other types of flows can be used more easily - Various small changes to code, comments and logging for clarity 30 October 2015, 15:43:11 UTC
15dcf50 Merge pull request #17 from wuaalb/clean-examples Cleaned up examples a little 27 October 2015, 16:17:29 UTC
5053807 update usage example 27 October 2015, 14:58:52 UTC
175291d Cleaned up examples a little Fixes number of small issues with the examples - Inconsistent output folders across different examples - Some help strings for `argparse` were not very helpful - Some options (like number of epochs) were not settable through command line - In the `mnist_ladder` example, default value for `-initval` was `'relu'` while it should be a numeric value which depends on `-init` (changed to `'None'` meaning Lasagne default) - Some comments regarding tensor sizes incorrect or inconsistent - Reduced batch size when estimating test log likelihood using 5k point importance sampling, to reduce memory requirements on smaller GPUs (e.g. 5000 (iw_samples) x 50 (batch_size) x 784 (num_features) x 4 bytes is approx. 784 MB) - Actual number of epochs trained would be one less than specified in some examples - Added input shuffling for `vae_vanilla` training like other examples (this makes training a little slower and train lower bound seems a little lower, but test lower bound is higher; ie. shuffling seems to help generalization) - Some minor clean ups 27 October 2015, 14:54:09 UTC
6260ef4 Merge pull request #20 from casperkaae/fix_examples Fix examples 27 October 2015, 14:14:17 UTC
ad97359 hp settings 27 October 2015, 14:13:56 UTC
4be5725 small changes 27 October 2015, 12:50:22 UTC
e9d8d3a add batch norm to iw_vae.py 27 October 2015, 12:44:07 UTC
04c75b5 fixes sampling bug v3 27 October 2015, 12:34:24 UTC
8033d30 fixed comment in example 27 October 2015, 12:09:39 UTC
0810cb3 fixes sampling v2 27 October 2015, 12:03:20 UTC
d18a996 fixes bug in iw_vae_normflow example 27 October 2015, 10:51:51 UTC
dd52625 Merge pull request #19 from casperkaae/fixed_iw_example fixes performance in iw examples 27 October 2015, 10:40:34 UTC
018b3d6 fixes performance in iw examples 27 October 2015, 10:35:51 UTC
c48258d Merge pull request #16 from wuaalb/vae_vanilla-def-params Better default hyper-parameters for vae_vanilla example 27 October 2015, 10:11:12 UTC
ac48403 Better default hyper-parameters for vae_vanilla example With previous parameters the test ELBO would start going up after approx. 60 epochs. Decreased learning rate and number of hidden units in deterministic layers of encoder/decoder. Set `analytic_kl_term=True` by default as it seems to improve results and is what the Kingma et al. paper does in its examples. Changed non-linearity `softplus` for deterministic hidden layers. Also tried `tanh` and `very_leaky_rectify`, but this seemed to perform best. Results with these settings ``` *Epoch: 999 Time: 9.03 LR: 0.00030 LL Train: -90.331 LL test: -93.592 ``` 16 October 2015, 11:07:55 UTC
a2c1086 Merge pull request #14 from wuaalb/clean-text Various minor fixes of docstrings, comments, .. 15 October 2015, 10:14:56 UTC
5f28ff0 Various minor fixes of docstrings, comments, .. 14 October 2015, 19:35:12 UTC
aefaab2 Merge pull request #13 from wuaalb/analytic_kl Analytic KL term, misc example clean up 13 October 2015, 21:16:37 UTC
0cc0093 docstrings for distributions, better default hyper-parameters for vae_vanilla 13 October 2015, 19:21:19 UTC
8e4ab07 Analytic KL term, misc example clean up - Added option for analytically integrated KL term in vae_vanilla.py (lower variance estimator of lower bound compared to using Monte Carlo approximation) - Slightly modified distributions.py - Added fixed numpy random seed for reproducibility - Few minor clean-ups in examples 13 October 2015, 13:26:05 UTC
185a391 cleanup v2 13 October 2015, 08:23:44 UTC
9b2e496 small change to gitignore 13 October 2015, 08:17:43 UTC
b44d962 small read change 11 October 2015, 13:07:31 UTC
f39c245 Merge pull request #11 from casperkaae/rewamped_datasets Support for binarized MNIST data, closes #8 and #10 11 October 2015, 12:09:27 UTC
0a6bd6c readme changes 10 October 2015, 22:40:23 UTC
5638eea small hyperparam and readme changes 10 October 2015, 22:38:30 UTC
155ad7e small formatting changes 10 October 2015, 15:51:03 UTC
657dd39 added support for binarized MNIST 10 October 2015, 15:49:55 UTC
6dec0bd Merge pull request #9 from wuaalb/fix-typos Fixed some small typos 10 October 2015, 13:50:44 UTC
3394fbe Fixed some small typos 09 October 2015, 20:57:37 UTC
ffa9633 update readme 09 October 2015, 15:30:00 UTC
back to top