sort by:
Revision Author Date Message Commit Date
867916a [fix] Fix imports for FB specific files (#138) Summary: Pull Request resolved: https://github.com/fairinternal/mmf-internal/pull/138 Fix internal imports so that we can the code from mmf-internal in FAIR cluster. Fix all tests to run on mmf-internal Fix black/isort mismatches Add missing init files to FB folders Reviewed By: apsdehal Differential Revision: D26478270 fbshipit-source-id: 291f61b25bb5ba1755e5a5f211231471e86ea8b4 24 February 2021, 18:12:20 UTC
8409361 [feat] Add feature extraction script (#764) Summary: Add a feature extraction script that uses Huggingface's Fast RCNN structure and code to discover features in an image. `extract_features_frcnn.py` is my script, the other files were taken from Huggingface, other than super minor changes by me, including rotating the RGB/BGR values in the preprocessing script. The script takes a number of flags that alter it's behavior like exclude files, input files, output files, batch size, etc. Run `python3 mmf/tools/scripts/features/frcnn/extract_features_frcnn.py --model_file model.bin --config_file config.yaml --image_dir ./example_images --output_folder ./output_features` Pull Request resolved: https://github.com/facebookresearch/mmf/pull/764 Reviewed By: vedanuj Differential Revision: D26612531 Pulled By: brettallenyo fbshipit-source-id: aed9e127d9e102b2704cadd131e80bd594bce28e 23 February 2021, 23:54:53 UTC
f5ff2c8 [fix] Improvement to VocabDict loader (#786) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/786 Upstream callers to VocabDict tend to pass `data_dir` which leads to inadverently concating it with the vocab_file path. We now skip this step if the path already exists Reviewed By: apsdehal Differential Revision: D26601097 fbshipit-source-id: 6d9006c66d2e6ebef503ba0f8fc43b16a4d21550 23 February 2021, 20:10:05 UTC
0837776 [mmf][PostRay] Make capability as an arg Summary: To make capability as an arg so its easier to switch between V100-16GB and 32GB cards. Reviewed By: madian9, vedanuj, apsdehal Differential Revision: D26600874 fbshipit-source-id: b4f95c09a611fb46101cbd2043f132b1d53eb67e 23 February 2021, 17:36:44 UTC
6db9048 [refactor, fix] Fix fbl flow to use iopath, some cleanup (#785) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/785 Some cleanup of file_io file Reviewed By: apsdehal, BruceChaun Differential Revision: D26599901 fbshipit-source-id: 1979248b54ec0d5b2566d158cae4a72028b1f116 23 February 2021, 04:56:33 UTC
b1fd2a9 [refactor] Refactor output type of MMFT preprocess_output (#781) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/781 - Refactors the output type to be Dict[str, Dict[str, Tensor]] - Namedtuples cannot be inherited to include new fields easily - Dataclasses are not scriptable Reviewed By: apsdehal, xuefeicao Differential Revision: D26479261 fbshipit-source-id: d64b116fadc1cdb314e0cd66189bde54d11dc2f6 19 February 2021, 05:08:02 UTC
bb6a9b7 [mmf][site] Upgrade Docusaurus Summary: To use FbInternalOnly, you need to be on Docusaurus 61+, this upgrades it. It also fixes a typo FBInternalOnly -> FbInternalOnly. MDX currently silently (with a warning) accepts components that haven't been imported, so this wasn't raising an error but wasn't doing anything either. Also fixes a few broken links (the new version of docusaurus checks this for you and enforce them). Reviewed By: apsdehal Differential Revision: D26510193 fbshipit-source-id: 81871d3151edce27a2281cbcaaa7e694309205a2 18 February 2021, 19:33:46 UTC
99f8010 [feat] Add batch_size_per_device training option (#780) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/780 This will allow specifying batch size per device instead of default global batch size. If specified using `training.batch_size_per_device`, this will supersede the value specified in `training.batch_size` and information will be logged. Reviewed By: madian9, vedanuj Differential Revision: D26480897 fbshipit-source-id: 9ffb3d9b79a84c39db4000e6a3d051e3bd1b6eb5 17 February 2021, 22:54:16 UTC
daeaa95 [fix] Don't show warning for missing is_xla in registry (#779) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/779 The warning is plaguing the logs and should not be explicitly registered. This diff fixes that. Reviewed By: ytsheng Differential Revision: D26480670 fbshipit-source-id: 9c7d0a67a1cd8fa71d8c7537eeff75c107c1117b 17 February 2021, 18:04:42 UTC
ea71345 [fix] Fix SPM Tokenizer for masked labels (#775) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/775 Fix masked SPM tokenizer inherited from MaskedBertTokenizer Differential Revision: D26432162 fbshipit-source-id: f0acb2388c4030c48573ec68b8ff6ab4e9ecd99c 16 February 2021, 17:51:22 UTC
53ee8d8 [feat] Changes to support training on TPUs (#752) Summary: Creating this new PR since pytorch-xla branch is in use for an ongoing project. I have incorporated most of the changes understood from the last comments on https://github.com/facebookresearch/mmf/pull/693 Happy to follow up on the open comments here. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/752 Reviewed By: vedanuj Differential Revision: D26218782 Pulled By: apsdehal fbshipit-source-id: 6ce6e48134153cea9e9969acac021cf2bf955fe6 12 February 2021, 22:15:46 UTC
18d685d [refactor,tests,fix] Clean up and fix a bug in MMFT; add extensive tests (#774) Summary: The diff address a bug that was introduced in text position ids by a recent change in https://github.com/facebookresearch/mmf/issues/760. The fix is to actually make single feature dim modalities B X 1 X D before passing to backend instead of making multiple assumption in backend layer which tend to be conflicting with text input ids. This also adds some comments to better explain stack version of text. The PR also refactors MMFT to be more readble and reduce cyclomatic complexity. Finally, this PR adds extensive tests to make sure a change like https://github.com/facebookresearch/mmf/issues/760 doesn't break MMFT again. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/774 Test Plan: Adds extensive tests for MMFT Tested disney's model. Reviewed By: ytsheng Differential Revision: D26410408 Pulled By: apsdehal fbshipit-source-id: 059af500739b3463219b8594b3bb72001326a0a2 12 February 2021, 20:30:49 UTC
95a2225 [docs] Update internal docs to latest Summary: Updates fblearner and devserver docs to point to latest content. Reviewed By: ytsheng Differential Revision: D25511541 fbshipit-source-id: 58ba9ff19a294c5c419baad50e18d2b3ed1013d7 11 February 2021, 22:04:01 UTC
2ff0062 [feat] Add MLM pretraining head for MMF Transformer (#769) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/769 1. Adds a MLM pretraining head for MMF Transformer model 1. Fixes `MultiSentenceBertTokenizer` to also support multisentence masking 1. Adds `mlm_labels_list` preprocessing 1. Adds pretraining head specific postprocessing Note : This is not the best way we can add pretraining losses and heads, but we should tackle it properly when we start working on Pretraining Schemes support for MMFT and other models. Facebook : 1. Fixes dataset filtering to support the case when no `set_name_mapping` is available for a dataset 1. Fixes `FeaturesWithTextBatchProcessor` when no labels are available for a dataset 1. Fixes `MultiSentenceSPMTokenizer` to also support multisentence masking Reviewed By: apsdehal Differential Revision: D26270908 fbshipit-source-id: 84a1ae462aea8fda6e6b0c791fad7722a69dbfb2 09 February 2021, 18:39:19 UTC
0ee1127 [feat] PL mvp0: training (#748) Summary: * pytorch lighting stub mostly involving training * Tests for lightning trainer included * built on top of the mmf grad accumulation fix: https://github.com/facebookresearch/mmf/pull/747 - [X] MVP 0. Training: Goal - Train a model from scratch and reach similar accuracy as using mmf_trainer - [X] Setup the training pipeline: done - [X] Training on the right device: done - [X] Clip gradients: done - [X] Optimizer: done - [X] FP16 Support: done - [X] LR scheduler (incl. warmup etc): done - [X] testcase: train visual_bert on vqa from scratch fo 10 iterations, compare the value: done - [x] tests included in this PR (tests are only done for pytorch lightning integration): - [X] Vanilla Training w/o grad accumulate, make sure loss for 5 iters are the same between mmf and pl - [X] Optimizer working as intended as a part of this PR. - [X] `max_updates` and `max_epochs` calculation - [x] Training with grad accumulate - [x] Training with LR schedule achieves a different value compared to without LR schedule - [x] Training with LR schedule for PL is the same as training with LR schedule for `mmf_tranier` - [x] Training with gradient clipping make sure all grads are within the `grad_clipping` threshold - [x] Training with gradient clipping is the same as training with gradient clipping for `mmf_trainer` Pull Request resolved: https://github.com/facebookresearch/mmf/pull/748 Reviewed By: apsdehal, simran2905 Differential Revision: D26192869 Pulled By: ytsheng fbshipit-source-id: 203a91e893d6b878bbed80ed84960dd059cfc90c 08 February 2021, 10:52:20 UTC
fc72ef0 [fix] fix typo in model zoo config (#715) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/715 Reviewed By: vedanuj Differential Revision: D25553078 Pulled By: ytsheng fbshipit-source-id: e24989d7118b0de3e92a381182a1a7fe4b540bc6 06 February 2021, 04:17:31 UTC
d49abdf [sdoc][codemod] Fix cert dialog popup on open-source sites Summary: Upgrades the FB-internal documentation plugin, to no longer call the internal api that requires auth, for external viewers. Context: https://fb.workplace.com/groups/654274975334601/permalink/1288672008171584/ Reviewed By: justintrudell Differential Revision: D26252469 fbshipit-source-id: 31068534ce79a7959c38c6e66e6a9cf12371e228 04 February 2021, 19:48:55 UTC
bfacca4 [feat] Changes to support bento notebook tutorial (#761) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/761 This diff aims to fix various things in MMF to provide an optimal bento notebook experience. - Fix some hasattr checks to "in" checks - Allow empty config_path for datasets and models - Also, allow missing model key or dataset key in yaml configs - Adds dataclass for LossConfig so as Losses can be initialized in the model through structured configs - Fix the usage of next(model.parameters()) to get the device. This doesn't work with dataparallel - Also, using arbitrary keys with text and image modality type. MMFT will now expect to find these in SampleList. - Allow passing `input_mask` as `modality_mask` for text type Reviewed By: vedanuj Differential Revision: D26170013 fbshipit-source-id: 7af3766ffcf50acc9d5d69dd1eb91da224e0783c 02 February 2021, 09:32:36 UTC
5b01c32 [feat] Support direct feature inputs with single position dim in MMFT (#760) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/760 This diff allows MMFT to support direct features which don't use any encoders are of position dim==1. Also, uses BertModelJit to initialize the transformer base so that it is torchscriptable. Reviewed By: vedanuj Differential Revision: D24555077 fbshipit-source-id: ee9be44fed10ba30f1a88437ec42a01f38952347 02 February 2021, 07:11:17 UTC
28f1a81 [feat] Add config dataclasses for MMFT (#658) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/658 - Adds base dataclasses for MMFT and related encoders if something is missing - Fix usage of EncoderFactory with enums in build_model - Adds extensive test for building the model with the configs - Adds defaults to the configs as well based on the original configs - In second set of changes, need to isolate backend configs. Reviewed By: vedanuj Differential Revision: D24469282 fbshipit-source-id: c2e752e163c23b9d1142bbe2fd0b45553f33dc07 02 February 2021, 07:11:17 UTC
b8f4591 [fix] Fix accumulate tensor fields in evaluation (#757) Summary: Fixes evaluation loop after changes in https://github.com/facebookresearch/mmf/issues/747 Pull Request resolved: https://github.com/facebookresearch/mmf/pull/757 Reviewed By: ytsheng Differential Revision: D26171041 Pulled By: vedanuj fbshipit-source-id: f0fd24ef96ef54dd7ea17af5968e05f0301e64a3 01 February 2021, 09:52:13 UTC
8237ed2 [fix] Fix windows build for iopath dependency (#758) Summary: Fixes Windows build after it was broken due to iopath dependency inclusion. Latest version of pywin is not compatible with python3.8, so we need to fix it at v225. Reference : [issue](https://github.com/mhammond/pywin32/issues/1431). Pull Request resolved: https://github.com/facebookresearch/mmf/pull/758 Reviewed By: ytsheng Differential Revision: D26171059 Pulled By: vedanuj fbshipit-source-id: 332281c7987a8ba9cdb4cc3ec38f23476a2af146 01 February 2021, 08:15:45 UTC
11b531e [fix] fix gradient accumulation when update_frequency != 1 (#747) Summary: * fix gradient accumulation when update_frequency is not equal to 1 * The issue is caused by "start_update" being called at an irregular time so the gradients are actually not being accumulated in between update_frequency. When optimizer's `zero_grad` is called, grad is manually set to zero and therefore the gradients are not being accumulated. This fix made it so that `zero_grad` is called in the correct interval so that grad are being accumulated in between update_frequency. * fix `combined_report` not taking into account the new losses causing metrics calculation to be incorrect. * add test (including logging) * add test to make sure there is a final update even if the num batches from the current update < update frequency. * address issue: https://github.com/facebookresearch/mmf/issues/626 Pull Request resolved: https://github.com/facebookresearch/mmf/pull/747 Reviewed By: apsdehal Differential Revision: D26034213 Pulled By: ytsheng fbshipit-source-id: fa0c7ad3566ce30f89b67996ac476a3d2dab779a 30 January 2021, 03:13:11 UTC
77a931e [enhancement] replace usage of PathManager to use ioPath (#749) Summary: Replace code in file_io to use the new ioPath Path Manager instead of fvcore. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/749 Test Plan: - use existing `test_file_io.py` test suite Reviewed By: vedanuj Differential Revision: D26058703 Pulled By: brettallenyo fbshipit-source-id: 148aa502a2daed9c8ea44f38235dd19b168b8c54 27 January 2021, 23:50:40 UTC
b0e6c4a [feat] Add generic TorchvisionResnetImageEncoder in ImageEncoderFactory. (#728) Summary: This PR adds a generic image encoder named `TorchvisionResnetImageEncoder` in `mmf.modules.ImageEncoderFactory` — this encoder can instantiate any ResNet-like model from [`torchvision.models`](https://pytorch.org/docs/stable/torchvision/models.html) based on its name. The encoder returns spatial grid features same as torchvision. It can be instantiated using a minimal config as: ```yaml type: "torchvision_resnet" params: name: <name> pretrained: false ``` - Set `pretrained: true` for loading ImageNet-pretrained weights from torchvision. - Supported `<name>`s are: - `resnet18`, `resnet34`, `resnet50`, `resnet101`, `resnet152` - `resnext50_32x4d`, `resnext101_32x8d` - `wide_resnet50_2`, `wide_resnet101_2` ### Example ```python >>> import torch >>> from omegaconf import OmegaConf >>> from mmf.utils.build import build_encoder >>> >>> config = OmegaConf.load("encoder_config.yaml") >>> encoder = build_encoder(config) >>> images = torch.randn(32, 3, 224, 224) # shape: (b, c, h, w) >>> grid_feats = encoder(images) # shape: (b, 2048, 7, 7) ``` ### Backward Compatibility This PR maintains backward compatibility as it adds a new encoder without changing existing ones. However, it subsumes the functionality of [ResNet152ImageEncoder](https://github.com/facebookresearch/mmf/blob/d04914a7bbae28f33356e8b17113a81d6291a5dc/mmf/modules/encoders.py#L215), hence the latter could be later replaced and deprecated in favor of this addition. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/728 Reviewed By: ytsheng Differential Revision: D25991085 Pulled By: vedanuj fbshipit-source-id: 1c8f732ac43c0c81e6cf9cf41deca9dca3c39efe 26 January 2021, 00:43:15 UTC
cd77a4b [fix, test] Fix recall metric bug, add test (#746) Summary: - Fix infinite recursion error https://github.com/facebookresearch/mmf/issues/725 - Add back mistakenly removed code - Add test for recall metric at 1, 5, and 10 This is a response to bug https://github.com/facebookresearch/mmf/issues/725 I believe `process_ranks` may have been accidentally deleted in this commit so I added it back: https://github.com/facebookresearch/mmf/commit/6b6f4fc221c46f8c55d55a06e110f3d8f9ca2417 I also added a testing file, which basically ranks all of the expected values in increasing order (so the last candidate is always the chosen one). Both the expected values and predicted values are (10, 100) tensors. As the index of top level array increases, I alter the final value to be decreasing by 2, so the chosen candidates are 198 (maximum value 99*2), 191, 189, 187, ... I decrease it in this way so that each of the candidates become out of the bounds for the increasing Recall Metrics. The rankings are `tensor([ 1, 4, 5, 6, 7, 8, 9, 10, 11, 12])` This makes the results varying for each metric: `RecallAt1` --> 1/10 = .1 `RecallAt5` --> 3/10 = .3 `RecallAt10` --> 8/10 = .8 I would love for people to take a look at `metrics.py` line `462` (`process_ranks`) There are print statements in here from the previous commit, but I do not see print statements anywhere else in the modules, should I delete these, leave them, do we have a different logging style? Thanks! Pull Request resolved: https://github.com/facebookresearch/mmf/pull/746 Reviewed By: vedanuj Differential Revision: D26049706 Pulled By: brettallenyo fbshipit-source-id: 03546c76f9aed3a1c26b318eefb00fbb05d1297a 25 January 2021, 20:07:14 UTC
079f71d [fix] Allow loading checkpoint from a folder (#735) Summary: - Needed for e2e movie+mcan Pull Request resolved: https://github.com/facebookresearch/mmf/pull/735 Test Plan: - Tested locally with e2e movie+mcan handler - Adds tests for loading from a folder - Combines older file based test to main pretrained as they had sequential dep Reviewed By: ytsheng Differential Revision: D25936068 Pulled By: apsdehal fbshipit-source-id: 2c8989fcfaff2b1a2e56e4b97948809fc50bef49 21 January 2021, 01:58:09 UTC
0229af6 [enhancement] be more explicit in isort (#738) Summary: * print in ci the difference that isort would make * allow isort to be more explicit. * my pytorch lightning PR is experiencing some local/circleci inconsistencies for isort despite using the same version. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/738 Reviewed By: apsdehal Differential Revision: D25942131 Pulled By: ytsheng fbshipit-source-id: 67d33bbefca34229ee96cc815b52efdad913f68a 19 January 2021, 19:35:00 UTC
2848ba3 [feat] D2 backbone intergation, Movie MCAN e2e (#700) Summary: [WIP] E2E with grid feats D2 (https://github.com/facebookresearch/mmf/commit/563a62d7bbbd89c857f711e8ca2670d33c129279) models Pull Request resolved: https://github.com/facebookresearch/mmf/pull/700 Reviewed By: apsdehal Differential Revision: D25886006 Pulled By: vedanuj fbshipit-source-id: 1fe3633dc8101ffc6f28ce48c434533d7a300d98 12 January 2021, 19:16:32 UTC
d04914a [docs] update the terms and concepts doc (#724) Summary: * Updated the terms of concepts to include the latest models and datasets * Fixed the links in this page * Renamed pythia to mmf * Added links to those models/datasets Pull Request resolved: https://github.com/facebookresearch/mmf/pull/724 Reviewed By: apsdehal Differential Revision: D25727819 Pulled By: ytsheng fbshipit-source-id: 5e4fb0d3d98a6222ddea84f536e4b0ae19a67618 07 January 2021, 05:55:23 UTC
e00ef78 [docs] dataset split from train feature docs (#726) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/726 Reviewed By: apsdehal Differential Revision: D25766616 Pulled By: ytsheng fbshipit-source-id: ae621b6a42b64d4387b4eb3ae5b9d2ad937cc8ec 05 January 2021, 01:23:27 UTC
51df5e5 [docs] Add LXMERT to model zoo doc (#719) Summary: Adding LXMERT to update model zoo documentation. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/719 Reviewed By: ytsheng Differential Revision: D25618345 Pulled By: vedanuj fbshipit-source-id: f87913a9ca6f52d1ac590ef6af01a2438771314c 18 December 2020, 19:26:49 UTC
421b3f3 Bump ini from 1.3.5 to 1.3.8 in /website (#713) Summary: Bumps [ini](https://github.com/isaacs/ini) from 1.3.5 to 1.3.8. <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/npm/ini/commit/a2c5da86604bc2238fe393c5ff083bf23a9910eb"><code>a2c5da8</code></a> 1.3.8</li> <li><a href="https://github.com/npm/ini/commit/af5c6bb5dca6f0248c153aa87e25bddfc515ff6e"><code>af5c6bb</code></a> Do not use Object.create(null)</li> <li><a href="https://github.com/npm/ini/commit/8b648a1ac49e1b3b7686ea957e0b95e544bc6ec1"><code>8b648a1</code></a> don't test where our devdeps don't even work</li> <li><a href="https://github.com/npm/ini/commit/c74c8af35f32b801a7e82a8309eab792a95932f6"><code>c74c8af</code></a> 1.3.7</li> <li><a href="https://github.com/npm/ini/commit/024b8b55ac1c980c6225607b007714c54eb501ba"><code>024b8b5</code></a> update deps, add linting</li> <li><a href="https://github.com/npm/ini/commit/032fbaf5f0b98fce70c8cc380e0d05177a9c9073"><code>032fbaf</code></a> Use Object.create(null) to avoid default object property hazards</li> <li><a href="https://github.com/npm/ini/commit/2da90391ef70db41d10f013e3a87f9a8c5d01a72"><code>2da9039</code></a> 1.3.6</li> <li><a href="https://github.com/npm/ini/commit/cfea636f534b5ca7550d2c28b7d1a95d936d56c6"><code>cfea636</code></a> better git push script, before publish instead of after</li> <li><a href="https://github.com/npm/ini/commit/56d2805e07ccd94e2ba0984ac9240ff02d44b6f1"><code>56d2805</code></a> do not allow invalid hazardous string as section name</li> <li>See full diff in <a href="https://github.com/isaacs/ini/compare/v1.3.5...v1.3.8">compare view</a></li> </ul> </details> <details> <summary>Maintainer changes</summary> <p>This version was pushed to npm by <a href="https://www.npmjs.com/~isaacs">isaacs</a>, a new releaser for ini since your current version.</p> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ini&package-manager=npm_and_yarn&previous-version=1.3.5&new-version=1.3.8)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `dependabot rebase` will rebase this PR - `dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `dependabot merge` will merge this PR after your CI passes on it - `dependabot squash and merge` will squash and merge this PR after your CI passes on it - `dependabot cancel merge` will cancel a previously requested merge and block automerging - `dependabot reopen` will reopen this PR if it is closed - `dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/facebookresearch/mmf/network/alerts). </details> Pull Request resolved: https://github.com/facebookresearch/mmf/pull/713 Reviewed By: ytsheng Differential Revision: D25615831 Pulled By: vedanuj fbshipit-source-id: b7d5f6e3564cd40c883846698fd0831b081bf616 18 December 2020, 19:26:48 UTC
a2314dd [staticdocs][codemod] Enable code snippets in docusaurus sites Summary: Lets you embed code in markdown with the following syntax: ```` ```js file=../../some-file.js start=start_marker end=end_marker ``` ```` Reviewed By: nikoant Differential Revision: D25460060 fbshipit-source-id: 29e242d6d896e7c8256371483d72104889b9daa9 10 December 2020, 13:55:36 UTC
14ba005 [codemod][staticdocs] Enable fine-grained internal content Summary: This preset includes a remark plugin to allow you to import markdown files from `.../fb/...` directories. Previously such imports would cause the OSS site to fail to build, but with this it's safe. This means you can embed internal-only parts of pages, by transcluding other markdown files, e.g: ``` import InternalSection from './fb/internal-section.md'; # Public Section some public stuff <InternalSection /> # Another Public Section ... ``` When building the public site variant, files imported from `../fb/..` will be skipped. More instructions [here](https://www.internalfb.com/intern/wiki/Static_Docs/Internal-Only_Content/) The preset delegates to `preset-classic` for everything else. Reviewed By: snarkmaster Differential Revision: D25370009 fbshipit-source-id: 36ea3000f0342a82749316cc8df19ce30ccf4537 08 December 2020, 12:19:50 UTC
ec5c8b5 [feat] allow passing in model's file path for from_pretrained (#677) Summary: - from_pretrained to allow a filepath - add test Pull Request resolved: https://github.com/facebookresearch/mmf/pull/677 Reviewed By: apsdehal Differential Revision: D25167229 Pulled By: ytsheng fbshipit-source-id: 2bc976e6239b4d6bd23b7e0b9f3d8495e9381d2d 08 December 2020, 02:21:10 UTC
f3d3da4 [fix, doc] add image feature extraction tutorial + small fixes (#673) Summary: - fix link errors in the documentation - add image feature extraction tutorials Pull Request resolved: https://github.com/facebookresearch/mmf/pull/673 Reviewed By: apsdehal Differential Revision: D24967971 Pulled By: ytsheng fbshipit-source-id: d7302fded6a3ed1dc2b8dd2fe65efea43c7bc11d 08 December 2020, 00:41:05 UTC
40b63e7 [fix] Typo in LMDBConversion (#699) Summary: Fix typo when extracting image features from .mdb file to .npy files. **Fix:** In **Line#95** image height is set to be the image width, which is wrong: https://github.com/facebookresearch/mmf/blob/518a5a675586e4dc1b415a52a8a80c75edfc2960/tools/scripts/features/lmdb_conversion.py#L95-L96 Pull Request resolved: https://github.com/facebookresearch/mmf/pull/699 Reviewed By: apsdehal Differential Revision: D25219613 Pulled By: vedanuj fbshipit-source-id: 3735ae890fcbba27cbd206086d582224866b051e 04 December 2020, 19:06:37 UTC
a221f06 [fix,feat] update the state_key based on module (#675) Summary: - Allow for module-wise state dict key update - Make use of the `_register_load_state_dict_pre_hook` to update the key of the state dict - opted for this approach because recursion is already being implemented in the load_state_dict function and therefore I think there is no need to re-implement recursion. Better to make use of the pytorch implementation. - Slightly cleaner fix compared to this fix: [664](https://github.com/facebookresearch/mmf/pull/664) - Some documentation clean up Pull Request resolved: https://github.com/facebookresearch/mmf/pull/675 Reviewed By: vedanuj Differential Revision: D24714619 Pulled By: ytsheng fbshipit-source-id: ccbf85c9aedae4bded3234d9b178e6b34241bbc3 03 December 2020, 00:01:06 UTC
6aec3eb [fix] Update README.md to fix typo (#698) Summary: * fix typo Pull Request resolved: https://github.com/facebookresearch/mmf/pull/698 Reviewed By: vedanuj Differential Revision: D25180465 Pulled By: ytsheng fbshipit-source-id: d53009df5118841f098316e9af6f4b4ab72ee500 30 November 2020, 19:26:40 UTC
518a5a6 [fix] Fix test_nucleus_sampling after pytorch multinomial fix (#684) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/684 Fix test_nucleus_sampling after pytorch multinomial fix Reviewed By: vedanuj Differential Revision: D24997596 fbshipit-source-id: 27b98d1289a36d151abf60e3a2ad0fb150da4139 19 November 2020, 19:21:18 UTC
8ff3f56 [fix] Upgrade actions workflows to use miniconda@v2 (#690) Summary: Github Actions workflows are broken due to deprecation of `set-env` and `add-path` commands. We need to upgrade to newer version of miniconda. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/690 Test Plan: Example broken tests : https://github.com/facebookresearch/mmf/actions/runs/368665121 Fixed in this PR. Reviewed By: apsdehal Differential Revision: D25052542 Pulled By: vedanuj fbshipit-source-id: 3909adba03669c8e22d7ed45bfe052153b0ea67b 18 November 2020, 20:13:57 UTC
2a0c567 [chores] Upgrade transformers dependency for mmf (#666) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/666 - Includes changes for compatibility with transformers 3.4.0 - BertLayerNorm is replaced with LayerNorm from torch.nn - Roberta classes are replicated in the latest version. So we need to monkey patch all of them. - BertEmbedding has a new position_id variable, due to which existing checkpoints have to be loaded with strict=False. This doesn't effect the models, instead of generating position ids it is just an added variable. In most of our models its a no-op. - Additional properties need to be ignored in the `PreTrainedModel` class. Reviewed By: ytsheng Differential Revision: D24599639 fbshipit-source-id: ca915227fc8e52d7624a1caedb09c8c57764a3fb 11 November 2020, 18:51:12 UTC
d9ab064 [mmf][website] Use static docs helpers Summary: Switch to using the helper lib for better tracking, code modding, and removing redundancy. Docs: https://www.internalfb.com/intern/wiki/Static_Docs/Internal-Only_Content/ Reviewed By: mweststrate Differential Revision: D24704871 fbshipit-source-id: fc37ee18c7517eb3107a97e4d130420bca0d68e7 05 November 2020, 11:07:37 UTC
5cbda90 [feat,fix] Stronger checkpoint validation, fix missing keys (#664) Summary: - This PR aims to add stronger checkpoint validations to MMF - It will fail the run if there are missing keys in the checkpoint - It will log unexpected keys as warnings - Using this, we fix multiple missing keys in pythia, m4c Fixes https://github.com/facebookresearch/mmf/issues/661 #652 Pull Request resolved: https://github.com/facebookresearch/mmf/pull/664 Reviewed By: vedanuj Differential Revision: D24636673 Pulled By: apsdehal fbshipit-source-id: d54b2495218984b6273276d4f5c86efba89e4d30 02 November 2020, 22:10:56 UTC
3ae71ec [feat] Add optimizer state sharding (ZeRO) (#636) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/636 Adding Zero optimizer state sharding to MMF from [fairscale](https://github.com/facebookresearch/fairscale) library. Added to tutorials It can used with this option : `optimizer.enable_state_sharding=True` Reviewed By: apsdehal Differential Revision: D24317858 fbshipit-source-id: d3b03dd2cbdfc3be7beaa0ee441a62151813df33 30 October 2020, 03:02:03 UTC
d2f00bf [fix] use torch current device rather than using registry (#659) Summary: * Minor fix so that we remove the registry current device dependency, and instead use torch. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/659 Reviewed By: apsdehal Differential Revision: D24542722 Pulled By: ytsheng fbshipit-source-id: 29bcd4318e319a1bc29b4bf4549830b441a0008b 30 October 2020, 00:15:24 UTC
a82fcef [enhancement] Add freeze backend option to MMFT (#642) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/642 Add option to freeze backend of MMF Transformer model Reviewed By: apsdehal Differential Revision: D24381850 fbshipit-source-id: be550e25cececa3603208c6bd8df7aeabae252ae 24 October 2020, 19:53:29 UTC
540bd91 [chores] Move mmf workflows to use manifold (#656) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/656 - Remove support for Gluster, move all flows to use manifold - Enable proxy in fbl workflows - Remove unused launch params - Fix workflow naming Reviewed By: BruceChaun Differential Revision: D24471398 fbshipit-source-id: c5811ad929af75cf9fca82ba808e6440506280e2 22 October 2020, 18:10:57 UTC
f658adb [fix] Fix text encoder output if the output is tensor (#655) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/655 Some text encoder outputs return a single tensor (say with dimension `batch_size x out_dim`). Additional condition checks are added to make sure this case is not affected. Reviewed By: vedanuj Differential Revision: D24456401 fbshipit-source-id: 09986304a1bd9cecd7dc743395829e5be7497c9b 22 October 2020, 17:21:46 UTC
0206f23 [fix] Fix text encoder output for BertModelJIT (#649) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/649 Output of BertModelJit is a 3 tuple. Fix to pick the pooled output for the fusion models. Behavior changed after we fixed `bert-*` models to use `BertModelJit` in D24038872 (https://github.com/facebookresearch/mmf/commit/5be74f067382272d6f6729156659d7f5a00be187) Reviewed By: n-zhang Differential Revision: D24440904 fbshipit-source-id: a9c1e5dd7f58ef1dd1d569308844a3ab7d1e64c3 21 October 2020, 06:21:05 UTC
70f36eb [fix] Allow struct=True configs for MMFT encoders,pretrained Pythia (#644) Summary: - If modality is missing encoder key, MMFT after recent fix won't break down - This allows using configs without specifying encoder key and if they are in struct mode which is what MMF gives when ran from command line Pull Request resolved: https://github.com/facebookresearch/mmf/pull/644 Test Plan: Tested with audio video MMFT Reviewed By: vedanuj Differential Revision: D24405513 Pulled By: apsdehal fbshipit-source-id: 3a6e588264bd39426178a8f16e582262a1d7cb1d 21 October 2020, 06:11:00 UTC
d8f9592 [fix] Change format of concatenating image and text as bert input in MMBT (#622) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/622 THe previous version of concatenation of image and text tokens has format `<cls> image_token <sep> <cls> text_tokens <sep> <pad>`, but there is a bug in code -- the `<sep>` token behind `image_token` might be `<pad>` because it simply used the last token in `input_ids`. We can easily see `bert_processor` convert input text into tokens which has format `<cls> tokens <sep> <pad>`. Apart from resolving the aforementioned issue, this diff also converts to a simpler format `<cls> image_token <sep> text_tokens <sep> <pad>`. So based on previous version, modifications are 1. Extract the last non-masked token, which is `<sep>` as `modal_end_token` 2. Remove `<cls>` token at the beginning of text input ids and append last token at the end to ensure it has the same sequence length 3. Fix `input_mask` by truncating the first token and appending 0 at the end Reviewed By: apsdehal Differential Revision: D24164033 fbshipit-source-id: 04246d64ebd483e6809721245fac9dc0ee20cff8 20 October 2020, 04:30:14 UTC
9af5b2c [enhancement] Support more modalities in MMFT (#576) Summary: - Generalizes MMFT to more modalities - Modality agnostic initialization, checks and forward for other modalities Pull Request resolved: https://github.com/facebookresearch/mmf/pull/576 Test Plan: Tested on a video-audio dataset Reviewed By: vedanuj Differential Revision: D23969975 Pulled By: apsdehal fbshipit-source-id: 9d45a2d9c50563433c230b6a9c59f38af7deaa59 19 October 2020, 19:33:59 UTC
97f3c7d [fix] Fix in_dim for movie_mcan model (#638) Summary: `in_dim` needs to be added to the config after the change in PR https://github.com/facebookresearch/mmf/issues/564. Fixing that. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/638 Reviewed By: apsdehal Differential Revision: D24388070 Pulled By: vedanuj fbshipit-source-id: dab1c2ce14ea168b8932193950bdae30ebbbf9d7 19 October 2020, 19:23:34 UTC
c0c834e [feat] Make top level loss classes torchscriptable (#631) Summary: - We weren't doing init_losses in the tests so we didn't notice this - Changes make top level loss function torchscriptable so that if losses are empty, this doesn't cause any issues Pull Request resolved: https://github.com/facebookresearch/mmf/pull/631 Test Plan: Enable init_losses in mmbt tests Reviewed By: vedanuj Differential Revision: D24328059 Pulled By: apsdehal fbshipit-source-id: 063f4938e7475f91904149771824c77eb2d22509 19 October 2020, 08:39:30 UTC
ae2c8e1 [fix] Issues with encoder update, add tests (#634) Summary: - I introduced some issues with recent encoder update. This PR aims to fix that. - This PR also adds tests to make sure this doesn't happen in future. - Tests only tests initialization as of now, can be build upon in future. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/634 Test Plan: Tests have been added Reviewed By: vedanuj Differential Revision: D24356702 Pulled By: apsdehal fbshipit-source-id: 815a68096db7c802178fdbe28aa6c22601a2c999 19 October 2020, 07:39:51 UTC
dcc1a95 [feat] Add fp16 support to MMFTrainer (#571) Summary: This PR adds support for fp16 to MMF. Extensive benchmarking on different models and different datasets shows that fp16 runs are comparable in performance but quite faster on longer runs. ## Task Items - [x] Save and load state_dict of scaler into/from checkpoint - [x] fp16 on validation/other loops as well - [x] grad scaling on clip grad - [x] tests - [x] benchmark on (i) TextVQA/M4C (ii) VisualBERT/VQA2 (iii) MMBT/HM - [x] docs ## Benchmarking numbers ### MMBT on Hateful Memes Reported metrics are ROC-AUC on validation set and time taken to complete the run | fp16 | Enabled | Disabled | |----------------------|----------------|----------------| | bs128.lr5e-5.mu22000 | 69.72/3h24m28s | 70.14/3h54m55s | | bs64.lr5e-5.mu22000 | 68.47/2h34m38s | 66.56/2h22m49s | ### VisualBERT on VQA2 Reported metrics are VQA accuracy on validation set and time taken to complete the run. | fp16 | Enabled | Disabled | |----------------------|-----------------|----------------| | bs128.lr5e-5.mu88000 | 66.08/05h09m22s | 65.89/7h28m46s | | bs64.lr5e-5.mu88000 | 66.82/04h14m10s | 66.69/6h04m30s | ### M4C on TextVQA Reported metrics are TextVQA accuracy on validation set and time taken to complete the run | fp16 | Enabled | Disabled | |---------------|----------------|-----------------| | bs128.mu24000 | 38.7/01h35m10s | 39.01/01h42m06s | Pull Request resolved: https://github.com/facebookresearch/mmf/pull/571 Reviewed By: vedanuj Differential Revision: D24240531 Pulled By: apsdehal fbshipit-source-id: c704602dea7f229ebee129841477ba2add74ed72 19 October 2020, 07:24:32 UTC
6980fe0 [fix] Parameter list regression in build_optimizers (#630) Summary: - With recent change of https://github.com/facebookresearch/mmf/issues/580, there is a regression for models which don't have `get_optimizer_parameters` or don't return param groups. https://github.com/facebookresearch/mmf/issues/580 specifically expects param groups to be present. - This PR fixes this regression and add tests for optimizers so that this doesn't happen again Pull Request resolved: https://github.com/facebookresearch/mmf/pull/630 Test Plan: Tested added for two type of parameter groups returned from the model Reviewed By: ronghanghu Differential Revision: D24325727 Pulled By: apsdehal fbshipit-source-id: 9245408b19323ee6cb2adc1f1eed9182841f5dc4 16 October 2020, 01:53:43 UTC
fa195d2 [feat] Introduce QC tests for missing init and empty folders (#616) Summary: - Missing inits cause problem with user dir and packaging - The tests check for missing init in folders with python files - Also, checks for empty folders - Furthers, checks on all MMF folders can be added easily Pull Request resolved: https://github.com/facebookresearch/mmf/pull/616 Test Plan: Tested locally and fixed the issues found TODO: Need to test internally as well. Reviewed By: vedanuj Differential Revision: D24200765 Pulled By: apsdehal fbshipit-source-id: 40a12f3be0c48a4279f3946c3f008fa4f8dfcb85 15 October 2020, 22:20:55 UTC
5be74f0 [fix] Allow Transformer layers to be used without monkey patching Summary: This involves using BertModelJit to initialize the pretrained model instead of normal one and then also adding init functions of those methods so that all layers are jit compatible. Reviewed By: vedanuj Differential Revision: D24038872 fbshipit-source-id: c573f6b98754ce28538fe770477a8679b736e9d4 15 October 2020, 20:04:01 UTC
bd1822d [chores, refactor] Enable torchscript tests internally, refactor (#619) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/619 - Enable torchscript tests internally by enabling proxy - Remove code duplication in torchscript tests Reviewed By: apsdehal Differential Revision: D24213541 fbshipit-source-id: 48f6e8265fcccdc7a1c5b4d40c1075a5d22fae0c 15 October 2020, 08:01:09 UTC
88a836a [feat] Add encoder registry and differentiate encoder factory (#628) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/628 This diff aims to smoothen the process of adding new encoders to MMF. This is also specifically needed for FB internal encoders which are hard to registry while not being part of OSS without registry. This will also help with keeping sanity in MMFTransformer and generic modalities diff. This diff also converts the older feature encoders to the factory which is what they technically are. This backwards compatible change as the build api still looks same. Though if someone is directly importing the encoder, this will break their code. But this refactor is much needed at this time. We also add `build_encoder` function which handles both structured config and normal config patterns. We hope to make it more streamlined in future when we make efforts towards Hydra integration. Reviewed By: vedanuj Differential Revision: D24300517 fbshipit-source-id: 3bc68de398ad397fdf8fa5cf39a261691cba31d2 15 October 2020, 06:00:15 UTC
f35b266 [chores] Remove cpu and windows build from CircleCI tests (#618) Summary: - Remove CPU tests and Windows Build test from CircleCI as they are already covered in Actions - Some CircleCI CPU tests were flaky - These tests also require high resources(class `large`) that causes forked repo builds to fail Pull Request resolved: https://github.com/facebookresearch/mmf/pull/618 Test Plan: Check if tests run properly Reviewed By: apsdehal Differential Revision: D24227244 Pulled By: vedanuj fbshipit-source-id: dbac96d4abfc57468b44e211a62523737665d336 15 October 2020, 01:57:40 UTC
ee9cade refactor vilbert to make it compatible for torchscript (#591) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/591 Refactor to make vilbert compatible to torchscript Similar changes are done in [our implementation of Vilbert](https://www.internalfb.com/intern/diffusion/FBS/browsefile/master/fbcode/assistant/mmu/multimodal_bert/vilbert/model.py). Reference diff: D18892981 Reviewed By: vedanuj Differential Revision: D23923842 fbshipit-source-id: a7c3736408e6da828373b1c7a4db94189169c83b 14 October 2020, 21:17:09 UTC
058e713 [staticdocs] Update docusaurus plugin for Static Docs projects Summary: Plugin update is required to fix hit counter and auto-redirect from public site on Chrome 85+. It will also enable auto-redirect from staticdocs.thefacebook.com to internalfb.com/intern/staticdocs to ensure intern sidebar is visible when documentation is browsed internally. Reviewed By: dkgi Differential Revision: D24281980 fbshipit-source-id: 2614b4228d2df164981cee437952058684575a23 14 October 2020, 11:46:59 UTC
7ce17a5 [feat] Enable torchscript on full VisualBERT model (#624) Summary: - This removes opinions on module getting torchscripted rather than the full model. This will also enable us to keep the signature of Dict[str, Tensor] same for all of the MMF models - Modifies code flow according to scripting model and pretraining model conditions - Add helps function around getattr - Modifies the test to play nice with new API which looks better Pull Request resolved: https://github.com/facebookresearch/mmf/pull/624 Test Plan: Modifies VisualBERT tests for TorchScript to script and test the whole model Reviewed By: vedanuj Differential Revision: D24267966 Pulled By: apsdehal fbshipit-source-id: c3ffa2934d64bf58739df9c4910fddd0a89684c6 14 October 2020, 01:11:16 UTC
78d908d [fix] Missing in_dim to pythia image encoder (#625) Summary: This is required and should have been passed in original commit to modify to config. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/625 Test Plan: Tested on pythia model Reviewed By: vedanuj Differential Revision: D24279546 Pulled By: apsdehal fbshipit-source-id: 711a62930f638b3c8b8e2af93aa561f6c8f8e648 13 October 2020, 18:27:35 UTC
6f14d31 [feat] Add init dataclasses for mmbt and encoders (#565) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/565 As step one of FAIM integration, we allow building our models from config so that the models are purely decoupled from configuration and users know what the model expects. We first do this for the MMBT model. - Adds configs for MMBT and respective encoders. - Also adds from_params method to the BaseModel class so that args initialization is possible - Updates build method to support passing of direct config object - Add Config class to BaseModel as well and update typings There is an issue with OmegaConf that doesn't let us use Union in structured configs. Take a look at https://github.com/omry/omegaconf/issues/144 Reviewed By: vedanuj Differential Revision: D23699688 fbshipit-source-id: 37020346ce820207eb41b7bd43c8fba579b436d5 12 October 2020, 21:29:40 UTC
c8264bd [refactor] Use config object for ImageFeatureEncoder init (#564) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/564 There has been a mismatch between ImageFeatureEncoder and ImageEncoder signatures which doesn't allow us to create same config types for them even though that's how they are used. This is confusing and should be same. This diff aims to fix this discrepancy. Reviewed By: vedanuj Differential Revision: D23712213 fbshipit-source-id: b6d3ad56a82727c08f3a53b8ee2b77e6af9e7a17 12 October 2020, 20:56:16 UTC
a5fc02b [feat] Add masked visual genome dataset (#602) Summary: - Adds visual genome masked dataset - Fixes https://github.com/facebookresearch/mmf/issues/566, for pretraining on visual genome - Fix for loading samples without `question_tokens` - Fix for loading visual genome feature files Pull Request resolved: https://github.com/facebookresearch/mmf/pull/602 Test Plan: Test run lxmert model with visual genome Reviewed By: apsdehal Differential Revision: D24108712 Pulled By: vedanuj fbshipit-source-id: 5ae041a2ce4dcdf914fdecc19466729d445237da 12 October 2020, 02:27:50 UTC
e83f626 [fix] Fix optimizer params to include all modules for transformer models (#620) Summary: Recent changes in PR https://github.com/facebookresearch/mmf/issues/568 moved the `pooler` layer outside classifier and was not added to optimizer params group. Check added in https://github.com/facebookresearch/mmf/issues/580 helped to catch this. Thanks ronghanghu ! - PR refactors the code in `get_optimizer_parameters_for_bert` - **All** layers other than classifier will have LR of `(lr * finetune_lr_multiplier)` when model head is not of `pretraining` type. - Remove unnecessary setting LR when `finetune_lr_multiplier == 1` Pull Request resolved: https://github.com/facebookresearch/mmf/pull/620 Test Plan: - Checked with MMF Transformer, MMBT, VisualBERT Reviewed By: apsdehal Differential Revision: D24237064 Pulled By: vedanuj fbshipit-source-id: 70f1321d6865bde2f636c46b9d7c00a7bf6667ab 11 October 2020, 02:29:53 UTC
bad7a4d [feat] Add DoCNN as text encoder in MMF Summary: Add DoCNN in TextEncoder. The embedding matrix uses the one in pytext transformer. And support to customize `conv_filters` and `mlp_sizes` in yaml config. # Performance | concat_bert model with text encoder | test/ap | test/accuracy | | docnn+pytext | **0.8998** | **0.8407** | | docnn | 0.8930 | 0.8346 | | xlm_roberta_base | 0.8843 | 0.8336 | Reviewed By: apsdehal Differential Revision: D23933438 fbshipit-source-id: 1cda7920e737d3342a6abf0afa54ca26660496d3 09 October 2020, 19:28:06 UTC
14c0e2f [feat] check unused model parameters in optimizer (#580) Summary: This PR adds optimizer option on whether to allow some of the model's parameters not to be used by the optimizer. Default is false to guard against missing parameters due to implementation errors in a model's `get_optimizer_parameters`. Previously `get_optimizer_parameters` is error-prong as the users need to remember to update `get_optimizer_parameters` whenever adding a new submodule, and the error is hard to spot when the users forget to do so. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/580 Test Plan: tested locally w/ VisualBERT, against correctly and incorrectly implemented `get_optimizer_parameters`. Reviewed By: vedanuj Differential Revision: D24201583 Pulled By: ronghanghu fbshipit-source-id: 139f9837e7bc8f5276e33c12bee7f3061f3fa6a5 09 October 2020, 02:45:14 UTC
6074c77 [feat] Add SentencePiece pytext tokenizer processor (#573) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/573 - Adds pytext SPM tokenizers to be used along with pytext models - Minor modifications to bert_processors to reuse similar parts in other tokenizers Reviewed By: apsdehal Differential Revision: D23782402 fbshipit-source-id: c1c6750cc55a8ab4eb477e9a463990ac9d075014 09 October 2020, 02:37:24 UTC
a769bac [fixup] don't init early_stopping from ckpt when resetting counts (#605) Summary: Since the docs in mmf/configs/defaults.yaml on `reset.count` mention that "All counts such as best_update, current_iteration etc will be reset", the early stopping criteria (`best_iteration` and `best_metric_value`) should not be initialized from the ckpt when `reset.count` is on. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/605 Test Plan: tested internally on a DETR-related model when initializing from an earlier model's checkpoint. Reviewed By: vedanuj, apsdehal Differential Revision: D24194277 Pulled By: ronghanghu fbshipit-source-id: 472b5b1b1c17b94639809026f9f3d087285986d7 09 October 2020, 02:31:35 UTC
0d0eaad [fix] Missing init in backends folder (#615) Summary: - Fixes missing imports problem for user dir Pull Request resolved: https://github.com/facebookresearch/mmf/pull/615 Test Plan: Tested locally Reviewed By: vedanuj Differential Revision: D24184482 Pulled By: apsdehal fbshipit-source-id: 52353753ec28147f640f7092a89c359928494daa 08 October 2020, 19:03:41 UTC
132116e [fix] Fix movie mcan feature reader (#604) Summary: - Fixes https://github.com/facebookresearch/mmf/issues/594 Pull Request resolved: https://github.com/facebookresearch/mmf/pull/604 Reviewed By: apsdehal Differential Revision: D24113348 Pulled By: vedanuj fbshipit-source-id: 1c29c816e697da02370128a40a4a20b25786b25c 08 October 2020, 07:58:31 UTC
33f16ba [fix] Discrepancy in tests isort b/w oss and fbcode (#614) Summary: - fbcode treats tests as third party - This fixes the sync Pull Request resolved: https://github.com/facebookresearch/mmf/pull/614 Test Plan: Tested and compared sorting in the two Reviewed By: vedanuj Differential Revision: D24171915 Pulled By: apsdehal fbshipit-source-id: 08fda10b1f7c4ca11925b70e89f70cc5fdbe1236 07 October 2020, 22:27:54 UTC
4877374 [chores] Add highlighting to the bibtex entry in README (#603) Summary: Add highlighting to the BibTeX entry in the README. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/603 Reviewed By: vedanuj Differential Revision: D24107511 Pulled By: apsdehal fbshipit-source-id: 6d5750876a2b441bf3dc5d860b4715f734ba1a3b 06 October 2020, 05:43:58 UTC
0d80bc8 [fix,enhancement] Fix CB tutorial and fail on empty loss dict (#585) Summary: - Fix a typo in ConcatBERT tutorial. Modify it to use key "concat_bert_tutorial" to avoid conflicts with existing concat bert inside MMF. - Fail if loss dict is empty in backward with an informative error message - Fixes https://github.com/facebookresearch/mmf/issues/582 Pull Request resolved: https://github.com/facebookresearch/mmf/pull/585 Test Plan: Tested locally Reviewed By: vedanuj Differential Revision: D24068883 Pulled By: apsdehal fbshipit-source-id: 325ada0b558e83e0c9e10051c00fa8750fc2e0cb 06 October 2020, 04:00:36 UTC
f6bb22d [fix] Issue with worker spawn under user dir (#608) Summary: - importlib should always import the module regardless of whether it is in sys.modules or not - This should fix the missing import error that happens in user dir setting for num_workers > 0. - Fixes https://github.com/facebookresearch/mmf/issues/600 Pull Request resolved: https://github.com/facebookresearch/mmf/pull/608 Test Plan: Tested on user dir with disney and e2e tests planned in separate PR Reviewed By: vedanuj Differential Revision: D24125181 Pulled By: apsdehal fbshipit-source-id: 924f98da9206ccec71b5abbc29b3ec2074d08f68 06 October 2020, 03:54:23 UTC
785607a [feat] Added the localized narratives dataset (#563) Summary: - Add the masked language modeling task for localized narratives using just caption data. - Create Flickr30k and Coco2017 dataset builder that shares the same mixin as localizednarratives dataset Pull Request resolved: https://github.com/facebookresearch/mmf/pull/563 Reviewed By: vedanuj Differential Revision: D24018522 Pulled By: ytsheng fbshipit-source-id: 862d82475a089b9a4e1069f41e51459f816c4f99 05 October 2020, 17:42:28 UTC
2bc58b5 [feat] Add torchscriptable version for MMF Transformer (#568) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/568 - Modifications to MMF transformer `preprocess_sample` to be scriptable friendly. - Refactor out backends : Added a BaseBackend abstract class. - Added a new scriptable backend for MMF Transformer that supports only Huggingface Bert, Roberta and XLMR transformer models. Here we are implementing a version of MMF Transformer that supports torchscriptable layers and works only for BERT, RoBERTa and XLMR models - Add scriptable `RobertaEmbeddingsJIT` - Add tests for MMF Transformer Reviewed By: apsdehal Differential Revision: D23569352 fbshipit-source-id: c6527beb1f8b344eb14650d0560a71584c828192 05 October 2020, 17:35:07 UTC
07b786d [fix] correct editor_config url (#592) Summary: Thanks for your contribution! If you're sending a large PR (e.g., >50 lines), please open an issue first about the feature/bug, and indicate how you want to contribute. Use [contributing guidelines](https://github.com/facebookresearch/mmf/tree/master/.github/CONTRIBUTING.md) before opening up the PR to follow MMF style guidelines. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/592 Reviewed By: vedanuj Differential Revision: D24083600 Pulled By: ytsheng fbshipit-source-id: 65ed98019e7a232f74e1bb9f92dd73ba9d5f24b4 02 October 2020, 20:23:13 UTC
0cb5bcf [chores] Update Hateful Memes stuff for Phase 2 (#595) Summary: - Updates convert command - Updates features - Updates annotations to point to unseen - Updates docs to mention how to run on seen set - Updates version number of MMF (Time to release a new version, yay) Pull Request resolved: https://github.com/facebookresearch/mmf/pull/595 Test Plan: Tested the runs and everything locally Fixes https://github.com/facebookresearch/mmf/issues/593 Reviewed By: vedanuj Differential Revision: D24069499 Pulled By: apsdehal fbshipit-source-id: 61e8dc4ea3c08327b9492651a242984556b2ea07 02 October 2020, 16:39:26 UTC
1f14f8c [codemod][staticdocs] Upgrade internaldocs plugin Summary: Codemod generated using: ``` cd fbsource/xplat/staticdocs ./update-all-users.sh '0.4.0' ``` Reviewed By: colriot Differential Revision: D24077905 fbshipit-source-id: 54e322dcb2078c65547f212c723aa227fa1b6014 02 October 2020, 15:53:12 UTC
7bd0a8f [JIT] Enable @unused syntax for ignoring properties (#45261) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/45261 **Summary** This commit enables `unused` syntax for ignoring properties. Inoring properties is more intuitive with this feature enabled. `ignore` is not supported because class type properties cannot be executed in Python (because they exist only as TorchScript types) like an `ignored` function and module properties that cannot be scripted are not added to the `ScriptModule` wrapper so that they may execute in Python. **Test Plan** This commit updates the existing unit tests for class type and module properties to test properties ignored using `unused`. Test Plan: Imported from OSS Reviewed By: navahgar, Krovatkin, mannatsingh Differential Revision: D23971881 Pulled By: SplitInfinity fbshipit-source-id: 8d3cc1bbede7753d6b6f416619e4660c56311d33 29 September 2020, 17:21:34 UTC
709a696 [fix] memory leak in loss reporting (#560) Summary: Previously `meter_update_dict.update(reduced_loss_dict)` was storing the loss with the gradient and computation graph. This PR should fix various GPU and CPU oom problems. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/560 Reviewed By: apsdehal Differential Revision: D23683638 Pulled By: vedanuj fbshipit-source-id: 733842436129426861fbab5bde3ce3b649509570 28 September 2020, 17:45:00 UTC
ba437e6 [feat, fix] Modify MMBT torchscript and other fixes for PT 1.7 torchscript (#581) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/581 - Modify MMBT to torchscript full model - Move `replace_with_jit()` to MMBTBase - Changes to `property` methods to make them scriptable. - Changes to base model Reviewed By: apsdehal Differential Revision: D23927970 fbshipit-source-id: b5da316c61597fd8ec9dde5e86ba628b6f1c65a0 25 September 2020, 23:21:32 UTC
259b81b [fix] Update CLEVR urls to latest for now to fix internal task (#579) Summary: Update URL. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/579 Reviewed By: vedanuj Differential Revision: D23907427 Pulled By: apsdehal fbshipit-source-id: 40367c4d628700c975563c07d22b8dee151bd980 24 September 2020, 16:43:29 UTC
50c325a [refactor] point to the right maskrcnn_benchmark (#555) Summary: * Update the comment to point to the correct maskrcnn-benchmark. There are many on github that is available. This one is the correct one to use. * Added the X152 model * Tested that both models can be loaded with the updated repo. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/555 Reviewed By: vedanuj Differential Revision: D23835588 Pulled By: ytsheng fbshipit-source-id: d500588f6a14ff45f55bf12937f7f5e69c35067d 23 September 2020, 07:01:16 UTC
7882228 [fix] Black formatting in sample (#572) Summary: - Accidentally forgot to run formatter on sample.py - This fixes it Pull Request resolved: https://github.com/facebookresearch/mmf/pull/572 Reviewed By: vedanuj Differential Revision: D23790238 Pulled By: apsdehal fbshipit-source-id: 3fd4314e7f6a451dba2ac6dbd464570c2c03674e 18 September 2020, 23:58:45 UTC
ee133a0 [feat] Add prediction support for internal datasets Summary: Custom prediction processors can now be specified for the internal datasets. Specify your custom processor that takes in a report and return back a list to be dumped into json. Default processor has been set to ArgMax prediction processor Reviewed By: vedanuj Differential Revision: D23535841 fbshipit-source-id: 43d9265b821b9a191c480a42d7040ff4afafd7e8 18 September 2020, 17:32:14 UTC
404905d [fix] Add missing if statement to check visual bert bypass_transformer (#570) Summary: Fixes https://github.com/facebookresearch/mmf/issues/569 Pull Request resolved: https://github.com/facebookresearch/mmf/pull/570 Reviewed By: apsdehal Differential Revision: D23777134 Pulled By: vedanuj fbshipit-source-id: b59be010e53be689c4eb9e0283e2d4ea51ec9189 18 September 2020, 04:21:41 UTC
2f33738 [refactor] Refactor MMF Transformer embeddings to support scriptability (#551) Summary: - Changes to support torchscript for mmf transformer derivatives - Changes to `MMFTransformerEmbeddings` by modifying the different modality embeddings to `nn.ModuleList` to make it scriptable. - Some changes to the forward function Note: It is difficult to make MMF transformer scriptable as a whole since its base can be any type of transformer and making that scriptable is dependent on other libraries. The idea is to have derived models from MMF transformer that can be scripted. For example: We will have a version of MMF transformer that works with Bert, Roberta and XLMR that will be scriptable. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/551 Test Plan: - Test with hateful memes dataset Reviewed By: apsdehal Differential Revision: D23569270 Pulled By: vedanuj fbshipit-source-id: 21d9ac80d912fcc65810a13f4facd0e828fcdd39 16 September 2020, 06:11:16 UTC
00fa16f [chores] Add linter tests to Actions (#559) Summary: Add linter tests to Actions. We can then remove CircleCI CPU tests completely. Pull Request resolved: https://github.com/facebookresearch/mmf/pull/559 Reviewed By: apsdehal Differential Revision: D23687038 Pulled By: vedanuj fbshipit-source-id: fb6ab9e627394fcf7b47bf60b07f90740f456359 16 September 2020, 05:12:22 UTC
f24c849 [fix] Fix tests with larger CPU resource, fail-fast set to false (#558) Summary: - Increase resource size for CircleCI CPU tests, change to machine executor which have larger RAM - set fail-fast to false for Actions fto be able to pinpoint which matrix combinations cause failures Pull Request resolved: https://github.com/facebookresearch/mmf/pull/558 Test Plan: - Run the tests and check if they pass Reviewed By: apsdehal Differential Revision: D23668047 Pulled By: vedanuj fbshipit-source-id: 47d135e784e62bb645de0227d75488478d7761b2 15 September 2020, 02:38:56 UTC
cdf1bcc [mmf] [feat] Batch Beam Search (#557) Summary: Pull Request resolved: https://github.com/facebookresearch/mmf/pull/557 Implement beam search for batch_size > 1. Current Implementation of Beam Search works only for batch size = 1. Here is the implementation : https://github.com/facebookresearch/mmf/blob/08e5416db2ffa86508af4161505213b976df2cfd/mmf/utils/text.py#L271 Reviewed By: howardhsu Differential Revision: D22860178 fbshipit-source-id: 95b70147ab8fad8ddd43dfef54b7be7c45841f3f 14 September 2020, 20:37:14 UTC
c766d82 [feat] Add torchscript support for MMBT (#550) Summary: - Adds torchscript support for MMBT model - Adds `BertEmbeddingsJit` and `BertModelJit` layers which are torchscriptable versions of `BertEmbeddings` and `BertModel` layers from transformers package - Replaces transformers Bert model layers with scriptable layers by monkey patching, function `replace_with_jit` - Adds tests Pull Request resolved: https://github.com/facebookresearch/mmf/pull/550 Test Plan: - Run tests - Run old pretrained checkpoints for MMBT and verify results Reviewed By: apsdehal Differential Revision: D23569134 Pulled By: vedanuj fbshipit-source-id: ec688aa5686812ae838a95ad203668ae06dfd1fc 12 September 2020, 08:42:42 UTC
back to top