Skip to main content
  • Home
  • Development
  • Documentation
  • Donate
  • Operational login
  • Browse the archive

swh logo
SoftwareHeritage
Software
Heritage
Archive
Features
  • Search

  • Downloads

  • Save code now

  • Add forge now

  • Help

https://github.com/zachzhang07/vosh
17 January 2025, 10:10:39 UTC
  • Code
  • Branches (1)
  • Releases (0)
  • Visits
    • Branches
    • Releases
    • HEAD
    • refs/heads/main
    No releases to show
  • 6250ce0
  • /
  • readme.md
Raw File Download Save again
Take a new snapshot of a software origin

If the archived software origin currently browsed is not synchronized with its upstream version (for instance when new commits have been issued), you can explicitly request Software Heritage to take a new snapshot of it.

Use the form below to proceed. Once a request has been submitted and accepted, it will be processed as soon as possible. You can then check its processing state by visiting this dedicated page.
swh spinner

Processing "take a new snapshot" request ...

To reference or cite the objects present in the Software Heritage archive, permalinks based on SoftWare Hash IDentifiers (SWHIDs) must be used.
Select below a type of object currently browsed in order to display its associated SWHID and permalink.

  • content
  • directory
  • revision
  • snapshot
origin badgecontent badge
swh:1:cnt:5ff4cbe469c41d9c0b023a02019ece4efa3502c6
origin badgedirectory badge
swh:1:dir:6250ce036c2b87e54e9cd34c4dc7c6f7581eb793
origin badgerevision badge
swh:1:rev:da207d03e7994d9c5a097126dcd509abedc26bc0
origin badgesnapshot badge
swh:1:snp:eee76444da62e238a10272cb71070ca8823b3f3d

This interface enables to generate software citations, provided that the root directory of browsed objects contains a citation.cff or codemeta.json file.
Select below a type of object currently browsed in order to generate citations for them.

  • content
  • directory
  • revision
  • snapshot
(requires biblatex-software package)
Generating citation ...
(requires biblatex-software package)
Generating citation ...
(requires biblatex-software package)
Generating citation ...
(requires biblatex-software package)
Generating citation ...
Tip revision: da207d03e7994d9c5a097126dcd509abedc26bc0 authored by zachzhang07 on 21 November 2024, 08:07:14 UTC
Update readme.md
Tip revision: da207d0
readme.md
# Vosh

This repository contains a PyTorch implementation of the paper: [Voxel-Mesh Hybrid Representation for Real-Time View Synthesis](https://arxiv.org/abs/2403.06505).

### [Project Page](https://zyyzyy06.github.io/Vosh/) | [Arxiv](https://arxiv.org/abs/2403.06505) | [Paper]()

![](assets/teaser.png)

# Install

```bash
git clone https://github.com/zachzhang07/vosh.git
cd vosh
```

### Install with pip
```bash
conda create -n vosh python==3.8.13
conda activate vosh

pip install torch==1.10.1+cu111 torchvision==0.11.2+cu111 torchaudio==0.10.1 -f https://download.pytorch.org/whl/cu111/torch_stable.html

pip install -r requirements.txt

# nvdiffrast
pip install git+https://github.com/NVlabs/nvdiffrast/

```

<!-- ### Build extension (optional)
By default, we use [`load`](https://pytorch.org/docs/stable/cpp_extension.html#torch.utils.cpp_extension.load) to build the extension at runtime.
However, this may be inconvenient sometimes.
Therefore, we also provide the `setup.py` to build each extension:
```bash
# install all extension modules
bash scripts/install_ext.sh

# if you want to install manually, here is an example:
cd raymarching
python setup.py build_ext --inplace # build ext only, do not install (only can be used in the parent directory)
pip install . # install to python path (you still need the raymarching/ folder, since this only install the built extension.)
``` -->

### Tested environments
* Ubuntu 20.04 with torch 1.10.1 & CUDA 11.1 on RTX 4090 and RTX 3090.

# Usage

We majorly support COLMAP dataset like [Mip-NeRF 360](http://storage.googleapis.com/gresearch/refraw360/360_v2.zip).
Please download and put them under `../data/`.

For custom datasets:
```bash
# prepare your video or images under /data/custom, and run colmap (assumed installed):
python scripts/colmap2nerf.py --video ../data/custom/video.mp4 --run_colmap # if use video
python scripts/colmap2nerf.py --images ../data/custom/images/ --run_colmap # if use images
```

### Basics
First time running will take some time to compile the CUDA extensions.
```bash
## train and eval
# mip-nerf 360
python main_vol.py ../data/360_v2/bicycle/ --workspace ../output/bicycle --contract
python main_mesh.py ../data/360_v2/bicycle/ --vol_path ../output/bicycle \
  --workspace ../output/bicycle_mesh
python main_vosh.py ../data/360_v2/bicycle/ --vol_path ../output/bicycle_mesh --workspace ../output/bicycle_base --lambda_mesh_weight 0.001 --mesh_select 0.9 --keep_center 0.25 --lambda_bg_weight 0.01
python main_vosh.py ../data/360_v2/bicycle/ --vol_path ../output/bicycle_mesh --workspace ../output/bicycle_light --lambda_mesh_weight 0.01 --mesh_select 1.0 --keep_center 0.25 --lambda_bg_weight 0.01 --use_mesh_occ_grid --mesh_check_ratio 8
```
If you want to eval Vosh in 7 scenes of mip-nerf 360 dataset, just run:
```bash
python full_eval_360.py ../data/360_v2/ --workspace ../output/
```

Please check full_eval_360.py for different hyper-parameters of different kind of scenes, and check `main_*.py` for all options.

### Acknowledgement
Heavily borrowed from [torch-merf](https://github.com/ashawkey/torch-merf) and [nerf2mesh](https://github.com/ashawkey/nerf2mesh). Many thanks to Jiaxiang.


# Citation

```
@article{zhang2024vosh,
  title={Vosh: Voxel-Mesh Hybrid Representation for Real-Time View Synthesis},
  author={Zhang, Chenhao and Zhou, Yongyang and Zhang, Lei},
  journal={arXiv preprint arXiv:2403.06505},
  year={2024}
}
```

back to top

Software Heritage — Copyright (C) 2015–2026, The Software Heritage developers. License: GNU AGPLv3+.
The source code of Software Heritage itself is available on our development forge.
The source code files archived by Software Heritage are available under their own copyright and licenses.
Terms of use: Archive access, API— Content policy— Contact— JavaScript license information— Web API