Skip to main content
  • Home
  • Development
  • Documentation
  • Donate
  • Operational login
  • Browse the archive

swh logo
SoftwareHeritage
Software
Heritage
Archive
Features
  • Search

  • Downloads

  • Save code now

  • Add forge now

  • Help

https://github.com/brownvc/deep-synth
31 March 2020, 06:46:23 UTC
  • Code
  • Branches (1)
  • Releases (0)
  • Visits
Revision b800e11290b763b58e7d3b30329769a7b77cd12a authored by kwang-ether on 14 June 2019, 23:53:57 UTC, committed by kwang-ether on 14 June 2019, 23:53:57 UTC
remove csv
1 parent 79eaa7f
  • Files
  • Changes
    • Branches
    • Releases
    • HEAD
    • refs/heads/master
    • b800e11290b763b58e7d3b30329769a7b77cd12a
    No releases to show
  • 291f7df
  • /
  • deep-synth
  • /
  • rotation_dataset.py
Raw File Download
Take a new snapshot of a software origin

If the archived software origin currently browsed is not synchronized with its upstream version (for instance when new commits have been issued), you can explicitly request Software Heritage to take a new snapshot of it.

Use the form below to proceed. Once a request has been submitted and accepted, it will be processed as soon as possible. You can then check its processing state by visiting this dedicated page.
swh spinner

Processing "take a new snapshot" request ...

To reference or cite the objects present in the Software Heritage archive, permalinks based on SoftWare Hash IDentifiers (SWHIDs) must be used.
Select below a type of object currently browsed in order to display its associated SWHID and permalink.

  • revision
  • directory
  • content
  • snapshot
origin badgerevision badge
swh:1:rev:b800e11290b763b58e7d3b30329769a7b77cd12a
origin badgedirectory badge
swh:1:dir:87b9fce6556ee1dd9155a27d000889cbaac88195
origin badgecontent badge
swh:1:cnt:f237a5b4f1ce172f23926297a6f6b1c1aa9725d6
origin badgesnapshot badge
swh:1:snp:0f10b5007a9962ed82323ed2242cf08ba5544645

This interface enables to generate software citations, provided that the root directory of browsed objects contains a citation.cff or codemeta.json file.
Select below a type of object currently browsed in order to generate citations for them.

  • revision
  • directory
  • content
  • snapshot
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
Tip revision: b800e11290b763b58e7d3b30329769a7b77cd12a authored by kwang-ether on 14 June 2019, 23:53:57 UTC
remove csv
Tip revision: b800e11
rotation_dataset.py
from torch.utils import data
import torch
from PIL import Image
from torch.autograd import Variable
import numpy as np
import scipy.misc as m
import random
import math
import pickle
import os
from data import ObjectCategories, RenderedScene, RenderedComposite, Obj, TopDownView, ProjectionGenerator

class RotationDataset():
    """
    Dataset for training/testing the instance/orientation network
    """
    pgen = ProjectionGenerator()
    possible_models = None

    def __init__(self, data_dir, data_root_dir, scene_indices=(0,4000), seed=None, ablation=None):
        """
        Parameters
        ----------
        data_root_dir (String): root dir where all data lives
        data_dir (String): directory where this dataset lives (relative to data_root_dir)
        scene_indices (tuple[int, int]): list of indices of scenes (in data_dir) that are considered part of this set
        p_auxiliary (int): probability that a auxiliary category is chosen
            Note that since (existing centroid) is sparse, it is actually treated as non-auxiliary when sampling
        seed (int or None, optional): random seed, to replicate stuff, if set
        ablation (string or None, optional): see data.RenderedComposite.get_composite, and paper
        """
        self.category_map = ObjectCategories()
        self.seed = seed
        self.data_dir = data_dir
        self.data_root_dir = data_root_dir
        self.scene_indices = scene_indices
        self.ablation = ablation

    def __len__(self):
        return self.scene_indices[1]-self.scene_indices[0]
    
    def __getitem__(self,index):
        if self.seed:
            random.seed(self.seed)

        i = index+self.scene_indices[0]
        scene = RenderedScene(i, self.data_dir, self.data_root_dir)
        composite = scene.create_composite() #get empty composite

        object_nodes = scene.object_nodes
        #Since we need to at least rotate one object, this differs from location dataset slightly
        num_objects = random.randint(0, len(object_nodes)-1) 
        num_categories = len(scene.categories)

        for i in range(num_objects):
            node = object_nodes[i]
            composite.add_node(node)
        
        #Select the node we want to rotate
        node = object_nodes[num_objects]
        
        modelId = node["modelId"]
        #Just some made up distribution of different cases
        #Focusing on 180 degree, then 90, then others
        ran = random.uniform(0,1)
        if ran < 0.2:
            r = math.pi
            target = 0
        elif ran < 0.4:
            r = math.pi / 2 * random.randint(1,3)
            target = 0
        elif ran < 0.6:
            r = math.pi / 8 * random.randint(1,15)
            target = 0
        else:
            r = 0
            target = 1

        o = Obj(modelId)
        #Get the transformation matrix from object space to scene space
        t = RotationDataset.pgen.get_projection(scene.room).to_2d(np.asarray(node["transform"]).reshape(4,4))
        #Since centered already in object space, rotating the object in object space is the easier option
        sin, cos = math.sin(r), math.cos(r)
        t_rot = np.asarray([[cos, 0, -sin, 0], \
                            [0, 1, 0, 0], \
                            [sin, 0, cos, 0], \
                            [0, 0, 0, 1]])
        o.transform(np.dot(t_rot,t))
        #Render the rotated view of the object
        rotated = torch.from_numpy(TopDownView.render_object_full_size(o, composite.size))
        #Calculate the relevant info needed to composite it to the input
        sin, cos = composite.get_transformation(node["transform"])
        original_r = math.atan2(sin, cos)
        sin = math.sin(original_r + r)
        cos = math.cos(original_r + r)
        composite.add_height_map(rotated, node["category"], sin, cos)
        
        inputs = composite.get_composite(ablation=self.ablation)
        #Add attention channel, which is just the outline of the targeted object
        rotated[rotated>0] = 1
        inputs[-1] = rotated

        return inputs, target
The diff you're trying to view is too large. Only the first 1000 changed files have been loaded.
Showing with 0 additions and 0 deletions (0 / 0 diffs computed)
swh spinner

Computing file changes ...

back to top

Software Heritage — Copyright (C) 2015–2025, The Software Heritage developers. License: GNU AGPLv3+.
The source code of Software Heritage itself is available on our development forge.
The source code files archived by Software Heritage are available under their own copyright and licenses.
Terms of use: Archive access, API— Content policy— Contact— JavaScript license information— Web API