Skip to main content
  • Home
  • Development
  • Documentation
  • Donate
  • Operational login
  • Browse the archive

swh logo
SoftwareHeritage
Software
Heritage
Archive
Features
  • Search

  • Downloads

  • Save code now

  • Add forge now

  • Help

Revision 8598e9e321cd47edfecd7e5e83482ee0e4486e94 authored by Wesley Tansey on 23 June 2016, 01:57:28 UTC, committed by Wesley Tansey on 23 June 2016, 01:57:28 UTC
Merge branch 'master' of github.com:tansey/smoothfdr
2 parent s 907bf5c + a7ab0a6
  • Files
  • Changes
  • f04a17d
  • /
  • smoothfdr
  • /
  • sfdr.py
Raw File Download
Permalinks

To reference or cite the objects present in the Software Heritage archive, permalinks based on SoftWare Hash IDentifiers (SWHIDs) must be used.
Select below a type of object currently browsed in order to display its associated SWHID and permalink.

  • revision
  • directory
  • content
revision badge
swh:1:rev:8598e9e321cd47edfecd7e5e83482ee0e4486e94
directory badge Iframe embedding
swh:1:dir:a6c0fd806e00b75984345b63925b36633898d701
content badge Iframe embedding
swh:1:cnt:6b46785019a94c6fc59097990d6fde301e21459d
Citations

This interface enables to generate software citations, provided that the root directory of browsed objects contains a citation.cff or codemeta.json file.
Select below a type of object currently browsed in order to generate citations for them.

  • revision
  • directory
  • content
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
sfdr.py
import matplotlib as mpl
mpl.use('Agg')
from matplotlib import cm, colors
import matplotlib.pyplot as plt
from mpl_toolkits.axes_grid1 import make_axes_locatable
import numpy as np
import argparse
import csv
import sys
from scipy.sparse import csc_matrix, dia_matrix, linalg as sla
from scipy.stats import norm
from smoothed_fdr import SmoothedFdr, GaussianKnown, calc_plateaus
from normix import GridDistribution, predictive_recursion, empirical_null
import signal_distributions
from utils import *
from plotutils import *


def main():
    parser = argparse.ArgumentParser(description='Runs the smoothed FDR algorithm.')

    parser.add_argument('data_file', help='The file containing the raw z-score data.')
    parser.add_argument('signal_distribution_file', help='The file location where the estimated signal distribution will be saved.')
    parser.add_argument('--verbose', type=int, default=0, help='Print detailed progress information to the console. 0=none, 1=outer-loop only, 2=all details.')
    parser.add_argument('--data_header', action='store_true', help='Specifies that there is a header line in the data file.')

    # Predictive recursion settings
    parser.add_argument('--pr_grid_x', nargs=3, type=int, default=[-7,7,57], help='The grid parameters (min, max, points) for the predictive recursion approximate distribution.')
    parser.add_argument('--pr_sweeps', type=int, default=50, help='The number of randomized sweeps to make over the data.')
    parser.add_argument('--pr_nullprob', type=float, default=1.0, help='The initial guess for the marginal probability of coming from the null distribution.')
    parser.add_argument('--pr_decay', type=float, default=-0.67, help='The exponential decay rate for the recursive update weights.')


    parser.set_defaults(data_header=False)

    # Get the arguments from the command line
    args = parser.parse_args()
    
    # Load the dataset from file
    data = np.loadtxt(args.data_file, delimiter=',', skiprows=1 if args.data_header else 0)

    if args.verbose:
        print 'Estimating null distribution empirically via Efron\'s method.'

    null_mean, null_stdev = empirical_null(data.flatten())
    null_dist = GaussianKnown(null_mean, null_stdev)

    if args.verbose:
        print 'Null: N({0}, {1}^2)'.format(null_mean, null_stdev)

    if args.verbose:
        print 'Performing predictive recursion to estimate the signal distribution [{0}, {1}] ({2} bins)'.format(args.pr_grid_x[0], args.pr_grid_x[1], args.pr_grid_x[2])
    
    grid_x = np.linspace(args.pr_grid_x[0], args.pr_grid_x[1], args.pr_grid_x[2])
    signal_data = data.flatten()

    pr_results = predictive_recursion(signal_data,
                             args.pr_sweeps, grid_x,
                             mu0=args.null_mean, sig0=args.null_stdev,
                             nullprob=args.pr_nullprob, decay=args.pr_decay)

    # Get the estimated distribution
    estimated_dist = GridDistribution(pr_results['grid_x'], pr_results['y_signal'])

    penalties = load_trails(args.trails)

    #TODO: Fill in the rest so there's a better lightweight script
    



















The diff you're trying to view is too large. Only the first 1000 changed files have been loaded.
Showing with 0 additions and 0 deletions (0 / 0 diffs computed)
swh spinner

Computing file changes ...

back to top

Software Heritage — Copyright (C) 2015–2025, The Software Heritage developers. License: GNU AGPLv3+.
The source code of Software Heritage itself is available on our development forge.
The source code files archived by Software Heritage are available under their own copyright and licenses.
Terms of use: Archive access, API— Contact— JavaScript license information— Web API