Skip to main content
  • Home
  • Development
  • Documentation
  • Donate
  • Operational login
  • Browse the archive

swh logo
SoftwareHeritage
Software
Heritage
Archive
Features
  • Search

  • Downloads

  • Save code now

  • Add forge now

  • Help

Revision df716c98d203ab64cdf05f9c17fdae565b7daa1c authored by Eelco Dolstra on 23 June 2012, 04:28:35 UTC, committed by Eelco Dolstra on 23 June 2012, 04:28:35 UTC
In chroot builds, use a private network namespace
On Linux it's possible to run a process in its own network namespace,
meaning that it gets its own set of network interfaces, disjunct from
the rest of the system.  We use this to completely remove network
access to chroot builds, except that they get a private loopback
interface.  This means that:

- Builders cannot connect to the outside network or to other processes
  on the same machine, except processes within the same build.

- Vice versa, other processes cannot connect to processes in a chroot
  build, and open ports/connections do not show up in "netstat".

- If two concurrent builders try to listen on the same port (e.g. as
  part of a test), they no longer conflict with each other.

This was inspired by the "PrivateNetwork" flag in systemd.
1 parent 2f3f413
  • Files
  • Changes
  • 517c173
  • /
  • scripts
  • /
  • nix-prefetch-url.in
Raw File Download

To reference or cite the objects present in the Software Heritage archive, permalinks based on SoftWare Hash IDentifiers (SWHIDs) must be used.
Select below a type of object currently browsed in order to display its associated SWHID and permalink.

  • revision
  • directory
  • content
revision badge
swh:1:rev:df716c98d203ab64cdf05f9c17fdae565b7daa1c
directory badge
swh:1:dir:d5f371433bd14235182cdc03cc8a5794b97fe4f4
content badge
swh:1:cnt:eea2b814b73326b0bf8e92f9ef0e5e2d09a37a7c

This interface enables to generate software citations, provided that the root directory of browsed objects contains a citation.cff or codemeta.json file.
Select below a type of object currently browsed in order to generate citations for them.

  • revision
  • directory
  • content
(requires biblatex-software package)
Generating citation ...
(requires biblatex-software package)
Generating citation ...
(requires biblatex-software package)
Generating citation ...
nix-prefetch-url.in
#! @perl@ -w @perlFlags@

use strict;
use File::Basename;
use File::Temp qw(tempdir);
use File::stat;
use Nix::Store;
use Nix::Config;

my $url = shift;
my $expHash = shift;
my $hashType = $ENV{'NIX_HASH_ALGO'} || "sha256";
my $cacheDir = $ENV{'NIX_DOWNLOAD_CACHE'};

if (!defined $url || $url eq "") {
    print STDERR <<EOF
Usage: nix-prefetch-url URL [EXPECTED-HASH]
EOF
    ;
    exit 1;
}

sub writeFile {
    my ($fn, $s) = @_;
    open TMP, ">$fn" or die;
    print TMP "$s" or die;
    close TMP or die;
}

sub readFile {
    local $/ = undef;
    my ($fn) = @_;
    open TMP, "<$fn" or die;
    my $s = <TMP>;
    close TMP or die;
    return $s;
}

my $tmpDir = tempdir("nix-prefetch-url.XXXXXX", CLEANUP => 1, TMPDIR => 1)
    or die "cannot create a temporary directory";

# Hack to support the mirror:// scheme from Nixpkgs.
if ($url =~ /^mirror:\/\//) {
    system("$Nix::Config::binDir/nix-build '<nixpkgs>' -A resolveMirrorURLs --argstr url '$url' -o $tmpDir/urls > /dev/null") == 0
        or die "$0: nix-build failed; maybe \$NIX_PATH is not set properly\n";
    my @expanded = split ' ', readFile("$tmpDir/urls");
    die "$0: cannot resolve ‘$url’" unless scalar @expanded > 0;
    print STDERR "$url expands to $expanded[0]\n";
    $url = $expanded[0];
}

# Handle escaped characters in the URI.  `+', `=' and `?' are the only
# characters that are valid in Nix store path names but have a special
# meaning in URIs.
my $name = basename $url;
die "cannot figure out file name for ‘$url’\n" if $name eq ""; 
$name =~ s/%2b/+/g;
$name =~ s/%3d/=/g;
$name =~ s/%3f/?/g;

my $finalPath;
my $hash;

# If the hash was given, a file with that hash may already be in the
# store.
if (defined $expHash) {
    $finalPath = makeFixedOutputPath(0, $hashType, $expHash, $name);
    if (isValidPath($finalPath)) { $hash = $expHash; } else { $finalPath = undef; }
}

# If we don't know the hash or a file with that hash doesn't exist,
# download the file and add it to the store.
if (!defined $finalPath) {

    my $tmpFile = "$tmpDir/$name";
    
    # Optionally do timestamp-based caching of the download.
    # Actually, the only thing that we cache in $NIX_DOWNLOAD_CACHE is
    # the hash and the timestamp of the file at $url.  The caching of
    # the file *contents* is done in Nix store, where it can be
    # garbage-collected independently.
    my ($cachedTimestampFN, $cachedHashFN, @cacheFlags);
    if (defined $cacheDir) {
        my $urlHash = hashString("sha256", 1, $url);
        writeFile "$cacheDir/$urlHash.url", $url;
        $cachedHashFN = "$cacheDir/$urlHash.$hashType";
        $cachedTimestampFN = "$cacheDir/$urlHash.stamp";
        @cacheFlags = ("--time-cond", $cachedTimestampFN) if -f $cachedHashFN && -f $cachedTimestampFN;
    }
    
    # Perform the download.
    my @curlFlags = ("curl", $url, "-o", $tmpFile, "--fail", "--location", "--max-redirs", "20", "--disable-epsv", "--cookie-jar", "$tmpDir/cookies", "--remote-time", (split " ", ($ENV{NIX_CURL_FLAGS} || "")));
    (system $Nix::Config::curl @curlFlags, @cacheFlags) == 0 or die "$0: download of ‘$url’ failed\n";

    if (defined $cacheDir && ! -e $tmpFile) {
        # Curl didn't create $tmpFile, so apparently there's no newer
        # file on the server.
        $hash = readFile $cachedHashFN or die;
        $finalPath = makeFixedOutputPath(0, $hashType, $hash, $name);
        unless (isValidPath $finalPath) {
            print STDERR "cached contents of ‘$url’ disappeared, redownloading...\n";
            $finalPath = undef;
            (system $Nix::Config::curl @curlFlags) == 0 or die "$0: download of ‘$url’ failed\n";
        }
    }

    if (!defined $finalPath) {
        
        # Compute the hash.
        $hash = hashFile($hashType, $hashType ne "md5", $tmpFile);

        if (defined $cacheDir) {
            writeFile $cachedHashFN, $hash;
            my $st = stat($tmpFile) or die;
            open STAMP, ">$cachedTimestampFN" or die; close STAMP;
            utime($st->atime, $st->mtime, $cachedTimestampFN) or die;
        }
    
        # Add the downloaded file to the Nix store.
        $finalPath = addToStore($tmpFile, 0, $hashType);
    }

    die "$0: hash mismatch for ‘$url’\n" if defined $expHash && $expHash ne $hash;
}

print STDERR "path is ‘$finalPath’\n" unless $ENV{'QUIET'};
print "$hash\n";
print "$finalPath\n" if $ENV{'PRINT_PATH'};
The diff you're trying to view is too large. Only the first 1000 changed files have been loaded.
Showing with 0 additions and 0 deletions (0 / 0 diffs computed)
swh spinner

Computing file changes ...

back to top

Software Heritage — Copyright (C) 2015–2026, The Software Heritage developers. License: GNU AGPLv3+.
The source code of Software Heritage itself is available on our development forge.
The source code files archived by Software Heritage are available under their own copyright and licenses.
Terms of use: Archive access, API— Content policy— Contact— JavaScript license information— Web API