Skip to main content
  • Home
  • Development
  • Documentation
  • Donate
  • Operational login
  • Browse the archive

swh logo
SoftwareHeritage
Software
Heritage
Archive
Features
  • Search

  • Downloads

  • Save code now

  • Add forge now

  • Help

Revision df716c98d203ab64cdf05f9c17fdae565b7daa1c authored by Eelco Dolstra on 23 June 2012, 04:28:35 UTC, committed by Eelco Dolstra on 23 June 2012, 04:28:35 UTC
In chroot builds, use a private network namespace
On Linux it's possible to run a process in its own network namespace,
meaning that it gets its own set of network interfaces, disjunct from
the rest of the system.  We use this to completely remove network
access to chroot builds, except that they get a private loopback
interface.  This means that:

- Builders cannot connect to the outside network or to other processes
  on the same machine, except processes within the same build.

- Vice versa, other processes cannot connect to processes in a chroot
  build, and open ports/connections do not show up in "netstat".

- If two concurrent builders try to listen on the same port (e.g. as
  part of a test), they no longer conflict with each other.

This was inspired by the "PrivateNetwork" flag in systemd.
1 parent 2f3f413
  • Files
  • Changes
  • 517c173
  • /
  • scripts
  • /
  • nix-push.in
Raw File Download

To reference or cite the objects present in the Software Heritage archive, permalinks based on SoftWare Hash IDentifiers (SWHIDs) must be used.
Select below a type of object currently browsed in order to display its associated SWHID and permalink.

  • revision
  • directory
  • content
revision badge
swh:1:rev:df716c98d203ab64cdf05f9c17fdae565b7daa1c
directory badge
swh:1:dir:d5f371433bd14235182cdc03cc8a5794b97fe4f4
content badge
swh:1:cnt:a1c02190bd6c6b2fceb8f93fc96fa0c46920711a

This interface enables to generate software citations, provided that the root directory of browsed objects contains a citation.cff or codemeta.json file.
Select below a type of object currently browsed in order to generate citations for them.

  • revision
  • directory
  • content
(requires biblatex-software package)
Generating citation ...
(requires biblatex-software package)
Generating citation ...
(requires biblatex-software package)
Generating citation ...
nix-push.in
#! @perl@ -w @perlFlags@

use strict;
use File::Temp qw(tempdir);
use File::stat;
use Nix::Config;
use Nix::Manifest;

my $hashAlgo = "sha256";

my $tmpDir = tempdir("nix-push.XXXXXX", CLEANUP => 1, TMPDIR => 1)
    or die "cannot create a temporary directory";

my $nixExpr = "$tmpDir/create-nars.nix";
my $manifest = "$tmpDir/MANIFEST";

my $curl = "$Nix::Config::curl --fail --silent";
my $extraCurlFlags = ${ENV{'CURL_FLAGS'}};
$curl = "$curl $extraCurlFlags" if defined $extraCurlFlags;


# Parse the command line.
my $localCopy;
my $localArchivesDir;
my $localManifestFile;

my $targetArchivesUrl;

my $archivesPutURL;
my $archivesGetURL;
my $manifestPutURL;

sub showSyntax {
    print STDERR <<EOF
Usage: nix-push --copy ARCHIVES_DIR MANIFEST_FILE PATHS...
   or: nix-push ARCHIVES_PUT_URL ARCHIVES_GET_URL MANIFEST_PUT_URL PATHS...

`nix-push' copies or uploads the closure of PATHS to the given
destination.
EOF
    ; # `
    exit 1;
}

showSyntax if scalar @ARGV < 1;

if ($ARGV[0] eq "--copy") {
    showSyntax if scalar @ARGV < 3;
    $localCopy = 1;
    shift @ARGV;
    $localArchivesDir = shift @ARGV;
    $localManifestFile = shift @ARGV;
    if ($ARGV[0] eq "--target") {
       shift @ARGV;
       $targetArchivesUrl = shift @ARGV;
    }
    else {
       $targetArchivesUrl = "file://$localArchivesDir";
    }
}
else {
    showSyntax if scalar @ARGV < 3;
    $localCopy = 0;
    $archivesPutURL = shift @ARGV;
    $archivesGetURL = shift @ARGV;
    $manifestPutURL = shift @ARGV;
}


# From the given store paths, determine the set of requisite store
# paths, i.e, the paths required to realise them.
my %storePaths;

foreach my $path (@ARGV) {
    die unless $path =~ /^\//;

    # Get all paths referenced by the normalisation of the given 
    # Nix expression.
    my $pid = open(READ,
        "$Nix::Config::binDir/nix-store --query --requisites --force-realise " .
        "--include-outputs '$path'|") or die;
    
    while (<READ>) {
        chomp;
        die "bad: $_" unless /^\//;
        $storePaths{$_} = "";
    }

    close READ or die "nix-store failed: $?";
}

my @storePaths = keys %storePaths;


# For each path, create a Nix expression that turns the path into
# a Nix archive.
open NIX, ">$nixExpr";
print NIX "[";

foreach my $storePath (@storePaths) {
    die unless ($storePath =~ /\/[0-9a-z]{32}[^\"\\\$]*$/);

    # Construct a Nix expression that creates a Nix archive.
    my $nixexpr = 
        "(import <nix/nar.nix> " .
        "{ storePath = builtins.storePath \"$storePath\"; hashAlgo = \"$hashAlgo\"; }) ";
    
    print NIX $nixexpr;
}

print NIX "]";
close NIX;


# Instantiate store derivations from the Nix expression.
my @storeExprs;
print STDERR "instantiating store derivations...\n";
my $pid = open(READ, "$Nix::Config::binDir/nix-instantiate $nixExpr|")
    or die "cannot run nix-instantiate";
while (<READ>) {
    chomp;
    die unless /^\//;
    push @storeExprs, $_;
}
close READ or die "nix-instantiate failed: $?";


# Build the derivations.
print STDERR "creating archives...\n";

my @narPaths;

my @tmp = @storeExprs;
while (scalar @tmp > 0) {
    my $n = scalar @tmp;
    if ($n > 256) { $n = 256 };
    my @tmp2 = @tmp[0..$n - 1];
    @tmp = @tmp[$n..scalar @tmp - 1];

    my $pid = open(READ, "$Nix::Config::binDir/nix-store --realise @tmp2|")
        or die "cannot run nix-store";
    while (<READ>) {
        chomp;
        die unless (/^\//);
        push @narPaths, "$_";
    }
    close READ or die "nix-store failed: $?";
}


# Create the manifest.
print STDERR "creating manifest...\n";

my %narFiles;
my %patches;

my @narArchives;
for (my $n = 0; $n < scalar @storePaths; $n++) {
    my $storePath = $storePaths[$n];
    my $narDir = $narPaths[$n];
    
    $storePath =~ /\/([^\/]*)$/;
    my $basename = $1;
    defined $basename or die;

    open HASH, "$narDir/narbz2-hash" or die "cannot open narbz2-hash";
    my $narbz2Hash = <HASH>;
    chomp $narbz2Hash;
    $narbz2Hash =~ /^[0-9a-z]+$/ or die "invalid hash";
    close HASH;

    my $narName = "$narbz2Hash.nar.bz2";

    my $narFile = "$narDir/$narName";
    (-f $narFile) or die "narfile for $storePath not found";
    push @narArchives, $narFile;

    my $narbz2Size = stat($narFile)->size;

    my $references = `$Nix::Config::binDir/nix-store --query --references '$storePath'`;
    die "cannot query references for `$storePath'" if $? != 0;
    $references = join(" ", split(" ", $references));

    my $deriver = `$Nix::Config::binDir/nix-store --query --deriver '$storePath'`;
    die "cannot query deriver for `$storePath'" if $? != 0;
    chomp $deriver;
    $deriver = "" if $deriver eq "unknown-deriver";

    my $narHash = `$Nix::Config::binDir/nix-store --query --hash '$storePath'`;
    die "cannot query hash for `$storePath'" if $? != 0;
    chomp $narHash;

    # In some exceptional cases (such as VM tests that use the Nix
    # store of the host), the database doesn't contain the hash.  So
    # compute it.
    if ($narHash =~ /^sha256:0*$/) {
        $narHash = `$Nix::Config::binDir/nix-hash --type sha256 --base32 '$storePath'`;
        die "cannot hash `$storePath'" if $? != 0;
        chomp $narHash;
        $narHash = "sha256:$narHash";
    }

    my $narSize = `$Nix::Config::binDir/nix-store --query --size '$storePath'`;
    die "cannot query size for `$storePath'" if $? != 0;
    chomp $narSize;

    my $url;
    if ($localCopy) {
        $url = "$targetArchivesUrl/$narName";
    } else {
        $url = "$archivesGetURL/$narName";
    }
    $narFiles{$storePath} = [
        { url => $url
        , hash => "$hashAlgo:$narbz2Hash"
        , size => $narbz2Size
        , narHash => "$narHash"
        , narSize => $narSize
        , references => $references
        , deriver => $deriver
        }
    ];
}

writeManifest $manifest, \%narFiles, \%patches;


sub copyFile {
    my $src = shift;
    my $dst = shift;
    my $tmp = "$dst.tmp.$$";
    system("@coreutils@/cp", $src, $tmp) == 0 or die "cannot copy file";
    rename($tmp, $dst) or die "cannot rename file: $!";
}


# Upload/copy the archives.
print STDERR "uploading/copying archives...\n";

sub archiveExists {
    my $name = shift;
    print STDERR "  HEAD on $archivesGetURL/$name\n";
    return system("$curl --head $archivesGetURL/$name > /dev/null") == 0;
}

foreach my $narArchive (@narArchives) {

    $narArchive =~ /\/([^\/]*)$/;
    my $basename = $1;

    if ($localCopy) {
        # Since nix-push creates $dst atomically, if it exists we
        # don't have to copy again.
        my $dst = "$localArchivesDir/$basename";
        if (! -f "$localArchivesDir/$basename") {
            print STDERR "  $narArchive\n";
            copyFile $narArchive, $dst;
        }
    }
    else {
        if (!archiveExists("$basename")) {
            print STDERR "  $narArchive\n";
            system("$curl --show-error --upload-file " .
                   "'$narArchive' '$archivesPutURL/$basename' > /dev/null") == 0 or
                   die "curl failed on $narArchive: $?";
        }
    }
}


# Upload the manifest.
print STDERR "uploading manifest...\n";
if ($localCopy) {
    copyFile $manifest, $localManifestFile;
    copyFile "$manifest.bz2", "$localManifestFile.bz2";
} else {
    system("$curl --show-error --upload-file " .
           "'$manifest' '$manifestPutURL' > /dev/null") == 0 or
           die "curl failed on $manifest: $?";
    system("$curl --show-error --upload-file " .
           "'$manifest'.bz2 '$manifestPutURL'.bz2 > /dev/null") == 0 or
           die "curl failed on $manifest: $?";
}
The diff you're trying to view is too large. Only the first 1000 changed files have been loaded.
Showing with 0 additions and 0 deletions (0 / 0 diffs computed)
swh spinner

Computing file changes ...

back to top

Software Heritage — Copyright (C) 2015–2026, The Software Heritage developers. License: GNU AGPLv3+.
The source code of Software Heritage itself is available on our development forge.
The source code files archived by Software Heritage are available under their own copyright and licenses.
Terms of use: Archive access, API— Content policy— Contact— JavaScript license information— Web API