--- /dev/null
+Updating the package
+====================
+
+1. Run d/make_orig_multi.sh <version>.
+2. Verify the -vendor component tarball to make sure it looks good.
+ If not, edit d/make_orig_multi.sh and the surrounding files (such as patches
+ and exclude files) and repeat the above until it looks good.
+3. $ git fetch upstream
+ You might have to first run:
+ $ git remote add upstream https://github.com/rust-lang/cargo
+4. $ gbp import-orig ../cargo_<version>.orig.tar.gz
+ If you get errors, check the extra default flags in d/gbp.conf
+5. Check that no old versions remain in vendor/. If there are, then your git
+ repo was messed up when you ran (4). Rewind the debian/sid, upstream, and
+ pristine-tar branches, delete the upstream/<version> tag; this reverts step
+ (4). Clean up your git repo, and then try (4) again.
+6. Update d/patches and the rest of the packaging, as normal.
+
+General info
+============
+
+Current packaging of cargo is sub-optimal due to the fact that
+both the language (Rust) and its package manager (Cargo)
+are involved into self-dependency loops to bootstrap.
+
+Moreover, the current approach to modules and registry by cargo is
+biased towards a developer-friendly always-online use.
+
+This package currently resort to several workarounds to build cargo:
+ 1. we use a custom script (debian/bootstrap.py) to build a local
+ stage0, instead of downloading/embedding a snapshotted binary.
+ 2. we embed all dependencies crates, because cargo needs external
+ modules (which need cargo themself to build).
+ 4. we generate a .cargo/config at build-time, to override paths and
+ registry.
+ 5. we create a temporary git repository at build-time for the
+ registry, as this is needed by cargo.
+
+As such, the original source is composed by two tarballs:
+ * cargo source
+ * dependencies crates (under vendor/), stripped of unused embedded
+ C libraries
+
+ -- Luca Bruno <lucab@debian.org> Sat, 13 Feb 2016 17:50:59 +0100
--- /dev/null
+- later: experiment with -C prefer-dynamic
+ - see TODO in d/rules, need to update rustc
+- (possibly not needed, seems Windows-only):
+ work around https://github.com/rust-lang/rust/issues/43449
+
+Later
+=====
+
+A lot of stuff here is shared with dh-cargo. We should use that instead.
+
+Specifically, d/cargohome/config and the override_dh_auto_* rules
--- /dev/null
+#!/usr/bin/env python
+"""
+NOTE: This script has not been used for a very long time and very likely won't
+work. Please read the code before attempting to run it and hoping that "just
+fixing the errors" will work. -- infinity0
+
+About
+=====
+
+This python script is design to do the bare minimum to compile and link the
+Cargo binary for the purposes of bootstrapping itself on a new platform for
+which cross-compiling isn't possible. I wrote this specifically to bootstrap
+Cargo on [Bitrig](https://bitrig.org). Bitrig is a fork of OpenBSD that uses
+clang/clang++ and other BSD licensed tools instead of GNU licensed software.
+Cross compiling from another platform is extremely difficult because of the
+alternative toolchain Bitrig uses.
+
+With this script, all that should be necessary to run this is a working Rust
+toolchain, Python, and Git.
+
+This script will not set up a full cargo cache or anything. It works by
+cloning the cargo index and then starting with the cargo dependencies, it
+recursively builds the dependency tree. Once it has the dependency tree, it
+starts with the leaves of the tree, doing a breadth first traversal and for
+each dependency, it clones the repo, sets the repo's head to the correct
+revision and then executes the build command specified in the cargo config.
+
+This bootstrap script uses a temporary directory to store the built dependency
+libraries and uses that as a link path when linking dependencies and the
+cargo binary. The goal is to create a statically linked cargo binary that is
+capable of being used as a "local cargo" when running the main cargo Makefiles.
+
+Dependencies
+============
+
+* pytoml -- used for parsing toml files.
+ https://github.com/avakar/pytoml
+
+* dulwich -- used for working with git repos.
+ https://git.samba.org/?p=jelmer/dulwich.git;a=summary
+
+Both can be installed via the pip tool:
+
+```sh
+sudo pip install pytoml dulwich
+```
+
+Command Line Options
+====================
+
+```
+--cargo-root <path> specify the path to the cargo repo root.
+--target-dir <path> specify the location to store build results.
+--crate-index <path> path to where crates.io index shoudl be cloned
+--no-clone don't clone crates.io index, --crate-index must point to existing clone.
+--no-clean don't remove the folders created during bootstrapping.
+--download only download the crates needed to bootstrap cargo.
+--graph output dot format graph of dependencies.
+--target <triple> build target: e.g. x86_64-unknown-bitrig
+--host <triple> host machine: e.g. x86_64-unknown-linux-gnu
+--urls-file <file> file to write crate URLs to
+--blacklist <crates> list of blacklisted crates to skip
+--include-optional <crates> list of optional crates to include
+--patchdir <dir> directory containing patches to apply to crates after fetching them
+--save-crate if set, save .crate file when downloading
+```
+
+The `--cargo-root` option defaults to the current directory if unspecified. The
+target directory defaults to Python equivilent of `mktemp -d` if unspecified.
+The `--crate-index` option specifies where the crates.io index will be cloned. Or,
+if you already have a clone of the index, the crates index should point there
+and you should also specify `--no-clone`. The `--target` option is used to
+specify which platform you are bootstrapping for. The `--host` option defaults
+to the value of the `--target` option when not specified.
+
+Examples
+========
+
+To bootstrap Cargo on (Bitrig)[https://bitrig.org] I followed these steps:
+
+* Cloned this [bootstrap script repo](https://github.com/dhuseby/cargo-bootstra)
+to `/tmp/bootstrap`.
+* Cloned the [crates.io index](https://github.com/rust-lang/crates.io-index)
+to `/tmp/index`.
+* Created a target folder, `/tmp/out`, for the output.
+* Cloned the (Cargo)[https://github.com/rust-lang/cargo] repo to `/tmp/cargo`.
+* Copied the bootstrap.py script to the cargo repo root.
+* Ran the bootstrap.py script like so:
+```sh
+./bootstrap.py --crate-index /tmp/index --target-dir /tmp/out --no-clone --no-clean --target x86_64-unknown-bitrig
+```
+
+After the script completed, there is a Cargo executable named `cargo-0_2_0` in
+`/tmp/out`. That executable can then be used to bootstrap Cargo from source by
+specifying it as the `--local-cargo` option to Cargo's `./configure` script.
+"""
+
+import argparse
+import cStringIO
+import hashlib
+import inspect
+import json
+import os
+import re
+import shutil
+import subprocess
+import sys
+import tarfile
+import tempfile
+import urlparse
+import socket
+# In Debian crates are already downloaded when we bootstrap cargo.
+# import requests
+import pytoml as toml
+import dulwich.porcelain as git
+from glob import glob
+
+
+TARGET = None
+HOST = None
+GRAPH = None
+URLS_FILE = None
+CRATE_CACHE = None
+CRATES_INDEX = 'git://github.com/rust-lang/crates.io-index.git'
+CARGO_REPO = 'git://github.com/rust-lang/cargo.git'
+CRATE_API_DL = 'https://crates.io/api/v1/crates/%s/%s/download'
+SV_RANGE = re.compile(r'^(?P<op>(?:\<=|\>=|=|\<|\>|\^|\~))?\s*'
+ r'(?P<major>(?:\*|0|[1-9][0-9]*))'
+ r'(\.(?P<minor>(?:\*|0|[1-9][0-9]*)))?'
+ r'(\.(?P<patch>(?:\*|0|[1-9][0-9]*)))?'
+ r'(\-(?P<prerelease>[0-9A-Za-z-]+(\.[0-9A-Za-z-]+)*))?'
+ r'(\+(?P<build>[0-9A-Za-z-]+(\.[0-9A-Za-z-]+)*))?$')
+SEMVER = re.compile(r'^\s*(?P<major>(?:0|[1-9][0-9]*))'
+ r'(\.(?P<minor>(?:0|[1-9][0-9]*)))?'
+ r'(\.(?P<patch>(?:0|[1-9][0-9]*)))?'
+ r'(\-(?P<prerelease>[0-9A-Za-z-]+(\.[0-9A-Za-z-]+)*))?'
+ r'(\+(?P<build>[0-9A-Za-z-]+(\.[0-9A-Za-z-]+)*))?$')
+BSCRIPT = re.compile(r'^cargo:(?P<key>([^\s=]+))(=(?P<value>.+))?$')
+BNAME = re.compile('^(lib)?(?P<name>([^_]+))(_.*)?$')
+BUILT = {}
+CRATES = {}
+CVER = re.compile("-([^-]+)$")
+UNRESOLVED = []
+PFX = []
+BLACKLIST = []
+INCLUDE_OPTIONAL = []
+
+def dbgCtx(f):
+ def do_dbg(self, *cargs):
+ PFX.append(self.name())
+ ret = f(self, *cargs)
+ PFX.pop()
+ return ret
+ return do_dbg
+
+def dbg(s):
+ print '%s: %s' % (':'.join(PFX), s)
+
+
+class PreRelease(object):
+
+ def __init__(self, pr):
+ self._container = []
+ if pr is not None:
+ self._container += str(pr).split('.')
+
+ def __str__(self):
+ return '.'.join(self._container)
+
+ def __repr__(self):
+ return self._container
+
+ def __getitem__(self, key):
+ return self._container[key]
+
+ def __len__(self):
+ return len(self._container)
+
+ def __gt__(self, rhs):
+ return not ((self < rhs) or (self == rhs))
+
+ def __ge__(self, rhs):
+ return not (self < rhs)
+
+ def __le__(self, rhs):
+ return not (self > rhs)
+
+ def __eq__(self, rhs):
+ return self._container == rhs._container
+
+ def __ne__(self, rhs):
+ return not (self == rhs)
+
+ def __lt__(self, rhs):
+ if self == rhs:
+ return False
+
+ # not having a pre-release is higher precedence
+ if len(self) == 0:
+ if len(rhs) == 0:
+ return False
+ else:
+ # 1.0.0 > 1.0.0-alpha
+ return False
+ else:
+ if len(rhs) is None:
+ # 1.0.0-alpha < 1.0.0
+ return True
+
+ # if both have one, then longer pre-releases are higher precedence
+ if len(self) > len(rhs):
+ # 1.0.0-alpha.1 > 1.0.0-alpha
+ return False
+ elif len(self) < len(rhs):
+ # 1.0.0-alpha < 1.0.0-alpha.1
+ return True
+
+ # if both have the same length pre-release, must check each piece
+ # numeric sub-parts have lower precedence than non-numeric sub-parts
+ # non-numeric sub-parts are compared lexically in ASCII sort order
+ for l,r in zip(self, rhs):
+ if l.isdigit():
+ if r.isdigit():
+ if int(l) < int(r):
+ # 2 > 1
+ return True
+ elif int(l) > int(r):
+ # 1 < 2
+ return False
+ else:
+ # 1 == 1
+ continue
+ else:
+ # 1 < 'foo'
+ return True
+ else:
+ if r.isdigit():
+ # 'foo' > 1
+ return False
+
+ # both are non-numeric
+ if l < r:
+ return True
+ elif l > r:
+ return False
+
+ raise RuntimeError('PreRelease __lt__ failed')
+
+
+class Semver(dict):
+
+ def __init__(self, sv):
+ match = SEMVER.match(str(sv))
+ if match is None:
+ raise ValueError('%s is not a valid semver string' % sv)
+
+ self._input = sv
+ self.update(match.groupdict())
+ self.prerelease = PreRelease(self['prerelease'])
+
+ def __str__(self):
+ major, minor, patch, prerelease, build = self.parts_raw()
+ s = ''
+ if major is None:
+ s += '0'
+ else:
+ s += major
+ s += '.'
+ if minor is None:
+ s += '0'
+ else:
+ s += minor
+ s += '.'
+ if patch is None:
+ s += '0'
+ else:
+ s += patch
+ if len(self.prerelease):
+ s += '-' + str(self.prerelease)
+ if build is not None:
+ s += '+' + build
+ return s
+
+ def __hash__(self):
+ return hash(str(self))
+
+ def as_range(self):
+ return SemverRange('=%s' % self)
+
+ def parts(self):
+ major, minor, patch, prerelease, build = self.parts_raw()
+ if major is None:
+ major = '0'
+ if minor is None:
+ minor = '0'
+ if patch is None:
+ patch = '0'
+ return (int(major),int(minor),int(patch),prerelease,build)
+
+ def parts_raw(self):
+ return (self['major'],self['minor'],self['patch'],self['prerelease'],self['build'])
+
+ def __lt__(self, rhs):
+ lmaj,lmin,lpat,lpre,_ = self.parts()
+ rmaj,rmin,rpat,rpre,_ = rhs.parts()
+ if lmaj < rmaj:
+ return True
+ if lmaj > rmaj:
+ return False
+ if lmin < rmin:
+ return True
+ if lmin > rmin:
+ return False
+ if lpat < rpat:
+ return True
+ if lpat > rpat:
+ return False
+ if lpre is not None and rpre is None:
+ return True
+ if lpre is not None and rpre is not None:
+ if self.prerelease < rhs.prerelease:
+ return True
+ return False
+
+ def __le__(self, rhs):
+ return not (self > rhs)
+
+ def __gt__(self, rhs):
+ return not ((self < rhs) or (self == rhs))
+
+ def __ge__(self, rhs):
+ return not (self < rhs)
+
+ def __eq__(self, rhs):
+ # build metadata is only considered for equality
+ lmaj,lmin,lpat,lpre,lbld = self.parts()
+ rmaj,rmin,rpat,rpre,rbld = rhs.parts()
+ return lmaj == rmaj and \
+ lmin == rmin and \
+ lpat == rpat and \
+ lpre == rpre and \
+ lbld == rbld
+
+ def __ne__(self, rhs):
+ return not (self == rhs)
+
+
+class SemverRange(object):
+
+ def __init__(self, sv):
+ self._input = sv
+ self._lower = None
+ self._upper = None
+ self._op = None
+ self._semver = None
+
+ sv = str(sv)
+ svs = [x.strip() for x in sv.split(',')]
+
+ if len(svs) > 1:
+ self._op = '^'
+ for sr in svs:
+ rang = SemverRange(sr)
+ if rang.lower() is not None:
+ if self._lower is None or rang.lower() < self._lower:
+ self._lower = rang.lower()
+ if rang.upper() is not None:
+ if self._upper is None or rang.upper() > self._upper:
+ self._upper = rang.upper()
+ op, semver = rang.op_semver()
+ if semver is not None:
+ if op == '>=':
+ if self._lower is None or semver < self._lower:
+ self._lower = semver
+ if op == '<':
+ if self._upper is None or semver > self._upper:
+ self._upper = semver
+ return
+
+ match = SV_RANGE.match(sv)
+ if match is None:
+ raise ValueError('%s is not a valid semver range string' % sv)
+
+ svm = match.groupdict()
+ op, major, minor, patch, prerelease, build = svm['op'], svm['major'], svm['minor'], svm['patch'], svm['prerelease'], svm['build']
+ prerelease = PreRelease(prerelease)
+
+ # fix up the op
+ if op is None:
+ if major == '*' or minor == '*' or patch == '*':
+ op = '*'
+ else:
+ # if no op was specified and there are no wildcards, then op
+ # defaults to '^'
+ op = '^'
+ else:
+ self._semver = Semver(sv[len(op):])
+
+ if op not in ('<=', '>=', '<', '>', '=', '^', '~', '*'):
+ raise ValueError('%s is not a valid semver operator' % op)
+
+ self._op = op
+
+ # lower bound
+ def find_lower():
+ if op in ('<=', '<', '=', '>', '>='):
+ return None
+
+ if op == '*':
+ # wildcards specify a range
+ if major == '*':
+ return Semver('0.0.0')
+ elif minor == '*':
+ return Semver(major + '.0.0')
+ elif patch == '*':
+ return Semver(major + '.' + minor + '.0')
+ elif op == '^':
+ # caret specifies a range
+ if patch is None:
+ if minor is None:
+ # ^0 means >=0.0.0 and <1.0.0
+ return Semver(major + '.0.0')
+ else:
+ # ^0.0 means >=0.0.0 and <0.1.0
+ return Semver(major + '.' + minor + '.0')
+ else:
+ # ^0.0.1 means >=0.0.1 and <0.0.2
+ # ^0.1.2 means >=0.1.2 and <0.2.0
+ # ^1.2.3 means >=1.2.3 and <2.0.0
+ if int(major) == 0:
+ if int(minor) == 0:
+ # ^0.0.1
+ return Semver('0.0.' + patch)
+ else:
+ # ^0.1.2
+ return Semver('0.' + minor + '.' + patch)
+ else:
+ # ^1.2.3
+ return Semver(major + '.' + minor + '.' + patch)
+ elif op == '~':
+ # tilde specifies a minimal range
+ if patch is None:
+ if minor is None:
+ # ~0 means >=0.0.0 and <1.0.0
+ return Semver(major + '.0.0')
+ else:
+ # ~0.0 means >=0.0.0 and <0.1.0
+ return Semver(major + '.' + minor + '.0')
+ else:
+ # ~0.0.1 means >=0.0.1 and <0.1.0
+ # ~0.1.2 means >=0.1.2 and <0.2.0
+ # ~1.2.3 means >=1.2.3 and <1.3.0
+ return Semver(major + '.' + minor + '.' + patch)
+
+ raise RuntimeError('No lower bound')
+ self._lower = find_lower()
+
+ def find_upper():
+ if op in ('<=', '<', '=', '>', '>='):
+ return None
+
+ if op == '*':
+ # wildcards specify a range
+ if major == '*':
+ return None
+ elif minor == '*':
+ return Semver(str(int(major) + 1) + '.0.0')
+ elif patch == '*':
+ return Semver(major + '.' + str(int(minor) + 1) + '.0')
+ elif op == '^':
+ # caret specifies a range
+ if patch is None:
+ if minor is None:
+ # ^0 means >=0.0.0 and <1.0.0
+ return Semver(str(int(major) + 1) + '.0.0')
+ else:
+ # ^0.0 means >=0.0.0 and <0.1.0
+ return Semver(major + '.' + str(int(minor) + 1) + '.0')
+ else:
+ # ^0.0.1 means >=0.0.1 and <0.0.2
+ # ^0.1.2 means >=0.1.2 and <0.2.0
+ # ^1.2.3 means >=1.2.3 and <2.0.0
+ if int(major) == 0:
+ if int(minor) == 0:
+ # ^0.0.1
+ return Semver('0.0.' + str(int(patch) + 1))
+ else:
+ # ^0.1.2
+ return Semver('0.' + str(int(minor) + 1) + '.0')
+ else:
+ # ^1.2.3
+ return Semver(str(int(major) + 1) + '.0.0')
+ elif op == '~':
+ # tilde specifies a minimal range
+ if patch is None:
+ if minor is None:
+ # ~0 means >=0.0.0 and <1.0.0
+ return Semver(str(int(major) + 1) + '.0.0')
+ else:
+ # ~0.0 means >=0.0.0 and <0.1.0
+ return Semver(major + '.' + str(int(minor) + 1) + '.0')
+ else:
+ # ~0.0.1 means >=0.0.1 and <0.1.0
+ # ~0.1.2 means >=0.1.2 and <0.2.0
+ # ~1.2.3 means >=1.2.3 and <1.3.0
+ return Semver(major + '.' + str(int(minor) + 1) + '.0')
+
+ raise RuntimeError('No upper bound')
+ self._upper = find_upper()
+
+ def __repr__(self):
+ return "SemverRange(%s, op=%s, semver=%s, lower=%s, upper=%s)" % (repr(self._input), self._op, self._semver, self._lower, self._upper)
+
+ def __str__(self):
+ return self._input
+
+ def lower(self):
+ return self._lower
+
+ def upper(self):
+ return self._upper
+
+ def op_semver(self):
+ return self._op, self._semver
+
+ def compare(self, sv):
+ if not isinstance(sv, Semver):
+ sv = Semver(sv)
+
+ op = self._op
+ if op == '*':
+ if self._semver is not None and self._semver['major'] == '*':
+ return sv >= Semver('0.0.0')
+ if self._lower is not None and sv < self._lower:
+ return False
+ if self._upper is not None and sv >= self._upper:
+ return False
+ return True
+ elif op == '^':
+ return (sv >= self._lower) and (sv < self._upper)
+ elif op == '~':
+ return (sv >= self._lower) and (sv < self._upper)
+ elif op == '<=':
+ return sv <= self._semver
+ elif op == '>=':
+ return sv >= self._semver
+ elif op == '<':
+ return sv < self._semver
+ elif op == '>':
+ return sv > self._semver
+ elif op == '=':
+ return sv == self._semver
+
+ raise RuntimeError('Semver comparison failed to find a matching op')
+
+
+def test_semver():
+ """
+ Tests for Semver parsing. Run using py.test: py.test bootstrap.py
+ """
+ assert str(Semver("1")) == "1.0.0"
+ assert str(Semver("1.1")) == "1.1.0"
+ assert str(Semver("1.1.1")) == "1.1.1"
+ assert str(Semver("1.1.1-alpha")) == "1.1.1-alpha"
+ assert str(Semver("1.1.1-alpha.1")) == "1.1.1-alpha.1"
+ assert str(Semver("1.1.1-alpha+beta")) == "1.1.1-alpha+beta"
+ assert str(Semver("1.1.1-alpha+beta.1")) == "1.1.1-alpha+beta.1"
+
+def test_semver_eq():
+ assert Semver("1") == Semver("1.0.0")
+ assert Semver("1.1") == Semver("1.1.0")
+ assert Semver("1.1.1") == Semver("1.1.1")
+ assert Semver("1.1.1-alpha") == Semver("1.1.1-alpha")
+ assert Semver("1.1.1-alpha.1") == Semver("1.1.1-alpha.1")
+ assert Semver("1.1.1-alpha+beta") == Semver("1.1.1-alpha+beta")
+ assert Semver("1.1.1-alpha.1+beta") == Semver("1.1.1-alpha.1+beta")
+ assert Semver("1.1.1-alpha.1+beta.1") == Semver("1.1.1-alpha.1+beta.1")
+
+def test_semver_comparison():
+ assert Semver("1") < Semver("2.0.0")
+ assert Semver("1.1") < Semver("1.2.0")
+ assert Semver("1.1.1") < Semver("1.1.2")
+ assert Semver("1.1.1-alpha") < Semver("1.1.1")
+ assert Semver("1.1.1-alpha") < Semver("1.1.1-beta")
+ assert Semver("1.1.1-alpha") < Semver("1.1.1-beta")
+ assert Semver("1.1.1-alpha") < Semver("1.1.1-alpha.1")
+ assert Semver("1.1.1-alpha.1") < Semver("1.1.1-alpha.2")
+ assert Semver("1.1.1-alpha+beta") < Semver("1.1.1+beta")
+ assert Semver("1.1.1-alpha+beta") < Semver("1.1.1-beta+beta")
+ assert Semver("1.1.1-alpha+beta") < Semver("1.1.1-beta+beta")
+ assert Semver("1.1.1-alpha+beta") < Semver("1.1.1-alpha.1+beta")
+ assert Semver("1.1.1-alpha.1+beta") < Semver("1.1.1-alpha.2+beta")
+ assert Semver("0.5") < Semver("2.0")
+ assert not (Semver("2.0") < Semver("0.5"))
+ assert not (Semver("0.5") > Semver("2.0"))
+ assert not (Semver("0.5") >= Semver("2.0"))
+ assert Semver("2.0") >= Semver("0.5")
+ assert Semver("2.0") > Semver("0.5")
+ assert not (Semver("2.0") > Semver("2.0"))
+ assert not (Semver("2.0") < Semver("2.0"))
+
+def test_semver_range():
+ def bounds(spec, lowe, high):
+ lowe = Semver(lowe) if lowe is not None else lowe
+ high = Semver(high) if high is not None else high
+ assert SemverRange(spec).lower() == lowe and SemverRange(spec).upper() == high
+ bounds('0', '0.0.0', '1.0.0')
+ bounds('0.0', '0.0.0', '0.1.0')
+ bounds('0.0.0', '0.0.0', '0.0.1')
+ bounds('0.0.1', '0.0.1', '0.0.2')
+ bounds('0.1.1', '0.1.1', '0.2.0')
+ bounds('1.1.1', '1.1.1', '2.0.0')
+ bounds('^0', '0.0.0', '1.0.0')
+ bounds('^0.0', '0.0.0', '0.1.0')
+ bounds('^0.0.0', '0.0.0', '0.0.1')
+ bounds('^0.0.1', '0.0.1', '0.0.2')
+ bounds('^0.1.1', '0.1.1', '0.2.0')
+ bounds('^1.1.1', '1.1.1', '2.0.0')
+ bounds('~0', '0.0.0', '1.0.0')
+ bounds('~0.0', '0.0.0', '0.1.0')
+ bounds('~0.0.0', '0.0.0', '0.1.0')
+ bounds('~0.0.1', '0.0.1', '0.1.0')
+ bounds('~0.1.1', '0.1.1', '0.2.0')
+ bounds('~1.1.1', '1.1.1', '1.2.0')
+ bounds('*', '0.0.0', None)
+ bounds('0.*', '0.0.0', '1.0.0')
+ bounds('0.0.*', '0.0.0', '0.1.0')
+
+
+def test_semver_multirange():
+ assert SemverRange(">= 0.5, < 2.0").compare("1.0.0")
+ assert SemverRange("*").compare("0.2.7")
+
+
+class Runner(object):
+
+ def __init__(self, c, e, cwd=None):
+ self._cmd = c
+ if not isinstance(self._cmd, list):
+ self._cmd = [self._cmd]
+ self._env = e
+ self._stdout = []
+ self._stderr = []
+ self._returncode = 0
+ self._cwd = cwd
+
+ def __call__(self, c, e):
+ cmd = self._cmd + c
+ env = dict(self._env, **e)
+ #dbg(' env: %s' % env)
+ #dbg(' cwd: %s' % self._cwd)
+ envstr = ''
+ for k, v in env.iteritems():
+ envstr += ' %s="%s"' % (k, v)
+ if self._cwd is not None:
+ dbg('cd %s && %s %s' % (self._cwd, envstr, ' '.join(cmd)))
+ else:
+ dbg('%s %s' % (envstr, ' '.join(cmd)))
+
+ proc = subprocess.Popen(cmd, env=env,
+ stdout=subprocess.PIPE, stderr=subprocess.PIPE,
+ cwd=self._cwd)
+ out, err = proc.communicate()
+
+ for lo in out.split('\n'):
+ if len(lo) > 0:
+ self._stdout.append(lo)
+ #dbg('out: %s' % lo)
+
+ for le in err.split('\n'):
+ if len(le) > 0:
+ self._stderr.append(le)
+ dbg(le)
+
+ """
+ while proc.poll() is None:
+ lo = proc.stdout.readline().rstrip('\n')
+ le = proc.stderr.readline().rstrip('\n')
+ if len(lo) > 0:
+ self._stdout.append(lo)
+ dbg(lo)
+ sys.stdout.flush()
+ if len(le) > 0:
+ self._stderr.append(le)
+ dbg('err: %s', le)
+ sys.stdout.flush()
+ """
+ self._returncode = proc.wait()
+ #dbg(' ret: %s' % self._returncode)
+ return self._stdout
+
+ def output(self):
+ return self._stdout
+
+ def returncode(self):
+ return self._returncode
+
+class RustcRunner(Runner):
+
+ def __call__(self, c, e):
+ super(RustcRunner, self).__call__(c, e)
+ return ([], {}, {})
+
+class BuildScriptRunner(Runner):
+
+ def __call__(self, c, e):
+ #dbg('XXX Running build script:');
+ #dbg(' env: %s' % e)
+ #dbg(' '.join(self._cmd + c))
+ super(BuildScriptRunner, self).__call__(c, e)
+
+ # parse the output for cargo: lines
+ cmd = []
+ env = {}
+ denv = {}
+ for l in self.output():
+ match = BSCRIPT.match(str(l))
+ if match is None:
+ continue
+ pieces = match.groupdict()
+ k = pieces['key']
+ v = pieces['value']
+
+ if k == 'rustc-link-lib':
+ #dbg('YYYYYY: adding -l %s' % v)
+ cmd += ['-l', v]
+ elif k == 'rustc-link-search':
+ #dbg("adding link search path: %s" % v)
+ cmd += ['-L', v]
+ elif k == 'rustc-cfg':
+ cmd += ['--cfg', v]
+ env['CARGO_FEATURE_%s' % v.upper().replace('-', '_')] = 1
+ else:
+ #dbg("env[%s] = %s" % (k, v));
+ denv[k] = v
+ return (cmd, env, denv)
+
+class Crate(object):
+
+ def __init__(self, crate, ver, deps, cdir, build):
+ self._crate = str(crate)
+ self._version = Semver(ver)
+ self._dep_info = deps
+ self._dir = cdir
+ # put the build scripts first
+ self._build = [x for x in build if x.get('type') == 'build_script']
+ # then add the lib/bin builds
+ self._build += [x for x in build if x.get('type') != 'build_script']
+ self._resolved = False
+ self._deps = {}
+ self._refs = []
+ self._env = {}
+ self._dep_env = {}
+ self._extra_flags = []
+
+ def name(self):
+ return self._crate
+
+ def dep_info(self):
+ return self._dep_info
+
+ def version(self):
+ return self._version
+
+ def dir(self):
+ return self._dir
+
+ def __str__(self):
+ return '%s-%s' % (self.name(), self.version())
+
+ def add_dep(self, crate, features):
+ if str(crate) in self._deps:
+ return
+
+ features = [str(x) for x in features]
+ self._deps[str(crate)] = { 'features': features }
+ crate.add_ref(self)
+
+ def add_ref(self, crate):
+ if str(crate) not in self._refs:
+ self._refs.append(str(crate))
+
+ def resolved(self):
+ return self._resolved
+
+ @dbgCtx
+ def resolve(self, tdir, idir, nodl, graph=None):
+ if self._resolved:
+ return
+ if str(self) in CRATES:
+ return
+
+ if self._dep_info is not None:
+ print ''
+ dbg('Resolving dependencies for: %s' % str(self))
+ for d in self._dep_info:
+ kind = d.get('kind', 'normal')
+ if kind not in ('normal', 'build'):
+ print ''
+ dbg('Skipping %s dep %s' % (kind, d['name']))
+ continue
+
+ optional = d.get('optional', False)
+ if optional and d['name'] not in INCLUDE_OPTIONAL:
+ print ''
+ dbg('Skipping optional dep %s' % d['name'])
+ continue
+
+ svr = SemverRange(d['req'])
+ print ''
+ deps = []
+ dbg('Looking up info for %s %s' % (d['name'], str(svr)))
+ if d.get('local', None) is None:
+ # go through crates first to see if the is satisfied already
+ dcrate = find_crate_by_name_and_semver(d['name'], svr)
+ if dcrate is not None:
+ #import pdb; pdb.set_trace()
+ svr = dcrate.version().as_range()
+ name, ver, ideps, ftrs, cksum = crate_info_from_index(idir, d['name'], svr)
+ if name in BLACKLIST:
+ dbg('Found in blacklist, skipping %s' % (name))
+ elif dcrate is None:
+ if nodl:
+ cdir = find_downloaded_crate(tdir, name, svr)
+ else:
+ cdir = dl_and_check_crate(tdir, name, ver, cksum)
+ _, tver, tdeps, build = crate_info_from_toml(cdir)
+ deps += ideps
+ deps += tdeps
+ else:
+ dbg('Found crate already satisfying %s %s' % (d['name'], str(svr)))
+ deps += dcrate.dep_info()
+ else:
+ cdir = d['path']
+ name, ver, ideps, build = crate_info_from_toml(cdir)
+ deps += ideps
+
+ if name not in BLACKLIST:
+ try:
+ if dcrate is None:
+ dcrate = Crate(name, ver, deps, cdir, build)
+ if str(dcrate) in CRATES:
+ dcrate = CRATES[str(dcrate)]
+ UNRESOLVED.append(dcrate)
+ if graph is not None:
+ print >> graph, '"%s" -> "%s";' % (str(self), str(dcrate))
+
+ except:
+ dcrate = None
+
+ # clean up the list of features that are enabled
+ tftrs = d.get('features', [])
+ if isinstance(tftrs, dict):
+ tftrs = tftrs.keys()
+ else:
+ tftrs = [x for x in tftrs if len(x) > 0]
+
+ # add 'default' if default_features is true
+ if d.get('default_features', True):
+ tftrs.append('default')
+
+ features = []
+ if isinstance(ftrs, dict):
+ # add any available features that are activated by the
+ # dependency entry in the parent's dependency record,
+ # and any features they depend on recursively
+ def add_features(f):
+ if f in ftrs:
+ for k in ftrs[f]:
+ # guard against infinite recursion
+ if not k in features:
+ features.append(k)
+ add_features(k)
+ for k in tftrs:
+ add_features(k)
+ else:
+ features += [x for x in ftrs if (len(x) > 0) and (x in tftrs)]
+
+ if dcrate is not None:
+ self.add_dep(dcrate, features)
+
+ self._resolved = True
+ CRATES[str(self)] = self
+
+ @dbgCtx
+ def build(self, by, out_dir, features=[]):
+ extra_filename = '-' + str(self.version()).replace('.','_')
+ output_name = self.name().replace('-','_')
+ output = os.path.join(out_dir, 'lib%s%s.rlib' % (output_name, extra_filename))
+
+ if str(self) in BUILT:
+ return ({'name':self.name(), 'lib':output}, self._env, self._extra_flags)
+
+ externs = []
+ extra_flags = []
+ for dep,info in self._deps.iteritems():
+ if dep in CRATES:
+ extern, env, extra_flags = CRATES[dep].build(self, out_dir, info['features'])
+ externs.append(extern)
+ self._dep_env[CRATES[dep].name()] = env
+ self._extra_flags += extra_flags
+
+ if os.path.isfile(output):
+ print ''
+ dbg('Skipping %s, already built (needed by: %s)' % (str(self), str(by)))
+ BUILT[str(self)] = str(by)
+ return ({'name':self.name(), 'lib':output}, self._env, self._extra_flags)
+
+ # build the environment for subcommands
+ tenv = dict(os.environ)
+ env = {}
+ env['PATH'] = tenv['PATH']
+ env['OUT_DIR'] = out_dir
+ env['TARGET'] = TARGET
+ env['HOST'] = HOST
+ env['NUM_JOBS'] = '1'
+ env['OPT_LEVEL'] = '0'
+ env['DEBUG'] = '0'
+ env['PROFILE'] = 'release'
+ env['CARGO_MANIFEST_DIR'] = self.dir()
+ env['CARGO_PKG_VERSION_MAJOR'] = self.version()['major']
+ env['CARGO_PKG_VERSION_MINOR'] = self.version()['minor']
+ env['CARGO_PKG_VERSION_PATCH'] = self.version()['patch']
+ pre = self.version()['prerelease']
+ if pre is None:
+ pre = ''
+ env['CARGO_PKG_VERSION_PRE'] = pre
+ env['CARGO_PKG_VERSION'] = str(self.version())
+ for f in features:
+ env['CARGO_FEATURE_%s' % f.upper().replace('-','_')] = '1'
+ for l,e in self._dep_env.iteritems():
+ for k,v in e.iteritems():
+ if type(v) is not str and type(v) is not unicode:
+ v = str(v)
+ env['DEP_%s_%s' % (l.upper(), v.upper())] = v
+
+ # create the builders, build scrips are first
+ cmds = []
+ for b in self._build:
+ v = str(self._version).replace('.','_')
+ cmd = ['rustc']
+ cmd.append(os.path.join(self._dir, b['path']))
+ cmd.append('--crate-name')
+ if b['type'] == 'lib':
+ b.setdefault('name', self.name())
+ cmd.append(b['name'].replace('-','_'))
+ cmd.append('--crate-type')
+ cmd.append('lib')
+ elif b['type'] == 'build_script':
+ cmd.append('build_script_%s' % b['name'].replace('-','_'))
+ cmd.append('--crate-type')
+ cmd.append('bin')
+ else:
+ cmd.append(b['name'].replace('-','_'))
+ cmd.append('--crate-type')
+ cmd.append('bin')
+
+ for f in features:
+ cmd.append('--cfg')
+ cmd.append('feature=\"%s\"' % f)
+
+ cmd.append('-C')
+ cmd.append('extra-filename=' + extra_filename)
+
+ cmd.append('--out-dir')
+ cmd.append('%s' % out_dir)
+ cmd.append('--emit=dep-info,link')
+ cmd.append('--target')
+ cmd.append(TARGET)
+ cmd.append('-L')
+ cmd.append('%s' % out_dir)
+ cmd.append('-L')
+ cmd.append('%s/lib' % out_dir)
+
+
+ # add in the flags from dependencies
+ cmd += self._extra_flags
+
+ for e in externs:
+ cmd.append('--extern')
+ cmd.append('%s=%s' % (e['name'].replace('-','_'), e['lib']))
+
+ # get the pkg key name
+ match = BNAME.match(b['name'])
+ if match is not None:
+ match = match.groupdict()['name'].replace('-','_')
+
+ # queue up the runner
+ cmds.append({'name':b['name'], 'env_key':match, 'cmd':RustcRunner(cmd, env)})
+
+ # queue up the build script runner
+ if b['type'] == 'build_script':
+ bcmd = os.path.join(out_dir, 'build_script_%s-%s' % (b['name'], v))
+ cmds.append({'name':b['name'], 'env_key':match, 'cmd':BuildScriptRunner(bcmd, env, self._dir)})
+
+ print ''
+ dbg('Building %s (needed by: %s)' % (str(self), str(by)))
+
+ bcmd = []
+ benv = {}
+ for c in cmds:
+ runner = c['cmd']
+
+ (c1, e1, e2) = runner(bcmd, benv)
+
+ if runner.returncode() != 0:
+ raise RuntimeError('build command failed: %s' % runner.returncode())
+
+ bcmd += c1
+ benv = dict(benv, **e1)
+
+ key = c['env_key']
+ for k,v in e2.iteritems():
+ self._env['DEP_%s_%s' % (key.upper(), k.upper())] = v
+
+ #dbg('XXX cmd: %s' % bcmd)
+ #dbg('XXX env: %s' % benv)
+ #dbg('XXX denv: %s' % self._env)
+ #print ''
+
+ BUILT[str(self)] = str(by)
+ return ({'name':self.name(), 'lib':output}, self._env, bcmd)
+
+
+def dl_crate(url, depth=0):
+ if depth > 10:
+ raise RuntimeError('too many redirects')
+
+ r = requests.get(url)
+ try:
+ dbg('%sconnected to %s...%s' % ((' ' * depth), r.url, r.status_code))
+
+ if URLS_FILE is not None:
+ with open(URLS_FILE, "a") as f:
+ f.write(r.url + "\n")
+
+ return r.content
+ finally:
+ r.close()
+
+def dl_and_check_crate(tdir, name, ver, cksum):
+ cname = '%s-%s' % (name, ver)
+ cdir = os.path.join(tdir, cname)
+ if cname in CRATES:
+ dbg('skipping %s...already downloaded' % cname)
+ return cdir
+
+ def check_checksum(buf):
+ if (cksum is not None):
+ h = hashlib.sha256()
+ h.update(buf)
+ if h.hexdigest() == cksum:
+ dbg('Checksum is good...%s' % cksum)
+ else:
+ dbg('Checksum is BAD (%s != %s)' % (h.hexdigest(), cksum))
+
+ if CRATE_CACHE:
+ cachename = os.path.join(CRATE_CACHE, "%s.crate" % (cname))
+ if os.path.isfile(cachename):
+ dbg('found crate in cache...%s.crate' % (cname))
+ buf = open(cachename).read()
+ check_checksum(buf)
+ with tarfile.open(fileobj=cStringIO.StringIO(buf)) as tf:
+ dbg('unpacking result to %s...' % cdir)
+ tf.extractall(path=tdir)
+ return cdir
+
+ if not os.path.isdir(cdir):
+ dbg('Downloading %s source to %s' % (cname, cdir))
+ dl = CRATE_API_DL % (name, ver)
+ buf = dl_crate(dl)
+ check_checksum(buf)
+
+ if CRATE_CACHE:
+ dbg("saving crate to %s/%s.crate..." % (CRATE_CACHE, cname))
+ with open(os.path.join(CRATE_CACHE, "%s.crate" % (cname)), "wb") as f:
+ f.write(buf)
+
+ fbuf = cStringIO.StringIO(buf)
+ with tarfile.open(fileobj=fbuf) as tf:
+ dbg('unpacking result to %s...' % cdir)
+ tf.extractall(path=tdir)
+
+ return cdir
+
+
+def find_downloaded_crate(tdir, name, svr):
+ exists = glob("%s/%s-[0-9]*" % (tdir, name))
+ if not exists:
+ raise RuntimeError("crate does not exist and have --no-download: %s" % name)
+
+ # First, grok the available versions.
+ aver = sorted([Semver(CVER.search(x).group(1)) for x in exists])
+
+ # Now filter the "suitable" versions based on our version range.
+ sver = filter(svr.compare, aver)
+ if not sver:
+ raise RuntimeError("unable to satisfy dependency %s %s from %s; try running without --no-download" % (name, svr, map(str, aver)))
+
+ cver = sver[-1]
+ return "%s/%s-%s" % (tdir, name, cver)
+
+
+def crate_info_from_toml(cdir):
+ try:
+ with open(os.path.join(cdir, 'Cargo.toml'), 'rb') as ctoml:
+ #import pdb; pdb.set_trace()
+ cfg = toml.load(ctoml)
+ build = []
+ p = cfg.get('package',cfg.get('project', {}))
+ name = p.get('name', None)
+ #if name == 'num_cpus':
+ # import pdb; pdb.set_trace()
+ ver = p.get('version', None)
+ if (name is None) or (ver is None):
+ import pdb; pdb.set_trace()
+ raise RuntimeError('invalid .toml file format')
+
+ # look for a "links" item
+ lnks = p.get('links', [])
+ if type(lnks) is not list:
+ lnks = [lnks]
+
+ # look for a "build" item
+ bf = p.get('build', None)
+
+ # if we have a 'links', there must be a 'build'
+ if len(lnks) > 0 and bf is None:
+ import pdb; pdb.set_trace()
+ raise RuntimeError('cargo requires a "build" item if "links" is specified')
+
+ # there can be target specific build script overrides
+ boverrides = {}
+ for lnk in lnks:
+ boverrides.update(cfg.get('target', {}).get(TARGET, {}).get(lnk, {}))
+
+ bmain = False
+ if bf is not None:
+ build.append({'type':'build_script', \
+ 'path':[ bf ], \
+ 'name':name.replace('-','_'), \
+ 'links': lnks, \
+ 'overrides': boverrides})
+
+ # look for libs array
+ libs = cfg.get('lib', [])
+ if type(libs) is not list:
+ libs = [libs]
+ for l in libs:
+ l['type'] = 'lib'
+ l['links'] = lnks
+ if l.get('path', None) is None:
+ l['path'] = [ 'lib.rs' ]
+ build.append(l)
+ bmain = True
+
+ # look for bins array
+ bins = cfg.get('bin', [])
+ if type(bins) is not list:
+ bins = [bins]
+ for b in bins:
+ if b.get('path', None) is None:
+ b['path'] = [ os.path.join('bin', '%s.rs' % b['name']), os.path.join('bin', 'main.rs'), '%s.rs' % b['name'], 'main.rs' ]
+ build.append({'type': 'bin', \
+ 'name':b['name'], \
+ 'path':b['path'], \
+ 'links': lnks})
+ bmain = True
+
+ # if no explicit directions on what to build, then add a default
+ if bmain == False:
+ build.append({'type':'lib', 'path':'lib.rs', 'name':name.replace('-','_')})
+
+ for b in build:
+ # make sure the path is a list of possible paths
+ if type(b['path']) is not list:
+ b['path'] = [ b['path'] ]
+ bin_paths = []
+ for p in b['path']:
+ bin_paths.append(os.path.join(cdir, p))
+ bin_paths.append(os.path.join(cdir, 'src', p))
+
+ found_path = None
+ for p in bin_paths:
+ if os.path.isfile(p):
+ found_path = p
+ break
+
+ if found_path == None:
+ import pdb; pdb.set_trace()
+ raise RuntimeError('could not find %s to build in %s', (build, cdir))
+ else:
+ b['path'] = found_path
+
+ d = cfg.get('build-dependencies', {})
+ d.update(cfg.get('dependencies', {}))
+ d.update(cfg.get('target', {}).get(TARGET, {}).get('dependencies', {}))
+ deps = []
+ for k,v in d.iteritems():
+ if type(v) is not dict:
+ deps.append({'name':k, 'req': v})
+ elif 'path' in v:
+ if v.get('version', None) is None:
+ deps.append({'name':k, 'path':os.path.join(cdir, v['path']), 'local':True, 'req':0})
+ else:
+ opts = v.get('optional',False)
+ ftrs = v.get('features',[])
+ deps.append({'name':k, 'path': v['path'], 'req':v['version'], 'features':ftrs, 'optional':opts})
+ else:
+ opts = v.get('optional',False)
+ ftrs = v.get('features',[])
+ deps.append({'name':k, 'req':v['version'], 'features':ftrs, 'optional':opts})
+
+ return (name, ver, deps, build)
+
+ except Exception, e:
+ dbg('failed to load toml file for: %s (%s)' % (cdir, str(e)))
+ import pdb; pdb.set_trace()
+
+ return (None, None, [], 'lib.rs')
+
+
+def crate_info_from_index(idir, name, svr):
+ if len(name) == 1:
+ ipath = os.path.join(idir, '1', name)
+ elif len(name) == 2:
+ ipath = os.path.join(idir, '2', name)
+ elif len(name) == 3:
+ ipath = os.path.join(idir, '3', name[0:1], name)
+ else:
+ ipath = os.path.join(idir, name[0:2], name[2:4], name)
+
+ dbg('opening crate info: %s' % ipath)
+ dep_infos = []
+ with open(ipath, 'rb') as fin:
+ lines = fin.readlines()
+ for l in lines:
+ dep_infos.append(json.loads(l))
+
+ passed = {}
+ for info in dep_infos:
+ if 'vers' not in info:
+ continue
+ sv = Semver(info['vers'])
+ if svr.compare(sv):
+ passed[sv] = info
+
+ keys = sorted(passed.iterkeys())
+ best_match = keys.pop()
+ dbg('best match is %s-%s' % (name, best_match))
+ best_info = passed[best_match]
+ name = best_info.get('name', None)
+ ver = best_info.get('vers', None)
+ deps = best_info.get('deps', [])
+ ftrs = best_info.get('features', [])
+ cksum = best_info.get('cksum', None)
+
+ # only include deps without a 'target' or ones with matching 'target'
+ deps = [x for x in deps if x.get('target', TARGET) == TARGET]
+
+ return (name, ver, deps, ftrs, cksum)
+
+
+def find_crate_by_name_and_semver(name, svr):
+ for c in CRATES.itervalues():
+ if c.name() == name and svr.compare(c.version()):
+ return c
+ for c in UNRESOLVED:
+ if c.name() == name and svr.compare(c.version()):
+ return c
+ return None
+
+
+def args_parser():
+ parser = argparse.ArgumentParser(description='Cargo Bootstrap Tool')
+ parser.add_argument('--cargo-root', type=str, default=os.getcwd(),
+ help="specify the cargo repo root path")
+ parser.add_argument('--target-dir', type=str, default=tempfile.mkdtemp(),
+ help="specify the path for storing built dependency libs")
+ parser.add_argument('--crate-index', type=str, default=None,
+ help="path to where the crate index should be cloned")
+ parser.add_argument('--target', type=str, default=None,
+ help="target triple for machine we're bootstrapping for")
+ parser.add_argument('--host', type=str, default=None,
+ help="host triple for machine we're bootstrapping on")
+ parser.add_argument('--no-clone', action='store_true',
+ help="skip cloning crates index, --crate-index must point to an existing clone of the crates index")
+ parser.add_argument('--no-git', action='store_true',
+ help="don't assume that the crates index and cargo root are git repos; implies --no-clone")
+ parser.add_argument('--no-clean', action='store_true',
+ help="don't delete the target dir and crate index")
+ parser.add_argument('--download', action='store_true',
+ help="only download the crates needed to build cargo")
+ parser.add_argument('--no-download', action='store_true',
+ help="don't download any crates (fail if any do not exist)")
+ parser.add_argument('--graph', action='store_true',
+ help="output a dot graph of the dependencies")
+ parser.add_argument('--urls-file', type=str, default=None,
+ help="file to write crate URLs to")
+ parser.add_argument('--blacklist', type=str, default="",
+ help="space-separated list of crates to skip")
+ parser.add_argument('--include-optional', type=str, default="",
+ help="space-separated list of optional crates to include")
+ parser.add_argument('--patchdir', type=str,
+ help="directory with patches to apply after downloading crates. organized by crate/NNNN-description.patch")
+ parser.add_argument('--crate-cache', type=str,
+ help="download and save crates to crate cache (directory)")
+ return parser
+
+
+def open_or_clone_repo(rdir, rurl, no_clone):
+ try:
+ repo = git.open_repo(rdir)
+ return repo
+ except:
+ repo = None
+
+ if repo is None and no_clone is False:
+ dbg('Cloning %s to %s' % (rurl, rdir))
+ return git.clone(rurl, rdir)
+
+ if repo is None and no_clone is True:
+ repo = rdir
+
+ return repo
+
+
+def patch_crates(targetdir, patchdir):
+ """
+ Apply patches in patchdir to downloaded crates
+ patchdir organization:
+
+ <patchdir>/
+ <crate>/
+ <patch>.patch
+ """
+ for patch in glob(os.path.join(patchdir, '*', '*.patch')):
+ crateid = os.path.basename(os.path.dirname(patch))
+ m = re.match(r'^([A-Za-z0-9_-]+?)(?:-([\d.]+))?$', crateid)
+ if m:
+ cratename = m.group(1)
+ else:
+ cratename = crateid
+ if cratename != crateid:
+ dirs = glob(os.path.join(targetdir, crateid))
+ else:
+ dirs = glob(os.path.join(targetdir, '%s-*' % (cratename)))
+ for cratedir in dirs:
+ # check if patch has been applied
+ patchpath = os.path.abspath(patch)
+ p = subprocess.Popen(['patch', '--dry-run', '-s', '-f', '-F', '10', '-p1', '-i', patchpath], cwd=cratedir)
+ rc = p.wait()
+ if rc == 0:
+ dbg("patching %s with patch %s" % (os.path.basename(cratedir), os.path.basename(patch)))
+ p = subprocess.Popen(['patch', '-s', '-F', '10', '-p1', '-i', patchpath], cwd=cratedir)
+ rc = p.wait()
+ if rc != 0:
+ dbg("%s: failed to apply %s (rc=%s)" % (os.path.basename(cratedir), os.path.basename(patch), rc))
+ else:
+ dbg("%s: %s does not apply (rc=%s)" % (os.path.basename(cratedir), os.path.basename(patch), rc))
+
+
+if __name__ == "__main__":
+ try:
+ # parse args
+ parser = args_parser()
+ args = parser.parse_args()
+
+ # clone the cargo index
+ if args.crate_index is None:
+ args.crate_index = os.path.normpath(os.path.join(args.target_dir, 'index'))
+ dbg('cargo: %s, target: %s, index: %s' % \
+ (args.cargo_root, args.target_dir, args.crate_index))
+
+ TARGET = args.target
+ HOST = args.host
+ URLS_FILE = args.urls_file
+ BLACKLIST = args.blacklist.split()
+ INCLUDE_OPTIONAL = args.include_optional.split()
+ if args.crate_cache and os.path.isdir(args.crate_cache):
+ CRATE_CACHE = os.path.abspath(args.crate_cache)
+
+ if not args.no_git:
+ index = open_or_clone_repo(args.crate_index, CRATES_INDEX, args.no_clone)
+ cargo = open_or_clone_repo(args.cargo_root, CARGO_REPO, args.no_clone)
+
+ if index is None:
+ raise RuntimeError('You must have a local clone of the crates index, ' \
+ 'omit --no-clone to allow this script to clone it for ' \
+ 'you, or pass --no-git to bypass this check.')
+ if cargo is None:
+ raise RuntimeError('You must have a local clone of the cargo repo ' \
+ 'so that this script can read the cargo toml file.')
+
+ if TARGET is None:
+ raise RuntimeError('You must specify the target triple of this machine')
+ if HOST is None:
+ HOST = TARGET
+
+ except Exception, e:
+ frame = inspect.trace()[-1]
+ print >> sys.stderr, "\nException:\n from %s, line %d:\n %s\n" % (frame[1], frame[2], e)
+ parser.print_help()
+ if not args.no_clean:
+ print "cleaning up %s" % (args.target_dir)
+ shutil.rmtree(args.target_dir)
+ sys.exit(1)
+
+ try:
+
+ # load cargo deps
+ name, ver, deps, build = crate_info_from_toml(args.cargo_root)
+ cargo_crate = Crate(name, ver, deps, args.cargo_root, build)
+ UNRESOLVED.append(cargo_crate)
+
+ if args.graph:
+ GRAPH = open(os.path.join(args.target_dir, 'deps.dot'), 'wb')
+ print >> GRAPH, "digraph %s {" % name
+
+ # resolve and download all of the dependencies
+ print ''
+ print '===================================='
+ print '===== DOWNLOADING DEPENDENCIES ====='
+ print '===================================='
+ while len(UNRESOLVED) > 0:
+ crate = UNRESOLVED.pop(0)
+ crate.resolve(args.target_dir, args.crate_index, args.no_download, GRAPH)
+
+ if args.graph:
+ print >> GRAPH, "}"
+ GRAPH.close()
+
+ if args.patchdir:
+ print ''
+ print '========================'
+ print '===== PATCH CRATES ====='
+ print '========================'
+ patch_crates(args.target_dir, args.patchdir)
+
+ if args.download:
+ print "done downloading..."
+ sys.exit(0)
+
+ # build cargo
+ print ''
+ print '=========================='
+ print '===== BUILDING CARGO ====='
+ print '=========================='
+ cargo_crate.build('bootstrap.py', args.target_dir)
+
+ # cleanup
+ if not args.no_clean:
+ print "cleaning up %s..." % (args.target_dir)
+ shutil.rmtree(args.target_dir)
+ print "done"
+
+ except Exception, e:
+ frame = inspect.trace()[-1]
+ print >> sys.stderr, "\nException:\n from %s, line %d:\n %s\n" % (frame[1], frame[2], e)
+ if not args.no_clean:
+ print "cleaning up %s..." % (args.target_dir)
+ shutil.rmtree(args.target_dir)
+ sys.exit(1)
+
+
--- /dev/null
+target/doc
--- /dev/null
+src/etc/cargo.bashcomp.sh
--- /dev/null
+src/etc/man/cargo-*.1
+src/etc/man/cargo.1
--- /dev/null
+[source.crates-io]
+replace-with = "dh-cargo-registry"
+
+[source.dh-cargo-registry]
+directory = "../vendor"
+
+[build]
+rustflags = "-g"
--- /dev/null
+cargo (0.27.0-2) unstable; urgency=medium
+
+ * Support cross-compile install (upstream PR #5614).
+
+ -- Ximin Luo <infinity0@debian.org> Wed, 06 Jun 2018 22:35:30 -0700
+
+cargo (0.27.0-1) unstable; urgency=medium
+
+ * Upload to unstable.
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Sun, 03 Jun 2018 20:42:13 +0530
+
+cargo (0.27.0-1~exp1) experimental; urgency=medium
+
+ [ upstream ]
+ * Cargo will now output path to custom commands when -v is passed with
+ --list.
+ * Cargo binary version is now same as the Rust version.
+ * Cargo.lock files are now included in published crates.
+
+ [ Vasudev Kamath ]
+ * Update patch 2004 for the new release.
+ * Add files from clap and vec_map to unsuspicious list.
+ * debian/patches:
+ + Update path to libgit2-sys in patch 2001.
+ + Adjust file name and paths to test files to be patched in patch
+ 2002.
+ + Drop all unused imports and comment out functions not just drop
+ #[test] in patch 2002.
+ + Drop patch 1001 as its now part of new cargo release.
+ + Refresh patch 2007.
+ * debian/copyright:
+ + Update copyright information for new vendored crates.
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Sat, 02 Jun 2018 15:10:38 +0530
+
+cargo (0.26.0-1) unstable; urgency=medium
+
+ * Upload to unstable.
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Tue, 01 May 2018 13:02:05 +0530
+
+cargo (0.26.0-1~exp1) experimental; urgency=medium
+
+ [upstream]
+ * cargo new now defaults to create binary crate instead of library
+ crate.
+ * cargo new will no longer name crates with name starting with rust- or
+ ending with -rs.
+ * cargo doc is faster as it uses cargo check instead of full rebuild.
+
+ [Vasudev Kamath]
+ * Refresh the patch 2004 against newer Cargo.toml
+ * Mark package compliance with Debian Policy 4.1.4
+ * debian/patches:
+ + Drop patch 2003 and 2005, the doc should be built from source using
+ mdbook.
+ + Drop patch 2006, the wasm32 related test seems to be dropped
+ upstream.
+ + Drop patch 1002, merged upstream.
+ + Add tests/generate_lock_file.rs to patch 2002 to disable
+ no_index_update test, this tries to access network.
+ + Refresh patch 1001 with new upstream release.
+ * debian/rules: disable execution of src/ci/dox.sh, this script is no
+ longer present in new release.
+ * debian/copyright:
+ + Add copyright for humantime crate.
+ + Add copyright for lazycell crate.
+ + Add copyright for quick-error crate
+ + Add copyright for proc-macro2 crate.
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Sat, 21 Apr 2018 20:59:39 +0530
+
+cargo (0.25.0-3) unstable; urgency=medium
+
+ [ Ximin Luo ]
+ * Update Vcs-* fields to salsa
+
+ [ Vasudev Kamath ]
+ * Add patch to prevent incremental builds on sparc64.
+ Closes: bug#895300, Thanks to John Paul Adrian Glaubitz.
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Sun, 15 Apr 2018 12:28:29 +0530
+
+cargo (0.25.0-2) unstable; urgency=medium
+
+ [ Ximin Luo ]
+ * Depend on rustc 1.24 or later.
+ * Backport a patch to not require dev-dependencies when not needed.
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Thu, 22 Mar 2018 20:08:17 +0530
+
+cargo (0.25.0-1) unstable; urgency=medium
+
+ * Upload to unstable.
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Fri, 09 Mar 2018 21:09:38 +0530
+
+cargo (0.25.0-1~exp2) experimental; urgency=medium
+
+ * Disable test running on powerpc and powerpcspe for now. Will be
+ enabled once issue in test suites are fixed.
+ Request from John Paul Adrian Glaubitz in IRC.
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Sun, 25 Feb 2018 10:27:23 +0530
+
+cargo (0.25.0-1~exp1) experimental; urgency=medium
+
+ [upstream]
+ * Added a workspace.default-members config that overrides implied --all
+ in virtual workspaces.
+ * Enable incremental by default on development builds.
+
+ [ Vasudev Kamath ]
+ * debian/vendor-tarball-filter.txt: Filter out git test data from
+ libgit2-sys crate.
+ * debian/vendor-tarball-unsusupiciousAudit unsuspicious files for 0.25.0
+ release.
+ * debian/make_orig_multi.sh: Make sure we take filter and unsuspiciaus
+ texts from debian folder.
+ * debian/patches:
+ + Drop patch 0001 it is merged upstream.
+ + Fix the typo in description of patch 2006.
+ * Drop source/lintian-override. README under patches directory is no
+ longer considered as a patch file by lintian.
+ * debian/copyright:
+ + Drop unused vendor crates copyright information.
+ + Add new crates copyright information to copyright.
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Sat, 24 Feb 2018 14:43:48 +0530
+
+cargo (0.24.0-1) unstable; urgency=medium
+
+ * Upload to unstable.
+
+ -- Ximin Luo <infinity0@debian.org> Sat, 27 Jan 2018 10:41:06 +0100
+
+cargo (0.24.0-1~exp1) experimental; urgency=medium
+
+ [upstream]
+ * Supports uninstallation of multiple crates.
+ * `cargo check` unit testing.
+ * Install a specific version using `cargo install --version`
+
+ [ Vasudev Kamath ]
+ * Update vendor-tarball-unsuspicious.txt vendor-tarball-filter.txt for
+ new upstream release.
+ * debian/control:
+ + Mark package compliance with Debian Policy 4.1.3.
+ * debian/patches:
+ + Update patch 2001 to work with libgit2-sys-0.6.19.
+ + Update 1002 patch to drop url crate specific hunk as its merged
+ upstream.
+ + Add patch 0001 to fix bad_git_dependency test failure.
+ * debian/copyright:
+ + Add new vendor crates to copyright.
+ + Track rustfmt.toml in top level copyright section.
+ * Add lintian-override for ignoring README from
+ patch-file-present-but-not-mentioned-in-series tag.
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Thu, 25 Jan 2018 14:57:43 +0530
+
+cargo (0.23.0-1) unstable; urgency=medium
+
+ * Upload to unstable.
+ * Mark package as compliant with Debian Policy 4.1.2.
+ No change required to source.
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Sun, 10 Dec 2017 15:33:55 +0530
+
+cargo (0.23.0-1~exp1) experimental; urgency=medium
+
+ * [upstream]
+ + Cargo will now build multi file examples in subdirectories of the
+ examples folder that have a main.rs file.
+ + Changed [root] to [package] in Cargo.lock. Old format packages will
+ continue to work and can be updated using cargo update.
+ + Supports vendoring git repositories.
+ * Refresh patch 2004 for new release.
+ * Audit logo.svg file from termion crate.
+ * debian/patches:
+ + Drop patch 1001, its merged upstream.
+ + Refresh patch 2002 with new upstream changes.
+ + Refresh patch 2001 with newer libgit2-sys changes.
+ + Add patch 2005 to prevent executing non-existing mdbook command
+ during build.
+ + Move part of typo fix for url crate to patch 1001 to 1002. url crate
+ is not updated in new cargo release.
+ * debian/copyright:
+ + Remove copyright for gcc crate.
+ + Add copyright information for cc, commoncrypto, crypto-hash,
+ redox_syscall. redox_termios and termion crate.
+ + Add CONTRIBUTING.md to top Files section.
+ + Drop magnet-sys from copyright.
+
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Tue, 05 Dec 2017 22:03:49 +0530
+
+cargo (0.22.0-1~exp1) experimental; urgency=medium
+
+ * New upstream release.
+ + Can now install multiple crates with cargo install.
+ + cargo commands inside a virtual workspace will now implicitly pass
+ --all.
+ + Added [patch] section to Cargo.toml to handle prepublication
+ dependencies RFC 1969.
+ + include and exclude fields in Cargo.toml now accept gitignore like
+ patterns.
+ + Added --all-target option.
+ + Using required dependencies as a feature is now deprecated and emits
+ a warning.
+ * Put upstream PR url for patch 1001.
+ * Add conv crate file to unsuspicious files.
+ * debian/patches:
+ + Refresh patches 1001, 2002 and 2004 with new upstream release.
+ + Fix typo in cargo search command and related tests.
+ * debian/control:
+ + Mark package compliance with Debian Policy 4.1.1.
+ + Mark priority for package as optional from extra. Priority extra is
+ deprecated from Debian Policy 4.0.1.
+ * debian/copyright:
+ + Add newly added vendor copyright information.
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Sun, 29 Oct 2017 19:50:42 +0530
+
+cargo (0.21.1-2) unstable; urgency=medium
+
+ * Upload to unstable.
+
+ -- Ximin Luo <infinity0@debian.org> Wed, 25 Oct 2017 23:30:46 +0200
+
+cargo (0.21.1-1) experimental; urgency=medium
+
+ * debian/control:
+ + Add myself as uploader for cargo package.
+ + Mark package compliance with Debian Policy 4.1.0.
+ * Mark tables.rs from unicode-normalization as unsuspicious.
+ * Ignore sublime workspace file from hex crate.
+ * debian/copyright:
+ + Drop wildcards representing the old crates under vendor folder.
+ + Add copyright information for newer crates under vendor
+ + Add ARCHITECTURE.* to copyright.
+ * debian/patches:
+ + Rename patches to follow patch naming guidelines mentioned in
+ debian/patches/README.
+ + Add patch 1001 to fix spelling errors in cargo output messages.
+ + Make patch 2003 DEP-3 compliant.
+ + Adjust make_orig_multi.sh to renamed clean-cargo-deps.patch
+
+ -- Vasudev Kamath <vasudev@copyninja.info> Sat, 23 Sep 2017 10:41:07 +0530
+
+cargo (0.20.0-2) unstable; urgency=medium
+
+ * Work around #865549, fixes FTBFS on ppc64el.
+
+ -- Ximin Luo <infinity0@debian.org> Thu, 14 Sep 2017 15:47:55 +0200
+
+cargo (0.20.0-1) unstable; urgency=medium
+
+ * New upstream release.
+ * Fix cross-compiling declarations, Multi-Arch: foreign => allowed
+ * Un-embed libgit2 0.25.1 again. (Closes: #860990)
+ * Update to latest Standards-Version; no changes required.
+
+ -- Ximin Luo <infinity0@debian.org> Thu, 24 Aug 2017 19:13:00 +0200
+
+cargo (0.17.0-2) unstable; urgency=medium
+
+ * Re-embed libgit2 0.25.1 due to the Debian testing freeze. It will be
+ removed again after the freeze is over, when libgit2 0.25.1 can again
+ enter Debian unstable.
+
+ -- Ximin Luo <infinity0@debian.org> Wed, 03 May 2017 16:56:03 +0200
+
+cargo (0.17.0-1) unstable; urgency=medium
+
+ * Upload to unstable so we have something to build rustc 1.17.0 with.
+
+ -- Ximin Luo <infinity0@debian.org> Wed, 03 May 2017 11:24:08 +0200
+
+cargo (0.17.0-1~exp3) experimental; urgency=medium
+
+ * Add git to Build-Depends to fix FTBFS.
+ * Mention cross-compiling in the previous changelog entry.
+
+ -- Ximin Luo <infinity0@debian.org> Tue, 02 May 2017 13:18:53 +0200
+
+cargo (0.17.0-1~exp2) experimental; urgency=medium
+
+ * Bring in some changes from Ubuntu.
+ - Rename deps/ to vendor/ as that's what upstream uses, and update
+ other files with the new paths too.
+ - Remove cargo-vendor-unpack since we no longer need to postprocess
+ cargo-vendor output in that way.
+ * Document that bootstrap.py probably doesn't work now.
+ * Include /usr/share/rustc/architecture.mk in d/rules instead of duplicating
+ awkward arch-dependent Makefile snippets.
+ * Don't embed libgit2, add a versioned B-D to libgit2-dev.
+ * Add support for cross-compiling bootstrap.
+
+ -- Ximin Luo <infinity0@debian.org> Mon, 01 May 2017 20:49:45 +0200
+
+cargo (0.17.0-1~exp1) experimental; urgency=medium
+
+ * New upstream release. (Closes: #851089, #859312)
+
+ -- Ximin Luo <infinity0@debian.org> Thu, 20 Apr 2017 03:16:04 +0200
+
+cargo (0.15.0~dev-1) unstable; urgency=medium
+
+ * New upstream snapshot (git 1877f59d6b2cb057f7ef6c6b34b926fd96a683c1)
+ - Compatible with OpenSSL 1.1.0 (Closes: #828259)
+ * rules: use new link-arg options (Closes: #834980, #837433)
+ - Requires rustc >= 1.13
+
+ -- Luca Bruno <lucab@debian.org> Fri, 25 Nov 2016 23:30:03 +0000
+
+cargo (0.11.0-2) unstable; urgency=high
+
+ * debian/rules: fix RUSTFLAGS quoting (Closes: #834980)
+
+ -- Luca Bruno <lucab@debian.org> Sun, 21 Aug 2016 18:21:21 +0000
+
+cargo (0.11.0-1) unstable; urgency=medium
+
+ [ Daniele Tricoli ]
+ * New upstream release. (Closes: #826938)
+ - Update deps tarball.
+ - Refresh patches.
+ - Drop clean-win-crates.patch since time crate is not a dependency
+ anymore.
+ - Drop deps-url-fix-toml.patch since merged upstream.
+
+ [ Luca Bruno ]
+ * Install subcommand manpages too
+ * Move to a bootstrapped (stage1) build by default
+
+ -- Luca Bruno <lucab@debian.org> Mon, 15 Aug 2016 13:59:04 +0000
+
+cargo (0.9.0-1) unstable; urgency=medium
+
+ * New upstream version
+ + Fix deprecation errors (Closes: #822178, #823652)
+ + Updated deps tarball
+ + Refreshed patches
+
+ -- Luca Bruno <lucab@debian.org> Sat, 07 May 2016 17:56:28 +0200
+
+cargo (0.8.0-2) unstable; urgency=low
+
+ * Prefer libcurl4-gnutls-dev for building (Closes: #819831)
+
+ -- Luca Bruno <lucab@debian.org> Tue, 05 Apr 2016 22:23:44 +0200
+
+cargo (0.8.0-1) unstable; urgency=medium
+
+ * New upstream version 0.8.0
+ + Updated deps tarball
+ + Refreshed patches
+ * cargo: removed unused lintian overrides
+
+ -- Luca Bruno <lucab@debian.org> Sat, 05 Mar 2016 22:39:06 +0100
+
+cargo (0.7.0-2) unstable; urgency=medium
+
+ * Bump standards version
+ * cargo:
+ + add a new stage2 profile
+ + preserve original Cargo.lock for clean
+ + clean environment to allow multiple builds
+ * cargo-doc:
+ + update docbase paths after package split
+ + do not reference remote jquery
+ + do not build under nodoc profile
+ * control: update build-deps for build-profiles
+
+ -- Luca Bruno <lucab@debian.org> Thu, 03 Mar 2016 22:18:32 +0100
+
+cargo (0.7.0-1) unstable; urgency=medium
+
+ * New upstream version 0.7.0
+ + Updated deps tarball and repack filter
+ + Refreshed patches
+ * Fixes to debian packaging
+ + Updated deps repack script
+ + index packing: use the same TAR format as cargo
+ + rules: ask cargo to build verbosely
+ * Update README.source to match current packaging
+
+ -- Luca Bruno <lucab@debian.org> Sun, 14 Feb 2016 16:12:55 +0100
+
+cargo (0.6.0-2) unstable; urgency=medium
+
+ * Introduce a cargo-doc package
+ * Fails to build when wget is installed. Force curl
+ (Closes: #809298)
+ * Add the missing VCS- fields
+
+ -- Sylvestre Ledru <sylvestre@debian.org> Tue, 26 Jan 2016 13:01:16 +0100
+
+cargo (0.6.0-1) unstable; urgency=medium
+
+ * New upstream version 0.6.0
+ + Updated deps tarball
+ + Not shipping a registry index anymore
+ * Refreshed bootstrap.py script
+ + Skip optional dependencies in stage0
+ * Added some crude pack/unpack helpers
+ * copyright: cleaned up unused entries
+ * rules: major update for new 0.6.0 bootstrap
+
+ -- Luca Bruno <lucab@debian.org> Fri, 04 Dec 2015 00:42:55 +0100
+
+cargo (0.3.0-2) unstable; urgency=medium
+
+ * Fix install target, removing arch-specific path
+
+ -- Luca Bruno <lucab@debian.org> Sat, 14 Nov 2015 19:46:57 +0100
+
+cargo (0.3.0-1) unstable; urgency=medium
+
+ * Team upload.
+ * First upload to unstable.
+ * Update gbp.conf according to git repo structure.
+ * patches: downgrade missing_docs lints to simple warnings
+ to avoid build failures on newer rustc.
+
+ -- Luca Bruno <lucab@debian.org> Sat, 14 Nov 2015 17:29:15 +0100
+
+cargo (0.3.0-0~exp1) experimental; urgency=low
+
+ * Team upload.
+ * Initial Debian release. (Closes: #786432)
+
+ -- Luca Bruno <lucab@debian.org> Tue, 11 Aug 2015 20:15:54 +0200
--- /dev/null
+Source: cargo
+Section: devel
+Maintainer: Rust Maintainers <pkg-rust-maintainers@lists.alioth.debian.org>
+Uploaders: Luca Bruno <lucab@debian.org>,
+ Angus Lees <gus@debian.org>,
+ Ximin Luo <infinity0@debian.org>,
+ Vasudev Kamath <vasudev@copyninja.info>
+Priority: optional
+# :native annotations are to support cross-compiling, see README.Debian of rustc
+Build-Depends: debhelper (>= 9.20141010),
+ dpkg-dev (>= 1.17.14),
+ python-dulwich <pkg.cargo.mkstage0>,
+ python-pytoml <pkg.cargo.mkstage0>,
+# TODO: related to https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=839145
+# the non-native lines can be deleted after all versions in unstable are M-A: allowed
+ cargo (>= 0.17.0) <!pkg.cargo.mkstage0> |
+ cargo:native (>= 0.17.0) <!pkg.cargo.mkstage0>,
+ rustc:native (>= 1.16),
+ libstd-rust-dev (>= 1.16),
+ pkg-config,
+ cmake,
+ bash-completion,
+ python3,
+ libcurl4-gnutls-dev | libcurl4-openssl-dev,
+ libssh2-1-dev,
+ libgit2-dev (>= 0.25.1.0),
+ libhttp-parser-dev,
+ libssl-dev,
+ zlib1g-dev,
+ git <!nocheck>
+Homepage: https://crates.io/
+Standards-Version: 4.1.4
+Vcs-Git: https://salsa.debian.org/rust-team/cargo.git
+Vcs-Browser: https://salsa.debian.org/rust-team/cargo
+
+Package: cargo
+Architecture: any
+Multi-Arch: allowed
+Depends: ${shlibs:Depends}, ${misc:Depends},
+ rustc (>= 1.24),
+ binutils,
+ gcc | clang | c-compiler
+Suggests: cargo-doc, python3
+Description: Rust package manager
+ Cargo is a tool that allows Rust projects to declare their various
+ dependencies, and ensure that you'll always get a repeatable build.
+ .
+ To accomplish this goal, Cargo does four things:
+ * Introduces two metadata files with various bits of project information.
+ * Fetches and builds your project's dependencies.
+ * Invokes rustc or another build tool with the correct parameters to build
+ your project.
+ * Introduces conventions, making working with Rust projects easier.
+ .
+ Cargo downloads your Rust project’s dependencies and compiles your
+ project.
+
+Package: cargo-doc
+Section: doc
+Architecture: all
+Build-Profiles: <!nodoc>
+Depends: ${misc:Depends}, libjs-jquery
+Description: Rust package manager, documentation
+ Cargo is a tool that allows Rust projects to declare their various
+ dependencies, and ensure that you'll always get a repeatable build.
+ .
+ To accomplish this goal, Cargo does four things:
+ * Introduces two metadata files with various bits of project information.
+ * Fetches and builds your project's dependencies.
+ * Invokes rustc or another build tool with the correct parameters to build
+ your project.
+ * Introduces conventions, making working with Rust projects easier.
+ .
+ Cargo downloads your Rust project’s dependencies and compiles your
+ project.
+ .
+ This package contains the documentation.
--- /dev/null
+Format: http://www.debian.org/doc/packaging-manuals/copyright-format/1.0/
+Upstream-Name: cargo
+Source: https://github.com/rust-lang/cargo
+
+Files: src/*
+ tests/*
+ .*
+ Cargo.*
+ LICENSE-*
+ README.*
+ ARCHITECTURE.*
+ CONTRIBUTING.*
+ appveyor.yml
+ rustfmt.toml
+Copyright: 2014 The Rust Project Developers
+License: MIT or Apache-2.0
+Comment: please do not add * to the above paragraph, so we can use lintian to
+ help us update this file properly
+
+Files: vendor/bitflags-*
+ vendor/bufstream-*
+ vendor/cmake-*
+ vendor/crossbeam-*
+ vendor/env_logger-*
+ vendor/flate2-*
+ vendor/fs2-0*
+ vendor/glob-0*
+ vendor/libc-0*
+ vendor/log-0*
+ vendor/num-0*
+ vendor/num-bigint-*
+ vendor/num-complex-*
+ vendor/num-integer-*
+ vendor/num-iter-*
+ vendor/num-rational-*
+ vendor/num-traits-*
+ vendor/rand-0*
+ vendor/regex-0*
+ vendor/regex-1*
+ vendor/regex-syntax-*
+ vendor/rustc-serialize-*
+ vendor/semver-*
+ vendor/shell-escape-*
+ vendor/tempdir-0*
+ vendor/vec_map-*
+ vendor/unicode-width-*
+Copyright: 2014-2016 The Rust Project Developers
+License: MIT or Apache-2.0
+Comment:
+ This is a collection of external crates embedded here to bootstrap cargo.
+ Most of them come from the original upstream Rust project, thus share the
+ same MIT/Apache-2.0 dual-license. See https://github.com/rust-lang.
+ Exceptions are noted below.
+
+Files: vendor/backtrace-0*
+ vendor/backtrace-sys-0*
+ vendor/cfg-if-0*
+ vendor/filetime-*
+ vendor/git2-0*
+ vendor/git2-curl-*
+ vendor/jobserver-0*
+ vendor/libz-sys-*
+ vendor/libgit2-sys-*
+ vendor/libssh2-sys-*
+ vendor/miniz-sys-*
+ vendor/openssl-probe-*
+ vendor/pkg-config-*
+ vendor/rustc-demangle-0*
+ vendor/scoped-tls-0*
+ vendor/tar-0*
+ vendor/toml-0*
+ vendor/socket2-*
+Copyright: 2014-2017 Alex Crichton <alex@alexcrichton.com>
+ 2014-2017 The Rust Project Developers
+License: MIT or Apache-2.0
+Comment: see https://github.com/alexcrichton/
+
+Files: vendor/aho-corasick-*
+ vendor/docopt-*
+ vendor/memchr-*
+ vendor/utf8-ranges-*
+ vendor/wincolor-*
+ vendor/termcolor-*
+ vendor/globset-*
+ vendor/ignore-*
+ vendor/same-file-*
+ vendor/walkdir-*
+Copyright: 2015 Andrew Gallant <jamslam@gmail.com>
+License: MIT or Unlicense
+Comment: see upstream projects,
+ * https://github.com/docopt/docopt.rs
+ * https://github.com/BurntSushi/aho-corasick
+ * https://github.com/BurntSushi/rust-memchr
+ * https://github.com/BurntSushi/utf8-ranges
+ * https://github.com/BurntSushi/ripgrep/tree/master/wincolor
+ * https://github.com/BurntSushi/ripgrep/tree/master/termcolor
+ * https://github.com/BurntSushi/ripgrep/tree/master/globset
+ * https://github.com/BurntSushi/ripgrep/tree/master/ignore
+ * https://github.com/BurntSushi/same-file
+ * https://github.com/BurntSushi/walkdir
+
+Files: vendor/ucd-util-*
+Copyright: 2015 Andrew Gallant <jamslam@gmail.com>
+License: MIT or Apache-2.0
+Comment: see https://github.com/BurntSushi/rucd
+
+Files: vendor/kernel32-*
+ vendor/winapi-*
+ vendor/advapi32-sys-*
+ vendor/userenv-sys-*
+Copyright: 2014-2017 Peter Atashian <retep998@gmail.com>
+ 2014-2017 winapi-rs developers
+License: MIT
+Comment: see https://github.com/retep998/winapi-rs
+
+
+Files: vendor/curl-0*
+ vendor/curl-sys-0*
+Copyright: 2014-2016 Carl Lerche <me@carllerche.com>
+ 2014-2016 Alex Crichton <alex@alexcrichton.com>
+License: MIT
+Comment: see https://github.com/sfackler/rust-openssl
+
+Files: vendor/dtoa-0*
+ vendor/itoa-0*
+ vendor/quote-0*
+Copyright: 2016-2017 David Tolnay <dtolnay@gmail.com>
+License: MIT or Apache-2.0
+Comment: see https://github.com/dtolnay
+
+Files: vendor/foreign-types-0*
+ vendor/foreign-types-shared-*
+Copyright: 2017-2017 Steven Fackler <sfackler@gmail.com>
+License: MIT or Apache-2.0
+Comment: see https://github.com/sfackler/foreign-types
+
+Files: vendor/hamcrest-0*
+Copyright: 2014-2016 Carl Lerche <me@carllerche.com>
+ 2014-2016 Alex Crichton <alex@alexcrichton.com>
+ 2014-2016 Ben Longbons <b.r.longbons@gmail.com>
+ 2014-2016 Graham Dennis <graham.dennis@gmail.com>
+ 2014-2016 Michael Gehring <mg@ebfe.org>
+ 2014-2016 Oliver Mader <b52@reaktor42.de>
+ 2014-2016 Robin Gloster <robin@loc-com.de>
+ 2014-2016 Steve Klabnik <steve@steveklabnik.com>
+ 2014-2016 Tamir Duberstein <tamird@gmail.com>
+ 2014-2016 Thiago Pontes <thiagopnts@gmail.com>
+ 2014-2016 Urban Hafner <contact@urbanhafner.com>
+ 2014-2016 Valerii Hiora <valerii.hiora@gmail.com>
+ 2014-2016 Yehuda Katz <wycats@gmail.com>
+License: MIT or Apache-2.0
+
+Files: vendor/idna-0*
+Copyright: 2013 Simon Sapin <simon.sapin@exyr.org>
+ 2013-2014 Valentin Gosu
+ 1991-2015 Unicode, Inc
+License: MIT or Apache-2.0
+
+Files: vendor/lazycell-*
+Copyright: 20014, The Rust Project Developers
+ 2016-2017, Nikita Pekin and lazycell contributors
+License: MIT or Apache-2.0
+
+Files: vendor/idna-0*/src/IdnaMappingTable.txt
+ vendor/idna-0*/tests/IdnaTest.txt
+Copyright: 1991-2017 Unicode, Inc
+License: Unicode-terms
+
+Files: vendor/lazy_static-0*
+ vendor/lazy_static-1*
+Copyright: 2014-2016 Marvin Löbel <loebel.marvin@gmail.com>
+License: MIT or Apache-2.0
+Comment: see https://github.com/rust-lang-nursery/lazy-static.rs
+
+Files: vendor/matches-*
+Copyright: 2015 Simon Sapin <simon.sapin@exyr.org>
+License: MIT
+Comment: see https://github.com/SimonSapin/rust-std-candidates
+
+Files: vendor/miniz-sys-*/miniz.c
+Copyright: Rich Geldreich <richgel99@gmail.com>
+License: Unlicense
+
+Files: vendor/num_cpus-*
+Copyright: 2015 Sean McArthur <sean.monstar@gmail.com>
+License: MIT
+Comment: see https://github.com/seanmonstar/num_cpus
+
+Files: vendor/openssl-*
+Copyright: 2013-2015 Steven Fackler <sfackler@gmail.com>
+ 2013 Jack Lloyd
+ 2011 Google Inc.
+License: Apache-2.0
+
+Files: vendor/openssl-sys-*
+Copyright: 2015 Steven Fackler <sfackler@gmail.com>
+ 2015 Alex Crichton <alex@alexcrichton.com>
+License: MIT
+Comment: see https://github.com/sfackler/rust-openssl
+
+Files: vendor/serde-1.*
+ vendor/serde_derive-1.*
+ vendor/serde_derive_internals-0*
+ vendor/serde_ignored-0*
+ vendor/serde_json-1.*
+Copyright: 2014-2017 Erick Tryzelaar <erick.tryzelaar@gmail.com>
+ 2014-2017 David Tolnay <dtolnay@gmail.com>
+License: MIT or Apache-2.0
+Comment: see https://github.com/serde-rs
+ see https://github.com/dtolnay/serde-ignored
+
+Files: vendor/strsim-*
+Copyright: 2015 Danny Guo <dannyguo91@gmail.com>
+License: MIT
+Comment: see https://github.com/dguo/strsim-rs
+
+Files: vendor/syn-0*
+ vendor/synom-0*
+Copyright: 2016-2017 David Tolnay <dtolnay@gmail.com>
+License: MIT or Apache-2.0
+Comment: see https://github.com/dtolnay/syn
+
+Files: vendor/thread-id-2.*
+Copyright: 2016-2017 Ruud van Asseldonk <dev@veniogames.com>
+License: Apache-2.0
+Comment: see https://github.com/ruud-v-a/thread-id
+ see https://github.com/ruuda/thread-id
+
+Files: vendor/thread_local-0*
+Copyright: 2016 Amanieu d'Antras <amanieu@gmail.com>
+License: MIT or Apache-2.0
+Comment: see https://github.com/Amanieu/thread_local-rs
+
+Files: vendor/unicode-bidi-*
+Copyright: 2015 The Servo Project Developers
+License: MIT or Apache-2.0
+Comment: see https://github.com/servo/unicode-bidi
+
+Files: vendor/core-foundation-*
+ vendor/core-foundation-sys-*
+Copyright: 2012-2013, The Servo Project Developers,
+ 2012-2013, Mozilla Foundation
+License: MIT or Apache-2.0
+Comment: see https://github.com/servo/core-foundation-rs
+
+Files: vendor/unicode-normalization-*
+Copyright: 2016 kwantam <kwantam@gmail.com>
+License: MIT or Apache-2.0
+Comment: see https://github.com/unicode-rs/unicode-normalization
+
+Files: vendor/unicode-xid-0*
+Copyright: 2015-2017 erick.tryzelaar <erick.tryzelaar@gmail.com>
+ 2015-2017 kwantam <kwantam@gmail.com>
+License: MIT or Apache-2.0
+Comment: see https://github.com/unicode-rs/unicode-xid
+
+Files: vendor/unicode-normalization-0*/scripts/unicode.py
+ vendor/unicode-xid-0*/scripts/unicode.py
+Copyright: 2011-2015 The Rust Project Developers
+ 2015 The Servo Project Developers
+License: MIT or Apache-2.0
+
+Files: vendor/unicode-bidi-0*/src/char_data/tables.rs
+ vendor/unicode-normalization-0*/src/tables.rs
+ vendor/unicode-normalization-0*/src/testdata.rs
+ vendor/unicode-xid-0*/src/tables.rs
+Copyright: 2011-2015 The Rust Project Developers
+ 2015 The Servo Project Developers
+License: MIT or Apache-2.0, and Unicode-terms
+Comment:
+ These files are generated using python scripts, as indicated below, from
+ Unicode data files which are licensed under the Unicode-terms. In Debian these
+ data files are available in the unicode-data package.
+ .
+ $ git grep -i generated -- vendor/unicode-* ':(exclude)*LICENSE*'
+ vendor/unicode-bidi-0.3.4/src/char_data/tables.rs:// The following code was generated by "tools/generate.py". do not edit directly
+ vendor/unicode-normalization-0.1.5/scripts/unicode.py:// NOTE: The following code was generated by "scripts/unicode.py", do not edit directly
+ vendor/unicode-normalization-0.1.5/src/tables.rs:// NOTE: The following code was generated by "scripts/unicode.py", do not edit directly
+ vendor/unicode-normalization-0.1.5/src/testdata.rs:// NOTE: The following code was generated by "scripts/unicode.py", do not edit directly
+ vendor/unicode-xid-0.0.4/scripts/unicode.py:// NOTE: The following code was generated by "scripts/unicode.py", do not edit directly
+ vendor/unicode-xid-0.0.4/src/tables.rs:// NOTE: The following code was generated by "scripts/unicode.py", do not edit directly
+
+Files: vendor/unreachable-*
+ vendor/void-1.*
+Copyright: 2015-2017 Jonathan Reem <jonathan.reem@gmail.com>
+License: MIT
+Comment: see https://github.com/reem/
+
+Files: vendor/url-1.*
+ vendor/percent-encoding-*
+Copyright: 2015-2016 Simon Sapin <simon.sapin@exyr.org>
+ 2013-2016 The rust-url developers
+License: MIT or Apache-2.0
+Comment: see https://github.com/servo/rust-url
+ see https://github.com/servo/rust-url/tree/master/percent_encoding
+
+Files: vendor/hex-*
+Copyright: 2015, rust-hex Developers
+License: MIT or Apache-2.0
+Comment: see https://github.com/KokaKiwi/rust-hex
+
+Files: vendor/atty-*
+Copyright: 2015-2016, Doug Tangren
+License: MIT
+Comment: see https://github.com/softprops/atty
+
+Files: vendor/fnv-*
+ vendor/cc-*
+ vendor/proc-macro2-*
+Copyright: 2017-2018, Alex Crichton <alex@alexcrichton.com>
+License: MIT or Apache-2.0
+Comment:
+ see https://github.com/servo/rust-fnv/
+ see https://github.com/alexcrichton/proc-macro
+
+
+Files: vendor/home-*
+Copyright: Brian Anderson <andersb@gmail.com>
+License: MIT or Apache-2.0
+Comment: see https://github.com/brson/home
+
+Files: vendor/fuchsia-zircon-*
+ vendor/fuchsia-zircon-sys-*
+Copyright: Raph Levien <raph@google.com>
+ 2016, The Fuchsia Authors
+License: BSD-3-Clause
+Comment:
+ * see https://fuchsia.googlesource.com/magenta-rs/
+ * see https://fuchsia.googlesource.com/garnet/
+
+Files: vendor/scopeguard-*
+Copyright: 2015, The Rust Project Developers
+ bluss
+License: MIT or Apache-2.0
+Comment: see https://github.com/bluss/scopeguard
+
+Files: vendor/vcpkg-0*
+Copyright: 2017-2017 Jim McGrath <jimmc2@gmail.com>
+License: MIT or Apache-2.0
+Comment: see https://github.com/mcgoo/vcpkg-rs
+
+Files: vendor/commoncrypto-*
+ vendor/crypto-hash-*
+Copyright: 2016, Mark Lee
+ 2015-2016, Mark Lee
+License: MIT
+Comment:
+ * see https://github.com/malept/rust-commoncrypto
+ * see https://github.com/malept/crypto-hash
+
+Files: vendor/redox_syscall-*
+ vendor/redox_termios-*
+Copyright: 2017, Redox OS Developers
+License: MIT
+
+Files: vendor/ansi_term-*
+Copyright: 2014, Benjamin Sago <ogham@bsago.me>
+ Ryan Scheel (Havvy) <ryan.havvy@gmail.com>
+ Josh Triplett <josh@joshtriplett.org>
+License: MIT
+
+Files: vendor/quick-error-*
+Copyright: 2015, The quick-error developers
+License: MIT or Apache-2.0
+
+Files: vendor/termion-*
+Copyright: 2016, Ticki
+License: MIT
+Comment: see https://github.com/ticki/termion/
+
+Files: vendor/failure-*
+ vendor/failure_derive-*
+Copyright: Without Boats <boats@mozilla.com>
+License: MIT or Apache-2.0
+Comment:
+ * see https://github.com/withoutboats/failure
+ * see https://github.com/withoutboats/failure_derive
+
+Files: vendor/remove_dir_all-*
+Copyright: 2017, Aaron Power <theaaronepower@gmail.com>
+License: MIT or Apache-2.0
+Comment: see https://github.com/Aaronepower/remove_dir_all
+
+Files: vendor/synstructure-*
+Copyright: Michael Layzell <michael@thelayzells.com>
+License: MIT
+Comment: see https://github.com/mystor/synstructure
+
+Files: src/doc/javascripts/prism.js
+ debian/missing-sources/prism.js
+Copyright: 2015 Lea Verou
+License: MIT
+
+Files: vendor/schannel-*
+Copyright: 2015, Steffen Butzer <steffen.butzer@outlook.com>
+License: MIT
+Comment: see https://github.com/steffengy/schannel-rs/
+
+Files: vendor/humantime-*
+Copyright: 2016, The humantime Developers
+License: MIT or Apache-2.0
+Comment:
+ Includes parts of http date with copyright: 2016, Pyfisch and portions of musl
+ libc with copyright 2005-2013 Rich Felker.
+ .
+ See https://github.com/tailhook/humantime
+
+Files: vendor/textwrap-*
+Copyright: 2016, Martin Geisler <martin@geisler.net>
+License: MIT
+
+Files: vendor/clap-*
+Copyright: 2015-2016, Kevin B. Knapp <kbknapp@gmail.com>
+License: MIT
+
+Files: vendor/tempfile-*
+Copyright: 2015, Steven Allen
+License: MIT or Apache-2.0
+
+Files: debian/*
+Copyright: 2017 Ximin Luo <infinity0@debian.org>
+ 2015-2016 Luca Bruno <lucab@debian.org>
+License: MIT or Apache-2.0
+
+Files: debian/bootstrap.py
+Copyright: 2015 David Huseby
+License: BSD-2-clause
+Comment: See LICENSE at https://github.com/dhuseby/cargo-bootstrap/
+
+License: BSD-2-clause
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions are met:
+ .
+ 1. Redistributions of source code must retain the above copyright notice,
+ this list of conditions and the following disclaimer.
+ 2. Redistributions in binary form must reproduce the above copyright notice,
+ this list of conditions and the following disclaimer in the documentation
+ and/or other materials provided with the distribution.
+ .
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
+ AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+ ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
+ LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+ CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+ SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+ INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+ CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
+ POSSIBILITY OF SUCH DAMAGE.
+
+License: BSD-3-clause
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions are
+ met:
+ .
+ 1. Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ 2. Redistributions in binary form must reproduce the above copyright
+ notice, this list of conditions and the following disclaimer in the
+ documentation and/or other materials provided with the
+ distribution.
+ 3. Neither the name of the Creytiv.com nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+ .
+ THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR
+ IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
+ WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
+ DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT,
+ INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
+ (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
+ SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
+ HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
+ STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING
+ IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
+ POSSIBILITY OF SUCH DAMAGE.
+
+License: MIT
+ Permission is hereby granted, free of charge, to any person obtaining a copy
+ of this software and associated documentation files (the "Software"), to deal
+ in the Software without restriction, including without limitation the rights
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+ copies of the Software, and to permit persons to whom the Software is
+ furnished to do so, subject to the following conditions:
+ .
+ The above copyright notice and this permission notice shall be included in
+ all copies or substantial portions of the Software.
+ .
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+ THE SOFTWARE.
+
+License: Apache-2.0
+ On Debian systems, see /usr/share/common-licenses/Apache-2.0 for
+ the full text of the Apache License version 2.
+
+License: Unlicense
+ This is free and unencumbered software released into the public domain.
+ .
+ Anyone is free to copy, modify, publish, use, compile, sell, or
+ distribute this software, either in source code form or as a compiled
+ binary, for any purpose, commercial or non-commercial, and by any
+ means.
+ .
+ In jurisdictions that recognize copyright laws, the author or authors
+ of this software dedicate any and all copyright interest in the
+ software to the public domain. We make this dedication for the benefit
+ of the public at large and to the detriment of our heirs and
+ successors. We intend this dedication to be an overt act of
+ relinquishment in perpetuity of all present and future rights to this
+ software under copyright law.
+ .
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+ EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+ MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+ IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR
+ OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
+ ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
+ OTHER DEALINGS IN THE SOFTWARE.
+
+License: Unicode-terms
+ Distributed under the Terms of Use in http://www.unicode.org/copyright.html.
+ .
+ Permission is hereby granted, free of charge, to any person obtaining
+ a copy of the Unicode data files and any associated documentation
+ (the "Data Files") or Unicode software and any associated documentation
+ (the "Software") to deal in the Data Files or Software
+ without restriction, including without limitation the rights to use,
+ copy, modify, merge, publish, distribute, and/or sell copies of
+ the Data Files or Software, and to permit persons to whom the Data Files
+ or Software are furnished to do so, provided that either
+ (a) this copyright and permission notice appear with all copies
+ of the Data Files or Software, or
+ (b) this copyright and permission notice appear in associated
+ Documentation.
+ .
+ THE DATA FILES AND SOFTWARE ARE PROVIDED "AS IS", WITHOUT WARRANTY OF
+ ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
+ WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
+ NONINFRINGEMENT OF THIRD PARTY RIGHTS.
+ IN NO EVENT SHALL THE COPYRIGHT HOLDER OR HOLDERS INCLUDED IN THIS
+ NOTICE BE LIABLE FOR ANY CLAIM, OR ANY SPECIAL INDIRECT OR CONSEQUENTIAL
+ DAMAGES, OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE,
+ DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER
+ TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
+ PERFORMANCE OF THE DATA FILES OR SOFTWARE.
+ .
+ Except as contained in this notice, the name of a copyright holder
+ shall not be used in advertising or otherwise to promote the sale,
+ use or other dealings in these Data Files or Software without prior
+ written authorization of the copyright holder.
+Comment: see http://www.unicode.org/copyright.html
--- /dev/null
+[DEFAULT]
+upstream-tag = upstream/%(version)s
+debian-tag = debian/%(version)s
+pristine-tar = True
+upstream-branch = upstream
+component = vendor
+
+[buildpackage]
+submodules = True
+ignore-branch = True
+
+[import-orig]
+upstream-vcs-tag = %(version)s
+debian-branch = debian/experimental
--- /dev/null
+target/*/release/cargo usr/bin
+debian/scripts/* usr/share/cargo
--- /dev/null
+#!/bin/sh
+set -e
+echo ""
+echo "This needs a local copy of cargo-vendor, and the following packages:"
+echo "python-dulwich python-pytoml devscripts"
+echo ""
+
+TMPDIR=`mktemp -d`
+echo "Using '${TMPDIR}'..."
+cat > "${TMPDIR}/Makefile" <<'EOF'
+include /usr/share/dpkg/pkg-info.mk
+all:
+ @echo $(DEB_VERSION_UPSTREAM)
+EOF
+WORKDIR=${PWD}
+
+if [ -z "$1" ]
+ then
+ USCAN_ARGS="";
+ CARGO_VER=$(make -f "${TMPDIR}/Makefile");
+ else
+ USCAN_ARGS="--download-version $1";
+ CARGO_VER="$1";
+fi;
+
+BOOTSTRAP_PY=$(find "${PWD}" -name bootstrap.py -type f)
+VENDOR_FILTER=$(find "${PWD}/debian" -name vendor-tarball-filter.txt -type f)
+VENDOR_SUS_WHITELIST=$(find "${PWD}/debian" -name vendor-tarball-unsuspicious.txt -type f)
+
+# Download cargo tarball
+uscan --rename ${USCAN_ARGS} --force-download --destdir "${TMPDIR}/"
+
+# Extract cargo source
+cd "${TMPDIR}"
+mkdir cargo
+tar -xaf "${TMPDIR}/cargo_${CARGO_VER}.orig.tar.gz" -C cargo --strip-components=1
+cd cargo
+
+# Trim the list of dependencies
+echo ""
+echo "Applying 2004_clean-cargo-deps.patch... If this fails, remember to refresh the patch first!"
+patch -p1 < ${WORKDIR}/debian/patches/2004_clean-cargo-deps.patch
+
+# Download build-deps via cargo-vendor
+export GIT_AUTHOR_NAME="deb-build"
+export GIT_AUTHOR_EMAIL="<>"
+export GIT_COMMITTER_NAME="${GIT_AUTHOR_NAME}"
+export GIT_COMMITTER_EMAIL="${GIT_AUTHOR_EMAIL}"
+cargo vendor --explicit-version --verbose
+
+# Clean embedded libs and update checksums
+grep -v '^#' ${VENDOR_FILTER} | xargs -I% sh -c 'rm -rf vendor/%'
+for i in vendor/*; do ${WORKDIR}/debian/scripts/prune-checksums "$i"; done
+
+# Report any suspicious files
+cp -R vendor vendor-scan
+grep -v '^#' ${VENDOR_SUS_WHITELIST} | xargs -I% sh -c 'rm -rf vendor-scan/%'
+echo "Checking for suspicious files..."
+# The following shell snippet is a bit more strict than suspicious-source(1)
+find vendor-scan -type f -and -not -name '.cargo-checksum.json' -exec file '{}' \; | \
+ sed -e 's/\btext\b\(.*\), with very long lines/verylongtext\1/g' | \
+ grep -v '\b\(text\|empty\)\b' || true
+echo "The above files (if any) seem suspicious, please audit them."
+echo "If good, add them to ${VENDOR_SUS_WHITELIST}."
+echo "If bad, add them to ${VENDOR_FILTER}."
+rm -rf vendor-scan
+
+# Pack it up, reproducibly
+GZIP=-9n tar --sort=name \
+ --mtime="./Cargo.lock" \
+ --owner=root --group=root \
+ -czf "${TMPDIR}/cargo_${CARGO_VER}.orig-vendor.tar.gz" vendor
+
+# All is good, we are done!
+echo "Your files are available at:"
+echo "${TMPDIR}/cargo_${CARGO_VER}.orig.tar.gz \\"
+echo "${TMPDIR}/cargo_${CARGO_VER}.orig-vendor.tar.gz"
+echo ""
+echo "Unpacked cargo sources are available under:"
+echo "${TMPDIR}/cargo/"
--- /dev/null
+/* http://prismjs.com/download.html?themes=prism&languages=markup+css+clike+javascript */
+var _self = (typeof window !== 'undefined')
+ ? window // if in browser
+ : (
+ (typeof WorkerGlobalScope !== 'undefined' && self instanceof WorkerGlobalScope)
+ ? self // if in worker
+ : {} // if in node js
+ );
+
+/**
+ * Prism: Lightweight, robust, elegant syntax highlighting
+ * MIT license http://www.opensource.org/licenses/mit-license.php/
+ * @author Lea Verou http://lea.verou.me
+ */
+
+var Prism = (function(){
+
+// Private helper vars
+var lang = /\blang(?:uage)?-(?!\*)(\w+)\b/i;
+
+var _ = _self.Prism = {
+ util: {
+ encode: function (tokens) {
+ if (tokens instanceof Token) {
+ return new Token(tokens.type, _.util.encode(tokens.content), tokens.alias);
+ } else if (_.util.type(tokens) === 'Array') {
+ return tokens.map(_.util.encode);
+ } else {
+ return tokens.replace(/&/g, '&').replace(/</g, '<').replace(/\u00a0/g, ' ');
+ }
+ },
+
+ type: function (o) {
+ return Object.prototype.toString.call(o).match(/\[object (\w+)\]/)[1];
+ },
+
+ // Deep clone a language definition (e.g. to extend it)
+ clone: function (o) {
+ var type = _.util.type(o);
+
+ switch (type) {
+ case 'Object':
+ var clone = {};
+
+ for (var key in o) {
+ if (o.hasOwnProperty(key)) {
+ clone[key] = _.util.clone(o[key]);
+ }
+ }
+
+ return clone;
+
+ case 'Array':
+ // Check for existence for IE8
+ return o.map && o.map(function(v) { return _.util.clone(v); });
+ }
+
+ return o;
+ }
+ },
+
+ languages: {
+ extend: function (id, redef) {
+ var lang = _.util.clone(_.languages[id]);
+
+ for (var key in redef) {
+ lang[key] = redef[key];
+ }
+
+ return lang;
+ },
+
+ /**
+ * Insert a token before another token in a language literal
+ * As this needs to recreate the object (we cannot actually insert before keys in object literals),
+ * we cannot just provide an object, we need anobject and a key.
+ * @param inside The key (or language id) of the parent
+ * @param before The key to insert before. If not provided, the function appends instead.
+ * @param insert Object with the key/value pairs to insert
+ * @param root The object that contains `inside`. If equal to Prism.languages, it can be omitted.
+ */
+ insertBefore: function (inside, before, insert, root) {
+ root = root || _.languages;
+ var grammar = root[inside];
+
+ if (arguments.length == 2) {
+ insert = arguments[1];
+
+ for (var newToken in insert) {
+ if (insert.hasOwnProperty(newToken)) {
+ grammar[newToken] = insert[newToken];
+ }
+ }
+
+ return grammar;
+ }
+
+ var ret = {};
+
+ for (var token in grammar) {
+
+ if (grammar.hasOwnProperty(token)) {
+
+ if (token == before) {
+
+ for (var newToken in insert) {
+
+ if (insert.hasOwnProperty(newToken)) {
+ ret[newToken] = insert[newToken];
+ }
+ }
+ }
+
+ ret[token] = grammar[token];
+ }
+ }
+
+ // Update references in other language definitions
+ _.languages.DFS(_.languages, function(key, value) {
+ if (value === root[inside] && key != inside) {
+ this[key] = ret;
+ }
+ });
+
+ return root[inside] = ret;
+ },
+
+ // Traverse a language definition with Depth First Search
+ DFS: function(o, callback, type) {
+ for (var i in o) {
+ if (o.hasOwnProperty(i)) {
+ callback.call(o, i, o[i], type || i);
+
+ if (_.util.type(o[i]) === 'Object') {
+ _.languages.DFS(o[i], callback);
+ }
+ else if (_.util.type(o[i]) === 'Array') {
+ _.languages.DFS(o[i], callback, i);
+ }
+ }
+ }
+ }
+ },
+
+ highlightAll: function(async, callback) {
+ var elements = document.querySelectorAll('code[class*="language-"], [class*="language-"] code, code[class*="lang-"], [class*="lang-"] code');
+
+ for (var i=0, element; element = elements[i++];) {
+ _.highlightElement(element, async === true, callback);
+ }
+ },
+
+ highlightElement: function(element, async, callback) {
+ // Find language
+ var language, grammar, parent = element;
+
+ while (parent && !lang.test(parent.className)) {
+ parent = parent.parentNode;
+ }
+
+ if (parent) {
+ language = (parent.className.match(lang) || [,''])[1];
+ grammar = _.languages[language];
+ }
+
+ // Set language on the element, if not present
+ element.className = element.className.replace(lang, '').replace(/\s+/g, ' ') + ' language-' + language;
+
+ // Set language on the parent, for styling
+ parent = element.parentNode;
+
+ if (/pre/i.test(parent.nodeName)) {
+ parent.className = parent.className.replace(lang, '').replace(/\s+/g, ' ') + ' language-' + language;
+ }
+
+ if (!grammar) {
+ return;
+ }
+
+ var code = element.textContent;
+
+ if(!code) {
+ return;
+ }
+
+ code = code.replace(/^(?:\r?\n|\r)/,'');
+
+ var env = {
+ element: element,
+ language: language,
+ grammar: grammar,
+ code: code
+ };
+
+ _.hooks.run('before-highlight', env);
+
+ if (async && _self.Worker) {
+ var worker = new Worker(_.filename);
+
+ worker.onmessage = function(evt) {
+ env.highlightedCode = Token.stringify(JSON.parse(evt.data), language);
+
+ _.hooks.run('before-insert', env);
+
+ env.element.innerHTML = env.highlightedCode;
+
+ callback && callback.call(env.element);
+ _.hooks.run('after-highlight', env);
+ };
+
+ worker.postMessage(JSON.stringify({
+ language: env.language,
+ code: env.code
+ }));
+ }
+ else {
+ env.highlightedCode = _.highlight(env.code, env.grammar, env.language);
+
+ _.hooks.run('before-insert', env);
+
+ env.element.innerHTML = env.highlightedCode;
+
+ callback && callback.call(element);
+
+ _.hooks.run('after-highlight', env);
+ }
+ },
+
+ highlight: function (text, grammar, language) {
+ var tokens = _.tokenize(text, grammar);
+ return Token.stringify(_.util.encode(tokens), language);
+ },
+
+ tokenize: function(text, grammar, language) {
+ var Token = _.Token;
+
+ var strarr = [text];
+
+ var rest = grammar.rest;
+
+ if (rest) {
+ for (var token in rest) {
+ grammar[token] = rest[token];
+ }
+
+ delete grammar.rest;
+ }
+
+ tokenloop: for (var token in grammar) {
+ if(!grammar.hasOwnProperty(token) || !grammar[token]) {
+ continue;
+ }
+
+ var patterns = grammar[token];
+ patterns = (_.util.type(patterns) === "Array") ? patterns : [patterns];
+
+ for (var j = 0; j < patterns.length; ++j) {
+ var pattern = patterns[j],
+ inside = pattern.inside,
+ lookbehind = !!pattern.lookbehind,
+ lookbehindLength = 0,
+ alias = pattern.alias;
+
+ pattern = pattern.pattern || pattern;
+
+ for (var i=0; i<strarr.length; i++) { // Don’t cache length as it changes during the loop
+
+ var str = strarr[i];
+
+ if (strarr.length > text.length) {
+ // Something went terribly wrong, ABORT, ABORT!
+ break tokenloop;
+ }
+
+ if (str instanceof Token) {
+ continue;
+ }
+
+ pattern.lastIndex = 0;
+
+ var match = pattern.exec(str);
+
+ if (match) {
+ if(lookbehind) {
+ lookbehindLength = match[1].length;
+ }
+
+ var from = match.index - 1 + lookbehindLength,
+ match = match[0].slice(lookbehindLength),
+ len = match.length,
+ to = from + len,
+ before = str.slice(0, from + 1),
+ after = str.slice(to + 1);
+
+ var args = [i, 1];
+
+ if (before) {
+ args.push(before);
+ }
+
+ var wrapped = new Token(token, inside? _.tokenize(match, inside) : match, alias);
+
+ args.push(wrapped);
+
+ if (after) {
+ args.push(after);
+ }
+
+ Array.prototype.splice.apply(strarr, args);
+ }
+ }
+ }
+ }
+
+ return strarr;
+ },
+
+ hooks: {
+ all: {},
+
+ add: function (name, callback) {
+ var hooks = _.hooks.all;
+
+ hooks[name] = hooks[name] || [];
+
+ hooks[name].push(callback);
+ },
+
+ run: function (name, env) {
+ var callbacks = _.hooks.all[name];
+
+ if (!callbacks || !callbacks.length) {
+ return;
+ }
+
+ for (var i=0, callback; callback = callbacks[i++];) {
+ callback(env);
+ }
+ }
+ }
+};
+
+var Token = _.Token = function(type, content, alias) {
+ this.type = type;
+ this.content = content;
+ this.alias = alias;
+};
+
+Token.stringify = function(o, language, parent) {
+ if (typeof o == 'string') {
+ return o;
+ }
+
+ if (_.util.type(o) === 'Array') {
+ return o.map(function(element) {
+ return Token.stringify(element, language, o);
+ }).join('');
+ }
+
+ var env = {
+ type: o.type,
+ content: Token.stringify(o.content, language, parent),
+ tag: 'span',
+ classes: ['token', o.type],
+ attributes: {},
+ language: language,
+ parent: parent
+ };
+
+ if (env.type == 'comment') {
+ env.attributes['spellcheck'] = 'true';
+ }
+
+ if (o.alias) {
+ var aliases = _.util.type(o.alias) === 'Array' ? o.alias : [o.alias];
+ Array.prototype.push.apply(env.classes, aliases);
+ }
+
+ _.hooks.run('wrap', env);
+
+ var attributes = '';
+
+ for (var name in env.attributes) {
+ attributes += name + '="' + (env.attributes[name] || '') + '"';
+ }
+
+ return '<' + env.tag + ' class="' + env.classes.join(' ') + '" ' + attributes + '>' + env.content + '</' + env.tag + '>';
+
+};
+
+if (!_self.document) {
+ if (!_self.addEventListener) {
+ // in Node.js
+ return _self.Prism;
+ }
+ // In worker
+ _self.addEventListener('message', function(evt) {
+ var message = JSON.parse(evt.data),
+ lang = message.language,
+ code = message.code;
+
+ _self.postMessage(JSON.stringify(_.util.encode(_.tokenize(code, _.languages[lang]))));
+ _self.close();
+ }, false);
+
+ return _self.Prism;
+}
+
+// Get current script and highlight
+var script = document.getElementsByTagName('script');
+
+script = script[script.length - 1];
+
+if (script) {
+ _.filename = script.src;
+
+ if (document.addEventListener && !script.hasAttribute('data-manual')) {
+ document.addEventListener('DOMContentLoaded', _.highlightAll);
+ }
+}
+
+return _self.Prism;
+
+})();
+
+if (typeof module !== 'undefined' && module.exports) {
+ module.exports = Prism;
+}
+;
+Prism.languages.markup = {
+ 'comment': /<!--[\w\W]*?-->/,
+ 'prolog': /<\?[\w\W]+?\?>/,
+ 'doctype': /<!DOCTYPE[\w\W]+?>/,
+ 'cdata': /<!\[CDATA\[[\w\W]*?]]>/i,
+ 'tag': {
+ pattern: /<\/?[^\s>\/]+(?:\s+[^\s>\/=]+(?:=(?:("|')(?:\\\1|\\?(?!\1)[\w\W])*\1|[^\s'">=]+))?)*\s*\/?>/i,
+ inside: {
+ 'tag': {
+ pattern: /^<\/?[^\s>\/]+/i,
+ inside: {
+ 'punctuation': /^<\/?/,
+ 'namespace': /^[^\s>\/:]+:/
+ }
+ },
+ 'attr-value': {
+ pattern: /=(?:('|")[\w\W]*?(\1)|[^\s>]+)/i,
+ inside: {
+ 'punctuation': /[=>"']/
+ }
+ },
+ 'punctuation': /\/?>/,
+ 'attr-name': {
+ pattern: /[^\s>\/]+/,
+ inside: {
+ 'namespace': /^[^\s>\/:]+:/
+ }
+ }
+
+ }
+ },
+ 'entity': /&#?[\da-z]{1,8};/i
+};
+
+// Plugin to make entity title show the real entity, idea by Roman Komarov
+Prism.hooks.add('wrap', function(env) {
+
+ if (env.type === 'entity') {
+ env.attributes['title'] = env.content.replace(/&/, '&');
+ }
+});
+;
+Prism.languages.css = {
+ 'comment': /\/\*[\w\W]*?\*\//,
+ 'atrule': {
+ pattern: /@[\w-]+?.*?(;|(?=\s*\{))/i,
+ inside: {
+ 'rule': /@[\w-]+/
+ // See rest below
+ }
+ },
+ 'url': /url\((?:(["'])(\\(?:\r\n|[\w\W])|(?!\1)[^\\\r\n])*\1|.*?)\)/i,
+ 'selector': /[^\{\}\s][^\{\};]*?(?=\s*\{)/,
+ 'string': /("|')(\\(?:\r\n|[\w\W])|(?!\1)[^\\\r\n])*\1/,
+ 'property': /(\b|\B)[\w-]+(?=\s*:)/i,
+ 'important': /\B!important\b/i,
+ 'function': /[-a-z0-9]+(?=\()/i,
+ 'punctuation': /[(){};:]/
+};
+
+Prism.languages.css['atrule'].inside.rest = Prism.util.clone(Prism.languages.css);
+
+if (Prism.languages.markup) {
+ Prism.languages.insertBefore('markup', 'tag', {
+ 'style': {
+ pattern: /<style[\w\W]*?>[\w\W]*?<\/style>/i,
+ inside: {
+ 'tag': {
+ pattern: /<style[\w\W]*?>|<\/style>/i,
+ inside: Prism.languages.markup.tag.inside
+ },
+ rest: Prism.languages.css
+ },
+ alias: 'language-css'
+ }
+ });
+
+ Prism.languages.insertBefore('inside', 'attr-value', {
+ 'style-attr': {
+ pattern: /\s*style=("|').*?\1/i,
+ inside: {
+ 'attr-name': {
+ pattern: /^\s*style/i,
+ inside: Prism.languages.markup.tag.inside
+ },
+ 'punctuation': /^\s*=\s*['"]|['"]\s*$/,
+ 'attr-value': {
+ pattern: /.+/i,
+ inside: Prism.languages.css
+ }
+ },
+ alias: 'language-css'
+ }
+ }, Prism.languages.markup.tag);
+};
+Prism.languages.clike = {
+ 'comment': [
+ {
+ pattern: /(^|[^\\])\/\*[\w\W]*?\*\//,
+ lookbehind: true
+ },
+ {
+ pattern: /(^|[^\\:])\/\/.*/,
+ lookbehind: true
+ }
+ ],
+ 'string': /("|')(\\(?:\r\n|[\s\S])|(?!\1)[^\\\r\n])*\1/,
+ 'class-name': {
+ pattern: /((?:(?:class|interface|extends|implements|trait|instanceof|new)\s+)|(?:catch\s+\())[a-z0-9_\.\\]+/i,
+ lookbehind: true,
+ inside: {
+ punctuation: /(\.|\\)/
+ }
+ },
+ 'keyword': /\b(if|else|while|do|for|return|in|instanceof|function|new|try|throw|catch|finally|null|break|continue)\b/,
+ 'boolean': /\b(true|false)\b/,
+ 'function': /[a-z0-9_]+(?=\()/i,
+ 'number': /\b-?(0x[\dA-Fa-f]+|\d*\.?\d+([Ee]-?\d+)?)\b/,
+ 'operator': /[-+]{1,2}|!|<=?|>=?|={1,3}|&{1,2}|\|?\||\?|\*|\/|~|\^|%/,
+ 'punctuation': /[{}[\];(),.:]/
+};
+;
+Prism.languages.javascript = Prism.languages.extend('clike', {
+ 'keyword': /\b(as|async|await|break|case|catch|class|const|continue|debugger|default|delete|do|else|enum|export|extends|false|finally|for|from|function|get|if|implements|import|in|instanceof|interface|let|new|null|of|package|private|protected|public|return|set|static|super|switch|this|throw|true|try|typeof|var|void|while|with|yield)\b/,
+ 'number': /\b-?(0x[\dA-Fa-f]+|0b[01]+|0o[0-7]+|\d*\.?\d+([Ee][+-]?\d+)?|NaN|Infinity)\b/,
+ 'function': /(?!\d)[a-z0-9_$]+(?=\()/i
+});
+
+Prism.languages.insertBefore('javascript', 'keyword', {
+ 'regex': {
+ pattern: /(^|[^/])\/(?!\/)(\[.+?]|\\.|[^/\\\r\n])+\/[gimyu]{0,5}(?=\s*($|[\r\n,.;})]))/,
+ lookbehind: true
+ }
+});
+
+Prism.languages.insertBefore('javascript', 'class-name', {
+ 'template-string': {
+ pattern: /`(?:\\`|\\?[^`])*`/,
+ inside: {
+ 'interpolation': {
+ pattern: /\$\{[^}]+\}/,
+ inside: {
+ 'interpolation-punctuation': {
+ pattern: /^\$\{|\}$/,
+ alias: 'punctuation'
+ },
+ rest: Prism.languages.javascript
+ }
+ },
+ 'string': /[\s\S]+/
+ }
+ }
+});
+
+if (Prism.languages.markup) {
+ Prism.languages.insertBefore('markup', 'tag', {
+ 'script': {
+ pattern: /<script[\w\W]*?>[\w\W]*?<\/script>/i,
+ inside: {
+ 'tag': {
+ pattern: /<script[\w\W]*?>|<\/script>/i,
+ inside: Prism.languages.markup.tag.inside
+ },
+ rest: Prism.languages.javascript
+ },
+ alias: 'language-javascript'
+ }
+ });
+}
+;
--- /dev/null
+Description: Always use system libgit2
+Author: Ximin Luo <infinity0@debian.org>
+Forwarded: not-needed
+---
+This patch header follows DEP-3: http://dep.debian.net/deps/dep3/
+--- a/vendor/libgit2-sys-0.7.1/build.rs
++++ b/vendor/libgit2-sys-0.7.1/build.rs
+@@ -31,12 +31,11 @@
+ }
+ let has_pkgconfig = Command::new("pkg-config").output().is_ok();
+
+- if env::var("LIBGIT2_SYS_USE_PKG_CONFIG").is_ok() {
+- if pkg_config::find_library("libgit2").is_ok() {
+- return
+- }
++ if pkg_config::find_library("libgit2").is_ok() {
++ return
+ }
+
++
+ if !Path::new("libgit2/.git").exists() {
+ let _ = Command::new("git").args(&["submodule", "update", "--init"])
+ .status();
--- /dev/null
+Description: Disable network tests
+Author: Ximin Luo <infinity0@debian.org>
+Forwarded: TODO
+---
+This patch header follows DEP-3: http://dep.debian.net/deps/dep3/
+--- a/tests/testsuite/build_auth.rs
++++ b/tests/testsuite/build_auth.rs
+@@ -1,269 +1,269 @@
+-use std;
+-use std::collections::HashSet;
+-use std::io::prelude::*;
+-use std::net::TcpListener;
+-use std::thread;
+-
+-use git2;
+-use bufstream::BufStream;
+-use cargotest::support::paths;
+-use cargotest::support::{execs, project};
+-use hamcrest::assert_that;
++// use std;
++// use std::collections::HashSet;
++// use std::io::prelude::*;
++// use std::net::TcpListener;
++// use std::thread;
++
++// use git2;
++// use bufstream::BufStream;
++// use cargotest::support::paths;
++// use cargotest::support::{execs, project};
++// use hamcrest::assert_that;
+
+ // Test that HTTP auth is offered from `credential.helper`
+-#[test]
+-fn http_auth_offered() {
+- let server = TcpListener::bind("127.0.0.1:0").unwrap();
+- let addr = server.local_addr().unwrap();
+-
+- fn headers(rdr: &mut BufRead) -> HashSet<String> {
+- let valid = ["GET", "Authorization", "Accept", "User-Agent"];
+- rdr.lines()
+- .map(|s| s.unwrap())
+- .take_while(|s| s.len() > 2)
+- .map(|s| s.trim().to_string())
+- .filter(|s| valid.iter().any(|prefix| s.starts_with(*prefix)))
+- .collect()
+- }
+-
+- let t = thread::spawn(move || {
+- let mut conn = BufStream::new(server.accept().unwrap().0);
+- let req = headers(&mut conn);
+- let user_agent = "User-Agent: git/2.0 (libgit2 0.27.0)";
+- conn.write_all(
+- b"\
+- HTTP/1.1 401 Unauthorized\r\n\
+- WWW-Authenticate: Basic realm=\"wheee\"\r\n
+- \r\n\
+- ",
+- ).unwrap();
+- assert_eq!(
+- req,
+- vec![
+- "GET /foo/bar/info/refs?service=git-upload-pack HTTP/1.1",
+- "Accept: */*",
+- user_agent,
+- ].into_iter()
+- .map(|s| s.to_string())
+- .collect()
+- );
+- drop(conn);
+-
+- let mut conn = BufStream::new(server.accept().unwrap().0);
+- let req = headers(&mut conn);
+- conn.write_all(
+- b"\
+- HTTP/1.1 401 Unauthorized\r\n\
+- WWW-Authenticate: Basic realm=\"wheee\"\r\n
+- \r\n\
+- ",
+- ).unwrap();
+- assert_eq!(
+- req,
+- vec![
+- "GET /foo/bar/info/refs?service=git-upload-pack HTTP/1.1",
+- "Authorization: Basic Zm9vOmJhcg==",
+- "Accept: */*",
+- user_agent,
+- ].into_iter()
+- .map(|s| s.to_string())
+- .collect()
+- );
+- });
+-
+- let script = project("script")
+- .file(
+- "Cargo.toml",
+- r#"
+- [project]
+- name = "script"
+- version = "0.0.1"
+- authors = []
+- "#,
+- )
+- .file(
+- "src/main.rs",
+- r#"
+- fn main() {
+- println!("username=foo");
+- println!("password=bar");
+- }
+- "#,
+- )
+- .build();
+-
+- assert_that(script.cargo("build").arg("-v"), execs().with_status(0));
+- let script = script.bin("script");
+-
+- let config = paths::home().join(".gitconfig");
+- let mut config = git2::Config::open(&config).unwrap();
+- config
+- .set_str("credential.helper", &script.display().to_string())
+- .unwrap();
+-
+- let p = project("foo")
+- .file(
+- "Cargo.toml",
+- &format!(
+- r#"
+- [project]
+- name = "foo"
+- version = "0.0.1"
+- authors = []
+-
+- [dependencies.bar]
+- git = "http://127.0.0.1:{}/foo/bar"
+- "#,
+- addr.port()
+- ),
+- )
+- .file("src/main.rs", "")
+- .file(
+- ".cargo/config",
+- "\
+- [net]
+- retry = 0
+- ",
+- )
+- .build();
+-
+- // This is a "contains" check because the last error differs by platform,
+- // may span multiple lines, and isn't relevant to this test.
+- assert_that(
+- p.cargo("build"),
+- execs().with_status(101).with_stderr_contains(&format!(
+- "\
+-[UPDATING] git repository `http://{addr}/foo/bar`
+-[ERROR] failed to load source for a dependency on `bar`
+-
+-Caused by:
+- Unable to update http://{addr}/foo/bar
+-
+-Caused by:
+- failed to clone into: [..]
+-
+-Caused by:
+- failed to authenticate when downloading repository
+-attempted to find username/password via `credential.helper`, but [..]
+-
+-Caused by:
+-",
+- addr = addr
+- )),
+- );
+-
+- t.join().ok().unwrap();
+-}
+-
+-// Boy, sure would be nice to have a TLS implementation in rust!
+-#[test]
+-fn https_something_happens() {
+- let server = TcpListener::bind("127.0.0.1:0").unwrap();
+- let addr = server.local_addr().unwrap();
+- let t = thread::spawn(move || {
+- let mut conn = server.accept().unwrap().0;
+- drop(conn.write(b"1234"));
+- drop(conn.shutdown(std::net::Shutdown::Write));
+- drop(conn.read(&mut [0; 16]));
+- });
+-
+- let p = project("foo")
+- .file(
+- "Cargo.toml",
+- &format!(
+- r#"
+- [project]
+- name = "foo"
+- version = "0.0.1"
+- authors = []
+-
+- [dependencies.bar]
+- git = "https://127.0.0.1:{}/foo/bar"
+- "#,
+- addr.port()
+- ),
+- )
+- .file("src/main.rs", "")
+- .file(
+- ".cargo/config",
+- "\
+- [net]
+- retry = 0
+- ",
+- )
+- .build();
+-
+- assert_that(
+- p.cargo("build").arg("-v"),
+- execs()
+- .with_status(101)
+- .with_stderr_contains(&format!(
+- "[UPDATING] git repository `https://{addr}/foo/bar`",
+- addr = addr
+- ))
+- .with_stderr_contains(&format!(
+- "\
+-Caused by:
+- {errmsg}
+-",
+- errmsg = if cfg!(windows) {
+- "[..]failed to send request: [..]"
+- } else if cfg!(target_os = "macos") {
+- // OSX is difficult to tests as some builds may use
+- // Security.framework and others may use OpenSSL. In that case let's
+- // just not verify the error message here.
+- "[..]"
+- } else {
+- "[..]SSL error: [..]"
+- }
+- )),
+- );
++//#[test]
++// fn http_auth_offered() {
++// let server = TcpListener::bind("127.0.0.1:0").unwrap();
++// let addr = server.local_addr().unwrap();
++
++// fn headers(rdr: &mut BufRead) -> HashSet<String> {
++// let valid = ["GET", "Authorization", "Accept", "User-Agent"];
++// rdr.lines()
++// .map(|s| s.unwrap())
++// .take_while(|s| s.len() > 2)
++// .map(|s| s.trim().to_string())
++// .filter(|s| valid.iter().any(|prefix| s.starts_with(*prefix)))
++// .collect()
++// }
++
++// let t = thread::spawn(move || {
++// let mut conn = BufStream::new(server.accept().unwrap().0);
++// let req = headers(&mut conn);
++// let user_agent = "User-Agent: git/2.0 (libgit2 0.27.0)";
++// conn.write_all(
++// b"\
++// HTTP/1.1 401 Unauthorized\r\n\
++// WWW-Authenticate: Basic realm=\"wheee\"\r\n
++// \r\n\
++// ",
++// ).unwrap();
++// assert_eq!(
++// req,
++// vec![
++// "GET /foo/bar/info/refs?service=git-upload-pack HTTP/1.1",
++// "Accept: */*",
++// user_agent,
++// ].into_iter()
++// .map(|s| s.to_string())
++// .collect()
++// );
++// drop(conn);
++
++// let mut conn = BufStream::new(server.accept().unwrap().0);
++// let req = headers(&mut conn);
++// conn.write_all(
++// b"\
++// HTTP/1.1 401 Unauthorized\r\n\
++// WWW-Authenticate: Basic realm=\"wheee\"\r\n
++// \r\n\
++// ",
++// ).unwrap();
++// assert_eq!(
++// req,
++// vec![
++// "GET /foo/bar/info/refs?service=git-upload-pack HTTP/1.1",
++// "Authorization: Basic Zm9vOmJhcg==",
++// "Accept: */*",
++// user_agent,
++// ].into_iter()
++// .map(|s| s.to_string())
++// .collect()
++// );
++// });
++
++// let script = project("script")
++// .file(
++// "Cargo.toml",
++// r#"
++// [project]
++// name = "script"
++// version = "0.0.1"
++// authors = []
++// "#,
++// )
++// .file(
++// "src/main.rs",
++// r#"
++// fn main() {
++// println!("username=foo");
++// println!("password=bar");
++// }
++// "#,
++// )
++// .build();
++
++// assert_that(script.cargo("build").arg("-v"), execs().with_status(0));
++// let script = script.bin("script");
++
++// let config = paths::home().join(".gitconfig");
++// let mut config = git2::Config::open(&config).unwrap();
++// config
++// .set_str("credential.helper", &script.display().to_string())
++// .unwrap();
++
++// let p = project("foo")
++// .file(
++// "Cargo.toml",
++// &format!(
++// r#"
++// [project]
++// name = "foo"
++// version = "0.0.1"
++// authors = []
++
++// [dependencies.bar]
++// git = "http://127.0.0.1:{}/foo/bar"
++// "#,
++// addr.port()
++// ),
++// )
++// .file("src/main.rs", "")
++// .file(
++// ".cargo/config",
++// "\
++// [net]
++// retry = 0
++// ",
++// )
++// .build();
++
++// // This is a "contains" check because the last error differs by platform,
++// // may span multiple lines, and isn't relevant to this test.
++// assert_that(
++// p.cargo("build"),
++// execs().with_status(101).with_stderr_contains(&format!(
++// "\
++// [UPDATING] git repository `http://{addr}/foo/bar`
++// [ERROR] failed to load source for a dependency on `bar`
++
++// Caused by:
++// Unable to update http://{addr}/foo/bar
++
++// Caused by:
++// failed to clone into: [..]
++
++// Caused by:
++// failed to authenticate when downloading repository
++// attempted to find username/password via `credential.helper`, but [..]
++
++// Caused by:
++// ",
++// addr = addr
++// )),
++// );
++
++// t.join().ok().unwrap();
++// }
++
++// // Boy, sure would be nice to have a TLS implementation in rust!
++// //#[test]
++// fn https_something_happens() {
++// let server = TcpListener::bind("127.0.0.1:0").unwrap();
++// let addr = server.local_addr().unwrap();
++// let t = thread::spawn(move || {
++// let mut conn = server.accept().unwrap().0;
++// drop(conn.write(b"1234"));
++// drop(conn.shutdown(std::net::Shutdown::Write));
++// drop(conn.read(&mut [0; 16]));
++// });
++
++// let p = project("foo")
++// .file(
++// "Cargo.toml",
++// &format!(
++// r#"
++// [project]
++// name = "foo"
++// version = "0.0.1"
++// authors = []
++
++// [dependencies.bar]
++// git = "https://127.0.0.1:{}/foo/bar"
++// "#,
++// addr.port()
++// ),
++// )
++// .file("src/main.rs", "")
++// .file(
++// ".cargo/config",
++// "\
++// [net]
++// retry = 0
++// ",
++// )
++// .build();
++
++// assert_that(
++// p.cargo("build").arg("-v"),
++// execs()
++// .with_status(101)
++// .with_stderr_contains(&format!(
++// "[UPDATING] git repository `https://{addr}/foo/bar`",
++// addr = addr
++// ))
++// .with_stderr_contains(&format!(
++// "\
++// Caused by:
++// {errmsg}
++// ",
++// errmsg = if cfg!(windows) {
++// "[..]failed to send request: [..]"
++// } else if cfg!(target_os = "macos") {
++// // OSX is difficult to tests as some builds may use
++// // Security.framework and others may use OpenSSL. In that case let's
++// // just not verify the error message here.
++// "[..]"
++// } else {
++// "[..]SSL error: [..]"
++// }
++// )),
++// );
+
+- t.join().ok().unwrap();
+-}
++// t.join().ok().unwrap();
++// }
+
+ // Boy, sure would be nice to have an SSH implementation in rust!
+-#[test]
+-fn ssh_something_happens() {
+- let server = TcpListener::bind("127.0.0.1:0").unwrap();
+- let addr = server.local_addr().unwrap();
+- let t = thread::spawn(move || {
+- drop(server.accept().unwrap());
+- });
+-
+- let p = project("foo")
+- .file(
+- "Cargo.toml",
+- &format!(
+- r#"
+- [project]
+- name = "foo"
+- version = "0.0.1"
+- authors = []
+-
+- [dependencies.bar]
+- git = "ssh://127.0.0.1:{}/foo/bar"
+- "#,
+- addr.port()
+- ),
+- )
+- .file("src/main.rs", "")
+- .build();
+-
+- assert_that(
+- p.cargo("build").arg("-v"),
+- execs()
+- .with_status(101)
+- .with_stderr_contains(&format!(
+- "[UPDATING] git repository `ssh://{addr}/foo/bar`",
+- addr = addr
+- ))
+- .with_stderr_contains(
+- "\
+-Caused by:
+- [..]failed to start SSH session: Failed getting banner[..]
+-",
+- ),
+- );
+- t.join().ok().unwrap();
+-}
++//#[test]
++// fn ssh_something_happens() {
++// let server = TcpListener::bind("127.0.0.1:0").unwrap();
++// let addr = server.local_addr().unwrap();
++// let t = thread::spawn(move || {
++// drop(server.accept().unwrap());
++// });
++
++// let p = project("foo")
++// .file(
++// "Cargo.toml",
++// &format!(
++// r#"
++// [project]
++// name = "foo"
++// version = "0.0.1"
++// authors = []
++
++// [dependencies.bar]
++// git = "ssh://127.0.0.1:{}/foo/bar"
++// "#,
++// addr.port()
++// ),
++// )
++// .file("src/main.rs", "")
++// .build();
++
++// assert_that(
++// p.cargo("build").arg("-v"),
++// execs()
++// .with_status(101)
++// .with_stderr_contains(&format!(
++// "[UPDATING] git repository `ssh://{addr}/foo/bar`",
++// addr = addr
++// ))
++// .with_stderr_contains(
++// "\
++// Caused by:
++// [..]failed to start SSH session: Failed getting banner[..]
++// ",
++// ),
++// );
++// t.join().ok().unwrap();
++// }
+--- a/tests/testsuite/net_config.rs
++++ b/tests/testsuite/net_config.rs
+@@ -1,75 +1,75 @@
+-use cargotest::support::{execs, project};
+-use hamcrest::assert_that;
++// use cargotest::support::{execs, project};
++// use hamcrest::assert_that;
+
+-#[test]
+-fn net_retry_loads_from_config() {
+- let p = project("foo")
+- .file(
+- "Cargo.toml",
+- r#"
+- [project]
+- name = "foo"
+- version = "0.0.1"
+- authors = []
+-
+- [dependencies.bar]
+- git = "https://127.0.0.1:11/foo/bar"
+- "#,
+- )
+- .file("src/main.rs", "")
+- .file(
+- ".cargo/config",
+- r#"
+- [net]
+- retry=1
+- [http]
+- timeout=1
+- "#,
+- )
+- .build();
+-
+- assert_that(
+- p.cargo("build").arg("-v"),
+- execs().with_status(101).with_stderr_contains(
+- "[WARNING] spurious network error \
+- (1 tries remaining): [..]",
+- ),
+- );
+-}
+-
+-#[test]
+-fn net_retry_git_outputs_warning() {
+- let p = project("foo")
+- .file(
+- "Cargo.toml",
+- r#"
+- [project]
+- name = "foo"
+- version = "0.0.1"
+- authors = []
+-
+- [dependencies.bar]
+- git = "https://127.0.0.1:11/foo/bar"
+- "#,
+- )
+- .file(
+- ".cargo/config",
+- r#"
+- [http]
+- timeout=1
+- "#,
+- )
+- .file("src/main.rs", "")
+- .build();
+-
+- assert_that(
+- p.cargo("build").arg("-v").arg("-j").arg("1"),
+- execs()
+- .with_status(101)
+- .with_stderr_contains(
+- "[WARNING] spurious network error \
+- (2 tries remaining): [..]",
+- )
+- .with_stderr_contains("[WARNING] spurious network error (1 tries remaining): [..]"),
+- );
+-}
++//#[test]
++// fn net_retry_loads_from_config() {
++// let p = project("foo")
++// .file(
++// "Cargo.toml",
++// r#"
++// [project]
++// name = "foo"
++// version = "0.0.1"
++// authors = []
++
++// [dependencies.bar]
++// git = "https://127.0.0.1:11/foo/bar"
++// "#,
++// )
++// .file("src/main.rs", "")
++// .file(
++// ".cargo/config",
++// r#"
++// [net]
++// retry=1
++// [http]
++// timeout=1
++// "#,
++// )
++// .build();
++
++// assert_that(
++// p.cargo("build").arg("-v"),
++// execs().with_status(101).with_stderr_contains(
++// "[WARNING] spurious network error \
++// (1 tries remaining): [..]",
++// ),
++// );
++// }
++
++//#[test]
++// fn net_retry_git_outputs_warning() {
++// let p = project("foo")
++// .file(
++// "Cargo.toml",
++// r#"
++// [project]
++// name = "foo"
++// version = "0.0.1"
++// authors = []
++
++// [dependencies.bar]
++// git = "https://127.0.0.1:11/foo/bar"
++// "#,
++// )
++// .file(
++// ".cargo/config",
++// r#"
++// [http]
++// timeout=1
++// "#,
++// )
++// .file("src/main.rs", "")
++// .build();
++
++// assert_that(
++// p.cargo("build").arg("-v").arg("-j").arg("1"),
++// execs()
++// .with_status(101)
++// .with_stderr_contains(
++// "[WARNING] spurious network error \
++// (2 tries remaining): [..]",
++// )
++// .with_stderr_contains("[WARNING] spurious network error (1 tries remaining): [..]"),
++// );
++// }
+--- a/tests/testsuite/test.rs
++++ b/tests/testsuite/test.rs
+@@ -3049,114 +3049,114 @@
+ );
+ }
+
+-#[test]
+-fn pass_correct_cfgs_flags_to_rustdoc() {
+- let p = project("foo")
+- .file(
+- "Cargo.toml",
+- r#"
+- [package]
+- name = "foo"
+- version = "0.1.0"
+- authors = []
+-
+- [features]
+- default = ["feature_a/default"]
+- nightly = ["feature_a/nightly"]
+-
+- [dependencies.feature_a]
+- path = "libs/feature_a"
+- default-features = false
+- "#,
+- )
+- .file(
+- "src/lib.rs",
+- r#"
+- #[cfg(test)]
+- mod tests {
+- #[test]
+- fn it_works() {
+- assert!(true);
+- }
+- }
+- "#,
+- )
+- .file(
+- "libs/feature_a/Cargo.toml",
+- r#"
+- [package]
+- name = "feature_a"
+- version = "0.1.0"
+- authors = []
+-
+- [features]
+- default = ["mock_serde_codegen"]
+- nightly = ["mock_serde_derive"]
+-
+- [dependencies]
+- mock_serde_derive = { path = "../mock_serde_derive", optional = true }
+-
+- [build-dependencies]
+- mock_serde_codegen = { path = "../mock_serde_codegen", optional = true }
+- "#,
+- )
+- .file(
+- "libs/feature_a/src/lib.rs",
+- r#"
+- #[cfg(feature = "mock_serde_derive")]
+- const MSG: &'static str = "This is safe";
+-
+- #[cfg(feature = "mock_serde_codegen")]
+- const MSG: &'static str = "This is risky";
+-
+- pub fn get() -> &'static str {
+- MSG
+- }
+- "#,
+- )
+- .file(
+- "libs/mock_serde_derive/Cargo.toml",
+- r#"
+- [package]
+- name = "mock_serde_derive"
+- version = "0.1.0"
+- authors = []
+- "#,
+- )
+- .file("libs/mock_serde_derive/src/lib.rs", "")
+- .file(
+- "libs/mock_serde_codegen/Cargo.toml",
+- r#"
+- [package]
+- name = "mock_serde_codegen"
+- version = "0.1.0"
+- authors = []
+- "#,
+- )
+- .file("libs/mock_serde_codegen/src/lib.rs", "");
+- let p = p.build();
+-
+- assert_that(
+- p.cargo("test")
+- .arg("--package")
+- .arg("feature_a")
+- .arg("--verbose"),
+- execs().with_status(0).with_stderr_contains(
+- "\
+-[DOCTEST] feature_a
+-[RUNNING] `rustdoc --test [..]mock_serde_codegen[..]`",
+- ),
+- );
+-
+- assert_that(
+- p.cargo("test").arg("--verbose"),
+- execs().with_status(0).with_stderr_contains(
+- "\
+-[DOCTEST] foo
+-[RUNNING] `rustdoc --test [..]feature_a[..]`",
+- ),
+- );
+-}
++//#[test]
++// fn pass_correct_cfgs_flags_to_rustdoc() {
++// let p = project("foo")
++// .file(
++// "Cargo.toml",
++// r#"
++// [package]
++// name = "foo"
++// version = "0.1.0"
++// authors = []
++
++// [features]
++// default = ["feature_a/default"]
++// nightly = ["feature_a/nightly"]
++
++// [dependencies.feature_a]
++// path = "libs/feature_a"
++// default-features = false
++// "#,
++// )
++// .file(
++// "src/lib.rs",
++// r#"
++// #[cfg(test)]
++// mod tests {
++// #[test]
++// fn it_works() {
++// assert!(true);
++// }
++// }
++// "#,
++// )
++// .file(
++// "libs/feature_a/Cargo.toml",
++// r#"
++// [package]
++// name = "feature_a"
++// version = "0.1.0"
++// authors = []
++
++// [features]
++// default = ["mock_serde_codegen"]
++// nightly = ["mock_serde_derive"]
++
++// [dependencies]
++// mock_serde_derive = { path = "../mock_serde_derive", optional = true }
++
++// [build-dependencies]
++// mock_serde_codegen = { path = "../mock_serde_codegen", optional = true }
++// "#,
++// )
++// .file(
++// "libs/feature_a/src/lib.rs",
++// r#"
++// #[cfg(feature = "mock_serde_derive")]
++// const MSG: &'static str = "This is safe";
++
++// #[cfg(feature = "mock_serde_codegen")]
++// const MSG: &'static str = "This is risky";
++
++// pub fn get() -> &'static str {
++// MSG
++// }
++// "#,
++// )
++// .file(
++// "libs/mock_serde_derive/Cargo.toml",
++// r#"
++// [package]
++// name = "mock_serde_derive"
++// version = "0.1.0"
++// authors = []
++// "#,
++// )
++// .file("libs/mock_serde_derive/src/lib.rs", "")
++// .file(
++// "libs/mock_serde_codegen/Cargo.toml",
++// r#"
++// [package]
++// name = "mock_serde_codegen"
++// version = "0.1.0"
++// authors = []
++// "#,
++// )
++// .file("libs/mock_serde_codegen/src/lib.rs", "");
++// let p = p.build();
++
++// assert_that(
++// p.cargo("test")
++// .arg("--package")
++// .arg("feature_a")
++// .arg("--verbose"),
++// execs().with_status(0).with_stderr_contains(
++// "\
++// [DOCTEST] feature_a
++// [RUNNING] `rustdoc --test [..]mock_serde_codegen[..]`",
++// ),
++// );
++
++// assert_that(
++// p.cargo("test").arg("--verbose"),
++// execs().with_status(0).with_stderr_contains(
++// "\
++// [DOCTEST] foo
++// [RUNNING] `rustdoc --test [..]feature_a[..]`",
++// ),
++// );
++// }
+
+ #[test]
+ fn test_release_ignore_panic() {
+--- a/tests/testsuite/generate_lockfile.rs
++++ b/tests/testsuite/generate_lockfile.rs
+@@ -2,8 +2,8 @@
+ use std::io::prelude::*;
+
+ use cargotest::support::{execs, project};
+-use cargotest::support::registry::Package;
+-use cargotest::ChannelChanger;
++// use cargotest::support::registry::Package;
++// use cargotest::ChannelChanger;
+ use hamcrest::{assert_that, existing_file, is_not};
+
+ #[test]
+@@ -90,38 +90,37 @@
+ assert_eq!(lock1, lock4);
+ }
+
+-#[test]
+-fn no_index_update() {
+- Package::new("serde", "1.0.0").publish();
+-
+- let p = project("foo")
+- .file(
+- "Cargo.toml",
+- r#"
+- [package]
+- name = "foo"
+- authors = []
+- version = "0.0.1"
+-
+- [dependencies]
+- serde = "1.0"
+- "#,
+- )
+- .file("src/main.rs", "fn main() {}")
+- .build();
++// fn no_index_update() {
++// Package::new("serde", "1.0.0").publish();
+
+- assert_that(
+- p.cargo("generate-lockfile"),
+- execs().with_stderr("[UPDATING] registry `[..]`"),
+- );
+-
+- assert_that(
+- p.cargo("generate-lockfile")
+- .masquerade_as_nightly_cargo()
+- .arg("-Zno-index-update"),
+- execs().with_status(0).with_stdout("").with_stderr(""),
+- );
+-}
++// let p = project("foo")
++// .file(
++// "Cargo.toml",
++// r#"
++// [package]
++// name = "foo"
++// authors = []
++// version = "0.0.1"
++
++// [dependencies]
++// serde = "1.0"
++// "#,
++// )
++// .file("src/main.rs", "fn main() {}")
++// .build();
++
++// assert_that(
++// p.cargo("generate-lockfile"),
++// execs().with_stderr("[UPDATING] registry `[..]`"),
++// );
++
++// assert_that(
++// p.cargo("generate-lockfile")
++// .masquerade_as_nightly_cargo()
++// .arg("-Zno-index-update"),
++// execs().with_status(0).with_stdout("").with_stderr(""),
++// );
++// }
+
+ #[test]
+ fn preserve_metadata() {
--- /dev/null
+Description: Remove stage1 extra dependencies (cargo 0.17)
+ Fetching extra dev-dependencies is not yet supported by bootstrap.py
+ and they are not need to build. However, cargo will try to download
+ them before building stage1.
+ .
+ For patches for earlier versions of cargo, see the git history for this file.
+From: Luca Bruno <lucab@debian.org>
+From: Ximin Luo <infinity0@debian.org>
+Forwarded: not-needed
+---
+This patch header follows DEP-3: http://dep.debian.net/deps/dep3/
+--- a/Cargo.toml
++++ b/Cargo.toml
+@@ -61,27 +61,27 @@
+ [target.'cfg(target_os = "macos")'.dependencies]
+ core-foundation = { version = "0.5.1", features = ["mac_os_10_7_support"] }
+
+-[target.'cfg(windows)'.dependencies]
+-miow = "0.3"
++# [target.'cfg(windows)'.dependencies]
++# miow = "0.3"
+
+-[target.'cfg(windows)'.dependencies.winapi]
+-version = "0.3"
+-features = [
+- "handleapi",
+- "jobapi",
+- "jobapi2",
+- "minwindef",
+- "ntdef",
+- "ntstatus",
+- "processenv",
+- "processthreadsapi",
+- "psapi",
+- "synchapi",
+- "winerror",
+- "winbase",
+- "wincon",
+- "winnt",
+-]
++# [target.'cfg(windows)'.dependencies.winapi]
++# version = "0.3"
++# features = [
++# "handleapi",
++# "jobapi",
++# "jobapi2",
++# "minwindef",
++# "ntdef",
++# "ntstatus",
++# "processenv",
++# "processthreadsapi",
++# "psapi",
++# "synchapi",
++# "winerror",
++# "winbase",
++# "wincon",
++# "winnt",
++# ]
+
+ [dev-dependencies]
+ bufstream = "0.1"
--- /dev/null
+Description: Disable incremental builds on sparc64
+ Incremental builds are currently unreliable on sparc64,
+ disable them by default for the time being.
+Last-Update: 2018-04-09
+
+--- a/src/cargo/core/manifest.rs
++++ b/src/cargo/core/manifest.rs
+@@ -711,6 +711,9 @@
+ debuginfo: Some(2),
+ debug_assertions: true,
+ overflow_checks: true,
++ #[cfg(target_arch = "sparc64")]
++ incremental: false,
++ #[cfg(not(target_arch = "sparc64"))]
+ incremental: true,
+ ..Profile::default()
+ }
--- /dev/null
+Bug: https://github.com/rust-lang/cargo/pull/5614
+Author: Ximin Luo <infinity0@pwned.gg>
+Date: Wed Jun 6 22:30:10 2018 -0700
+
+ Support cross-compile install
+
+diff --git a/src/bin/cargo/commands/install.rs b/src/bin/cargo/commands/install.rs
+index f0c65515..efac1a94 100644
+--- a/src/bin/commands/install.rs
++++ b/src/bin/commands/install.rs
+@@ -32,6 +32,7 @@ pub fn cli() -> App {
+ "Install only the specified example",
+ "Install all examples",
+ )
++ .arg_target_triple("Build for the target triple")
+ .arg(opt("root", "Directory to install packages into").value_name("DIR"))
+ .after_help(
+ "\
--- /dev/null
+0xxx: Grabbed from upstream development.
+1xxx: Possibly relevant for upstream adoption.
+2xxx: Only relevant for official Debian release.
--- /dev/null
+2007_sparc64_disable_incremental_build.patch
+2004_clean-cargo-deps.patch
+2001_use-system-libgit2.patch
+2002_disable-net-tests.patch
+2008_support-cross-compile-install.patch
--- /dev/null
+#!/usr/bin/make -f
+
+include /usr/share/dpkg/pkg-info.mk
+include /usr/share/dpkg/architecture.mk
+include /usr/share/dpkg/buildflags.mk
+include /usr/share/rustc/architecture.mk
+
+# For cross-compiling; perhaps we should add this to dh-cargo or debcargo
+export PKG_CONFIG = $(DEB_HOST_GNU_TYPE)-pkg-config
+export PKG_CONFIG_ALLOW_CROSS = 1
+# Needed because of https://github.com/rust-lang/cargo/issues/4133
+RUSTFLAGS += -C linker=$(DEB_HOST_GNU_TYPE)-gcc
+# TODO: we cannot enable this until dh_shlibdeps works correctly; atm we get:
+# dpkg-shlibdeps: warning: can't extract name and version from library name 'libstd-XXXXXXXX.so'
+# and the resulting cargo.deb does not depend on the correct version of libstd-rust-1.XX
+# We probably need to add override_dh_makeshlibs to d/rules of rustc
+#RUSTFLAGS += -C prefer-dynamic
+
+# Pass on dpkg-buildflags stuff
+RUSTFLAGS += $(foreach flag,$(LDFLAGS),-C link-arg=$(flag))
+export CFLAGS CXXFLAGS CPPFLAGS LDFLAGS RUSTFLAGS
+
+CARGO = RUST_BACKTRACE=1 cargo
+CARGOFLAGS = --release --target=$(DEB_HOST_RUST_TYPE) --verbose
+# Cargo looks for config in and writes cache to $CARGO_HOME/
+export CARGO_HOME = $(CURDIR)/debian/cargohome
+
+# Work around #865549, needed when using older cargo that is affected by it.
+# TODO: remove after cargo/unstable is a version that was compiled with rustc
+# >= 1.18 where this bug was fixed in Debian (i.e. probably after 0.20).
+ifeq (0,$(shell test $$(uname -s) = "Linux" -a $$(getconf PAGESIZE) -gt 4096; echo $$?))
+ SYSTEM_WORKAROUNDS += ulimit -s $$(expr $$(getconf PAGESIZE) / 1024 '*' 256 + 8192);
+endif
+
+# Disable tests on powerpc and powerpcspe for now.
+# cbmuser requested this and will fix the errors later
+# TODO: once powerpc powerpcspe test failures are fixed drop this
+ifeq ($(DEB_HOST_ARCH), powerpc)
+ DEB_BUILD_PROFILES = nocheck
+endif
+
+ifeq ($(DEB_HOST_ARCH), powerpcspe)
+ DEB_BUILD_PROFILES = nocheck
+endif
+
+%:
+ $(SYSTEM_WORKAROUNDS) dh $@ --with bash-completion
+
+override_dh_auto_configure:
+ # cp -a $(CURDIR)/Cargo.lock $(CURDIR)/.Cargo.lock.orig
+ifneq ($(filter pkg.cargo.mkstage0,$(DEB_BUILD_PROFILES)),)
+ # NOTE: this very likely doesn't work any more, see bootstrap.py for details
+ # Instead, you can try to bootstrap by setting PATH to a binary cargo
+ # downloaded from upstream; or by cross-compiling, see "build-cross" below
+ # Preserved in case someone wants to resurrect it later:
+ # Bootstrap cargo stage0
+ ./debian/bootstrap.py \
+ --no-clean \
+ --no-clone \
+ --no-git \
+ --no-download \
+ --crate-index $(CURDIR)/vendor/index / \
+ --cargo-root $(CURDIR)/ \
+ --target-dir $(CURDIR)/deps \
+ --host=$(DEB_HOST_RUST_TYPE) \
+ --target=$(DEB_TARGET_RUST_TYPE)
+ # Workaround for https://github.com/rust-lang/cargo/issues/1423
+ ln -s `find $(CURDIR)/deps -name 'cargo-*' -type f -executable` $(CURDIR)/cargo-stage0
+else
+ ln -sf `which cargo` $(CURDIR)/cargo-stage0
+endif
+ debian/scripts/prune-checksums vendor/backtrace-sys-*/
+ debian/scripts/prune-checksums vendor/libgit2-sys-*/
+
+override_dh_auto_build-arch:
+ $(CARGO) build $(CARGOFLAGS)
+
+override_dh_auto_build-indep:
+ifeq (,$(findstring nodoc,$(DEB_BUILD_PROFILES)))
+ $(CARGO) doc $(CARGOFLAGS)
+ # Extra instructions from README.md, unfortunately not done by "cargo doc"
+ # sh src/ci/dox.sh
+ # Post-processing for Debian
+ cd target/doc/ && rm -f jquery.js && ln -s /usr/share/javascript/jquery/jquery.js
+endif
+
+override_dh_auto_test:
+ifeq (,$(findstring nocheck,$(DEB_BUILD_PROFILES)))
+ifeq (,$(findstring nocheck,$(DEB_BUILD_OPTIONS)))
+ CFG_DISABLE_CROSS_TESTS=1 $(CARGO) test $(CARGOFLAGS)
+endif
+endif
+
+override_dh_auto_install:
+ # We pick stuff directly from target/
+
+override_dh_auto_clean:
+ -mv $(CURDIR)/.Cargo.lock.orig $(CURDIR)/Cargo.lock
+ $(CARGO) clean $(CARGOFLAGS)
+ -$(RM) -r $(CURDIR)/target/ \
+ $(CURDIR)/.cargo \
+ $(CURDIR)/config.mk \
+ $(CURDIR)/config.stamp \
+ $(CURDIR)/Makefile \
+ $(CURDIR)/Cargo.lock \
+ $(CURDIR)/cargo-stage0
+
+override_dh_clean:
+ # Upstream contains a lot of these
+ dh_clean -XCargo.toml.orig
+
+CROSS_SBUILD = sbuild --profiles=nocheck \
+ --build-failed-commands '%SBUILD_SHELL' \
+ --host=$(DEB_HOST_ARCH) \
+ --no-arch-all $(EXTRA_SBUILD_FLAGS)
+
+SBUILD_REPO_EXPERIMENTAL = --extra-repository="deb http://httpredir.debian.org/debian experimental main"
+
+# Sometimes this is necessary, if the mirrors have different versions for different architectures
+ifeq (1,$(SBUILD_USE_INCOMING))
+CROSS_SBUILD += --extra-repository="deb http://incoming.debian.org/debian-buildd buildd-unstable main"
+SBUILD_REPO_EXPERIMENTAL += --extra-repository="deb http://incoming.debian.org/debian-buildd buildd-experimental main"
+endif
+
+crossbuild:
+ $(CROSS_SBUILD) .
+
+crossbuild-experimental:
+ $(CROSS_SBUILD) $(SBUILD_REPO_EXPERIMENTAL) --build-dep-resolver=aspcud .
--- /dev/null
+#!/usr/bin/python3
+# Copyright: 2015-2017 The Debian Project
+# License: MIT or Apache-2.0
+#
+# Guess the copyright of a cargo crate by looking at its git history.
+
+import datetime
+import pytoml
+import os
+import subprocess
+import sys
+
+this_year = datetime.datetime.now().year
+crates = sys.argv[1:]
+get_initial_commit = len(crates) == 1
+
+for crate in crates:
+ with open(os.path.join(crate, "Cargo.toml")) as fp:
+ data = pytoml.load(fp)
+ repo = data["package"].get("repository", None)
+ if get_initial_commit and repo:
+ output = subprocess.check_output(
+ """git clone -q --bare "%s" tmp.crate-copyright >&2 &&
+cd tmp.crate-copyright &&
+git log --format=%%cI --reverse | head -n1 | cut -b1-4 &&
+git log --format=%%cI | head -n1 | cut -b1-4 &&
+cd .. &&
+rm -rf tmp.crate-copyright""" % repo, shell=True).decode("utf-8")
+ first_year, last_year = output.strip().split(maxsplit=2)
+ else:
+ first_year = "20XX"
+ last_year = this_year
+ print("""Files: {0}
+Copyright: {1}
+License: {2}
+Comment: see {3}
+""".format(
+ os.path.join(crate, "*"),
+ "\n ".join("%s-%s %s" % (first_year, last_year, a.replace(" <>", "")) for a in data ["package"]["authors"]),
+ data["package"].get("license", "???").replace("/", " or "),
+ repo or "???"
+ ))
--- /dev/null
+#!/usr/bin/python3
+# Copyright: 2015-2017 The Debian Project
+# License: MIT or Apache-2.0
+#
+# Helper to remove removed-files from .cargo-checksum
+# TODO: rewrite to perl and add to dh-cargo, maybe?
+
+from collections import OrderedDict
+import argparse
+import json
+import os
+import sys
+
+def prune_keep(cfile):
+ with open(cfile) as fp:
+ sums = json.load(fp, object_pairs_hook=OrderedDict)
+
+ oldfiles = sums["files"]
+ newfiles = OrderedDict([entry for entry in oldfiles.items() if os.path.exists(entry[0])])
+ sums["files"] = newfiles
+
+ if len(oldfiles) == len(newfiles):
+ return
+
+ with open(cfile, "w") as fp:
+ json.dump(sums, fp, separators=(',', ':'))
+
+def prune(cfile):
+ with open(cfile, "r+") as fp:
+ sums = json.load(fp, object_pairs_hook=OrderedDict)
+ sums["files"] = {}
+ fp.seek(0)
+ json.dump(sums, fp, separators=(',', ':'))
+ fp.truncate()
+
+if __name__ == "__main__":
+ parser = argparse.ArgumentParser()
+ parser.add_argument("-k", "--keep", action="store_true", help="keep "
+ "checksums of files that still exist, and assume they haven't changed.")
+ parser.add_argument('crates', nargs=argparse.REMAINDER,
+ help="crates whose checksums to prune. (default: ./)")
+ args = parser.parse_args(sys.argv[1:])
+ crates = args.crates or ["."]
+ f = prune_keep if args.keep else prune
+ for c in crates:
+ cfile = os.path.join(c, ".cargo-checksum.json") if os.path.isdir(c) else c
+ f(cfile)
--- /dev/null
+3.0 (quilt)
--- /dev/null
+# This is a list of files and dirs that should be filtered from
+# deps tarball for copyright/duplication reasons
+curl-sys-*/curl/
+libgit2-sys-*/libgit2*/
+libssh2-sys-*/libssh2*/
+libz-sys-*/src/zlib-*/
+strsim-*/docs/
+# pre-built archive for WinAPI
+# <WindowsBunny> infinity0: So the libfoo.a in my hundreds of -sys crates are from MinGW-w64
+# <WindowsBunny> infinity0: They're import libraries that only matter when targetting pc-windows-gnu
+# <WindowsBunny> infinity0: And even then they only matter when the user doesn't have a full MinGW-w64 installation already
+# so for now we just omit them because we don't target windows in Debian (yet)
+# in the future we will have to figure out how to build them, as follows:
+# <infinity0> well, how do we know it's free software if we don't have the build scripts for it?
+# <WindowsBunny> infinity0: Well, the .def files are in the winapi repo
+# <WindowsBunny> infinity0: and you can uh... run whatever mingw tool to create the import libraries from those .def files
+dbghelp-sys-*/*/*.a
+
+# added in 0.24.0
+winapi-*/*/*.a
+secur32-*/*/*.a
+schannel-*/*/*.a
+
+# To clean deps before making a new tarball, run
+# grep -v '^#' debian/deps-tarball-filter.txt | xargs -I% sh -c 'rm -rf %'
+
+# We do not need sublime workspace file for building package
+hex-*/rust-hex.sublime-workspace
+
+# added in 0.25.0
+
+# Embedded git data for test.
+libgit2-sys-*/libgit2/tests/resources/userdiff/.gitted
+libgit2-sys-*/libgit2/tests/resources/submod3/.gitted
+libgit2-sys-*/libgit2/tests/resources/win32-forbidden/.gitted
+libgit2-sys-*/libgit2/tests/resources/mergedrepo/.gitted
+libgit2-sys-*/libgit2/tests/resources/submod2_target/.gitted
+libgit2-sys-*/libgit2/tests/resources/submodules/.gitted
+libgit2-sys-*/libgit2/tests/resources/submodules/testrepo/.gitted
+libgit2-sys-*/libgit2/tests/resources/cherrypick/.gitted
+libgit2-sys-*/libgit2/tests/resources/testrepo2/.gitted
+libgit2-sys-*/libgit2/tests/resources/describe/.gitted
+libgit2-sys-*/libgit2/tests/resources/merge-recursive/.gitted
+libgit2-sys-*/libgit2/tests/resources/diff/.gitted
+libgit2-sys-*/libgit2/tests/resources/partial-testrepo/.gitted
+libgit2-sys-*/libgit2/tests/resources/nsecs/.gitted
+libgit2-sys-*/libgit2/tests/resources/issue_1397/.gitted
+libgit2-sys-*/libgit2/tests/resources/typechanges/.gitted
+libgit2-sys-*/libgit2/tests/resources/submod2/not/.gitted
+libgit2-sys-*/libgit2/tests/resources/submod2/.gitted
+libgit2-sys-*/libgit2/tests/resources/submod2/not-submodule/.gitted
+libgit2-sys-*/libgit2/tests/resources/push_src/.gitted
+libgit2-sys-*/libgit2/tests/resources/testrepo/.gitted
+libgit2-sys-*/libgit2/tests/resources/renames/.gitted
+libgit2-sys-*/libgit2/tests/resources/issue_592/.gitted
+libgit2-sys-*/libgit2/tests/resources/filemodes/.gitted
+libgit2-sys-*/libgit2/tests/resources/attr_index/.gitted
+libgit2-sys-*/libgit2/tests/resources/crlf/.gitted
+libgit2-sys-*/libgit2/tests/resources/diff_format_email/.gitted
+libgit2-sys-*/libgit2/tests/resources/submodule_with_path/.gitted
+libgit2-sys-*/libgit2/tests/resources/attr/.gitted
+libgit2-sys-*/libgit2/tests/resources/merge-resolve/.gitted
+libgit2-sys-*/libgit2/tests/resources/revert/.gitted
+libgit2-sys-*/libgit2/tests/resources/submodule_simple/.gitted
+libgit2-sys-*/libgit2/tests/resources/issue_592b/.gitted
+libgit2-sys-*/libgit2/tests/resources/super/.gitted
+libgit2-sys-*/libgit2/tests/resources/rebase-submodule/.gitted
+libgit2-sys-*/libgit2/tests/resources/indexv4/.gitted
+libgit2-sys-*/libgit2/tests/resources/merge-whitespace/.gitted
+libgit2-sys-*/libgit2/tests/resources/icase/.gitted
+libgit2-sys-*/libgit2/tests/resources/status/.gitted
+libgit2-sys-*/libgit2/tests/resources/nasty/.gitted
+libgit2-sys-*/libgit2/tests/resources/binaryunicode/.gitted
+libgit2-sys-*/libgit2/tests/resources/rebase/.gitted
+libgit2-sys-*/libgit2/tests/resources/empty_standard_repo/.gitted
+libgit2-sys-*/libgit2/tests/resources/crlf_data/
--- /dev/null
+# This is a list of files and dirs that are omitted from our custom
+# "suspicious files" scanner
+
+# test data
+flate2-*/tests/
+tar-*/tests/archives/
+term-*/tests/data/
+toml-*/tests/
+num-*/ci/
+openssl-*/test/
+
+# misc support data
+hamcrest-*/LICENSE-*
+*/.travis.yml
+# "build status" link-images etc take up a lot of line-length
+*/README.md
+
+# individual files, manually audited:
+idna-*/tests/IdnaTest.txt
+idna-*/src/uts46_mapping_table.rs
+regex-*/src/testdata/basic.dat
+regex-*/tests/fowler.rs
+libgit2-sys-*/libgit2/src/openssl_stream.c
+term-*/scripts/id_rsa.enc
+url-*/github.png
+num-*/doc/favicon.ico
+num-*/doc/rust-logo-128x128-blk-v2.png
+num-*/.travis/deploy.enc
+miniz-sys-*/miniz.c
+docopt-*/src/test/testcases.rs
+winapi-*/src/winnt.rs
+winapi-*/README.md
+miniz-sys-*/miniz.c
+itoa-*/performance.png
+dtoa-*/performance.png
+backtrace-sys-*/src/libbacktrace/configure
+unicode-normalization-*/src/tables.rs
+conv-*/src/errors.rs
+conv-*/src/macros.rs
+conv-*/src/impls.rs
+conv-*/src/lib.rs
+termion-*/logo.svg
+tar-*/Cargo.toml
+schannel-*/LICENSE.md
+lazy_static-*/src/lib.rs
+clap-*/.github/CONTRIBUTING.md
+clap-*/CONTRIBUTORS.md
+clap-*/CHANGELOG.md
+vec_map-*/Cargo.toml
+
+# test data for schanel
+schannel-*/test/*
+
+#libgit2 test files
+libgit2-sys-*/libgit2/tests/filter/crlf.h
+libgit2-sys-*/libgit2/tests/path/win32.c
+libgit2-sys-*/libgit2/tests/status/status_data.h
+libgit2-sys-*/libgit2/src/openssl-stream.c
+libgit2-sys-*/libgit2/src/hash/sha1dc/ubc_check.c
+
+# code of conduct files
+failure-*/CODE_OF_CONDUCT.md
+failure_derive-*/CODE_OF_CONDUCT.md
--- /dev/null
+version=3
+https://github.com/rust-lang/cargo/releases /rust-lang/cargo/archive/(\d+\.\d+\.\d+)\.tar\.gz