--- /dev/null
+/target
+Cargo.lock
+.cargo
+/config.stamp
+/Makefile
+/config.mk
+src/doc/build
+src/etc/*.pyc
+src/registry/target
+rustc
+__pycache__
+.idea/
+*.iml
+*.swp
--- /dev/null
+language: rust
+rust: stable
+sudo: required
+dist: trusty
+
+git:
+ depth: 1
+
+# Using 'cache: cargo' to cache target/ and all of $HOME/.cargo/
+# doesn't work well: the cache is large and it takes several minutes
+# to move it to and from S3. So instead we only cache the mdbook
+# binary.
+cache:
+ directories:
+ - $HOME/.cargo/bin/
+
+matrix:
+ include:
+ - env: TARGET=x86_64-unknown-linux-gnu
+ ALT=i686-unknown-linux-gnu
+ - env: TARGET=x86_64-apple-darwin
+ ALT=i686-apple-darwin
+ os: osx
+
+ - env: TARGET=x86_64-unknown-linux-gnu
+ ALT=i686-unknown-linux-gnu
+ rust: beta
+
+ - env: TARGET=x86_64-unknown-linux-gnu
+ ALT=i686-unknown-linux-gnu
+ rust: nightly
+ install:
+ - mdbook --help || cargo install mdbook --force
+ script:
+ - cargo test
+ - cargo doc --no-deps
+ - sh src/ci/dox.sh
+ after_success: |
+ [ $TRAVIS_BRANCH = master ] &&
+ [ $TRAVIS_PULL_REQUEST = false ] &&
+ [ $(uname -s) = Linux ] &&
+ pip install ghp-import --user &&
+ $HOME/.local/bin/ghp-import -n target/doc &&
+ git push -qf https://${GH_TOKEN}@github.com/${TRAVIS_REPO_SLUG}.git gh-pages 2>&1 >/dev/null
+
+ exclude:
+ - rust: stable
+
+before_script:
+ - rustup target add $ALT
+script:
+ - cargo test
+
+env:
+ global:
+ - secure: "hWheSLilMM4DXChfSy2XsDlLw338X2o+fw8bE590xxU2TzngFW8GUfq7lGfZEp/l4SNNIS6ROU/igyttCZtxZMANZ4aMQZR5E8Fp4yPOyE1pZLDH/LdQVXnROsfburQJeq+GIYIbZ01Abzh5ClpgLg5KX0H627uj063zZ7Ljo/w="
+
+notifications:
+ email:
+ on_success: never
+addons:
+ apt:
+ packages:
+ - gcc-multilib
--- /dev/null
+# Cargo Architecture
+
+This document gives a high level overview of Cargo internals. You may
+find it useful if you want to contribute to Cargo or if you are
+interested in the inner workings of Cargo.
+
+
+## Subcommands
+
+Cargo is organized as a set of subcommands. All subcommands live in
+`src/bin` directory. However, only `src/bin/cargo.rs` file produces an
+executable, other files inside the `bin` directory are submodules. See
+`src/bin/cargo.rs` for how these subcommands get wired up with the
+main executable.
+
+A typical subcommand, such as `src/bin/build.rs`, parses command line
+options, reads the configuration files, discovers the Cargo project in
+the current directory and delegates the actual implementation to one
+of the functions in `src/cargo/ops/mod.rs`. This short file is a good
+place to find out about most of the things that Cargo can do.
+
+
+## Important Data Structures
+
+There are some important data structures which are used throughout
+Cargo.
+
+`Config` is available almost everywhere and holds "global"
+information, such as `CARGO_HOME` or configuration from
+`.cargo/config` files. The `shell` method of `Config` is the entry
+point for printing status messages and other info to the console.
+
+`Workspace` is the description of the workspace for the current
+working directory. Each workspace contains at least one
+`Package`. Each package corresponds to a single `Cargo.toml`, and may
+define several `Target`s, such as the library, binaries, integration
+test or examples. Targets are crates (each target defines a crate
+root, like `src/lib.rs` or `examples/foo.rs`) and are what is actually
+compiled by `rustc`.
+
+A typical package defines the single library target and several
+auxiliary ones. Packages are a unit of dependency in Cargo, and when
+package `foo` depends on package `bar`, that means that each target
+from `foo` needs the library target from `bar`.
+
+`PackageId` is the unique identifier of a (possibly remote)
+package. It consist of three components: name, version and source
+id. Source is the place where the source code for package comes
+from. Typical sources are crates.io, a git repository or a folder on
+the local hard drive.
+
+`Resolve` is the representation of a directed acyclic graph of package
+dependencies, which uses `PackageId`s for nodes. This is the data
+structure that is saved to the lock file. If there is no lockfile,
+Cargo constructs a resolve by finding a graph of packages which
+matches declared dependency specification according to semver.
+
+
+## Persistence
+
+Cargo is a non-daemon command line application, which means that all
+the information used by Cargo must be persisted on the hard drive. The
+main sources of information are `Cargo.toml` and `Cargo.lock` files,
+`.cargo/config` configuration files and the globally shared registry
+of packages downloaded from crates.io, usually located at
+`~/.cargo/registry`. See `src/sources/registry` for the specifics of
+the registry storage format.
+
+
+## Concurrency
+
+Cargo is mostly single threaded. The only concurrency inside a single
+instance of Cargo happens during compilation, when several instances
+of `rustc` are invoked in parallel to build independent
+targets. However there can be several different instances of Cargo
+process running concurrently on the system. Cargo guarantees that this
+is always safe by using file locks when accessing potentially shared
+data like the registry or the target directory.
+
+
+## Tests
+
+Cargo has an impressive test suite located in the `tests` folder. Most
+of the test are integration: a project structure with `Cargo.toml` and
+rust source code is created in a temporary directory, `cargo` binary
+is invoked via `std::process::Command` and then stdout and stderr are
+verified against the expected output. To simplify testing, several
+macros of the form `[MACRO]` are used in the expected output. For
+example, `[..]` matches any string and `[/]` matches `/` on Unixes and
+`\` on windows.
--- /dev/null
+# Contributing to Cargo
+
+Thank you for your interest in contributing to Cargo! Good places to
+start are this document, [ARCHITECTURE.md](ARCHITECTURE.md), which
+describes the high-level structure of Cargo and [E-easy] bugs on the
+issue tracker.
+
+If you have a general question about Cargo or it's internals, feel free to ask
+on [IRC].
+
+## Code of Conduct
+
+All contributors are expected to follow our [Code of Conduct].
+
+## Bug reports
+
+We can't fix what we don't know about, so please report problems liberally. This
+includes problems with understanding the documentation, unhelpful error messages
+and unexpected behavior.
+
+**If you think that you have identified an issue with Cargo that might compromise
+its users' security, please do not open a public issue on GitHub. Instead,
+we ask you to refer to Rust's [security policy].**
+
+Opening an issue is as easy as following [this
+link][new-issues] and filling out the fields.
+Here's a template that you can use to file an issue, though it's not necessary to
+use it exactly:
+
+ <short summary of the problem>
+
+ I tried this: <minimal example that causes the problem>
+
+ I expected to see this happen: <explanation>
+
+ Instead, this happened: <explanation>
+
+ I'm using <output of `cargo --version`>
+
+All three components are important: what you did, what you expected, what
+happened instead. Please use https://gist.github.com/ if your examples run long.
+
+## Working on issues
+
+If you're looking for somewhere to start, check out the [E-easy][E-Easy] tag.
+
+Feel free to ask for guidelines on how to tackle a problem on [IRC] or open a
+[new issue][new-issues]. This is especially important if you want to add new
+features to Cargo or make large changes to the already existing code-base.
+Cargo's core developers will do their best to provide help.
+
+If you start working on an already-filed issue, post a comment on this issue to
+let people know that somebody is working it. Feel free to ask for comments if
+you are unsure about the solution you would like to submit.
+
+While Cargo does make use of some Rust-features available only through the
+`nightly` toolchain, it must compile on stable Rust. Code added to Cargo
+is encouraged to make use of the latest stable features of the language and
+`stdlib`.
+
+We use the "fork and pull" model [described here][development-models], where
+contributors push changes to their personal fork and create pull requests to
+bring those changes into the source repository. This process is partly
+automated: Pull requests are made against Cargo's master-branch, tested and
+reviewed. Once a change is approved to be merged, a friendly bot merges the
+changes into an internal branch, runs the full test-suite on that branch
+and only then merges into master. This ensures that Cargo's master branch
+passes the test-suite at all times.
+
+Your basic steps to get going:
+
+* Fork Cargo and create a branch from master for the issue you are working on.
+* Please adhere to the code style that you see around the location you are
+working on.
+* [Commit as you go][githelp].
+* Include tests that cover all non-trivial code. The existing tests
+in `test/` provide templates on how to test Cargo's behavior in a
+sandbox-environment. The internal crate `cargotest` provides a vast amount
+of helpers to minimize boilerplate.
+* Make sure `cargo test` passes. If you do not have the cross-compilers
+installed locally, ignore the cross-compile test failures or disable them by
+using `CFG_DISABLE_CROSS_TESTS=1 cargo test`. Note that some tests are enabled
+only on `nightly` toolchain. If you can, test both toolchains.
+* Push your commits to GitHub and create a pull request against Cargo's
+`master` branch.
+
+## Pull requests
+
+After the pull request is made, a friendly bot will automatically assign a
+reviewer; the review-process will make sure that the proposed changes are
+sound. Please give the assigned reviewer sufficient time, especially during
+weekends. If you don't get a reply, you may poke the core developers on [IRC].
+
+A merge of Cargo's master-branch and your changes is immediately queued
+to be tested after the pull request is made. In case unforeseen
+problems are discovered during this step (e.g. a failure on a platform you
+originally did not develop on), you may ask for guidance. Push additional
+commits to your branch to tackle these problems.
+
+The reviewer might point out changes deemed necessary. Please add them as
+extra commits; this ensures that the reviewer can see what has changed since
+the code was previously reviewed. Large or tricky changes may require several
+passes of review and changes.
+
+Once the reviewer approves your pull request, a friendly bot picks it up
+and [merges][mergequeue] it into Cargo's `master` branch.
+
+## Contributing to the documentation
+
+To contribute to the documentation, all you need to do is change the markdown
+files in the `src/doc` directory. To view the rendered version of changes you
+have made locally, run:
+
+```sh
+sh src/ci/dox.sh
+open target/doc/index.html
+```
+
+
+## Issue Triage
+
+Sometimes an issue will stay open, even though the bug has been fixed. And
+sometimes, the original bug may go stale because something has changed in the
+meantime.
+
+It can be helpful to go through older bug reports and make sure that they are
+still valid. Load up an older issue, double check that it's still true, and
+leave a comment letting us know if it is or is not. The [least recently
+updated sort][lru] is good for finding issues like this.
+
+Contributors with sufficient permissions on the Rust-repository can help by
+adding labels to triage issues:
+
+* Yellow, **A**-prefixed labels state which **area** of the project an issue
+ relates to.
+
+* Magenta, **B**-prefixed labels identify bugs which are **blockers**.
+
+* Light purple, **C**-prefixed labels represent the **category** of an issue.
+
+* Dark purple, **Command**-prefixed labels mean the issue has to do with a
+ specific cargo command.
+
+* Green, **E**-prefixed labels explain the level of **experience** or
+ **effort** necessary to fix the issue. [**E-mentor**][E-mentor] issues also
+ have some instructions on how to get started.
+
+* Red, **I**-prefixed labels indicate the **importance** of the issue. The
+ [I-nominated][inom] label indicates that an issue has been nominated for
+ prioritizing at the next triage meeting.
+
+* Purple gray, **O**-prefixed labels are the **operating system** or platform
+ that this issue is specific to.
+
+* Orange, **P**-prefixed labels indicate a bug's **priority**. These labels
+ are only assigned during triage meetings and replace the [I-nominated][inom]
+ label.
+
+* The light orange **relnotes** label marks issues that should be documented in
+ the release notes of the next release.
+
+
+[githelp]: https://dont-be-afraid-to-commit.readthedocs.io/en/latest/git/commandlinegit.html
+[development-models]: https://help.github.com/articles/about-collaborative-development-models/
+[gist]: https://gist.github.com/
+[new-issues]: https://github.com/rust-lang/cargo/issues/new
+[mergequeue]: https://buildbot2.rust-lang.org/homu/queue/cargo
+[security policy]: https://www.rust-lang.org/security.html
+[lru]: https://github.com/rust-lang/cargo/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-asc
+[E-easy]: https://github.com/rust-lang/cargo/labels/E-easy
+[E-mentor]: https://github.com/rust-lang/cargo/labels/E-mentor
+[Code of Conduct]: https://www.rust-lang.org/conduct.html
+[IRC]: https://kiwiirc.com/client/irc.mozilla.org/cargo
--- /dev/null
+[package]
+name = "cargo"
+version = "0.24.0"
+authors = ["Yehuda Katz <wycats@gmail.com>",
+ "Carl Lerche <me@carllerche.com>",
+ "Alex Crichton <alex@alexcrichton.com>"]
+license = "MIT/Apache-2.0"
+homepage = "https://crates.io"
+repository = "https://github.com/rust-lang/cargo"
+documentation = "https://docs.rs/cargo"
+description = """
+Cargo, a package manager for Rust.
+"""
+
+[lib]
+name = "cargo"
+path = "src/cargo/lib.rs"
+
+[dependencies]
+atty = "0.2"
+crates-io = { path = "src/crates-io", version = "0.13" }
+crossbeam = "0.3"
+crypto-hash = "0.3"
+curl = "0.4.6"
+docopt = "0.8.1"
+env_logger = "0.4"
+error-chain = "0.11.0"
+filetime = "0.1"
+flate2 = "0.2"
+fs2 = "0.4"
+git2 = "0.6"
+git2-curl = "0.7"
+glob = "0.2"
+hex = "0.2"
+home = "0.3"
+ignore = "^0.2.2"
+jobserver = "0.1.6"
+libc = "0.2"
+libgit2-sys = "0.6"
+log = "0.3"
+num_cpus = "1.0"
+same-file = "0.1"
+scoped-tls = "0.1"
+semver = { version = "0.8.0", features = ["serde"] }
+serde = "1.0"
+serde_derive = "1.0"
+serde_ignored = "0.0.4"
+serde_json = "1.0"
+shell-escape = "0.1"
+tar = { version = "0.4", default-features = false }
+tempdir = "0.3"
+termcolor = "0.3"
+toml = "0.4"
+url = "1.1"
+
+[target.'cfg(target_os = "macos")'.dependencies]
+core-foundation = { version = "0.4.4", features = ["mac_os_10_7_support"] }
+
+[target.'cfg(windows)'.dependencies]
+kernel32-sys = "0.2"
+miow = "0.2"
+psapi-sys = "0.1"
+winapi = "0.2"
+
+[dev-dependencies]
+bufstream = "0.1"
+cargotest = { path = "tests/cargotest" }
+filetime = "0.1"
+hamcrest = "=0.1.1"
+
+[[bin]]
+name = "cargo"
+test = false
+doc = false
--- /dev/null
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+END OF TERMS AND CONDITIONS
+
+APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "[]"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+Copyright [yyyy] [name of copyright owner]
+
+Licensed under the Apache License, Version 2.0 (the "License");
+you may not use this file except in compliance with the License.
+You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
--- /dev/null
+Permission is hereby granted, free of charge, to any
+person obtaining a copy of this software and associated
+documentation files (the "Software"), to deal in the
+Software without restriction, including without
+limitation the rights to use, copy, modify, merge,
+publish, distribute, sublicense, and/or sell copies of
+the Software, and to permit persons to whom the Software
+is furnished to do so, subject to the following
+conditions:
+
+The above copyright notice and this permission notice
+shall be included in all copies or substantial portions
+of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
+ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
+TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
+PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
+SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
+OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
+IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+DEALINGS IN THE SOFTWARE.
--- /dev/null
+The Cargo source code itself does not bundle any third party libraries, but it
+depends on a number of libraries which carry their own copyright notices and
+license terms. These libraries are normally all linked static into the binary
+distributions of Cargo:
+
+* OpenSSL - http://www.openssl.org/source/license.html
+
+ Copyright (c) 1998-2011 The OpenSSL Project. All rights reserved.
+
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions
+ are met:
+
+ 1. Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+
+ 2. Redistributions in binary form must reproduce the above copyright
+ notice, this list of conditions and the following disclaimer in
+ the documentation and/or other materials provided with the
+ distribution.
+
+ 3. All advertising materials mentioning features or use of this
+ software must display the following acknowledgment:
+ "This product includes software developed by the OpenSSL Project
+ for use in the OpenSSL Toolkit. (http://www.openssl.org/)"
+
+ 4. The names "OpenSSL Toolkit" and "OpenSSL Project" must not be used to
+ endorse or promote products derived from this software without
+ prior written permission. For written permission, please contact
+ openssl-core@openssl.org.
+
+ 5. Products derived from this software may not be called "OpenSSL"
+ nor may "OpenSSL" appear in their names without prior written
+ permission of the OpenSSL Project.
+
+ 6. Redistributions of any form whatsoever must retain the following
+ acknowledgment:
+ "This product includes software developed by the OpenSSL Project
+ for use in the OpenSSL Toolkit (http://www.openssl.org/)"
+
+ THIS SOFTWARE IS PROVIDED BY THE OpenSSL PROJECT ``AS IS'' AND ANY
+ EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE OpenSSL PROJECT OR
+ ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
+ NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
+ LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
+ HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
+ STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
+ OF THE POSSIBILITY OF SUCH DAMAGE.
+ ====================================================================
+
+ This product includes cryptographic software written by Eric Young
+ (eay@cryptsoft.com). This product includes software written by Tim
+ Hudson (tjh@cryptsoft.com).
+
+ ---
+
+ Copyright (C) 1995-1998 Eric Young (eay@cryptsoft.com)
+ All rights reserved.
+
+ This package is an SSL implementation written
+ by Eric Young (eay@cryptsoft.com).
+ The implementation was written so as to conform with Netscapes SSL.
+
+ This library is free for commercial and non-commercial use as long as
+ the following conditions are aheared to. The following conditions
+ apply to all code found in this distribution, be it the RC4, RSA,
+ lhash, DES, etc., code; not just the SSL code. The SSL documentation
+ included with this distribution is covered by the same copyright terms
+ except that the holder is Tim Hudson (tjh@cryptsoft.com).
+
+ Copyright remains Eric Young's, and as such any Copyright notices in
+ the code are not to be removed.
+ If this package is used in a product, Eric Young should be given attribution
+ as the author of the parts of the library used.
+ This can be in the form of a textual message at program startup or
+ in documentation (online or textual) provided with the package.
+
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions
+ are met:
+ 1. Redistributions of source code must retain the copyright
+ notice, this list of conditions and the following disclaimer.
+ 2. Redistributions in binary form must reproduce the above copyright
+ notice, this list of conditions and the following disclaimer in the
+ documentation and/or other materials provided with the distribution.
+ 3. All advertising materials mentioning features or use of this software
+ must display the following acknowledgement:
+ "This product includes cryptographic software written by
+ Eric Young (eay@cryptsoft.com)"
+ The word 'cryptographic' can be left out if the rouines from the library
+ being used are not cryptographic related :-).
+ 4. If you include any Windows specific code (or a derivative thereof) from
+ the apps directory (application code) you must include an acknowledgement:
+ "This product includes software written by Tim Hudson (tjh@cryptsoft.com)"
+
+ THIS SOFTWARE IS PROVIDED BY ERIC YOUNG ``AS IS'' AND
+ ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+ ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE
+ FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
+ DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
+ OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
+ HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
+ LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
+ OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
+ SUCH DAMAGE.
+
+ The licence and distribution terms for any publically available version or
+ derivative of this code cannot be changed. i.e. this code cannot simply be
+ copied and put under another distribution licence
+ [including the GNU Public Licence.]
+
+* libgit2 - https://github.com/libgit2/libgit2/blob/master/COPYING
+
+ libgit2 is Copyright (C) the libgit2 contributors,
+ unless otherwise stated. See the AUTHORS file for details.
+
+ Note that the only valid version of the GPL as far as this project
+ is concerned is _this_ particular version of the license (ie v2, not
+ v2.2 or v3.x or whatever), unless explicitly otherwise stated.
+
+ ----------------------------------------------------------------------
+
+ LINKING EXCEPTION
+
+ In addition to the permissions in the GNU General Public License,
+ the authors give you unlimited permission to link the compiled
+ version of this library into combinations with other programs,
+ and to distribute those combinations without any restriction
+ coming from the use of this file. (The General Public License
+ restrictions do apply in other respects; for example, they cover
+ modification of the file, and distribution when not linked into
+ a combined executable.)
+
+ ----------------------------------------------------------------------
+
+ GNU GENERAL PUBLIC LICENSE
+ Version 2, June 1991
+
+ Copyright (C) 1989, 1991 Free Software Foundation, Inc.
+ 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+ Everyone is permitted to copy and distribute verbatim copies
+ of this license document, but changing it is not allowed.
+
+ Preamble
+
+ The licenses for most software are designed to take away your
+ freedom to share and change it. By contrast, the GNU General Public
+ License is intended to guarantee your freedom to share and change free
+ software--to make sure the software is free for all its users. This
+ General Public License applies to most of the Free Software
+ Foundation's software and to any other program whose authors commit to
+ using it. (Some other Free Software Foundation software is covered by
+ the GNU Library General Public License instead.) You can apply it to
+ your programs, too.
+
+ When we speak of free software, we are referring to freedom, not
+ price. Our General Public Licenses are designed to make sure that you
+ have the freedom to distribute copies of free software (and charge for
+ this service if you wish), that you receive source code or can get it
+ if you want it, that you can change the software or use pieces of it
+ in new free programs; and that you know you can do these things.
+
+ To protect your rights, we need to make restrictions that forbid
+ anyone to deny you these rights or to ask you to surrender the rights.
+ These restrictions translate to certain responsibilities for you if you
+ distribute copies of the software, or if you modify it.
+
+ For example, if you distribute copies of such a program, whether
+ gratis or for a fee, you must give the recipients all the rights that
+ you have. You must make sure that they, too, receive or can get the
+ source code. And you must show them these terms so they know their
+ rights.
+
+ We protect your rights with two steps: (1) copyright the software, and
+ (2) offer you this license which gives you legal permission to copy,
+ distribute and/or modify the software.
+
+ Also, for each author's protection and ours, we want to make certain
+ that everyone understands that there is no warranty for this free
+ software. If the software is modified by someone else and passed on, we
+ want its recipients to know that what they have is not the original, so
+ that any problems introduced by others will not reflect on the original
+ authors' reputations.
+
+ Finally, any free program is threatened constantly by software
+ patents. We wish to avoid the danger that redistributors of a free
+ program will individually obtain patent licenses, in effect making the
+ program proprietary. To prevent this, we have made it clear that any
+ patent must be licensed for everyone's free use or not licensed at all.
+
+ The precise terms and conditions for copying, distribution and
+ modification follow.
+
+ GNU GENERAL PUBLIC LICENSE
+ TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
+
+ 0. This License applies to any program or other work which contains
+ a notice placed by the copyright holder saying it may be distributed
+ under the terms of this General Public License. The "Program", below,
+ refers to any such program or work, and a "work based on the Program"
+ means either the Program or any derivative work under copyright law:
+ that is to say, a work containing the Program or a portion of it,
+ either verbatim or with modifications and/or translated into another
+ language. (Hereinafter, translation is included without limitation in
+ the term "modification".) Each licensee is addressed as "you".
+
+ Activities other than copying, distribution and modification are not
+ covered by this License; they are outside its scope. The act of
+ running the Program is not restricted, and the output from the Program
+ is covered only if its contents constitute a work based on the
+ Program (independent of having been made by running the Program).
+ Whether that is true depends on what the Program does.
+
+ 1. You may copy and distribute verbatim copies of the Program's
+ source code as you receive it, in any medium, provided that you
+ conspicuously and appropriately publish on each copy an appropriate
+ copyright notice and disclaimer of warranty; keep intact all the
+ notices that refer to this License and to the absence of any warranty;
+ and give any other recipients of the Program a copy of this License
+ along with the Program.
+
+ You may charge a fee for the physical act of transferring a copy, and
+ you may at your option offer warranty protection in exchange for a fee.
+
+ 2. You may modify your copy or copies of the Program or any portion
+ of it, thus forming a work based on the Program, and copy and
+ distribute such modifications or work under the terms of Section 1
+ above, provided that you also meet all of these conditions:
+
+ a) You must cause the modified files to carry prominent notices
+ stating that you changed the files and the date of any change.
+
+ b) You must cause any work that you distribute or publish, that in
+ whole or in part contains or is derived from the Program or any
+ part thereof, to be licensed as a whole at no charge to all third
+ parties under the terms of this License.
+
+ c) If the modified program normally reads commands interactively
+ when run, you must cause it, when started running for such
+ interactive use in the most ordinary way, to print or display an
+ announcement including an appropriate copyright notice and a
+ notice that there is no warranty (or else, saying that you provide
+ a warranty) and that users may redistribute the program under
+ these conditions, and telling the user how to view a copy of this
+ License. (Exception: if the Program itself is interactive but
+ does not normally print such an announcement, your work based on
+ the Program is not required to print an announcement.)
+
+ These requirements apply to the modified work as a whole. If
+ identifiable sections of that work are not derived from the Program,
+ and can be reasonably considered independent and separate works in
+ themselves, then this License, and its terms, do not apply to those
+ sections when you distribute them as separate works. But when you
+ distribute the same sections as part of a whole which is a work based
+ on the Program, the distribution of the whole must be on the terms of
+ this License, whose permissions for other licensees extend to the
+ entire whole, and thus to each and every part regardless of who wrote it.
+
+ Thus, it is not the intent of this section to claim rights or contest
+ your rights to work written entirely by you; rather, the intent is to
+ exercise the right to control the distribution of derivative or
+ collective works based on the Program.
+
+ In addition, mere aggregation of another work not based on the Program
+ with the Program (or with a work based on the Program) on a volume of
+ a storage or distribution medium does not bring the other work under
+ the scope of this License.
+
+ 3. You may copy and distribute the Program (or a work based on it,
+ under Section 2) in object code or executable form under the terms of
+ Sections 1 and 2 above provided that you also do one of the following:
+
+ a) Accompany it with the complete corresponding machine-readable
+ source code, which must be distributed under the terms of Sections
+ 1 and 2 above on a medium customarily used for software interchange; or,
+
+ b) Accompany it with a written offer, valid for at least three
+ years, to give any third party, for a charge no more than your
+ cost of physically performing source distribution, a complete
+ machine-readable copy of the corresponding source code, to be
+ distributed under the terms of Sections 1 and 2 above on a medium
+ customarily used for software interchange; or,
+
+ c) Accompany it with the information you received as to the offer
+ to distribute corresponding source code. (This alternative is
+ allowed only for noncommercial distribution and only if you
+ received the program in object code or executable form with such
+ an offer, in accord with Subsection b above.)
+
+ The source code for a work means the preferred form of the work for
+ making modifications to it. For an executable work, complete source
+ code means all the source code for all modules it contains, plus any
+ associated interface definition files, plus the scripts used to
+ control compilation and installation of the executable. However, as a
+ special exception, the source code distributed need not include
+ anything that is normally distributed (in either source or binary
+ form) with the major components (compiler, kernel, and so on) of the
+ operating system on which the executable runs, unless that component
+ itself accompanies the executable.
+
+ If distribution of executable or object code is made by offering
+ access to copy from a designated place, then offering equivalent
+ access to copy the source code from the same place counts as
+ distribution of the source code, even though third parties are not
+ compelled to copy the source along with the object code.
+
+ 4. You may not copy, modify, sublicense, or distribute the Program
+ except as expressly provided under this License. Any attempt
+ otherwise to copy, modify, sublicense or distribute the Program is
+ void, and will automatically terminate your rights under this License.
+ However, parties who have received copies, or rights, from you under
+ this License will not have their licenses terminated so long as such
+ parties remain in full compliance.
+
+ 5. You are not required to accept this License, since you have not
+ signed it. However, nothing else grants you permission to modify or
+ distribute the Program or its derivative works. These actions are
+ prohibited by law if you do not accept this License. Therefore, by
+ modifying or distributing the Program (or any work based on the
+ Program), you indicate your acceptance of this License to do so, and
+ all its terms and conditions for copying, distributing or modifying
+ the Program or works based on it.
+
+ 6. Each time you redistribute the Program (or any work based on the
+ Program), the recipient automatically receives a license from the
+ original licensor to copy, distribute or modify the Program subject to
+ these terms and conditions. You may not impose any further
+ restrictions on the recipients' exercise of the rights granted herein.
+ You are not responsible for enforcing compliance by third parties to
+ this License.
+
+ 7. If, as a consequence of a court judgment or allegation of patent
+ infringement or for any other reason (not limited to patent issues),
+ conditions are imposed on you (whether by court order, agreement or
+ otherwise) that contradict the conditions of this License, they do not
+ excuse you from the conditions of this License. If you cannot
+ distribute so as to satisfy simultaneously your obligations under this
+ License and any other pertinent obligations, then as a consequence you
+ may not distribute the Program at all. For example, if a patent
+ license would not permit royalty-free redistribution of the Program by
+ all those who receive copies directly or indirectly through you, then
+ the only way you could satisfy both it and this License would be to
+ refrain entirely from distribution of the Program.
+
+ If any portion of this section is held invalid or unenforceable under
+ any particular circumstance, the balance of the section is intended to
+ apply and the section as a whole is intended to apply in other
+ circumstances.
+
+ It is not the purpose of this section to induce you to infringe any
+ patents or other property right claims or to contest validity of any
+ such claims; this section has the sole purpose of protecting the
+ integrity of the free software distribution system, which is
+ implemented by public license practices. Many people have made
+ generous contributions to the wide range of software distributed
+ through that system in reliance on consistent application of that
+ system; it is up to the author/donor to decide if he or she is willing
+ to distribute software through any other system and a licensee cannot
+ impose that choice.
+
+ This section is intended to make thoroughly clear what is believed to
+ be a consequence of the rest of this License.
+
+ 8. If the distribution and/or use of the Program is restricted in
+ certain countries either by patents or by copyrighted interfaces, the
+ original copyright holder who places the Program under this License
+ may add an explicit geographical distribution limitation excluding
+ those countries, so that distribution is permitted only in or among
+ countries not thus excluded. In such case, this License incorporates
+ the limitation as if written in the body of this License.
+
+ 9. The Free Software Foundation may publish revised and/or new versions
+ of the General Public License from time to time. Such new versions will
+ be similar in spirit to the present version, but may differ in detail to
+ address new problems or concerns.
+
+ Each version is given a distinguishing version number. If the Program
+ specifies a version number of this License which applies to it and "any
+ later version", you have the option of following the terms and conditions
+ either of that version or of any later version published by the Free
+ Software Foundation. If the Program does not specify a version number of
+ this License, you may choose any version ever published by the Free Software
+ Foundation.
+
+ 10. If you wish to incorporate parts of the Program into other free
+ programs whose distribution conditions are different, write to the author
+ to ask for permission. For software which is copyrighted by the Free
+ Software Foundation, write to the Free Software Foundation; we sometimes
+ make exceptions for this. Our decision will be guided by the two goals
+ of preserving the free status of all derivatives of our free software and
+ of promoting the sharing and reuse of software generally.
+
+ NO WARRANTY
+
+ 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
+ FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
+ OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
+ PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
+ OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
+ MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
+ TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
+ PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
+ REPAIR OR CORRECTION.
+
+ 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
+ WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
+ REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
+ INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
+ OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
+ TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
+ YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
+ PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
+ POSSIBILITY OF SUCH DAMAGES.
+
+ END OF TERMS AND CONDITIONS
+
+ How to Apply These Terms to Your New Programs
+
+ If you develop a new program, and you want it to be of the greatest
+ possible use to the public, the best way to achieve this is to make it
+ free software which everyone can redistribute and change under these terms.
+
+ To do so, attach the following notices to the program. It is safest
+ to attach them to the start of each source file to most effectively
+ convey the exclusion of warranty; and each file should have at least
+ the "copyright" line and a pointer to where the full notice is found.
+
+ <one line to give the program's name and a brief idea of what it does.>
+ Copyright (C) <year> <name of author>
+
+ This program is free software; you can redistribute it and/or modify
+ it under the terms of the GNU General Public License as published by
+ the Free Software Foundation; either version 2 of the License, or
+ (at your option) any later version.
+
+ This program is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+ GNU General Public License for more details.
+
+ You should have received a copy of the GNU General Public License
+ along with this program; if not, write to the Free Software
+ Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+
+
+ Also add information on how to contact you by electronic and paper mail.
+
+ If the program is interactive, make it output a short notice like this
+ when it starts in an interactive mode:
+
+ Gnomovision version 69, Copyright (C) year name of author
+ Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
+ This is free software, and you are welcome to redistribute it
+ under certain conditions; type `show c' for details.
+
+ The hypothetical commands `show w' and `show c' should show the appropriate
+ parts of the General Public License. Of course, the commands you use may
+ be called something other than `show w' and `show c'; they could even be
+ mouse-clicks or menu items--whatever suits your program.
+
+ You should also get your employer (if you work as a programmer) or your
+ school, if any, to sign a "copyright disclaimer" for the program, if
+ necessary. Here is a sample; alter the names:
+
+ Yoyodyne, Inc., hereby disclaims all copyright interest in the program
+ `Gnomovision' (which makes passes at compilers) written by James Hacker.
+
+ <signature of Ty Coon>, 1 April 1989
+ Ty Coon, President of Vice
+
+ This General Public License does not permit incorporating your program into
+ proprietary programs. If your program is a subroutine library, you may
+ consider it more useful to permit linking proprietary applications with the
+ library. If this is what you want to do, use the GNU Library General
+ Public License instead of this License.
+
+ ----------------------------------------------------------------------
+
+ The bundled ZLib code is licensed under the ZLib license:
+
+ Copyright (C) 1995-2010 Jean-loup Gailly and Mark Adler
+
+ This software is provided 'as-is', without any express or implied
+ warranty. In no event will the authors be held liable for any damages
+ arising from the use of this software.
+
+ Permission is granted to anyone to use this software for any purpose,
+ including commercial applications, and to alter it and redistribute it
+ freely, subject to the following restrictions:
+
+ 1. The origin of this software must not be misrepresented; you must not
+ claim that you wrote the original software. If you use this software
+ in a product, an acknowledgment in the product documentation would be
+ appreciated but is not required.
+ 2. Altered source versions must be plainly marked as such, and must not be
+ misrepresented as being the original software.
+ 3. This notice may not be removed or altered from any source distribution.
+
+ Jean-loup Gailly Mark Adler
+ jloup@gzip.org madler@alumni.caltech.edu
+
+ ----------------------------------------------------------------------
+
+ The Clar framework is licensed under the MIT license:
+
+ Copyright (C) 2011 by Vicent Marti
+
+ Permission is hereby granted, free of charge, to any person obtaining a copy
+ of this software and associated documentation files (the "Software"), to deal
+ in the Software without restriction, including without limitation the rights
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+ copies of the Software, and to permit persons to whom the Software is
+ furnished to do so, subject to the following conditions:
+
+ The above copyright notice and this permission notice shall be included in
+ all copies or substantial portions of the Software.
+
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+ THE SOFTWARE.
+
+ ----------------------------------------------------------------------
+
+ The regex library (deps/regex/) is licensed under the GNU LGPL
+
+ GNU LESSER GENERAL PUBLIC LICENSE
+ Version 2.1, February 1999
+
+ Copyright (C) 1991, 1999 Free Software Foundation, Inc.
+ 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
+ Everyone is permitted to copy and distribute verbatim copies
+ of this license document, but changing it is not allowed.
+
+ [This is the first released version of the Lesser GPL. It also counts
+ as the successor of the GNU Library Public License, version 2, hence
+ the version number 2.1.]
+
+ Preamble
+
+ The licenses for most software are designed to take away your
+ freedom to share and change it. By contrast, the GNU General Public
+ Licenses are intended to guarantee your freedom to share and change
+ free software--to make sure the software is free for all its users.
+
+ This license, the Lesser General Public License, applies to some
+ specially designated software packages--typically libraries--of the
+ Free Software Foundation and other authors who decide to use it. You
+ can use it too, but we suggest you first think carefully about whether
+ this license or the ordinary General Public License is the better
+ strategy to use in any particular case, based on the explanations below.
+
+ When we speak of free software, we are referring to freedom of use,
+ not price. Our General Public Licenses are designed to make sure that
+ you have the freedom to distribute copies of free software (and charge
+ for this service if you wish); that you receive source code or can get
+ it if you want it; that you can change the software and use pieces of
+ it in new free programs; and that you are informed that you can do
+ these things.
+
+ To protect your rights, we need to make restrictions that forbid
+ distributors to deny you these rights or to ask you to surrender these
+ rights. These restrictions translate to certain responsibilities for
+ you if you distribute copies of the library or if you modify it.
+
+ For example, if you distribute copies of the library, whether gratis
+ or for a fee, you must give the recipients all the rights that we gave
+ you. You must make sure that they, too, receive or can get the source
+ code. If you link other code with the library, you must provide
+ complete object files to the recipients, so that they can relink them
+ with the library after making changes to the library and recompiling
+ it. And you must show them these terms so they know their rights.
+
+ We protect your rights with a two-step method: (1) we copyright the
+ library, and (2) we offer you this license, which gives you legal
+ permission to copy, distribute and/or modify the library.
+
+ To protect each distributor, we want to make it very clear that
+ there is no warranty for the free library. Also, if the library is
+ modified by someone else and passed on, the recipients should know
+ that what they have is not the original version, so that the original
+ author's reputation will not be affected by problems that might be
+ introduced by others.
+
+ Finally, software patents pose a constant threat to the existence of
+ any free program. We wish to make sure that a company cannot
+ effectively restrict the users of a free program by obtaining a
+ restrictive license from a patent holder. Therefore, we insist that
+ any patent license obtained for a version of the library must be
+ consistent with the full freedom of use specified in this license.
+
+ Most GNU software, including some libraries, is covered by the
+ ordinary GNU General Public License. This license, the GNU Lesser
+ General Public License, applies to certain designated libraries, and
+ is quite different from the ordinary General Public License. We use
+ this license for certain libraries in order to permit linking those
+ libraries into non-free programs.
+
+ When a program is linked with a library, whether statically or using
+ a shared library, the combination of the two is legally speaking a
+ combined work, a derivative of the original library. The ordinary
+ General Public License therefore permits such linking only if the
+ entire combination fits its criteria of freedom. The Lesser General
+ Public License permits more lax criteria for linking other code with
+ the library.
+
+ We call this license the "Lesser" General Public License because it
+ does Less to protect the user's freedom than the ordinary General
+ Public License. It also provides other free software developers Less
+ of an advantage over competing non-free programs. These disadvantages
+ are the reason we use the ordinary General Public License for many
+ libraries. However, the Lesser license provides advantages in certain
+ special circumstances.
+
+ For example, on rare occasions, there may be a special need to
+ encourage the widest possible use of a certain library, so that it becomes
+ a de-facto standard. To achieve this, non-free programs must be
+ allowed to use the library. A more frequent case is that a free
+ library does the same job as widely used non-free libraries. In this
+ case, there is little to gain by limiting the free library to free
+ software only, so we use the Lesser General Public License.
+
+ In other cases, permission to use a particular library in non-free
+ programs enables a greater number of people to use a large body of
+ free software. For example, permission to use the GNU C Library in
+ non-free programs enables many more people to use the whole GNU
+ operating system, as well as its variant, the GNU/Linux operating
+ system.
+
+ Although the Lesser General Public License is Less protective of the
+ users' freedom, it does ensure that the user of a program that is
+ linked with the Library has the freedom and the wherewithal to run
+ that program using a modified version of the Library.
+
+ The precise terms and conditions for copying, distribution and
+ modification follow. Pay close attention to the difference between a
+ "work based on the library" and a "work that uses the library". The
+ former contains code derived from the library, whereas the latter must
+ be combined with the library in order to run.
+
+ GNU LESSER GENERAL PUBLIC LICENSE
+ TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
+
+ 0. This License Agreement applies to any software library or other
+ program which contains a notice placed by the copyright holder or
+ other authorized party saying it may be distributed under the terms of
+ this Lesser General Public License (also called "this License").
+ Each licensee is addressed as "you".
+
+ A "library" means a collection of software functions and/or data
+ prepared so as to be conveniently linked with application programs
+ (which use some of those functions and data) to form executables.
+
+ The "Library", below, refers to any such software library or work
+ which has been distributed under these terms. A "work based on the
+ Library" means either the Library or any derivative work under
+ copyright law: that is to say, a work containing the Library or a
+ portion of it, either verbatim or with modifications and/or translated
+ straightforwardly into another language. (Hereinafter, translation is
+ included without limitation in the term "modification".)
+
+ "Source code" for a work means the preferred form of the work for
+ making modifications to it. For a library, complete source code means
+ all the source code for all modules it contains, plus any associated
+ interface definition files, plus the scripts used to control compilation
+ and installation of the library.
+
+ Activities other than copying, distribution and modification are not
+ covered by this License; they are outside its scope. The act of
+ running a program using the Library is not restricted, and output from
+ such a program is covered only if its contents constitute a work based
+ on the Library (independent of the use of the Library in a tool for
+ writing it). Whether that is true depends on what the Library does
+ and what the program that uses the Library does.
+
+ 1. You may copy and distribute verbatim copies of the Library's
+ complete source code as you receive it, in any medium, provided that
+ you conspicuously and appropriately publish on each copy an
+ appropriate copyright notice and disclaimer of warranty; keep intact
+ all the notices that refer to this License and to the absence of any
+ warranty; and distribute a copy of this License along with the
+ Library.
+
+ You may charge a fee for the physical act of transferring a copy,
+ and you may at your option offer warranty protection in exchange for a
+ fee.
+
+ 2. You may modify your copy or copies of the Library or any portion
+ of it, thus forming a work based on the Library, and copy and
+ distribute such modifications or work under the terms of Section 1
+ above, provided that you also meet all of these conditions:
+
+ a) The modified work must itself be a software library.
+
+ b) You must cause the files modified to carry prominent notices
+ stating that you changed the files and the date of any change.
+
+ c) You must cause the whole of the work to be licensed at no
+ charge to all third parties under the terms of this License.
+
+ d) If a facility in the modified Library refers to a function or a
+ table of data to be supplied by an application program that uses
+ the facility, other than as an argument passed when the facility
+ is invoked, then you must make a good faith effort to ensure that,
+ in the event an application does not supply such function or
+ table, the facility still operates, and performs whatever part of
+ its purpose remains meaningful.
+
+ (For example, a function in a library to compute square roots has
+ a purpose that is entirely well-defined independent of the
+ application. Therefore, Subsection 2d requires that any
+ application-supplied function or table used by this function must
+ be optional: if the application does not supply it, the square
+ root function must still compute square roots.)
+
+ These requirements apply to the modified work as a whole. If
+ identifiable sections of that work are not derived from the Library,
+ and can be reasonably considered independent and separate works in
+ themselves, then this License, and its terms, do not apply to those
+ sections when you distribute them as separate works. But when you
+ distribute the same sections as part of a whole which is a work based
+ on the Library, the distribution of the whole must be on the terms of
+ this License, whose permissions for other licensees extend to the
+ entire whole, and thus to each and every part regardless of who wrote
+ it.
+
+ Thus, it is not the intent of this section to claim rights or contest
+ your rights to work written entirely by you; rather, the intent is to
+ exercise the right to control the distribution of derivative or
+ collective works based on the Library.
+
+ In addition, mere aggregation of another work not based on the Library
+ with the Library (or with a work based on the Library) on a volume of
+ a storage or distribution medium does not bring the other work under
+ the scope of this License.
+
+ 3. You may opt to apply the terms of the ordinary GNU General Public
+ License instead of this License to a given copy of the Library. To do
+ this, you must alter all the notices that refer to this License, so
+ that they refer to the ordinary GNU General Public License, version 2,
+ instead of to this License. (If a newer version than version 2 of the
+ ordinary GNU General Public License has appeared, then you can specify
+ that version instead if you wish.) Do not make any other change in
+ these notices.
+
+ Once this change is made in a given copy, it is irreversible for
+ that copy, so the ordinary GNU General Public License applies to all
+ subsequent copies and derivative works made from that copy.
+
+ This option is useful when you wish to copy part of the code of
+ the Library into a program that is not a library.
+
+ 4. You may copy and distribute the Library (or a portion or
+ derivative of it, under Section 2) in object code or executable form
+ under the terms of Sections 1 and 2 above provided that you accompany
+ it with the complete corresponding machine-readable source code, which
+ must be distributed under the terms of Sections 1 and 2 above on a
+ medium customarily used for software interchange.
+
+ If distribution of object code is made by offering access to copy
+ from a designated place, then offering equivalent access to copy the
+ source code from the same place satisfies the requirement to
+ distribute the source code, even though third parties are not
+ compelled to copy the source along with the object code.
+
+ 5. A program that contains no derivative of any portion of the
+ Library, but is designed to work with the Library by being compiled or
+ linked with it, is called a "work that uses the Library". Such a
+ work, in isolation, is not a derivative work of the Library, and
+ therefore falls outside the scope of this License.
+
+ However, linking a "work that uses the Library" with the Library
+ creates an executable that is a derivative of the Library (because it
+ contains portions of the Library), rather than a "work that uses the
+ library". The executable is therefore covered by this License.
+ Section 6 states terms for distribution of such executables.
+
+ When a "work that uses the Library" uses material from a header file
+ that is part of the Library, the object code for the work may be a
+ derivative work of the Library even though the source code is not.
+ Whether this is true is especially significant if the work can be
+ linked without the Library, or if the work is itself a library. The
+ threshold for this to be true is not precisely defined by law.
+
+ If such an object file uses only numerical parameters, data
+ structure layouts and accessors, and small macros and small inline
+ functions (ten lines or less in length), then the use of the object
+ file is unrestricted, regardless of whether it is legally a derivative
+ work. (Executables containing this object code plus portions of the
+ Library will still fall under Section 6.)
+
+ Otherwise, if the work is a derivative of the Library, you may
+ distribute the object code for the work under the terms of Section 6.
+ Any executables containing that work also fall under Section 6,
+ whether or not they are linked directly with the Library itself.
+
+ 6. As an exception to the Sections above, you may also combine or
+ link a "work that uses the Library" with the Library to produce a
+ work containing portions of the Library, and distribute that work
+ under terms of your choice, provided that the terms permit
+ modification of the work for the customer's own use and reverse
+ engineering for debugging such modifications.
+
+ You must give prominent notice with each copy of the work that the
+ Library is used in it and that the Library and its use are covered by
+ this License. You must supply a copy of this License. If the work
+ during execution displays copyright notices, you must include the
+ copyright notice for the Library among them, as well as a reference
+ directing the user to the copy of this License. Also, you must do one
+ of these things:
+
+ a) Accompany the work with the complete corresponding
+ machine-readable source code for the Library including whatever
+ changes were used in the work (which must be distributed under
+ Sections 1 and 2 above); and, if the work is an executable linked
+ with the Library, with the complete machine-readable "work that
+ uses the Library", as object code and/or source code, so that the
+ user can modify the Library and then relink to produce a modified
+ executable containing the modified Library. (It is understood
+ that the user who changes the contents of definitions files in the
+ Library will not necessarily be able to recompile the application
+ to use the modified definitions.)
+
+ b) Use a suitable shared library mechanism for linking with the
+ Library. A suitable mechanism is one that (1) uses at run time a
+ copy of the library already present on the user's computer system,
+ rather than copying library functions into the executable, and (2)
+ will operate properly with a modified version of the library, if
+ the user installs one, as long as the modified version is
+ interface-compatible with the version that the work was made with.
+
+ c) Accompany the work with a written offer, valid for at
+ least three years, to give the same user the materials
+ specified in Subsection 6a, above, for a charge no more
+ than the cost of performing this distribution.
+
+ d) If distribution of the work is made by offering access to copy
+ from a designated place, offer equivalent access to copy the above
+ specified materials from the same place.
+
+ e) Verify that the user has already received a copy of these
+ materials or that you have already sent this user a copy.
+
+ For an executable, the required form of the "work that uses the
+ Library" must include any data and utility programs needed for
+ reproducing the executable from it. However, as a special exception,
+ the materials to be distributed need not include anything that is
+ normally distributed (in either source or binary form) with the major
+ components (compiler, kernel, and so on) of the operating system on
+ which the executable runs, unless that component itself accompanies
+ the executable.
+
+ It may happen that this requirement contradicts the license
+ restrictions of other proprietary libraries that do not normally
+ accompany the operating system. Such a contradiction means you cannot
+ use both them and the Library together in an executable that you
+ distribute.
+
+ 7. You may place library facilities that are a work based on the
+ Library side-by-side in a single library together with other library
+ facilities not covered by this License, and distribute such a combined
+ library, provided that the separate distribution of the work based on
+ the Library and of the other library facilities is otherwise
+ permitted, and provided that you do these two things:
+
+ a) Accompany the combined library with a copy of the same work
+ based on the Library, uncombined with any other library
+ facilities. This must be distributed under the terms of the
+ Sections above.
+
+ b) Give prominent notice with the combined library of the fact
+ that part of it is a work based on the Library, and explaining
+ where to find the accompanying uncombined form of the same work.
+
+ 8. You may not copy, modify, sublicense, link with, or distribute
+ the Library except as expressly provided under this License. Any
+ attempt otherwise to copy, modify, sublicense, link with, or
+ distribute the Library is void, and will automatically terminate your
+ rights under this License. However, parties who have received copies,
+ or rights, from you under this License will not have their licenses
+ terminated so long as such parties remain in full compliance.
+
+ 9. You are not required to accept this License, since you have not
+ signed it. However, nothing else grants you permission to modify or
+ distribute the Library or its derivative works. These actions are
+ prohibited by law if you do not accept this License. Therefore, by
+ modifying or distributing the Library (or any work based on the
+ Library), you indicate your acceptance of this License to do so, and
+ all its terms and conditions for copying, distributing or modifying
+ the Library or works based on it.
+
+ 10. Each time you redistribute the Library (or any work based on the
+ Library), the recipient automatically receives a license from the
+ original licensor to copy, distribute, link with or modify the Library
+ subject to these terms and conditions. You may not impose any further
+ restrictions on the recipients' exercise of the rights granted herein.
+ You are not responsible for enforcing compliance by third parties with
+ this License.
+
+ 11. If, as a consequence of a court judgment or allegation of patent
+ infringement or for any other reason (not limited to patent issues),
+ conditions are imposed on you (whether by court order, agreement or
+ otherwise) that contradict the conditions of this License, they do not
+ excuse you from the conditions of this License. If you cannot
+ distribute so as to satisfy simultaneously your obligations under this
+ License and any other pertinent obligations, then as a consequence you
+ may not distribute the Library at all. For example, if a patent
+ license would not permit royalty-free redistribution of the Library by
+ all those who receive copies directly or indirectly through you, then
+ the only way you could satisfy both it and this License would be to
+ refrain entirely from distribution of the Library.
+
+ If any portion of this section is held invalid or unenforceable under any
+ particular circumstance, the balance of the section is intended to apply,
+ and the section as a whole is intended to apply in other circumstances.
+
+ It is not the purpose of this section to induce you to infringe any
+ patents or other property right claims or to contest validity of any
+ such claims; this section has the sole purpose of protecting the
+ integrity of the free software distribution system which is
+ implemented by public license practices. Many people have made
+ generous contributions to the wide range of software distributed
+ through that system in reliance on consistent application of that
+ system; it is up to the author/donor to decide if he or she is willing
+ to distribute software through any other system and a licensee cannot
+ impose that choice.
+
+ This section is intended to make thoroughly clear what is believed to
+ be a consequence of the rest of this License.
+
+ 12. If the distribution and/or use of the Library is restricted in
+ certain countries either by patents or by copyrighted interfaces, the
+ original copyright holder who places the Library under this License may add
+ an explicit geographical distribution limitation excluding those countries,
+ so that distribution is permitted only in or among countries not thus
+ excluded. In such case, this License incorporates the limitation as if
+ written in the body of this License.
+
+ 13. The Free Software Foundation may publish revised and/or new
+ versions of the Lesser General Public License from time to time.
+ Such new versions will be similar in spirit to the present version,
+ but may differ in detail to address new problems or concerns.
+
+ Each version is given a distinguishing version number. If the Library
+ specifies a version number of this License which applies to it and
+ "any later version", you have the option of following the terms and
+ conditions either of that version or of any later version published by
+ the Free Software Foundation. If the Library does not specify a
+ license version number, you may choose any version ever published by
+ the Free Software Foundation.
+
+ 14. If you wish to incorporate parts of the Library into other free
+ programs whose distribution conditions are incompatible with these,
+ write to the author to ask for permission. For software which is
+ copyrighted by the Free Software Foundation, write to the Free
+ Software Foundation; we sometimes make exceptions for this. Our
+ decision will be guided by the two goals of preserving the free status
+ of all derivatives of our free software and of promoting the sharing
+ and reuse of software generally.
+
+ NO WARRANTY
+
+ 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO
+ WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.
+ EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR
+ OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY
+ KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE
+ LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME
+ THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
+
+ 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN
+ WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY
+ AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU
+ FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR
+ CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE
+ LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING
+ RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A
+ FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF
+ SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH
+ DAMAGES.
+
+ END OF TERMS AND CONDITIONS
+
+ How to Apply These Terms to Your New Libraries
+
+ If you develop a new library, and you want it to be of the greatest
+ possible use to the public, we recommend making it free software that
+ everyone can redistribute and change. You can do so by permitting
+ redistribution under these terms (or, alternatively, under the terms of the
+ ordinary General Public License).
+
+ To apply these terms, attach the following notices to the library. It is
+ safest to attach them to the start of each source file to most effectively
+ convey the exclusion of warranty; and each file should have at least the
+ "copyright" line and a pointer to where the full notice is found.
+
+ <one line to give the library's name and a brief idea of what it does.>
+ Copyright (C) <year> <name of author>
+
+ This library is free software; you can redistribute it and/or
+ modify it under the terms of the GNU Lesser General Public
+ License as published by the Free Software Foundation; either
+ version 2.1 of the License, or (at your option) any later version.
+
+ This library is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ Lesser General Public License for more details.
+
+ You should have received a copy of the GNU Lesser General Public
+ License along with this library; if not, write to the Free Software
+ Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
+
+ Also add information on how to contact you by electronic and paper mail.
+
+ You should also get your employer (if you work as a programmer) or your
+ school, if any, to sign a "copyright disclaimer" for the library, if
+ necessary. Here is a sample; alter the names:
+
+ Yoyodyne, Inc., hereby disclaims all copyright interest in the
+ library `Frob' (a library for tweaking knobs) written by James Random Hacker.
+
+ <signature of Ty Coon>, 1 April 1990
+ Ty Coon, President of Vice
+
+ That's all there is to it!
+
+ ----------------------------------------------------------------------
+
+* libssh2 - http://www.libssh2.org/license.html
+
+ Copyright (c) 2004-2007 Sara Golemon <sarag@libssh2.org>
+ Copyright (c) 2005,2006 Mikhail Gusarov <dottedmag@dottedmag.net>
+ Copyright (c) 2006-2007 The Written Word, Inc.
+ Copyright (c) 2007 Eli Fant <elifantu@mail.ru>
+ Copyright (c) 2009 Daniel Stenberg
+ Copyright (C) 2008, 2009 Simon Josefsson
+ All rights reserved.
+
+ Redistribution and use in source and binary forms,
+ with or without modification, are permitted provided
+ that the following conditions are met:
+
+ Redistributions of source code must retain the above
+ copyright notice, this list of conditions and the
+ following disclaimer.
+
+ Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials
+ provided with the distribution.
+
+ Neither the name of the copyright holder nor the names
+ of any other contributors may be used to endorse or
+ promote products derived from this software without
+ specific prior written permission.
+
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND
+ CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES,
+ INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES
+ OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+ ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
+ CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
+ BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
+ SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+ INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
+ WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
+ NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
+ USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY
+ OF SUCH DAMAGE.
+
+* libcurl - http://curl.haxx.se/docs/copyright.html
+
+ COPYRIGHT AND PERMISSION NOTICE
+
+ Copyright (c) 1996 - 2014, Daniel Stenberg, daniel@haxx.se.
+
+ All rights reserved.
+
+ Permission to use, copy, modify, and distribute this software for any
+ purpose with or without fee is hereby granted, provided that the above
+ copyright notice and this permission notice appear in all copies.
+
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT OF THIRD PARTY RIGHTS.
+ IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+ DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+ OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+ USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+ Except as contained in this notice, the name of a copyright holder shall not
+ be used in advertising or otherwise to promote the sale, use or other
+ dealings in this Software without prior written authorization of the
+ copyright holder.
+
+* flate2-rs - https://github.com/alexcrichton/flate2-rs/blob/master/LICENSE-MIT
+* link-config - https://github.com/alexcrichton/link-config/blob/master/LICENSE-MIT
+* openssl-static-sys - https://github.com/alexcrichton/openssl-static-sys/blob/master/LICENSE-MIT
+* toml-rs - https://github.com/alexcrichton/toml-rs/blob/master/LICENSE-MIT
+* libssh2-static-sys - https://github.com/alexcrichton/libssh2-static-sys/blob/master/LICENSE-MIT
+* git2-rs - https://github.com/alexcrichton/git2-rs/blob/master/LICENSE-MIT
+* tar-rs - https://github.com/alexcrichton/tar-rs/blob/master/LICENSE-MIT
+
+ Copyright (c) 2014 Alex Crichton
+
+ Permission is hereby granted, free of charge, to any
+ person obtaining a copy of this software and associated
+ documentation files (the "Software"), to deal in the
+ Software without restriction, including without
+ limitation the rights to use, copy, modify, merge,
+ publish, distribute, sublicense, and/or sell copies of
+ the Software, and to permit persons to whom the Software
+ is furnished to do so, subject to the following
+ conditions:
+
+ The above copyright notice and this permission notice
+ shall be included in all copies or substantial portions
+ of the Software.
+
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
+ ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
+ TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
+ PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
+ SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+ CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
+ OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
+ IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+ DEALINGS IN THE SOFTWARE.
+
+* glob - https://github.com/rust-lang/glob/blob/master/LICENSE-MIT
+* semver - https://github.com/rust-lang/semver/blob/master/LICENSE-MIT
+
+ Copyright (c) 2014 The Rust Project Developers
+
+ Permission is hereby granted, free of charge, to any
+ person obtaining a copy of this software and associated
+ documentation files (the "Software"), to deal in the
+ Software without restriction, including without
+ limitation the rights to use, copy, modify, merge,
+ publish, distribute, sublicense, and/or sell copies of
+ the Software, and to permit persons to whom the Software
+ is furnished to do so, subject to the following
+ conditions:
+
+ The above copyright notice and this permission notice
+ shall be included in all copies or substantial portions
+ of the Software.
+
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
+ ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
+ TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
+ PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
+ SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+ CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
+ OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
+ IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+ DEALINGS IN THE SOFTWARE.
+
+* rust-url - https://github.com/servo/rust-url/blob/master/LICENSE-MIT
+
+ Copyright (c) 2006-2009 Graydon Hoare
+ Copyright (c) 2009-2013 Mozilla Foundation
+
+ Permission is hereby granted, free of charge, to any
+ person obtaining a copy of this software and associated
+ documentation files (the "Software"), to deal in the
+ Software without restriction, including without
+ limitation the rights to use, copy, modify, merge,
+ publish, distribute, sublicense, and/or sell copies of
+ the Software, and to permit persons to whom the Software
+ is furnished to do so, subject to the following
+ conditions:
+
+ The above copyright notice and this permission notice
+ shall be included in all copies or substantial portions
+ of the Software.
+
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
+ ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
+ TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
+ PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
+ SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+ CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
+ OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
+ IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+ DEALINGS IN THE SOFTWARE.
+
+* rust-encoding - https://github.com/lifthrasiir/rust-encoding/blob/master/LICENSE.txt
+
+ The MIT License (MIT)
+
+ Copyright (c) 2013, Kang Seonghoon.
+
+ Permission is hereby granted, free of charge, to any person obtaining a copy
+ of this software and associated documentation files (the "Software"), to deal
+ in the Software without restriction, including without limitation the rights
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+ copies of the Software, and to permit persons to whom the Software is
+ furnished to do so, subject to the following conditions:
+
+ The above copyright notice and this permission notice shall be included in
+ all copies or substantial portions of the Software.
+
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+ THE SOFTWARE.
+
+* curl-rust - https://github.com/carllerche/curl-rust/blob/master/LICENSE
+
+ Copyright (c) 2014 Carl Lerche
+
+ Permission is hereby granted, free of charge, to any person obtaining a copy
+ of this software and associated documentation files (the "Software"), to deal
+ in the Software without restriction, including without limitation the rights
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+ copies of the Software, and to permit persons to whom the Software is
+ furnished to do so, subject to the following conditions:
+
+ The above copyright notice and this permission notice shall be included in
+ all copies or substantial portions of the Software.
+
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+ THE SOFTWARE.
+
+* docopt.rs - https://github.com/docopt/docopt.rs/blob/master/UNLICENSE
+
+ This is free and unencumbered software released into the public domain.
+
+ Anyone is free to copy, modify, publish, use, compile, sell, or
+ distribute this software, either in source code form or as a compiled
+ binary, for any purpose, commercial or non-commercial, and by any
+ means.
+
+ In jurisdictions that recognize copyright laws, the author or authors
+ of this software dedicate any and all copyright interest in the
+ software to the public domain. We make this dedication for the benefit
+ of the public at large and to the detriment of our heirs and
+ successors. We intend this dedication to be an overt act of
+ relinquishment in perpetuity of all present and future rights to this
+ software under copyright law.
+
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+ EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+ MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+ IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR
+ OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
+ ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
+ OTHER DEALINGS IN THE SOFTWARE.
+
+ For more information, please refer to <http://unlicense.org/>
+
--- /dev/null
+# Cargo
+
+Cargo downloads your Rust project’s dependencies and compiles your project.
+
+Learn more at http://doc.crates.io/
+
+## Code Status
+
+[](https://travis-ci.org/rust-lang/cargo)
+[](https://ci.appveyor.com/project/rust-lang-libs/cargo)
+
+## Installing Cargo
+
+Cargo is distributed by default with Rust, so if you've got `rustc` installed
+locally you probably also have `cargo` installed locally.
+
+## Compiling from Source
+
+Cargo requires the following tools and packages to build:
+
+* `python`
+* `curl` (on Unix)
+* `cmake`
+* OpenSSL headers (only for Unix, this is the `libssl-dev` package on ubuntu)
+* `cargo` and `rustc`
+
+First, you'll want to check out this repository
+
+```
+git clone --recursive https://github.com/rust-lang/cargo
+cd cargo
+```
+
+With `cargo` already installed, you can simply run:
+
+```
+cargo build --release
+```
+
+## Adding new subcommands to Cargo
+
+Cargo is designed to be extensible with new subcommands without having to modify
+Cargo itself. See [the Wiki page][third-party-subcommands] for more details and
+a list of known community-developed subcommands.
+
+[third-party-subcommands]: https://github.com/rust-lang/cargo/wiki/Third-party-cargo-subcommands
+
+
+## Releases
+
+High level release notes are available as part of [Rust's release notes][rel].
+Cargo releases coincide with Rust releases.
+
+[rel]: https://github.com/rust-lang/rust/blob/master/RELEASES.md
+
+## Reporting issues
+
+Found a bug? We'd love to know about it!
+
+Please report all issues on the github [issue tracker][issues].
+
+[issues]: https://github.com/rust-lang/cargo/issues
+
+## Contributing
+
+See [CONTRIBUTING.md](CONTRIBUTING.md). You may also find the arhitecture
+documentation useful ([ARCHITECTURE.md](ARCHITECTURE.md)).
+
+## License
+
+Cargo is primarily distributed under the terms of both the MIT license
+and the Apache License (Version 2.0).
+
+See LICENSE-APACHE and LICENSE-MIT for details.
+
+### Third party software
+
+This product includes software developed by the OpenSSL Project
+for use in the OpenSSL Toolkit (http://www.openssl.org/).
+
+In binary form, this product includes software that is licensed under the
+terms of the GNU General Public License, version 2, with a linking exception,
+which can be obtained from the [upstream repository][1].
+
+See LICENSE-THIRD-PARTY for details.
+
+[1]: https://github.com/libgit2/libgit2
+
--- /dev/null
+environment:
+
+ # At the time this was added AppVeyor was having troubles with checking
+ # revocation of SSL certificates of sites like static.rust-lang.org and what
+ # we think is crates.io. The libcurl HTTP client by default checks for
+ # revocation on Windows and according to a mailing list [1] this can be
+ # disabled.
+ #
+ # The `CARGO_HTTP_CHECK_REVOKE` env var here tells cargo to disable SSL
+ # revocation checking on Windows in libcurl. Note, though, that rustup, which
+ # we're using to download Rust here, also uses libcurl as the default backend.
+ # Unlike Cargo, however, rustup doesn't have a mechanism to disable revocation
+ # checking. To get rustup working we set `RUSTUP_USE_HYPER` which forces it to
+ # use the Hyper instead of libcurl backend. Both Hyper and libcurl use
+ # schannel on Windows but it appears that Hyper configures it slightly
+ # differently such that revocation checking isn't turned on by default.
+ #
+ # [1]: https://curl.haxx.se/mail/lib-2016-03/0202.html
+ RUSTUP_USE_HYPER: 1
+ CARGO_HTTP_CHECK_REVOKE: false
+
+ matrix:
+ - TARGET: x86_64-pc-windows-msvc
+ OTHER_TARGET: i686-pc-windows-msvc
+ MAKE_TARGETS: test-unit-x86_64-pc-windows-msvc
+
+install:
+ - appveyor-retry appveyor DownloadFile https://win.rustup.rs/ -FileName rustup-init.exe
+ - rustup-init.exe -y --default-host x86_64-pc-windows-msvc --default-toolchain nightly
+ - set PATH=%PATH%;C:\Users\appveyor\.cargo\bin
+ - rustup target add %OTHER_TARGET%
+ - rustc -V
+ - cargo -V
+ - git submodule update --init
+
+clone_depth: 1
+
+build: false
+
+test_script:
+ - cargo test
--- /dev/null
+disable_all_formatting = true
--- /dev/null
+use std::env;
+
+use cargo::core::Workspace;
+use cargo::ops::{self, MessageFormat, Packages};
+use cargo::util::{CliResult, CliError, Config, CargoErrorKind};
+use cargo::util::important_paths::{find_root_manifest_for_wd};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_no_run: bool,
+ flag_package: Vec<String>,
+ flag_jobs: Option<u32>,
+ flag_features: Vec<String>,
+ flag_all_features: bool,
+ flag_no_default_features: bool,
+ flag_target: Option<String>,
+ flag_manifest_path: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_message_format: MessageFormat,
+ flag_lib: bool,
+ flag_bin: Vec<String>,
+ flag_bins: bool,
+ flag_example: Vec<String>,
+ flag_examples: bool,
+ flag_test: Vec<String>,
+ flag_tests: bool,
+ flag_bench: Vec<String>,
+ flag_benches: bool,
+ flag_all_targets: bool,
+ flag_no_fail_fast: bool,
+ flag_frozen: bool,
+ flag_locked: bool,
+ arg_args: Vec<String>,
+ flag_all: bool,
+ flag_exclude: Vec<String>,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Execute all benchmarks of a local package
+
+Usage:
+ cargo bench [options] [--] [<args>...]
+
+Options:
+ -h, --help Print this message
+ --lib Benchmark only this package's library
+ --bin NAME ... Benchmark only the specified binary
+ --bins Benchmark all binaries
+ --example NAME ... Benchmark only the specified example
+ --examples Benchmark all examples
+ --test NAME ... Benchmark only the specified test target
+ --tests Benchmark all tests
+ --bench NAME ... Benchmark only the specified bench target
+ --benches Benchmark all benches
+ --all-targets Benchmark all targets (default)
+ --no-run Compile, but don't run benchmarks
+ -p SPEC, --package SPEC ... Package to run benchmarks for
+ --all Benchmark all packages in the workspace
+ --exclude SPEC ... Exclude packages from the benchmark
+ -j N, --jobs N Number of parallel jobs, defaults to # of CPUs
+ --features FEATURES Space-separated list of features to also build
+ --all-features Build all available features
+ --no-default-features Do not build the `default` feature
+ --target TRIPLE Build for the target triple
+ --manifest-path PATH Path to the manifest to build benchmarks for
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --message-format FMT Error format: human, json [default: human]
+ --no-fail-fast Run all benchmarks regardless of failure
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+All of the trailing arguments are passed to the benchmark binaries generated
+for filtering benchmarks and generally providing options configuring how they
+run.
+
+If the --package argument is given, then SPEC is a package id specification
+which indicates which package should be benchmarked. If it is not given, then
+the current package is benchmarked. For more information on SPEC and its format,
+see the `cargo help pkgid` command.
+
+All packages in the workspace are benchmarked if the `--all` flag is supplied. The
+`--all` flag is automatically assumed for a virtual manifest.
+Note that `--exclude` has to be specified in conjunction with the `--all` flag.
+
+The --jobs argument affects the building of the benchmark executable but does
+not affect how many jobs are used when running the benchmarks.
+
+Compilation can be customized with the `bench` profile in the manifest.
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ debug!("executing; cmd=cargo-bench; args={:?}",
+ env::args().collect::<Vec<_>>());
+
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let root = find_root_manifest_for_wd(options.flag_manifest_path, config.cwd())?;
+ let ws = Workspace::new(&root, config)?;
+
+ let spec = Packages::from_flags(ws.is_virtual(),
+ options.flag_all,
+ &options.flag_exclude,
+ &options.flag_package)?;
+
+ let ops = ops::TestOptions {
+ no_run: options.flag_no_run,
+ no_fail_fast: options.flag_no_fail_fast,
+ only_doc: false,
+ compile_opts: ops::CompileOptions {
+ config: config,
+ jobs: options.flag_jobs,
+ target: options.flag_target.as_ref().map(|s| &s[..]),
+ features: &options.flag_features,
+ all_features: options.flag_all_features,
+ no_default_features: options.flag_no_default_features,
+ spec: spec,
+ release: true,
+ mode: ops::CompileMode::Bench,
+ filter: ops::CompileFilter::new(options.flag_lib,
+ &options.flag_bin, options.flag_bins,
+ &options.flag_test, options.flag_tests,
+ &options.flag_example, options.flag_examples,
+ &options.flag_bench, options.flag_benches,
+ options.flag_all_targets),
+ message_format: options.flag_message_format,
+ target_rustdoc_args: None,
+ target_rustc_args: None,
+ },
+ };
+
+ let err = ops::run_benches(&ws, &ops, &options.arg_args)?;
+ match err {
+ None => Ok(()),
+ Some(err) => {
+ Err(match err.exit.as_ref().and_then(|e| e.code()) {
+ Some(i) => CliError::new("bench failed".into(), i),
+ None => CliError::new(CargoErrorKind::CargoTestErrorKind(err).into(), 101)
+ })
+ }
+ }
+}
--- /dev/null
+use std::env;
+
+use cargo::core::Workspace;
+use cargo::ops::{self, CompileOptions, MessageFormat, Packages};
+use cargo::util::important_paths::{find_root_manifest_for_wd};
+use cargo::util::{CliResult, Config};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_package: Vec<String>,
+ flag_jobs: Option<u32>,
+ flag_features: Vec<String>,
+ flag_all_features: bool,
+ flag_no_default_features: bool,
+ flag_target: Option<String>,
+ flag_manifest_path: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_message_format: MessageFormat,
+ flag_release: bool,
+ flag_lib: bool,
+ flag_bin: Vec<String>,
+ flag_bins: bool,
+ flag_example: Vec<String>,
+ flag_examples: bool,
+ flag_test: Vec<String>,
+ flag_tests: bool,
+ flag_bench: Vec<String>,
+ flag_benches: bool,
+ flag_all_targets: bool,
+ flag_locked: bool,
+ flag_frozen: bool,
+ flag_all: bool,
+ flag_exclude: Vec<String>,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Compile a local package and all of its dependencies
+
+Usage:
+ cargo build [options]
+
+Options:
+ -h, --help Print this message
+ -p SPEC, --package SPEC ... Package to build
+ --all Build all packages in the workspace
+ --exclude SPEC ... Exclude packages from the build
+ -j N, --jobs N Number of parallel jobs, defaults to # of CPUs
+ --lib Build only this package's library
+ --bin NAME Build only the specified binary
+ --bins Build all binaries
+ --example NAME Build only the specified example
+ --examples Build all examples
+ --test NAME Build only the specified test target
+ --tests Build all tests
+ --bench NAME Build only the specified bench target
+ --benches Build all benches
+ --all-targets Build all targets (lib and bin targets by default)
+ --release Build artifacts in release mode, with optimizations
+ --features FEATURES Space-separated list of features to also build
+ --all-features Build all available features
+ --no-default-features Do not build the `default` feature
+ --target TRIPLE Build for the target triple
+ --manifest-path PATH Path to the manifest to compile
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --message-format FMT Error format: human, json [default: human]
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+If the --package argument is given, then SPEC is a package id specification
+which indicates which package should be built. If it is not given, then the
+current package is built. For more information on SPEC and its format, see the
+`cargo help pkgid` command.
+
+All packages in the workspace are built if the `--all` flag is supplied. The
+`--all` flag is automatically assumed for a virtual manifest.
+Note that `--exclude` has to be specified in conjunction with the `--all` flag.
+
+Compilation can be configured via the use of profiles which are configured in
+the manifest. The default profile for this command is `dev`, but passing
+the --release flag will use the `release` profile instead.
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ debug!("executing; cmd=cargo-build; args={:?}",
+ env::args().collect::<Vec<_>>());
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let root = find_root_manifest_for_wd(options.flag_manifest_path, config.cwd())?;
+ let ws = Workspace::new(&root, config)?;
+
+ let spec = Packages::from_flags(ws.is_virtual(),
+ options.flag_all,
+ &options.flag_exclude,
+ &options.flag_package)?;
+
+ let opts = CompileOptions {
+ config: config,
+ jobs: options.flag_jobs,
+ target: options.flag_target.as_ref().map(|t| &t[..]),
+ features: &options.flag_features,
+ all_features: options.flag_all_features,
+ no_default_features: options.flag_no_default_features,
+ spec: spec,
+ mode: ops::CompileMode::Build,
+ release: options.flag_release,
+ filter: ops::CompileFilter::new(options.flag_lib,
+ &options.flag_bin, options.flag_bins,
+ &options.flag_test, options.flag_tests,
+ &options.flag_example, options.flag_examples,
+ &options.flag_bench, options.flag_benches,
+ options.flag_all_targets),
+ message_format: options.flag_message_format,
+ target_rustdoc_args: None,
+ target_rustc_args: None,
+ };
+
+ ops::compile(&ws, &opts)?;
+ Ok(())
+}
--- /dev/null
+extern crate cargo;
+extern crate env_logger;
+extern crate git2_curl;
+extern crate toml;
+#[macro_use]
+extern crate log;
+#[macro_use]
+extern crate serde_derive;
+extern crate serde_json;
+
+use std::collections::BTreeSet;
+use std::collections::HashMap;
+use std::env;
+use std::fs;
+use std::path::{Path, PathBuf};
+
+use cargo::core::shell::{Shell, Verbosity};
+use cargo::util::{self, CliResult, lev_distance, Config, CargoResult, CargoError, CargoErrorKind};
+use cargo::util::CliError;
+
+#[derive(Deserialize)]
+pub struct Flags {
+ flag_list: bool,
+ flag_version: bool,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_explain: Option<String>,
+ arg_command: String,
+ arg_args: Vec<String>,
+ flag_locked: bool,
+ flag_frozen: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+const USAGE: &'static str = "
+Rust's package manager
+
+Usage:
+ cargo <command> [<args>...]
+ cargo [options]
+
+Options:
+ -h, --help Display this message
+ -V, --version Print version info and exit
+ --list List installed commands
+ --explain CODE Run `rustc --explain CODE`
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+Some common cargo commands are (see all commands with --list):
+ build Compile the current project
+ check Analyze the current project and report errors, but don't build object files
+ clean Remove the target directory
+ doc Build this project's and its dependencies' documentation
+ new Create a new cargo project
+ init Create a new cargo project in an existing directory
+ run Build and execute src/main.rs
+ test Run the tests
+ bench Run the benchmarks
+ update Update dependencies listed in Cargo.lock
+ search Search registry for crates
+ publish Package and upload this project to the registry
+ install Install a Rust binary
+ uninstall Uninstall a Rust binary
+
+See 'cargo help <command>' for more information on a specific command.
+";
+
+fn main() {
+ env_logger::init().unwrap();
+
+ let mut config = match Config::default() {
+ Ok(cfg) => cfg,
+ Err(e) => {
+ let mut shell = Shell::new();
+ cargo::exit_with_error(e.into(), &mut shell)
+ }
+ };
+
+ let result = (|| {
+ let args: Vec<_> = try!(env::args_os()
+ .map(|s| {
+ s.into_string().map_err(|s| {
+ CargoError::from(format!("invalid unicode in argument: {:?}", s))
+ })
+ })
+ .collect());
+ let rest = &args;
+ cargo::call_main_without_stdin(execute, &mut config, USAGE, rest, true)
+ })();
+
+ match result {
+ Err(e) => cargo::exit_with_error(e, &mut *config.shell()),
+ Ok(()) => {}
+ }
+}
+
+macro_rules! each_subcommand{
+ ($mac:ident) => {
+ $mac!(bench);
+ $mac!(build);
+ $mac!(check);
+ $mac!(clean);
+ $mac!(doc);
+ $mac!(fetch);
+ $mac!(generate_lockfile);
+ $mac!(git_checkout);
+ $mac!(help);
+ $mac!(init);
+ $mac!(install);
+ $mac!(locate_project);
+ $mac!(login);
+ $mac!(metadata);
+ $mac!(new);
+ $mac!(owner);
+ $mac!(package);
+ $mac!(pkgid);
+ $mac!(publish);
+ $mac!(read_manifest);
+ $mac!(run);
+ $mac!(rustc);
+ $mac!(rustdoc);
+ $mac!(search);
+ $mac!(test);
+ $mac!(uninstall);
+ $mac!(update);
+ $mac!(verify_project);
+ $mac!(version);
+ $mac!(yank);
+ }
+}
+
+macro_rules! declare_mod {
+ ($name:ident) => ( pub mod $name; )
+}
+each_subcommand!(declare_mod);
+
+/**
+ The top-level `cargo` command handles configuration and project location
+ because they are fundamental (and intertwined). Other commands can rely
+ on this top-level information.
+*/
+fn execute(flags: Flags, config: &mut Config) -> CliResult {
+ config.configure(flags.flag_verbose,
+ flags.flag_quiet,
+ &flags.flag_color,
+ flags.flag_frozen,
+ flags.flag_locked,
+ &flags.flag_z)?;
+
+ init_git_transports(config);
+ let _token = cargo::util::job::setup();
+
+ if flags.flag_version {
+ let version = cargo::version();
+ println!("{}", version);
+ if flags.flag_verbose > 0 {
+ println!("release: {}.{}.{}",
+ version.major,
+ version.minor,
+ version.patch);
+ if let Some(ref cfg) = version.cfg_info {
+ if let Some(ref ci) = cfg.commit_info {
+ println!("commit-hash: {}", ci.commit_hash);
+ println!("commit-date: {}", ci.commit_date);
+ }
+ }
+ }
+ return Ok(());
+ }
+
+ if flags.flag_list {
+ println!("Installed Commands:");
+ for command in list_commands(config) {
+ println!(" {}", command);
+ }
+ return Ok(());
+ }
+
+ if let Some(ref code) = flags.flag_explain {
+ let mut procss = config.rustc()?.process();
+ procss.arg("--explain").arg(code).exec()?;
+ return Ok(());
+ }
+
+ let args = match &flags.arg_command[..] {
+ // For the commands `cargo` and `cargo help`, re-execute ourselves as
+ // `cargo -h` so we can go through the normal process of printing the
+ // help message.
+ "" | "help" if flags.arg_args.is_empty() => {
+ config.shell().set_verbosity(Verbosity::Verbose);
+ let args = &["cargo".to_string(), "-h".to_string()];
+ return cargo::call_main_without_stdin(execute, config, USAGE, args, false);
+ }
+
+ // For `cargo help -h` and `cargo help --help`, print out the help
+ // message for `cargo help`
+ "help" if flags.arg_args[0] == "-h" || flags.arg_args[0] == "--help" => {
+ vec!["cargo".to_string(), "help".to_string(), "-h".to_string()]
+ }
+
+ // For `cargo help foo`, print out the usage message for the specified
+ // subcommand by executing the command with the `-h` flag.
+ "help" => vec!["cargo".to_string(), flags.arg_args[0].clone(), "-h".to_string()],
+
+ // For all other invocations, we're of the form `cargo foo args...`. We
+ // use the exact environment arguments to preserve tokens like `--` for
+ // example.
+ _ => {
+ let mut default_alias = HashMap::new();
+ default_alias.insert("b", "build".to_string());
+ default_alias.insert("t", "test".to_string());
+ default_alias.insert("r", "run".to_string());
+ let mut args: Vec<String> = env::args().collect();
+ if let Some(new_command) = default_alias.get(&args[1][..]) {
+ args[1] = new_command.clone();
+ }
+ args
+ }
+ };
+
+ if let Some(r) = try_execute_builtin_command(config, &args) {
+ return r;
+ }
+
+ let alias_list = aliased_command(config, &args[1])?;
+ let args = match alias_list {
+ Some(alias_command) => {
+ let chain = args.iter()
+ .take(1)
+ .chain(alias_command.iter())
+ .chain(args.iter().skip(2))
+ .map(|s| s.to_string())
+ .collect::<Vec<_>>();
+ if let Some(r) = try_execute_builtin_command(config, &chain) {
+ return r;
+ } else {
+ chain
+ }
+ }
+ None => args,
+ };
+
+ execute_external_subcommand(config, &args[1], &args)
+}
+
+fn try_execute_builtin_command(config: &mut Config, args: &[String]) -> Option<CliResult> {
+ macro_rules! cmd {
+ ($name:ident) => (if args[1] == stringify!($name).replace("_", "-") {
+ config.shell().set_verbosity(Verbosity::Verbose);
+ let r = cargo::call_main_without_stdin($name::execute,
+ config,
+ $name::USAGE,
+ &args,
+ false);
+ return Some(r);
+ })
+ }
+ each_subcommand!(cmd);
+
+ None
+}
+
+fn aliased_command(config: &Config, command: &str) -> CargoResult<Option<Vec<String>>> {
+ let alias_name = format!("alias.{}", command);
+ let mut result = Ok(None);
+ match config.get_string(&alias_name) {
+ Ok(value) => {
+ if let Some(record) = value {
+ let alias_commands = record.val
+ .split_whitespace()
+ .map(|s| s.to_string())
+ .collect();
+ result = Ok(Some(alias_commands));
+ }
+ }
+ Err(_) => {
+ let value = config.get_list(&alias_name)?;
+ if let Some(record) = value {
+ let alias_commands: Vec<String> = record.val
+ .iter()
+ .map(|s| s.0.to_string())
+ .collect();
+ result = Ok(Some(alias_commands));
+ }
+ }
+ }
+ result
+}
+
+fn find_closest(config: &Config, cmd: &str) -> Option<String> {
+ let cmds = list_commands(config);
+ // Only consider candidates with a lev_distance of 3 or less so we don't
+ // suggest out-of-the-blue options.
+ let mut filtered = cmds.iter()
+ .map(|c| (lev_distance(c, cmd), c))
+ .filter(|&(d, _)| d < 4)
+ .collect::<Vec<_>>();
+ filtered.sort_by(|a, b| a.0.cmp(&b.0));
+ filtered.get(0).map(|slot| slot.1.clone())
+}
+
+fn execute_external_subcommand(config: &Config, cmd: &str, args: &[String]) -> CliResult {
+ let command_exe = format!("cargo-{}{}", cmd, env::consts::EXE_SUFFIX);
+ let path = search_directories(config)
+ .iter()
+ .map(|dir| dir.join(&command_exe))
+ .find(|file| is_executable(file));
+ let command = match path {
+ Some(command) => command,
+ None => {
+ return Err(CargoError::from(match find_closest(config, cmd) {
+ Some(closest) => {
+ format!("no such subcommand: `{}`\n\n\tDid you mean `{}`?\n",
+ cmd,
+ closest)
+ }
+ None => format!("no such subcommand: `{}`", cmd),
+ })
+ .into())
+ }
+ };
+
+ let cargo_exe = config.cargo_exe()?;
+ let err = match util::process(&command)
+ .env(cargo::CARGO_ENV, cargo_exe)
+ .args(&args[1..])
+ .exec_replace() {
+ Ok(()) => return Ok(()),
+ Err(e) => e,
+ };
+
+ if let &CargoErrorKind::ProcessErrorKind(ref perr) = err.kind() {
+ if let Some(code) = perr.exit.as_ref().and_then(|c| c.code()) {
+ return Err(CliError::code(code));
+ }
+ }
+ Err(CliError::new(err, 101))
+}
+
+/// List all runnable commands
+fn list_commands(config: &Config) -> BTreeSet<String> {
+ let prefix = "cargo-";
+ let suffix = env::consts::EXE_SUFFIX;
+ let mut commands = BTreeSet::new();
+ for dir in search_directories(config) {
+ let entries = match fs::read_dir(dir) {
+ Ok(entries) => entries,
+ _ => continue,
+ };
+ for entry in entries.filter_map(|e| e.ok()) {
+ let path = entry.path();
+ let filename = match path.file_name().and_then(|s| s.to_str()) {
+ Some(filename) => filename,
+ _ => continue,
+ };
+ if !filename.starts_with(prefix) || !filename.ends_with(suffix) {
+ continue;
+ }
+ if is_executable(entry.path()) {
+ let end = filename.len() - suffix.len();
+ commands.insert(filename[prefix.len()..end].to_string());
+ }
+ }
+ }
+
+ macro_rules! add_cmd {
+ ($cmd:ident) => ({ commands.insert(stringify!($cmd).replace("_", "-")); })
+ }
+ each_subcommand!(add_cmd);
+ commands
+}
+
+#[cfg(unix)]
+fn is_executable<P: AsRef<Path>>(path: P) -> bool {
+ use std::os::unix::prelude::*;
+ fs::metadata(path)
+ .map(|metadata| metadata.is_file() && metadata.permissions().mode() & 0o111 != 0)
+ .unwrap_or(false)
+}
+#[cfg(windows)]
+fn is_executable<P: AsRef<Path>>(path: P) -> bool {
+ fs::metadata(path).map(|metadata| metadata.is_file()).unwrap_or(false)
+}
+
+fn search_directories(config: &Config) -> Vec<PathBuf> {
+ let mut dirs = vec![config.home().clone().into_path_unlocked().join("bin")];
+ if let Some(val) = env::var_os("PATH") {
+ dirs.extend(env::split_paths(&val));
+ }
+ dirs
+}
+
+fn init_git_transports(config: &Config) {
+ // Only use a custom transport if a proxy is configured, right now libgit2
+ // doesn't support proxies and we have to use a custom transport in this
+ // case. The custom transport, however, is not as well battle-tested.
+ match cargo::ops::http_proxy_exists(config) {
+ Ok(true) => {}
+ _ => return,
+ }
+
+ let handle = match cargo::ops::http_handle(config) {
+ Ok(handle) => handle,
+ Err(..) => return,
+ };
+
+ // The unsafety of the registration function derives from two aspects:
+ //
+ // 1. This call must be synchronized with all other registration calls as
+ // well as construction of new transports.
+ // 2. The argument is leaked.
+ //
+ // We're clear on point (1) because this is only called at the start of this
+ // binary (we know what the state of the world looks like) and we're mostly
+ // clear on point (2) because we'd only free it after everything is done
+ // anyway
+ unsafe {
+ git2_curl::register(handle);
+ }
+}
--- /dev/null
+use std::env;
+
+use cargo::core::Workspace;
+use cargo::ops::{self, CompileOptions, MessageFormat, Packages};
+use cargo::util::{CliResult, CliError, Config};
+use cargo::util::important_paths::find_root_manifest_for_wd;
+
+pub const USAGE: &'static str = "
+Check a local package and all of its dependencies for errors
+
+Usage:
+ cargo check [options]
+
+Options:
+ -h, --help Print this message
+ -p SPEC, --package SPEC ... Package(s) to check
+ --all Check all packages in the workspace
+ --exclude SPEC ... Exclude packages from the check
+ -j N, --jobs N Number of parallel jobs, defaults to # of CPUs
+ --lib Check only this package's library
+ --bin NAME Check only the specified binary
+ --bins Check all binaries
+ --example NAME Check only the specified example
+ --examples Check all examples
+ --test NAME Check only the specified test target
+ --tests Check all tests
+ --bench NAME Check only the specified bench target
+ --benches Check all benches
+ --all-targets Check all targets (lib and bin targets by default)
+ --release Check artifacts in release mode, with optimizations
+ --profile PROFILE Profile to build the selected target for
+ --features FEATURES Space-separated list of features to also check
+ --all-features Check all available features
+ --no-default-features Do not check the `default` feature
+ --target TRIPLE Check for the target triple
+ --manifest-path PATH Path to the manifest to compile
+ -v, --verbose ... Use verbose output
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --message-format FMT Error format: human, json [default: human]
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+If the --package argument is given, then SPEC is a package id specification
+which indicates which package should be built. If it is not given, then the
+current package is built. For more information on SPEC and its format, see the
+`cargo help pkgid` command.
+
+All packages in the workspace are checked if the `--all` flag is supplied. The
+`--all` flag is automatically assumed for a virtual manifest.
+Note that `--exclude` has to be specified in conjunction with the `--all` flag.
+
+Compilation can be configured via the use of profiles which are configured in
+the manifest. The default profile for this command is `dev`, but passing
+the --release flag will use the `release` profile instead.
+
+The `--profile test` flag can be used to check unit tests with the
+`#[cfg(test)]` attribute.
+";
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_package: Vec<String>,
+ flag_jobs: Option<u32>,
+ flag_features: Vec<String>,
+ flag_all_features: bool,
+ flag_no_default_features: bool,
+ flag_target: Option<String>,
+ flag_manifest_path: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_message_format: MessageFormat,
+ flag_release: bool,
+ flag_lib: bool,
+ flag_bin: Vec<String>,
+ flag_bins: bool,
+ flag_example: Vec<String>,
+ flag_examples: bool,
+ flag_test: Vec<String>,
+ flag_tests: bool,
+ flag_bench: Vec<String>,
+ flag_benches: bool,
+ flag_all_targets: bool,
+ flag_locked: bool,
+ flag_frozen: bool,
+ flag_all: bool,
+ flag_exclude: Vec<String>,
+ flag_profile: Option<String>,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ debug!("executing; cmd=cargo-check; args={:?}",
+ env::args().collect::<Vec<_>>());
+
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let root = find_root_manifest_for_wd(options.flag_manifest_path, config.cwd())?;
+ let ws = Workspace::new(&root, config)?;
+
+ let spec = Packages::from_flags(ws.is_virtual(),
+ options.flag_all,
+ &options.flag_exclude,
+ &options.flag_package)?;
+
+ let test = match options.flag_profile.as_ref().map(|t| &t[..]) {
+ Some("test") => true,
+ None => false,
+ Some(profile) => {
+ let err = format!("unknown profile: `{}`, only `test` is currently supported",
+ profile).into();
+ return Err(CliError::new(err, 101))
+ }
+ };
+
+ let opts = CompileOptions {
+ config: config,
+ jobs: options.flag_jobs,
+ target: options.flag_target.as_ref().map(|t| &t[..]),
+ features: &options.flag_features,
+ all_features: options.flag_all_features,
+ no_default_features: options.flag_no_default_features,
+ spec: spec,
+ mode: ops::CompileMode::Check{test:test},
+ release: options.flag_release,
+ filter: ops::CompileFilter::new(options.flag_lib,
+ &options.flag_bin, options.flag_bins,
+ &options.flag_test, options.flag_tests,
+ &options.flag_example, options.flag_examples,
+ &options.flag_bench, options.flag_benches,
+ options.flag_all_targets),
+ message_format: options.flag_message_format,
+ target_rustdoc_args: None,
+ target_rustc_args: None,
+ };
+
+ ops::compile(&ws, &opts)?;
+ Ok(())
+}
--- /dev/null
+use std::env;
+
+use cargo::core::Workspace;
+use cargo::ops;
+use cargo::util::{CliResult, Config};
+use cargo::util::important_paths::{find_root_manifest_for_wd};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_package: Vec<String>,
+ flag_target: Option<String>,
+ flag_manifest_path: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_release: bool,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Remove artifacts that cargo has generated in the past
+
+Usage:
+ cargo clean [options]
+
+Options:
+ -h, --help Print this message
+ -p SPEC, --package SPEC ... Package to clean artifacts for
+ --manifest-path PATH Path to the manifest to the package to clean
+ --target TRIPLE Target triple to clean output for (default all)
+ --release Whether or not to clean release artifacts
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+If the --package argument is given, then SPEC is a package id specification
+which indicates which package's artifacts should be cleaned out. If it is not
+given, then all packages' artifacts are removed. For more information on SPEC
+and its format, see the `cargo help pkgid` command.
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ debug!("executing; cmd=cargo-clean; args={:?}", env::args().collect::<Vec<_>>());
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let root = find_root_manifest_for_wd(options.flag_manifest_path, config.cwd())?;
+ let opts = ops::CleanOptions {
+ config: config,
+ spec: &options.flag_package,
+ target: options.flag_target.as_ref().map(|s| &s[..]),
+ release: options.flag_release,
+ };
+ let ws = Workspace::new(&root, config)?;
+ ops::clean(&ws, &opts)?;
+ Ok(())
+}
--- /dev/null
+use std::env;
+
+use cargo::core::Workspace;
+use cargo::ops::{self, MessageFormat, Packages};
+use cargo::util::{CliResult, Config};
+use cargo::util::important_paths::{find_root_manifest_for_wd};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_target: Option<String>,
+ flag_features: Vec<String>,
+ flag_all_features: bool,
+ flag_jobs: Option<u32>,
+ flag_manifest_path: Option<String>,
+ flag_no_default_features: bool,
+ flag_no_deps: bool,
+ flag_open: bool,
+ flag_release: bool,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_message_format: MessageFormat,
+ flag_package: Vec<String>,
+ flag_lib: bool,
+ flag_bin: Vec<String>,
+ flag_bins: bool,
+ flag_frozen: bool,
+ flag_locked: bool,
+ flag_all: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Build a package's documentation
+
+Usage:
+ cargo doc [options]
+
+Options:
+ -h, --help Print this message
+ --open Opens the docs in a browser after the operation
+ -p SPEC, --package SPEC ... Package to document
+ --all Document all packages in the workspace
+ --no-deps Don't build documentation for dependencies
+ -j N, --jobs N Number of parallel jobs, defaults to # of CPUs
+ --lib Document only this package's library
+ --bin NAME Document only the specified binary
+ --bins Document all binaries
+ --release Build artifacts in release mode, with optimizations
+ --features FEATURES Space-separated list of features to also build
+ --all-features Build all available features
+ --no-default-features Do not build the `default` feature
+ --target TRIPLE Build for the target triple
+ --manifest-path PATH Path to the manifest to document
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --message-format FMT Error format: human, json [default: human]
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+By default the documentation for the local package and all dependencies is
+built. The output is all placed in `target/doc` in rustdoc's usual format.
+
+All packages in the workspace are documented if the `--all` flag is supplied. The
+`--all` flag is automatically assumed for a virtual manifest.
+Note that `--exclude` has to be specified in conjunction with the `--all` flag.
+
+If the --package argument is given, then SPEC is a package id specification
+which indicates which package should be documented. If it is not given, then the
+current package is documented. For more information on SPEC and its format, see
+the `cargo help pkgid` command.
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ debug!("executing; cmd=cargo-check; args={:?}",
+ env::args().collect::<Vec<_>>());
+
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let root = find_root_manifest_for_wd(options.flag_manifest_path, config.cwd())?;
+ let ws = Workspace::new(&root, config)?;
+
+ let spec = if options.flag_all || (ws.is_virtual() && options.flag_package.is_empty()) {
+ Packages::All
+ } else {
+ Packages::Packages(&options.flag_package)
+ };
+
+ let empty = Vec::new();
+ let doc_opts = ops::DocOptions {
+ open_result: options.flag_open,
+ compile_opts: ops::CompileOptions {
+ config: config,
+ jobs: options.flag_jobs,
+ target: options.flag_target.as_ref().map(|t| &t[..]),
+ features: &options.flag_features,
+ all_features: options.flag_all_features,
+ no_default_features: options.flag_no_default_features,
+ spec: spec,
+ filter: ops::CompileFilter::new(options.flag_lib,
+ &options.flag_bin, options.flag_bins,
+ &empty, false,
+ &empty, false,
+ &empty, false,
+ false),
+ message_format: options.flag_message_format,
+ release: options.flag_release,
+ mode: ops::CompileMode::Doc {
+ deps: !options.flag_no_deps,
+ },
+ target_rustc_args: None,
+ target_rustdoc_args: None,
+ },
+ };
+
+ ops::doc(&ws, &doc_opts)?;
+ Ok(())
+}
--- /dev/null
+use cargo::core::Workspace;
+use cargo::ops;
+use cargo::util::{CliResult, Config};
+use cargo::util::important_paths::find_root_manifest_for_wd;
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_manifest_path: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Fetch dependencies of a package from the network.
+
+Usage:
+ cargo fetch [options]
+
+Options:
+ -h, --help Print this message
+ --manifest-path PATH Path to the manifest to fetch dependencies for
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+If a lockfile is available, this command will ensure that all of the git
+dependencies and/or registries dependencies are downloaded and locally
+available. The network is never touched after a `cargo fetch` unless
+the lockfile changes.
+
+If the lockfile is not available, then this is the equivalent of
+`cargo generate-lockfile`. A lockfile is generated and dependencies are also
+all updated.
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+ let root = find_root_manifest_for_wd(options.flag_manifest_path, config.cwd())?;
+ let ws = Workspace::new(&root, config)?;
+ ops::fetch(&ws)?;
+ Ok(())
+}
+
--- /dev/null
+use std::env;
+
+use cargo::core::Workspace;
+use cargo::ops;
+use cargo::util::{CliResult, Config};
+use cargo::util::important_paths::find_root_manifest_for_wd;
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_manifest_path: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Generate the lockfile for a project
+
+Usage:
+ cargo generate-lockfile [options]
+
+Options:
+ -h, --help Print this message
+ --manifest-path PATH Path to the manifest to generate a lockfile for
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ debug!("executing; cmd=cargo-generate-lockfile; args={:?}", env::args().collect::<Vec<_>>());
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+ let root = find_root_manifest_for_wd(options.flag_manifest_path, config.cwd())?;
+
+ let ws = Workspace::new(&root, config)?;
+ ops::generate_lockfile(&ws)?;
+ Ok(())
+}
--- /dev/null
+use cargo::core::source::{Source, SourceId, GitReference};
+use cargo::sources::git::{GitSource};
+use cargo::util::{Config, CliResult, ToUrl};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_url: String,
+ flag_reference: String,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Checkout a copy of a Git repository
+
+Usage:
+ cargo git-checkout [options] --url=URL --reference=REF
+ cargo git-checkout -h | --help
+
+Options:
+ -h, --help Print this message
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+ let Options { flag_url: url, flag_reference: reference, .. } = options;
+
+ let url = url.to_url()?;
+
+ let reference = GitReference::Branch(reference.clone());
+ let source_id = SourceId::for_git(&url, reference)?;
+
+ let mut source = GitSource::new(&source_id, config)?;
+
+ source.update()?;
+
+ Ok(())
+}
--- /dev/null
+use cargo::util::{CliResult, CliError, Config};
+
+#[derive(Deserialize)]
+pub struct Options;
+
+pub const USAGE: &'static str = "
+Get some help with a cargo command.
+
+Usage:
+ cargo help <command>
+ cargo help -h | --help
+
+Options:
+ -h, --help Print this message
+";
+
+pub fn execute(_: Options, _: &mut Config) -> CliResult {
+ // This is a dummy command just so that `cargo help help` works.
+ // The actual delegation of help flag to subcommands is handled by the
+ // cargo command.
+ Err(CliError::new("help command should not be executed directly".into(), 101))
+}
--- /dev/null
+use std::env;
+
+use cargo::ops;
+use cargo::util::{CliResult, Config};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_bin: bool,
+ flag_lib: bool,
+ arg_path: Option<String>,
+ flag_name: Option<String>,
+ flag_vcs: Option<ops::VersionControl>,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Create a new cargo package in an existing directory
+
+Usage:
+ cargo init [options] [<path>]
+ cargo init -h | --help
+
+Options:
+ -h, --help Print this message
+ --vcs VCS Initialize a new repository for the given version
+ control system (git, hg, pijul, or fossil) or do not
+ initialize any version control at all (none), overriding
+ a global configuration.
+ --bin Use a binary (application) template
+ --lib Use a library template [default]
+ --name NAME Set the resulting package name
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ debug!("executing; cmd=cargo-init; args={:?}", env::args().collect::<Vec<_>>());
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let Options { flag_bin, flag_lib, arg_path, flag_name, flag_vcs, .. } = options;
+
+ let path = &arg_path.unwrap_or_else(|| String::from("."));
+ let opts = ops::NewOptions::new(flag_vcs,
+ flag_bin,
+ flag_lib,
+ path,
+ flag_name.as_ref().map(|s| s.as_ref()));
+
+ let opts_lib = opts.lib;
+ ops::init(&opts, config)?;
+
+ config.shell().status("Created", format!("{} project",
+ if opts_lib { "library" }
+ else {"binary (application)"}))?;
+
+ Ok(())
+}
+
--- /dev/null
+use cargo::ops;
+use cargo::core::{SourceId, GitReference};
+use cargo::util::{CargoError, CliResult, Config, ToUrl};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_jobs: Option<u32>,
+ flag_features: Vec<String>,
+ flag_all_features: bool,
+ flag_no_default_features: bool,
+ flag_debug: bool,
+ flag_bin: Vec<String>,
+ flag_bins: bool,
+ flag_example: Vec<String>,
+ flag_examples: bool,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_root: Option<String>,
+ flag_list: bool,
+ flag_force: bool,
+ flag_frozen: bool,
+ flag_locked: bool,
+
+ arg_crate: Vec<String>,
+ flag_vers: Option<String>,
+ flag_version: Option<String>,
+
+ flag_git: Option<String>,
+ flag_branch: Option<String>,
+ flag_tag: Option<String>,
+ flag_rev: Option<String>,
+
+ flag_path: Option<String>,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Install a Rust binary
+
+Usage:
+ cargo install [options] [<crate>...]
+ cargo install [options] --list
+
+Specifying what crate to install:
+ --vers VERSION Specify a version to install from crates.io
+ --version VERSION
+ --git URL Git URL to install the specified crate from
+ --branch BRANCH Branch to use when installing from git
+ --tag TAG Tag to use when installing from git
+ --rev SHA Specific commit to use when installing from git
+ --path PATH Filesystem path to local crate to install
+
+Build and install options:
+ -h, --help Print this message
+ -j N, --jobs N Number of parallel jobs, defaults to # of CPUs
+ -f, --force Force overwriting existing crates or binaries
+ --features FEATURES Space-separated list of features to activate
+ --all-features Build all available features
+ --no-default-features Do not build the `default` feature
+ --debug Build in debug mode instead of release mode
+ --bin NAME Install only the specified binary
+ --bins Install all binaries
+ --example NAME Install only the specified example
+ --examples Install all examples
+ --root DIR Directory to install packages into
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet Less output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+This command manages Cargo's local set of installed binary crates. Only packages
+which have [[bin]] targets can be installed, and all binaries are installed into
+the installation root's `bin` folder. The installation root is determined, in
+order of precedence, by `--root`, `$CARGO_INSTALL_ROOT`, the `install.root`
+configuration key, and finally the home directory (which is either
+`$CARGO_HOME` if set or `$HOME/.cargo` by default).
+
+There are multiple sources from which a crate can be installed. The default
+location is crates.io but the `--git` and `--path` flags can change this source.
+If the source contains more than one package (such as crates.io or a git
+repository with multiple crates) the `<crate>` argument is required to indicate
+which crate should be installed.
+
+Crates from crates.io can optionally specify the version they wish to install
+via the `--vers` flags, and similarly packages from git repositories can
+optionally specify the branch, tag, or revision that should be installed. If a
+crate has multiple binaries, the `--bin` argument can selectively install only
+one of them, and if you'd rather install examples the `--example` argument can
+be used as well.
+
+By default cargo will refuse to overwrite existing binaries. The `--force` flag
+enables overwriting existing binaries. Thus you can reinstall a crate with
+`cargo install --force <crate>`.
+
+As a special convenience, omitting the <crate> specification entirely will
+install the crate in the current directory. That is, `install` is equivalent to
+the more explicit `install --path .`.
+
+The `--list` option will list all installed packages (and their versions).
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let compile_opts = ops::CompileOptions {
+ config: config,
+ jobs: options.flag_jobs,
+ target: None,
+ features: &options.flag_features,
+ all_features: options.flag_all_features,
+ no_default_features: options.flag_no_default_features,
+ spec: ops::Packages::Packages(&[]),
+ mode: ops::CompileMode::Build,
+ release: !options.flag_debug,
+ filter: ops::CompileFilter::new(false,
+ &options.flag_bin, options.flag_bins,
+ &[], false,
+ &options.flag_example, options.flag_examples,
+ &[], false,
+ false),
+ message_format: ops::MessageFormat::Human,
+ target_rustc_args: None,
+ target_rustdoc_args: None,
+ };
+
+ let source = if let Some(url) = options.flag_git {
+ let url = url.to_url()?;
+ let gitref = if let Some(branch) = options.flag_branch {
+ GitReference::Branch(branch)
+ } else if let Some(tag) = options.flag_tag {
+ GitReference::Tag(tag)
+ } else if let Some(rev) = options.flag_rev {
+ GitReference::Rev(rev)
+ } else {
+ GitReference::Branch("master".to_string())
+ };
+ SourceId::for_git(&url, gitref)?
+ } else if let Some(path) = options.flag_path {
+ SourceId::for_path(&config.cwd().join(path))?
+ } else if options.arg_crate.is_empty() {
+ SourceId::for_path(config.cwd())?
+ } else {
+ SourceId::crates_io(config)?
+ };
+
+ let krates = options.arg_crate.iter().map(|s| &s[..]).collect::<Vec<_>>();
+ let vers = match (&options.flag_vers, &options.flag_version) {
+ (&Some(_), &Some(_)) => return Err(CargoError::from("Invalid arguments.").into()),
+ (&Some(ref v), _) | (_, &Some(ref v)) => Some(v.as_ref()),
+ _ => None,
+ };
+ let root = options.flag_root.as_ref().map(|s| &s[..]);
+
+ if options.flag_list {
+ ops::install_list(root, config)?;
+ } else {
+ ops::install(root, krates, &source, vers, &compile_opts, options.flag_force)?;
+ }
+ Ok(())
+}
--- /dev/null
+use cargo;
+use cargo::util::{CliResult, CliError, Config};
+use cargo::util::important_paths::{find_root_manifest_for_wd};
+
+#[derive(Deserialize)]
+pub struct LocateProjectFlags {
+ flag_manifest_path: Option<String>,
+}
+
+pub const USAGE: &'static str = "
+Print a JSON representation of a Cargo.toml file's location
+
+Usage:
+ cargo locate-project [options]
+
+Options:
+ --manifest-path PATH Path to the manifest to locate
+ -h, --help Print this message
+";
+
+#[derive(Serialize)]
+pub struct ProjectLocation {
+ root: String
+}
+
+pub fn execute(flags: LocateProjectFlags, config: &mut Config) -> CliResult {
+ let root = find_root_manifest_for_wd(flags.flag_manifest_path, config.cwd())?;
+
+ let string = root.to_str()
+ .ok_or_else(|| "Your project path contains \
+ characters not representable in \
+ Unicode".into())
+ .map_err(|e| CliError::new(e, 1))?;
+
+ let location = ProjectLocation { root: string.to_string() };
+ cargo::print_json(&location);
+ Ok(())
+}
--- /dev/null
+use std::io::prelude::*;
+use std::io;
+
+use cargo::ops;
+use cargo::core::{SourceId, Source};
+use cargo::sources::RegistrySource;
+use cargo::util::{CargoError, CliResult, CargoResultExt, Config};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_host: Option<String>,
+ arg_token: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+ flag_registry: Option<String>,
+}
+
+pub const USAGE: &'static str = "
+Save an api token from the registry locally. If token is not specified, it will be read from stdin.
+
+Usage:
+ cargo login [options] [<token>]
+
+Options:
+ -h, --help Print this message
+ --host HOST Host to set the token for
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+ --registry REGISTRY Registry to use
+
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ if options.flag_registry.is_some() && !config.cli_unstable().unstable_options {
+ return Err(CargoError::from("registry option is an unstable feature and requires -Zunstable-options to use.").into());
+ }
+
+ let token = match options.arg_token {
+ Some(token) => token,
+ None => {
+ let host = match options.flag_registry {
+ Some(ref _registry) => {
+ return Err(CargoError::from("token must be provided when --registry is provided.").into());
+ }
+ None => {
+ let src = SourceId::crates_io(config)?;
+ let mut src = RegistrySource::remote(&src, config);
+ src.update()?;
+ let config = src.config()?.unwrap();
+ options.flag_host.clone().unwrap_or(config.api.unwrap())
+ }
+ };
+ println!("please visit {}me and paste the API Token below", host);
+ let mut line = String::new();
+ let input = io::stdin();
+ input.lock().read_line(&mut line).chain_err(|| {
+ "failed to read stdin"
+ })?;
+ line.trim().to_string()
+ }
+ };
+
+ ops::registry_login(config, token, options.flag_registry)?;
+ Ok(())
+}
--- /dev/null
+use cargo;
+use cargo::core::Workspace;
+use cargo::ops::{output_metadata, OutputMetadataOptions};
+use cargo::util::important_paths::find_root_manifest_for_wd;
+use cargo::util::{CliResult, Config};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_color: Option<String>,
+ flag_features: Vec<String>,
+ flag_all_features: bool,
+ flag_format_version: Option<u32>,
+ flag_manifest_path: Option<String>,
+ flag_no_default_features: bool,
+ flag_no_deps: bool,
+ flag_quiet: Option<bool>,
+ flag_verbose: u32,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Output the resolved dependencies of a project, the concrete used versions
+including overrides, in machine-readable format.
+
+Usage:
+ cargo metadata [options]
+
+Options:
+ -h, --help Print this message
+ --features FEATURES Space-separated list of features
+ --all-features Build all available features
+ --no-default-features Do not include the `default` feature
+ --no-deps Output information only about the root package
+ and don't fetch dependencies.
+ --manifest-path PATH Path to the manifest
+ --format-version VERSION Format version
+ Valid values: 1
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+ let manifest = find_root_manifest_for_wd(options.flag_manifest_path, config.cwd())?;
+
+ if options.flag_format_version.is_none() {
+ config.shell().warn("please specify `--format-version` flag explicitly to \
+ avoid compatibility problems")?
+ }
+
+ let options = OutputMetadataOptions {
+ features: options.flag_features,
+ all_features: options.flag_all_features,
+ no_default_features: options.flag_no_default_features,
+ no_deps: options.flag_no_deps,
+ version: options.flag_format_version.unwrap_or(1),
+ };
+
+ let ws = Workspace::new(&manifest, config)?;
+ let result = output_metadata(&ws, &options)?;
+ cargo::print_json(&result);
+ Ok(())
+}
--- /dev/null
+use std::env;
+
+use cargo::ops;
+use cargo::util::{CliResult, Config};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_bin: bool,
+ flag_lib: bool,
+ arg_path: String,
+ flag_name: Option<String>,
+ flag_vcs: Option<ops::VersionControl>,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Create a new cargo package at <path>
+
+Usage:
+ cargo new [options] <path>
+ cargo new -h | --help
+
+Options:
+ -h, --help Print this message
+ --vcs VCS Initialize a new repository for the given version
+ control system (git, hg, pijul, or fossil) or do not
+ initialize any version control at all (none), overriding
+ a global configuration.
+ --bin Use a binary (application) template
+ --lib Use a library template [default]
+ --name NAME Set the resulting package name, defaults to the value of <path>
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ debug!("executing; cmd=cargo-new; args={:?}", env::args().collect::<Vec<_>>());
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let Options { flag_bin, flag_lib, arg_path, flag_name, flag_vcs, .. } = options;
+
+ let opts = ops::NewOptions::new(flag_vcs,
+ flag_bin,
+ flag_lib,
+ &arg_path,
+ flag_name.as_ref().map(|s| s.as_ref()));
+
+ let opts_lib = opts.lib;
+ ops::new(&opts, config)?;
+
+ config.shell().status("Created", format!("{} `{}` project",
+ if opts_lib { "library" }
+ else {"binary (application)"},
+ arg_path))?;
+
+ Ok(())
+}
+
--- /dev/null
+use cargo::ops;
+use cargo::util::{CargoError, CliResult, Config};
+
+#[derive(Deserialize)]
+pub struct Options {
+ arg_crate: Option<String>,
+ flag_token: Option<String>,
+ flag_add: Option<Vec<String>>,
+ flag_remove: Option<Vec<String>>,
+ flag_index: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_list: bool,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+ flag_registry: Option<String>,
+}
+
+pub const USAGE: &'static str = "
+Manage the owners of a crate on the registry
+
+Usage:
+ cargo owner [options] [<crate>]
+
+Options:
+ -h, --help Print this message
+ -a, --add LOGIN Name of a user or team to add as an owner
+ -r, --remove LOGIN Name of a user or team to remove as an owner
+ -l, --list List owners of a crate
+ --index INDEX Registry index to modify owners for
+ --token TOKEN API token to use when authenticating
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+ --registry REGISTRY Registry to use
+
+This command will modify the owners for a package on the specified registry (or
+default). Note that owners of a package can upload new versions, yank old
+versions. Explicitly named owners can also modify the set of owners, so take
+caution!
+
+See http://doc.crates.io/crates-io.html#cargo-owner for detailed documentation
+and troubleshooting.
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+ let opts = ops::OwnersOptions {
+ krate: options.arg_crate,
+ token: options.flag_token,
+ index: options.flag_index,
+ to_add: options.flag_add,
+ to_remove: options.flag_remove,
+ list: options.flag_list,
+ registry: options.flag_registry,
+ };
+
+ if opts.registry.is_some() && !config.cli_unstable().unstable_options {
+ return Err(CargoError::from("registry option is an unstable feature and requires -Zunstable-options to use.").into());
+ }
+
+ ops::modify_owners(config, &opts)?;
+ Ok(())
+}
+
--- /dev/null
+use cargo::core::Workspace;
+use cargo::ops;
+use cargo::util::{CliResult, Config};
+use cargo::util::important_paths::find_root_manifest_for_wd;
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_target: Option<String>,
+ flag_manifest_path: Option<String>,
+ flag_no_verify: bool,
+ flag_no_metadata: bool,
+ flag_list: bool,
+ flag_allow_dirty: bool,
+ flag_jobs: Option<u32>,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Assemble the local package into a distributable tarball
+
+Usage:
+ cargo package [options]
+
+Options:
+ -h, --help Print this message
+ -l, --list Print files included in a package without making one
+ --no-verify Don't verify the contents by building them
+ --no-metadata Ignore warnings about a lack of human-usable metadata
+ --allow-dirty Allow dirty working directories to be packaged
+ --target TRIPLE Build for the target triple
+ --manifest-path PATH Path to the manifest to compile
+ -j N, --jobs N Number of parallel jobs, defaults to # of CPUs
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+ let root = find_root_manifest_for_wd(options.flag_manifest_path, config.cwd())?;
+ let ws = Workspace::new(&root, config)?;
+ ops::package(&ws, &ops::PackageOpts {
+ config: config,
+ verify: !options.flag_no_verify,
+ list: options.flag_list,
+ check_metadata: !options.flag_no_metadata,
+ allow_dirty: options.flag_allow_dirty,
+ target: options.flag_target.as_ref().map(|t| &t[..]),
+ jobs: options.flag_jobs,
+ registry: None,
+ })?;
+ Ok(())
+}
--- /dev/null
+use cargo::core::Workspace;
+use cargo::ops;
+use cargo::util::{CliResult, Config};
+use cargo::util::important_paths::{find_root_manifest_for_wd};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_manifest_path: Option<String>,
+ flag_frozen: bool,
+ flag_locked: bool,
+ flag_package: Option<String>,
+ arg_spec: Option<String>,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Print a fully qualified package specification
+
+Usage:
+ cargo pkgid [options] [<spec>]
+
+Options:
+ -h, --help Print this message
+ -p SPEC, --package SPEC Argument to get the package id specifier for
+ --manifest-path PATH Path to the manifest to the package to clean
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+Given a <spec> argument, print out the fully qualified package id specifier.
+This command will generate an error if <spec> is ambiguous as to which package
+it refers to in the dependency graph. If no <spec> is given, then the pkgid for
+the local package is printed.
+
+This command requires that a lockfile is available and dependencies have been
+fetched.
+
+Example Package IDs
+
+ pkgid | name | version | url
+ |-----------------------------|--------|-----------|---------------------|
+ foo | foo | * | *
+ foo:1.2.3 | foo | 1.2.3 | *
+ crates.io/foo | foo | * | *://crates.io/foo
+ crates.io/foo#1.2.3 | foo | 1.2.3 | *://crates.io/foo
+ crates.io/bar#foo:1.2.3 | foo | 1.2.3 | *://crates.io/bar
+ http://crates.io/foo#1.2.3 | foo | 1.2.3 | http://crates.io/foo
+
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+ let root = find_root_manifest_for_wd(options.flag_manifest_path.clone(), config.cwd())?;
+ let ws = Workspace::new(&root, config)?;
+
+ let spec = if options.arg_spec.is_some() {
+ options.arg_spec
+ } else if options.flag_package.is_some() {
+ options.flag_package
+ } else {
+ None
+ };
+ let spec = spec.as_ref().map(|s| &s[..]);
+ let spec = ops::pkgid(&ws, spec)?;
+ println!("{}", spec);
+ Ok(())
+}
+
--- /dev/null
+use cargo::core::Workspace;
+use cargo::ops;
+use cargo::util::{CargoError, CliResult, Config};
+use cargo::util::important_paths::find_root_manifest_for_wd;
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_index: Option<String>,
+ flag_host: Option<String>, // TODO: Deprecated, remove
+ flag_token: Option<String>,
+ flag_target: Option<String>,
+ flag_manifest_path: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_no_verify: bool,
+ flag_allow_dirty: bool,
+ flag_jobs: Option<u32>,
+ flag_dry_run: bool,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+ flag_registry: Option<String>,
+}
+
+pub const USAGE: &'static str = "
+Upload a package to the registry
+
+Usage:
+ cargo publish [options]
+
+Options:
+ -h, --help Print this message
+ --index INDEX Registry index to upload the package to
+ --host HOST DEPRECATED, renamed to '--index'
+ --token TOKEN Token to use when uploading
+ --no-verify Don't verify package tarball before publish
+ --allow-dirty Allow publishing with a dirty source directory
+ --target TRIPLE Build for the target triple
+ --manifest-path PATH Path to the manifest of the package to publish
+ -j N, --jobs N Number of parallel jobs, defaults to # of CPUs
+ --dry-run Perform all checks without uploading
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+ --registry REGISTRY Registry to publish to
+
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let Options {
+ flag_token: token,
+ flag_index: index,
+ flag_host: host, // TODO: Deprecated, remove
+ flag_manifest_path,
+ flag_no_verify: no_verify,
+ flag_allow_dirty: allow_dirty,
+ flag_jobs: jobs,
+ flag_dry_run: dry_run,
+ flag_target: target,
+ flag_registry: registry,
+ ..
+ } = options;
+
+ if registry.is_some() && !config.cli_unstable().unstable_options {
+ return Err(CargoError::from("registry option is an unstable feature and requires -Zunstable-options to use.").into());
+ }
+
+ // TODO: Deprecated
+ // remove once it has been decided --host can be removed
+ // We may instead want to repurpose the host flag, as
+ // mentioned in this issue
+ // https://github.com/rust-lang/cargo/issues/4208
+ let msg = "The flag '--host' is no longer valid.
+
+Previous versions of Cargo accepted this flag, but it is being
+deprecated. The flag is being renamed to 'index', as the flag
+wants the location of the index to which to publish. Please
+use '--index' instead.
+
+This will soon become a hard error, so it's either recommended
+to update to a fixed version or contact the upstream maintainer
+about this warning.";
+
+ let root = find_root_manifest_for_wd(flag_manifest_path.clone(), config.cwd())?;
+ let ws = Workspace::new(&root, config)?;
+ ops::publish(&ws, &ops::PublishOpts {
+ config: config,
+ token: token,
+ index:
+ if host.clone().is_none() || host.clone().unwrap().is_empty() { index }
+ else { config.shell().warn(&msg)?; host }, // TODO: Deprecated, remove
+ verify: !no_verify,
+ allow_dirty: allow_dirty,
+ target: target.as_ref().map(|t| &t[..]),
+ jobs: jobs,
+ dry_run: dry_run,
+ registry: registry,
+ })?;
+ Ok(())
+}
--- /dev/null
+use std::env;
+
+use cargo;
+use cargo::core::Package;
+use cargo::util::{CliResult, Config};
+use cargo::util::important_paths::{find_root_manifest_for_wd};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_manifest_path: Option<String>,
+ flag_color: Option<String>,
+}
+
+pub const USAGE: &'static str = "
+Deprecated, use `cargo metadata --no-deps` instead.
+Print a JSON representation of a Cargo.toml manifest.
+
+Usage:
+ cargo read-manifest [options]
+ cargo read-manifest -h | --help
+
+Options:
+ -h, --help Print this message
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ --manifest-path PATH Path to the manifest
+ --color WHEN Coloring: auto, always, never
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ debug!("executing; cmd=cargo-read-manifest; args={:?}",
+ env::args().collect::<Vec<_>>());
+ config.shell().set_color_choice(options.flag_color.as_ref().map(|s| &s[..]))?;
+
+ let root = find_root_manifest_for_wd(options.flag_manifest_path, config.cwd())?;
+
+ let pkg = Package::for_path(&root, config)?;
+ cargo::print_json(&pkg);
+ Ok(())
+}
--- /dev/null
+use std::iter::FromIterator;
+
+use cargo::core::Workspace;
+use cargo::ops::{self, MessageFormat, Packages};
+use cargo::util::{CliResult, CliError, Config, CargoErrorKind};
+use cargo::util::important_paths::{find_root_manifest_for_wd};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_bin: Option<String>,
+ flag_example: Option<String>,
+ flag_package: Option<String>,
+ flag_jobs: Option<u32>,
+ flag_features: Vec<String>,
+ flag_all_features: bool,
+ flag_no_default_features: bool,
+ flag_target: Option<String>,
+ flag_manifest_path: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_message_format: MessageFormat,
+ flag_release: bool,
+ flag_frozen: bool,
+ flag_locked: bool,
+ arg_args: Vec<String>,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Run the main binary of the local package (src/main.rs)
+
+Usage:
+ cargo run [options] [--] [<args>...]
+
+Options:
+ -h, --help Print this message
+ --bin NAME Name of the bin target to run
+ --example NAME Name of the example target to run
+ -p SPEC, --package SPEC Package with the target to run
+ -j N, --jobs N Number of parallel jobs, defaults to # of CPUs
+ --release Build artifacts in release mode, with optimizations
+ --features FEATURES Space-separated list of features to also build
+ --all-features Build all available features
+ --no-default-features Do not build the `default` feature
+ --target TRIPLE Build for the target triple
+ --manifest-path PATH Path to the manifest to execute
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --message-format FMT Error format: human, json [default: human]
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+If neither `--bin` nor `--example` are given, then if the project only has one
+bin target it will be run. Otherwise `--bin` specifies the bin target to run,
+and `--example` specifies the example target to run. At most one of `--bin` or
+`--example` can be provided.
+
+All of the trailing arguments are passed to the binary to run. If you're passing
+arguments to both Cargo and the binary, the ones after `--` go to the binary,
+the ones before go to Cargo.
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let root = find_root_manifest_for_wd(options.flag_manifest_path, config.cwd())?;
+
+ let (mut examples, mut bins) = (Vec::new(), Vec::new());
+ if let Some(s) = options.flag_bin {
+ bins.push(s);
+ }
+ if let Some(s) = options.flag_example {
+ examples.push(s);
+ }
+
+ let packages = Vec::from_iter(options.flag_package.iter().cloned());
+ let spec = Packages::Packages(&packages);
+
+ let compile_opts = ops::CompileOptions {
+ config: config,
+ jobs: options.flag_jobs,
+ target: options.flag_target.as_ref().map(|t| &t[..]),
+ features: &options.flag_features,
+ all_features: options.flag_all_features,
+ no_default_features: options.flag_no_default_features,
+ spec: spec,
+ release: options.flag_release,
+ mode: ops::CompileMode::Build,
+ filter: if examples.is_empty() && bins.is_empty() {
+ ops::CompileFilter::Default { required_features_filterable: false, }
+ } else {
+ ops::CompileFilter::new(false,
+ &bins, false,
+ &[], false,
+ &examples, false,
+ &[], false,
+ false)
+ },
+ message_format: options.flag_message_format,
+ target_rustdoc_args: None,
+ target_rustc_args: None,
+ };
+
+ let ws = Workspace::new(&root, config)?;
+ match ops::run(&ws, &compile_opts, &options.arg_args)? {
+ None => Ok(()),
+ Some(err) => {
+ // If we never actually spawned the process then that sounds pretty
+ // bad and we always want to forward that up.
+ let exit = match err.exit {
+ Some(exit) => exit,
+ None => return Err(
+ CliError::new(CargoErrorKind::ProcessErrorKind(err).into(), 101)),
+ };
+
+ // If `-q` was passed then we suppress extra error information about
+ // a failed process, we assume the process itself printed out enough
+ // information about why it failed so we don't do so as well
+ let exit_code = exit.code().unwrap_or(101);
+ Err(if options.flag_quiet == Some(true) {
+ CliError::code(exit_code)
+ } else {
+ CliError::new(CargoErrorKind::ProcessErrorKind(err).into(), exit_code)
+ })
+ }
+ }
+}
--- /dev/null
+use std::env;
+
+use cargo::core::Workspace;
+use cargo::ops::{self, CompileOptions, CompileMode, MessageFormat, Packages};
+use cargo::util::important_paths::{find_root_manifest_for_wd};
+use cargo::util::{CliResult, CliError, Config};
+
+#[derive(Deserialize)]
+pub struct Options {
+ arg_opts: Option<Vec<String>>,
+ flag_package: Option<String>,
+ flag_jobs: Option<u32>,
+ flag_features: Vec<String>,
+ flag_all_features: bool,
+ flag_no_default_features: bool,
+ flag_target: Option<String>,
+ flag_manifest_path: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_message_format: MessageFormat,
+ flag_release: bool,
+ flag_lib: bool,
+ flag_bin: Vec<String>,
+ flag_bins: bool,
+ flag_example: Vec<String>,
+ flag_examples: bool,
+ flag_test: Vec<String>,
+ flag_tests: bool,
+ flag_bench: Vec<String>,
+ flag_benches: bool,
+ flag_all_targets: bool,
+ flag_profile: Option<String>,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Compile a package and all of its dependencies
+
+Usage:
+ cargo rustc [options] [--] [<opts>...]
+
+Options:
+ -h, --help Print this message
+ -p SPEC, --package SPEC Package to build
+ -j N, --jobs N Number of parallel jobs, defaults to # of CPUs
+ --lib Build only this package's library
+ --bin NAME Build only the specified binary
+ --bins Build all binaries
+ --example NAME Build only the specified example
+ --examples Build all examples
+ --test NAME Build only the specified test target
+ --tests Build all tests
+ --bench NAME Build only the specified bench target
+ --benches Build all benches
+ --all-targets Build all targets (lib and bin targets by default)
+ --release Build artifacts in release mode, with optimizations
+ --profile PROFILE Profile to build the selected target for
+ --features FEATURES Features to compile for the package
+ --all-features Build all available features
+ --no-default-features Do not compile default features for the package
+ --target TRIPLE Target triple which compiles will be for
+ --manifest-path PATH Path to the manifest to fetch dependencies for
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --message-format FMT Error format: human, json [default: human]
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+The specified target for the current package (or package specified by SPEC if
+provided) will be compiled along with all of its dependencies. The specified
+<opts>... will all be passed to the final compiler invocation, not any of the
+dependencies. Note that the compiler will still unconditionally receive
+arguments such as -L, --extern, and --crate-type, and the specified <opts>...
+will simply be added to the compiler invocation.
+
+This command requires that only one target is being compiled. If more than one
+target is available for the current package the filters of --lib, --bin, etc,
+must be used to select which target is compiled. To pass flags to all compiler
+processes spawned by Cargo, use the $RUSTFLAGS environment variable or the
+`build.rustflags` configuration option.
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ debug!("executing; cmd=cargo-rustc; args={:?}",
+ env::args().collect::<Vec<_>>());
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let root = find_root_manifest_for_wd(options.flag_manifest_path,
+ config.cwd())?;
+ let mode = match options.flag_profile.as_ref().map(|t| &t[..]) {
+ Some("dev") | None => CompileMode::Build,
+ Some("test") => CompileMode::Test,
+ Some("bench") => CompileMode::Bench,
+ Some("check") => CompileMode::Check {test: false},
+ Some(mode) => {
+ let err = format!("unknown profile: `{}`, use dev,
+ test, or bench", mode).into();
+ return Err(CliError::new(err, 101))
+ }
+ };
+
+ let spec = options.flag_package.map_or_else(Vec::new, |s| vec![s]);
+
+ let opts = CompileOptions {
+ config: config,
+ jobs: options.flag_jobs,
+ target: options.flag_target.as_ref().map(|t| &t[..]),
+ features: &options.flag_features,
+ all_features: options.flag_all_features,
+ no_default_features: options.flag_no_default_features,
+ spec: Packages::Packages(&spec),
+ mode: mode,
+ release: options.flag_release,
+ filter: ops::CompileFilter::new(options.flag_lib,
+ &options.flag_bin, options.flag_bins,
+ &options.flag_test, options.flag_tests,
+ &options.flag_example, options.flag_examples,
+ &options.flag_bench, options.flag_benches,
+ options.flag_all_targets),
+ message_format: options.flag_message_format,
+ target_rustdoc_args: None,
+ target_rustc_args: options.arg_opts.as_ref().map(|a| &a[..]),
+ };
+
+ let ws = Workspace::new(&root, config)?;
+ ops::compile(&ws, &opts)?;
+ Ok(())
+}
+
--- /dev/null
+use cargo::core::Workspace;
+use cargo::ops::{self, MessageFormat, Packages};
+use cargo::util::{CliResult, Config};
+use cargo::util::important_paths::{find_root_manifest_for_wd};
+
+#[derive(Deserialize)]
+pub struct Options {
+ arg_opts: Vec<String>,
+ flag_target: Option<String>,
+ flag_features: Vec<String>,
+ flag_all_features: bool,
+ flag_jobs: Option<u32>,
+ flag_manifest_path: Option<String>,
+ flag_no_default_features: bool,
+ flag_open: bool,
+ flag_verbose: u32,
+ flag_release: bool,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_message_format: MessageFormat,
+ flag_package: Option<String>,
+ flag_lib: bool,
+ flag_bin: Vec<String>,
+ flag_bins: bool,
+ flag_example: Vec<String>,
+ flag_examples: bool,
+ flag_test: Vec<String>,
+ flag_tests: bool,
+ flag_bench: Vec<String>,
+ flag_benches: bool,
+ flag_all_targets: bool,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Build a package's documentation, using specified custom flags.
+
+Usage:
+ cargo rustdoc [options] [--] [<opts>...]
+
+Options:
+ -h, --help Print this message
+ --open Opens the docs in a browser after the operation
+ -p SPEC, --package SPEC Package to document
+ -j N, --jobs N Number of parallel jobs, defaults to # of CPUs
+ --lib Build only this package's library
+ --bin NAME Build only the specified binary
+ --bins Build all binaries
+ --example NAME Build only the specified example
+ --examples Build all examples
+ --test NAME Build only the specified test target
+ --tests Build all tests
+ --bench NAME Build only the specified bench target
+ --benches Build all benches
+ --all-targets Build all targets (default)
+ --release Build artifacts in release mode, with optimizations
+ --features FEATURES Space-separated list of features to also build
+ --all-features Build all available features
+ --no-default-features Do not build the `default` feature
+ --target TRIPLE Build for the target triple
+ --manifest-path PATH Path to the manifest to document
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --message-format FMT Error format: human, json [default: human]
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+The specified target for the current package (or package specified by SPEC if
+provided) will be documented with the specified <opts>... being passed to the
+final rustdoc invocation. Dependencies will not be documented as part of this
+command. Note that rustdoc will still unconditionally receive arguments such
+as -L, --extern, and --crate-type, and the specified <opts>... will simply be
+added to the rustdoc invocation.
+
+If the --package argument is given, then SPEC is a package id specification
+which indicates which package should be documented. If it is not given, then the
+current package is documented. For more information on SPEC and its format, see
+the `cargo help pkgid` command.
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let root = find_root_manifest_for_wd(options.flag_manifest_path,
+ config.cwd())?;
+
+ let spec = options.flag_package.map_or_else(Vec::new, |s| vec![s]);
+
+ let doc_opts = ops::DocOptions {
+ open_result: options.flag_open,
+ compile_opts: ops::CompileOptions {
+ config: config,
+ jobs: options.flag_jobs,
+ target: options.flag_target.as_ref().map(|t| &t[..]),
+ features: &options.flag_features,
+ all_features: options.flag_all_features,
+ no_default_features: options.flag_no_default_features,
+ spec: Packages::Packages(&spec),
+ release: options.flag_release,
+ filter: ops::CompileFilter::new(options.flag_lib,
+ &options.flag_bin, options.flag_bins,
+ &options.flag_test, options.flag_tests,
+ &options.flag_example, options.flag_examples,
+ &options.flag_bench, options.flag_benches,
+ options.flag_all_targets),
+ message_format: options.flag_message_format,
+ mode: ops::CompileMode::Doc { deps: false },
+ target_rustdoc_args: Some(&options.arg_opts),
+ target_rustc_args: None,
+ },
+ };
+
+ let ws = Workspace::new(&root, config)?;
+ ops::doc(&ws, &doc_opts)?;
+
+ Ok(())
+}
--- /dev/null
+use cargo::ops;
+use cargo::util::{CargoError, CliResult, Config};
+
+use std::cmp;
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_index: Option<String>,
+ flag_host: Option<String>, // TODO: Depricated, remove
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_limit: Option<u32>,
+ flag_frozen: bool,
+ flag_locked: bool,
+ arg_query: Vec<String>,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+ flag_registry: Option<String>,
+}
+
+pub const USAGE: &'static str = "
+Search packages in crates.io
+
+Usage:
+ cargo search [options] <query>...
+ cargo search [-h | --help]
+
+Options:
+ -h, --help Print this message
+ --index INDEX Registry index to search in
+ --host HOST DEPRECATED, renamed to '--index'
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --limit LIMIT Limit the number of results (default: 10, max: 100)
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+ --registry REGISTRY Registry to use
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+ let Options {
+ flag_index: index,
+ flag_host: host, // TODO: Depricated, remove
+ flag_limit: limit,
+ arg_query: query,
+ flag_registry: registry,
+ ..
+ } = options;
+
+ if registry.is_some() && !config.cli_unstable().unstable_options {
+ return Err(CargoError::from("registry option is an unstable feature and requires -Zunstable-options to use.").into());
+ }
+
+ // TODO: Depricated
+ // remove once it has been decided --host can be safely removed
+ // We may instead want to repurpose the host flag, as
+ // mentioned in this issue
+ // https://github.com/rust-lang/cargo/issues/4208
+
+ let msg = "The flag '--host' is no longer valid.
+
+Previous versions of Cargo accepted this flag, but it is being
+depricated. The flag is being renamed to 'index', as the flag
+wants the location of the index in which to search. Please
+use '--index' instead.
+
+This will soon become a hard error, so it's either recommended
+to update to a fixed version or contact the upstream maintainer
+about this warning.";
+
+ let index = if host.clone().is_none() || host.clone().unwrap().is_empty() {
+ index
+ } else {
+ config.shell().warn(&msg)?;
+ host
+ };
+
+ ops::search(&query.join("+"), config, index, cmp::min(100, limit.unwrap_or(10)) as u8, registry)?;
+ Ok(())
+}
--- /dev/null
+use std::env;
+
+use cargo::core::Workspace;
+use cargo::ops::{self, MessageFormat, Packages};
+use cargo::util::{CliResult, CliError, Config, CargoErrorKind};
+use cargo::util::important_paths::find_root_manifest_for_wd;
+
+#[derive(Deserialize)]
+pub struct Options {
+ arg_args: Vec<String>,
+ flag_features: Vec<String>,
+ flag_all_features: bool,
+ flag_jobs: Option<u32>,
+ flag_manifest_path: Option<String>,
+ flag_no_default_features: bool,
+ flag_no_run: bool,
+ flag_package: Vec<String>,
+ flag_target: Option<String>,
+ flag_lib: bool,
+ flag_doc: bool,
+ flag_bin: Vec<String>,
+ flag_bins: bool,
+ flag_example: Vec<String>,
+ flag_examples: bool,
+ flag_test: Vec<String>,
+ flag_tests: bool,
+ flag_bench: Vec<String>,
+ flag_benches: bool,
+ flag_all_targets: bool,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_message_format: MessageFormat,
+ flag_release: bool,
+ flag_no_fail_fast: bool,
+ flag_frozen: bool,
+ flag_locked: bool,
+ flag_all: bool,
+ flag_exclude: Vec<String>,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Execute all unit and integration tests of a local package
+
+Usage:
+ cargo test [options] [--] [<args>...]
+
+Options:
+ -h, --help Print this message
+ --lib Test only this package's library
+ --doc Test only this library's documentation
+ --bin NAME ... Test only the specified binary
+ --bins Test all binaries
+ --example NAME ... Check that the specified examples compile
+ --examples Check that all examples compile
+ --test NAME ... Test only the specified test target
+ --tests Test all tests
+ --bench NAME ... Test only the specified bench target
+ --benches Test all benches
+ --all-targets Test all targets (default)
+ --no-run Compile, but don't run tests
+ -p SPEC, --package SPEC ... Package to run tests for
+ --all Test all packages in the workspace
+ --exclude SPEC ... Exclude packages from the test
+ -j N, --jobs N Number of parallel builds, see below for details
+ --release Build artifacts in release mode, with optimizations
+ --features FEATURES Space-separated list of features to also build
+ --all-features Build all available features
+ --no-default-features Do not build the `default` feature
+ --target TRIPLE Build for the target triple
+ --manifest-path PATH Path to the manifest to build tests for
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --message-format FMT Error format: human, json [default: human]
+ --no-fail-fast Run all tests regardless of failure
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+All of the trailing arguments are passed to the test binaries generated for
+filtering tests and generally providing options configuring how they run. For
+example, this will run all tests with the name `foo` in their name:
+
+ cargo test foo
+
+If the --package argument is given, then SPEC is a package id specification
+which indicates which package should be tested. If it is not given, then the
+current package is tested. For more information on SPEC and its format, see the
+`cargo help pkgid` command.
+
+All packages in the workspace are tested if the `--all` flag is supplied. The
+`--all` flag is automatically assumed for a virtual manifest.
+Note that `--exclude` has to be specified in conjunction with the `--all` flag.
+
+The --jobs argument affects the building of the test executable but does
+not affect how many jobs are used when running the tests. The default value
+for the --jobs argument is the number of CPUs. If you want to control the
+number of simultaneous running test cases, pass the `--test-threads` option
+to the test binaries:
+
+ cargo test -- --test-threads=1
+
+Compilation can be configured via the `test` profile in the manifest.
+
+By default the rust test harness hides output from test execution to
+keep results readable. Test output can be recovered (e.g. for debugging)
+by passing `--nocapture` to the test binaries:
+
+ cargo test -- --nocapture
+
+To get the list of all options available for the test binaries use this:
+
+ cargo test -- --help
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ debug!("executing; cmd=cargo-test; args={:?}",
+ env::args().collect::<Vec<_>>());
+
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let root = find_root_manifest_for_wd(options.flag_manifest_path, config.cwd())?;
+ let ws = Workspace::new(&root, config)?;
+
+ let empty = Vec::new();
+ let (mode, filter);
+ if options.flag_doc {
+ mode = ops::CompileMode::Doctest;
+ filter = ops::CompileFilter::new(true, &empty, false, &empty, false,
+ &empty, false, &empty, false,
+ false);
+ } else {
+ mode = ops::CompileMode::Test;
+ filter = ops::CompileFilter::new(options.flag_lib,
+ &options.flag_bin, options.flag_bins,
+ &options.flag_test, options.flag_tests,
+ &options.flag_example, options.flag_examples,
+ &options.flag_bench, options.flag_benches,
+ options.flag_all_targets);
+ }
+
+ let spec = Packages::from_flags(ws.is_virtual(),
+ options.flag_all,
+ &options.flag_exclude,
+ &options.flag_package)?;
+
+ let ops = ops::TestOptions {
+ no_run: options.flag_no_run,
+ no_fail_fast: options.flag_no_fail_fast,
+ only_doc: options.flag_doc,
+ compile_opts: ops::CompileOptions {
+ config: config,
+ jobs: options.flag_jobs,
+ target: options.flag_target.as_ref().map(|s| &s[..]),
+ features: &options.flag_features,
+ all_features: options.flag_all_features,
+ no_default_features: options.flag_no_default_features,
+ spec: spec,
+ release: options.flag_release,
+ mode: mode,
+ filter: filter,
+ message_format: options.flag_message_format,
+ target_rustdoc_args: None,
+ target_rustc_args: None,
+ },
+ };
+
+ let err = ops::run_tests(&ws, &ops, &options.arg_args)?;
+ match err {
+ None => Ok(()),
+ Some(err) => {
+ Err(match err.exit.as_ref().and_then(|e| e.code()) {
+ Some(i) => CliError::new(err.hint().into(), i),
+ None => CliError::new(CargoErrorKind::CargoTestErrorKind(err).into(), 101),
+ })
+ }
+ }
+}
--- /dev/null
+use cargo::ops;
+use cargo::util::{CliResult, Config};
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_bin: Vec<String>,
+ flag_root: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+
+ arg_spec: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Remove a Rust binary
+
+Usage:
+ cargo uninstall [options] <spec>...
+ cargo uninstall (-h | --help)
+
+Options:
+ -h, --help Print this message
+ --root DIR Directory to uninstall packages from
+ --bin NAME Only uninstall the binary NAME
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet Less output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+The argument SPEC is a package id specification (see `cargo help pkgid`) to
+specify which crate should be uninstalled. By default all binaries are
+uninstalled for a crate but the `--bin` and `--example` flags can be used to
+only uninstall particular binaries.
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ let root = options.flag_root.as_ref().map(|s| &s[..]);
+ let specs = options.arg_spec.iter().map(|s| &s[..]).collect::<Vec<_>>();
+
+ ops::uninstall(root, specs, &options.flag_bin, config)?;
+ Ok(())
+}
+
--- /dev/null
+use std::env;
+
+use cargo::core::Workspace;
+use cargo::ops;
+use cargo::util::{CliResult, Config};
+use cargo::util::important_paths::find_root_manifest_for_wd;
+
+#[derive(Deserialize)]
+pub struct Options {
+ flag_package: Vec<String>,
+ flag_aggressive: bool,
+ flag_precise: Option<String>,
+ flag_manifest_path: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Update dependencies as recorded in the local lock file.
+
+Usage:
+ cargo update [options]
+
+Options:
+ -h, --help Print this message
+ -p SPEC, --package SPEC ... Package to update
+ --aggressive Force updating all dependencies of <name> as well
+ --precise PRECISE Update a single dependency to exactly PRECISE
+ --manifest-path PATH Path to the crate's manifest
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+
+This command requires that a `Cargo.lock` already exists as generated by
+`cargo build` or related commands.
+
+If SPEC is given, then a conservative update of the lockfile will be
+performed. This means that only the dependency specified by SPEC will be
+updated. Its transitive dependencies will be updated only if SPEC cannot be
+updated without updating dependencies. All other dependencies will remain
+locked at their currently recorded versions.
+
+If PRECISE is specified, then --aggressive must not also be specified. The
+argument PRECISE is a string representing a precise revision that the package
+being updated should be updated to. For example, if the package comes from a git
+repository, then PRECISE would be the exact revision that the repository should
+be updated to.
+
+If SPEC is not given, then all dependencies will be re-resolved and
+updated.
+
+For more information about package id specifications, see `cargo help pkgid`.
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ debug!("executing; cmd=cargo-update; args={:?}", env::args().collect::<Vec<_>>());
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+ let root = find_root_manifest_for_wd(options.flag_manifest_path, config.cwd())?;
+
+ let update_opts = ops::UpdateOptions {
+ aggressive: options.flag_aggressive,
+ precise: options.flag_precise.as_ref().map(|s| &s[..]),
+ to_update: &options.flag_package,
+ config: config,
+ };
+
+ let ws = Workspace::new(&root, config)?;
+ ops::update_lockfile(&ws, &update_opts)?;
+ Ok(())
+}
--- /dev/null
+use std::collections::HashMap;
+use std::fs::File;
+use std::io::prelude::*;
+use std::process;
+
+use cargo;
+use cargo::util::important_paths::{find_root_manifest_for_wd};
+use cargo::util::{CliResult, Config};
+use serde_json;
+use toml;
+
+#[derive(Deserialize)]
+pub struct Flags {
+ flag_manifest_path: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+}
+
+pub const USAGE: &'static str = "
+Check correctness of crate manifest
+
+Usage:
+ cargo verify-project [options]
+ cargo verify-project -h | --help
+
+Options:
+ -h, --help Print this message
+ --manifest-path PATH Path to the manifest to verify
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+";
+
+pub fn execute(args: Flags, config: &mut Config) -> CliResult {
+ config.configure(args.flag_verbose,
+ args.flag_quiet,
+ &args.flag_color,
+ args.flag_frozen,
+ args.flag_locked,
+ &args.flag_z)?;
+
+ let mut contents = String::new();
+ let filename = args.flag_manifest_path.unwrap_or_else(|| "Cargo.toml".into());
+ let filename = match find_root_manifest_for_wd(Some(filename), config.cwd()) {
+ Ok(manifest_path) => manifest_path,
+ Err(e) => fail("invalid", &e.to_string()),
+ };
+
+ let file = File::open(&filename);
+ match file.and_then(|mut f| f.read_to_string(&mut contents)) {
+ Ok(_) => {},
+ Err(e) => fail("invalid", &format!("error reading file: {}", e))
+ };
+ if contents.parse::<toml::Value>().is_err() {
+ fail("invalid", "invalid-format");
+ }
+
+ let mut h = HashMap::new();
+ h.insert("success".to_string(), "true".to_string());
+ cargo::print_json(&h);
+ Ok(())
+}
+
+fn fail(reason: &str, value: &str) -> ! {
+ let mut h = HashMap::new();
+ h.insert(reason.to_string(), value.to_string());
+ println!("{}", serde_json::to_string(&h).unwrap());
+ process::exit(1)
+}
--- /dev/null
+use std::env;
+
+use cargo;
+use cargo::util::{CliResult, Config};
+
+#[derive(Deserialize)]
+pub struct Options;
+
+pub const USAGE: &'static str = "
+Show version information
+
+Usage:
+ cargo version [options]
+
+Options:
+ -h, --help Print this message
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ --color WHEN Coloring: auto, always, never
+";
+
+pub fn execute(_: Options, _: &mut Config) -> CliResult {
+ debug!("executing; cmd=cargo-version; args={:?}", env::args().collect::<Vec<_>>());
+
+ println!("{}", cargo::version());
+
+ Ok(())
+}
--- /dev/null
+use cargo::ops;
+use cargo::util::{CargoError, CliResult, Config};
+
+#[derive(Deserialize)]
+pub struct Options {
+ arg_crate: Option<String>,
+ flag_token: Option<String>,
+ flag_vers: Option<String>,
+ flag_index: Option<String>,
+ flag_verbose: u32,
+ flag_quiet: Option<bool>,
+ flag_color: Option<String>,
+ flag_undo: bool,
+ flag_frozen: bool,
+ flag_locked: bool,
+ #[serde(rename = "flag_Z")]
+ flag_z: Vec<String>,
+ flag_registry: Option<String>,
+}
+
+pub static USAGE: &'static str = "
+Remove a pushed crate from the index
+
+Usage:
+ cargo yank [options] [<crate>]
+
+Options:
+ -h, --help Print this message
+ --vers VERSION The version to yank or un-yank
+ --undo Undo a yank, putting a version back into the index
+ --index INDEX Registry index to yank from
+ --token TOKEN API token to use when authenticating
+ -v, --verbose ... Use verbose output (-vv very verbose/build.rs output)
+ -q, --quiet No output printed to stdout
+ --color WHEN Coloring: auto, always, never
+ --frozen Require Cargo.lock and cache are up to date
+ --locked Require Cargo.lock is up to date
+ -Z FLAG ... Unstable (nightly-only) flags to Cargo
+ --registry REGISTRY Registry to use
+
+The yank command removes a previously pushed crate's version from the server's
+index. This command does not delete any data, and the crate will still be
+available for download via the registry's download link.
+
+Note that existing crates locked to a yanked version will still be able to
+download the yanked version to use it. Cargo will, however, not allow any new
+crates to be locked to any yanked version.
+";
+
+pub fn execute(options: Options, config: &mut Config) -> CliResult {
+ config.configure(options.flag_verbose,
+ options.flag_quiet,
+ &options.flag_color,
+ options.flag_frozen,
+ options.flag_locked,
+ &options.flag_z)?;
+
+ if options.flag_registry.is_some() && !config.cli_unstable().unstable_options {
+ return Err(CargoError::from("registry option is an unstable feature and requires -Zunstable-options to use.").into());
+ }
+
+ ops::yank(config,
+ options.arg_crate,
+ options.flag_vers,
+ options.flag_token,
+ options.flag_index,
+ options.flag_undo,
+ options.flag_registry)?;
+ Ok(())
+}
+
--- /dev/null
+use std::fmt;
+use std::rc::Rc;
+use std::str::FromStr;
+
+use semver::VersionReq;
+use semver::ReqParseError;
+use serde::ser;
+
+use core::{SourceId, Summary, PackageId};
+use util::{Cfg, CfgExpr, Config};
+use util::errors::{CargoResult, CargoResultExt, CargoError};
+
+/// Information about a dependency requested by a Cargo manifest.
+/// Cheap to copy.
+#[derive(PartialEq, Clone, Debug)]
+pub struct Dependency {
+ inner: Rc<Inner>,
+}
+
+/// The data underlying a Dependency.
+#[derive(PartialEq, Clone, Debug)]
+struct Inner {
+ name: String,
+ source_id: SourceId,
+ req: VersionReq,
+ specified_req: bool,
+ kind: Kind,
+ only_match_name: bool,
+
+ optional: bool,
+ default_features: bool,
+ features: Vec<String>,
+
+ // This dependency should be used only for this platform.
+ // `None` means *all platforms*.
+ platform: Option<Platform>,
+}
+
+#[derive(Clone, Debug, PartialEq)]
+pub enum Platform {
+ Name(String),
+ Cfg(CfgExpr),
+}
+
+#[derive(Serialize)]
+struct SerializedDependency<'a> {
+ name: &'a str,
+ source: &'a SourceId,
+ req: String,
+ kind: Kind,
+
+ optional: bool,
+ uses_default_features: bool,
+ features: &'a [String],
+ target: Option<&'a Platform>,
+}
+
+impl ser::Serialize for Dependency {
+ fn serialize<S>(&self, s: S) -> Result<S::Ok, S::Error>
+ where S: ser::Serializer,
+ {
+ SerializedDependency {
+ name: self.name(),
+ source: self.source_id(),
+ req: self.version_req().to_string(),
+ kind: self.kind(),
+ optional: self.is_optional(),
+ uses_default_features: self.uses_default_features(),
+ features: self.features(),
+ target: self.platform(),
+ }.serialize(s)
+ }
+}
+
+#[derive(PartialEq, Clone, Debug, Copy)]
+pub enum Kind {
+ Normal,
+ Development,
+ Build,
+}
+
+fn parse_req_with_deprecated(req: &str,
+ extra: Option<(&PackageId, &Config)>)
+ -> CargoResult<VersionReq> {
+ match VersionReq::parse(req) {
+ Err(e) => {
+ let (inside, config) = match extra {
+ Some(pair) => pair,
+ None => return Err(e.into()),
+ };
+ match e {
+ ReqParseError::DeprecatedVersionRequirement(requirement) => {
+ let msg = format!("\
+parsed version requirement `{}` is no longer valid
+
+Previous versions of Cargo accepted this malformed requirement,
+but it is being deprecated. This was found when parsing the manifest
+of {} {}, and the correct version requirement is `{}`.
+
+This will soon become a hard error, so it's either recommended to
+update to a fixed version or contact the upstream maintainer about
+this warning.
+",
+req, inside.name(), inside.version(), requirement);
+ config.shell().warn(&msg)?;
+
+ Ok(requirement)
+ }
+ e => Err(e.into()),
+ }
+ },
+ Ok(v) => Ok(v),
+ }
+}
+
+impl ser::Serialize for Kind {
+ fn serialize<S>(&self, s: S) -> Result<S::Ok, S::Error>
+ where S: ser::Serializer,
+ {
+ match *self {
+ Kind::Normal => None,
+ Kind::Development => Some("dev"),
+ Kind::Build => Some("build"),
+ }.serialize(s)
+ }
+}
+
+impl Dependency {
+ /// Attempt to create a `Dependency` from an entry in the manifest.
+ pub fn parse(name: &str,
+ version: Option<&str>,
+ source_id: &SourceId,
+ inside: &PackageId,
+ config: &Config) -> CargoResult<Dependency> {
+ let arg = Some((inside, config));
+ let (specified_req, version_req) = match version {
+ Some(v) => (true, parse_req_with_deprecated(v, arg)?),
+ None => (false, VersionReq::any())
+ };
+
+ let mut ret = Dependency::new_override(name, source_id);
+ {
+ let ptr = Rc::make_mut(&mut ret.inner);
+ ptr.only_match_name = false;
+ ptr.req = version_req;
+ ptr.specified_req = specified_req;
+ }
+ Ok(ret)
+ }
+
+ /// Attempt to create a `Dependency` from an entry in the manifest.
+ pub fn parse_no_deprecated(name: &str,
+ version: Option<&str>,
+ source_id: &SourceId) -> CargoResult<Dependency> {
+ let (specified_req, version_req) = match version {
+ Some(v) => (true, parse_req_with_deprecated(v, None)?),
+ None => (false, VersionReq::any())
+ };
+
+ let mut ret = Dependency::new_override(name, source_id);
+ {
+ let ptr = Rc::make_mut(&mut ret.inner);
+ ptr.only_match_name = false;
+ ptr.req = version_req;
+ ptr.specified_req = specified_req;
+ }
+ Ok(ret)
+ }
+
+ pub fn new_override(name: &str, source_id: &SourceId) -> Dependency {
+ Dependency {
+ inner: Rc::new(Inner {
+ name: name.to_string(),
+ source_id: source_id.clone(),
+ req: VersionReq::any(),
+ kind: Kind::Normal,
+ only_match_name: true,
+ optional: false,
+ features: Vec::new(),
+ default_features: true,
+ specified_req: false,
+ platform: None,
+ }),
+ }
+ }
+
+ pub fn version_req(&self) -> &VersionReq {
+ &self.inner.req
+ }
+
+ pub fn name(&self) -> &str {
+ &self.inner.name
+ }
+
+ pub fn source_id(&self) -> &SourceId {
+ &self.inner.source_id
+ }
+
+ pub fn kind(&self) -> Kind {
+ self.inner.kind
+ }
+
+ pub fn specified_req(&self) -> bool {
+ self.inner.specified_req
+ }
+
+ /// If none, this dependencies must be built for all platforms.
+ /// If some, it must only be built for the specified platform.
+ pub fn platform(&self) -> Option<&Platform> {
+ self.inner.platform.as_ref()
+ }
+
+ pub fn set_kind(&mut self, kind: Kind) -> &mut Dependency {
+ Rc::make_mut(&mut self.inner).kind = kind;
+ self
+ }
+
+ /// Sets the list of features requested for the package.
+ pub fn set_features(&mut self, features: Vec<String>) -> &mut Dependency {
+ Rc::make_mut(&mut self.inner).features = features;
+ self
+ }
+
+ /// Sets whether the dependency requests default features of the package.
+ pub fn set_default_features(&mut self, default_features: bool) -> &mut Dependency {
+ Rc::make_mut(&mut self.inner).default_features = default_features;
+ self
+ }
+
+ /// Sets whether the dependency is optional.
+ pub fn set_optional(&mut self, optional: bool) -> &mut Dependency {
+ Rc::make_mut(&mut self.inner).optional = optional;
+ self
+ }
+
+ /// Set the source id for this dependency
+ pub fn set_source_id(&mut self, id: SourceId) -> &mut Dependency {
+ Rc::make_mut(&mut self.inner).source_id = id;
+ self
+ }
+
+ /// Set the version requirement for this dependency
+ pub fn set_version_req(&mut self, req: VersionReq) -> &mut Dependency {
+ Rc::make_mut(&mut self.inner).req = req;
+ self
+ }
+
+ pub fn set_platform(&mut self, platform: Option<Platform>) -> &mut Dependency {
+ Rc::make_mut(&mut self.inner).platform = platform;
+ self
+ }
+
+ /// Lock this dependency to depending on the specified package id
+ pub fn lock_to(&mut self, id: &PackageId) -> &mut Dependency {
+ assert_eq!(self.inner.source_id, *id.source_id());
+ assert!(self.inner.req.matches(id.version()));
+ self.set_version_req(VersionReq::exact(id.version()))
+ .set_source_id(id.source_id().clone())
+ }
+
+ /// Returns whether this is a "locked" dependency, basically whether it has
+ /// an exact version req.
+ pub fn is_locked(&self) -> bool {
+ // Kind of a hack to figure this out, but it works!
+ self.inner.req.to_string().starts_with('=')
+ }
+
+ /// Returns false if the dependency is only used to build the local package.
+ pub fn is_transitive(&self) -> bool {
+ match self.inner.kind {
+ Kind::Normal | Kind::Build => true,
+ Kind::Development => false,
+ }
+ }
+
+ pub fn is_build(&self) -> bool {
+ match self.inner.kind {
+ Kind::Build => true,
+ _ => false,
+ }
+ }
+
+ pub fn is_optional(&self) -> bool {
+ self.inner.optional
+ }
+
+ /// Returns true if the default features of the dependency are requested.
+ pub fn uses_default_features(&self) -> bool {
+ self.inner.default_features
+ }
+ /// Returns the list of features that are requested by the dependency.
+ pub fn features(&self) -> &[String] {
+ &self.inner.features
+ }
+
+ /// Returns true if the package (`sum`) can fulfill this dependency request.
+ pub fn matches(&self, sum: &Summary) -> bool {
+ self.matches_id(sum.package_id())
+ }
+
+ /// Returns true if the package (`sum`) can fulfill this dependency request.
+ pub fn matches_ignoring_source(&self, sum: &Summary) -> bool {
+ self.name() == sum.package_id().name() &&
+ self.version_req().matches(sum.package_id().version())
+ }
+
+ /// Returns true if the package (`id`) can fulfill this dependency request.
+ pub fn matches_id(&self, id: &PackageId) -> bool {
+ self.inner.name == id.name() &&
+ (self.inner.only_match_name || (self.inner.req.matches(id.version()) &&
+ &self.inner.source_id == id.source_id()))
+ }
+
+ pub fn map_source(mut self, to_replace: &SourceId, replace_with: &SourceId)
+ -> Dependency {
+ if self.source_id() != to_replace {
+ self
+ } else {
+ self.set_source_id(replace_with.clone());
+ self
+ }
+ }
+}
+
+impl Platform {
+ pub fn matches(&self, name: &str, cfg: Option<&[Cfg]>) -> bool {
+ match *self {
+ Platform::Name(ref p) => p == name,
+ Platform::Cfg(ref p) => {
+ match cfg {
+ Some(cfg) => p.matches(cfg),
+ None => false,
+ }
+ }
+ }
+ }
+}
+
+impl ser::Serialize for Platform {
+ fn serialize<S>(&self, s: S) -> Result<S::Ok, S::Error>
+ where S: ser::Serializer,
+ {
+ self.to_string().serialize(s)
+ }
+}
+
+impl FromStr for Platform {
+ type Err = CargoError;
+
+ fn from_str(s: &str) -> CargoResult<Platform> {
+ if s.starts_with("cfg(") && s.ends_with(')') {
+ let s = &s[4..s.len()-1];
+ s.parse().map(Platform::Cfg).chain_err(|| {
+ format!("failed to parse `{}` as a cfg expression", s)
+ })
+ } else {
+ Ok(Platform::Name(s.to_string()))
+ }
+ }
+}
+
+impl fmt::Display for Platform {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ match *self {
+ Platform::Name(ref n) => n.fmt(f),
+ Platform::Cfg(ref e) => write!(f, "cfg({})", e),
+ }
+ }
+}
--- /dev/null
+//! Support for nightly features in Cargo itself
+//!
+//! This file is the version of `feature_gate.rs` in upstream Rust for Cargo
+//! itself and is intended to be the avenue for which new features in Cargo are
+//! gated by default and then eventually stabilized. All known stable and
+//! unstable features are tracked in this file.
+//!
+//! If you're reading this then you're likely interested in adding a feature to
+//! Cargo, and the good news is that it shouldn't be too hard! To do this you'll
+//! want to follow these steps:
+//!
+//! 1. Add your feature. Do this by searching for "look here" in this file and
+//! expanding the macro invocation that lists all features with your new
+//! feature.
+//!
+//! 2. Find the appropriate place to place the feature gate in Cargo itself. If
+//! you're extending the manifest format you'll likely just want to modify
+//! the `Manifest::feature_gate` function, but otherwise you may wish to
+//! place the feature gate elsewhere in Cargo.
+//!
+//! 3. To actually perform the feature gate, you'll want to have code that looks
+//! like:
+//!
+//! ```rust,ignore
+//! use core::{Feature, Features};
+//!
+//! let feature = Feature::launch_into_space();
+//! package.manifest().features().require(feature).chain_err(|| {
+//! "launching Cargo into space right now is unstable and may result in \
+//! unintended damage to your codebase, use with caution"
+//! })?;
+//! ```
+//!
+//! Notably you'll notice the `require` function called with your `Feature`, and
+//! then you use `chain_err` to tack on more context for why the feature was
+//! required when the feature isn't activated.
+//!
+//! And hopefully that's it! Bear with us though that this is, at the time of
+//! this writing, a very new feature in Cargo. If the process differs from this
+//! we'll be sure to update this documentation!
+
+use std::env;
+
+use util::errors::CargoResult;
+
+enum Status {
+ Stable,
+ Unstable,
+}
+
+macro_rules! features {
+ (
+ pub struct Features {
+ $([$stab:ident] $feature:ident: bool,)*
+ }
+ ) => (
+ #[derive(Default, Clone, Debug)]
+ pub struct Features {
+ $($feature: bool,)*
+ activated: Vec<String>,
+ }
+
+ impl Feature {
+ $(
+ pub fn $feature() -> &'static Feature {
+ fn get(features: &Features) -> bool {
+ features.$feature
+ }
+ static FEAT: Feature = Feature {
+ name: stringify!($feature),
+ get: get,
+ };
+ &FEAT
+ }
+ )*
+
+ fn is_enabled(&self, features: &Features) -> bool {
+ (self.get)(features)
+ }
+ }
+
+ impl Features {
+ fn status(&mut self, feature: &str) -> Option<(&mut bool, Status)> {
+ if feature.contains("_") {
+ return None
+ }
+ let feature = feature.replace("-", "_");
+ $(
+ if feature == stringify!($feature) {
+ return Some((&mut self.$feature, stab!($stab)))
+ }
+ )*
+ None
+ }
+ }
+ )
+}
+
+macro_rules! stab {
+ (stable) => (Status::Stable);
+ (unstable) => (Status::Unstable);
+}
+
+/// A listing of all features in Cargo
+///
+/// "look here"
+///
+/// This is the macro that lists all stable and unstable features in Cargo.
+/// You'll want to add to this macro whenever you add a feature to Cargo, also
+/// following the directions above.
+///
+/// Note that all feature names here are valid Rust identifiers, but the `_`
+/// character is translated to `-` when specified in the `cargo-features`
+/// manifest entry in `Cargo.toml`.
+features! {
+ pub struct Features {
+
+ // A dummy feature that doesn't actually gate anything, but it's used in
+ // testing to ensure that we can enable stable features.
+ [stable] test_dummy_stable: bool,
+
+ // A dummy feature that gates the usage of the `im-a-teapot` manifest
+ // entry. This is basically just intended for tests.
+ [unstable] test_dummy_unstable: bool,
+
+ // Downloading packages from alternative registry indexes.
+ [unstable] alternative_registries: bool,
+ }
+}
+
+pub struct Feature {
+ name: &'static str,
+ get: fn(&Features) -> bool,
+}
+
+impl Features {
+ pub fn new(features: &[String],
+ warnings: &mut Vec<String>) -> CargoResult<Features> {
+ let mut ret = Features::default();
+ for feature in features {
+ ret.add(feature, warnings)?;
+ ret.activated.push(feature.to_string());
+ }
+ Ok(ret)
+ }
+
+ fn add(&mut self, feature: &str, warnings: &mut Vec<String>) -> CargoResult<()> {
+ let (slot, status) = match self.status(feature) {
+ Some(p) => p,
+ None => bail!("unknown cargo feature `{}`", feature),
+ };
+
+ if *slot {
+ bail!("the cargo feature `{}` has already been activated", feature);
+ }
+
+ match status {
+ Status::Stable => {
+ let warning = format!("the cargo feature `{}` is now stable \
+ and is no longer necessary to be listed \
+ in the manifest", feature);
+ warnings.push(warning);
+ }
+ Status::Unstable if !nightly_features_allowed() => {
+ bail!("the cargo feature `{}` requires a nightly version of \
+ Cargo, but this is the `{}` channel",
+ feature,
+ channel())
+ }
+ Status::Unstable => {}
+ }
+
+ *slot = true;
+
+ Ok(())
+ }
+
+ pub fn activated(&self) -> &[String] {
+ &self.activated
+ }
+
+ pub fn require(&self, feature: &Feature) -> CargoResult<()> {
+ if feature.is_enabled(self) {
+ Ok(())
+ } else {
+ let feature = feature.name.replace("_", "-");
+ let mut msg = format!("feature `{}` is required", feature);
+
+ if nightly_features_allowed() {
+ let s = format!("\n\nconsider adding `cargo-features = [\"{0}\"]` \
+ to the manifest", feature);
+ msg.push_str(&s);
+ } else {
+ let s = format!("\n\n\
+ this Cargo does not support nightly features, but if you\n\
+ switch to nightly channel you can add\n\
+ `cargo-features = [\"{}\"]` to enable this feature",
+ feature);
+ msg.push_str(&s);
+ }
+ bail!("{}", msg);
+ }
+ }
+}
+
+/// A parsed representation of all unstable flags that Cargo accepts.
+///
+/// Cargo, like `rustc`, accepts a suite of `-Z` flags which are intended for
+/// gating unstable functionality to Cargo. These flags are only available on
+/// the nightly channel of Cargo.
+///
+/// This struct doesn't have quite the same convenience macro that the features
+/// have above, but the procedure should still be relatively stable for adding a
+/// new unstable flag:
+///
+/// 1. First, add a field to this `CliUnstable` structure. All flags are allowed
+/// to have a value as the `-Z` flags are either of the form `-Z foo` or
+/// `-Z foo=bar`, and it's up to you how to parse `bar`.
+///
+/// 2. Add an arm to the match statement in `CliUnstable::add` below to match on
+/// your new flag. The key (`k`) is what you're matching on and the value is
+/// in `v`.
+///
+/// 3. (optional) Add a new parsing function to parse your datatype. As of now
+/// there's an example for `bool`, but more can be added!
+///
+/// 4. In Cargo use `config.cli_unstable()` to get a reference to this structure
+/// and then test for your flag or your value and act accordingly.
+///
+/// If you have any trouble with this, please let us know!
+#[derive(Default, Debug)]
+pub struct CliUnstable {
+ pub print_im_a_teapot: bool,
+ pub unstable_options: bool,
+}
+
+impl CliUnstable {
+ pub fn parse(&mut self, flags: &[String]) -> CargoResult<()> {
+ if !flags.is_empty() && !nightly_features_allowed() {
+ bail!("the `-Z` flag is only accepted on the nightly channel of Cargo")
+ }
+ for flag in flags {
+ self.add(flag)?;
+ }
+ Ok(())
+ }
+
+ fn add(&mut self, flag: &str) -> CargoResult<()> {
+ let mut parts = flag.splitn(2, '=');
+ let k = parts.next().unwrap();
+ let v = parts.next();
+
+ fn parse_bool(value: Option<&str>) -> CargoResult<bool> {
+ match value {
+ None |
+ Some("yes") => Ok(true),
+ Some("no") => Ok(false),
+ Some(s) => bail!("expected `no` or `yes`, found: {}", s),
+ }
+ }
+
+ match k {
+ "print-im-a-teapot" => self.print_im_a_teapot = parse_bool(v)?,
+ "unstable-options" => self.unstable_options = true,
+ _ => bail!("unknown `-Z` flag specified: {}", k),
+ }
+
+ Ok(())
+ }
+}
+
+fn channel() -> String {
+ env::var("__CARGO_TEST_CHANNEL_OVERRIDE_DO_NOT_USE_THIS").unwrap_or_else(|_| {
+ ::version().cfg_info.map(|c| c.release_channel)
+ .unwrap_or_else(|| String::from("dev"))
+ })
+}
+
+fn nightly_features_allowed() -> bool {
+ match &channel()[..] {
+ "nightly" | "dev" => true,
+ _ => false,
+ }
+}
--- /dev/null
+use std::collections::{HashMap, BTreeMap};
+use std::fmt;
+use std::path::{PathBuf, Path};
+use std::rc::Rc;
+
+use semver::Version;
+use serde::ser;
+use url::Url;
+
+use core::{Dependency, PackageId, Summary, SourceId, PackageIdSpec};
+use core::{WorkspaceConfig, Features, Feature};
+use util::Config;
+use util::toml::TomlManifest;
+use util::errors::*;
+
+pub enum EitherManifest {
+ Real(Manifest),
+ Virtual(VirtualManifest),
+}
+
+/// Contains all the information about a package, as loaded from a Cargo.toml.
+#[derive(Clone, Debug)]
+pub struct Manifest {
+ summary: Summary,
+ targets: Vec<Target>,
+ links: Option<String>,
+ warnings: Vec<DelayedWarning>,
+ exclude: Vec<String>,
+ include: Vec<String>,
+ metadata: ManifestMetadata,
+ profiles: Profiles,
+ publish: Option<Vec<String>>,
+ replace: Vec<(PackageIdSpec, Dependency)>,
+ patch: HashMap<Url, Vec<Dependency>>,
+ workspace: WorkspaceConfig,
+ original: Rc<TomlManifest>,
+ features: Features,
+ im_a_teapot: Option<bool>,
+}
+
+/// When parsing `Cargo.toml`, some warnings should silenced
+/// if the manifest comes from a dependency. `ManifestWarning`
+/// allows this delayed emission of warnings.
+#[derive(Clone, Debug)]
+pub struct DelayedWarning {
+ pub message: String,
+ pub is_critical: bool
+}
+
+#[derive(Clone, Debug)]
+pub struct VirtualManifest {
+ replace: Vec<(PackageIdSpec, Dependency)>,
+ patch: HashMap<Url, Vec<Dependency>>,
+ workspace: WorkspaceConfig,
+ profiles: Profiles,
+}
+
+/// General metadata about a package which is just blindly uploaded to the
+/// registry.
+///
+/// Note that many of these fields can contain invalid values such as the
+/// homepage, repository, documentation, or license. These fields are not
+/// validated by cargo itself, but rather it is up to the registry when uploaded
+/// to validate these fields. Cargo will itself accept any valid TOML
+/// specification for these values.
+#[derive(PartialEq, Clone, Debug)]
+pub struct ManifestMetadata {
+ pub authors: Vec<String>,
+ pub keywords: Vec<String>,
+ pub categories: Vec<String>,
+ pub license: Option<String>,
+ pub license_file: Option<String>,
+ pub description: Option<String>, // not markdown
+ pub readme: Option<String>, // file, not contents
+ pub homepage: Option<String>, // url
+ pub repository: Option<String>, // url
+ pub documentation: Option<String>, // url
+ pub badges: BTreeMap<String, BTreeMap<String, String>>,
+}
+
+#[derive(Debug, Clone, PartialEq, Eq, Hash, PartialOrd, Ord)]
+pub enum LibKind {
+ Lib,
+ Rlib,
+ Dylib,
+ ProcMacro,
+ Other(String),
+}
+
+impl LibKind {
+ pub fn from_str(string: &str) -> LibKind {
+ match string {
+ "lib" => LibKind::Lib,
+ "rlib" => LibKind::Rlib,
+ "dylib" => LibKind::Dylib,
+ "proc-macro" => LibKind::ProcMacro,
+ s => LibKind::Other(s.to_string()),
+ }
+ }
+
+ /// Returns the argument suitable for `--crate-type` to pass to rustc.
+ pub fn crate_type(&self) -> &str {
+ match *self {
+ LibKind::Lib => "lib",
+ LibKind::Rlib => "rlib",
+ LibKind::Dylib => "dylib",
+ LibKind::ProcMacro => "proc-macro",
+ LibKind::Other(ref s) => s,
+ }
+ }
+
+ pub fn linkable(&self) -> bool {
+ match *self {
+ LibKind::Lib |
+ LibKind::Rlib |
+ LibKind::Dylib |
+ LibKind::ProcMacro => true,
+ LibKind::Other(..) => false,
+ }
+ }
+}
+
+#[derive(Debug, Clone, Hash, PartialEq, Eq, PartialOrd, Ord)]
+pub enum TargetKind {
+ Lib(Vec<LibKind>),
+ Bin,
+ Test,
+ Bench,
+ ExampleLib(Vec<LibKind>),
+ ExampleBin,
+ CustomBuild,
+}
+
+impl ser::Serialize for TargetKind {
+ fn serialize<S>(&self, s: S) -> Result<S::Ok, S::Error>
+ where S: ser::Serializer,
+ {
+ use self::TargetKind::*;
+ match *self {
+ Lib(ref kinds) => kinds.iter().map(LibKind::crate_type).collect(),
+ Bin => vec!["bin"],
+ ExampleBin | ExampleLib(_) => vec!["example"],
+ Test => vec!["test"],
+ CustomBuild => vec!["custom-build"],
+ Bench => vec!["bench"]
+ }.serialize(s)
+ }
+}
+
+
+// Note that most of the fields here are skipped when serializing because we
+// don't want to export them just yet (becomes a public API of Cargo). Others
+// though are definitely needed!
+#[derive(Clone, PartialEq, Eq, Debug, Hash, Serialize)]
+pub struct Profile {
+ pub opt_level: String,
+ #[serde(skip_serializing)]
+ pub lto: bool,
+ #[serde(skip_serializing)]
+ pub codegen_units: Option<u32>, // None = use rustc default
+ #[serde(skip_serializing)]
+ pub rustc_args: Option<Vec<String>>,
+ #[serde(skip_serializing)]
+ pub rustdoc_args: Option<Vec<String>>,
+ pub debuginfo: Option<u32>,
+ pub debug_assertions: bool,
+ pub overflow_checks: bool,
+ #[serde(skip_serializing)]
+ pub rpath: bool,
+ pub test: bool,
+ #[serde(skip_serializing)]
+ pub doc: bool,
+ #[serde(skip_serializing)]
+ pub run_custom_build: bool,
+ #[serde(skip_serializing)]
+ pub check: bool,
+ #[serde(skip_serializing)]
+ pub panic: Option<String>,
+}
+
+#[derive(Default, Clone, Debug, PartialEq, Eq)]
+pub struct Profiles {
+ pub release: Profile,
+ pub dev: Profile,
+ pub test: Profile,
+ pub test_deps: Profile,
+ pub bench: Profile,
+ pub bench_deps: Profile,
+ pub doc: Profile,
+ pub custom_build: Profile,
+ pub check: Profile,
+ pub check_test: Profile,
+ pub doctest: Profile,
+}
+
+/// Information about a binary, a library, an example, etc. that is part of the
+/// package.
+#[derive(Clone, Hash, PartialEq, Eq, Debug)]
+pub struct Target {
+ kind: TargetKind,
+ name: String,
+ src_path: PathBuf,
+ required_features: Option<Vec<String>>,
+ tested: bool,
+ benched: bool,
+ doc: bool,
+ doctest: bool,
+ harness: bool, // whether to use the test harness (--test)
+ for_host: bool,
+}
+
+#[derive(Serialize)]
+struct SerializedTarget<'a> {
+ /// Is this a `--bin bin`, `--lib`, `--example ex`?
+ /// Serialized as a list of strings for historical reasons.
+ kind: &'a TargetKind,
+ /// Corresponds to `--crate-type` compiler attribute.
+ /// See https://doc.rust-lang.org/reference.html#linkage
+ crate_types: Vec<&'a str>,
+ name: &'a str,
+ src_path: &'a PathBuf,
+}
+
+impl ser::Serialize for Target {
+ fn serialize<S: ser::Serializer>(&self, s: S) -> Result<S::Ok, S::Error> {
+ SerializedTarget {
+ kind: &self.kind,
+ crate_types: self.rustc_crate_types(),
+ name: &self.name,
+ src_path: &self.src_path,
+ }.serialize(s)
+ }
+}
+
+impl Manifest {
+ pub fn new(summary: Summary,
+ targets: Vec<Target>,
+ exclude: Vec<String>,
+ include: Vec<String>,
+ links: Option<String>,
+ metadata: ManifestMetadata,
+ profiles: Profiles,
+ publish: Option<Vec<String>>,
+ replace: Vec<(PackageIdSpec, Dependency)>,
+ patch: HashMap<Url, Vec<Dependency>>,
+ workspace: WorkspaceConfig,
+ features: Features,
+ im_a_teapot: Option<bool>,
+ original: Rc<TomlManifest>) -> Manifest {
+ Manifest {
+ summary: summary,
+ targets: targets,
+ warnings: Vec::new(),
+ exclude: exclude,
+ include: include,
+ links: links,
+ metadata: metadata,
+ profiles: profiles,
+ publish: publish,
+ replace: replace,
+ patch: patch,
+ workspace: workspace,
+ features: features,
+ original: original,
+ im_a_teapot: im_a_teapot,
+ }
+ }
+
+ pub fn dependencies(&self) -> &[Dependency] { self.summary.dependencies() }
+ pub fn exclude(&self) -> &[String] { &self.exclude }
+ pub fn include(&self) -> &[String] { &self.include }
+ pub fn metadata(&self) -> &ManifestMetadata { &self.metadata }
+ pub fn name(&self) -> &str { self.package_id().name() }
+ pub fn package_id(&self) -> &PackageId { self.summary.package_id() }
+ pub fn summary(&self) -> &Summary { &self.summary }
+ pub fn targets(&self) -> &[Target] { &self.targets }
+ pub fn version(&self) -> &Version { self.package_id().version() }
+ pub fn warnings(&self) -> &[DelayedWarning] { &self.warnings }
+ pub fn profiles(&self) -> &Profiles { &self.profiles }
+ pub fn publish(&self) -> &Option<Vec<String>> { &self.publish }
+ pub fn replace(&self) -> &[(PackageIdSpec, Dependency)] { &self.replace }
+ pub fn original(&self) -> &TomlManifest { &self.original }
+ pub fn patch(&self) -> &HashMap<Url, Vec<Dependency>> { &self.patch }
+ pub fn links(&self) -> Option<&str> {
+ self.links.as_ref().map(|s| &s[..])
+ }
+
+ pub fn workspace_config(&self) -> &WorkspaceConfig {
+ &self.workspace
+ }
+
+ pub fn features(&self) -> &Features {
+ &self.features
+ }
+
+ pub fn add_warning(&mut self, s: String) {
+ self.warnings.push(DelayedWarning { message: s, is_critical: false })
+ }
+
+ pub fn add_critical_warning(&mut self, s: String) {
+ self.warnings.push(DelayedWarning { message: s, is_critical: true })
+ }
+
+ pub fn set_summary(&mut self, summary: Summary) {
+ self.summary = summary;
+ }
+
+ pub fn map_source(self, to_replace: &SourceId, replace_with: &SourceId)
+ -> Manifest {
+ Manifest {
+ summary: self.summary.map_source(to_replace, replace_with),
+ ..self
+ }
+ }
+
+ pub fn feature_gate(&self) -> CargoResult<()> {
+ if self.im_a_teapot.is_some() {
+ self.features.require(Feature::test_dummy_unstable()).chain_err(|| {
+ "the `im-a-teapot` manifest key is unstable and may not work \
+ properly in England"
+ })?;
+ }
+
+ Ok(())
+ }
+
+ // Just a helper function to test out `-Z` flags on Cargo
+ pub fn print_teapot(&self, config: &Config) {
+ if let Some(teapot) = self.im_a_teapot {
+ if config.cli_unstable().print_im_a_teapot {
+ println!("im-a-teapot = {}", teapot);
+ }
+ }
+ }
+}
+
+impl VirtualManifest {
+ pub fn new(replace: Vec<(PackageIdSpec, Dependency)>,
+ patch: HashMap<Url, Vec<Dependency>>,
+ workspace: WorkspaceConfig,
+ profiles: Profiles) -> VirtualManifest {
+ VirtualManifest {
+ replace: replace,
+ patch: patch,
+ workspace: workspace,
+ profiles: profiles,
+ }
+ }
+
+ pub fn replace(&self) -> &[(PackageIdSpec, Dependency)] {
+ &self.replace
+ }
+
+ pub fn patch(&self) -> &HashMap<Url, Vec<Dependency>> {
+ &self.patch
+ }
+
+ pub fn workspace_config(&self) -> &WorkspaceConfig {
+ &self.workspace
+ }
+
+ pub fn profiles(&self) -> &Profiles {
+ &self.profiles
+ }
+}
+
+impl Target {
+ fn with_path(src_path: PathBuf) -> Target {
+ assert!(src_path.is_absolute());
+ Target {
+ kind: TargetKind::Bin,
+ name: String::new(),
+ src_path: src_path,
+ required_features: None,
+ doc: false,
+ doctest: false,
+ harness: true,
+ for_host: false,
+ tested: true,
+ benched: true,
+ }
+ }
+
+ pub fn lib_target(name: &str,
+ crate_targets: Vec<LibKind>,
+ src_path: PathBuf) -> Target {
+ Target {
+ kind: TargetKind::Lib(crate_targets),
+ name: name.to_string(),
+ doctest: true,
+ doc: true,
+ ..Target::with_path(src_path)
+ }
+ }
+
+ pub fn bin_target(name: &str, src_path: PathBuf,
+ required_features: Option<Vec<String>>) -> Target {
+ Target {
+ kind: TargetKind::Bin,
+ name: name.to_string(),
+ required_features: required_features,
+ doc: true,
+ ..Target::with_path(src_path)
+ }
+ }
+
+ /// Builds a `Target` corresponding to the `build = "build.rs"` entry.
+ pub fn custom_build_target(name: &str, src_path: PathBuf) -> Target {
+ Target {
+ kind: TargetKind::CustomBuild,
+ name: name.to_string(),
+ for_host: true,
+ benched: false,
+ tested: false,
+ ..Target::with_path(src_path)
+ }
+ }
+
+ pub fn example_target(name: &str,
+ crate_targets: Vec<LibKind>,
+ src_path: PathBuf,
+ required_features: Option<Vec<String>>) -> Target {
+ let kind = if crate_targets.is_empty() {
+ TargetKind::ExampleBin
+ } else {
+ TargetKind::ExampleLib(crate_targets)
+ };
+
+ Target {
+ kind: kind,
+ name: name.to_string(),
+ required_features: required_features,
+ benched: false,
+ ..Target::with_path(src_path)
+ }
+ }
+
+ pub fn test_target(name: &str, src_path: PathBuf,
+ required_features: Option<Vec<String>>) -> Target {
+ Target {
+ kind: TargetKind::Test,
+ name: name.to_string(),
+ required_features: required_features,
+ benched: false,
+ ..Target::with_path(src_path)
+ }
+ }
+
+ pub fn bench_target(name: &str, src_path: PathBuf,
+ required_features: Option<Vec<String>>) -> Target {
+ Target {
+ kind: TargetKind::Bench,
+ name: name.to_string(),
+ required_features: required_features,
+ tested: false,
+ ..Target::with_path(src_path)
+ }
+ }
+
+ pub fn name(&self) -> &str { &self.name }
+ pub fn crate_name(&self) -> String { self.name.replace("-", "_") }
+ pub fn src_path(&self) -> &Path { &self.src_path }
+ pub fn required_features(&self) -> Option<&Vec<String>> { self.required_features.as_ref() }
+ pub fn kind(&self) -> &TargetKind { &self.kind }
+ pub fn tested(&self) -> bool { self.tested }
+ pub fn harness(&self) -> bool { self.harness }
+ pub fn documented(&self) -> bool { self.doc }
+ pub fn for_host(&self) -> bool { self.for_host }
+ pub fn benched(&self) -> bool { self.benched }
+
+ pub fn doctested(&self) -> bool {
+ self.doctest && match self.kind {
+ TargetKind::Lib(ref kinds) => {
+ kinds.iter().any(|k| {
+ *k == LibKind::Rlib ||
+ *k == LibKind::Lib ||
+ *k == LibKind::ProcMacro
+ })
+ }
+ _ => false,
+ }
+ }
+
+ pub fn allows_underscores(&self) -> bool {
+ self.is_bin() || self.is_example() || self.is_custom_build()
+ }
+
+ pub fn is_lib(&self) -> bool {
+ match self.kind {
+ TargetKind::Lib(_) => true,
+ _ => false
+ }
+ }
+
+ pub fn is_dylib(&self) -> bool {
+ match self.kind {
+ TargetKind::Lib(ref libs) => libs.iter().any(|l| *l == LibKind::Dylib),
+ _ => false
+ }
+ }
+
+ pub fn is_cdylib(&self) -> bool {
+ let libs = match self.kind {
+ TargetKind::Lib(ref libs) => libs,
+ _ => return false
+ };
+ libs.iter().any(|l| {
+ match *l {
+ LibKind::Other(ref s) => s == "cdylib",
+ _ => false,
+ }
+ })
+ }
+
+ pub fn linkable(&self) -> bool {
+ match self.kind {
+ TargetKind::Lib(ref kinds) => {
+ kinds.iter().any(|k| k.linkable())
+ }
+ _ => false
+ }
+ }
+
+ pub fn is_bin(&self) -> bool { self.kind == TargetKind::Bin }
+
+ pub fn is_example(&self) -> bool {
+ match self.kind {
+ TargetKind::ExampleBin |
+ TargetKind::ExampleLib(..) => true,
+ _ => false
+ }
+ }
+
+ pub fn is_bin_example(&self) -> bool {
+ // Needed for --all-examples in contexts where only runnable examples make sense
+ match self.kind {
+ TargetKind::ExampleBin => true,
+ _ => false
+ }
+ }
+
+ pub fn is_test(&self) -> bool { self.kind == TargetKind::Test }
+ pub fn is_bench(&self) -> bool { self.kind == TargetKind::Bench }
+ pub fn is_custom_build(&self) -> bool { self.kind == TargetKind::CustomBuild }
+
+ /// Returns the arguments suitable for `--crate-type` to pass to rustc.
+ pub fn rustc_crate_types(&self) -> Vec<&str> {
+ match self.kind {
+ TargetKind::Lib(ref kinds) |
+ TargetKind::ExampleLib(ref kinds) => {
+ kinds.iter().map(LibKind::crate_type).collect()
+ }
+ TargetKind::CustomBuild |
+ TargetKind::Bench |
+ TargetKind::Test |
+ TargetKind::ExampleBin |
+ TargetKind::Bin => vec!["bin"],
+ }
+ }
+
+ pub fn can_lto(&self) -> bool {
+ match self.kind {
+ TargetKind::Lib(ref v) => {
+ !v.contains(&LibKind::Rlib) &&
+ !v.contains(&LibKind::Dylib) &&
+ !v.contains(&LibKind::Lib)
+ }
+ _ => true,
+ }
+ }
+
+ pub fn set_tested(&mut self, tested: bool) -> &mut Target {
+ self.tested = tested;
+ self
+ }
+ pub fn set_benched(&mut self, benched: bool) -> &mut Target {
+ self.benched = benched;
+ self
+ }
+ pub fn set_doctest(&mut self, doctest: bool) -> &mut Target {
+ self.doctest = doctest;
+ self
+ }
+ pub fn set_for_host(&mut self, for_host: bool) -> &mut Target {
+ self.for_host = for_host;
+ self
+ }
+ pub fn set_harness(&mut self, harness: bool) -> &mut Target {
+ self.harness = harness;
+ self
+ }
+ pub fn set_doc(&mut self, doc: bool) -> &mut Target {
+ self.doc = doc;
+ self
+ }
+}
+
+impl fmt::Display for Target {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ match self.kind {
+ TargetKind::Lib(..) => write!(f, "Target(lib)"),
+ TargetKind::Bin => write!(f, "Target(bin: {})", self.name),
+ TargetKind::Test => write!(f, "Target(test: {})", self.name),
+ TargetKind::Bench => write!(f, "Target(bench: {})", self.name),
+ TargetKind::ExampleBin |
+ TargetKind::ExampleLib(..) => write!(f, "Target(example: {})", self.name),
+ TargetKind::CustomBuild => write!(f, "Target(script)"),
+ }
+ }
+}
+
+impl Profile {
+ pub fn default_dev() -> Profile {
+ Profile {
+ debuginfo: Some(2),
+ debug_assertions: true,
+ overflow_checks: true,
+ ..Profile::default()
+ }
+ }
+
+ pub fn default_release() -> Profile {
+ Profile {
+ opt_level: "3".to_string(),
+ debuginfo: None,
+ ..Profile::default()
+ }
+ }
+
+ pub fn default_test() -> Profile {
+ Profile {
+ test: true,
+ ..Profile::default_dev()
+ }
+ }
+
+ pub fn default_bench() -> Profile {
+ Profile {
+ test: true,
+ ..Profile::default_release()
+ }
+ }
+
+ pub fn default_doc() -> Profile {
+ Profile {
+ doc: true,
+ ..Profile::default_dev()
+ }
+ }
+
+ pub fn default_custom_build() -> Profile {
+ Profile {
+ run_custom_build: true,
+ ..Profile::default_dev()
+ }
+ }
+
+ pub fn default_check() -> Profile {
+ Profile {
+ check: true,
+ ..Profile::default_dev()
+ }
+ }
+
+ pub fn default_check_test() -> Profile {
+ Profile {
+ check: true,
+ test: true,
+ ..Profile::default_dev()
+ }
+ }
+
+ pub fn default_doctest() -> Profile {
+ Profile {
+ doc: true,
+ test: true,
+ ..Profile::default_dev()
+ }
+ }
+}
+
+impl Default for Profile {
+ fn default() -> Profile {
+ Profile {
+ opt_level: "0".to_string(),
+ lto: false,
+ codegen_units: None,
+ rustc_args: None,
+ rustdoc_args: None,
+ debuginfo: None,
+ debug_assertions: false,
+ overflow_checks: false,
+ rpath: false,
+ test: false,
+ doc: false,
+ run_custom_build: false,
+ check: false,
+ panic: None,
+ }
+ }
+}
+
+impl fmt::Display for Profile {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ if self.test {
+ write!(f, "Profile(test)")
+ } else if self.doc {
+ write!(f, "Profile(doc)")
+ } else if self.run_custom_build {
+ write!(f, "Profile(run)")
+ } else if self.check {
+ write!(f, "Profile(check)")
+ } else {
+ write!(f, "Profile(build)")
+ }
+
+ }
+}
--- /dev/null
+pub use self::dependency::Dependency;
+pub use self::features::{Features, Feature, CliUnstable};
+pub use self::manifest::{EitherManifest, VirtualManifest};
+pub use self::manifest::{Manifest, Target, TargetKind, Profile, LibKind, Profiles};
+pub use self::package::{Package, PackageSet};
+pub use self::package_id::PackageId;
+pub use self::package_id_spec::PackageIdSpec;
+pub use self::registry::Registry;
+pub use self::resolver::Resolve;
+pub use self::shell::{Shell, Verbosity};
+pub use self::source::{Source, SourceId, SourceMap, GitReference};
+pub use self::summary::Summary;
+pub use self::workspace::{Members, Workspace, WorkspaceConfig, WorkspaceRootConfig};
+
+pub mod source;
+pub mod package;
+pub mod package_id;
+pub mod dependency;
+pub mod manifest;
+pub mod resolver;
+pub mod summary;
+pub mod shell;
+pub mod registry;
+mod package_id_spec;
+mod workspace;
+mod features;
--- /dev/null
+use std::cell::{Ref, RefCell};
+use std::collections::{HashMap, BTreeMap};
+use std::fmt;
+use std::hash;
+use std::path::{Path, PathBuf};
+
+use semver::Version;
+use serde::ser;
+use toml;
+
+use core::{Dependency, Manifest, PackageId, SourceId, Target};
+use core::{Summary, SourceMap};
+use ops;
+use util::{Config, LazyCell, internal, lev_distance};
+use util::errors::{CargoResult, CargoResultExt};
+
+/// Information about a package that is available somewhere in the file system.
+///
+/// A package is a `Cargo.toml` file plus all the files that are part of it.
+// TODO: Is manifest_path a relic?
+#[derive(Clone, Debug)]
+pub struct Package {
+ /// The package's manifest
+ manifest: Manifest,
+ /// The root of the package
+ manifest_path: PathBuf,
+}
+
+/// A Package in a form where `Serialize` can be derived.
+#[derive(Serialize)]
+struct SerializedPackage<'a> {
+ name: &'a str,
+ version: &'a str,
+ id: &'a PackageId,
+ license: Option<&'a str>,
+ license_file: Option<&'a str>,
+ description: Option<&'a str>,
+ source: &'a SourceId,
+ dependencies: &'a [Dependency],
+ targets: &'a [Target],
+ features: &'a BTreeMap<String, Vec<String>>,
+ manifest_path: &'a str,
+}
+
+impl ser::Serialize for Package {
+ fn serialize<S>(&self, s: S) -> Result<S::Ok, S::Error>
+ where S: ser::Serializer,
+ {
+ let summary = self.manifest.summary();
+ let package_id = summary.package_id();
+ let manmeta = self.manifest.metadata();
+ let license = manmeta.license.as_ref().map(String::as_ref);
+ let license_file = manmeta.license_file.as_ref().map(String::as_ref);
+ let description = manmeta.description.as_ref().map(String::as_ref);
+
+ SerializedPackage {
+ name: package_id.name(),
+ version: &package_id.version().to_string(),
+ id: package_id,
+ license: license,
+ license_file: license_file,
+ description: description,
+ source: summary.source_id(),
+ dependencies: summary.dependencies(),
+ targets: self.manifest.targets(),
+ features: summary.features(),
+ manifest_path: &self.manifest_path.display().to_string(),
+ }.serialize(s)
+ }
+}
+
+impl Package {
+ /// Create a package from a manifest and its location
+ pub fn new(manifest: Manifest,
+ manifest_path: &Path) -> Package {
+ Package {
+ manifest: manifest,
+ manifest_path: manifest_path.to_path_buf(),
+ }
+ }
+
+ /// Calculate the Package from the manifest path (and cargo configuration).
+ pub fn for_path(manifest_path: &Path, config: &Config) -> CargoResult<Package> {
+ let path = manifest_path.parent().unwrap();
+ let source_id = SourceId::for_path(path)?;
+ let (pkg, _) = ops::read_package(manifest_path, &source_id, config)?;
+ Ok(pkg)
+ }
+
+ /// Get the manifest dependencies
+ pub fn dependencies(&self) -> &[Dependency] { self.manifest.dependencies() }
+ /// Get the manifest
+ pub fn manifest(&self) -> &Manifest { &self.manifest }
+ /// Get the path to the manifest
+ pub fn manifest_path(&self) -> &Path { &self.manifest_path }
+ /// Get the name of the package
+ pub fn name(&self) -> &str { self.package_id().name() }
+ /// Get the PackageId object for the package (fully defines a packge)
+ pub fn package_id(&self) -> &PackageId { self.manifest.package_id() }
+ /// Get the root folder of the package
+ pub fn root(&self) -> &Path { self.manifest_path.parent().unwrap() }
+ /// Get the summary for the package
+ pub fn summary(&self) -> &Summary { self.manifest.summary() }
+ /// Get the targets specified in the manifest
+ pub fn targets(&self) -> &[Target] { self.manifest.targets() }
+ /// Get the current package version
+ pub fn version(&self) -> &Version { self.package_id().version() }
+ /// Get the package authors
+ pub fn authors(&self) -> &Vec<String> { &self.manifest.metadata().authors }
+ /// Whether the package is set to publish
+ pub fn publish(&self) -> &Option<Vec<String>> { self.manifest.publish() }
+
+ /// Whether the package uses a custom build script for any target
+ pub fn has_custom_build(&self) -> bool {
+ self.targets().iter().any(|t| t.is_custom_build())
+ }
+
+ pub fn find_closest_target(&self,
+ target: &str,
+ is_expected_kind: fn(&Target)-> bool) -> Option<&Target> {
+ let targets = self.targets();
+
+ let matches = targets.iter().filter(|t| is_expected_kind(t))
+ .map(|t| (lev_distance(target, t.name()), t))
+ .filter(|&(d, _)| d < 4);
+ matches.min_by_key(|t| t.0).map(|t| t.1)
+ }
+
+ pub fn map_source(self, to_replace: &SourceId, replace_with: &SourceId)
+ -> Package {
+ Package {
+ manifest: self.manifest.map_source(to_replace, replace_with),
+ manifest_path: self.manifest_path,
+ }
+ }
+
+ pub fn to_registry_toml(&self) -> CargoResult<String> {
+ let manifest = self.manifest().original().prepare_for_publish();
+ let toml = toml::to_string(&manifest)?;
+ Ok(format!("\
+ # THIS FILE IS AUTOMATICALLY GENERATED BY CARGO\n\
+ #\n\
+ # When uploading crates to the registry Cargo will automatically\n\
+ # \"normalize\" Cargo.toml files for maximal compatibility\n\
+ # with all versions of Cargo and also rewrite `path` dependencies\n\
+ # to registry (e.g. crates.io) dependencies\n\
+ #\n\
+ # If you believe there's an error in this file please file an\n\
+ # issue against the rust-lang/cargo repository. If you're\n\
+ # editing this file be aware that the upstream Cargo.toml\n\
+ # will likely look very different (and much more reasonable)\n\
+ \n\
+ {}\
+ ", toml))
+ }
+}
+
+impl fmt::Display for Package {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ write!(f, "{}", self.summary().package_id())
+ }
+}
+
+impl PartialEq for Package {
+ fn eq(&self, other: &Package) -> bool {
+ self.package_id() == other.package_id()
+ }
+}
+
+impl Eq for Package {}
+
+impl hash::Hash for Package {
+ fn hash<H: hash::Hasher>(&self, into: &mut H) {
+ self.package_id().hash(into)
+ }
+}
+
+pub struct PackageSet<'cfg> {
+ packages: HashMap<PackageId, LazyCell<Package>>,
+ sources: RefCell<SourceMap<'cfg>>,
+}
+
+impl<'cfg> PackageSet<'cfg> {
+ pub fn new(package_ids: &[PackageId],
+ sources: SourceMap<'cfg>) -> PackageSet<'cfg> {
+ PackageSet {
+ packages: package_ids.iter().map(|id| {
+ (id.clone(), LazyCell::new())
+ }).collect(),
+ sources: RefCell::new(sources),
+ }
+ }
+
+ pub fn package_ids<'a>(&'a self) -> Box<Iterator<Item=&'a PackageId> + 'a> {
+ Box::new(self.packages.keys())
+ }
+
+ pub fn get(&self, id: &PackageId) -> CargoResult<&Package> {
+ let slot = self.packages.get(id).ok_or_else(|| {
+ internal(format!("couldn't find `{}` in package set", id))
+ })?;
+ if let Some(pkg) = slot.borrow() {
+ return Ok(pkg)
+ }
+ let mut sources = self.sources.borrow_mut();
+ let source = sources.get_mut(id.source_id()).ok_or_else(|| {
+ internal(format!("couldn't find source for `{}`", id))
+ })?;
+ let pkg = source.download(id).chain_err(|| {
+ "unable to get packages from source"
+ })?;
+ assert!(slot.fill(pkg).is_ok());
+ Ok(slot.borrow().unwrap())
+ }
+
+ pub fn sources(&self) -> Ref<SourceMap<'cfg>> {
+ self.sources.borrow()
+ }
+}
--- /dev/null
+use std::cmp::Ordering;
+use std::fmt::{self, Formatter};
+use std::hash::Hash;
+use std::hash;
+use std::path::Path;
+use std::sync::Arc;
+
+use semver;
+use serde::de;
+use serde::ser;
+
+use util::{CargoResult, ToSemver};
+use core::source::SourceId;
+
+/// Identifier for a specific version of a package in a specific source.
+#[derive(Clone)]
+pub struct PackageId {
+ inner: Arc<PackageIdInner>,
+}
+
+#[derive(PartialEq, PartialOrd, Eq, Ord)]
+struct PackageIdInner {
+ name: String,
+ version: semver::Version,
+ source_id: SourceId,
+}
+
+impl ser::Serialize for PackageId {
+ fn serialize<S>(&self, s: S) -> Result<S::Ok, S::Error>
+ where S: ser::Serializer
+ {
+ s.collect_str(&format_args!("{} {} ({})",
+ self.inner.name,
+ self.inner.version,
+ self.inner.source_id.to_url()))
+ }
+}
+
+impl<'de> de::Deserialize<'de> for PackageId {
+ fn deserialize<D>(d: D) -> Result<PackageId, D::Error>
+ where D: de::Deserializer<'de>
+ {
+ let string = String::deserialize(d)?;
+ let mut s = string.splitn(3, ' ');
+ let name = s.next().unwrap();
+ let version = match s.next() {
+ Some(s) => s,
+ None => return Err(de::Error::custom("invalid serialized PackageId")),
+ };
+ let version = semver::Version::parse(version)
+ .map_err(de::Error::custom)?;
+ let url = match s.next() {
+ Some(s) => s,
+ None => return Err(de::Error::custom("invalid serialized PackageId")),
+ };
+ let url = if url.starts_with('(') && url.ends_with(')') {
+ &url[1..url.len() - 1]
+ } else {
+ return Err(de::Error::custom("invalid serialized PackageId"))
+
+ };
+ let source_id = SourceId::from_url(url).map_err(de::Error::custom)?;
+
+ Ok(PackageId {
+ inner: Arc::new(PackageIdInner {
+ name: name.to_string(),
+ version: version,
+ source_id: source_id,
+ }),
+ })
+ }
+}
+
+impl Hash for PackageId {
+ fn hash<S: hash::Hasher>(&self, state: &mut S) {
+ self.inner.name.hash(state);
+ self.inner.version.hash(state);
+ self.inner.source_id.hash(state);
+ }
+}
+
+impl PartialEq for PackageId {
+ fn eq(&self, other: &PackageId) -> bool {
+ (*self.inner).eq(&*other.inner)
+ }
+}
+impl PartialOrd for PackageId {
+ fn partial_cmp(&self, other: &PackageId) -> Option<Ordering> {
+ (*self.inner).partial_cmp(&*other.inner)
+ }
+}
+impl Eq for PackageId {}
+impl Ord for PackageId {
+ fn cmp(&self, other: &PackageId) -> Ordering {
+ (*self.inner).cmp(&*other.inner)
+ }
+}
+
+impl PackageId {
+ pub fn new<T: ToSemver>(name: &str, version: T,
+ sid: &SourceId) -> CargoResult<PackageId> {
+ let v = version.to_semver()?;
+ Ok(PackageId {
+ inner: Arc::new(PackageIdInner {
+ name: name.to_string(),
+ version: v,
+ source_id: sid.clone(),
+ }),
+ })
+ }
+
+ pub fn name(&self) -> &str { &self.inner.name }
+ pub fn version(&self) -> &semver::Version { &self.inner.version }
+ pub fn source_id(&self) -> &SourceId { &self.inner.source_id }
+
+ pub fn with_precise(&self, precise: Option<String>) -> PackageId {
+ PackageId {
+ inner: Arc::new(PackageIdInner {
+ name: self.inner.name.to_string(),
+ version: self.inner.version.clone(),
+ source_id: self.inner.source_id.with_precise(precise),
+ }),
+ }
+ }
+
+ pub fn with_source_id(&self, source: &SourceId) -> PackageId {
+ PackageId {
+ inner: Arc::new(PackageIdInner {
+ name: self.inner.name.to_string(),
+ version: self.inner.version.clone(),
+ source_id: source.clone(),
+ }),
+ }
+ }
+
+ pub fn stable_hash<'a>(&'a self, workspace: &'a Path) -> PackageIdStableHash<'a> {
+ PackageIdStableHash(self, workspace)
+ }
+}
+
+pub struct PackageIdStableHash<'a>(&'a PackageId, &'a Path);
+
+impl<'a> Hash for PackageIdStableHash<'a> {
+ fn hash<S: hash::Hasher>(&self, state: &mut S) {
+ self.0.inner.name.hash(state);
+ self.0.inner.version.hash(state);
+ self.0.inner.source_id.stable_hash(self.1, state);
+ }
+}
+
+impl fmt::Display for PackageId {
+ fn fmt(&self, f: &mut Formatter) -> fmt::Result {
+ write!(f, "{} v{}", self.inner.name, self.inner.version)?;
+
+ if !self.inner.source_id.is_default_registry() {
+ write!(f, " ({})", self.inner.source_id)?;
+ }
+
+ Ok(())
+ }
+}
+
+impl fmt::Debug for PackageId {
+ fn fmt(&self, f: &mut Formatter) -> fmt::Result {
+ f.debug_struct("PackageId")
+ .field("name", &self.inner.name)
+ .field("version", &self.inner.version.to_string())
+ .field("source", &self.inner.source_id.to_string())
+ .finish()
+ }
+}
+
+#[cfg(test)]
+mod tests {
+ use super::PackageId;
+ use core::source::SourceId;
+ use sources::CRATES_IO;
+ use util::ToUrl;
+
+ #[test]
+ fn invalid_version_handled_nicely() {
+ let loc = CRATES_IO.to_url().unwrap();
+ let repo = SourceId::for_registry(&loc).unwrap();
+
+ assert!(PackageId::new("foo", "1.0", &repo).is_err());
+ assert!(PackageId::new("foo", "1", &repo).is_err());
+ assert!(PackageId::new("foo", "bar", &repo).is_err());
+ assert!(PackageId::new("foo", "", &repo).is_err());
+ }
+}
--- /dev/null
+use std::collections::HashMap;
+use std::fmt;
+
+use semver::Version;
+use url::Url;
+
+use core::PackageId;
+use util::{ToUrl, ToSemver};
+use util::errors::{CargoError, CargoResult, CargoResultExt};
+
+#[derive(Clone, PartialEq, Eq, Debug)]
+pub struct PackageIdSpec {
+ name: String,
+ version: Option<Version>,
+ url: Option<Url>,
+}
+
+impl PackageIdSpec {
+ pub fn parse(spec: &str) -> CargoResult<PackageIdSpec> {
+ if spec.contains('/') {
+ if let Ok(url) = spec.to_url() {
+ return PackageIdSpec::from_url(url);
+ }
+ if !spec.contains("://") {
+ if let Ok(url) = Url::parse(&format!("cargo://{}", spec)) {
+ return PackageIdSpec::from_url(url);
+ }
+ }
+ }
+ let mut parts = spec.splitn(2, ':');
+ let name = parts.next().unwrap();
+ let version = match parts.next() {
+ Some(version) => Some(Version::parse(version)?),
+ None => None,
+ };
+ for ch in name.chars() {
+ if !ch.is_alphanumeric() && ch != '_' && ch != '-' {
+ bail!("invalid character in pkgid `{}`: `{}`", spec, ch)
+ }
+ }
+ Ok(PackageIdSpec {
+ name: name.to_string(),
+ version: version,
+ url: None,
+ })
+ }
+
+ pub fn query_str<'a, I>(spec: &str, i: I) -> CargoResult<&'a PackageId>
+ where I: IntoIterator<Item=&'a PackageId>
+ {
+ let spec = PackageIdSpec::parse(spec).chain_err(|| {
+ format!("invalid package id specification: `{}`", spec)
+ })?;
+ spec.query(i)
+ }
+
+ pub fn from_package_id(package_id: &PackageId) -> PackageIdSpec {
+ PackageIdSpec {
+ name: package_id.name().to_string(),
+ version: Some(package_id.version().clone()),
+ url: Some(package_id.source_id().url().clone()),
+ }
+ }
+
+ fn from_url(mut url: Url) -> CargoResult<PackageIdSpec> {
+ if url.query().is_some() {
+ bail!("cannot have a query string in a pkgid: {}", url)
+ }
+ let frag = url.fragment().map(|s| s.to_owned());
+ url.set_fragment(None);
+ let (name, version) = {
+ let mut path = url.path_segments().ok_or_else(|| {
+ CargoError::from(format!("pkgid urls must have a path: {}", url))
+ })?;
+ let path_name = path.next_back().ok_or_else(|| {
+ CargoError::from(format!("pkgid urls must have at least one path \
+ component: {}", url))
+ })?;
+ match frag {
+ Some(fragment) => {
+ let mut parts = fragment.splitn(2, ':');
+ let name_or_version = parts.next().unwrap();
+ match parts.next() {
+ Some(part) => {
+ let version = part.to_semver()?;
+ (name_or_version.to_string(), Some(version))
+ }
+ None => {
+ if name_or_version.chars().next().unwrap()
+ .is_alphabetic() {
+ (name_or_version.to_string(), None)
+ } else {
+ let version = name_or_version.to_semver()?;
+ (path_name.to_string(), Some(version))
+ }
+ }
+ }
+ }
+ None => (path_name.to_string(), None),
+ }
+ };
+ Ok(PackageIdSpec {
+ name: name,
+ version: version,
+ url: Some(url),
+ })
+ }
+
+ pub fn name(&self) -> &str { &self.name }
+ pub fn version(&self) -> Option<&Version> { self.version.as_ref() }
+ pub fn url(&self) -> Option<&Url> { self.url.as_ref() }
+
+ pub fn set_url(&mut self, url: Url) {
+ self.url = Some(url);
+ }
+
+ pub fn matches(&self, package_id: &PackageId) -> bool {
+ if self.name() != package_id.name() { return false }
+
+ if let Some(ref v) = self.version {
+ if v != package_id.version() {
+ return false;
+ }
+ }
+
+ match self.url {
+ Some(ref u) => u == package_id.source_id().url(),
+ None => true
+ }
+ }
+
+ pub fn query<'a, I>(&self, i: I) -> CargoResult<&'a PackageId>
+ where I: IntoIterator<Item=&'a PackageId>
+ {
+ let mut ids = i.into_iter().filter(|p| self.matches(*p));
+ let ret = match ids.next() {
+ Some(id) => id,
+ None => bail!("package id specification `{}` \
+ matched no packages", self),
+ };
+ return match ids.next() {
+ Some(other) => {
+ let mut msg = format!("There are multiple `{}` packages in \
+ your project, and the specification \
+ `{}` is ambiguous.\n\
+ Please re-run this command \
+ with `-p <spec>` where `<spec>` is one \
+ of the following:",
+ self.name(), self);
+ let mut vec = vec![ret, other];
+ vec.extend(ids);
+ minimize(&mut msg, &vec, self);
+ Err(msg.into())
+ }
+ None => Ok(ret)
+ };
+
+ fn minimize(msg: &mut String,
+ ids: &[&PackageId],
+ spec: &PackageIdSpec) {
+ let mut version_cnt = HashMap::new();
+ for id in ids {
+ *version_cnt.entry(id.version()).or_insert(0) += 1;
+ }
+ for id in ids {
+ if version_cnt[id.version()] == 1 {
+ msg.push_str(&format!("\n {}:{}", spec.name(),
+ id.version()));
+ } else {
+ msg.push_str(&format!("\n {}",
+ PackageIdSpec::from_package_id(*id)));
+ }
+ }
+ }
+ }
+}
+
+impl fmt::Display for PackageIdSpec {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ let mut printed_name = false;
+ match self.url {
+ Some(ref url) => {
+ if url.scheme() == "cargo" {
+ write!(f, "{}{}", url.host().unwrap(), url.path())?;
+ } else {
+ write!(f, "{}", url)?;
+ }
+ if url.path_segments().unwrap().next_back().unwrap() != self.name {
+ printed_name = true;
+ write!(f, "#{}", self.name)?;
+ }
+ }
+ None => { printed_name = true; write!(f, "{}", self.name)? }
+ }
+ if let Some(ref v) = self.version {
+ write!(f, "{}{}", if printed_name {":"} else {"#"}, v)?;
+ }
+ Ok(())
+ }
+}
+
+#[cfg(test)]
+mod tests {
+ use core::{PackageId, SourceId};
+ use super::PackageIdSpec;
+ use url::Url;
+ use semver::Version;
+
+ #[test]
+ fn good_parsing() {
+ fn ok(spec: &str, expected: PackageIdSpec) {
+ let parsed = PackageIdSpec::parse(spec).unwrap();
+ assert_eq!(parsed, expected);
+ assert_eq!(parsed.to_string(), spec);
+ }
+
+ ok("http://crates.io/foo#1.2.3", PackageIdSpec {
+ name: "foo".to_string(),
+ version: Some(Version::parse("1.2.3").unwrap()),
+ url: Some(Url::parse("http://crates.io/foo").unwrap()),
+ });
+ ok("http://crates.io/foo#bar:1.2.3", PackageIdSpec {
+ name: "bar".to_string(),
+ version: Some(Version::parse("1.2.3").unwrap()),
+ url: Some(Url::parse("http://crates.io/foo").unwrap()),
+ });
+ ok("crates.io/foo", PackageIdSpec {
+ name: "foo".to_string(),
+ version: None,
+ url: Some(Url::parse("cargo://crates.io/foo").unwrap()),
+ });
+ ok("crates.io/foo#1.2.3", PackageIdSpec {
+ name: "foo".to_string(),
+ version: Some(Version::parse("1.2.3").unwrap()),
+ url: Some(Url::parse("cargo://crates.io/foo").unwrap()),
+ });
+ ok("crates.io/foo#bar", PackageIdSpec {
+ name: "bar".to_string(),
+ version: None,
+ url: Some(Url::parse("cargo://crates.io/foo").unwrap()),
+ });
+ ok("crates.io/foo#bar:1.2.3", PackageIdSpec {
+ name: "bar".to_string(),
+ version: Some(Version::parse("1.2.3").unwrap()),
+ url: Some(Url::parse("cargo://crates.io/foo").unwrap()),
+ });
+ ok("foo", PackageIdSpec {
+ name: "foo".to_string(),
+ version: None,
+ url: None,
+ });
+ ok("foo:1.2.3", PackageIdSpec {
+ name: "foo".to_string(),
+ version: Some(Version::parse("1.2.3").unwrap()),
+ url: None,
+ });
+ }
+
+ #[test]
+ fn bad_parsing() {
+ assert!(PackageIdSpec::parse("baz:").is_err());
+ assert!(PackageIdSpec::parse("baz:*").is_err());
+ assert!(PackageIdSpec::parse("baz:1.0").is_err());
+ assert!(PackageIdSpec::parse("http://baz:1.0").is_err());
+ assert!(PackageIdSpec::parse("http://#baz:1.0").is_err());
+ }
+
+ #[test]
+ fn matching() {
+ let url = Url::parse("http://example.com").unwrap();
+ let sid = SourceId::for_registry(&url).unwrap();
+ let foo = PackageId::new("foo", "1.2.3", &sid).unwrap();
+ let bar = PackageId::new("bar", "1.2.3", &sid).unwrap();
+
+ assert!( PackageIdSpec::parse("foo").unwrap().matches(&foo));
+ assert!(!PackageIdSpec::parse("foo").unwrap().matches(&bar));
+ assert!( PackageIdSpec::parse("foo:1.2.3").unwrap().matches(&foo));
+ assert!(!PackageIdSpec::parse("foo:1.2.2").unwrap().matches(&foo));
+ }
+}
--- /dev/null
+use std::collections::HashMap;
+
+use semver::VersionReq;
+use url::Url;
+
+use core::{Source, SourceId, SourceMap, Summary, Dependency, PackageId};
+use core::PackageSet;
+use util::{Config, profile};
+use util::errors::{CargoResult, CargoResultExt};
+use sources::config::SourceConfigMap;
+
+/// Source of information about a group of packages.
+///
+/// See also `core::Source`.
+pub trait Registry {
+ /// Attempt to find the packages that match a dependency request.
+ fn query(&mut self,
+ dep: &Dependency,
+ f: &mut FnMut(Summary)) -> CargoResult<()>;
+
+ fn query_vec(&mut self, dep: &Dependency) -> CargoResult<Vec<Summary>> {
+ let mut ret = Vec::new();
+ self.query(dep, &mut |s| ret.push(s))?;
+ Ok(ret)
+ }
+
+ /// Returns whether or not this registry will return summaries with
+ /// checksums listed.
+ fn supports_checksums(&self) -> bool;
+
+ /// Returns whether or not this registry will return summaries with
+ /// the `precise` field in the source id listed.
+ fn requires_precise(&self) -> bool;
+}
+
+impl<'a, T: ?Sized + Registry + 'a> Registry for Box<T> {
+ fn query(&mut self,
+ dep: &Dependency,
+ f: &mut FnMut(Summary)) -> CargoResult<()> {
+ (**self).query(dep, f)
+ }
+
+ fn supports_checksums(&self) -> bool {
+ (**self).supports_checksums()
+ }
+
+ fn requires_precise(&self) -> bool {
+ (**self).requires_precise()
+ }
+}
+
+/// This structure represents a registry of known packages. It internally
+/// contains a number of `Box<Source>` instances which are used to load a
+/// `Package` from.
+///
+/// The resolution phase of Cargo uses this to drive knowledge about new
+/// packages as well as querying for lists of new packages. It is here that
+/// sources are updated (e.g. network operations) and overrides are
+/// handled.
+///
+/// The general idea behind this registry is that it is centered around the
+/// `SourceMap` structure, contained within which is a mapping of a `SourceId` to
+/// a `Source`. Each `Source` in the map has been updated (using network
+/// operations if necessary) and is ready to be queried for packages.
+pub struct PackageRegistry<'cfg> {
+ sources: SourceMap<'cfg>,
+
+ // A list of sources which are considered "overrides" which take precedent
+ // when querying for packages.
+ overrides: Vec<SourceId>,
+
+ // Note that each SourceId does not take into account its `precise` field
+ // when hashing or testing for equality. When adding a new `SourceId`, we
+ // want to avoid duplicates in the `SourceMap` (to prevent re-updating the
+ // same git repo twice for example), but we also want to ensure that the
+ // loaded source is always updated.
+ //
+ // Sources with a `precise` field normally don't need to be updated because
+ // their contents are already on disk, but sources without a `precise` field
+ // almost always need to be updated. If we have a cached `Source` for a
+ // precise `SourceId`, then when we add a new `SourceId` that is not precise
+ // we want to ensure that the underlying source is updated.
+ //
+ // This is basically a long-winded way of saying that we want to know
+ // precisely what the keys of `sources` are, so this is a mapping of key to
+ // what exactly the key is.
+ source_ids: HashMap<SourceId, (SourceId, Kind)>,
+
+ locked: LockedMap,
+ source_config: SourceConfigMap<'cfg>,
+ patches: HashMap<Url, Vec<Summary>>,
+}
+
+type LockedMap = HashMap<SourceId, HashMap<String, Vec<(PackageId, Vec<PackageId>)>>>;
+
+#[derive(PartialEq, Eq, Clone, Copy)]
+enum Kind {
+ Override,
+ Locked,
+ Normal,
+}
+
+impl<'cfg> PackageRegistry<'cfg> {
+ pub fn new(config: &'cfg Config) -> CargoResult<PackageRegistry<'cfg>> {
+ let source_config = SourceConfigMap::new(config)?;
+ Ok(PackageRegistry {
+ sources: SourceMap::new(),
+ source_ids: HashMap::new(),
+ overrides: Vec::new(),
+ source_config: source_config,
+ locked: HashMap::new(),
+ patches: HashMap::new(),
+ })
+ }
+
+ pub fn get(self, package_ids: &[PackageId]) -> PackageSet<'cfg> {
+ trace!("getting packages; sources={}", self.sources.len());
+ PackageSet::new(package_ids, self.sources)
+ }
+
+ fn ensure_loaded(&mut self, namespace: &SourceId, kind: Kind) -> CargoResult<()> {
+ match self.source_ids.get(namespace) {
+ // We've previously loaded this source, and we've already locked it,
+ // so we're not allowed to change it even if `namespace` has a
+ // slightly different precise version listed.
+ Some(&(_, Kind::Locked)) => {
+ debug!("load/locked {}", namespace);
+ return Ok(())
+ }
+
+ // If the previous source was not a precise source, then we can be
+ // sure that it's already been updated if we've already loaded it.
+ Some(&(ref previous, _)) if previous.precise().is_none() => {
+ debug!("load/precise {}", namespace);
+ return Ok(())
+ }
+
+ // If the previous source has the same precise version as we do,
+ // then we're done, otherwise we need to need to move forward
+ // updating this source.
+ Some(&(ref previous, _)) => {
+ if previous.precise() == namespace.precise() {
+ debug!("load/match {}", namespace);
+ return Ok(())
+ }
+ debug!("load/mismatch {}", namespace);
+ }
+ None => {
+ debug!("load/missing {}", namespace);
+ }
+ }
+
+ self.load(namespace, kind)?;
+ Ok(())
+ }
+
+ pub fn add_sources(&mut self, ids: &[SourceId]) -> CargoResult<()> {
+ for id in ids.iter() {
+ self.ensure_loaded(id, Kind::Locked)?;
+ }
+ Ok(())
+ }
+
+ pub fn add_preloaded(&mut self, source: Box<Source + 'cfg>) {
+ self.add_source(source, Kind::Locked);
+ }
+
+ fn add_source(&mut self, source: Box<Source + 'cfg>, kind: Kind) {
+ let id = source.source_id().clone();
+ self.sources.insert(source);
+ self.source_ids.insert(id.clone(), (id, kind));
+ }
+
+ pub fn add_override(&mut self, source: Box<Source + 'cfg>) {
+ self.overrides.push(source.source_id().clone());
+ self.add_source(source, Kind::Override);
+ }
+
+ pub fn register_lock(&mut self, id: PackageId, deps: Vec<PackageId>) {
+ trace!("register_lock: {}", id);
+ for dep in deps.iter() {
+ trace!("\t-> {}", dep);
+ }
+ let sub_map = self.locked.entry(id.source_id().clone())
+ .or_insert(HashMap::new());
+ let sub_vec = sub_map.entry(id.name().to_string())
+ .or_insert(Vec::new());
+ sub_vec.push((id, deps));
+ }
+
+ pub fn patch(&mut self, url: &Url, deps: &[Dependency]) -> CargoResult<()> {
+ let deps = deps.iter().map(|dep| {
+ let mut summaries = self.query_vec(dep)?.into_iter();
+ let summary = match summaries.next() {
+ Some(summary) => summary,
+ None => {
+ bail!("patch for `{}` in `{}` did not resolve to any crates. If this is \
+ unexpected, you may wish to consult: \
+ https://github.com/rust-lang/cargo/issues/4678",
+ dep.name(), url)
+ }
+ };
+ if summaries.next().is_some() {
+ bail!("patch for `{}` in `{}` resolved to more than one candidate",
+ dep.name(), url)
+ }
+ if summary.package_id().source_id().url() == url {
+ bail!("patch for `{}` in `{}` points to the same source, but \
+ patches must point to different sources",
+ dep.name(), url);
+ }
+ Ok(summary)
+ }).collect::<CargoResult<Vec<_>>>().chain_err(|| {
+ format!("failed to resolve patches for `{}`", url)
+ })?;
+
+ self.patches.insert(url.clone(), deps);
+
+ Ok(())
+ }
+
+ pub fn patches(&self) -> &HashMap<Url, Vec<Summary>> {
+ &self.patches
+ }
+
+ fn load(&mut self, source_id: &SourceId, kind: Kind) -> CargoResult<()> {
+ (|| {
+ let source = self.source_config.load(source_id)?;
+ assert_eq!(source.source_id(), source_id);
+
+ if kind == Kind::Override {
+ self.overrides.push(source_id.clone());
+ }
+ self.add_source(source, kind);
+
+ // Ensure the source has fetched all necessary remote data.
+ let _p = profile::start(format!("updating: {}", source_id));
+ self.sources.get_mut(source_id).unwrap().update()
+ })().chain_err(|| format!("Unable to update {}", source_id))
+ }
+
+ fn query_overrides(&mut self, dep: &Dependency)
+ -> CargoResult<Option<Summary>> {
+ for s in self.overrides.iter() {
+ let src = self.sources.get_mut(s).unwrap();
+ let dep = Dependency::new_override(dep.name(), s);
+ let mut results = src.query_vec(&dep)?;
+ if !results.is_empty() {
+ return Ok(Some(results.remove(0)))
+ }
+ }
+ Ok(None)
+ }
+
+ /// This function is used to transform a summary to another locked summary
+ /// if possible. This is where the concept of a lockfile comes into play.
+ ///
+ /// If a summary points at a package id which was previously locked, then we
+ /// override the summary's id itself, as well as all dependencies, to be
+ /// rewritten to the locked versions. This will transform the summary's
+ /// source to a precise source (listed in the locked version) as well as
+ /// transforming all of the dependencies from range requirements on
+ /// imprecise sources to exact requirements on precise sources.
+ ///
+ /// If a summary does not point at a package id which was previously locked,
+ /// or if any dependencies were added and don't have a previously listed
+ /// version, we still want to avoid updating as many dependencies as
+ /// possible to keep the graph stable. In this case we map all of the
+ /// summary's dependencies to be rewritten to a locked version wherever
+ /// possible. If we're unable to map a dependency though, we just pass it on
+ /// through.
+ pub fn lock(&self, summary: Summary) -> Summary {
+ lock(&self.locked, &self.patches, summary)
+ }
+
+ fn warn_bad_override(&self,
+ override_summary: &Summary,
+ real_summary: &Summary) -> CargoResult<()> {
+ let mut real_deps = real_summary.dependencies().iter().collect::<Vec<_>>();
+
+ let boilerplate = "\
+This is currently allowed but is known to produce buggy behavior with spurious
+recompiles and changes to the crate graph. Path overrides unfortunately were
+never intended to support this feature, so for now this message is just a
+warning. In the future, however, this message will become a hard error.
+
+To change the dependency graph via an override it's recommended to use the
+`[replace]` feature of Cargo instead of the path override feature. This is
+documented online at the url below for more information.
+
+http://doc.crates.io/specifying-dependencies.html#overriding-dependencies
+";
+
+ for dep in override_summary.dependencies() {
+ if let Some(i) = real_deps.iter().position(|d| dep == *d) {
+ real_deps.remove(i);
+ continue
+ }
+ let msg = format!("\
+ path override for crate `{}` has altered the original list of\n\
+ dependencies; the dependency on `{}` was either added or\n\
+ modified to not match the previously resolved version\n\n\
+ {}", override_summary.package_id().name(), dep.name(), boilerplate);
+ self.source_config.config().shell().warn(&msg)?;
+ return Ok(())
+ }
+
+ if let Some(id) = real_deps.get(0) {
+ let msg = format!("\
+ path override for crate `{}` has altered the original list of
+ dependencies; the dependency on `{}` was removed\n\n
+ {}", override_summary.package_id().name(), id.name(), boilerplate);
+ self.source_config.config().shell().warn(&msg)?;
+ return Ok(())
+ }
+
+ Ok(())
+ }
+}
+
+impl<'cfg> Registry for PackageRegistry<'cfg> {
+ fn query(&mut self,
+ dep: &Dependency,
+ f: &mut FnMut(Summary)) -> CargoResult<()> {
+ let (override_summary, n, to_warn) = {
+ // Look for an override and get ready to query the real source.
+ let override_summary = self.query_overrides(dep)?;
+
+ // Next up on our list of candidates is to check the `[patch]`
+ // section of the manifest. Here we look through all patches
+ // relevant to the source that `dep` points to, and then we match
+ // name/version. Note that we don't use `dep.matches(..)` because
+ // the patches, by definition, come from a different source.
+ // This means that `dep.matches(..)` will always return false, when
+ // what we really care about is the name/version match.
+ let mut patches = Vec::<Summary>::new();
+ if let Some(extra) = self.patches.get(dep.source_id().url()) {
+ patches.extend(extra.iter().filter(|s| {
+ dep.matches_ignoring_source(s)
+ }).cloned());
+ }
+
+ // A crucial feature of the `[patch]` feature is that we *don't*
+ // query the actual registry if we have a "locked" dependency. A
+ // locked dep basically just means a version constraint of `=a.b.c`,
+ // and because patches take priority over the actual source then if
+ // we have a candidate we're done.
+ if patches.len() == 1 && dep.is_locked() {
+ let patch = patches.remove(0);
+ match override_summary {
+ Some(summary) => (summary, 1, Some(patch)),
+ None => {
+ f(patch);
+ return Ok(())
+ }
+ }
+ } else {
+ if !patches.is_empty() {
+ debug!("found {} patches with an unlocked dep, \
+ looking at sources", patches.len());
+ }
+
+ // Ensure the requested source_id is loaded
+ self.ensure_loaded(dep.source_id(), Kind::Normal).chain_err(|| {
+ format!("failed to load source for a dependency \
+ on `{}`", dep.name())
+ })?;
+
+ let source = self.sources.get_mut(dep.source_id());
+ match (override_summary, source) {
+ (Some(_), None) => bail!("override found but no real ones"),
+ (None, None) => return Ok(()),
+
+ // If we don't have an override then we just ship
+ // everything upstairs after locking the summary
+ (None, Some(source)) => {
+ for patch in patches.iter() {
+ f(patch.clone());
+ }
+
+ // Our sources shouldn't ever come back to us with two
+ // summaries that have the same version. We could,
+ // however, have an `[patch]` section which is in use
+ // to override a version in the registry. This means
+ // that if our `summary` in this loop has the same
+ // version as something in `patches` that we've
+ // already selected, then we skip this `summary`.
+ let locked = &self.locked;
+ let all_patches = &self.patches;
+ return source.query(dep, &mut |summary| {
+ for patch in patches.iter() {
+ let patch = patch.package_id().version();
+ if summary.package_id().version() == patch {
+ return
+ }
+ }
+ f(lock(locked, all_patches, summary))
+ })
+ }
+
+ // If we have an override summary then we query the source
+ // to sanity check its results. We don't actually use any of
+ // the summaries it gives us though.
+ (Some(override_summary), Some(source)) => {
+ if !patches.is_empty() {
+ bail!("found patches and a path override")
+ }
+ let mut n = 0;
+ let mut to_warn = None;
+ source.query(dep, &mut |summary| {
+ n += 1;
+ to_warn = Some(summary);
+ })?;
+ (override_summary, n, to_warn)
+ }
+ }
+ }
+ };
+
+ if n > 1 {
+ bail!("found an override with a non-locked list");
+ } else if let Some(summary) = to_warn {
+ self.warn_bad_override(&override_summary, &summary)?;
+ }
+ f(self.lock(override_summary));
+ Ok(())
+ }
+
+ fn supports_checksums(&self) -> bool {
+ false
+ }
+
+ fn requires_precise(&self) -> bool {
+ false
+ }
+}
+
+fn lock(locked: &LockedMap,
+ patches: &HashMap<Url, Vec<Summary>>,
+ summary: Summary) -> Summary {
+ let pair = locked.get(summary.source_id()).and_then(|map| {
+ map.get(summary.name())
+ }).and_then(|vec| {
+ vec.iter().find(|&&(ref id, _)| id == summary.package_id())
+ });
+
+ trace!("locking summary of {}", summary.package_id());
+
+ // Lock the summary's id if possible
+ let summary = match pair {
+ Some(&(ref precise, _)) => summary.override_id(precise.clone()),
+ None => summary,
+ };
+ summary.map_dependencies(|dep| {
+ trace!("\t{}/{}/{}", dep.name(), dep.version_req(),
+ dep.source_id());
+
+ // If we've got a known set of overrides for this summary, then
+ // one of a few cases can arise:
+ //
+ // 1. We have a lock entry for this dependency from the same
+ // source as it's listed as coming from. In this case we make
+ // sure to lock to precisely the given package id.
+ //
+ // 2. We have a lock entry for this dependency, but it's from a
+ // different source than what's listed, or the version
+ // requirement has changed. In this case we must discard the
+ // locked version because the dependency needs to be
+ // re-resolved.
+ //
+ // 3. We don't have a lock entry for this dependency, in which
+ // case it was likely an optional dependency which wasn't
+ // included previously so we just pass it through anyway.
+ //
+ // Cases 1/2 are handled by `matches_id` and case 3 is handled by
+ // falling through to the logic below.
+ if let Some(&(_, ref locked_deps)) = pair {
+ let locked = locked_deps.iter().find(|id| dep.matches_id(id));
+ if let Some(locked) = locked {
+ trace!("\tfirst hit on {}", locked);
+ let mut dep = dep.clone();
+ dep.lock_to(locked);
+ return dep
+ }
+ }
+
+ // If this dependency did not have a locked version, then we query
+ // all known locked packages to see if they match this dependency.
+ // If anything does then we lock it to that and move on.
+ let v = locked.get(dep.source_id()).and_then(|map| {
+ map.get(dep.name())
+ }).and_then(|vec| {
+ vec.iter().find(|&&(ref id, _)| dep.matches_id(id))
+ });
+ if let Some(&(ref id, _)) = v {
+ trace!("\tsecond hit on {}", id);
+ let mut dep = dep.clone();
+ dep.lock_to(id);
+ return dep
+ }
+
+ // Finally we check to see if any registered patches correspond to
+ // this dependency.
+ let v = patches.get(dep.source_id().url()).map(|vec| {
+ let dep2 = dep.clone();
+ let mut iter = vec.iter().filter(move |s| {
+ dep2.name() == s.package_id().name() &&
+ dep2.version_req().matches(s.package_id().version())
+ });
+ (iter.next(), iter)
+ });
+ if let Some((Some(summary), mut remaining)) = v {
+ assert!(remaining.next().is_none());
+ let patch_source = summary.package_id().source_id();
+ let patch_locked = locked.get(patch_source).and_then(|m| {
+ m.get(summary.package_id().name())
+ }).map(|list| {
+ list.iter().any(|&(ref id, _)| id == summary.package_id())
+ }).unwrap_or(false);
+
+ if patch_locked {
+ trace!("\tthird hit on {}", summary.package_id());
+ let req = VersionReq::exact(summary.package_id().version());
+ let mut dep = dep.clone();
+ dep.set_version_req(req);
+ return dep
+ }
+ }
+
+ trace!("\tnope, unlocked");
+ dep
+ })
+}
+
+#[cfg(test)]
+pub mod test {
+ use core::{Summary, Registry, Dependency};
+ use util::CargoResult;
+
+ pub struct RegistryBuilder {
+ summaries: Vec<Summary>,
+ overrides: Vec<Summary>
+ }
+
+ impl RegistryBuilder {
+ pub fn new() -> RegistryBuilder {
+ RegistryBuilder { summaries: vec![], overrides: vec![] }
+ }
+
+ pub fn summary(mut self, summary: Summary) -> RegistryBuilder {
+ self.summaries.push(summary);
+ self
+ }
+
+ pub fn summaries(mut self, summaries: Vec<Summary>) -> RegistryBuilder {
+ self.summaries.extend(summaries.into_iter());
+ self
+ }
+
+ pub fn add_override(mut self, summary: Summary) -> RegistryBuilder {
+ self.overrides.push(summary);
+ self
+ }
+
+ pub fn overrides(mut self, summaries: Vec<Summary>) -> RegistryBuilder {
+ self.overrides.extend(summaries.into_iter());
+ self
+ }
+
+ fn query_overrides(&self, dep: &Dependency) -> Vec<Summary> {
+ self.overrides.iter()
+ .filter(|s| s.name() == dep.name())
+ .map(|s| s.clone())
+ .collect()
+ }
+ }
+
+ impl Registry for RegistryBuilder {
+ fn query(&mut self,
+ dep: &Dependency,
+ f: &mut FnMut(Summary)) -> CargoResult<()> {
+ debug!("querying; dep={:?}", dep);
+
+ let overrides = self.query_overrides(dep);
+
+ if overrides.is_empty() {
+ for s in self.summaries.iter() {
+ if dep.matches(s) {
+ f(s.clone());
+ }
+ }
+ Ok(())
+ } else {
+ for s in overrides {
+ f(s);
+ }
+ Ok(())
+ }
+ }
+
+ fn supports_checksums(&self) -> bool {
+ false
+ }
+
+ fn requires_precise(&self) -> bool {
+ false
+ }
+ }
+}
--- /dev/null
+use std::collections::{HashMap, HashSet, BTreeMap};
+use std::fmt;
+use std::str::FromStr;
+
+use serde::ser;
+use serde::de;
+
+use core::{Package, PackageId, SourceId, Workspace, Dependency};
+use util::{Graph, Config, internal};
+use util::errors::{CargoResult, CargoResultExt, CargoError};
+
+use super::Resolve;
+
+#[derive(Serialize, Deserialize, Debug)]
+pub struct EncodableResolve {
+ package: Option<Vec<EncodableDependency>>,
+ /// `root` is optional to allow backward compatibility.
+ root: Option<EncodableDependency>,
+ metadata: Option<Metadata>,
+
+ #[serde(default, skip_serializing_if = "Patch::is_empty")]
+ patch: Patch,
+}
+
+#[derive(Serialize, Deserialize, Debug, Default)]
+struct Patch {
+ unused: Vec<EncodableDependency>,
+}
+
+pub type Metadata = BTreeMap<String, String>;
+
+impl EncodableResolve {
+ pub fn into_resolve(self, ws: &Workspace) -> CargoResult<Resolve> {
+ let path_deps = build_path_deps(ws);
+
+ let packages = {
+ let mut packages = self.package.unwrap_or_default();
+ if let Some(root) = self.root {
+ packages.insert(0, root);
+ }
+ packages
+ };
+
+ // `PackageId`s in the lock file don't include the `source` part
+ // for workspace members, so we reconstruct proper ids.
+ let (live_pkgs, all_pkgs) = {
+ let mut live_pkgs = HashMap::new();
+ let mut all_pkgs = HashSet::new();
+ for pkg in packages.iter() {
+ let enc_id = EncodablePackageId {
+ name: pkg.name.clone(),
+ version: pkg.version.clone(),
+ source: pkg.source.clone(),
+ };
+
+ if !all_pkgs.insert(enc_id.clone()) {
+ return Err(internal(format!("package `{}` is specified twice in the lockfile",
+ pkg.name)));
+ }
+ let id = match pkg.source.as_ref().or_else(|| path_deps.get(&pkg.name)) {
+ // We failed to find a local package in the workspace.
+ // It must have been removed and should be ignored.
+ None => {
+ debug!("path dependency now missing {} v{}",
+ pkg.name,
+ pkg.version);
+ continue
+ }
+ Some(source) => PackageId::new(&pkg.name, &pkg.version, source)?
+ };
+
+ assert!(live_pkgs.insert(enc_id, (id, pkg)).is_none())
+ }
+ (live_pkgs, all_pkgs)
+ };
+
+ let lookup_id = |enc_id: &EncodablePackageId| -> CargoResult<Option<PackageId>> {
+ match live_pkgs.get(enc_id) {
+ Some(&(ref id, _)) => Ok(Some(id.clone())),
+ None => if all_pkgs.contains(enc_id) {
+ // Package is found in the lockfile, but it is
+ // no longer a member of the workspace.
+ Ok(None)
+ } else {
+ Err(internal(format!("package `{}` is specified as a dependency, \
+ but is missing from the package list", enc_id)))
+ }
+ }
+ };
+
+ let g = {
+ let mut g = Graph::new();
+
+ for &(ref id, _) in live_pkgs.values() {
+ g.add(id.clone(), &[]);
+ }
+
+ for &(ref id, pkg) in live_pkgs.values() {
+ let deps = match pkg.dependencies {
+ Some(ref deps) => deps,
+ None => continue
+ };
+
+ for edge in deps.iter() {
+ if let Some(to_depend_on) = lookup_id(edge)? {
+ g.link(id.clone(), to_depend_on);
+ }
+ }
+ }
+ g
+ };
+
+ let replacements = {
+ let mut replacements = HashMap::new();
+ for &(ref id, pkg) in live_pkgs.values() {
+ if let Some(ref replace) = pkg.replace {
+ assert!(pkg.dependencies.is_none());
+ if let Some(replace_id) = lookup_id(replace)? {
+ replacements.insert(id.clone(), replace_id);
+ }
+ }
+ }
+ replacements
+ };
+
+ let mut metadata = self.metadata.unwrap_or_default();
+
+ // Parse out all package checksums. After we do this we can be in a few
+ // situations:
+ //
+ // * We parsed no checksums. In this situation we're dealing with an old
+ // lock file and we're gonna fill them all in.
+ // * We parsed some checksums, but not one for all packages listed. It
+ // could have been the case that some were listed, then an older Cargo
+ // client added more dependencies, and now we're going to fill in the
+ // missing ones.
+ // * There are too many checksums listed, indicative of an older Cargo
+ // client removing a package but not updating the checksums listed.
+ //
+ // In all of these situations they're part of normal usage, so we don't
+ // really worry about it. We just try to slurp up as many checksums as
+ // possible.
+ let mut checksums = HashMap::new();
+ let prefix = "checksum ";
+ let mut to_remove = Vec::new();
+ for (k, v) in metadata.iter().filter(|p| p.0.starts_with(prefix)) {
+ to_remove.push(k.to_string());
+ let k = &k[prefix.len()..];
+ let enc_id: EncodablePackageId = k.parse().chain_err(|| {
+ internal("invalid encoding of checksum in lockfile")
+ })?;
+ let id = match lookup_id(&enc_id) {
+ Ok(Some(id)) => id,
+ _ => continue,
+ };
+
+ let v = if v == "<none>" {
+ None
+ } else {
+ Some(v.to_string())
+ };
+ checksums.insert(id, v);
+ }
+
+ for k in to_remove {
+ metadata.remove(&k);
+ }
+
+ let mut unused_patches = Vec::new();
+ for pkg in self.patch.unused {
+ let id = match pkg.source.as_ref().or_else(|| path_deps.get(&pkg.name)) {
+ Some(src) => PackageId::new(&pkg.name, &pkg.version, src)?,
+ None => continue,
+ };
+ unused_patches.push(id);
+ }
+
+ Ok(Resolve {
+ graph: g,
+ empty_features: HashSet::new(),
+ features: HashMap::new(),
+ replacements: replacements,
+ checksums: checksums,
+ metadata: metadata,
+ unused_patches: unused_patches,
+ })
+ }
+}
+
+fn build_path_deps(ws: &Workspace) -> HashMap<String, SourceId> {
+ // If a crate is *not* a path source, then we're probably in a situation
+ // such as `cargo install` with a lock file from a remote dependency. In
+ // that case we don't need to fixup any path dependencies (as they're not
+ // actually path dependencies any more), so we ignore them.
+ let members = ws.members().filter(|p| {
+ p.package_id().source_id().is_path()
+ }).collect::<Vec<_>>();
+
+ let mut ret = HashMap::new();
+ let mut visited = HashSet::new();
+ for member in members.iter() {
+ ret.insert(member.package_id().name().to_string(),
+ member.package_id().source_id().clone());
+ visited.insert(member.package_id().source_id().clone());
+ }
+ for member in members.iter() {
+ build_pkg(member, ws.config(), &mut ret, &mut visited);
+ }
+ for (_, deps) in ws.root_patch() {
+ for dep in deps {
+ build_dep(dep, ws.config(), &mut ret, &mut visited);
+ }
+ }
+ for &(_, ref dep) in ws.root_replace() {
+ build_dep(dep, ws.config(), &mut ret, &mut visited);
+ }
+
+ return ret;
+
+ fn build_pkg(pkg: &Package,
+ config: &Config,
+ ret: &mut HashMap<String, SourceId>,
+ visited: &mut HashSet<SourceId>) {
+ for dep in pkg.dependencies() {
+ build_dep(dep, config, ret, visited);
+ }
+ }
+
+ fn build_dep(dep: &Dependency,
+ config: &Config,
+ ret: &mut HashMap<String, SourceId>,
+ visited: &mut HashSet<SourceId>) {
+ let id = dep.source_id();
+ if visited.contains(id) || !id.is_path() {
+ return
+ }
+ let path = match id.url().to_file_path() {
+ Ok(p) => p.join("Cargo.toml"),
+ Err(_) => return,
+ };
+ let pkg = match Package::for_path(&path, config) {
+ Ok(p) => p,
+ Err(_) => return,
+ };
+ ret.insert(pkg.name().to_string(),
+ pkg.package_id().source_id().clone());
+ visited.insert(pkg.package_id().source_id().clone());
+ build_pkg(&pkg, config, ret, visited);
+ }
+}
+
+impl Patch {
+ fn is_empty(&self) -> bool {
+ self.unused.is_empty()
+ }
+}
+
+#[derive(Serialize, Deserialize, Debug, PartialOrd, Ord, PartialEq, Eq)]
+pub struct EncodableDependency {
+ name: String,
+ version: String,
+ source: Option<SourceId>,
+ dependencies: Option<Vec<EncodablePackageId>>,
+ replace: Option<EncodablePackageId>,
+}
+
+#[derive(Debug, PartialOrd, Ord, PartialEq, Eq, Hash, Clone)]
+pub struct EncodablePackageId {
+ name: String,
+ version: String,
+ source: Option<SourceId>
+}
+
+impl fmt::Display for EncodablePackageId {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ write!(f, "{} {}", self.name, self.version)?;
+ if let Some(ref s) = self.source {
+ write!(f, " ({})", s.to_url())?;
+ }
+ Ok(())
+ }
+}
+
+impl FromStr for EncodablePackageId {
+ type Err = CargoError;
+
+ fn from_str(s: &str) -> CargoResult<EncodablePackageId> {
+ let mut s = s.splitn(3, ' ');
+ let name = s.next().unwrap();
+ let version = s.next().ok_or_else(|| {
+ internal("invalid serialized PackageId")
+ })?;
+ let source_id = match s.next() {
+ Some(s) => {
+ if s.starts_with('(') && s.ends_with(')') {
+ Some(SourceId::from_url(&s[1..s.len() - 1])?)
+ } else {
+ bail!("invalid serialized PackageId")
+ }
+ }
+ None => None,
+ };
+
+ Ok(EncodablePackageId {
+ name: name.to_string(),
+ version: version.to_string(),
+ source: source_id
+ })
+ }
+}
+
+impl ser::Serialize for EncodablePackageId {
+ fn serialize<S>(&self, s: S) -> Result<S::Ok, S::Error>
+ where S: ser::Serializer,
+ {
+ s.collect_str(self)
+ }
+}
+
+impl<'de> de::Deserialize<'de> for EncodablePackageId {
+ fn deserialize<D>(d: D) -> Result<EncodablePackageId, D::Error>
+ where D: de::Deserializer<'de>,
+ {
+ String::deserialize(d).and_then(|string| {
+ string.parse::<EncodablePackageId>()
+ .map_err(de::Error::custom)
+ })
+ }
+}
+
+pub struct WorkspaceResolve<'a, 'cfg: 'a> {
+ pub ws: &'a Workspace<'cfg>,
+ pub resolve: &'a Resolve,
+}
+
+impl<'a, 'cfg> ser::Serialize for WorkspaceResolve<'a, 'cfg> {
+ fn serialize<S>(&self, s: S) -> Result<S::Ok, S::Error>
+ where S: ser::Serializer,
+ {
+ let mut ids: Vec<&PackageId> = self.resolve.graph.iter().collect();
+ ids.sort();
+
+ let encodable = ids.iter().filter_map(|&id| {
+ Some(encodable_resolve_node(id, self.resolve))
+ }).collect::<Vec<_>>();
+
+ let mut metadata = self.resolve.metadata.clone();
+
+ for id in ids.iter().filter(|id| !id.source_id().is_path()) {
+ let checksum = match self.resolve.checksums[*id] {
+ Some(ref s) => &s[..],
+ None => "<none>",
+ };
+ let id = encodable_package_id(id);
+ metadata.insert(format!("checksum {}", id.to_string()),
+ checksum.to_string());
+ }
+
+ let metadata = if metadata.is_empty() { None } else { Some(metadata) };
+
+ let patch = Patch {
+ unused: self.resolve.unused_patches().iter().map(|id| {
+ EncodableDependency {
+ name: id.name().to_string(),
+ version: id.version().to_string(),
+ source: encode_source(id.source_id()),
+ dependencies: None,
+ replace: None,
+ }
+ }).collect(),
+ };
+ EncodableResolve {
+ package: Some(encodable),
+ root: None,
+ metadata: metadata,
+ patch: patch,
+ }.serialize(s)
+ }
+}
+
+fn encodable_resolve_node(id: &PackageId, resolve: &Resolve)
+ -> EncodableDependency {
+ let (replace, deps) = match resolve.replacement(id) {
+ Some(id) => {
+ (Some(encodable_package_id(id)), None)
+ }
+ None => {
+ let mut deps = resolve.graph.edges(id)
+ .into_iter().flat_map(|a| a)
+ .map(encodable_package_id)
+ .collect::<Vec<_>>();
+ deps.sort();
+ (None, Some(deps))
+ }
+ };
+
+ EncodableDependency {
+ name: id.name().to_string(),
+ version: id.version().to_string(),
+ source: encode_source(id.source_id()),
+ dependencies: deps,
+ replace: replace,
+ }
+}
+
+fn encodable_package_id(id: &PackageId) -> EncodablePackageId {
+ EncodablePackageId {
+ name: id.name().to_string(),
+ version: id.version().to_string(),
+ source: encode_source(id.source_id()).map(|s| s.with_precise(None)),
+ }
+}
+
+fn encode_source(id: &SourceId) -> Option<SourceId> {
+ if id.is_path() {
+ None
+ } else {
+ Some(id.clone())
+ }
+}
--- /dev/null
+//! Resolution of the entire dependency graph for a crate
+//!
+//! This module implements the core logic in taking the world of crates and
+//! constraints and creating a resolved graph with locked versions for all
+//! crates and their dependencies. This is separate from the registry module
+//! which is more worried about discovering crates from various sources, this
+//! module just uses the Registry trait as a source to learn about crates from.
+//!
+//! Actually solving a constraint graph is an NP-hard problem. This algorithm
+//! is basically a nice heuristic to make sure we get roughly the best answer
+//! most of the time. The constraints that we're working with are:
+//!
+//! 1. Each crate can have any number of dependencies. Each dependency can
+//! declare a version range that it is compatible with.
+//! 2. Crates can be activated with multiple version (e.g. show up in the
+//! dependency graph twice) so long as each pairwise instance have
+//! semver-incompatible versions.
+//!
+//! The algorithm employed here is fairly simple, we simply do a DFS, activating
+//! the "newest crate" (highest version) first and then going to the next
+//! option. The heuristics we employ are:
+//!
+//! * Never try to activate a crate version which is incompatible. This means we
+//! only try crates which will actually satisfy a dependency and we won't ever
+//! try to activate a crate that's semver compatible with something else
+//! activated (as we're only allowed to have one).
+//! * Always try to activate the highest version crate first. The default
+//! dependency in Cargo (e.g. when you write `foo = "0.1.2"`) is
+//! semver-compatible, so selecting the highest version possible will allow us
+//! to hopefully satisfy as many dependencies at once.
+//!
+//! Beyond that, what's implemented below is just a naive backtracking version
+//! which should in theory try all possible combinations of dependencies and
+//! versions to see if one works. The first resolution that works causes
+//! everything to bail out immediately and return success, and only if *nothing*
+//! works do we actually return an error up the stack.
+//!
+//! ## Performance
+//!
+//! Note that this is a relatively performance-critical portion of Cargo. The
+//! data that we're processing is proportional to the size of the dependency
+//! graph, which can often be quite large (e.g. take a look at Servo). To make
+//! matters worse the DFS algorithm we're implemented is inherently quite
+//! inefficient. When we add the requirement of backtracking on top it means
+//! that we're implementing something that probably shouldn't be allocating all
+//! over the place.
+
+use std::cmp::Ordering;
+use std::collections::{HashSet, HashMap, BinaryHeap, BTreeMap};
+use std::fmt;
+use std::iter::FromIterator;
+use std::ops::Range;
+use std::rc::Rc;
+use std::time::{Instant, Duration};
+
+use semver;
+use url::Url;
+
+use core::{PackageId, Registry, SourceId, Summary, Dependency};
+use core::PackageIdSpec;
+use util::config::Config;
+use util::Graph;
+use util::errors::{CargoResult, CargoError};
+use util::profile;
+use util::graph::{Nodes, Edges};
+
+pub use self::encode::{EncodableResolve, EncodableDependency, EncodablePackageId};
+pub use self::encode::{Metadata, WorkspaceResolve};
+
+mod encode;
+
+/// Represents a fully resolved package dependency graph. Each node in the graph
+/// is a package and edges represent dependencies between packages.
+///
+/// Each instance of `Resolve` also understands the full set of features used
+/// for each package.
+#[derive(PartialEq)]
+pub struct Resolve {
+ graph: Graph<PackageId>,
+ replacements: HashMap<PackageId, PackageId>,
+ empty_features: HashSet<String>,
+ features: HashMap<PackageId, HashSet<String>>,
+ checksums: HashMap<PackageId, Option<String>>,
+ metadata: Metadata,
+ unused_patches: Vec<PackageId>,
+}
+
+pub struct Deps<'a> {
+ edges: Option<Edges<'a, PackageId>>,
+ resolve: &'a Resolve,
+}
+
+pub struct DepsNotReplaced<'a> {
+ edges: Option<Edges<'a, PackageId>>,
+}
+
+#[derive(Clone, Copy)]
+pub enum Method<'a> {
+ Everything,
+ Required {
+ dev_deps: bool,
+ features: &'a [String],
+ uses_default_features: bool,
+ },
+}
+
+// Information about the dependencies for a crate, a tuple of:
+//
+// (dependency info, candidates, features activated)
+type DepInfo = (Dependency, Rc<Vec<Candidate>>, Rc<Vec<String>>);
+
+#[derive(Clone)]
+struct Candidate {
+ summary: Summary,
+ replace: Option<Summary>,
+}
+
+impl Resolve {
+ /// Resolves one of the paths from the given dependent package up to
+ /// the root.
+ pub fn path_to_top(&self, pkg: &PackageId) -> Vec<&PackageId> {
+ let mut result = Vec::new();
+ let mut pkg = pkg;
+ while let Some(pulling) = self.graph
+ .get_nodes()
+ .iter()
+ .filter_map(|(pulling, pulled)|
+ if pulled.contains(pkg) {
+ Some(pulling)
+ } else {
+ None
+ })
+ .nth(0) {
+ result.push(pulling);
+ pkg = pulling;
+ }
+ result
+ }
+ pub fn register_used_patches(&mut self,
+ patches: &HashMap<Url, Vec<Summary>>) {
+ for summary in patches.values().flat_map(|v| v) {
+ if self.iter().any(|id| id == summary.package_id()) {
+ continue
+ }
+ self.unused_patches.push(summary.package_id().clone());
+ }
+ }
+
+ pub fn merge_from(&mut self, previous: &Resolve) -> CargoResult<()> {
+ // Given a previous instance of resolve, it should be forbidden to ever
+ // have a checksums which *differ*. If the same package id has differing
+ // checksums, then something has gone wrong such as:
+ //
+ // * Something got seriously corrupted
+ // * A "mirror" isn't actually a mirror as some changes were made
+ // * A replacement source wasn't actually a replacment, some changes
+ // were made
+ //
+ // In all of these cases, we want to report an error to indicate that
+ // something is awry. Normal execution (esp just using crates.io) should
+ // never run into this.
+ for (id, cksum) in previous.checksums.iter() {
+ if let Some(mine) = self.checksums.get(id) {
+ if mine == cksum {
+ continue
+ }
+
+ // If the previous checksum wasn't calculated, the current
+ // checksum is `Some`. This may indicate that a source was
+ // erroneously replaced or was replaced with something that
+ // desires stronger checksum guarantees than can be afforded
+ // elsewhere.
+ if cksum.is_none() {
+ bail!("\
+checksum for `{}` was not previously calculated, but a checksum could now \
+be calculated
+
+this could be indicative of a few possible situations:
+
+ * the source `{}` did not previously support checksums,
+ but was replaced with one that does
+ * newer Cargo implementations know how to checksum this source, but this
+ older implementation does not
+ * the lock file is corrupt
+", id, id.source_id())
+
+ // If our checksum hasn't been calculated, then it could mean
+ // that future Cargo figured out how to checksum something or
+ // more realistically we were overridden with a source that does
+ // not have checksums.
+ } else if mine.is_none() {
+ bail!("\
+checksum for `{}` could not be calculated, but a checksum is listed in \
+the existing lock file
+
+this could be indicative of a few possible situations:
+
+ * the source `{}` supports checksums,
+ but was replaced with one that doesn't
+ * the lock file is corrupt
+
+unable to verify that `{0}` is the same as when the lockfile was generated
+", id, id.source_id())
+
+ // If the checksums aren't equal, and neither is None, then they
+ // must both be Some, in which case the checksum now differs.
+ // That's quite bad!
+ } else {
+ bail!("\
+checksum for `{}` changed between lock files
+
+this could be indicative of a few possible errors:
+
+ * the lock file is corrupt
+ * a replacement source in use (e.g. a mirror) returned a different checksum
+ * the source itself may be corrupt in one way or another
+
+unable to verify that `{0}` is the same as when the lockfile was generated
+", id);
+ }
+ }
+ }
+
+ // Be sure to just copy over any unknown metadata.
+ self.metadata = previous.metadata.clone();
+ Ok(())
+ }
+
+ pub fn iter(&self) -> Nodes<PackageId> {
+ self.graph.iter()
+ }
+
+ pub fn deps(&self, pkg: &PackageId) -> Deps {
+ Deps { edges: self.graph.edges(pkg), resolve: self }
+ }
+
+ pub fn deps_not_replaced(&self, pkg: &PackageId) -> DepsNotReplaced {
+ DepsNotReplaced { edges: self.graph.edges(pkg) }
+ }
+
+ pub fn replacement(&self, pkg: &PackageId) -> Option<&PackageId> {
+ self.replacements.get(pkg)
+ }
+
+ pub fn replacements(&self) -> &HashMap<PackageId, PackageId> {
+ &self.replacements
+ }
+
+ pub fn features(&self, pkg: &PackageId) -> &HashSet<String> {
+ self.features.get(pkg).unwrap_or(&self.empty_features)
+ }
+
+ pub fn features_sorted(&self, pkg: &PackageId) -> Vec<&str> {
+ let mut v = Vec::from_iter(self.features(pkg).iter().map(|s| s.as_ref()));
+ v.sort();
+ v
+ }
+
+ pub fn query(&self, spec: &str) -> CargoResult<&PackageId> {
+ PackageIdSpec::query_str(spec, self.iter())
+ }
+
+ pub fn unused_patches(&self) -> &[PackageId] {
+ &self.unused_patches
+ }
+}
+
+impl fmt::Debug for Resolve {
+ fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
+ write!(fmt, "graph: {:?}\n", self.graph)?;
+ write!(fmt, "\nfeatures: {{\n")?;
+ for (pkg, features) in &self.features {
+ write!(fmt, " {}: {:?}\n", pkg, features)?;
+ }
+ write!(fmt, "}}")
+ }
+}
+
+impl<'a> Iterator for Deps<'a> {
+ type Item = &'a PackageId;
+
+ fn next(&mut self) -> Option<&'a PackageId> {
+ self.edges.as_mut()
+ .and_then(|e| e.next())
+ .map(|id| self.resolve.replacement(id).unwrap_or(id))
+ }
+}
+
+impl<'a> Iterator for DepsNotReplaced<'a> {
+ type Item = &'a PackageId;
+
+ fn next(&mut self) -> Option<&'a PackageId> {
+ self.edges.as_mut().and_then(|e| e.next())
+ }
+}
+
+struct RcList<T> {
+ head: Option<Rc<(T, RcList<T>)>>
+}
+
+impl<T> RcList<T> {
+ fn new() -> RcList<T> {
+ RcList { head: None }
+ }
+
+ fn push(&mut self, data: T) {
+ let node = Rc::new((data, RcList { head: self.head.take() }));
+ self.head = Some(node);
+ }
+}
+
+// Not derived to avoid `T: Clone`
+impl<T> Clone for RcList<T> {
+ fn clone(&self) -> RcList<T> {
+ RcList { head: self.head.clone() }
+ }
+}
+
+// Avoid stack overflows on drop by turning recursion into a loop
+impl<T> Drop for RcList<T> {
+ fn drop(&mut self) {
+ let mut cur = self.head.take();
+ while let Some(head) = cur {
+ match Rc::try_unwrap(head) {
+ Ok((_data, mut next)) => cur = next.head.take(),
+ Err(_) => break,
+ }
+ }
+ }
+}
+
+enum GraphNode {
+ Add(PackageId),
+ Link(PackageId, PackageId),
+}
+
+// A `Context` is basically a bunch of local resolution information which is
+// kept around for all `BacktrackFrame` instances. As a result, this runs the
+// risk of being cloned *a lot* so we want to make this as cheap to clone as
+// possible.
+#[derive(Clone)]
+struct Context<'a> {
+ // TODO: Both this and the map below are super expensive to clone. We should
+ // switch to persistent hash maps if we can at some point or otherwise
+ // make these much cheaper to clone in general.
+ activations: Activations,
+ resolve_features: HashMap<PackageId, HashSet<String>>,
+
+ // These are two cheaply-cloneable lists (O(1) clone) which are effectively
+ // hash maps but are built up as "construction lists". We'll iterate these
+ // at the very end and actually construct the map that we're making.
+ resolve_graph: RcList<GraphNode>,
+ resolve_replacements: RcList<(PackageId, PackageId)>,
+
+ replacements: &'a [(PackageIdSpec, Dependency)],
+
+ // These warnings are printed after resolution.
+ warnings: RcList<String>,
+}
+
+type Activations = HashMap<String, HashMap<SourceId, Vec<Summary>>>;
+
+/// Builds the list of all packages required to build the first argument.
+pub fn resolve(summaries: &[(Summary, Method)],
+ replacements: &[(PackageIdSpec, Dependency)],
+ registry: &mut Registry,
+ config: Option<&Config>,
+ print_warnings: bool) -> CargoResult<Resolve> {
+ let cx = Context {
+ resolve_graph: RcList::new(),
+ resolve_features: HashMap::new(),
+ resolve_replacements: RcList::new(),
+ activations: HashMap::new(),
+ replacements: replacements,
+ warnings: RcList::new(),
+ };
+ let _p = profile::start("resolving");
+ let cx = activate_deps_loop(cx, registry, summaries, config)?;
+
+ let mut resolve = Resolve {
+ graph: cx.graph(),
+ empty_features: HashSet::new(),
+ checksums: HashMap::new(),
+ metadata: BTreeMap::new(),
+ replacements: cx.resolve_replacements(),
+ features: cx.resolve_features.iter().map(|(k, v)| {
+ (k.clone(), v.clone())
+ }).collect(),
+ unused_patches: Vec::new(),
+ };
+
+ for summary in cx.activations.values()
+ .flat_map(|v| v.values())
+ .flat_map(|v| v.iter()) {
+ let cksum = summary.checksum().map(|s| s.to_string());
+ resolve.checksums.insert(summary.package_id().clone(), cksum);
+ }
+
+ check_cycles(&resolve, &cx.activations)?;
+ trace!("resolved: {:?}", resolve);
+
+ // If we have a shell, emit warnings about required deps used as feature.
+ if let Some(config) = config {
+ if print_warnings {
+ let mut shell = config.shell();
+ let mut warnings = &cx.warnings;
+ while let Some(ref head) = warnings.head {
+ shell.warn(&head.0)?;
+ warnings = &head.1;
+ }
+ }
+ }
+
+ Ok(resolve)
+}
+
+/// Attempts to activate the summary `candidate` in the context `cx`.
+///
+/// This function will pull dependency summaries from the registry provided, and
+/// the dependencies of the package will be determined by the `method` provided.
+/// If `candidate` was activated, this function returns the dependency frame to
+/// iterate through next.
+fn activate(cx: &mut Context,
+ registry: &mut Registry,
+ parent: Option<&Summary>,
+ candidate: Candidate,
+ method: &Method)
+ -> CargoResult<Option<(DepsFrame, Duration)>> {
+ if let Some(parent) = parent {
+ cx.resolve_graph.push(GraphNode::Link(parent.package_id().clone(),
+ candidate.summary.package_id().clone()));
+ }
+
+ let activated = cx.flag_activated(&candidate.summary, method);
+
+ let candidate = match candidate.replace {
+ Some(replace) => {
+ cx.resolve_replacements.push((candidate.summary.package_id().clone(),
+ replace.package_id().clone()));
+ if cx.flag_activated(&replace, method) && activated {
+ return Ok(None);
+ }
+ trace!("activating {} (replacing {})", replace.package_id(),
+ candidate.summary.package_id());
+ replace
+ }
+ None => {
+ if activated {
+ return Ok(None)
+ }
+ trace!("activating {}", candidate.summary.package_id());
+ candidate.summary
+ }
+ };
+
+ let now = Instant::now();
+ let deps = cx.build_deps(registry, &candidate, method)?;
+ let frame = DepsFrame {
+ parent: candidate,
+ remaining_siblings: RcVecIter::new(Rc::new(deps)),
+ };
+ Ok(Some((frame, now.elapsed())))
+}
+
+struct RcVecIter<T> {
+ vec: Rc<Vec<T>>,
+ rest: Range<usize>,
+}
+
+impl<T> RcVecIter<T> {
+ fn new(vec: Rc<Vec<T>>) -> RcVecIter<T> {
+ RcVecIter {
+ rest: 0..vec.len(),
+ vec: vec,
+ }
+ }
+
+ fn cur_index(&self) -> usize {
+ self.rest.start - 1
+ }
+}
+
+// Not derived to avoid `T: Clone`
+impl<T> Clone for RcVecIter<T> {
+ fn clone(&self) -> RcVecIter<T> {
+ RcVecIter {
+ vec: self.vec.clone(),
+ rest: self.rest.clone(),
+ }
+ }
+}
+
+impl<T> Iterator for RcVecIter<T> where T: Clone {
+ type Item = (usize, T);
+
+ fn next(&mut self) -> Option<(usize, T)> {
+ self.rest.next().and_then(|i| {
+ self.vec.get(i).map(|val| (i, val.clone()))
+ })
+ }
+
+ fn size_hint(&self) -> (usize, Option<usize>) {
+ self.rest.size_hint()
+ }
+}
+
+#[derive(Clone)]
+struct DepsFrame {
+ parent: Summary,
+ remaining_siblings: RcVecIter<DepInfo>,
+}
+
+impl DepsFrame {
+ /// Returns the least number of candidates that any of this frame's siblings
+ /// has.
+ ///
+ /// The `remaining_siblings` array is already sorted with the smallest
+ /// number of candidates at the front, so we just return the number of
+ /// candidates in that entry.
+ fn min_candidates(&self) -> usize {
+ self.remaining_siblings.clone().next().map(|(_, (_, candidates, _))| {
+ candidates.len()
+ }).unwrap_or(0)
+ }
+}
+
+impl PartialEq for DepsFrame {
+ fn eq(&self, other: &DepsFrame) -> bool {
+ self.min_candidates() == other.min_candidates()
+ }
+}
+
+impl Eq for DepsFrame {}
+
+impl PartialOrd for DepsFrame {
+ fn partial_cmp(&self, other: &DepsFrame) -> Option<Ordering> {
+ Some(self.cmp(other))
+ }
+}
+
+impl Ord for DepsFrame {
+ fn cmp(&self, other: &DepsFrame) -> Ordering {
+ // the frame with the sibling that has the least number of candidates
+ // needs to get the bubbled up to the top of the heap we use below, so
+ // reverse the order of the comparison here.
+ other.min_candidates().cmp(&self.min_candidates())
+ }
+}
+
+struct BacktrackFrame<'a> {
+ context_backup: Context<'a>,
+ deps_backup: BinaryHeap<DepsFrame>,
+ remaining_candidates: RemainingCandidates,
+ parent: Summary,
+ dep: Dependency,
+ features: Rc<Vec<String>>,
+}
+
+#[derive(Clone)]
+struct RemainingCandidates {
+ remaining: RcVecIter<Candidate>,
+}
+
+impl RemainingCandidates {
+ fn next(&mut self, prev_active: &[Summary]) -> Option<Candidate> {
+ // Filter the set of candidates based on the previously activated
+ // versions for this dependency. We can actually use a version if it
+ // precisely matches an activated version or if it is otherwise
+ // incompatible with all other activated versions. Note that we
+ // define "compatible" here in terms of the semver sense where if
+ // the left-most nonzero digit is the same they're considered
+ // compatible.
+ self.remaining.by_ref().map(|p| p.1).find(|b| {
+ prev_active.iter().any(|a| *a == b.summary) ||
+ prev_active.iter().all(|a| {
+ !compatible(a.version(), b.summary.version())
+ })
+ })
+ }
+}
+
+/// Recursively activates the dependencies for `top`, in depth-first order,
+/// backtracking across possible candidates for each dependency as necessary.
+///
+/// If all dependencies can be activated and resolved to a version in the
+/// dependency graph, cx.resolve is returned.
+fn activate_deps_loop<'a>(mut cx: Context<'a>,
+ registry: &mut Registry,
+ summaries: &[(Summary, Method)],
+ config: Option<&Config>)
+ -> CargoResult<Context<'a>> {
+ // Note that a `BinaryHeap` is used for the remaining dependencies that need
+ // activation. This heap is sorted such that the "largest value" is the most
+ // constrained dependency, or the one with the least candidates.
+ //
+ // This helps us get through super constrained portions of the dependency
+ // graph quickly and hopefully lock down what later larger dependencies can
+ // use (those with more candidates).
+ let mut backtrack_stack = Vec::new();
+ let mut remaining_deps = BinaryHeap::new();
+ for &(ref summary, ref method) in summaries {
+ debug!("initial activation: {}", summary.package_id());
+ let candidate = Candidate { summary: summary.clone(), replace: None };
+ let res = activate(&mut cx, registry, None, candidate, method)?;
+ if let Some((frame, _)) = res {
+ remaining_deps.push(frame);
+ }
+ }
+
+ let mut ticks = 0;
+ let start = Instant::now();
+ let time_to_print = Duration::from_millis(500);
+ let mut printed = false;
+ let mut deps_time = Duration::new(0, 0);
+
+ // Main resolution loop, this is the workhorse of the resolution algorithm.
+ //
+ // You'll note that a few stacks are maintained on the side, which might
+ // seem odd when this algorithm looks like it could be implemented
+ // recursively. While correct, this is implemented iteratively to avoid
+ // blowing the stack (the recusion depth is proportional to the size of the
+ // input).
+ //
+ // The general sketch of this loop is to run until there are no dependencies
+ // left to activate, and for each dependency to attempt to activate all of
+ // its own dependencies in turn. The `backtrack_stack` is a side table of
+ // backtracking states where if we hit an error we can return to in order to
+ // attempt to continue resolving.
+ while let Some(mut deps_frame) = remaining_deps.pop() {
+
+ // If we spend a lot of time here (we shouldn't in most cases) then give
+ // a bit of a visual indicator as to what we're doing. Only enable this
+ // when stderr is a tty (a human is likely to be watching) to ensure we
+ // get deterministic output otherwise when observed by tools.
+ //
+ // Also note that we hit this loop a lot, so it's fairly performance
+ // sensitive. As a result try to defer a possibly expensive operation
+ // like `Instant::now` by only checking every N iterations of this loop
+ // to amortize the cost of the current time lookup.
+ ticks += 1;
+ if let Some(config) = config {
+ if config.shell().is_err_tty() &&
+ !printed &&
+ ticks % 1000 == 0 &&
+ start.elapsed() - deps_time > time_to_print
+ {
+ printed = true;
+ config.shell().status("Resolving", "dependency graph...")?;
+ }
+ }
+
+ let frame = match deps_frame.remaining_siblings.next() {
+ Some(sibling) => {
+ let parent = Summary::clone(&deps_frame.parent);
+ remaining_deps.push(deps_frame);
+ (parent, sibling)
+ }
+ None => continue,
+ };
+ let (mut parent, (mut cur, (mut dep, candidates, mut features))) = frame;
+ assert!(!remaining_deps.is_empty());
+
+ let (next, has_another, remaining_candidates) = {
+ let prev_active = cx.prev_active(&dep);
+ trace!("{}[{}]>{} {} candidates", parent.name(), cur, dep.name(),
+ candidates.len());
+ trace!("{}[{}]>{} {} prev activations", parent.name(), cur,
+ dep.name(), prev_active.len());
+ let mut candidates = RemainingCandidates {
+ remaining: RcVecIter::new(Rc::clone(&candidates)),
+ };
+ (candidates.next(prev_active),
+ candidates.clone().next(prev_active).is_some(),
+ candidates)
+ };
+
+ // Alright, for each candidate that's gotten this far, it meets the
+ // following requirements:
+ //
+ // 1. The version matches the dependency requirement listed for this
+ // package
+ // 2. There are no activated versions for this package which are
+ // semver-compatible, or there's an activated version which is
+ // precisely equal to `candidate`.
+ //
+ // This means that we're going to attempt to activate each candidate in
+ // turn. We could possibly fail to activate each candidate, so we try
+ // each one in turn.
+ let candidate = match next {
+ Some(candidate) => {
+ // We have a candidate. Add an entry to the `backtrack_stack` so
+ // we can try the next one if this one fails.
+ if has_another {
+ backtrack_stack.push(BacktrackFrame {
+ context_backup: Context::clone(&cx),
+ deps_backup: <BinaryHeap<DepsFrame>>::clone(&remaining_deps),
+ remaining_candidates: remaining_candidates,
+ parent: Summary::clone(&parent),
+ dep: Dependency::clone(&dep),
+ features: Rc::clone(&features),
+ });
+ }
+ candidate
+ }
+ None => {
+ // This dependency has no valid candidate. Backtrack until we
+ // find a dependency that does have a candidate to try, and try
+ // to activate that one. This resets the `remaining_deps` to
+ // their state at the found level of the `backtrack_stack`.
+ trace!("{}[{}]>{} -- no candidates", parent.name(), cur,
+ dep.name());
+ match find_candidate(&mut backtrack_stack,
+ &mut cx,
+ &mut remaining_deps,
+ &mut parent,
+ &mut cur,
+ &mut dep,
+ &mut features) {
+ None => return Err(activation_error(&cx, registry, &parent,
+ &dep,
+ cx.prev_active(&dep),
+ &candidates)),
+ Some(candidate) => candidate,
+ }
+ }
+ };
+
+ let method = Method::Required {
+ dev_deps: false,
+ features: &features,
+ uses_default_features: dep.uses_default_features(),
+ };
+ trace!("{}[{}]>{} trying {}", parent.name(), cur, dep.name(),
+ candidate.summary.version());
+ let res = activate(&mut cx, registry, Some(&parent), candidate, &method)?;
+ if let Some((frame, dur)) = res {
+ remaining_deps.push(frame);
+ deps_time += dur;
+ }
+ }
+
+ Ok(cx)
+}
+
+// Searches up `backtrack_stack` until it finds a dependency with remaining
+// candidates. Resets `cx` and `remaining_deps` to that level and returns the
+// next candidate. If all candidates have been exhausted, returns None.
+fn find_candidate<'a>(backtrack_stack: &mut Vec<BacktrackFrame<'a>>,
+ cx: &mut Context<'a>,
+ remaining_deps: &mut BinaryHeap<DepsFrame>,
+ parent: &mut Summary,
+ cur: &mut usize,
+ dep: &mut Dependency,
+ features: &mut Rc<Vec<String>>) -> Option<Candidate> {
+ while let Some(mut frame) = backtrack_stack.pop() {
+ let (next, has_another) = {
+ let prev_active = frame.context_backup.prev_active(&frame.dep);
+ (frame.remaining_candidates.next(prev_active),
+ frame.remaining_candidates.clone().next(prev_active).is_some())
+ };
+ if let Some(candidate) = next {
+ if has_another {
+ *cx = frame.context_backup.clone();
+ *remaining_deps = frame.deps_backup.clone();
+ *parent = frame.parent.clone();
+ *dep = frame.dep.clone();
+ *features = Rc::clone(&frame.features);
+ backtrack_stack.push(frame);
+ } else {
+ *cx = frame.context_backup;
+ *remaining_deps = frame.deps_backup;
+ *parent = frame.parent;
+ *dep = frame.dep;
+ *features = frame.features;
+ }
+ *cur = remaining_deps.peek().unwrap().remaining_siblings.cur_index();
+ return Some(candidate)
+ }
+ }
+ None
+}
+
+fn activation_error(cx: &Context,
+ registry: &mut Registry,
+ parent: &Summary,
+ dep: &Dependency,
+ prev_active: &[Summary],
+ candidates: &[Candidate]) -> CargoError {
+ if !candidates.is_empty() {
+ let mut msg = format!("failed to select a version for `{}` \
+ (required by `{}`):\n\
+ all possible versions conflict with \
+ previously selected versions of `{}`",
+ dep.name(), parent.name(),
+ dep.name());
+ let graph = cx.graph();
+ 'outer: for v in prev_active.iter() {
+ for node in graph.iter() {
+ let edges = match graph.edges(node) {
+ Some(edges) => edges,
+ None => continue,
+ };
+ for edge in edges {
+ if edge != v.package_id() { continue }
+
+ msg.push_str(&format!("\n version {} in use by {}",
+ v.version(), edge));
+ continue 'outer;
+ }
+ }
+ msg.push_str(&format!("\n version {} in use by ??",
+ v.version()));
+ }
+
+ msg.push_str(&format!("\n possible versions to select: {}",
+ candidates.iter()
+ .map(|v| v.summary.version())
+ .map(|v| v.to_string())
+ .collect::<Vec<_>>()
+ .join(", ")));
+
+ return msg.into()
+ }
+
+ // Once we're all the way down here, we're definitely lost in the
+ // weeds! We didn't actually use any candidates above, so we need to
+ // give an error message that nothing was found.
+ //
+ // Note that we re-query the registry with a new dependency that
+ // allows any version so we can give some nicer error reporting
+ // which indicates a few versions that were actually found.
+ let all_req = semver::VersionReq::parse("*").unwrap();
+ let mut new_dep = dep.clone();
+ new_dep.set_version_req(all_req);
+ let mut candidates = match registry.query_vec(&new_dep) {
+ Ok(candidates) => candidates,
+ Err(e) => return e,
+ };
+ candidates.sort_by(|a, b| {
+ b.version().cmp(a.version())
+ });
+
+ let msg = if !candidates.is_empty() {
+ let versions = {
+ let mut versions = candidates.iter().take(3).map(|cand| {
+ cand.version().to_string()
+ }).collect::<Vec<_>>();
+
+ if candidates.len() > 3 {
+ versions.push("...".into());
+ }
+
+ versions.join(", ")
+ };
+
+ let mut msg = format!("no matching version `{}` found for package `{}` \
+ (required by `{}`)\n\
+ location searched: {}\n\
+ versions found: {}",
+ dep.version_req(),
+ dep.name(),
+ parent.name(),
+ dep.source_id(),
+ versions);
+
+ // If we have a path dependency with a locked version, then this may
+ // indicate that we updated a sub-package and forgot to run `cargo
+ // update`. In this case try to print a helpful error!
+ if dep.source_id().is_path()
+ && dep.version_req().to_string().starts_with('=') {
+ msg.push_str("\nconsider running `cargo update` to update \
+ a path dependency's locked version");
+ }
+
+ msg
+ } else {
+ format!("no matching package named `{}` found \
+ (required by `{}`)\n\
+ location searched: {}\n\
+ version required: {}",
+ dep.name(), parent.name(),
+ dep.source_id(),
+ dep.version_req())
+ };
+
+ msg.into()
+}
+
+// Returns if `a` and `b` are compatible in the semver sense. This is a
+// commutative operation.
+//
+// Versions `a` and `b` are compatible if their left-most nonzero digit is the
+// same.
+fn compatible(a: &semver::Version, b: &semver::Version) -> bool {
+ if a.major != b.major { return false }
+ if a.major != 0 { return true }
+ if a.minor != b.minor { return false }
+ if a.minor != 0 { return true }
+ a.patch == b.patch
+}
+
+struct Requirements<'a> {
+ summary: &'a Summary,
+ // The deps map is a mapping of package name to list of features enabled.
+ // Each package should be enabled, and each package should have the
+ // specified set of features enabled. The boolean indicates whether this
+ // package was specifically requested (rather than just requesting features
+ // *within* this package).
+ deps: HashMap<&'a str, (bool, Vec<String>)>,
+ // The used features set is the set of features which this local package had
+ // enabled, which is later used when compiling to instruct the code what
+ // features were enabled.
+ used: HashSet<&'a str>,
+ visited: HashSet<&'a str>,
+}
+
+impl<'r> Requirements<'r> {
+ fn new<'a>(summary: &'a Summary) -> Requirements<'a> {
+ Requirements {
+ summary,
+ deps: HashMap::new(),
+ used: HashSet::new(),
+ visited: HashSet::new(),
+ }
+ }
+
+ fn require_crate_feature(&mut self, package: &'r str, feat: &'r str) {
+ self.used.insert(package);
+ self.deps.entry(package)
+ .or_insert((false, Vec::new()))
+ .1.push(feat.to_string());
+ }
+
+ fn seen(&mut self, feat: &'r str) -> bool {
+ if self.visited.insert(feat) {
+ self.used.insert(feat);
+ false
+ } else {
+ true
+ }
+ }
+
+ fn require_dependency(&mut self, pkg: &'r str) {
+ if self.seen(pkg) {
+ return;
+ }
+ self.deps.entry(pkg).or_insert((false, Vec::new())).0 = true;
+ }
+
+ fn require_feature(&mut self, feat: &'r str) -> CargoResult<()> {
+ if self.seen(feat) {
+ return Ok(());
+ }
+ for f in self.summary.features().get(feat).expect("must be a valid feature") {
+ if f == &feat {
+ bail!("Cyclic feature dependency: feature `{}` depends on itself", feat);
+ }
+ self.add_feature(f)?;
+ }
+ Ok(())
+ }
+
+ fn add_feature(&mut self, feat: &'r str) -> CargoResult<()> {
+ if feat.is_empty() { return Ok(()) }
+
+ // If this feature is of the form `foo/bar`, then we just lookup package
+ // `foo` and enable its feature `bar`. Otherwise this feature is of the
+ // form `foo` and we need to recurse to enable the feature `foo` for our
+ // own package, which may end up enabling more features or just enabling
+ // a dependency.
+ let mut parts = feat.splitn(2, '/');
+ let feat_or_package = parts.next().unwrap();
+ match parts.next() {
+ Some(feat) => {
+ self.require_crate_feature(feat_or_package, feat);
+ }
+ None => {
+ if self.summary.features().contains_key(feat_or_package) {
+ self.require_feature(feat_or_package)?;
+ } else {
+ self.require_dependency(feat_or_package);
+ }
+ }
+ }
+ Ok(())
+ }
+}
+
+// Takes requested features for a single package from the input Method and
+// recurses to find all requested features, dependencies and requested
+// dependency features in a Requirements object, returning it to the resolver.
+fn build_requirements<'a, 'b: 'a>(s: &'a Summary, method: &'b Method)
+ -> CargoResult<Requirements<'a>> {
+ let mut reqs = Requirements::new(s);
+ match *method {
+ Method::Everything => {
+ for key in s.features().keys() {
+ reqs.require_feature(key)?;
+ }
+ for dep in s.dependencies().iter().filter(|d| d.is_optional()) {
+ reqs.require_dependency(dep.name());
+ }
+ }
+ Method::Required { features: requested_features, .. } => {
+ for feat in requested_features.iter() {
+ reqs.add_feature(feat)?;
+ }
+ }
+ }
+ match *method {
+ Method::Everything |
+ Method::Required { uses_default_features: true, .. } => {
+ if s.features().get("default").is_some() {
+ reqs.require_feature("default")?;
+ }
+ }
+ Method::Required { uses_default_features: false, .. } => {}
+ }
+ return Ok(reqs);
+}
+
+impl<'a> Context<'a> {
+ // Activate this summary by inserting it into our list of known activations.
+ //
+ // Returns if this summary with the given method is already activated.
+ fn flag_activated(&mut self,
+ summary: &Summary,
+ method: &Method) -> bool {
+ let id = summary.package_id();
+ let prev = self.activations
+ .entry(id.name().to_string())
+ .or_insert_with(HashMap::new)
+ .entry(id.source_id().clone())
+ .or_insert(Vec::new());
+ if !prev.iter().any(|c| c == summary) {
+ self.resolve_graph.push(GraphNode::Add(id.clone()));
+ prev.push(summary.clone());
+ return false
+ }
+ debug!("checking if {} is already activated", summary.package_id());
+ let (features, use_default) = match *method {
+ Method::Required { features, uses_default_features, .. } => {
+ (features, uses_default_features)
+ }
+ Method::Everything => return false,
+ };
+
+ let has_default_feature = summary.features().contains_key("default");
+ match self.resolve_features.get(id) {
+ Some(prev) => {
+ features.iter().all(|f| prev.contains(f)) &&
+ (!use_default || prev.contains("default") ||
+ !has_default_feature)
+ }
+ None => features.is_empty() && (!use_default || !has_default_feature)
+ }
+ }
+
+ fn build_deps(&mut self,
+ registry: &mut Registry,
+ candidate: &Summary,
+ method: &Method) -> CargoResult<Vec<DepInfo>> {
+ // First, figure out our set of dependencies based on the requsted set
+ // of features. This also calculates what features we're going to enable
+ // for our own dependencies.
+ let deps = self.resolve_features(candidate, method)?;
+
+ // Next, transform all dependencies into a list of possible candidates
+ // which can satisfy that dependency.
+ let mut deps = deps.into_iter().map(|(dep, features)| {
+ let mut candidates = self.query(registry, &dep)?;
+ // When we attempt versions for a package, we'll want to start at
+ // the maximum version and work our way down.
+ candidates.sort_by(|a, b| {
+ b.summary.version().cmp(a.summary.version())
+ });
+ Ok((dep, Rc::new(candidates), Rc::new(features)))
+ }).collect::<CargoResult<Vec<DepInfo>>>()?;
+
+ // Attempt to resolve dependencies with fewer candidates before trying
+ // dependencies with more candidates. This way if the dependency with
+ // only one candidate can't be resolved we don't have to do a bunch of
+ // work before we figure that out.
+ deps.sort_by_key(|&(_, ref a, _)| a.len());
+
+ Ok(deps)
+ }
+
+ /// Queries the `registry` to return a list of candidates for `dep`.
+ ///
+ /// This method is the location where overrides are taken into account. If
+ /// any candidates are returned which match an override then the override is
+ /// applied by performing a second query for what the override should
+ /// return.
+ fn query(&self,
+ registry: &mut Registry,
+ dep: &Dependency) -> CargoResult<Vec<Candidate>> {
+ let mut ret = Vec::new();
+ registry.query(dep, &mut |s| {
+ ret.push(Candidate { summary: s, replace: None });
+ })?;
+ for candidate in ret.iter_mut() {
+ let summary = &candidate.summary;
+
+ let mut potential_matches = self.replacements.iter()
+ .filter(|&&(ref spec, _)| spec.matches(summary.package_id()));
+
+ let &(ref spec, ref dep) = match potential_matches.next() {
+ None => continue,
+ Some(replacement) => replacement,
+ };
+ debug!("found an override for {} {}", dep.name(), dep.version_req());
+
+ let mut summaries = registry.query_vec(dep)?.into_iter();
+ let s = summaries.next().ok_or_else(|| {
+ format!("no matching package for override `{}` found\n\
+ location searched: {}\n\
+ version required: {}",
+ spec, dep.source_id(), dep.version_req())
+ })?;
+ let summaries = summaries.collect::<Vec<_>>();
+ if !summaries.is_empty() {
+ let bullets = summaries.iter().map(|s| {
+ format!(" * {}", s.package_id())
+ }).collect::<Vec<_>>();
+ bail!("the replacement specification `{}` matched \
+ multiple packages:\n * {}\n{}", spec, s.package_id(),
+ bullets.join("\n"));
+ }
+
+ // The dependency should be hard-coded to have the same name and an
+ // exact version requirement, so both of these assertions should
+ // never fail.
+ assert_eq!(s.version(), summary.version());
+ assert_eq!(s.name(), summary.name());
+
+ let replace = if s.source_id() == summary.source_id() {
+ debug!("Preventing\n{:?}\nfrom replacing\n{:?}", summary, s);
+ None
+ } else {
+ Some(s)
+ };
+ let matched_spec = spec.clone();
+
+ // Make sure no duplicates
+ if let Some(&(ref spec, _)) = potential_matches.next() {
+ bail!("overlapping replacement specifications found:\n\n \
+ * {}\n * {}\n\nboth specifications match: {}",
+ matched_spec, spec, summary.package_id());
+ }
+
+ for dep in summary.dependencies() {
+ debug!("\t{} => {}", dep.name(), dep.version_req());
+ }
+
+ candidate.replace = replace;
+ }
+ Ok(ret)
+ }
+
+ fn prev_active(&self, dep: &Dependency) -> &[Summary] {
+ self.activations.get(dep.name())
+ .and_then(|v| v.get(dep.source_id()))
+ .map(|v| &v[..])
+ .unwrap_or(&[])
+ }
+
+ /// Return all dependencies and the features we want from them.
+ fn resolve_features<'b>(&mut self,
+ s: &'b Summary,
+ method: &'b Method)
+ -> CargoResult<Vec<(Dependency, Vec<String>)>> {
+ let dev_deps = match *method {
+ Method::Everything => true,
+ Method::Required { dev_deps, .. } => dev_deps,
+ };
+
+ // First, filter by dev-dependencies
+ let deps = s.dependencies();
+ let deps = deps.iter().filter(|d| d.is_transitive() || dev_deps);
+
+ let mut reqs = build_requirements(s, method)?;
+ let mut ret = Vec::new();
+
+ // Next, collect all actually enabled dependencies and their features.
+ for dep in deps {
+ // Skip optional dependencies, but not those enabled through a feature
+ if dep.is_optional() && !reqs.deps.contains_key(dep.name()) {
+ continue
+ }
+ // So we want this dependency. Move the features we want from `feature_deps`
+ // to `ret`.
+ let base = reqs.deps.remove(dep.name()).unwrap_or((false, vec![]));
+ if !dep.is_optional() && base.0 {
+ self.warnings.push(
+ format!("Package `{}` does not have feature `{}`. It has a required dependency \
+ with that name, but only optional dependencies can be used as features. \
+ This is currently a warning to ease the transition, but it will become an \
+ error in the future.",
+ s.package_id(), dep.name())
+ );
+ }
+ let mut base = base.1;
+ base.extend(dep.features().iter().cloned());
+ for feature in base.iter() {
+ if feature.contains("/") {
+ bail!("feature names may not contain slashes: `{}`", feature);
+ }
+ }
+ ret.push((dep.clone(), base));
+ }
+
+ // Any remaining entries in feature_deps are bugs in that the package does not actually
+ // have those dependencies. We classified them as dependencies in the first place
+ // because there is no such feature, either.
+ if !reqs.deps.is_empty() {
+ let unknown = reqs.deps.keys()
+ .map(|s| &s[..])
+ .collect::<Vec<&str>>();
+ let features = unknown.join(", ");
+ bail!("Package `{}` does not have these features: `{}`",
+ s.package_id(), features)
+ }
+
+ // Record what list of features is active for this package.
+ if !reqs.used.is_empty() {
+ let pkgid = s.package_id();
+
+ let set = self.resolve_features.entry(pkgid.clone())
+ .or_insert_with(HashSet::new);
+ for feature in reqs.used {
+ if !set.contains(feature) {
+ set.insert(feature.to_string());
+ }
+ }
+ }
+
+ Ok(ret)
+ }
+
+ fn resolve_replacements(&self) -> HashMap<PackageId, PackageId> {
+ let mut replacements = HashMap::new();
+ let mut cur = &self.resolve_replacements;
+ while let Some(ref node) = cur.head {
+ let (k, v) = node.0.clone();
+ replacements.insert(k, v);
+ cur = &node.1;
+ }
+ replacements
+ }
+
+ fn graph(&self) -> Graph<PackageId> {
+ let mut graph = Graph::new();
+ let mut cur = &self.resolve_graph;
+ while let Some(ref node) = cur.head {
+ match node.0 {
+ GraphNode::Add(ref p) => graph.add(p.clone(), &[]),
+ GraphNode::Link(ref a, ref b) => graph.link(a.clone(), b.clone()),
+ }
+ cur = &node.1;
+ }
+ graph
+ }
+}
+
+fn check_cycles(resolve: &Resolve, activations: &Activations)
+ -> CargoResult<()> {
+ let summaries: HashMap<&PackageId, &Summary> = activations.values()
+ .flat_map(|v| v.values())
+ .flat_map(|v| v)
+ .map(|s| (s.package_id(), s))
+ .collect();
+
+ // Sort packages to produce user friendly deterministic errors.
+ let all_packages = resolve.iter().collect::<BinaryHeap<_>>().into_sorted_vec();
+ let mut checked = HashSet::new();
+ for pkg in all_packages {
+ if !checked.contains(pkg) {
+ visit(resolve,
+ pkg,
+ &summaries,
+ &mut HashSet::new(),
+ &mut checked)?
+ }
+ }
+ return Ok(());
+
+ fn visit<'a>(resolve: &'a Resolve,
+ id: &'a PackageId,
+ summaries: &HashMap<&'a PackageId, &Summary>,
+ visited: &mut HashSet<&'a PackageId>,
+ checked: &mut HashSet<&'a PackageId>)
+ -> CargoResult<()> {
+ // See if we visited ourselves
+ if !visited.insert(id) {
+ bail!("cyclic package dependency: package `{}` depends on itself",
+ id);
+ }
+
+ // If we've already checked this node no need to recurse again as we'll
+ // just conclude the same thing as last time, so we only execute the
+ // recursive step if we successfully insert into `checked`.
+ //
+ // Note that if we hit an intransitive dependency then we clear out the
+ // visitation list as we can't induce a cycle through transitive
+ // dependencies.
+ if checked.insert(id) {
+ let summary = summaries[id];
+ for dep in resolve.deps_not_replaced(id) {
+ let is_transitive = summary.dependencies().iter().any(|d| {
+ d.matches_id(dep) && d.is_transitive()
+ });
+ let mut empty = HashSet::new();
+ let visited = if is_transitive {&mut *visited} else {&mut empty};
+ visit(resolve, dep, summaries, visited, checked)?;
+
+ if let Some(id) = resolve.replacement(dep) {
+ visit(resolve, id, summaries, visited, checked)?;
+ }
+ }
+ }
+
+ // Ok, we're done, no longer visiting our node any more
+ visited.remove(id);
+ Ok(())
+ }
+}
--- /dev/null
+use std::fmt;
+use std::io::prelude::*;
+
+use atty;
+use termcolor::Color::{Green, Red, Yellow, Cyan};
+use termcolor::{self, StandardStream, Color, ColorSpec, WriteColor};
+
+use util::errors::CargoResult;
+
+/// The requested verbosity of output
+#[derive(Debug, Clone, Copy, PartialEq)]
+pub enum Verbosity {
+ Verbose,
+ Normal,
+ Quiet
+}
+
+/// An abstraction around a `Write`able object that remembers preferences for output verbosity and
+/// color.
+pub struct Shell {
+ /// the `Write`able object, either with or without color support (represented by different enum
+ /// variants)
+ err: ShellOut,
+ /// How verbose messages should be
+ verbosity: Verbosity,
+}
+
+impl fmt::Debug for Shell {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ match &self.err {
+ &ShellOut::Write(_) => {
+ f.debug_struct("Shell")
+ .field("verbosity", &self.verbosity)
+ .finish()
+ }
+ &ShellOut::Stream { color_choice, .. } => {
+ f.debug_struct("Shell")
+ .field("verbosity", &self.verbosity)
+ .field("color_choice", &color_choice)
+ .finish()
+ }
+ }
+ }
+}
+
+/// A `Write`able object, either with or without color support
+enum ShellOut {
+ /// A plain write object without color support
+ Write(Box<Write>),
+ /// Color-enabled stdio, with information on whether color should be used
+ Stream {
+ stream: StandardStream,
+ tty: bool,
+ color_choice: ColorChoice,
+ },
+}
+
+/// Whether messages should use color output
+#[derive(Debug, PartialEq, Clone, Copy)]
+pub enum ColorChoice {
+ /// Force color output
+ Always,
+ /// Force disable color output
+ Never,
+ /// Intelligently guess whether to use color output
+ CargoAuto,
+}
+
+impl Shell {
+ /// Create a new shell (color choice and verbosity), defaulting to 'auto' color and verbose
+ /// output.
+ pub fn new() -> Shell {
+ Shell {
+ err: ShellOut::Stream {
+ stream: StandardStream::stderr(ColorChoice::CargoAuto.to_termcolor_color_choice()),
+ color_choice: ColorChoice::CargoAuto,
+ tty: atty::is(atty::Stream::Stderr),
+ },
+ verbosity: Verbosity::Verbose,
+ }
+ }
+
+ /// Create a shell from a plain writable object, with no color, and max verbosity.
+ pub fn from_write(out: Box<Write>) -> Shell {
+ Shell {
+ err: ShellOut::Write(out),
+ verbosity: Verbosity::Verbose,
+ }
+ }
+
+ /// Print a message, where the status will have `color` color, and can be justified. The
+ /// messages follows without color.
+ fn print(&mut self,
+ status: &fmt::Display,
+ message: Option<&fmt::Display>,
+ color: Color,
+ justified: bool) -> CargoResult<()> {
+ match self.verbosity {
+ Verbosity::Quiet => Ok(()),
+ _ => {
+ self.err.print(status, message, color, justified)
+ }
+ }
+ }
+
+ /// Returns the width of the terminal in spaces, if any
+ pub fn err_width(&self) -> Option<usize> {
+ match self.err {
+ ShellOut::Stream { tty: true, .. } => imp::stderr_width(),
+ _ => None,
+ }
+ }
+
+ /// Returns whether stderr is a tty
+ pub fn is_err_tty(&self) -> bool {
+ match self.err {
+ ShellOut::Stream { tty, .. } => tty,
+ _ => false,
+ }
+ }
+
+ /// Get a reference to the underlying writer
+ pub fn err(&mut self) -> &mut Write {
+ self.err.as_write()
+ }
+
+ /// Shortcut to right-align and color green a status message.
+ pub fn status<T, U>(&mut self, status: T, message: U) -> CargoResult<()>
+ where T: fmt::Display, U: fmt::Display
+ {
+ self.print(&status, Some(&message), Green, true)
+ }
+
+ pub fn status_header<T>(&mut self, status: T) -> CargoResult<()>
+ where T: fmt::Display,
+ {
+ self.print(&status, None, Cyan, true)
+ }
+
+ /// Shortcut to right-align a status message.
+ pub fn status_with_color<T, U>(&mut self,
+ status: T,
+ message: U,
+ color: Color) -> CargoResult<()>
+ where T: fmt::Display, U: fmt::Display
+ {
+ self.print(&status, Some(&message), color, true)
+ }
+
+ /// Run the callback only if we are in verbose mode
+ pub fn verbose<F>(&mut self, mut callback: F) -> CargoResult<()>
+ where F: FnMut(&mut Shell) -> CargoResult<()>
+ {
+ match self.verbosity {
+ Verbosity::Verbose => callback(self),
+ _ => Ok(())
+ }
+ }
+
+ /// Run the callback if we are not in verbose mode.
+ pub fn concise<F>(&mut self, mut callback: F) -> CargoResult<()>
+ where F: FnMut(&mut Shell) -> CargoResult<()>
+ {
+ match self.verbosity {
+ Verbosity::Verbose => Ok(()),
+ _ => callback(self)
+ }
+ }
+
+ /// Print a red 'error' message
+ pub fn error<T: fmt::Display>(&mut self, message: T) -> CargoResult<()> {
+ self.print(&"error:", Some(&message), Red, false)
+ }
+
+ /// Print an amber 'warning' message
+ pub fn warn<T: fmt::Display>(&mut self, message: T) -> CargoResult<()> {
+ match self.verbosity {
+ Verbosity::Quiet => Ok(()),
+ _ => self.print(&"warning:", Some(&message), Yellow, false),
+ }
+ }
+
+ /// Update the verbosity of the shell
+ pub fn set_verbosity(&mut self, verbosity: Verbosity) {
+ self.verbosity = verbosity;
+ }
+
+ /// Get the verbosity of the shell
+ pub fn verbosity(&self) -> Verbosity {
+ self.verbosity
+ }
+
+ /// Update the color choice (always, never, or auto) from a string.
+ pub fn set_color_choice(&mut self, color: Option<&str>) -> CargoResult<()> {
+ if let ShellOut::Stream { ref mut stream, ref mut color_choice, .. } = self.err {
+ let cfg = match color {
+ Some("always") => ColorChoice::Always,
+ Some("never") => ColorChoice::Never,
+
+ Some("auto") |
+ None => ColorChoice::CargoAuto,
+
+ Some(arg) => bail!("argument for --color must be auto, always, or \
+ never, but found `{}`", arg),
+ };
+ *color_choice = cfg;
+ *stream = StandardStream::stderr(cfg.to_termcolor_color_choice());
+ }
+ Ok(())
+ }
+
+ /// Get the current color choice
+ ///
+ /// If we are not using a color stream, this will always return Never, even if the color choice
+ /// has been set to something else.
+ pub fn color_choice(&self) -> ColorChoice {
+ match self.err {
+ ShellOut::Stream { color_choice, .. } => color_choice,
+ ShellOut::Write(_) => ColorChoice::Never,
+ }
+ }
+}
+
+impl ShellOut {
+ /// Print out a message with a status. The status comes first and is bold + the given color.
+ /// The status can be justified, in which case the max width that will right align is 12 chars.
+ fn print(&mut self,
+ status: &fmt::Display,
+ message: Option<&fmt::Display>,
+ color: Color,
+ justified: bool) -> CargoResult<()> {
+ match *self {
+ ShellOut::Stream { ref mut stream, .. } => {
+ stream.reset()?;
+ stream.set_color(ColorSpec::new()
+ .set_bold(true)
+ .set_fg(Some(color)))?;
+ if justified {
+ write!(stream, "{:>12}", status)?;
+ } else {
+ write!(stream, "{}", status)?;
+ }
+ stream.reset()?;
+ match message {
+ Some(message) => write!(stream, " {}\n", message)?,
+ None => write!(stream, " ")?,
+ }
+ }
+ ShellOut::Write(ref mut w) => {
+ if justified {
+ write!(w, "{:>12}", status)?;
+ } else {
+ write!(w, "{}", status)?;
+ }
+ match message {
+ Some(message) => write!(w, " {}\n", message)?,
+ None => write!(w, " ")?,
+ }
+ }
+ }
+ Ok(())
+ }
+
+ /// Get this object as a `io::Write`.
+ fn as_write(&mut self) -> &mut Write {
+ match *self {
+ ShellOut::Stream { ref mut stream, .. } => stream,
+ ShellOut::Write(ref mut w) => w,
+ }
+ }
+}
+
+impl ColorChoice {
+ /// Convert our color choice to termcolor's version
+ fn to_termcolor_color_choice(&self) -> termcolor::ColorChoice {
+ match *self {
+ ColorChoice::Always => termcolor::ColorChoice::Always,
+ ColorChoice::Never => termcolor::ColorChoice::Never,
+ ColorChoice::CargoAuto => {
+ if atty::is(atty::Stream::Stderr) {
+ termcolor::ColorChoice::Auto
+ } else {
+ termcolor::ColorChoice::Never
+ }
+ }
+ }
+ }
+}
+
+#[cfg(any(target_os = "linux", target_os = "macos"))]
+mod imp {
+ use std::mem;
+
+ use libc;
+
+ pub fn stderr_width() -> Option<usize> {
+ unsafe {
+ let mut winsize: libc::winsize = mem::zeroed();
+ if libc::ioctl(libc::STDERR_FILENO, libc::TIOCGWINSZ, &mut winsize) < 0 {
+ return None
+ }
+ if winsize.ws_col > 0 {
+ Some(winsize.ws_col as usize)
+ } else {
+ None
+ }
+ }
+ }
+}
+
+#[cfg(all(unix, not(any(target_os = "linux", target_os = "macos"))))]
+mod imp {
+ pub fn stderr_width() -> Option<usize> {
+ None
+ }
+}
+
+#[cfg(windows)]
+mod imp {
+ use std::mem;
+
+ extern crate winapi;
+ extern crate kernel32;
+
+ pub fn stderr_width() -> Option<usize> {
+ unsafe {
+ let stdout = kernel32::GetStdHandle(winapi::STD_ERROR_HANDLE);
+ let mut csbi: winapi::CONSOLE_SCREEN_BUFFER_INFO = mem::zeroed();
+ if kernel32::GetConsoleScreenBufferInfo(stdout, &mut csbi) == 0 {
+ return None
+ }
+ Some((csbi.srWindow.Right - csbi.srWindow.Left) as usize)
+ }
+ }
+}
--- /dev/null
+use std::collections::hash_map::{HashMap, Values, IterMut};
+
+use core::{Package, PackageId, Registry};
+use util::CargoResult;
+
+mod source_id;
+
+pub use self::source_id::{SourceId, GitReference};
+
+/// A Source finds and downloads remote packages based on names and
+/// versions.
+pub trait Source: Registry {
+ /// Returns the `SourceId` corresponding to this source
+ fn source_id(&self) -> &SourceId;
+
+ /// The update method performs any network operations required to
+ /// get the entire list of all names, versions and dependencies of
+ /// packages managed by the Source.
+ fn update(&mut self) -> CargoResult<()>;
+
+ /// The download method fetches the full package for each name and
+ /// version specified.
+ fn download(&mut self, package: &PackageId) -> CargoResult<Package>;
+
+ /// Generates a unique string which represents the fingerprint of the
+ /// current state of the source.
+ ///
+ /// This fingerprint is used to determine the "fresheness" of the source
+ /// later on. It must be guaranteed that the fingerprint of a source is
+ /// constant if and only if the output product will remain constant.
+ ///
+ /// The `pkg` argument is the package which this fingerprint should only be
+ /// interested in for when this source may contain multiple packages.
+ fn fingerprint(&self, pkg: &Package) -> CargoResult<String>;
+
+ /// If this source supports it, verifies the source of the package
+ /// specified.
+ ///
+ /// Note that the source may also have performed other checksum-based
+ /// verification during the `download` step, but this is intended to be run
+ /// just before a crate is compiled so it may perform more expensive checks
+ /// which may not be cacheable.
+ fn verify(&self, _pkg: &PackageId) -> CargoResult<()> {
+ Ok(())
+ }
+}
+
+impl<'a, T: Source + ?Sized + 'a> Source for Box<T> {
+ /// Forwards to `Source::source_id`
+ fn source_id(&self) -> &SourceId {
+ (**self).source_id()
+ }
+
+ /// Forwards to `Source::update`
+ fn update(&mut self) -> CargoResult<()> {
+ (**self).update()
+ }
+
+ /// Forwards to `Source::download`
+ fn download(&mut self, id: &PackageId) -> CargoResult<Package> {
+ (**self).download(id)
+ }
+
+ /// Forwards to `Source::fingerprint`
+ fn fingerprint(&self, pkg: &Package) -> CargoResult<String> {
+ (**self).fingerprint(pkg)
+ }
+
+ /// Forwards to `Source::verify`
+ fn verify(&self, pkg: &PackageId) -> CargoResult<()> {
+ (**self).verify(pkg)
+ }
+}
+
+/// A `HashMap` of `SourceId` -> `Box<Source>`
+#[derive(Default)]
+pub struct SourceMap<'src> {
+ map: HashMap<SourceId, Box<Source + 'src>>,
+}
+
+/// A `std::collection::hash_map::Values` for `SourceMap`
+pub type Sources<'a, 'src> = Values<'a, SourceId, Box<Source + 'src>>;
+
+/// A `std::collection::hash_map::IterMut` for `SourceMap`
+pub struct SourcesMut<'a, 'src: 'a> {
+ inner: IterMut<'a, SourceId, Box<Source + 'src>>,
+}
+
+impl<'src> SourceMap<'src> {
+ /// Create an empty map
+ pub fn new() -> SourceMap<'src> {
+ SourceMap { map: HashMap::new() }
+ }
+
+ /// Like `HashMap::contains_key`
+ pub fn contains(&self, id: &SourceId) -> bool {
+ self.map.contains_key(id)
+ }
+
+ /// Like `HashMap::get`
+ pub fn get(&self, id: &SourceId) -> Option<&(Source + 'src)> {
+ let source = self.map.get(id);
+
+ source.map(|s| {
+ let s: &(Source + 'src) = &**s;
+ s
+ })
+ }
+
+ /// Like `HashMap::get_mut`
+ pub fn get_mut(&mut self, id: &SourceId) -> Option<&mut (Source + 'src)> {
+ self.map.get_mut(id).map(|s| {
+ let s: &mut (Source + 'src) = &mut **s;
+ s
+ })
+ }
+
+ /// Like `HashMap::get`, but first calculates the `SourceId` from a
+ /// `PackageId`
+ pub fn get_by_package_id(&self, pkg_id: &PackageId) -> Option<&(Source + 'src)> {
+ self.get(pkg_id.source_id())
+ }
+
+ /// Like `HashMap::insert`, but derives the SourceId key from the Source
+ pub fn insert(&mut self, source: Box<Source + 'src>) {
+ let id = source.source_id().clone();
+ self.map.insert(id, source);
+ }
+
+ /// Like `HashMap::is_empty`
+ pub fn is_empty(&self) -> bool {
+ self.map.is_empty()
+ }
+
+ /// Like `HashMap::len`
+ pub fn len(&self) -> usize {
+ self.map.len()
+ }
+
+ /// Like `HashMap::values`
+ pub fn sources<'a>(&'a self) -> Sources<'a, 'src> {
+ self.map.values()
+ }
+
+ /// Like `HashMap::iter_mut`
+ pub fn sources_mut<'a>(&'a mut self) -> SourcesMut<'a, 'src> {
+ SourcesMut { inner: self.map.iter_mut() }
+ }
+}
+
+impl<'a, 'src> Iterator for SourcesMut<'a, 'src> {
+ type Item = (&'a SourceId, &'a mut (Source + 'src));
+ fn next(&mut self) -> Option<(&'a SourceId, &'a mut (Source + 'src))> {
+ self.inner.next().map(|(a, b)| (a, &mut **b))
+ }
+}
+
--- /dev/null
+use std::cmp::{self, Ordering};
+use std::fmt::{self, Formatter};
+use std::hash::{self, Hash};
+use std::path::Path;
+use std::sync::Arc;
+use std::sync::atomic::{AtomicBool, ATOMIC_BOOL_INIT};
+use std::sync::atomic::Ordering::SeqCst;
+
+use serde::ser;
+use serde::de;
+use url::Url;
+
+use ops;
+use sources::git;
+use sources::{PathSource, GitSource, RegistrySource, CRATES_IO};
+use sources::DirectorySource;
+use util::{Config, CargoResult, ToUrl};
+
+/// Unique identifier for a source of packages.
+#[derive(Clone, Eq, Debug)]
+pub struct SourceId {
+ inner: Arc<SourceIdInner>,
+}
+
+#[derive(Eq, Clone, Debug)]
+struct SourceIdInner {
+ /// The source URL
+ url: Url,
+ /// `git::canonicalize_url(url)` for the url field
+ canonical_url: Url,
+ /// The source kind
+ kind: Kind,
+ // e.g. the exact git revision of the specified branch for a Git Source
+ precise: Option<String>,
+ /// Name of the registry source for alternative registries
+ name: Option<String>,
+}
+
+/// The possible kinds of code source. Along with a URL, this fully defines the
+/// source
+#[derive(Debug, Clone, PartialEq, Eq, PartialOrd, Ord, Hash)]
+enum Kind {
+ /// Kind::Git(<git reference>) represents a git repository
+ Git(GitReference),
+ /// represents a local path
+ Path,
+ /// represents a remote registry
+ Registry,
+ /// represents a local filesystem-based registry
+ LocalRegistry,
+ /// represents a directory-based registry
+ Directory,
+}
+
+/// Information to find a specific commit in a git repository
+#[derive(Debug, Clone, PartialEq, Eq, PartialOrd, Ord, Hash)]
+pub enum GitReference {
+ /// from a tag
+ Tag(String),
+ /// from the HEAD of a branch
+ Branch(String),
+ /// from a specific revision
+ Rev(String),
+}
+
+impl SourceId {
+ /// Create a SourceId object from the kind and url.
+ ///
+ /// The canonical url will be calculated, but the precise field will not
+ fn new(kind: Kind, url: Url) -> CargoResult<SourceId> {
+ let source_id = SourceId {
+ inner: Arc::new(SourceIdInner {
+ kind: kind,
+ canonical_url: git::canonicalize_url(&url)?,
+ url: url,
+ precise: None,
+ name: None,
+ }),
+ };
+ Ok(source_id)
+ }
+
+ /// Parses a source URL and returns the corresponding ID.
+ ///
+ /// ## Example
+ ///
+ /// ```
+ /// use cargo::core::SourceId;
+ /// SourceId::from_url("git+https://github.com/alexcrichton/\
+ /// libssh2-static-sys#80e71a3021618eb05\
+ /// 656c58fb7c5ef5f12bc747f");
+ /// ```
+ pub fn from_url(string: &str) -> CargoResult<SourceId> {
+ let mut parts = string.splitn(2, '+');
+ let kind = parts.next().unwrap();
+ let url = parts.next().ok_or_else(|| format!("invalid source `{}`", string))?;
+
+ match kind {
+ "git" => {
+ let mut url = url.to_url()?;
+ let mut reference = GitReference::Branch("master".to_string());
+ for (k, v) in url.query_pairs() {
+ match &k[..] {
+ // map older 'ref' to branch
+ "branch" |
+ "ref" => reference = GitReference::Branch(v.into_owned()),
+
+ "rev" => reference = GitReference::Rev(v.into_owned()),
+ "tag" => reference = GitReference::Tag(v.into_owned()),
+ _ => {}
+ }
+ }
+ let precise = url.fragment().map(|s| s.to_owned());
+ url.set_fragment(None);
+ url.set_query(None);
+ Ok(SourceId::for_git(&url, reference)?.with_precise(precise))
+ },
+ "registry" => {
+ let url = url.to_url()?;
+ Ok(SourceId::new(Kind::Registry, url)?
+ .with_precise(Some("locked".to_string())))
+ }
+ "path" => {
+ let url = url.to_url()?;
+ SourceId::new(Kind::Path, url)
+ }
+ kind => Err(format!("unsupported source protocol: {}", kind).into())
+ }
+ }
+
+ /// A view of the `SourceId` that can be `Display`ed as a URL
+ pub fn to_url(&self) -> SourceIdToUrl {
+ SourceIdToUrl { inner: &*self.inner }
+ }
+
+ /// Create a SourceId from a filesystem path.
+ ///
+ /// Pass absolute path
+ pub fn for_path(path: &Path) -> CargoResult<SourceId> {
+ let url = path.to_url()?;
+ SourceId::new(Kind::Path, url)
+ }
+
+ /// Crate a SourceId from a git reference
+ pub fn for_git(url: &Url, reference: GitReference) -> CargoResult<SourceId> {
+ SourceId::new(Kind::Git(reference), url.clone())
+ }
+
+ /// Create a SourceId from a registry url
+ pub fn for_registry(url: &Url) -> CargoResult<SourceId> {
+ SourceId::new(Kind::Registry, url.clone())
+ }
+
+ /// Create a SourceId from a local registry path
+ pub fn for_local_registry(path: &Path) -> CargoResult<SourceId> {
+ let url = path.to_url()?;
+ SourceId::new(Kind::LocalRegistry, url)
+ }
+
+ /// Create a SourceId from a directory path
+ pub fn for_directory(path: &Path) -> CargoResult<SourceId> {
+ let url = path.to_url()?;
+ SourceId::new(Kind::Directory, url)
+ }
+
+ /// Returns the `SourceId` corresponding to the main repository.
+ ///
+ /// This is the main cargo registry by default, but it can be overridden in
+ /// a `.cargo/config`.
+ pub fn crates_io(config: &Config) -> CargoResult<SourceId> {
+ let cfg = ops::registry_configuration(config, None)?;
+ let url = if let Some(ref index) = cfg.index {
+ static WARNED: AtomicBool = ATOMIC_BOOL_INIT;
+ if !WARNED.swap(true, SeqCst) {
+ config.shell().warn("custom registry support via \
+ the `registry.index` configuration is \
+ being removed, this functionality \
+ will not work in the future")?;
+ }
+ &index[..]
+ } else {
+ CRATES_IO
+ };
+ let url = url.to_url()?;
+ SourceId::for_registry(&url)
+ }
+
+ pub fn alt_registry(config: &Config, key: &str) -> CargoResult<SourceId> {
+ let url = config.get_registry_index(key)?;
+ Ok(SourceId {
+ inner: Arc::new(SourceIdInner {
+ kind: Kind::Registry,
+ canonical_url: git::canonicalize_url(&url)?,
+ url: url,
+ precise: None,
+ name: Some(key.to_string()),
+ }),
+ })
+ }
+
+ /// Get this source URL
+ pub fn url(&self) -> &Url {
+ &self.inner.url
+ }
+
+ pub fn display_registry(&self) -> String {
+ format!("registry `{}`", self.url())
+ }
+
+ /// Is this source from a filesystem path
+ pub fn is_path(&self) -> bool {
+ self.inner.kind == Kind::Path
+ }
+
+ /// Is this source from a registry (either local or not)
+ pub fn is_registry(&self) -> bool {
+ match self.inner.kind {
+ Kind::Registry | Kind::LocalRegistry => true,
+ _ => false,
+ }
+ }
+
+ /// Is this source from an alternative registry
+ pub fn is_alt_registry(&self) -> bool {
+ self.is_registry() && self.inner.name.is_some()
+ }
+
+ /// Is this source from a git repository
+ pub fn is_git(&self) -> bool {
+ match self.inner.kind {
+ Kind::Git(_) => true,
+ _ => false,
+ }
+ }
+
+ /// Creates an implementation of `Source` corresponding to this ID.
+ pub fn load<'a>(&self, config: &'a Config) -> CargoResult<Box<super::Source + 'a>> {
+ trace!("loading SourceId; {}", self);
+ match self.inner.kind {
+ Kind::Git(..) => Ok(Box::new(GitSource::new(self, config)?)),
+ Kind::Path => {
+ let path = match self.inner.url.to_file_path() {
+ Ok(p) => p,
+ Err(()) => panic!("path sources cannot be remote"),
+ };
+ Ok(Box::new(PathSource::new(&path, self, config)))
+ }
+ Kind::Registry => Ok(Box::new(RegistrySource::remote(self, config))),
+ Kind::LocalRegistry => {
+ let path = match self.inner.url.to_file_path() {
+ Ok(p) => p,
+ Err(()) => panic!("path sources cannot be remote"),
+ };
+ Ok(Box::new(RegistrySource::local(self, &path, config)))
+ }
+ Kind::Directory => {
+ let path = match self.inner.url.to_file_path() {
+ Ok(p) => p,
+ Err(()) => panic!("path sources cannot be remote"),
+ };
+ Ok(Box::new(DirectorySource::new(&path, self, config)))
+ }
+ }
+ }
+
+ /// Get the value of the precise field
+ pub fn precise(&self) -> Option<&str> {
+ self.inner.precise.as_ref().map(|s| &s[..])
+ }
+
+ /// Get the git reference if this is a git source, otherwise None.
+ pub fn git_reference(&self) -> Option<&GitReference> {
+ match self.inner.kind {
+ Kind::Git(ref s) => Some(s),
+ _ => None,
+ }
+ }
+
+ /// Create a new SourceId from this source with the given `precise`
+ pub fn with_precise(&self, v: Option<String>) -> SourceId {
+ SourceId {
+ inner: Arc::new(SourceIdInner {
+ precise: v,
+ ..(*self.inner).clone()
+ })
+ }
+ }
+
+ /// Whether the remote registry is the standard https://crates.io
+ pub fn is_default_registry(&self) -> bool {
+ match self.inner.kind {
+ Kind::Registry => {}
+ _ => return false,
+ }
+ self.inner.url.to_string() == CRATES_IO
+ }
+
+ /// Hash `self`
+ ///
+ /// For paths, remove the workspace prefix so the same source will give the
+ /// same hash in different locations.
+ pub fn stable_hash<S: hash::Hasher>(&self, workspace: &Path, into: &mut S) {
+ if self.is_path() {
+ if let Ok(p) = self.inner.url.to_file_path().unwrap().strip_prefix(workspace) {
+ self.inner.kind.hash(into);
+ p.to_str().unwrap().hash(into);
+ return
+ }
+ }
+ self.hash(into)
+ }
+}
+
+impl PartialEq for SourceId {
+ fn eq(&self, other: &SourceId) -> bool {
+ (*self.inner).eq(&*other.inner)
+ }
+}
+
+impl PartialOrd for SourceId {
+ fn partial_cmp(&self, other: &SourceId) -> Option<Ordering> {
+ Some(self.cmp(other))
+ }
+}
+
+impl Ord for SourceId {
+ fn cmp(&self, other: &SourceId) -> Ordering {
+ self.inner.cmp(&other.inner)
+ }
+}
+
+impl ser::Serialize for SourceId {
+ fn serialize<S>(&self, s: S) -> Result<S::Ok, S::Error>
+ where S: ser::Serializer,
+ {
+ if self.is_path() {
+ None::<String>.serialize(s)
+ } else {
+ s.collect_str(&self.to_url())
+ }
+ }
+}
+
+impl<'de> de::Deserialize<'de> for SourceId {
+ fn deserialize<D>(d: D) -> Result<SourceId, D::Error>
+ where D: de::Deserializer<'de>,
+ {
+ let string = String::deserialize(d)?;
+ SourceId::from_url(&string).map_err(de::Error::custom)
+ }
+}
+
+impl fmt::Display for SourceId {
+ fn fmt(&self, f: &mut Formatter) -> fmt::Result {
+ match *self.inner {
+ SourceIdInner { kind: Kind::Path, ref url, .. } => {
+ fmt::Display::fmt(url, f)
+ }
+ SourceIdInner { kind: Kind::Git(ref reference), ref url,
+ ref precise, .. } => {
+ write!(f, "{}", url)?;
+ if let Some(pretty) = reference.pretty_ref() {
+ write!(f, "?{}", pretty)?;
+ }
+
+ if let Some(ref s) = *precise {
+ let len = cmp::min(s.len(), 8);
+ write!(f, "#{}", &s[..len])?;
+ }
+ Ok(())
+ }
+ SourceIdInner { kind: Kind::Registry, ref url, .. } |
+ SourceIdInner { kind: Kind::LocalRegistry, ref url, .. } => {
+ write!(f, "registry `{}`", url)
+ }
+ SourceIdInner { kind: Kind::Directory, ref url, .. } => {
+ write!(f, "dir {}", url)
+ }
+ }
+ }
+}
+
+// This custom implementation handles situations such as when two git sources
+// point at *almost* the same URL, but not quite, even when they actually point
+// to the same repository.
+/// This method tests for self and other values to be equal, and is used by ==.
+///
+/// For git repositories, the canonical url is checked.
+impl PartialEq for SourceIdInner {
+ fn eq(&self, other: &SourceIdInner) -> bool {
+ if self.kind != other.kind {
+ return false;
+ }
+ if self.url == other.url {
+ return true;
+ }
+
+ match (&self.kind, &other.kind) {
+ (&Kind::Git(ref ref1), &Kind::Git(ref ref2)) => {
+ ref1 == ref2 && self.canonical_url == other.canonical_url
+ }
+ _ => false,
+ }
+ }
+}
+
+impl PartialOrd for SourceIdInner {
+ fn partial_cmp(&self, other: &SourceIdInner) -> Option<Ordering> {
+ Some(self.cmp(other))
+ }
+}
+
+impl Ord for SourceIdInner {
+ fn cmp(&self, other: &SourceIdInner) -> Ordering {
+ match self.kind.cmp(&other.kind) {
+ Ordering::Equal => {}
+ ord => return ord,
+ }
+ match self.url.cmp(&other.url) {
+ Ordering::Equal => {}
+ ord => return ord,
+ }
+ match (&self.kind, &other.kind) {
+ (&Kind::Git(ref ref1), &Kind::Git(ref ref2)) => {
+ (ref1, &self.canonical_url).cmp(&(ref2, &other.canonical_url))
+ }
+ _ => self.kind.cmp(&other.kind),
+ }
+ }
+}
+
+// The hash of SourceId is used in the name of some Cargo folders, so shouldn't
+// vary. `as_str` gives the serialisation of a url (which has a spec) and so
+// insulates against possible changes in how the url crate does hashing.
+impl Hash for SourceId {
+ fn hash<S: hash::Hasher>(&self, into: &mut S) {
+ self.inner.kind.hash(into);
+ match *self.inner {
+ SourceIdInner { kind: Kind::Git(..), ref canonical_url, .. } => {
+ canonical_url.as_str().hash(into)
+ }
+ _ => self.inner.url.as_str().hash(into),
+ }
+ }
+}
+
+/// A `Display`able view into a SourceId that will write it as a url
+pub struct SourceIdToUrl<'a> {
+ inner: &'a SourceIdInner,
+}
+
+impl<'a> fmt::Display for SourceIdToUrl<'a> {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ match *self.inner {
+ SourceIdInner { kind: Kind::Path, ref url, .. } => {
+ write!(f, "path+{}", url)
+ }
+ SourceIdInner {
+ kind: Kind::Git(ref reference), ref url, ref precise, ..
+ } => {
+ write!(f, "git+{}", url)?;
+ if let Some(pretty) = reference.pretty_ref() {
+ write!(f, "?{}", pretty)?;
+ }
+ if let Some(precise) = precise.as_ref() {
+ write!(f, "#{}", precise)?;
+ }
+ Ok(())
+ }
+ SourceIdInner { kind: Kind::Registry, ref url, .. } => {
+ write!(f, "registry+{}", url)
+ }
+ SourceIdInner { kind: Kind::LocalRegistry, ref url, .. } => {
+ write!(f, "local-registry+{}", url)
+ }
+ SourceIdInner { kind: Kind::Directory, ref url, .. } => {
+ write!(f, "directory+{}", url)
+ }
+ }
+ }
+}
+
+impl GitReference {
+ /// Returns a `Display`able view of this git reference, or None if using
+ /// the head of the "master" branch
+ pub fn pretty_ref(&self) -> Option<PrettyRef> {
+ match *self {
+ GitReference::Branch(ref s) if *s == "master" => None,
+ _ => Some(PrettyRef { inner: self }),
+ }
+ }
+}
+
+/// A git reference that can be `Display`ed
+pub struct PrettyRef<'a> {
+ inner: &'a GitReference,
+}
+
+impl<'a> fmt::Display for PrettyRef<'a> {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ match *self.inner {
+ GitReference::Branch(ref b) => write!(f, "branch={}", b),
+ GitReference::Tag(ref s) => write!(f, "tag={}", s),
+ GitReference::Rev(ref s) => write!(f, "rev={}", s),
+ }
+ }
+}
+
+#[cfg(test)]
+mod tests {
+ use super::{SourceId, Kind, GitReference};
+ use util::ToUrl;
+
+ #[test]
+ fn github_sources_equal() {
+ let loc = "https://github.com/foo/bar".to_url().unwrap();
+ let master = Kind::Git(GitReference::Branch("master".to_string()));
+ let s1 = SourceId::new(master.clone(), loc).unwrap();
+
+ let loc = "git://github.com/foo/bar".to_url().unwrap();
+ let s2 = SourceId::new(master, loc.clone()).unwrap();
+
+ assert_eq!(s1, s2);
+
+ let foo = Kind::Git(GitReference::Branch("foo".to_string()));
+ let s3 = SourceId::new(foo, loc).unwrap();
+ assert!(s1 != s3);
+ }
+}
--- /dev/null
+use std::collections::BTreeMap;
+use std::mem;
+use std::rc::Rc;
+
+use semver::Version;
+use core::{Dependency, PackageId, SourceId};
+
+use util::CargoResult;
+
+/// Subset of a `Manifest`. Contains only the most important informations about
+/// a package.
+///
+/// Summaries are cloned, and should not be mutated after creation
+#[derive(Debug, Clone)]
+pub struct Summary {
+ inner: Rc<Inner>,
+}
+
+#[derive(Debug, Clone)]
+struct Inner {
+ package_id: PackageId,
+ dependencies: Vec<Dependency>,
+ features: BTreeMap<String, Vec<String>>,
+ checksum: Option<String>,
+}
+
+impl Summary {
+ pub fn new(pkg_id: PackageId,
+ dependencies: Vec<Dependency>,
+ features: BTreeMap<String, Vec<String>>) -> CargoResult<Summary> {
+ for dep in dependencies.iter() {
+ if features.get(dep.name()).is_some() {
+ bail!("Features and dependencies cannot have the \
+ same name: `{}`", dep.name())
+ }
+ if dep.is_optional() && !dep.is_transitive() {
+ bail!("Dev-dependencies are not allowed to be optional: `{}`",
+ dep.name())
+ }
+ }
+ for (feature, list) in features.iter() {
+ for dep in list.iter() {
+ let mut parts = dep.splitn(2, '/');
+ let dep = parts.next().unwrap();
+ let is_reexport = parts.next().is_some();
+ if !is_reexport && features.get(dep).is_some() { continue }
+ match dependencies.iter().find(|d| d.name() == dep) {
+ Some(d) => {
+ if d.is_optional() || is_reexport { continue }
+ bail!("Feature `{}` depends on `{}` which is not an \
+ optional dependency.\nConsider adding \
+ `optional = true` to the dependency",
+ feature, dep)
+ }
+ None if is_reexport => {
+ bail!("Feature `{}` requires a feature of `{}` which is not a \
+ dependency", feature, dep)
+ }
+ None => {
+ bail!("Feature `{}` includes `{}` which is neither \
+ a dependency nor another feature", feature, dep)
+ }
+ }
+ }
+ }
+ Ok(Summary {
+ inner: Rc::new(Inner {
+ package_id: pkg_id,
+ dependencies: dependencies,
+ features: features,
+ checksum: None,
+ }),
+ })
+ }
+
+ pub fn package_id(&self) -> &PackageId { &self.inner.package_id }
+ pub fn name(&self) -> &str { self.package_id().name() }
+ pub fn version(&self) -> &Version { self.package_id().version() }
+ pub fn source_id(&self) -> &SourceId { self.package_id().source_id() }
+ pub fn dependencies(&self) -> &[Dependency] { &self.inner.dependencies }
+ pub fn features(&self) -> &BTreeMap<String, Vec<String>> { &self.inner.features }
+ pub fn checksum(&self) -> Option<&str> {
+ self.inner.checksum.as_ref().map(|s| &s[..])
+ }
+
+ pub fn override_id(mut self, id: PackageId) -> Summary {
+ Rc::make_mut(&mut self.inner).package_id = id;
+ self
+ }
+
+ pub fn set_checksum(mut self, cksum: String) -> Summary {
+ Rc::make_mut(&mut self.inner).checksum = Some(cksum);
+ self
+ }
+
+ pub fn map_dependencies<F>(mut self, f: F) -> Summary
+ where F: FnMut(Dependency) -> Dependency {
+ {
+ let slot = &mut Rc::make_mut(&mut self.inner).dependencies;
+ let deps = mem::replace(slot, Vec::new());
+ *slot = deps.into_iter().map(f).collect();
+ }
+ self
+ }
+
+ pub fn map_source(self, to_replace: &SourceId, replace_with: &SourceId)
+ -> Summary {
+ let me = if self.package_id().source_id() == to_replace {
+ let new_id = self.package_id().with_source_id(replace_with);
+ self.override_id(new_id)
+ } else {
+ self
+ };
+ me.map_dependencies(|dep| {
+ dep.map_source(to_replace, replace_with)
+ })
+ }
+}
+
+impl PartialEq for Summary {
+ fn eq(&self, other: &Summary) -> bool {
+ self.inner.package_id == other.inner.package_id
+ }
+}
--- /dev/null
+use std::collections::hash_map::{HashMap, Entry};
+use std::collections::BTreeMap;
+use std::path::{Path, PathBuf};
+use std::slice;
+
+use glob::glob;
+use url::Url;
+
+use core::{Package, VirtualManifest, EitherManifest, SourceId};
+use core::{PackageIdSpec, Dependency, Profile, Profiles};
+use util::{Config, Filesystem};
+use util::errors::{CargoResult, CargoResultExt};
+use util::paths;
+use util::toml::read_manifest;
+
+/// The core abstraction in Cargo for working with a workspace of crates.
+///
+/// A workspace is often created very early on and then threaded through all
+/// other functions. It's typically through this object that the current
+/// package is loaded and/or learned about.
+#[derive(Debug)]
+pub struct Workspace<'cfg> {
+ config: &'cfg Config,
+
+ // This path is a path to where the current cargo subcommand was invoked
+ // from. That is, this is the `--manifest-path` argument to Cargo, and
+ // points to the "main crate" that we're going to worry about.
+ current_manifest: PathBuf,
+
+ // A list of packages found in this workspace. Always includes at least the
+ // package mentioned by `current_manifest`.
+ packages: Packages<'cfg>,
+
+ // If this workspace includes more than one crate, this points to the root
+ // of the workspace. This is `None` in the case that `[workspace]` is
+ // missing, `package.workspace` is missing, and no `Cargo.toml` above
+ // `current_manifest` was found on the filesystem with `[workspace]`.
+ root_manifest: Option<PathBuf>,
+
+ // Shared target directory for all the packages of this workspace.
+ // `None` if the default path of `root/target` should be used.
+ target_dir: Option<Filesystem>,
+
+ // List of members in this workspace with a listing of all their manifest
+ // paths. The packages themselves can be looked up through the `packages`
+ // set above.
+ members: Vec<PathBuf>,
+
+ // True, if this is a temporary workspace created for the purposes of
+ // cargo install or cargo package.
+ is_ephemeral: bool,
+
+ // True if this workspace should enforce optional dependencies even when
+ // not needed; false if this workspace should only enforce dependencies
+ // needed by the current configuration (such as in cargo install).
+ require_optional_deps: bool,
+}
+
+// Separate structure for tracking loaded packages (to avoid loading anything
+// twice), and this is separate to help appease the borrow checker.
+#[derive(Debug)]
+struct Packages<'cfg> {
+ config: &'cfg Config,
+ packages: HashMap<PathBuf, MaybePackage>,
+}
+
+#[derive(Debug)]
+enum MaybePackage {
+ Package(Package),
+ Virtual(VirtualManifest),
+}
+
+/// Configuration of a workspace in a manifest.
+#[derive(Debug, Clone)]
+pub enum WorkspaceConfig {
+ /// Indicates that `[workspace]` was present and the members were
+ /// optionally specified as well.
+ Root(WorkspaceRootConfig),
+
+ /// Indicates that `[workspace]` was present and the `root` field is the
+ /// optional value of `package.workspace`, if present.
+ Member { root: Option<String> },
+}
+
+/// Intermediate configuration of a workspace root in a manifest.
+///
+/// Knows the Workspace Root path, as well as `members` and `exclude` lists of path patterns, which
+/// together tell if some path is recognized as a member by this root or not.
+#[derive(Debug, Clone)]
+pub struct WorkspaceRootConfig {
+ root_dir: PathBuf,
+ members: Option<Vec<String>>,
+ exclude: Vec<String>,
+}
+
+/// An iterator over the member packages of a workspace, returned by
+/// `Workspace::members`
+pub struct Members<'a, 'cfg: 'a> {
+ ws: &'a Workspace<'cfg>,
+ iter: slice::Iter<'a, PathBuf>,
+}
+
+impl<'cfg> Workspace<'cfg> {
+ /// Creates a new workspace given the target manifest pointed to by
+ /// `manifest_path`.
+ ///
+ /// This function will construct the entire workspace by determining the
+ /// root and all member packages. It will then validate the workspace
+ /// before returning it, so `Ok` is only returned for valid workspaces.
+ pub fn new(manifest_path: &Path, config: &'cfg Config)
+ -> CargoResult<Workspace<'cfg>> {
+ let target_dir = config.target_dir()?;
+
+ let mut ws = Workspace {
+ config: config,
+ current_manifest: manifest_path.to_path_buf(),
+ packages: Packages {
+ config: config,
+ packages: HashMap::new(),
+ },
+ root_manifest: None,
+ target_dir: target_dir,
+ members: Vec::new(),
+ is_ephemeral: false,
+ require_optional_deps: true,
+ };
+ ws.root_manifest = ws.find_root(manifest_path)?;
+ ws.find_members()?;
+ ws.validate()?;
+ Ok(ws)
+ }
+
+ pub fn current_manifest(&self) -> &Path {
+ &self.current_manifest
+ }
+
+ /// Creates a "temporary workspace" from one package which only contains
+ /// that package.
+ ///
+ /// This constructor will not touch the filesystem and only creates an
+ /// in-memory workspace. That is, all configuration is ignored, it's just
+ /// intended for that one package.
+ ///
+ /// This is currently only used in niche situations like `cargo install` or
+ /// `cargo package`.
+ pub fn ephemeral(package: Package,
+ config: &'cfg Config,
+ target_dir: Option<Filesystem>,
+ require_optional_deps: bool) -> CargoResult<Workspace<'cfg>> {
+ let mut ws = Workspace {
+ config: config,
+ current_manifest: package.manifest_path().to_path_buf(),
+ packages: Packages {
+ config: config,
+ packages: HashMap::new(),
+ },
+ root_manifest: None,
+ target_dir: None,
+ members: Vec::new(),
+ is_ephemeral: true,
+ require_optional_deps: require_optional_deps,
+ };
+ {
+ let key = ws.current_manifest.parent().unwrap();
+ let package = MaybePackage::Package(package);
+ ws.packages.packages.insert(key.to_path_buf(), package);
+ ws.target_dir = if let Some(dir) = target_dir {
+ Some(dir)
+ } else {
+ ws.config.target_dir()?
+ };
+ ws.members.push(ws.current_manifest.clone());
+ }
+ Ok(ws)
+ }
+
+ /// Returns the current package of this workspace.
+ ///
+ /// Note that this can return an error if it the current manifest is
+ /// actually a "virtual Cargo.toml", in which case an error is returned
+ /// indicating that something else should be passed.
+ pub fn current(&self) -> CargoResult<&Package> {
+ self.current_opt().ok_or_else(||
+ format!("manifest path `{}` is a virtual manifest, but this \
+ command requires running against an actual package in \
+ this workspace", self.current_manifest.display()).into()
+ )
+ }
+
+ pub fn current_opt(&self) -> Option<&Package> {
+ match *self.packages.get(&self.current_manifest) {
+ MaybePackage::Package(ref p) => Some(p),
+ MaybePackage::Virtual(..) => None
+ }
+ }
+
+ pub fn is_virtual(&self) -> bool {
+ match *self.packages.get(&self.current_manifest) {
+ MaybePackage::Package(..) => false,
+ MaybePackage::Virtual(..) => true
+ }
+ }
+
+ /// Returns the `Config` this workspace is associated with.
+ pub fn config(&self) -> &'cfg Config {
+ self.config
+ }
+
+ pub fn profiles(&self) -> &Profiles {
+ let root = self.root_manifest.as_ref().unwrap_or(&self.current_manifest);
+ match *self.packages.get(root) {
+ MaybePackage::Package(ref p) => p.manifest().profiles(),
+ MaybePackage::Virtual(ref vm) => vm.profiles(),
+ }
+ }
+
+ /// Returns the root path of this workspace.
+ ///
+ /// That is, this returns the path of the directory containing the
+ /// `Cargo.toml` which is the root of this workspace.
+ pub fn root(&self) -> &Path {
+ match self.root_manifest {
+ Some(ref p) => p,
+ None => &self.current_manifest
+ }.parent().unwrap()
+ }
+
+ pub fn target_dir(&self) -> Filesystem {
+ self.target_dir.clone().unwrap_or_else(|| {
+ Filesystem::new(self.root().join("target"))
+ })
+ }
+
+ /// Returns the root [replace] section of this workspace.
+ ///
+ /// This may be from a virtual crate or an actual crate.
+ pub fn root_replace(&self) -> &[(PackageIdSpec, Dependency)] {
+ let path = match self.root_manifest {
+ Some(ref p) => p,
+ None => &self.current_manifest,
+ };
+ match *self.packages.get(path) {
+ MaybePackage::Package(ref p) => p.manifest().replace(),
+ MaybePackage::Virtual(ref vm) => vm.replace(),
+ }
+ }
+
+ /// Returns the root [patch] section of this workspace.
+ ///
+ /// This may be from a virtual crate or an actual crate.
+ pub fn root_patch(&self) -> &HashMap<Url, Vec<Dependency>> {
+ let path = match self.root_manifest {
+ Some(ref p) => p,
+ None => &self.current_manifest,
+ };
+ match *self.packages.get(path) {
+ MaybePackage::Package(ref p) => p.manifest().patch(),
+ MaybePackage::Virtual(ref vm) => vm.patch(),
+ }
+ }
+
+ /// Returns an iterator over all packages in this workspace
+ pub fn members<'a>(&'a self) -> Members<'a, 'cfg> {
+ Members {
+ ws: self,
+ iter: self.members.iter(),
+ }
+ }
+
+ pub fn is_ephemeral(&self) -> bool {
+ self.is_ephemeral
+ }
+
+ pub fn require_optional_deps(&self) -> bool {
+ self.require_optional_deps
+ }
+
+ /// Finds the root of a workspace for the crate whose manifest is located
+ /// at `manifest_path`.
+ ///
+ /// This will parse the `Cargo.toml` at `manifest_path` and then interpret
+ /// the workspace configuration, optionally walking up the filesystem
+ /// looking for other workspace roots.
+ ///
+ /// Returns an error if `manifest_path` isn't actually a valid manifest or
+ /// if some other transient error happens.
+ fn find_root(&mut self, manifest_path: &Path)
+ -> CargoResult<Option<PathBuf>> {
+ fn read_root_pointer(member_manifest: &Path, root_link: &str) -> CargoResult<PathBuf> {
+ let path = member_manifest.parent().unwrap()
+ .join(root_link)
+ .join("Cargo.toml");
+ debug!("find_root - pointer {}", path.display());
+ Ok(paths::normalize_path(&path))
+ };
+
+ {
+ let current = self.packages.load(manifest_path)?;
+ match *current.workspace_config() {
+ WorkspaceConfig::Root(_) => {
+ debug!("find_root - is root {}", manifest_path.display());
+ return Ok(Some(manifest_path.to_path_buf()))
+ }
+ WorkspaceConfig::Member { root: Some(ref path_to_root) } => {
+ return Ok(Some(read_root_pointer(manifest_path, path_to_root)?))
+ }
+ WorkspaceConfig::Member { root: None } => {}
+ }
+ }
+
+ for path in paths::ancestors(manifest_path).skip(2) {
+ let ances_manifest_path = path.join("Cargo.toml");
+ debug!("find_root - trying {}", ances_manifest_path.display());
+ if ances_manifest_path.exists() {
+ match *self.packages.load(&ances_manifest_path)?.workspace_config() {
+ WorkspaceConfig::Root(ref ances_root_config) => {
+ debug!("find_root - found a root checking exclusion");
+ if !ances_root_config.is_excluded(&manifest_path) {
+ debug!("find_root - found!");
+ return Ok(Some(ances_manifest_path))
+ }
+ }
+ WorkspaceConfig::Member { root: Some(ref path_to_root) } => {
+ debug!("find_root - found pointer");
+ return Ok(Some(read_root_pointer(&ances_manifest_path, path_to_root)?))
+ }
+ WorkspaceConfig::Member { .. } => {}
+ }
+ }
+ }
+
+ Ok(None)
+ }
+
+ /// After the root of a workspace has been located, probes for all members
+ /// of a workspace.
+ ///
+ /// If the `workspace.members` configuration is present, then this just
+ /// verifies that those are all valid packages to point to. Otherwise, this
+ /// will transitively follow all `path` dependencies looking for members of
+ /// the workspace.
+ fn find_members(&mut self) -> CargoResult<()> {
+ let root_manifest_path = match self.root_manifest {
+ Some(ref path) => path.clone(),
+ None => {
+ debug!("find_members - only me as a member");
+ self.members.push(self.current_manifest.clone());
+ return Ok(())
+ }
+ };
+
+ let members_paths = {
+ let root_package = self.packages.load(&root_manifest_path)?;
+ match *root_package.workspace_config() {
+ WorkspaceConfig::Root(ref root_config) => root_config.members_paths()?,
+ _ => bail!("root of a workspace inferred but wasn't a root: {}",
+ root_manifest_path.display()),
+ }
+ };
+
+ for path in members_paths {
+ self.find_path_deps(&path.join("Cargo.toml"), &root_manifest_path, false)?;
+ }
+
+ self.find_path_deps(&root_manifest_path, &root_manifest_path, false)
+ }
+
+ fn find_path_deps(&mut self,
+ manifest_path: &Path,
+ root_manifest: &Path,
+ is_path_dep: bool) -> CargoResult<()> {
+ let manifest_path = paths::normalize_path(manifest_path);
+ if self.members.iter().any(|p| p == &manifest_path) {
+ return Ok(())
+ }
+ if is_path_dep
+ && !manifest_path.parent().unwrap().starts_with(self.root())
+ && self.find_root(&manifest_path)? != self.root_manifest {
+ // If `manifest_path` is a path dependency outside of the workspace,
+ // don't add it, or any of its dependencies, as a members.
+ return Ok(())
+ }
+
+ match *self.packages.load(root_manifest)?.workspace_config() {
+ WorkspaceConfig::Root(ref root_config) => {
+ if root_config.is_excluded(&manifest_path) {
+ return Ok(())
+ }
+ }
+ _ => {}
+ }
+
+ debug!("find_members - {}", manifest_path.display());
+ self.members.push(manifest_path.clone());
+
+ let candidates = {
+ let pkg = match *self.packages.load(&manifest_path)? {
+ MaybePackage::Package(ref p) => p,
+ MaybePackage::Virtual(_) => return Ok(()),
+ };
+ pkg.dependencies()
+ .iter()
+ .map(|d| d.source_id())
+ .filter(|d| d.is_path())
+ .filter_map(|d| d.url().to_file_path().ok())
+ .map(|p| p.join("Cargo.toml"))
+ .collect::<Vec<_>>()
+ };
+ for candidate in candidates {
+ self.find_path_deps(&candidate, root_manifest, true)?;
+ }
+ Ok(())
+ }
+
+ /// Validates a workspace, ensuring that a number of invariants are upheld:
+ ///
+ /// 1. A workspace only has one root.
+ /// 2. All workspace members agree on this one root as the root.
+ /// 3. The current crate is a member of this workspace.
+ fn validate(&mut self) -> CargoResult<()> {
+ if self.root_manifest.is_none() {
+ return Ok(())
+ }
+
+ let mut roots = Vec::new();
+ {
+ let mut names = BTreeMap::new();
+ for member in self.members.iter() {
+ let package = self.packages.get(member);
+ match *package.workspace_config() {
+ WorkspaceConfig::Root(_) => {
+ roots.push(member.parent().unwrap().to_path_buf());
+ }
+ WorkspaceConfig::Member { .. } => {}
+ }
+ let name = match *package {
+ MaybePackage::Package(ref p) => p.name(),
+ MaybePackage::Virtual(_) => continue,
+ };
+ if let Some(prev) = names.insert(name, member) {
+ bail!("two packages named `{}` in this workspace:\n\
+ - {}\n\
+ - {}", name, prev.display(), member.display());
+ }
+ }
+ }
+
+ match roots.len() {
+ 0 => {
+ bail!("`package.workspace` configuration points to a crate \
+ which is not configured with [workspace]: \n\
+ configuration at: {}\n\
+ points to: {}",
+ self.current_manifest.display(),
+ self.root_manifest.as_ref().unwrap().display())
+ }
+ 1 => {}
+ _ => {
+ bail!("multiple workspace roots found in the same workspace:\n{}",
+ roots.iter()
+ .map(|r| format!(" {}", r.display()))
+ .collect::<Vec<_>>()
+ .join("\n"));
+ }
+ }
+
+ for member in self.members.clone() {
+ let root = self.find_root(&member)?;
+ if root == self.root_manifest {
+ continue
+ }
+
+ match root {
+ Some(root) => {
+ bail!("package `{}` is a member of the wrong workspace\n\
+ expected: {}\n\
+ actual: {}",
+ member.display(),
+ self.root_manifest.as_ref().unwrap().display(),
+ root.display());
+ }
+ None => {
+ bail!("workspace member `{}` is not hierarchically below \
+ the workspace root `{}`",
+ member.display(),
+ self.root_manifest.as_ref().unwrap().display());
+ }
+ }
+ }
+
+ if !self.members.contains(&self.current_manifest) {
+ let root = self.root_manifest.as_ref().unwrap();
+ let root_dir = root.parent().unwrap();
+ let current_dir = self.current_manifest.parent().unwrap();
+ let root_pkg = self.packages.get(root);
+
+ // FIXME: Make this more generic by using a relative path resolver between member and
+ // root.
+ let members_msg = match current_dir.strip_prefix(root_dir) {
+ Ok(rel) => {
+ format!("this may be fixable by adding `{}` to the \
+ `workspace.members` array of the manifest \
+ located at: {}",
+ rel.display(),
+ root.display())
+ }
+ Err(_) => {
+ format!("this may be fixable by adding a member to \
+ the `workspace.members` array of the \
+ manifest located at: {}", root.display())
+ }
+ };
+ let extra = match *root_pkg {
+ MaybePackage::Virtual(_) => members_msg,
+ MaybePackage::Package(ref p) => {
+ let has_members_list = match *p.manifest().workspace_config() {
+ WorkspaceConfig::Root(ref root_config) => root_config.has_members_list(),
+ WorkspaceConfig::Member { .. } => unreachable!(),
+ };
+ if !has_members_list {
+ format!("this may be fixable by ensuring that this \
+ crate is depended on by the workspace \
+ root: {}", root.display())
+ } else {
+ members_msg
+ }
+ }
+ };
+ bail!("current package believes it's in a workspace when it's not:\n\
+ current: {}\n\
+ workspace: {}\n\n{}",
+ self.current_manifest.display(),
+ root.display(),
+ extra);
+ }
+
+ if let Some(ref root_manifest) = self.root_manifest {
+ let default_profiles = Profiles {
+ release: Profile::default_release(),
+ dev: Profile::default_dev(),
+ test: Profile::default_test(),
+ test_deps: Profile::default_dev(),
+ bench: Profile::default_bench(),
+ bench_deps: Profile::default_release(),
+ doc: Profile::default_doc(),
+ custom_build: Profile::default_custom_build(),
+ check: Profile::default_check(),
+ check_test: Profile::default_check_test(),
+ doctest: Profile::default_doctest(),
+ };
+
+ for pkg in self.members().filter(|p| p.manifest_path() != root_manifest) {
+ if pkg.manifest().profiles() != &default_profiles {
+ let message = &format!("profiles for the non root package will be ignored, \
+ specify profiles at the workspace root:\n\
+ package: {}\n\
+ workspace: {}",
+ pkg.manifest_path().display(),
+ root_manifest.display());
+
+ //TODO: remove `Eq` bound from `Profiles` when the warning is removed.
+ self.config.shell().warn(&message)?;
+ }
+ }
+ }
+
+ Ok(())
+ }
+}
+
+
+impl<'cfg> Packages<'cfg> {
+ fn get(&self, manifest_path: &Path) -> &MaybePackage {
+ &self.packages[manifest_path.parent().unwrap()]
+ }
+
+ fn load(&mut self, manifest_path: &Path) -> CargoResult<&MaybePackage> {
+ let key = manifest_path.parent().unwrap();
+ match self.packages.entry(key.to_path_buf()) {
+ Entry::Occupied(e) => Ok(e.into_mut()),
+ Entry::Vacant(v) => {
+ let source_id = SourceId::for_path(key)?;
+ let (manifest, _nested_paths) =
+ read_manifest(manifest_path, &source_id, self.config)?;
+ Ok(v.insert(match manifest {
+ EitherManifest::Real(manifest) => {
+ MaybePackage::Package(Package::new(manifest, manifest_path))
+ }
+ EitherManifest::Virtual(vm) => {
+ MaybePackage::Virtual(vm)
+ }
+ }))
+ }
+ }
+ }
+}
+
+impl<'a, 'cfg> Members<'a, 'cfg> {
+ pub fn is_empty(self) -> bool {
+ self.count() == 0
+ }
+}
+
+impl<'a, 'cfg> Iterator for Members<'a, 'cfg> {
+ type Item = &'a Package;
+
+ fn next(&mut self) -> Option<&'a Package> {
+ loop {
+ let next = self.iter.next().map(|path| {
+ self.ws.packages.get(path)
+ });
+ match next {
+ Some(&MaybePackage::Package(ref p)) => return Some(p),
+ Some(&MaybePackage::Virtual(_)) => {}
+ None => return None,
+ }
+ }
+ }
+}
+
+impl MaybePackage {
+ fn workspace_config(&self) -> &WorkspaceConfig {
+ match *self {
+ MaybePackage::Package(ref p) => p.manifest().workspace_config(),
+ MaybePackage::Virtual(ref vm) => vm.workspace_config(),
+ }
+ }
+}
+
+impl WorkspaceRootConfig {
+ /// Create a new Intermediate Workspace Root configuration.
+ pub fn new(
+ root_dir: &Path,
+ members: &Option<Vec<String>>,
+ exclude: &Option<Vec<String>>,
+ ) -> WorkspaceRootConfig {
+ WorkspaceRootConfig {
+ root_dir: root_dir.to_path_buf(),
+ members: members.clone(),
+ exclude: exclude.clone().unwrap_or_default(),
+ }
+ }
+
+ /// Checks the path against the `excluded` list.
+ ///
+ /// This method does NOT consider the `members` list.
+ fn is_excluded(&self, manifest_path: &Path) -> bool {
+ let excluded = self.exclude.iter().any(|ex| {
+ manifest_path.starts_with(self.root_dir.join(ex))
+ });
+
+ let explicit_member = match self.members {
+ Some(ref members) => {
+ members.iter().any(|mem| {
+ manifest_path.starts_with(self.root_dir.join(mem))
+ })
+ }
+ None => false,
+ };
+
+ !explicit_member && excluded
+ }
+
+ fn has_members_list(&self) -> bool {
+ self.members.is_some()
+ }
+
+ fn members_paths(&self) -> CargoResult<Vec<PathBuf>> {
+ let mut expanded_list = Vec::new();
+
+ if let Some(globs) = self.members.clone() {
+ for glob in globs {
+ let pathbuf = self.root_dir.join(glob);
+ let expanded_paths = Self::expand_member_path(&pathbuf)?;
+
+ // If glob does not find any valid paths, then put the original
+ // path in the expanded list to maintain backwards compatibility.
+ if expanded_paths.is_empty() {
+ expanded_list.push(pathbuf);
+ } else {
+ expanded_list.extend(expanded_paths);
+ }
+ }
+ }
+
+ Ok(expanded_list)
+ }
+
+ fn expand_member_path(path: &Path) -> CargoResult<Vec<PathBuf>> {
+ let path = match path.to_str() {
+ Some(p) => p,
+ None => return Ok(Vec::new()),
+ };
+ let res = glob(path).chain_err(|| {
+ format!("could not parse pattern `{}`", &path)
+ })?;
+ res.map(|p| {
+ p.chain_err(|| {
+ format!("unable to match path to pattern `{}`", &path)
+ })
+ }).collect()
+ }
+}
--- /dev/null
+#![deny(unused)]
+#![cfg_attr(test, deny(warnings))]
+#![recursion_limit="128"]
+
+#[macro_use] extern crate error_chain;
+#[macro_use] extern crate log;
+#[macro_use] extern crate scoped_tls;
+#[macro_use] extern crate serde_derive;
+#[macro_use] extern crate serde_json;
+extern crate atty;
+extern crate crates_io as registry;
+extern crate crossbeam;
+extern crate curl;
+extern crate docopt;
+extern crate filetime;
+extern crate flate2;
+extern crate fs2;
+extern crate git2;
+extern crate glob;
+extern crate hex;
+extern crate home;
+extern crate ignore;
+extern crate jobserver;
+extern crate libc;
+extern crate libgit2_sys;
+extern crate num_cpus;
+extern crate same_file;
+extern crate semver;
+extern crate serde;
+extern crate serde_ignored;
+extern crate shell_escape;
+extern crate tar;
+extern crate tempdir;
+extern crate termcolor;
+extern crate toml;
+extern crate url;
+#[cfg(target_os = "macos")]
+extern crate core_foundation;
+
+use std::fmt;
+use std::error::Error;
+
+use error_chain::ChainedError;
+use serde::Deserialize;
+use serde::ser;
+use docopt::Docopt;
+
+use core::Shell;
+use core::shell::Verbosity::Verbose;
+
+pub use util::{CargoError, CargoErrorKind, CargoResult, CliError, CliResult, Config};
+
+pub const CARGO_ENV: &'static str = "CARGO";
+
+macro_rules! bail {
+ ($($fmt:tt)*) => (
+ return Err(::util::errors::CargoError::from(format_args!($($fmt)*).to_string()))
+ )
+}
+
+pub mod core;
+pub mod ops;
+pub mod sources;
+pub mod util;
+
+pub struct CommitInfo {
+ pub short_commit_hash: String,
+ pub commit_hash: String,
+ pub commit_date: String,
+}
+
+pub struct CfgInfo {
+ // Information about the git repository we may have been built from.
+ pub commit_info: Option<CommitInfo>,
+ // The release channel we were built for.
+ pub release_channel: String,
+}
+
+pub struct VersionInfo {
+ pub major: String,
+ pub minor: String,
+ pub patch: String,
+ pub pre_release: Option<String>,
+ // Information that's only available when we were built with
+ // configure/make, rather than cargo itself.
+ pub cfg_info: Option<CfgInfo>,
+}
+
+impl fmt::Display for VersionInfo {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ write!(f, "cargo {}.{}.{}",
+ self.major, self.minor, self.patch)?;
+ if let Some(channel) = self.cfg_info.as_ref().map(|ci| &ci.release_channel) {
+ if channel != "stable" {
+ write!(f, "-{}", channel)?;
+ let empty = String::from("");
+ write!(f, "{}", self.pre_release.as_ref().unwrap_or(&empty))?;
+ }
+ };
+
+ if let Some(ref cfg) = self.cfg_info {
+ if let Some(ref ci) = cfg.commit_info {
+ write!(f, " ({} {})",
+ ci.short_commit_hash, ci.commit_date)?;
+ }
+ };
+ Ok(())
+ }
+}
+
+pub fn call_main_without_stdin<'de, Flags: Deserialize<'de>>(
+ exec: fn(Flags, &mut Config) -> CliResult,
+ config: &mut Config,
+ usage: &str,
+ args: &[String],
+ options_first: bool) -> CliResult
+{
+ let docopt = Docopt::new(usage).unwrap()
+ .options_first(options_first)
+ .argv(args.iter().map(|s| &s[..]))
+ .help(true);
+
+ let flags = docopt.deserialize().map_err(|e| {
+ let code = if e.fatal() {1} else {0};
+ CliError::new(e.to_string().into(), code)
+ })?;
+
+ exec(flags, config)
+}
+
+pub fn print_json<T: ser::Serialize>(obj: &T) {
+ let encoded = serde_json::to_string(&obj).unwrap();
+ println!("{}", encoded);
+}
+
+pub fn exit_with_error(err: CliError, shell: &mut Shell) -> ! {
+ debug!("exit_with_error; err={:?}", err);
+
+ let CliError { error, exit_code, unknown } = err;
+ // exit_code == 0 is non-fatal error, e.g. docopt version info
+ let fatal = exit_code != 0;
+
+ let hide = unknown && shell.verbosity() != Verbose;
+
+ if let Some(error) = error {
+ if hide {
+ drop(shell.error("An unknown error occurred"))
+ } else if fatal {
+ drop(shell.error(&error))
+ } else {
+ drop(writeln!(shell.err(), "{}", error))
+ }
+
+ if !handle_cause(error, shell) || hide {
+ drop(writeln!(shell.err(), "\nTo learn more, run the command again \
+ with --verbose."));
+ }
+ }
+
+ std::process::exit(exit_code)
+}
+
+pub fn handle_error(err: CargoError, shell: &mut Shell) {
+ debug!("handle_error; err={:?}", &err);
+
+ let _ignored_result = shell.error(&err);
+ handle_cause(err, shell);
+}
+
+fn handle_cause<E, EKind>(cargo_err: E, shell: &mut Shell) -> bool
+ where E: ChainedError<ErrorKind=EKind> + 'static
+{
+ fn print(error: String, shell: &mut Shell) {
+ drop(writeln!(shell.err(), "\nCaused by:"));
+ drop(writeln!(shell.err(), " {}", error));
+ }
+
+ //Error inspection in non-verbose mode requires inspecting the
+ //error kind to avoid printing Internal errors. The downcasting
+ //machinery requires &(Error + 'static), but the iterator (and
+ //underlying `cause`) return &Error. Because the borrows are
+ //constrained to this handling method, and because the original
+ //error object is constrained to be 'static, we're casting away
+ //the borrow's actual lifetime for purposes of downcasting and
+ //inspecting the error chain
+ unsafe fn extend_lifetime(r: &Error) -> &(Error + 'static) {
+ std::mem::transmute::<&Error, &Error>(r)
+ }
+
+ let verbose = shell.verbosity();
+
+ if verbose == Verbose {
+ //The first error has already been printed to the shell
+ //Print all remaining errors
+ for err in cargo_err.iter().skip(1) {
+ print(err.to_string(), shell);
+ }
+ } else {
+ //The first error has already been printed to the shell
+ //Print remaining errors until one marked as Internal appears
+ for err in cargo_err.iter().skip(1) {
+ let err = unsafe { extend_lifetime(err) };
+ if let Some(&CargoError(CargoErrorKind::Internal(..), ..)) =
+ err.downcast_ref::<CargoError>() {
+ return false;
+ }
+
+ print(err.to_string(), shell);
+ }
+ }
+
+ true
+}
+
+pub fn version() -> VersionInfo {
+ macro_rules! env_str {
+ ($name:expr) => { env!($name).to_string() }
+ }
+ macro_rules! option_env_str {
+ ($name:expr) => { option_env!($name).map(|s| s.to_string()) }
+ }
+ match option_env!("CFG_RELEASE_CHANNEL") {
+ // We have environment variables set up from configure/make.
+ Some(_) => {
+ let commit_info =
+ option_env!("CFG_COMMIT_HASH").map(|s| {
+ CommitInfo {
+ commit_hash: s.to_string(),
+ short_commit_hash: option_env_str!("CFG_SHORT_COMMIT_HASH").unwrap(),
+ commit_date: option_env_str!("CFG_COMMIT_DATE").unwrap(),
+ }
+ });
+ VersionInfo {
+ major: env_str!("CARGO_PKG_VERSION_MAJOR"),
+ minor: env_str!("CARGO_PKG_VERSION_MINOR"),
+ patch: env_str!("CARGO_PKG_VERSION_PATCH"),
+ pre_release: option_env_str!("CARGO_PKG_VERSION_PRE"),
+ cfg_info: Some(CfgInfo {
+ release_channel: option_env_str!("CFG_RELEASE_CHANNEL").unwrap(),
+ commit_info: commit_info,
+ }),
+ }
+ },
+ // We are being compiled by Cargo itself.
+ None => {
+ VersionInfo {
+ major: env_str!("CARGO_PKG_VERSION_MAJOR"),
+ minor: env_str!("CARGO_PKG_VERSION_MINOR"),
+ patch: env_str!("CARGO_PKG_VERSION_PATCH"),
+ pre_release: option_env_str!("CARGO_PKG_VERSION_PRE"),
+ cfg_info: None,
+ }
+ }
+ }
+}
--- /dev/null
+use std::default::Default;
+use std::fs;
+use std::path::Path;
+
+use core::{Profiles, Workspace};
+use util::Config;
+use util::errors::{CargoResult, CargoResultExt};
+use ops::{self, Context, BuildConfig, Kind, Unit};
+
+pub struct CleanOptions<'a> {
+ pub spec: &'a [String],
+ pub target: Option<&'a str>,
+ pub config: &'a Config,
+ pub release: bool,
+}
+
+/// Cleans the project from build artifacts.
+pub fn clean(ws: &Workspace, opts: &CleanOptions) -> CargoResult<()> {
+ let target_dir = ws.target_dir();
+
+ // If we have a spec, then we need to delete some packages, otherwise, just
+ // remove the whole target directory and be done with it!
+ //
+ // Note that we don't bother grabbing a lock here as we're just going to
+ // blow it all away anyway.
+ if opts.spec.is_empty() {
+ let target_dir = target_dir.into_path_unlocked();
+ return rm_rf(&target_dir);
+ }
+
+ let (packages, resolve) = ops::resolve_ws(ws)?;
+
+ let profiles = ws.profiles();
+ let host_triple = opts.config.rustc()?.host.clone();
+ let mut cx = Context::new(ws, &resolve, &packages, opts.config,
+ BuildConfig {
+ host_triple,
+ requested_target: opts.target.map(|s| s.to_owned()),
+ release: opts.release,
+ jobs: 1,
+ ..BuildConfig::default()
+ },
+ profiles)?;
+ let mut units = Vec::new();
+
+ for spec in opts.spec {
+ // Translate the spec to a Package
+ let pkgid = resolve.query(spec)?;
+ let pkg = packages.get(pkgid)?;
+
+ // Generate all relevant `Unit` targets for this package
+ for target in pkg.targets() {
+ for kind in [Kind::Host, Kind::Target].iter() {
+ let Profiles {
+ ref release, ref dev, ref test, ref bench, ref doc,
+ ref custom_build, ref test_deps, ref bench_deps, ref check,
+ ref check_test, ref doctest,
+ } = *profiles;
+ let profiles = [release, dev, test, bench, doc, custom_build,
+ test_deps, bench_deps, check, check_test, doctest];
+ for profile in profiles.iter() {
+ units.push(Unit {
+ pkg,
+ target,
+ profile,
+ kind: *kind,
+ });
+ }
+ }
+ }
+ }
+
+ cx.probe_target_info(&units)?;
+
+ for unit in units.iter() {
+ rm_rf(&cx.fingerprint_dir(unit))?;
+ if unit.target.is_custom_build() {
+ if unit.profile.run_custom_build {
+ rm_rf(&cx.build_script_out_dir(unit))?;
+ } else {
+ rm_rf(&cx.build_script_dir(unit))?;
+ }
+ continue
+ }
+
+ for &(ref src, ref link_dst, _) in cx.target_filenames(unit)?.iter() {
+ rm_rf(src)?;
+ if let Some(ref dst) = *link_dst {
+ rm_rf(dst)?;
+ }
+ }
+ }
+
+ Ok(())
+}
+
+fn rm_rf(path: &Path) -> CargoResult<()> {
+ let m = fs::metadata(path);
+ if m.as_ref().map(|s| s.is_dir()).unwrap_or(false) {
+ fs::remove_dir_all(path).chain_err(|| {
+ "could not remove build directory"
+ })?;
+ } else if m.is_ok() {
+ fs::remove_file(path).chain_err(|| {
+ "failed to remove build artifact"
+ })?;
+ }
+ Ok(())
+}
--- /dev/null
+//!
+//! Cargo compile currently does the following steps:
+//!
+//! All configurations are already injected as environment variables via the
+//! main cargo command
+//!
+//! 1. Read the manifest
+//! 2. Shell out to `cargo-resolve` with a list of dependencies and sources as
+//! stdin
+//!
+//! a. Shell out to `--do update` and `--do list` for each source
+//! b. Resolve dependencies and return a list of name/version/source
+//!
+//! 3. Shell out to `--do download` for each source
+//! 4. Shell out to `--do get` for each source, and build up the list of paths
+//! to pass to rustc -L
+//! 5. Call `cargo-rustc` with the results of the resolver zipped together with
+//! the results of the `get`
+//!
+//! a. Topologically sort the dependencies
+//! b. Compile each dependency in order, passing in the -L's pointing at each
+//! previously compiled dependency
+//!
+
+use std::collections::{HashMap, HashSet};
+use std::default::Default;
+use std::path::PathBuf;
+use std::sync::Arc;
+
+use core::{Source, Package, Target};
+use core::{Profile, TargetKind, Profiles, Workspace, PackageId, PackageIdSpec};
+use core::resolver::Resolve;
+use ops::{self, BuildOutput, Executor, DefaultExecutor};
+use util::config::Config;
+use util::{CargoResult, profile};
+use util::errors::{CargoResultExt, CargoError};
+
+/// Contains information about how a package should be compiled.
+#[derive(Debug)]
+pub struct CompileOptions<'a> {
+ pub config: &'a Config,
+ /// Number of concurrent jobs to use.
+ pub jobs: Option<u32>,
+ /// The target platform to compile for (example: `i686-unknown-linux-gnu`).
+ pub target: Option<&'a str>,
+ /// Extra features to build for the root package
+ pub features: &'a [String],
+ /// Flag whether all available features should be built for the root package
+ pub all_features: bool,
+ /// Flag if the default feature should be built for the root package
+ pub no_default_features: bool,
+ /// A set of packages to build.
+ pub spec: Packages<'a>,
+ /// Filter to apply to the root package to select which targets will be
+ /// built.
+ pub filter: CompileFilter<'a>,
+ /// Whether this is a release build or not
+ pub release: bool,
+ /// Mode for this compile.
+ pub mode: CompileMode,
+ /// `--error_format` flag for the compiler.
+ pub message_format: MessageFormat,
+ /// Extra arguments to be passed to rustdoc (for main crate and dependencies)
+ pub target_rustdoc_args: Option<&'a [String]>,
+ /// The specified target will be compiled with all the available arguments,
+ /// note that this only accounts for the *final* invocation of rustc
+ pub target_rustc_args: Option<&'a [String]>,
+}
+
+impl<'a> CompileOptions<'a> {
+ pub fn default(config: &'a Config, mode: CompileMode) -> CompileOptions<'a>
+ {
+ CompileOptions {
+ config: config,
+ jobs: None,
+ target: None,
+ features: &[],
+ all_features: false,
+ no_default_features: false,
+ spec: ops::Packages::Packages(&[]),
+ mode: mode,
+ release: false,
+ filter: CompileFilter::Default { required_features_filterable: false },
+ message_format: MessageFormat::Human,
+ target_rustdoc_args: None,
+ target_rustc_args: None,
+ }
+ }
+}
+
+#[derive(Clone, Copy, PartialEq, Debug)]
+pub enum CompileMode {
+ Test,
+ Build,
+ Check { test: bool },
+ Bench,
+ Doc { deps: bool },
+ Doctest,
+}
+
+#[derive(Clone, Copy, Debug, PartialEq, Eq, Deserialize)]
+pub enum MessageFormat {
+ Human,
+ Json
+}
+
+#[derive(Clone, Copy, PartialEq, Eq, Debug)]
+pub enum Packages<'a> {
+ All,
+ OptOut(&'a [String]),
+ Packages(&'a [String]),
+}
+
+impl<'a> Packages<'a> {
+ pub fn from_flags(virtual_ws: bool, all: bool, exclude: &'a [String], package: &'a [String])
+ -> CargoResult<Self>
+ {
+ let all = all || (virtual_ws && package.is_empty());
+
+ let packages = match (all, &exclude) {
+ (true, exclude) if exclude.is_empty() => Packages::All,
+ (true, exclude) => Packages::OptOut(exclude),
+ (false, exclude) if !exclude.is_empty() => bail!("--exclude can only be used together \
+ with --all"),
+ _ => Packages::Packages(package),
+ };
+
+ Ok(packages)
+ }
+
+ pub fn into_package_id_specs(self, ws: &Workspace) -> CargoResult<Vec<PackageIdSpec>> {
+ let specs = match self {
+ Packages::All => {
+ ws.members()
+ .map(Package::package_id)
+ .map(PackageIdSpec::from_package_id)
+ .collect()
+ }
+ Packages::OptOut(opt_out) => {
+ ws.members()
+ .map(Package::package_id)
+ .map(PackageIdSpec::from_package_id)
+ .filter(|p| opt_out.iter().position(|x| *x == p.name()).is_none())
+ .collect()
+ }
+ Packages::Packages(packages) if packages.is_empty() => {
+ ws.current_opt()
+ .map(Package::package_id)
+ .map(PackageIdSpec::from_package_id)
+ .into_iter().collect()
+ }
+ Packages::Packages(packages) => {
+ packages.iter().map(|p| PackageIdSpec::parse(p)).collect::<CargoResult<Vec<_>>>()?
+ }
+ };
+ Ok(specs)
+ }
+}
+
+#[derive(Clone, Copy, Debug)]
+pub enum FilterRule<'a> {
+ All,
+ Just (&'a [String]),
+}
+
+#[derive(Debug)]
+pub enum CompileFilter<'a> {
+ Default {
+ /// Flag whether targets can be safely skipped when required-features are not satisfied.
+ required_features_filterable: bool,
+ },
+ Only {
+ all_targets: bool,
+ lib: bool,
+ bins: FilterRule<'a>,
+ examples: FilterRule<'a>,
+ tests: FilterRule<'a>,
+ benches: FilterRule<'a>,
+ }
+}
+
+pub fn compile<'a>(ws: &Workspace<'a>, options: &CompileOptions<'a>)
+ -> CargoResult<ops::Compilation<'a>> {
+ compile_with_exec(ws, options, Arc::new(DefaultExecutor))
+}
+
+pub fn compile_with_exec<'a>(ws: &Workspace<'a>,
+ options: &CompileOptions<'a>,
+ exec: Arc<Executor>)
+ -> CargoResult<ops::Compilation<'a>> {
+ for member in ws.members() {
+ for warning in member.manifest().warnings().iter() {
+ if warning.is_critical {
+ let err: CargoResult<_> = Err(CargoError::from(warning.message.to_owned()));
+ return err.chain_err(|| {
+ format!("failed to parse manifest at `{}`", member.manifest_path().display())
+ })
+ } else {
+ options.config.shell().warn(&warning.message)?
+ }
+ }
+ }
+ compile_ws(ws, None, options, exec)
+}
+
+pub fn compile_ws<'a>(ws: &Workspace<'a>,
+ source: Option<Box<Source + 'a>>,
+ options: &CompileOptions<'a>,
+ exec: Arc<Executor>)
+ -> CargoResult<ops::Compilation<'a>> {
+ let CompileOptions { config, jobs, target, spec, features,
+ all_features, no_default_features,
+ release, mode, message_format,
+ ref filter,
+ ref target_rustdoc_args,
+ ref target_rustc_args } = *options;
+
+ let target = target.map(|s| s.to_string());
+
+ if jobs == Some(0) {
+ bail!("jobs must be at least 1")
+ }
+
+ let profiles = ws.profiles();
+
+ let specs = spec.into_package_id_specs(ws)?;
+ let resolve = ops::resolve_ws_precisely(ws,
+ source,
+ features,
+ all_features,
+ no_default_features,
+ &specs)?;
+ let (packages, resolve_with_overrides) = resolve;
+
+ if specs.is_empty() {
+ return Err(format!("manifest path `{}` contains no package: The manifest is virtual, \
+ and the workspace has no members.", ws.current_manifest().display()).into());
+ };
+
+ let to_builds = specs.iter().map(|p| {
+ let pkgid = p.query(resolve_with_overrides.iter())?;
+ let p = packages.get(pkgid)?;
+ p.manifest().print_teapot(ws.config());
+ Ok(p)
+ }).collect::<CargoResult<Vec<_>>>()?;
+
+ let mut general_targets = Vec::new();
+ let mut package_targets = Vec::new();
+
+ match (*target_rustc_args, *target_rustdoc_args) {
+ (Some(..), _) |
+ (_, Some(..)) if to_builds.len() != 1 => {
+ panic!("`rustc` and `rustdoc` should not accept multiple `-p` flags")
+ }
+ (Some(args), _) => {
+ let all_features = resolve_all_features(&resolve_with_overrides,
+ to_builds[0].package_id());
+ let targets = generate_targets(to_builds[0], profiles,
+ mode, filter, &all_features, release)?;
+ if targets.len() == 1 {
+ let (target, profile) = targets[0];
+ let mut profile = profile.clone();
+ profile.rustc_args = Some(args.to_vec());
+ general_targets.push((target, profile));
+ } else {
+ bail!("extra arguments to `rustc` can only be passed to one \
+ target, consider filtering\nthe package by passing \
+ e.g. `--lib` or `--bin NAME` to specify a single target")
+ }
+ }
+ (None, Some(args)) => {
+ let all_features = resolve_all_features(&resolve_with_overrides,
+ to_builds[0].package_id());
+ let targets = generate_targets(to_builds[0], profiles,
+ mode, filter, &all_features, release)?;
+ if targets.len() == 1 {
+ let (target, profile) = targets[0];
+ let mut profile = profile.clone();
+ profile.rustdoc_args = Some(args.to_vec());
+ general_targets.push((target, profile));
+ } else {
+ bail!("extra arguments to `rustdoc` can only be passed to one \
+ target, consider filtering\nthe package by passing e.g. \
+ `--lib` or `--bin NAME` to specify a single target")
+ }
+ }
+ (None, None) => {
+ for &to_build in to_builds.iter() {
+ let all_features = resolve_all_features(&resolve_with_overrides,
+ to_build.package_id());
+ let targets = generate_targets(to_build, profiles, mode,
+ filter, &all_features, release)?;
+ package_targets.push((to_build, targets));
+ }
+ }
+ };
+
+ for &(target, ref profile) in &general_targets {
+ for &to_build in to_builds.iter() {
+ package_targets.push((to_build, vec![(target, profile)]));
+ }
+ }
+
+ let mut ret = {
+ let _p = profile::start("compiling");
+ let mut build_config = scrape_build_config(config, jobs, target)?;
+ build_config.release = release;
+ build_config.test = mode == CompileMode::Test || mode == CompileMode::Bench;
+ build_config.json_messages = message_format == MessageFormat::Json;
+ if let CompileMode::Doc { deps } = mode {
+ build_config.doc_all = deps;
+ }
+
+ ops::compile_targets(ws,
+ &package_targets,
+ &packages,
+ &resolve_with_overrides,
+ config,
+ build_config,
+ profiles,
+ exec)?
+ };
+
+ ret.to_doc_test = to_builds.into_iter().cloned().collect();
+
+ return Ok(ret);
+
+ fn resolve_all_features(resolve_with_overrides: &Resolve,
+ package_id: &PackageId)
+ -> HashSet<String> {
+ let mut features = resolve_with_overrides.features(package_id).clone();
+
+ // Include features enabled for use by dependencies so targets can also use them with the
+ // required-features field when deciding whether to be built or skipped.
+ let deps = resolve_with_overrides.deps(package_id);
+ for dep in deps {
+ for feature in resolve_with_overrides.features(dep) {
+ features.insert(dep.name().to_string() + "/" + feature);
+ }
+ }
+
+ features
+ }
+}
+
+impl<'a> FilterRule<'a> {
+ pub fn new(targets: &'a [String], all: bool) -> FilterRule<'a> {
+ if all {
+ FilterRule::All
+ } else {
+ FilterRule::Just(targets)
+ }
+ }
+
+ fn matches(&self, target: &Target) -> bool {
+ match *self {
+ FilterRule::All => true,
+ FilterRule::Just(targets) => {
+ targets.iter().any(|x| *x == target.name())
+ },
+ }
+ }
+
+ fn is_specific(&self) -> bool {
+ match *self {
+ FilterRule::All => true,
+ FilterRule::Just(targets) => !targets.is_empty(),
+ }
+ }
+
+ pub fn try_collect(&self) -> Option<Vec<String>> {
+ match *self {
+ FilterRule::All => None,
+ FilterRule::Just(targets) => Some(targets.to_vec()),
+ }
+ }
+}
+
+impl<'a> CompileFilter<'a> {
+ pub fn new(lib_only: bool,
+ bins: &'a [String], all_bins: bool,
+ tsts: &'a [String], all_tsts: bool,
+ exms: &'a [String], all_exms: bool,
+ bens: &'a [String], all_bens: bool,
+ all_targets: bool) -> CompileFilter<'a> {
+ let rule_bins = FilterRule::new(bins, all_bins);
+ let rule_tsts = FilterRule::new(tsts, all_tsts);
+ let rule_exms = FilterRule::new(exms, all_exms);
+ let rule_bens = FilterRule::new(bens, all_bens);
+
+ if all_targets {
+ CompileFilter::Only {
+ all_targets: true,
+ lib: true, bins: FilterRule::All,
+ examples: FilterRule::All, benches: FilterRule::All,
+ tests: FilterRule::All,
+ }
+ } else if lib_only || rule_bins.is_specific() || rule_tsts.is_specific()
+ || rule_exms.is_specific() || rule_bens.is_specific() {
+ CompileFilter::Only {
+ all_targets: false,
+ lib: lib_only, bins: rule_bins,
+ examples: rule_exms, benches: rule_bens,
+ tests: rule_tsts,
+ }
+ } else {
+ CompileFilter::Default {
+ required_features_filterable: true,
+ }
+ }
+ }
+
+ pub fn matches(&self, target: &Target) -> bool {
+ match *self {
+ CompileFilter::Default { .. } => true,
+ CompileFilter::Only { lib, bins, examples, tests, benches, .. } => {
+ let rule = match *target.kind() {
+ TargetKind::Bin => bins,
+ TargetKind::Test => tests,
+ TargetKind::Bench => benches,
+ TargetKind::ExampleBin |
+ TargetKind::ExampleLib(..) => examples,
+ TargetKind::Lib(..) => return lib,
+ TargetKind::CustomBuild => return false,
+ };
+ rule.matches(target)
+ }
+ }
+ }
+
+ pub fn is_specific(&self) -> bool {
+ match *self {
+ CompileFilter::Default { .. } => false,
+ CompileFilter::Only { .. } => true,
+ }
+ }
+}
+
+#[derive(Clone, Copy, Debug)]
+struct BuildProposal<'a> {
+ target: &'a Target,
+ profile: &'a Profile,
+ required: bool,
+}
+
+fn generate_auto_targets<'a>(mode: CompileMode, targets: &'a [Target],
+ profile: &'a Profile,
+ dep: &'a Profile,
+ required_features_filterable: bool) -> Vec<BuildProposal<'a>> {
+ match mode {
+ CompileMode::Bench => {
+ targets.iter().filter(|t| t.benched()).map(|t| {
+ BuildProposal {
+ target: t,
+ profile: profile,
+ required: !required_features_filterable,
+ }
+ }).collect::<Vec<_>>()
+ }
+ CompileMode::Test => {
+ let mut base = targets.iter().filter(|t| {
+ t.tested()
+ }).map(|t| {
+ BuildProposal {
+ target: t,
+ profile: if t.is_example() {dep} else {profile},
+ required: !required_features_filterable,
+ }
+ }).collect::<Vec<_>>();
+
+ // Always compile the library if we're testing everything as
+ // it'll be needed for doctests
+ if let Some(t) = targets.iter().find(|t| t.is_lib()) {
+ if t.doctested() {
+ base.push(BuildProposal {
+ target: t,
+ profile: dep,
+ required: !required_features_filterable,
+ });
+ }
+ }
+ base
+ }
+ CompileMode::Build | CompileMode::Check{..} => {
+ targets.iter().filter(|t| {
+ t.is_bin() || t.is_lib()
+ }).map(|t| BuildProposal {
+ target: t,
+ profile: profile,
+ required: !required_features_filterable,
+ }).collect()
+ }
+ CompileMode::Doc { .. } => {
+ targets.iter().filter(|t| {
+ t.documented() && (
+ !t.is_bin() ||
+ !targets.iter().any(|l| l.is_lib() && l.name() == t.name())
+ )
+ }).map(|t| BuildProposal {
+ target: t,
+ profile: profile,
+ required: !required_features_filterable,
+ }).collect()
+ }
+ CompileMode::Doctest => {
+ if let Some(t) = targets.iter().find(|t| t.is_lib()) {
+ if t.doctested() {
+ return vec![BuildProposal {
+ target: t,
+ profile: profile,
+ required: !required_features_filterable,
+ }];
+ }
+ }
+
+ Vec::new()
+ }
+ }
+}
+
+/// Given a filter rule and some context, propose a list of targets
+fn propose_indicated_targets<'a>(pkg: &'a Package,
+ rule: FilterRule,
+ desc: &'static str,
+ is_expected_kind: fn(&Target) -> bool,
+ profile: &'a Profile) -> CargoResult<Vec<BuildProposal<'a>>> {
+ match rule {
+ FilterRule::All => {
+ let result = pkg.targets().iter().filter(|t| is_expected_kind(t)).map(|t| {
+ BuildProposal {
+ target: t,
+ profile: profile,
+ required: false,
+ }
+ });
+ Ok(result.collect())
+ }
+ FilterRule::Just(names) => {
+ let mut targets = Vec::new();
+ for name in names {
+ let target = pkg.targets().iter().find(|t| {
+ t.name() == *name && is_expected_kind(t)
+ });
+ let t = match target {
+ Some(t) => t,
+ None => {
+ let suggestion = pkg.find_closest_target(name, is_expected_kind);
+ match suggestion {
+ Some(s) => {
+ let suggested_name = s.name();
+ bail!("no {} target named `{}`\n\nDid you mean `{}`?",
+ desc, name, suggested_name)
+ }
+ None => bail!("no {} target named `{}`", desc, name),
+ }
+ }
+ };
+ debug!("found {} `{}`", desc, name);
+ targets.push(BuildProposal {
+ target: t,
+ profile: profile,
+ required: true,
+ });
+ }
+ Ok(targets)
+ }
+ }
+}
+
+/// Collect the targets that are libraries or have all required features available.
+fn filter_compatible_targets<'a>(mut proposals: Vec<BuildProposal<'a>>,
+ features: &HashSet<String>)
+ -> CargoResult<Vec<(&'a Target, &'a Profile)>> {
+ let mut compatible = Vec::with_capacity(proposals.len());
+ for proposal in proposals.drain(..) {
+ let unavailable_features = match proposal.target.required_features() {
+ Some(rf) => rf.iter().filter(|f| !features.contains(*f)).collect(),
+ None => Vec::new(),
+ };
+ if proposal.target.is_lib() || unavailable_features.is_empty() {
+ compatible.push((proposal.target, proposal.profile));
+ } else if proposal.required {
+ let required_features = proposal.target.required_features().unwrap();
+ let quoted_required_features: Vec<String> = required_features.iter()
+ .map(|s| format!("`{}`",s))
+ .collect();
+ bail!("target `{}` requires the features: {}\n\
+ Consider enabling them by passing e.g. `--features=\"{}\"`",
+ proposal.target.name(),
+ quoted_required_features.join(", "),
+ required_features.join(" "));
+ }
+ }
+ Ok(compatible)
+}
+
+/// Given the configuration for a build, this function will generate all
+/// target/profile combinations needed to be built.
+fn generate_targets<'a>(pkg: &'a Package,
+ profiles: &'a Profiles,
+ mode: CompileMode,
+ filter: &CompileFilter,
+ features: &HashSet<String>,
+ release: bool)
+ -> CargoResult<Vec<(&'a Target, &'a Profile)>> {
+ let build = if release {&profiles.release} else {&profiles.dev};
+ let test = if release {&profiles.bench} else {&profiles.test};
+ let profile = match mode {
+ CompileMode::Test => test,
+ CompileMode::Bench => &profiles.bench,
+ CompileMode::Build => build,
+ CompileMode::Check {test: false} => &profiles.check,
+ CompileMode::Check {test: true} => &profiles.check_test,
+ CompileMode::Doc { .. } => &profiles.doc,
+ CompileMode::Doctest => &profiles.doctest,
+ };
+
+ let test_profile = if profile.check {
+ &profiles.check_test
+ } else if mode == CompileMode::Build {
+ test
+ } else {
+ profile
+ };
+
+ let bench_profile = if profile.check {
+ &profiles.check_test
+ } else if mode == CompileMode::Build {
+ &profiles.bench
+ } else {
+ profile
+ };
+
+ let targets = match *filter {
+ CompileFilter::Default { required_features_filterable } => {
+ let deps = if release {
+ &profiles.bench_deps
+ } else {
+ &profiles.test_deps
+ };
+ generate_auto_targets(mode, pkg.targets(), profile, deps, required_features_filterable)
+ }
+ CompileFilter::Only { all_targets, lib, bins, examples, tests, benches } => {
+ let mut targets = Vec::new();
+
+ if lib {
+ if let Some(t) = pkg.targets().iter().find(|t| t.is_lib()) {
+ targets.push(BuildProposal {
+ target: t,
+ profile: profile,
+ required: true,
+ });
+ } else if !all_targets {
+ bail!("no library targets found")
+ }
+ }
+ targets.append(&mut propose_indicated_targets(
+ pkg, bins, "bin", Target::is_bin, profile)?);
+ targets.append(&mut propose_indicated_targets(
+ pkg, examples, "example", Target::is_example, profile)?);
+ // If --tests was specified, add all targets that would be
+ // generated by `cargo test`.
+ let test_filter = match tests {
+ FilterRule::All => Target::tested,
+ FilterRule::Just(_) => Target::is_test
+ };
+ targets.append(&mut propose_indicated_targets(
+ pkg, tests, "test", test_filter, test_profile)?);
+ // If --benches was specified, add all targets that would be
+ // generated by `cargo bench`.
+ let bench_filter = match benches {
+ FilterRule::All => Target::benched,
+ FilterRule::Just(_) => Target::is_bench
+ };
+ targets.append(&mut propose_indicated_targets(
+ pkg, benches, "bench", bench_filter, bench_profile)?);
+ targets
+ }
+ };
+
+ filter_compatible_targets(targets, features)
+}
+
+/// Parse all config files to learn about build configuration. Currently
+/// configured options are:
+///
+/// * build.jobs
+/// * build.target
+/// * target.$target.ar
+/// * target.$target.linker
+/// * target.$target.libfoo.metadata
+fn scrape_build_config(config: &Config,
+ jobs: Option<u32>,
+ target: Option<String>)
+ -> CargoResult<ops::BuildConfig> {
+ if jobs.is_some() && config.jobserver_from_env().is_some() {
+ config.shell().warn("a `-j` argument was passed to Cargo but Cargo is \
+ also configured with an external jobserver in \
+ its environment, ignoring the `-j` parameter")?;
+ }
+ let cfg_jobs = match config.get_i64("build.jobs")? {
+ Some(v) => {
+ if v.val <= 0 {
+ bail!("build.jobs must be positive, but found {} in {}",
+ v.val, v.definition)
+ } else if v.val >= i64::from(u32::max_value()) {
+ bail!("build.jobs is too large: found {} in {}", v.val,
+ v.definition)
+ } else {
+ Some(v.val as u32)
+ }
+ }
+ None => None,
+ };
+ let jobs = jobs.or(cfg_jobs).unwrap_or(::num_cpus::get() as u32);
+ let cfg_target = config.get_string("build.target")?.map(|s| s.val);
+ let target = target.or(cfg_target);
+ let mut base = ops::BuildConfig {
+ host_triple: config.rustc()?.host.clone(),
+ requested_target: target.clone(),
+ jobs: jobs,
+ ..Default::default()
+ };
+ base.host = scrape_target_config(config, &base.host_triple)?;
+ base.target = match target.as_ref() {
+ Some(triple) => scrape_target_config(config, triple)?,
+ None => base.host.clone(),
+ };
+ Ok(base)
+}
+
+fn scrape_target_config(config: &Config, triple: &str)
+ -> CargoResult<ops::TargetConfig> {
+
+ let key = format!("target.{}", triple);
+ let mut ret = ops::TargetConfig {
+ ar: config.get_path(&format!("{}.ar", key))?.map(|v| v.val),
+ linker: config.get_path(&format!("{}.linker", key))?.map(|v| v.val),
+ overrides: HashMap::new(),
+ };
+ let table = match config.get_table(&key)? {
+ Some(table) => table.val,
+ None => return Ok(ret),
+ };
+ for (lib_name, value) in table {
+ match lib_name.as_str() {
+ "ar" | "linker" | "runner" | "rustflags" => {
+ continue
+ },
+ _ => {}
+ }
+
+ let mut output = BuildOutput {
+ library_paths: Vec::new(),
+ library_links: Vec::new(),
+ cfgs: Vec::new(),
+ env: Vec::new(),
+ metadata: Vec::new(),
+ rerun_if_changed: Vec::new(),
+ rerun_if_env_changed: Vec::new(),
+ warnings: Vec::new(),
+ };
+ // We require deterministic order of evaluation, so we must sort the pairs by key first.
+ let mut pairs = Vec::new();
+ for (k, value) in value.table(&lib_name)?.0 {
+ pairs.push((k,value));
+ }
+ pairs.sort_by_key( |p| p.0 );
+ for (k,value) in pairs{
+ let key = format!("{}.{}", key, k);
+ match &k[..] {
+ "rustc-flags" => {
+ let (flags, definition) = value.string(k)?;
+ let whence = format!("in `{}` (in {})", key,
+ definition.display());
+ let (paths, links) =
+ BuildOutput::parse_rustc_flags(flags, &whence)
+ ?;
+ output.library_paths.extend(paths);
+ output.library_links.extend(links);
+ }
+ "rustc-link-lib" => {
+ let list = value.list(k)?;
+ output.library_links.extend(list.iter()
+ .map(|v| v.0.clone()));
+ }
+ "rustc-link-search" => {
+ let list = value.list(k)?;
+ output.library_paths.extend(list.iter().map(|v| {
+ PathBuf::from(&v.0)
+ }));
+ }
+ "rustc-cfg" => {
+ let list = value.list(k)?;
+ output.cfgs.extend(list.iter().map(|v| v.0.clone()));
+ }
+ "rustc-env" => {
+ for (name, val) in value.table(k)?.0 {
+ let val = val.string(name)?.0;
+ output.env.push((name.clone(), val.to_string()));
+ }
+ }
+ "warning" |
+ "rerun-if-changed" |
+ "rerun-if-env-changed" => {
+ bail!("`{}` is not supported in build script overrides", k);
+ }
+ _ => {
+ let val = value.string(k)?.0;
+ output.metadata.push((k.clone(), val.to_string()));
+ }
+ }
+ }
+ ret.overrides.insert(lib_name, output);
+ }
+
+ Ok(ret)
+}
--- /dev/null
+use std::collections::HashMap;
+use std::fs;
+use std::path::Path;
+use std::process::Command;
+
+use core::Workspace;
+use ops;
+use util::CargoResult;
+
+pub struct DocOptions<'a> {
+ pub open_result: bool,
+ pub compile_opts: ops::CompileOptions<'a>,
+}
+
+pub fn doc(ws: &Workspace, options: &DocOptions) -> CargoResult<()> {
+ let specs = options.compile_opts.spec.into_package_id_specs(ws)?;
+ let resolve = ops::resolve_ws_precisely(ws,
+ None,
+ options.compile_opts.features,
+ options.compile_opts.all_features,
+ options.compile_opts.no_default_features,
+ &specs)?;
+ let (packages, resolve_with_overrides) = resolve;
+
+ if specs.is_empty() {
+ return Err(format!("manifest path `{}` contains no package: The manifest is virtual, \
+ and the workspace has no members.", ws.current_manifest().display()).into());
+ };
+
+ let pkgs = specs.iter().map(|p| {
+ let pkgid = p.query(resolve_with_overrides.iter())?;
+ packages.get(pkgid)
+ }).collect::<CargoResult<Vec<_>>>()?;
+
+ let mut lib_names = HashMap::new();
+ let mut bin_names = HashMap::new();
+ for package in &pkgs {
+ for target in package.targets().iter().filter(|t| t.documented()) {
+ if target.is_lib() {
+ if let Some(prev) = lib_names.insert(target.crate_name(), package) {
+ bail!("The library `{}` is specified by packages `{}` and \
+ `{}` but can only be documented once. Consider renaming \
+ or marking one of the targets as `doc = false`.",
+ target.crate_name(), prev, package);
+ }
+ } else {
+ if let Some(prev) = bin_names.insert(target.crate_name(), package) {
+ bail!("The binary `{}` is specified by packages `{}` and \
+ `{}` but can be documented only once. Consider renaming \
+ or marking one of the targets as `doc = false`.",
+ target.crate_name(), prev, package);
+ }
+ }
+ }
+ }
+
+ ops::compile(ws, &options.compile_opts)?;
+
+ if options.open_result {
+ let name = if pkgs.len() > 1 {
+ bail!("Passing multiple packages and `open` is not supported")
+ } else if pkgs.len() == 1 {
+ pkgs[0].name().replace("-", "_")
+ } else {
+ match lib_names.keys().chain(bin_names.keys()).nth(0) {
+ Some(s) => s.to_string(),
+ None => return Ok(()),
+ }
+ };
+
+ // Don't bother locking here as if this is getting deleted there's
+ // nothing we can do about it and otherwise if it's getting overwritten
+ // then that's also ok!
+ let mut target_dir = ws.target_dir();
+ if let Some(triple) = options.compile_opts.target {
+ target_dir.push(Path::new(triple).file_stem().unwrap());
+ }
+ let path = target_dir.join("doc").join(&name).join("index.html");
+ let path = path.into_path_unlocked();
+ if fs::metadata(&path).is_ok() {
+ let mut shell = options.compile_opts.config.shell();
+ shell.status("Opening", path.display())?;
+ match open_docs(&path) {
+ Ok(m) => shell.status("Launching", m)?,
+ Err(e) => {
+ shell.warn(
+ "warning: could not determine a browser to open docs with, tried:")?;
+ for method in e {
+ shell.warn(format!("\t{}", method))?;
+ }
+ }
+ }
+ }
+ }
+
+ Ok(())
+}
+
+#[cfg(not(any(target_os = "windows", target_os = "macos")))]
+fn open_docs(path: &Path) -> Result<&'static str, Vec<&'static str>> {
+ use std::env;
+ let mut methods = Vec::new();
+ // trying $BROWSER
+ if let Ok(name) = env::var("BROWSER") {
+ match Command::new(name).arg(path).status() {
+ Ok(_) => return Ok("$BROWSER"),
+ Err(_) => methods.push("$BROWSER"),
+ }
+ }
+
+ for m in ["xdg-open", "gnome-open", "kde-open"].iter() {
+ match Command::new(m).arg(path).status() {
+ Ok(_) => return Ok(m),
+ Err(_) => methods.push(m),
+ }
+ }
+
+ Err(methods)
+}
+
+#[cfg(target_os = "windows")]
+fn open_docs(path: &Path) -> Result<&'static str, Vec<&'static str>> {
+ match Command::new("cmd").arg("/C").arg(path).status() {
+ Ok(_) => Ok("cmd /C"),
+ Err(_) => Err(vec!["cmd /C"]),
+ }
+}
+
+#[cfg(target_os = "macos")]
+fn open_docs(path: &Path) -> Result<&'static str, Vec<&'static str>> {
+ match Command::new("open").arg(path).status() {
+ Ok(_) => Ok("open"),
+ Err(_) => Err(vec!["open"]),
+ }
+}
--- /dev/null
+use core::{Resolve, PackageSet, Workspace};
+use ops;
+use util::CargoResult;
+
+/// Executes `cargo fetch`.
+pub fn fetch<'a>(ws: &Workspace<'a>) -> CargoResult<(Resolve, PackageSet<'a>)> {
+ let (packages, resolve) = ops::resolve_ws(ws)?;
+ for id in resolve.iter() {
+ packages.get(id)?;
+ }
+ Ok((resolve, packages))
+}
--- /dev/null
+use std::collections::{BTreeMap, HashSet};
+
+use termcolor::Color::{self, Cyan, Green, Red};
+
+use core::PackageId;
+use core::registry::PackageRegistry;
+use core::{Resolve, SourceId, Workspace};
+use core::resolver::Method;
+use ops;
+use util::config::Config;
+use util::CargoResult;
+
+pub struct UpdateOptions<'a> {
+ pub config: &'a Config,
+ pub to_update: &'a [String],
+ pub precise: Option<&'a str>,
+ pub aggressive: bool,
+}
+
+pub fn generate_lockfile(ws: &Workspace) -> CargoResult<()> {
+ let mut registry = PackageRegistry::new(ws.config())?;
+ let resolve = ops::resolve_with_previous(&mut registry, ws,
+ Method::Everything,
+ None, None, &[], true)?;
+ ops::write_pkg_lockfile(ws, &resolve)?;
+ Ok(())
+}
+
+pub fn update_lockfile(ws: &Workspace, opts: &UpdateOptions)
+ -> CargoResult<()> {
+
+ if opts.aggressive && opts.precise.is_some() {
+ bail!("cannot specify both aggressive and precise simultaneously")
+ }
+
+ if ws.members().is_empty() {
+ bail!("you can't generate a lockfile for an empty workspace.")
+ }
+
+ let previous_resolve = match ops::load_pkg_lockfile(ws)? {
+ Some(resolve) => resolve,
+ None => return generate_lockfile(ws),
+ };
+ let mut registry = PackageRegistry::new(opts.config)?;
+ let mut to_avoid = HashSet::new();
+
+ if opts.to_update.is_empty() {
+ to_avoid.extend(previous_resolve.iter());
+ } else {
+ let mut sources = Vec::new();
+ for name in opts.to_update {
+ let dep = previous_resolve.query(name)?;
+ if opts.aggressive {
+ fill_with_deps(&previous_resolve, dep, &mut to_avoid,
+ &mut HashSet::new());
+ } else {
+ to_avoid.insert(dep);
+ sources.push(match opts.precise {
+ Some(precise) => {
+ // TODO: see comment in `resolve.rs` as well, but this
+ // seems like a pretty hokey reason to single out
+ // the registry as well.
+ let precise = if dep.source_id().is_registry() {
+ format!("{}={}", dep.name(), precise)
+ } else {
+ precise.to_string()
+ };
+ dep.source_id().clone().with_precise(Some(precise))
+ }
+ None => {
+ dep.source_id().clone().with_precise(None)
+ }
+ });
+ }
+ }
+ registry.add_sources(&sources)?;
+ }
+
+ let resolve = ops::resolve_with_previous(&mut registry,
+ ws,
+ Method::Everything,
+ Some(&previous_resolve),
+ Some(&to_avoid),
+ &[],
+ true)?;
+
+ // Summarize what is changing for the user.
+ let print_change = |status: &str, msg: String, color: Color| {
+ opts.config.shell().status_with_color(status, msg, color)
+ };
+ for (removed, added) in compare_dependency_graphs(&previous_resolve, &resolve) {
+ if removed.len() == 1 && added.len() == 1 {
+ let msg = if removed[0].source_id().is_git() {
+ format!("{} -> #{}", removed[0],
+ &added[0].source_id().precise().unwrap()[..8])
+ } else {
+ format!("{} -> v{}", removed[0], added[0].version())
+ };
+ print_change("Updating", msg, Green)?;
+ } else {
+ for package in removed.iter() {
+ print_change("Removing", format!("{}", package), Red)?;
+ }
+ for package in added.iter() {
+ print_change("Adding", format!("{}", package), Cyan)?;
+ }
+ }
+ }
+
+ ops::write_pkg_lockfile(ws, &resolve)?;
+ return Ok(());
+
+ fn fill_with_deps<'a>(resolve: &'a Resolve, dep: &'a PackageId,
+ set: &mut HashSet<&'a PackageId>,
+ visited: &mut HashSet<&'a PackageId>) {
+ if !visited.insert(dep) {
+ return
+ }
+ set.insert(dep);
+ for dep in resolve.deps(dep) {
+ fill_with_deps(resolve, dep, set, visited);
+ }
+ }
+
+ fn compare_dependency_graphs<'a>(previous_resolve: &'a Resolve,
+ resolve: &'a Resolve) ->
+ Vec<(Vec<&'a PackageId>, Vec<&'a PackageId>)> {
+ fn key(dep: &PackageId) -> (&str, &SourceId) {
+ (dep.name(), dep.source_id())
+ }
+
+ // Removes all package ids in `b` from `a`. Note that this is somewhat
+ // more complicated because the equality for source ids does not take
+ // precise versions into account (e.g. git shas), but we want to take
+ // that into account here.
+ fn vec_subtract<'a>(a: &[&'a PackageId],
+ b: &[&'a PackageId]) -> Vec<&'a PackageId> {
+ a.iter().filter(|a| {
+ // If this package id is not found in `b`, then it's definitely
+ // in the subtracted set
+ let i = match b.binary_search(a) {
+ Ok(i) => i,
+ Err(..) => return true,
+ };
+
+ // If we've found `a` in `b`, then we iterate over all instances
+ // (we know `b` is sorted) and see if they all have different
+ // precise versions. If so, then `a` isn't actually in `b` so
+ // we'll let it through.
+ //
+ // Note that we only check this for non-registry sources,
+ // however, as registries contain enough version information in
+ // the package id to disambiguate
+ if a.source_id().is_registry() {
+ return false
+ }
+ b[i..].iter().take_while(|b| a == b).all(|b| {
+ a.source_id().precise() != b.source_id().precise()
+ })
+ }).cloned().collect()
+ }
+
+ // Map (package name, package source) to (removed versions, added versions).
+ let mut changes = BTreeMap::new();
+ let empty = (Vec::new(), Vec::new());
+ for dep in previous_resolve.iter() {
+ changes.entry(key(dep)).or_insert(empty.clone()).0.push(dep);
+ }
+ for dep in resolve.iter() {
+ changes.entry(key(dep)).or_insert(empty.clone()).1.push(dep);
+ }
+
+ for v in changes.values_mut() {
+ let (ref mut old, ref mut new) = *v;
+ old.sort();
+ new.sort();
+ let removed = vec_subtract(old, new);
+ let added = vec_subtract(new, old);
+ *old = removed;
+ *new = added;
+ }
+ debug!("{:#?}", changes);
+
+ changes.into_iter().map(|(_, v)| v).collect()
+ }
+}
--- /dev/null
+use std::collections::btree_map::Entry;
+use std::collections::{BTreeMap, BTreeSet};
+use std::{env, fs};
+use std::io::prelude::*;
+use std::io::SeekFrom;
+use std::path::{Path, PathBuf};
+use std::sync::Arc;
+
+use semver::{Version, VersionReq};
+use tempdir::TempDir;
+use toml;
+
+use core::{SourceId, Source, Package, Dependency, PackageIdSpec};
+use core::{PackageId, Workspace};
+use ops::{self, CompileFilter, DefaultExecutor};
+use sources::{GitSource, PathSource, SourceConfigMap};
+use util::{Config, internal};
+use util::{Filesystem, FileLock};
+use util::errors::{CargoError, CargoResult, CargoResultExt};
+
+#[derive(Deserialize, Serialize)]
+#[serde(untagged)]
+enum CrateListing {
+ V1(CrateListingV1),
+ Empty(Empty),
+}
+
+#[derive(Deserialize, Serialize)]
+#[serde(deny_unknown_fields)]
+struct Empty {}
+
+#[derive(Deserialize, Serialize)]
+struct CrateListingV1 {
+ v1: BTreeMap<PackageId, BTreeSet<String>>,
+}
+
+struct Transaction {
+ bins: Vec<PathBuf>,
+}
+
+impl Transaction {
+ fn success(mut self) {
+ self.bins.clear();
+ }
+}
+
+impl Drop for Transaction {
+ fn drop(&mut self) {
+ for bin in self.bins.iter() {
+ let _ = fs::remove_file(bin);
+ }
+ }
+}
+
+pub fn install(root: Option<&str>,
+ krates: Vec<&str>,
+ source_id: &SourceId,
+ vers: Option<&str>,
+ opts: &ops::CompileOptions,
+ force: bool) -> CargoResult<()> {
+ let root = resolve_root(root, opts.config)?;
+ let map = SourceConfigMap::new(opts.config)?;
+
+ let (installed_anything, scheduled_error) = if krates.len() <= 1 {
+ install_one(&root, &map, krates.into_iter().next(), source_id, vers, opts,
+ force, true)?;
+ (true, false)
+ } else {
+ let mut succeeded = vec![];
+ let mut failed = vec![];
+ let mut first = true;
+ for krate in krates {
+ let root = root.clone();
+ let map = map.clone();
+ match install_one(&root, &map, Some(krate), source_id, vers,
+ opts, force, first) {
+ Ok(()) => succeeded.push(krate),
+ Err(e) => {
+ ::handle_error(e, &mut opts.config.shell());
+ failed.push(krate)
+ }
+ }
+ first = false;
+ }
+
+ let mut summary = vec![];
+ if !succeeded.is_empty() {
+ summary.push(format!("Successfully installed {}!", succeeded.join(", ")));
+ }
+ if !failed.is_empty() {
+ summary.push(format!("Failed to install {} (see error(s) above).", failed.join(", ")));
+ }
+ if !succeeded.is_empty() || !failed.is_empty() {
+ opts.config.shell().status("\nSummary:", summary.join(" "))?;
+ }
+
+ (!succeeded.is_empty(), !failed.is_empty())
+ };
+
+ if installed_anything {
+ // Print a warning that if this directory isn't in PATH that they won't be
+ // able to run these commands.
+ let dst = metadata(opts.config, &root)?.parent().join("bin");
+ let path = env::var_os("PATH").unwrap_or_default();
+ for path in env::split_paths(&path) {
+ if path == dst {
+ return Ok(())
+ }
+ }
+
+ opts.config.shell().warn(&format!("be sure to add `{}` to your PATH to be \
+ able to run the installed binaries",
+ dst.display()))?;
+ }
+
+ if scheduled_error {
+ bail!("some crates failed to install");
+ }
+
+ Ok(())
+}
+
+fn install_one(root: &Filesystem,
+ map: &SourceConfigMap,
+ krate: Option<&str>,
+ source_id: &SourceId,
+ vers: Option<&str>,
+ opts: &ops::CompileOptions,
+ force: bool,
+ is_first_install: bool) -> CargoResult<()> {
+
+ let config = opts.config;
+
+ let (pkg, source) = if source_id.is_git() {
+ select_pkg(GitSource::new(source_id, config)?,
+ krate, vers, config, is_first_install,
+ &mut |git| git.read_packages())?
+ } else if source_id.is_path() {
+ let path = source_id.url().to_file_path()
+ .map_err(|()| CargoError::from("path sources must have a valid path"))?;
+ let mut src = PathSource::new(&path, source_id, config);
+ src.update().chain_err(|| {
+ format!("`{}` is not a crate root; specify a crate to \
+ install from crates.io, or use --path or --git to \
+ specify an alternate source", path.display())
+ })?;
+ select_pkg(PathSource::new(&path, source_id, config),
+ krate, vers, config, is_first_install,
+ &mut |path| path.read_packages())?
+ } else {
+ select_pkg(map.load(source_id)?,
+ krate, vers, config, is_first_install,
+ &mut |_| Err("must specify a crate to install from \
+ crates.io, or use --path or --git to \
+ specify alternate source".into()))?
+ };
+
+ let mut td_opt = None;
+ let overidden_target_dir = if source_id.is_path() {
+ None
+ } else if let Ok(td) = TempDir::new("cargo-install") {
+ let p = td.path().to_owned();
+ td_opt = Some(td);
+ Some(Filesystem::new(p))
+ } else {
+ Some(Filesystem::new(config.cwd().join("target-install")))
+ };
+
+ let ws = match overidden_target_dir {
+ Some(dir) => Workspace::ephemeral(pkg, config, Some(dir), false)?,
+ None => Workspace::new(pkg.manifest_path(), config)?,
+ };
+ let pkg = ws.current()?;
+
+ config.shell().status("Installing", pkg)?;
+
+ // Preflight checks to check up front whether we'll overwrite something.
+ // We have to check this again afterwards, but may as well avoid building
+ // anything if we're gonna throw it away anyway.
+ {
+ let metadata = metadata(config, root)?;
+ let list = read_crate_list(&metadata)?;
+ let dst = metadata.parent().join("bin");
+ check_overwrites(&dst, pkg, &opts.filter, &list, force)?;
+ }
+
+ let compile = ops::compile_ws(&ws,
+ Some(source),
+ opts,
+ Arc::new(DefaultExecutor)).chain_err(|| {
+ if let Some(td) = td_opt.take() {
+ // preserve the temporary directory, so the user can inspect it
+ td.into_path();
+ }
+
+ CargoError::from(format!("failed to compile `{}`, intermediate artifacts can be \
+ found at `{}`", pkg, ws.target_dir().display()))
+ })?;
+ let binaries: Vec<(&str, &Path)> = compile.binaries.iter().map(|bin| {
+ let name = bin.file_name().unwrap();
+ if let Some(s) = name.to_str() {
+ Ok((s, bin.as_ref()))
+ } else {
+ bail!("Binary `{:?}` name can't be serialized into string", name)
+ }
+ }).collect::<CargoResult<_>>()?;
+ if binaries.is_empty() {
+ bail!("no binaries are available for install using the selected \
+ features");
+ }
+
+ let metadata = metadata(config, root)?;
+ let mut list = read_crate_list(&metadata)?;
+ let dst = metadata.parent().join("bin");
+ let duplicates = check_overwrites(&dst, pkg, &opts.filter,
+ &list, force)?;
+
+ fs::create_dir_all(&dst)?;
+
+ // Copy all binaries to a temporary directory under `dst` first, catching
+ // some failure modes (e.g. out of space) before touching the existing
+ // binaries. This directory will get cleaned up via RAII.
+ let staging_dir = TempDir::new_in(&dst, "cargo-install")?;
+ for &(bin, src) in binaries.iter() {
+ let dst = staging_dir.path().join(bin);
+ // Try to move if `target_dir` is transient.
+ if !source_id.is_path() && fs::rename(src, &dst).is_ok() {
+ continue
+ }
+ fs::copy(src, &dst).chain_err(|| {
+ format!("failed to copy `{}` to `{}`", src.display(),
+ dst.display())
+ })?;
+ }
+
+ let (to_replace, to_install): (Vec<&str>, Vec<&str>) =
+ binaries.iter().map(|&(bin, _)| bin)
+ .partition(|&bin| duplicates.contains_key(bin));
+
+ let mut installed = Transaction { bins: Vec::new() };
+
+ // Move the temporary copies into `dst` starting with new binaries.
+ for bin in to_install.iter() {
+ let src = staging_dir.path().join(bin);
+ let dst = dst.join(bin);
+ config.shell().status("Installing", dst.display())?;
+ fs::rename(&src, &dst).chain_err(|| {
+ format!("failed to move `{}` to `{}`", src.display(),
+ dst.display())
+ })?;
+ installed.bins.push(dst);
+ }
+
+ // Repeat for binaries which replace existing ones but don't pop the error
+ // up until after updating metadata.
+ let mut replaced_names = Vec::new();
+ let result = {
+ let mut try_install = || -> CargoResult<()> {
+ for &bin in to_replace.iter() {
+ let src = staging_dir.path().join(bin);
+ let dst = dst.join(bin);
+ config.shell().status("Replacing", dst.display())?;
+ fs::rename(&src, &dst).chain_err(|| {
+ format!("failed to move `{}` to `{}`", src.display(),
+ dst.display())
+ })?;
+ replaced_names.push(bin);
+ }
+ Ok(())
+ };
+ try_install()
+ };
+
+ // Update records of replaced binaries.
+ for &bin in replaced_names.iter() {
+ if let Some(&Some(ref p)) = duplicates.get(bin) {
+ if let Some(set) = list.v1.get_mut(p) {
+ set.remove(bin);
+ }
+ }
+ list.v1.entry(pkg.package_id().clone())
+ .or_insert_with(|| BTreeSet::new())
+ .insert(bin.to_string());
+ }
+
+ // Remove empty metadata lines.
+ let pkgs = list.v1.iter()
+ .filter_map(|(p, set)| if set.is_empty() { Some(p.clone()) } else { None })
+ .collect::<Vec<_>>();
+ for p in pkgs.iter() {
+ list.v1.remove(p);
+ }
+
+ // If installation was successful record newly installed binaries.
+ if result.is_ok() {
+ list.v1.entry(pkg.package_id().clone())
+ .or_insert_with(|| BTreeSet::new())
+ .extend(to_install.iter().map(|s| s.to_string()));
+ }
+
+ let write_result = write_crate_list(&metadata, list);
+ match write_result {
+ // Replacement error (if any) isn't actually caused by write error
+ // but this seems to be the only way to show both.
+ Err(err) => result.chain_err(|| err)?,
+ Ok(_) => result?,
+ }
+
+ // Reaching here means all actions have succeeded. Clean up.
+ installed.success();
+ if !source_id.is_path() {
+ // Don't bother grabbing a lock as we're going to blow it all away
+ // anyway.
+ let target_dir = ws.target_dir().into_path_unlocked();
+ fs::remove_dir_all(&target_dir)?;
+ }
+
+ Ok(())
+}
+
+fn select_pkg<'a, T>(mut source: T,
+ name: Option<&str>,
+ vers: Option<&str>,
+ config: &Config,
+ needs_update: bool,
+ list_all: &mut FnMut(&mut T) -> CargoResult<Vec<Package>>)
+ -> CargoResult<(Package, Box<Source + 'a>)>
+ where T: Source + 'a
+{
+ if needs_update {
+ source.update()?;
+ }
+
+ match name {
+ Some(name) => {
+ let vers = match vers {
+ Some(v) => {
+
+ // If the version begins with character <, >, =, ^, ~ parse it as a
+ // version range, otherwise parse it as a specific version
+ let first = v.chars()
+ .nth(0)
+ .ok_or("no version provided for the `--vers` flag")?;
+
+ match first {
+ '<' | '>' | '=' | '^' | '~' => match v.parse::<VersionReq>() {
+ Ok(v) => Some(v.to_string()),
+ Err(_) => {
+ let msg = format!("the `--vers` provided, `{}`, is \
+ not a valid semver version requirement\n\n
+ Please have a look at \
+ http://doc.crates.io/specifying-dependencies.html \
+ for the correct format", v);
+ return Err(msg.into());
+ }
+ },
+ _ => match v.parse::<Version>() {
+ Ok(v) => Some(format!("={}", v)),
+ Err(_) => {
+ let mut msg = format!("the `--vers` provided, `{}`, is \
+ not a valid semver version\n\n\
+ historically Cargo treated this \
+ as a semver version requirement \
+ accidentally\nand will continue \
+ to do so, but this behavior \
+ will be removed eventually", v);
+
+ // If it is not a valid version but it is a valid version
+ // requirement, add a note to the warning
+ if v.parse::<VersionReq>().is_ok() {
+ msg.push_str(&format!("\nif you want to specify semver range, \
+ add an explicit qualifier, like ^{}", v));
+ }
+ config.shell().warn(&msg)?;
+ Some(v.to_string())
+ }
+ }
+ }
+ }
+ None => None,
+ };
+ let vers = vers.as_ref().map(|s| &**s);
+ let dep = Dependency::parse_no_deprecated(name, vers, source.source_id())?;
+ let deps = source.query_vec(&dep)?;
+ match deps.iter().map(|p| p.package_id()).max() {
+ Some(pkgid) => {
+ let pkg = source.download(pkgid)?;
+ Ok((pkg, Box::new(source)))
+ }
+ None => {
+ let vers_info = vers.map(|v| format!(" with version `{}`", v))
+ .unwrap_or_default();
+ Err(format!("could not find `{}` in {}{}", name,
+ source.source_id(), vers_info).into())
+ }
+ }
+ }
+ None => {
+ let candidates = list_all(&mut source)?;
+ let binaries = candidates.iter().filter(|cand| {
+ cand.targets().iter().filter(|t| t.is_bin()).count() > 0
+ });
+ let examples = candidates.iter().filter(|cand| {
+ cand.targets().iter().filter(|t| t.is_example()).count() > 0
+ });
+ let pkg = match one(binaries, |v| multi_err("binaries", v))? {
+ Some(p) => p,
+ None => {
+ match one(examples, |v| multi_err("examples", v))? {
+ Some(p) => p,
+ None => bail!("no packages found with binaries or \
+ examples"),
+ }
+ }
+ };
+ return Ok((pkg.clone(), Box::new(source)));
+
+ fn multi_err(kind: &str, mut pkgs: Vec<&Package>) -> String {
+ pkgs.sort_by(|a, b| a.name().cmp(b.name()));
+ format!("multiple packages with {} found: {}", kind,
+ pkgs.iter().map(|p| p.name()).collect::<Vec<_>>()
+ .join(", "))
+ }
+ }
+ }
+}
+
+fn one<I, F>(mut i: I, f: F) -> CargoResult<Option<I::Item>>
+ where I: Iterator,
+ F: FnOnce(Vec<I::Item>) -> String
+{
+ match (i.next(), i.next()) {
+ (Some(i1), Some(i2)) => {
+ let mut v = vec![i1, i2];
+ v.extend(i);
+ Err(f(v).into())
+ }
+ (Some(i), None) => Ok(Some(i)),
+ (None, _) => Ok(None)
+ }
+}
+
+fn check_overwrites(dst: &Path,
+ pkg: &Package,
+ filter: &ops::CompileFilter,
+ prev: &CrateListingV1,
+ force: bool) -> CargoResult<BTreeMap<String, Option<PackageId>>> {
+ // If explicit --bin or --example flags were passed then those'll
+ // get checked during cargo_compile, we only care about the "build
+ // everything" case here
+ if !filter.is_specific() && !pkg.targets().iter().any(|t| t.is_bin()) {
+ bail!("specified package has no binaries")
+ }
+ let duplicates = find_duplicates(dst, pkg, filter, prev);
+ if force || duplicates.is_empty() {
+ return Ok(duplicates)
+ }
+ // Format the error message.
+ let mut msg = String::new();
+ for (bin, p) in duplicates.iter() {
+ msg.push_str(&format!("binary `{}` already exists in destination", bin));
+ if let Some(p) = p.as_ref() {
+ msg.push_str(&format!(" as part of `{}`\n", p));
+ } else {
+ msg.push_str("\n");
+ }
+ }
+ msg.push_str("Add --force to overwrite");
+ Err(msg.into())
+}
+
+fn find_duplicates(dst: &Path,
+ pkg: &Package,
+ filter: &ops::CompileFilter,
+ prev: &CrateListingV1) -> BTreeMap<String, Option<PackageId>> {
+ let check = |name: String| {
+ // Need to provide type, works around Rust Issue #93349
+ let name = format!("{}{}", name, env::consts::EXE_SUFFIX);
+ if fs::metadata(dst.join(&name)).is_err() {
+ None
+ } else if let Some((p, _)) = prev.v1.iter().find(|&(_, v)| v.contains(&name)) {
+ Some((name, Some(p.clone())))
+ } else {
+ Some((name, None))
+ }
+ };
+ match *filter {
+ CompileFilter::Default { .. } => {
+ pkg.targets().iter()
+ .filter(|t| t.is_bin())
+ .filter_map(|t| check(t.name().to_string()))
+ .collect()
+ }
+ CompileFilter::Only { bins, examples, .. } => {
+ let all_bins: Vec<String> = bins.try_collect().unwrap_or_else(|| {
+ pkg.targets().iter().filter(|t| t.is_bin())
+ .map(|t| t.name().to_string())
+ .collect()
+ });
+ let all_examples: Vec<String> = examples.try_collect().unwrap_or_else(|| {
+ pkg.targets().iter().filter(|t| t.is_bin_example())
+ .map(|t| t.name().to_string())
+ .collect()
+ });
+
+ all_bins.iter().chain(all_examples.iter())
+ .filter_map(|t| check(t.clone()))
+ .collect::<BTreeMap<String, Option<PackageId>>>()
+ }
+ }
+}
+
+fn read_crate_list(file: &FileLock) -> CargoResult<CrateListingV1> {
+ (|| -> CargoResult<_> {
+ let mut contents = String::new();
+ file.file().read_to_string(&mut contents)?;
+ let listing = toml::from_str(&contents).chain_err(|| {
+ internal("invalid TOML found for metadata")
+ })?;
+ match listing {
+ CrateListing::V1(v1) => Ok(v1),
+ CrateListing::Empty(_) => {
+ Ok(CrateListingV1 { v1: BTreeMap::new() })
+ }
+ }
+ })().chain_err(|| {
+ format!("failed to parse crate metadata at `{}`",
+ file.path().to_string_lossy())
+ })
+}
+
+fn write_crate_list(file: &FileLock, listing: CrateListingV1) -> CargoResult<()> {
+ (|| -> CargoResult<_> {
+ let mut file = file.file();
+ file.seek(SeekFrom::Start(0))?;
+ file.set_len(0)?;
+ let data = toml::to_string(&CrateListing::V1(listing))?;
+ file.write_all(data.as_bytes())?;
+ Ok(())
+ })().chain_err(|| {
+ format!("failed to write crate metadata at `{}`",
+ file.path().to_string_lossy())
+ })
+}
+
+pub fn install_list(dst: Option<&str>, config: &Config) -> CargoResult<()> {
+ let dst = resolve_root(dst, config)?;
+ let dst = metadata(config, &dst)?;
+ let list = read_crate_list(&dst)?;
+ for (k, v) in list.v1.iter() {
+ println!("{}:", k);
+ for bin in v {
+ println!(" {}", bin);
+ }
+ }
+ Ok(())
+}
+
+pub fn uninstall(root: Option<&str>,
+ specs: Vec<&str>,
+ bins: &[String],
+ config: &Config) -> CargoResult<()> {
+ if specs.len() > 1 && bins.len() > 0 {
+ bail!("A binary can only be associated with a single installed package, specifying multiple specs with --bin is redundant.");
+ }
+
+ let root = resolve_root(root, config)?;
+ let scheduled_error = if specs.len() == 1 {
+ uninstall_one(root, specs[0], bins, config)?;
+ false
+ } else {
+ let mut succeeded = vec![];
+ let mut failed = vec![];
+ for spec in specs {
+ let root = root.clone();
+ match uninstall_one(root, spec, bins, config) {
+ Ok(()) => succeeded.push(spec),
+ Err(e) => {
+ ::handle_error(e, &mut config.shell());
+ failed.push(spec)
+ }
+ }
+ }
+
+ let mut summary = vec![];
+ if !succeeded.is_empty() {
+ summary.push(format!("Successfully uninstalled {}!", succeeded.join(", ")));
+ }
+ if !failed.is_empty() {
+ summary.push(format!("Failed to uninstall {} (see error(s) above).", failed.join(", ")));
+ }
+
+ if !succeeded.is_empty() || !failed.is_empty() {
+ config.shell().status("\nSummary:", summary.join(" "))?;
+ }
+
+ !failed.is_empty()
+ };
+
+ if scheduled_error {
+ bail!("some packages failed to uninstall");
+ }
+
+ Ok(())
+}
+
+pub fn uninstall_one(root: Filesystem,
+ spec: &str,
+ bins: &[String],
+ config: &Config) -> CargoResult<()> {
+ let crate_metadata = metadata(config, &root)?;
+ let mut metadata = read_crate_list(&crate_metadata)?;
+ let mut to_remove = Vec::new();
+ {
+ let result = PackageIdSpec::query_str(spec, metadata.v1.keys())?
+ .clone();
+ let mut installed = match metadata.v1.entry(result.clone()) {
+ Entry::Occupied(e) => e,
+ Entry::Vacant(..) => panic!("entry not found: {}", result),
+ };
+ let dst = crate_metadata.parent().join("bin");
+ for bin in installed.get() {
+ let bin = dst.join(bin);
+ if fs::metadata(&bin).is_err() {
+ bail!("corrupt metadata, `{}` does not exist when it should",
+ bin.display())
+ }
+ }
+
+ let bins = bins.iter().map(|s| {
+ if s.ends_with(env::consts::EXE_SUFFIX) {
+ s.to_string()
+ } else {
+ format!("{}{}", s, env::consts::EXE_SUFFIX)
+ }
+ }).collect::<Vec<_>>();
+
+ for bin in bins.iter() {
+ if !installed.get().contains(bin) {
+ bail!("binary `{}` not installed as part of `{}`", bin, result)
+ }
+ }
+
+ if bins.is_empty() {
+ to_remove.extend(installed.get().iter().map(|b| dst.join(b)));
+ installed.get_mut().clear();
+ } else {
+ for bin in bins.iter() {
+ to_remove.push(dst.join(bin));
+ installed.get_mut().remove(bin);
+ }
+ }
+ if installed.get().is_empty() {
+ installed.remove();
+ }
+ }
+ write_crate_list(&crate_metadata, metadata)?;
+ for bin in to_remove {
+ config.shell().status("Removing", bin.display())?;
+ fs::remove_file(bin)?;
+ }
+
+ Ok(())
+}
+
+fn metadata(config: &Config, root: &Filesystem) -> CargoResult<FileLock> {
+ root.open_rw(Path::new(".crates.toml"), config, "crate metadata")
+}
+
+fn resolve_root(flag: Option<&str>,
+ config: &Config) -> CargoResult<Filesystem> {
+ let config_root = config.get_path("install.root")?;
+ Ok(flag.map(PathBuf::from).or_else(|| {
+ env::var_os("CARGO_INSTALL_ROOT").map(PathBuf::from)
+ }).or_else(move || {
+ config_root.map(|v| v.val)
+ }).map(Filesystem::new).unwrap_or_else(|| {
+ config.home().clone()
+ }))
+}
--- /dev/null
+use std::collections::BTreeMap;
+use std::env;
+use std::fs;
+use std::path::Path;
+
+use serde::{Deserialize, Deserializer};
+use serde::de;
+
+use git2::Config as GitConfig;
+use git2::Repository as GitRepository;
+
+use core::Workspace;
+use ops::is_bad_artifact_name;
+use util::{GitRepo, HgRepo, PijulRepo, FossilRepo, internal};
+use util::{Config, paths};
+use util::errors::{CargoError, CargoResult, CargoResultExt};
+
+use toml;
+
+#[derive(Clone, Copy, Debug, PartialEq)]
+pub enum VersionControl { Git, Hg, Pijul, Fossil, NoVcs }
+
+pub struct NewOptions<'a> {
+ pub version_control: Option<VersionControl>,
+ pub bin: bool,
+ pub lib: bool,
+ pub path: &'a str,
+ pub name: Option<&'a str>,
+}
+
+struct SourceFileInformation {
+ relative_path: String,
+ target_name: String,
+ bin: bool,
+}
+
+struct MkOptions<'a> {
+ version_control: Option<VersionControl>,
+ path: &'a Path,
+ name: &'a str,
+ source_files: Vec<SourceFileInformation>,
+ bin: bool,
+}
+
+impl<'de> Deserialize<'de> for VersionControl {
+ fn deserialize<D: Deserializer<'de>>(d: D) -> Result<VersionControl, D::Error> {
+ Ok(match &String::deserialize(d)?[..] {
+ "git" => VersionControl::Git,
+ "hg" => VersionControl::Hg,
+ "pijul" => VersionControl::Pijul,
+ "fossil" => VersionControl::Fossil,
+ "none" => VersionControl::NoVcs,
+ n => {
+ let value = de::Unexpected::Str(n);
+ let msg = "unsupported version control system";
+ return Err(de::Error::invalid_value(value, &msg));
+ }
+ })
+ }
+}
+
+impl<'a> NewOptions<'a> {
+ pub fn new(version_control: Option<VersionControl>,
+ bin: bool,
+ lib: bool,
+ path: &'a str,
+ name: Option<&'a str>) -> NewOptions<'a> {
+
+ // default to lib
+ let is_lib = if !bin {
+ true
+ }
+ else {
+ lib
+ };
+
+ NewOptions {
+ version_control: version_control,
+ bin: bin,
+ lib: is_lib,
+ path: path,
+ name: name,
+ }
+ }
+}
+
+struct CargoNewConfig {
+ name: Option<String>,
+ email: Option<String>,
+ version_control: Option<VersionControl>,
+}
+
+fn get_name<'a>(path: &'a Path, opts: &'a NewOptions, config: &Config) -> CargoResult<&'a str> {
+ if let Some(name) = opts.name {
+ return Ok(name);
+ }
+
+ if path.file_name().is_none() {
+ bail!("cannot auto-detect project name from path {:?} ; use --name to override",
+ path.as_os_str());
+ }
+
+ let dir_name = path.file_name().and_then(|s| s.to_str()).ok_or_else(|| {
+ CargoError::from(format!("cannot create a project with a non-unicode name: {:?}",
+ path.file_name().unwrap()))
+ })?;
+
+ if opts.bin {
+ Ok(dir_name)
+ } else {
+ let new_name = strip_rust_affixes(dir_name);
+ if new_name != dir_name {
+ writeln!(config.shell().err(),
+ "note: package will be named `{}`; use --name to override",
+ new_name)?;
+ }
+ Ok(new_name)
+ }
+}
+
+fn check_name(name: &str, is_bin: bool) -> CargoResult<()> {
+
+ // Ban keywords + test list found at
+ // https://doc.rust-lang.org/grammar.html#keywords
+ let blacklist = ["abstract", "alignof", "as", "become", "box",
+ "break", "const", "continue", "crate", "do",
+ "else", "enum", "extern", "false", "final",
+ "fn", "for", "if", "impl", "in",
+ "let", "loop", "macro", "match", "mod",
+ "move", "mut", "offsetof", "override", "priv",
+ "proc", "pub", "pure", "ref", "return",
+ "self", "sizeof", "static", "struct",
+ "super", "test", "trait", "true", "type", "typeof",
+ "unsafe", "unsized", "use", "virtual", "where",
+ "while", "yield"];
+ if blacklist.contains(&name) || (is_bin && is_bad_artifact_name(name)) {
+ bail!("The name `{}` cannot be used as a crate name\n\
+ use --name to override crate name",
+ name)
+ }
+
+ if let Some(ref c) = name.chars().nth(0) {
+ if c.is_digit(10) {
+ bail!("Package names starting with a digit cannot be used as a crate name\n\
+ use --name to override crate name")
+ }
+ }
+
+ for c in name.chars() {
+ if c.is_alphanumeric() { continue }
+ if c == '_' || c == '-' { continue }
+ bail!("Invalid character `{}` in crate name: `{}`\n\
+ use --name to override crate name",
+ c, name)
+ }
+ Ok(())
+}
+
+fn detect_source_paths_and_types(project_path : &Path,
+ project_name: &str,
+ detected_files: &mut Vec<SourceFileInformation>,
+ ) -> CargoResult<()> {
+ let path = project_path;
+ let name = project_name;
+
+ enum H {
+ Bin,
+ Lib,
+ Detect,
+ }
+
+ struct Test {
+ proposed_path: String,
+ handling: H,
+ }
+
+ let tests = vec![
+ Test { proposed_path: format!("src/main.rs"), handling: H::Bin },
+ Test { proposed_path: format!("main.rs"), handling: H::Bin },
+ Test { proposed_path: format!("src/{}.rs", name), handling: H::Detect },
+ Test { proposed_path: format!("{}.rs", name), handling: H::Detect },
+ Test { proposed_path: format!("src/lib.rs"), handling: H::Lib },
+ Test { proposed_path: format!("lib.rs"), handling: H::Lib },
+ ];
+
+ for i in tests {
+ let pp = i.proposed_path;
+
+ // path/pp does not exist or is not a file
+ if !fs::metadata(&path.join(&pp)).map(|x| x.is_file()).unwrap_or(false) {
+ continue;
+ }
+
+ let sfi = match i.handling {
+ H::Bin => {
+ SourceFileInformation {
+ relative_path: pp,
+ target_name: project_name.to_string(),
+ bin: true
+ }
+ }
+ H::Lib => {
+ SourceFileInformation {
+ relative_path: pp,
+ target_name: project_name.to_string(),
+ bin: false
+ }
+ }
+ H::Detect => {
+ let content = paths::read(&path.join(pp.clone()))?;
+ let isbin = content.contains("fn main");
+ SourceFileInformation {
+ relative_path: pp,
+ target_name: project_name.to_string(),
+ bin: isbin
+ }
+ }
+ };
+ detected_files.push(sfi);
+ }
+
+ // Check for duplicate lib attempt
+
+ let mut previous_lib_relpath : Option<&str> = None;
+ let mut duplicates_checker : BTreeMap<&str, &SourceFileInformation> = BTreeMap::new();
+
+ for i in detected_files {
+ if i.bin {
+ if let Some(x) = BTreeMap::get::<str>(&duplicates_checker, i.target_name.as_ref()) {
+ bail!("\
+multiple possible binary sources found:
+ {}
+ {}
+cannot automatically generate Cargo.toml as the main target would be ambiguous",
+ &x.relative_path, &i.relative_path);
+ }
+ duplicates_checker.insert(i.target_name.as_ref(), i);
+ } else {
+ if let Some(plp) = previous_lib_relpath {
+ return Err(format!("cannot have a project with \
+ multiple libraries, \
+ found both `{}` and `{}`",
+ plp, i.relative_path).into());
+ }
+ previous_lib_relpath = Some(&i.relative_path);
+ }
+ }
+
+ Ok(())
+}
+
+fn plan_new_source_file(bin: bool, project_name: String) -> SourceFileInformation {
+ if bin {
+ SourceFileInformation {
+ relative_path: "src/main.rs".to_string(),
+ target_name: project_name,
+ bin: true,
+ }
+ } else {
+ SourceFileInformation {
+ relative_path: "src/lib.rs".to_string(),
+ target_name: project_name,
+ bin: false,
+ }
+ }
+}
+
+pub fn new(opts: &NewOptions, config: &Config) -> CargoResult<()> {
+ let path = config.cwd().join(opts.path);
+ if fs::metadata(&path).is_ok() {
+ bail!("destination `{}` already exists\n\n\
+ Use `cargo init` to initialize the directory\
+ ", path.display()
+ )
+ }
+
+ if opts.lib && opts.bin {
+ bail!("can't specify both lib and binary outputs")
+ }
+
+ let name = get_name(&path, opts, config)?;
+ check_name(name, opts.bin)?;
+
+ let mkopts = MkOptions {
+ version_control: opts.version_control,
+ path: &path,
+ name: name,
+ source_files: vec![plan_new_source_file(opts.bin, name.to_string())],
+ bin: opts.bin,
+ };
+
+ mk(config, &mkopts).chain_err(|| {
+ format!("Failed to create project `{}` at `{}`",
+ name, path.display())
+ })
+}
+
+pub fn init(opts: &NewOptions, config: &Config) -> CargoResult<()> {
+ let path = config.cwd().join(opts.path);
+
+ let cargotoml_path = path.join("Cargo.toml");
+ if fs::metadata(&cargotoml_path).is_ok() {
+ bail!("`cargo init` cannot be run on existing Cargo projects")
+ }
+
+ if opts.lib && opts.bin {
+ bail!("can't specify both lib and binary outputs");
+ }
+
+ let name = get_name(&path, opts, config)?;
+ check_name(name, opts.bin)?;
+
+ let mut src_paths_types = vec![];
+
+ detect_source_paths_and_types(&path, name, &mut src_paths_types)?;
+
+ if src_paths_types.is_empty() {
+ src_paths_types.push(plan_new_source_file(opts.bin, name.to_string()));
+ } else {
+ // --bin option may be ignored if lib.rs or src/lib.rs present
+ // Maybe when doing `cargo init --bin` inside a library project stub,
+ // user may mean "initialize for library, but also add binary target"
+ }
+
+ let mut version_control = opts.version_control;
+
+ if version_control == None {
+ let mut num_detected_vsces = 0;
+
+ if fs::metadata(&path.join(".git")).is_ok() {
+ version_control = Some(VersionControl::Git);
+ num_detected_vsces += 1;
+ }
+
+ if fs::metadata(&path.join(".hg")).is_ok() {
+ version_control = Some(VersionControl::Hg);
+ num_detected_vsces += 1;
+ }
+
+ if fs::metadata(&path.join(".pijul")).is_ok() {
+ version_control = Some(VersionControl::Pijul);
+ num_detected_vsces += 1;
+ }
+
+ if fs::metadata(&path.join(".fossil")).is_ok() {
+ version_control = Some(VersionControl::Fossil);
+ num_detected_vsces += 1;
+ }
+
+ // if none exists, maybe create git, like in `cargo new`
+
+ if num_detected_vsces > 1 {
+ bail!("more than one of .hg, .git, .pijul, .fossil configurations \
+ found and the ignore file can't be filled in as \
+ a result. specify --vcs to override detection");
+ }
+ }
+
+ let mkopts = MkOptions {
+ version_control: version_control,
+ path: &path,
+ name: name,
+ bin: src_paths_types.iter().any(|x|x.bin),
+ source_files: src_paths_types,
+ };
+
+ mk(config, &mkopts).chain_err(|| {
+ format!("Failed to create project `{}` at `{}`",
+ name, path.display())
+ })
+}
+
+fn strip_rust_affixes(name: &str) -> &str {
+ for &prefix in &["rust-", "rust_", "rs-", "rs_"] {
+ if name.starts_with(prefix) {
+ return &name[prefix.len()..];
+ }
+ }
+ for &suffix in &["-rust", "_rust", "-rs", "_rs"] {
+ if name.ends_with(suffix) {
+ return &name[..name.len()-suffix.len()];
+ }
+ }
+ name
+}
+
+fn existing_vcs_repo(path: &Path, cwd: &Path) -> bool {
+ GitRepo::discover(path, cwd).is_ok() || HgRepo::discover(path, cwd).is_ok()
+}
+
+fn mk(config: &Config, opts: &MkOptions) -> CargoResult<()> {
+ let path = opts.path;
+ let name = opts.name;
+ let cfg = global_config(config)?;
+ // Please ensure that ignore and hgignore are in sync.
+ let ignore = ["\n", "/target/\n", "**/*.rs.bk\n",
+ if !opts.bin { "Cargo.lock\n" } else { "" }]
+ .concat();
+ // Mercurial glob ignores can't be rooted, so just sticking a 'syntax: glob' at the top of the
+ // file will exclude too much. Instead, use regexp-based ignores. See 'hg help ignore' for
+ // more.
+ let hgignore = ["\n", "^target/\n", "glob:*.rs.bk\n",
+ if !opts.bin { "glob:Cargo.lock\n" } else { "" }]
+ .concat();
+
+ let vcs = opts.version_control
+ .unwrap_or_else(|| {
+ let in_existing_vcs = existing_vcs_repo(path.parent().unwrap_or(path),
+ config.cwd());
+ match (cfg.version_control, in_existing_vcs) {
+ (None, false) => VersionControl::Git,
+ (Some(opt), false) => opt,
+ (_, true) => VersionControl::NoVcs,
+ }
+ });
+
+ match vcs {
+ VersionControl::Git => {
+ if !fs::metadata(&path.join(".git")).is_ok() {
+ GitRepo::init(path, config.cwd())?;
+ }
+ paths::append(&path.join(".gitignore"), ignore.as_bytes())?;
+ },
+ VersionControl::Hg => {
+ if !fs::metadata(&path.join(".hg")).is_ok() {
+ HgRepo::init(path, config.cwd())?;
+ }
+ paths::append(&path.join(".hgignore"), hgignore.as_bytes())?;
+ },
+ VersionControl::Pijul => {
+ if !fs::metadata(&path.join(".pijul")).is_ok() {
+ PijulRepo::init(path, config.cwd())?;
+ }
+ },
+ VersionControl::Fossil => {
+ if !fs::metadata(&path.join(".fossil")).is_ok() {
+ FossilRepo::init(path, config.cwd())?;
+ }
+ },
+ VersionControl::NoVcs => {
+ fs::create_dir_all(path)?;
+ },
+ };
+
+ let (author_name, email) = discover_author()?;
+ // Hoo boy, sure glad we've got exhaustiveness checking behind us.
+ let author = match (cfg.name, cfg.email, author_name, email) {
+ (Some(name), Some(email), _, _) |
+ (Some(name), None, _, Some(email)) |
+ (None, Some(email), name, _) |
+ (None, None, name, Some(email)) => format!("{} <{}>", name, email),
+ (Some(name), None, _, None) |
+ (None, None, name, None) => name,
+ };
+
+ let mut cargotoml_path_specifier = String::new();
+
+ // Calculate what [lib] and [[bin]]s do we need to append to Cargo.toml
+
+ for i in &opts.source_files {
+ if i.bin {
+ if i.relative_path != "src/main.rs" {
+ cargotoml_path_specifier.push_str(&format!(r#"
+[[bin]]
+name = "{}"
+path = {}
+"#, i.target_name, toml::Value::String(i.relative_path.clone())));
+ }
+ } else if i.relative_path != "src/lib.rs" {
+ cargotoml_path_specifier.push_str(&format!(r#"
+[lib]
+name = "{}"
+path = {}
+"#, i.target_name, toml::Value::String(i.relative_path.clone())));
+ }
+ }
+
+ // Create Cargo.toml file with necessary [lib] and [[bin]] sections, if needed
+
+ paths::write(&path.join("Cargo.toml"), format!(
+r#"[package]
+name = "{}"
+version = "0.1.0"
+authors = [{}]
+
+[dependencies]
+{}"#, name, toml::Value::String(author), cargotoml_path_specifier).as_bytes())?;
+
+
+ // Create all specified source files
+ // (with respective parent directories)
+ // if they are don't exist
+
+ for i in &opts.source_files {
+ let path_of_source_file = path.join(i.relative_path.clone());
+
+ if let Some(src_dir) = path_of_source_file.parent() {
+ fs::create_dir_all(src_dir)?;
+ }
+
+ let default_file_content : &[u8] = if i.bin {
+ b"\
+fn main() {
+ println!(\"Hello, world!\");
+}
+"
+ } else {
+ b"\
+#[cfg(test)]
+mod tests {
+ #[test]
+ fn it_works() {
+ assert_eq!(2 + 2, 4);
+ }
+}
+"
+ };
+
+ if !fs::metadata(&path_of_source_file).map(|x| x.is_file()).unwrap_or(false) {
+ paths::write(&path_of_source_file, default_file_content)?;
+ }
+ }
+
+ if let Err(e) = Workspace::new(&path.join("Cargo.toml"), config) {
+ let msg = format!("compiling this new crate may not work due to invalid \
+ workspace configuration\n\n{}", e);
+ config.shell().warn(msg)?;
+ }
+
+ Ok(())
+}
+
+fn get_environment_variable(variables: &[&str] ) -> Option<String>{
+ variables.iter()
+ .filter_map(|var| env::var(var).ok())
+ .next()
+}
+
+fn discover_author() -> CargoResult<(String, Option<String>)> {
+ let cwd = env::current_dir()?;
+ let git_config = if let Ok(repo) = GitRepository::discover(&cwd) {
+ repo.config().ok().or_else(|| GitConfig::open_default().ok())
+ } else {
+ GitConfig::open_default().ok()
+ };
+ let git_config = git_config.as_ref();
+ let name_variables = ["CARGO_NAME", "GIT_AUTHOR_NAME", "GIT_COMMITTER_NAME",
+ "USER", "USERNAME", "NAME"];
+ let name = get_environment_variable(&name_variables[0..3])
+ .or_else(|| git_config.and_then(|g| g.get_string("user.name").ok()))
+ .or_else(|| get_environment_variable(&name_variables[3..]));
+
+ let name = match name {
+ Some(name) => name,
+ None => {
+ let username_var = if cfg!(windows) {"USERNAME"} else {"USER"};
+ bail!("could not determine the current user, please set ${}",
+ username_var)
+ }
+ };
+ let email_variables = ["CARGO_EMAIL", "GIT_AUTHOR_EMAIL", "GIT_COMMITTER_EMAIL",
+ "EMAIL"];
+ let email = get_environment_variable(&email_variables[0..3])
+ .or_else(|| git_config.and_then(|g| g.get_string("user.email").ok()))
+ .or_else(|| get_environment_variable(&email_variables[3..]));
+
+ let name = name.trim().to_string();
+ let email = email.map(|s| s.trim().to_string());
+
+ Ok((name, email))
+}
+
+fn global_config(config: &Config) -> CargoResult<CargoNewConfig> {
+ let name = config.get_string("cargo-new.name")?.map(|s| s.val);
+ let email = config.get_string("cargo-new.email")?.map(|s| s.val);
+ let vcs = config.get_string("cargo-new.vcs")?;
+
+ let vcs = match vcs.as_ref().map(|p| (&p.val[..], &p.definition)) {
+ Some(("git", _)) => Some(VersionControl::Git),
+ Some(("hg", _)) => Some(VersionControl::Hg),
+ Some(("none", _)) => Some(VersionControl::NoVcs),
+ Some((s, p)) => {
+ return Err(internal(format!("invalid configuration for key \
+ `cargo-new.vcs`, unknown vcs `{}` \
+ (found in {})", s, p)))
+ }
+ None => None
+ };
+ Ok(CargoNewConfig {
+ name: name,
+ email: email,
+ version_control: vcs,
+ })
+}
+
+#[cfg(test)]
+mod tests {
+ use super::strip_rust_affixes;
+
+ #[test]
+ fn affixes_stripped() {
+ assert_eq!(strip_rust_affixes("rust-foo"), "foo");
+ assert_eq!(strip_rust_affixes("foo-rs"), "foo");
+ assert_eq!(strip_rust_affixes("rs_foo"), "foo");
+ // Only one affix is stripped
+ assert_eq!(strip_rust_affixes("rs-foo-rs"), "foo-rs");
+ assert_eq!(strip_rust_affixes("foo-rs-rs"), "foo-rs");
+ // It shouldn't touch the middle
+ assert_eq!(strip_rust_affixes("some-rust-crate"), "some-rust-crate");
+ }
+}
--- /dev/null
+use serde::ser::{self, Serialize};
+
+use core::resolver::Resolve;
+use core::{Package, PackageId, Workspace};
+use ops::{self, Packages};
+use util::CargoResult;
+
+const VERSION: u32 = 1;
+
+pub struct OutputMetadataOptions {
+ pub features: Vec<String>,
+ pub no_default_features: bool,
+ pub all_features: bool,
+ pub no_deps: bool,
+ pub version: u32,
+}
+
+/// Loads the manifest, resolves the dependencies of the project to the concrete
+/// used versions - considering overrides - and writes all dependencies in a JSON
+/// format to stdout.
+pub fn output_metadata(ws: &Workspace,
+ opt: &OutputMetadataOptions) -> CargoResult<ExportInfo> {
+ if opt.version != VERSION {
+ bail!("metadata version {} not supported, only {} is currently supported",
+ opt.version, VERSION);
+ }
+ if opt.no_deps {
+ metadata_no_deps(ws, opt)
+ } else {
+ metadata_full(ws, opt)
+ }
+}
+
+fn metadata_no_deps(ws: &Workspace,
+ _opt: &OutputMetadataOptions) -> CargoResult<ExportInfo> {
+ Ok(ExportInfo {
+ packages: ws.members().cloned().collect(),
+ workspace_members: ws.members().map(|pkg| pkg.package_id().clone()).collect(),
+ resolve: None,
+ target_directory: ws.target_dir().display().to_string(),
+ version: VERSION,
+ })
+}
+
+fn metadata_full(ws: &Workspace,
+ opt: &OutputMetadataOptions) -> CargoResult<ExportInfo> {
+ let specs = Packages::All.into_package_id_specs(ws)?;
+ let deps = ops::resolve_ws_precisely(ws,
+ None,
+ &opt.features,
+ opt.all_features,
+ opt.no_default_features,
+ &specs)?;
+ let (packages, resolve) = deps;
+
+ let packages = packages.package_ids()
+ .map(|i| packages.get(i).map(|p| p.clone()))
+ .collect::<CargoResult<Vec<_>>>()?;
+
+ Ok(ExportInfo {
+ packages: packages,
+ workspace_members: ws.members().map(|pkg| pkg.package_id().clone()).collect(),
+ resolve: Some(MetadataResolve{
+ resolve: resolve,
+ root: ws.current_opt().map(|pkg| pkg.package_id().clone()),
+ }),
+ target_directory: ws.target_dir().display().to_string(),
+ version: VERSION,
+ })
+}
+
+#[derive(Serialize)]
+pub struct ExportInfo {
+ packages: Vec<Package>,
+ workspace_members: Vec<PackageId>,
+ resolve: Option<MetadataResolve>,
+ target_directory: String,
+ version: u32,
+}
+
+/// Newtype wrapper to provide a custom `Serialize` implementation.
+/// The one from lockfile does not fit because it uses a non-standard
+/// format for `PackageId`s
+#[derive(Serialize)]
+struct MetadataResolve {
+ #[serde(rename = "nodes", serialize_with = "serialize_resolve")]
+ resolve: Resolve,
+ root: Option<PackageId>,
+}
+
+fn serialize_resolve<S>(resolve: &Resolve, s: S) -> Result<S::Ok, S::Error>
+ where S: ser::Serializer,
+{
+ #[derive(Serialize)]
+ struct Node<'a> {
+ id: &'a PackageId,
+ dependencies: Vec<&'a PackageId>,
+ }
+
+ resolve.iter().map(|id| {
+ Node {
+ id: id,
+ dependencies: resolve.deps(id).collect(),
+ }
+ }).collect::<Vec<_>>().serialize(s)
+}
--- /dev/null
+use std::fs::{self, File};
+use std::io::SeekFrom;
+use std::io::prelude::*;
+use std::path::{self, Path};
+use std::sync::Arc;
+
+use flate2::read::GzDecoder;
+use flate2::{GzBuilder, Compression};
+use git2;
+use tar::{Archive, Builder, Header, EntryType};
+
+use core::{Package, Workspace, Source, SourceId};
+use sources::PathSource;
+use util::{self, internal, Config, FileLock};
+use util::errors::{CargoResult, CargoResultExt};
+use ops::{self, DefaultExecutor};
+
+pub struct PackageOpts<'cfg> {
+ pub config: &'cfg Config,
+ pub list: bool,
+ pub check_metadata: bool,
+ pub allow_dirty: bool,
+ pub verify: bool,
+ pub jobs: Option<u32>,
+ pub target: Option<&'cfg str>,
+ pub registry: Option<String>,
+}
+
+pub fn package(ws: &Workspace,
+ opts: &PackageOpts) -> CargoResult<Option<FileLock>> {
+ let pkg = ws.current()?;
+ let config = ws.config();
+
+ // Allow packaging if a registry has been provided, or if there are no nightly
+ // features enabled.
+ if opts.registry.is_none() && !pkg.manifest().features().activated().is_empty() {
+ bail!("cannot package or publish crates which activate nightly-only \
+ cargo features")
+ }
+ let mut src = PathSource::new(pkg.root(),
+ pkg.package_id().source_id(),
+ config);
+ src.update()?;
+
+ if opts.check_metadata {
+ check_metadata(pkg, config)?;
+ }
+
+ verify_dependencies(pkg)?;
+
+ if opts.list {
+ let root = pkg.root();
+ let mut list: Vec<_> = src.list_files(pkg)?.iter().map(|file| {
+ util::without_prefix(file, root).unwrap().to_path_buf()
+ }).collect();
+ list.sort();
+ for file in list.iter() {
+ println!("{}", file.display());
+ }
+ return Ok(None)
+ }
+
+ if !opts.allow_dirty {
+ check_not_dirty(pkg, &src)?;
+ }
+
+ let filename = format!("{}-{}.crate", pkg.name(), pkg.version());
+ let dir = ws.target_dir().join("package");
+ let mut dst = {
+ let tmp = format!(".{}", filename);
+ dir.open_rw(&tmp, config, "package scratch space")?
+ };
+
+ // Package up and test a temporary tarball and only move it to the final
+ // location if it actually passes all our tests. Any previously existing
+ // tarball can be assumed as corrupt or invalid, so we just blow it away if
+ // it exists.
+ config.shell().status("Packaging", pkg.package_id().to_string())?;
+ dst.file().set_len(0)?;
+ tar(ws, &src, dst.file(), &filename).chain_err(|| {
+ "failed to prepare local package for uploading"
+ })?;
+ if opts.verify {
+ dst.seek(SeekFrom::Start(0))?;
+ run_verify(ws, dst.file(), opts).chain_err(|| {
+ "failed to verify package tarball"
+ })?
+ }
+ dst.seek(SeekFrom::Start(0))?;
+ {
+ let src_path = dst.path();
+ let dst_path = dst.parent().join(&filename);
+ fs::rename(&src_path, &dst_path).chain_err(|| {
+ "failed to move temporary tarball into final location"
+ })?;
+ }
+ Ok(Some(dst))
+}
+
+// check that the package has some piece of metadata that a human can
+// use to tell what the package is about.
+fn check_metadata(pkg: &Package, config: &Config) -> CargoResult<()> {
+ let md = pkg.manifest().metadata();
+
+ let mut missing = vec![];
+
+ macro_rules! lacking {
+ ($( $($field: ident)||* ),*) => {{
+ $(
+ if $(md.$field.as_ref().map_or(true, |s| s.is_empty()))&&* {
+ $(missing.push(stringify!($field).replace("_", "-"));)*
+ }
+ )*
+ }}
+ }
+ lacking!(description, license || license_file, documentation || homepage || repository);
+
+ if !missing.is_empty() {
+ let mut things = missing[..missing.len() - 1].join(", ");
+ // things will be empty if and only if length == 1 (i.e. the only case
+ // to have no `or`).
+ if !things.is_empty() {
+ things.push_str(" or ");
+ }
+ things.push_str(missing.last().unwrap());
+
+ config.shell().warn(
+ &format!("manifest has no {things}.\n\
+ See http://doc.crates.io/manifest.html#package-metadata for more info.",
+ things = things))?
+ }
+ Ok(())
+}
+
+// check that the package dependencies are safe to deploy.
+fn verify_dependencies(pkg: &Package) -> CargoResult<()> {
+ for dep in pkg.dependencies() {
+ if dep.source_id().is_path() && !dep.specified_req() {
+ bail!("all path dependencies must have a version specified \
+ when packaging.\ndependency `{}` does not specify \
+ a version.", dep.name())
+ }
+ }
+ Ok(())
+}
+
+fn check_not_dirty(p: &Package, src: &PathSource) -> CargoResult<()> {
+ if let Ok(repo) = git2::Repository::discover(p.root()) {
+ if let Some(workdir) = repo.workdir() {
+ debug!("found a git repo at {:?}, checking if index present",
+ workdir);
+ let path = p.manifest_path();
+ let path = path.strip_prefix(workdir).unwrap_or(path);
+ if let Ok(status) = repo.status_file(path) {
+ if (status & git2::STATUS_IGNORED).is_empty() {
+ debug!("Cargo.toml found in repo, checking if dirty");
+ return git(p, src, &repo)
+ }
+ }
+ }
+ }
+
+ // No VCS recognized, we don't know if the directory is dirty or not, so we
+ // have to assume that it's clean.
+ return Ok(());
+
+ fn git(p: &Package,
+ src: &PathSource,
+ repo: &git2::Repository) -> CargoResult<()> {
+ let workdir = repo.workdir().unwrap();
+ let dirty = src.list_files(p)?.iter().filter(|file| {
+ let relative = file.strip_prefix(workdir).unwrap();
+ if let Ok(status) = repo.status_file(relative) {
+ status != git2::STATUS_CURRENT
+ } else {
+ false
+ }
+ }).map(|path| {
+ path.strip_prefix(p.root()).unwrap_or(path).display().to_string()
+ }).collect::<Vec<_>>();
+ if dirty.is_empty() {
+ Ok(())
+ } else {
+ bail!("{} files in the working directory contain changes that were \
+ not yet committed into git:\n\n{}\n\n\
+ to proceed despite this, pass the `--allow-dirty` flag",
+ dirty.len(), dirty.join("\n"))
+ }
+ }
+}
+
+fn tar(ws: &Workspace,
+ src: &PathSource,
+ dst: &File,
+ filename: &str) -> CargoResult<()> {
+ // Prepare the encoder and its header
+ let filename = Path::new(filename);
+ let encoder = GzBuilder::new().filename(util::path2bytes(filename)?)
+ .write(dst, Compression::Best);
+
+ // Put all package files into a compressed archive
+ let mut ar = Builder::new(encoder);
+ let pkg = ws.current()?;
+ let config = ws.config();
+ let root = pkg.root();
+ for file in src.list_files(pkg)?.iter() {
+ let relative = util::without_prefix(file, root).unwrap();
+ check_filename(relative)?;
+ let relative = relative.to_str().ok_or_else(|| {
+ format!("non-utf8 path in source directory: {}",
+ relative.display())
+ })?;
+ config.shell().verbose(|shell| {
+ shell.status("Archiving", &relative)
+ })?;
+ let path = format!("{}-{}{}{}", pkg.name(), pkg.version(),
+ path::MAIN_SEPARATOR, relative);
+
+ // The tar::Builder type by default will build GNU archives, but
+ // unfortunately we force it here to use UStar archives instead. The
+ // UStar format has more limitations on the length of path name that it
+ // can encode, so it's not quite as nice to use.
+ //
+ // Older cargos, however, had a bug where GNU archives were interpreted
+ // as UStar archives. This bug means that if we publish a GNU archive
+ // which has fully filled out metadata it'll be corrupt when unpacked by
+ // older cargos.
+ //
+ // Hopefully in the future after enough cargos have been running around
+ // with the bugfixed tar-rs library we'll be able to switch this over to
+ // GNU archives, but for now we'll just say that you can't encode paths
+ // in archives that are *too* long.
+ //
+ // For an instance of this in the wild, use the tar-rs 0.3.3 library to
+ // unpack the selectors 0.4.0 crate on crates.io. Either that or take a
+ // look at rust-lang/cargo#2326
+ let mut header = Header::new_ustar();
+ header.set_path(&path).chain_err(|| {
+ format!("failed to add to archive: `{}`", relative)
+ })?;
+ let mut file = File::open(file).chain_err(|| {
+ format!("failed to open for archiving: `{}`", file.display())
+ })?;
+ let metadata = file.metadata().chain_err(|| {
+ format!("could not learn metadata for: `{}`", relative)
+ })?;
+ header.set_metadata(&metadata);
+
+ if relative == "Cargo.toml" {
+ let orig = Path::new(&path).with_file_name("Cargo.toml.orig");
+ header.set_path(&orig)?;
+ header.set_cksum();
+ ar.append(&header, &mut file).chain_err(|| {
+ internal(format!("could not archive source file `{}`", relative))
+ })?;
+
+ let mut header = Header::new_ustar();
+ let toml = pkg.to_registry_toml()?;
+ header.set_path(&path)?;
+ header.set_entry_type(EntryType::file());
+ header.set_mode(0o644);
+ header.set_size(toml.len() as u64);
+ header.set_cksum();
+ ar.append(&header, toml.as_bytes()).chain_err(|| {
+ internal(format!("could not archive source file `{}`", relative))
+ })?;
+ } else {
+ header.set_cksum();
+ ar.append(&header, &mut file).chain_err(|| {
+ internal(format!("could not archive source file `{}`", relative))
+ })?;
+ }
+ }
+ let encoder = ar.into_inner()?;
+ encoder.finish()?;
+ Ok(())
+}
+
+fn run_verify(ws: &Workspace, tar: &File, opts: &PackageOpts) -> CargoResult<()> {
+ let config = ws.config();
+ let pkg = ws.current()?;
+
+ config.shell().status("Verifying", pkg)?;
+
+ let f = GzDecoder::new(tar)?;
+ let dst = pkg.root().join(&format!("target/package/{}-{}",
+ pkg.name(), pkg.version()));
+ if fs::metadata(&dst).is_ok() {
+ fs::remove_dir_all(&dst)?;
+ }
+ let mut archive = Archive::new(f);
+ archive.unpack(dst.parent().unwrap())?;
+
+ // Manufacture an ephemeral workspace to ensure that even if the top-level
+ // package has a workspace we can still build our new crate.
+ let id = SourceId::for_path(&dst)?;
+ let mut src = PathSource::new(&dst, &id, ws.config());
+ let new_pkg = src.root_package()?;
+ let ws = Workspace::ephemeral(new_pkg, config, None, true)?;
+
+ ops::compile_ws(&ws, None, &ops::CompileOptions {
+ config: config,
+ jobs: opts.jobs,
+ target: opts.target,
+ features: &[],
+ no_default_features: false,
+ all_features: false,
+ spec: ops::Packages::Packages(&[]),
+ filter: ops::CompileFilter::Default { required_features_filterable: true },
+ release: false,
+ message_format: ops::MessageFormat::Human,
+ mode: ops::CompileMode::Build,
+ target_rustdoc_args: None,
+ target_rustc_args: None,
+ }, Arc::new(DefaultExecutor))?;
+
+ Ok(())
+}
+
+// It can often be the case that files of a particular name on one platform
+// can't actually be created on another platform. For example files with colons
+// in the name are allowed on Unix but not on Windows.
+//
+// To help out in situations like this, issue about weird filenames when
+// packaging as a "heads up" that something may not work on other platforms.
+fn check_filename(file: &Path) -> CargoResult<()> {
+ let name = match file.file_name() {
+ Some(name) => name,
+ None => return Ok(()),
+ };
+ let name = match name.to_str() {
+ Some(name) => name,
+ None => {
+ bail!("path does not have a unicode filename which may not unpack \
+ on all platforms: {}", file.display())
+ }
+ };
+ let bad_chars = ['/', '\\', '<', '>', ':', '"', '|', '?', '*'];
+ if let Some(c) = bad_chars.iter().find(|c| name.contains(**c)) {
+ bail!("cannot package a filename with a special character `{}`: {}",
+ c, file.display())
+ }
+ Ok(())
+}
--- /dev/null
+use ops;
+use core::{PackageIdSpec, Workspace};
+use util::CargoResult;
+
+pub fn pkgid(ws: &Workspace, spec: Option<&str>) -> CargoResult<PackageIdSpec> {
+ let resolve = match ops::load_pkg_lockfile(ws)? {
+ Some(resolve) => resolve,
+ None => bail!("a Cargo.lock must exist for this command"),
+ };
+
+ let pkgid = match spec {
+ Some(spec) => PackageIdSpec::query_str(spec, resolve.iter())?,
+ None => ws.current()?.package_id(),
+ };
+ Ok(PackageIdSpec::from_package_id(pkgid))
+}
--- /dev/null
+use std::collections::{HashMap, HashSet};
+use std::fs;
+use std::io;
+use std::path::{Path, PathBuf};
+
+use core::{Package, SourceId, PackageId, EitherManifest};
+use util::{self, Config};
+use util::errors::{CargoResult, CargoResultExt, CargoError};
+use util::important_paths::find_project_manifest_exact;
+use util::toml::read_manifest;
+
+pub fn read_package(path: &Path, source_id: &SourceId, config: &Config)
+ -> CargoResult<(Package, Vec<PathBuf>)> {
+ trace!("read_package; path={}; source-id={}", path.display(), source_id);
+ let (manifest, nested) = read_manifest(path, source_id, config)?;
+ let manifest = match manifest {
+ EitherManifest::Real(manifest) => manifest,
+ EitherManifest::Virtual(..) => {
+ bail!("found a virtual manifest at `{}` instead of a package \
+ manifest", path.display())
+ }
+ };
+
+ Ok((Package::new(manifest, path), nested))
+}
+
+pub fn read_packages(path: &Path, source_id: &SourceId, config: &Config)
+ -> CargoResult<Vec<Package>> {
+ let mut all_packages = HashMap::new();
+ let mut visited = HashSet::<PathBuf>::new();
+ let mut errors = Vec::<CargoError>::new();
+
+ trace!("looking for root package: {}, source_id={}", path.display(), source_id);
+
+ walk(path, &mut |dir| {
+ trace!("looking for child package: {}", dir.display());
+
+ // Don't recurse into hidden/dot directories unless we're at the toplevel
+ if dir != path {
+ let name = dir.file_name().and_then(|s| s.to_str());
+ if name.map(|s| s.starts_with('.')) == Some(true) {
+ return Ok(false)
+ }
+
+ // Don't automatically discover packages across git submodules
+ if fs::metadata(&dir.join(".git")).is_ok() {
+ return Ok(false)
+ }
+ }
+
+ // Don't ever look at target directories
+ if dir.file_name().and_then(|s| s.to_str()) == Some("target") &&
+ has_manifest(dir.parent().unwrap()) {
+ return Ok(false)
+ }
+
+ if has_manifest(dir) {
+ read_nested_packages(dir, &mut all_packages, source_id, config,
+ &mut visited, &mut errors)?;
+ }
+ Ok(true)
+ })?;
+
+ if all_packages.is_empty() {
+ match errors.pop() {
+ Some(err) => Err(err),
+ None => Err(format!("Could not find Cargo.toml in `{}`", path.display()).into()),
+ }
+ } else {
+ Ok(all_packages.into_iter().map(|(_, v)| v).collect())
+ }
+}
+
+fn walk(path: &Path, callback: &mut FnMut(&Path) -> CargoResult<bool>)
+ -> CargoResult<()> {
+ if !callback(path)? {
+ trace!("not processing {}", path.display());
+ return Ok(())
+ }
+
+ // Ignore any permission denied errors because temporary directories
+ // can often have some weird permissions on them.
+ let dirs = match fs::read_dir(path) {
+ Ok(dirs) => dirs,
+ Err(ref e) if e.kind() == io::ErrorKind::PermissionDenied => {
+ return Ok(())
+ }
+ Err(e) => {
+ return Err(e).chain_err(|| {
+ format!("failed to read directory `{}`", path.display())
+ })
+ }
+ };
+ for dir in dirs {
+ let dir = dir?;
+ if dir.file_type()?.is_dir() {
+ walk(&dir.path(), callback)?;
+ }
+ }
+ Ok(())
+}
+
+fn has_manifest(path: &Path) -> bool {
+ find_project_manifest_exact(path, "Cargo.toml").is_ok()
+}
+
+fn read_nested_packages(path: &Path,
+ all_packages: &mut HashMap<PackageId, Package>,
+ source_id: &SourceId,
+ config: &Config,
+ visited: &mut HashSet<PathBuf>,
+ errors: &mut Vec<CargoError>) -> CargoResult<()> {
+ if !visited.insert(path.to_path_buf()) { return Ok(()) }
+
+ let manifest_path = find_project_manifest_exact(path, "Cargo.toml")?;
+
+ let (manifest, nested) = match read_manifest(&manifest_path, source_id, config) {
+ Err(err) => {
+ // Ignore malformed manifests found on git repositories
+ //
+ // git source try to find and read all manifests from the repository
+ // but since it's not possible to exclude folders from this search
+ // it's safer to ignore malformed manifests to avoid
+ //
+ // TODO: Add a way to exclude folders?
+ info!("skipping malformed package found at `{}`",
+ path.to_string_lossy());
+ errors.push(err);
+ return Ok(());
+ }
+ Ok(tuple) => tuple
+ };
+
+ let manifest = match manifest {
+ EitherManifest::Real(manifest) => manifest,
+ EitherManifest::Virtual(..) => return Ok(()),
+ };
+ let pkg = Package::new(manifest, &manifest_path);
+
+ let pkg_id = pkg.package_id().clone();
+ if !all_packages.contains_key(&pkg_id) {
+ all_packages.insert(pkg_id, pkg);
+ } else {
+ info!("skipping nested package `{}` found at `{}`",
+ pkg.name(), path.to_string_lossy());
+ }
+
+ // Registry sources are not allowed to have `path=` dependencies because
+ // they're all translated to actual registry dependencies.
+ //
+ // We normalize the path here ensure that we don't infinitely walk around
+ // looking for crates. By normalizing we ensure that we visit this crate at
+ // most once.
+ //
+ // TODO: filesystem/symlink implications?
+ if !source_id.is_registry() {
+ for p in nested.iter() {
+ let path = util::normalize_path(&path.join(p));
+ read_nested_packages(&path, all_packages, source_id,
+ config, visited, errors)?;
+ }
+ }
+
+ Ok(())
+}
--- /dev/null
+use std::path::Path;
+
+use ops::{self, Packages};
+use util::{self, CargoResult, CargoError, ProcessError};
+use util::errors::CargoErrorKind;
+use core::Workspace;
+
+pub fn run(ws: &Workspace,
+ options: &ops::CompileOptions,
+ args: &[String]) -> CargoResult<Option<ProcessError>> {
+ let config = ws.config();
+
+ let pkg = match options.spec {
+ Packages::All => unreachable!("cargo run supports single package only"),
+ Packages::OptOut(_) => unreachable!("cargo run supports single package only"),
+ Packages::Packages(xs) => match xs.len() {
+ 0 => ws.current()?,
+ 1 => ws.members()
+ .find(|pkg| pkg.name() == xs[0])
+ .ok_or_else(||
+ CargoError::from(
+ format!("package `{}` is not a member of the workspace", xs[0]))
+ )?,
+ _ => unreachable!("cargo run supports single package only"),
+ }
+ };
+
+ let bins: Vec<_> = pkg.manifest().targets().iter().filter(|a| {
+ !a.is_lib() && !a.is_custom_build() && if !options.filter.is_specific() {
+ a.is_bin()
+ } else {
+ options.filter.matches(a)
+ }
+ })
+ .map(|bin| bin.name())
+ .collect();
+
+ if bins.len() == 0 {
+ if !options.filter.is_specific() {
+ bail!("a bin target must be available for `cargo run`")
+ } else {
+ // this will be verified in cargo_compile
+ }
+ }
+ if bins.len() > 1 {
+ if !options.filter.is_specific() {
+ bail!("`cargo run` requires that a project only have one \
+ executable; use the `--bin` option to specify which one \
+ to run\navailable binaries: {}", bins.join(", "))
+ } else {
+ bail!("`cargo run` can run at most one executable, but \
+ multiple were specified")
+ }
+ }
+
+ let compile = ops::compile(ws, options)?;
+ assert_eq!(compile.binaries.len(), 1);
+ let exe = &compile.binaries[0];
+ let exe = match util::without_prefix(exe, config.cwd()) {
+ Some(path) if path.file_name() == Some(path.as_os_str())
+ => Path::new(".").join(path).to_path_buf(),
+ Some(path) => path.to_path_buf(),
+ None => exe.to_path_buf(),
+ };
+ let mut process = compile.target_process(exe, pkg)?;
+ process.args(args).cwd(config.cwd());
+
+ config.shell().status("Running", process.to_string())?;
+
+ let result = process.exec_replace();
+
+ match result {
+ Ok(()) => Ok(None),
+ Err(CargoError(CargoErrorKind::ProcessErrorKind(e), ..)) => Ok(Some(e)),
+ Err(e) => Err(e)
+ }
+}
--- /dev/null
+use std::collections::{HashMap, HashSet};
+use std::ffi::OsStr;
+use std::path::PathBuf;
+use semver::Version;
+
+use core::{PackageId, Package, Target, TargetKind};
+use util::{self, CargoResult, Config, LazyCell, ProcessBuilder, process, join_paths};
+
+/// A structure returning the result of a compilation.
+pub struct Compilation<'cfg> {
+ /// A mapping from a package to the list of libraries that need to be
+ /// linked when working with that package.
+ pub libraries: HashMap<PackageId, HashSet<(Target, PathBuf)>>,
+
+ /// An array of all tests created during this compilation.
+ pub tests: Vec<(Package, TargetKind, String, PathBuf)>,
+
+ /// An array of all binaries created.
+ pub binaries: Vec<PathBuf>,
+
+ /// All directories for the output of native build commands.
+ ///
+ /// This is currently used to drive some entries which are added to the
+ /// LD_LIBRARY_PATH as appropriate.
+ // TODO: deprecated, remove
+ pub native_dirs: HashSet<PathBuf>,
+
+ /// Root output directory (for the local package's artifacts)
+ pub root_output: PathBuf,
+
+ /// Output directory for rust dependencies.
+ /// May be for the host or for a specific target.
+ pub deps_output: PathBuf,
+
+ /// Output directory for the rust host dependencies.
+ pub host_deps_output: PathBuf,
+
+ /// The path to rustc's own libstd
+ pub host_dylib_path: Option<PathBuf>,
+
+ /// The path to libstd for the target
+ pub target_dylib_path: Option<PathBuf>,
+
+ /// Extra environment variables that were passed to compilations and should
+ /// be passed to future invocations of programs.
+ pub extra_env: HashMap<PackageId, Vec<(String, String)>>,
+
+ pub to_doc_test: Vec<Package>,
+
+ /// Features per package enabled during this compilation.
+ pub cfgs: HashMap<PackageId, HashSet<String>>,
+
+ pub target: String,
+
+ config: &'cfg Config,
+
+ target_runner: LazyCell<Option<(PathBuf, Vec<String>)>>,
+}
+
+impl<'cfg> Compilation<'cfg> {
+ pub fn new(config: &'cfg Config) -> Compilation<'cfg> {
+ Compilation {
+ libraries: HashMap::new(),
+ native_dirs: HashSet::new(), // TODO: deprecated, remove
+ root_output: PathBuf::from("/"),
+ deps_output: PathBuf::from("/"),
+ host_deps_output: PathBuf::from("/"),
+ host_dylib_path: None,
+ target_dylib_path: None,
+ tests: Vec::new(),
+ binaries: Vec::new(),
+ extra_env: HashMap::new(),
+ to_doc_test: Vec::new(),
+ cfgs: HashMap::new(),
+ config: config,
+ target: String::new(),
+ target_runner: LazyCell::new(),
+ }
+ }
+
+ /// See `process`.
+ pub fn rustc_process(&self, pkg: &Package) -> CargoResult<ProcessBuilder> {
+ self.fill_env(self.config.rustc()?.process(), pkg, true)
+ }
+
+ /// See `process`.
+ pub fn rustdoc_process(&self, pkg: &Package) -> CargoResult<ProcessBuilder> {
+ self.fill_env(process(&*self.config.rustdoc()?), pkg, false)
+ }
+
+ /// See `process`.
+ pub fn host_process<T: AsRef<OsStr>>(&self, cmd: T, pkg: &Package)
+ -> CargoResult<ProcessBuilder> {
+ self.fill_env(process(cmd), pkg, true)
+ }
+
+ fn target_runner(&self) -> CargoResult<&Option<(PathBuf, Vec<String>)>> {
+ self.target_runner.get_or_try_init(|| {
+ let key = format!("target.{}.runner", self.target);
+ Ok(self.config.get_path_and_args(&key)?.map(|v| v.val))
+ })
+ }
+
+ /// See `process`.
+ pub fn target_process<T: AsRef<OsStr>>(&self, cmd: T, pkg: &Package)
+ -> CargoResult<ProcessBuilder> {
+ let builder = if let Some((ref runner, ref args)) = *self.target_runner()? {
+ let mut builder = process(runner);
+ builder.args(args);
+ builder.arg(cmd);
+ builder
+ } else {
+ process(cmd)
+ };
+ self.fill_env(builder, pkg, false)
+ }
+
+ /// Prepares a new process with an appropriate environment to run against
+ /// the artifacts produced by the build process.
+ ///
+ /// The package argument is also used to configure environment variables as
+ /// well as the working directory of the child process.
+ fn fill_env(&self, mut cmd: ProcessBuilder, pkg: &Package, is_host: bool)
+ -> CargoResult<ProcessBuilder> {
+
+ let mut search_path = if is_host {
+ let mut search_path = vec![self.host_deps_output.clone()];
+ search_path.extend(self.host_dylib_path.clone());
+ search_path
+ } else {
+ let mut search_path =
+ super::filter_dynamic_search_path(self.native_dirs.iter(),
+ &self.root_output);
+ search_path.push(self.root_output.clone());
+ search_path.push(self.deps_output.clone());
+ search_path.extend(self.target_dylib_path.clone());
+ search_path
+ };
+
+ search_path.extend(util::dylib_path().into_iter());
+ let search_path = join_paths(&search_path, util::dylib_path_envvar())?;
+
+ cmd.env(util::dylib_path_envvar(), &search_path);
+ if let Some(env) = self.extra_env.get(pkg.package_id()) {
+ for &(ref k, ref v) in env {
+ cmd.env(k, v);
+ }
+ }
+
+ let metadata = pkg.manifest().metadata();
+
+ let cargo_exe = self.config.cargo_exe()?;
+ cmd.env(::CARGO_ENV, cargo_exe);
+
+ // When adding new environment variables depending on
+ // crate properties which might require rebuild upon change
+ // consider adding the corresponding properties to the hash
+ // in Context::target_metadata()
+ cmd.env("CARGO_MANIFEST_DIR", pkg.root())
+ .env("CARGO_PKG_VERSION_MAJOR", &pkg.version().major.to_string())
+ .env("CARGO_PKG_VERSION_MINOR", &pkg.version().minor.to_string())
+ .env("CARGO_PKG_VERSION_PATCH", &pkg.version().patch.to_string())
+ .env("CARGO_PKG_VERSION_PRE", &pre_version_component(pkg.version()))
+ .env("CARGO_PKG_VERSION", &pkg.version().to_string())
+ .env("CARGO_PKG_NAME", &pkg.name())
+ .env("CARGO_PKG_DESCRIPTION", metadata.description.as_ref().unwrap_or(&String::new()))
+ .env("CARGO_PKG_HOMEPAGE", metadata.homepage.as_ref().unwrap_or(&String::new()))
+ .env("CARGO_PKG_AUTHORS", &pkg.authors().join(":"))
+ .cwd(pkg.root());
+ Ok(cmd)
+ }
+}
+
+fn pre_version_component(v: &Version) -> String {
+ if v.pre.is_empty() {
+ return String::new();
+ }
+
+ let mut ret = String::new();
+
+ for (i, x) in v.pre.iter().enumerate() {
+ if i != 0 { ret.push('.') };
+ ret.push_str(&x.to_string());
+ }
+
+ ret
+}
--- /dev/null
+#![allow(deprecated)]
+
+use std::collections::{HashSet, HashMap, BTreeSet};
+use std::collections::hash_map::Entry;
+use std::env;
+use std::fmt;
+use std::hash::{Hasher, Hash, SipHasher};
+use std::path::{Path, PathBuf};
+use std::str::{self, FromStr};
+use std::sync::Arc;
+use std::cell::RefCell;
+
+use jobserver::Client;
+
+use core::{Package, PackageId, PackageSet, Resolve, Target, Profile};
+use core::{TargetKind, Profiles, Dependency, Workspace};
+use core::dependency::Kind as DepKind;
+use util::{self, ProcessBuilder, internal, Config, profile, Cfg, CfgExpr};
+use util::errors::{CargoResult, CargoResultExt};
+
+use super::TargetConfig;
+use super::custom_build::{BuildState, BuildScripts, BuildDeps};
+use super::fingerprint::Fingerprint;
+use super::layout::Layout;
+use super::links::Links;
+use super::{Kind, Compilation, BuildConfig};
+
+/// All information needed to define a Unit.
+///
+/// A unit is an object that has enough information so that cargo knows how to build it.
+/// For example, if your project has dependencies, then every dependency will be built as a library
+/// unit. If your project is a library, then it will be built as a library unit as well, or if it
+/// is a binary with `main.rs`, then a binary will be output. There are also separate unit types
+/// for `test`ing and `check`ing, amongst others.
+///
+/// The unit also holds information about all possible metadata about the package in `pkg`.
+///
+/// A unit needs to know extra information in addition to the type and root source file. For
+/// example, it needs to know the target architecture (OS, chip arch etc.) and it needs to know
+/// whether you want a debug or release build. There is enough information in this struct to figure
+/// all that out.
+#[derive(Clone, Copy, Eq, PartialEq, Hash)]
+pub struct Unit<'a> {
+ /// Information about avaiable targets, which files to include/exclude, etc. Basically stuff in
+ /// `Cargo.toml`.
+ pub pkg: &'a Package,
+ /// Information about the specific target to build, out of the possible targets in `pkg`. Not
+ /// to be confused with *target-triple* (or *target architecture* ...), the target arch for a
+ /// build.
+ pub target: &'a Target,
+ /// The profile contains information about *how* the build should be run, including debug
+ /// level, extra args to pass to rustc, etc.
+ pub profile: &'a Profile,
+ /// Whether this compilation unit is for the host or target architecture.
+ ///
+ /// For example, when
+ /// cross compiling and using a custom build script, the build script needs to be compiled for
+ /// the host architecture so the host rustc can use it (when compiling to the target
+ /// architecture).
+ pub kind: Kind,
+}
+
+/// Type of each file generated by a Unit.
+#[derive(Copy, Clone, PartialEq, Eq, Debug)]
+pub enum TargetFileType {
+ /// Not a special file type.
+ Normal,
+ /// It is something you can link against (e.g. a library)
+ Linkable,
+ /// It is a piece of external debug information (e.g. *.dSYM and *.pdb)
+ DebugInfo,
+}
+
+/// The build context, containing all information about a build task
+pub struct Context<'a, 'cfg: 'a> {
+ /// The workspace the build is for
+ pub ws: &'a Workspace<'cfg>,
+ /// The cargo configuration
+ pub config: &'cfg Config,
+ /// The dependency graph for our build
+ pub resolve: &'a Resolve,
+ /// Information on the compilation output
+ pub compilation: Compilation<'cfg>,
+ pub packages: &'a PackageSet<'cfg>,
+ pub build_state: Arc<BuildState>,
+ pub build_script_overridden: HashSet<(PackageId, Kind)>,
+ pub build_explicit_deps: HashMap<Unit<'a>, BuildDeps>,
+ pub fingerprints: HashMap<Unit<'a>, Arc<Fingerprint>>,
+ pub compiled: HashSet<Unit<'a>>,
+ pub build_config: BuildConfig,
+ pub build_scripts: HashMap<Unit<'a>, Arc<BuildScripts>>,
+ pub links: Links<'a>,
+ pub used_in_plugin: HashSet<Unit<'a>>,
+ pub jobserver: Client,
+
+ /// The target directory layout for the host (and target if it is the same as host)
+ host: Layout,
+ /// The target directory layout for the target (if different from then host)
+ target: Option<Layout>,
+ target_info: TargetInfo,
+ host_info: TargetInfo,
+ profiles: &'a Profiles,
+ incremental_enabled: bool,
+
+ /// For each Unit, a list all files produced as a triple of
+ ///
+ /// - File name that will be produced by the build process (in `deps`)
+ /// - If it should be linked into `target`, and what it should be called (e.g. without
+ /// metadata).
+ /// - Type of the file (library / debug symbol / else)
+ target_filenames: HashMap<Unit<'a>, Arc<Vec<(PathBuf, Option<PathBuf>, TargetFileType)>>>,
+ target_metadatas: HashMap<Unit<'a>, Option<Metadata>>,
+}
+
+#[derive(Clone, Default)]
+struct TargetInfo {
+ crate_type_process: Option<ProcessBuilder>,
+ crate_types: RefCell<HashMap<String, Option<(String, String)>>>,
+ cfg: Option<Vec<Cfg>>,
+}
+
+impl TargetInfo {
+ fn discover_crate_type(&self, crate_type: &str) -> CargoResult<Option<(String, String)>> {
+ let mut process = self.crate_type_process.clone().unwrap();
+
+ process.arg("--crate-type").arg(crate_type);
+
+ let output = process.exec_with_output().chain_err(|| {
+ format!("failed to run `rustc` to learn about \
+ crate-type {} information", crate_type)
+ })?;
+
+ let error = str::from_utf8(&output.stderr).unwrap();
+ let output = str::from_utf8(&output.stdout).unwrap();
+ Ok(parse_crate_type(crate_type, error, &mut output.lines())?)
+ }
+}
+
+#[derive(Clone, Hash, Eq, PartialEq, Ord, PartialOrd)]
+pub struct Metadata(u64);
+
+impl<'a, 'cfg> Context<'a, 'cfg> {
+ pub fn new(ws: &'a Workspace<'cfg>,
+ resolve: &'a Resolve,
+ packages: &'a PackageSet<'cfg>,
+ config: &'cfg Config,
+ build_config: BuildConfig,
+ profiles: &'a Profiles) -> CargoResult<Context<'a, 'cfg>> {
+
+ let dest = if build_config.release { "release" } else { "debug" };
+ let host_layout = Layout::new(ws, None, dest)?;
+ let target_layout = match build_config.requested_target.as_ref() {
+ Some(target) => Some(Layout::new(ws, Some(target), dest)?),
+ None => None,
+ };
+
+ // Enable incremental builds if the user opts in. For now,
+ // this is an environment variable until things stabilize a
+ // bit more.
+ let incremental_enabled = match env::var("CARGO_INCREMENTAL") {
+ Ok(v) => v == "1",
+ Err(_) => false,
+ };
+
+ // -Z can only be used on nightly builds; other builds complain loudly.
+ // Since incremental builds only work on nightly anyway, we silently
+ // ignore CARGO_INCREMENTAL on anything but nightly. This allows users
+ // to always have CARGO_INCREMENTAL set without getting unexpected
+ // errors on stable/beta builds.
+ let is_nightly =
+ config.rustc()?.verbose_version.contains("-nightly") ||
+ config.rustc()?.verbose_version.contains("-dev");
+ let incremental_enabled = incremental_enabled && is_nightly;
+
+ // Load up the jobserver that we'll use to manage our parallelism. This
+ // is the same as the GNU make implementation of a jobserver, and
+ // intentionally so! It's hoped that we can interact with GNU make and
+ // all share the same jobserver.
+ //
+ // Note that if we don't have a jobserver in our environment then we
+ // create our own, and we create it with `n-1` tokens because one token
+ // is ourself, a running process.
+ let jobserver = match config.jobserver_from_env() {
+ Some(c) => c.clone(),
+ None => Client::new(build_config.jobs as usize - 1).chain_err(|| {
+ "failed to create jobserver"
+ })?,
+ };
+
+ Ok(Context {
+ ws: ws,
+ host: host_layout,
+ target: target_layout,
+ resolve: resolve,
+ packages: packages,
+ config: config,
+ target_info: TargetInfo::default(),
+ host_info: TargetInfo::default(),
+ compilation: Compilation::new(config),
+ build_state: Arc::new(BuildState::new(&build_config)),
+ build_config: build_config,
+ fingerprints: HashMap::new(),
+ profiles: profiles,
+ compiled: HashSet::new(),
+ build_scripts: HashMap::new(),
+ build_explicit_deps: HashMap::new(),
+ links: Links::new(),
+ used_in_plugin: HashSet::new(),
+ incremental_enabled: incremental_enabled,
+ jobserver: jobserver,
+ build_script_overridden: HashSet::new(),
+
+ // TODO: Pre-Calculate these with a topo-sort, rather than lazy-calculating
+ target_filenames: HashMap::new(),
+ target_metadatas: HashMap::new(),
+ })
+ }
+
+ /// Prepare this context, ensuring that all filesystem directories are in
+ /// place.
+ pub fn prepare(&mut self) -> CargoResult<()> {
+ let _p = profile::start("preparing layout");
+
+ self.host.prepare().chain_err(|| {
+ internal("couldn't prepare build directories")
+ })?;
+ if let Some(ref mut target) = self.target {
+ target.prepare().chain_err(|| {
+ internal("couldn't prepare build directories")
+ })?;
+ }
+
+ self.compilation.host_deps_output = self.host.deps().to_path_buf();
+
+ let layout = self.target.as_ref().unwrap_or(&self.host);
+ self.compilation.root_output = layout.dest().to_path_buf();
+ self.compilation.deps_output = layout.deps().to_path_buf();
+ Ok(())
+ }
+
+ /// Ensure that we've collected all target-specific information to compile
+ /// all the units mentioned in `units`.
+ pub fn probe_target_info(&mut self, units: &[Unit<'a>]) -> CargoResult<()> {
+ let mut crate_types = BTreeSet::new();
+ let mut visited_units = HashSet::new();
+ // pre-fill with `bin` for learning about tests (nothing may be
+ // explicitly `bin`) as well as `rlib` as it's the coalesced version of
+ // `lib` in the compiler and we're not sure which we'll see.
+ crate_types.insert("bin".to_string());
+ crate_types.insert("rlib".to_string());
+ for unit in units {
+ self.visit_crate_type(unit, &mut crate_types, &mut visited_units)?;
+ }
+ debug!("probe_target_info: crate_types={:?}", crate_types);
+ self.probe_target_info_kind(&crate_types, Kind::Target)?;
+ if self.requested_target().is_none() {
+ self.host_info = self.target_info.clone();
+ } else {
+ self.probe_target_info_kind(&crate_types, Kind::Host)?;
+ }
+ Ok(())
+ }
+
+ /// A recursive function that checks all crate types (`rlib`, ...) are in `crate_types`
+ /// for this unit and its dependencies.
+ ///
+ /// Tracks visited units to avoid unnecessary work.
+ fn visit_crate_type(&self,
+ unit: &Unit<'a>,
+ crate_types: &mut BTreeSet<String>,
+ visited_units: &mut HashSet<Unit<'a>>)
+ -> CargoResult<()> {
+ if !visited_units.insert(*unit) {
+ return Ok(());
+ }
+ for target in unit.pkg.manifest().targets() {
+ crate_types.extend(target.rustc_crate_types().iter().map(|s| {
+ if *s == "lib" {
+ "rlib".to_string()
+ } else {
+ s.to_string()
+ }
+ }));
+ }
+ for dep in self.dep_targets(unit)? {
+ self.visit_crate_type(&dep, crate_types, visited_units)?;
+ }
+ Ok(())
+ }
+
+ fn probe_target_info_kind(&mut self,
+ crate_types: &BTreeSet<String>,
+ kind: Kind)
+ -> CargoResult<()> {
+ let rustflags = env_args(self.config,
+ &self.build_config,
+ self.info(&kind),
+ kind,
+ "RUSTFLAGS")?;
+ let mut process = self.config.rustc()?.process();
+ process.arg("-")
+ .arg("--crate-name").arg("___")
+ .arg("--print=file-names")
+ .args(&rustflags)
+ .env_remove("RUST_LOG");
+
+ if kind == Kind::Target {
+ process.arg("--target").arg(&self.target_triple());
+ }
+
+ let crate_type_process = process.clone();
+
+ for crate_type in crate_types {
+ process.arg("--crate-type").arg(crate_type);
+ }
+
+ let mut with_cfg = process.clone();
+ with_cfg.arg("--print=sysroot");
+ with_cfg.arg("--print=cfg");
+
+ let mut has_cfg_and_sysroot = true;
+ let output = with_cfg.exec_with_output().or_else(|_| {
+ has_cfg_and_sysroot = false;
+ process.exec_with_output()
+ }).chain_err(|| {
+ "failed to run `rustc` to learn about target-specific information"
+ })?;
+
+ let error = str::from_utf8(&output.stderr).unwrap();
+ let output = str::from_utf8(&output.stdout).unwrap();
+ let mut lines = output.lines();
+ let mut map = HashMap::new();
+ for crate_type in crate_types {
+ let out = parse_crate_type(crate_type, error, &mut lines)?;
+ map.insert(crate_type.to_string(), out);
+ }
+
+ if has_cfg_and_sysroot {
+ let line = match lines.next() {
+ Some(line) => line,
+ None => bail!("output of --print=sysroot missing when learning about \
+ target-specific information from rustc"),
+ };
+ let mut rustlib = PathBuf::from(line);
+ if kind == Kind::Host {
+ if cfg!(windows) {
+ rustlib.push("bin");
+ } else {
+ rustlib.push("lib");
+ }
+ self.compilation.host_dylib_path = Some(rustlib);
+ } else {
+ rustlib.push("lib");
+ rustlib.push("rustlib");
+ rustlib.push(self.target_triple());
+ rustlib.push("lib");
+ self.compilation.target_dylib_path = Some(rustlib);
+ }
+ }
+
+ let cfg = if has_cfg_and_sysroot {
+ Some(try!(lines.map(Cfg::from_str).collect()))
+ } else {
+ None
+ };
+
+ let info = match kind {
+ Kind::Target => &mut self.target_info,
+ Kind::Host => &mut self.host_info,
+ };
+ info.crate_type_process = Some(crate_type_process);
+ info.crate_types = RefCell::new(map);
+ info.cfg = cfg;
+ Ok(())
+ }
+
+ /// Builds up the `used_in_plugin` internal to this context from the list of
+ /// top-level units.
+ ///
+ /// This will recursively walk `units` and all of their dependencies to
+ /// determine which crate are going to be used in plugins or not.
+ pub fn build_used_in_plugin_map(&mut self, units: &[Unit<'a>])
+ -> CargoResult<()> {
+ let mut visited = HashSet::new();
+ for unit in units {
+ self.walk_used_in_plugin_map(unit,
+ unit.target.for_host(),
+ &mut visited)?;
+ }
+ Ok(())
+ }
+
+ fn walk_used_in_plugin_map(&mut self,
+ unit: &Unit<'a>,
+ is_plugin: bool,
+ visited: &mut HashSet<(Unit<'a>, bool)>)
+ -> CargoResult<()> {
+ if !visited.insert((*unit, is_plugin)) {
+ return Ok(())
+ }
+ if is_plugin {
+ self.used_in_plugin.insert(*unit);
+ }
+ for unit in self.dep_targets(unit)? {
+ self.walk_used_in_plugin_map(&unit,
+ is_plugin || unit.target.for_host(),
+ visited)?;
+ }
+ Ok(())
+ }
+
+ /// Returns the appropriate directory layout for either a plugin or not.
+ fn layout(&self, kind: Kind) -> &Layout {
+ match kind {
+ Kind::Host => &self.host,
+ Kind::Target => self.target.as_ref().unwrap_or(&self.host)
+ }
+ }
+
+ /// Returns the directories where Rust crate dependencies are found for the
+ /// specified unit.
+ pub fn deps_dir(&self, unit: &Unit) -> &Path {
+ self.layout(unit.kind).deps()
+ }
+
+ /// Returns the directory for the specified unit where fingerprint
+ /// information is stored.
+ pub fn fingerprint_dir(&mut self, unit: &Unit<'a>) -> PathBuf {
+ let dir = self.pkg_dir(unit);
+ self.layout(unit.kind).fingerprint().join(dir)
+ }
+
+ /// Returns the appropriate directory layout for either a plugin or not.
+ pub fn build_script_dir(&mut self, unit: &Unit<'a>) -> PathBuf {
+ assert!(unit.target.is_custom_build());
+ assert!(!unit.profile.run_custom_build);
+ let dir = self.pkg_dir(unit);
+ self.layout(Kind::Host).build().join(dir)
+ }
+
+ /// Returns the appropriate directory layout for either a plugin or not.
+ pub fn build_script_out_dir(&mut self, unit: &Unit<'a>) -> PathBuf {
+ assert!(unit.target.is_custom_build());
+ assert!(unit.profile.run_custom_build);
+ let dir = self.pkg_dir(unit);
+ self.layout(unit.kind).build().join(dir).join("out")
+ }
+
+ pub fn host_deps(&self) -> &Path {
+ self.host.deps()
+ }
+
+ /// Return the root of the build output tree
+ pub fn target_root(&self) -> &Path {
+ self.host.dest()
+ }
+
+ /// Returns the appropriate output directory for the specified package and
+ /// target.
+ pub fn out_dir(&mut self, unit: &Unit<'a>) -> PathBuf {
+ if unit.profile.doc {
+ self.layout(unit.kind).root().parent().unwrap().join("doc")
+ } else if unit.target.is_custom_build() {
+ self.build_script_dir(unit)
+ } else if unit.target.is_example() {
+ self.layout(unit.kind).examples().to_path_buf()
+ } else {
+ self.deps_dir(unit).to_path_buf()
+ }
+ }
+
+ fn pkg_dir(&mut self, unit: &Unit<'a>) -> String {
+ let name = unit.pkg.package_id().name();
+ match self.target_metadata(unit) {
+ Some(meta) => format!("{}-{}", name, meta),
+ None => format!("{}-{}", name, self.target_short_hash(unit)),
+ }
+ }
+
+ /// Return the host triple for this context
+ pub fn host_triple(&self) -> &str {
+ &self.build_config.host_triple
+ }
+
+ /// Return the target triple which this context is targeting.
+ pub fn target_triple(&self) -> &str {
+ self.requested_target().unwrap_or(self.host_triple())
+ }
+
+ /// Requested (not actual) target for the build
+ pub fn requested_target(&self) -> Option<&str> {
+ self.build_config.requested_target.as_ref().map(|s| &s[..])
+ }
+
+ /// Get the short hash based only on the PackageId
+ /// Used for the metadata when target_metadata returns None
+ pub fn target_short_hash(&self, unit: &Unit) -> String {
+ let hashable = unit.pkg.package_id().stable_hash(self.ws.root());
+ util::short_hash(&hashable)
+ }
+
+ /// Get the metadata for a target in a specific profile
+ /// We build to the path: "{filename}-{target_metadata}"
+ /// We use a linking step to link/copy to a predictable filename
+ /// like `target/debug/libfoo.{a,so,rlib}` and such.
+ pub fn target_metadata(&mut self, unit: &Unit<'a>) -> Option<Metadata> {
+ if let Some(cache) = self.target_metadatas.get(unit) {
+ return cache.clone()
+ }
+
+ let metadata = self.calc_target_metadata(unit);
+ self.target_metadatas.insert(*unit, metadata.clone());
+ metadata
+ }
+
+ fn calc_target_metadata(&mut self, unit: &Unit<'a>) -> Option<Metadata> {
+ // No metadata for dylibs because of a couple issues
+ // - OSX encodes the dylib name in the executable
+ // - Windows rustc multiple files of which we can't easily link all of them
+ //
+ // No metadata for bin because of an issue
+ // - wasm32 rustc/emcc encodes the .wasm name in the .js (rust-lang/cargo#4535)
+ //
+ // Two exceptions
+ // 1) Upstream dependencies (we aren't exporting + need to resolve name conflict)
+ // 2) __CARGO_DEFAULT_LIB_METADATA env var
+ //
+ // Note, though, that the compiler's build system at least wants
+ // path dependencies (eg libstd) to have hashes in filenames. To account for
+ // that we have an extra hack here which reads the
+ // `__CARGO_DEFAULT_LIB_METADATA` environment variable and creates a
+ // hash in the filename if that's present.
+ //
+ // This environment variable should not be relied on! It's
+ // just here for rustbuild. We need a more principled method
+ // doing this eventually.
+ let __cargo_default_lib_metadata = env::var("__CARGO_DEFAULT_LIB_METADATA");
+ if !unit.profile.test &&
+ (unit.target.is_dylib() || unit.target.is_cdylib() ||
+ (unit.target.is_bin() && self.target_triple().starts_with("wasm32-"))) &&
+ unit.pkg.package_id().source_id().is_path() &&
+ !__cargo_default_lib_metadata.is_ok()
+ {
+ return None;
+ }
+
+ let mut hasher = SipHasher::new_with_keys(0, 0);
+
+ // Unique metadata per (name, source, version) triple. This'll allow us
+ // to pull crates from anywhere w/o worrying about conflicts
+ unit.pkg.package_id().stable_hash(self.ws.root()).hash(&mut hasher);
+
+ // Add package properties which map to environment variables
+ // exposed by Cargo
+ let manifest_metadata = unit.pkg.manifest().metadata();
+ manifest_metadata.authors.hash(&mut hasher);
+ manifest_metadata.description.hash(&mut hasher);
+ manifest_metadata.homepage.hash(&mut hasher);
+
+ // Also mix in enabled features to our metadata. This'll ensure that
+ // when changing feature sets each lib is separately cached.
+ self.resolve.features_sorted(unit.pkg.package_id()).hash(&mut hasher);
+
+ // Mix in the target-metadata of all the dependencies of this target
+ if let Ok(deps) = self.dep_targets(unit) {
+ let mut deps_metadata = deps.into_iter().map(|dep_unit| {
+ self.target_metadata(&dep_unit)
+ }).collect::<Vec<_>>();
+ deps_metadata.sort();
+ deps_metadata.hash(&mut hasher);
+ }
+
+ // Throw in the profile we're compiling with. This helps caching
+ // panic=abort and panic=unwind artifacts, additionally with various
+ // settings like debuginfo and whatnot.
+ unit.profile.hash(&mut hasher);
+
+ // Artifacts compiled for the host should have a different metadata
+ // piece than those compiled for the target, so make sure we throw in
+ // the unit's `kind` as well
+ unit.kind.hash(&mut hasher);
+
+ // Finally throw in the target name/kind. This ensures that concurrent
+ // compiles of targets in the same crate don't collide.
+ unit.target.name().hash(&mut hasher);
+ unit.target.kind().hash(&mut hasher);
+
+ if let Ok(rustc) = self.config.rustc() {
+ rustc.verbose_version.hash(&mut hasher);
+ }
+
+ // Seed the contents of __CARGO_DEFAULT_LIB_METADATA to the hasher if present.
+ // This should be the release channel, to get a different hash for each channel.
+ if let Ok(ref channel) = __cargo_default_lib_metadata {
+ channel.hash(&mut hasher);
+ }
+
+ Some(Metadata(hasher.finish()))
+ }
+
+ /// Returns the file stem for a given target/profile combo (with metadata)
+ pub fn file_stem(&mut self, unit: &Unit<'a>) -> String {
+ match self.target_metadata(unit) {
+ Some(ref metadata) => format!("{}-{}", unit.target.crate_name(),
+ metadata),
+ None => self.bin_stem(unit),
+ }
+ }
+
+ /// Returns the bin stem for a given target (without metadata)
+ fn bin_stem(&self, unit: &Unit) -> String {
+ if unit.target.allows_underscores() {
+ unit.target.name().to_string()
+ } else {
+ unit.target.crate_name()
+ }
+ }
+
+ /// Returns a tuple with the directory and name of the hard link we expect
+ /// our target to be copied to. Eg, file_stem may be out_dir/deps/foo-abcdef
+ /// and link_stem would be out_dir/foo
+ /// This function returns it in two parts so the caller can add prefix/suffix
+ /// to filename separately
+ ///
+ /// Returns an Option because in some cases we don't want to link
+ /// (eg a dependent lib)
+ pub fn link_stem(&mut self, unit: &Unit<'a>) -> Option<(PathBuf, String)> {
+ let src_dir = self.out_dir(unit);
+ let bin_stem = self.bin_stem(unit);
+ let file_stem = self.file_stem(unit);
+
+ // We currently only lift files up from the `deps` directory. If
+ // it was compiled into something like `example/` or `doc/` then
+ // we don't want to link it up.
+ if src_dir.ends_with("deps") {
+ // Don't lift up library dependencies
+ if self.ws.members().find(|&p| p == unit.pkg).is_none() &&
+ !unit.target.is_bin() {
+ None
+ } else {
+ Some((
+ src_dir.parent().unwrap().to_owned(),
+ if unit.profile.test {file_stem} else {bin_stem},
+ ))
+ }
+ } else if bin_stem == file_stem {
+ None
+ } else if src_dir.ends_with("examples")
+ || src_dir.parent().unwrap().ends_with("build") {
+ Some((src_dir, bin_stem))
+ } else {
+ None
+ }
+ }
+
+ /// Return the filenames that the given target for the given profile will
+ /// generate as a list of 3-tuples (filename, link_dst, linkable)
+ ///
+ /// - filename: filename rustc compiles to. (Often has metadata suffix).
+ /// - link_dst: Optional file to link/copy the result to (without metadata suffix)
+ /// - linkable: Whether possible to link against file (eg it's a library)
+ pub fn target_filenames(&mut self, unit: &Unit<'a>)
+ -> CargoResult<Arc<Vec<(PathBuf, Option<PathBuf>, TargetFileType)>>> {
+ if let Some(cache) = self.target_filenames.get(unit) {
+ return Ok(Arc::clone(cache))
+ }
+
+ let result = self.calc_target_filenames(unit);
+ if let Ok(ref ret) = result {
+ self.target_filenames.insert(*unit, Arc::clone(ret));
+ }
+ result
+ }
+
+ fn calc_target_filenames(&mut self, unit: &Unit<'a>)
+ -> CargoResult<Arc<Vec<(PathBuf, Option<PathBuf>, TargetFileType)>>> {
+ let out_dir = self.out_dir(unit);
+ let stem = self.file_stem(unit);
+ let link_stem = self.link_stem(unit);
+ let info = if unit.target.for_host() {
+ &self.host_info
+ } else {
+ &self.target_info
+ };
+
+ let mut ret = Vec::new();
+ let mut unsupported = Vec::new();
+ {
+ if unit.profile.check {
+ let filename = out_dir.join(format!("lib{}.rmeta", stem));
+ let link_dst = link_stem.clone().map(|(ld, ls)| {
+ ld.join(format!("lib{}.rmeta", ls))
+ });
+ ret.push((filename, link_dst, TargetFileType::Linkable));
+ } else {
+ let mut add = |crate_type: &str, file_type: TargetFileType| -> CargoResult<()> {
+ let crate_type = if crate_type == "lib" {"rlib"} else {crate_type};
+ let mut crate_types = info.crate_types.borrow_mut();
+ let entry = crate_types.entry(crate_type.to_string());
+ let crate_type_info = match entry {
+ Entry::Occupied(o) => &*o.into_mut(),
+ Entry::Vacant(v) => {
+ let value = info.discover_crate_type(v.key())?;
+ &*v.insert(value)
+ }
+ };
+ match *crate_type_info {
+ Some((ref prefix, ref suffix)) => {
+ let suffixes = add_target_specific_suffixes(
+ &self.target_triple(),
+ &crate_type,
+ unit.target.kind(),
+ suffix,
+ file_type,
+ );
+ for (suffix, file_type, should_replace_hyphens) in suffixes {
+ // wasm bin target will generate two files in deps such as
+ // "web-stuff.js" and "web_stuff.wasm". Note the different usages of
+ // "-" and "_". should_replace_hyphens is a flag to indicate that
+ // we need to convert the stem "web-stuff" to "web_stuff", so we
+ // won't miss "web_stuff.wasm".
+ let conv = |s: String| if should_replace_hyphens {
+ s.replace("-", "_")
+ } else {
+ s
+ };
+ let filename =
+ out_dir.join(format!("{}{}{}", prefix, conv(stem.clone()), suffix));
+ let link_dst = link_stem.clone().map(|(ld, ls)| {
+ ld.join(format!("{}{}{}", prefix, conv(ls), suffix))
+ });
+ ret.push((filename, link_dst, file_type));
+ }
+ Ok(())
+ }
+ // not supported, don't worry about it
+ None => {
+ unsupported.push(crate_type.to_string());
+ Ok(())
+ }
+ }
+ };
+ //info!("{:?}", unit);
+ match *unit.target.kind() {
+ TargetKind::Bin |
+ TargetKind::CustomBuild |
+ TargetKind::ExampleBin |
+ TargetKind::Bench |
+ TargetKind::Test => {
+ add("bin", TargetFileType::Normal)?;
+ }
+ TargetKind::Lib(..) |
+ TargetKind::ExampleLib(..)
+ if unit.profile.test => {
+ add("bin", TargetFileType::Normal)?;
+ }
+ TargetKind::ExampleLib(ref kinds) |
+ TargetKind::Lib(ref kinds) => {
+ for kind in kinds {
+ add(kind.crate_type(), if kind.linkable() {
+ TargetFileType::Linkable
+ } else {
+ TargetFileType::Normal
+ })?;
+ }
+ }
+ }
+ }
+ }
+ if ret.is_empty() {
+ if !unsupported.is_empty() {
+ bail!("cannot produce {} for `{}` as the target `{}` \
+ does not support these crate types",
+ unsupported.join(", "), unit.pkg, self.target_triple())
+ }
+ bail!("cannot compile `{}` as the target `{}` does not \
+ support any of the output crate types",
+ unit.pkg, self.target_triple());
+ }
+ info!("Target filenames: {:?}", ret);
+
+ Ok(Arc::new(ret))
+ }
+
+ /// For a package, return all targets which are registered as dependencies
+ /// for that package.
+ pub fn dep_targets(&self, unit: &Unit<'a>) -> CargoResult<Vec<Unit<'a>>> {
+ if unit.profile.run_custom_build {
+ return self.dep_run_custom_build(unit)
+ } else if unit.profile.doc && !unit.profile.test {
+ return self.doc_deps(unit);
+ }
+
+ let id = unit.pkg.package_id();
+ let deps = self.resolve.deps(id);
+ let mut ret = deps.filter(|dep| {
+ unit.pkg.dependencies().iter().filter(|d| {
+ d.name() == dep.name() && d.version_req().matches(dep.version())
+ }).any(|d| {
+ // If this target is a build command, then we only want build
+ // dependencies, otherwise we want everything *other than* build
+ // dependencies.
+ if unit.target.is_custom_build() != d.is_build() {
+ return false
+ }
+
+ // If this dependency is *not* a transitive dependency, then it
+ // only applies to test/example targets
+ if !d.is_transitive() && !unit.target.is_test() &&
+ !unit.target.is_example() && !unit.profile.test {
+ return false
+ }
+
+ // If this dependency is only available for certain platforms,
+ // make sure we're only enabling it for that platform.
+ if !self.dep_platform_activated(d, unit.kind) {
+ return false
+ }
+
+ // If the dependency is optional, then we're only activating it
+ // if the corresponding feature was activated
+ if d.is_optional() && !self.resolve.features(id).contains(d.name()) {
+ return false;
+ }
+
+ // If we've gotten past all that, then this dependency is
+ // actually used!
+ true
+ })
+ }).filter_map(|id| {
+ match self.get_package(id) {
+ Ok(pkg) => {
+ pkg.targets().iter().find(|t| t.is_lib()).map(|t| {
+ let unit = Unit {
+ pkg: pkg,
+ target: t,
+ profile: self.lib_or_check_profile(unit, t),
+ kind: unit.kind.for_target(t),
+ };
+ Ok(unit)
+ })
+ }
+ Err(e) => Some(Err(e))
+ }
+ }).collect::<CargoResult<Vec<_>>>()?;
+
+ // If this target is a build script, then what we've collected so far is
+ // all we need. If this isn't a build script, then it depends on the
+ // build script if there is one.
+ if unit.target.is_custom_build() {
+ return Ok(ret)
+ }
+ ret.extend(self.dep_build_script(unit));
+
+ // If this target is a binary, test, example, etc, then it depends on
+ // the library of the same package. The call to `resolve.deps` above
+ // didn't include `pkg` in the return values, so we need to special case
+ // it here and see if we need to push `(pkg, pkg_lib_target)`.
+ if unit.target.is_lib() && !unit.profile.doc {
+ return Ok(ret)
+ }
+ ret.extend(self.maybe_lib(unit));
+
+ // Integration tests/benchmarks require binaries to be built
+ if unit.profile.test &&
+ (unit.target.is_test() || unit.target.is_bench()) {
+ ret.extend(unit.pkg.targets().iter().filter(|t| {
+ let no_required_features = Vec::new();
+
+ t.is_bin() &&
+ // Skip binaries with required features that have not been selected.
+ t.required_features().unwrap_or(&no_required_features).iter().all(|f| {
+ self.resolve.features(id).contains(f)
+ })
+ }).map(|t| {
+ Unit {
+ pkg: unit.pkg,
+ target: t,
+ profile: self.lib_or_check_profile(unit, t),
+ kind: unit.kind.for_target(t),
+ }
+ }));
+ }
+ Ok(ret)
+ }
+
+ /// Returns the dependencies needed to run a build script.
+ ///
+ /// The `unit` provided must represent an execution of a build script, and
+ /// the returned set of units must all be run before `unit` is run.
+ pub fn dep_run_custom_build(&self, unit: &Unit<'a>)
+ -> CargoResult<Vec<Unit<'a>>> {
+ // If this build script's execution has been overridden then we don't
+ // actually depend on anything, we've reached the end of the dependency
+ // chain as we've got all the info we're gonna get.
+ let key = (unit.pkg.package_id().clone(), unit.kind);
+ if self.build_script_overridden.contains(&key) {
+ return Ok(Vec::new())
+ }
+
+ // When not overridden, then the dependencies to run a build script are:
+ //
+ // 1. Compiling the build script itself
+ // 2. For each immediate dependency of our package which has a `links`
+ // key, the execution of that build script.
+ let not_custom_build = unit.pkg.targets().iter().find(|t| {
+ !t.is_custom_build()
+ }).unwrap();
+ let tmp = Unit {
+ target: not_custom_build,
+ profile: &self.profiles.dev,
+ ..*unit
+ };
+ let deps = self.dep_targets(&tmp)?;
+ Ok(deps.iter().filter_map(|unit| {
+ if !unit.target.linkable() || unit.pkg.manifest().links().is_none() {
+ return None
+ }
+ self.dep_build_script(unit)
+ }).chain(Some(Unit {
+ profile: self.build_script_profile(unit.pkg.package_id()),
+ kind: Kind::Host, // build scripts always compiled for the host
+ ..*unit
+ })).collect())
+ }
+
+ /// Returns the dependencies necessary to document a package
+ fn doc_deps(&self, unit: &Unit<'a>) -> CargoResult<Vec<Unit<'a>>> {
+ let deps = self.resolve.deps(unit.pkg.package_id()).filter(|dep| {
+ unit.pkg.dependencies().iter().filter(|d| {
+ d.name() == dep.name()
+ }).any(|dep| {
+ match dep.kind() {
+ DepKind::Normal => self.dep_platform_activated(dep,
+ unit.kind),
+ _ => false,
+ }
+ })
+ }).map(|dep| {
+ self.get_package(dep)
+ });
+
+ // To document a library, we depend on dependencies actually being
+ // built. If we're documenting *all* libraries, then we also depend on
+ // the documentation of the library being built.
+ let mut ret = Vec::new();
+ for dep in deps {
+ let dep = dep?;
+ let lib = match dep.targets().iter().find(|t| t.is_lib()) {
+ Some(lib) => lib,
+ None => continue,
+ };
+ ret.push(Unit {
+ pkg: dep,
+ target: lib,
+ profile: self.lib_profile(),
+ kind: unit.kind.for_target(lib),
+ });
+ if self.build_config.doc_all {
+ ret.push(Unit {
+ pkg: dep,
+ target: lib,
+ profile: &self.profiles.doc,
+ kind: unit.kind.for_target(lib),
+ });
+ }
+ }
+
+ // Be sure to build/run the build script for documented libraries as
+ ret.extend(self.dep_build_script(unit));
+
+ // If we document a binary, we need the library available
+ if unit.target.is_bin() {
+ ret.extend(self.maybe_lib(unit));
+ }
+ Ok(ret)
+ }
+
+ /// If a build script is scheduled to be run for the package specified by
+ /// `unit`, this function will return the unit to run that build script.
+ ///
+ /// Overriding a build script simply means that the running of the build
+ /// script itself doesn't have any dependencies, so even in that case a unit
+ /// of work is still returned. `None` is only returned if the package has no
+ /// build script.
+ fn dep_build_script(&self, unit: &Unit<'a>) -> Option<Unit<'a>> {
+ unit.pkg.targets().iter().find(|t| t.is_custom_build()).map(|t| {
+ Unit {
+ pkg: unit.pkg,
+ target: t,
+ profile: &self.profiles.custom_build,
+ kind: unit.kind,
+ }
+ })
+ }
+
+ fn maybe_lib(&self, unit: &Unit<'a>) -> Option<Unit<'a>> {
+ unit.pkg.targets().iter().find(|t| t.linkable()).map(|t| {
+ Unit {
+ pkg: unit.pkg,
+ target: t,
+ profile: self.lib_or_check_profile(unit, t),
+ kind: unit.kind.for_target(t),
+ }
+ })
+ }
+
+ fn dep_platform_activated(&self, dep: &Dependency, kind: Kind) -> bool {
+ // If this dependency is only available for certain platforms,
+ // make sure we're only enabling it for that platform.
+ let platform = match dep.platform() {
+ Some(p) => p,
+ None => return true,
+ };
+ let (name, info) = match kind {
+ Kind::Host => (self.host_triple(), &self.host_info),
+ Kind::Target => (self.target_triple(), &self.target_info),
+ };
+ platform.matches(name, info.cfg.as_ref().map(|cfg| &cfg[..]))
+ }
+
+ /// Gets a package for the given package id.
+ pub fn get_package(&self, id: &PackageId) -> CargoResult<&'a Package> {
+ self.packages.get(id)
+ }
+
+ /// Get the user-specified linker for a particular host or target
+ pub fn linker(&self, kind: Kind) -> Option<&Path> {
+ self.target_config(kind).linker.as_ref().map(|s| s.as_ref())
+ }
+
+ /// Get the user-specified `ar` program for a particular host or target
+ pub fn ar(&self, kind: Kind) -> Option<&Path> {
+ self.target_config(kind).ar.as_ref().map(|s| s.as_ref())
+ }
+
+ /// Get the list of cfg printed out from the compiler for the specified kind
+ pub fn cfg(&self, kind: Kind) -> &[Cfg] {
+ let info = match kind {
+ Kind::Host => &self.host_info,
+ Kind::Target => &self.target_info,
+ };
+ info.cfg.as_ref().map(|s| &s[..]).unwrap_or(&[])
+ }
+
+ /// Get the target configuration for a particular host or target
+ fn target_config(&self, kind: Kind) -> &TargetConfig {
+ match kind {
+ Kind::Host => &self.build_config.host,
+ Kind::Target => &self.build_config.target,
+ }
+ }
+
+ /// Number of jobs specified for this build
+ pub fn jobs(&self) -> u32 { self.build_config.jobs }
+
+ pub fn lib_profile(&self) -> &'a Profile {
+ let (normal, test) = if self.build_config.release {
+ (&self.profiles.release, &self.profiles.bench_deps)
+ } else {
+ (&self.profiles.dev, &self.profiles.test_deps)
+ };
+ if self.build_config.test {
+ test
+ } else {
+ normal
+ }
+ }
+
+ pub fn lib_or_check_profile(&self, unit: &Unit, target: &Target) -> &'a Profile {
+ if unit.profile.check && !target.is_custom_build() && !target.for_host() {
+ &self.profiles.check
+ } else {
+ self.lib_profile()
+ }
+ }
+
+ pub fn build_script_profile(&self, _pkg: &PackageId) -> &'a Profile {
+ // TODO: should build scripts always be built with the same library
+ // profile? How is this controlled at the CLI layer?
+ self.lib_profile()
+ }
+
+ pub fn incremental_args(&self, unit: &Unit) -> CargoResult<Vec<String>> {
+ if self.incremental_enabled {
+ if unit.pkg.package_id().source_id().is_path() {
+ // Only enable incremental compilation for sources the user can modify.
+ // For things that change infrequently, non-incremental builds yield
+ // better performance.
+ // (see also https://github.com/rust-lang/cargo/issues/3972)
+ return Ok(vec![format!("-Zincremental={}",
+ self.layout(unit.kind).incremental().display())]);
+ } else if unit.profile.codegen_units.is_none() {
+ // For non-incremental builds we set a higher number of
+ // codegen units so we get faster compiles. It's OK to do
+ // so because the user has already opted into slower
+ // runtime code by setting CARGO_INCREMENTAL.
+ return Ok(vec![format!("-Ccodegen-units={}", ::num_cpus::get())]);
+ }
+ }
+
+ Ok(vec![])
+ }
+
+ pub fn rustflags_args(&self, unit: &Unit) -> CargoResult<Vec<String>> {
+ env_args(self.config, &self.build_config, self.info(&unit.kind), unit.kind, "RUSTFLAGS")
+ }
+
+ pub fn rustdocflags_args(&self, unit: &Unit) -> CargoResult<Vec<String>> {
+ env_args(self.config, &self.build_config, self.info(&unit.kind), unit.kind, "RUSTDOCFLAGS")
+ }
+
+ pub fn show_warnings(&self, pkg: &PackageId) -> bool {
+ pkg.source_id().is_path() || self.config.extra_verbose()
+ }
+
+ fn info(&self, kind: &Kind) -> &TargetInfo {
+ match *kind {
+ Kind::Host => &self.host_info,
+ Kind::Target => &self.target_info,
+ }
+ }
+}
+
+/// Acquire extra flags to pass to the compiler from various locations.
+///
+/// The locations are:
+///
+/// - the `RUSTFLAGS` environment variable
+///
+/// then if this was not found
+///
+/// - `target.*.rustflags` from the manifest (Cargo.toml)
+/// - `target.cfg(..).rustflags` from the manifest
+///
+/// then if neither of these were found
+///
+/// - `build.rustflags` from the manifest
+///
+/// Note that if a `target` is specified, no args will be passed to host code (plugins, build
+/// scripts, ...), even if it is the same as the target.
+fn env_args(config: &Config,
+ build_config: &BuildConfig,
+ target_info: &TargetInfo,
+ kind: Kind,
+ name: &str) -> CargoResult<Vec<String>> {
+ // We *want* to apply RUSTFLAGS only to builds for the
+ // requested target architecture, and not to things like build
+ // scripts and plugins, which may be for an entirely different
+ // architecture. Cargo's present architecture makes it quite
+ // hard to only apply flags to things that are not build
+ // scripts and plugins though, so we do something more hacky
+ // instead to avoid applying the same RUSTFLAGS to multiple targets
+ // arches:
+ //
+ // 1) If --target is not specified we just apply RUSTFLAGS to
+ // all builds; they are all going to have the same target.
+ //
+ // 2) If --target *is* specified then we only apply RUSTFLAGS
+ // to compilation units with the Target kind, which indicates
+ // it was chosen by the --target flag.
+ //
+ // This means that, e.g. even if the specified --target is the
+ // same as the host, build scripts in plugins won't get
+ // RUSTFLAGS.
+ let compiling_with_target = build_config.requested_target.is_some();
+ let is_target_kind = kind == Kind::Target;
+
+ if compiling_with_target && !is_target_kind {
+ // This is probably a build script or plugin and we're
+ // compiling with --target. In this scenario there are
+ // no rustflags we can apply.
+ return Ok(Vec::new());
+ }
+
+ // First try RUSTFLAGS from the environment
+ if let Ok(a) = env::var(name) {
+ let args = a.split(' ')
+ .map(str::trim)
+ .filter(|s| !s.is_empty())
+ .map(str::to_string);
+ return Ok(args.collect());
+ }
+
+ let mut rustflags = Vec::new();
+
+ let name = name.chars().flat_map(|c| c.to_lowercase()).collect::<String>();
+ // Then the target.*.rustflags value...
+ let target = build_config.requested_target.as_ref().unwrap_or(&build_config.host_triple);
+ let key = format!("target.{}.{}", target, name);
+ if let Some(args) = config.get_list_or_split_string(&key)? {
+ let args = args.val.into_iter();
+ rustflags.extend(args);
+ }
+ // ...including target.'cfg(...)'.rustflags
+ if let Some(ref target_cfg) = target_info.cfg {
+ if let Some(table) = config.get_table("target")? {
+ let cfgs = table.val.keys().filter_map(|t| {
+ if t.starts_with("cfg(") && t.ends_with(')') {
+ let cfg = &t[4..t.len() - 1];
+ CfgExpr::from_str(cfg)
+ .ok()
+ .and_then(|c| if c.matches(target_cfg) { Some(t) } else { None })
+ } else {
+ None
+ }
+ });
+ for n in cfgs {
+ let key = format!("target.{}.{}", n, name);
+ if let Some(args) = config.get_list_or_split_string(&key)? {
+ let args = args.val.into_iter();
+ rustflags.extend(args);
+ }
+ }
+ }
+ }
+
+ if !rustflags.is_empty() {
+ return Ok(rustflags);
+ }
+
+ // Then the build.rustflags value
+ let key = format!("build.{}", name);
+ if let Some(args) = config.get_list_or_split_string(&key)? {
+ let args = args.val.into_iter();
+ return Ok(args.collect());
+ }
+
+ Ok(Vec::new())
+}
+
+impl fmt::Display for Metadata {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ write!(f, "{:016x}", self.0)
+ }
+}
+
+/// Takes rustc output (using specialized command line args), and calculates the file prefix and
+/// suffix for the given crate type, or returns None if the type is not supported. (e.g. for a
+/// rust library like libcargo.rlib, prefix = "lib", suffix = "rlib").
+///
+/// The caller needs to ensure that the lines object is at the correct line for the given crate
+/// type: this is not checked.
+// This function can not handle more than 1 file per type (with wasm32-unknown-emscripten, there
+// are 2 files for bin (.wasm and .js))
+fn parse_crate_type(
+ crate_type: &str,
+ error: &str,
+ lines: &mut str::Lines,
+) -> CargoResult<Option<(String, String)>> {
+ let not_supported = error.lines().any(|line| {
+ (line.contains("unsupported crate type") ||
+ line.contains("unknown crate type")) &&
+ line.contains(crate_type)
+ });
+ if not_supported {
+ return Ok(None);
+ }
+ let line = match lines.next() {
+ Some(line) => line,
+ None => bail!("malformed output when learning about \
+ crate-type {} information", crate_type),
+ };
+ let mut parts = line.trim().split("___");
+ let prefix = parts.next().unwrap();
+ let suffix = match parts.next() {
+ Some(part) => part,
+ None => bail!("output of --print=file-names has changed in \
+ the compiler, cannot parse"),
+ };
+
+ Ok(Some((prefix.to_string(), suffix.to_string())))
+}
+
+// (not a rustdoc)
+// Return a list of 3-tuples (suffix, file_type, should_replace_hyphens).
+//
+// should_replace_hyphens will be used by the caller to replace "-" with "_"
+// in a bin_stem. See the caller side (calc_target_filenames()) for details.
+fn add_target_specific_suffixes(
+ target_triple: &str,
+ crate_type: &str,
+ target_kind: &TargetKind,
+ suffix: &str,
+ file_type: TargetFileType,
+) -> Vec<(String, TargetFileType, bool)> {
+ let mut ret = vec![(suffix.to_string(), file_type, false)];
+
+ // rust-lang/cargo#4500
+ if target_triple.ends_with("pc-windows-msvc") && crate_type.ends_with("dylib") &&
+ suffix == ".dll"
+ {
+ ret.push((".dll.lib".to_string(), TargetFileType::Normal, false));
+ }
+
+ // rust-lang/cargo#4535
+ if target_triple.starts_with("wasm32-") && crate_type == "bin" &&
+ suffix == ".js"
+ {
+ ret.push((".wasm".to_string(), TargetFileType::Normal, true));
+ }
+
+ // rust-lang/cargo#4490
+ // - only uplift *.dSYM for binaries.
+ // tests are run directly from target/debug/deps/
+ // and examples are inside target/debug/examples/ which already have *.dSYM next to them
+ // so no need to do anything.
+ if target_triple.contains("-apple-") && *target_kind == TargetKind::Bin {
+ ret.push((".dSYM".to_string(), TargetFileType::DebugInfo, false));
+ }
+
+ ret
+}
--- /dev/null
+use std::collections::{HashMap, BTreeSet, HashSet};
+use std::fs;
+use std::path::{PathBuf, Path};
+use std::str;
+use std::sync::{Mutex, Arc};
+
+use core::PackageId;
+use util::{Freshness, Cfg};
+use util::errors::{CargoResult, CargoResultExt, CargoError};
+use util::{internal, profile, paths};
+use util::machine_message;
+
+use super::job::Work;
+use super::{fingerprint, Kind, Context, Unit};
+
+/// Contains the parsed output of a custom build script.
+#[derive(Clone, Debug, Hash)]
+pub struct BuildOutput {
+ /// Paths to pass to rustc with the `-L` flag
+ pub library_paths: Vec<PathBuf>,
+ /// Names and link kinds of libraries, suitable for the `-l` flag
+ pub library_links: Vec<String>,
+ /// Various `--cfg` flags to pass to the compiler
+ pub cfgs: Vec<String>,
+ /// Additional environment variables to run the compiler with.
+ pub env: Vec<(String, String)>,
+ /// Metadata to pass to the immediate dependencies
+ pub metadata: Vec<(String, String)>,
+ /// Paths to trigger a rerun of this build script.
+ pub rerun_if_changed: Vec<String>,
+ /// Environment variables which, when changed, will cause a rebuild.
+ pub rerun_if_env_changed: Vec<String>,
+ /// Warnings generated by this build,
+ pub warnings: Vec<String>,
+}
+
+/// Map of packages to build info
+pub type BuildMap = HashMap<(PackageId, Kind), BuildOutput>;
+
+/// Build info and overrides
+pub struct BuildState {
+ pub outputs: Mutex<BuildMap>,
+ overrides: HashMap<(String, Kind), BuildOutput>,
+}
+
+#[derive(Default)]
+pub struct BuildScripts {
+ // Cargo will use this `to_link` vector to add -L flags to compiles as we
+ // propagate them upwards towards the final build. Note, however, that we
+ // need to preserve the ordering of `to_link` to be topologically sorted.
+ // This will ensure that build scripts which print their paths properly will
+ // correctly pick up the files they generated (if there are duplicates
+ // elsewhere).
+ //
+ // To preserve this ordering, the (id, kind) is stored in two places, once
+ // in the `Vec` and once in `seen_to_link` for a fast lookup. We maintain
+ // this as we're building interactively below to ensure that the memory
+ // usage here doesn't blow up too much.
+ //
+ // For more information, see #2354
+ pub to_link: Vec<(PackageId, Kind)>,
+ seen_to_link: HashSet<(PackageId, Kind)>,
+ pub plugins: BTreeSet<PackageId>,
+}
+
+pub struct BuildDeps {
+ pub build_script_output: PathBuf,
+ pub rerun_if_changed: Vec<String>,
+ pub rerun_if_env_changed: Vec<String>,
+}
+
+/// Prepares a `Work` that executes the target as a custom build script.
+///
+/// The `req` given is the requirement which this run of the build script will
+/// prepare work for. If the requirement is specified as both the target and the
+/// host platforms it is assumed that the two are equal and the build script is
+/// only run once (not twice).
+pub fn prepare<'a, 'cfg>(cx: &mut Context<'a, 'cfg>, unit: &Unit<'a>)
+ -> CargoResult<(Work, Work, Freshness)> {
+ let _p = profile::start(format!("build script prepare: {}/{}",
+ unit.pkg, unit.target.name()));
+
+ let key = (unit.pkg.package_id().clone(), unit.kind);
+ let overridden = cx.build_script_overridden.contains(&key);
+ let (work_dirty, work_fresh) = if overridden {
+ (Work::noop(), Work::noop())
+ } else {
+ build_work(cx, unit)?
+ };
+
+ // Now that we've prep'd our work, build the work needed to manage the
+ // fingerprint and then start returning that upwards.
+ let (freshness, dirty, fresh) =
+ fingerprint::prepare_build_cmd(cx, unit)?;
+
+ Ok((work_dirty.then(dirty), work_fresh.then(fresh), freshness))
+}
+
+fn build_work<'a, 'cfg>(cx: &mut Context<'a, 'cfg>, unit: &Unit<'a>)
+ -> CargoResult<(Work, Work)> {
+ let dependencies = cx.dep_run_custom_build(unit)?;
+ let build_script_unit = dependencies.iter().find(|d| {
+ !d.profile.run_custom_build && d.target.is_custom_build()
+ }).expect("running a script not depending on an actual script");
+ let script_output = cx.build_script_dir(build_script_unit);
+ let build_output = cx.build_script_out_dir(unit);
+
+ // Building the command to execute
+ let to_exec = script_output.join(unit.target.name());
+
+ // Start preparing the process to execute, starting out with some
+ // environment variables. Note that the profile-related environment
+ // variables are not set with this the build script's profile but rather the
+ // package's library profile.
+ let profile = cx.lib_profile();
+ let to_exec = to_exec.into_os_string();
+ let mut cmd = cx.compilation.host_process(to_exec, unit.pkg)?;
+ cmd.env("OUT_DIR", &build_output)
+ .env("CARGO_MANIFEST_DIR", unit.pkg.root())
+ .env("NUM_JOBS", &cx.jobs().to_string())
+ .env("TARGET", &match unit.kind {
+ Kind::Host => cx.host_triple(),
+ Kind::Target => cx.target_triple(),
+ })
+ .env("DEBUG", &profile.debuginfo.is_some().to_string())
+ .env("OPT_LEVEL", &profile.opt_level)
+ .env("PROFILE", if cx.build_config.release { "release" } else { "debug" })
+ .env("HOST", cx.host_triple())
+ .env("RUSTC", &cx.config.rustc()?.path)
+ .env("RUSTDOC", &*cx.config.rustdoc()?)
+ .inherit_jobserver(&cx.jobserver);
+
+ if let Some(links) = unit.pkg.manifest().links() {
+ cmd.env("CARGO_MANIFEST_LINKS", links);
+ }
+
+ // Be sure to pass along all enabled features for this package, this is the
+ // last piece of statically known information that we have.
+ for feat in cx.resolve.features(unit.pkg.package_id()).iter() {
+ cmd.env(&format!("CARGO_FEATURE_{}", super::envify(feat)), "1");
+ }
+
+ let mut cfg_map = HashMap::new();
+ for cfg in cx.cfg(unit.kind) {
+ match *cfg {
+ Cfg::Name(ref n) => { cfg_map.insert(n.clone(), None); }
+ Cfg::KeyPair(ref k, ref v) => {
+ match *cfg_map.entry(k.clone()).or_insert(Some(Vec::new())) {
+ Some(ref mut values) => values.push(v.clone()),
+ None => { /* ... */ }
+ }
+ }
+ }
+ }
+ for (k, v) in cfg_map {
+ let k = format!("CARGO_CFG_{}", super::envify(&k));
+ match v {
+ Some(list) => { cmd.env(&k, list.join(",")); }
+ None => { cmd.env(&k, ""); }
+ }
+ }
+
+ // Gather the set of native dependencies that this package has along with
+ // some other variables to close over.
+ //
+ // This information will be used at build-time later on to figure out which
+ // sorts of variables need to be discovered at that time.
+ let lib_deps = {
+ dependencies.iter().filter_map(|unit| {
+ if unit.profile.run_custom_build {
+ Some((unit.pkg.manifest().links().unwrap().to_string(),
+ unit.pkg.package_id().clone()))
+ } else {
+ None
+ }
+ }).collect::<Vec<_>>()
+ };
+ let pkg_name = unit.pkg.to_string();
+ let build_state = Arc::clone(&cx.build_state);
+ let id = unit.pkg.package_id().clone();
+ let (output_file, err_file) = {
+ let build_output_parent = build_output.parent().unwrap();
+ let output_file = build_output_parent.join("output");
+ let err_file = build_output_parent.join("stderr");
+ (output_file, err_file)
+ };
+ let all = (id.clone(), pkg_name.clone(), Arc::clone(&build_state),
+ output_file.clone());
+ let build_scripts = super::load_build_deps(cx, unit);
+ let kind = unit.kind;
+ let json_messages = cx.build_config.json_messages;
+
+ // Check to see if the build script has already run, and if it has keep
+ // track of whether it has told us about some explicit dependencies
+ let prev_output = BuildOutput::parse_file(&output_file, &pkg_name).ok();
+ let deps = BuildDeps::new(&output_file, prev_output.as_ref());
+ cx.build_explicit_deps.insert(*unit, deps);
+
+ fs::create_dir_all(&script_output)?;
+ fs::create_dir_all(&build_output)?;
+
+ let root_output = cx.target_root().to_path_buf();
+
+ // Prepare the unit of "dirty work" which will actually run the custom build
+ // command.
+ //
+ // Note that this has to do some extra work just before running the command
+ // to determine extra environment variables and such.
+ let dirty = Work::new(move |state| {
+ // Make sure that OUT_DIR exists.
+ //
+ // If we have an old build directory, then just move it into place,
+ // otherwise create it!
+ if fs::metadata(&build_output).is_err() {
+ fs::create_dir(&build_output).chain_err(|| {
+ internal("failed to create script output directory for \
+ build command")
+ })?;
+ }
+
+ // For all our native lib dependencies, pick up their metadata to pass
+ // along to this custom build command. We're also careful to augment our
+ // dynamic library search path in case the build script depended on any
+ // native dynamic libraries.
+ {
+ let build_state = build_state.outputs.lock().unwrap();
+ for (name, id) in lib_deps {
+ let key = (id.clone(), kind);
+ let state = build_state.get(&key).ok_or_else(|| {
+ internal(format!("failed to locate build state for env \
+ vars: {}/{:?}", id, kind))
+ })?;
+ let data = &state.metadata;
+ for &(ref key, ref value) in data.iter() {
+ cmd.env(&format!("DEP_{}_{}", super::envify(&name),
+ super::envify(key)), value);
+ }
+ }
+ if let Some(build_scripts) = build_scripts {
+ super::add_plugin_deps(&mut cmd, &build_state,
+ &build_scripts,
+ &root_output)?;
+ }
+ }
+
+ // And now finally, run the build command itself!
+ state.running(&cmd);
+ let output = cmd.exec_with_streaming(
+ &mut |out_line| { state.stdout(out_line); Ok(()) },
+ &mut |err_line| { state.stderr(err_line); Ok(()) },
+ true,
+ ).map_err(|e| {
+ CargoError::from(
+ format!("failed to run custom build command for `{}`\n{}",
+ pkg_name, e.description()))
+
+ })?;
+
+
+ // After the build command has finished running, we need to be sure to
+ // remember all of its output so we can later discover precisely what it
+ // was, even if we don't run the build command again (due to freshness).
+ //
+ // This is also the location where we provide feedback into the build
+ // state informing what variables were discovered via our script as
+ // well.
+ paths::write(&output_file, &output.stdout)?;
+ paths::write(&err_file, &output.stderr)?;
+ let parsed_output = BuildOutput::parse(&output.stdout, &pkg_name)?;
+
+ if json_messages {
+ let library_paths = parsed_output.library_paths.iter().map(|l| {
+ l.display().to_string()
+ }).collect::<Vec<_>>();
+ machine_message::emit(&machine_message::BuildScript {
+ package_id: &id,
+ linked_libs: &parsed_output.library_links,
+ linked_paths: &library_paths,
+ cfgs: &parsed_output.cfgs,
+ env: &parsed_output.env,
+ });
+ }
+
+ build_state.insert(id, kind, parsed_output);
+ Ok(())
+ });
+
+ // Now that we've prepared our work-to-do, we need to prepare the fresh work
+ // itself to run when we actually end up just discarding what we calculated
+ // above.
+ let fresh = Work::new(move |_tx| {
+ let (id, pkg_name, build_state, output_file) = all;
+ let output = match prev_output {
+ Some(output) => output,
+ None => BuildOutput::parse_file(&output_file, &pkg_name)?,
+ };
+ build_state.insert(id, kind, output);
+ Ok(())
+ });
+
+ Ok((dirty, fresh))
+}
+
+impl BuildState {
+ pub fn new(config: &super::BuildConfig) -> BuildState {
+ let mut overrides = HashMap::new();
+ let i1 = config.host.overrides.iter().map(|p| (p, Kind::Host));
+ let i2 = config.target.overrides.iter().map(|p| (p, Kind::Target));
+ for ((name, output), kind) in i1.chain(i2) {
+ overrides.insert((name.clone(), kind), output.clone());
+ }
+ BuildState {
+ outputs: Mutex::new(HashMap::new()),
+ overrides: overrides,
+ }
+ }
+
+ fn insert(&self, id: PackageId, kind: Kind, output: BuildOutput) {
+ self.outputs.lock().unwrap().insert((id, kind), output);
+ }
+}
+
+impl BuildOutput {
+ pub fn parse_file(path: &Path, pkg_name: &str) -> CargoResult<BuildOutput> {
+ let contents = paths::read_bytes(path)?;
+ BuildOutput::parse(&contents, pkg_name)
+ }
+
+ // Parses the output of a script.
+ // The `pkg_name` is used for error messages.
+ pub fn parse(input: &[u8], pkg_name: &str) -> CargoResult<BuildOutput> {
+ let mut library_paths = Vec::new();
+ let mut library_links = Vec::new();
+ let mut cfgs = Vec::new();
+ let mut env = Vec::new();
+ let mut metadata = Vec::new();
+ let mut rerun_if_changed = Vec::new();
+ let mut rerun_if_env_changed = Vec::new();
+ let mut warnings = Vec::new();
+ let whence = format!("build script of `{}`", pkg_name);
+
+ for line in input.split(|b| *b == b'\n') {
+ let line = match str::from_utf8(line) {
+ Ok(line) => line.trim(),
+ Err(..) => continue,
+ };
+ let mut iter = line.splitn(2, ':');
+ if iter.next() != Some("cargo") {
+ // skip this line since it doesn't start with "cargo:"
+ continue;
+ }
+ let data = match iter.next() {
+ Some(val) => val,
+ None => continue
+ };
+
+ // getting the `key=value` part of the line
+ let mut iter = data.splitn(2, '=');
+ let key = iter.next();
+ let value = iter.next();
+ let (key, value) = match (key, value) {
+ (Some(a), Some(b)) => (a, b.trim_right()),
+ // line started with `cargo:` but didn't match `key=value`
+ _ => bail!("Wrong output in {}: `{}`", whence, line),
+ };
+
+ match key {
+ "rustc-flags" => {
+ let (paths, links) =
+ BuildOutput::parse_rustc_flags(value, &whence)?;
+ library_links.extend(links.into_iter());
+ library_paths.extend(paths.into_iter());
+ }
+ "rustc-link-lib" => library_links.push(value.to_string()),
+ "rustc-link-search" => library_paths.push(PathBuf::from(value)),
+ "rustc-cfg" => cfgs.push(value.to_string()),
+ "rustc-env" => env.push(BuildOutput::parse_rustc_env(value, &whence)?),
+ "warning" => warnings.push(value.to_string()),
+ "rerun-if-changed" => rerun_if_changed.push(value.to_string()),
+ "rerun-if-env-changed" => rerun_if_env_changed.push(value.to_string()),
+ _ => metadata.push((key.to_string(), value.to_string())),
+ }
+ }
+
+ Ok(BuildOutput {
+ library_paths: library_paths,
+ library_links: library_links,
+ cfgs: cfgs,
+ env: env,
+ metadata: metadata,
+ rerun_if_changed: rerun_if_changed,
+ rerun_if_env_changed: rerun_if_env_changed,
+ warnings: warnings,
+ })
+ }
+
+ pub fn parse_rustc_flags(value: &str, whence: &str)
+ -> CargoResult<(Vec<PathBuf>, Vec<String>)> {
+ let value = value.trim();
+ let mut flags_iter = value.split(|c: char| c.is_whitespace())
+ .filter(|w| w.chars().any(|c| !c.is_whitespace()));
+ let (mut library_paths, mut library_links) = (Vec::new(), Vec::new());
+ while let Some(flag) = flags_iter.next() {
+ if flag != "-l" && flag != "-L" {
+ bail!("Only `-l` and `-L` flags are allowed in {}: `{}`",
+ whence, value)
+ }
+ let value = match flags_iter.next() {
+ Some(v) => v,
+ None => bail!("Flag in rustc-flags has no value in {}: `{}`",
+ whence, value)
+ };
+ match flag {
+ "-l" => library_links.push(value.to_string()),
+ "-L" => library_paths.push(PathBuf::from(value)),
+
+ // was already checked above
+ _ => bail!("only -l and -L flags are allowed")
+ };
+ }
+ Ok((library_paths, library_links))
+ }
+
+ pub fn parse_rustc_env(value: &str, whence: &str)
+ -> CargoResult<(String, String)> {
+ let mut iter = value.splitn(2, '=');
+ let name = iter.next();
+ let val = iter.next();
+ match (name, val) {
+ (Some(n), Some(v)) => Ok((n.to_owned(), v.to_owned())),
+ _ => bail!("Variable rustc-env has no value in {}: {}", whence, value),
+ }
+ }
+}
+
+impl BuildDeps {
+ pub fn new(output_file: &Path, output: Option<&BuildOutput>) -> BuildDeps {
+ BuildDeps {
+ build_script_output: output_file.to_path_buf(),
+ rerun_if_changed: output.map(|p| &p.rerun_if_changed)
+ .cloned()
+ .unwrap_or_default(),
+ rerun_if_env_changed: output.map(|p| &p.rerun_if_env_changed)
+ .cloned()
+ .unwrap_or_default(),
+ }
+ }
+}
+
+/// Compute the `build_scripts` map in the `Context` which tracks what build
+/// scripts each package depends on.
+///
+/// The global `build_scripts` map lists for all (package, kind) tuples what set
+/// of packages' build script outputs must be considered. For example this lists
+/// all dependencies' `-L` flags which need to be propagated transitively.
+///
+/// The given set of targets to this function is the initial set of
+/// targets/profiles which are being built.
+pub fn build_map<'b, 'cfg>(cx: &mut Context<'b, 'cfg>,
+ units: &[Unit<'b>])
+ -> CargoResult<()> {
+ let mut ret = HashMap::new();
+ for unit in units {
+ build(&mut ret, cx, unit)?;
+ }
+ cx.build_scripts.extend(ret.into_iter().map(|(k, v)| {
+ (k, Arc::new(v))
+ }));
+ return Ok(());
+
+ // Recursive function to build up the map we're constructing. This function
+ // memoizes all of its return values as it goes along.
+ fn build<'a, 'b, 'cfg>(out: &'a mut HashMap<Unit<'b>, BuildScripts>,
+ cx: &mut Context<'b, 'cfg>,
+ unit: &Unit<'b>)
+ -> CargoResult<&'a BuildScripts> {
+ // Do a quick pre-flight check to see if we've already calculated the
+ // set of dependencies.
+ if out.contains_key(unit) {
+ return Ok(&out[unit])
+ }
+
+ {
+ let key = unit.pkg.manifest().links().map(|l| (l.to_string(), unit.kind));
+ let build_state = &cx.build_state;
+ if let Some(output) = key.and_then(|k| build_state.overrides.get(&k)) {
+ let key = (unit.pkg.package_id().clone(), unit.kind);
+ cx.build_script_overridden.insert(key.clone());
+ build_state
+ .outputs
+ .lock()
+ .unwrap()
+ .insert(key, output.clone());
+ }
+ }
+
+ let mut ret = BuildScripts::default();
+
+ if !unit.target.is_custom_build() && unit.pkg.has_custom_build() {
+ add_to_link(&mut ret, unit.pkg.package_id(), unit.kind);
+ }
+
+ // We want to invoke the compiler deterministically to be cache-friendly
+ // to rustc invocation caching schemes, so be sure to generate the same
+ // set of build script dependency orderings via sorting the targets that
+ // come out of the `Context`.
+ let mut targets = cx.dep_targets(unit)?;
+ targets.sort_by_key(|u| u.pkg.package_id());
+
+ for unit in targets.iter() {
+ let dep_scripts = build(out, cx, unit)?;
+
+ if unit.target.for_host() {
+ ret.plugins.extend(dep_scripts.to_link.iter()
+ .map(|p| &p.0).cloned());
+ } else if unit.target.linkable() {
+ for &(ref pkg, kind) in dep_scripts.to_link.iter() {
+ add_to_link(&mut ret, pkg, kind);
+ }
+ }
+ }
+
+ let prev = out.entry(*unit).or_insert(BuildScripts::default());
+ for (pkg, kind) in ret.to_link {
+ add_to_link(prev, &pkg, kind);
+ }
+ prev.plugins.extend(ret.plugins);
+ Ok(prev)
+ }
+
+ // When adding an entry to 'to_link' we only actually push it on if the
+ // script hasn't seen it yet (e.g. we don't push on duplicates).
+ fn add_to_link(scripts: &mut BuildScripts, pkg: &PackageId, kind: Kind) {
+ if scripts.seen_to_link.insert((pkg.clone(), kind)) {
+ scripts.to_link.push((pkg.clone(), kind));
+ }
+ }
+}
--- /dev/null
+use std::env;
+use std::fs::{self, File, OpenOptions};
+use std::hash::{self, Hasher};
+use std::io::prelude::*;
+use std::io::{BufReader, SeekFrom};
+use std::path::{Path, PathBuf};
+use std::sync::{Arc, Mutex};
+
+use filetime::FileTime;
+use serde::ser::{self, Serialize};
+use serde::de::{self, Deserialize};
+use serde_json;
+
+use core::{Package, TargetKind};
+use util;
+use util::{Fresh, Dirty, Freshness, internal, profile};
+use util::errors::{CargoResult, CargoResultExt};
+use util::paths;
+
+use super::job::Work;
+use super::context::{Context, Unit, TargetFileType};
+use super::custom_build::BuildDeps;
+
+/// A tuple result of the `prepare_foo` functions in this module.
+///
+/// The first element of the triple is whether the target in question is
+/// currently fresh or not, and the second two elements are work to perform when
+/// the target is dirty or fresh, respectively.
+///
+/// Both units of work are always generated because a fresh package may still be
+/// rebuilt if some upstream dependency changes.
+pub type Preparation = (Freshness, Work, Work);
+
+/// Prepare the necessary work for the fingerprint for a specific target.
+///
+/// When dealing with fingerprints, cargo gets to choose what granularity
+/// "freshness" is considered at. One option is considering freshness at the
+/// package level. This means that if anything in a package changes, the entire
+/// package is rebuilt, unconditionally. This simplicity comes at a cost,
+/// however, in that test-only changes will cause libraries to be rebuilt, which
+/// is quite unfortunate!
+///
+/// The cost was deemed high enough that fingerprints are now calculated at the
+/// layer of a target rather than a package. Each target can then be kept track
+/// of separately and only rebuilt as necessary. This requires cargo to
+/// understand what the inputs are to a target, so we drive rustc with the
+/// --dep-info flag to learn about all input files to a unit of compilation.
+///
+/// This function will calculate the fingerprint for a target and prepare the
+/// work necessary to either write the fingerprint or copy over all fresh files
+/// from the old directories to their new locations.
+pub fn prepare_target<'a, 'cfg>(cx: &mut Context<'a, 'cfg>,
+ unit: &Unit<'a>) -> CargoResult<Preparation> {
+ let _p = profile::start(format!("fingerprint: {} / {}",
+ unit.pkg.package_id(), unit.target.name()));
+ let new = cx.fingerprint_dir(unit);
+ let loc = new.join(&filename(cx, unit));
+
+ debug!("fingerprint at: {}", loc.display());
+
+ let fingerprint = calculate(cx, unit)?;
+ let compare = compare_old_fingerprint(&loc, &*fingerprint);
+ log_compare(unit, &compare);
+
+ // If our comparison failed (e.g. we're going to trigger a rebuild of this
+ // crate), then we also ensure the source of the crate passes all
+ // verification checks before we build it.
+ //
+ // The `Source::verify` method is intended to allow sources to execute
+ // pre-build checks to ensure that the relevant source code is all
+ // up-to-date and as expected. This is currently used primarily for
+ // directory sources which will use this hook to perform an integrity check
+ // on all files in the source to ensure they haven't changed. If they have
+ // changed then an error is issued.
+ if compare.is_err() {
+ let source_id = unit.pkg.package_id().source_id();
+ let sources = cx.packages.sources();
+ let source = sources.get(source_id).ok_or_else(|| {
+ internal("missing package source")
+ })?;
+ source.verify(unit.pkg.package_id())?;
+ }
+
+ let root = cx.out_dir(unit);
+ let mut missing_outputs = false;
+ if unit.profile.doc {
+ missing_outputs = !root.join(unit.target.crate_name())
+ .join("index.html").exists();
+ } else {
+ for &(ref src, ref link_dst, file_type) in cx.target_filenames(unit)?.iter() {
+ if file_type == TargetFileType::DebugInfo {
+ continue;
+ }
+ missing_outputs |= !src.exists();
+ if let Some(ref link_dst) = *link_dst {
+ missing_outputs |= !link_dst.exists();
+ }
+ }
+ }
+
+ let allow_failure = unit.profile.rustc_args.is_some();
+ let write_fingerprint = Work::new(move |_| {
+ match fingerprint.update_local() {
+ Ok(()) => {}
+ Err(..) if allow_failure => return Ok(()),
+ Err(e) => return Err(e)
+ }
+ write_fingerprint(&loc, &*fingerprint)
+ });
+
+ let fresh = compare.is_ok() && !missing_outputs;
+ Ok((if fresh {Fresh} else {Dirty}, write_fingerprint, Work::noop()))
+}
+
+/// A fingerprint can be considered to be a "short string" representing the
+/// state of a world for a package.
+///
+/// If a fingerprint ever changes, then the package itself needs to be
+/// recompiled. Inputs to the fingerprint include source code modifications,
+/// compiler flags, compiler version, etc. This structure is not simply a
+/// `String` due to the fact that some fingerprints cannot be calculated lazily.
+///
+/// Path sources, for example, use the mtime of the corresponding dep-info file
+/// as a fingerprint (all source files must be modified *before* this mtime).
+/// This dep-info file is not generated, however, until after the crate is
+/// compiled. As a result, this structure can be thought of as a fingerprint
+/// to-be. The actual value can be calculated via `hash()`, but the operation
+/// may fail as some files may not have been generated.
+///
+/// Note that dependencies are taken into account for fingerprints because rustc
+/// requires that whenever an upstream crate is recompiled that all downstream
+/// dependants are also recompiled. This is typically tracked through
+/// `DependencyQueue`, but it also needs to be retained here because Cargo can
+/// be interrupted while executing, losing the state of the `DependencyQueue`
+/// graph.
+#[derive(Serialize, Deserialize)]
+pub struct Fingerprint {
+ rustc: u64,
+ features: String,
+ target: u64,
+ profile: u64,
+ #[serde(serialize_with = "serialize_deps", deserialize_with = "deserialize_deps")]
+ deps: Vec<(String, Arc<Fingerprint>)>,
+ local: Vec<LocalFingerprint>,
+ #[serde(skip_serializing, skip_deserializing)]
+ memoized_hash: Mutex<Option<u64>>,
+ rustflags: Vec<String>,
+}
+
+fn serialize_deps<S>(deps: &[(String, Arc<Fingerprint>)], ser: S)
+ -> Result<S::Ok, S::Error>
+ where S: ser::Serializer,
+{
+ deps.iter().map(|&(ref a, ref b)| {
+ (a, b.hash())
+ }).collect::<Vec<_>>().serialize(ser)
+}
+
+fn deserialize_deps<'de, D>(d: D) -> Result<Vec<(String, Arc<Fingerprint>)>, D::Error>
+ where D: de::Deserializer<'de>,
+{
+ let decoded = <Vec<(String, u64)>>::deserialize(d)?;
+ Ok(decoded.into_iter().map(|(name, hash)| {
+ (name, Arc::new(Fingerprint {
+ rustc: 0,
+ target: 0,
+ profile: 0,
+ local: vec![LocalFingerprint::Precalculated(String::new())],
+ features: String::new(),
+ deps: Vec::new(),
+ memoized_hash: Mutex::new(Some(hash)),
+ rustflags: Vec::new(),
+ }))
+ }).collect())
+}
+
+#[derive(Serialize, Deserialize, Hash)]
+enum LocalFingerprint {
+ Precalculated(String),
+ MtimeBased(MtimeSlot, PathBuf),
+ EnvBased(String, Option<String>),
+}
+
+struct MtimeSlot(Mutex<Option<FileTime>>);
+
+impl Fingerprint {
+ fn update_local(&self) -> CargoResult<()> {
+ let mut hash_busted = false;
+ for local in self.local.iter() {
+ match *local {
+ LocalFingerprint::MtimeBased(ref slot, ref path) => {
+ let meta = fs::metadata(path)
+ .chain_err(|| {
+ internal(format!("failed to stat `{}`", path.display()))
+ })?;
+ let mtime = FileTime::from_last_modification_time(&meta);
+ *slot.0.lock().unwrap() = Some(mtime);
+ }
+ LocalFingerprint::EnvBased(..) |
+ LocalFingerprint::Precalculated(..) => continue,
+ }
+ hash_busted = true;
+ }
+
+ if hash_busted {
+ *self.memoized_hash.lock().unwrap() = None;
+ }
+ Ok(())
+ }
+
+ fn hash(&self) -> u64 {
+ if let Some(s) = *self.memoized_hash.lock().unwrap() {
+ return s
+ }
+ let ret = util::hash_u64(self);
+ *self.memoized_hash.lock().unwrap() = Some(ret);
+ ret
+ }
+
+ fn compare(&self, old: &Fingerprint) -> CargoResult<()> {
+ if self.rustc != old.rustc {
+ bail!("rust compiler has changed")
+ }
+ if self.features != old.features {
+ bail!("features have changed: {} != {}", self.features, old.features)
+ }
+ if self.target != old.target {
+ bail!("target configuration has changed")
+ }
+ if self.profile != old.profile {
+ bail!("profile configuration has changed")
+ }
+ if self.rustflags != old.rustflags {
+ return Err(internal("RUSTFLAGS has changed"))
+ }
+ if self.local.len() != old.local.len() {
+ bail!("local lens changed");
+ }
+ for (new, old) in self.local.iter().zip(&old.local) {
+ match (new, old) {
+ (&LocalFingerprint::Precalculated(ref a),
+ &LocalFingerprint::Precalculated(ref b)) => {
+ if a != b {
+ bail!("precalculated components have changed: {} != {}",
+ a, b)
+ }
+ }
+ (&LocalFingerprint::MtimeBased(ref on_disk_mtime, ref ap),
+ &LocalFingerprint::MtimeBased(ref previously_built_mtime, ref bp)) => {
+ let on_disk_mtime = on_disk_mtime.0.lock().unwrap();
+ let previously_built_mtime = previously_built_mtime.0.lock().unwrap();
+
+ let should_rebuild = match (*on_disk_mtime, *previously_built_mtime) {
+ (None, None) => false,
+ (Some(_), None) | (None, Some(_)) => true,
+ (Some(on_disk), Some(previously_built)) => on_disk > previously_built,
+ };
+
+ if should_rebuild {
+ bail!("mtime based components have changed: previously {:?} now {:?}, \
+ paths are {:?} and {:?}",
+ *previously_built_mtime, *on_disk_mtime, ap, bp)
+ }
+ }
+ (&LocalFingerprint::EnvBased(ref akey, ref avalue),
+ &LocalFingerprint::EnvBased(ref bkey, ref bvalue)) => {
+ if *akey != *bkey {
+ bail!("env vars changed: {} != {}", akey, bkey);
+ }
+ if *avalue != *bvalue {
+ bail!("env var `{}` changed: previously {:?} now {:?}",
+ akey, bvalue, avalue)
+ }
+ }
+ _ => bail!("local fingerprint type has changed"),
+ }
+ }
+
+ if self.deps.len() != old.deps.len() {
+ bail!("number of dependencies has changed")
+ }
+ for (a, b) in self.deps.iter().zip(old.deps.iter()) {
+ if a.1.hash() != b.1.hash() {
+ bail!("new ({}) != old ({})", a.0, b.0)
+ }
+ }
+ Ok(())
+ }
+}
+
+impl hash::Hash for Fingerprint {
+ fn hash<H: Hasher>(&self, h: &mut H) {
+ let Fingerprint {
+ rustc,
+ ref features,
+ target,
+ profile,
+ ref deps,
+ ref local,
+ memoized_hash: _,
+ ref rustflags,
+ } = *self;
+ (rustc, features, target, profile, local, rustflags).hash(h);
+
+ h.write_usize(deps.len());
+ for &(ref name, ref fingerprint) in deps {
+ name.hash(h);
+ // use memoized dep hashes to avoid exponential blowup
+ h.write_u64(Fingerprint::hash(fingerprint));
+ }
+ }
+}
+
+impl hash::Hash for MtimeSlot {
+ fn hash<H: Hasher>(&self, h: &mut H) {
+ self.0.lock().unwrap().hash(h)
+ }
+}
+
+impl ser::Serialize for MtimeSlot {
+ fn serialize<S>(&self, s: S) -> Result<S::Ok, S::Error>
+ where S: ser::Serializer,
+ {
+ self.0.lock().unwrap().map(|ft| {
+ (ft.seconds_relative_to_1970(), ft.nanoseconds())
+ }).serialize(s)
+ }
+}
+
+impl<'de> de::Deserialize<'de> for MtimeSlot {
+ fn deserialize<D>(d: D) -> Result<MtimeSlot, D::Error>
+ where D: de::Deserializer<'de>,
+ {
+ let kind: Option<(u64, u32)> = de::Deserialize::deserialize(d)?;
+ Ok(MtimeSlot(Mutex::new(kind.map(|(s, n)| {
+ FileTime::from_seconds_since_1970(s, n)
+ }))))
+ }
+}
+
+/// Calculates the fingerprint for a package/target pair.
+///
+/// This fingerprint is used by Cargo to learn about when information such as:
+///
+/// * A non-path package changes (changes version, changes revision, etc).
+/// * Any dependency changes
+/// * The compiler changes
+/// * The set of features a package is built with changes
+/// * The profile a target is compiled with changes (e.g. opt-level changes)
+///
+/// Information like file modification time is only calculated for path
+/// dependencies and is calculated in `calculate_target_fresh`.
+fn calculate<'a, 'cfg>(cx: &mut Context<'a, 'cfg>, unit: &Unit<'a>)
+ -> CargoResult<Arc<Fingerprint>> {
+ if let Some(s) = cx.fingerprints.get(unit) {
+ return Ok(Arc::clone(s))
+ }
+
+ // Next, recursively calculate the fingerprint for all of our dependencies.
+ //
+ // Skip the fingerprints of build scripts as they may not always be
+ // available and the dirtiness propagation for modification is tracked
+ // elsewhere. Also skip fingerprints of binaries because they don't actually
+ // induce a recompile, they're just dependencies in the sense that they need
+ // to be built.
+ let deps = cx.dep_targets(unit)?;
+ let deps = deps.iter().filter(|u| {
+ !u.target.is_custom_build() && !u.target.is_bin()
+ }).map(|unit| {
+ calculate(cx, unit).map(|fingerprint| {
+ (unit.pkg.package_id().to_string(), fingerprint)
+ })
+ }).collect::<CargoResult<Vec<_>>>()?;
+
+ // And finally, calculate what our own local fingerprint is
+ let local = if use_dep_info(unit) {
+ let dep_info = dep_info_loc(cx, unit);
+ let mtime = dep_info_mtime_if_fresh(&dep_info)?;
+ LocalFingerprint::MtimeBased(MtimeSlot(Mutex::new(mtime)), dep_info)
+ } else {
+ let fingerprint = pkg_fingerprint(cx, unit.pkg)?;
+ LocalFingerprint::Precalculated(fingerprint)
+ };
+ let mut deps = deps;
+ deps.sort_by(|&(ref a, _), &(ref b, _)| a.cmp(b));
+ let extra_flags = if unit.profile.doc {
+ cx.rustdocflags_args(unit)?
+ } else {
+ cx.rustflags_args(unit)?
+ };
+ let fingerprint = Arc::new(Fingerprint {
+ rustc: util::hash_u64(&cx.config.rustc()?.verbose_version),
+ target: util::hash_u64(&unit.target),
+ profile: util::hash_u64(&unit.profile),
+ features: format!("{:?}", cx.resolve.features_sorted(unit.pkg.package_id())),
+ deps: deps,
+ local: vec![local],
+ memoized_hash: Mutex::new(None),
+ rustflags: extra_flags,
+ });
+ cx.fingerprints.insert(*unit, Arc::clone(&fingerprint));
+ Ok(fingerprint)
+}
+
+
+// We want to use the mtime for files if we're a path source, but if we're a
+// git/registry source, then the mtime of files may fluctuate, but they won't
+// change so long as the source itself remains constant (which is the
+// responsibility of the source)
+fn use_dep_info(unit: &Unit) -> bool {
+ let path = unit.pkg.summary().source_id().is_path();
+ !unit.profile.doc && path
+}
+
+/// Prepare the necessary work for the fingerprint of a build command.
+///
+/// Build commands are located on packages, not on targets. Additionally, we
+/// don't have --dep-info to drive calculation of the fingerprint of a build
+/// command. This brings up an interesting predicament which gives us a few
+/// options to figure out whether a build command is dirty or not:
+///
+/// 1. A build command is dirty if *any* file in a package changes. In theory
+/// all files are candidate for being used by the build command.
+/// 2. A build command is dirty if any file in a *specific directory* changes.
+/// This may lose information as it may require files outside of the specific
+/// directory.
+/// 3. A build command must itself provide a dep-info-like file stating how it
+/// should be considered dirty or not.
+///
+/// The currently implemented solution is option (1), although it is planned to
+/// migrate to option (2) in the near future.
+pub fn prepare_build_cmd<'a, 'cfg>(cx: &mut Context<'a, 'cfg>, unit: &Unit<'a>)
+ -> CargoResult<Preparation> {
+ let _p = profile::start(format!("fingerprint build cmd: {}",
+ unit.pkg.package_id()));
+ let new = cx.fingerprint_dir(unit);
+ let loc = new.join("build");
+
+ debug!("fingerprint at: {}", loc.display());
+
+ let (local, output_path) = build_script_local_fingerprints(cx, unit)?;
+ let mut fingerprint = Fingerprint {
+ rustc: 0,
+ target: 0,
+ profile: 0,
+ features: String::new(),
+ deps: Vec::new(),
+ local: local,
+ memoized_hash: Mutex::new(None),
+ rustflags: Vec::new(),
+ };
+ let compare = compare_old_fingerprint(&loc, &fingerprint);
+ log_compare(unit, &compare);
+
+ // When we write out the fingerprint, we may want to actually change the
+ // kind of fingerprint being recorded. If we started out, then the previous
+ // run of the build script (or if it had never run before) may indicate to
+ // use the `Precalculated` variant with the `pkg_fingerprint`. If the build
+ // script then prints `rerun-if-changed`, however, we need to record what's
+ // necessary for that fingerprint.
+ //
+ // Hence, if there were some `rerun-if-changed` directives forcibly change
+ // the kind of fingerprint by reinterpreting the dependencies output by the
+ // build script.
+ let state = Arc::clone(&cx.build_state);
+ let key = (unit.pkg.package_id().clone(), unit.kind);
+ let root = unit.pkg.root().to_path_buf();
+ let write_fingerprint = Work::new(move |_| {
+ if let Some(output_path) = output_path {
+ let outputs = state.outputs.lock().unwrap();
+ let outputs = &outputs[&key];
+ if !outputs.rerun_if_changed.is_empty() ||
+ !outputs.rerun_if_env_changed.is_empty() {
+ let deps = BuildDeps::new(&output_path, Some(outputs));
+ fingerprint.local = local_fingerprints_deps(&deps, &root);
+ fingerprint.update_local()?;
+ }
+ }
+ write_fingerprint(&loc, &fingerprint)
+ });
+
+ Ok((if compare.is_ok() {Fresh} else {Dirty}, write_fingerprint, Work::noop()))
+}
+
+fn build_script_local_fingerprints<'a, 'cfg>(cx: &mut Context<'a, 'cfg>,
+ unit: &Unit<'a>)
+ -> CargoResult<(Vec<LocalFingerprint>, Option<PathBuf>)>
+{
+ let state = cx.build_state.outputs.lock().unwrap();
+ // First up, if this build script is entirely overridden, then we just
+ // return the hash of what we overrode it with.
+ //
+ // Note that the `None` here means that we don't want to update the local
+ // fingerprint afterwards because this is all just overridden.
+ if let Some(output) = state.get(&(unit.pkg.package_id().clone(), unit.kind)) {
+ debug!("override local fingerprints deps");
+ let s = format!("overridden build state with hash: {}",
+ util::hash_u64(output));
+ return Ok((vec![LocalFingerprint::Precalculated(s)], None))
+ }
+
+ // Next up we look at the previously listed dependencies for the build
+ // script. If there are none then we're in the "old mode" where we just
+ // assume that we're changed if anything in the packaged changed. The
+ // `Some` here though means that we want to update our local fingerprints
+ // after we're done as running this build script may have created more
+ // dependencies.
+ let deps = &cx.build_explicit_deps[unit];
+ let output = deps.build_script_output.clone();
+ if deps.rerun_if_changed.is_empty() && deps.rerun_if_env_changed.is_empty() {
+ debug!("old local fingerprints deps");
+ let s = pkg_fingerprint(cx, unit.pkg)?;
+ return Ok((vec![LocalFingerprint::Precalculated(s)], Some(output)))
+ }
+
+ // Ok so now we're in "new mode" where we can have files listed as
+ // dependencies as well as env vars listed as dependencies. Process them all
+ // here.
+ Ok((local_fingerprints_deps(deps, unit.pkg.root()), Some(output)))
+}
+
+fn local_fingerprints_deps(deps: &BuildDeps, root: &Path) -> Vec<LocalFingerprint> {
+ debug!("new local fingerprints deps");
+ let mut local = Vec::new();
+ if !deps.rerun_if_changed.is_empty() {
+ let output = &deps.build_script_output;
+ let deps = deps.rerun_if_changed.iter().map(|p| root.join(p));
+ let mtime = mtime_if_fresh(output, deps);
+ let mtime = MtimeSlot(Mutex::new(mtime));
+ local.push(LocalFingerprint::MtimeBased(mtime, output.clone()));
+ }
+
+ for var in deps.rerun_if_env_changed.iter() {
+ let val = env::var(var).ok();
+ local.push(LocalFingerprint::EnvBased(var.clone(), val));
+ }
+
+ local
+}
+
+fn write_fingerprint(loc: &Path, fingerprint: &Fingerprint) -> CargoResult<()> {
+ let hash = fingerprint.hash();
+ debug!("write fingerprint: {}", loc.display());
+ paths::write(loc, util::to_hex(hash).as_bytes())?;
+ paths::write(&loc.with_extension("json"),
+ &serde_json::to_vec(&fingerprint).unwrap())?;
+ Ok(())
+}
+
+/// Prepare for work when a package starts to build
+pub fn prepare_init<'a, 'cfg>(cx: &mut Context<'a, 'cfg>, unit: &Unit<'a>) -> CargoResult<()> {
+ let new1 = cx.fingerprint_dir(unit);
+
+ if fs::metadata(&new1).is_err() {
+ fs::create_dir(&new1)?;
+ }
+
+ Ok(())
+}
+
+pub fn dep_info_loc<'a, 'cfg>(cx: &mut Context<'a, 'cfg>, unit: &Unit<'a>) -> PathBuf {
+ cx.fingerprint_dir(unit).join(&format!("dep-{}", filename(cx, unit)))
+}
+
+fn compare_old_fingerprint(loc: &Path, new_fingerprint: &Fingerprint)
+ -> CargoResult<()> {
+ let old_fingerprint_short = paths::read(loc)?;
+ let new_hash = new_fingerprint.hash();
+
+ if util::to_hex(new_hash) == old_fingerprint_short {
+ return Ok(())
+ }
+
+ let old_fingerprint_json = paths::read(&loc.with_extension("json"))?;
+ let old_fingerprint = serde_json::from_str(&old_fingerprint_json)
+ .chain_err(|| internal("failed to deserialize json"))?;
+ new_fingerprint.compare(&old_fingerprint)
+}
+
+fn log_compare(unit: &Unit, compare: &CargoResult<()>) {
+ let ce = match *compare {
+ Ok(..) => return,
+ Err(ref e) => e,
+ };
+ info!("fingerprint error for {}: {}", unit.pkg, ce);
+
+ for cause in ce.iter() {
+ info!(" cause: {}", cause);
+ }
+}
+
+// Parse the dep-info into a list of paths
+pub fn parse_dep_info(dep_info: &Path) -> CargoResult<Option<Vec<PathBuf>>> {
+ macro_rules! fs_try {
+ ($e:expr) => (match $e { Ok(e) => e, Err(..) => return Ok(None) })
+ }
+ let mut f = BufReader::new(fs_try!(File::open(dep_info)));
+ // see comments in append_current_dir for where this cwd is manifested from.
+ let mut cwd = Vec::new();
+ if fs_try!(f.read_until(0, &mut cwd)) == 0 {
+ return Ok(None)
+ }
+ let cwd = util::bytes2path(&cwd[..cwd.len()-1])?;
+ let line = match f.lines().next() {
+ Some(Ok(line)) => line,
+ _ => return Ok(None),
+ };
+ let pos = line.find(": ").ok_or_else(|| {
+ internal(format!("dep-info not in an understood format: {}",
+ dep_info.display()))
+ })?;
+ let deps = &line[pos + 2..];
+
+ let mut paths = Vec::new();
+ let mut deps = deps.split(' ').map(|s| s.trim()).filter(|s| !s.is_empty());
+ while let Some(s) = deps.next() {
+ let mut file = s.to_string();
+ while file.ends_with('\\') {
+ file.pop();
+ file.push(' ');
+ file.push_str(deps.next().ok_or_else(|| {
+ internal("malformed dep-info format, trailing \\".to_string())
+ })?);
+ }
+ paths.push(cwd.join(&file));
+ }
+ Ok(Some(paths))
+}
+
+fn dep_info_mtime_if_fresh(dep_info: &Path) -> CargoResult<Option<FileTime>> {
+ if let Some(paths) = parse_dep_info(dep_info)? {
+ Ok(mtime_if_fresh(dep_info, paths.iter()))
+ } else {
+ Ok(None)
+ }
+}
+
+fn pkg_fingerprint(cx: &Context, pkg: &Package) -> CargoResult<String> {
+ let source_id = pkg.package_id().source_id();
+ let sources = cx.packages.sources();
+
+ let source = sources.get(source_id).ok_or_else(|| {
+ internal("missing package source")
+ })?;
+ source.fingerprint(pkg)
+}
+
+fn mtime_if_fresh<I>(output: &Path, paths: I) -> Option<FileTime>
+ where I: IntoIterator,
+ I::Item: AsRef<Path>,
+{
+ let meta = match fs::metadata(output) {
+ Ok(meta) => meta,
+ Err(..) => return None,
+ };
+ let mtime = FileTime::from_last_modification_time(&meta);
+
+ let any_stale = paths.into_iter().any(|path| {
+ let path = path.as_ref();
+ let meta = match fs::metadata(path) {
+ Ok(meta) => meta,
+ Err(..) => {
+ info!("stale: {} -- missing", path.display());
+ return true
+ }
+ };
+ let mtime2 = FileTime::from_last_modification_time(&meta);
+ if mtime2 > mtime {
+ info!("stale: {} -- {} vs {}", path.display(), mtime2, mtime);
+ true
+ } else {
+ false
+ }
+ });
+
+ if any_stale {
+ None
+ } else {
+ Some(mtime)
+ }
+}
+
+fn filename<'a, 'cfg>(cx: &mut Context<'a, 'cfg>, unit: &Unit<'a>) -> String {
+ // file_stem includes metadata hash. Thus we have a different
+ // fingerprint for every metadata hash version. This works because
+ // even if the package is fresh, we'll still link the fresh target
+ let file_stem = cx.file_stem(unit);
+ let kind = match *unit.target.kind() {
+ TargetKind::Lib(..) => "lib",
+ TargetKind::Bin => "bin",
+ TargetKind::Test => "integration-test",
+ TargetKind::ExampleBin |
+ TargetKind::ExampleLib(..) => "example",
+ TargetKind::Bench => "bench",
+ TargetKind::CustomBuild => "build-script",
+ };
+ let flavor = if unit.profile.test {
+ "test-"
+ } else if unit.profile.doc {
+ "doc-"
+ } else {
+ ""
+ };
+ format!("{}{}-{}", flavor, kind, file_stem)
+}
+
+// The dep-info files emitted by the compiler all have their listed paths
+// relative to whatever the current directory was at the time that the compiler
+// was invoked. As the current directory may change over time, we need to record
+// what that directory was at the beginning of the file so we can know about it
+// next time.
+pub fn append_current_dir(path: &Path, cwd: &Path) -> CargoResult<()> {
+ debug!("appending {} <- {}", path.display(), cwd.display());
+ let mut f = OpenOptions::new().read(true).write(true).open(path)?;
+ let mut contents = Vec::new();
+ f.read_to_end(&mut contents)?;
+ f.seek(SeekFrom::Start(0))?;
+ f.write_all(util::path2bytes(cwd)?)?;
+ f.write_all(&[0])?;
+ f.write_all(&contents)?;
+ Ok(())
+}
--- /dev/null
+use std::fmt;
+
+use util::{CargoResult, Fresh, Dirty, Freshness};
+use super::job_queue::JobState;
+
+pub struct Job { dirty: Work, fresh: Work }
+
+/// Each proc should send its description before starting.
+/// It should send either once or close immediately.
+pub struct Work {
+ inner: Box<for <'a, 'b> FnBox<&'a JobState<'b>, CargoResult<()>> + Send>,
+}
+
+trait FnBox<A, R> {
+ fn call_box(self: Box<Self>, a: A) -> R;
+}
+
+impl<A, R, F: FnOnce(A) -> R> FnBox<A, R> for F {
+ fn call_box(self: Box<F>, a: A) -> R {
+ (*self)(a)
+ }
+}
+
+impl Work {
+ pub fn new<F>(f: F) -> Work
+ where F: FnOnce(&JobState) -> CargoResult<()> + Send + 'static
+ {
+ Work { inner: Box::new(f) }
+ }
+
+ pub fn noop() -> Work {
+ Work::new(|_| Ok(()))
+ }
+
+ pub fn call(self, tx: &JobState) -> CargoResult<()> {
+ self.inner.call_box(tx)
+ }
+
+ pub fn then(self, next: Work) -> Work {
+ Work::new(move |state| {
+ self.call(state)?;
+ next.call(state)
+ })
+ }
+}
+
+impl Job {
+ /// Create a new job representing a unit of work.
+ pub fn new(dirty: Work, fresh: Work) -> Job {
+ Job { dirty: dirty, fresh: fresh }
+ }
+
+ /// Consumes this job by running it, returning the result of the
+ /// computation.
+ pub fn run(self, fresh: Freshness, state: &JobState) -> CargoResult<()> {
+ match fresh {
+ Fresh => self.fresh.call(state),
+ Dirty => self.dirty.call(state),
+ }
+ }
+}
+
+impl fmt::Debug for Job {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ write!(f, "Job {{ ... }}")
+ }
+}
--- /dev/null
+use std::collections::HashSet;
+use std::collections::hash_map::HashMap;
+use std::fmt;
+use std::io;
+use std::mem;
+use std::sync::mpsc::{channel, Sender, Receiver};
+
+use crossbeam::{self, Scope};
+use jobserver::{Acquired, HelperThread};
+
+use core::{PackageId, Target, Profile};
+use util::{Config, DependencyQueue, Fresh, Dirty, Freshness};
+use util::{CargoResult, ProcessBuilder, profile, internal, CargoResultExt};
+use {handle_error};
+
+use super::{Context, Kind, Unit};
+use super::job::Job;
+
+/// A management structure of the entire dependency graph to compile.
+///
+/// This structure is backed by the `DependencyQueue` type and manages the
+/// actual compilation step of each package. Packages enqueue units of work and
+/// then later on the entire graph is processed and compiled.
+pub struct JobQueue<'a> {
+ queue: DependencyQueue<Key<'a>, Vec<(Job, Freshness)>>,
+ tx: Sender<Message<'a>>,
+ rx: Receiver<Message<'a>>,
+ active: usize,
+ pending: HashMap<Key<'a>, PendingBuild>,
+ compiled: HashSet<&'a PackageId>,
+ documented: HashSet<&'a PackageId>,
+ counts: HashMap<&'a PackageId, usize>,
+ is_release: bool,
+}
+
+/// A helper structure for metadata about the state of a building package.
+struct PendingBuild {
+ /// Number of jobs currently active
+ amt: usize,
+ /// Current freshness state of this package. Any dirty target within a
+ /// package will cause the entire package to become dirty.
+ fresh: Freshness,
+}
+
+#[derive(Clone, Copy, Eq, PartialEq, Hash)]
+struct Key<'a> {
+ pkg: &'a PackageId,
+ target: &'a Target,
+ profile: &'a Profile,
+ kind: Kind,
+}
+
+pub struct JobState<'a> {
+ tx: Sender<Message<'a>>,
+}
+
+enum Message<'a> {
+ Run(String),
+ Stdout(String),
+ Stderr(String),
+ Token(io::Result<Acquired>),
+ Finish(Key<'a>, CargoResult<()>),
+}
+
+impl<'a> JobState<'a> {
+ pub fn running(&self, cmd: &ProcessBuilder) {
+ let _ = self.tx.send(Message::Run(cmd.to_string()));
+ }
+
+ pub fn stdout(&self, out: &str) {
+ let _ = self.tx.send(Message::Stdout(out.to_string()));
+ }
+
+ pub fn stderr(&self, err: &str) {
+ let _ = self.tx.send(Message::Stderr(err.to_string()));
+ }
+}
+
+impl<'a> JobQueue<'a> {
+ pub fn new<'cfg>(cx: &Context<'a, 'cfg>) -> JobQueue<'a> {
+ let (tx, rx) = channel();
+ JobQueue {
+ queue: DependencyQueue::new(),
+ tx: tx,
+ rx: rx,
+ active: 0,
+ pending: HashMap::new(),
+ compiled: HashSet::new(),
+ documented: HashSet::new(),
+ counts: HashMap::new(),
+ is_release: cx.build_config.release,
+ }
+ }
+
+ pub fn enqueue<'cfg>(&mut self,
+ cx: &Context<'a, 'cfg>,
+ unit: &Unit<'a>,
+ job: Job,
+ fresh: Freshness) -> CargoResult<()> {
+ let key = Key::new(unit);
+ let deps = key.dependencies(cx)?;
+ self.queue.queue(Fresh, key, Vec::new(), &deps).push((job, fresh));
+ *self.counts.entry(key.pkg).or_insert(0) += 1;
+ Ok(())
+ }
+
+ /// Execute all jobs necessary to build the dependency graph.
+ ///
+ /// This function will spawn off `config.jobs()` workers to build all of the
+ /// necessary dependencies, in order. Freshness is propagated as far as
+ /// possible along each dependency chain.
+ pub fn execute(&mut self, cx: &mut Context) -> CargoResult<()> {
+ let _p = profile::start("executing the job graph");
+
+ // We need to give a handle to the send half of our message queue to the
+ // jobserver helper thread. Unfortunately though we need the handle to be
+ // `'static` as that's typically what's required when spawning a
+ // thread!
+ //
+ // To work around this we transmute the `Sender` to a static lifetime.
+ // we're only sending "longer living" messages and we should also
+ // destroy all references to the channel before this function exits as
+ // the destructor for the `helper` object will ensure the associated
+ // thread is no longer running.
+ //
+ // As a result, this `transmute` to a longer lifetime should be safe in
+ // practice.
+ let tx = self.tx.clone();
+ let tx = unsafe {
+ mem::transmute::<Sender<Message<'a>>, Sender<Message<'static>>>(tx)
+ };
+ let helper = cx.jobserver.clone().into_helper_thread(move |token| {
+ drop(tx.send(Message::Token(token)));
+ }).chain_err(|| {
+ "failed to create helper thread for jobserver management"
+ })?;
+
+ crossbeam::scope(|scope| {
+ self.drain_the_queue(cx, scope, &helper)
+ })
+ }
+
+ fn drain_the_queue(&mut self,
+ cx: &mut Context,
+ scope: &Scope<'a>,
+ jobserver_helper: &HelperThread)
+ -> CargoResult<()> {
+ use std::time::Instant;
+
+ let mut tokens = Vec::new();
+ let mut queue = Vec::new();
+ trace!("queue: {:#?}", self.queue);
+
+ // Iteratively execute the entire dependency graph. Each turn of the
+ // loop starts out by scheduling as much work as possible (up to the
+ // maximum number of parallel jobs we have tokens for). A local queue
+ // is maintained separately from the main dependency queue as one
+ // dequeue may actually dequeue quite a bit of work (e.g. 10 binaries
+ // in one project).
+ //
+ // After a job has finished we update our internal state if it was
+ // successful and otherwise wait for pending work to finish if it failed
+ // and then immediately return.
+ let mut error = None;
+ let start_time = Instant::now();
+ loop {
+ // Dequeue as much work as we can, learning about everything
+ // possible that can run. Note that this is also the point where we
+ // start requesting job tokens. Each job after the first needs to
+ // request a token.
+ while let Some((fresh, key, jobs)) = self.queue.dequeue() {
+ let total_fresh = jobs.iter().fold(fresh, |fresh, &(_, f)| {
+ f.combine(fresh)
+ });
+ self.pending.insert(key, PendingBuild {
+ amt: jobs.len(),
+ fresh: total_fresh,
+ });
+ for (job, f) in jobs {
+ queue.push((key, job, f.combine(fresh)));
+ if self.active + queue.len() > 0 {
+ jobserver_helper.request_token();
+ }
+ }
+ }
+
+ // Now that we've learned of all possible work that we can execute
+ // try to spawn it so long as we've got a jobserver token which says
+ // we're able to perform some parallel work.
+ while error.is_none() && self.active < tokens.len() + 1 && !queue.is_empty() {
+ let (key, job, fresh) = queue.remove(0);
+ self.run(key, fresh, job, cx.config, scope)?;
+ }
+
+ // If after all that we're not actually running anything then we're
+ // done!
+ if self.active == 0 {
+ break
+ }
+
+ // And finally, before we block waiting for the next event, drop any
+ // excess tokens we may have accidentally acquired. Due to how our
+ // jobserver interface is architected we may acquire a token that we
+ // don't actually use, and if this happens just relinquish it back
+ // to the jobserver itself.
+ tokens.truncate(self.active - 1);
+
+ match self.rx.recv().unwrap() {
+ Message::Run(cmd) => {
+ cx.config.shell().verbose(|c| c.status("Running", &cmd))?;
+ }
+ Message::Stdout(out) => {
+ if cx.config.extra_verbose() {
+ println!("{}", out);
+ }
+ }
+ Message::Stderr(err) => {
+ if cx.config.extra_verbose() {
+ writeln!(cx.config.shell().err(), "{}", err)?;
+ }
+ }
+ Message::Finish(key, result) => {
+ info!("end: {:?}", key);
+ self.active -= 1;
+ if self.active > 0 {
+ assert!(!tokens.is_empty());
+ drop(tokens.pop());
+ }
+ match result {
+ Ok(()) => self.finish(key, cx)?,
+ Err(e) => {
+ let msg = "The following warnings were emitted during compilation:";
+ self.emit_warnings(Some(msg), key, cx)?;
+
+ if self.active > 0 {
+ error = Some("build failed".into());
+ handle_error(e, &mut *cx.config.shell());
+ cx.config.shell().warn(
+ "build failed, waiting for other \
+ jobs to finish...")?;
+ }
+ else {
+ error = Some(e);
+ }
+ }
+ }
+ }
+ Message::Token(acquired_token) => {
+ tokens.push(acquired_token.chain_err(|| {
+ "failed to acquire jobserver token"
+ })?);
+ }
+ }
+ }
+
+ let build_type = if self.is_release { "release" } else { "dev" };
+ let profile = cx.lib_profile();
+ let mut opt_type = String::from(if profile.opt_level == "0" { "unoptimized" }
+ else { "optimized" });
+ if profile.debuginfo.is_some() {
+ opt_type += " + debuginfo";
+ }
+ let duration = start_time.elapsed();
+ let time_elapsed = format!("{}.{1:.2} secs",
+ duration.as_secs(),
+ duration.subsec_nanos() / 10_000_000);
+ if self.queue.is_empty() {
+ let message = format!("{} [{}] target(s) in {}",
+ build_type,
+ opt_type,
+ time_elapsed);
+ cx.config.shell().status("Finished", message)?;
+ Ok(())
+ } else if let Some(e) = error {
+ Err(e)
+ } else {
+ debug!("queue: {:#?}", self.queue);
+ Err(internal("finished with jobs still left in the queue"))
+ }
+ }
+
+ /// Executes a job in the `scope` given, pushing the spawned thread's
+ /// handled onto `threads`.
+ fn run(&mut self,
+ key: Key<'a>,
+ fresh: Freshness,
+ job: Job,
+ config: &Config,
+ scope: &Scope<'a>) -> CargoResult<()> {
+ info!("start: {:?}", key);
+
+ self.active += 1;
+ *self.counts.get_mut(key.pkg).unwrap() -= 1;
+
+ let my_tx = self.tx.clone();
+ let doit = move || {
+ let res = job.run(fresh, &JobState {
+ tx: my_tx.clone(),
+ });
+ my_tx.send(Message::Finish(key, res)).unwrap();
+ };
+ match fresh {
+ Freshness::Fresh => doit(),
+ Freshness::Dirty => { scope.spawn(doit); }
+ }
+
+ // Print out some nice progress information
+ self.note_working_on(config, &key, fresh)?;
+
+ Ok(())
+ }
+
+ fn emit_warnings(&self, msg: Option<&str>, key: Key<'a>, cx: &mut Context) -> CargoResult<()> {
+ let output = cx.build_state.outputs.lock().unwrap();
+ if let Some(output) = output.get(&(key.pkg.clone(), key.kind)) {
+ if let Some(msg) = msg {
+ if !output.warnings.is_empty() {
+ writeln!(cx.config.shell().err(), "{}\n", msg)?;
+ }
+ }
+
+ for warning in output.warnings.iter() {
+ cx.config.shell().warn(warning)?;
+ }
+
+ if !output.warnings.is_empty() && msg.is_some() {
+ // Output an empty line.
+ writeln!(cx.config.shell().err(), "")?;
+ }
+ }
+
+ Ok(())
+ }
+
+ fn finish(&mut self, key: Key<'a>, cx: &mut Context) -> CargoResult<()> {
+ if key.profile.run_custom_build && cx.show_warnings(key.pkg) {
+ self.emit_warnings(None, key, cx)?;
+ }
+
+ let state = self.pending.get_mut(&key).unwrap();
+ state.amt -= 1;
+ if state.amt == 0 {
+ self.queue.finish(&key, state.fresh);
+ }
+ Ok(())
+ }
+
+ // This isn't super trivial because we don't want to print loads and
+ // loads of information to the console, but we also want to produce a
+ // faithful representation of what's happening. This is somewhat nuanced
+ // as a package can start compiling *very* early on because of custom
+ // build commands and such.
+ //
+ // In general, we try to print "Compiling" for the first nontrivial task
+ // run for a package, regardless of when that is. We then don't print
+ // out any more information for a package after we've printed it once.
+ fn note_working_on(&mut self,
+ config: &Config,
+ key: &Key<'a>,
+ fresh: Freshness) -> CargoResult<()> {
+ if (self.compiled.contains(key.pkg) && !key.profile.doc) ||
+ (self.documented.contains(key.pkg) && key.profile.doc) {
+ return Ok(())
+ }
+
+ match fresh {
+ // Any dirty stage which runs at least one command gets printed as
+ // being a compiled package
+ Dirty => {
+ if key.profile.doc {
+ if !key.profile.test {
+ self.documented.insert(key.pkg);
+ config.shell().status("Documenting", key.pkg)?;
+ }
+ } else {
+ self.compiled.insert(key.pkg);
+ config.shell().status("Compiling", key.pkg)?;
+ }
+ }
+ Fresh if self.counts[key.pkg] == 0 => {
+ self.compiled.insert(key.pkg);
+ config.shell().verbose(|c| c.status("Fresh", key.pkg))?;
+ }
+ Fresh => {}
+ }
+ Ok(())
+ }
+}
+
+impl<'a> Key<'a> {
+ fn new(unit: &Unit<'a>) -> Key<'a> {
+ Key {
+ pkg: unit.pkg.package_id(),
+ target: unit.target,
+ profile: unit.profile,
+ kind: unit.kind,
+ }
+ }
+
+ fn dependencies<'cfg>(&self, cx: &Context<'a, 'cfg>)
+ -> CargoResult<Vec<Key<'a>>> {
+ let unit = Unit {
+ pkg: cx.get_package(self.pkg)?,
+ target: self.target,
+ profile: self.profile,
+ kind: self.kind,
+ };
+ let targets = cx.dep_targets(&unit)?;
+ Ok(targets.iter().filter_map(|unit| {
+ // Binaries aren't actually needed to *compile* tests, just to run
+ // them, so we don't include this dependency edge in the job graph.
+ if self.target.is_test() && unit.target.is_bin() {
+ None
+ } else {
+ Some(Key::new(unit))
+ }
+ }).collect())
+ }
+}
+
+impl<'a> fmt::Debug for Key<'a> {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ write!(f, "{} => {}/{} => {:?}", self.pkg, self.target, self.profile,
+ self.kind)
+ }
+}
--- /dev/null
+//! Management of the directory layout of a build
+//!
+//! The directory layout is a little tricky at times, hence a separate file to
+//! house this logic. The current layout looks like this:
+//!
+//! ```ignore
+//! # This is the root directory for all output, the top-level package
+//! # places all of its output here.
+//! target/
+//!
+//! # This is the root directory for all output of *dependencies*
+//! deps/
+//!
+//! # Root directory for all compiled examples
+//! examples/
+//!
+//! # This is the location at which the output of all custom build
+//! # commands are rooted
+//! build/
+//!
+//! # Each package gets its own directory where its build script and
+//! # script output are placed
+//! $pkg1/
+//! $pkg2/
+//! $pkg3/
+//!
+//! # Each directory package has a `out` directory where output
+//! # is placed.
+//! out/
+//!
+//! # This is the location at which the output of all old custom build
+//! # commands are rooted
+//! native/
+//!
+//! # Each package gets its own directory for where its output is
+//! # placed. We can't track exactly what's getting put in here, so
+//! # we just assume that all relevant output is in these
+//! # directories.
+//! $pkg1/
+//! $pkg2/
+//! $pkg3/
+//!
+//! # Directory used to store incremental data for the compiler (when
+//! # incremental is enabled.
+//! incremental/
+//!
+//! # Hidden directory that holds all of the fingerprint files for all
+//! # packages
+//! .fingerprint/
+//! ```
+
+use std::fs;
+use std::io;
+use std::path::{PathBuf, Path};
+
+use core::Workspace;
+use util::{Config, FileLock, CargoResult, Filesystem};
+
+/// Contains the paths of all target output locations.
+///
+/// See module docs for more information.
+pub struct Layout {
+ root: PathBuf,
+ deps: PathBuf,
+ native: PathBuf,
+ build: PathBuf,
+ incremental: PathBuf,
+ fingerprint: PathBuf,
+ examples: PathBuf,
+ /// The lockfile for a build, will be unlocked when this struct is `drop`ped.
+ _lock: FileLock,
+}
+
+pub fn is_bad_artifact_name(name: &str) -> bool {
+ ["deps", "examples", "build", "native", "incremental"]
+ .iter()
+ .any(|&reserved| reserved == name)
+}
+
+impl Layout {
+ /// Calculate the paths for build output, lock the build directory, and return as a Layout.
+ ///
+ /// This function will block if the directory is already locked.
+ ///
+ /// Differs from `at` in that this calculates the root path from the workspace target directory,
+ /// adding the target triple and the profile (debug, release, ...).
+ pub fn new(ws: &Workspace,
+ triple: Option<&str>,
+ dest: &str) -> CargoResult<Layout> {
+ let mut path = ws.target_dir();
+ // Flexible target specifications often point at filenames, so interpret
+ // the target triple as a Path and then just use the file stem as the
+ // component for the directory name.
+ if let Some(triple) = triple {
+ path.push(Path::new(triple).file_stem().ok_or_else(|| "target was empty")?);
+ }
+ path.push(dest);
+ Layout::at(ws.config(), path)
+ }
+
+ /// Calculate the paths for build output, lock the build directory, and return as a Layout.
+ ///
+ /// This function will block if the directory is already locked.
+ pub fn at(config: &Config, root: Filesystem) -> CargoResult<Layout> {
+ // For now we don't do any more finer-grained locking on the artifact
+ // directory, so just lock the entire thing for the duration of this
+ // compile.
+ let lock = root.open_rw(".cargo-lock", config, "build directory")?;
+ let root = root.into_path_unlocked();
+
+ Ok(Layout {
+ deps: root.join("deps"),
+ native: root.join("native"),
+ build: root.join("build"),
+ incremental: root.join("incremental"),
+ fingerprint: root.join(".fingerprint"),
+ examples: root.join("examples"),
+ root: root,
+ _lock: lock,
+ })
+ }
+
+ #[cfg(not(target_os = "macos"))]
+ fn exclude_from_backups(&self, _: &Path) {}
+
+ #[cfg(target_os = "macos")]
+ /// Marks files or directories as excluded from Time Machine on macOS
+ ///
+ /// This is recommended to prevent derived/temporary files from bloating backups.
+ fn exclude_from_backups(&self, path: &Path) {
+ use std::ptr;
+ use core_foundation::{url, number, string};
+ use core_foundation::base::TCFType;
+
+ // For compatibility with 10.7 a string is used instead of global kCFURLIsExcludedFromBackupKey
+ let is_excluded_key: Result<string::CFString, _> = "NSURLIsExcludedFromBackupKey".parse();
+ match (url::CFURL::from_path(path, false), is_excluded_key) {
+ (Some(path), Ok(is_excluded_key)) => unsafe {
+ url::CFURLSetResourcePropertyForKey(
+ path.as_concrete_TypeRef(),
+ is_excluded_key.as_concrete_TypeRef(),
+ number::kCFBooleanTrue as *const _,
+ ptr::null_mut(),
+ );
+ },
+ // Errors are ignored, since it's an optional feature and failure
+ // doesn't prevent Cargo from working
+ _ => {}
+ }
+ }
+
+ /// Make sure all directories stored in the Layout exist on the filesystem.
+ pub fn prepare(&mut self) -> io::Result<()> {
+ if fs::metadata(&self.root).is_err() {
+ fs::create_dir_all(&self.root)?;
+ }
+
+ self.exclude_from_backups(&self.root);
+
+ mkdir(&self.deps)?;
+ mkdir(&self.native)?;
+ mkdir(&self.incremental)?;
+ mkdir(&self.fingerprint)?;
+ mkdir(&self.examples)?;
+ mkdir(&self.build)?;
+
+ return Ok(());
+
+ fn mkdir(dir: &Path) -> io::Result<()> {
+ if fs::metadata(&dir).is_err() {
+ fs::create_dir(dir)?;
+ }
+ Ok(())
+ }
+ }
+
+ /// Fetch the root path.
+ pub fn dest(&self) -> &Path { &self.root }
+ /// Fetch the deps path.
+ pub fn deps(&self) -> &Path { &self.deps }
+ /// Fetch the examples path.
+ pub fn examples(&self) -> &Path { &self.examples }
+ /// Fetch the root path.
+ pub fn root(&self) -> &Path { &self.root }
+ /// Fetch the incremental path.
+ pub fn incremental(&self) -> &Path { &self.incremental }
+ /// Fetch the fingerprint path.
+ pub fn fingerprint(&self) -> &Path { &self.fingerprint }
+ /// Fetch the build path.
+ pub fn build(&self) -> &Path { &self.build }
+}
--- /dev/null
+use std::collections::{HashMap, HashSet};
+use std::fmt::Write;
+
+use core::{Resolve, PackageId};
+use util::CargoResult;
+use super::Unit;
+
+pub struct Links<'a> {
+ validated: HashSet<&'a PackageId>,
+ links: HashMap<String, &'a PackageId>,
+}
+
+impl<'a> Links<'a> {
+ pub fn new() -> Links<'a> {
+ Links {
+ validated: HashSet::new(),
+ links: HashMap::new(),
+ }
+ }
+
+ pub fn validate(&mut self, resolve: &Resolve, unit: &Unit<'a>) -> CargoResult<()> {
+ if !self.validated.insert(unit.pkg.package_id()) {
+ return Ok(())
+ }
+ let lib = match unit.pkg.manifest().links() {
+ Some(lib) => lib,
+ None => return Ok(()),
+ };
+ if let Some(prev) = self.links.get(lib) {
+ let pkg = unit.pkg.package_id();
+
+ let describe_path = |pkgid: &PackageId| -> String {
+ let dep_path = resolve.path_to_top(pkgid);
+ if dep_path.is_empty() {
+ String::from("The root-package ")
+ } else {
+ let mut dep_path_desc = format!("Package `{}`\n", pkgid);
+ for dep in dep_path {
+ write!(dep_path_desc,
+ " ... which is depended on by `{}`\n",
+ dep).unwrap();
+ }
+ dep_path_desc
+ }
+ };
+
+ bail!("Multiple packages link to native library `{}`. \
+ A native library can be linked only once.\n\
+ \n\
+ {}links to native library `{}`.\n\
+ \n\
+ {}also links to native library `{}`.",
+ lib,
+ describe_path(prev), lib,
+ describe_path(pkg), lib)
+ }
+ if !unit.pkg.manifest().targets().iter().any(|t| t.is_custom_build()) {
+ bail!("package `{}` specifies that it links to `{}` but does not \
+ have a custom build script", unit.pkg.package_id(), lib)
+ }
+ self.links.insert(lib.to_string(), unit.pkg.package_id());
+ Ok(())
+ }
+}
--- /dev/null
+use std::collections::{HashMap, HashSet};
+use std::env;
+use std::ffi::{OsStr, OsString};
+use std::fs;
+use std::io::{self, Write};
+use std::path::{self, PathBuf};
+use std::sync::Arc;
+
+use same_file::is_same_file;
+use serde_json;
+
+use core::{Package, PackageId, PackageSet, Target, Resolve};
+use core::{Profile, Profiles, Workspace};
+use core::shell::ColorChoice;
+use util::{self, ProcessBuilder, machine_message};
+use util::{Config, internal, profile, join_paths};
+use util::errors::{CargoResult, CargoResultExt};
+use util::Freshness;
+
+use self::job::{Job, Work};
+use self::job_queue::JobQueue;
+
+use self::output_depinfo::output_depinfo;
+
+pub use self::compilation::Compilation;
+pub use self::context::{Context, Unit, TargetFileType};
+pub use self::custom_build::{BuildOutput, BuildMap, BuildScripts};
+pub use self::layout::is_bad_artifact_name;
+
+mod compilation;
+mod context;
+mod custom_build;
+mod fingerprint;
+mod job;
+mod job_queue;
+mod layout;
+mod links;
+mod output_depinfo;
+
+/// Whether an object is for the host arch, or the target arch.
+///
+/// These will be the same unless cross-compiling.
+#[derive(PartialEq, Eq, Hash, Debug, Clone, Copy, PartialOrd, Ord)]
+pub enum Kind { Host, Target }
+
+/// Configuration information for a rustc build.
+#[derive(Default, Clone)]
+pub struct BuildConfig {
+ /// The host arch triple
+ ///
+ /// e.g. x86_64-unknown-linux-gnu, would be
+ /// - machine: x86_64
+ /// - hardware-platform: unknown
+ /// - operating system: linux-gnu
+ pub host_triple: String,
+ /// Build information for the host arch
+ pub host: TargetConfig,
+ /// The target arch triple, defaults to host arch
+ pub requested_target: Option<String>,
+ /// Build information for the target
+ pub target: TargetConfig,
+ /// How many rustc jobs to run in parallel
+ pub jobs: u32,
+ /// Whether we are building for release
+ pub release: bool,
+ /// Whether we are running tests
+ pub test: bool,
+ /// Whether we are building documentation
+ pub doc_all: bool,
+ /// Whether to print std output in json format (for machine reading)
+ pub json_messages: bool,
+}
+
+/// Information required to build for a target
+#[derive(Clone, Default)]
+pub struct TargetConfig {
+ /// The path of archiver (lib builder) for this target.
+ pub ar: Option<PathBuf>,
+ /// The path of the linker for this target.
+ pub linker: Option<PathBuf>,
+ /// Special build options for any necessary input files (filename -> options)
+ pub overrides: HashMap<String, BuildOutput>,
+}
+
+pub type PackagesToBuild<'a> = [(&'a Package, Vec<(&'a Target, &'a Profile)>)];
+
+/// A glorified callback for executing calls to rustc. Rather than calling rustc
+/// directly, we'll use an Executor, giving clients an opportunity to intercept
+/// the build calls.
+pub trait Executor: Send + Sync + 'static {
+ /// Called after a rustc process invocation is prepared up-front for a given
+ /// unit of work (may still be modified for runtime-known dependencies, when
+ /// the work is actually executed).
+ fn init(&self, _cx: &Context, _unit: &Unit) {}
+
+ /// In case of an `Err`, Cargo will not continue with the build process for
+ /// this package.
+ fn exec(&self,
+ cmd: ProcessBuilder,
+ _id: &PackageId,
+ _target: &Target)
+ -> CargoResult<()> {
+ cmd.exec()?;
+ Ok(())
+ }
+
+ fn exec_json(&self,
+ cmd: ProcessBuilder,
+ _id: &PackageId,
+ _target: &Target,
+ handle_stdout: &mut FnMut(&str) -> CargoResult<()>,
+ handle_stderr: &mut FnMut(&str) -> CargoResult<()>)
+ -> CargoResult<()> {
+ cmd.exec_with_streaming(handle_stdout, handle_stderr, false)?;
+ Ok(())
+ }
+
+ /// Queried when queuing each unit of work. If it returns true, then the
+ /// unit will always be rebuilt, independent of whether it needs to be.
+ fn force_rebuild(&self, _unit: &Unit) -> bool {
+ false
+ }
+}
+
+/// A `DefaultExecutor` calls rustc without doing anything else. It is Cargo's
+/// default behaviour.
+#[derive(Copy, Clone)]
+pub struct DefaultExecutor;
+
+impl Executor for DefaultExecutor {}
+
+// Returns a mapping of the root package plus its immediate dependencies to
+// where the compiled libraries are all located.
+pub fn compile_targets<'a, 'cfg: 'a>(ws: &Workspace<'cfg>,
+ pkg_targets: &'a PackagesToBuild<'a>,
+ packages: &'a PackageSet<'cfg>,
+ resolve: &'a Resolve,
+ config: &'cfg Config,
+ build_config: BuildConfig,
+ profiles: &'a Profiles,
+ exec: Arc<Executor>)
+ -> CargoResult<Compilation<'cfg>> {
+ let units = pkg_targets.iter().flat_map(|&(pkg, ref targets)| {
+ let default_kind = if build_config.requested_target.is_some() {
+ Kind::Target
+ } else {
+ Kind::Host
+ };
+ targets.iter().map(move |&(target, profile)| {
+ Unit {
+ pkg: pkg,
+ target: target,
+ profile: profile,
+ kind: if target.for_host() {Kind::Host} else {default_kind},
+ }
+ })
+ }).collect::<Vec<_>>();
+
+ let mut cx = Context::new(ws, resolve, packages, config,
+ build_config, profiles)?;
+
+ let mut queue = JobQueue::new(&cx);
+
+ cx.prepare()?;
+ cx.probe_target_info(&units)?;
+ cx.build_used_in_plugin_map(&units)?;
+ custom_build::build_map(&mut cx, &units)?;
+
+ for unit in units.iter() {
+ // Build up a list of pending jobs, each of which represent
+ // compiling a particular package. No actual work is executed as
+ // part of this, that's all done next as part of the `execute`
+ // function which will run everything in order with proper
+ // parallelism.
+ compile(&mut cx, &mut queue, unit, Arc::clone(&exec))?;
+ }
+
+ // Now that we've figured out everything that we're going to do, do it!
+ queue.execute(&mut cx)?;
+
+ for unit in units.iter() {
+ for &(ref dst, ref link_dst, file_type) in cx.target_filenames(unit)?.iter() {
+ if file_type == TargetFileType::DebugInfo {
+ continue;
+ }
+
+ let bindst = match *link_dst {
+ Some(ref link_dst) => link_dst,
+ None => dst,
+ };
+
+ if unit.profile.test {
+ cx.compilation.tests.push((unit.pkg.clone(),
+ unit.target.kind().clone(),
+ unit.target.name().to_string(),
+ dst.clone()));
+ } else if unit.target.is_bin() || unit.target.is_example() {
+ cx.compilation.binaries.push(bindst.clone());
+ } else if unit.target.is_lib() {
+ let pkgid = unit.pkg.package_id().clone();
+ cx.compilation.libraries.entry(pkgid).or_insert(HashSet::new())
+ .insert((unit.target.clone(), dst.clone()));
+ }
+ }
+
+ for dep in cx.dep_targets(unit)?.iter() {
+ if !unit.target.is_lib() { continue }
+
+ if dep.profile.run_custom_build {
+ let out_dir = cx.build_script_out_dir(dep).display().to_string();
+ cx.compilation.extra_env.entry(dep.pkg.package_id().clone())
+ .or_insert(Vec::new())
+ .push(("OUT_DIR".to_string(), out_dir));
+ }
+
+ if !dep.target.is_lib() { continue }
+ if dep.profile.doc { continue }
+
+ let v = cx.target_filenames(dep)?;
+ cx.compilation.libraries
+ .entry(unit.pkg.package_id().clone())
+ .or_insert(HashSet::new())
+ .extend(v.iter().map(|&(ref f, _, _)| {
+ (dep.target.clone(), f.clone())
+ }));
+ }
+
+ let feats = cx.resolve.features(unit.pkg.package_id());
+ cx.compilation.cfgs.entry(unit.pkg.package_id().clone())
+ .or_insert_with(HashSet::new)
+ .extend(feats.iter().map(|feat| format!("feature=\"{}\"", feat)));
+
+ output_depinfo(&mut cx, unit)?;
+ }
+
+ for (&(ref pkg, _), output) in cx.build_state.outputs.lock().unwrap().iter() {
+ cx.compilation.cfgs.entry(pkg.clone())
+ .or_insert_with(HashSet::new)
+ .extend(output.cfgs.iter().cloned());
+
+ cx.compilation.extra_env.entry(pkg.clone())
+ .or_insert_with(Vec::new)
+ .extend(output.env.iter().cloned());
+
+ for dir in output.library_paths.iter() {
+ cx.compilation.native_dirs.insert(dir.clone());
+ }
+ }
+ cx.compilation.target = cx.target_triple().to_string();
+ Ok(cx.compilation)
+}
+
+fn compile<'a, 'cfg: 'a>(cx: &mut Context<'a, 'cfg>,
+ jobs: &mut JobQueue<'a>,
+ unit: &Unit<'a>,
+ exec: Arc<Executor>) -> CargoResult<()> {
+ if !cx.compiled.insert(*unit) {
+ return Ok(())
+ }
+
+ // Build up the work to be done to compile this unit, enqueuing it once
+ // we've got everything constructed.
+ let p = profile::start(format!("preparing: {}/{}", unit.pkg,
+ unit.target.name()));
+ fingerprint::prepare_init(cx, unit)?;
+ cx.links.validate(cx.resolve, unit)?;
+
+ let (dirty, fresh, freshness) = if unit.profile.run_custom_build {
+ custom_build::prepare(cx, unit)?
+ } else if unit.profile.doc && unit.profile.test {
+ // we run these targets later, so this is just a noop for now
+ (Work::noop(), Work::noop(), Freshness::Fresh)
+ } else {
+ let (mut freshness, dirty, fresh) = fingerprint::prepare_target(cx, unit)?;
+ let work = if unit.profile.doc {
+ rustdoc(cx, unit)?
+ } else {
+ rustc(cx, unit, Arc::clone(&exec))?
+ };
+ // Need to link targets on both the dirty and fresh
+ let dirty = work.then(link_targets(cx, unit, false)?).then(dirty);
+ let fresh = link_targets(cx, unit, true)?.then(fresh);
+
+ if exec.force_rebuild(unit) {
+ freshness = Freshness::Dirty;
+ }
+
+ (dirty, fresh, freshness)
+ };
+ jobs.enqueue(cx, unit, Job::new(dirty, fresh), freshness)?;
+ drop(p);
+
+ // Be sure to compile all dependencies of this target as well.
+ for unit in cx.dep_targets(unit)?.iter() {
+ compile(cx, jobs, unit, exec.clone())?;
+ }
+
+ Ok(())
+}
+
+fn rustc<'a, 'cfg>(cx: &mut Context<'a, 'cfg>,
+ unit: &Unit<'a>,
+ exec: Arc<Executor>) -> CargoResult<Work> {
+ let mut rustc = prepare_rustc(cx, &unit.target.rustc_crate_types(), unit)?;
+
+ let name = unit.pkg.name().to_string();
+
+ // If this is an upstream dep we don't want warnings from, turn off all
+ // lints.
+ if !cx.show_warnings(unit.pkg.package_id()) {
+ rustc.arg("--cap-lints").arg("allow");
+
+ // If this is an upstream dep but we *do* want warnings, make sure that they
+ // don't fail compilation.
+ } else if !unit.pkg.package_id().source_id().is_path() {
+ rustc.arg("--cap-lints").arg("warn");
+ }
+
+ let filenames = cx.target_filenames(unit)?;
+ let root = cx.out_dir(unit);
+ let kind = unit.kind;
+
+ // Prepare the native lib state (extra -L and -l flags)
+ let build_state = cx.build_state.clone();
+ let current_id = unit.pkg.package_id().clone();
+ let build_deps = load_build_deps(cx, unit);
+
+ // If we are a binary and the package also contains a library, then we
+ // don't pass the `-l` flags.
+ let pass_l_flag = unit.target.is_lib() ||
+ !unit.pkg.targets().iter().any(|t| t.is_lib());
+ let do_rename = unit.target.allows_underscores() && !unit.profile.test;
+ let real_name = unit.target.name().to_string();
+ let crate_name = unit.target.crate_name();
+
+ // XXX(Rely on target_filenames iterator as source of truth rather than rederiving filestem)
+ let rustc_dep_info_loc = if do_rename && cx.target_metadata(unit).is_none() {
+ root.join(&crate_name)
+ } else {
+ root.join(&cx.file_stem(unit))
+ }.with_extension("d");
+ let dep_info_loc = fingerprint::dep_info_loc(cx, unit);
+ let cwd = cx.config.cwd().to_path_buf();
+
+ rustc.args(&cx.incremental_args(unit)?);
+ rustc.args(&cx.rustflags_args(unit)?);
+ let json_messages = cx.build_config.json_messages;
+ let package_id = unit.pkg.package_id().clone();
+ let target = unit.target.clone();
+
+ exec.init(cx, unit);
+ let exec = exec.clone();
+
+ let root_output = cx.target_root().to_path_buf();
+
+ return Ok(Work::new(move |state| {
+ // Only at runtime have we discovered what the extra -L and -l
+ // arguments are for native libraries, so we process those here. We
+ // also need to be sure to add any -L paths for our plugins to the
+ // dynamic library load path as a plugin's dynamic library may be
+ // located somewhere in there.
+ // Finally, if custom environment variables have been produced by
+ // previous build scripts, we include them in the rustc invocation.
+ if let Some(build_deps) = build_deps {
+ let build_state = build_state.outputs.lock().unwrap();
+ add_native_deps(&mut rustc, &build_state, &build_deps,
+ pass_l_flag, ¤t_id)?;
+ add_plugin_deps(&mut rustc, &build_state, &build_deps,
+ &root_output)?;
+ add_custom_env(&mut rustc, &build_state, ¤t_id, kind)?;
+ }
+
+ for &(ref filename, ref _link_dst, _linkable) in filenames.iter() {
+ // If there is both an rmeta and rlib, rustc will prefer to use the
+ // rlib, even if it is older. Therefore, we must delete the rlib to
+ // force using the new rmeta.
+ if filename.extension() == Some(OsStr::new("rmeta")) {
+ let dst = root.join(filename).with_extension("rlib");
+ if dst.exists() {
+ fs::remove_file(&dst).chain_err(|| {
+ format!("Could not remove file: {}.", dst.display())
+ })?;
+ }
+ }
+ }
+
+ state.running(&rustc);
+ if json_messages {
+ exec.exec_json(rustc, &package_id, &target,
+ &mut |line| if !line.is_empty() {
+ Err(internal(&format!("compiler stdout is not empty: `{}`", line)))
+ } else {
+ Ok(())
+ },
+ &mut |line| {
+ // stderr from rustc can have a mix of JSON and non-JSON output
+ if line.starts_with('{') {
+ // Handle JSON lines
+ let compiler_message = serde_json::from_str(line).map_err(|_| {
+ internal(&format!("compiler produced invalid json: `{}`", line))
+ })?;
+
+ machine_message::emit(&machine_message::FromCompiler {
+ package_id: &package_id,
+ target: &target,
+ message: compiler_message,
+ });
+ } else {
+ // Forward non-JSON to stderr
+ writeln!(io::stderr(), "{}", line)?;
+ }
+ Ok(())
+ }
+ ).chain_err(|| {
+ format!("Could not compile `{}`.", name)
+ })?;
+ } else {
+ exec.exec(rustc, &package_id, &target).map_err(|e| e.into_internal()).chain_err(|| {
+ format!("Could not compile `{}`.", name)
+ })?;
+ }
+
+ if do_rename && real_name != crate_name {
+ let dst = &filenames[0].0;
+ let src = dst.with_file_name(dst.file_name().unwrap()
+ .to_str().unwrap()
+ .replace(&real_name, &crate_name));
+ if src.exists() && src.file_name() != dst.file_name() {
+ fs::rename(&src, &dst).chain_err(|| {
+ internal(format!("could not rename crate {:?}", src))
+ })?;
+ }
+ }
+
+ if fs::metadata(&rustc_dep_info_loc).is_ok() {
+ info!("Renaming dep_info {:?} to {:?}", rustc_dep_info_loc, dep_info_loc);
+ fs::rename(&rustc_dep_info_loc, &dep_info_loc).chain_err(|| {
+ internal(format!("could not rename dep info: {:?}",
+ rustc_dep_info_loc))
+ })?;
+ fingerprint::append_current_dir(&dep_info_loc, &cwd)?;
+ }
+
+ Ok(())
+ }));
+
+ // Add all relevant -L and -l flags from dependencies (now calculated and
+ // present in `state`) to the command provided
+ fn add_native_deps(rustc: &mut ProcessBuilder,
+ build_state: &BuildMap,
+ build_scripts: &BuildScripts,
+ pass_l_flag: bool,
+ current_id: &PackageId) -> CargoResult<()> {
+ for key in build_scripts.to_link.iter() {
+ let output = build_state.get(key).ok_or_else(|| {
+ internal(format!("couldn't find build state for {}/{:?}",
+ key.0, key.1))
+ })?;
+ for path in output.library_paths.iter() {
+ rustc.arg("-L").arg(path);
+ }
+ if key.0 == *current_id {
+ for cfg in &output.cfgs {
+ rustc.arg("--cfg").arg(cfg);
+ }
+ if pass_l_flag {
+ for name in output.library_links.iter() {
+ rustc.arg("-l").arg(name);
+ }
+ }
+ }
+ }
+ Ok(())
+ }
+
+ // Add all custom environment variables present in `state` (after they've
+ // been put there by one of the `build_scripts`) to the command provided.
+ fn add_custom_env(rustc: &mut ProcessBuilder,
+ build_state: &BuildMap,
+ current_id: &PackageId,
+ kind: Kind) -> CargoResult<()> {
+ let key = (current_id.clone(), kind);
+ if let Some(output) = build_state.get(&key) {
+ for &(ref name, ref value) in output.env.iter() {
+ rustc.env(name, value);
+ }
+ }
+ Ok(())
+ }
+}
+
+/// Link the compiled target (often of form `foo-{metadata_hash}`) to the
+/// final target. This must happen during both "Fresh" and "Compile"
+fn link_targets<'a, 'cfg>(cx: &mut Context<'a, 'cfg>,
+ unit: &Unit<'a>,
+ fresh: bool) -> CargoResult<Work> {
+ let filenames = cx.target_filenames(unit)?;
+ let package_id = unit.pkg.package_id().clone();
+ let target = unit.target.clone();
+ let profile = unit.profile.clone();
+ let features = cx.resolve.features_sorted(&package_id).into_iter()
+ .map(|s| s.to_owned())
+ .collect();
+ let json_messages = cx.build_config.json_messages;
+
+ Ok(Work::new(move |_| {
+ // If we're a "root crate", e.g. the target of this compilation, then we
+ // hard link our outputs out of the `deps` directory into the directory
+ // above. This means that `cargo build` will produce binaries in
+ // `target/debug` which one probably expects.
+ let mut destinations = vec![];
+ for &(ref src, ref link_dst, _file_type) in filenames.iter() {
+ // This may have been a `cargo rustc` command which changes the
+ // output, so the source may not actually exist.
+ if !src.exists() {
+ continue
+ }
+ let dst = match link_dst.as_ref() {
+ Some(dst) => dst,
+ None => {
+ destinations.push(src.display().to_string());
+ continue;
+ }
+ };
+ destinations.push(dst.display().to_string());
+
+ debug!("linking {} to {}", src.display(), dst.display());
+ if is_same_file(src, dst).unwrap_or(false) {
+ continue
+ }
+ if dst.exists() {
+ fs::remove_file(&dst).chain_err(|| {
+ format!("failed to remove: {}", dst.display())
+ })?;
+ }
+
+ let link_result = if src.is_dir() {
+ #[cfg(unix)]
+ use std::os::unix::fs::symlink;
+ #[cfg(target_os = "redox")]
+ use std::os::redox::fs::symlink;
+ #[cfg(windows)]
+ use std::os::windows::fs::symlink_dir as symlink;
+
+ symlink(src, dst)
+ } else {
+ fs::hard_link(src, dst)
+ };
+ link_result
+ .or_else(|err| {
+ debug!("link failed {}. falling back to fs::copy", err);
+ fs::copy(src, dst).map(|_| ())
+ })
+ .chain_err(|| {
+ format!("failed to link or copy `{}` to `{}`",
+ src.display(), dst.display())
+ })?;
+ }
+
+ if json_messages {
+ machine_message::emit(&machine_message::Artifact {
+ package_id: &package_id,
+ target: &target,
+ profile: &profile,
+ features: features,
+ filenames: destinations,
+ fresh: fresh,
+ });
+ }
+ Ok(())
+ }))
+}
+
+fn load_build_deps(cx: &Context, unit: &Unit) -> Option<Arc<BuildScripts>> {
+ cx.build_scripts.get(unit).cloned()
+}
+
+// For all plugin dependencies, add their -L paths (now calculated and
+// present in `state`) to the dynamic library load path for the command to
+// execute.
+fn add_plugin_deps(rustc: &mut ProcessBuilder,
+ build_state: &BuildMap,
+ build_scripts: &BuildScripts,
+ root_output: &PathBuf)
+ -> CargoResult<()> {
+ let var = util::dylib_path_envvar();
+ let search_path = rustc.get_env(var).unwrap_or_default();
+ let mut search_path = env::split_paths(&search_path).collect::<Vec<_>>();
+ for id in build_scripts.plugins.iter() {
+ let key = (id.clone(), Kind::Host);
+ let output = build_state.get(&key).ok_or_else(|| {
+ internal(format!("couldn't find libs for plugin dep {}", id))
+ })?;
+ search_path.append(&mut filter_dynamic_search_path(output.library_paths.iter(),
+ root_output));
+ }
+ let search_path = join_paths(&search_path, var)?;
+ rustc.env(var, &search_path);
+ Ok(())
+}
+
+// Determine paths to add to the dynamic search path from -L entries
+//
+// Strip off prefixes like "native=" or "framework=" and filter out directories
+// *not* inside our output directory since they are likely spurious and can cause
+// clashes with system shared libraries (issue #3366).
+fn filter_dynamic_search_path<'a, I>(paths :I, root_output: &PathBuf) -> Vec<PathBuf>
+ where I: Iterator<Item=&'a PathBuf> {
+ let mut search_path = vec![];
+ for dir in paths {
+ let dir = match dir.to_str() {
+ Some(s) => {
+ let mut parts = s.splitn(2, '=');
+ match (parts.next(), parts.next()) {
+ (Some("native"), Some(path)) |
+ (Some("crate"), Some(path)) |
+ (Some("dependency"), Some(path)) |
+ (Some("framework"), Some(path)) |
+ (Some("all"), Some(path)) => path.into(),
+ _ => dir.clone(),
+ }
+ }
+ None => dir.clone(),
+ };
+ if dir.starts_with(&root_output) {
+ search_path.push(dir);
+ } else {
+ debug!("Not including path {} in runtime library search path because it is \
+ outside target root {}", dir.display(), root_output.display());
+ }
+ }
+ search_path
+}
+
+fn prepare_rustc<'a, 'cfg>(cx: &mut Context<'a, 'cfg>,
+ crate_types: &[&str],
+ unit: &Unit<'a>) -> CargoResult<ProcessBuilder> {
+ let mut base = cx.compilation.rustc_process(unit.pkg)?;
+ base.inherit_jobserver(&cx.jobserver);
+ build_base_args(cx, &mut base, unit, crate_types);
+ build_deps_args(&mut base, cx, unit)?;
+ Ok(base)
+}
+
+
+fn rustdoc<'a, 'cfg>(cx: &mut Context<'a, 'cfg>,
+ unit: &Unit<'a>) -> CargoResult<Work> {
+ let mut rustdoc = cx.compilation.rustdoc_process(unit.pkg)?;
+ rustdoc.inherit_jobserver(&cx.jobserver);
+ rustdoc.arg("--crate-name").arg(&unit.target.crate_name())
+ .cwd(cx.config.cwd())
+ .arg(&root_path(cx, unit));
+
+ if unit.kind != Kind::Host {
+ if let Some(target) = cx.requested_target() {
+ rustdoc.arg("--target").arg(target);
+ }
+ }
+
+ let doc_dir = cx.out_dir(unit);
+
+ // Create the documentation directory ahead of time as rustdoc currently has
+ // a bug where concurrent invocations will race to create this directory if
+ // it doesn't already exist.
+ fs::create_dir_all(&doc_dir)?;
+
+ rustdoc.arg("-o").arg(doc_dir);
+
+ for feat in cx.resolve.features_sorted(unit.pkg.package_id()) {
+ rustdoc.arg("--cfg").arg(&format!("feature=\"{}\"", feat));
+ }
+
+ if let Some(ref args) = unit.profile.rustdoc_args {
+ rustdoc.args(args);
+ }
+
+ build_deps_args(&mut rustdoc, cx, unit)?;
+
+ rustdoc.args(&cx.rustdocflags_args(unit)?);
+
+ let name = unit.pkg.name().to_string();
+ let build_state = cx.build_state.clone();
+ let key = (unit.pkg.package_id().clone(), unit.kind);
+
+ Ok(Work::new(move |state| {
+ if let Some(output) = build_state.outputs.lock().unwrap().get(&key) {
+ for cfg in output.cfgs.iter() {
+ rustdoc.arg("--cfg").arg(cfg);
+ }
+ for &(ref name, ref value) in output.env.iter() {
+ rustdoc.env(name, value);
+ }
+ }
+ state.running(&rustdoc);
+ rustdoc.exec().chain_err(|| format!("Could not document `{}`.", name))
+ }))
+}
+
+// The path that we pass to rustc is actually fairly important because it will
+// show up in error messages and the like. For this reason we take a few moments
+// to ensure that something shows up pretty reasonably.
+//
+// The heuristic here is fairly simple, but the key idea is that the path is
+// always "relative" to the current directory in order to be found easily. The
+// path is only actually relative if the current directory is an ancestor if it.
+// This means that non-path dependencies (git/registry) will likely be shown as
+// absolute paths instead of relative paths.
+fn root_path(cx: &Context, unit: &Unit) -> PathBuf {
+ let absolute = unit.pkg.root().join(unit.target.src_path());
+ let cwd = cx.config.cwd();
+ if absolute.starts_with(cwd) {
+ util::without_prefix(&absolute, cwd).map(|s| {
+ s.to_path_buf()
+ }).unwrap_or(absolute)
+ } else {
+ absolute
+ }
+}
+
+fn build_base_args<'a, 'cfg>(cx: &mut Context<'a, 'cfg>,
+ cmd: &mut ProcessBuilder,
+ unit: &Unit<'a>,
+ crate_types: &[&str]) {
+ let Profile {
+ ref opt_level, lto, codegen_units, ref rustc_args, debuginfo,
+ debug_assertions, overflow_checks, rpath, test, doc: _doc,
+ run_custom_build, ref panic, rustdoc_args: _, check,
+ } = *unit.profile;
+ assert!(!run_custom_build);
+
+ // Move to cwd so the root_path() passed below is actually correct
+ cmd.cwd(cx.config.cwd());
+
+ cmd.arg("--crate-name").arg(&unit.target.crate_name());
+
+ cmd.arg(&root_path(cx, unit));
+
+ match cx.config.shell().color_choice() {
+ ColorChoice::Always => { cmd.arg("--color").arg("always"); }
+ ColorChoice::Never => { cmd.arg("--color").arg("never"); }
+ ColorChoice::CargoAuto => {}
+ }
+
+ if cx.build_config.json_messages {
+ cmd.arg("--error-format").arg("json");
+ }
+
+ if !test {
+ for crate_type in crate_types.iter() {
+ cmd.arg("--crate-type").arg(crate_type);
+ }
+ }
+
+ if check {
+ cmd.arg("--emit=dep-info,metadata");
+ } else {
+ cmd.arg("--emit=dep-info,link");
+ }
+
+ let prefer_dynamic = (unit.target.for_host() &&
+ !unit.target.is_custom_build()) ||
+ (crate_types.contains(&"dylib") &&
+ cx.ws.members().any(|p| p != unit.pkg));
+ if prefer_dynamic {
+ cmd.arg("-C").arg("prefer-dynamic");
+ }
+
+ if opt_level != "0" {
+ cmd.arg("-C").arg(&format!("opt-level={}", opt_level));
+ }
+
+ // If a panic mode was configured *and* we're not ever going to be used in a
+ // plugin, then we can compile with that panic mode.
+ //
+ // If we're used in a plugin then we'll eventually be linked to libsyntax
+ // most likely which isn't compiled with a custom panic mode, so we'll just
+ // get an error if we actually compile with that. This fixes `panic=abort`
+ // crates which have plugin dependencies, but unfortunately means that
+ // dependencies shared between the main application and plugins must be
+ // compiled without `panic=abort`. This isn't so bad, though, as the main
+ // application will still be compiled with `panic=abort`.
+ if let Some(panic) = panic.as_ref() {
+ if !cx.used_in_plugin.contains(unit) {
+ cmd.arg("-C").arg(format!("panic={}", panic));
+ }
+ }
+
+ // Disable LTO for host builds as prefer_dynamic and it are mutually
+ // exclusive.
+ if unit.target.can_lto() && lto && !unit.target.for_host() {
+ cmd.args(&["-C", "lto"]);
+ } else if let Some(n) = codegen_units {
+ // There are some restrictions with LTO and codegen-units, so we
+ // only add codegen units when LTO is not used.
+ cmd.arg("-C").arg(&format!("codegen-units={}", n));
+ }
+
+ if let Some(debuginfo) = debuginfo {
+ cmd.arg("-C").arg(format!("debuginfo={}", debuginfo));
+ }
+
+ if let Some(ref args) = *rustc_args {
+ cmd.args(args);
+ }
+
+ // -C overflow-checks is implied by the setting of -C debug-assertions,
+ // so we only need to provide -C overflow-checks if it differs from
+ // the value of -C debug-assertions we would provide.
+ if opt_level != "0" {
+ if debug_assertions {
+ cmd.args(&["-C", "debug-assertions=on"]);
+ if !overflow_checks {
+ cmd.args(&["-C", "overflow-checks=off"]);
+ }
+ } else if overflow_checks {
+ cmd.args(&["-C", "overflow-checks=on"]);
+ }
+ } else if !debug_assertions {
+ cmd.args(&["-C", "debug-assertions=off"]);
+ if overflow_checks {
+ cmd.args(&["-C", "overflow-checks=on"]);
+ }
+ } else if !overflow_checks {
+ cmd.args(&["-C", "overflow-checks=off"]);
+ }
+
+ if test && unit.target.harness() {
+ cmd.arg("--test");
+ } else if test {
+ cmd.arg("--cfg").arg("test");
+ }
+
+ // We ideally want deterministic invocations of rustc to ensure that
+ // rustc-caching strategies like sccache are able to cache more, so sort the
+ // feature list here.
+ for feat in cx.resolve.features_sorted(unit.pkg.package_id()) {
+ cmd.arg("--cfg").arg(&format!("feature=\"{}\"", feat));
+ }
+
+ match cx.target_metadata(unit) {
+ Some(m) => {
+ cmd.arg("-C").arg(&format!("metadata={}", m));
+ cmd.arg("-C").arg(&format!("extra-filename=-{}", m));
+ }
+ None => {
+ cmd.arg("-C").arg(&format!("metadata={}", cx.target_short_hash(unit)));
+ }
+ }
+
+ if rpath {
+ cmd.arg("-C").arg("rpath");
+ }
+
+ cmd.arg("--out-dir").arg(&cx.out_dir(unit));
+
+ fn opt(cmd: &mut ProcessBuilder, key: &str, prefix: &str,
+ val: Option<&OsStr>) {
+ if let Some(val) = val {
+ let mut joined = OsString::from(prefix);
+ joined.push(val);
+ cmd.arg(key).arg(joined);
+ }
+ }
+
+ if unit.kind == Kind::Target {
+ opt(cmd, "--target", "", cx.requested_target().map(|s| s.as_ref()));
+ }
+
+ opt(cmd, "-C", "ar=", cx.ar(unit.kind).map(|s| s.as_ref()));
+ opt(cmd, "-C", "linker=", cx.linker(unit.kind).map(|s| s.as_ref()));
+}
+
+
+fn build_deps_args<'a, 'cfg>(cmd: &mut ProcessBuilder,
+ cx: &mut Context<'a, 'cfg>,
+ unit: &Unit<'a>) -> CargoResult<()> {
+ cmd.arg("-L").arg(&{
+ let mut deps = OsString::from("dependency=");
+ deps.push(cx.deps_dir(unit));
+ deps
+ });
+
+ // Be sure that the host path is also listed. This'll ensure that proc-macro
+ // dependencies are correctly found (for reexported macros).
+ if let Kind::Target = unit.kind {
+ cmd.arg("-L").arg(&{
+ let mut deps = OsString::from("dependency=");
+ deps.push(cx.host_deps());
+ deps
+ });
+ }
+
+ for unit in cx.dep_targets(unit)?.iter() {
+ if unit.profile.run_custom_build {
+ cmd.env("OUT_DIR", &cx.build_script_out_dir(unit));
+ }
+ if unit.target.linkable() && !unit.profile.doc {
+ link_to(cmd, cx, unit)?;
+ }
+ }
+
+ return Ok(());
+
+ fn link_to<'a, 'cfg>(cmd: &mut ProcessBuilder,
+ cx: &mut Context<'a, 'cfg>,
+ unit: &Unit<'a>) -> CargoResult<()> {
+ for &(ref dst, _, file_type) in cx.target_filenames(unit)?.iter() {
+ if file_type != TargetFileType::Linkable {
+ continue
+ }
+ let mut v = OsString::new();
+ v.push(&unit.target.crate_name());
+ v.push("=");
+ v.push(cx.out_dir(unit));
+ v.push(&path::MAIN_SEPARATOR.to_string());
+ v.push(&dst.file_name().unwrap());
+ cmd.arg("--extern").arg(&v);
+ }
+ Ok(())
+ }
+}
+
+fn envify(s: &str) -> String {
+ s.chars()
+ .flat_map(|c| c.to_uppercase())
+ .map(|c| if c == '-' {'_'} else {c})
+ .collect()
+}
+
+impl Kind {
+ fn for_target(&self, target: &Target) -> Kind {
+ // Once we start compiling for the `Host` kind we continue doing so, but
+ // if we are a `Target` kind and then we start compiling for a target
+ // that needs to be on the host we lift ourselves up to `Host`
+ match *self {
+ Kind::Host => Kind::Host,
+ Kind::Target if target.for_host() => Kind::Host,
+ Kind::Target => Kind::Target,
+ }
+ }
+}
--- /dev/null
+use std::collections::HashSet;
+use std::io::{Write, BufWriter, ErrorKind};
+use std::fs::{self, File};
+use std::path::{Path, PathBuf};
+
+use ops::{Context, Unit};
+use util::{CargoResult, internal};
+use ops::cargo_rustc::fingerprint;
+
+fn render_filename<P: AsRef<Path>>(path: P, basedir: Option<&str>) -> CargoResult<String> {
+ let path = path.as_ref();
+ let relpath = match basedir {
+ None => path,
+ Some(base) => match path.strip_prefix(base) {
+ Ok(relpath) => relpath,
+ _ => path,
+ }
+ };
+ relpath.to_str().ok_or_else(|| internal("path not utf-8")).map(|f| f.replace(" ", "\\ "))
+}
+
+fn add_deps_for_unit<'a, 'b>(
+ deps: &mut HashSet<PathBuf>,
+ context: &mut Context<'a, 'b>,
+ unit: &Unit<'a>,
+ visited: &mut HashSet<Unit<'a>>,
+)
+ -> CargoResult<()>
+{
+ if !visited.insert(*unit) {
+ return Ok(());
+ }
+
+ // units representing the execution of a build script don't actually
+ // generate a dep info file, so we just keep on going below
+ if !unit.profile.run_custom_build {
+ // Add dependencies from rustc dep-info output (stored in fingerprint directory)
+ let dep_info_loc = fingerprint::dep_info_loc(context, unit);
+ if let Some(paths) = fingerprint::parse_dep_info(&dep_info_loc)? {
+ for path in paths {
+ deps.insert(path);
+ }
+ } else {
+ debug!("can't find dep_info for {:?} {:?}",
+ unit.pkg.package_id(), unit.profile);
+ return Err(internal("dep_info missing"));
+ }
+ }
+
+ // Add rerun-if-changed dependencies
+ let key = (unit.pkg.package_id().clone(), unit.kind);
+ if let Some(output) = context.build_state.outputs.lock().unwrap().get(&key) {
+ for path in &output.rerun_if_changed {
+ deps.insert(path.into());
+ }
+ }
+
+ // Recursively traverse all transitive dependencies
+ for dep_unit in &context.dep_targets(unit)? {
+ let source_id = dep_unit.pkg.package_id().source_id();
+ if source_id.is_path() {
+ add_deps_for_unit(deps, context, dep_unit, visited)?;
+ }
+ }
+ Ok(())
+}
+
+pub fn output_depinfo<'a, 'b>(context: &mut Context<'a, 'b>, unit: &Unit<'a>) -> CargoResult<()> {
+ let mut deps = HashSet::new();
+ let mut visited = HashSet::new();
+ let success = add_deps_for_unit(&mut deps, context, unit, &mut visited).is_ok();
+ let basedir = None; // TODO
+ for &(_, ref link_dst, _) in context.target_filenames(unit)?.iter() {
+ if let Some(ref link_dst) = *link_dst {
+ let output_path = link_dst.with_extension("d");
+ if success {
+ let mut outfile = BufWriter::new(File::create(output_path)?);
+ let target_fn = render_filename(link_dst, basedir)?;
+ write!(outfile, "{}:", target_fn)?;
+ for dep in &deps {
+ write!(outfile, " {}", render_filename(dep, basedir)?)?;
+ }
+ writeln!(outfile, "")?;
+ } else if let Err(err) = fs::remove_file(output_path) {
+ // dep-info generation failed, so delete output file. This will usually
+ // cause the build system to always rerun the build rule, which is correct
+ // if inefficient.
+ if err.kind() != ErrorKind::NotFound {
+ return Err(err.into());
+ }
+ }
+ }
+ }
+ Ok(())
+}
--- /dev/null
+use std::ffi::{OsString, OsStr};
+
+use ops::{self, Compilation};
+use util::{self, CargoTestError, Test, ProcessError};
+use util::errors::{CargoResult, CargoErrorKind, CargoError};
+use core::Workspace;
+
+pub struct TestOptions<'a> {
+ pub compile_opts: ops::CompileOptions<'a>,
+ pub no_run: bool,
+ pub no_fail_fast: bool,
+ pub only_doc: bool,
+}
+
+pub fn run_tests(ws: &Workspace,
+ options: &TestOptions,
+ test_args: &[String]) -> CargoResult<Option<CargoTestError>> {
+ let compilation = compile_tests(ws, options)?;
+
+ if options.no_run {
+ return Ok(None)
+ }
+ let (test, mut errors) = if options.only_doc {
+ assert!(options.compile_opts.filter.is_specific());
+ run_doc_tests(options, test_args, &compilation)?
+ } else {
+ run_unit_tests(options, test_args, &compilation)?
+ };
+
+ // If we have an error and want to fail fast, return
+ if !errors.is_empty() && !options.no_fail_fast {
+ return Ok(Some(CargoTestError::new(test, errors)))
+ }
+
+ // If a specific test was requested or we're not running any tests at all,
+ // don't run any doc tests.
+ if options.compile_opts.filter.is_specific() {
+ match errors.len() {
+ 0 => return Ok(None),
+ _ => return Ok(Some(CargoTestError::new(test, errors)))
+ }
+ }
+
+ let (doctest, docerrors) = run_doc_tests(options, test_args, &compilation)?;
+ let test = if docerrors.is_empty() { test } else { doctest };
+ errors.extend(docerrors);
+ if errors.is_empty() {
+ Ok(None)
+ } else {
+ Ok(Some(CargoTestError::new(test, errors)))
+ }
+}
+
+pub fn run_benches(ws: &Workspace,
+ options: &TestOptions,
+ args: &[String]) -> CargoResult<Option<CargoTestError>> {
+ let mut args = args.to_vec();
+ args.push("--bench".to_string());
+ let compilation = compile_tests(ws, options)?;
+
+ if options.no_run {
+ return Ok(None)
+ }
+ let (test, errors) = run_unit_tests(options, &args, &compilation)?;
+ match errors.len() {
+ 0 => Ok(None),
+ _ => Ok(Some(CargoTestError::new(test, errors))),
+ }
+}
+
+fn compile_tests<'a>(ws: &Workspace<'a>,
+ options: &TestOptions<'a>)
+ -> CargoResult<Compilation<'a>> {
+ let mut compilation = ops::compile(ws, &options.compile_opts)?;
+ compilation.tests.sort_by(|a, b| {
+ (a.0.package_id(), &a.1, &a.2).cmp(&(b.0.package_id(), &b.1, &b.2))
+ });
+ Ok(compilation)
+}
+
+/// Run the unit and integration tests of a project.
+fn run_unit_tests(options: &TestOptions,
+ test_args: &[String],
+ compilation: &Compilation)
+ -> CargoResult<(Test, Vec<ProcessError>)> {
+ let config = options.compile_opts.config;
+ let cwd = options.compile_opts.config.cwd();
+
+ let mut errors = Vec::new();
+
+ for &(ref pkg, ref kind, ref test, ref exe) in &compilation.tests {
+ let to_display = match util::without_prefix(exe, cwd) {
+ Some(path) => path,
+ None => &**exe,
+ };
+ let mut cmd = compilation.target_process(exe, pkg)?;
+ cmd.args(test_args);
+ config.shell().concise(|shell| {
+ shell.status("Running", to_display.display().to_string())
+ })?;
+ config.shell().verbose(|shell| {
+ shell.status("Running", cmd.to_string())
+ })?;
+
+ let result = cmd.exec();
+
+ match result {
+ Err(CargoError(CargoErrorKind::ProcessErrorKind(e), .. )) => {
+ errors.push((kind.clone(), test.clone(), e));
+ if !options.no_fail_fast {
+ break;
+ }
+ }
+ Err(e) => {
+ //This is an unexpected Cargo error rather than a test failure
+ return Err(e)
+ }
+ Ok(()) => {}
+ }
+ }
+
+ if errors.len() == 1 {
+ let (kind, test, e) = errors.pop().unwrap();
+ Ok((Test::UnitTest(kind, test), vec![e]))
+ } else {
+ Ok((Test::Multiple, errors.into_iter().map((|(_, _, e)| e)).collect()))
+ }
+}
+
+fn run_doc_tests(options: &TestOptions,
+ test_args: &[String],
+ compilation: &Compilation)
+ -> CargoResult<(Test, Vec<ProcessError>)> {
+ let mut errors = Vec::new();
+ let config = options.compile_opts.config;
+
+ // We don't build/rust doctests if target != host
+ if config.rustc()?.host != compilation.target {
+ return Ok((Test::Doc, errors));
+ }
+
+ let libs = compilation.to_doc_test.iter().map(|package| {
+ (package, package.targets().iter().filter(|t| t.doctested())
+ .map(|t| (t.src_path(), t.name(), t.crate_name())))
+ });
+
+ for (package, tests) in libs {
+ for (lib, name, crate_name) in tests {
+ config.shell().status("Doc-tests", name)?;
+ let mut p = compilation.rustdoc_process(package)?;
+ p.arg("--test").arg(lib)
+ .arg("--crate-name").arg(&crate_name);
+
+ for &rust_dep in &[&compilation.deps_output] {
+ let mut arg = OsString::from("dependency=");
+ arg.push(rust_dep);
+ p.arg("-L").arg(arg);
+ }
+
+ for native_dep in compilation.native_dirs.iter() {
+ p.arg("-L").arg(native_dep);
+ }
+
+ for &host_rust_dep in &[&compilation.host_deps_output] {
+ let mut arg = OsString::from("dependency=");
+ arg.push(host_rust_dep);
+ p.arg("-L").arg(arg);
+ }
+
+ for arg in test_args {
+ p.arg("--test-args").arg(arg);
+ }
+
+ if let Some(cfgs) = compilation.cfgs.get(package.package_id()) {
+ for cfg in cfgs.iter() {
+ p.arg("--cfg").arg(cfg);
+ }
+ }
+
+ let libs = &compilation.libraries[package.package_id()];
+ for &(ref target, ref lib) in libs.iter() {
+ // Note that we can *only* doctest rlib outputs here. A
+ // staticlib output cannot be linked by the compiler (it just
+ // doesn't do that). A dylib output, however, can be linked by
+ // the compiler, but will always fail. Currently all dylibs are
+ // built as "static dylibs" where the standard library is
+ // statically linked into the dylib. The doc tests fail,
+ // however, for now as they try to link the standard library
+ // dynamically as well, causing problems. As a result we only
+ // pass `--extern` for rlib deps and skip out on all other
+ // artifacts.
+ if lib.extension() != Some(OsStr::new("rlib")) &&
+ !target.for_host() {
+ continue
+ }
+ let mut arg = OsString::from(target.crate_name());
+ arg.push("=");
+ arg.push(lib);
+ p.arg("--extern").arg(&arg);
+ }
+
+ config.shell().verbose(|shell| {
+ shell.status("Running", p.to_string())
+ })?;
+ if let Err(CargoError(CargoErrorKind::ProcessErrorKind(e), .. )) = p.exec() {
+ errors.push(e);
+ if !options.no_fail_fast {
+ return Ok((Test::Doc, errors));
+ }
+ }
+ }
+ }
+ Ok((Test::Doc, errors))
+}
--- /dev/null
+use std::io::prelude::*;
+
+use toml;
+
+use core::{Resolve, resolver, Workspace};
+use core::resolver::WorkspaceResolve;
+use util::Filesystem;
+use util::errors::{CargoResult, CargoResultExt};
+use util::toml as cargo_toml;
+
+pub fn load_pkg_lockfile(ws: &Workspace) -> CargoResult<Option<Resolve>> {
+ if !ws.root().join("Cargo.lock").exists() {
+ return Ok(None)
+ }
+
+ let root = Filesystem::new(ws.root().to_path_buf());
+ let mut f = root.open_ro("Cargo.lock", ws.config(), "Cargo.lock file")?;
+
+ let mut s = String::new();
+ f.read_to_string(&mut s).chain_err(|| {
+ format!("failed to read file: {}", f.path().display())
+ })?;
+
+ (|| -> CargoResult<Option<Resolve>> {
+ let resolve : toml::Value = cargo_toml::parse(&s, f.path(), ws.config())?;
+ let v: resolver::EncodableResolve = resolve.try_into()?;
+ Ok(Some(v.into_resolve(ws)?))
+ })().chain_err(|| {
+ format!("failed to parse lock file at: {}", f.path().display())
+ })
+}
+
+pub fn write_pkg_lockfile(ws: &Workspace, resolve: &Resolve) -> CargoResult<()> {
+ // Load the original lockfile if it exists.
+ let ws_root = Filesystem::new(ws.root().to_path_buf());
+ let orig = ws_root.open_ro("Cargo.lock", ws.config(), "Cargo.lock file");
+ let orig = orig.and_then(|mut f| {
+ let mut s = String::new();
+ f.read_to_string(&mut s)?;
+ Ok(s)
+ });
+
+ let toml = toml::Value::try_from(WorkspaceResolve { ws, resolve }).unwrap();
+
+ let mut out = String::new();
+
+ let deps = toml["package"].as_array().unwrap();
+ for dep in deps.iter() {
+ let dep = dep.as_table().unwrap();
+
+ out.push_str("[[package]]\n");
+ emit_package(dep, &mut out);
+ }
+
+ if let Some(patch) = toml.get("patch") {
+ let list = patch["unused"].as_array().unwrap();
+ for entry in list {
+ out.push_str("[[patch.unused]]\n");
+ emit_package(entry.as_table().unwrap(), &mut out);
+ out.push_str("\n");
+ }
+ }
+
+ if let Some(meta) = toml.get("metadata") {
+ out.push_str("[metadata]\n");
+ out.push_str(&meta.to_string());
+ }
+
+ // If the lockfile contents haven't changed so don't rewrite it. This is
+ // helpful on read-only filesystems.
+ if let Ok(orig) = orig {
+ if are_equal_lockfiles(orig, &out, ws) {
+ return Ok(())
+ }
+ }
+
+ if !ws.config().lock_update_allowed() {
+ let flag = if ws.config().network_allowed() {"--locked"} else {"--frozen"};
+ bail!("the lock file needs to be updated but {} was passed to \
+ prevent this", flag);
+ }
+
+ // Ok, if that didn't work just write it out
+ ws_root.open_rw("Cargo.lock", ws.config(), "Cargo.lock file").and_then(|mut f| {
+ f.file().set_len(0)?;
+ f.write_all(out.as_bytes())?;
+ Ok(())
+ }).chain_err(|| {
+ format!("failed to write {}",
+ ws.root().join("Cargo.lock").display())
+ })
+}
+
+fn are_equal_lockfiles(mut orig: String, current: &str, ws: &Workspace) -> bool {
+ if has_crlf_line_endings(&orig) {
+ orig = orig.replace("\r\n", "\n");
+ }
+
+ // If we want to try and avoid updating the lockfile, parse both and
+ // compare them; since this is somewhat expensive, don't do it in the
+ // common case where we can update lockfiles.
+ if !ws.config().lock_update_allowed() {
+ let res: CargoResult<bool> = (|| {
+ let old: resolver::EncodableResolve = toml::from_str(&orig)?;
+ let new: resolver::EncodableResolve = toml::from_str(current)?;
+ Ok(old.into_resolve(ws)? == new.into_resolve(ws)?)
+ })();
+ if let Ok(true) = res {
+ return true;
+ }
+ }
+
+ current == orig
+}
+
+fn has_crlf_line_endings(s: &str) -> bool {
+ // Only check the first line.
+ if let Some(lf) = s.find('\n') {
+ s[..lf].ends_with('\r')
+ } else {
+ false
+ }
+}
+
+fn emit_package(dep: &toml::value::Table, out: &mut String) {
+ out.push_str(&format!("name = {}\n", &dep["name"]));
+ out.push_str(&format!("version = {}\n", &dep["version"]));
+
+ if dep.contains_key("source") {
+ out.push_str(&format!("source = {}\n", &dep["source"]));
+ }
+
+ if let Some(s) = dep.get("dependencies") {
+ let slice = s.as_array().unwrap();
+
+ if !slice.is_empty() {
+ out.push_str("dependencies = [\n");
+
+ for child in slice.iter() {
+ out.push_str(&format!(" {},\n", child));
+ }
+
+ out.push_str("]\n");
+ }
+ out.push_str("\n");
+ } else if dep.contains_key("replace") {
+ out.push_str(&format!("replace = {}\n\n", &dep["replace"]));
+ }
+}
--- /dev/null
+pub use self::cargo_clean::{clean, CleanOptions};
+pub use self::cargo_compile::{compile, compile_with_exec, compile_ws, CompileOptions};
+pub use self::cargo_compile::{CompileFilter, CompileMode, FilterRule, MessageFormat, Packages};
+pub use self::cargo_read_manifest::{read_package, read_packages};
+pub use self::cargo_rustc::{compile_targets, Compilation, Kind, Unit};
+pub use self::cargo_rustc::{Context, is_bad_artifact_name};
+pub use self::cargo_rustc::{BuildOutput, BuildConfig, TargetConfig};
+pub use self::cargo_rustc::{Executor, DefaultExecutor};
+pub use self::cargo_run::run;
+pub use self::cargo_install::{install, install_list, uninstall};
+pub use self::cargo_new::{new, init, NewOptions, VersionControl};
+pub use self::cargo_doc::{doc, DocOptions};
+pub use self::cargo_generate_lockfile::{generate_lockfile};
+pub use self::cargo_generate_lockfile::{update_lockfile};
+pub use self::cargo_generate_lockfile::UpdateOptions;
+pub use self::lockfile::{load_pkg_lockfile, write_pkg_lockfile};
+pub use self::cargo_test::{run_tests, run_benches, TestOptions};
+pub use self::cargo_package::{package, PackageOpts};
+pub use self::registry::{publish, registry_configuration, RegistryConfig};
+pub use self::registry::{registry_login, search, http_proxy_exists, http_handle};
+pub use self::registry::{modify_owners, yank, OwnersOptions, PublishOpts};
+pub use self::registry::configure_http_handle;
+pub use self::cargo_fetch::fetch;
+pub use self::cargo_pkgid::pkgid;
+pub use self::resolve::{resolve_ws, resolve_ws_precisely, resolve_with_previous};
+pub use self::cargo_output_metadata::{output_metadata, OutputMetadataOptions, ExportInfo};
+
+mod cargo_clean;
+mod cargo_compile;
+mod cargo_doc;
+mod cargo_fetch;
+mod cargo_generate_lockfile;
+mod cargo_install;
+mod cargo_new;
+mod cargo_output_metadata;
+mod cargo_package;
+mod cargo_pkgid;
+mod cargo_read_manifest;
+mod cargo_run;
+mod cargo_rustc;
+mod cargo_test;
+mod lockfile;
+mod registry;
+mod resolve;
--- /dev/null
+use std::env;
+use std::fs::{self, File};
+use std::iter::repeat;
+use std::time::Duration;
+
+use curl::easy::{Easy, SslOpt};
+use git2;
+use registry::{Registry, NewCrate, NewCrateDependency};
+
+use url::percent_encoding::{percent_encode, QUERY_ENCODE_SET};
+
+use version;
+use core::source::Source;
+use core::{Package, SourceId, Workspace};
+use core::dependency::Kind;
+use core::manifest::ManifestMetadata;
+use ops;
+use sources::{RegistrySource};
+use util::config::{self, Config};
+use util::paths;
+use util::ToUrl;
+use util::errors::{CargoError, CargoResult, CargoResultExt};
+use util::important_paths::find_root_manifest_for_wd;
+
+pub struct RegistryConfig {
+ pub index: Option<String>,
+ pub token: Option<String>,
+}
+
+pub struct PublishOpts<'cfg> {
+ pub config: &'cfg Config,
+ pub token: Option<String>,
+ pub index: Option<String>,
+ pub verify: bool,
+ pub allow_dirty: bool,
+ pub jobs: Option<u32>,
+ pub target: Option<&'cfg str>,
+ pub dry_run: bool,
+ pub registry: Option<String>,
+}
+
+pub fn publish(ws: &Workspace, opts: &PublishOpts) -> CargoResult<()> {
+ let pkg = ws.current()?;
+
+ if let &Some(ref allowed_registries) = pkg.publish() {
+ if !match opts.registry {
+ Some(ref registry) => allowed_registries.contains(registry),
+ None => false,
+ } {
+ bail!("some crates cannot be published.\n\
+ `{}` is marked as unpublishable", pkg.name());
+ }
+ }
+
+ if !pkg.manifest().patch().is_empty() {
+ bail!("published crates cannot contain [patch] sections");
+ }
+
+ let (mut registry, reg_id) = registry(opts.config,
+ opts.token.clone(),
+ opts.index.clone(),
+ opts.registry.clone())?;
+ verify_dependencies(pkg, ®_id)?;
+
+ // Prepare a tarball, with a non-surpressable warning if metadata
+ // is missing since this is being put online.
+ let tarball = ops::package(ws, &ops::PackageOpts {
+ config: opts.config,
+ verify: opts.verify,
+ list: false,
+ check_metadata: true,
+ allow_dirty: opts.allow_dirty,
+ target: opts.target,
+ jobs: opts.jobs,
+ registry: opts.registry.clone(),
+ })?.unwrap();
+
+ // Upload said tarball to the specified destination
+ opts.config.shell().status("Uploading", pkg.package_id().to_string())?;
+ transmit(opts.config, pkg, tarball.file(), &mut registry, ®_id, opts.dry_run)?;
+
+ Ok(())
+}
+
+fn verify_dependencies(pkg: &Package, registry_src: &SourceId)
+ -> CargoResult<()> {
+ for dep in pkg.dependencies().iter() {
+ if dep.source_id().is_path() {
+ if !dep.specified_req() {
+ bail!("all path dependencies must have a version specified \
+ when publishing.\ndependency `{}` does not specify \
+ a version", dep.name())
+ }
+ } else if dep.source_id() != registry_src {
+ if dep.source_id().is_registry() {
+ // Block requests to send to a registry if it is not an alternative
+ // registry
+ if !registry_src.is_alt_registry() {
+ bail!("crates cannot be published to crates.io with dependencies sourced from other\n\
+ registries either publish `{}` on crates.io or pull it into this repository\n\
+ and specify it with a path and version\n\
+ (crate `{}` is pulled from {})", dep.name(), dep.name(), dep.source_id());
+ }
+ } else {
+ bail!("crates cannot be published to crates.io with dependencies sourced from \
+ a repository\neither publish `{}` as its own crate on crates.io and \
+ specify a crates.io version as a dependency or pull it into this \
+ repository and specify it with a path and version\n(crate `{}` has \
+ repository path `{}`)", dep.name(), dep.name(), dep.source_id());
+ }
+ }
+ }
+ Ok(())
+}
+
+fn transmit(config: &Config,
+ pkg: &Package,
+ tarball: &File,
+ registry: &mut Registry,
+ registry_id: &SourceId,
+ dry_run: bool) -> CargoResult<()> {
+
+ let deps = pkg.dependencies().iter().map(|dep| {
+
+ // If the dependency is from a different registry, then include the
+ // registry in the dependency.
+ let dep_registry = if dep.source_id() != registry_id {
+ Some(dep.source_id().url().to_string())
+ } else {
+ None
+ };
+
+ NewCrateDependency {
+ optional: dep.is_optional(),
+ default_features: dep.uses_default_features(),
+ name: dep.name().to_string(),
+ features: dep.features().to_vec(),
+ version_req: dep.version_req().to_string(),
+ target: dep.platform().map(|s| s.to_string()),
+ kind: match dep.kind() {
+ Kind::Normal => "normal",
+ Kind::Build => "build",
+ Kind::Development => "dev",
+ }.to_string(),
+ registry: dep_registry,
+ }
+ }).collect::<Vec<NewCrateDependency>>();
+ let manifest = pkg.manifest();
+ let ManifestMetadata {
+ ref authors, ref description, ref homepage, ref documentation,
+ ref keywords, ref readme, ref repository, ref license, ref license_file,
+ ref categories, ref badges,
+ } = *manifest.metadata();
+ let readme_content = match *readme {
+ Some(ref readme) => Some(paths::read(&pkg.root().join(readme))?),
+ None => None,
+ };
+ if let Some(ref file) = *license_file {
+ if fs::metadata(&pkg.root().join(file)).is_err() {
+ bail!("the license file `{}` does not exist", file)
+ }
+ }
+
+ // Do not upload if performing a dry run
+ if dry_run {
+ config.shell().warn("aborting upload due to dry run")?;
+ return Ok(());
+ }
+
+ let publish = registry.publish(&NewCrate {
+ name: pkg.name().to_string(),
+ vers: pkg.version().to_string(),
+ deps: deps,
+ features: pkg.summary().features().clone(),
+ authors: authors.clone(),
+ description: description.clone(),
+ homepage: homepage.clone(),
+ documentation: documentation.clone(),
+ keywords: keywords.clone(),
+ categories: categories.clone(),
+ readme: readme_content,
+ readme_file: readme.clone(),
+ repository: repository.clone(),
+ license: license.clone(),
+ license_file: license_file.clone(),
+ badges: badges.clone(),
+ }, tarball);
+
+ match publish {
+ Ok(warnings) => {
+ if !warnings.invalid_categories.is_empty() {
+ let msg = format!("\
+ the following are not valid category slugs and were \
+ ignored: {}. Please see https://crates.io/category_slugs \
+ for the list of all category slugs. \
+ ", warnings.invalid_categories.join(", "));
+ config.shell().warn(&msg)?;
+ }
+
+ if !warnings.invalid_badges.is_empty() {
+ let msg = format!("\
+ the following are not valid badges and were ignored: {}. \
+ Either the badge type specified is unknown or a required \
+ attribute is missing. Please see \
+ http://doc.crates.io/manifest.html#package-metadata \
+ for valid badge types and their required attributes.",
+ warnings.invalid_badges.join(", "));
+ config.shell().warn(&msg)?;
+ }
+
+ Ok(())
+ },
+ Err(e) => Err(e.into()),
+ }
+}
+
+pub fn registry_configuration(config: &Config,
+ registry: Option<String>) -> CargoResult<RegistryConfig> {
+
+ let (index, token) = match registry {
+ Some(registry) => {
+ (Some(config.get_registry_index(®istry)?.to_string()),
+ config.get_string(&format!("registry.{}.token", registry))?.map(|p| p.val))
+ }
+ None => {
+ // Checking out for default index and token
+ (config.get_string("registry.index")?.map(|p| p.val),
+ config.get_string("registry.token")?.map(|p| p.val))
+ }
+ };
+
+ Ok(RegistryConfig {
+ index: index,
+ token: token
+ })
+}
+
+pub fn registry(config: &Config,
+ token: Option<String>,
+ index: Option<String>,
+ registry: Option<String>) -> CargoResult<(Registry, SourceId)> {
+ // Parse all configuration options
+ let RegistryConfig {
+ token: token_config,
+ index: index_config,
+ } = registry_configuration(config, registry.clone())?;
+ let token = token.or(token_config);
+ let sid = match (index_config, index, registry) {
+ (_, _, Some(registry)) => SourceId::alt_registry(config, ®istry)?,
+ (Some(index), _, _) | (None, Some(index), _) => SourceId::for_registry(&index.to_url()?)?,
+ (None, None, _) => SourceId::crates_io(config)?,
+ };
+ let api_host = {
+ let mut src = RegistrySource::remote(&sid, config);
+ src.update().chain_err(|| {
+ format!("failed to update {}", sid)
+ })?;
+ (src.config()?).unwrap().api.unwrap()
+ };
+ let handle = http_handle(config)?;
+ Ok((Registry::new_handle(api_host, token, handle), sid))
+}
+
+/// Create a new HTTP handle with appropriate global configuration for cargo.
+pub fn http_handle(config: &Config) -> CargoResult<Easy> {
+ if !config.network_allowed() {
+ bail!("attempting to make an HTTP request, but --frozen was \
+ specified")
+ }
+
+ // The timeout option for libcurl by default times out the entire transfer,
+ // but we probably don't want this. Instead we only set timeouts for the
+ // connect phase as well as a "low speed" timeout so if we don't receive
+ // many bytes in a large-ish period of time then we time out.
+ let mut handle = Easy::new();
+ configure_http_handle(config, &mut handle)?;
+ Ok(handle)
+}
+
+/// Configure a libcurl http handle with the defaults options for Cargo
+pub fn configure_http_handle(config: &Config, handle: &mut Easy) -> CargoResult<()> {
+ // The timeout option for libcurl by default times out the entire transfer,
+ // but we probably don't want this. Instead we only set timeouts for the
+ // connect phase as well as a "low speed" timeout so if we don't receive
+ // many bytes in a large-ish period of time then we time out.
+ handle.connect_timeout(Duration::new(30, 0))?;
+ handle.low_speed_limit(10 /* bytes per second */)?;
+ handle.low_speed_time(Duration::new(30, 0))?;
+ handle.useragent(&version().to_string())?;
+ if let Some(proxy) = http_proxy(config)? {
+ handle.proxy(&proxy)?;
+ }
+ if let Some(cainfo) = config.get_path("http.cainfo")? {
+ handle.cainfo(&cainfo.val)?;
+ }
+ if let Some(check) = config.get_bool("http.check-revoke")? {
+ handle.ssl_options(SslOpt::new().no_revoke(!check.val))?;
+ }
+ if let Some(timeout) = http_timeout(config)? {
+ handle.connect_timeout(Duration::new(timeout as u64, 0))?;
+ handle.low_speed_time(Duration::new(timeout as u64, 0))?;
+ }
+ Ok(())
+}
+
+/// Find an explicit HTTP proxy if one is available.
+///
+/// Favor cargo's `http.proxy`, then git's `http.proxy`. Proxies specified
+/// via environment variables are picked up by libcurl.
+fn http_proxy(config: &Config) -> CargoResult<Option<String>> {
+ if let Some(s) = config.get_string("http.proxy")? {
+ return Ok(Some(s.val))
+ }
+ if let Ok(cfg) = git2::Config::open_default() {
+ if let Ok(s) = cfg.get_str("http.proxy") {
+ return Ok(Some(s.to_string()))
+ }
+ }
+ Ok(None)
+}
+
+/// Determine if an http proxy exists.
+///
+/// Checks the following for existence, in order:
+///
+/// * cargo's `http.proxy`
+/// * git's `http.proxy`
+/// * `http_proxy` env var
+/// * `HTTP_PROXY` env var
+/// * `https_proxy` env var
+/// * `HTTPS_PROXY` env var
+pub fn http_proxy_exists(config: &Config) -> CargoResult<bool> {
+ if http_proxy(config)?.is_some() {
+ Ok(true)
+ } else {
+ Ok(["http_proxy", "HTTP_PROXY",
+ "https_proxy", "HTTPS_PROXY"].iter().any(|v| env::var(v).is_ok()))
+ }
+}
+
+pub fn http_timeout(config: &Config) -> CargoResult<Option<i64>> {
+ if let Some(s) = config.get_i64("http.timeout")? {
+ return Ok(Some(s.val))
+ }
+ Ok(env::var("HTTP_TIMEOUT").ok().and_then(|s| s.parse().ok()))
+}
+
+pub fn registry_login(config: &Config,
+ token: String,
+ registry: Option<String>) -> CargoResult<()> {
+ let RegistryConfig {
+ token: old_token,
+ ..
+ } = registry_configuration(config, registry.clone())?;
+
+ if let Some(old_token) = old_token {
+ if old_token == token {
+ return Ok(());
+ }
+ }
+
+ config::save_credentials(config, token, registry)
+}
+
+pub struct OwnersOptions {
+ pub krate: Option<String>,
+ pub token: Option<String>,
+ pub index: Option<String>,
+ pub to_add: Option<Vec<String>>,
+ pub to_remove: Option<Vec<String>>,
+ pub list: bool,
+ pub registry: Option<String>,
+}
+
+pub fn modify_owners(config: &Config, opts: &OwnersOptions) -> CargoResult<()> {
+ let name = match opts.krate {
+ Some(ref name) => name.clone(),
+ None => {
+ let manifest_path = find_root_manifest_for_wd(None, config.cwd())?;
+ let pkg = Package::for_path(&manifest_path, config)?;
+ pkg.name().to_string()
+ }
+ };
+
+ let (mut registry, _) = registry(config,
+ opts.token.clone(),
+ opts.index.clone(),
+ opts.registry.clone())?;
+
+ if let Some(ref v) = opts.to_add {
+ let v = v.iter().map(|s| &s[..]).collect::<Vec<_>>();
+ let msg = registry.add_owners(&name, &v).map_err(|e| {
+ CargoError::from(format!("failed to invite owners to crate {}: {}", name, e))
+ })?;
+
+ config.shell().status("Owner", msg)?;
+ }
+
+ if let Some(ref v) = opts.to_remove {
+ let v = v.iter().map(|s| &s[..]).collect::<Vec<_>>();
+ config.shell().status("Owner", format!("removing {:?} from crate {}",
+ v, name))?;
+ registry.remove_owners(&name, &v).map_err(|e| {
+ CargoError::from(format!("failed to remove owners from crate {}: {}", name, e))
+ })?;
+ }
+
+ if opts.list {
+ let owners = registry.list_owners(&name).map_err(|e| {
+ CargoError::from(format!("failed to list owners of crate {}: {}", name, e))
+ })?;
+ for owner in owners.iter() {
+ print!("{}", owner.login);
+ match (owner.name.as_ref(), owner.email.as_ref()) {
+ (Some(name), Some(email)) => println!(" ({} <{}>)", name, email),
+ (Some(s), None) |
+ (None, Some(s)) => println!(" ({})", s),
+ (None, None) => println!(""),
+ }
+ }
+ }
+
+ Ok(())
+}
+
+pub fn yank(config: &Config,
+ krate: Option<String>,
+ version: Option<String>,
+ token: Option<String>,
+ index: Option<String>,
+ undo: bool,
+ reg: Option<String>) -> CargoResult<()> {
+ let name = match krate {
+ Some(name) => name,
+ None => {
+ let manifest_path = find_root_manifest_for_wd(None, config.cwd())?;
+ let pkg = Package::for_path(&manifest_path, config)?;
+ pkg.name().to_string()
+ }
+ };
+ let version = match version {
+ Some(v) => v,
+ None => bail!("a version must be specified to yank")
+ };
+
+ let (mut registry, _) = registry(config, token, index, reg)?;
+
+ if undo {
+ config.shell().status("Unyank", format!("{}:{}", name, version))?;
+ registry.unyank(&name, &version).map_err(|e| {
+ CargoError::from(format!("failed to undo a yank: {}", e))
+ })?;
+ } else {
+ config.shell().status("Yank", format!("{}:{}", name, version))?;
+ registry.yank(&name, &version).map_err(|e| {
+ CargoError::from(format!("failed to yank: {}", e))
+ })?;
+ }
+
+ Ok(())
+}
+
+pub fn search(query: &str,
+ config: &Config,
+ index: Option<String>,
+ limit: u8,
+ reg: Option<String>) -> CargoResult<()> {
+ fn truncate_with_ellipsis(s: &str, max_length: usize) -> String {
+ if s.len() < max_length {
+ s.to_string()
+ } else {
+ format!("{}…", &s[..max_length - 1])
+ }
+ }
+
+ let (mut registry, _) = registry(config, None, index, reg)?;
+ let (crates, total_crates) = registry.search(query, limit).map_err(|e| {
+ CargoError::from(format!("failed to retrieve search results from the registry: {}", e))
+ })?;
+
+ let list_items = crates.iter()
+ .map(|krate| (
+ format!("{} = \"{}\"", krate.name, krate.max_version),
+ krate.description.as_ref().map(|desc|
+ truncate_with_ellipsis(&desc.replace("\n", " "), 128))
+ ))
+ .collect::<Vec<_>>();
+ let description_margin = list_items.iter()
+ .map(|&(ref left, _)| left.len() + 4)
+ .max()
+ .unwrap_or(0);
+
+ for (name, description) in list_items.into_iter() {
+ let line = match description {
+ Some(desc) => {
+ let space = repeat(' ').take(description_margin - name.len())
+ .collect::<String>();
+ name + &space + "# " + &desc
+ }
+ None => name
+ };
+ println!("{}", line);
+ }
+
+ let search_max_limit = 100;
+ if total_crates > u32::from(limit) && limit < search_max_limit {
+ println!("... and {} crates more (use --limit N to see more)",
+ total_crates - u32::from(limit));
+ } else if total_crates > u32::from(limit) && limit >= search_max_limit {
+ println!("... and {} crates more (go to http://crates.io/search?q={} to see more)",
+ total_crates - u32::from(limit),
+ percent_encode(query.as_bytes(), QUERY_ENCODE_SET));
+ }
+
+ Ok(())
+}
--- /dev/null
+use std::collections::HashSet;
+
+use core::{PackageId, PackageIdSpec, PackageSet, Source, SourceId, Workspace};
+use core::registry::PackageRegistry;
+use core::resolver::{self, Resolve, Method};
+use sources::PathSource;
+use ops;
+use util::profile;
+use util::errors::{CargoResult, CargoResultExt};
+
+/// Resolve all dependencies for the workspace using the previous
+/// lockfile as a guide if present.
+///
+/// This function will also write the result of resolution as a new
+/// lockfile.
+pub fn resolve_ws<'a>(ws: &Workspace<'a>) -> CargoResult<(PackageSet<'a>, Resolve)> {
+ let mut registry = PackageRegistry::new(ws.config())?;
+ let resolve = resolve_with_registry(ws, &mut registry, true)?;
+ let packages = get_resolved_packages(&resolve, registry);
+ Ok((packages, resolve))
+}
+
+/// Resolves dependencies for some packages of the workspace,
+/// taking into account `paths` overrides and activated features.
+pub fn resolve_ws_precisely<'a>(ws: &Workspace<'a>,
+ source: Option<Box<Source + 'a>>,
+ features: &[String],
+ all_features: bool,
+ no_default_features: bool,
+ specs: &[PackageIdSpec])
+ -> CargoResult<(PackageSet<'a>, Resolve)> {
+ let features = features.iter()
+ .flat_map(|s| s.split_whitespace())
+ .flat_map(|s| s.split(','))
+ .filter(|s| !s.is_empty())
+ .map(|s| s.to_string())
+ .collect::<Vec<String>>();
+
+ let mut registry = PackageRegistry::new(ws.config())?;
+ if let Some(source) = source {
+ registry.add_preloaded(source);
+ }
+
+ let resolve = if ws.require_optional_deps() {
+ // First, resolve the root_package's *listed* dependencies, as well as
+ // downloading and updating all remotes and such.
+ let resolve = resolve_with_registry(ws, &mut registry, false)?;
+
+ // Second, resolve with precisely what we're doing. Filter out
+ // transitive dependencies if necessary, specify features, handle
+ // overrides, etc.
+ let _p = profile::start("resolving w/ overrides...");
+
+ add_overrides(&mut registry, ws)?;
+
+ for &(ref replace_spec, ref dep) in ws.root_replace() {
+ if !resolve.iter().any(|r| replace_spec.matches(r) && !dep.matches_id(r)) {
+ ws.config().shell().warn(
+ format!("package replacement is not used: {}", replace_spec)
+ )?
+ }
+ }
+
+ Some(resolve)
+ } else {
+ None
+ };
+
+ let method = if all_features {
+ Method::Everything
+ } else {
+ Method::Required {
+ dev_deps: true, // TODO: remove this option?
+ features: &features,
+ uses_default_features: !no_default_features,
+ }
+ };
+
+ let resolved_with_overrides =
+ ops::resolve_with_previous(&mut registry, ws,
+ method, resolve.as_ref(), None,
+ specs, true)?;
+
+ let packages = get_resolved_packages(&resolved_with_overrides, registry);
+
+ Ok((packages, resolved_with_overrides))
+}
+
+fn resolve_with_registry(ws: &Workspace, registry: &mut PackageRegistry, warn: bool)
+ -> CargoResult<Resolve> {
+ let prev = ops::load_pkg_lockfile(ws)?;
+ let resolve = resolve_with_previous(registry, ws,
+ Method::Everything,
+ prev.as_ref(), None, &[], warn)?;
+
+ if !ws.is_ephemeral() {
+ ops::write_pkg_lockfile(ws, &resolve)?;
+ }
+ Ok(resolve)
+}
+
+
+/// Resolve all dependencies for a package using an optional previous instance
+/// of resolve to guide the resolution process.
+///
+/// This also takes an optional hash set, `to_avoid`, which is a list of package
+/// ids that should be avoided when consulting the previous instance of resolve
+/// (often used in pairings with updates).
+///
+/// The previous resolve normally comes from a lockfile. This function does not
+/// read or write lockfiles from the filesystem.
+pub fn resolve_with_previous<'a>(registry: &mut PackageRegistry,
+ ws: &Workspace,
+ method: Method,
+ previous: Option<&'a Resolve>,
+ to_avoid: Option<&HashSet<&'a PackageId>>,
+ specs: &[PackageIdSpec],
+ warn: bool)
+ -> CargoResult<Resolve> {
+ // Here we place an artificial limitation that all non-registry sources
+ // cannot be locked at more than one revision. This means that if a git
+ // repository provides more than one package, they must all be updated in
+ // step when any of them are updated.
+ //
+ // TODO: This seems like a hokey reason to single out the registry as being
+ // different
+ let mut to_avoid_sources = HashSet::new();
+ if let Some(to_avoid) = to_avoid {
+ to_avoid_sources.extend(to_avoid.iter()
+ .map(|p| p.source_id())
+ .filter(|s| !s.is_registry()));
+ }
+
+ let ref keep = |p: &&'a PackageId| {
+ !to_avoid_sources.contains(&p.source_id()) && match to_avoid {
+ Some(set) => !set.contains(p),
+ None => true,
+ }
+ };
+
+ // In the case where a previous instance of resolve is available, we
+ // want to lock as many packages as possible to the previous version
+ // without disturbing the graph structure. To this end we perform
+ // two actions here:
+ //
+ // 1. We inform the package registry of all locked packages. This
+ // involves informing it of both the locked package's id as well
+ // as the versions of all locked dependencies. The registry will
+ // then takes this information into account when it is queried.
+ //
+ // 2. The specified package's summary will have its dependencies
+ // modified to their precise variants. This will instruct the
+ // first step of the resolution process to not query for ranges
+ // but rather for precise dependency versions.
+ //
+ // This process must handle altered dependencies, however, as
+ // it's possible for a manifest to change over time to have
+ // dependencies added, removed, or modified to different version
+ // ranges. To deal with this, we only actually lock a dependency
+ // to the previously resolved version if the dependency listed
+ // still matches the locked version.
+ if let Some(r) = previous {
+ trace!("previous: {:?}", r);
+ for node in r.iter().filter(keep) {
+ let deps = r.deps_not_replaced(node)
+ .filter(keep)
+ .cloned().collect();
+ registry.register_lock(node.clone(), deps);
+ }
+ }
+
+ for (url, patches) in ws.root_patch() {
+ let previous = match previous {
+ Some(r) => r,
+ None => {
+ registry.patch(url, patches)?;
+ continue
+ }
+ };
+ let patches = patches.iter().map(|dep| {
+ let unused = previous.unused_patches();
+ let candidates = previous.iter().chain(unused);
+ match candidates.filter(keep).find(|id| dep.matches_id(id)) {
+ Some(id) => {
+ let mut dep = dep.clone();
+ dep.lock_to(id);
+ dep
+ }
+ None => dep.clone(),
+ }
+ }).collect::<Vec<_>>();
+ registry.patch(url, &patches)?;
+ }
+
+ let mut summaries = Vec::new();
+ for member in ws.members() {
+ registry.add_sources(&[member.package_id().source_id().clone()])?;
+ let method_to_resolve = match method {
+ // When everything for a workspace we want to be sure to resolve all
+ // members in the workspace, so propagate the `Method::Everything`.
+ Method::Everything => Method::Everything,
+
+ // If we're not resolving everything though then we're constructing the
+ // exact crate graph we're going to build. Here we don't necessarily
+ // want to keep around all workspace crates as they may not all be
+ // built/tested.
+ //
+ // Additionally, the `method` specified represents command line
+ // flags, which really only matters for the current package
+ // (determined by the cwd). If other packages are specified (via
+ // `-p`) then the command line flags like features don't apply to
+ // them.
+ //
+ // As a result, if this `member` is the current member of the
+ // workspace, then we use `method` specified. Otherwise we use a
+ // base method with no features specified but using default features
+ // for any other packages specified with `-p`.
+ Method::Required { dev_deps, .. } => {
+ let base = Method::Required {
+ dev_deps: dev_deps,
+ features: &[],
+ uses_default_features: true,
+ };
+ let member_id = member.package_id();
+ match ws.current_opt() {
+ Some(current) if member_id == current.package_id() => method,
+ _ => {
+ if specs.iter().any(|spec| spec.matches(member_id)) {
+ base
+ } else {
+ continue
+ }
+ }
+ }
+ }
+ };
+
+ let summary = registry.lock(member.summary().clone());
+ summaries.push((summary, method_to_resolve));
+ }
+
+ let root_replace = ws.root_replace();
+
+ let replace = match previous {
+ Some(r) => {
+ root_replace.iter().map(|&(ref spec, ref dep)| {
+ for (key, val) in r.replacements().iter() {
+ if spec.matches(key) && dep.matches_id(val) && keep(&val) {
+ let mut dep = dep.clone();
+ dep.lock_to(val);
+ return (spec.clone(), dep)
+ }
+ }
+ (spec.clone(), dep.clone())
+ }).collect::<Vec<_>>()
+ }
+ None => root_replace.to_vec(),
+ };
+
+ let mut resolved = resolver::resolve(&summaries,
+ &replace,
+ registry,
+ Some(ws.config()),
+ warn)?;
+ resolved.register_used_patches(registry.patches());
+ if let Some(previous) = previous {
+ resolved.merge_from(previous)?;
+ }
+ Ok(resolved)
+}
+
+/// Read the `paths` configuration variable to discover all path overrides that
+/// have been configured.
+fn add_overrides<'a>(registry: &mut PackageRegistry<'a>,
+ ws: &Workspace<'a>) -> CargoResult<()> {
+ let paths = match ws.config().get_list("paths")? {
+ Some(list) => list,
+ None => return Ok(())
+ };
+
+ let paths = paths.val.iter().map(|&(ref s, ref p)| {
+ // The path listed next to the string is the config file in which the
+ // key was located, so we want to pop off the `.cargo/config` component
+ // to get the directory containing the `.cargo` folder.
+ (p.parent().unwrap().parent().unwrap().join(s), p)
+ });
+
+ for (path, definition) in paths {
+ let id = SourceId::for_path(&path)?;
+ let mut source = PathSource::new_recursive(&path, &id, ws.config());
+ source.update().chain_err(|| {
+ format!("failed to update path override `{}` \
+ (defined in `{}`)", path.display(),
+ definition.display())
+ })?;
+ registry.add_override(Box::new(source));
+ }
+ Ok(())
+}
+
+fn get_resolved_packages<'a>(resolve: &Resolve,
+ registry: PackageRegistry<'a>)
+ -> PackageSet<'a> {
+ let ids: Vec<PackageId> = resolve.iter().cloned().collect();
+ registry.get(&ids)
+}
+
--- /dev/null
+//! Implementation of configuration for various sources
+//!
+//! This module will parse the various `source.*` TOML configuration keys into a
+//! structure usable by Cargo itself. Currently this is primarily used to map
+//! sources to one another via the `replace-with` key in `.cargo/config`.
+
+use std::collections::HashMap;
+use std::path::{Path, PathBuf};
+
+use url::Url;
+
+use core::{Source, SourceId, GitReference};
+use sources::ReplacedSource;
+use util::{Config, ToUrl};
+use util::config::ConfigValue;
+use util::errors::{CargoError, CargoResult, CargoResultExt};
+
+#[derive(Clone)]
+pub struct SourceConfigMap<'cfg> {
+ cfgs: HashMap<String, SourceConfig>,
+ id2name: HashMap<SourceId, String>,
+ config: &'cfg Config,
+}
+
+/// Configuration for a particular source, found in TOML looking like:
+///
+/// ```toml
+/// [source.crates-io]
+/// registry = 'https://github.com/rust-lang/crates.io-index'
+/// replace-with = 'foo' # optional
+/// ```
+#[derive(Clone)]
+struct SourceConfig {
+ // id this source corresponds to, inferred from the various defined keys in
+ // the configuration
+ id: SourceId,
+
+ // Name of the source that this source should be replaced with. This field
+ // is a tuple of (name, path) where path is where this configuration key was
+ // defined (the literal `.cargo/config` file).
+ replace_with: Option<(String, PathBuf)>,
+}
+
+impl<'cfg> SourceConfigMap<'cfg> {
+ pub fn new(config: &'cfg Config) -> CargoResult<SourceConfigMap<'cfg>> {
+ let mut base = SourceConfigMap::empty(config)?;
+ if let Some(table) = config.get_table("source")? {
+ for (key, value) in table.val.iter() {
+ base.add_config(key, value)?;
+ }
+ }
+ Ok(base)
+ }
+
+ pub fn empty(config: &'cfg Config) -> CargoResult<SourceConfigMap<'cfg>> {
+ let mut base = SourceConfigMap {
+ cfgs: HashMap::new(),
+ id2name: HashMap::new(),
+ config: config,
+ };
+ base.add("crates-io", SourceConfig {
+ id: SourceId::crates_io(config)?,
+ replace_with: None,
+ });
+ Ok(base)
+ }
+
+ pub fn config(&self) -> &'cfg Config {
+ self.config
+ }
+
+ pub fn load(&self, id: &SourceId) -> CargoResult<Box<Source + 'cfg>> {
+ debug!("loading: {}", id);
+ let mut name = match self.id2name.get(id) {
+ Some(name) => name,
+ None => return Ok(id.load(self.config)?),
+ };
+ let mut path = Path::new("/");
+ let orig_name = name;
+ let new_id;
+ loop {
+ let cfg = match self.cfgs.get(name) {
+ Some(cfg) => cfg,
+ None => bail!("could not find a configured source with the \
+ name `{}` when attempting to lookup `{}` \
+ (configuration in `{}`)",
+ name, orig_name, path.display()),
+ };
+ match cfg.replace_with {
+ Some((ref s, ref p)) => {
+ name = s;
+ path = p;
+ }
+ None if *id == cfg.id => return Ok(id.load(self.config)?),
+ None => {
+ new_id = cfg.id.with_precise(id.precise()
+ .map(|s| s.to_string()));
+ break
+ }
+ }
+ debug!("following pointer to {}", name);
+ if name == orig_name {
+ bail!("detected a cycle of `replace-with` sources, the source \
+ `{}` is eventually replaced with itself \
+ (configuration in `{}`)", name, path.display())
+ }
+ }
+ let new_src = new_id.load(self.config)?;
+ let old_src = id.load(self.config)?;
+ if !new_src.supports_checksums() && old_src.supports_checksums() {
+ bail!("\
+cannot replace `{orig}` with `{name}`, the source `{orig}` supports \
+checksums, but `{name}` does not
+
+a lock file compatible with `{orig}` cannot be generated in this situation
+", orig = orig_name, name = name);
+ }
+
+ if old_src.requires_precise() && id.precise().is_none() {
+ bail!("\
+the source {orig} requires a lock file to be present first before it can be
+used against vendored source code
+
+remove the source replacement configuration, generate a lock file, and then
+restore the source replacement configuration to continue the build
+", orig = orig_name);
+ }
+
+ Ok(Box::new(ReplacedSource::new(id, &new_id, new_src)))
+ }
+
+ fn add(&mut self, name: &str, cfg: SourceConfig) {
+ self.id2name.insert(cfg.id.clone(), name.to_string());
+ self.cfgs.insert(name.to_string(), cfg);
+ }
+
+ fn add_config(&mut self, name: &str, cfg: &ConfigValue) -> CargoResult<()> {
+ let (table, _path) = cfg.table(&format!("source.{}", name))?;
+ let mut srcs = Vec::new();
+ if let Some(val) = table.get("registry") {
+ let url = url(val, &format!("source.{}.registry", name))?;
+ srcs.push(SourceId::for_registry(&url)?);
+ }
+ if let Some(val) = table.get("local-registry") {
+ let (s, path) = val.string(&format!("source.{}.local-registry",
+ name))?;
+ let mut path = path.to_path_buf();
+ path.pop();
+ path.pop();
+ path.push(s);
+ srcs.push(SourceId::for_local_registry(&path)?);
+ }
+ if let Some(val) = table.get("directory") {
+ let (s, path) = val.string(&format!("source.{}.directory",
+ name))?;
+ let mut path = path.to_path_buf();
+ path.pop();
+ path.pop();
+ path.push(s);
+ srcs.push(SourceId::for_directory(&path)?);
+ }
+ if let Some(val) = table.get("git") {
+ let url = url(val, &format!("source.{}.git", name))?;
+ let try = |s: &str| {
+ let val = match table.get(s) {
+ Some(s) => s,
+ None => return Ok(None),
+ };
+ let key = format!("source.{}.{}", name, s);
+ val.string(&key).map(Some)
+ };
+ let reference = match try("branch")? {
+ Some(b) => GitReference::Branch(b.0.to_string()),
+ None => {
+ match try("tag")? {
+ Some(b) => GitReference::Tag(b.0.to_string()),
+ None => {
+ match try("rev")? {
+ Some(b) => GitReference::Rev(b.0.to_string()),
+ None => GitReference::Branch("master".to_string()),
+ }
+ }
+ }
+ }
+ };
+ srcs.push(SourceId::for_git(&url, reference)?);
+ }
+ if name == "crates-io" && srcs.is_empty() {
+ srcs.push(SourceId::crates_io(self.config)?);
+ }
+
+ let mut srcs = srcs.into_iter();
+ let src = srcs.next().ok_or_else(|| {
+ CargoError::from(format!("no source URL specified for `source.{}`, need \
+ either `registry` or `local-registry` defined",
+ name))
+ })?;
+ if srcs.next().is_some() {
+ return Err(format!("more than one source URL specified for \
+ `source.{}`", name).into())
+ }
+
+ let mut replace_with = None;
+ if let Some(val) = table.get("replace-with") {
+ let (s, path) = val.string(&format!("source.{}.replace-with",
+ name))?;
+ replace_with = Some((s.to_string(), path.to_path_buf()));
+ }
+
+ self.add(name, SourceConfig {
+ id: src,
+ replace_with: replace_with,
+ });
+
+ return Ok(());
+
+ fn url(cfg: &ConfigValue, key: &str) -> CargoResult<Url> {
+ let (url, path) = cfg.string(key)?;
+ url.to_url().chain_err(|| {
+ format!("configuration key `{}` specified an invalid \
+ URL (in {})", key, path.display())
+
+ })
+ }
+ }
+}
--- /dev/null
+use std::collections::HashMap;
+use std::fmt::{self, Debug, Formatter};
+use std::fs::File;
+use std::io::Read;
+use std::path::{Path, PathBuf};
+
+use hex::ToHex;
+
+use serde_json;
+
+use core::{Package, PackageId, Summary, SourceId, Source, Dependency, Registry};
+use sources::PathSource;
+use util::{Config, Sha256};
+use util::errors::{CargoResult, CargoResultExt};
+use util::paths;
+
+pub struct DirectorySource<'cfg> {
+ source_id: SourceId,
+ root: PathBuf,
+ packages: HashMap<PackageId, (Package, Checksum)>,
+ config: &'cfg Config,
+}
+
+#[derive(Deserialize)]
+struct Checksum {
+ package: Option<String>,
+ files: HashMap<String, String>,
+}
+
+impl<'cfg> DirectorySource<'cfg> {
+ pub fn new(path: &Path, id: &SourceId, config: &'cfg Config)
+ -> DirectorySource<'cfg> {
+ DirectorySource {
+ source_id: id.clone(),
+ root: path.to_path_buf(),
+ config: config,
+ packages: HashMap::new(),
+ }
+ }
+}
+
+impl<'cfg> Debug for DirectorySource<'cfg> {
+ fn fmt(&self, f: &mut Formatter) -> fmt::Result {
+ write!(f, "DirectorySource {{ root: {:?} }}", self.root)
+ }
+}
+
+impl<'cfg> Registry for DirectorySource<'cfg> {
+ fn query(&mut self,
+ dep: &Dependency,
+ f: &mut FnMut(Summary)) -> CargoResult<()> {
+ let packages = self.packages.values().map(|p| &p.0);
+ let matches = packages.filter(|pkg| dep.matches(pkg.summary()));
+ for summary in matches.map(|pkg| pkg.summary().clone()) {
+ f(summary);
+ }
+ Ok(())
+ }
+
+ fn supports_checksums(&self) -> bool {
+ true
+ }
+
+ fn requires_precise(&self) -> bool {
+ true
+ }
+}
+
+impl<'cfg> Source for DirectorySource<'cfg> {
+ fn source_id(&self) -> &SourceId {
+ &self.source_id
+ }
+
+ fn update(&mut self) -> CargoResult<()> {
+ self.packages.clear();
+ let entries = self.root.read_dir().chain_err(|| {
+ format!("failed to read root of directory source: {}",
+ self.root.display())
+ })?;
+
+ for entry in entries {
+ let entry = entry?;
+ let path = entry.path();
+
+ // Ignore hidden/dot directories as they typically don't contain
+ // crates and otherwise may conflict with a VCS
+ // (rust-lang/cargo#3414).
+ if let Some(s) = path.file_name().and_then(|s| s.to_str()) {
+ if s.starts_with('.') {
+ continue
+ }
+ }
+
+ // Vendor directories are often checked into a VCS, but throughout
+ // the lifetime of a vendor dir crates are often added and deleted.
+ // Some VCS implementations don't always fully delete the directory
+ // when a dir is removed from a different checkout. Sometimes a
+ // mostly-empty dir is left behind.
+ //
+ // To help work Cargo work by default in more cases we try to
+ // handle this case by default. If the directory looks like it only
+ // has dotfiles in it (or no files at all) then we skip it.
+ //
+ // In general we don't want to skip completely malformed directories
+ // to help with debugging, so we don't just ignore errors in
+ // `update` below.
+ let mut only_dotfile = true;
+ for entry in path.read_dir()?.filter_map(|e| e.ok()) {
+ if let Some(s) = entry.file_name().to_str() {
+ if s.starts_with('.') {
+ continue
+ }
+ }
+ only_dotfile = false;
+ }
+ if only_dotfile {
+ continue
+ }
+
+ let mut src = PathSource::new(&path, &self.source_id, self.config);
+ src.update()?;
+ let pkg = src.root_package()?;
+
+ let cksum_file = path.join(".cargo-checksum.json");
+ let cksum = paths::read(&path.join(cksum_file)).chain_err(|| {
+ format!("failed to load checksum `.cargo-checksum.json` \
+ of {} v{}",
+ pkg.package_id().name(),
+ pkg.package_id().version())
+
+ })?;
+ let cksum: Checksum = serde_json::from_str(&cksum).chain_err(|| {
+ format!("failed to decode `.cargo-checksum.json` of \
+ {} v{}",
+ pkg.package_id().name(),
+ pkg.package_id().version())
+ })?;
+
+ let mut manifest = pkg.manifest().clone();
+ let mut summary = manifest.summary().clone();
+ if let Some(ref package) = cksum.package {
+ summary = summary.set_checksum(package.clone());
+ }
+ manifest.set_summary(summary);
+ let pkg = Package::new(manifest, pkg.manifest_path());
+ self.packages.insert(pkg.package_id().clone(), (pkg, cksum));
+ }
+
+ Ok(())
+ }
+
+ fn download(&mut self, id: &PackageId) -> CargoResult<Package> {
+ self.packages.get(id).map(|p| &p.0).cloned().ok_or_else(|| {
+ format!("failed to find package with id: {}", id).into()
+ })
+ }
+
+ fn fingerprint(&self, pkg: &Package) -> CargoResult<String> {
+ Ok(pkg.package_id().version().to_string())
+ }
+
+ fn verify(&self, id: &PackageId) -> CargoResult<()> {
+ let (pkg, cksum) = match self.packages.get(id) {
+ Some(&(ref pkg, ref cksum)) => (pkg, cksum),
+ None => bail!("failed to find entry for `{}` in directory source",
+ id),
+ };
+
+ let mut buf = [0; 16 * 1024];
+ for (file, cksum) in cksum.files.iter() {
+ let mut h = Sha256::new();
+ let file = pkg.root().join(file);
+
+ (|| -> CargoResult<()> {
+ let mut f = File::open(&file)?;
+ loop {
+ match f.read(&mut buf)? {
+ 0 => return Ok(()),
+ n => h.update(&buf[..n]),
+ }
+ }
+ })().chain_err(|| {
+ format!("failed to calculate checksum of: {}",
+ file.display())
+ })?;
+
+ let actual = h.finish().to_hex();
+ if &*actual != cksum {
+ bail!("\
+ the listed checksum of `{}` has changed:\n\
+ expected: {}\n\
+ actual: {}\n\
+ \n\
+ directory sources are not intended to be edited, if \
+ modifications are required then it is recommended \
+ that [replace] is used with a forked copy of the \
+ source\
+ ", file.display(), cksum, actual);
+ }
+ }
+
+ Ok(())
+ }
+}
--- /dev/null
+pub use self::utils::{GitRemote, GitDatabase, GitCheckout, GitRevision, fetch};
+pub use self::source::{GitSource, canonicalize_url};
+mod utils;
+mod source;
--- /dev/null
+use std::fmt::{self, Debug, Formatter};
+
+use url::Url;
+
+use core::source::{Source, SourceId};
+use core::GitReference;
+use core::{Package, PackageId, Summary, Registry, Dependency};
+use util::Config;
+use util::errors::{CargoError, CargoResult};
+use util::hex::short_hash;
+use sources::PathSource;
+use sources::git::utils::{GitRemote, GitRevision};
+
+/* TODO: Refactor GitSource to delegate to a PathSource
+ */
+pub struct GitSource<'cfg> {
+ remote: GitRemote,
+ reference: GitReference,
+ source_id: SourceId,
+ path_source: Option<PathSource<'cfg>>,
+ rev: Option<GitRevision>,
+ ident: String,
+ config: &'cfg Config,
+}
+
+impl<'cfg> GitSource<'cfg> {
+ pub fn new(source_id: &SourceId,
+ config: &'cfg Config) -> CargoResult<GitSource<'cfg>> {
+ assert!(source_id.is_git(), "id is not git, id={}", source_id);
+
+ let remote = GitRemote::new(source_id.url());
+ let ident = ident(source_id.url())?;
+
+ let reference = match source_id.precise() {
+ Some(s) => GitReference::Rev(s.to_string()),
+ None => source_id.git_reference().unwrap().clone(),
+ };
+
+ let source = GitSource {
+ remote: remote,
+ reference: reference,
+ source_id: source_id.clone(),
+ path_source: None,
+ rev: None,
+ ident: ident,
+ config: config,
+ };
+
+ Ok(source)
+ }
+
+ pub fn url(&self) -> &Url { self.remote.url() }
+
+ pub fn read_packages(&mut self) -> CargoResult<Vec<Package>> {
+ if self.path_source.is_none() {
+ self.update()?;
+ }
+ self.path_source.as_mut().unwrap().read_packages()
+ }
+}
+
+fn ident(url: &Url) -> CargoResult<String> {
+ let url = canonicalize_url(url)?;
+ let ident = url.path_segments().and_then(|mut s| s.next_back()).unwrap_or("");
+
+ let ident = if ident == "" {
+ "_empty"
+ } else {
+ ident
+ };
+
+ Ok(format!("{}-{}", ident, short_hash(&url)))
+}
+
+// Some hacks and heuristics for making equivalent URLs hash the same
+pub fn canonicalize_url(url: &Url) -> CargoResult<Url> {
+ let mut url = url.clone();
+
+ // cannot-be-a-base-urls are not supported
+ // eg. github.com:rust-lang-nursery/rustfmt.git
+ if url.cannot_be_a_base() {
+ return Err(format!("invalid url `{}`: cannot-be-a-base-URLs are not supported", url).into());
+ }
+
+ // Strip a trailing slash
+ if url.path().ends_with('/') {
+ url.path_segments_mut().unwrap().pop_if_empty();
+ }
+
+ // HACKHACK: For github URL's specifically just lowercase
+ // everything. GitHub treats both the same, but they hash
+ // differently, and we're gonna be hashing them. This wants a more
+ // general solution, and also we're almost certainly not using the
+ // same case conversion rules that GitHub does. (#84)
+ if url.host_str() == Some("github.com") {
+ url.set_scheme("https").unwrap();
+ let path = url.path().to_lowercase();
+ url.set_path(&path);
+ }
+
+ // Repos generally can be accessed with or w/o '.git'
+ let needs_chopping = url.path().ends_with(".git");
+ if needs_chopping {
+ let last = {
+ let last = url.path_segments().unwrap().next_back().unwrap();
+ last[..last.len() - 4].to_owned()
+ };
+ url.path_segments_mut().unwrap().pop().push(&last);
+ }
+
+ Ok(url)
+}
+
+impl<'cfg> Debug for GitSource<'cfg> {
+ fn fmt(&self, f: &mut Formatter) -> fmt::Result {
+ write!(f, "git repo at {}", self.remote.url())?;
+
+ match self.reference.pretty_ref() {
+ Some(s) => write!(f, " ({})", s),
+ None => Ok(())
+ }
+ }
+}
+
+impl<'cfg> Registry for GitSource<'cfg> {
+ fn query(&mut self,
+ dep: &Dependency,
+ f: &mut FnMut(Summary)) -> CargoResult<()> {
+ let src = self.path_source.as_mut()
+ .expect("BUG: update() must be called before query()");
+ src.query(dep, f)
+ }
+
+ fn supports_checksums(&self) -> bool {
+ false
+ }
+
+ fn requires_precise(&self) -> bool {
+ true
+ }
+}
+
+impl<'cfg> Source for GitSource<'cfg> {
+ fn source_id(&self) -> &SourceId {
+ &self.source_id
+ }
+
+ fn update(&mut self) -> CargoResult<()> {
+ let lock = self.config.git_path()
+ .open_rw(".cargo-lock-git", self.config, "the git checkouts")?;
+
+ let db_path = lock.parent().join("db").join(&self.ident);
+
+ // Resolve our reference to an actual revision, and check if the
+ // database already has that revision. If it does, we just load a
+ // database pinned at that revision, and if we don't we issue an update
+ // to try to find the revision.
+ let actual_rev = self.remote.rev_for(&db_path, &self.reference);
+ let should_update = actual_rev.is_err() ||
+ self.source_id.precise().is_none();
+
+ let (repo, actual_rev) = if should_update {
+ self.config.shell().status("Updating",
+ format!("git repository `{}`", self.remote.url()))?;
+
+ trace!("updating git source `{:?}`", self.remote);
+
+ let repo = self.remote.checkout(&db_path, self.config)?;
+ let rev = repo.rev_for(&self.reference).map_err(CargoError::into_internal)?;
+ (repo, rev)
+ } else {
+ (self.remote.db_at(&db_path)?, actual_rev.unwrap())
+ };
+
+ // Don’t use the full hash,
+ // to contribute less to reaching the path length limit on Windows:
+ // https://github.com/servo/servo/pull/14397
+ let short_id = repo.to_short_id(actual_rev.clone()).unwrap();
+
+ let checkout_path = lock.parent().join("checkouts")
+ .join(&self.ident).join(short_id.as_str());
+
+ // Copy the database to the checkout location. After this we could drop
+ // the lock on the database as we no longer needed it, but we leave it
+ // in scope so the destructors here won't tamper with too much.
+ // Checkout is immutable, so we don't need to protect it with a lock once
+ // it is created.
+ repo.copy_to(actual_rev.clone(), &checkout_path, self.config)?;
+
+ let source_id = self.source_id.with_precise(Some(actual_rev.to_string()));
+ let path_source = PathSource::new_recursive(&checkout_path,
+ &source_id,
+ self.config);
+
+ self.path_source = Some(path_source);
+ self.rev = Some(actual_rev);
+ self.path_source.as_mut().unwrap().update()
+ }
+
+ fn download(&mut self, id: &PackageId) -> CargoResult<Package> {
+ trace!("getting packages for package id `{}` from `{:?}`", id,
+ self.remote);
+ self.path_source.as_mut()
+ .expect("BUG: update() must be called before get()")
+ .download(id)
+ }
+
+ fn fingerprint(&self, _pkg: &Package) -> CargoResult<String> {
+ Ok(self.rev.as_ref().unwrap().to_string())
+ }
+}
+
+#[cfg(test)]
+mod test {
+ use url::Url;
+ use super::ident;
+ use util::ToUrl;
+
+ #[test]
+ pub fn test_url_to_path_ident_with_path() {
+ let ident = ident(&url("https://github.com/carlhuda/cargo")).unwrap();
+ assert!(ident.starts_with("cargo-"));
+ }
+
+ #[test]
+ pub fn test_url_to_path_ident_without_path() {
+ let ident = ident(&url("https://github.com")).unwrap();
+ assert!(ident.starts_with("_empty-"));
+ }
+
+ #[test]
+ fn test_canonicalize_idents_by_stripping_trailing_url_slash() {
+ let ident1 = ident(&url("https://github.com/PistonDevelopers/piston/")).unwrap();
+ let ident2 = ident(&url("https://github.com/PistonDevelopers/piston")).unwrap();
+ assert_eq!(ident1, ident2);
+ }
+
+ #[test]
+ fn test_canonicalize_idents_by_lowercasing_github_urls() {
+ let ident1 = ident(&url("https://github.com/PistonDevelopers/piston")).unwrap();
+ let ident2 = ident(&url("https://github.com/pistondevelopers/piston")).unwrap();
+ assert_eq!(ident1, ident2);
+ }
+
+ #[test]
+ fn test_canonicalize_idents_by_stripping_dot_git() {
+ let ident1 = ident(&url("https://github.com/PistonDevelopers/piston")).unwrap();
+ let ident2 = ident(&url("https://github.com/PistonDevelopers/piston.git")).unwrap();
+ assert_eq!(ident1, ident2);
+ }
+
+ #[test]
+ fn test_canonicalize_idents_different_protocols() {
+ let ident1 = ident(&url("https://github.com/PistonDevelopers/piston")).unwrap();
+ let ident2 = ident(&url("git://github.com/PistonDevelopers/piston")).unwrap();
+ assert_eq!(ident1, ident2);
+ }
+
+ #[test]
+ fn test_canonicalize_cannot_be_a_base_urls() {
+ assert!(ident(&url("github.com:PistonDevelopers/piston")).is_err());
+ assert!(ident(&url("google.com:PistonDevelopers/piston")).is_err());
+ }
+
+ fn url(s: &str) -> Url {
+ s.to_url().unwrap()
+ }
+}
--- /dev/null
+use std::env;
+use std::fmt;
+use std::fs::{self, File};
+use std::mem;
+use std::path::{Path, PathBuf};
+use std::process::Command;
+
+use curl::easy::{Easy, List};
+use git2::{self, ObjectType};
+use serde::ser::{self, Serialize};
+use url::Url;
+
+use core::GitReference;
+use util::{ToUrl, internal, Config, network, Progress};
+use util::errors::{CargoResult, CargoResultExt, CargoError};
+
+#[derive(PartialEq, Clone, Debug)]
+pub struct GitRevision(git2::Oid);
+
+impl ser::Serialize for GitRevision {
+ fn serialize<S: ser::Serializer>(&self, s: S) -> Result<S::Ok, S::Error> {
+ serialize_str(self, s)
+ }
+}
+
+fn serialize_str<T, S>(t: &T, s: S) -> Result<S::Ok, S::Error>
+ where T: fmt::Display,
+ S: ser::Serializer,
+{
+ t.to_string().serialize(s)
+}
+
+impl fmt::Display for GitRevision {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ fmt::Display::fmt(&self.0, f)
+ }
+}
+
+pub struct GitShortID(git2::Buf);
+
+impl GitShortID {
+ pub fn as_str(&self) -> &str {
+ self.0.as_str().unwrap()
+ }
+}
+
+/// `GitRemote` represents a remote repository. It gets cloned into a local
+/// `GitDatabase`.
+#[derive(PartialEq, Clone, Debug, Serialize)]
+pub struct GitRemote {
+ #[serde(serialize_with = "serialize_str")]
+ url: Url,
+}
+
+/// `GitDatabase` is a local clone of a remote repository's database. Multiple
+/// `GitCheckouts` can be cloned from this `GitDatabase`.
+#[derive(Serialize)]
+pub struct GitDatabase {
+ remote: GitRemote,
+ path: PathBuf,
+ #[serde(skip_serializing)]
+ repo: git2::Repository,
+}
+
+/// `GitCheckout` is a local checkout of a particular revision. Calling
+/// `clone_into` with a reference will resolve the reference into a revision,
+/// and return a `CargoError` if no revision for that reference was found.
+#[derive(Serialize)]
+pub struct GitCheckout<'a> {
+ database: &'a GitDatabase,
+ location: PathBuf,
+ revision: GitRevision,
+ #[serde(skip_serializing)]
+ repo: git2::Repository,
+}
+
+// Implementations
+
+impl GitRemote {
+ pub fn new(url: &Url) -> GitRemote {
+ GitRemote { url: url.clone() }
+ }
+
+ pub fn url(&self) -> &Url {
+ &self.url
+ }
+
+ pub fn rev_for(&self, path: &Path, reference: &GitReference)
+ -> CargoResult<GitRevision> {
+ let db = self.db_at(path)?;
+ db.rev_for(reference)
+ }
+
+ pub fn checkout(&self, into: &Path, cargo_config: &Config) -> CargoResult<GitDatabase> {
+ let repo = match git2::Repository::open(into) {
+ Ok(mut repo) => {
+ self.fetch_into(&mut repo, cargo_config).chain_err(|| {
+ format!("failed to fetch into {}", into.display())
+ })?;
+ repo
+ }
+ Err(..) => {
+ self.clone_into(into, cargo_config).chain_err(|| {
+ format!("failed to clone into: {}", into.display())
+ })?
+ }
+ };
+
+ Ok(GitDatabase {
+ remote: self.clone(),
+ path: into.to_path_buf(),
+ repo: repo,
+ })
+ }
+
+ pub fn db_at(&self, db_path: &Path) -> CargoResult<GitDatabase> {
+ let repo = git2::Repository::open(db_path)?;
+ Ok(GitDatabase {
+ remote: self.clone(),
+ path: db_path.to_path_buf(),
+ repo: repo,
+ })
+ }
+
+ fn fetch_into(&self, dst: &mut git2::Repository, cargo_config: &Config) -> CargoResult<()> {
+ // Create a local anonymous remote in the repository to fetch the url
+ let refspec = "refs/heads/*:refs/heads/*";
+ fetch(dst, &self.url, refspec, cargo_config)
+ }
+
+ fn clone_into(&self, dst: &Path, cargo_config: &Config) -> CargoResult<git2::Repository> {
+ if fs::metadata(&dst).is_ok() {
+ fs::remove_dir_all(dst)?;
+ }
+ fs::create_dir_all(dst)?;
+ let mut repo = git2::Repository::init_bare(dst)?;
+ fetch(&mut repo, &self.url, "refs/heads/*:refs/heads/*", cargo_config)?;
+ Ok(repo)
+ }
+}
+
+impl GitDatabase {
+ pub fn copy_to(&self, rev: GitRevision, dest: &Path, cargo_config: &Config)
+ -> CargoResult<GitCheckout> {
+ let checkout = match git2::Repository::open(dest) {
+ Ok(repo) => {
+ let mut checkout = GitCheckout::new(dest, self, rev, repo);
+ if !checkout.is_fresh() {
+ checkout.fetch(cargo_config)?;
+ checkout.reset(cargo_config)?;
+ assert!(checkout.is_fresh());
+ }
+ checkout
+ }
+ Err(..) => GitCheckout::clone_into(dest, self, rev, cargo_config)?,
+ };
+ checkout.update_submodules(cargo_config)?;
+ Ok(checkout)
+ }
+
+ pub fn rev_for(&self, reference: &GitReference) -> CargoResult<GitRevision> {
+ let id = match *reference {
+ GitReference::Tag(ref s) => {
+ (|| -> CargoResult<git2::Oid> {
+ let refname = format!("refs/tags/{}", s);
+ let id = self.repo.refname_to_id(&refname)?;
+ let obj = self.repo.find_object(id, None)?;
+ let obj = obj.peel(ObjectType::Commit)?;
+ Ok(obj.id())
+ })().chain_err(|| {
+ format!("failed to find tag `{}`", s)
+ })?
+ }
+ GitReference::Branch(ref s) => {
+ (|| {
+ let b = self.repo.find_branch(s, git2::BranchType::Local)?;
+ b.get().target().ok_or_else(|| {
+ CargoError::from(format!("branch `{}` did not have a target", s))
+ })
+ })().chain_err(|| {
+ format!("failed to find branch `{}`", s)
+ })?
+ }
+ GitReference::Rev(ref s) => {
+ let obj = self.repo.revparse_single(s)?;
+ match obj.as_tag() {
+ Some(tag) => tag.target_id(),
+ None => obj.id(),
+ }
+ }
+ };
+ Ok(GitRevision(id))
+ }
+
+ pub fn to_short_id(&self, revision: GitRevision) -> CargoResult<GitShortID> {
+ let obj = self.repo.find_object(revision.0, None)?;
+ Ok(GitShortID(obj.short_id()?))
+ }
+
+ pub fn has_ref(&self, reference: &str) -> CargoResult<()> {
+ self.repo.revparse_single(reference)?;
+ Ok(())
+ }
+}
+
+impl<'a> GitCheckout<'a> {
+ fn new(path: &Path, database: &'a GitDatabase, revision: GitRevision,
+ repo: git2::Repository)
+ -> GitCheckout<'a>
+ {
+ GitCheckout {
+ location: path.to_path_buf(),
+ database: database,
+ revision: revision,
+ repo: repo,
+ }
+ }
+
+ fn clone_into(into: &Path,
+ database: &'a GitDatabase,
+ revision: GitRevision,
+ config: &Config)
+ -> CargoResult<GitCheckout<'a>>
+ {
+ let dirname = into.parent().unwrap();
+ fs::create_dir_all(&dirname).chain_err(|| {
+ format!("Couldn't mkdir {}", dirname.display())
+ })?;
+ if fs::metadata(&into).is_ok() {
+ fs::remove_dir_all(into).chain_err(|| {
+ format!("Couldn't rmdir {}", into.display())
+ })?;
+ }
+ let repo = git2::Repository::init(into)?;
+ let mut checkout = GitCheckout::new(into, database, revision, repo);
+ checkout.fetch(config)?;
+ checkout.reset(config)?;
+ Ok(checkout)
+ }
+
+ fn is_fresh(&self) -> bool {
+ match self.repo.revparse_single("HEAD") {
+ Ok(ref head) if head.id() == self.revision.0 => {
+ // See comments in reset() for why we check this
+ fs::metadata(self.location.join(".cargo-ok")).is_ok()
+ }
+ _ => false,
+ }
+ }
+
+ fn fetch(&mut self, cargo_config: &Config) -> CargoResult<()> {
+ info!("fetch {}", self.repo.path().display());
+ let url = self.database.path.to_url()?;
+ let refspec = "refs/heads/*:refs/heads/*";
+ fetch(&mut self.repo, &url, refspec, cargo_config)?;
+ Ok(())
+ }
+
+ fn reset(&self, config: &Config) -> CargoResult<()> {
+ // If we're interrupted while performing this reset (e.g. we die because
+ // of a signal) Cargo needs to be sure to try to check out this repo
+ // again on the next go-round.
+ //
+ // To enable this we have a dummy file in our checkout, .cargo-ok, which
+ // if present means that the repo has been successfully reset and is
+ // ready to go. Hence if we start to do a reset, we make sure this file
+ // *doesn't* exist, and then once we're done we create the file.
+ let ok_file = self.location.join(".cargo-ok");
+ let _ = fs::remove_file(&ok_file);
+ info!("reset {} to {}", self.repo.path().display(), self.revision);
+ let object = self.repo.find_object(self.revision.0, None)?;
+ reset(&self.repo, &object, config)?;
+ File::create(ok_file)?;
+ Ok(())
+ }
+
+ fn update_submodules(&self, cargo_config: &Config) -> CargoResult<()> {
+ return update_submodules(&self.repo, cargo_config);
+
+ fn update_submodules(repo: &git2::Repository, cargo_config: &Config) -> CargoResult<()> {
+ info!("update submodules for: {:?}", repo.workdir().unwrap());
+
+ for mut child in repo.submodules()? {
+ update_submodule(repo, &mut child, cargo_config)
+ .map_err(CargoError::into_internal)
+ .chain_err(|| {
+ format!("failed to update submodule `{}`",
+ child.name().unwrap_or(""))
+ })?;
+ }
+ Ok(())
+ }
+
+ fn update_submodule(parent: &git2::Repository,
+ child: &mut git2::Submodule,
+ cargo_config: &Config) -> CargoResult<()> {
+ child.init(false)?;
+ let url = child.url().ok_or_else(|| {
+ internal("non-utf8 url for submodule")
+ })?;
+
+ // A submodule which is listed in .gitmodules but not actually
+ // checked out will not have a head id, so we should ignore it.
+ let head = match child.head_id() {
+ Some(head) => head,
+ None => return Ok(()),
+ };
+
+ // If the submodule hasn't been checked out yet, we need to
+ // clone it. If it has been checked out and the head is the same
+ // as the submodule's head, then we can bail out and go to the
+ // next submodule.
+ let head_and_repo = child.open().and_then(|repo| {
+ let target = repo.head()?.target();
+ Ok((target, repo))
+ });
+ let mut repo = match head_and_repo {
+ Ok((head, repo)) => {
+ if child.head_id() == head {
+ return Ok(())
+ }
+ repo
+ }
+ Err(..) => {
+ let path = parent.workdir().unwrap().join(child.path());
+ let _ = fs::remove_dir_all(&path);
+ git2::Repository::init(&path)?
+ }
+ };
+
+ // Fetch data from origin and reset to the head commit
+ let refspec = "refs/heads/*:refs/heads/*";
+ let url = url.to_url()?;
+ fetch(&mut repo, &url, refspec, cargo_config).chain_err(|| {
+ internal(format!("failed to fetch submodule `{}` from {}",
+ child.name().unwrap_or(""), url))
+ })?;
+
+ let obj = repo.find_object(head, None)?;
+ reset(&repo, &obj, cargo_config)?;
+ update_submodules(&repo, cargo_config)
+ }
+ }
+}
+
+/// Prepare the authentication callbacks for cloning a git repository.
+///
+/// The main purpose of this function is to construct the "authentication
+/// callback" which is used to clone a repository. This callback will attempt to
+/// find the right authentication on the system (without user input) and will
+/// guide libgit2 in doing so.
+///
+/// The callback is provided `allowed` types of credentials, and we try to do as
+/// much as possible based on that:
+///
+/// * Prioritize SSH keys from the local ssh agent as they're likely the most
+/// reliable. The username here is prioritized from the credential
+/// callback, then from whatever is configured in git itself, and finally
+/// we fall back to the generic user of `git`.
+///
+/// * If a username/password is allowed, then we fallback to git2-rs's
+/// implementation of the credential helper. This is what is configured
+/// with `credential.helper` in git, and is the interface for the OSX
+/// keychain, for example.
+///
+/// * After the above two have failed, we just kinda grapple attempting to
+/// return *something*.
+///
+/// If any form of authentication fails, libgit2 will repeatedly ask us for
+/// credentials until we give it a reason to not do so. To ensure we don't
+/// just sit here looping forever we keep track of authentications we've
+/// attempted and we don't try the same ones again.
+fn with_authentication<T, F>(url: &str, cfg: &git2::Config, mut f: F)
+ -> CargoResult<T>
+ where F: FnMut(&mut git2::Credentials) -> CargoResult<T>
+{
+ let mut cred_helper = git2::CredentialHelper::new(url);
+ cred_helper.config(cfg);
+
+ let mut ssh_username_requested = false;
+ let mut cred_helper_bad = None;
+ let mut ssh_agent_attempts = Vec::new();
+ let mut any_attempts = false;
+ let mut tried_sshkey = false;
+
+ let mut res = f(&mut |url, username, allowed| {
+ any_attempts = true;
+ // libgit2's "USERNAME" authentication actually means that it's just
+ // asking us for a username to keep going. This is currently only really
+ // used for SSH authentication and isn't really an authentication type.
+ // The logic currently looks like:
+ //
+ // let user = ...;
+ // if (user.is_null())
+ // user = callback(USERNAME, null, ...);
+ //
+ // callback(SSH_KEY, user, ...)
+ //
+ // So if we're being called here then we know that (a) we're using ssh
+ // authentication and (b) no username was specified in the URL that
+ // we're trying to clone. We need to guess an appropriate username here,
+ // but that may involve a few attempts. Unfortunately we can't switch
+ // usernames during one authentication session with libgit2, so to
+ // handle this we bail out of this authentication session after setting
+ // the flag `ssh_username_requested`, and then we handle this below.
+ if allowed.contains(git2::USERNAME) {
+ debug_assert!(username.is_none());
+ ssh_username_requested = true;
+ return Err(git2::Error::from_str("gonna try usernames later"))
+ }
+
+ // An "SSH_KEY" authentication indicates that we need some sort of SSH
+ // authentication. This can currently either come from the ssh-agent
+ // process or from a raw in-memory SSH key. Cargo only supports using
+ // ssh-agent currently.
+ //
+ // If we get called with this then the only way that should be possible
+ // is if a username is specified in the URL itself (e.g. `username` is
+ // Some), hence the unwrap() here. We try custom usernames down below.
+ if allowed.contains(git2::SSH_KEY) && !tried_sshkey {
+ // If ssh-agent authentication fails, libgit2 will keep
+ // calling this callback asking for other authentication
+ // methods to try. Make sure we only try ssh-agent once,
+ // to avoid looping forever.
+ tried_sshkey = true;
+ let username = username.unwrap();
+ debug_assert!(!ssh_username_requested);
+ ssh_agent_attempts.push(username.to_string());
+ return git2::Cred::ssh_key_from_agent(username)
+ }
+
+ // Sometimes libgit2 will ask for a username/password in plaintext. This
+ // is where Cargo would have an interactive prompt if we supported it,
+ // but we currently don't! Right now the only way we support fetching a
+ // plaintext password is through the `credential.helper` support, so
+ // fetch that here.
+ if allowed.contains(git2::USER_PASS_PLAINTEXT) {
+ let r = git2::Cred::credential_helper(cfg, url, username);
+ cred_helper_bad = Some(r.is_err());
+ return r
+ }
+
+ // I'm... not sure what the DEFAULT kind of authentication is, but seems
+ // easy to support?
+ if allowed.contains(git2::DEFAULT) {
+ return git2::Cred::default()
+ }
+
+ // Whelp, we tried our best
+ Err(git2::Error::from_str("no authentication available"))
+ });
+
+ // Ok, so if it looks like we're going to be doing ssh authentication, we
+ // want to try a few different usernames as one wasn't specified in the URL
+ // for us to use. In order, we'll try:
+ //
+ // * A credential helper's username for this URL, if available.
+ // * This account's username.
+ // * "git"
+ //
+ // We have to restart the authentication session each time (due to
+ // constraints in libssh2 I guess? maybe this is inherent to ssh?), so we
+ // call our callback, `f`, in a loop here.
+ if ssh_username_requested {
+ debug_assert!(res.is_err());
+ let mut attempts = Vec::new();
+ attempts.push("git".to_string());
+ if let Ok(s) = env::var("USER").or_else(|_| env::var("USERNAME")) {
+ attempts.push(s);
+ }
+ if let Some(ref s) = cred_helper.username {
+ attempts.push(s.clone());
+ }
+
+ while let Some(s) = attempts.pop() {
+ // We should get `USERNAME` first, where we just return our attempt,
+ // and then after that we should get `SSH_KEY`. If the first attempt
+ // fails we'll get called again, but we don't have another option so
+ // we bail out.
+ let mut attempts = 0;
+ res = f(&mut |_url, username, allowed| {
+ if allowed.contains(git2::USERNAME) {
+ return git2::Cred::username(&s);
+ }
+ if allowed.contains(git2::SSH_KEY) {
+ debug_assert_eq!(Some(&s[..]), username);
+ attempts += 1;
+ if attempts == 1 {
+ ssh_agent_attempts.push(s.to_string());
+ return git2::Cred::ssh_key_from_agent(&s)
+ }
+ }
+ Err(git2::Error::from_str("no authentication available"))
+ });
+
+ // If we made two attempts then that means:
+ //
+ // 1. A username was requested, we returned `s`.
+ // 2. An ssh key was requested, we returned to look up `s` in the
+ // ssh agent.
+ // 3. For whatever reason that lookup failed, so we were asked again
+ // for another mode of authentication.
+ //
+ // Essentially, if `attempts == 2` then in theory the only error was
+ // that this username failed to authenticate (e.g. no other network
+ // errors happened). Otherwise something else is funny so we bail
+ // out.
+ if attempts != 2 {
+ break
+ }
+ }
+ }
+
+ if res.is_ok() || !any_attempts {
+ return res.map_err(From::from)
+ }
+
+ // In the case of an authentication failure (where we tried something) then
+ // we try to give a more helpful error message about precisely what we
+ // tried.
+ res.map_err(CargoError::from).map_err(|e| e.into_internal()).chain_err(|| {
+ let mut msg = "failed to authenticate when downloading \
+ repository".to_string();
+ if !ssh_agent_attempts.is_empty() {
+ let names = ssh_agent_attempts.iter()
+ .map(|s| format!("`{}`", s))
+ .collect::<Vec<_>>()
+ .join(", ");
+ msg.push_str(&format!("\nattempted ssh-agent authentication, but \
+ none of the usernames {} succeeded", names));
+ }
+ if let Some(failed_cred_helper) = cred_helper_bad {
+ if failed_cred_helper {
+ msg.push_str("\nattempted to find username/password via \
+ git's `credential.helper` support, but failed");
+ } else {
+ msg.push_str("\nattempted to find username/password via \
+ `credential.helper`, but maybe the found \
+ credentials were incorrect");
+ }
+ }
+ msg
+ })
+}
+
+fn reset(repo: &git2::Repository,
+ obj: &git2::Object,
+ config: &Config) -> CargoResult<()> {
+ let mut pb = Progress::new("Checkout", config);
+ let mut opts = git2::build::CheckoutBuilder::new();
+ opts.progress(|_, cur, max| {
+ drop(pb.tick(cur, max));
+ });
+ repo.reset(obj, git2::ResetType::Hard, Some(&mut opts))?;
+ Ok(())
+}
+
+pub fn fetch(repo: &mut git2::Repository,
+ url: &Url,
+ refspec: &str,
+ config: &Config) -> CargoResult<()> {
+ if !config.network_allowed() {
+ bail!("attempting to update a git repository, but --frozen \
+ was specified")
+ }
+
+ // If we're fetching from github, attempt github's special fast path for
+ // testing if we've already got an up-to-date copy of the repository
+ if url.host_str() == Some("github.com") {
+ if let Ok(oid) = repo.refname_to_id("refs/remotes/origin/master") {
+ let mut handle = config.http()?.borrow_mut();
+ debug!("attempting github fast path for {}", url);
+ if github_up_to_date(&mut handle, url, &oid) {
+ return Ok(())
+ } else {
+ debug!("fast path failed, falling back to a git fetch");
+ }
+ }
+ }
+
+ // We reuse repositories quite a lot, so before we go through and update the
+ // repo check to see if it's a little too old and could benefit from a gc.
+ // In theory this shouldn't be too too expensive compared to the network
+ // request we're about to issue.
+ maybe_gc_repo(repo)?;
+
+ debug!("doing a fetch for {}", url);
+ let mut progress = Progress::new("Fetch", config);
+ with_authentication(url.as_str(), &repo.config()?, |f| {
+ let mut cb = git2::RemoteCallbacks::new();
+ cb.credentials(f);
+
+ cb.transfer_progress(|stats| {
+ progress.tick(stats.indexed_objects(), stats.total_objects()).is_ok()
+ });
+
+ // Create a local anonymous remote in the repository to fetch the url
+ let mut remote = repo.remote_anonymous(url.as_str())?;
+ let mut opts = git2::FetchOptions::new();
+ opts.remote_callbacks(cb)
+ .download_tags(git2::AutotagOption::All);
+
+ network::with_retry(config, || {
+ debug!("initiating fetch of {} from {}", refspec, url);
+ remote.fetch(&[refspec], Some(&mut opts), None)
+ .map_err(CargoError::from)
+ })?;
+ Ok(())
+ })
+}
+
+/// Cargo has a bunch of long-lived git repositories in its global cache and
+/// some, like the index, are updated very frequently. Right now each update
+/// creates a new "pack file" inside the git database, and over time this can
+/// cause bad performance and bad current behavior in libgit2.
+///
+/// One pathological use case today is where libgit2 opens hundreds of file
+/// descriptors, getting us dangerously close to blowing out the OS limits of
+/// how many fds we can have open. This is detailed in #4403.
+///
+/// To try to combat this problem we attempt a `git gc` here. Note, though, that
+/// we may not even have `git` installed on the system! As a result we
+/// opportunistically try a `git gc` when the pack directory looks too big, and
+/// failing that we just blow away the repository and start over.
+fn maybe_gc_repo(repo: &mut git2::Repository) -> CargoResult<()> {
+ // Here we arbitrarily declare that if you have more than 100 files in your
+ // `pack` folder that we need to do a gc.
+ let entries = match repo.path().join("objects/pack").read_dir() {
+ Ok(e) => e.count(),
+ Err(_) => {
+ debug!("skipping gc as pack dir appears gone");
+ return Ok(())
+ }
+ };
+ let max = env::var("__CARGO_PACKFILE_LIMIT").ok()
+ .and_then(|s| s.parse::<usize>().ok())
+ .unwrap_or(100);
+ if entries < max {
+ debug!("skipping gc as there's only {} pack files", entries);
+ return Ok(())
+ }
+
+ // First up, try a literal `git gc` by shelling out to git. This is pretty
+ // likely to fail though as we may not have `git` installed. Note that
+ // libgit2 doesn't currently implement the gc operation, so there's no
+ // equivalent there.
+ match Command::new("git").arg("gc").current_dir(repo.path()).output() {
+ Ok(out) => {
+ debug!("git-gc status: {}\n\nstdout ---\n{}\nstderr ---\n{}",
+ out.status,
+ String::from_utf8_lossy(&out.stdout),
+ String::from_utf8_lossy(&out.stderr));
+ if out.status.success() {
+ let new = git2::Repository::open(repo.path())?;
+ mem::replace(repo, new);
+ return Ok(())
+ }
+ }
+ Err(e) => debug!("git-gc failed to spawn: {}", e),
+ }
+
+ // Alright all else failed, let's start over.
+ //
+ // Here we want to drop the current repository object pointed to by `repo`,
+ // so we initialize temporary repository in a sub-folder, blow away the
+ // existing git folder, and then recreate the git repo. Finally we blow away
+ // the `tmp` folder we allocated.
+ let path = repo.path().to_path_buf();
+ let tmp = path.join("tmp");
+ mem::replace(repo, git2::Repository::init(&tmp)?);
+ for entry in path.read_dir()? {
+ let entry = entry?;
+ if entry.file_name().to_str() == Some("tmp") {
+ continue
+ }
+ let path = entry.path();
+ drop(fs::remove_file(&path).or_else(|_| fs::remove_dir_all(&path)));
+ }
+ if repo.is_bare() {
+ mem::replace(repo, git2::Repository::init_bare(path)?);
+ } else {
+ mem::replace(repo, git2::Repository::init(path)?);
+ }
+ fs::remove_dir_all(&tmp).chain_err(|| {
+ format!("failed to remove {:?}", tmp)
+ })?;
+ Ok(())
+}
+
+/// Updating the index is done pretty regularly so we want it to be as fast as
+/// possible. For registries hosted on github (like the crates.io index) there's
+/// a fast path available to use [1] to tell us that there's no updates to be
+/// made.
+///
+/// This function will attempt to hit that fast path and verify that the `oid`
+/// is actually the current `master` branch of the repository. If `true` is
+/// returned then no update needs to be performed, but if `false` is returned
+/// then the standard update logic still needs to happen.
+///
+/// [1]: https://developer.github.com/v3/repos/commits/#get-the-sha-1-of-a-commit-reference
+///
+/// Note that this function should never cause an actual failure because it's
+/// just a fast path. As a result all errors are ignored in this function and we
+/// just return a `bool`. Any real errors will be reported through the normal
+/// update path above.
+fn github_up_to_date(handle: &mut Easy, url: &Url, oid: &git2::Oid) -> bool {
+ macro_rules! try {
+ ($e:expr) => (match $e {
+ Some(e) => e,
+ None => return false,
+ })
+ }
+
+ // This expects github urls in the form `github.com/user/repo` and nothing
+ // else
+ let mut pieces = try!(url.path_segments());
+ let username = try!(pieces.next());
+ let repo = try!(pieces.next());
+ if pieces.next().is_some() {
+ return false
+ }
+
+ let url = format!("https://api.github.com/repos/{}/{}/commits/master",
+ username, repo);
+ try!(handle.get(true).ok());
+ try!(handle.url(&url).ok());
+ try!(handle.useragent("cargo").ok());
+ let mut headers = List::new();
+ try!(headers.append("Accept: application/vnd.github.3.sha").ok());
+ try!(headers.append(&format!("If-None-Match: \"{}\"", oid)).ok());
+ try!(handle.http_headers(headers).ok());
+ try!(handle.perform().ok());
+
+ try!(handle.response_code().ok()) == 304
+}
--- /dev/null
+pub use self::config::SourceConfigMap;
+pub use self::directory::DirectorySource;
+pub use self::git::GitSource;
+pub use self::path::PathSource;
+pub use self::registry::{RegistrySource, CRATES_IO};
+pub use self::replaced::ReplacedSource;
+
+pub mod config;
+pub mod directory;
+pub mod git;
+pub mod path;
+pub mod registry;
+pub mod replaced;
--- /dev/null
+use std::fmt::{self, Debug, Formatter};
+use std::fs;
+use std::path::{Path, PathBuf};
+
+use filetime::FileTime;
+use git2;
+use glob::Pattern;
+use ignore::Match;
+use ignore::gitignore::GitignoreBuilder;
+
+use core::{Package, PackageId, Summary, SourceId, Source, Dependency, Registry};
+use ops;
+use util::{self, CargoError, CargoResult, internal};
+use util::Config;
+
+pub struct PathSource<'cfg> {
+ source_id: SourceId,
+ path: PathBuf,
+ updated: bool,
+ packages: Vec<Package>,
+ config: &'cfg Config,
+ recursive: bool,
+}
+
+impl<'cfg> PathSource<'cfg> {
+ /// Invoked with an absolute path to a directory that contains a Cargo.toml.
+ ///
+ /// This source will only return the package at precisely the `path`
+ /// specified, and it will be an error if there's not a package at `path`.
+ pub fn new(path: &Path, id: &SourceId, config: &'cfg Config)
+ -> PathSource<'cfg> {
+ PathSource {
+ source_id: id.clone(),
+ path: path.to_path_buf(),
+ updated: false,
+ packages: Vec::new(),
+ config: config,
+ recursive: false,
+ }
+ }
+
+ /// Creates a new source which is walked recursively to discover packages.
+ ///
+ /// This is similar to the `new` method except that instead of requiring a
+ /// valid package to be present at `root` the folder is walked entirely to
+ /// crawl for packages.
+ ///
+ /// Note that this should be used with care and likely shouldn't be chosen
+ /// by default!
+ pub fn new_recursive(root: &Path, id: &SourceId, config: &'cfg Config)
+ -> PathSource<'cfg> {
+ PathSource {
+ recursive: true,
+ .. PathSource::new(root, id, config)
+ }
+ }
+
+ pub fn root_package(&mut self) -> CargoResult<Package> {
+ trace!("root_package; source={:?}", self);
+
+ self.update()?;
+
+ match self.packages.iter().find(|p| p.root() == &*self.path) {
+ Some(pkg) => Ok(pkg.clone()),
+ None => Err(internal("no package found in source"))
+ }
+ }
+
+ pub fn read_packages(&self) -> CargoResult<Vec<Package>> {
+ if self.updated {
+ Ok(self.packages.clone())
+ } else if self.recursive {
+ ops::read_packages(&self.path, &self.source_id, self.config)
+ } else {
+ let path = self.path.join("Cargo.toml");
+ let (pkg, _) = ops::read_package(&path, &self.source_id, self.config)?;
+ Ok(vec![pkg])
+ }
+ }
+
+ /// List all files relevant to building this package inside this source.
+ ///
+ /// This function will use the appropriate methods to determine the
+ /// set of files underneath this source's directory which are relevant for
+ /// building `pkg`.
+ ///
+ /// The basic assumption of this method is that all files in the directory
+ /// are relevant for building this package, but it also contains logic to
+ /// use other methods like .gitignore to filter the list of files.
+ ///
+ /// ## Pattern matching strategy
+ ///
+ /// Migrating from a glob-like pattern matching (using `glob` crate) to a
+ /// gitignore-like pattern matching (using `ignore` crate). The migration
+ /// stages are:
+ ///
+ /// 1) Only warn users about the future change iff their matching rules are
+ /// affected. (CURRENT STAGE)
+ ///
+ /// 2) Switch to the new strategy and upate documents. Still keep warning
+ /// affected users.
+ ///
+ /// 3) Drop the old strategy and no mor warnings.
+ ///
+ /// See <https://github.com/rust-lang/cargo/issues/4268> for more info.
+ pub fn list_files(&self, pkg: &Package) -> CargoResult<Vec<PathBuf>> {
+ let root = pkg.root();
+ let no_include_option = pkg.manifest().include().is_empty();
+
+ // glob-like matching rules
+
+ let glob_parse = |p: &String| {
+ let pattern: &str = if p.starts_with('/') {
+ &p[1..p.len()]
+ } else {
+ p
+ };
+ Pattern::new(pattern).map_err(|e| {
+ CargoError::from(format!("could not parse glob pattern `{}`: {}", p, e))
+ })
+ };
+
+ let glob_exclude = pkg.manifest()
+ .exclude()
+ .iter()
+ .map(|p| glob_parse(p))
+ .collect::<Result<Vec<_>, _>>()?;
+
+ let glob_include = pkg.manifest()
+ .include()
+ .iter()
+ .map(|p| glob_parse(p))
+ .collect::<Result<Vec<_>, _>>()?;
+
+ let glob_should_package = |relative_path: &Path| -> bool {
+ fn glob_match(patterns: &Vec<Pattern>, relative_path: &Path) -> bool {
+ patterns.iter().any(|pattern| pattern.matches_path(relative_path))
+ }
+
+ // include and exclude options are mutually exclusive.
+ if no_include_option {
+ !glob_match(&glob_exclude, relative_path)
+ } else {
+ glob_match(&glob_include, relative_path)
+ }
+ };
+
+ // ignore-like matching rules
+
+ let mut exclude_builder = GitignoreBuilder::new(root);
+ for rule in pkg.manifest().exclude() {
+ exclude_builder.add_line(None, rule)?;
+ }
+ let ignore_exclude = exclude_builder.build()?;
+
+ let mut include_builder = GitignoreBuilder::new(root);
+ for rule in pkg.manifest().include() {
+ include_builder.add_line(None, rule)?;
+ }
+ let ignore_include = include_builder.build()?;
+
+ let ignore_should_package = |relative_path: &Path| -> CargoResult<bool> {
+ // include and exclude options are mutually exclusive.
+ if no_include_option {
+ match ignore_exclude.matched_path_or_any_parents(
+ relative_path,
+ /* is_dir */ false,
+ ) {
+ Match::None => Ok(true),
+ Match::Ignore(_) => Ok(false),
+ Match::Whitelist(pattern) => Err(CargoError::from(format!(
+ "exclude rules cannot start with `!`: {}",
+ pattern.original()
+ ))),
+ }
+ } else {
+ match ignore_include.matched_path_or_any_parents(
+ relative_path,
+ /* is_dir */ false,
+ ) {
+ Match::None => Ok(false),
+ Match::Ignore(_) => Ok(true),
+ Match::Whitelist(pattern) => Err(CargoError::from(format!(
+ "include rules cannot start with `!`: {}",
+ pattern.original()
+ ))),
+ }
+ }
+ };
+
+ // matching to paths
+
+ let mut filter = |path: &Path| -> CargoResult<bool> {
+ let relative_path = util::without_prefix(path, root).unwrap();
+ let glob_should_package = glob_should_package(relative_path);
+ let ignore_should_package = ignore_should_package(relative_path)?;
+
+ if glob_should_package != ignore_should_package {
+ if glob_should_package {
+ if no_include_option {
+ self.config
+ .shell()
+ .warn(format!(
+ "Pattern matching for Cargo's include/exclude fields is changing and \
+ file `{}` WILL be excluded in a future Cargo version.\n\
+ See https://github.com/rust-lang/cargo/issues/4268 for more info",
+ relative_path.display()
+ ))?;
+ } else {
+ self.config
+ .shell()
+ .warn(format!(
+ "Pattern matching for Cargo's include/exclude fields is changing and \
+ file `{}` WILL NOT be included in a future Cargo version.\n\
+ See https://github.com/rust-lang/cargo/issues/4268 for more info",
+ relative_path.display()
+ ))?;
+ }
+ } else if no_include_option {
+ self.config
+ .shell()
+ .warn(format!(
+ "Pattern matching for Cargo's include/exclude fields is changing and \
+ file `{}` WILL NOT be excluded in a future Cargo version.\n\
+ See https://github.com/rust-lang/cargo/issues/4268 for more info",
+ relative_path.display()
+ ))?;
+ } else {
+ self.config
+ .shell()
+ .warn(format!(
+ "Pattern matching for Cargo's include/exclude fields is changing and \
+ file `{}` WILL be included in a future Cargo version.\n\
+ See https://github.com/rust-lang/cargo/issues/4268 for more info",
+ relative_path.display()
+ ))?;
+ }
+ }
+
+ // Update to ignore_should_package for Stage 2
+ Ok(glob_should_package)
+ };
+
+ // attempt git-prepopulate only if no `include` (rust-lang/cargo#4135)
+ if no_include_option {
+ if let Some(result) = self.discover_git_and_list_files(pkg, root, &mut filter) {
+ return result;
+ }
+ }
+ self.list_files_walk(pkg, &mut filter)
+ }
+
+ // Returns Some(_) if found sibling Cargo.toml and .git folder;
+ // otherwise caller should fall back on full file list.
+ fn discover_git_and_list_files(&self,
+ pkg: &Package,
+ root: &Path,
+ filter: &mut FnMut(&Path) -> CargoResult<bool>)
+ -> Option<CargoResult<Vec<PathBuf>>> {
+ // If this package is in a git repository, then we really do want to
+ // query the git repository as it takes into account items such as
+ // .gitignore. We're not quite sure where the git repository is,
+ // however, so we do a bit of a probe.
+ //
+ // We walk this package's path upwards and look for a sibling
+ // Cargo.toml and .git folder. If we find one then we assume that we're
+ // part of that repository.
+ let mut cur = root;
+ loop {
+ if cur.join("Cargo.toml").is_file() {
+ // If we find a git repository next to this Cargo.toml, we still
+ // check to see if we are indeed part of the index. If not, then
+ // this is likely an unrelated git repo, so keep going.
+ if let Ok(repo) = git2::Repository::open(cur) {
+ let index = match repo.index() {
+ Ok(index) => index,
+ Err(err) => return Some(Err(err.into())),
+ };
+ let path = util::without_prefix(root, cur)
+ .unwrap().join("Cargo.toml");
+ if index.get_path(&path, 0).is_some() {
+ return Some(self.list_files_git(pkg, repo, filter));
+ }
+ }
+ }
+ // don't cross submodule boundaries
+ if cur.join(".git").is_dir() {
+ break
+ }
+ match cur.parent() {
+ Some(parent) => cur = parent,
+ None => break,
+ }
+ }
+ None
+ }
+
+ fn list_files_git(&self, pkg: &Package, repo: git2::Repository,
+ filter: &mut FnMut(&Path) -> CargoResult<bool>)
+ -> CargoResult<Vec<PathBuf>> {
+ warn!("list_files_git {}", pkg.package_id());
+ let index = repo.index()?;
+ let root = repo.workdir().ok_or_else(|| {
+ internal("Can't list files on a bare repository.")
+ })?;
+ let pkg_path = pkg.root();
+
+ let mut ret = Vec::<PathBuf>::new();
+
+ // We use information from the git repository to guide us in traversing
+ // its tree. The primary purpose of this is to take advantage of the
+ // .gitignore and auto-ignore files that don't matter.
+ //
+ // Here we're also careful to look at both tracked and untracked files as
+ // the untracked files are often part of a build and may become relevant
+ // as part of a future commit.
+ let index_files = index.iter().map(|entry| {
+ use libgit2_sys::GIT_FILEMODE_COMMIT;
+ let is_dir = entry.mode == GIT_FILEMODE_COMMIT as u32;
+ (join(root, &entry.path), Some(is_dir))
+ });
+ let mut opts = git2::StatusOptions::new();
+ opts.include_untracked(true);
+ if let Some(suffix) = util::without_prefix(pkg_path, root) {
+ opts.pathspec(suffix);
+ }
+ let statuses = repo.statuses(Some(&mut opts))?;
+ let untracked = statuses.iter().filter_map(|entry| {
+ match entry.status() {
+ git2::STATUS_WT_NEW => Some((join(root, entry.path_bytes()), None)),
+ _ => None,
+ }
+ });
+
+ let mut subpackages_found = Vec::new();
+
+ for (file_path, is_dir) in index_files.chain(untracked) {
+ let file_path = file_path?;
+
+ // Filter out files blatantly outside this package. This is helped a
+ // bit obove via the `pathspec` function call, but we need to filter
+ // the entries in the index as well.
+ if !file_path.starts_with(pkg_path) {
+ continue
+ }
+
+ match file_path.file_name().and_then(|s| s.to_str()) {
+ // Filter out Cargo.lock and target always, we don't want to
+ // package a lock file no one will ever read and we also avoid
+ // build artifacts
+ Some("Cargo.lock") |
+ Some("target") => continue,
+
+ // Keep track of all sub-packages found and also strip out all
+ // matches we've found so far. Note, though, that if we find
+ // our own `Cargo.toml` we keep going.
+ Some("Cargo.toml") => {
+ let path = file_path.parent().unwrap();
+ if path != pkg_path {
+ warn!("subpackage found: {}", path.display());
+ ret.retain(|p| !p.starts_with(path));
+ subpackages_found.push(path.to_path_buf());
+ continue
+ }
+ }
+
+ _ => {}
+ }
+
+ // If this file is part of any other sub-package we've found so far,
+ // skip it.
+ if subpackages_found.iter().any(|p| file_path.starts_with(p)) {
+ continue
+ }
+
+ if is_dir.unwrap_or_else(|| file_path.is_dir()) {
+ warn!(" found submodule {}", file_path.display());
+ let rel = util::without_prefix(&file_path, root).unwrap();
+ let rel = rel.to_str().ok_or_else(|| {
+ CargoError::from(format!("invalid utf-8 filename: {}", rel.display()))
+ })?;
+ // Git submodules are currently only named through `/` path
+ // separators, explicitly not `\` which windows uses. Who knew?
+ let rel = rel.replace(r"\", "/");
+ match repo.find_submodule(&rel).and_then(|s| s.open()) {
+ Ok(repo) => {
+ let files = self.list_files_git(pkg, repo, filter)?;
+ ret.extend(files.into_iter());
+ }
+ Err(..) => {
+ PathSource::walk(&file_path, &mut ret, false, filter)?;
+ }
+ }
+ } else if (*filter)(&file_path)? {
+ // We found a file!
+ warn!(" found {}", file_path.display());
+ ret.push(file_path);
+ }
+ }
+ return Ok(ret);
+
+ #[cfg(unix)]
+ fn join(path: &Path, data: &[u8]) -> CargoResult<PathBuf> {
+ use std::os::unix::prelude::*;
+ use std::ffi::OsStr;
+ Ok(path.join(<OsStr as OsStrExt>::from_bytes(data)))
+ }
+ #[cfg(windows)]
+ fn join(path: &Path, data: &[u8]) -> CargoResult<PathBuf> {
+ use std::str;
+ match str::from_utf8(data) {
+ Ok(s) => Ok(path.join(s)),
+ Err(..) => Err(internal("cannot process path in git with a non \
+ unicode filename")),
+ }
+ }
+ }
+
+ fn list_files_walk(&self, pkg: &Package, filter: &mut FnMut(&Path) -> CargoResult<bool>)
+ -> CargoResult<Vec<PathBuf>> {
+ let mut ret = Vec::new();
+ PathSource::walk(pkg.root(), &mut ret, true, filter)?;
+ Ok(ret)
+ }
+
+ fn walk(path: &Path, ret: &mut Vec<PathBuf>,
+ is_root: bool, filter: &mut FnMut(&Path) -> CargoResult<bool>)
+ -> CargoResult<()>
+ {
+ if !fs::metadata(&path).map(|m| m.is_dir()).unwrap_or(false) {
+ if (*filter)(path)? {
+ ret.push(path.to_path_buf());
+ }
+ return Ok(())
+ }
+ // Don't recurse into any sub-packages that we have
+ if !is_root && fs::metadata(&path.join("Cargo.toml")).is_ok() {
+ return Ok(())
+ }
+
+ // For package integration tests, we need to sort the paths in a deterministic order to
+ // be able to match stdout warnings in the same order.
+ //
+ // TODO: Drop collect and sort after transition period and dropping wraning tests.
+ // See <https://github.com/rust-lang/cargo/issues/4268>
+ // and <https://github.com/rust-lang/cargo/pull/4270>
+ let mut entries: Vec<fs::DirEntry> = fs::read_dir(path)?.map(|e| e.unwrap()).collect();
+ entries.sort_by(|a, b| a.path().as_os_str().cmp(b.path().as_os_str()));
+ for entry in entries {
+ let path = entry.path();
+ let name = path.file_name().and_then(|s| s.to_str());
+ // Skip dotfile directories
+ if name.map(|s| s.starts_with('.')) == Some(true) {
+ continue
+ } else if is_root {
+ // Skip cargo artifacts
+ match name {
+ Some("target") | Some("Cargo.lock") => continue,
+ _ => {}
+ }
+ }
+ PathSource::walk(&path, ret, false, filter)?;
+ }
+ Ok(())
+ }
+}
+
+impl<'cfg> Debug for PathSource<'cfg> {
+ fn fmt(&self, f: &mut Formatter) -> fmt::Result {
+ write!(f, "the paths source")
+ }
+}
+
+impl<'cfg> Registry for PathSource<'cfg> {
+ fn query(&mut self,
+ dep: &Dependency,
+ f: &mut FnMut(Summary)) -> CargoResult<()> {
+ for s in self.packages.iter().map(|p| p.summary()) {
+ if dep.matches(s) {
+ f(s.clone())
+ }
+ }
+ Ok(())
+ }
+
+ fn supports_checksums(&self) -> bool {
+ false
+ }
+
+ fn requires_precise(&self) -> bool {
+ false
+ }
+}
+
+impl<'cfg> Source for PathSource<'cfg> {
+ fn source_id(&self) -> &SourceId {
+ &self.source_id
+ }
+
+ fn update(&mut self) -> CargoResult<()> {
+ if !self.updated {
+ let packages = self.read_packages()?;
+ self.packages.extend(packages.into_iter());
+ self.updated = true;
+ }
+
+ Ok(())
+ }
+
+ fn download(&mut self, id: &PackageId) -> CargoResult<Package> {
+ trace!("getting packages; id={}", id);
+
+ let pkg = self.packages.iter().find(|pkg| pkg.package_id() == id);
+ pkg.cloned().ok_or_else(|| {
+ internal(format!("failed to find {} in path source", id))
+ })
+ }
+
+ fn fingerprint(&self, pkg: &Package) -> CargoResult<String> {
+ if !self.updated {
+ return Err(internal("BUG: source was not updated"));
+ }
+
+ let mut max = FileTime::zero();
+ let mut max_path = PathBuf::from("");
+ for file in self.list_files(pkg)? {
+ // An fs::stat error here is either because path is a
+ // broken symlink, a permissions error, or a race
+ // condition where this path was rm'ed - either way,
+ // we can ignore the error and treat the path's mtime
+ // as 0.
+ let mtime = fs::metadata(&file).map(|meta| {
+ FileTime::from_last_modification_time(&meta)
+ }).unwrap_or(FileTime::zero());
+ warn!("{} {}", mtime, file.display());
+ if mtime > max {
+ max = mtime;
+ max_path = file;
+ }
+ }
+ trace!("fingerprint {}: {}", self.path.display(), max);
+ Ok(format!("{} ({})", max, max_path.display()))
+ }
+}
--- /dev/null
+use std::collections::HashMap;
+use std::path::Path;
+use std::str;
+
+use serde_json;
+use semver::Version;
+
+use core::dependency::Dependency;
+use core::{SourceId, Summary, PackageId};
+use sources::registry::{RegistryPackage, INDEX_LOCK};
+use sources::registry::RegistryData;
+use util::{CargoError, CargoResult, internal, Filesystem, Config};
+
+pub struct RegistryIndex<'cfg> {
+ source_id: SourceId,
+ path: Filesystem,
+ cache: HashMap<String, Vec<(Summary, bool)>>,
+ hashes: HashMap<String, HashMap<Version, String>>, // (name, vers) => cksum
+ config: &'cfg Config,
+ locked: bool,
+}
+
+impl<'cfg> RegistryIndex<'cfg> {
+ pub fn new(id: &SourceId,
+ path: &Filesystem,
+ config: &'cfg Config,
+ locked: bool)
+ -> RegistryIndex<'cfg> {
+ RegistryIndex {
+ source_id: id.clone(),
+ path: path.clone(),
+ cache: HashMap::new(),
+ hashes: HashMap::new(),
+ config: config,
+ locked: locked,
+ }
+ }
+
+ /// Return the hash listed for a specified PackageId.
+ pub fn hash(&mut self,
+ pkg: &PackageId,
+ load: &mut RegistryData)
+ -> CargoResult<String> {
+ let name = pkg.name();
+ let version = pkg.version();
+ if let Some(s) = self.hashes.get(name).and_then(|v| v.get(version)) {
+ return Ok(s.clone())
+ }
+ // Ok, we're missing the key, so parse the index file to load it.
+ self.summaries(name, load)?;
+ self.hashes.get(name).and_then(|v| v.get(version)).ok_or_else(|| {
+ internal(format!("no hash listed for {}", pkg))
+ }).map(|s| s.clone())
+ }
+
+ /// Parse the on-disk metadata for the package provided
+ ///
+ /// Returns a list of pairs of (summary, yanked) for the package name
+ /// specified.
+ pub fn summaries(&mut self,
+ name: &str,
+ load: &mut RegistryData)
+ -> CargoResult<&Vec<(Summary, bool)>> {
+ if self.cache.contains_key(name) {
+ return Ok(&self.cache[name]);
+ }
+ let summaries = self.load_summaries(name, load)?;
+ self.cache.insert(name.to_string(), summaries);
+ Ok(&self.cache[name])
+ }
+
+ fn load_summaries(&mut self,
+ name: &str,
+ load: &mut RegistryData)
+ -> CargoResult<Vec<(Summary, bool)>> {
+ let (root, _lock) = if self.locked {
+ let lock = self.path.open_ro(Path::new(INDEX_LOCK),
+ self.config,
+ "the registry index");
+ match lock {
+ Ok(lock) => {
+ (lock.path().parent().unwrap().to_path_buf(), Some(lock))
+ }
+ Err(_) => return Ok(Vec::new()),
+ }
+ } else {
+ (self.path.clone().into_path_unlocked(), None)
+ };
+
+ let fs_name = name.chars().flat_map(|c| {
+ c.to_lowercase()
+ }).collect::<String>();
+
+ // see module comment for why this is structured the way it is
+ let path = match fs_name.len() {
+ 1 => format!("1/{}", fs_name),
+ 2 => format!("2/{}", fs_name),
+ 3 => format!("3/{}/{}", &fs_name[..1], fs_name),
+ _ => format!("{}/{}/{}", &fs_name[0..2], &fs_name[2..4], fs_name),
+ };
+ let mut ret = Vec::new();
+ let mut hit_closure = false;
+ let err = load.load(&root, Path::new(&path), &mut |contents| {
+ hit_closure = true;
+ let contents = str::from_utf8(contents).map_err(|_| {
+ CargoError::from("registry index file was not valid utf-8")
+ })?;
+ ret.reserve(contents.lines().count());
+ let lines = contents.lines()
+ .map(|s| s.trim())
+ .filter(|l| !l.is_empty());
+
+ // Attempt forwards-compatibility on the index by ignoring
+ // everything that we ourselves don't understand, that should
+ // allow future cargo implementations to break the
+ // interpretation of each line here and older cargo will simply
+ // ignore the new lines.
+ ret.extend(lines.filter_map(|line| {
+ self.parse_registry_package(line).ok()
+ }));
+
+ Ok(())
+ });
+
+ // We ignore lookup failures as those are just crates which don't exist
+ // or we haven't updated the registry yet. If we actually ran the
+ // closure though then we care about those errors.
+ if hit_closure {
+ err?;
+ }
+
+ Ok(ret)
+ }
+
+ /// Parse a line from the registry's index file into a Summary for a
+ /// package.
+ ///
+ /// The returned boolean is whether or not the summary has been yanked.
+ fn parse_registry_package(&mut self, line: &str)
+ -> CargoResult<(Summary, bool)> {
+ let RegistryPackage {
+ name, vers, cksum, deps, features, yanked
+ } = super::DEFAULT_ID.set(&self.source_id, || {
+ serde_json::from_str::<RegistryPackage>(line)
+ })?;
+ let pkgid = PackageId::new(&name, &vers, &self.source_id)?;
+ let summary = Summary::new(pkgid, deps.inner, features)?;
+ let summary = summary.set_checksum(cksum.clone());
+ if self.hashes.contains_key(&name[..]) {
+ self.hashes.get_mut(&name[..]).unwrap().insert(vers, cksum);
+ } else {
+ self.hashes.entry(name.into_owned())
+ .or_insert_with(HashMap::new)
+ .insert(vers, cksum);
+ }
+ Ok((summary, yanked.unwrap_or(false)))
+ }
+
+ pub fn query(&mut self,
+ dep: &Dependency,
+ load: &mut RegistryData,
+ f: &mut FnMut(Summary))
+ -> CargoResult<()> {
+ let source_id = self.source_id.clone();
+ let summaries = self.summaries(dep.name(), load)?;
+ let summaries = summaries.iter().filter(|&&(_, yanked)| {
+ dep.source_id().precise().is_some() || !yanked
+ }).map(|s| s.0.clone());
+
+ // Handle `cargo update --precise` here. If specified, our own source
+ // will have a precise version listed of the form `<pkg>=<req>` where
+ // `<pkg>` is the name of a crate on this source and `<req>` is the
+ // version requested (agument to `--precise`).
+ let summaries = summaries.filter(|s| {
+ match source_id.precise() {
+ Some(p) if p.starts_with(dep.name()) &&
+ p[dep.name().len()..].starts_with('=') => {
+ let vers = &p[dep.name().len() + 1..];
+ s.version().to_string() == vers
+ }
+ _ => true,
+ }
+ });
+
+ for summary in summaries {
+ if dep.matches(&summary) {
+ f(summary);
+ }
+ }
+ Ok(())
+ }
+}
--- /dev/null
+use std::io::SeekFrom;
+use std::io::prelude::*;
+use std::path::Path;
+
+use core::PackageId;
+use hex::ToHex;
+use sources::registry::{RegistryData, RegistryConfig};
+use util::FileLock;
+use util::paths;
+use util::{Config, Sha256, Filesystem};
+use util::errors::{CargoResult, CargoResultExt};
+
+pub struct LocalRegistry<'cfg> {
+ index_path: Filesystem,
+ root: Filesystem,
+ src_path: Filesystem,
+ config: &'cfg Config,
+}
+
+impl<'cfg> LocalRegistry<'cfg> {
+ pub fn new(root: &Path,
+ config: &'cfg Config,
+ name: &str) -> LocalRegistry<'cfg> {
+ LocalRegistry {
+ src_path: config.registry_source_path().join(name),
+ index_path: Filesystem::new(root.join("index")),
+ root: Filesystem::new(root.to_path_buf()),
+ config: config,
+ }
+ }
+}
+
+impl<'cfg> RegistryData for LocalRegistry<'cfg> {
+ fn index_path(&self) -> &Filesystem {
+ &self.index_path
+ }
+
+ fn load(&self,
+ root: &Path,
+ path: &Path,
+ data: &mut FnMut(&[u8]) -> CargoResult<()>) -> CargoResult<()> {
+ data(&paths::read_bytes(&root.join(path))?)
+ }
+
+ fn config(&mut self) -> CargoResult<Option<RegistryConfig>> {
+ // Local registries don't have configuration for remote APIs or anything
+ // like that
+ Ok(None)
+ }
+
+ fn update_index(&mut self) -> CargoResult<()> {
+ // Nothing to update, we just use what's on disk. Verify it actually
+ // exists though. We don't use any locks as we're just checking whether
+ // these directories exist.
+ let root = self.root.clone().into_path_unlocked();
+ if !root.is_dir() {
+ bail!("local registry path is not a directory: {}",
+ root.display())
+ }
+ let index_path = self.index_path.clone().into_path_unlocked();
+ if !index_path.is_dir() {
+ bail!("local registry index path is not a directory: {}",
+ index_path.display())
+ }
+ Ok(())
+ }
+
+ fn download(&mut self, pkg: &PackageId, checksum: &str)
+ -> CargoResult<FileLock> {
+ let crate_file = format!("{}-{}.crate", pkg.name(), pkg.version());
+ let mut crate_file = self.root.open_ro(&crate_file,
+ self.config,
+ "crate file")?;
+
+ // If we've already got an unpacked version of this crate, then skip the
+ // checksum below as it is in theory already verified.
+ let dst = format!("{}-{}", pkg.name(), pkg.version());
+ if self.src_path.join(dst).into_path_unlocked().exists() {
+ return Ok(crate_file)
+ }
+
+ self.config.shell().status("Unpacking", pkg)?;
+
+ // We don't actually need to download anything per-se, we just need to
+ // verify the checksum matches the .crate file itself.
+ let mut state = Sha256::new();
+ let mut buf = [0; 64 * 1024];
+ loop {
+ let n = crate_file.read(&mut buf).chain_err(|| {
+ format!("failed to read `{}`", crate_file.path().display())
+ })?;
+ if n == 0 {
+ break
+ }
+ state.update(&buf[..n]);
+ }
+ if state.finish().to_hex() != checksum {
+ bail!("failed to verify the checksum of `{}`", pkg)
+ }
+
+ crate_file.seek(SeekFrom::Start(0))?;
+
+ Ok(crate_file)
+ }
+}
--- /dev/null
+//! A `Source` for registry-based packages.
+//!
+//! # What's a Registry?
+//!
+//! Registries are central locations where packages can be uploaded to,
+//! discovered, and searched for. The purpose of a registry is to have a
+//! location that serves as permanent storage for versions of a crate over time.
+//!
+//! Compared to git sources, a registry provides many packages as well as many
+//! versions simultaneously. Git sources can also have commits deleted through
+//! rebasings where registries cannot have their versions deleted.
+//!
+//! # The Index of a Registry
+//!
+//! One of the major difficulties with a registry is that hosting so many
+//! packages may quickly run into performance problems when dealing with
+//! dependency graphs. It's infeasible for cargo to download the entire contents
+//! of the registry just to resolve one package's dependencies, for example. As
+//! a result, cargo needs some efficient method of querying what packages are
+//! available on a registry, what versions are available, and what the
+//! dependencies for each version is.
+//!
+//! One method of doing so would be having the registry expose an HTTP endpoint
+//! which can be queried with a list of packages and a response of their
+//! dependencies and versions is returned. This is somewhat inefficient however
+//! as we may have to hit the endpoint many times and we may have already
+//! queried for much of the data locally already (for other packages, for
+//! example). This also involves inventing a transport format between the
+//! registry and Cargo itself, so this route was not taken.
+//!
+//! Instead, Cargo communicates with registries through a git repository
+//! referred to as the Index. The Index of a registry is essentially an easily
+//! query-able version of the registry's database for a list of versions of a
+//! package as well as a list of dependencies for each version.
+//!
+//! Using git to host this index provides a number of benefits:
+//!
+//! * The entire index can be stored efficiently locally on disk. This means
+//! that all queries of a registry can happen locally and don't need to touch
+//! the network.
+//!
+//! * Updates of the index are quite efficient. Using git buys incremental
+//! updates, compressed transmission, etc for free. The index must be updated
+//! each time we need fresh information from a registry, but this is one
+//! update of a git repository that probably hasn't changed a whole lot so
+//! it shouldn't be too expensive.
+//!
+//! Additionally, each modification to the index is just appending a line at
+//! the end of a file (the exact format is described later). This means that
+//! the commits for an index are quite small and easily applied/compressable.
+//!
+//! ## The format of the Index
+//!
+//! The index is a store for the list of versions for all packages known, so its
+//! format on disk is optimized slightly to ensure that `ls registry` doesn't
+//! produce a list of all packages ever known. The index also wants to ensure
+//! that there's not a million files which may actually end up hitting
+//! filesystem limits at some point. To this end, a few decisions were made
+//! about the format of the registry:
+//!
+//! 1. Each crate will have one file corresponding to it. Each version for a
+//! crate will just be a line in this file.
+//! 2. There will be two tiers of directories for crate names, under which
+//! crates corresponding to those tiers will be located.
+//!
+//! As an example, this is an example hierarchy of an index:
+//!
+//! ```notrust
+//! .
+//! ├── 3
+//! │ └── u
+//! │ └── url
+//! ├── bz
+//! │ └── ip
+//! │ └── bzip2
+//! ├── config.json
+//! ├── en
+//! │ └── co
+//! │ └── encoding
+//! └── li
+//! ├── bg
+//! │ └── libgit2
+//! └── nk
+//! └── link-config
+//! ```
+//!
+//! The root of the index contains a `config.json` file with a few entries
+//! corresponding to the registry (see `RegistryConfig` below).
+//!
+//! Otherwise, there are three numbered directories (1, 2, 3) for crates with
+//! names 1, 2, and 3 characters in length. The 1/2 directories simply have the
+//! crate files underneath them, while the 3 directory is sharded by the first
+//! letter of the crate name.
+//!
+//! Otherwise the top-level directory contains many two-letter directory names,
+//! each of which has many sub-folders with two letters. At the end of all these
+//! are the actual crate files themselves.
+//!
+//! The purpose of this layout is to hopefully cut down on `ls` sizes as well as
+//! efficient lookup based on the crate name itself.
+//!
+//! ## Crate files
+//!
+//! Each file in the index is the history of one crate over time. Each line in
+//! the file corresponds to one version of a crate, stored in JSON format (see
+//! the `RegistryPackage` structure below).
+//!
+//! As new versions are published, new lines are appended to this file. The only
+//! modifications to this file that should happen over time are yanks of a
+//! particular version.
+//!
+//! # Downloading Packages
+//!
+//! The purpose of the Index was to provide an efficient method to resolve the
+//! dependency graph for a package. So far we only required one network
+//! interaction to update the registry's repository (yay!). After resolution has
+//! been performed, however we need to download the contents of packages so we
+//! can read the full manifest and build the source code.
+//!
+//! To accomplish this, this source's `download` method will make an HTTP
+//! request per-package requested to download tarballs into a local cache. These
+//! tarballs will then be unpacked into a destination folder.
+//!
+//! Note that because versions uploaded to the registry are frozen forever that
+//! the HTTP download and unpacking can all be skipped if the version has
+//! already been downloaded and unpacked. This caching allows us to only
+//! download a package when absolutely necessary.
+//!
+//! # Filesystem Hierarchy
+//!
+//! Overall, the `$HOME/.cargo` looks like this when talking about the registry:
+//!
+//! ```notrust
+//! # A folder under which all registry metadata is hosted (similar to
+//! # $HOME/.cargo/git)
+//! $HOME/.cargo/registry/
+//!
+//! # For each registry that cargo knows about (keyed by hostname + hash)
+//! # there is a folder which is the checked out version of the index for
+//! # the registry in this location. Note that this is done so cargo can
+//! # support multiple registries simultaneously
+//! index/
+//! registry1-<hash>/
+//! registry2-<hash>/
+//! ...
+//!
+//! # This folder is a cache for all downloaded tarballs from a registry.
+//! # Once downloaded and verified, a tarball never changes.
+//! cache/
+//! registry1-<hash>/<pkg>-<version>.crate
+//! ...
+//!
+//! # Location in which all tarballs are unpacked. Each tarball is known to
+//! # be frozen after downloading, so transitively this folder is also
+//! # frozen once its unpacked (it's never unpacked again)
+//! src/
+//! registry1-<hash>/<pkg>-<version>/...
+//! ...
+//! ```
+
+use std::borrow::Cow;
+use std::collections::BTreeMap;
+use std::fmt;
+use std::fs::File;
+use std::path::{PathBuf, Path};
+
+use flate2::read::GzDecoder;
+use semver::Version;
+use serde::de;
+use tar::Archive;
+
+use core::{Source, SourceId, PackageId, Package, Summary, Registry};
+use core::dependency::{Dependency, Kind};
+use sources::PathSource;
+use util::{CargoResult, Config, internal, FileLock, Filesystem};
+use util::errors::CargoResultExt;
+use util::hex;
+use util::to_url::ToUrl;
+
+const INDEX_LOCK: &'static str = ".cargo-index-lock";
+pub static CRATES_IO: &'static str = "https://github.com/rust-lang/crates.io-index";
+
+pub struct RegistrySource<'cfg> {
+ source_id: SourceId,
+ src_path: Filesystem,
+ config: &'cfg Config,
+ updated: bool,
+ ops: Box<RegistryData + 'cfg>,
+ index: index::RegistryIndex<'cfg>,
+ index_locked: bool,
+}
+
+#[derive(Deserialize)]
+pub struct RegistryConfig {
+ /// Download endpoint for all crates. This will be appended with
+ /// `/<crate>/<version>/download` and then will be hit with an HTTP GET
+ /// request to download the tarball for a crate.
+ pub dl: String,
+
+ /// API endpoint for the registry. This is what's actually hit to perform
+ /// operations like yanks, owner modifications, publish new crates, etc.
+ pub api: Option<String>,
+}
+
+#[derive(Deserialize)]
+struct RegistryPackage<'a> {
+ name: Cow<'a, str>,
+ vers: Version,
+ deps: DependencyList,
+ features: BTreeMap<String, Vec<String>>,
+ cksum: String,
+ yanked: Option<bool>,
+}
+
+struct DependencyList {
+ inner: Vec<Dependency>,
+}
+
+#[derive(Deserialize)]
+struct RegistryDependency<'a> {
+ name: Cow<'a, str>,
+ req: Cow<'a, str>,
+ features: Vec<String>,
+ optional: bool,
+ default_features: bool,
+ target: Option<Cow<'a, str>>,
+ kind: Option<Cow<'a, str>>,
+ registry: Option<String>,
+}
+
+pub trait RegistryData {
+ fn index_path(&self) -> &Filesystem;
+ fn load(&self,
+ _root: &Path,
+ path: &Path,
+ data: &mut FnMut(&[u8]) -> CargoResult<()>) -> CargoResult<()>;
+ fn config(&mut self) -> CargoResult<Option<RegistryConfig>>;
+ fn update_index(&mut self) -> CargoResult<()>;
+ fn download(&mut self,
+ pkg: &PackageId,
+ checksum: &str) -> CargoResult<FileLock>;
+}
+
+mod index;
+mod remote;
+mod local;
+
+fn short_name(id: &SourceId) -> String {
+ let hash = hex::short_hash(id);
+ let ident = id.url().host_str().unwrap_or("").to_string();
+ format!("{}-{}", ident, hash)
+}
+
+impl<'cfg> RegistrySource<'cfg> {
+ pub fn remote(source_id: &SourceId,
+ config: &'cfg Config) -> RegistrySource<'cfg> {
+ let name = short_name(source_id);
+ let ops = remote::RemoteRegistry::new(source_id, config, &name);
+ RegistrySource::new(source_id, config, &name, Box::new(ops), true)
+ }
+
+ pub fn local(source_id: &SourceId,
+ path: &Path,
+ config: &'cfg Config) -> RegistrySource<'cfg> {
+ let name = short_name(source_id);
+ let ops = local::LocalRegistry::new(path, config, &name);
+ RegistrySource::new(source_id, config, &name, Box::new(ops), false)
+ }
+
+ fn new(source_id: &SourceId,
+ config: &'cfg Config,
+ name: &str,
+ ops: Box<RegistryData + 'cfg>,
+ index_locked: bool) -> RegistrySource<'cfg> {
+ RegistrySource {
+ src_path: config.registry_source_path().join(name),
+ config: config,
+ source_id: source_id.clone(),
+ updated: false,
+ index: index::RegistryIndex::new(source_id,
+ ops.index_path(),
+ config,
+ index_locked),
+ index_locked: index_locked,
+ ops: ops,
+ }
+ }
+
+ /// Decode the configuration stored within the registry.
+ ///
+ /// This requires that the index has been at least checked out.
+ pub fn config(&mut self) -> CargoResult<Option<RegistryConfig>> {
+ self.ops.config()
+ }
+
+ /// Unpacks a downloaded package into a location where it's ready to be
+ /// compiled.
+ ///
+ /// No action is taken if the source looks like it's already unpacked.
+ fn unpack_package(&self,
+ pkg: &PackageId,
+ tarball: &FileLock)
+ -> CargoResult<PathBuf> {
+ let dst = self.src_path.join(&format!("{}-{}", pkg.name(),
+ pkg.version()));
+ dst.create_dir()?;
+ // Note that we've already got the `tarball` locked above, and that
+ // implies a lock on the unpacked destination as well, so this access
+ // via `into_path_unlocked` should be ok.
+ let dst = dst.into_path_unlocked();
+ let ok = dst.join(".cargo-ok");
+ if ok.exists() {
+ return Ok(dst)
+ }
+
+ let gz = GzDecoder::new(tarball.file())?;
+ let mut tar = Archive::new(gz);
+ let prefix = dst.file_name().unwrap();
+ let parent = dst.parent().unwrap();
+ for entry in tar.entries()? {
+ let mut entry = entry.chain_err(|| "failed to iterate over archive")?;
+ let entry_path = entry.path()
+ .chain_err(|| "failed to read entry path")?
+ .into_owned();
+
+ // We're going to unpack this tarball into the global source
+ // directory, but we want to make sure that it doesn't accidentally
+ // (or maliciously) overwrite source code from other crates. Cargo
+ // itself should never generate a tarball that hits this error, and
+ // crates.io should also block uploads with these sorts of tarballs,
+ // but be extra sure by adding a check here as well.
+ if !entry_path.starts_with(prefix) {
+ return Err(format!("invalid tarball downloaded, contains \
+ a file at {:?} which isn't under {:?}",
+ entry_path, prefix).into())
+ }
+
+ // Once that's verified, unpack the entry as usual.
+ entry.unpack_in(parent).chain_err(|| {
+ format!("failed to unpack entry at `{}`", entry_path.display())
+ })?;
+ }
+ File::create(&ok)?;
+ Ok(dst.clone())
+ }
+
+ fn do_update(&mut self) -> CargoResult<()> {
+ self.ops.update_index()?;
+ let path = self.ops.index_path();
+ self.index = index::RegistryIndex::new(&self.source_id,
+ path,
+ self.config,
+ self.index_locked);
+ Ok(())
+ }
+}
+
+impl<'cfg> Registry for RegistrySource<'cfg> {
+ fn query(&mut self,
+ dep: &Dependency,
+ f: &mut FnMut(Summary)) -> CargoResult<()> {
+ // If this is a precise dependency, then it came from a lockfile and in
+ // theory the registry is known to contain this version. If, however, we
+ // come back with no summaries, then our registry may need to be
+ // updated, so we fall back to performing a lazy update.
+ if dep.source_id().precise().is_some() && !self.updated {
+ let mut called = false;
+ self.index.query(dep, &mut *self.ops, &mut |s| {
+ called = true;
+ f(s);
+ })?;
+ if called {
+ return Ok(())
+ } else {
+ self.do_update()?;
+ }
+ }
+
+ self.index.query(dep, &mut *self.ops, f)
+ }
+
+ fn supports_checksums(&self) -> bool {
+ true
+ }
+
+ fn requires_precise(&self) -> bool {
+ false
+ }
+}
+
+impl<'cfg> Source for RegistrySource<'cfg> {
+ fn source_id(&self) -> &SourceId {
+ &self.source_id
+ }
+
+ fn update(&mut self) -> CargoResult<()> {
+ // If we have an imprecise version then we don't know what we're going
+ // to look for, so we always attempt to perform an update here.
+ //
+ // If we have a precise version, then we'll update lazily during the
+ // querying phase. Note that precise in this case is only
+ // `Some("locked")` as other `Some` values indicate a `cargo update
+ // --precise` request
+ if self.source_id.precise() != Some("locked") {
+ self.do_update()?;
+ }
+ Ok(())
+ }
+
+ fn download(&mut self, package: &PackageId) -> CargoResult<Package> {
+ let hash = self.index.hash(package, &mut *self.ops)?;
+ let path = self.ops.download(package, &hash)?;
+ let path = self.unpack_package(package, &path).chain_err(|| {
+ internal(format!("failed to unpack package `{}`", package))
+ })?;
+ let mut src = PathSource::new(&path, &self.source_id, self.config);
+ src.update()?;
+ let pkg = src.download(package)?;
+
+ // Unfortunately the index and the actual Cargo.toml in the index can
+ // differ due to historical Cargo bugs. To paper over these we trash the
+ // *summary* loaded from the Cargo.toml we just downloaded with the one
+ // we loaded from the index.
+ let summaries = self.index.summaries(package.name(), &mut *self.ops)?;
+ let summary = summaries.iter().map(|s| &s.0).find(|s| {
+ s.package_id() == package
+ }).expect("summary not found");
+ let mut manifest = pkg.manifest().clone();
+ manifest.set_summary(summary.clone());
+ Ok(Package::new(manifest, pkg.manifest_path()))
+ }
+
+ fn fingerprint(&self, pkg: &Package) -> CargoResult<String> {
+ Ok(pkg.package_id().version().to_string())
+ }
+}
+
+// TODO: this is pretty unfortunate, ideally we'd use `DeserializeSeed` which
+// is intended for "deserializing with context" but that means we couldn't
+// use `#[derive(Deserialize)]` on `RegistryPackage` unfortunately.
+//
+// I'm told, however, that https://github.com/serde-rs/serde/pull/909 will solve
+// all our problems here. Until that lands this thread local is just a
+// workaround in the meantime.
+//
+// If you're reading this and find this thread local funny, check to see if that
+// PR is merged. If it is then let's ditch this thread local!
+scoped_thread_local!(static DEFAULT_ID: SourceId);
+
+impl<'de> de::Deserialize<'de> for DependencyList {
+ fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
+ where D: de::Deserializer<'de>,
+ {
+ return deserializer.deserialize_seq(Visitor);
+
+ struct Visitor;
+
+ impl<'de> de::Visitor<'de> for Visitor {
+ type Value = DependencyList;
+
+ fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
+ write!(formatter, "a list of dependencies")
+ }
+
+ fn visit_seq<A>(self, mut seq: A) -> Result<DependencyList, A::Error>
+ where A: de::SeqAccess<'de>,
+ {
+ let mut ret = Vec::new();
+ if let Some(size) = seq.size_hint() {
+ ret.reserve(size);
+ }
+ while let Some(element) = seq.next_element::<RegistryDependency>()? {
+ ret.push(parse_registry_dependency(element).map_err(|e| {
+ de::Error::custom(e)
+ })?);
+ }
+
+ Ok(DependencyList { inner: ret })
+ }
+ }
+ }
+}
+
+/// Converts an encoded dependency in the registry to a cargo dependency
+fn parse_registry_dependency(dep: RegistryDependency)
+ -> CargoResult<Dependency> {
+ let RegistryDependency {
+ name, req, features, optional, default_features, target, kind, registry
+ } = dep;
+
+ let id = if let Some(registry) = registry {
+ SourceId::for_registry(®istry.to_url()?)?
+ } else {
+ DEFAULT_ID.with(|id| {
+ id.clone()
+ })
+ };
+
+ let mut dep = Dependency::parse_no_deprecated(&name, Some(&req), &id)?;
+ let kind = match kind.as_ref().map(|s| &s[..]).unwrap_or("") {
+ "dev" => Kind::Development,
+ "build" => Kind::Build,
+ _ => Kind::Normal,
+ };
+
+ let platform = match target {
+ Some(target) => Some(target.parse()?),
+ None => None,
+ };
+
+ // Unfortunately older versions of cargo and/or the registry ended up
+ // publishing lots of entries where the features array contained the
+ // empty feature, "", inside. This confuses the resolution process much
+ // later on and these features aren't actually valid, so filter them all
+ // out here.
+ let features = features.into_iter().filter(|s| !s.is_empty()).collect();
+
+ dep.set_optional(optional)
+ .set_default_features(default_features)
+ .set_features(features)
+ .set_platform(platform)
+ .set_kind(kind);
+
+ Ok(dep)
+}
--- /dev/null
+use std::cell::{RefCell, Ref, Cell};
+use std::io::SeekFrom;
+use std::io::prelude::*;
+use std::mem;
+use std::path::Path;
+use std::str;
+
+use git2;
+use hex::ToHex;
+use serde_json;
+
+use core::{PackageId, SourceId};
+use sources::git;
+use sources::registry::{RegistryData, RegistryConfig, INDEX_LOCK};
+use util::network;
+use util::{FileLock, Filesystem, LazyCell};
+use util::{Config, Sha256, ToUrl, Progress};
+use util::errors::{CargoErrorKind, CargoResult, CargoResultExt};
+
+pub struct RemoteRegistry<'cfg> {
+ index_path: Filesystem,
+ cache_path: Filesystem,
+ source_id: SourceId,
+ config: &'cfg Config,
+ tree: RefCell<Option<git2::Tree<'static>>>,
+ repo: LazyCell<git2::Repository>,
+ head: Cell<Option<git2::Oid>>,
+}
+
+impl<'cfg> RemoteRegistry<'cfg> {
+ pub fn new(source_id: &SourceId, config: &'cfg Config, name: &str)
+ -> RemoteRegistry<'cfg> {
+ RemoteRegistry {
+ index_path: config.registry_index_path().join(name),
+ cache_path: config.registry_cache_path().join(name),
+ source_id: source_id.clone(),
+ config: config,
+ tree: RefCell::new(None),
+ repo: LazyCell::new(),
+ head: Cell::new(None),
+ }
+ }
+
+ fn repo(&self) -> CargoResult<&git2::Repository> {
+ self.repo.get_or_try_init(|| {
+ let path = self.index_path.clone().into_path_unlocked();
+
+ // Fast path without a lock
+ if let Ok(repo) = git2::Repository::open(&path) {
+ return Ok(repo)
+ }
+
+ // Ok, now we need to lock and try the whole thing over again.
+ let lock = self.index_path.open_rw(Path::new(INDEX_LOCK),
+ self.config,
+ "the registry index")?;
+ match git2::Repository::open(&path) {
+ Ok(repo) => Ok(repo),
+ Err(_) => {
+ let _ = lock.remove_siblings();
+
+ // Note that we'd actually prefer to use a bare repository
+ // here as we're not actually going to check anything out.
+ // All versions of Cargo, though, share the same CARGO_HOME,
+ // so for compatibility with older Cargo which *does* do
+ // checkouts we make sure to initialize a new full
+ // repository (not a bare one).
+ //
+ // We should change this to `init_bare` whenever we feel
+ // like enough time has passed or if we change the directory
+ // that the folder is located in, such as by changing the
+ // hash at the end of the directory.
+ Ok(git2::Repository::init(&path)?)
+ }
+ }
+ })
+ }
+
+ fn head(&self) -> CargoResult<git2::Oid> {
+ if self.head.get().is_none() {
+ let oid = self.repo()?.refname_to_id("refs/remotes/origin/master")?;
+ self.head.set(Some(oid));
+ }
+ Ok(self.head.get().unwrap())
+ }
+
+ fn tree(&self) -> CargoResult<Ref<git2::Tree>> {
+ {
+ let tree = self.tree.borrow();
+ if tree.is_some() {
+ return Ok(Ref::map(tree, |s| s.as_ref().unwrap()))
+ }
+ }
+ let repo = self.repo()?;
+ let commit = repo.find_commit(self.head()?)?;
+ let tree = commit.tree()?;
+
+ // Unfortunately in libgit2 the tree objects look like they've got a
+ // reference to the repository object which means that a tree cannot
+ // outlive the repository that it came from. Here we want to cache this
+ // tree, though, so to accomplish this we transmute it to a static
+ // lifetime.
+ //
+ // Note that we don't actually hand out the static lifetime, instead we
+ // only return a scoped one from this function. Additionally the repo
+ // we loaded from (above) lives as long as this object
+ // (`RemoteRegistry`) so we then just need to ensure that the tree is
+ // destroyed first in the destructor, hence the destructor on
+ // `RemoteRegistry` below.
+ let tree = unsafe {
+ mem::transmute::<git2::Tree, git2::Tree<'static>>(tree)
+ };
+ *self.tree.borrow_mut() = Some(tree);
+ Ok(Ref::map(self.tree.borrow(), |s| s.as_ref().unwrap()))
+ }
+}
+
+impl<'cfg> RegistryData for RemoteRegistry<'cfg> {
+ fn index_path(&self) -> &Filesystem {
+ &self.index_path
+ }
+
+ fn load(&self,
+ _root: &Path,
+ path: &Path,
+ data: &mut FnMut(&[u8]) -> CargoResult<()>) -> CargoResult<()> {
+ // Note that the index calls this method and the filesystem is locked
+ // in the index, so we don't need to worry about an `update_index`
+ // happening in a different process.
+ let repo = self.repo()?;
+ let tree = self.tree()?;
+ let entry = tree.get_path(path)?;
+ let object = entry.to_object(repo)?;
+ let blob = match object.as_blob() {
+ Some(blob) => blob,
+ None => bail!("path `{}` is not a blob in the git repo", path.display()),
+ };
+ data(blob.content())
+ }
+
+ fn config(&mut self) -> CargoResult<Option<RegistryConfig>> {
+ self.repo()?; // create intermediate dirs and initialize the repo
+ let _lock = self.index_path.open_ro(Path::new(INDEX_LOCK),
+ self.config,
+ "the registry index")?;
+ let mut config = None;
+ self.load(Path::new(""), Path::new("config.json"), &mut |json| {
+ config = Some(serde_json::from_slice(json)?);
+ Ok(())
+ })?;
+ Ok(config)
+ }
+
+ fn update_index(&mut self) -> CargoResult<()> {
+ // Ensure that we'll actually be able to acquire an HTTP handle later on
+ // once we start trying to download crates. This will weed out any
+ // problems with `.cargo/config` configuration related to HTTP.
+ //
+ // This way if there's a problem the error gets printed before we even
+ // hit the index, which may not actually read this configuration.
+ self.config.http()?;
+
+ self.repo()?;
+ self.head.set(None);
+ *self.tree.borrow_mut() = None;
+ let _lock = self.index_path.open_rw(Path::new(INDEX_LOCK),
+ self.config,
+ "the registry index")?;
+ self.config.shell().status("Updating", self.source_id.display_registry())?;
+
+ // git fetch origin master
+ let url = self.source_id.url();
+ let refspec = "refs/heads/master:refs/remotes/origin/master";
+ let repo = self.repo.borrow_mut().unwrap();
+ git::fetch(repo, url, refspec, self.config).chain_err(|| {
+ format!("failed to fetch `{}`", url)
+ })?;
+ Ok(())
+ }
+
+ fn download(&mut self, pkg: &PackageId, checksum: &str)
+ -> CargoResult<FileLock> {
+ let filename = format!("{}-{}.crate", pkg.name(), pkg.version());
+ let path = Path::new(&filename);
+
+ // Attempt to open an read-only copy first to avoid an exclusive write
+ // lock and also work with read-only filesystems. Note that we check the
+ // length of the file like below to handle interrupted downloads.
+ //
+ // If this fails then we fall through to the exclusive path where we may
+ // have to redownload the file.
+ if let Ok(dst) = self.cache_path.open_ro(path, self.config, &filename) {
+ let meta = dst.file().metadata()?;
+ if meta.len() > 0 {
+ return Ok(dst)
+ }
+ }
+ let mut dst = self.cache_path.open_rw(path, self.config, &filename)?;
+ let meta = dst.file().metadata()?;
+ if meta.len() > 0 {
+ return Ok(dst)
+ }
+ self.config.shell().status("Downloading", pkg)?;
+
+ let config = self.config()?.unwrap();
+ let mut url = config.dl.to_url()?;
+ url.path_segments_mut().unwrap()
+ .push(pkg.name())
+ .push(&pkg.version().to_string())
+ .push("download");
+
+ // TODO: don't download into memory, but ensure that if we ctrl-c a
+ // download we should resume either from the start or the middle
+ // on the next time
+ let url = url.to_string();
+ let mut handle = self.config.http()?.borrow_mut();
+ handle.get(true)?;
+ handle.url(&url)?;
+ handle.follow_location(true)?;
+ let mut state = Sha256::new();
+ let mut body = Vec::new();
+ network::with_retry(self.config, || {
+ state = Sha256::new();
+ body = Vec::new();
+ let mut pb = Progress::new("Fetch", self.config);
+ {
+ handle.progress(true)?;
+ let mut handle = handle.transfer();
+ handle.progress_function(|dl_total, dl_cur, _, _| {
+ pb.tick(dl_cur as usize, dl_total as usize).is_ok()
+ })?;
+ handle.write_function(|buf| {
+ state.update(buf);
+ body.extend_from_slice(buf);
+ Ok(buf.len())
+ })?;
+ handle.perform()?;
+ }
+ let code = handle.response_code()?;
+ if code != 200 && code != 0 {
+ let url = handle.effective_url()?.unwrap_or(&url);
+ Err(CargoErrorKind::HttpNot200(code, url.to_string()).into())
+ } else {
+ Ok(())
+ }
+ })?;
+
+ // Verify what we just downloaded
+ if state.finish().to_hex() != checksum {
+ bail!("failed to verify the checksum of `{}`", pkg)
+ }
+
+ dst.write_all(&body)?;
+ dst.seek(SeekFrom::Start(0))?;
+ Ok(dst)
+ }
+}
+
+impl<'cfg> Drop for RemoteRegistry<'cfg> {
+ fn drop(&mut self) {
+ // Just be sure to drop this before our other fields
+ self.tree.borrow_mut().take();
+ }
+}
--- /dev/null
+use core::{Source, Registry, PackageId, Package, Dependency, Summary, SourceId};
+use util::errors::{CargoResult, CargoResultExt};
+
+pub struct ReplacedSource<'cfg> {
+ to_replace: SourceId,
+ replace_with: SourceId,
+ inner: Box<Source + 'cfg>,
+}
+
+impl<'cfg> ReplacedSource<'cfg> {
+ pub fn new(to_replace: &SourceId,
+ replace_with: &SourceId,
+ src: Box<Source + 'cfg>) -> ReplacedSource<'cfg> {
+ ReplacedSource {
+ to_replace: to_replace.clone(),
+ replace_with: replace_with.clone(),
+ inner: src,
+ }
+ }
+}
+
+impl<'cfg> Registry for ReplacedSource<'cfg> {
+ fn query(&mut self,
+ dep: &Dependency,
+ f: &mut FnMut(Summary)) -> CargoResult<()> {
+ let (replace_with, to_replace) = (&self.replace_with, &self.to_replace);
+ let dep = dep.clone().map_source(to_replace, replace_with);
+
+ self.inner.query(&dep, &mut |summary| {
+ f(summary.map_source(replace_with, to_replace))
+ }).chain_err(|| {
+ format!("failed to query replaced source {}",
+ self.to_replace)
+ })
+ }
+
+ fn supports_checksums(&self) -> bool {
+ self.inner.supports_checksums()
+ }
+
+ fn requires_precise(&self) -> bool {
+ self.inner.requires_precise()
+ }
+}
+
+impl<'cfg> Source for ReplacedSource<'cfg> {
+ fn source_id(&self) -> &SourceId {
+ &self.to_replace
+ }
+
+ fn update(&mut self) -> CargoResult<()> {
+ self.inner.update().chain_err(|| {
+ format!("failed to update replaced source {}",
+ self.to_replace)
+ })
+ }
+
+ fn download(&mut self, id: &PackageId) -> CargoResult<Package> {
+ let id = id.with_source_id(&self.replace_with);
+ let pkg = self.inner.download(&id).chain_err(|| {
+ format!("failed to download replaced source {}",
+ self.to_replace)
+ })?;
+ Ok(pkg.map_source(&self.replace_with, &self.to_replace))
+ }
+
+ fn fingerprint(&self, id: &Package) -> CargoResult<String> {
+ self.inner.fingerprint(id)
+ }
+
+ fn verify(&self, id: &PackageId) -> CargoResult<()> {
+ let id = id.with_source_id(&self.replace_with);
+ self.inner.verify(&id)
+ }
+}
--- /dev/null
+use std::str::{self, FromStr};
+use std::iter;
+use std::fmt;
+
+use util::{CargoError, CargoResult};
+
+#[derive(Clone, PartialEq, Debug)]
+pub enum Cfg {
+ Name(String),
+ KeyPair(String, String),
+}
+
+#[derive(Clone, PartialEq, Debug)]
+pub enum CfgExpr {
+ Not(Box<CfgExpr>),
+ All(Vec<CfgExpr>),
+ Any(Vec<CfgExpr>),
+ Value(Cfg),
+}
+
+#[derive(PartialEq)]
+enum Token<'a> {
+ LeftParen,
+ RightParen,
+ Ident(&'a str),
+ Comma,
+ Equals,
+ String(&'a str),
+}
+
+struct Tokenizer<'a> {
+ s: iter::Peekable<str::CharIndices<'a>>,
+ orig: &'a str,
+}
+
+struct Parser<'a> {
+ t: iter::Peekable<Tokenizer<'a>>,
+}
+
+impl FromStr for Cfg {
+ type Err = CargoError;
+
+ fn from_str(s: &str) -> CargoResult<Cfg> {
+ let mut p = Parser::new(s);
+ let e = p.cfg()?;
+ if p.t.next().is_some() {
+ bail!("malformed cfg value or key/value pair: `{}`", s)
+ }
+ Ok(e)
+ }
+}
+
+impl fmt::Display for Cfg {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ match *self {
+ Cfg::Name(ref s) => s.fmt(f),
+ Cfg::KeyPair(ref k, ref v) => write!(f, "{} = \"{}\"", k, v),
+ }
+ }
+}
+
+impl CfgExpr {
+ pub fn matches(&self, cfg: &[Cfg]) -> bool {
+ match *self {
+ CfgExpr::Not(ref e) => !e.matches(cfg),
+ CfgExpr::All(ref e) => e.iter().all(|e| e.matches(cfg)),
+ CfgExpr::Any(ref e) => e.iter().any(|e| e.matches(cfg)),
+ CfgExpr::Value(ref e) => cfg.contains(e),
+ }
+ }
+}
+
+impl FromStr for CfgExpr {
+ type Err = CargoError;
+
+ fn from_str(s: &str) -> CargoResult<CfgExpr> {
+ let mut p = Parser::new(s);
+ let e = p.expr()?;
+ if p.t.next().is_some() {
+ bail!("can only have one cfg-expression, consider using all() or \
+ any() explicitly")
+ }
+ Ok(e)
+ }
+}
+
+impl fmt::Display for CfgExpr {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ match *self {
+ CfgExpr::Not(ref e) => write!(f, "not({})", e),
+ CfgExpr::All(ref e) => write!(f, "all({})", CommaSep(e)),
+ CfgExpr::Any(ref e) => write!(f, "any({})", CommaSep(e)),
+ CfgExpr::Value(ref e) => write!(f, "{}", e),
+ }
+ }
+}
+
+struct CommaSep<'a, T: 'a>(&'a [T]);
+
+impl<'a, T: fmt::Display> fmt::Display for CommaSep<'a, T> {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ for (i, v) in self.0.iter().enumerate() {
+ if i > 0 {
+ write!(f, ", ")?;
+ }
+ write!(f, "{}", v)?;
+ }
+ Ok(())
+ }
+}
+
+impl<'a> Parser<'a> {
+ fn new(s: &'a str) -> Parser<'a> {
+ Parser {
+ t: Tokenizer {
+ s: s.char_indices().peekable(),
+ orig: s,
+ }.peekable(),
+ }
+ }
+
+ fn expr(&mut self) -> CargoResult<CfgExpr> {
+ match self.t.peek() {
+ Some(&Ok(Token::Ident(op @ "all"))) |
+ Some(&Ok(Token::Ident(op @ "any"))) => {
+ self.t.next();
+ let mut e = Vec::new();
+ self.eat(Token::LeftParen)?;
+ while !self.try(Token::RightParen) {
+ e.push(self.expr()?);
+ if !self.try(Token::Comma) {
+ self.eat(Token::RightParen)?;
+ break
+ }
+ }
+ if op == "all" {
+ Ok(CfgExpr::All(e))
+ } else {
+ Ok(CfgExpr::Any(e))
+ }
+ }
+ Some(&Ok(Token::Ident("not"))) => {
+ self.t.next();
+ self.eat(Token::LeftParen)?;
+ let e = self.expr()?;
+ self.eat(Token::RightParen)?;
+ Ok(CfgExpr::Not(Box::new(e)))
+ }
+ Some(&Ok(..)) => self.cfg().map(CfgExpr::Value),
+ Some(&Err(..)) => {
+ Err(self.t.next().unwrap().err().unwrap())
+ }
+ None => bail!("expected start of a cfg expression, \
+ found nothing"),
+ }
+ }
+
+ fn cfg(&mut self) -> CargoResult<Cfg> {
+ match self.t.next() {
+ Some(Ok(Token::Ident(name))) => {
+ let e = if self.try(Token::Equals) {
+ let val = match self.t.next() {
+ Some(Ok(Token::String(s))) => s,
+ Some(Ok(t)) => bail!("expected a string, found {}",
+ t.classify()),
+ Some(Err(e)) => return Err(e),
+ None => bail!("expected a string, found nothing"),
+ };
+ Cfg::KeyPair(name.to_string(), val.to_string())
+ } else {
+ Cfg::Name(name.to_string())
+ };
+ Ok(e)
+ }
+ Some(Ok(t)) => bail!("expected identifier, found {}", t.classify()),
+ Some(Err(e)) => Err(e),
+ None => bail!("expected identifier, found nothing"),
+ }
+ }
+
+ fn try(&mut self, token: Token<'a>) -> bool {
+ match self.t.peek() {
+ Some(&Ok(ref t)) if token == *t => {}
+ _ => return false,
+ }
+ self.t.next();
+ true
+ }
+
+ fn eat(&mut self, token: Token<'a>) -> CargoResult<()> {
+ match self.t.next() {
+ Some(Ok(ref t)) if token == *t => Ok(()),
+ Some(Ok(t)) => bail!("expected {}, found {}", token.classify(),
+ t.classify()),
+ Some(Err(e)) => Err(e),
+ None => bail!("expected {}, but cfg expr ended", token.classify()),
+ }
+ }
+}
+
+impl<'a> Iterator for Tokenizer<'a> {
+ type Item = CargoResult<Token<'a>>;
+
+ fn next(&mut self) -> Option<CargoResult<Token<'a>>> {
+ loop {
+ match self.s.next() {
+ Some((_, ' ')) => {}
+ Some((_, '(')) => return Some(Ok(Token::LeftParen)),
+ Some((_, ')')) => return Some(Ok(Token::RightParen)),
+ Some((_, ',')) => return Some(Ok(Token::Comma)),
+ Some((_, '=')) => return Some(Ok(Token::Equals)),
+ Some((start, '"')) => {
+ while let Some((end, ch)) = self.s.next() {
+ if ch == '"' {
+ return Some(Ok(Token::String(&self.orig[start+1..end])))
+ }
+ }
+ return Some(Err("unterminated string in cfg".into()))
+ }
+ Some((start, ch)) if is_ident_start(ch) => {
+ while let Some(&(end, ch)) = self.s.peek() {
+ if !is_ident_rest(ch) {
+ return Some(Ok(Token::Ident(&self.orig[start..end])))
+ } else {
+ self.s.next();
+ }
+ }
+ return Some(Ok(Token::Ident(&self.orig[start..])))
+ }
+ Some((_, ch)) => {
+ return Some(Err(format!("unexpected character in \
+ cfg `{}`, expected parens, \
+ a comma, an identifier, or \
+ a string", ch).into()))
+ }
+ None => return None
+ }
+ }
+ }
+}
+
+fn is_ident_start(ch: char) -> bool {
+ ch == '_' || ('a' <= ch && ch <= 'z') || ('A' <= ch && ch <= 'Z')
+}
+
+fn is_ident_rest(ch: char) -> bool {
+ is_ident_start(ch) || ('0' <= ch && ch <= '9')
+}
+
+impl<'a> Token<'a> {
+ fn classify(&self) -> &str {
+ match *self {
+ Token::LeftParen => "`(`",
+ Token::RightParen => "`)`",
+ Token::Ident(..) => "an identifier",
+ Token::Comma => "`,`",
+ Token::Equals => "`=`",
+ Token::String(..) => "a string",
+ }
+ }
+}
--- /dev/null
+use std::cell::{RefCell, RefMut};
+use std::collections::HashSet;
+use std::collections::hash_map::Entry::{Occupied, Vacant};
+use std::collections::hash_map::HashMap;
+use std::env;
+use std::fmt;
+use std::fs::{self, File};
+use std::io::SeekFrom;
+use std::io::prelude::*;
+use std::mem;
+use std::path::{Path, PathBuf};
+use std::str::FromStr;
+use std::sync::{Once, ONCE_INIT};
+
+use curl::easy::Easy;
+use jobserver;
+use serde::{Serialize, Serializer};
+use toml;
+
+use core::shell::Verbosity;
+use core::{Shell, CliUnstable};
+use ops;
+use url::Url;
+use util::ToUrl;
+use util::Rustc;
+use util::errors::{CargoResult, CargoResultExt, CargoError, internal};
+use util::paths;
+use util::toml as cargo_toml;
+use util::{Filesystem, LazyCell};
+
+use self::ConfigValue as CV;
+
+/// Configuration information for cargo. This is not specific to a build, it is information
+/// relating to cargo itself.
+///
+/// This struct implements `Default`: all fields can be inferred.
+#[derive(Debug)]
+pub struct Config {
+ /// The location of the users's 'home' directory. OS-dependent.
+ home_path: Filesystem,
+ /// Information about how to write messages to the shell
+ shell: RefCell<Shell>,
+ /// Information on how to invoke the compiler (rustc)
+ rustc: LazyCell<Rustc>,
+ /// A collection of configuration options
+ values: LazyCell<HashMap<String, ConfigValue>>,
+ /// The current working directory of cargo
+ cwd: PathBuf,
+ /// The location of the cargo executable (path to current process)
+ cargo_exe: LazyCell<PathBuf>,
+ /// The location of the rustdoc executable
+ rustdoc: LazyCell<PathBuf>,
+ /// Whether we are printing extra verbose messages
+ extra_verbose: bool,
+ /// `frozen` is set if we shouldn't access the network
+ frozen: bool,
+ /// `locked` is set if we should not update lock files
+ locked: bool,
+ /// A global static IPC control mechanism (used for managing parallel builds)
+ jobserver: Option<jobserver::Client>,
+ /// Cli flags of the form "-Z something"
+ cli_flags: CliUnstable,
+ /// A handle on curl easy mode for http calls
+ easy: LazyCell<RefCell<Easy>>,
+}
+
+impl Config {
+ pub fn new(shell: Shell,
+ cwd: PathBuf,
+ homedir: PathBuf) -> Config {
+ static mut GLOBAL_JOBSERVER: *mut jobserver::Client = 0 as *mut _;
+ static INIT: Once = ONCE_INIT;
+
+ // This should be called early on in the process, so in theory the
+ // unsafety is ok here. (taken ownership of random fds)
+ INIT.call_once(|| unsafe {
+ if let Some(client) = jobserver::Client::from_env() {
+ GLOBAL_JOBSERVER = Box::into_raw(Box::new(client));
+ }
+ });
+
+ Config {
+ home_path: Filesystem::new(homedir),
+ shell: RefCell::new(shell),
+ rustc: LazyCell::new(),
+ cwd: cwd,
+ values: LazyCell::new(),
+ cargo_exe: LazyCell::new(),
+ rustdoc: LazyCell::new(),
+ extra_verbose: false,
+ frozen: false,
+ locked: false,
+ jobserver: unsafe {
+ if GLOBAL_JOBSERVER.is_null() {
+ None
+ } else {
+ Some((*GLOBAL_JOBSERVER).clone())
+ }
+ },
+ cli_flags: CliUnstable::default(),
+ easy: LazyCell::new(),
+ }
+ }
+
+ pub fn default() -> CargoResult<Config> {
+ let shell = Shell::new();
+ let cwd = env::current_dir().chain_err(|| {
+ "couldn't get the current directory of the process"
+ })?;
+ let homedir = homedir(&cwd).ok_or_else(|| {
+ "Cargo couldn't find your home directory. \
+ This probably means that $HOME was not set."
+ })?;
+ Ok(Config::new(shell, cwd, homedir))
+ }
+
+ /// The user's cargo home directory (OS-dependent)
+ pub fn home(&self) -> &Filesystem { &self.home_path }
+
+ /// The cargo git directory (`<cargo_home>/git`)
+ pub fn git_path(&self) -> Filesystem {
+ self.home_path.join("git")
+ }
+
+ /// The cargo registry index directory (`<cargo_home>/registry/index`)
+ pub fn registry_index_path(&self) -> Filesystem {
+ self.home_path.join("registry").join("index")
+ }
+
+ /// The cargo registry cache directory (`<cargo_home>/registry/path`)
+ pub fn registry_cache_path(&self) -> Filesystem {
+ self.home_path.join("registry").join("cache")
+ }
+
+ /// The cargo registry source directory (`<cargo_home>/registry/src`)
+ pub fn registry_source_path(&self) -> Filesystem {
+ self.home_path.join("registry").join("src")
+ }
+
+ /// Get a reference to the shell, for e.g. writing error messages
+ pub fn shell(&self) -> RefMut<Shell> {
+ self.shell.borrow_mut()
+ }
+
+ /// Get the path to the `rustdoc` executable
+ pub fn rustdoc(&self) -> CargoResult<&Path> {
+ self.rustdoc.get_or_try_init(|| self.get_tool("rustdoc")).map(AsRef::as_ref)
+ }
+
+ /// Get the path to the `rustc` executable
+ pub fn rustc(&self) -> CargoResult<&Rustc> {
+ self.rustc.get_or_try_init(|| Rustc::new(self.get_tool("rustc")?,
+ self.maybe_get_tool("rustc_wrapper")?))
+ }
+
+ /// Get the path to the `cargo` executable
+ pub fn cargo_exe(&self) -> CargoResult<&Path> {
+ self.cargo_exe.get_or_try_init(|| {
+ fn from_current_exe() -> CargoResult<PathBuf> {
+ // Try fetching the path to `cargo` using env::current_exe().
+ // The method varies per operating system and might fail; in particular,
+ // it depends on /proc being mounted on Linux, and some environments
+ // (like containers or chroots) may not have that available.
+ env::current_exe()
+ .and_then(|path| path.canonicalize())
+ .map_err(CargoError::from)
+ }
+
+ fn from_argv() -> CargoResult<PathBuf> {
+ // Grab argv[0] and attempt to resolve it to an absolute path.
+ // If argv[0] has one component, it must have come from a PATH lookup,
+ // so probe PATH in that case.
+ // Otherwise, it has multiple components and is either:
+ // - a relative path (e.g. `./cargo`, `target/debug/cargo`), or
+ // - an absolute path (e.g. `/usr/local/bin/cargo`).
+ // In either case, Path::canonicalize will return the full absolute path
+ // to the target if it exists
+ env::args_os()
+ .next()
+ .ok_or(CargoError::from("no argv[0]"))
+ .map(PathBuf::from)
+ .and_then(|argv0| {
+ if argv0.components().count() == 1 {
+ probe_path(argv0)
+ } else {
+ argv0.canonicalize().map_err(CargoError::from)
+ }
+ })
+ }
+
+ fn probe_path(argv0: PathBuf) -> CargoResult<PathBuf> {
+ let paths = env::var_os("PATH").ok_or(CargoError::from("no PATH"))?;
+ for path in env::split_paths(&paths) {
+ let candidate = PathBuf::from(path).join(&argv0);
+ if candidate.is_file() {
+ // PATH may have a component like "." in it, so we still need to
+ // canonicalize.
+ return candidate.canonicalize().map_err(CargoError::from);
+ }
+ }
+
+ Err(CargoError::from("no cargo executable candidate found in PATH"))
+ }
+
+ from_current_exe()
+ .or_else(|_| from_argv())
+ .chain_err(|| "couldn't get the path to cargo executable")
+ }).map(AsRef::as_ref)
+ }
+
+ pub fn values(&self) -> CargoResult<&HashMap<String, ConfigValue>> {
+ self.values.get_or_try_init(|| self.load_values())
+ }
+
+ pub fn set_values(&self, values: HashMap<String, ConfigValue>) -> CargoResult<()> {
+ if self.values.borrow().is_some() {
+ return Err("Config values already found".into());
+ }
+ match self.values.fill(values) {
+ Ok(()) => Ok(()),
+ Err(_) => Err("Could not fill values".into()),
+ }
+ }
+
+ pub fn cwd(&self) -> &Path { &self.cwd }
+
+ pub fn target_dir(&self) -> CargoResult<Option<Filesystem>> {
+ if let Some(dir) = env::var_os("CARGO_TARGET_DIR") {
+ Ok(Some(Filesystem::new(self.cwd.join(dir))))
+ } else if let Some(val) = self.get_path("build.target-dir")? {
+ let val = self.cwd.join(val.val);
+ Ok(Some(Filesystem::new(val)))
+ } else {
+ Ok(None)
+ }
+ }
+
+ fn get(&self, key: &str) -> CargoResult<Option<ConfigValue>> {
+ let vals = self.values()?;
+ let mut parts = key.split('.').enumerate();
+ let mut val = match vals.get(parts.next().unwrap().1) {
+ Some(val) => val,
+ None => return Ok(None),
+ };
+ for (i, part) in parts {
+ match *val {
+ CV::Table(ref map, _) => {
+ val = match map.get(part) {
+ Some(val) => val,
+ None => return Ok(None),
+ }
+ }
+ CV::Integer(_, ref path) |
+ CV::String(_, ref path) |
+ CV::List(_, ref path) |
+ CV::Boolean(_, ref path) => {
+ let idx = key.split('.').take(i)
+ .fold(0, |n, s| n + s.len()) + i - 1;
+ let key_so_far = &key[..idx];
+ bail!("expected table for configuration key `{}`, \
+ but found {} in {}",
+ key_so_far, val.desc(), path.display())
+ }
+ }
+ }
+ Ok(Some(val.clone()))
+ }
+
+ fn get_env<V: FromStr>(&self, key: &str) -> CargoResult<Option<Value<V>>>
+ where CargoError: From<V::Err>
+ {
+ let key = key.replace(".", "_")
+ .replace("-", "_")
+ .chars()
+ .flat_map(|c| c.to_uppercase())
+ .collect::<String>();
+ match env::var(&format!("CARGO_{}", key)) {
+ Ok(value) => {
+ Ok(Some(Value {
+ val: value.parse()?,
+ definition: Definition::Environment,
+ }))
+ }
+ Err(..) => Ok(None),
+ }
+ }
+
+ pub fn get_string(&self, key: &str) -> CargoResult<Option<Value<String>>> {
+ if let Some(v) = self.get_env(key)? {
+ return Ok(Some(v))
+ }
+ match self.get(key)? {
+ Some(CV::String(i, path)) => {
+ Ok(Some(Value {
+ val: i,
+ definition: Definition::Path(path),
+ }))
+ }
+ Some(val) => self.expected("string", key, val),
+ None => Ok(None),
+ }
+ }
+
+ pub fn get_bool(&self, key: &str) -> CargoResult<Option<Value<bool>>> {
+ if let Some(v) = self.get_env(key)? {
+ return Ok(Some(v))
+ }
+ match self.get(key)? {
+ Some(CV::Boolean(b, path)) => {
+ Ok(Some(Value {
+ val: b,
+ definition: Definition::Path(path),
+ }))
+ }
+ Some(val) => self.expected("bool", key, val),
+ None => Ok(None),
+ }
+ }
+
+ fn string_to_path(&self, value: String, definition: &Definition) -> PathBuf {
+ let is_path = value.contains('/') ||
+ (cfg!(windows) && value.contains('\\'));
+ if is_path {
+ definition.root(self).join(value)
+ } else {
+ // A pathless name
+ PathBuf::from(value)
+ }
+ }
+
+ pub fn get_path(&self, key: &str) -> CargoResult<Option<Value<PathBuf>>> {
+ if let Some(val) = self.get_string(key)? {
+ Ok(Some(Value {
+ val: self.string_to_path(val.val, &val.definition),
+ definition: val.definition
+ }))
+ } else {
+ Ok(None)
+ }
+ }
+
+ pub fn get_path_and_args(&self, key: &str)
+ -> CargoResult<Option<Value<(PathBuf, Vec<String>)>>> {
+ if let Some(mut val) = self.get_list_or_split_string(key)? {
+ if !val.val.is_empty() {
+ return Ok(Some(Value {
+ val: (self.string_to_path(val.val.remove(0), &val.definition), val.val),
+ definition: val.definition
+ }));
+ }
+ }
+ Ok(None)
+ }
+
+ pub fn get_list(&self, key: &str)
+ -> CargoResult<Option<Value<Vec<(String, PathBuf)>>>> {
+ match self.get(key)? {
+ Some(CV::List(i, path)) => {
+ Ok(Some(Value {
+ val: i,
+ definition: Definition::Path(path),
+ }))
+ }
+ Some(val) => self.expected("list", key, val),
+ None => Ok(None),
+ }
+ }
+
+ pub fn get_list_or_split_string(&self, key: &str)
+ -> CargoResult<Option<Value<Vec<String>>>> {
+ match self.get_env::<String>(key) {
+ Ok(Some(value)) =>
+ return Ok(Some(Value {
+ val: value.val.split(' ').map(str::to_string).collect(),
+ definition: value.definition
+ })),
+ Err(err) => return Err(err),
+ Ok(None) => (),
+ }
+
+ match self.get(key)? {
+ Some(CV::List(i, path)) => {
+ Ok(Some(Value {
+ val: i.into_iter().map(|(s, _)| s).collect(),
+ definition: Definition::Path(path),
+ }))
+ }
+ Some(CV::String(i, path)) => {
+ Ok(Some(Value {
+ val: i.split(' ').map(str::to_string).collect(),
+ definition: Definition::Path(path),
+ }))
+ }
+ Some(val) => self.expected("list or string", key, val),
+ None => Ok(None),
+ }
+ }
+
+ pub fn get_table(&self, key: &str)
+ -> CargoResult<Option<Value<HashMap<String, CV>>>> {
+ match self.get(key)? {
+ Some(CV::Table(i, path)) => {
+ Ok(Some(Value {
+ val: i,
+ definition: Definition::Path(path),
+ }))
+ }
+ Some(val) => self.expected("table", key, val),
+ None => Ok(None),
+ }
+ }
+
+ pub fn get_i64(&self, key: &str) -> CargoResult<Option<Value<i64>>> {
+ if let Some(v) = self.get_env(key)? {
+ return Ok(Some(v))
+ }
+ match self.get(key)? {
+ Some(CV::Integer(i, path)) => {
+ Ok(Some(Value {
+ val: i,
+ definition: Definition::Path(path),
+ }))
+ }
+ Some(val) => self.expected("integer", key, val),
+ None => Ok(None),
+ }
+ }
+
+ pub fn net_retry(&self) -> CargoResult<i64> {
+ match self.get_i64("net.retry")? {
+ Some(v) => {
+ let value = v.val;
+ if value < 0 {
+ bail!("net.retry must be positive, but found {} in {}",
+ v.val, v.definition)
+ } else {
+ Ok(value)
+ }
+ }
+ None => Ok(2),
+ }
+ }
+
+ pub fn expected<T>(&self, ty: &str, key: &str, val: CV) -> CargoResult<T> {
+ val.expected(ty, key).map_err(|e| {
+ format!("invalid configuration for key `{}`\n{}", key, e).into()
+ })
+ }
+
+ pub fn configure(&mut self,
+ verbose: u32,
+ quiet: Option<bool>,
+ color: &Option<String>,
+ frozen: bool,
+ locked: bool,
+ unstable_flags: &[String]) -> CargoResult<()> {
+ let extra_verbose = verbose >= 2;
+ let verbose = if verbose == 0 {None} else {Some(true)};
+
+ // Ignore errors in the configuration files.
+ let cfg_verbose = self.get_bool("term.verbose").unwrap_or(None).map(|v| v.val);
+ let cfg_color = self.get_string("term.color").unwrap_or(None).map(|v| v.val);
+
+ let color = color.as_ref().or_else(|| cfg_color.as_ref());
+
+ let verbosity = match (verbose, cfg_verbose, quiet) {
+ (Some(true), _, None) |
+ (None, Some(true), None) => Verbosity::Verbose,
+
+ // command line takes precedence over configuration, so ignore the
+ // configuration.
+ (None, _, Some(true)) => Verbosity::Quiet,
+
+ // Can't pass both at the same time on the command line regardless
+ // of configuration.
+ (Some(true), _, Some(true)) => {
+ bail!("cannot set both --verbose and --quiet");
+ }
+
+ // Can't actually get `Some(false)` as a value from the command
+ // line, so just ignore them here to appease exhaustiveness checking
+ // in match statements.
+ (Some(false), _, _) |
+ (_, _, Some(false)) |
+
+ (None, Some(false), None) |
+ (None, None, None) => Verbosity::Normal,
+ };
+
+ self.shell().set_verbosity(verbosity);
+ self.shell().set_color_choice(color.map(|s| &s[..]))?;
+ self.extra_verbose = extra_verbose;
+ self.frozen = frozen;
+ self.locked = locked;
+ self.cli_flags.parse(unstable_flags)?;
+
+ Ok(())
+ }
+
+ pub fn cli_unstable(&self) -> &CliUnstable {
+ &self.cli_flags
+ }
+
+ pub fn extra_verbose(&self) -> bool {
+ self.extra_verbose
+ }
+
+ pub fn network_allowed(&self) -> bool {
+ !self.frozen
+ }
+
+ pub fn lock_update_allowed(&self) -> bool {
+ !self.frozen && !self.locked
+ }
+
+ /// Loads configuration from the filesystem
+ pub fn load_values(&self) -> CargoResult<HashMap<String, ConfigValue>> {
+ let mut cfg = CV::Table(HashMap::new(), PathBuf::from("."));
+
+ walk_tree(&self.cwd, |path| {
+ let mut contents = String::new();
+ let mut file = File::open(&path)?;
+ file.read_to_string(&mut contents).chain_err(|| {
+ format!("failed to read configuration file `{}`",
+ path.display())
+ })?;
+ let toml = cargo_toml::parse(&contents,
+ path,
+ self).chain_err(|| {
+ format!("could not parse TOML configuration in `{}`",
+ path.display())
+ })?;
+ let value = CV::from_toml(path, toml).chain_err(|| {
+ format!("failed to load TOML configuration from `{}`",
+ path.display())
+ })?;
+ cfg.merge(value).chain_err(|| {
+ format!("failed to merge configuration at `{}`", path.display())
+ })?;
+ Ok(())
+ }).chain_err(|| "Couldn't load Cargo configuration")?;
+
+ self.load_credentials(&mut cfg)?;
+ match cfg {
+ CV::Table(map, _) => Ok(map),
+ _ => unreachable!(),
+ }
+ }
+
+ /// Gets the index for a registry.
+ pub fn get_registry_index(&self, registry: &str) -> CargoResult<Url> {
+ Ok(match self.get_string(&format!("registries.{}.index", registry))? {
+ Some(index) => index.val.to_url()?,
+ None => return Err(CargoError::from(format!("No index found for registry: `{}`", registry)).into()),
+ })
+ }
+
+ /// Loads credentials config from the credentials file into the ConfigValue object, if present.
+ fn load_credentials(&self, cfg: &mut ConfigValue) -> CargoResult<()> {
+ let home_path = self.home_path.clone().into_path_unlocked();
+ let credentials = home_path.join("credentials");
+ if !fs::metadata(&credentials).is_ok() {
+ return Ok(());
+ }
+
+ let mut contents = String::new();
+ let mut file = File::open(&credentials)?;
+ file.read_to_string(&mut contents).chain_err(|| {
+ format!("failed to read configuration file `{}`", credentials.display())
+ })?;
+
+ let toml = cargo_toml::parse(&contents,
+ &credentials,
+ self).chain_err(|| {
+ format!("could not parse TOML configuration in `{}`", credentials.display())
+ })?;
+
+ let value = CV::from_toml(&credentials, toml).chain_err(|| {
+ format!("failed to load TOML configuration from `{}`", credentials.display())
+ })?;
+
+ let cfg = match *cfg {
+ CV::Table(ref mut map, _) => map,
+ _ => unreachable!(),
+ };
+
+ let registry = cfg.entry("registry".into())
+ .or_insert_with(|| CV::Table(HashMap::new(), PathBuf::from(".")));
+
+ match (registry, value) {
+ (&mut CV::Table(ref mut old, _), CV::Table(ref mut new, _)) => {
+ // Take ownership of `new` by swapping it with an empty hashmap, so we can move
+ // into an iterator.
+ let new = mem::replace(new, HashMap::new());
+ for (key, value) in new {
+ old.insert(key, value);
+ }
+ }
+ _ => unreachable!(),
+ }
+
+ Ok(())
+ }
+
+ /// Look for a path for `tool` in an environment variable or config path, but return `None`
+ /// if it's not present.
+ fn maybe_get_tool(&self, tool: &str) -> CargoResult<Option<PathBuf>> {
+ let var = tool.chars().flat_map(|c| c.to_uppercase()).collect::<String>();
+ if let Some(tool_path) = env::var_os(&var) {
+ return Ok(Some(PathBuf::from(tool_path)));
+ }
+
+ let var = format!("build.{}", tool);
+ if let Some(tool_path) = self.get_path(&var)? {
+ return Ok(Some(tool_path.val));
+ }
+
+ Ok(None)
+ }
+
+ /// Look for a path for `tool` in an environment variable or config path, defaulting to `tool`
+ /// as a path.
+ fn get_tool(&self, tool: &str) -> CargoResult<PathBuf> {
+ self.maybe_get_tool(tool)
+ .map(|t| t.unwrap_or_else(|| PathBuf::from(tool)))
+ }
+
+ pub fn jobserver_from_env(&self) -> Option<&jobserver::Client> {
+ self.jobserver.as_ref()
+ }
+
+ pub fn http(&self) -> CargoResult<&RefCell<Easy>> {
+ let http = self.easy.get_or_try_init(|| {
+ ops::http_handle(self).map(RefCell::new)
+ })?;
+ {
+ let mut http = http.borrow_mut();
+ http.reset();
+ ops::configure_http_handle(self, &mut http)?;
+ }
+ Ok(http)
+ }
+}
+
+#[derive(Eq, PartialEq, Clone, Copy)]
+pub enum Location {
+ Project,
+ Global
+}
+
+#[derive(Eq,PartialEq,Clone,Deserialize)]
+pub enum ConfigValue {
+ Integer(i64, PathBuf),
+ String(String, PathBuf),
+ List(Vec<(String, PathBuf)>, PathBuf),
+ Table(HashMap<String, ConfigValue>, PathBuf),
+ Boolean(bool, PathBuf),
+}
+
+pub struct Value<T> {
+ pub val: T,
+ pub definition: Definition,
+}
+
+pub enum Definition {
+ Path(PathBuf),
+ Environment,
+}
+
+impl fmt::Debug for ConfigValue {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ match *self {
+ CV::Integer(i, ref path) => write!(f, "{} (from {})", i,
+ path.display()),
+ CV::Boolean(b, ref path) => write!(f, "{} (from {})", b,
+ path.display()),
+ CV::String(ref s, ref path) => write!(f, "{} (from {})", s,
+ path.display()),
+ CV::List(ref list, ref path) => {
+ write!(f, "[")?;
+ for (i, &(ref s, ref path)) in list.iter().enumerate() {
+ if i > 0 { write!(f, ", ")?; }
+ write!(f, "{} (from {})", s, path.display())?;
+ }
+ write!(f, "] (from {})", path.display())
+ }
+ CV::Table(ref table, _) => write!(f, "{:?}", table),
+ }
+ }
+}
+
+impl Serialize for ConfigValue {
+ fn serialize<S: Serializer>(&self, s: S) -> Result<S::Ok, S::Error> {
+ match *self {
+ CV::String(ref string, _) => string.serialize(s),
+ CV::List(ref list, _) => {
+ let list: Vec<&String> = list.iter().map(|s| &s.0).collect();
+ list.serialize(s)
+ }
+ CV::Table(ref table, _) => table.serialize(s),
+ CV::Boolean(b, _) => b.serialize(s),
+ CV::Integer(i, _) => i.serialize(s),
+ }
+ }
+}
+
+impl ConfigValue {
+ fn from_toml(path: &Path, toml: toml::Value) -> CargoResult<ConfigValue> {
+ match toml {
+ toml::Value::String(val) => Ok(CV::String(val, path.to_path_buf())),
+ toml::Value::Boolean(b) => Ok(CV::Boolean(b, path.to_path_buf())),
+ toml::Value::Integer(i) => Ok(CV::Integer(i, path.to_path_buf())),
+ toml::Value::Array(val) => {
+ Ok(CV::List(val.into_iter().map(|toml| {
+ match toml {
+ toml::Value::String(val) => Ok((val, path.to_path_buf())),
+ v => Err(format!("expected string but found {} \
+ in list", v.type_str()).into()),
+ }
+ }).collect::<CargoResult<_>>()?, path.to_path_buf()))
+ }
+ toml::Value::Table(val) => {
+ Ok(CV::Table(val.into_iter().map(|(key, value)| {
+ let value = CV::from_toml(path, value).chain_err(|| {
+ format!("failed to parse key `{}`", key)
+ })?;
+ Ok((key, value))
+ }).collect::<CargoResult<_>>()?, path.to_path_buf()))
+ }
+ v => bail!("found TOML configuration value of unknown type `{}`",
+ v.type_str()),
+ }
+ }
+
+ fn into_toml(self) -> toml::Value {
+ match self {
+ CV::Boolean(s, _) => toml::Value::Boolean(s),
+ CV::String(s, _) => toml::Value::String(s),
+ CV::Integer(i, _) => toml::Value::Integer(i),
+ CV::List(l, _) => toml::Value::Array(l
+ .into_iter()
+ .map(|(s, _)| toml::Value::String(s))
+ .collect()),
+ CV::Table(l, _) => toml::Value::Table(l.into_iter()
+ .map(|(k, v)| (k, v.into_toml()))
+ .collect()),
+ }
+ }
+
+ fn merge(&mut self, from: ConfigValue) -> CargoResult<()> {
+ match (self, from) {
+ (&mut CV::String(..), CV::String(..)) |
+ (&mut CV::Integer(..), CV::Integer(..)) |
+ (&mut CV::Boolean(..), CV::Boolean(..)) => {}
+ (&mut CV::List(ref mut old, _), CV::List(ref mut new, _)) => {
+ let new = mem::replace(new, Vec::new());
+ old.extend(new.into_iter());
+ }
+ (&mut CV::Table(ref mut old, _), CV::Table(ref mut new, _)) => {
+ let new = mem::replace(new, HashMap::new());
+ for (key, value) in new {
+ match old.entry(key.clone()) {
+ Occupied(mut entry) => {
+ let path = value.definition_path().to_path_buf();
+ let entry = entry.get_mut();
+ entry.merge(value).chain_err(|| {
+ format!("failed to merge key `{}` between \
+ files:\n \
+ file 1: {}\n \
+ file 2: {}",
+ key,
+ entry.definition_path().display(),
+ path.display())
+
+ })?;
+ }
+ Vacant(entry) => { entry.insert(value); }
+ };
+ }
+ }
+ (expected, found) => {
+ return Err(internal(format!("expected {}, but found {}",
+ expected.desc(), found.desc())))
+ }
+ }
+
+ Ok(())
+ }
+
+ pub fn i64(&self, key: &str) -> CargoResult<(i64, &Path)> {
+ match *self {
+ CV::Integer(i, ref p) => Ok((i, p)),
+ _ => self.expected("integer", key),
+ }
+ }
+
+ pub fn string(&self, key: &str) -> CargoResult<(&str, &Path)> {
+ match *self {
+ CV::String(ref s, ref p) => Ok((s, p)),
+ _ => self.expected("string", key),
+ }
+ }
+
+ pub fn table(&self, key: &str)
+ -> CargoResult<(&HashMap<String, ConfigValue>, &Path)> {
+ match *self {
+ CV::Table(ref table, ref p) => Ok((table, p)),
+ _ => self.expected("table", key),
+ }
+ }
+
+ pub fn list(&self, key: &str) -> CargoResult<&[(String, PathBuf)]> {
+ match *self {
+ CV::List(ref list, _) => Ok(list),
+ _ => self.expected("list", key),
+ }
+ }
+
+ pub fn boolean(&self, key: &str) -> CargoResult<(bool, &Path)> {
+ match *self {
+ CV::Boolean(b, ref p) => Ok((b, p)),
+ _ => self.expected("bool", key),
+ }
+ }
+
+ pub fn desc(&self) -> &'static str {
+ match *self {
+ CV::Table(..) => "table",
+ CV::List(..) => "array",
+ CV::String(..) => "string",
+ CV::Boolean(..) => "boolean",
+ CV::Integer(..) => "integer",
+ }
+ }
+
+ pub fn definition_path(&self) -> &Path {
+ match *self {
+ CV::Boolean(_, ref p) |
+ CV::Integer(_, ref p) |
+ CV::String(_, ref p) |
+ CV::List(_, ref p) |
+ CV::Table(_, ref p) => p
+ }
+ }
+
+ pub fn expected<T>(&self, wanted: &str, key: &str) -> CargoResult<T> {
+ Err(format!("expected a {}, but found a {} for `{}` in {}",
+ wanted, self.desc(), key,
+ self.definition_path().display()).into())
+ }
+}
+
+impl Definition {
+ pub fn root<'a>(&'a self, config: &'a Config) -> &'a Path {
+ match *self {
+ Definition::Path(ref p) => p.parent().unwrap().parent().unwrap(),
+ Definition::Environment => config.cwd(),
+ }
+ }
+}
+
+impl fmt::Display for Definition {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ match *self {
+ Definition::Path(ref p) => p.display().fmt(f),
+ Definition::Environment => "the environment".fmt(f),
+ }
+ }
+}
+
+pub fn homedir(cwd: &Path) -> Option<PathBuf> {
+ ::home::cargo_home_with_cwd(cwd).ok()
+}
+
+fn walk_tree<F>(pwd: &Path, mut walk: F) -> CargoResult<()>
+ where F: FnMut(&Path) -> CargoResult<()>
+{
+ let mut stash: HashSet<PathBuf> = HashSet::new();
+
+ for current in paths::ancestors(pwd) {
+ let possible = current.join(".cargo").join("config");
+ if fs::metadata(&possible).is_ok() {
+ walk(&possible)?;
+ stash.insert(possible);
+ }
+ }
+
+ // Once we're done, also be sure to walk the home directory even if it's not
+ // in our history to be sure we pick up that standard location for
+ // information.
+ let home = homedir(pwd).ok_or_else(|| {
+ CargoError::from("Cargo couldn't find your home directory. \
+ This probably means that $HOME was not set.")
+ })?;
+ let config = home.join("config");
+ if !stash.contains(&config) && fs::metadata(&config).is_ok() {
+ walk(&config)?;
+ }
+
+ Ok(())
+}
+
+pub fn save_credentials(cfg: &Config,
+ token: String,
+ registry: Option<String>) -> CargoResult<()> {
+ let mut file = {
+ cfg.home_path.create_dir()?;
+ cfg.home_path.open_rw(Path::new("credentials"), cfg,
+ "credentials' config file")?
+ };
+
+ let (key, value) = {
+ let key = "token".to_string();
+ let value = ConfigValue::String(token, file.path().to_path_buf());
+
+ if let Some(registry) = registry {
+ let mut map = HashMap::new();
+ map.insert(key, value);
+ (registry, CV::Table(map, file.path().to_path_buf()))
+ } else {
+ (key, value)
+ }
+ };
+
+ let mut contents = String::new();
+ file.read_to_string(&mut contents).chain_err(|| {
+ format!("failed to read configuration file `{}`", file.path().display())
+ })?;
+
+ let mut toml = cargo_toml::parse(&contents, file.path(), cfg)?;
+ toml.as_table_mut()
+ .unwrap()
+ .insert(key, value.into_toml());
+
+ let contents = toml.to_string();
+ file.seek(SeekFrom::Start(0))?;
+ file.write_all(contents.as_bytes())?;
+ file.file().set_len(contents.len() as u64)?;
+ set_permissions(file.file(), 0o600)?;
+
+ return Ok(());
+
+ #[cfg(unix)]
+ fn set_permissions(file: & File, mode: u32) -> CargoResult<()> {
+ use std::os::unix::fs::PermissionsExt;
+
+ let mut perms = file.metadata()?.permissions();
+ perms.set_mode(mode);
+ file.set_permissions(perms)?;
+ Ok(())
+ }
+
+ #[cfg(not(unix))]
+ #[allow(unused)]
+ fn set_permissions(file: & File, mode: u32) -> CargoResult<()> {
+ Ok(())
+ }
+}
--- /dev/null
+//! A graph-like structure used to represent a set of dependencies and in what
+//! order they should be built.
+//!
+//! This structure is used to store the dependency graph and dynamically update
+//! it to figure out when a dependency should be built.
+
+use std::collections::hash_map::Entry::{Occupied, Vacant};
+use std::collections::{HashMap, HashSet};
+use std::hash::Hash;
+
+pub use self::Freshness::{Fresh, Dirty};
+
+#[derive(Debug)]
+pub struct DependencyQueue<K: Eq + Hash, V> {
+ /// A list of all known keys to build.
+ ///
+ /// The value of the hash map is list of dependencies which still need to be
+ /// built before the package can be built. Note that the set is dynamically
+ /// updated as more dependencies are built.
+ dep_map: HashMap<K, (HashSet<K>, V)>,
+
+ /// A reverse mapping of a package to all packages that depend on that
+ /// package.
+ ///
+ /// This map is statically known and does not get updated throughout the
+ /// lifecycle of the DependencyQueue.
+ reverse_dep_map: HashMap<K, HashSet<K>>,
+
+ /// A set of dirty packages.
+ ///
+ /// Packages may become dirty over time if their dependencies are rebuilt.
+ dirty: HashSet<K>,
+
+ /// The packages which are currently being built, waiting for a call to
+ /// `finish`.
+ pending: HashSet<K>,
+}
+
+/// Indication of the freshness of a package.
+///
+/// A fresh package does not necessarily need to be rebuilt (unless a dependency
+/// was also rebuilt), and a dirty package must always be rebuilt.
+#[derive(PartialEq, Eq, Debug, Clone, Copy)]
+pub enum Freshness {
+ Fresh,
+ Dirty,
+}
+
+impl Freshness {
+ pub fn combine(&self, other: Freshness) -> Freshness {
+ match *self { Fresh => other, Dirty => Dirty }
+ }
+}
+
+impl<K: Hash + Eq + Clone, V> Default for DependencyQueue<K, V> {
+ fn default() -> DependencyQueue<K, V> {
+ DependencyQueue::new()
+ }
+}
+
+impl<K: Hash + Eq + Clone, V> DependencyQueue<K, V> {
+ /// Creates a new dependency queue with 0 packages.
+ pub fn new() -> DependencyQueue<K, V> {
+ DependencyQueue {
+ dep_map: HashMap::new(),
+ reverse_dep_map: HashMap::new(),
+ dirty: HashSet::new(),
+ pending: HashSet::new(),
+ }
+ }
+
+ /// Adds a new package to this dependency queue.
+ ///
+ /// It is assumed that any dependencies of this package will eventually also
+ /// be added to the dependency queue.
+ pub fn queue(&mut self,
+ fresh: Freshness,
+ key: K,
+ value: V,
+ dependencies: &[K]) -> &mut V {
+ let slot = match self.dep_map.entry(key.clone()) {
+ Occupied(v) => return &mut v.into_mut().1,
+ Vacant(v) => v,
+ };
+
+ if fresh == Dirty {
+ self.dirty.insert(key.clone());
+ }
+
+ let mut my_dependencies = HashSet::new();
+ for dep in dependencies {
+ my_dependencies.insert(dep.clone());
+ let rev = self.reverse_dep_map.entry(dep.clone())
+ .or_insert_with(HashSet::new);
+ rev.insert(key.clone());
+ }
+ &mut slot.insert((my_dependencies, value)).1
+ }
+
+ /// Dequeues a package that is ready to be built.
+ ///
+ /// A package is ready to be built when it has 0 un-built dependencies. If
+ /// `None` is returned then no packages are ready to be built.
+ pub fn dequeue(&mut self) -> Option<(Freshness, K, V)> {
+ let key = match self.dep_map.iter()
+ .find(|&(_, &(ref deps, _))| deps.is_empty())
+ .map(|(key, _)| key.clone()) {
+ Some(key) => key,
+ None => return None
+ };
+ let (_, data) = self.dep_map.remove(&key).unwrap();
+ let fresh = if self.dirty.contains(&key) {Dirty} else {Fresh};
+ self.pending.insert(key.clone());
+ Some((fresh, key, data))
+ }
+
+ /// Returns whether there are remaining packages to be built.
+ pub fn is_empty(&self) -> bool {
+ self.dep_map.is_empty() && self.pending.is_empty()
+ }
+
+ /// Returns the number of remaining packages to be built.
+ pub fn len(&self) -> usize {
+ self.dep_map.len() + self.pending.len()
+ }
+
+ /// Indicate that a package has been built.
+ ///
+ /// This function will update the dependency queue with this information,
+ /// possibly allowing the next invocation of `dequeue` to return a package.
+ pub fn finish(&mut self, key: &K, fresh: Freshness) {
+ assert!(self.pending.remove(key));
+ let reverse_deps = match self.reverse_dep_map.get(key) {
+ Some(deps) => deps,
+ None => return,
+ };
+ for dep in reverse_deps.iter() {
+ if fresh == Dirty {
+ self.dirty.insert(dep.clone());
+ }
+ assert!(self.dep_map.get_mut(dep).unwrap().0.remove(key));
+ }
+ }
+}
--- /dev/null
+#![allow(unknown_lints)]
+
+use std::error::Error;
+use std::fmt;
+use std::io;
+use std::num;
+use std::process::{Output, ExitStatus};
+use std::str;
+use std::string;
+
+use core::TargetKind;
+
+use curl;
+use git2;
+use semver;
+use serde_json;
+use toml;
+use registry;
+use ignore;
+
+error_chain! {
+ types {
+ CargoError, CargoErrorKind, CargoResultExt, CargoResult;
+ }
+
+ links {
+ CrateRegistry(registry::Error, registry::ErrorKind);
+ }
+
+ foreign_links {
+ ParseSemver(semver::ReqParseError);
+ Semver(semver::SemVerError);
+ Ignore(ignore::Error);
+ Io(io::Error);
+ SerdeJson(serde_json::Error);
+ TomlSer(toml::ser::Error);
+ TomlDe(toml::de::Error);
+ ParseInt(num::ParseIntError);
+ ParseBool(str::ParseBoolError);
+ Parse(string::ParseError);
+ Git(git2::Error);
+ Curl(curl::Error);
+ }
+
+ errors {
+ Internal(err: Box<CargoErrorKind>) {
+ description(err.description())
+ display("{}", *err)
+ }
+ ProcessErrorKind(proc_err: ProcessError) {
+ description(&proc_err.desc)
+ display("{}", &proc_err.desc)
+ }
+ CargoTestErrorKind(test_err: CargoTestError) {
+ description(&test_err.desc)
+ display("{}", &test_err.desc)
+ }
+ HttpNot200(code: u32, url: String) {
+ description("failed to get a 200 response")
+ display("failed to get 200 response from `{}`, got {}", url, code)
+ }
+ }
+}
+
+impl CargoError {
+ pub fn into_internal(self) -> Self {
+ CargoError(CargoErrorKind::Internal(Box::new(self.0)), self.1)
+ }
+
+ fn is_human(&self) -> bool {
+ match self.0 {
+ CargoErrorKind::Msg(_) |
+ CargoErrorKind::TomlSer(_) |
+ CargoErrorKind::TomlDe(_) |
+ CargoErrorKind::Curl(_) |
+ CargoErrorKind::HttpNot200(..) |
+ CargoErrorKind::ProcessErrorKind(_) |
+ CargoErrorKind::CrateRegistry(_) => true,
+ CargoErrorKind::ParseSemver(_) |
+ CargoErrorKind::Semver(_) |
+ CargoErrorKind::Ignore(_) |
+ CargoErrorKind::Io(_) |
+ CargoErrorKind::SerdeJson(_) |
+ CargoErrorKind::ParseInt(_) |
+ CargoErrorKind::ParseBool(_) |
+ CargoErrorKind::Parse(_) |
+ CargoErrorKind::Git(_) |
+ CargoErrorKind::Internal(_) |
+ CargoErrorKind::CargoTestErrorKind(_) |
+ CargoErrorKind::__Nonexhaustive { .. } => false
+ }
+ }
+}
+
+
+// =============================================================================
+// Process errors
+#[derive(Debug)]
+pub struct ProcessError {
+ pub desc: String,
+ pub exit: Option<ExitStatus>,
+ pub output: Option<Output>,
+}
+
+// =============================================================================
+// Cargo test errors.
+
+/// Error when testcases fail
+#[derive(Debug)]
+pub struct CargoTestError {
+ pub test: Test,
+ pub desc: String,
+ pub exit: Option<ExitStatus>,
+ pub causes: Vec<ProcessError>,
+}
+
+#[derive(Debug)]
+pub enum Test {
+ Multiple,
+ Doc,
+ UnitTest(TargetKind, String)
+}
+
+impl CargoTestError {
+ pub fn new(test: Test, errors: Vec<ProcessError>) -> Self {
+ if errors.is_empty() {
+ panic!("Cannot create CargoTestError from empty Vec")
+ }
+ let desc = errors.iter().map(|error| error.desc.clone())
+ .collect::<Vec<String>>()
+ .join("\n");
+ CargoTestError {
+ test: test,
+ desc: desc,
+ exit: errors[0].exit,
+ causes: errors,
+ }
+ }
+
+ pub fn hint(&self) -> String {
+ match self.test {
+ Test::UnitTest(ref kind, ref name) => {
+ match *kind {
+ TargetKind::Bench => format!("test failed, to rerun pass '--bench {}'", name),
+ TargetKind::Bin => format!("test failed, to rerun pass '--bin {}'", name),
+ TargetKind::Lib(_) => "test failed, to rerun pass '--lib'".into(),
+ TargetKind::Test => format!("test failed, to rerun pass '--test {}'", name),
+ TargetKind::ExampleBin | TargetKind::ExampleLib(_) =>
+ format!("test failed, to rerun pass '--example {}", name),
+ _ => "test failed.".into()
+ }
+ },
+ Test::Doc => "test failed, to rerun pass '--doc'".into(),
+ _ => "test failed.".into()
+ }
+ }
+}
+
+// =============================================================================
+// CLI errors
+
+pub type CliResult = Result<(), CliError>;
+
+#[derive(Debug)]
+pub struct CliError {
+ pub error: Option<CargoError>,
+ pub unknown: bool,
+ pub exit_code: i32
+}
+
+impl Error for CliError {
+ fn description(&self) -> &str {
+ self.error.as_ref().map(|e| e.description())
+ .unwrap_or("unknown cli error")
+ }
+
+ fn cause(&self) -> Option<&Error> {
+ self.error.as_ref().and_then(|e| e.cause())
+ }
+}
+
+impl fmt::Display for CliError {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ if let Some(ref error) = self.error {
+ error.fmt(f)
+ } else {
+ self.description().fmt(f)
+ }
+ }
+}
+
+impl CliError {
+ pub fn new(error: CargoError, code: i32) -> CliError {
+ let human = &error.is_human();
+ CliError { error: Some(error), exit_code: code, unknown: !human }
+ }
+
+ pub fn code(code: i32) -> CliError {
+ CliError { error: None, exit_code: code, unknown: false }
+ }
+}
+
+impl From<CargoError> for CliError {
+ fn from(err: CargoError) -> CliError {
+ CliError::new(err, 101)
+ }
+}
+
+
+// =============================================================================
+// Construction helpers
+
+pub fn process_error(msg: &str,
+ status: Option<&ExitStatus>,
+ output: Option<&Output>) -> ProcessError
+{
+ let exit = match status {
+ Some(s) => status_to_string(s),
+ None => "never executed".to_string(),
+ };
+ let mut desc = format!("{} ({})", &msg, exit);
+
+ if let Some(out) = output {
+ match str::from_utf8(&out.stdout) {
+ Ok(s) if !s.trim().is_empty() => {
+ desc.push_str("\n--- stdout\n");
+ desc.push_str(s);
+ }
+ Ok(..) | Err(..) => {}
+ }
+ match str::from_utf8(&out.stderr) {
+ Ok(s) if !s.trim().is_empty() => {
+ desc.push_str("\n--- stderr\n");
+ desc.push_str(s);
+ }
+ Ok(..) | Err(..) => {}
+ }
+ }
+
+ return ProcessError {
+ desc: desc,
+ exit: status.cloned(),
+ output: output.cloned(),
+ };
+
+ #[cfg(unix)]
+ fn status_to_string(status: &ExitStatus) -> String {
+ use std::os::unix::process::*;
+ use libc;
+
+ if let Some(signal) = status.signal() {
+ let name = match signal as libc::c_int {
+ libc::SIGABRT => ", SIGABRT: process abort signal",
+ libc::SIGALRM => ", SIGALRM: alarm clock",
+ libc::SIGFPE => ", SIGFPE: erroneous arithmetic operation",
+ libc::SIGHUP => ", SIGHUP: hangup",
+ libc::SIGILL => ", SIGILL: illegal instruction",
+ libc::SIGINT => ", SIGINT: terminal interrupt signal",
+ libc::SIGKILL => ", SIGKILL: kill",
+ libc::SIGPIPE => ", SIGPIPE: write on a pipe with no one to read",
+ libc::SIGQUIT => ", SIGQUIT: terminal quite signal",
+ libc::SIGSEGV => ", SIGSEGV: invalid memory reference",
+ libc::SIGTERM => ", SIGTERM: termination signal",
+ libc::SIGBUS => ", SIGBUS: access to undefined memory",
+ #[cfg(not(target_os = "haiku"))]
+ libc::SIGSYS => ", SIGSYS: bad system call",
+ libc::SIGTRAP => ", SIGTRAP: trace/breakpoint trap",
+ _ => "",
+ };
+ format!("signal: {}{}", signal, name)
+ } else {
+ status.to_string()
+ }
+ }
+
+ #[cfg(windows)]
+ fn status_to_string(status: &ExitStatus) -> String {
+ status.to_string()
+ }
+}
+
+pub fn internal<S: fmt::Display>(error: S) -> CargoError {
+ _internal(&error)
+}
+
+fn _internal(error: &fmt::Display) -> CargoError {
+ CargoError::from_kind(error.to_string().into()).into_internal()
+}
--- /dev/null
+use std::fs::{self, File, OpenOptions};
+use std::io::*;
+use std::io;
+use std::path::{Path, PathBuf, Display};
+
+use termcolor::Color::Cyan;
+use fs2::{FileExt, lock_contended_error};
+#[allow(unused_imports)]
+use libc;
+
+use util::Config;
+use util::errors::{CargoResult, CargoResultExt};
+
+pub struct FileLock {
+ f: Option<File>,
+ path: PathBuf,
+ state: State,
+}
+
+#[derive(PartialEq)]
+enum State {
+ Unlocked,
+ Shared,
+ Exclusive,
+}
+
+impl FileLock {
+ /// Returns the underlying file handle of this lock.
+ pub fn file(&self) -> &File {
+ self.f.as_ref().unwrap()
+ }
+
+ /// Returns the underlying path that this lock points to.
+ ///
+ /// Note that special care must be taken to ensure that the path is not
+ /// referenced outside the lifetime of this lock.
+ pub fn path(&self) -> &Path {
+ assert!(self.state != State::Unlocked);
+ &self.path
+ }
+
+ /// Returns the parent path containing this file
+ pub fn parent(&self) -> &Path {
+ assert!(self.state != State::Unlocked);
+ self.path.parent().unwrap()
+ }
+
+ /// Removes all sibling files to this locked file.
+ ///
+ /// This can be useful if a directory is locked with a sentinel file but it
+ /// needs to be cleared out as it may be corrupt.
+ pub fn remove_siblings(&self) -> io::Result<()> {
+ let path = self.path();
+ for entry in path.parent().unwrap().read_dir()? {
+ let entry = entry?;
+ if Some(&entry.file_name()[..]) == path.file_name() {
+ continue
+ }
+ let kind = entry.file_type()?;
+ if kind.is_dir() {
+ fs::remove_dir_all(entry.path())?;
+ } else {
+ fs::remove_file(entry.path())?;
+ }
+ }
+ Ok(())
+ }
+}
+
+impl Read for FileLock {
+ fn read(&mut self, buf: &mut [u8]) -> io::Result<usize> {
+ self.file().read(buf)
+ }
+}
+
+impl Seek for FileLock {
+ fn seek(&mut self, to: SeekFrom) -> io::Result<u64> {
+ self.file().seek(to)
+ }
+}
+
+impl Write for FileLock {
+ fn write(&mut self, buf: &[u8]) -> io::Result<usize> {
+ self.file().write(buf)
+ }
+
+ fn flush(&mut self) -> io::Result<()> {
+ self.file().flush()
+ }
+}
+
+impl Drop for FileLock {
+ fn drop(&mut self) {
+ if self.state != State::Unlocked {
+ if let Some(f) = self.f.take() {
+ let _ = f.unlock();
+ }
+ }
+ }
+}
+
+/// A "filesystem" is intended to be a globally shared, hence locked, resource
+/// in Cargo.
+///
+/// The `Path` of a filesystem cannot be learned unless it's done in a locked
+/// fashion, and otherwise functions on this structure are prepared to handle
+/// concurrent invocations across multiple instances of Cargo.
+#[derive(Clone, Debug)]
+pub struct Filesystem {
+ root: PathBuf,
+}
+
+impl Filesystem {
+ /// Creates a new filesystem to be rooted at the given path.
+ pub fn new(path: PathBuf) -> Filesystem {
+ Filesystem { root: path }
+ }
+
+ /// Like `Path::join`, creates a new filesystem rooted at this filesystem
+ /// joined with the given path.
+ pub fn join<T: AsRef<Path>>(&self, other: T) -> Filesystem {
+ Filesystem::new(self.root.join(other))
+ }
+
+ /// Like `Path::push`, pushes a new path component onto this filesystem.
+ pub fn push<T: AsRef<Path>>(&mut self, other: T) {
+ self.root.push(other);
+ }
+
+ /// Consumes this filesystem and returns the underlying `PathBuf`.
+ ///
+ /// Note that this is a relatively dangerous operation and should be used
+ /// with great caution!.
+ pub fn into_path_unlocked(self) -> PathBuf {
+ self.root
+ }
+
+ /// Creates the directory pointed to by this filesystem.
+ ///
+ /// Handles errors where other Cargo processes are also attempting to
+ /// concurrently create this directory.
+ pub fn create_dir(&self) -> io::Result<()> {
+ create_dir_all(&self.root)
+ }
+
+ /// Returns an adaptor that can be used to print the path of this
+ /// filesystem.
+ pub fn display(&self) -> Display {
+ self.root.display()
+ }
+
+ /// Opens exclusive access to a file, returning the locked version of a
+ /// file.
+ ///
+ /// This function will create a file at `path` if it doesn't already exist
+ /// (including intermediate directories), and then it will acquire an
+ /// exclusive lock on `path`. If the process must block waiting for the
+ /// lock, the `msg` is printed to `config`.
+ ///
+ /// The returned file can be accessed to look at the path and also has
+ /// read/write access to the underlying file.
+ pub fn open_rw<P>(&self,
+ path: P,
+ config: &Config,
+ msg: &str) -> CargoResult<FileLock>
+ where P: AsRef<Path>
+ {
+ self.open(path.as_ref(),
+ OpenOptions::new().read(true).write(true).create(true),
+ State::Exclusive,
+ config,
+ msg)
+ }
+
+ /// Opens shared access to a file, returning the locked version of a file.
+ ///
+ /// This function will fail if `path` doesn't already exist, but if it does
+ /// then it will acquire a shared lock on `path`. If the process must block
+ /// waiting for the lock, the `msg` is printed to `config`.
+ ///
+ /// The returned file can be accessed to look at the path and also has read
+ /// access to the underlying file. Any writes to the file will return an
+ /// error.
+ pub fn open_ro<P>(&self,
+ path: P,
+ config: &Config,
+ msg: &str) -> CargoResult<FileLock>
+ where P: AsRef<Path>
+ {
+ self.open(path.as_ref(),
+ OpenOptions::new().read(true),
+ State::Shared,
+ config,
+ msg)
+ }
+
+ fn open(&self,
+ path: &Path,
+ opts: &OpenOptions,
+ state: State,
+ config: &Config,
+ msg: &str) -> CargoResult<FileLock> {
+ let path = self.root.join(path);
+
+ // If we want an exclusive lock then if we fail because of NotFound it's
+ // likely because an intermediate directory didn't exist, so try to
+ // create the directory and then continue.
+ let f = opts.open(&path).or_else(|e| {
+ if e.kind() == io::ErrorKind::NotFound && state == State::Exclusive {
+ create_dir_all(path.parent().unwrap())?;
+ opts.open(&path)
+ } else {
+ Err(e)
+ }
+ }).chain_err(|| {
+ format!("failed to open: {}", path.display())
+ })?;
+ match state {
+ State::Exclusive => {
+ acquire(config, msg, &path,
+ &|| f.try_lock_exclusive(),
+ &|| f.lock_exclusive())?;
+ }
+ State::Shared => {
+ acquire(config, msg, &path,
+ &|| f.try_lock_shared(),
+ &|| f.lock_shared())?;
+ }
+ State::Unlocked => {}
+
+ }
+ Ok(FileLock { f: Some(f), path: path, state: state })
+ }
+}
+
+/// Acquires a lock on a file in a "nice" manner.
+///
+/// Almost all long-running blocking actions in Cargo have a status message
+/// associated with them as we're not sure how long they'll take. Whenever a
+/// conflicted file lock happens, this is the case (we're not sure when the lock
+/// will be released).
+///
+/// This function will acquire the lock on a `path`, printing out a nice message
+/// to the console if we have to wait for it. It will first attempt to use `try`
+/// to acquire a lock on the crate, and in the case of contention it will emit a
+/// status message based on `msg` to `config`'s shell, and then use `block` to
+/// block waiting to acquire a lock.
+///
+/// Returns an error if the lock could not be acquired or if any error other
+/// than a contention error happens.
+fn acquire(config: &Config,
+ msg: &str,
+ path: &Path,
+ try: &Fn() -> io::Result<()>,
+ block: &Fn() -> io::Result<()>) -> CargoResult<()> {
+
+ // File locking on Unix is currently implemented via `flock`, which is known
+ // to be broken on NFS. We could in theory just ignore errors that happen on
+ // NFS, but apparently the failure mode [1] for `flock` on NFS is **blocking
+ // forever**, even if the nonblocking flag is passed!
+ //
+ // As a result, we just skip all file locks entirely on NFS mounts. That
+ // should avoid calling any `flock` functions at all, and it wouldn't work
+ // there anyway.
+ //
+ // [1]: https://github.com/rust-lang/cargo/issues/2615
+ if is_on_nfs_mount(path) {
+ return Ok(())
+ }
+
+ match try() {
+ Ok(()) => return Ok(()),
+
+ // In addition to ignoring NFS which is commonly not working we also
+ // just ignore locking on filesystems that look like they don't
+ // implement file locking. We detect that here via the return value of
+ // locking (e.g. inspecting errno).
+ #[cfg(unix)]
+ Err(ref e) if e.raw_os_error() == Some(libc::ENOTSUP) => return Ok(()),
+
+ #[cfg(target_os = "linux")]
+ Err(ref e) if e.raw_os_error() == Some(libc::ENOSYS) => return Ok(()),
+
+ Err(e) => {
+ if e.raw_os_error() != lock_contended_error().raw_os_error() {
+ return Err(e).chain_err(|| {
+ format!("failed to lock file: {}", path.display())
+ })
+ }
+ }
+ }
+ let msg = format!("waiting for file lock on {}", msg);
+ config.shell().status_with_color("Blocking", &msg, Cyan)?;
+
+ return block().chain_err(|| {
+ format!("failed to lock file: {}", path.display())
+ });
+
+ #[cfg(all(target_os = "linux", not(target_env = "musl")))]
+ fn is_on_nfs_mount(path: &Path) -> bool {
+ use std::ffi::CString;
+ use std::mem;
+ use std::os::unix::prelude::*;
+
+ let path = match CString::new(path.as_os_str().as_bytes()) {
+ Ok(path) => path,
+ Err(_) => return false,
+ };
+
+ unsafe {
+ let mut buf: libc::statfs = mem::zeroed();
+ let r = libc::statfs(path.as_ptr(), &mut buf);
+
+ r == 0 && buf.f_type as u32 == libc::NFS_SUPER_MAGIC as u32
+ }
+ }
+
+ #[cfg(any(not(target_os = "linux"), target_env = "musl"))]
+ fn is_on_nfs_mount(_path: &Path) -> bool {
+ false
+ }
+}
+
+fn create_dir_all(path: &Path) -> io::Result<()> {
+ match create_dir(path) {
+ Ok(()) => Ok(()),
+ Err(e) => {
+ if e.kind() == io::ErrorKind::NotFound {
+ if let Some(p) = path.parent() {
+ return create_dir_all(p).and_then(|()| create_dir(path))
+ }
+ }
+ Err(e)
+ }
+ }
+}
+
+fn create_dir(path: &Path) -> io::Result<()> {
+ match fs::create_dir(path) {
+ Ok(()) => Ok(()),
+ Err(ref e) if e.kind() == io::ErrorKind::AlreadyExists => Ok(()),
+ Err(e) => Err(e),
+ }
+}
--- /dev/null
+use std::fmt;
+use std::hash::Hash;
+use std::collections::hash_set::{HashSet, Iter};
+use std::collections::hash_map::{HashMap, Keys};
+
+pub struct Graph<N> {
+ nodes: HashMap<N, HashSet<N>>
+}
+
+enum Mark {
+ InProgress,
+ Done
+}
+
+pub type Nodes<'a, N> = Keys<'a, N, HashSet<N>>;
+pub type Edges<'a, N> = Iter<'a, N>;
+
+impl<N: Eq + Hash + Clone> Graph<N> {
+ pub fn new() -> Graph<N> {
+ Graph { nodes: HashMap::new() }
+ }
+
+ pub fn add(&mut self, node: N, children: &[N]) {
+ self.nodes.entry(node)
+ .or_insert_with(HashSet::new)
+ .extend(children.iter().cloned());
+ }
+
+ pub fn link(&mut self, node: N, child: N) {
+ self.nodes.entry(node).or_insert_with(HashSet::new).insert(child);
+ }
+
+ pub fn get_nodes(&self) -> &HashMap<N, HashSet<N>> {
+ &self.nodes
+ }
+
+ pub fn edges(&self, node: &N) -> Option<Edges<N>> {
+ self.nodes.get(node).map(|set| set.iter())
+ }
+
+ pub fn sort(&self) -> Option<Vec<N>> {
+ let mut ret = Vec::new();
+ let mut marks = HashMap::new();
+
+ for node in self.nodes.keys() {
+ self.visit(node, &mut ret, &mut marks);
+ }
+
+ Some(ret)
+ }
+
+ fn visit(&self, node: &N, dst: &mut Vec<N>, marks: &mut HashMap<N, Mark>) {
+ if marks.contains_key(node) {
+ return;
+ }
+
+ marks.insert(node.clone(), Mark::InProgress);
+
+ for child in &self.nodes[node] {
+ self.visit(child, dst, marks);
+ }
+
+ dst.push(node.clone());
+ marks.insert(node.clone(), Mark::Done);
+ }
+
+ pub fn iter(&self) -> Nodes<N> {
+ self.nodes.keys()
+ }
+}
+
+impl<N: Eq + Hash + Clone> Default for Graph<N> {
+ fn default() -> Graph<N> {
+ Graph::new()
+ }
+}
+
+impl<N: fmt::Display + Eq + Hash> fmt::Debug for Graph<N> {
+ fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
+ writeln!(fmt, "Graph {{")?;
+
+ for (n, e) in &self.nodes {
+ writeln!(fmt, " - {}", n)?;
+
+ for n in e.iter() {
+ writeln!(fmt, " - {}", n)?;
+ }
+ }
+
+ write!(fmt, "}}")?;
+
+ Ok(())
+ }
+}
+
+impl<N: Eq + Hash> PartialEq for Graph<N> {
+ fn eq(&self, other: &Graph<N>) -> bool { self.nodes.eq(&other.nodes) }
+}
+impl<N: Eq + Hash> Eq for Graph<N> {}
+
+impl<N: Eq + Hash + Clone> Clone for Graph<N> {
+ fn clone(&self) -> Graph<N> {
+ Graph { nodes: self.nodes.clone() }
+ }
+}
--- /dev/null
+#![allow(deprecated)]
+
+use hex::ToHex;
+use std::hash::{Hasher, Hash, SipHasher};
+
+pub fn to_hex(num: u64) -> String {
+ [
+ (num >> 0) as u8,
+ (num >> 8) as u8,
+ (num >> 16) as u8,
+ (num >> 24) as u8,
+ (num >> 32) as u8,
+ (num >> 40) as u8,
+ (num >> 48) as u8,
+ (num >> 56) as u8,
+ ].to_hex()
+}
+
+pub fn hash_u64<H: Hash>(hashable: &H) -> u64 {
+ let mut hasher = SipHasher::new_with_keys(0, 0);
+ hashable.hash(&mut hasher);
+ hasher.finish()
+}
+
+pub fn short_hash<H: Hash>(hashable: &H) -> String {
+ to_hex(hash_u64(hashable))
+}
--- /dev/null
+use std::fs;
+use std::path::{Path, PathBuf};
+use util::errors::CargoResult;
+use util::paths;
+
+/// Iteratively search for `file` in `pwd` and its parents, returning
+/// the path of the directory.
+pub fn find_project(pwd: &Path, file: &str) -> CargoResult<PathBuf> {
+ find_project_manifest(pwd, file).map(|mut p| {
+ // remove the file, leaving just the directory
+ p.pop();
+ p
+ })
+}
+
+/// Iteratively search for `file` in `pwd` and its parents, returning
+/// the path to the file.
+pub fn find_project_manifest(pwd: &Path, file: &str) -> CargoResult<PathBuf> {
+ let mut current = pwd;
+
+ loop {
+ let manifest = current.join(file);
+ if fs::metadata(&manifest).is_ok() {
+ return Ok(manifest)
+ }
+
+ match current.parent() {
+ Some(p) => current = p,
+ None => break,
+ }
+ }
+
+ bail!("could not find `{}` in `{}` or any parent directory",
+ file, pwd.display())
+}
+
+/// Find the root Cargo.toml
+pub fn find_root_manifest_for_wd(manifest_path: Option<String>, cwd: &Path)
+ -> CargoResult<PathBuf> {
+ match manifest_path {
+ Some(path) => {
+ let absolute_path = paths::normalize_path(&cwd.join(&path));
+ if !absolute_path.ends_with("Cargo.toml") {
+ bail!("the manifest-path must be a path to a Cargo.toml file")
+ }
+ if !fs::metadata(&absolute_path).is_ok() {
+ bail!("manifest path `{}` does not exist", path)
+ }
+ Ok(absolute_path)
+ },
+ None => find_project_manifest(cwd, "Cargo.toml"),
+ }
+}
+
+/// Return the path to the `file` in `pwd`, if it exists.
+pub fn find_project_manifest_exact(pwd: &Path, file: &str) -> CargoResult<PathBuf> {
+ let manifest = pwd.join(file);
+
+ if fs::metadata(&manifest).is_ok() {
+ Ok(manifest)
+ } else {
+ Err(format!("Could not find `{}` in `{}`",
+ file, pwd.display()).into())
+ }
+}
--- /dev/null
+//! Job management (mostly for windows)
+//!
+//! Most of the time when you're running cargo you expect Ctrl-C to actually
+//! terminate the entire tree of processes in play, not just the one at the top
+//! (cago). This currently works "by default" on Unix platforms because Ctrl-C
+//! actually sends a signal to the *process group* rather than the parent
+//! process, so everything will get torn down. On Windows, however, this does
+//! not happen and Ctrl-C just kills cargo.
+//!
+//! To achieve the same semantics on Windows we use Job Objects to ensure that
+//! all processes die at the same time. Job objects have a mode of operation
+//! where when all handles to the object are closed it causes all child
+//! processes associated with the object to be terminated immediately.
+//! Conveniently whenever a process in the job object spawns a new process the
+//! child will be associated with the job object as well. This means if we add
+//! ourselves to the job object we create then everything will get torn down!
+
+pub use self::imp::Setup;
+
+pub fn setup() -> Option<Setup> {
+ unsafe { imp::setup() }
+}
+
+#[cfg(unix)]
+mod imp {
+ use std::env;
+ use libc;
+
+ pub type Setup = ();
+
+ pub unsafe fn setup() -> Option<()> {
+ // There's a test case for the behavior of
+ // when-cargo-is-killed-subprocesses-are-also-killed, but that requires
+ // one cargo spawned to become its own session leader, so we do that
+ // here.
+ if env::var("__CARGO_TEST_SETSID_PLEASE_DONT_USE_ELSEWHERE").is_ok() {
+ libc::setsid();
+ }
+ Some(())
+ }
+}
+
+#[cfg(windows)]
+mod imp {
+ extern crate kernel32;
+ extern crate winapi;
+ extern crate psapi;
+
+ use std::ffi::OsString;
+ use std::io;
+ use std::mem;
+ use std::os::windows::prelude::*;
+
+ pub struct Setup {
+ job: Handle,
+ }
+
+ pub struct Handle {
+ inner: winapi::HANDLE,
+ }
+
+ fn last_err() -> io::Error {
+ io::Error::last_os_error()
+ }
+
+ pub unsafe fn setup() -> Option<Setup> {
+ // Creates a new job object for us to use and then adds ourselves to it.
+ // Note that all errors are basically ignored in this function,
+ // intentionally. Job objects are "relatively new" in Windows,
+ // particularly the ability to support nested job objects. Older
+ // Windows installs don't support this ability. We probably don't want
+ // to force Cargo to abort in this situation or force others to *not*
+ // use job objects, so we instead just ignore errors and assume that
+ // we're otherwise part of someone else's job object in this case.
+
+ let job = kernel32::CreateJobObjectW(0 as *mut _, 0 as *const _);
+ if job.is_null() {
+ return None
+ }
+ let job = Handle { inner: job };
+
+ // Indicate that when all handles to the job object are gone that all
+ // process in the object should be killed. Note that this includes our
+ // entire process tree by default because we've added ourselves and and
+ // our children will reside in the job once we spawn a process.
+ let mut info: winapi::JOBOBJECT_EXTENDED_LIMIT_INFORMATION;
+ info = mem::zeroed();
+ info.BasicLimitInformation.LimitFlags =
+ winapi::JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE;
+ let r = kernel32::SetInformationJobObject(job.inner,
+ winapi::JobObjectExtendedLimitInformation,
+ &mut info as *mut _ as winapi::LPVOID,
+ mem::size_of_val(&info) as winapi::DWORD);
+ if r == 0 {
+ return None
+ }
+
+ // Assign our process to this job object, meaning that our children will
+ // now live or die based on our existence.
+ let me = kernel32::GetCurrentProcess();
+ let r = kernel32::AssignProcessToJobObject(job.inner, me);
+ if r == 0 {
+ return None
+ }
+
+ Some(Setup { job: job })
+ }
+
+ impl Drop for Setup {
+ fn drop(&mut self) {
+ // This is a litte subtle. By default if we are terminated then all
+ // processes in our job object are terminated as well, but we
+ // intentionally want to whitelist some processes to outlive our job
+ // object (see below).
+ //
+ // To allow for this, we manually kill processes instead of letting
+ // the job object kill them for us. We do this in a loop to handle
+ // processes spawning other processes.
+ //
+ // Finally once this is all done we know that the only remaining
+ // ones are ourselves and the whitelisted processes. The destructor
+ // here then configures our job object to *not* kill everything on
+ // close, then closes the job object.
+ unsafe {
+ while self.kill_remaining() {
+ info!("killed some, going for more");
+ }
+
+ let mut info: winapi::JOBOBJECT_EXTENDED_LIMIT_INFORMATION;
+ info = mem::zeroed();
+ let r = kernel32::SetInformationJobObject(
+ self.job.inner,
+ winapi::JobObjectExtendedLimitInformation,
+ &mut info as *mut _ as winapi::LPVOID,
+ mem::size_of_val(&info) as winapi::DWORD);
+ if r == 0 {
+ info!("failed to configure job object to defaults: {}",
+ last_err());
+ }
+ }
+ }
+ }
+
+ impl Setup {
+ unsafe fn kill_remaining(&mut self) -> bool {
+ #[repr(C)]
+ struct Jobs {
+ header: winapi::JOBOBJECT_BASIC_PROCESS_ID_LIST,
+ list: [winapi::ULONG_PTR; 1024],
+ }
+
+ let mut jobs: Jobs = mem::zeroed();
+ let r = kernel32::QueryInformationJobObject(
+ self.job.inner,
+ winapi::JobObjectBasicProcessIdList,
+ &mut jobs as *mut _ as winapi::LPVOID,
+ mem::size_of_val(&jobs) as winapi::DWORD,
+ 0 as *mut _);
+ if r == 0 {
+ info!("failed to query job object: {}", last_err());
+ return false
+ }
+
+ let mut killed = false;
+ let list = &jobs.list[..jobs.header.NumberOfProcessIdsInList as usize];
+ assert!(list.len() > 0);
+ info!("found {} remaining processes", list.len() - 1);
+
+ let list = list.iter().filter(|&&id| {
+ // let's not kill ourselves
+ id as winapi::DWORD != kernel32::GetCurrentProcessId()
+ }).filter_map(|&id| {
+ // Open the process with the necessary rights, and if this
+ // fails then we probably raced with the process exiting so we
+ // ignore the problem.
+ let flags = winapi::PROCESS_QUERY_INFORMATION |
+ winapi::PROCESS_TERMINATE |
+ winapi::SYNCHRONIZE;
+ let p = kernel32::OpenProcess(flags,
+ winapi::FALSE,
+ id as winapi::DWORD);
+ if p.is_null() {
+ None
+ } else {
+ Some(Handle { inner: p })
+ }
+ }).filter(|p| {
+ // Test if this process was actually in the job object or not.
+ // If it's not then we likely raced with something else
+ // recycling this PID, so we just skip this step.
+ let mut res = 0;
+ let r = kernel32::IsProcessInJob(p.inner, self.job.inner, &mut res);
+ if r == 0 {
+ info!("failed to test is process in job: {}", last_err());
+ return false
+ }
+ res == winapi::TRUE
+ });
+
+
+ for p in list {
+ // Load the file which this process was spawned from. We then
+ // later use this for identification purposes.
+ let mut buf = [0; 1024];
+ let r = psapi::GetProcessImageFileNameW(p.inner,
+ buf.as_mut_ptr(),
+ buf.len() as winapi::DWORD);
+ if r == 0 {
+ info!("failed to get image name: {}", last_err());
+ continue
+ }
+ let s = OsString::from_wide(&buf[..r as usize]);
+ info!("found remaining: {:?}", s);
+
+ // And here's where we find the whole purpose for this
+ // function! Currently, our only whitelisted process is
+ // `mspdbsrv.exe`, and more details about that can be found
+ // here:
+ //
+ // https://github.com/rust-lang/rust/issues/33145
+ //
+ // The gist of it is that all builds on one machine use the
+ // same `mspdbsrv.exe` instance. If we were to kill this
+ // instance then we could erroneously cause other builds to
+ // fail.
+ if let Some(s) = s.to_str() {
+ if s.contains("mspdbsrv") {
+ info!("\toops, this is mspdbsrv");
+ continue
+ }
+ }
+
+ // Ok, this isn't mspdbsrv, let's kill the process. After we
+ // kill it we wait on it to ensure that the next time around in
+ // this function we're not going to see it again.
+ let r = kernel32::TerminateProcess(p.inner, 1);
+ if r == 0 {
+ info!("\tfailed to kill subprocess: {}", last_err());
+ info!("\tassuming subprocess is dead...");
+ } else {
+ info!("\tterminated subprocess");
+ }
+ let r = kernel32::WaitForSingleObject(p.inner, winapi::INFINITE);
+ if r != 0 {
+ info!("failed to wait for process to die: {}", last_err());
+ return false
+ }
+ killed = true;
+ }
+
+ return killed
+ }
+ }
+
+ impl Drop for Handle {
+ fn drop(&mut self) {
+ unsafe { kernel32::CloseHandle(self.inner); }
+ }
+ }
+}
--- /dev/null
+//! A lazily fill Cell, but with frozen contents.
+//!
+//! With a `RefCell`, the inner contents cannot be borrowed for the lifetime of
+//! the entire object, but only of the borrows returned. A `LazyCell` is a
+//! variation on `RefCell` which allows borrows tied to the lifetime of the
+//! outer object.
+//!
+//! The limitation of a `LazyCell` is that after initialized, it can never be
+//! modified unless you've otherwise got a `&mut` reference
+
+use std::cell::UnsafeCell;
+
+#[derive(Debug)]
+pub struct LazyCell<T> {
+ inner: UnsafeCell<Option<T>>,
+}
+
+impl<T> LazyCell<T> {
+ /// Creates a new empty lazy cell.
+ pub fn new() -> LazyCell<T> {
+ LazyCell { inner: UnsafeCell::new(None) }
+ }
+
+ /// Put a value into this cell.
+ ///
+ /// This function will fail if the cell has already been filled.
+ pub fn fill(&self, t: T) -> Result<(), T> {
+ unsafe {
+ let slot = self.inner.get();
+ if (*slot).is_none() {
+ *slot = Some(t);
+ Ok(())
+ } else {
+ Err(t)
+ }
+ }
+ }
+
+ /// Borrows the contents of this lazy cell for the duration of the cell
+ /// itself.
+ ///
+ /// This function will return `Some` if the cell has been previously
+ /// initialized, and `None` if it has not yet been initialized.
+ pub fn borrow(&self) -> Option<&T> {
+ unsafe {
+ (*self.inner.get()).as_ref()
+ }
+ }
+
+ /// Same as `borrow`, but the mutable version
+ pub fn borrow_mut(&mut self) -> Option<&mut T> {
+ unsafe {
+ (*self.inner.get()).as_mut()
+ }
+ }
+
+ /// Consumes this `LazyCell`, returning the underlying value.
+ pub fn into_inner(self) -> Option<T> {
+ unsafe {
+ self.inner.into_inner()
+ }
+ }
+
+ /// Borrows the contents of this lazy cell, initializing it if necessary.
+ pub fn get_or_try_init<Error, F>(&self, init: F) -> Result<&T, Error>
+ where F: FnOnce() -> Result<T, Error>
+ {
+ if self.borrow().is_none() && self.fill(init()?).is_err() {
+ unreachable!();
+ }
+ Ok(self.borrow().unwrap())
+ }
+}
--- /dev/null
+use std::cmp;
+
+pub fn lev_distance(me: &str, t: &str) -> usize {
+ if me.is_empty() { return t.chars().count(); }
+ if t.is_empty() { return me.chars().count(); }
+
+ let mut dcol = (0..t.len() + 1).collect::<Vec<_>>();
+ let mut t_last = 0;
+
+ for (i, sc) in me.chars().enumerate() {
+
+ let mut current = i;
+ dcol[0] = current + 1;
+
+ for (j, tc) in t.chars().enumerate() {
+
+ let next = dcol[j + 1];
+
+ if sc == tc {
+ dcol[j + 1] = current;
+ } else {
+ dcol[j + 1] = cmp::min(current, next);
+ dcol[j + 1] = cmp::min(dcol[j + 1], dcol[j]) + 1;
+ }
+
+ current = next;
+ t_last = j;
+ }
+ }
+
+ dcol[t_last + 1]
+}
+
+#[test]
+fn test_lev_distance() {
+ use std::char::{ from_u32, MAX };
+ // Test bytelength agnosticity
+ for c in (0u32..MAX as u32)
+ .filter_map(|i| from_u32(i))
+ .map(|i| i.to_string()) {
+ assert_eq!(lev_distance(&c, &c), 0);
+ }
+
+ let a = "\nMäry häd ä little lämb\n\nLittle lämb\n";
+ let b = "\nMary häd ä little lämb\n\nLittle lämb\n";
+ let c = "Mary häd ä little lämb\n\nLittle lämb\n";
+ assert_eq!(lev_distance(a, b), 1);
+ assert_eq!(lev_distance(b, a), 1);
+ assert_eq!(lev_distance(a, c), 2);
+ assert_eq!(lev_distance(c, a), 2);
+ assert_eq!(lev_distance(b, c), 1);
+ assert_eq!(lev_distance(c, b), 1);
+}
--- /dev/null
+use serde::ser;
+use serde_json::{self, Value};
+
+use core::{PackageId, Target, Profile};
+
+pub trait Message: ser::Serialize {
+ fn reason(&self) -> &str;
+}
+
+pub fn emit<T: Message>(t: &T) {
+ let mut json: Value = serde_json::to_value(t).unwrap();
+ json["reason"] = json!(t.reason());
+ println!("{}", json);
+}
+
+#[derive(Serialize)]
+pub struct FromCompiler<'a> {
+ pub package_id: &'a PackageId,
+ pub target: &'a Target,
+ pub message: serde_json::Value,
+}
+
+impl<'a> Message for FromCompiler<'a> {
+ fn reason(&self) -> &str {
+ "compiler-message"
+ }
+}
+
+#[derive(Serialize)]
+pub struct Artifact<'a> {
+ pub package_id: &'a PackageId,
+ pub target: &'a Target,
+ pub profile: &'a Profile,
+ pub features: Vec<String>,
+ pub filenames: Vec<String>,
+ pub fresh: bool,
+}
+
+impl<'a> Message for Artifact<'a> {
+ fn reason(&self) -> &str {
+ "compiler-artifact"
+ }
+}
+
+#[derive(Serialize)]
+pub struct BuildScript<'a> {
+ pub package_id: &'a PackageId,
+ pub linked_libs: &'a [String],
+ pub linked_paths: &'a [String],
+ pub cfgs: &'a [String],
+ pub env: &'a [(String, String)],
+}
+
+impl<'a> Message for BuildScript<'a> {
+ fn reason(&self) -> &str {
+ "build-script-executed"
+ }
+}
--- /dev/null
+pub use self::cfg::{Cfg, CfgExpr};
+pub use self::config::{Config, ConfigValue, homedir};
+pub use self::dependency_queue::{DependencyQueue, Fresh, Dirty, Freshness};
+pub use self::errors::{CargoResult, CargoResultExt, CargoError, CargoErrorKind, Test, CliResult};
+pub use self::errors::{CliError, ProcessError, CargoTestError};
+pub use self::errors::{process_error, internal};
+pub use self::flock::{FileLock, Filesystem};
+pub use self::graph::Graph;
+pub use self::hex::{to_hex, short_hash, hash_u64};
+pub use self::lazy_cell::LazyCell;
+pub use self::lev_distance::{lev_distance};
+pub use self::paths::{join_paths, path2bytes, bytes2path, dylib_path};
+pub use self::paths::{normalize_path, dylib_path_envvar, without_prefix};
+pub use self::process_builder::{process, ProcessBuilder};
+pub use self::rustc::Rustc;
+pub use self::sha256::Sha256;
+pub use self::to_semver::ToSemver;
+pub use self::to_url::ToUrl;
+pub use self::vcs::{GitRepo, HgRepo, PijulRepo, FossilRepo};
+pub use self::read2::read2;
+pub use self::progress::Progress;
+
+pub mod config;
+pub mod errors;
+pub mod graph;
+pub mod hex;
+pub mod important_paths;
+pub mod job;
+pub mod lev_distance;
+pub mod machine_message;
+pub mod network;
+pub mod paths;
+pub mod process_builder;
+pub mod profile;
+pub mod to_semver;
+pub mod to_url;
+pub mod toml;
+mod cfg;
+mod dependency_queue;
+mod rustc;
+mod sha256;
+mod vcs;
+mod lazy_cell;
+mod flock;
+mod read2;
+mod progress;
--- /dev/null
+use std;
+use std::error::Error;
+
+use error_chain::ChainedError;
+use util::Config;
+use util::errors::{CargoError, CargoErrorKind, CargoResult};
+use git2;
+
+fn maybe_spurious<E, EKind>(err: &E) -> bool
+ where E: ChainedError<ErrorKind=EKind> + 'static {
+ //Error inspection in non-verbose mode requires inspecting the
+ //error kind to avoid printing Internal errors. The downcasting
+ //machinery requires &(Error + 'static), but the iterator (and
+ //underlying `cause`) return &Error. Because the borrows are
+ //constrained to this handling method, and because the original
+ //error object is constrained to be 'static, we're casting away
+ //the borrow's actual lifetime for purposes of downcasting and
+ //inspecting the error chain
+ unsafe fn extend_lifetime(r: &Error) -> &(Error + 'static) {
+ std::mem::transmute::<&Error, &Error>(r)
+ }
+
+ for e in err.iter() {
+ let e = unsafe { extend_lifetime(e) };
+ if let Some(cargo_err) = e.downcast_ref::<CargoError>() {
+ match cargo_err.kind() {
+ &CargoErrorKind::Git(ref git_err) => {
+ match git_err.class() {
+ git2::ErrorClass::Net |
+ git2::ErrorClass::Os => return true,
+ _ => ()
+ }
+ }
+ &CargoErrorKind::Curl(ref curl_err)
+ if curl_err.is_couldnt_connect() ||
+ curl_err.is_couldnt_resolve_proxy() ||
+ curl_err.is_couldnt_resolve_host() ||
+ curl_err.is_operation_timedout() ||
+ curl_err.is_recv_error() => {
+ return true
+ }
+ &CargoErrorKind::HttpNot200(code, ref _url) if 500 <= code && code < 600 => {
+ return true
+ }
+ _ => ()
+ }
+ }
+ }
+ false
+}
+
+/// Wrapper method for network call retry logic.
+///
+/// Retry counts provided by Config object `net.retry`. Config shell outputs
+/// a warning on per retry.
+///
+/// Closure must return a `CargoResult`.
+///
+/// # Examples
+///
+/// ```ignore
+/// use util::network;
+/// cargo_result = network.with_retry(&config, || something.download());
+/// ```
+pub fn with_retry<T, F>(config: &Config, mut callback: F) -> CargoResult<T>
+ where F: FnMut() -> CargoResult<T>
+{
+ let mut remaining = config.net_retry()?;
+ loop {
+ match callback() {
+ Ok(ret) => return Ok(ret),
+ Err(ref e) if maybe_spurious(e) && remaining > 0 => {
+ let msg = format!("spurious network error ({} tries \
+ remaining): {}", remaining, e);
+ config.shell().warn(msg)?;
+ remaining -= 1;
+ }
+ //todo impl from
+ Err(e) => return Err(e.into()),
+ }
+ }
+}
+#[test]
+fn with_retry_repeats_the_call_then_works() {
+ //Error HTTP codes (5xx) are considered maybe_spurious and will prompt retry
+ let error1 = CargoErrorKind::HttpNot200(501, "Uri".to_string()).into();
+ let error2 = CargoErrorKind::HttpNot200(502, "Uri".to_string()).into();
+ let mut results: Vec<CargoResult<()>> = vec![Ok(()), Err(error1), Err(error2)];
+ let config = Config::default().unwrap();
+ let result = with_retry(&config, || results.pop().unwrap());
+ assert_eq!(result.unwrap(), ())
+}
+
+#[test]
+fn with_retry_finds_nested_spurious_errors() {
+ //Error HTTP codes (5xx) are considered maybe_spurious and will prompt retry
+ //String error messages are not considered spurious
+ let error1 : CargoError = CargoErrorKind::HttpNot200(501, "Uri".to_string()).into();
+ let error1 = CargoError::with_chain(error1, "A non-spurious wrapping err");
+ let error2 = CargoError::from_kind(CargoErrorKind::HttpNot200(502, "Uri".to_string()));
+ let error2 = CargoError::with_chain(error2, "A second chained error");
+ let mut results: Vec<CargoResult<()>> = vec![Ok(()), Err(error1), Err(error2)];
+ let config = Config::default().unwrap();
+ let result = with_retry(&config, || results.pop().unwrap());
+ assert_eq!(result.unwrap(), ())
+}
--- /dev/null
+use std::env;
+use std::ffi::{OsStr, OsString};
+use std::fs::File;
+use std::fs::OpenOptions;
+use std::io::prelude::*;
+use std::path::{Path, PathBuf, Component};
+
+use util::{internal, CargoResult};
+use util::errors::CargoResultExt;
+
+pub fn join_paths<T: AsRef<OsStr>>(paths: &[T], env: &str) -> CargoResult<OsString> {
+ env::join_paths(paths.iter()).or_else(|e| {
+ let paths = paths.iter().map(Path::new).collect::<Vec<_>>();
+ Err(internal(format!("failed to join path array: {:?}", paths))).chain_err(|| {
+ format!("failed to join search paths together: {}\n\
+ Does ${} have an unterminated quote character?",
+ e, env)
+ })
+ })
+}
+
+pub fn dylib_path_envvar() -> &'static str {
+ if cfg!(windows) {"PATH"}
+ else if cfg!(target_os = "macos") {"DYLD_LIBRARY_PATH"}
+ else {"LD_LIBRARY_PATH"}
+}
+
+pub fn dylib_path() -> Vec<PathBuf> {
+ match env::var_os(dylib_path_envvar()) {
+ Some(var) => env::split_paths(&var).collect(),
+ None => Vec::new(),
+ }
+}
+
+pub fn normalize_path(path: &Path) -> PathBuf {
+ let mut components = path.components().peekable();
+ let mut ret = if let Some(c @ Component::Prefix(..)) = components.peek()
+ .cloned() {
+ components.next();
+ PathBuf::from(c.as_os_str())
+ } else {
+ PathBuf::new()
+ };
+
+ for component in components {
+ match component {
+ Component::Prefix(..) => unreachable!(),
+ Component::RootDir => { ret.push(component.as_os_str()); }
+ Component::CurDir => {}
+ Component::ParentDir => { ret.pop(); }
+ Component::Normal(c) => { ret.push(c); }
+ }
+ }
+ ret
+}
+
+pub fn without_prefix<'a>(long_path: &'a Path, prefix: &'a Path) -> Option<&'a Path> {
+ let mut a = long_path.components();
+ let mut b = prefix.components();
+ loop {
+ match b.next() {
+ Some(y) => match a.next() {
+ Some(x) if x == y => continue,
+ _ => return None,
+ },
+ None => return Some(a.as_path()),
+ }
+ }
+}
+
+pub fn read(path: &Path) -> CargoResult<String> {
+ match String::from_utf8(read_bytes(path)?) {
+ Ok(s) => Ok(s),
+ Err(_) => bail!("path at `{}` was not valid utf-8", path.display()),
+ }
+}
+
+pub fn read_bytes(path: &Path) -> CargoResult<Vec<u8>> {
+ (|| -> CargoResult<_> {
+ let mut ret = Vec::new();
+ let mut f = File::open(path)?;
+ if let Ok(m) = f.metadata() {
+ ret.reserve(m.len() as usize + 1);
+ }
+ f.read_to_end(&mut ret)?;
+ Ok(ret)
+ })().chain_err(|| {
+ format!("failed to read `{}`", path.display())
+ })
+}
+
+pub fn write(path: &Path, contents: &[u8]) -> CargoResult<()> {
+ (|| -> CargoResult<()> {
+ let mut f = File::create(path)?;
+ f.write_all(contents)?;
+ Ok(())
+ })().chain_err(|| {
+ format!("failed to write `{}`", path.display())
+ })
+}
+
+pub fn append(path: &Path, contents: &[u8]) -> CargoResult<()> {
+ (|| -> CargoResult<()> {
+ let mut f = OpenOptions::new()
+ .write(true)
+ .append(true)
+ .create(true)
+ .open(path)?;
+
+ f.write_all(contents)?;
+ Ok(())
+ })().chain_err(|| {
+ internal(format!("failed to write `{}`", path.display()))
+ })
+}
+
+#[cfg(unix)]
+pub fn path2bytes(path: &Path) -> CargoResult<&[u8]> {
+ use std::os::unix::prelude::*;
+ Ok(path.as_os_str().as_bytes())
+}
+#[cfg(windows)]
+pub fn path2bytes(path: &Path) -> CargoResult<&[u8]> {
+ match path.as_os_str().to_str() {
+ Some(s) => Ok(s.as_bytes()),
+ None => Err(format!("invalid non-unicode path: {}",
+ path.display()).into())
+ }
+}
+
+#[cfg(unix)]
+pub fn bytes2path(bytes: &[u8]) -> CargoResult<PathBuf> {
+ use std::os::unix::prelude::*;
+ use std::ffi::OsStr;
+ Ok(PathBuf::from(OsStr::from_bytes(bytes)))
+}
+#[cfg(windows)]
+pub fn bytes2path(bytes: &[u8]) -> CargoResult<PathBuf> {
+ use std::str;
+ match str::from_utf8(bytes) {
+ Ok(s) => Ok(PathBuf::from(s)),
+ Err(..) => Err("invalid non-unicode path".into()),
+ }
+}
+
+pub fn ancestors(path: &Path) -> PathAncestors {
+ PathAncestors::new(path)
+}
+
+pub struct PathAncestors<'a> {
+ current: Option<&'a Path>,
+ stop_at: Option<PathBuf>
+}
+
+impl<'a> PathAncestors<'a> {
+ fn new(path: &Path) -> PathAncestors {
+ PathAncestors {
+ current: Some(path),
+ //HACK: avoid reading `~/.cargo/config` when testing Cargo itself.
+ stop_at: env::var("__CARGO_TEST_ROOT").ok().map(PathBuf::from),
+ }
+ }
+}
+
+impl<'a> Iterator for PathAncestors<'a> {
+ type Item = &'a Path;
+
+ fn next(&mut self) -> Option<&'a Path> {
+ if let Some(path) = self.current {
+ self.current = path.parent();
+
+ if let Some(ref stop_at) = self.stop_at {
+ if path == stop_at {
+ self.current = None;
+ }
+ }
+
+ Some(path)
+ } else {
+ None
+ }
+ }
+}
--- /dev/null
+use std::collections::HashMap;
+use std::env;
+use std::ffi::{OsString, OsStr};
+use std::fmt;
+use std::path::Path;
+use std::process::{Command, Stdio, Output};
+
+use jobserver::Client;
+use shell_escape::escape;
+
+use util::{CargoResult, CargoResultExt, CargoError, process_error, read2};
+use util::errors::CargoErrorKind;
+
+/// A builder object for an external process, similar to `std::process::Command`.
+#[derive(Clone, Debug)]
+pub struct ProcessBuilder {
+ /// The program to execute.
+ program: OsString,
+ /// A list of arguments to pass to the program.
+ args: Vec<OsString>,
+ /// Any environment variables that should be set for the program.
+ env: HashMap<String, Option<OsString>>,
+ /// Which directory to run the program from.
+ cwd: Option<OsString>,
+ /// The `make` jobserver. See the [jobserver crate][jobserver_docs] for
+ /// more information.
+ ///
+ /// [jobserver_docs]: https://docs.rs/jobserver/0.1.6/jobserver/
+ jobserver: Option<Client>,
+}
+
+impl fmt::Display for ProcessBuilder {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ write!(f, "`{}", self.program.to_string_lossy())?;
+
+ for arg in &self.args {
+ write!(f, " {}", escape(arg.to_string_lossy()))?;
+ }
+
+ write!(f, "`")
+ }
+}
+
+impl ProcessBuilder {
+ /// (chainable) Set the executable for the process.
+ pub fn program<T: AsRef<OsStr>>(&mut self, program: T) -> &mut ProcessBuilder {
+ self.program = program.as_ref().to_os_string();
+ self
+ }
+
+ /// (chainable) Add an arg to the args list.
+ pub fn arg<T: AsRef<OsStr>>(&mut self, arg: T) -> &mut ProcessBuilder {
+ self.args.push(arg.as_ref().to_os_string());
+ self
+ }
+
+ /// (chainable) Add many args to the args list.
+ pub fn args<T: AsRef<OsStr>>(&mut self, arguments: &[T]) -> &mut ProcessBuilder {
+ self.args.extend(arguments.iter().map(|t| {
+ t.as_ref().to_os_string()
+ }));
+ self
+ }
+
+ /// (chainable) Replace args with new args list
+ pub fn args_replace<T: AsRef<OsStr>>(&mut self, arguments: &[T]) -> &mut ProcessBuilder {
+ self.args = arguments.iter().map(|t| {
+ t.as_ref().to_os_string()
+ }).collect();
+ self
+ }
+
+ /// (chainable) Set the current working directory of the process
+ pub fn cwd<T: AsRef<OsStr>>(&mut self, path: T) -> &mut ProcessBuilder {
+ self.cwd = Some(path.as_ref().to_os_string());
+ self
+ }
+
+ /// (chainable) Set an environment variable for the process.
+ pub fn env<T: AsRef<OsStr>>(&mut self, key: &str,
+ val: T) -> &mut ProcessBuilder {
+ self.env.insert(key.to_string(), Some(val.as_ref().to_os_string()));
+ self
+ }
+
+ /// (chainable) Unset an environment variable for the process.
+ pub fn env_remove(&mut self, key: &str) -> &mut ProcessBuilder {
+ self.env.insert(key.to_string(), None);
+ self
+ }
+
+ /// Get the executable name.
+ pub fn get_program(&self) -> &OsString {
+ &self.program
+ }
+
+ /// Get the program arguments
+ pub fn get_args(&self) -> &[OsString] {
+ &self.args
+ }
+
+ /// Get the current working directory for the process
+ pub fn get_cwd(&self) -> Option<&Path> {
+ self.cwd.as_ref().map(Path::new)
+ }
+
+ /// Get an environment variable as the process will see it (will inherit from environment
+ /// unless explicitally unset).
+ pub fn get_env(&self, var: &str) -> Option<OsString> {
+ self.env.get(var).cloned().or_else(|| Some(env::var_os(var)))
+ .and_then(|s| s)
+ }
+
+ /// Get all environment variables explicitally set or unset for the process (not inherited
+ /// vars).
+ pub fn get_envs(&self) -> &HashMap<String, Option<OsString>> { &self.env }
+
+ /// Set the `make` jobserver. See the [jobserver crate][jobserver_docs] for
+ /// more information.
+ ///
+ /// [jobserver_docs]: https://docs.rs/jobserver/0.1.6/jobserver/
+ pub fn inherit_jobserver(&mut self, jobserver: &Client) -> &mut Self {
+ self.jobserver = Some(jobserver.clone());
+ self
+ }
+
+ /// Run the process, waiting for completion, and mapping non-success exit codes to an error.
+ pub fn exec(&self) -> CargoResult<()> {
+ let mut command = self.build_command();
+ let exit = command.status().chain_err(|| {
+ CargoErrorKind::ProcessErrorKind(
+ process_error(&format!("could not execute process `{}`",
+ self.debug_string()), None, None))
+ })?;
+
+ if exit.success() {
+ Ok(())
+ } else {
+ Err(CargoErrorKind::ProcessErrorKind(process_error(
+ &format!("process didn't exit successfully: `{}`", self.debug_string()),
+ Some(&exit), None)).into())
+ }
+ }
+
+ /// On unix, executes the process using the unix syscall `execvp`, which will block this
+ /// process, and will only return if there is an error. On windows this is a synonym for
+ /// `exec`.
+ #[cfg(unix)]
+ pub fn exec_replace(&self) -> CargoResult<()> {
+ use std::os::unix::process::CommandExt;
+
+ let mut command = self.build_command();
+ let error = command.exec();
+ Err(CargoError::with_chain(error,
+ CargoErrorKind::ProcessErrorKind(process_error(
+ &format!("could not execute process `{}`", self.debug_string()), None, None))))
+ }
+
+ /// On unix, executes the process using the unix syscall `execvp`, which will block this
+ /// process, and will only return if there is an error. On windows this is a synonym for
+ /// `exec`.
+ #[cfg(windows)]
+ pub fn exec_replace(&self) -> CargoResult<()> {
+ self.exec()
+ }
+
+ /// Execute the process, returning the stdio output, or an error if non-zero exit status.
+ pub fn exec_with_output(&self) -> CargoResult<Output> {
+ let mut command = self.build_command();
+
+ let output = command.output().chain_err(|| {
+ CargoErrorKind::ProcessErrorKind(
+ process_error(
+ &format!("could not execute process `{}`", self.debug_string()),
+ None, None))
+ })?;
+
+ if output.status.success() {
+ Ok(output)
+ } else {
+ Err(CargoErrorKind::ProcessErrorKind(process_error(
+ &format!("process didn't exit successfully: `{}`", self.debug_string()),
+ Some(&output.status), Some(&output))).into())
+ }
+ }
+
+ /// Execute a command, passing each line of stdout and stderr to the supplied callbacks, which
+ /// can mutate the string data.
+ ///
+ /// If any invocations of these function return an error, it will be propagated.
+ ///
+ /// Optionally, output can be passed to errors using `print_output`
+ pub fn exec_with_streaming(&self,
+ on_stdout_line: &mut FnMut(&str) -> CargoResult<()>,
+ on_stderr_line: &mut FnMut(&str) -> CargoResult<()>,
+ print_output: bool)
+ -> CargoResult<Output> {
+ let mut stdout = Vec::new();
+ let mut stderr = Vec::new();
+
+ let mut cmd = self.build_command();
+ cmd.stdout(Stdio::piped())
+ .stderr(Stdio::piped())
+ .stdin(Stdio::null());
+
+ let mut callback_error = None;
+ let status = (|| {
+ let mut child = cmd.spawn()?;
+ let out = child.stdout.take().unwrap();
+ let err = child.stderr.take().unwrap();
+ read2(out, err, &mut |is_out, data, eof| {
+ let idx = if eof {
+ data.len()
+ } else {
+ match data.iter().rposition(|b| *b == b'\n') {
+ Some(i) => i + 1,
+ None => return,
+ }
+ };
+ let data = data.drain(..idx);
+ let dst = if is_out {&mut stdout} else {&mut stderr};
+ let start = dst.len();
+ dst.extend(data);
+ for line in String::from_utf8_lossy(&dst[start..]).lines() {
+ if callback_error.is_some() { break }
+ let callback_result = if is_out {
+ on_stdout_line(line)
+ } else {
+ on_stderr_line(line)
+ };
+ if let Err(e) = callback_result {
+ callback_error = Some(e);
+ }
+ }
+ })?;
+ child.wait()
+ })().chain_err(|| {
+ CargoErrorKind::ProcessErrorKind(
+ process_error(&format!("could not execute process `{}`",
+ self.debug_string()),
+ None, None))
+ })?;
+ let output = Output {
+ stdout: stdout,
+ stderr: stderr,
+ status: status,
+ };
+
+ {
+ let to_print = if print_output {
+ Some(&output)
+ } else {
+ None
+ };
+ if !output.status.success() {
+ return Err(CargoErrorKind::ProcessErrorKind(process_error(
+ &format!("process didn't exit successfully: `{}`", self.debug_string()),
+ Some(&output.status), to_print)).into())
+ } else if let Some(e) = callback_error {
+ return Err(CargoError::with_chain(e,
+ CargoErrorKind::ProcessErrorKind(process_error(
+ &format!("failed to parse process output: `{}`", self.debug_string()),
+ Some(&output.status), to_print))))
+ }
+ }
+
+ Ok(output)
+ }
+
+ /// Converts ProcessBuilder into a `std::process::Command`, and handles the jobserver if
+ /// present.
+ pub fn build_command(&self) -> Command {
+ let mut command = Command::new(&self.program);
+ if let Some(cwd) = self.get_cwd() {
+ command.current_dir(cwd);
+ }
+ for arg in &self.args {
+ command.arg(arg);
+ }
+ for (k, v) in &self.env {
+ match *v {
+ Some(ref v) => { command.env(k, v); }
+ None => { command.env_remove(k); }
+ }
+ }
+ if let Some(ref c) = self.jobserver {
+ c.configure(&mut command);
+ }
+ command
+ }
+
+ /// Get the command line for the process as a string.
+ fn debug_string(&self) -> String {
+ let mut program = format!("{}", self.program.to_string_lossy());
+ for arg in &self.args {
+ program.push(' ');
+ program.push_str(&format!("{}", arg.to_string_lossy()));
+ }
+ program
+ }
+}
+
+/// A helper function to create a ProcessBuilder.
+pub fn process<T: AsRef<OsStr>>(cmd: T) -> ProcessBuilder {
+ ProcessBuilder {
+ program: cmd.as_ref().to_os_string(),
+ args: Vec::new(),
+ cwd: None,
+ env: HashMap::new(),
+ jobserver: None,
+ }
+}
--- /dev/null
+use std::env;
+use std::fmt;
+use std::mem;
+use std::time;
+use std::iter::repeat;
+use std::cell::RefCell;
+
+thread_local!(static PROFILE_STACK: RefCell<Vec<time::Instant>> = RefCell::new(Vec::new()));
+thread_local!(static MESSAGES: RefCell<Vec<Message>> = RefCell::new(Vec::new()));
+
+type Message = (usize, u64, String);
+
+pub struct Profiler {
+ desc: String,
+}
+
+fn enabled_level() -> Option<usize> {
+ env::var("CARGO_PROFILE").ok().and_then(|s| s.parse().ok())
+}
+
+pub fn start<T: fmt::Display>(desc: T) -> Profiler {
+ if enabled_level().is_none() { return Profiler { desc: String::new() } }
+
+ PROFILE_STACK.with(|stack| stack.borrow_mut().push(time::Instant::now()));
+
+ Profiler {
+ desc: desc.to_string(),
+ }
+}
+
+impl Drop for Profiler {
+ fn drop(&mut self) {
+ let enabled = match enabled_level() {
+ Some(i) => i,
+ None => return,
+ };
+
+ let start = PROFILE_STACK.with(|stack| stack.borrow_mut().pop().unwrap());
+ let duration = start.elapsed();
+ let duration_ms = duration.as_secs() * 1000 + u64::from(duration.subsec_nanos() / 1_000_000);
+
+ let stack_len = PROFILE_STACK.with(|stack| stack.borrow().len());
+ if stack_len == 0 {
+ fn print(lvl: usize, msgs: &[Message], enabled: usize) {
+ if lvl > enabled { return }
+ let mut last = 0;
+ for (i, &(l, time, ref msg)) in msgs.iter().enumerate() {
+ if l != lvl { continue }
+ println!("{} {:6}ms - {}",
+ repeat(" ").take(lvl + 1).collect::<String>(),
+ time, msg);
+
+ print(lvl + 1, &msgs[last..i], enabled);
+ last = i;
+ }
+
+ }
+ MESSAGES.with(|msgs_rc| {
+ let mut msgs = msgs_rc.borrow_mut();
+ msgs.push((0, duration_ms,
+ mem::replace(&mut self.desc, String::new())));
+ print(0, &msgs, enabled);
+ });
+ } else {
+ MESSAGES.with(|msgs| {
+ let msg = mem::replace(&mut self.desc, String::new());
+ msgs.borrow_mut().push((stack_len, duration_ms, msg));
+ });
+ }
+ }
+}
--- /dev/null
+use std::cmp;
+use std::iter;
+use std::time::{Instant, Duration};
+
+use util::{Config, CargoResult};
+
+pub struct Progress<'cfg> {
+ state: Option<State<'cfg>>,
+}
+
+struct State<'cfg> {
+ config: &'cfg Config,
+ width: usize,
+ first: bool,
+ last_update: Instant,
+ name: String,
+ done: bool,
+}
+
+impl<'cfg> Progress<'cfg> {
+ pub fn new(name: &str, cfg: &'cfg Config) -> Progress<'cfg> {
+ Progress {
+ state: cfg.shell().err_width().map(|n| {
+ State {
+ config: cfg,
+ width: cmp::min(n, 80),
+ first: true,
+ last_update: Instant::now(),
+ name: name.to_string(),
+ done: false,
+ }
+ }),
+ }
+ }
+
+ pub fn tick(&mut self, cur: usize, max: usize) -> CargoResult<()> {
+ match self.state {
+ Some(ref mut s) => s.tick(cur, max),
+ None => Ok(())
+ }
+ }
+}
+
+impl<'cfg> State<'cfg> {
+ fn tick(&mut self, cur: usize, max: usize) -> CargoResult<()> {
+ if self.done {
+ return Ok(())
+ }
+
+ // Don't update too often as it can cause excessive performance loss
+ // just putting stuff onto the terminal. We also want to avoid
+ // flickering by not drawing anything that goes away too quickly. As a
+ // result we've got two branches here:
+ //
+ // 1. If we haven't drawn anything, we wait for a period of time to
+ // actually start drawing to the console. This ensures that
+ // short-lived operations don't flicker on the console. Currently
+ // there's a 500ms delay to when we first draw something.
+ // 2. If we've drawn something, then we rate limit ourselves to only
+ // draw to the console every so often. Currently there's a 100ms
+ // delay between updates.
+ if self.first {
+ let delay = Duration::from_millis(500);
+ if self.last_update.elapsed() < delay {
+ return Ok(())
+ }
+ self.first = false;
+ } else {
+ let interval = Duration::from_millis(100);
+ if self.last_update.elapsed() < interval {
+ return Ok(())
+ }
+ }
+ self.last_update = Instant::now();
+
+ // Render the percentage at the far right and then figure how long the
+ // progress bar is
+ let pct = (cur as f64) / (max as f64);
+ let pct = if !pct.is_finite() { 0.0 } else { pct };
+ let stats = format!(" {:6.02}%", pct * 100.0);
+ let extra_len = stats.len() + 2 /* [ and ] */ + 15 /* status header */;
+ let display_width = match self.width.checked_sub(extra_len) {
+ Some(n) => n,
+ None => return Ok(()),
+ };
+ let mut string = String::from("[");
+ let hashes = display_width as f64 * pct;
+ let hashes = hashes as usize;
+
+ // Draw the `===>`
+ if hashes > 0 {
+ for _ in 0..hashes-1 {
+ string.push_str("=");
+ }
+ if cur == max {
+ self.done = true;
+ string.push_str("=");
+ } else {
+ string.push_str(">");
+ }
+ }
+
+ // Draw the empty space we have left to do
+ for _ in 0..(display_width - hashes) {
+ string.push_str(" ");
+ }
+ string.push_str("]");
+ string.push_str(&stats);
+
+ // Write out a pretty header, then the progress bar itself, and then
+ // return back to the beginning of the line for the next print.
+ self.config.shell().status_header(&self.name)?;
+ write!(self.config.shell().err(), "{}\r", string)?;
+ Ok(())
+ }
+}
+
+fn clear(width: usize, config: &Config) {
+ let blank = iter::repeat(" ").take(width).collect::<String>();
+ drop(write!(config.shell().err(), "{}\r", blank));
+}
+
+impl<'cfg> Drop for State<'cfg> {
+ fn drop(&mut self) {
+ clear(self.width, self.config);
+ }
+}
--- /dev/null
+pub use self::imp::read2;
+
+#[cfg(unix)]
+mod imp {
+ use std::io::prelude::*;
+ use std::io;
+ use std::mem;
+ use std::os::unix::prelude::*;
+ use std::process::{ChildStdout, ChildStderr};
+ use libc;
+
+ pub fn read2(mut out_pipe: ChildStdout,
+ mut err_pipe: ChildStderr,
+ data: &mut FnMut(bool, &mut Vec<u8>, bool)) -> io::Result<()> {
+ unsafe {
+ libc::fcntl(out_pipe.as_raw_fd(), libc::F_SETFL, libc::O_NONBLOCK);
+ libc::fcntl(err_pipe.as_raw_fd(), libc::F_SETFL, libc::O_NONBLOCK);
+ }
+
+ let mut out_done = false;
+ let mut err_done = false;
+ let mut out = Vec::new();
+ let mut err = Vec::new();
+
+ let mut fds: [libc::pollfd; 2] = unsafe { mem::zeroed() };
+ fds[0].fd = out_pipe.as_raw_fd();
+ fds[0].events = libc::POLLIN;
+ fds[1].fd = err_pipe.as_raw_fd();
+ fds[1].events = libc::POLLIN;
+ loop {
+ // wait for either pipe to become readable using `select`
+ let r = unsafe { libc::poll(fds.as_mut_ptr(), 2, -1) };
+ if r == -1 {
+ let err = io::Error::last_os_error();
+ if err.kind() == io::ErrorKind::Interrupted {
+ continue
+ }
+ return Err(err)
+ }
+
+ // Read as much as we can from each pipe, ignoring EWOULDBLOCK or
+ // EAGAIN. If we hit EOF, then this will happen because the underlying
+ // reader will return Ok(0), in which case we'll see `Ok` ourselves. In
+ // this case we flip the other fd back into blocking mode and read
+ // whatever's leftover on that file descriptor.
+ let handle = |res: io::Result<_>| {
+ match res {
+ Ok(_) => Ok(true),
+ Err(e) => {
+ if e.kind() == io::ErrorKind::WouldBlock {
+ Ok(false)
+ } else {
+ Err(e)
+ }
+ }
+ }
+ };
+ if !out_done && fds[0].revents != 0 && handle(out_pipe.read_to_end(&mut out))? {
+ out_done = true;
+ }
+ data(true, &mut out, out_done);
+ if !err_done && fds[1].revents != 0 && handle(err_pipe.read_to_end(&mut err))? {
+ err_done = true;
+ }
+ data(false, &mut err, err_done);
+
+ if out_done && err_done {
+ return Ok(())
+ }
+ }
+ }
+}
+
+#[cfg(windows)]
+mod imp {
+ extern crate miow;
+ extern crate winapi;
+
+ use std::io;
+ use std::os::windows::prelude::*;
+ use std::process::{ChildStdout, ChildStderr};
+ use std::slice;
+
+ use self::miow::iocp::{CompletionPort, CompletionStatus};
+ use self::miow::pipe::NamedPipe;
+ use self::miow::Overlapped;
+ use self::winapi::ERROR_BROKEN_PIPE;
+
+ struct Pipe<'a> {
+ dst: &'a mut Vec<u8>,
+ overlapped: Overlapped,
+ pipe: NamedPipe,
+ done: bool,
+ }
+
+ pub fn read2(out_pipe: ChildStdout,
+ err_pipe: ChildStderr,
+ data: &mut FnMut(bool, &mut Vec<u8>, bool)) -> io::Result<()> {
+ let mut out = Vec::new();
+ let mut err = Vec::new();
+
+ let port = CompletionPort::new(1)?;
+ port.add_handle(0, &out_pipe)?;
+ port.add_handle(1, &err_pipe)?;
+
+ unsafe {
+ let mut out_pipe = Pipe::new(out_pipe, &mut out);
+ let mut err_pipe = Pipe::new(err_pipe, &mut err);
+
+ out_pipe.read()?;
+ err_pipe.read()?;
+
+ let mut status = [CompletionStatus::zero(), CompletionStatus::zero()];
+
+ while !out_pipe.done || !err_pipe.done {
+ for status in port.get_many(&mut status, None)? {
+ if status.token() == 0 {
+ out_pipe.complete(status);
+ data(true, out_pipe.dst, out_pipe.done);
+ out_pipe.read()?;
+ } else {
+ err_pipe.complete(status);
+ data(false, err_pipe.dst, err_pipe.done);
+ err_pipe.read()?;
+ }
+ }
+ }
+
+ Ok(())
+ }
+ }
+
+ impl<'a> Pipe<'a> {
+ unsafe fn new<P: IntoRawHandle>(p: P, dst: &'a mut Vec<u8>) -> Pipe<'a> {
+ Pipe {
+ dst: dst,
+ pipe: NamedPipe::from_raw_handle(p.into_raw_handle()),
+ overlapped: Overlapped::zero(),
+ done: false,
+ }
+ }
+
+ unsafe fn read(&mut self) -> io::Result<()> {
+ let dst = slice_to_end(self.dst);
+ match self.pipe.read_overlapped(dst, self.overlapped.raw()) {
+ Ok(_) => Ok(()),
+ Err(e) => {
+ if e.raw_os_error() == Some(ERROR_BROKEN_PIPE as i32) {
+ self.done = true;
+ Ok(())
+ } else {
+ Err(e)
+ }
+ }
+ }
+ }
+
+ unsafe fn complete(&mut self, status: &CompletionStatus) {
+ let prev = self.dst.len();
+ self.dst.set_len(prev + status.bytes_transferred() as usize);
+ if status.bytes_transferred() == 0 {
+ self.done = true;
+ }
+ }
+ }
+
+ unsafe fn slice_to_end(v: &mut Vec<u8>) -> &mut [u8] {
+ if v.capacity() == 0 {
+ v.reserve(16);
+ }
+ if v.capacity() == v.len() {
+ v.reserve(1);
+ }
+ slice::from_raw_parts_mut(v.as_mut_ptr().offset(v.len() as isize),
+ v.capacity() - v.len())
+ }
+}
--- /dev/null
+use std::path::PathBuf;
+
+use util::{self, CargoResult, internal, ProcessBuilder};
+
+/// Information on the `rustc` executable
+#[derive(Debug)]
+pub struct Rustc {
+ /// The location of the exe
+ pub path: PathBuf,
+ /// An optional program that will be passed the path of the rust exe as its first argument, and
+ /// rustc args following this.
+ pub wrapper: Option<PathBuf>,
+ /// Verbose version information (the output of `rustc -vV`)
+ pub verbose_version: String,
+ /// The host triple (arch-platform-OS), this comes from verbose_version.
+ pub host: String,
+}
+
+impl Rustc {
+ /// Run the compiler at `path` to learn various pieces of information about
+ /// it, with an optional wrapper.
+ ///
+ /// If successful this function returns a description of the compiler along
+ /// with a list of its capabilities.
+ pub fn new(path: PathBuf, wrapper: Option<PathBuf>) -> CargoResult<Rustc> {
+ let mut cmd = util::process(&path);
+ cmd.arg("-vV");
+
+ let output = cmd.exec_with_output()?;
+
+ let verbose_version = String::from_utf8(output.stdout).map_err(|_| {
+ internal("rustc -v didn't return utf8 output")
+ })?;
+
+ let host = {
+ let triple = verbose_version.lines().find(|l| {
+ l.starts_with("host: ")
+ }).map(|l| &l[6..]).ok_or_else(|| internal("rustc -v didn't have a line for `host:`"))?;
+ triple.to_string()
+ };
+
+ Ok(Rustc {
+ path: path,
+ wrapper: wrapper,
+ verbose_version: verbose_version,
+ host: host,
+ })
+ }
+
+ /// Get a process builder set up to use the found rustc version, with a wrapper if Some
+ pub fn process(&self) -> ProcessBuilder {
+ if let Some(ref wrapper) = self.wrapper {
+ let mut cmd = util::process(wrapper);
+ {
+ cmd.arg(&self.path);
+ }
+ cmd
+ } else {
+ util::process(&self.path)
+ }
+ }
+}
--- /dev/null
+extern crate crypto_hash;
+use self::crypto_hash::{Hasher,Algorithm};
+use std::io::Write;
+
+pub struct Sha256(Hasher);
+
+impl Sha256 {
+ pub fn new() -> Sha256 {
+ let hasher = Hasher::new(Algorithm::SHA256);
+ Sha256(hasher)
+ }
+
+ pub fn update(&mut self, bytes: &[u8]) {
+ let _ = self.0.write_all(bytes);
+ }
+
+ pub fn finish(&mut self) -> [u8; 32] {
+ let mut ret = [0u8; 32];
+ let data = self.0.finish();
+ ret.copy_from_slice(&data[..]);
+ ret
+ }
+}
--- /dev/null
+use semver::Version;
+
+pub trait ToSemver {
+ fn to_semver(self) -> Result<Version, String>;
+}
+
+impl ToSemver for Version {
+ fn to_semver(self) -> Result<Version, String> { Ok(self) }
+}
+
+impl<'a> ToSemver for &'a str {
+ fn to_semver(self) -> Result<Version, String> {
+ match Version::parse(self) {
+ Ok(v) => Ok(v),
+ Err(..) => Err(format!("cannot parse '{}' as a semver", self)),
+ }
+ }
+}
+
+impl<'a> ToSemver for &'a String {
+ fn to_semver(self) -> Result<Version, String> {
+ (**self).to_semver()
+ }
+}
+
+impl<'a> ToSemver for &'a Version {
+ fn to_semver(self) -> Result<Version, String> {
+ Ok(self.clone())
+ }
+}
--- /dev/null
+use std::path::Path;
+
+use url::Url;
+
+use util::CargoResult;
+
+/// A type that can be converted to a Url
+pub trait ToUrl {
+ /// Performs the conversion
+ fn to_url(self) -> CargoResult<Url>;
+}
+
+impl<'a> ToUrl for &'a str {
+ fn to_url(self) -> CargoResult<Url> {
+ Url::parse(self).map_err(|s| {
+ format!("invalid url `{}`: {}", self, s).into()
+ })
+ }
+}
+
+impl<'a> ToUrl for &'a Path {
+ fn to_url(self) -> CargoResult<Url> {
+ Url::from_file_path(self).map_err(|()| {
+ format!("invalid path url `{}`", self.display()).into()
+ })
+ }
+}
--- /dev/null
+use std::collections::{HashMap, BTreeMap, HashSet, BTreeSet};
+use std::fmt;
+use std::fs;
+use std::path::{Path, PathBuf};
+use std::rc::Rc;
+use std::str;
+
+use semver::{self, VersionReq};
+use serde::ser;
+use serde::de::{self, Deserialize};
+use serde_ignored;
+use toml;
+use url::Url;
+
+use core::{SourceId, Profiles, PackageIdSpec, GitReference, WorkspaceConfig, WorkspaceRootConfig};
+use core::{Summary, Manifest, Target, Dependency, PackageId};
+use core::{EitherManifest, VirtualManifest, Features, Feature};
+use core::dependency::{Kind, Platform};
+use core::manifest::{LibKind, Profile, ManifestMetadata};
+use sources::CRATES_IO;
+use util::paths;
+use util::{self, ToUrl, Config};
+use util::errors::{CargoError, CargoResult, CargoResultExt};
+
+mod targets;
+use self::targets::targets;
+
+pub fn read_manifest(path: &Path, source_id: &SourceId, config: &Config)
+ -> CargoResult<(EitherManifest, Vec<PathBuf>)> {
+ trace!("read_manifest; path={}; source-id={}", path.display(), source_id);
+ let contents = paths::read(path)?;
+
+ do_read_manifest(&contents, path, source_id, config).chain_err(|| {
+ format!("failed to parse manifest at `{}`", path.display())
+ })
+}
+
+fn do_read_manifest(contents: &str,
+ manifest_file: &Path,
+ source_id: &SourceId,
+ config: &Config)
+ -> CargoResult<(EitherManifest, Vec<PathBuf>)> {
+ let package_root = manifest_file.parent().unwrap();
+
+ let toml = {
+ let pretty_filename =
+ util::without_prefix(manifest_file, config.cwd()).unwrap_or(manifest_file);
+ parse(contents, pretty_filename, config)?
+ };
+
+ let mut unused = BTreeSet::new();
+ let manifest: TomlManifest = serde_ignored::deserialize(toml, |path| {
+ let mut key = String::new();
+ stringify(&mut key, &path);
+ unused.insert(key);
+ })?;
+
+ let manifest = Rc::new(manifest);
+ return match TomlManifest::to_real_manifest(&manifest,
+ source_id,
+ package_root,
+ config) {
+ Ok((mut manifest, paths)) => {
+ for key in unused {
+ manifest.add_warning(format!("unused manifest key: {}", key));
+ }
+ if !manifest.targets().iter().any(|t| !t.is_custom_build()) {
+ bail!("no targets specified in the manifest\n \
+ either src/lib.rs, src/main.rs, a [lib] section, or \
+ [[bin]] section must be present")
+ }
+ Ok((EitherManifest::Real(manifest), paths))
+ }
+ Err(e) => {
+ match TomlManifest::to_virtual_manifest(&manifest,
+ source_id,
+ package_root,
+ config) {
+ Ok((m, paths)) => Ok((EitherManifest::Virtual(m), paths)),
+ Err(..) => Err(e),
+ }
+ }
+ };
+
+ fn stringify(dst: &mut String, path: &serde_ignored::Path) {
+ use serde_ignored::Path;
+
+ match *path {
+ Path::Root => {}
+ Path::Seq { parent, index } => {
+ stringify(dst, parent);
+ if !dst.is_empty() {
+ dst.push('.');
+ }
+ dst.push_str(&index.to_string());
+ }
+ Path::Map { parent, ref key } => {
+ stringify(dst, parent);
+ if !dst.is_empty() {
+ dst.push('.');
+ }
+ dst.push_str(key);
+ }
+ Path::Some { parent } |
+ Path::NewtypeVariant { parent } |
+ Path::NewtypeStruct { parent } => stringify(dst, parent),
+ }
+ }
+}
+
+pub fn parse(toml: &str,
+ file: &Path,
+ config: &Config) -> CargoResult<toml::Value> {
+ let first_error = match toml.parse() {
+ Ok(ret) => return Ok(ret),
+ Err(e) => e,
+ };
+
+ let mut second_parser = toml::de::Deserializer::new(toml);
+ second_parser.set_require_newline_after_table(false);
+ if let Ok(ret) = toml::Value::deserialize(&mut second_parser) {
+ let msg = format!("\
+TOML file found which contains invalid syntax and will soon not parse
+at `{}`.
+
+The TOML spec requires newlines after table definitions (e.g. `[a] b = 1` is
+invalid), but this file has a table header which does not have a newline after
+it. A newline needs to be added and this warning will soon become a hard error
+in the future.", file.display());
+ config.shell().warn(&msg)?;
+ return Ok(ret)
+ }
+
+ Err(first_error).chain_err(|| {
+ "could not parse input as TOML"
+ })
+}
+
+type TomlLibTarget = TomlTarget;
+type TomlBinTarget = TomlTarget;
+type TomlExampleTarget = TomlTarget;
+type TomlTestTarget = TomlTarget;
+type TomlBenchTarget = TomlTarget;
+
+#[derive(Debug, Serialize)]
+#[serde(untagged)]
+pub enum TomlDependency {
+ Simple(String),
+ Detailed(DetailedTomlDependency)
+}
+
+impl<'de> de::Deserialize<'de> for TomlDependency {
+ fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
+ where D: de::Deserializer<'de>
+ {
+ struct TomlDependencyVisitor;
+
+ impl<'de> de::Visitor<'de> for TomlDependencyVisitor {
+ type Value = TomlDependency;
+
+ fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
+ formatter.write_str("a version string like \"0.9.8\" or a \
+ detailed dependency like { version = \"0.9.8\" }")
+ }
+
+ fn visit_str<E>(self, s: &str) -> Result<Self::Value, E>
+ where E: de::Error
+ {
+ Ok(TomlDependency::Simple(s.to_owned()))
+ }
+
+ fn visit_map<V>(self, map: V) -> Result<Self::Value, V::Error>
+ where V: de::MapAccess<'de>
+ {
+ let mvd = de::value::MapAccessDeserializer::new(map);
+ DetailedTomlDependency::deserialize(mvd).map(TomlDependency::Detailed)
+ }
+ }
+
+ deserializer.deserialize_any(TomlDependencyVisitor)
+ }
+}
+
+#[derive(Deserialize, Serialize, Clone, Debug, Default)]
+pub struct DetailedTomlDependency {
+ version: Option<String>,
+ registry: Option<String>,
+ path: Option<String>,
+ git: Option<String>,
+ branch: Option<String>,
+ tag: Option<String>,
+ rev: Option<String>,
+ features: Option<Vec<String>>,
+ optional: Option<bool>,
+ #[serde(rename = "default-features")]
+ default_features: Option<bool>,
+ #[serde(rename = "default_features")]
+ default_features2: Option<bool>,
+}
+
+#[derive(Debug, Deserialize, Serialize)]
+pub struct TomlManifest {
+ #[serde(rename = "cargo-features")]
+ cargo_features: Option<Vec<String>>,
+ package: Option<Box<TomlProject>>,
+ project: Option<Box<TomlProject>>,
+ profile: Option<TomlProfiles>,
+ lib: Option<TomlLibTarget>,
+ bin: Option<Vec<TomlBinTarget>>,
+ example: Option<Vec<TomlExampleTarget>>,
+ test: Option<Vec<TomlTestTarget>>,
+ bench: Option<Vec<TomlTestTarget>>,
+ dependencies: Option<BTreeMap<String, TomlDependency>>,
+ #[serde(rename = "dev-dependencies")]
+ dev_dependencies: Option<BTreeMap<String, TomlDependency>>,
+ #[serde(rename = "dev_dependencies")]
+ dev_dependencies2: Option<BTreeMap<String, TomlDependency>>,
+ #[serde(rename = "build-dependencies")]
+ build_dependencies: Option<BTreeMap<String, TomlDependency>>,
+ #[serde(rename = "build_dependencies")]
+ build_dependencies2: Option<BTreeMap<String, TomlDependency>>,
+ features: Option<BTreeMap<String, Vec<String>>>,
+ target: Option<BTreeMap<String, TomlPlatform>>,
+ replace: Option<BTreeMap<String, TomlDependency>>,
+ patch: Option<BTreeMap<String, BTreeMap<String, TomlDependency>>>,
+ workspace: Option<TomlWorkspace>,
+ badges: Option<BTreeMap<String, BTreeMap<String, String>>>,
+}
+
+#[derive(Deserialize, Serialize, Clone, Debug, Default)]
+pub struct TomlProfiles {
+ test: Option<TomlProfile>,
+ doc: Option<TomlProfile>,
+ bench: Option<TomlProfile>,
+ dev: Option<TomlProfile>,
+ release: Option<TomlProfile>,
+}
+
+#[derive(Clone, Debug)]
+pub struct TomlOptLevel(String);
+
+impl<'de> de::Deserialize<'de> for TomlOptLevel {
+ fn deserialize<D>(d: D) -> Result<TomlOptLevel, D::Error>
+ where D: de::Deserializer<'de>
+ {
+ struct Visitor;
+
+ impl<'de> de::Visitor<'de> for Visitor {
+ type Value = TomlOptLevel;
+
+ fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
+ formatter.write_str("an optimization level")
+ }
+
+ fn visit_i64<E>(self, value: i64) -> Result<TomlOptLevel, E>
+ where E: de::Error
+ {
+ Ok(TomlOptLevel(value.to_string()))
+ }
+
+ fn visit_str<E>(self, value: &str) -> Result<TomlOptLevel, E>
+ where E: de::Error
+ {
+ if value == "s" || value == "z" {
+ Ok(TomlOptLevel(value.to_string()))
+ } else {
+ Err(E::custom(format!("must be an integer, `z`, or `s`, \
+ but found: {}", value)))
+ }
+ }
+ }
+
+ d.deserialize_u32(Visitor)
+ }
+}
+
+impl ser::Serialize for TomlOptLevel {
+ fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
+ where S: ser::Serializer,
+ {
+ match self.0.parse::<u32>() {
+ Ok(n) => n.serialize(serializer),
+ Err(_) => self.0.serialize(serializer),
+ }
+ }
+}
+
+#[derive(Clone, Debug, Serialize)]
+#[serde(untagged)]
+pub enum U32OrBool {
+ U32(u32),
+ Bool(bool),
+}
+
+impl<'de> de::Deserialize<'de> for U32OrBool {
+ fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
+ where D: de::Deserializer<'de>
+ {
+ struct Visitor;
+
+ impl<'de> de::Visitor<'de> for Visitor {
+ type Value = U32OrBool;
+
+ fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
+ formatter.write_str("a boolean or an integer")
+ }
+
+ fn visit_i64<E>(self, u: i64) -> Result<Self::Value, E>
+ where E: de::Error,
+ {
+ Ok(U32OrBool::U32(u as u32))
+ }
+
+ fn visit_u64<E>(self, u: u64) -> Result<Self::Value, E>
+ where E: de::Error,
+ {
+ Ok(U32OrBool::U32(u as u32))
+ }
+
+ fn visit_bool<E>(self, b: bool) -> Result<Self::Value, E>
+ where E: de::Error,
+ {
+ Ok(U32OrBool::Bool(b))
+ }
+ }
+
+ deserializer.deserialize_any(Visitor)
+ }
+}
+
+#[derive(Deserialize, Serialize, Clone, Debug, Default)]
+pub struct TomlProfile {
+ #[serde(rename = "opt-level")]
+ opt_level: Option<TomlOptLevel>,
+ lto: Option<bool>,
+ #[serde(rename = "codegen-units")]
+ codegen_units: Option<u32>,
+ debug: Option<U32OrBool>,
+ #[serde(rename = "debug-assertions")]
+ debug_assertions: Option<bool>,
+ rpath: Option<bool>,
+ panic: Option<String>,
+ #[serde(rename = "overflow-checks")]
+ overflow_checks: Option<bool>,
+}
+
+#[derive(Clone, Debug, Serialize)]
+#[serde(untagged)]
+pub enum StringOrBool {
+ String(String),
+ Bool(bool),
+}
+
+impl<'de> de::Deserialize<'de> for StringOrBool {
+ fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
+ where D: de::Deserializer<'de>
+ {
+ struct Visitor;
+
+ impl<'de> de::Visitor<'de> for Visitor {
+ type Value = StringOrBool;
+
+ fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
+ formatter.write_str("a boolean or a string")
+ }
+
+ fn visit_str<E>(self, s: &str) -> Result<Self::Value, E>
+ where E: de::Error,
+ {
+ Ok(StringOrBool::String(s.to_string()))
+ }
+
+ fn visit_bool<E>(self, b: bool) -> Result<Self::Value, E>
+ where E: de::Error,
+ {
+ Ok(StringOrBool::Bool(b))
+ }
+ }
+
+ deserializer.deserialize_any(Visitor)
+ }
+}
+
+#[derive(Clone, Debug, Serialize)]
+#[serde(untagged)]
+pub enum VecStringOrBool {
+ VecString(Vec<String>),
+ Bool(bool),
+}
+
+impl<'de> de::Deserialize<'de> for VecStringOrBool {
+ fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
+ where D: de::Deserializer<'de>
+ {
+ struct Visitor;
+
+ impl<'de> de::Visitor<'de> for Visitor {
+ type Value = VecStringOrBool;
+
+ fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
+ formatter.write_str("a boolean or vector of strings")
+ }
+
+ fn visit_seq<V>(self, v: V) -> Result<Self::Value, V::Error>
+ where V: de::SeqAccess<'de>
+ {
+ let seq = de::value::SeqAccessDeserializer::new(v);
+ Vec::deserialize(seq).map(VecStringOrBool::VecString)
+ }
+
+ fn visit_bool<E>(self, b: bool) -> Result<Self::Value, E>
+ where E: de::Error,
+ {
+ Ok(VecStringOrBool::Bool(b))
+ }
+ }
+
+ deserializer.deserialize_any(Visitor)
+ }
+}
+
+#[derive(Deserialize, Serialize, Clone, Debug)]
+pub struct TomlProject {
+ name: String,
+ version: semver::Version,
+ authors: Option<Vec<String>>,
+ build: Option<StringOrBool>,
+ links: Option<String>,
+ exclude: Option<Vec<String>>,
+ include: Option<Vec<String>>,
+ publish: Option<VecStringOrBool>,
+ workspace: Option<String>,
+ #[serde(rename = "im-a-teapot")]
+ im_a_teapot: Option<bool>,
+
+ // package metadata
+ description: Option<String>,
+ homepage: Option<String>,
+ documentation: Option<String>,
+ readme: Option<String>,
+ keywords: Option<Vec<String>>,
+ categories: Option<Vec<String>>,
+ license: Option<String>,
+ #[serde(rename = "license-file")]
+ license_file: Option<String>,
+ repository: Option<String>,
+ metadata: Option<toml::Value>,
+}
+
+#[derive(Debug, Deserialize, Serialize)]
+pub struct TomlWorkspace {
+ members: Option<Vec<String>>,
+ exclude: Option<Vec<String>>,
+}
+
+impl TomlProject {
+ pub fn to_package_id(&self, source_id: &SourceId) -> CargoResult<PackageId> {
+ PackageId::new(&self.name, self.version.clone(), source_id)
+ }
+}
+
+struct Context<'a, 'b> {
+ pkgid: Option<&'a PackageId>,
+ deps: &'a mut Vec<Dependency>,
+ source_id: &'a SourceId,
+ nested_paths: &'a mut Vec<PathBuf>,
+ config: &'b Config,
+ warnings: &'a mut Vec<String>,
+ platform: Option<Platform>,
+ root: &'a Path,
+ features: &'a Features,
+}
+
+impl TomlManifest {
+ pub fn prepare_for_publish(&self) -> TomlManifest {
+ let mut package = self.package.as_ref()
+ .or_else(|| self.project.as_ref())
+ .unwrap()
+ .clone();
+ package.workspace = None;
+ return TomlManifest {
+ package: Some(package),
+ project: None,
+ profile: self.profile.clone(),
+ lib: self.lib.clone(),
+ bin: self.bin.clone(),
+ example: self.example.clone(),
+ test: self.test.clone(),
+ bench: self.bench.clone(),
+ dependencies: map_deps(self.dependencies.as_ref()),
+ dev_dependencies: map_deps(self.dev_dependencies.as_ref()
+ .or_else(|| self.dev_dependencies2.as_ref())),
+ dev_dependencies2: None,
+ build_dependencies: map_deps(self.build_dependencies.as_ref()
+ .or_else(|| self.build_dependencies2.as_ref())),
+ build_dependencies2: None,
+ features: self.features.clone(),
+ target: self.target.as_ref().map(|target_map| {
+ target_map.iter().map(|(k, v)| {
+ (k.clone(), TomlPlatform {
+ dependencies: map_deps(v.dependencies.as_ref()),
+ dev_dependencies: map_deps(v.dev_dependencies.as_ref()
+ .or_else(|| v.dev_dependencies2.as_ref())),
+ dev_dependencies2: None,
+ build_dependencies: map_deps(v.build_dependencies.as_ref()
+ .or_else(|| v.build_dependencies2.as_ref())),
+ build_dependencies2: None,
+ })
+ }).collect()
+ }),
+ replace: None,
+ patch: None,
+ workspace: None,
+ badges: self.badges.clone(),
+ cargo_features: self.cargo_features.clone(),
+ };
+
+ fn map_deps(deps: Option<&BTreeMap<String, TomlDependency>>)
+ -> Option<BTreeMap<String, TomlDependency>>
+ {
+ let deps = match deps {
+ Some(deps) => deps,
+ None => return None
+ };
+ Some(deps.iter().map(|(k, v)| (k.clone(), map_dependency(v))).collect())
+ }
+
+ fn map_dependency(dep: &TomlDependency) -> TomlDependency {
+ match *dep {
+ TomlDependency::Detailed(ref d) => {
+ let mut d = d.clone();
+ d.path.take(); // path dependencies become crates.io deps
+ TomlDependency::Detailed(d)
+ }
+ TomlDependency::Simple(ref s) => {
+ TomlDependency::Detailed(DetailedTomlDependency {
+ version: Some(s.clone()),
+ ..Default::default()
+ })
+ }
+ }
+ }
+ }
+
+ fn to_real_manifest(me: &Rc<TomlManifest>,
+ source_id: &SourceId,
+ package_root: &Path,
+ config: &Config)
+ -> CargoResult<(Manifest, Vec<PathBuf>)> {
+ let mut nested_paths = vec![];
+ let mut warnings = vec![];
+ let mut errors = vec![];
+
+ // Parse features first so they will be available when parsing other parts of the toml
+ let empty = Vec::new();
+ let cargo_features = me.cargo_features.as_ref().unwrap_or(&empty);
+ let features = Features::new(&cargo_features, &mut warnings)?;
+
+ let project = me.project.as_ref().or_else(|| me.package.as_ref());
+ let project = project.ok_or_else(|| {
+ CargoError::from("no `package` section found.")
+ })?;
+
+ let package_name = project.name.trim();
+ if package_name.is_empty() {
+ bail!("package name cannot be an empty string.")
+ }
+
+ let pkgid = project.to_package_id(source_id)?;
+
+ // If we have no lib at all, use the inferred lib if available
+ // If we have a lib with a path, we're done
+ // If we have a lib with no path, use the inferred lib or_else package name
+ let targets = targets(me, package_name, package_root, &project.build,
+ &mut warnings, &mut errors)?;
+
+ if targets.is_empty() {
+ debug!("manifest has no build targets");
+ }
+
+ if let Err(e) = unique_build_targets(&targets, package_root) {
+ warnings.push(format!("file found to be present in multiple \
+ build targets: {}", e));
+ }
+
+ let mut deps = Vec::new();
+ let replace;
+ let patch;
+
+ {
+
+ let mut cx = Context {
+ pkgid: Some(&pkgid),
+ deps: &mut deps,
+ source_id: source_id,
+ nested_paths: &mut nested_paths,
+ config: config,
+ warnings: &mut warnings,
+ features: &features,
+ platform: None,
+ root: package_root,
+ };
+
+ fn process_dependencies(
+ cx: &mut Context,
+ new_deps: Option<&BTreeMap<String, TomlDependency>>,
+ kind: Option<Kind>)
+ -> CargoResult<()>
+ {
+ let dependencies = match new_deps {
+ Some(dependencies) => dependencies,
+ None => return Ok(())
+ };
+ for (n, v) in dependencies.iter() {
+ let dep = v.to_dependency(n, cx, kind)?;
+ cx.deps.push(dep);
+ }
+
+ Ok(())
+ }
+
+ // Collect the deps
+ process_dependencies(&mut cx, me.dependencies.as_ref(),
+ None)?;
+ let dev_deps = me.dev_dependencies.as_ref()
+ .or_else(|| me.dev_dependencies2.as_ref());
+ process_dependencies(&mut cx, dev_deps, Some(Kind::Development))?;
+ let build_deps = me.build_dependencies.as_ref()
+ .or_else(|| me.build_dependencies2.as_ref());
+ process_dependencies(&mut cx, build_deps, Some(Kind::Build))?;
+
+ for (name, platform) in me.target.iter().flat_map(|t| t) {
+ cx.platform = Some(name.parse()?);
+ process_dependencies(&mut cx, platform.dependencies.as_ref(),
+ None)?;
+ let build_deps = platform.build_dependencies.as_ref()
+ .or_else(|| platform.build_dependencies2.as_ref());
+ process_dependencies(&mut cx, build_deps, Some(Kind::Build))?;
+ let dev_deps = platform.dev_dependencies.as_ref()
+ .or_else(|| platform.dev_dependencies2.as_ref());
+ process_dependencies(&mut cx, dev_deps, Some(Kind::Development))?;
+ }
+
+ replace = me.replace(&mut cx)?;
+ patch = me.patch(&mut cx)?;
+ }
+
+ {
+ let mut names_sources = BTreeMap::new();
+ for dep in &deps {
+ let name = dep.name();
+ let prev = names_sources.insert(name, dep.source_id());
+ if prev.is_some() && prev != Some(dep.source_id()) {
+ bail!("Dependency '{}' has different source paths depending on the build \
+ target. Each dependency must have a single canonical source path \
+ irrespective of build target.", name);
+ }
+ }
+ }
+
+ let exclude = project.exclude.clone().unwrap_or_default();
+ let include = project.include.clone().unwrap_or_default();
+
+ let summary = Summary::new(pkgid, deps, me.features.clone()
+ .unwrap_or_else(BTreeMap::new))?;
+ let metadata = ManifestMetadata {
+ description: project.description.clone(),
+ homepage: project.homepage.clone(),
+ documentation: project.documentation.clone(),
+ readme: project.readme.clone(),
+ authors: project.authors.clone().unwrap_or_default(),
+ license: project.license.clone(),
+ license_file: project.license_file.clone(),
+ repository: project.repository.clone(),
+ keywords: project.keywords.clone().unwrap_or_default(),
+ categories: project.categories.clone().unwrap_or_default(),
+ badges: me.badges.clone().unwrap_or_default(),
+ };
+
+ let workspace_config = match (me.workspace.as_ref(),
+ project.workspace.as_ref()) {
+ (Some(config), None) => {
+ WorkspaceConfig::Root(
+ WorkspaceRootConfig::new(&package_root, &config.members, &config.exclude)
+ )
+ }
+ (None, root) => {
+ WorkspaceConfig::Member { root: root.cloned() }
+ }
+ (Some(..), Some(..)) => {
+ bail!("cannot configure both `package.workspace` and \
+ `[workspace]`, only one can be specified")
+ }
+ };
+ let profiles = build_profiles(&me.profile);
+ let publish = match project.publish {
+ Some(VecStringOrBool::VecString(ref vecstring)) => {
+ features.require(Feature::alternative_registries()).chain_err(|| {
+ "the `publish` manifest key is unstable for anything other than a value of true or false"
+ })?;
+ Some(vecstring.clone())
+ },
+ Some(VecStringOrBool::Bool(false)) => Some(vec![]),
+ None | Some(VecStringOrBool::Bool(true)) => None,
+ };
+ let mut manifest = Manifest::new(summary,
+ targets,
+ exclude,
+ include,
+ project.links.clone(),
+ metadata,
+ profiles,
+ publish,
+ replace,
+ patch,
+ workspace_config,
+ features,
+ project.im_a_teapot,
+ Rc::clone(me));
+ if project.license_file.is_some() && project.license.is_some() {
+ manifest.add_warning("only one of `license` or \
+ `license-file` is necessary".to_string());
+ }
+ for warning in warnings {
+ manifest.add_warning(warning);
+ }
+ for error in errors {
+ manifest.add_critical_warning(error);
+ }
+
+ manifest.feature_gate()?;
+
+ Ok((manifest, nested_paths))
+ }
+
+ fn to_virtual_manifest(me: &Rc<TomlManifest>,
+ source_id: &SourceId,
+ root: &Path,
+ config: &Config)
+ -> CargoResult<(VirtualManifest, Vec<PathBuf>)> {
+ if me.project.is_some() {
+ bail!("virtual manifests do not define [project]");
+ }
+ if me.package.is_some() {
+ bail!("virtual manifests do not define [package]");
+ }
+ if me.lib.is_some() {
+ bail!("virtual manifests do not specify [lib]");
+ }
+ if me.bin.is_some() {
+ bail!("virtual manifests do not specify [[bin]]");
+ }
+ if me.example.is_some() {
+ bail!("virtual manifests do not specify [[example]]");
+ }
+ if me.test.is_some() {
+ bail!("virtual manifests do not specify [[test]]");
+ }
+ if me.bench.is_some() {
+ bail!("virtual manifests do not specify [[bench]]");
+ }
+
+ let mut nested_paths = Vec::new();
+ let mut warnings = Vec::new();
+ let mut deps = Vec::new();
+ let empty = Vec::new();
+ let cargo_features = me.cargo_features.as_ref().unwrap_or(&empty);
+ let features = Features::new(&cargo_features, &mut warnings)?;
+
+ let (replace, patch) = {
+ let mut cx = Context {
+ pkgid: None,
+ deps: &mut deps,
+ source_id: source_id,
+ nested_paths: &mut nested_paths,
+ config: config,
+ warnings: &mut warnings,
+ platform: None,
+ features: &features,
+ root: root
+ };
+ (me.replace(&mut cx)?, me.patch(&mut cx)?)
+ };
+ let profiles = build_profiles(&me.profile);
+ let workspace_config = match me.workspace {
+ Some(ref config) => {
+ WorkspaceConfig::Root(
+ WorkspaceRootConfig::new(&root, &config.members, &config.exclude)
+ )
+ }
+ None => {
+ bail!("virtual manifests must be configured with [workspace]");
+ }
+ };
+ Ok((VirtualManifest::new(replace, patch, workspace_config, profiles), nested_paths))
+ }
+
+ fn replace(&self, cx: &mut Context)
+ -> CargoResult<Vec<(PackageIdSpec, Dependency)>> {
+ if self.patch.is_some() && self.replace.is_some() {
+ bail!("cannot specify both [replace] and [patch]");
+ }
+ let mut replace = Vec::new();
+ for (spec, replacement) in self.replace.iter().flat_map(|x| x) {
+ let mut spec = PackageIdSpec::parse(spec).chain_err(|| {
+ format!("replacements must specify a valid semver \
+ version to replace, but `{}` does not",
+ spec)
+ })?;
+ if spec.url().is_none() {
+ spec.set_url(CRATES_IO.parse().unwrap());
+ }
+
+ let version_specified = match *replacement {
+ TomlDependency::Detailed(ref d) => d.version.is_some(),
+ TomlDependency::Simple(..) => true,
+ };
+ if version_specified {
+ bail!("replacements cannot specify a version \
+ requirement, but found one for `{}`", spec);
+ }
+
+ let mut dep = replacement.to_dependency(spec.name(), cx, None)?;
+ {
+ let version = spec.version().ok_or_else(|| {
+ CargoError::from(format!("replacements must specify a version \
+ to replace, but `{}` does not",
+ spec))
+ })?;
+ dep.set_version_req(VersionReq::exact(version));
+ }
+ replace.push((spec, dep));
+ }
+ Ok(replace)
+ }
+
+ fn patch(&self, cx: &mut Context)
+ -> CargoResult<HashMap<Url, Vec<Dependency>>> {
+ let mut patch = HashMap::new();
+ for (url, deps) in self.patch.iter().flat_map(|x| x) {
+ let url = match &url[..] {
+ "crates-io" => CRATES_IO.parse().unwrap(),
+ _ => url.to_url()?,
+ };
+ patch.insert(url, deps.iter().map(|(name, dep)| {
+ dep.to_dependency(name, cx, None)
+ }).collect::<CargoResult<Vec<_>>>()?);
+ }
+ Ok(patch)
+ }
+
+ fn maybe_custom_build(&self,
+ build: &Option<StringOrBool>,
+ package_root: &Path)
+ -> Option<PathBuf> {
+ let build_rs = package_root.join("build.rs");
+ match *build {
+ Some(StringOrBool::Bool(false)) => None, // explicitly no build script
+ Some(StringOrBool::Bool(true)) => Some(build_rs.into()),
+ Some(StringOrBool::String(ref s)) => Some(PathBuf::from(s)),
+ None => {
+ match fs::metadata(&build_rs) {
+ // If there is a build.rs file next to the Cargo.toml, assume it is
+ // a build script
+ Ok(ref e) if e.is_file() => Some(build_rs.into()),
+ Ok(_) | Err(_) => None,
+ }
+ }
+ }
+ }
+}
+
+/// Will check a list of build targets, and make sure the target names are unique within a vector.
+/// If not, the name of the offending build target is returned.
+fn unique_build_targets(targets: &[Target], package_root: &Path) -> Result<(), String> {
+ let mut seen = HashSet::new();
+ for v in targets.iter().map(|e| package_root.join(e.src_path())) {
+ if !seen.insert(v.clone()) {
+ return Err(v.display().to_string());
+ }
+ }
+ Ok(())
+}
+
+impl TomlDependency {
+ fn to_dependency(&self,
+ name: &str,
+ cx: &mut Context,
+ kind: Option<Kind>)
+ -> CargoResult<Dependency> {
+ let details = match *self {
+ TomlDependency::Simple(ref version) => DetailedTomlDependency {
+ version: Some(version.clone()),
+ .. Default::default()
+ },
+ TomlDependency::Detailed(ref details) => details.clone(),
+ };
+
+ if details.version.is_none() && details.path.is_none() &&
+ details.git.is_none() {
+ let msg = format!("dependency ({}) specified without \
+ providing a local path, Git repository, or \
+ version to use. This will be considered an \
+ error in future versions", name);
+ cx.warnings.push(msg);
+ }
+
+ if details.git.is_none() {
+ let git_only_keys = [
+ (&details.branch, "branch"),
+ (&details.tag, "tag"),
+ (&details.rev, "rev")
+ ];
+
+ for &(key, key_name) in &git_only_keys {
+ if key.is_some() {
+ let msg = format!("key `{}` is ignored for dependency ({}). \
+ This will be considered an error in future versions",
+ key_name, name);
+ cx.warnings.push(msg)
+ }
+ }
+ }
+
+ let new_source_id = match (details.git.as_ref(), details.path.as_ref(), details.registry.as_ref()) {
+ (Some(_), _, Some(_)) => bail!("dependency ({}) specification is ambiguous. \
+ Only one of `git` or `registry` is allowed.", name),
+ (_, Some(_), Some(_)) => bail!("dependency ({}) specification is ambiguous. \
+ Only one of `path` or `registry` is allowed.", name),
+ (Some(git), maybe_path, _) => {
+ if maybe_path.is_some() {
+ let msg = format!("dependency ({}) specification is ambiguous. \
+ Only one of `git` or `path` is allowed. \
+ This will be considered an error in future versions", name);
+ cx.warnings.push(msg)
+ }
+
+ let n_details = [&details.branch, &details.tag, &details.rev]
+ .iter()
+ .filter(|d| d.is_some())
+ .count();
+
+ if n_details > 1 {
+ let msg = format!("dependency ({}) specification is ambiguous. \
+ Only one of `branch`, `tag` or `rev` is allowed. \
+ This will be considered an error in future versions", name);
+ cx.warnings.push(msg)
+ }
+
+ let reference = details.branch.clone().map(GitReference::Branch)
+ .or_else(|| details.tag.clone().map(GitReference::Tag))
+ .or_else(|| details.rev.clone().map(GitReference::Rev))
+ .unwrap_or_else(|| GitReference::Branch("master".to_string()));
+ let loc = git.to_url()?;
+ SourceId::for_git(&loc, reference)?
+ },
+ (None, Some(path), _) => {
+ cx.nested_paths.push(PathBuf::from(path));
+ // If the source id for the package we're parsing is a path
+ // source, then we normalize the path here to get rid of
+ // components like `..`.
+ //
+ // The purpose of this is to get a canonical id for the package
+ // that we're depending on to ensure that builds of this package
+ // always end up hashing to the same value no matter where it's
+ // built from.
+ if cx.source_id.is_path() {
+ let path = cx.root.join(path);
+ let path = util::normalize_path(&path);
+ SourceId::for_path(&path)?
+ } else {
+ cx.source_id.clone()
+ }
+ },
+ (None, None, Some(registry)) => {
+ cx.features.require(Feature::alternative_registries())?;
+ SourceId::alt_registry(cx.config, registry)?
+ }
+ (None, None, None) => SourceId::crates_io(cx.config)?,
+ };
+
+ let version = details.version.as_ref().map(|v| &v[..]);
+ let mut dep = match cx.pkgid {
+ Some(id) => {
+ Dependency::parse(name, version, &new_source_id,
+ id, cx.config)?
+ }
+ None => Dependency::parse_no_deprecated(name, version, &new_source_id)?,
+ };
+ dep.set_features(details.features.unwrap_or_default())
+ .set_default_features(details.default_features
+ .or(details.default_features2)
+ .unwrap_or(true))
+ .set_optional(details.optional.unwrap_or(false))
+ .set_platform(cx.platform.clone());
+ if let Some(kind) = kind {
+ dep.set_kind(kind);
+ }
+ Ok(dep)
+ }
+}
+
+#[derive(Default, Serialize, Deserialize, Debug, Clone)]
+struct TomlTarget {
+ name: Option<String>,
+
+ // The intention was to only accept `crate-type` here but historical
+ // versions of Cargo also accepted `crate_type`, so look for both.
+ #[serde(rename = "crate-type")]
+ crate_type: Option<Vec<String>>,
+ #[serde(rename = "crate_type")]
+ crate_type2: Option<Vec<String>>,
+
+ path: Option<PathValue>,
+ test: Option<bool>,
+ doctest: Option<bool>,
+ bench: Option<bool>,
+ doc: Option<bool>,
+ plugin: Option<bool>,
+ #[serde(rename = "proc-macro")]
+ proc_macro: Option<bool>,
+ #[serde(rename = "proc_macro")]
+ proc_macro2: Option<bool>,
+ harness: Option<bool>,
+ #[serde(rename = "required-features")]
+ required_features: Option<Vec<String>>,
+}
+
+#[derive(Clone)]
+struct PathValue(PathBuf);
+
+impl<'de> de::Deserialize<'de> for PathValue {
+ fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
+ where D: de::Deserializer<'de>
+ {
+ Ok(PathValue(String::deserialize(deserializer)?.into()))
+ }
+}
+
+impl ser::Serialize for PathValue {
+ fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
+ where S: ser::Serializer,
+ {
+ self.0.serialize(serializer)
+ }
+}
+
+/// Corresponds to a `target` entry, but `TomlTarget` is already used.
+#[derive(Serialize, Deserialize, Debug)]
+struct TomlPlatform {
+ dependencies: Option<BTreeMap<String, TomlDependency>>,
+ #[serde(rename = "build-dependencies")]
+ build_dependencies: Option<BTreeMap<String, TomlDependency>>,
+ #[serde(rename = "build_dependencies")]
+ build_dependencies2: Option<BTreeMap<String, TomlDependency>>,
+ #[serde(rename = "dev-dependencies")]
+ dev_dependencies: Option<BTreeMap<String, TomlDependency>>,
+ #[serde(rename = "dev_dependencies")]
+ dev_dependencies2: Option<BTreeMap<String, TomlDependency>>,
+}
+
+impl TomlTarget {
+ fn new() -> TomlTarget {
+ TomlTarget::default()
+ }
+
+ fn name(&self) -> String {
+ match self.name {
+ Some(ref name) => name.clone(),
+ None => panic!("target name is required")
+ }
+ }
+
+ fn proc_macro(&self) -> Option<bool> {
+ self.proc_macro.or(self.proc_macro2)
+ }
+
+ fn crate_types(&self) -> Option<&Vec<String>> {
+ self.crate_type.as_ref().or_else(|| self.crate_type2.as_ref())
+ }
+}
+
+impl fmt::Debug for PathValue {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ self.0.fmt(f)
+ }
+}
+
+fn build_profiles(profiles: &Option<TomlProfiles>) -> Profiles {
+ let profiles = profiles.as_ref();
+ let mut profiles = Profiles {
+ release: merge(Profile::default_release(),
+ profiles.and_then(|p| p.release.as_ref())),
+ dev: merge(Profile::default_dev(),
+ profiles.and_then(|p| p.dev.as_ref())),
+ test: merge(Profile::default_test(),
+ profiles.and_then(|p| p.test.as_ref())),
+ test_deps: merge(Profile::default_dev(),
+ profiles.and_then(|p| p.dev.as_ref())),
+ bench: merge(Profile::default_bench(),
+ profiles.and_then(|p| p.bench.as_ref())),
+ bench_deps: merge(Profile::default_release(),
+ profiles.and_then(|p| p.release.as_ref())),
+ doc: merge(Profile::default_doc(),
+ profiles.and_then(|p| p.doc.as_ref())),
+ custom_build: Profile::default_custom_build(),
+ check: merge(Profile::default_check(),
+ profiles.and_then(|p| p.dev.as_ref())),
+ check_test: merge(Profile::default_check_test(),
+ profiles.and_then(|p| p.dev.as_ref())),
+ doctest: Profile::default_doctest(),
+ };
+ // The test/bench targets cannot have panic=abort because they'll all get
+ // compiled with --test which requires the unwind runtime currently
+ profiles.test.panic = None;
+ profiles.bench.panic = None;
+ profiles.test_deps.panic = None;
+ profiles.bench_deps.panic = None;
+ return profiles;
+
+ fn merge(profile: Profile, toml: Option<&TomlProfile>) -> Profile {
+ let &TomlProfile {
+ ref opt_level, lto, codegen_units, ref debug, debug_assertions, rpath,
+ ref panic, ref overflow_checks,
+ } = match toml {
+ Some(toml) => toml,
+ None => return profile,
+ };
+ let debug = match *debug {
+ Some(U32OrBool::U32(debug)) => Some(Some(debug)),
+ Some(U32OrBool::Bool(true)) => Some(Some(2)),
+ Some(U32OrBool::Bool(false)) => Some(None),
+ None => None,
+ };
+ Profile {
+ opt_level: opt_level.clone().unwrap_or(TomlOptLevel(profile.opt_level)).0,
+ lto: lto.unwrap_or(profile.lto),
+ codegen_units: codegen_units,
+ rustc_args: None,
+ rustdoc_args: None,
+ debuginfo: debug.unwrap_or(profile.debuginfo),
+ debug_assertions: debug_assertions.unwrap_or(profile.debug_assertions),
+ overflow_checks: overflow_checks.unwrap_or(profile.overflow_checks),
+ rpath: rpath.unwrap_or(profile.rpath),
+ test: profile.test,
+ doc: profile.doc,
+ run_custom_build: profile.run_custom_build,
+ check: profile.check,
+ panic: panic.clone().or(profile.panic),
+ }
+ }
+}
--- /dev/null
+//! This module implements Cargo conventions for directory layout:
+//!
+//! * `src/lib.rs` is a library
+//! * `src/main.rs` is a binary
+//! * `src/bin/*.rs` are binaries
+//! * `examples/*.rs` are examples
+//! * `tests/*.rs` are integration tests
+//! * `benches/*.rs` are benchmarks
+//!
+//! It is a bit tricky because we need match explicit information from `Cargo.toml`
+//! with implicit info in directory layout.
+
+use std::path::{Path, PathBuf};
+use std::fs::{self, DirEntry};
+use std::collections::HashSet;
+
+use core::Target;
+use ops::is_bad_artifact_name;
+use util::errors::CargoResult;
+use super::{TomlTarget, LibKind, PathValue, TomlManifest, StringOrBool,
+ TomlLibTarget, TomlBinTarget, TomlBenchTarget, TomlExampleTarget, TomlTestTarget};
+
+
+pub fn targets(manifest: &TomlManifest,
+ package_name: &str,
+ package_root: &Path,
+ custom_build: &Option<StringOrBool>,
+ warnings: &mut Vec<String>,
+ errors: &mut Vec<String>)
+ -> CargoResult<Vec<Target>> {
+ let mut targets = Vec::new();
+
+ let has_lib;
+
+ if let Some(target) = clean_lib(manifest.lib.as_ref(), package_root, package_name, warnings)? {
+ targets.push(target);
+ has_lib = true;
+ } else {
+ has_lib = false;
+ }
+
+ targets.extend(
+ clean_bins(manifest.bin.as_ref(), package_root, package_name, warnings, has_lib)?
+ );
+
+ targets.extend(
+ clean_examples(manifest.example.as_ref(), package_root, errors)?
+ );
+
+ targets.extend(
+ clean_tests(manifest.test.as_ref(), package_root, errors)?
+ );
+
+ targets.extend(
+ clean_benches(manifest.bench.as_ref(), package_root, warnings, errors)?
+ );
+
+ // processing the custom build script
+ if let Some(custom_build) = manifest.maybe_custom_build(custom_build, package_root) {
+ let name = format!("build-script-{}",
+ custom_build.file_stem().and_then(|s| s.to_str()).unwrap_or(""));
+ targets.push(Target::custom_build_target(&name, package_root.join(custom_build)));
+ }
+
+ Ok(targets)
+}
+
+
+fn clean_lib(toml_lib: Option<&TomlLibTarget>,
+ package_root: &Path,
+ package_name: &str,
+ warnings: &mut Vec<String>) -> CargoResult<Option<Target>> {
+ let inferred = inferred_lib(package_root);
+ let lib = match toml_lib {
+ Some(lib) => {
+ if let Some(ref name) = lib.name {
+ // XXX: other code paths dodge this validation
+ if name.contains('-') {
+ bail!("library target names cannot contain hyphens: {}", name)
+ }
+ }
+ Some(TomlTarget {
+ name: lib.name.clone().or_else(|| Some(package_name.to_owned())),
+ ..lib.clone()
+ })
+ }
+ None => inferred.as_ref().map(|lib| {
+ TomlTarget {
+ name: Some(package_name.to_string()),
+ path: Some(PathValue(lib.clone())),
+ ..TomlTarget::new()
+ }
+ })
+ };
+
+ let lib = match lib {
+ Some(ref lib) => lib,
+ None => return Ok(None)
+ };
+
+ validate_has_name(lib, "library", "lib")?;
+
+ let path = match (lib.path.as_ref(), inferred) {
+ (Some(path), _) => package_root.join(&path.0),
+ (None, Some(path)) => path,
+ (None, None) => {
+ let legacy_path = package_root.join("src").join(format!("{}.rs", lib.name()));
+ if legacy_path.exists() {
+ warnings.push(format!(
+ "path `{}` was erroneously implicitly accepted for library `{}`,\n\
+ please rename the file to `src/lib.rs` or set lib.path in Cargo.toml",
+ legacy_path.display(), lib.name()
+ ));
+ legacy_path
+ } else {
+ bail!("can't find library `{}`, \
+ rename file to `src/lib.rs` or specify lib.path", lib.name())
+ }
+ }
+ };
+
+ // Per the Macros 1.1 RFC:
+ //
+ // > Initially if a crate is compiled with the proc-macro crate type
+ // > (and possibly others) it will forbid exporting any items in the
+ // > crate other than those functions tagged #[proc_macro_derive] and
+ // > those functions must also be placed at the crate root.
+ //
+ // A plugin requires exporting plugin_registrar so a crate cannot be
+ // both at once.
+ let crate_types = match (lib.crate_types(), lib.plugin, lib.proc_macro()) {
+ (_, Some(true), Some(true)) => bail!("lib.plugin and lib.proc-macro cannot both be true"),
+ (Some(kinds), _, _) => kinds.iter().map(|s| LibKind::from_str(s)).collect(),
+ (None, Some(true), _) => vec![LibKind::Dylib],
+ (None, _, Some(true)) => vec![LibKind::ProcMacro],
+ (None, _, _) => vec![LibKind::Lib],
+ };
+
+ let mut target = Target::lib_target(&lib.name(), crate_types, path);
+ configure(lib, &mut target);
+ Ok(Some(target))
+}
+
+fn clean_bins(toml_bins: Option<&Vec<TomlBinTarget>>,
+ package_root: &Path,
+ package_name: &str,
+ warnings: &mut Vec<String>,
+ has_lib: bool) -> CargoResult<Vec<Target>> {
+ let inferred = inferred_bins(package_root, package_name);
+ let bins = match toml_bins {
+ Some(bins) => bins.clone(),
+ None => inferred.iter().map(|&(ref name, ref path)| {
+ TomlTarget {
+ name: Some(name.clone()),
+ path: Some(PathValue(path.clone())),
+ ..TomlTarget::new()
+ }
+ }).collect()
+ };
+
+ for bin in &bins {
+ validate_has_name(bin, "binary", "bin")?;
+
+ let name = bin.name();
+ if is_bad_artifact_name(&name) {
+ bail!("the binary target name `{}` is forbidden", name)
+ }
+ }
+
+ validate_unique_names(&bins, "binary")?;
+
+ let mut result = Vec::new();
+ for bin in &bins {
+ let path = target_path(bin, &inferred, "bin", package_root, &mut |_| {
+ if let Some(legacy_path) = legacy_bin_path(package_root, &bin.name(), has_lib) {
+ warnings.push(format!(
+ "path `{}` was erroneously implicitly accepted for binary `{}`,\n\
+ please set bin.path in Cargo.toml",
+ legacy_path.display(), bin.name()
+ ));
+ Some(legacy_path)
+ } else {
+ None
+ }
+ });
+ let path = match path {
+ Ok(path) => path,
+ Err(e) => bail!("{}", e),
+ };
+
+ let mut target = Target::bin_target(&bin.name(), path,
+ bin.required_features.clone());
+ configure(bin, &mut target);
+ result.push(target);
+ }
+ return Ok(result);
+
+ fn legacy_bin_path(package_root: &Path, name: &str, has_lib: bool) -> Option<PathBuf> {
+ if !has_lib {
+ let path = package_root.join("src").join(format!("{}.rs", name));
+ if path.exists() {
+ return Some(path);
+ }
+ }
+ let path = package_root.join("src").join("main.rs");
+ if path.exists() {
+ return Some(path);
+ }
+
+ let path = package_root.join("src").join("bin").join("main.rs");
+ if path.exists() {
+ return Some(path);
+ }
+ None
+ }
+}
+
+fn clean_examples(toml_examples: Option<&Vec<TomlExampleTarget>>,
+ package_root: &Path,
+ errors: &mut Vec<String>)
+ -> CargoResult<Vec<Target>> {
+
+ let inferred = infer_from_directory(&package_root.join("examples"));
+
+ let targets = clean_targets("example", "example",
+ toml_examples, &inferred,
+ package_root, errors)?;
+
+ let mut result = Vec::new();
+ for (path, toml) in targets {
+ let crate_types = match toml.crate_types() {
+ Some(kinds) => kinds.iter().map(|s| LibKind::from_str(s)).collect(),
+ None => Vec::new()
+ };
+
+ let mut target = Target::example_target(&toml.name(), crate_types, path,
+ toml.required_features.clone());
+ configure(&toml, &mut target);
+ result.push(target);
+ }
+
+ Ok(result)
+}
+
+fn clean_tests(toml_tests: Option<&Vec<TomlTestTarget>>,
+ package_root: &Path,
+ errors: &mut Vec<String>) -> CargoResult<Vec<Target>> {
+
+ let inferred = infer_from_directory(&package_root.join("tests"));
+
+ let targets = clean_targets("test", "test",
+ toml_tests, &inferred,
+ package_root, errors)?;
+
+ let mut result = Vec::new();
+ for (path, toml) in targets {
+ let mut target = Target::test_target(&toml.name(), path,
+ toml.required_features.clone());
+ configure(&toml, &mut target);
+ result.push(target);
+ }
+ Ok(result)
+}
+
+fn clean_benches(toml_benches: Option<&Vec<TomlBenchTarget>>,
+ package_root: &Path,
+ warnings: &mut Vec<String>,
+ errors: &mut Vec<String>) -> CargoResult<Vec<Target>> {
+ let mut legacy_bench_path = |bench: &TomlTarget| {
+ let legacy_path = package_root.join("src").join("bench.rs");
+ if !(bench.name() == "bench" && legacy_path.exists()) {
+ return None;
+ }
+ warnings.push(format!(
+ "path `{}` was erroneously implicitly accepted for benchmark `{}`,\n\
+ please set bench.path in Cargo.toml",
+ legacy_path.display(), bench.name()
+ ));
+ Some(legacy_path)
+ };
+
+ let inferred = infer_from_directory(&package_root.join("benches"));
+
+ let targets = clean_targets_with_legacy_path("benchmark", "bench",
+ toml_benches, &inferred,
+ package_root,
+ errors,
+ &mut legacy_bench_path)?;
+
+ let mut result = Vec::new();
+ for (path, toml) in targets {
+ let mut target = Target::bench_target(&toml.name(), path,
+ toml.required_features.clone());
+ configure(&toml, &mut target);
+ result.push(target);
+ }
+
+ Ok(result)
+}
+
+fn clean_targets(target_kind_human: &str, target_kind: &str,
+ toml_targets: Option<&Vec<TomlTarget>>,
+ inferred: &[(String, PathBuf)],
+ package_root: &Path,
+ errors: &mut Vec<String>)
+ -> CargoResult<Vec<(PathBuf, TomlTarget)>> {
+ clean_targets_with_legacy_path(target_kind_human, target_kind,
+ toml_targets,
+ inferred,
+ package_root,
+ errors,
+ &mut |_| None)
+}
+
+fn clean_targets_with_legacy_path(target_kind_human: &str, target_kind: &str,
+ toml_targets: Option<&Vec<TomlTarget>>,
+ inferred: &[(String, PathBuf)],
+ package_root: &Path,
+ errors: &mut Vec<String>,
+ legacy_path: &mut FnMut(&TomlTarget) -> Option<PathBuf>)
+ -> CargoResult<Vec<(PathBuf, TomlTarget)>> {
+ let toml_targets = match toml_targets {
+ Some(targets) => targets.clone(),
+ None => inferred.iter().map(|&(ref name, ref path)| {
+ TomlTarget {
+ name: Some(name.clone()),
+ path: Some(PathValue(path.clone())),
+ ..TomlTarget::new()
+ }
+ }).collect()
+ };
+
+ for target in &toml_targets {
+ validate_has_name(target, target_kind_human, target_kind)?;
+ }
+
+ validate_unique_names(&toml_targets, target_kind)?;
+ let mut result = Vec::new();
+ for target in toml_targets {
+ let path = target_path(&target, inferred, target_kind, package_root, legacy_path);
+ let path = match path {
+ Ok(path) => path,
+ Err(e) => {
+ errors.push(e);
+ continue
+ },
+ };
+ result.push((path, target));
+ }
+ Ok(result)
+}
+
+
+fn inferred_lib(package_root: &Path) -> Option<PathBuf> {
+ let lib = package_root.join("src").join("lib.rs");
+ if fs::metadata(&lib).is_ok() {
+ Some(lib)
+ } else {
+ None
+ }
+}
+
+fn inferred_bins(package_root: &Path, package_name: &str) -> Vec<(String, PathBuf)> {
+ let main = package_root.join("src").join("main.rs");
+ let mut result = Vec::new();
+ if main.exists() {
+ result.push((package_name.to_string(), main));
+ }
+ result.extend(infer_from_directory(&package_root.join("src").join("bin")));
+
+ result
+}
+
+fn infer_from_directory(directory: &Path) -> Vec<(String, PathBuf)> {
+ let entries = match fs::read_dir(directory) {
+ Err(_) => return Vec::new(),
+ Ok(dir) => dir
+ };
+
+ entries
+ .filter_map(|e| e.ok())
+ .filter(is_not_dotfile)
+ .filter_map(|d| infer_any(&d))
+ .collect()
+}
+
+
+fn infer_any(entry: &DirEntry) -> Option<(String, PathBuf)> {
+ if entry.path().extension().and_then(|p| p.to_str()) == Some("rs") {
+ infer_file(entry)
+ } else if entry.file_type().map(|t| t.is_dir()).ok() == Some(true) {
+ infer_subdirectory(entry)
+ } else {
+ None
+ }
+}
+
+
+fn infer_file(entry: &DirEntry) -> Option<(String, PathBuf)> {
+ let path = entry.path();
+ path
+ .file_stem()
+ .and_then(|p| p.to_str())
+ .map(|p| (p.to_owned(), path.clone()))
+}
+
+
+fn infer_subdirectory(entry: &DirEntry) -> Option<(String, PathBuf)> {
+ let path = entry.path();
+ let main = path.join("main.rs");
+ let name = path.file_name().and_then(|n| n.to_str());
+ match (name, main.exists()) {
+ (Some(name), true) => Some((name.to_owned(), main)),
+ _ => None
+ }
+}
+
+
+fn is_not_dotfile(entry: &DirEntry) -> bool {
+ entry.file_name().to_str().map(|s| s.starts_with('.')) == Some(false)
+}
+
+
+fn validate_has_name(target: &TomlTarget,
+ target_kind_human: &str,
+ target_kind: &str) -> CargoResult<()> {
+ match target.name {
+ Some(ref name) => if name.trim().is_empty() {
+ bail!("{} target names cannot be empty", target_kind_human)
+ },
+ None => bail!("{} target {}.name is required", target_kind_human, target_kind)
+ }
+
+ Ok(())
+}
+
+/// Will check a list of toml targets, and make sure the target names are unique within a vector.
+fn validate_unique_names(targets: &[TomlTarget], target_kind: &str) -> CargoResult<()> {
+ let mut seen = HashSet::new();
+ for name in targets.iter().map(|e| e.name()) {
+ if !seen.insert(name.clone()) {
+ bail!("found duplicate {target_kind} name {name}, \
+ but all {target_kind} targets must have a unique name",
+ target_kind = target_kind, name = name);
+ }
+ }
+ Ok(())
+}
+
+
+fn configure(toml: &TomlTarget, target: &mut Target) {
+ let t2 = target.clone();
+ target.set_tested(toml.test.unwrap_or_else(|| t2.tested()))
+ .set_doc(toml.doc.unwrap_or_else(|| t2.documented()))
+ .set_doctest(toml.doctest.unwrap_or_else(|| t2.doctested()))
+ .set_benched(toml.bench.unwrap_or_else(|| t2.benched()))
+ .set_harness(toml.harness.unwrap_or_else(|| t2.harness()))
+ .set_for_host(match (toml.plugin, toml.proc_macro()) {
+ (None, None) => t2.for_host(),
+ (Some(true), _) | (_, Some(true)) => true,
+ (Some(false), _) | (_, Some(false)) => false,
+ });
+}
+
+fn target_path(target: &TomlTarget,
+ inferred: &[(String, PathBuf)],
+ target_kind: &str,
+ package_root: &Path,
+ legacy_path: &mut FnMut(&TomlTarget) -> Option<PathBuf>) -> Result<PathBuf, String> {
+ if let Some(ref path) = target.path {
+ // Should we verify that this path exists here?
+ return Ok(package_root.join(&path.0));
+ }
+ let name = target.name();
+
+ let mut matching = inferred.iter()
+ .filter(|&&(ref n, _)| n == &name)
+ .map(|&(_, ref p)| p.clone());
+
+ let first = matching.next();
+ let second = matching.next();
+ match (first, second) {
+ (Some(path), None) => Ok(path),
+ (None, None) | (Some(_), Some(_)) => {
+ if let Some(path) = legacy_path(target) {
+ return Ok(path);
+ }
+ Err(format!("can't find `{name}` {target_kind}, specify {target_kind}.path",
+ name = name, target_kind = target_kind))
+ }
+ (None, Some(_)) => unreachable!()
+ }
+}
--- /dev/null
+use std::path::Path;
+use std::fs::create_dir;
+
+use git2;
+
+use util::{CargoResult, process};
+
+pub struct HgRepo;
+pub struct GitRepo;
+pub struct PijulRepo;
+pub struct FossilRepo;
+
+impl GitRepo {
+ pub fn init(path: &Path, _: &Path) -> CargoResult<GitRepo> {
+ git2::Repository::init(path)?;
+ Ok(GitRepo)
+ }
+ pub fn discover(path: &Path, _: &Path) -> Result<git2::Repository,git2::Error> {
+ git2::Repository::discover(path)
+ }
+}
+
+impl HgRepo {
+ pub fn init(path: &Path, cwd: &Path) -> CargoResult<HgRepo> {
+ process("hg").cwd(cwd).arg("init").arg(path).exec()?;
+ Ok(HgRepo)
+ }
+ pub fn discover(path: &Path, cwd: &Path) -> CargoResult<HgRepo> {
+ process("hg").cwd(cwd).arg("root").cwd(path).exec_with_output()?;
+ Ok(HgRepo)
+ }
+}
+
+impl PijulRepo {
+ pub fn init(path: &Path, cwd: &Path) -> CargoResult<PijulRepo> {
+ process("pijul").cwd(cwd).arg("init").arg(path).exec()?;
+ Ok(PijulRepo)
+ }
+}
+
+impl FossilRepo {
+ pub fn init(path: &Path, cwd: &Path) -> CargoResult<FossilRepo> {
+ // fossil doesn't create the directory so we'll do that first
+ create_dir(path)?;
+
+ // set up the paths we'll use
+ let db_fname = ".fossil";
+ let mut db_path = path.to_owned();
+ db_path.push(db_fname);
+
+ // then create the fossil DB in that location
+ process("fossil").cwd(cwd).arg("init").arg(&db_path).exec()?;
+
+ // open it in that new directory
+ process("fossil").cwd(&path).arg("open").arg(db_fname).exec()?;
+
+ // set `target` as ignoreable and cleanable
+ process("fossil").cwd(cwd).arg("settings")
+ .arg("ignore-glob")
+ .arg("target");
+ process("fossil").cwd(cwd).arg("settings")
+ .arg("clean-glob")
+ .arg("target");
+ Ok(FossilRepo)
+ }
+}
--- /dev/null
+set -ex
+
+DOCS="index faq config guide manifest build-script pkgid-spec crates-io \
+ environment-variables specifying-dependencies source-replacement \
+ external-tools"
+ASSETS="CNAME images/noise.png images/forkme.png images/Cargo-Logo-Small.png \
+ stylesheets/all.css stylesheets/normalize.css javascripts/prism.js \
+ javascripts/all.js stylesheets/prism.css images/circle-with-i.png \
+ images/search.png images/org-level-acl.png images/auth-level-acl.png \
+ favicon.ico policies.html"
+
+for asset in $ASSETS; do
+ mkdir -p `dirname target/doc/$asset`
+ cp src/doc/$asset target/doc/$asset
+done
+
+for doc in $DOCS; do
+ rustdoc \
+ --markdown-no-toc \
+ --markdown-css stylesheets/normalize.css \
+ --markdown-css stylesheets/all.css \
+ --markdown-css stylesheets/prism.css \
+ --html-in-header src/doc/html-headers.html \
+ --html-before-content src/doc/header.html \
+ --html-after-content src/doc/footer.html \
+ -o target/doc \
+ src/doc/$doc.md
+done
+
+# Temporary preview for mdBook docs
+cd src/doc/book
+$HOME/.cargo/bin/mdbook build --no-create --dest-dir ../../../target/doc/book
+cd ../../../
--- /dev/null
+[package]
+name = "crates-io"
+version = "0.13.0"
+authors = ["Alex Crichton <alex@alexcrichton.com>"]
+license = "MIT/Apache-2.0"
+repository = "https://github.com/rust-lang/cargo"
+description = """
+Helpers for interacting with crates.io
+"""
+
+[lib]
+name = "crates_io"
+path = "lib.rs"
+
+[dependencies]
+curl = "0.4"
+error-chain = "0.11.0-rc.2"
+serde = "1.0"
+serde_derive = "1.0"
+serde_json = "1.0"
+url = "1.0"
--- /dev/null
+#![allow(unknown_lints)]
+
+extern crate curl;
+extern crate url;
+#[macro_use]
+extern crate error_chain;
+extern crate serde_json;
+#[macro_use]
+extern crate serde_derive;
+
+use std::collections::BTreeMap;
+use std::fs::File;
+use std::io::prelude::*;
+use std::io::{self, Cursor};
+
+use curl::easy::{Easy, List};
+
+use url::percent_encoding::{percent_encode, QUERY_ENCODE_SET};
+
+error_chain! {
+ foreign_links {
+ Curl(curl::Error);
+ Io(io::Error);
+ Json(serde_json::Error);
+ }
+
+ errors {
+ NotOkResponse(code: u32, headers: Vec<String>, body: Vec<u8>){
+ description("failed to get a 200 OK response")
+ display("failed to get a 200 OK response, got {}
+headers:
+ {}
+body:
+{}", code, headers.join("\n ", ), String::from_utf8_lossy(body))
+ }
+ NonUtf8Body {
+ description("response body was not utf-8")
+ display("response body was not utf-8")
+ }
+ Api(errs: Vec<String>) {
+ display("api errors: {}", errs.join(", "))
+ }
+ Unauthorized {
+ display("unauthorized API access")
+ }
+ TokenMissing{
+ display("no upload token found, please run `cargo login`")
+ }
+ NotFound {
+ display("cannot find crate")
+ }
+ }
+ }
+
+pub struct Registry {
+ host: String,
+ token: Option<String>,
+ handle: Easy,
+}
+
+#[derive(PartialEq, Clone, Copy)]
+pub enum Auth {
+ Authorized,
+ Unauthorized,
+}
+
+#[derive(Deserialize)]
+pub struct Crate {
+ pub name: String,
+ pub description: Option<String>,
+ pub max_version: String,
+}
+
+#[derive(Serialize)]
+pub struct NewCrate {
+ pub name: String,
+ pub vers: String,
+ pub deps: Vec<NewCrateDependency>,
+ pub features: BTreeMap<String, Vec<String>>,
+ pub authors: Vec<String>,
+ pub description: Option<String>,
+ pub documentation: Option<String>,
+ pub homepage: Option<String>,
+ pub readme: Option<String>,
+ pub readme_file: Option<String>,
+ pub keywords: Vec<String>,
+ pub categories: Vec<String>,
+ pub license: Option<String>,
+ pub license_file: Option<String>,
+ pub repository: Option<String>,
+ pub badges: BTreeMap<String, BTreeMap<String, String>>,
+}
+
+#[derive(Serialize)]
+pub struct NewCrateDependency {
+ pub optional: bool,
+ pub default_features: bool,
+ pub name: String,
+ pub features: Vec<String>,
+ pub version_req: String,
+ pub target: Option<String>,
+ pub kind: String,
+ #[serde(skip_serializing_if = "Option::is_none")]
+ pub registry: Option<String>,
+}
+
+#[derive(Deserialize)]
+pub struct User {
+ pub id: u32,
+ pub login: String,
+ pub avatar: Option<String>,
+ pub email: Option<String>,
+ pub name: Option<String>,
+}
+
+pub struct Warnings {
+ pub invalid_categories: Vec<String>,
+ pub invalid_badges: Vec<String>,
+}
+
+#[derive(Deserialize)] struct R { ok: bool }
+#[derive(Deserialize)] struct OwnerResponse { ok: bool, msg: String }
+#[derive(Deserialize)] struct ApiErrorList { errors: Vec<ApiError> }
+#[derive(Deserialize)] struct ApiError { detail: String }
+#[derive(Serialize)] struct OwnersReq<'a> { users: &'a [&'a str] }
+#[derive(Deserialize)] struct Users { users: Vec<User> }
+#[derive(Deserialize)] struct TotalCrates { total: u32 }
+#[derive(Deserialize)] struct Crates { crates: Vec<Crate>, meta: TotalCrates }
+impl Registry {
+ pub fn new(host: String, token: Option<String>) -> Registry {
+ Registry::new_handle(host, token, Easy::new())
+ }
+
+ pub fn new_handle(host: String,
+ token: Option<String>,
+ handle: Easy) -> Registry {
+ Registry {
+ host: host,
+ token: token,
+ handle: handle,
+ }
+ }
+
+ pub fn add_owners(&mut self, krate: &str, owners: &[&str]) -> Result<String> {
+ let body = serde_json::to_string(&OwnersReq { users: owners })?;
+ let body = self.put(format!("/crates/{}/owners", krate),
+ body.as_bytes())?;
+ assert!(serde_json::from_str::<OwnerResponse>(&body)?.ok);
+ Ok(serde_json::from_str::<OwnerResponse>(&body)?.msg)
+ }
+
+ pub fn remove_owners(&mut self, krate: &str, owners: &[&str]) -> Result<()> {
+ let body = serde_json::to_string(&OwnersReq { users: owners })?;
+ let body = self.delete(format!("/crates/{}/owners", krate),
+ Some(body.as_bytes()))?;
+ assert!(serde_json::from_str::<OwnerResponse>(&body)?.ok);
+ Ok(())
+ }
+
+ pub fn list_owners(&mut self, krate: &str) -> Result<Vec<User>> {
+ let body = self.get(format!("/crates/{}/owners", krate))?;
+ Ok(serde_json::from_str::<Users>(&body)?.users)
+ }
+
+ pub fn publish(&mut self, krate: &NewCrate, tarball: &File)
+ -> Result<Warnings> {
+ let json = serde_json::to_string(krate)?;
+ // Prepare the body. The format of the upload request is:
+ //
+ // <le u32 of json>
+ // <json request> (metadata for the package)
+ // <le u32 of tarball>
+ // <source tarball>
+ let stat = tarball.metadata()?;
+ let header = {
+ let mut w = Vec::new();
+ w.extend([
+ (json.len() >> 0) as u8,
+ (json.len() >> 8) as u8,
+ (json.len() >> 16) as u8,
+ (json.len() >> 24) as u8,
+ ].iter().map(|x| *x));
+ w.extend(json.as_bytes().iter().map(|x| *x));
+ w.extend([
+ (stat.len() >> 0) as u8,
+ (stat.len() >> 8) as u8,
+ (stat.len() >> 16) as u8,
+ (stat.len() >> 24) as u8,
+ ].iter().map(|x| *x));
+ w
+ };
+ let size = stat.len() as usize + header.len();
+ let mut body = Cursor::new(header).chain(tarball);
+
+ let url = format!("{}/api/v1/crates/new", self.host);
+
+ let token = match self.token.as_ref() {
+ Some(s) => s,
+ None => return Err(Error::from_kind(ErrorKind::TokenMissing)),
+ };
+ self.handle.put(true)?;
+ self.handle.url(&url)?;
+ self.handle.in_filesize(size as u64)?;
+ let mut headers = List::new();
+ headers.append("Accept: application/json")?;
+ headers.append(&format!("Authorization: {}", token))?;
+ self.handle.http_headers(headers)?;
+
+ let body = handle(&mut self.handle, &mut |buf| body.read(buf).unwrap_or(0))?;
+
+ let response = if body.len() > 0 {
+ body.parse::<serde_json::Value>()?
+ } else {
+ "{}".parse()?
+ };
+
+ let invalid_categories: Vec<String> = response
+ .get("warnings")
+ .and_then(|j| j.get("invalid_categories"))
+ .and_then(|j| j.as_array())
+ .map(|x| x.iter().flat_map(|j| j.as_str()).map(Into::into).collect())
+ .unwrap_or_else(Vec::new);
+
+ let invalid_badges: Vec<String> = response
+ .get("warnings")
+ .and_then(|j| j.get("invalid_badges"))
+ .and_then(|j| j.as_array())
+ .map(|x| x.iter().flat_map(|j| j.as_str()).map(Into::into).collect())
+ .unwrap_or_else(Vec::new);
+
+ Ok(Warnings {
+ invalid_categories: invalid_categories,
+ invalid_badges: invalid_badges,
+ })
+ }
+
+ pub fn search(&mut self, query: &str, limit: u8) -> Result<(Vec<Crate>, u32)> {
+ let formated_query = percent_encode(query.as_bytes(), QUERY_ENCODE_SET);
+ let body = self.req(
+ format!("/crates?q={}&per_page={}", formated_query, limit),
+ None, Auth::Unauthorized
+ )?;
+
+ let crates = serde_json::from_str::<Crates>(&body)?;
+ Ok((crates.crates, crates.meta.total))
+ }
+
+ pub fn yank(&mut self, krate: &str, version: &str) -> Result<()> {
+ let body = self.delete(format!("/crates/{}/{}/yank", krate, version),
+ None)?;
+ assert!(serde_json::from_str::<R>(&body)?.ok);
+ Ok(())
+ }
+
+ pub fn unyank(&mut self, krate: &str, version: &str) -> Result<()> {
+ let body = self.put(format!("/crates/{}/{}/unyank", krate, version),
+ &[])?;
+ assert!(serde_json::from_str::<R>(&body)?.ok);
+ Ok(())
+ }
+
+ fn put(&mut self, path: String, b: &[u8]) -> Result<String> {
+ self.handle.put(true)?;
+ self.req(path, Some(b), Auth::Authorized)
+ }
+
+ fn get(&mut self, path: String) -> Result<String> {
+ self.handle.get(true)?;
+ self.req(path, None, Auth::Authorized)
+ }
+
+ fn delete(&mut self, path: String, b: Option<&[u8]>) -> Result<String> {
+ self.handle.custom_request("DELETE")?;
+ self.req(path, b, Auth::Authorized)
+ }
+
+ fn req(&mut self,
+ path: String,
+ body: Option<&[u8]>,
+ authorized: Auth) -> Result<String> {
+ self.handle.url(&format!("{}/api/v1{}", self.host, path))?;
+ let mut headers = List::new();
+ headers.append("Accept: application/json")?;
+ headers.append("Content-Type: application/json")?;
+
+ if authorized == Auth::Authorized {
+ let token = match self.token.as_ref() {
+ Some(s) => s,
+ None => return Err(Error::from_kind(ErrorKind::TokenMissing)),
+ };
+ headers.append(&format!("Authorization: {}", token))?;
+ }
+ self.handle.http_headers(headers)?;
+ match body {
+ Some(mut body) => {
+ self.handle.upload(true)?;
+ self.handle.in_filesize(body.len() as u64)?;
+ handle(&mut self.handle, &mut |buf| body.read(buf).unwrap_or(0))
+ }
+ None => handle(&mut self.handle, &mut |_| 0),
+ }
+ }
+}
+
+fn handle(handle: &mut Easy,
+ read: &mut FnMut(&mut [u8]) -> usize) -> Result<String> {
+ let mut headers = Vec::new();
+ let mut body = Vec::new();
+ {
+ let mut handle = handle.transfer();
+ handle.read_function(|buf| Ok(read(buf)))?;
+ handle.write_function(|data| {
+ body.extend_from_slice(data);
+ Ok(data.len())
+ })?;
+ handle.header_function(|data| {
+ headers.push(String::from_utf8_lossy(data).into_owned());
+ true
+ })?;
+ handle.perform()?;
+ }
+
+ match handle.response_code()? {
+ 0 => {} // file upload url sometimes
+ 200 => {}
+ 403 => return Err(Error::from_kind(ErrorKind::Unauthorized)),
+ 404 => return Err(Error::from_kind(ErrorKind::NotFound)),
+ code => return Err(Error::from_kind(ErrorKind::NotOkResponse(code, headers, body))),
+ }
+
+ let body = match String::from_utf8(body) {
+ Ok(body) => body,
+ Err(..) => return Err(Error::from_kind(ErrorKind::NonUtf8Body)),
+ };
+ match serde_json::from_str::<ApiErrorList>(&body) {
+ Ok(errors) => {
+ return Err(Error::from_kind(ErrorKind::Api(errors
+ .errors
+ .into_iter()
+ .map(|s| s.detail)
+ .collect())))
+ }
+ Err(..) => {}
+ }
+ Ok(body)
+}
--- /dev/null
+doc.crates.io
--- /dev/null
+index.md book/src/index.md book/src/getting-started/index.md book/src/getting-started/*.md
+guide.md book/src/guide/index.md book/src/guide/*.md
+build-script.md book/src/reference/build-scripts.md
+config.md book/src/reference/config.md
+crates-io.md book/src/reference/publishing.md
+environment-variables.md book/src/reference/environment-variables.md
+external-tools.md book/src/reference/external-tools.md
+manifest.md book/src/reference/manifest.md
+pkgid-spec.md book/src/reference/pkgid-spec.md
+source-replacement.md book/src/reference/source-replacement.md
+specifying-dependencies.md book/src/reference/specifying-dependencies.md
+faq.md book/src/faq.md
--- /dev/null
+# Cargo Documentation
+
+NOTE: Cargo documentation is under migration to mdBook-based structure. All the
+`*.md` files here shall be kept in sync with the `*.md` files under `book/src/`.
+See `MIGRATION_MAP` file here and <https://github.com/rust-lang/cargo/pull/4453>
+for details.
--- /dev/null
+# The Cargo Book
+
+
+### Requirements
+
+Building the book requires [mdBook]. To get it:
+
+[mdBook]: https://github.com/azerupi/mdBook
+
+```shell
+$ cargo install mdbook
+```
+
+### Building
+
+To build the book:
+
+```shell
+$ mdbook build
+```
+
+The output will be in the `book` subdirectory. To check it out, open it in
+your web browser.
+
+_Firefox:_
+```shell
+$ firefox book/index.html # Linux
+$ open -a "Firefox" book/index.html # OS X
+$ Start-Process "firefox.exe" .\book\index.html # Windows (PowerShell)
+$ start firefox.exe .\book\index.html # Windows (Cmd)
+```
+
+_Chrome:_
+```shell
+$ google-chrome book/index.html # Linux
+$ open -a "Google Chrome" book/index.html # OS X
+$ Start-Process "chrome.exe" .\book\index.html # Windows (PowerShell)
+$ start chrome.exe .\book\index.html # Windows (Cmd)
+```
+
+
+## Contributing
+
+Given that the book is still in a draft state, we'd love your help! Please feel free to open
+issues about anything, and send in PRs for things you'd like to fix or change. If your change is
+large, please open an issue first, so we can make sure that it's something we'd accept before you
+go through the work of getting a PR together.
--- /dev/null
+title = "The Cargo Book"
+author = "Alex Crichton, Steve Klabnik and Carol Nichols, with Contributions from the Rust Community"
--- /dev/null
+# Summary
+
+[Introduction](index.md)
+
+* [Getting Started](getting-started/index.md)
+ * [Installation](getting-started/installation.md)
+ * [First Steps with Cargo](getting-started/first-steps.md)
+
+* [Cargo Guide](guide/index.md)
+ * [Why Cargo Exists](guide/why-cargo-exists.md)
+ * [Creating a New Project](guide/creating-a-new-project.md)
+ * [Working on an Existing Project](guide/working-on-an-existing-project.md)
+ * [Dependencies](guide/dependencies.md)
+ * [Project Layout](guide/project-layout.md)
+ * [Cargo.toml vs Cargo.lock](guide/cargo-toml-vs-cargo-lock.md)
+ * [Tests](guide/tests.md)
+ * [Continuous Integration](guide/continuous-integration.md)
+
+* [Cargo Reference](reference/index.md)
+ * [Specifying Dependencies](reference/specifying-dependencies.md)
+ * [The Manifest Format](reference/manifest.md)
+ * [Configuration](reference/config.md)
+ * [Environment Variables](reference/environment-variables.md)
+ * [Build Scripts](reference/build-scripts.md)
+ * [Publishing on crates.io](reference/publishing.md)
+ * [Package ID Specifications](reference/pkgid-spec.md)
+ * [Source Replacement](reference/source-replacement.md)
+ * [External Tools](reference/external-tools.md)
+
+* [FAQ](faq.md)
--- /dev/null
+## Frequently Asked Questions
+
+### Is the plan to use GitHub as a package repository?
+
+No. The plan for Cargo is to use [crates.io], like npm or Rubygems do with
+npmjs.org and rubygems.org.
+
+We plan to support git repositories as a source of packages forever,
+because they can be used for early development and temporary patches,
+even when people use the registry as the primary source of packages.
+
+### Why build crates.io rather than use GitHub as a registry?
+
+We think that it’s very important to support multiple ways to download
+packages, including downloading from GitHub and copying packages into
+your project itself.
+
+That said, we think that [crates.io] offers a number of important benefits, and
+will likely become the primary way that people download packages in Cargo.
+
+For precedent, both Node.js’s [npm][1] and Ruby’s [bundler][2] support both a
+central registry model as well as a Git-based model, and most packages
+are downloaded through the registry in those ecosystems, with an
+important minority of packages making use of git-based packages.
+
+[1]: https://www.npmjs.org
+[2]: https://bundler.io
+
+Some of the advantages that make a central registry popular in other
+languages include:
+
+* **Discoverability**. A central registry provides an easy place to look
+ for existing packages. Combined with tagging, this also makes it
+ possible for a registry to provide ecosystem-wide information, such as a
+ list of the most popular or most-depended-on packages.
+* **Speed**. A central registry makes it possible to easily fetch just
+ the metadata for packages quickly and efficiently, and then to
+ efficiently download just the published package, and not other bloat
+ that happens to exist in the repository. This adds up to a significant
+ improvement in the speed of dependency resolution and fetching. As
+ dependency graphs scale up, downloading all of the git repositories bogs
+ down fast. Also remember that not everybody has a high-speed,
+ low-latency Internet connection.
+
+### Will Cargo work with C code (or other languages)?
+
+Yes!
+
+Cargo handles compiling Rust code, but we know that many Rust projects
+link against C code. We also know that there are decades of tooling
+built up around compiling languages other than Rust.
+
+Our solution: Cargo allows a package to [specify a script](reference/build-scripts.html)
+(written in Rust) to run before invoking `rustc`. Rust is leveraged to
+implement platform-specific configuration and refactor out common build
+functionality among packages.
+
+### Can Cargo be used inside of `make` (or `ninja`, or ...)
+
+Indeed. While we intend Cargo to be useful as a standalone way to
+compile Rust projects at the top-level, we know that some people will
+want to invoke Cargo from other build tools.
+
+We have designed Cargo to work well in those contexts, paying attention
+to things like error codes and machine-readable output modes. We still
+have some work to do on those fronts, but using Cargo in the context of
+conventional scripts is something we designed for from the beginning and
+will continue to prioritize.
+
+### Does Cargo handle multi-platform projects or cross-compilation?
+
+Rust itself provides facilities for configuring sections of code based
+on the platform. Cargo also supports [platform-specific
+dependencies][target-deps], and we plan to support more per-platform
+configuration in `Cargo.toml` in the future.
+
+[target-deps]: reference/specifying-dependencies.html#platform-specific-dependencies
+
+In the longer-term, we’re looking at ways to conveniently cross-compile
+projects using Cargo.
+
+### Does Cargo support environments, like `production` or `test`?
+
+We support environments through the use of [profiles][profile] to support:
+
+[profile]: reference/manifest.html#the-profile-sections
+
+* environment-specific flags (like `-g --opt-level=0` for development
+ and `--opt-level=3` for production).
+* environment-specific dependencies (like `hamcrest` for test assertions).
+* environment-specific `#[cfg]`
+* a `cargo test` command
+
+### Does Cargo work on Windows?
+
+Yes!
+
+All commits to Cargo are required to pass the local test suite on Windows.
+If, however, you find a Windows issue, we consider it a bug, so [please file an
+issue][3].
+
+[3]: https://github.com/rust-lang/cargo/issues
+
+### Why do binaries have `Cargo.lock` in version control, but not libraries?
+
+The purpose of a `Cargo.lock` is to describe the state of the world at the time
+of a successful build. It is then used to provide deterministic builds across
+whatever machine is building the project by ensuring that the exact same
+dependencies are being compiled.
+
+This property is most desirable from applications and projects which are at the
+very end of the dependency chain (binaries). As a result, it is recommended that
+all binaries check in their `Cargo.lock`.
+
+For libraries the situation is somewhat different. A library is not only used by
+the library developers, but also any downstream consumers of the library. Users
+dependent on the library will not inspect the library’s `Cargo.lock` (even if it
+exists). This is precisely because a library should **not** be deterministically
+recompiled for all users of the library.
+
+If a library ends up being used transitively by several dependencies, it’s
+likely that just a single copy of the library is desired (based on semver
+compatibility). If all libraries were to check in their `Cargo.lock`, then
+multiple copies of the library would be used, and perhaps even a version
+conflict.
+
+In other words, libraries specify semver requirements for their dependencies but
+cannot see the full picture. Only end products like binaries have a full
+picture to decide what versions of dependencies should be used.
+
+### Can libraries use `*` as a version for their dependencies?
+
+**As of January 22nd, 2016, [crates.io] rejects all packages (not just libraries)
+with wildcard dependency constraints.**
+
+While libraries _can_, strictly speaking, they should not. A version requirement
+of `*` says “This will work with every version ever,” which is never going
+to be true. Libraries should always specify the range that they do work with,
+even if it’s something as general as “every 1.x.y version.”
+
+### Why `Cargo.toml`?
+
+As one of the most frequent interactions with Cargo, the question of why the
+configuration file is named `Cargo.toml` arises from time to time. The leading
+capital-`C` was chosen to ensure that the manifest was grouped with other
+similar configuration files in directory listings. Sorting files often puts
+capital letters before lowercase letters, ensuring files like `Makefile` and
+`Cargo.toml` are placed together. The trailing `.toml` was chosen to emphasize
+the fact that the file is in the [TOML configuration
+format](https://github.com/toml-lang/toml).
+
+Cargo does not allow other names such as `cargo.toml` or `Cargofile` to
+emphasize the ease of how a Cargo repository can be identified. An option of
+many possible names has historically led to confusion where one case was handled
+but others were accidentally forgotten.
+
+[crates.io]: https://crates.io/
+
+### How can Cargo work offline?
+
+Cargo is often used in situations with limited or no network access such as
+airplanes, CI environments, or embedded in large production deployments. Users
+are often surprised when Cargo attempts to fetch resources from the network, and
+hence the request for Cargo to work offline comes up frequently.
+
+Cargo, at its heart, will not attempt to access the network unless told to do
+so. That is, if no crates comes from crates.io, a git repository, or some other
+network location, Cargo will never attempt to make a network connection. As a
+result, if Cargo attempts to touch the network, then it's because it needs to
+fetch a required resource.
+
+Cargo is also quite aggressive about caching information to minimize the amount
+of network activity. It will guarantee, for example, that if `cargo build` (or
+an equivalent) is run to completion then the next `cargo build` is guaranteed to
+not touch the network so long as `Cargo.toml` has not been modified in the
+meantime. This avoidance of the network boils down to a `Cargo.lock` existing
+and a populated cache of the crates reflected in the lock file. If either of
+these components are missing, then they're required for the build to succeed and
+must be fetched remotely.
+
+As of Rust 1.11.0 Cargo understands a new flag, `--frozen`, which is an
+assertion that it shouldn't touch the network. When passed, Cargo will
+immediately return an error if it would otherwise attempt a network request.
+The error should include contextual information about why the network request is
+being made in the first place to help debug as well. Note that this flag *does
+not change the behavior of Cargo*, it simply asserts that Cargo shouldn't touch
+the network as a previous command has been run to ensure that network activity
+shouldn't be necessary.
+
+For more information about vendoring, see documentation on [source
+replacement][replace].
+
+[replace]: reference/source-replacement.html
--- /dev/null
+## First Steps with Cargo
+
+To start a new project with Cargo, use `cargo new`:
+
+```shell
+$ cargo new hello_world --bin
+```
+
+We’re passing `--bin` because we’re making a binary program: if we
+were making a library, we’d leave it off.
+
+Let’s check out what Cargo has generated for us:
+
+```shell
+$ cd hello_world
+$ tree .
+.
+├── Cargo.toml
+└── src
+ └── main.rs
+
+1 directory, 2 files
+```
+
+This is all we need to get started. First, let’s check out `Cargo.toml`:
+
+```toml
+[package]
+name = "hello_world"
+version = "0.1.0"
+authors = ["Your Name <you@example.com>"]
+```
+
+This is called a **manifest**, and it contains all of the metadata that Cargo
+needs to compile your project.
+
+Here’s what’s in `src/main.rs`:
+
+```rust
+fn main() {
+ println!("Hello, world!");
+}
+```
+
+Cargo generated a “hello world” for us. Let’s compile it:
+
+```shell
+$ cargo build
+ Compiling hello_world v0.1.0 (file:///path/to/project/hello_world)
+```
+
+And then run it:
+
+```shell
+$ ./target/debug/hello_world
+Hello, world!
+```
+
+We can also use `cargo run` to compile and then run it, all in one step:
+
+```shell
+$ cargo run
+ Fresh hello_world v0.1.0 (file:///path/to/project/hello_world)
+ Running `target/hello_world`
+Hello, world!
+```
+
+### Going further
+
+For more details on using Cargo, check out the [Cargo Guide](guide/index.html)
--- /dev/null
+## Getting Started
+
+To get started with Cargo, install Cargo (and Rust) and set up your first crate.
+
+* [Installation](getting-started/installation.html)
+* [First steps with Cargo](getting-started/first-steps.html)
--- /dev/null
+## Installation
+
+### Install Stable Rust and Cargo
+
+The easiest way to get Cargo is to get the current stable release of [Rust] by
+using the `rustup` script:
+
+```shell
+$ curl -sSf https://static.rust-lang.org/rustup.sh | sh
+```
+
+After this, you can use the `rustup` command to also install `beta` or `nightly`
+channels for Rust and Cargo.
+
+### Install Nightly Cargo
+
+To install just Cargo, the current recommended installation method is through
+the official nightly builds. Note that Cargo will also require that [Rust] is
+already installed on the system.
+
+| Platform | 64-bit | 32-bit |
+|------------------|-------------------|-------------------|
+| Linux binaries | [tar.gz][linux64] | [tar.gz][linux32] |
+| MacOS binaries | [tar.gz][mac64] | [tar.gz][mac32] |
+| Windows binaries | [tar.gz][win64] | [tar.gz][win32] |
+
+### Build and Install Cargo from Source
+
+Alternatively, you can [build Cargo from source][compiling-from-source].
+
+[rust]: https://www.rust-lang.org/
+[linux64]: https://static.rust-lang.org/cargo-dist/cargo-nightly-x86_64-unknown-linux-gnu.tar.gz
+[linux32]: https://static.rust-lang.org/cargo-dist/cargo-nightly-i686-unknown-linux-gnu.tar.gz
+[mac64]: https://static.rust-lang.org/cargo-dist/cargo-nightly-x86_64-apple-darwin.tar.gz
+[mac32]: https://static.rust-lang.org/cargo-dist/cargo-nightly-i686-apple-darwin.tar.gz
+[win64]: https://static.rust-lang.org/cargo-dist/cargo-nightly-x86_64-pc-windows-gnu.tar.gz
+[win32]: https://static.rust-lang.org/cargo-dist/cargo-nightly-i686-pc-windows-gnu.tar.gz
+[compiling-from-source]: https://github.com/rust-lang/cargo#compiling-from-source
--- /dev/null
+## Cargo.toml vs Cargo.lock
+
+`Cargo.toml` and `Cargo.lock` serve two different purposes. Before we talk
+about them, here’s a summary:
+
+* `Cargo.toml` is about describing your dependencies in a broad sense, and is
+ written by you.
+* `Cargo.lock` contains exact information about your dependencies. It is
+ maintained by Cargo and should not be manually edited.
+
+If you’re building a library that other projects will depend on, put
+`Cargo.lock` in your `.gitignore`. If you’re building an executable like a
+command-line tool or an application, check `Cargo.lock` into `git`. If you're
+curious about why that is, see ["Why do binaries have `Cargo.lock` in version
+control, but not libraries?" in the
+FAQ](faq.html#why-do-binaries-have-cargolock-in-version-control-but-not-libraries).
+
+Let’s dig in a little bit more.
+
+`Cargo.toml` is a **manifest** file in which we can specify a bunch of
+different metadata about our project. For example, we can say that we depend
+on another project:
+
+```toml
+[package]
+name = "hello_world"
+version = "0.1.0"
+authors = ["Your Name <you@example.com>"]
+
+[dependencies]
+rand = { git = "https://github.com/rust-lang-nursery/rand.git" }
+```
+
+This project has a single dependency, on the `rand` library. We’ve stated in
+this case that we’re relying on a particular Git repository that lives on
+GitHub. Since we haven’t specified any other information, Cargo assumes that
+we intend to use the latest commit on the `master` branch to build our project.
+
+Sound good? Well, there’s one problem: If you build this project today, and
+then you send a copy to me, and I build this project tomorrow, something bad
+could happen. There could be more commits to `rand` in the meantime, and my
+build would include new commits while yours would not. Therefore, we would
+get different builds. This would be bad because we want reproducible builds.
+
+We could fix this problem by putting a `rev` line in our `Cargo.toml`:
+
+```toml
+[dependencies]
+rand = { git = "https://github.com/rust-lang-nursery/rand.git", rev = "9f35b8e" }
+```
+
+Now our builds will be the same. But there’s a big drawback: now we have to
+manually think about SHA-1s every time we want to update our library. This is
+both tedious and error prone.
+
+Enter the `Cargo.lock`. Because of its existence, we don’t need to manually
+keep track of the exact revisions: Cargo will do it for us. When we have a
+manifest like this:
+
+```toml
+[package]
+name = "hello_world"
+version = "0.1.0"
+authors = ["Your Name <you@example.com>"]
+
+[dependencies]
+rand = { git = "https://github.com/rust-lang-nursery/rand.git" }
+```
+
+Cargo will take the latest commit and write that information out into our
+`Cargo.lock` when we build for the first time. That file will look like this:
+
+```toml
+[[package]]
+name = "hello_world"
+version = "0.1.0"
+dependencies = [
+ "rand 0.1.0 (git+https://github.com/rust-lang-nursery/rand.git#9f35b8e439eeedd60b9414c58f389bdc6a3284f9)",
+]
+
+[[package]]
+name = "rand"
+version = "0.1.0"
+source = "git+https://github.com/rust-lang-nursery/rand.git#9f35b8e439eeedd60b9414c58f389bdc6a3284f9"
+```
+
+You can see that there’s a lot more information here, including the exact
+revision we used to build. Now when you give your project to someone else,
+they’ll use the exact same SHA, even though we didn’t specify it in our
+`Cargo.toml`.
+
+When we’re ready to opt in to a new version of the library, Cargo can
+re-calculate the dependencies and update things for us:
+
+```shell
+$ cargo update # updates all dependencies
+$ cargo update -p rand # updates just “rand”
+```
+
+This will write out a new `Cargo.lock` with the new version information. Note
+that the argument to `cargo update` is actually a
+[Package ID Specification](reference/pkgid-spec.html) and `rand` is just a short
+specification.
--- /dev/null
+## Continuous Integration
+
+### Travis CI
+
+To test your project on Travis CI, here is a sample `.travis.yml` file:
+
+```yaml
+language: rust
+rust:
+ - stable
+ - beta
+ - nightly
+matrix:
+ allow_failures:
+ - rust: nightly
+```
+
+This will test all three release channels, but any breakage in nightly
+will not fail your overall build. Please see the [Travis CI Rust
+documentation](https://docs.travis-ci.com/user/languages/rust/) for more
+information.
--- /dev/null
+## Creating a New Project
+
+To start a new project with Cargo, use `cargo new`:
+
+```shell
+$ cargo new hello_world --bin
+```
+
+We’re passing `--bin` because we’re making a binary program: if we
+were making a library, we’d leave it off. This also initializes a new `git`
+repository by default. If you don't want it to do that, pass `--vcs none`.
+
+Let’s check out what Cargo has generated for us:
+
+```shell
+$ cd hello_world
+$ tree .
+.
+├── Cargo.toml
+└── src
+ └── main.rs
+
+1 directory, 2 files
+```
+
+If we had just used `cargo new hello_world` without the `--bin` flag, then
+we would have a `lib.rs` instead of a `main.rs`. For now, however, this is all
+we need to get started. First, let’s check out `Cargo.toml`:
+
+```toml
+[package]
+name = "hello_world"
+version = "0.1.0"
+authors = ["Your Name <you@example.com>"]
+```
+
+This is called a **manifest**, and it contains all of the metadata that Cargo
+needs to compile your project.
+
+Here’s what’s in `src/main.rs`:
+
+```rust
+fn main() {
+ println!("Hello, world!");
+}
+```
+
+Cargo generated a “hello world” for us. Let’s compile it:
+
+```shell
+$ cargo build
+ Compiling hello_world v0.1.0 (file:///path/to/project/hello_world)
+```
+
+And then run it:
+
+```shell
+$ ./target/debug/hello_world
+Hello, world!
+```
+
+We can also use `cargo run` to compile and then run it, all in one step (You
+won't see the `Compiling` line if you have not made any changes since you last
+compiled):
+
+```shell
+$ cargo run
+ Compiling hello_world v0.1.0 (file:///path/to/project/hello_world)
+ Running `target/debug/hello_world`
+Hello, world!
+```
+
+You’ll now notice a new file, `Cargo.lock`. It contains information about our
+dependencies. Since we don’t have any yet, it’s not very interesting.
+
+Once you’re ready for release, you can use `cargo build --release` to compile
+your files with optimizations turned on:
+
+```shell
+$ cargo build --release
+ Compiling hello_world v0.1.0 (file:///path/to/project/hello_world)
+```
+
+`cargo build --release` puts the resulting binary in `target/release` instead of
+`target/debug`.
+
+Compiling in debug mode is the default for development-- compilation time is
+shorter since the compiler doesn't do optimizations, but the code will run
+slower. Release mode takes longer to compile, but the code will run faster.
--- /dev/null
+## Dependencies
+
+[crates.io] is the Rust community's central package registry that serves as a
+location to discover and download packages. `cargo` is configured to use it by
+default to find requested packages.
+
+To depend on a library hosted on [crates.io], add it to your `Cargo.toml`.
+
+[crates.io]: https://crates.io/
+
+### Adding a dependency
+
+If your `Cargo.toml` doesn't already have a `[dependencies]` section, add that,
+then list the crate name and version that you would like to use. This example
+adds a dependency of the `time` crate:
+
+```toml
+[dependencies]
+time = "0.1.12"
+```
+
+The version string is a [semver] version requirement. The [specifying
+dependencies](reference/specifying-dependencies.html) docs have more information about
+the options you have here.
+
+[semver]: https://github.com/steveklabnik/semver#requirements
+
+If we also wanted to add a dependency on the `regex` crate, we would not need
+to add `[dependencies]` for each crate listed. Here's what your whole
+`Cargo.toml` file would look like with dependencies on the `time` and `regex`
+crates:
+
+```toml
+[package]
+name = "hello_world"
+version = "0.1.0"
+authors = ["Your Name <you@example.com>"]
+
+[dependencies]
+time = "0.1.12"
+regex = "0.1.41"
+```
+
+Re-run `cargo build`, and Cargo will fetch the new dependencies and all of
+their dependencies, compile them all, and update the `Cargo.lock`:
+
+```shell
+$ cargo build
+ Updating registry `https://github.com/rust-lang/crates.io-index`
+ Downloading memchr v0.1.5
+ Downloading libc v0.1.10
+ Downloading regex-syntax v0.2.1
+ Downloading memchr v0.1.5
+ Downloading aho-corasick v0.3.0
+ Downloading regex v0.1.41
+ Compiling memchr v0.1.5
+ Compiling libc v0.1.10
+ Compiling regex-syntax v0.2.1
+ Compiling memchr v0.1.5
+ Compiling aho-corasick v0.3.0
+ Compiling regex v0.1.41
+ Compiling hello_world v0.1.0 (file:///path/to/project/hello_world)
+```
+
+Our `Cargo.lock` contains the exact information about which revision of all of
+these dependencies we used.
+
+Now, if `regex` gets updated, we will still build with the same revision until
+we choose to `cargo update`.
+
+You can now use the `regex` library using `extern crate` in `main.rs`.
+
+```
+extern crate regex;
+
+use regex::Regex;
+
+fn main() {
+ let re = Regex::new(r"^\d{4}-\d{2}-\d{2}$").unwrap();
+ println!("Did our date match? {}", re.is_match("2014-01-01"));
+}
+```
+
+Running it will show:
+
+```shell
+$ cargo run
+ Running `target/hello_world`
+Did our date match? true
+```
--- /dev/null
+## Cargo Guide
+
+This guide will give you all that you need to know about how to use Cargo to
+develop Rust projects.
+
+* [Why Cargo Exists](guide/why-cargo-exists.html)
+* [Creating a New Project](guide/creating-a-new-project.html)
+* [Working on an Existing Cargo Project](guide/working-on-an-existing-project.html)
+* [Dependencies](guide/dependencies.html)
+* [Project Layout](guide/project-layout.html)
+* [Cargo.toml vs Cargo.lock](guide/cargo-toml-vs-cargo-lock.html)
+* [Tests](guide/tests.html)
+* [Continuous Integration](guide/continuous-integration.html)
--- /dev/null
+## Project Layout
+
+Cargo uses conventions for file placement to make it easy to dive into a new
+Cargo project:
+
+```shell
+.
+├── Cargo.lock
+├── Cargo.toml
+├── benches
+│ └── large-input.rs
+├── examples
+│ └── simple.rs
+├── src
+│ ├── bin
+│ │ └── another_executable.rs
+│ ├── lib.rs
+│ └── main.rs
+└── tests
+ └── some-integration-tests.rs
+```
+
+* `Cargo.toml` and `Cargo.lock` are stored in the root of your project (*package
+ root*).
+* Source code goes in the `src` directory.
+* The default library file is `src/lib.rs`.
+* The default executable file is `src/main.rs`.
+* Other executables can be placed in `src/bin/*.rs`.
+* Integration tests go in the `tests` directory (unit tests go in each file
+ they're testing).
+* Examples go in the `examples` directory.
+* Benchmarks go in the `benches` directory.
+
+These are explained in more detail in the [manifest
+description](reference/manifest.html#the-project-layout).
--- /dev/null
+## Tests
+
+Cargo can run your tests with the `cargo test` command. Cargo looks for tests
+to run in two places: in each of your `src` files and any tests in `tests/`.
+Tests in your `src` files should be unit tests, and tests in `tests/` should be
+integration-style tests. As such, you’ll need to import your crates into
+the files in `tests`.
+
+Here's an example of running `cargo test` in our project, which currently has
+no tests:
+
+```shell
+$ cargo test
+ Compiling rand v0.1.0 (https://github.com/rust-lang-nursery/rand.git#9f35b8e)
+ Compiling hello_world v0.1.0 (file:///path/to/project/hello_world)
+ Running target/test/hello_world-9c2b65bbb79eabce
+
+running 0 tests
+
+test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out
+```
+
+If our project had tests, we would see more output with the correct number of
+tests.
+
+You can also run a specific test by passing a filter:
+
+```shell
+$ cargo test foo
+```
+
+This will run any test with `foo` in its name.
+
+`cargo test` runs additional checks as well. For example, it will compile any
+examples you’ve included and will also test the examples in your
+documentation. Please see the [testing guide][testing] in the Rust
+documentation for more details.
+
+[testing]: https://doc.rust-lang.org/book/testing.html
--- /dev/null
+## Why Cargo Exists
+
+Cargo is a tool that allows Rust projects to declare their various
+dependencies and ensure that you’ll always get a repeatable build.
+
+To accomplish this goal, Cargo does four things:
+
+* Introduces two metadata files with various bits of project information.
+* Fetches and builds your project’s dependencies.
+* Invokes `rustc` or another build tool with the correct parameters to build
+ your project.
+* Introduces conventions to make working with Rust projects easier.
--- /dev/null
+## Working on an Existing Cargo Project
+
+If you download an existing project that uses Cargo, it’s really easy
+to get going.
+
+First, get the project from somewhere. In this example, we’ll use `rand`
+cloned from its repository on GitHub:
+
+```shell
+$ git clone https://github.com/rust-lang-nursery/rand.git
+$ cd rand
+```
+
+To build, use `cargo build`:
+
+```shell
+$ cargo build
+ Compiling rand v0.1.0 (file:///path/to/project/rand)
+```
+
+This will fetch all of the dependencies and then build them, along with the
+project.
--- /dev/null
+# The Cargo Book
+
+
+
+Cargo is the [Rust] *package manager*. Cargo downloads your Rust project’s
+dependencies, compiles your project, makes packages, and upload them to
+[crates.io], the Rust community’s *package registry*.
+
+
+### Sections
+
+**[Getting Started](getting-started/index.html)**
+
+To get started with Cargo, install Cargo (and Rust) and set up your first crate.
+
+**[Cargo Guide](guide/index.html)**
+
+The guide will give you all you need to know about how to use Cargo to develop
+Rust projects.
+
+**[Cargo Reference](reference/index.html)**
+
+The reference covers the details of various areas of Cargo.
+
+**[Frequently Asked Questions](faq.html)**
+
+[rust]: https://www.rust-lang.org/
+[crates.io]: https://crates.io/
--- /dev/null
+## Build Scripts
+
+Some packages need to compile third-party non-Rust code, for example C
+libraries. Other packages need to link to C libraries which can either be
+located on the system or possibly need to be built from source. Others still
+need facilities for functionality such as code generation before building (think
+parser generators).
+
+Cargo does not aim to replace other tools that are well-optimized for
+these tasks, but it does integrate with them with the `build` configuration
+option.
+
+```toml
+[package]
+# ...
+build = "build.rs"
+```
+
+The Rust file designated by the `build` command (relative to the package root)
+will be compiled and invoked before anything else is compiled in the package,
+allowing your Rust code to depend on the built or generated artifacts.
+By default Cargo looks up for `"build.rs"` file in a package root (even if you
+do not specify a value for `build`). Use `build = "custom_build_name.rs"` to specify
+a custom build name or `build = false` to disable automatic detection of the build script.
+
+Some example use cases of the build command are:
+
+* Building a bundled C library.
+* Finding a C library on the host system.
+* Generating a Rust module from a specification.
+* Performing any platform-specific configuration needed for the crate.
+
+Each of these use cases will be detailed in full below to give examples of how
+the build command works.
+
+### Inputs to the Build Script
+
+When the build script is run, there are a number of inputs to the build script,
+all passed in the form of [environment variables][env].
+
+In addition to environment variables, the build script’s current directory is
+the source directory of the build script’s package.
+
+[env]: reference/environment-variables.html
+
+### Outputs of the Build Script
+
+All the lines printed to stdout by a build script are written to a file like
+`target/debug/build/<pkg>/output` (the precise location may depend on your
+configuration). Any line that starts with `cargo:` is interpreted directly by
+Cargo. This line must be of the form `cargo:key=value`, like the examples below:
+
+```shell
+# specially recognized by Cargo
+cargo:rustc-link-lib=static=foo
+cargo:rustc-link-search=native=/path/to/foo
+cargo:rustc-cfg=foo
+cargo:rustc-env=FOO=bar
+# arbitrary user-defined metadata
+cargo:root=/path/to/foo
+cargo:libdir=/path/to/foo/lib
+cargo:include=/path/to/foo/include
+```
+
+On the other hand, lines printed to stderr are written to a file like
+`target/debug/build/<pkg>/stderr` but are not interpreted by cargo.
+
+There are a few special keys that Cargo recognizes, some affecting how the
+crate is built:
+
+* `rustc-link-lib=[KIND=]NAME` indicates that the specified value is a library
+ name and should be passed to the compiler as a `-l` flag. The optional `KIND`
+ can be one of `static`, `dylib` (the default), or `framework`, see
+ `rustc --help` for more details.
+* `rustc-link-search=[KIND=]PATH` indicates the specified value is a library
+ search path and should be passed to the compiler as a `-L` flag. The optional
+ `KIND` can be one of `dependency`, `crate`, `native`, `framework` or `all`
+ (the default), see `rustc --help` for more details.
+* `rustc-flags=FLAGS` is a set of flags passed to the compiler, only `-l` and
+ `-L` flags are supported.
+* `rustc-cfg=FEATURE` indicates that the specified feature will be passed as a
+ `--cfg` flag to the compiler. This is often useful for performing compile-time
+ detection of various features.
+* `rustc-env=VAR=VALUE` indicates that the specified environment variable
+ will be added to the environment which the compiler is run within.
+ The value can be then retrieved by the `env!` macro in the compiled crate.
+ This is useful for embedding additional metadata in crate's code,
+ such as the hash of Git HEAD or the unique identifier of a continuous
+ integration server.
+* `rerun-if-changed=PATH` is a path to a file or directory which indicates that
+ the build script should be re-run if it changes (detected by a more-recent
+ last-modified timestamp on the file). Normally build scripts are re-run if
+ any file inside the crate root changes, but this can be used to scope changes
+ to just a small set of files. (If this path points to a directory the entire
+ directory will not be traversed for changes -- only changes to the timestamp
+ of the directory itself (which corresponds to some types of changes within the
+ directory, depending on platform) will trigger a rebuild. To request a re-run
+ on any changes within an entire directory, print a line for the directory and
+ another line for everything inside it, recursively.)
+ Note that if the build script itself (or one of its dependencies) changes,
+ then it's rebuilt and rerun unconditionally, so
+ `cargo:rerun-if-changed=build.rs` is almost always redundant (unless you
+ want to ignore changes in all other files except for `build.rs`).
+* `rerun-if-env-changed=VAR` is the name of an environment variable which
+ indicates that if the environment variable's value changes the build script
+ should be rerun. This basically behaves the same as `rerun-if-changed` except
+ that it works with environment variables instead. Note that the environment
+ variables here are intended for global environment variables like `CC` and
+ such, it's not necessary to use this for env vars like `TARGET` that Cargo
+ sets. Also note that if `rerun-if-env-changed` is printed out then Cargo will
+ *only* rerun the build script if those environment variables change or if
+ files printed out by `rerun-if-changed` change.
+
+* `warning=MESSAGE` is a message that will be printed to the main console after
+ a build script has finished running. Warnings are only shown for path
+ dependencies (that is, those you're working on locally), so for example
+ warnings printed out in crates.io crates are not emitted by default.
+
+Any other element is a user-defined metadata that will be passed to
+dependents. More information about this can be found in the [`links`][links]
+section.
+
+[links]: #the-links-manifest-key
+
+### Build Dependencies
+
+Build scripts are also allowed to have dependencies on other Cargo-based crates.
+Dependencies are declared through the `build-dependencies` section of the
+manifest.
+
+```toml
+[build-dependencies]
+foo = { git = "https://github.com/your-packages/foo" }
+```
+
+The build script **does not** have access to the dependencies listed in the
+`dependencies` or `dev-dependencies` section (they’re not built yet!). All build
+dependencies will also not be available to the package itself unless explicitly
+stated as so.
+
+### The `links` Manifest Key
+
+In addition to the manifest key `build`, Cargo also supports a `links` manifest
+key to declare the name of a native library that is being linked to:
+
+```toml
+[package]
+# ...
+links = "foo"
+build = "build.rs"
+```
+
+This manifest states that the package links to the `libfoo` native library, and
+it also has a build script for locating and/or building the library. Cargo
+requires that a `build` command is specified if a `links` entry is also
+specified.
+
+The purpose of this manifest key is to give Cargo an understanding about the set
+of native dependencies that a package has, as well as providing a principled
+system of passing metadata between package build scripts.
+
+Primarily, Cargo requires that there is at most one package per `links` value.
+In other words, it’s forbidden to have two packages link to the same native
+library. Note, however, that there are [conventions in place][star-sys] to
+alleviate this.
+
+[star-sys]: #-sys-packages
+
+As mentioned above in the output format, each build script can generate an
+arbitrary set of metadata in the form of key-value pairs. This metadata is
+passed to the build scripts of **dependent** packages. For example, if `libbar`
+depends on `libfoo`, then if `libfoo` generates `key=value` as part of its
+metadata, then the build script of `libbar` will have the environment variables
+`DEP_FOO_KEY=value`.
+
+Note that metadata is only passed to immediate dependents, not transitive
+dependents. The motivation for this metadata passing is outlined in the linking
+to system libraries case study below.
+
+### Overriding Build Scripts
+
+If a manifest contains a `links` key, then Cargo supports overriding the build
+script specified with a custom library. The purpose of this functionality is to
+prevent running the build script in question altogether and instead supply the
+metadata ahead of time.
+
+To override a build script, place the following configuration in any acceptable
+Cargo [configuration location](reference/config.html).
+
+```toml
+[target.x86_64-unknown-linux-gnu.foo]
+rustc-link-search = ["/path/to/foo"]
+rustc-link-lib = ["foo"]
+root = "/path/to/foo"
+key = "value"
+```
+
+This section states that for the target `x86_64-unknown-linux-gnu` the library
+named `foo` has the metadata specified. This metadata is the same as the
+metadata generated as if the build script had run, providing a number of
+key/value pairs where the `rustc-flags`, `rustc-link-search`, and
+`rustc-link-lib` keys are slightly special.
+
+With this configuration, if a package declares that it links to `foo` then the
+build script will **not** be compiled or run, and the metadata specified will
+instead be used.
+
+### Case study: Code generation
+
+Some Cargo packages need to have code generated just before they are compiled
+for various reasons. Here we’ll walk through a simple example which generates a
+library call as part of the build script.
+
+First, let’s take a look at the directory structure of this package:
+
+```shell
+.
+├── Cargo.toml
+├── build.rs
+└── src
+ └── main.rs
+
+1 directory, 3 files
+```
+
+Here we can see that we have a `build.rs` build script and our binary in
+`main.rs`. Next, let’s take a look at the manifest:
+
+```toml
+# Cargo.toml
+
+[package]
+name = "hello-from-generated-code"
+version = "0.1.0"
+authors = ["you@example.com"]
+build = "build.rs"
+```
+
+Here we can see we’ve got a build script specified which we’ll use to generate
+some code. Let’s see what’s inside the build script:
+
+```rust,no_run
+// build.rs
+
+use std::env;
+use std::fs::File;
+use std::io::Write;
+use std::path::Path;
+
+fn main() {
+ let out_dir = env::var("OUT_DIR").unwrap();
+ let dest_path = Path::new(&out_dir).join("hello.rs");
+ let mut f = File::create(&dest_path).unwrap();
+
+ f.write_all(b"
+ pub fn message() -> &'static str {
+ \"Hello, World!\"
+ }
+ ").unwrap();
+}
+```
+
+There’s a couple of points of note here:
+
+* The script uses the `OUT_DIR` environment variable to discover where the
+ output files should be located. It can use the process’ current working
+ directory to find where the input files should be located, but in this case we
+ don’t have any input files.
+* This script is relatively simple as it just writes out a small generated file.
+ One could imagine that other more fanciful operations could take place such as
+ generating a Rust module from a C header file or another language definition,
+ for example.
+
+Next, let’s peek at the library itself:
+
+```rust,ignore
+// src/main.rs
+
+include!(concat!(env!("OUT_DIR"), "/hello.rs"));
+
+fn main() {
+ println!("{}", message());
+}
+```
+
+This is where the real magic happens. The library is using the rustc-defined
+`include!` macro in combination with the `concat!` and `env!` macros to include
+the generated file (`hello.rs`) into the crate’s compilation.
+
+Using the structure shown here, crates can include any number of generated files
+from the build script itself.
+
+### Case study: Building some native code
+
+Sometimes it’s necessary to build some native C or C++ code as part of a
+package. This is another excellent use case of leveraging the build script to
+build a native library before the Rust crate itself. As an example, we’ll create
+a Rust library which calls into C to print “Hello, World!”.
+
+Like above, let’s first take a look at the project layout:
+
+```shell
+.
+├── Cargo.toml
+├── build.rs
+└── src
+ ├── hello.c
+ └── main.rs
+
+1 directory, 4 files
+```
+
+Pretty similar to before! Next, the manifest:
+
+```toml
+# Cargo.toml
+
+[package]
+name = "hello-world-from-c"
+version = "0.1.0"
+authors = ["you@example.com"]
+build = "build.rs"
+```
+
+For now we’re not going to use any build dependencies, so let’s take a look at
+the build script now:
+
+```rust,no_run
+// build.rs
+
+use std::process::Command;
+use std::env;
+use std::path::Path;
+
+fn main() {
+ let out_dir = env::var("OUT_DIR").unwrap();
+
+ // note that there are a number of downsides to this approach, the comments
+ // below detail how to improve the portability of these commands.
+ Command::new("gcc").args(&["src/hello.c", "-c", "-fPIC", "-o"])
+ .arg(&format!("{}/hello.o", out_dir))
+ .status().unwrap();
+ Command::new("ar").args(&["crus", "libhello.a", "hello.o"])
+ .current_dir(&Path::new(&out_dir))
+ .status().unwrap();
+
+ println!("cargo:rustc-link-search=native={}", out_dir);
+ println!("cargo:rustc-link-lib=static=hello");
+}
+```
+
+This build script starts out by compiling our C file into an object file (by
+invoking `gcc`) and then converting this object file into a static library (by
+invoking `ar`). The final step is feedback to Cargo itself to say that our
+output was in `out_dir` and the compiler should link the crate to `libhello.a`
+statically via the `-l static=hello` flag.
+
+Note that there are a number of drawbacks to this hardcoded approach:
+
+* The `gcc` command itself is not portable across platforms. For example it’s
+ unlikely that Windows platforms have `gcc`, and not even all Unix platforms
+ may have `gcc`. The `ar` command is also in a similar situation.
+* These commands do not take cross-compilation into account. If we’re cross
+ compiling for a platform such as Android it’s unlikely that `gcc` will produce
+ an ARM executable.
+
+Not to fear, though, this is where a `build-dependencies` entry would help! The
+Cargo ecosystem has a number of packages to make this sort of task much easier,
+portable, and standardized. For example, the build script could be written as:
+
+```rust,ignore
+// build.rs
+
+// Bring in a dependency on an externally maintained `gcc` package which manages
+// invoking the C compiler.
+extern crate gcc;
+
+fn main() {
+ gcc::compile_library("libhello.a", &["src/hello.c"]);
+}
+```
+
+Add a build time dependency on the `gcc` crate with the following addition to
+your `Cargo.toml`:
+
+```toml
+[build-dependencies]
+gcc = "0.3"
+```
+
+The [`gcc` crate](https://crates.io/crates/gcc) abstracts a range of build
+script requirements for C code:
+
+* It invokes the appropriate compiler (MSVC for windows, `gcc` for MinGW, `cc`
+ for Unix platforms, etc.).
+* It takes the `TARGET` variable into account by passing appropriate flags to
+ the compiler being used.
+* Other environment variables, such as `OPT_LEVEL`, `DEBUG`, etc., are all
+ handled automatically.
+* The stdout output and `OUT_DIR` locations are also handled by the `gcc`
+ library.
+
+Here we can start to see some of the major benefits of farming as much
+functionality as possible out to common build dependencies rather than
+duplicating logic across all build scripts!
+
+Back to the case study though, let’s take a quick look at the contents of the
+`src` directory:
+
+```c
+// src/hello.c
+
+#include <stdio.h>
+
+void hello() {
+ printf("Hello, World!\n");
+}
+```
+
+```rust,ignore
+// src/main.rs
+
+// Note the lack of the `#[link]` attribute. We’re delegating the responsibility
+// of selecting what to link to over to the build script rather than hardcoding
+// it in the source file.
+extern { fn hello(); }
+
+fn main() {
+ unsafe { hello(); }
+}
+```
+
+And there we go! This should complete our example of building some C code from a
+Cargo package using the build script itself. This also shows why using a build
+dependency can be crucial in many situations and even much more concise!
+
+We’ve also seen a brief example of how a build script can use a crate as a
+dependency purely for the build process and not for the crate itself at runtime.
+
+### Case study: Linking to system libraries
+
+The final case study here will be investigating how a Cargo library links to a
+system library and how the build script is leveraged to support this use case.
+
+Quite frequently a Rust crate wants to link to a native library often provided
+on the system to bind its functionality or just use it as part of an
+implementation detail. This is quite a nuanced problem when it comes to
+performing this in a platform-agnostic fashion, and the purpose of a build
+script is again to farm out as much of this as possible to make this as easy as
+possible for consumers.
+
+As an example to follow, let’s take a look at one of [Cargo’s own
+dependencies][git2-rs], [libgit2][libgit2]. The C library has a number of
+constraints:
+
+[git2-rs]: https://github.com/alexcrichton/git2-rs/tree/master/libgit2-sys
+[libgit2]: https://github.com/libgit2/libgit2
+
+* It has an optional dependency on OpenSSL on Unix to implement the https
+ transport.
+* It has an optional dependency on libssh2 on all platforms to implement the ssh
+ transport.
+* It is often not installed on all systems by default.
+* It can be built from source using `cmake`.
+
+To visualize what’s going on here, let’s take a look at the manifest for the
+relevant Cargo package that links to the native C library.
+
+```toml
+[package]
+name = "libgit2-sys"
+version = "0.1.0"
+authors = ["..."]
+links = "git2"
+build = "build.rs"
+
+[dependencies]
+libssh2-sys = { git = "https://github.com/alexcrichton/ssh2-rs" }
+
+[target.'cfg(unix)'.dependencies]
+openssl-sys = { git = "https://github.com/alexcrichton/openssl-sys" }
+
+# ...
+```
+
+As the above manifests show, we’ve got a `build` script specified, but it’s
+worth noting that this example has a `links` entry which indicates that the
+crate (`libgit2-sys`) links to the `git2` native library.
+
+Here we also see that we chose to have the Rust crate have an unconditional
+dependency on `libssh2` via the `libssh2-sys` crate, as well as a
+platform-specific dependency on `openssl-sys` for \*nix (other variants elided
+for now). It may seem a little counterintuitive to express *C dependencies* in
+the *Cargo manifest*, but this is actually using one of Cargo’s conventions in
+this space.
+
+### `*-sys` Packages
+
+To alleviate linking to system libraries, Cargo has a *convention* of package
+naming and functionality. Any package named `foo-sys` will provide two major
+pieces of functionality:
+
+* The library crate will link to the native library `libfoo`. This will often
+ probe the current system for `libfoo` before resorting to building from
+ source.
+* The library crate will provide **declarations** for functions in `libfoo`,
+ but it does **not** provide bindings or higher-level abstractions.
+
+The set of `*-sys` packages provides a common set of dependencies for linking
+to native libraries. There are a number of benefits earned from having this
+convention of native-library-related packages:
+
+* Common dependencies on `foo-sys` alleviates the above rule about one package
+ per value of `links`.
+* A common dependency allows centralizing logic on discovering `libfoo` itself
+ (or building it from source).
+* These dependencies are easily overridable.
+
+### Building libgit2
+
+Now that we’ve got libgit2’s dependencies sorted out, we need to actually write
+the build script. We’re not going to look at specific snippets of code here and
+instead only take a look at the high-level details of the build script of
+`libgit2-sys`. This is not recommending all packages follow this strategy, but
+rather just outlining one specific strategy.
+
+The first step of the build script should do is to query whether libgit2 is
+already installed on the host system. To do this we’ll leverage the preexisting
+tool `pkg-config` (when its available). We’ll also use a `build-dependencies`
+section to refactor out all the `pkg-config` related code (or someone’s already
+done that!).
+
+If `pkg-config` failed to find libgit2, or if `pkg-config` just wasn’t
+installed, the next step is to build libgit2 from bundled source code
+(distributed as part of `libgit2-sys` itself). There are a few nuances when
+doing so that we need to take into account, however:
+
+* The build system of libgit2, `cmake`, needs to be able to find libgit2’s
+ optional dependency of libssh2. We’re sure we’ve already built it (it’s a
+ Cargo dependency), we just need to communicate this information. To do this
+ we leverage the metadata format to communicate information between build
+ scripts. In this example the libssh2 package printed out `cargo:root=...` to
+ tell us where libssh2 is installed at, and we can then pass this along to
+ cmake with the `CMAKE_PREFIX_PATH` environment variable.
+
+* We’ll need to handle some `CFLAGS` values when compiling C code (and tell
+ `cmake` about this). Some flags we may want to pass are `-m64` for 64-bit
+ code, `-m32` for 32-bit code, or `-fPIC` for 64-bit code as well.
+
+* Finally, we’ll invoke `cmake` to place all output into the `OUT_DIR`
+ environment variable, and then we’ll print the necessary metadata to instruct
+ rustc how to link to libgit2.
+
+Most of the functionality of this build script is easily refactorable into
+common dependencies, so our build script isn’t quite as intimidating as this
+descriptions! In reality it’s expected that build scripts are quite succinct by
+farming logic such as above to build dependencies.
--- /dev/null
+## Configuration
+
+This document will explain how Cargo’s configuration system works, as well as
+available keys or configuration. For configuration of a project through its
+manifest, see the [manifest format](reference/manifest.html).
+
+### Hierarchical structure
+
+
+Cargo allows local configuration for a particular project as well as global
+configuration, like git. Cargo extends this to a hierarchical strategy.
+If, for example, Cargo were invoked in `/projects/foo/bar/baz`, then the
+following configuration files would be probed for and unified in this order:
+
+* `/projects/foo/bar/baz/.cargo/config`
+* `/projects/foo/bar/.cargo/config`
+* `/projects/foo/.cargo/config`
+* `/projects/.cargo/config`
+* `/.cargo/config`
+* `$HOME/.cargo/config`
+
+With this structure, you can specify configuration per-project, and even
+possibly check it into version control. You can also specify personal defaults
+with a configuration file in your home directory.
+
+### Configuration format
+
+All configuration is currently in the [TOML format][toml] (like the manifest),
+with simple key-value pairs inside of sections (tables) which all get merged
+together.
+
+[toml]: https://github.com/toml-lang/toml
+
+### Configuration keys
+
+All of the following keys are optional, and their defaults are listed as their
+value unless otherwise noted.
+
+Key values that specify a tool may be given as an absolute path, a relative path
+or as a pathless tool name. Absolute paths and pathless tool names are used as
+given. Relative paths are resolved relative to the parent directory of the
+`.cargo` directory of the config file that the value resides within.
+
+```toml
+# An array of paths to local repositories which are to be used as overrides for
+# dependencies. For more information see the Specifying Dependencies guide.
+paths = ["/path/to/override"]
+
+[cargo-new]
+# This is your name/email to place in the `authors` section of a new Cargo.toml
+# that is generated. If not present, then `git` will be probed, and if that is
+# not present then `$USER` and `$EMAIL` will be used.
+name = "..."
+email = "..."
+
+# By default `cargo new` will initialize a new Git repository. This key can be
+# set to `hg` to create a Mercurial repository, or `none` to disable this
+# behavior.
+vcs = "none"
+
+# For the following sections, $triple refers to any valid target triple, not the
+# literal string "$triple", and it will apply whenever that target triple is
+# being compiled to. 'cfg(...)' refers to the Rust-like `#[cfg]` syntax for
+# conditional compilation.
+[target.$triple]
+# This is the linker which is passed to rustc (via `-C linker=`) when the `$triple`
+# is being compiled for. By default this flag is not passed to the compiler.
+linker = ".."
+# Same but for the library archiver which is passed to rustc via `-C ar=`.
+ar = ".."
+# If a runner is provided, compiled targets for the `$triple` will be executed
+# by invoking the specified runner executable with actual target as first argument.
+# This applies to `cargo run`, `cargo test` and `cargo bench` commands.
+# By default compiled targets are executed directly.
+runner = ".."
+# custom flags to pass to all compiler invocations that target $triple
+# this value overrides build.rustflags when both are present
+rustflags = ["..", ".."]
+
+[target.'cfg(...)']
+# Similar for the $triple configuration, but using the `cfg` syntax.
+# If several `cfg` and $triple targets are candidates, then the rustflags
+# are concatenated. The `cfg` syntax only applies to rustflags, and not to
+# linker.
+rustflags = ["..", ".."]
+
+# Configuration keys related to the registry
+[registry]
+index = "..." # URL of the registry index (defaults to the central repository)
+token = "..." # Access token (found on the central repo’s website)
+
+[http]
+proxy = "host:port" # HTTP proxy to use for HTTP requests (defaults to none)
+ # in libcurl format, e.g. "socks5h://host:port"
+timeout = 60000 # Timeout for each HTTP request, in milliseconds
+cainfo = "cert.pem" # Path to Certificate Authority (CA) bundle (optional)
+check-revoke = true # Indicates whether SSL certs are checked for revocation
+
+[build]
+jobs = 1 # number of parallel jobs, defaults to # of CPUs
+rustc = "rustc" # the rust compiler tool
+rustdoc = "rustdoc" # the doc generator tool
+target = "triple" # build for the target triple
+target-dir = "target" # path of where to place all generated artifacts
+rustflags = ["..", ".."] # custom flags to pass to all compiler invocations
+
+[term]
+verbose = false # whether cargo provides verbose output
+color = 'auto' # whether cargo colorizes output
+
+# Network configuration
+[net]
+retry = 2 # number of times a network call will automatically retried
+
+# Alias cargo commands. The first 3 aliases are built in. If your
+# command requires grouped whitespace use the list format.
+[alias]
+b = "build"
+t = "test"
+r = "run"
+rr = "run --release"
+space_example = ["run", "--release", "--", "\"command list\""]
+```
+
+### Environment variables
+
+Cargo can also be configured through environment variables in addition to the
+TOML syntax above. For each configuration key above of the form `foo.bar` the
+environment variable `CARGO_FOO_BAR` can also be used to define the value. For
+example the `build.jobs` key can also be defined by `CARGO_BUILD_JOBS`.
+
+Environment variables will take precedent over TOML configuration, and currently
+only integer, boolean, and string keys are supported to be defined by
+environment variables.
+
+In addition to the system above, Cargo recognizes a few other specific
+[environment variables][env].
+
+[env]: reference/environment-variables.html
--- /dev/null
+## Environment Variables
+
+Cargo sets and reads a number of environment variables which your code can detect
+or override. Here is a list of the variables Cargo sets, organized by when it interacts
+with them:
+
+### Environment variables Cargo reads
+
+You can override these environment variables to change Cargo's behavior on your
+system:
+
+* `CARGO_HOME` - Cargo maintains a local cache of the registry index and of git
+ checkouts of crates. By default these are stored under `$HOME/.cargo`, but
+ this variable overrides the location of this directory. Once a crate is cached
+ it is not removed by the clean command.
+* `CARGO_TARGET_DIR` - Location of where to place all generated artifacts,
+ relative to the current working directory.
+* `RUSTC` - Instead of running `rustc`, Cargo will execute this specified
+ compiler instead.
+* `RUSTC_WRAPPER` - Instead of simply running `rustc`, Cargo will execute this
+ specified wrapper instead, passing as its commandline arguments the rustc
+ invocation, with the first argument being rustc.
+* `RUSTDOC` - Instead of running `rustdoc`, Cargo will execute this specified
+ `rustdoc` instance instead.
+* `RUSTDOCFLAGS` - A space-separated list of custom flags to pass to all `rustdoc`
+ invocations that Cargo performs. In contrast with `cargo rustdoc`, this is
+ useful for passing a flag to *all* `rustdoc` instances.
+* `RUSTFLAGS` - A space-separated list of custom flags to pass to all compiler
+ invocations that Cargo performs. In contrast with `cargo rustc`, this is
+ useful for passing a flag to *all* compiler instances.
+
+Note that Cargo will also read environment variables for `.cargo/config`
+configuration values, as described in [that documentation][config-env]
+
+[config-env]: reference/config.html#environment-variables
+
+### Environment variables Cargo sets for crates
+
+Cargo exposes these environment variables to your crate when it is compiled.
+Note that this applies for test binaries as well.
+To get the value of any of these variables in a Rust program, do this:
+
+```rust
+let version = env!("CARGO_PKG_VERSION");
+```
+
+`version` will now contain the value of `CARGO_PKG_VERSION`.
+
+* `CARGO` - Path to the `cargo` binary performing the build.
+* `CARGO_MANIFEST_DIR` - The directory containing the manifest of your package.
+* `CARGO_PKG_VERSION` - The full version of your package.
+* `CARGO_PKG_VERSION_MAJOR` - The major version of your package.
+* `CARGO_PKG_VERSION_MINOR` - The minor version of your package.
+* `CARGO_PKG_VERSION_PATCH` - The patch version of your package.
+* `CARGO_PKG_VERSION_PRE` - The pre-release version of your package.
+* `CARGO_PKG_AUTHORS` - Colon separated list of authors from the manifest of your package.
+* `CARGO_PKG_NAME` - The name of your package.
+* `CARGO_PKG_DESCRIPTION` - The description of your package.
+* `CARGO_PKG_HOMEPAGE` - The home page of your package.
+* `OUT_DIR` - If the package has a build script, this is set to the folder where the build
+ script should place its output. See below for more information.
+
+### Environment variables Cargo sets for build scripts
+
+Cargo sets several environment variables when build scripts are run. Because these variables
+are not yet set when the build script is compiled, the above example using `env!` won't work
+and instead you'll need to retrieve the values when the build script is run:
+
+```rust
+use std::env;
+let out_dir = env::var("OUT_DIR").unwrap();
+```
+
+`out_dir` will now contain the value of `OUT_DIR`.
+
+* `CARGO_MANIFEST_DIR` - The directory containing the manifest for the package
+ being built (the package containing the build
+ script). Also note that this is the value of the
+ current working directory of the build script when it
+ starts.
+* `CARGO_MANIFEST_LINKS` - the manifest `links` value.
+* `CARGO_FEATURE_<name>` - For each activated feature of the package being
+ built, this environment variable will be present
+ where `<name>` is the name of the feature uppercased
+ and having `-` translated to `_`.
+* `CARGO_CFG_<cfg>` - For each [configuration option][configuration] of the
+ package being built, this environment variable will
+ contain the value of the configuration, where `<cfg>` is
+ the name of the configuration uppercased and having `-`
+ translated to `_`.
+ Boolean configurations are present if they are set, and
+ not present otherwise.
+ Configurations with multiple values are joined to a
+ single variable with the values delimited by `,`.
+* `OUT_DIR` - the folder in which all output should be placed. This folder is
+ inside the build directory for the package being built, and it is
+ unique for the package in question.
+* `TARGET` - the target triple that is being compiled for. Native code should be
+ compiled for this triple. Some more information about target
+ triples can be found in [clang’s own documentation][clang].
+* `HOST` - the host triple of the rust compiler.
+* `NUM_JOBS` - the parallelism specified as the top-level parallelism. This can
+ be useful to pass a `-j` parameter to a system like `make`. Note
+ that care should be taken when interpreting this environment
+ variable. For historical purposes this is still provided but
+ recent versions of Cargo, for example, do not need to run `make
+ -j` as it'll automatically happen. Cargo implements its own
+ [jobserver] and will allow build scripts to inherit this
+ information, so programs compatible with GNU make jobservers will
+ already have appropriately configured parallelism.
+* `OPT_LEVEL`, `DEBUG` - values of the corresponding variables for the
+ profile currently being built.
+* `PROFILE` - `release` for release builds, `debug` for other builds.
+* `DEP_<name>_<key>` - For more information about this set of environment
+ variables, see build script documentation about [`links`][links].
+* `RUSTC`, `RUSTDOC` - the compiler and documentation generator that Cargo has
+ resolved to use, passed to the build script so it might
+ use it as well.
+
+[links]: reference/build-scripts.html#the-links-manifest-key
+[profile]: reference/manifest.html#the-profile-sections
+[configuration]: https://doc.rust-lang.org/reference/attributes.html#conditional-compilation
+[clang]:http://clang.llvm.org/docs/CrossCompilation.html#target-triple
+
+### Environment variables Cargo sets for 3rd party subcommands
+
+Cargo exposes this environment variable to 3rd party subcommands
+(ie. programs named `cargo-foobar` placed in `$PATH`):
+
+* `CARGO` - Path to the `cargo` binary performing the build.
--- /dev/null
+## External tools
+
+One of the goals of Cargo is simple integration with third-party tools, like
+IDEs and other build systems. To make integration easier, Cargo has several
+facilities:
+
+* a `cargo metadata` command, which outputs project structure and dependencies
+ information in JSON,
+
+* a `--message-format` flag, which outputs information about a particular build,
+ and
+
+* support for custom subcommands.
+
+
+### Information about project structure
+
+You can use `cargo metadata` command to get information about project structure
+and dependencies. The output of the command looks like this:
+
+```text
+{
+ // Integer version number of the format.
+ "version": integer,
+
+ // List of packages for this workspace, including dependencies.
+ "packages": [
+ {
+ // Opaque package identifier.
+ "id": PackageId,
+
+ "name": string,
+
+ "version": string,
+
+ "source": SourceId,
+
+ // A list of declared dependencies, see `resolve` field for actual dependencies.
+ "dependencies": [ Dependency ],
+
+ "targets: [ Target ],
+
+ // Path to Cargo.toml
+ "manifest_path": string,
+ }
+ ],
+
+ "workspace_members": [ PackageId ],
+
+ // Dependencies graph.
+ "resolve": {
+ "nodes": [
+ {
+ "id": PackageId,
+ "dependencies": [ PackageId ]
+ }
+ ]
+ }
+}
+```
+
+The format is stable and versioned. When calling `cargo metadata`, you should
+pass `--format-version` flag explicitly to avoid forward incompatibility
+hazard.
+
+If you are using Rust, there is [cargo_metadata] crate.
+
+[cargo_metadata]: https://crates.io/crates/cargo_metadata
+
+
+### Information about build
+
+When passing `--message-format=json`, Cargo will output the following
+information during the build:
+
+* compiler errors and warnings,
+
+* produced artifacts,
+
+* results of the build scripts (for example, native dependencies).
+
+The output goes to stdout in the JSON object per line format. The `reason` field
+distinguishes different kinds of messages.
+
+Information about dependencies in the Makefile-compatible format is stored in
+the `.d` files alongside the artifacts.
+
+
+### Custom subcommands
+
+Cargo is designed to be extensible with new subcommands without having to modify
+Cargo itself. This is achieved by translating a cargo invocation of the form
+cargo `(?<command>[^ ]+)` into an invocation of an external tool
+`cargo-${command}` that then needs to be present in one of the user's `$PATH`
+directories.
+
+Custom subcommand may use `CARGO` environment variable to call back to
+Cargo. Alternatively, it can link to `cargo` crate as a library, but this
+approach has drawbacks:
+
+* Cargo as a library is unstable, API changes without deprecation,
+
+* versions of Cargo library and Cargo binary may be different.
--- /dev/null
+## Cargo Reference
+
+Now that you have an overview of how to use Cargo and have created your first
+crate, you may be interested in more details in the following areas.
+
+The reference covers the details of various areas of Cargo.
+
+* [Specifying Dependencies](reference/specifying-dependencies.html)
+* [The Manifest Format](reference/manifest.html)
+* [Configuration](reference/config.html)
+* [Environment Variables](reference/environment-variables.html)
+* [Build Scripts](reference/build-scripts.html)
+* [Publishing on crates.io](reference/publishing.html)
+* [Package ID Specifications](reference/pkgid-spec.html)
+* [Source Replacement](reference/source-replacement.html)
+* [External Tools](reference/external-tools.html)
--- /dev/null
+## The Manifest Format
+
+The `Cargo.toml` file for each package is called its *manifest*. Every manifest
+file consists of one or more sections.
+
+### The `[package]` section
+
+The first section in a `Cargo.toml` is `[package]`.
+
+```toml
+[package]
+name = "hello_world" # the name of the package
+version = "0.1.0" # the current version, obeying semver
+authors = ["you@example.com"]
+```
+
+All three of these fields are mandatory.
+
+#### The `version` field
+
+Cargo bakes in the concept of [Semantic
+Versioning](http://semver.org/), so make sure you follow some basic rules:
+
+* Before you reach 1.0.0, anything goes, but if you make breaking changes,
+ increment the minor version. In Rust, breaking changes include adding fields to
+ structs or variants to enums.
+* After 1.0.0, only make breaking changes when you increment the major version.
+ Don’t break the build.
+* After 1.0.0, don’t add any new public API (no new `pub` anything) in tiny
+ versions. Always increment the minor version if you add any new `pub` structs,
+ traits, fields, types, functions, methods or anything else.
+* Use version numbers with three numeric parts such as 1.0.0 rather than 1.0.
+
+#### The `build` field (optional)
+
+This field specifies a file in the project root which is a [build script][1] for
+building native code. More information can be found in the build script
+[guide][1].
+
+[1]: reference/build-scripts.html
+
+```toml
+[package]
+# ...
+build = "build.rs"
+```
+
+#### The `documentation` field (optional)
+
+This field specifies a URL to a website hosting the crate's documentation.
+If no URL is specified in the manifest file, [crates.io][cratesio] will
+automatically link your crate to the corresponding [docs.rs][docsrs] page.
+
+Documentation links from specific hosts are blacklisted. Hosts are added
+to the blacklist if they are known to not be hosting documentation and are
+possibly of malicious intent e.g. ad tracking networks. URLs from the
+following hosts are blacklisted:
+
+* rust-ci.org
+
+Documentation URLs from blacklisted hosts will not appear on crates.io, and
+may be replaced by docs.rs links.
+
+[docsrs]: https://docs.rs/
+[cratesio]: https://crates.io/
+
+#### The `exclude` and `include` fields (optional)
+
+You can explicitly specify to Cargo that a set of [globs][globs] should be
+ignored or included for the purposes of packaging and rebuilding a package. The
+globs specified in the `exclude` field identify a set of files that are not
+included when a package is published as well as ignored for the purposes of
+detecting when to rebuild a package, and the globs in `include` specify files
+that are explicitly included.
+
+If a VCS is being used for a package, the `exclude` field will be seeded with
+the VCS’ ignore settings (`.gitignore` for git for example).
+
+```toml
+[package]
+# ...
+exclude = ["build/**/*.o", "doc/**/*.html"]
+```
+
+```toml
+[package]
+# ...
+include = ["src/**/*", "Cargo.toml"]
+```
+
+The options are mutually exclusive: setting `include` will override an
+`exclude`. Note that `include` must be an exhaustive list of files as otherwise
+necessary source files may not be included.
+
+[globs]: http://doc.rust-lang.org/glob/glob/struct.Pattern.html
+
+#### Migrating to `gitignore`-like pattern matching
+
+The current interpretation of these configs is based on UNIX Globs, as
+implemented in the [`glob` crate](https://crates.io/crates/glob). We want
+Cargo's `include` and `exclude` configs to work as similar to `gitignore` as
+possible. [The `gitignore` specification](https://git-scm.com/docs/gitignore) is
+also based on Globs, but has a bunch of additional features that enable easier
+pattern writing and more control. Therefore, we are migrating the interpretation
+for the rules of these configs to use the [`ignore`
+crate](https://crates.io/crates/ignore), and treat them each rule as a single
+line in a `gitignore` file. See [the tracking
+issue](https://github.com/rust-lang/cargo/issues/4268) for more details on the
+migration.
+
+#### The `publish` field (optional)
+
+The `publish` field can be used to prevent a package from being published to a
+package registry (like *crates.io*) by mistake.
+
+```toml
+[package]
+# ...
+publish = false
+```
+
+#### The `workspace` field (optional)
+
+The `workspace` field can be used to configure the workspace that this package
+will be a member of. If not specified this will be inferred as the first
+Cargo.toml with `[workspace]` upwards in the filesystem.
+
+```toml
+[package]
+# ...
+workspace = "path/to/workspace/root"
+```
+
+For more information, see the documentation for the workspace table below.
+
+#### Package metadata
+
+There are a number of optional metadata fields also accepted under the
+`[package]` section:
+
+```toml
+[package]
+# ...
+
+# A short blurb about the package. This is not rendered in any format when
+# uploaded to crates.io (aka this is not markdown).
+description = "..."
+
+# These URLs point to more information about the package. These are
+# intended to be webviews of the relevant data, not necessarily compatible
+# with VCS tools and the like.
+documentation = "..."
+homepage = "..."
+repository = "..."
+
+# This points to a file under the package root (relative to this `Cargo.toml`).
+# The contents of this file are stored and indexed in the registry.
+# crates.io will render this file and place the result on the crate's page.
+readme = "..."
+
+# This is a list of up to five keywords that describe this crate. Keywords
+# are searchable on crates.io, and you may choose any words that would
+# help someone find this crate.
+keywords = ["...", "..."]
+
+# This is a list of up to five categories where this crate would fit.
+# Categories are a fixed list available at crates.io/category_slugs, and
+# they must match exactly.
+categories = ["...", "..."]
+
+# This is a string description of the license for this package. Currently
+# crates.io will validate the license provided against a whitelist of known
+# license identifiers from http://spdx.org/licenses/. Multiple licenses can be
+# separated with a `/`.
+license = "..."
+
+# If a project is using a nonstandard license, then this key may be specified in
+# lieu of the above key and must point to a file relative to this manifest
+# (similar to the readme key).
+license-file = "..."
+
+# Optional specification of badges to be displayed on crates.io.
+#
+# - The badges pertaining to build status that are currently available are
+# Appveyor, CircleCI, GitLab, and TravisCI.
+# - Available badges pertaining to code test coverage are Codecov and
+# Coveralls.
+# - There are also maintenance-related badges basesed on isitmaintained.com
+# which state the issue resolution time, percent of open issues, and future
+# maintenance intentions.
+#
+# If a `repository` key is required, this refers to a repository in
+# `user/repo` format.
+[badges]
+
+# Appveyor: `repository` is required. `branch` is optional; default is `master`
+# `service` is optional; valid values are `github` (default), `bitbucket`, and
+# `gitlab`.
+appveyor = { repository = "...", branch = "master", service = "github" }
+
+# Circle CI: `repository` is required. `branch` is optional; default is `master`
+circle-ci = { repository = "...", branch = "master" }
+
+# GitLab: `repository` is required. `branch` is optional; default is `master`
+gitlab = { repository = "...", branch = "master" }
+
+# Travis CI: `repository` in format "<user>/<project>" is required.
+# `branch` is optional; default is `master`
+travis-ci = { repository = "...", branch = "master" }
+
+# Codecov: `repository` is required. `branch` is optional; default is `master`
+# `service` is optional; valid values are `github` (default), `bitbucket`, and
+# `gitlab`.
+codecov = { repository = "...", branch = "master", service = "github" }
+
+# Coveralls: `repository` is required. `branch` is optional; default is `master`
+# `service` is optional; valid values are `github` (default) and `bitbucket`.
+coveralls = { repository = "...", branch = "master", service = "github" }
+
+# Is it maintained resolution time: `repository` is required.
+is-it-maintained-issue-resolution = { repository = "..." }
+
+# Is it maintained percentage of open issues: `repository` is required.
+is-it-maintained-open-issues = { repository = "..." }
+
+# Maintenance: `status` is required Available options are `actively-developed`,
+# `passively-maintained`, `as-is`, `none`, `experimental`, `looking-for-maintainer`
+# and `deprecated`.
+maintenance = { status = "..." }
+```
+
+The [crates.io](https://crates.io) registry will render the description, display
+the license, link to the three URLs and categorize by the keywords. These keys
+provide useful information to users of the registry and also influence the
+search ranking of a crate. It is highly discouraged to omit everything in a
+published crate.
+
+#### The `metadata` table (optional)
+
+Cargo by default will warn about unused keys in `Cargo.toml` to assist in
+detecting typos and such. The `package.metadata` table, however, is completely
+ignored by Cargo and will not be warned about. This section can be used for
+tools which would like to store project configuration in `Cargo.toml`. For
+example:
+
+```toml
+[package]
+name = "..."
+# ...
+
+# Metadata used when generating an Android APK, for example.
+[package.metadata.android]
+package-name = "my-awesome-android-app"
+assets = "path/to/static"
+```
+
+### Dependency sections
+
+See the [specifying dependencies page](reference/specifying-dependencies.html) for
+information on the `[dependencies]`, `[dev-dependencies]`,
+`[build-dependencies]`, and target-specific `[target.*.dependencies]` sections.
+
+### The `[profile.*]` sections
+
+Cargo supports custom configuration of how rustc is invoked through profiles at
+the top level. Any manifest may declare a profile, but only the top level
+project’s profiles are actually read. All dependencies’ profiles will be
+overridden. This is done so the top-level project has control over how its
+dependencies are compiled.
+
+There are five currently supported profile names, all of which have the same
+configuration available to them. Listed below is the configuration available,
+along with the defaults for each profile.
+
+```toml
+# The development profile, used for `cargo build`.
+[profile.dev]
+opt-level = 0 # controls the `--opt-level` the compiler builds with.
+ # 0-1 is good for debugging. 2 is well-optimized. Max is 3.
+debug = true # include debug information (debug symbols). Equivalent to
+ # `-C debuginfo=2` compiler flag.
+rpath = false # controls whether compiler should set loader paths.
+ # If true, passes `-C rpath` flag to the compiler.
+lto = false # Link Time Optimization usually reduces size of binaries
+ # and static libraries. Increases compilation time.
+ # If true, passes `-C lto` flag to the compiler.
+debug-assertions = true # controls whether debug assertions are enabled
+ # (e.g. debug_assert!() and arithmetic overflow checks)
+codegen-units = 1 # if > 1 enables parallel code generation which improves
+ # compile times, but prevents some optimizations.
+ # Passes `-C codegen-units`. Ignored when `lto = true`.
+panic = 'unwind' # panic strategy (`-C panic=...`), can also be 'abort'
+
+# The release profile, used for `cargo build --release`.
+[profile.release]
+opt-level = 3
+debug = false
+rpath = false
+lto = false
+debug-assertions = false
+codegen-units = 1
+panic = 'unwind'
+
+# The testing profile, used for `cargo test`.
+[profile.test]
+opt-level = 0
+debug = 2
+rpath = false
+lto = false
+debug-assertions = true
+codegen-units = 1
+panic = 'unwind'
+
+# The benchmarking profile, used for `cargo bench` and `cargo test --release`.
+[profile.bench]
+opt-level = 3
+debug = false
+rpath = false
+lto = false
+debug-assertions = false
+codegen-units = 1
+panic = 'unwind'
+
+# The documentation profile, used for `cargo doc`.
+[profile.doc]
+opt-level = 0
+debug = 2
+rpath = false
+lto = false
+debug-assertions = true
+codegen-units = 1
+panic = 'unwind'
+```
+
+### The `[features]` section
+
+Cargo supports features to allow expression of:
+
+* conditional compilation options (usable through `cfg` attributes);
+* optional dependencies, which enhance a package, but are not required; and
+* clusters of optional dependencies, such as `postgres`, that would include the
+ `postgres` package, the `postgres-macros` package, and possibly other packages
+ (such as development-time mocking libraries, debugging tools, etc.).
+
+A feature of a package is either an optional dependency, or a set of other
+features. The format for specifying features is:
+
+```toml
+[package]
+name = "awesome"
+
+[features]
+# The default set of optional packages. Most people will want to use these
+# packages, but they are strictly optional. Note that `session` is not a package
+# but rather another feature listed in this manifest.
+default = ["jquery", "uglifier", "session"]
+
+# A feature with no dependencies is used mainly for conditional compilation,
+# like `#[cfg(feature = "go-faster")]`.
+go-faster = []
+
+# The `secure-password` feature depends on the bcrypt package. This aliasing
+# will allow people to talk about the feature in a higher-level way and allow
+# this package to add more requirements to the feature in the future.
+secure-password = ["bcrypt"]
+
+# Features can be used to reexport features of other packages. The `session`
+# feature of package `awesome` will ensure that the `session` feature of the
+# package `cookie` is also enabled.
+session = ["cookie/session"]
+
+[dependencies]
+# These packages are mandatory and form the core of this package’s distribution.
+cookie = "1.2.0"
+oauth = "1.1.0"
+route-recognizer = "=2.1.0"
+
+# A list of all of the optional dependencies, some of which are included in the
+# above `features`. They can be opted into by apps.
+jquery = { version = "1.0.2", optional = true }
+uglifier = { version = "1.5.3", optional = true }
+bcrypt = { version = "*", optional = true }
+civet = { version = "*", optional = true }
+```
+
+To use the package `awesome`:
+
+```toml
+[dependencies.awesome]
+version = "1.3.5"
+default-features = false # do not include the default features, and optionally
+ # cherry-pick individual features
+features = ["secure-password", "civet"]
+```
+
+#### Rules
+
+The usage of features is subject to a few rules:
+
+* Feature names must not conflict with other package names in the manifest. This
+ is because they are opted into via `features = [...]`, which only has a single
+ namespace.
+* With the exception of the `default` feature, all features are opt-in. To opt
+ out of the default feature, use `default-features = false` and cherry-pick
+ individual features.
+* Feature groups are not allowed to cyclically depend on one another.
+* Dev-dependencies cannot be optional.
+* Features groups can only reference optional dependencies.
+* When a feature is selected, Cargo will call `rustc` with `--cfg
+ feature="${feature_name}"`. If a feature group is included, it and all of its
+ individual features will be included. This can be tested in code via
+ `#[cfg(feature = "foo")]`.
+
+Note that it is explicitly allowed for features to not actually activate any
+optional dependencies. This allows packages to internally enable/disable
+features without requiring a new dependency.
+
+#### Usage in end products
+
+One major use-case for this feature is specifying optional features in
+end-products. For example, the Servo project may want to include optional
+features that people can enable or disable when they build it.
+
+In that case, Servo will describe features in its `Cargo.toml` and they can be
+enabled using command-line flags:
+
+```shell
+$ cargo build --release --features "shumway pdf"
+```
+
+Default features could be excluded using `--no-default-features`.
+
+#### Usage in packages
+
+In most cases, the concept of *optional dependency* in a library is best
+expressed as a separate package that the top-level application depends on.
+
+However, high-level packages, like Iron or Piston, may want the ability to
+curate a number of packages for easy installation. The current Cargo system
+allows them to curate a number of mandatory dependencies into a single package
+for easy installation.
+
+In some cases, packages may want to provide additional curation for optional
+dependencies:
+
+* grouping a number of low-level optional dependencies together into a single
+ high-level feature;
+* specifying packages that are recommended (or suggested) to be included by
+ users of the package; and
+* including a feature (like `secure-password` in the motivating example) that
+ will only work if an optional dependency is available, and would be difficult
+ to implement as a separate package (for example, it may be overly difficult to
+ design an IO package to be completely decoupled from OpenSSL, with opt-in via
+ the inclusion of a separate package).
+
+In almost all cases, it is an antipattern to use these features outside of
+high-level packages that are designed for curation. If a feature is optional, it
+can almost certainly be expressed as a separate package.
+
+### The `[workspace]` section
+
+Projects can define a workspace which is a set of crates that will all share the
+same `Cargo.lock` and output directory. The `[workspace]` table can be defined
+as:
+
+```toml
+[workspace]
+
+# Optional key, inferred if not present
+members = ["path/to/member1", "path/to/member2", "path/to/member3/*"]
+
+# Optional key, empty if not present
+exclude = ["path1", "path/to/dir2"]
+```
+
+Workspaces were added to Cargo as part of [RFC 1525] and have a number of
+properties:
+
+* A workspace can contain multiple crates where one of them is the *root crate*.
+* The *root crate*'s `Cargo.toml` contains the `[workspace]` table, but is not
+ required to have other configuration.
+* Whenever any crate in the workspace is compiled, output is placed in the
+ *workspace root*. i.e. next to the *root crate*'s `Cargo.toml`.
+* The lock file for all crates in the workspace resides in the *workspace root*.
+* The `[patch]` and `[replace]` sections in `Cargo.toml` are only recognized
+ in the *root crate*'s manifest, and ignored in member crates' manifests.
+
+[RFC 1525]: https://github.com/rust-lang/rfcs/blob/master/text/1525-cargo-workspace.md
+
+The *root crate* of a workspace, indicated by the presence of `[workspace]` in
+its manifest, is responsible for defining the entire workspace. All `path`
+dependencies residing in the workspace directory become members. You can add
+additional packages to the workspace by listing them in the `members` key. Note
+that members of the workspaces listed explicitly will also have their path
+dependencies included in the workspace. Sometimes a project may have a lot of
+workspace members and it can be onerous to keep up to date. The path dependency
+can also use [globs][globs] to match multiple paths. Finally, the `exclude`
+key can be used to blacklist paths from being included in a workspace. This can
+be useful if some path dependencies aren't desired to be in the workspace at
+all.
+
+The `package.workspace` manifest key (described above) is used in member crates
+to point at a workspace's root crate. If this key is omitted then it is inferred
+to be the first crate whose manifest contains `[workspace]` upwards in the
+filesystem.
+
+A crate may either specify `package.workspace` or specify `[workspace]`. That
+is, a crate cannot both be a root crate in a workspace (contain `[workspace]`)
+and also be a member crate of another workspace (contain `package.workspace`).
+
+Most of the time workspaces will not need to be dealt with as `cargo new` and
+`cargo init` will handle workspace configuration automatically.
+
+#### Virtual Manifest
+
+In workspace manifests, if the `package` table is present, the workspace root
+crate will be treated as a normal package, as well as a worksapce. If the
+`package` table is not present in a worksapce manifest, it is called a *virtual
+manifest*.
+
+When working with *virtual manifests*, package-related cargo commands, like
+`cargo build`, won't be available anymore. But, most of such commands support
+the `--all` option, will execute the command for all the non-virtual manifest in
+the workspace.
+
+#TODO: move this to a more appropriate place
+### The project layout
+
+If your project is an executable, name the main source file `src/main.rs`. If it
+is a library, name the main source file `src/lib.rs`.
+
+Cargo will also treat any files located in `src/bin/*.rs` as executables. If your
+executable consists of more than just one source file, you might also use a directory
+inside `src/bin` containing a `main.rs` file which will be treated as an executable
+with a name of the parent directory.
+Do note, however, once you add a `[[bin]]` section ([see
+below](#configuring-a-target)), Cargo will no longer automatically build files
+located in `src/bin/*.rs`. Instead you must create a `[[bin]]` section for
+each file you want to build.
+
+Your project can optionally contain folders named `examples`, `tests`, and
+`benches`, which Cargo will treat as containing examples,
+integration tests, and benchmarks respectively.
+
+```shell
+▾ src/ # directory containing source files
+ lib.rs # the main entry point for libraries and packages
+ main.rs # the main entry point for projects producing executables
+ ▾ bin/ # (optional) directory containing additional executables
+ *.rs
+ ▾ */ # (optional) directories containing multi-file executables
+ main.rs
+▾ examples/ # (optional) examples
+ *.rs
+▾ tests/ # (optional) integration tests
+ *.rs
+▾ benches/ # (optional) benchmarks
+ *.rs
+```
+
+To structure your code after you've created the files and folders for your
+project, you should remember to use Rust's module system, which you can read
+about in [the book](https://doc.rust-lang.org/book/crates-and-modules.html).
+
+### Examples
+
+Files located under `examples` are example uses of the functionality provided by
+the library. When compiled, they are placed in the `target/examples` directory.
+
+They can compile either as executables (with a `main()` function) or libraries
+and pull in the library by using `extern crate <library-name>`. They are
+compiled when you run your tests to protect them from bitrotting.
+
+You can run individual executable examples with the command `cargo run --example
+<example-name>`.
+
+Specify `crate-type` to make an example be compiled as a library:
+
+```toml
+[[example]]
+name = "foo"
+crate-type = ["staticlib"]
+```
+
+You can build individual library examples with the command `cargo build
+--example <example-name>`.
+
+### Tests
+
+When you run `cargo test`, Cargo will:
+
+* compile and run your library’s unit tests, which are in the files reachable
+ from `lib.rs` (naturally, any sections marked with `#[cfg(test)]` will be
+ considered at this stage);
+* compile and run your library’s documentation tests, which are embedded inside
+ of documentation blocks;
+* compile and run your library’s [integration tests](#integration-tests); and
+* compile your library’s examples.
+
+#### Integration tests
+
+Each file in `tests/*.rs` is an integration test. When you run `cargo test`,
+Cargo will compile each of these files as a separate crate. The crate can link
+to your library by using `extern crate <library-name>`, like any other code that
+depends on it.
+
+Cargo will not automatically compile files inside subdirectories of `tests`, but
+an integration test can import modules from these directories as usual. For
+example, if you want several integration tests to share some code, you can put
+the shared code in `tests/common/mod.rs` and then put `mod common;` in each of
+the test files.
+
+### Configuring a target
+
+All of the `[[bin]]`, `[lib]`, `[[bench]]`, `[[test]]`, and `[[example]]`
+sections support similar configuration for specifying how a target should be
+built. The double-bracket sections like `[[bin]]` are array-of-table of
+[TOML](https://github.com/toml-lang/toml#array-of-tables), which means you can
+write more than one `[[bin]]` section to make several executables in your crate.
+
+The example below uses `[lib]`, but it also applies to all other sections
+as well. All values listed are the defaults for that option unless otherwise
+specified.
+
+```toml
+[package]
+# ...
+
+[lib]
+# The name of a target is the name of the library that will be generated. This
+# is defaulted to the name of the package or project, with any dashes replaced
+# with underscores. (Rust `extern crate` declarations reference this name;
+# therefore the value must be a valid Rust identifier to be usable.)
+name = "foo"
+
+# This field points at where the crate is located, relative to the `Cargo.toml`.
+path = "src/lib.rs"
+
+# A flag for enabling unit tests for this target. This is used by `cargo test`.
+test = true
+
+# A flag for enabling documentation tests for this target. This is only relevant
+# for libraries, it has no effect on other sections. This is used by
+# `cargo test`.
+doctest = true
+
+# A flag for enabling benchmarks for this target. This is used by `cargo bench`.
+bench = true
+
+# A flag for enabling documentation of this target. This is used by `cargo doc`.
+doc = true
+
+# If the target is meant to be a compiler plugin, this field must be set to true
+# for Cargo to correctly compile it and make it available for all dependencies.
+plugin = false
+
+# If the target is meant to be a "macros 1.1" procedural macro, this field must
+# be set to true.
+proc-macro = false
+
+# If set to false, `cargo test` will omit the `--test` flag to rustc, which
+# stops it from generating a test harness. This is useful when the binary being
+# built manages the test runner itself.
+harness = true
+```
+
+#### The `required-features` field (optional)
+
+The `required-features` field specifies which features the target needs in order
+to be built. If any of the required features are not selected, the target will
+be skipped. This is only relevant for the `[[bin]]`, `[[bench]]`, `[[test]]`,
+and `[[example]]` sections, it has no effect on `[lib]`.
+
+```toml
+[features]
+# ...
+postgres = []
+sqlite = []
+tools = []
+
+[[bin]]
+# ...
+required-features = ["postgres", "tools"]
+```
+
+#### Building dynamic or static libraries
+
+If your project produces a library, you can specify which kind of library to
+build by explicitly listing the library in your `Cargo.toml`:
+
+```toml
+# ...
+
+[lib]
+name = "..."
+crate-type = ["dylib"] # could be `staticlib` as well
+```
+
+The available options are `dylib`, `rlib`, `staticlib`, `cdylib`, and
+`proc-macro`. You should only use this option in a project. Cargo will always
+compile packages (dependencies) based on the requirements of the project that
+includes them.
+
+You can read more about the different crate types in the
+[Rust Reference Manual](https://doc.rust-lang.org/reference/linkage.html)
+
+### The `[patch]` Section
+
+This section of Cargo.toml can be used to [override dependencies][replace] with
+other copies. The syntax is similar to the `[dependencies]` section:
+
+```toml
+[patch.crates-io]
+foo = { git = 'https://github.com/example/foo' }
+bar = { path = 'my/local/bar' }
+```
+
+The `[patch]` table is made of dependency-like sub-tables. Each key after
+`[patch]` is a URL of the source that's being patched, or `crates-io` if
+you're modifying the https://crates.io registry. In the example above
+`crates-io` could be replaced with a git URL such as
+`https://github.com/rust-lang-nursery/log`.
+
+Each entry in these tables is a normal dependency specification, the same as
+found in the `[dependencies]` section of the manifest. The dependencies listed
+in the `[patch]` section are resolved and used to patch the source at the
+URL specified. The above manifest snippet patches the `crates-io` source (e.g.
+crates.io itself) with the `foo` crate and `bar` crate.
+
+Sources can be patched with versions of crates that do not exist, and they can
+also be patched with versions of crates that already exist. If a source is
+patched with a crate version that already exists in the source, then the
+source's original crate is replaced.
+
+More information about overriding dependencies can be found in the [overriding
+dependencies][replace] section of the documentation and [RFC 1969] for the
+technical specification of this feature. (Note that the `[patch]` feature will
+first become available in Rust 1.21, set to be released on 2017-10-12.)
+
+[RFC 1969]: https://github.com/rust-lang/rfcs/pull/1969
+[replace]: reference/specifying-dependencies.html#overriding-dependencies
+
+### The `[replace]` Section
+
+This section of Cargo.toml can be used to [override dependencies][replace] with
+other copies. The syntax is similar to the `[dependencies]` section:
+
+```toml
+[replace]
+"foo:0.1.0" = { git = 'https://github.com/example/foo' }
+"bar:1.0.2" = { path = 'my/local/bar' }
+```
+
+Each key in the `[replace]` table is a [package id
+specification](reference/pkgid-spec.html) which allows arbitrarily choosing a node in the
+dependency graph to override. The value of each key is the same as the
+`[dependencies]` syntax for specifying dependencies, except that you can't
+specify features. Note that when a crate is overridden the copy it's overridden
+with must have both the same name and version, but it can come from a different
+source (e.g. git or a local path).
+
+More information about overriding dependencies can be found in the [overriding
+dependencies][replace] section of the documentation.
--- /dev/null
+## Package ID Specifications
+
+### Package ID specifications
+
+Subcommands of Cargo frequently need to refer to a particular package within a
+dependency graph for various operations like updating, cleaning, building, etc.
+To solve this problem, Cargo supports Package ID Specifications. A specification
+is a string which is used to uniquely refer to one package within a graph of
+packages.
+
+#### Specification grammar
+
+The formal grammar for a Package Id Specification is:
+
+```notrust
+pkgid := pkgname
+ | [ proto "://" ] hostname-and-path [ "#" ( pkgname | semver ) ]
+pkgname := name [ ":" semver ]
+
+proto := "http" | "git" | ...
+```
+
+Here, brackets indicate that the contents are optional.
+
+#### Example specifications
+
+These could all be references to a package `foo` version `1.2.3` from the
+registry at `crates.io`
+
+| pkgid | name | version | url |
+|:-----------------------------|:-----:|:-------:|:----------------------:|
+| `foo` | `foo` | `*` | `*` |
+| `foo:1.2.3` | `foo` | `1.2.3` | `*` |
+| `crates.io/foo` | `foo` | `*` | `*://crates.io/foo` |
+| `crates.io/foo#1.2.3` | `foo` | `1.2.3` | `*://crates.io/foo` |
+| `crates.io/bar#foo:1.2.3` | `foo` | `1.2.3` | `*://crates.io/bar` |
+| `http://crates.io/foo#1.2.3` | `foo` | `1.2.3` | `http://crates.io/foo` |
+
+#### Brevity of specifications
+
+The goal of this is to enable both succinct and exhaustive syntaxes for
+referring to packages in a dependency graph. Ambiguous references may refer to
+one or more packages. Most commands generate an error if more than one package
+could be referred to with the same specification.
--- /dev/null
+## Publishing on crates.io
+
+Once you've got a library that you'd like to share with the world, it's time to
+publish it on [crates.io]! Publishing a crate is when a specific
+version is uploaded to be hosted on [crates.io].
+
+Take care when publishing a crate, because a publish is **permanent**. The
+version can never be overwritten, and the code cannot be deleted. There is no
+limit to the number of versions which can be published, however.
+
+### Before your first publish
+
+First thing’s first, you’ll need an account on [crates.io] to acquire
+an API token. To do so, [visit the home page][crates.io] and log in via a GitHub
+account (required for now). After this, visit your [Account
+Settings](https://crates.io/me) page and run the `cargo login` command
+specified.
+
+```shell
+$ cargo login abcdefghijklmnopqrstuvwxyz012345
+```
+
+This command will inform Cargo of your API token and store it locally in your
+`~/.cargo/credentials` (previously it was `~/.cargo/config`). Note that this
+token is a **secret** and should not be shared with anyone else. If it leaks for
+any reason, you should regenerate it immediately.
+
+### Before publishing a new crate
+
+Keep in mind that crate names on [crates.io] are allocated on a first-come-first-
+serve basis. Once a crate name is taken, it cannot be used for another crate.
+
+#### Packaging a crate
+
+The next step is to package up your crate into a format that can be uploaded to
+[crates.io]. For this we’ll use the `cargo package` subcommand. This will take
+our entire crate and package it all up into a `*.crate` file in the
+`target/package` directory.
+
+```shell
+$ cargo package
+```
+
+As an added bonus, the `*.crate` will be verified independently of the current
+source tree. After the `*.crate` is created, it’s unpacked into
+`target/package` and then built from scratch to ensure that all necessary files
+are there for the build to succeed. This behavior can be disabled with the
+`--no-verify` flag.
+
+Now’s a good time to take a look at the `*.crate` file to make sure you didn’t
+accidentally package up that 2GB video asset, or large data files used for code
+generation, integration tests, or benchmarking. There is currently a 10MB
+upload size limit on `*.crate` files. So, if the size of `tests` and `benches`
+directories and their dependencies are up to a couple of MBs, you can keep them
+in your package; otherwsie, better to exclude them.
+
+Cargo will automatically ignore files ignored by your version control system
+when packaging, but if you want to specify an extra set of files to ignore you
+can use the `exclude` key in the manifest:
+
+```toml
+[package]
+# ...
+exclude = [
+ "public/assets/*",
+ "videos/*",
+]
+```
+
+The syntax of each element in this array is what
+[rust-lang/glob](https://github.com/rust-lang/glob) accepts. If you’d rather
+roll with a whitelist instead of a blacklist, Cargo also supports an `include`
+key, which if set, overrides the `exclude` key:
+
+```toml
+[package]
+# ...
+include = [
+ "**/*.rs",
+ "Cargo.toml",
+]
+```
+
+### Uploading the crate
+
+Now that we’ve got a `*.crate` file ready to go, it can be uploaded to
+[crates.io] with the `cargo publish` command. And that’s it, you’ve now published
+your first crate!
+
+```shell
+$ cargo publish
+```
+
+If you’d like to skip the `cargo package` step, the `cargo publish` subcommand
+will automatically package up the local crate if a copy isn’t found already.
+
+Be sure to check out the [metadata you can
+specify](reference/manifest.html#package-metadata) to ensure your crate can be
+discovered more easily!
+
+### Publishing a new version of an existing crate
+
+In order to release a new version, change the `version` value specified in your
+`Cargo.toml` manifest. Keep in mind [the semver
+rules](reference/manifest.html#the-version-field). Then optionally run `cargo package` if
+you want to inspect the `*.crate` file for the new version before publishing,
+and run `cargo publish` to upload the new version.
+
+### Managing a crates.io-based crate
+
+Management of crates is primarily done through the command line `cargo` tool
+rather than the [crates.io] web interface. For this, there are a few subcommands
+to manage a crate.
+
+#### `cargo yank`
+
+Occasions may arise where you publish a version of a crate that actually ends up
+being broken for one reason or another (syntax error, forgot to include a file,
+etc.). For situations such as this, Cargo supports a “yank” of a version of a
+crate.
+
+```shell
+$ cargo yank --vers 1.0.1
+$ cargo yank --vers 1.0.1 --undo
+```
+
+A yank **does not** delete any code. This feature is not intended for deleting
+accidentally uploaded secrets, for example. If that happens, you must reset
+those secrets immediately.
+
+The semantics of a yanked version are that no new dependencies can be created
+against that version, but all existing dependencies continue to work. One of the
+major goals of [crates.io] is to act as a permanent archive of crates that does
+not change over time, and allowing deletion of a version would go against this
+goal. Essentially a yank means that all projects with a `Cargo.lock` will not
+break, while any future `Cargo.lock` files generated will not list the yanked
+version.
+
+#### `cargo owner`
+
+A crate is often developed by more than one person, or the primary maintainer
+may change over time! The owner of a crate is the only person allowed to publish
+new versions of the crate, but an owner may designate additional owners.
+
+```shell
+$ cargo owner --add my-buddy
+$ cargo owner --remove my-buddy
+$ cargo owner --add github:rust-lang:owners
+$ cargo owner --remove github:rust-lang:owners
+```
+
+The owner IDs given to these commands must be GitHub user names or GitHub teams.
+
+If a user name is given to `--add`, that user becomes a “named” owner, with
+full rights to the crate. In addition to being able to publish or yank versions
+of the crate, they have the ability to add or remove owners, *including* the
+owner that made *them* an owner. Needless to say, you shouldn’t make people you
+don’t fully trust into a named owner. In order to become a named owner, a user
+must have logged into [crates.io] previously.
+
+If a team name is given to `--add`, that team becomes a “team” owner, with
+restricted right to the crate. While they have permission to publish or yank
+versions of the crate, they *do not* have the ability to add or remove owners.
+In addition to being more convenient for managing groups of owners, teams are
+just a bit more secure against owners becoming malicious.
+
+The syntax for teams is currently `github:org:team` (see examples above).
+In order to add a team as an owner one must be a member of that team. No
+such restriction applies to removing a team as an owner.
+
+### GitHub permissions
+
+Team membership is not something GitHub provides simple public access to, and it
+is likely for you to encounter the following message when working with them:
+
+> It looks like you don’t have permission to query a necessary property from
+GitHub to complete this request. You may need to re-authenticate on [crates.io]
+to grant permission to read GitHub org memberships. Just go to
+https://crates.io/login
+
+This is basically a catch-all for “you tried to query a team, and one of the
+five levels of membership access control denied this”. That is not an
+exaggeration. GitHub’s support for team access control is Enterprise Grade.
+
+The most likely cause of this is simply that you last logged in before this
+feature was added. We originally requested *no* permissions from GitHub when
+authenticating users, because we didn’t actually ever use the user’s token for
+anything other than logging them in. However to query team membership on your
+behalf, we now require
+[the `read:org` scope](https://developer.github.com/v3/oauth/#scopes).
+
+You are free to deny us this scope, and everything that worked before teams
+were introduced will keep working. However you will never be able to add a team
+as an owner, or publish a crate as a team owner. If you ever attempt to do this,
+you will get the error above. You may also see this error if you ever try to
+publish a crate that you don’t own at all, but otherwise happens to have a team.
+
+If you ever change your mind, or just aren’t sure if [crates.io] has sufficient
+permission, you can always go to https://crates.io/login, which will prompt you
+for permission if [crates.io] doesn’t have all the scopes it would like to.
+
+An additional barrier to querying GitHub is that the organization may be
+actively denying third party access. To check this, you can go to:
+
+ https://github.com/organizations/:org/settings/oauth_application_policy
+
+where `:org` is the name of the organization (e.g. rust-lang). You may see
+something like:
+
+
+
+Where you may choose to explicitly remove [crates.io] from your organization’s
+blacklist, or simply press the “Remove Restrictions” button to allow all third
+party applications to access this data.
+
+Alternatively, when [crates.io] requested the `read:org` scope, you could have
+explicitly whitelisted [crates.io] querying the org in question by pressing
+the “Grant Access” button next to its name:
+
+
+
+[crates.io]: https://crates.io/
--- /dev/null
+## Source Replacement
+
+Cargo supports the ability to **replace one source with another** to express
+strategies along the lines of mirrors or vendoring dependencies. Configuration
+is currently done through the [`.cargo/config` configuration][config] mechanism,
+like so:
+
+[config]: reference/config.html
+
+```toml
+# The `source` table is where all keys related to source-replacement
+# are stored.
+[source]
+
+# Under the `source` table are a number of other tables whose keys are a
+# name for the relevant source. For example this section defines a new
+# source, called `my-awesome-source`, which comes from a directory
+# located at `vendor` relative to the directory containing this `.cargo/config`
+# file
+[source.my-awesome-source]
+directory = "vendor"
+
+# The crates.io default source for crates is available under the name
+# "crates-io", and here we use the `replace-with` key to indicate that it's
+# replaced with our source above.
+[source.crates-io]
+replace-with = "my-awesome-source"
+```
+
+With this configuration Cargo attempts to look up all crates in the directory
+"vendor" rather than querying the online registry at crates.io. Using source
+replacement Cargo can express:
+
+* Vendoring - custom sources can be defined which represent crates on the local
+ filesystem. These sources are subsets of the source that they're replacing and
+ can be checked into projects if necessary.
+
+* Mirroring - sources can be replaced with an equivalent version which acts as a
+ cache for crates.io itself.
+
+Cargo has a core assumption about source replacement that the source code is
+exactly the same from both sources. In our above example Cargo assumes that all
+of the crates coming from `my-awesome-source` are the exact same as the copies
+from `crates-io`. Note that this also means that `my-awesome-source` is not
+allowed to have crates which are not present in the `crates-io` source.
+
+As a consequence, source replacement is not appropriate for situations such as
+patching a dependency or a private registry. Cargo supports patching
+dependencies through the usage of [the `[replace]` key][replace-section], and
+private registry support is planned for a future version of Cargo.
+
+[replace-section]: reference/manifest.html#the-replace-section
+
+### Configuration
+
+Configuration of replacement sources is done through [`.cargo/config`][config]
+and the full set of available keys are:
+
+```toml
+# Each source has its own table where the key is the name of the source
+[source.the-source-name]
+
+# Indicate that `the-source-name` will be replaced with `another-source`,
+# defined elsewhere
+replace-with = "another-source"
+
+# Available kinds of sources that can be specified (described below)
+registry = "https://example.com/path/to/index"
+local-registry = "path/to/registry"
+directory = "path/to/vendor"
+```
+
+The `crates-io` represents the crates.io online registry (default source of
+crates) and can be replaced with:
+
+```toml
+[source.crates-io]
+replace-with = 'another-source'
+```
+
+### Registry Sources
+
+A "registry source" is one that is the same as crates.io itself. That is, it has
+an index served in a git repository which matches the format of the
+[crates.io index](https://github.com/rust-lang/crates.io-index). That repository
+then has configuration indicating where to download crates from.
+
+Currently there is not an already-available project for setting up a mirror of
+crates.io. Stay tuned though!
+
+### Local Registry Sources
+
+A "local registry source" is intended to be a subset of another registry
+source, but available on the local filesystem (aka vendoring). Local registries
+are downloaded ahead of time, typically sync'd with a `Cargo.lock`, and are
+made up of a set of `*.crate` files and an index like the normal registry is.
+
+The primary way to manage and crate local registry sources is through the
+[`cargo-local-registry`][cargo-local-registry] subcommand, available on
+crates.io and can be installed with `cargo install cargo-local-registry`.
+
+[cargo-local-registry]: https://crates.io/crates/cargo-local-registry
+
+Local registries are contained within one directory and contain a number of
+`*.crate` files downloaded from crates.io as well as an `index` directory with
+the same format as the crates.io-index project (populated with just entries for
+the crates that are present).
+
+### Directory Sources
+
+A "directory source" is similar to a local registry source where it contains a
+number of crates available on the local filesystem, suitable for vendoring
+dependencies. Also like local registries, directory sources can primarily be
+managed by an external subcommand, [`cargo-vendor`][cargo-vendor], which can be
+installed with `cargo install cargo-vendor`.
+
+[cargo-vendor]: https://crates.io/crates/cargo-vendor
+
+Directory sources are distinct from local registries though in that they contain
+the unpacked version of `*.crate` files, making it more suitable in some
+situations to check everything into source control. A directory source is just a
+directory containing a number of other directories which contain the source code
+for crates (the unpacked version of `*.crate` files). Currently no restriction
+is placed on the name of each directory.
+
+Each crate in a directory source also has an associated metadata file indicating
+the checksum of each file in the crate to protect against accidental
+modifications.
--- /dev/null
+## Specifying Dependencies
+
+Your crates can depend on other libraries from [crates.io], `git` repositories, or
+subdirectories on your local file system. You can also temporarily override the
+location of a dependency— for example, to be able to test out a bug fix in the
+dependency that you are working on locally. You can have different
+dependencies for different platforms, and dependencies that are only used during
+development. Let's take a look at how to do each of these.
+
+### Specifying dependencies from crates.io
+
+Cargo is configured to look for dependencies on [crates.io] by default. Only
+the name and a version string are required in this case. In [the cargo
+guide](guide/index.html), we specified a dependency on the `time` crate:
+
+```toml
+[dependencies]
+time = "0.1.12"
+```
+
+The string `"0.1.12"` is a [semver] version requirement. Since this
+string does not have any operators in it, it is interpreted the same way as
+if we had specified `"^0.1.12"`, which is called a caret requirement.
+
+[semver]: https://github.com/steveklabnik/semver#requirements
+
+### Caret requirements
+
+**Caret requirements** allow SemVer compatible updates to a specified version.
+An update is allowed if the new version number does not modify the left-most
+non-zero digit in the major, minor, patch grouping. In this case, if we ran
+`cargo update -p time`, cargo would update us to version `0.1.13` if it was
+available, but would not update us to `0.2.0`. If instead we had specified the
+version string as `^1.0`, cargo would update to `1.1` but not `2.0`. The version
+`0.0.x` is not considered compatible with any other version.
+
+Here are some more examples of caret requirements and the versions that would
+be allowed with them:
+
+```notrust
+^1.2.3 := >=1.2.3 <2.0.0
+^1.2 := >=1.2.0 <2.0.0
+^1 := >=1.0.0 <2.0.0
+^0.2.3 := >=0.2.3 <0.3.0
+^0.0.3 := >=0.0.3 <0.0.4
+^0.0 := >=0.0.0 <0.1.0
+^0 := >=0.0.0 <1.0.0
+```
+
+This compatibility convention is different from SemVer in the way it treats
+versions before 1.0.0. While SemVer says there is no compatibility before
+1.0.0, Cargo considers `0.x.y` to be compatible with `0.x.z`, where `y ≥ z`
+and `x > 0`.
+
+### Tilde requirements
+
+**Tilde requirements** specify a minimal version with some ability to update.
+If you specify a major, minor, and patch version or only a major and minor
+version, only patch-level changes are allowed. If you only specify a major
+version, then minor- and patch-level changes are allowed.
+
+`~1.2.3` is an example of a tilde requirement.
+
+```notrust
+~1.2.3 := >=1.2.3 <1.3.0
+~1.2 := >=1.2.0 <1.3.0
+~1 := >=1.0.0 <2.0.0
+```
+
+### Wildcard requirements
+
+**Wildcard requirements** allow for any version where the wildcard is
+positioned.
+
+`*`, `1.*` and `1.2.*` are examples of wildcard requirements.
+
+```notrust
+* := >=0.0.0
+1.* := >=1.0.0 <2.0.0
+1.2.* := >=1.2.0 <1.3.0
+```
+
+### Inequality requirements
+
+**Inequality requirements** allow manually specifying a version range or an
+exact version to depend on.
+
+Here are some examples of inequality requirements:
+
+```notrust
+>= 1.2.0
+> 1
+< 2
+= 1.2.3
+```
+
+### Multiple requirements
+
+Multiple version requirements can also be separated with a comma, e.g. `>= 1.2,
+< 1.5`.
+
+### Specifying dependencies from `git` repositories
+
+To depend on a library located in a `git` repository, the minimum information
+you need to specify is the location of the repository with the `git` key:
+
+```toml
+[dependencies]
+rand = { git = "https://github.com/rust-lang-nursery/rand" }
+```
+
+Cargo will fetch the `git` repository at this location then look for a
+`Cargo.toml` for the requested crate anywhere inside the `git` repository
+(not necessarily at the root).
+
+Since we haven’t specified any other information, Cargo assumes that
+we intend to use the latest commit on the `master` branch to build our project.
+You can combine the `git` key with the `rev`, `tag`, or `branch` keys to
+specify something else. Here's an example of specifying that you want to use
+the latest commit on a branch named `next`:
+
+```toml
+[dependencies]
+rand = { git = "https://github.com/rust-lang-nursery/rand", branch = "next" }
+```
+
+### Specifying path dependencies
+
+Over time, our `hello_world` project from [the guide](guide/index.html) has
+grown significantly in size! It’s gotten to the point that we probably want to
+split out a separate crate for others to use. To do this Cargo supports **path
+dependencies** which are typically sub-crates that live within one repository.
+Let’s start off by making a new crate inside of our `hello_world` project:
+
+```shell
+# inside of hello_world/
+$ cargo new hello_utils
+```
+
+This will create a new folder `hello_utils` inside of which a `Cargo.toml` and
+`src` folder are ready to be configured. In order to tell Cargo about this, open
+up `hello_world/Cargo.toml` and add `hello_utils` to your dependencies:
+
+```toml
+[dependencies]
+hello_utils = { path = "hello_utils" }
+```
+
+This tells Cargo that we depend on a crate called `hello_utils` which is found
+in the `hello_utils` folder (relative to the `Cargo.toml` it’s written in).
+
+And that’s it! The next `cargo build` will automatically build `hello_utils` and
+all of its own dependencies, and others can also start using the crate as well.
+However, crates that use dependencies specified with only a path are not
+permitted on [crates.io]. If we wanted to publish our `hello_world` crate, we
+would need to publish a version of `hello_utils` to [crates.io](https://crates.io)
+and specify its version in the dependencies line as well:
+
+```toml
+[dependencies]
+hello_utils = { path = "hello_utils", version = "0.1.0" }
+```
+
+### Overriding dependencies
+
+There are a number of methods in Cargo to support overriding dependencies and
+otherwise controlling the dependency graph. These options are typically, though,
+only available at the workspace level and aren't propagated through
+dependencies. In other words, "applications" have the ability to override
+dependencies but "libraries" do not.
+
+The desire to override a dependency or otherwise alter some dependencies can
+arise through a number of scenarios. Most of them, however, boil down to the
+ability to work with a crate before it's been published to crates.io. For
+example:
+
+* A crate you're working on is also used in a much larger application you're
+ working on, and you'd like to test a bug fix to the library inside of the
+ larger application.
+* An upstream crate you don't work on has a new feature or a bug fix on the
+ master branch of its git repository which you'd like to test out.
+* You're about to publish a new major version of your crate, but you'd like to
+ do integration testing across an entire project to ensure the new major
+ version works.
+* You've submitted a fix to an upstream crate for a bug you found, but you'd
+ like to immediately have your application start depending on the fixed version
+ of the crate to avoid blocking on the bug fix getting merged.
+
+These scenarios are currently all solved with the [`[patch]` manifest
+section][patch-section]. Note that the `[patch]` feature is not yet currently
+stable and will be released on 2017-08-31. Historically some of these scenarios
+have been solved with [the `[replace]` section][replace-section], but we'll
+document the `[patch]` section here.
+
+[patch-section]: reference/manifest.html#the-patch-section
+[replace-section]: reference/manifest.html#the-replace-section
+
+### Testing a bugfix
+
+Let's say you're working with the [`uuid`] crate but while you're working on it
+you discover a bug. You are, however, quite enterprising so you decide to also
+try out to fix the bug! Originally your manifest will look like:
+
+[`uuid`](https://crates.io/crates/uuid)
+
+```toml
+[package]
+name = "my-library"
+version = "0.1.0"
+authors = ["..."]
+
+[dependencies]
+uuid = "1.0"
+```
+
+First thing we'll do is to clone the [`uuid` repository][uuid-repository]
+locally via:
+
+```shell
+$ git clone https://github.com/rust-lang-nursery/uuid
+```
+
+Next we'll edit the manifest of `my-library` to contain:
+
+```toml
+[patch.crates-io]
+uuid = { path = "../path/to/uuid" }
+```
+
+Here we declare that we're *patching* the source `crates-io` with a new
+dependency. This will effectively add the local checked out version of `uuid` to
+the crates.io registry for our local project.
+
+Next up we need to ensure that our lock file is updated to use this new version
+of `uuid` so our project uses the locally checked out copy instead of one from
+crates.io. The way `[patch]` works is that it'll load the dependency at
+`../path/to/uuid` and then whenever crates.io is queried for versions of `uuid`
+it'll *also* return the local version.
+
+This means that the version number of the local checkout is significant and will
+affect whether the patch is used. Our manifest declared `uuid = "1.0"` which
+means we'll only resolve to `>= 1.0.0, < 2.0.0`, and Cargo's greedy resolution
+algorithm also means that we'll resolve to the maximum version within that
+range. Typically this doesn't matter as the version of the git repository will
+already be greater or match the maximum version published on crates.io, but it's
+important to keep this in mind!
+
+In any case, typically all you need to do now is:
+
+```shell
+$ cargo build
+ Compiling uuid v1.0.0 (file://.../uuid)
+ Compiling my-library v0.1.0 (file://.../my-library)
+ Finished dev [unoptimized + debuginfo] target(s) in 0.32 secs
+```
+
+And that's it! You're now building with the local version of `uuid` (note the
+`file://` in the build output). If you don't see the `file://` version getting
+built then you may need to run `cargo update -p uuid --precise $version` where
+`$version` is the version of the locally checked out copy of `uuid`.
+
+Once you've fixed the bug you originally found the next thing you'll want to do
+is to likely submit that as a pull request to the `uuid` crate itself. Once
+you've done this then you can also update the `[patch]` section. The listing
+inside of `[patch]` is just like the `[dependencies]` section, so once your pull
+request is merged you could change your `path` dependency to:
+
+```toml
+[patch.crates-io]
+uuid = { git = 'https://github.com/rust-lang-nursery/uuid' }
+```
+
+[uuid-repository]: https://github.com/rust-lang-nursery/uuid
+
+### Working with an unpublished minor version
+
+Let's now shift gears a bit from bug fixes to adding features. While working on
+`my-library` you discover that a whole new feature is needed in the `uuid`
+crate. You've implemented this feature, tested it locally above with `[patch]`,
+and submitted a pull request. Let's go over how you continue to use and test it
+before it's actually published.
+
+Let's also say that the current version of `uuid` on crates.io is `1.0.0`, but
+since then the master branch of the git repository has updated to `1.0.1`. This
+branch includes your new feature you submitted previously. To use this
+repository we'll edit our `Cargo.toml` to look like
+
+```toml
+[package]
+name = "my-library"
+version = "0.1.0"
+authors = ["..."]
+
+[dependencies]
+uuid = "1.0.1"
+
+[patch.crates-io]
+uuid = { git = 'https://github.com/rust-lang-nursery/uuid' }
+```
+
+Note that our local dependency on `uuid` has been updated to `1.0.1` as it's
+what we'll actually require once the crate is published. This version doesn't
+exist on crates.io, though, so we provide it with the `[patch]` section of the
+manifest.
+
+Now when our library is built it'll fetch `uuid` from the git repository and
+resolve to 1.0.1 inside the repository instead of trying to download a version
+from crates.io. Once 1.0.1 is published on crates.io the `[patch]` section can
+be deleted.
+
+It's also worth nothing that `[patch]` applies *transitively*. Let's say you use
+`my-library` in a larger project, such as:
+
+```toml
+[package]
+name = "my-binary"
+version = "0.1.0"
+authors = ["..."]
+
+[dependencies]
+my-library = { git = 'https://example.com/git/my-library' }
+uuid = "1.0"
+
+[patch.crates-io]
+uuid = { git = 'https://github.com/rust-lang-nursery/uuid' }
+```
+
+Remember that `[patch]` is only applicable at the *top level* so we consumers of
+`my-library` have to repeat the `[patch]` section if necessary. Here, though,
+the new `uuid` crate applies to *both* our dependency on `uuid` and the
+`my-library -> uuid` dependency. The `uuid` crate will be resolved to one
+version for this entire crate graph, 1.0.1, and it'll be pulled from the git
+repository.
+
+### Prepublishing a breaking change
+
+As a final scenario, let's take a look at working with a new major version of a
+crate, typically accompanied with breaking changes. Sticking with our previous
+crates, this means that we're going to be creating version 2.0.0 of the `uuid`
+crate. After we've submitted all changes upstream we can update our manifest for
+`my-library` to look like:
+
+```toml
+[dependencies]
+uuid = "2.0"
+
+[patch.crates-io]
+uuid = { git = "https://github.com/rust-lang-nursery/uuid", branch = "2.0.0" }
+```
+
+And that's it! Like with the previous example the 2.0.0 version doesn't actually
+exist on crates.io but we can still put it in through a git dependency through
+the usage of the `[patch]` section. As a thought exercise let's take another
+look at the `my-binary` manifest from above again as well:
+
+```toml
+[package]
+name = "my-binary"
+version = "0.1.0"
+authors = ["..."]
+
+[dependencies]
+my-library = { git = 'https://example.com/git/my-library' }
+uuid = "1.0"
+
+[patch.crates-io]
+uuid = { git = 'https://github.com/rust-lang-nursery/uuid', version = '2.0.0' }
+```
+
+Note that this will actually resolve to two versions of the `uuid` crate. The
+`my-binary` crate will continue to use the 1.x.y series of the `uuid` crate but
+the `my-library` crate will use the 2.0.0 version of `uuid`. This will allow you
+to gradually roll out breaking changes to a crate through a dependency graph
+without being force to update everything all at once.
+
+### Overriding with local dependencies
+
+Sometimes you're only temporarily working on a crate and you don't want to have
+to modify `Cargo.toml` like with the `[patch]` section above. For this use
+case Cargo offers a much more limited version of overrides called **path
+overrides**.
+
+Path overrides are specified through `.cargo/config` instead of `Cargo.toml`,
+and you can find [more documentation about this configuration][config-docs].
+Inside of `.cargo/config` you'll specify a key called `paths`:
+
+[config-docs]: reference/config.html
+
+```toml
+paths = ["/path/to/uuid"]
+```
+
+This array should be filled with directories that contain a `Cargo.toml`. In
+this instance, we’re just adding `uuid`, so it will be the only one that’s
+overridden. This path can be either absolute or relative to the directory that
+contains the `.cargo` folder.
+
+Path overrides are more restricted than the `[patch]` section, however, in
+that they cannot change the structure of the dependency graph. When a
+path replacement is used then the previous set of dependencies
+must all match exactly to the new `Cargo.toml` specification. For example this
+means that path overrides cannot be used to test out adding a dependency to a
+crate, instead `[patch]` must be used in that situation. As a result usage of a
+path override is typically isolated to quick bug fixes rather than larger
+changes.
+
+Note: using a local configuration to override paths will only work for crates
+that have been published to [crates.io]. You cannot use this feature to tell
+Cargo how to find local unpublished crates.
+
+### Platform specific dependencies
+
+
+Platform-specific dependencies take the same format, but are listed under a
+`target` section. Normally Rust-like `#[cfg]` syntax will be used to define
+these sections:
+
+```toml
+[target.'cfg(windows)'.dependencies]
+winhttp = "0.4.0"
+
+[target.'cfg(unix)'.dependencies]
+openssl = "1.0.1"
+
+[target.'cfg(target_arch = "x86")'.dependencies]
+native = { path = "native/i686" }
+
+[target.'cfg(target_arch = "x86_64")'.dependencies]
+native = { path = "native/x86_64" }
+```
+
+Like with Rust, the syntax here supports the `not`, `any`, and `all` operators
+to combine various cfg name/value pairs. Note that the `cfg` syntax has only
+been available since Cargo 0.9.0 (Rust 1.8.0).
+
+In addition to `#[cfg]` syntax, Cargo also supports listing out the full target
+the dependencies would apply to:
+
+```toml
+[target.x86_64-pc-windows-gnu.dependencies]
+winhttp = "0.4.0"
+
+[target.i686-unknown-linux-gnu.dependencies]
+openssl = "1.0.1"
+```
+
+If you’re using a custom target specification, quote the full path and file
+name:
+
+```toml
+[target."x86_64/windows.json".dependencies]
+winhttp = "0.4.0"
+
+[target."i686/linux.json".dependencies]
+openssl = "1.0.1"
+native = { path = "native/i686" }
+
+[target."x86_64/linux.json".dependencies]
+openssl = "1.0.1"
+native = { path = "native/x86_64" }
+```
+
+### Development dependencies
+
+You can add a `[dev-dependencies]` section to your `Cargo.toml` whose format
+is equivalent to `[dependencies]`:
+
+```toml
+[dev-dependencies]
+tempdir = "0.3"
+```
+
+Dev-dependencies are not used when compiling
+a package for building, but are used for compiling tests, examples, and
+benchmarks.
+
+These dependencies are *not* propagated to other packages which depend on this
+package.
+
+You can also have target-specific development dependencies by using
+`dev-dependencies` in the target section header instead of `dependencies`. For
+example:
+
+```toml
+[target.'cfg(unix)'.dev-dependencies]
+mio = "0.0.1"
+```
+
+[crates.io]: https://crates.io/
+
+### Build dependencies
+
+You can depend on other Cargo-based crates for use in your build scripts.
+Dependencies are declared through the `build-dependencies` section of the
+manifest:
+
+```toml
+[build-dependencies]
+gcc = "0.3"
+```
+
+The build script **does not** have access to the dependencies listed
+in the `dependencies` or `dev-dependencies` section. Build
+dependencies will likewise not be available to the package itself
+unless listed under the `dependencies` section as well. A package
+itself and its build script are built separately, so their
+dependencies need not coincide. Cargo is kept simpler and cleaner by
+using independent dependencies for independent purposes.
+
+### Choosing features
+
+If a package you depend on offers conditional features, you can
+specify which to use:
+
+```toml
+[dependencies.awesome]
+version = "1.3.5"
+default-features = false # do not include the default features, and optionally
+ # cherry-pick individual features
+features = ["secure-password", "civet"]
+```
+
+More information about features can be found in the
+[manifest documentation](reference/manifest.html#the-features-section).
--- /dev/null
+% Build Script Support
+
+Some packages need to compile third-party non-Rust code, for example C
+libraries. Other packages need to link to C libraries which can either be
+located on the system or possibly need to be built from source. Others still
+need facilities for functionality such as code generation before building (think
+parser generators).
+
+Cargo does not aim to replace other tools that are well-optimized for
+these tasks, but it does integrate with them with the `build` configuration
+option.
+
+```toml
+[package]
+# ...
+build = "build.rs"
+```
+
+The Rust file designated by the `build` command (relative to the package root)
+will be compiled and invoked before anything else is compiled in the package,
+allowing your Rust code to depend on the built or generated artifacts. Note that
+if you do not specify a value for `build` but your package root does contains a
+`"build.rs"` file, Cargo will compile and invoke this file for you.
+
+Some example use cases of the build command are:
+
+* Building a bundled C library.
+* Finding a C library on the host system.
+* Generating a Rust module from a specification.
+* Performing any platform-specific configuration needed for the crate.
+
+Each of these use cases will be detailed in full below to give examples of how
+the build command works.
+
+## Inputs to the Build Script
+
+When the build script is run, there are a number of inputs to the build script,
+all passed in the form of [environment variables][env].
+
+In addition to environment variables, the build script’s current directory is
+the source directory of the build script’s package.
+
+[env]: environment-variables.html
+
+## Outputs of the Build Script
+
+All the lines printed to stdout by a build script are written to a file like
+`target/debug/build/<pkg>/output` (the precise location may depend on your
+configuration). If you would like to see such output directly in your terminal,
+invoke cargo as 'very verbose' with the `-vv` flag. Any line that starts with
+`cargo:` is interpreted directly by Cargo. This line must be of the form
+`cargo:key=value`, like the examples below:
+
+```notrust
+# specially recognized by Cargo
+cargo:rustc-link-lib=static=foo
+cargo:rustc-link-search=native=/path/to/foo
+cargo:rustc-cfg=foo
+cargo:rustc-env=FOO=bar
+# arbitrary user-defined metadata
+cargo:root=/path/to/foo
+cargo:libdir=/path/to/foo/lib
+cargo:include=/path/to/foo/include
+```
+
+On the other hand, lines printed to stderr are written to a file like
+`target/debug/build/<pkg>/stderr` but are not interpreted by cargo.
+
+There are a few special keys that Cargo recognizes, some affecting how the
+crate is built:
+
+* `rustc-link-lib=[KIND=]NAME` indicates that the specified value is a library
+ name and should be passed to the compiler as a `-l` flag. The optional `KIND`
+ can be one of `static`, `dylib` (the default), or `framework`, see
+ `rustc --help` for more details.
+* `rustc-link-search=[KIND=]PATH` indicates the specified value is a library
+ search path and should be passed to the compiler as a `-L` flag. The optional
+ `KIND` can be one of `dependency`, `crate`, `native`, `framework` or `all`
+ (the default), see `rustc --help` for more details.
+* `rustc-flags=FLAGS` is a set of flags passed to the compiler, only `-l` and
+ `-L` flags are supported.
+* `rustc-cfg=FEATURE` indicates that the specified feature will be passed as a
+ `--cfg` flag to the compiler. This is often useful for performing compile-time
+ detection of various features.
+* `rustc-env=VAR=VALUE` indicates that the specified environment variable
+ will be added to the environment which the compiler is run within.
+ The value can be then retrieved by the `env!` macro in the compiled crate.
+ This is useful for embedding additional metadata in crate's code,
+ such as the hash of Git HEAD or the unique identifier of a continuous
+ integration server.
+* `rerun-if-changed=PATH` is a path to a file or directory which indicates that
+ the build script should be re-run if it changes (detected by a more-recent
+ last-modified timestamp on the file). Normally build scripts are re-run if
+ any file inside the crate root changes, but this can be used to scope changes
+ to just a small set of files. (If this path points to a directory the entire
+ directory will not be traversed for changes -- only changes to the timestamp
+ of the directory itself (which corresponds to some types of changes within the
+ directory, depending on platform) will trigger a rebuild. To request a re-run
+ on any changes within an entire directory, print a line for the directory and
+ another line for everything inside it, recursively.)
+ Note that if the build script itself (or one of its dependencies) changes,
+ then it's rebuilt and rerun unconditionally, so
+ `cargo:rerun-if-changed=build.rs` is almost always redundant (unless you
+ want to ignore changes in all other files except for `build.rs`).
+* `rerun-if-env-changed=VAR` is the name of an environment variable which
+ indicates that if the environment variable's value changes the build script
+ should be rerun. This basically behaves the same as `rerun-if-changed` except
+ that it works with environment variables instead. Note that the environment
+ variables here are intended for global environment variables like `CC` and
+ such, it's not necessary to use this for env vars like `TARGET` that Cargo
+ sets. Also note that if `rerun-if-env-changed` is printed out then Cargo will
+ *only* rerun the build script if those environment variables change or if
+ files printed out by `rerun-if-changed` change.
+
+* `warning=MESSAGE` is a message that will be printed to the main console after
+ a build script has finished running. Warnings are only shown for path
+ dependencies (that is, those you're working on locally), so for example
+ warnings printed out in crates.io crates are not emitted by default.
+
+Any other element is a user-defined metadata that will be passed to
+dependents. More information about this can be found in the [`links`][links]
+section.
+
+[links]: #the-links-manifest-key
+
+## Build Dependencies
+
+Build scripts are also allowed to have dependencies on other Cargo-based crates.
+Dependencies are declared through the `build-dependencies` section of the
+manifest.
+
+```toml
+[build-dependencies]
+foo = { git = "https://github.com/your-packages/foo" }
+```
+
+The build script **does not** have access to the dependencies listed in the
+`dependencies` or `dev-dependencies` section (they’re not built yet!). All build
+dependencies will also not be available to the package itself unless explicitly
+stated as so.
+
+## The `links` Manifest Key
+
+In addition to the manifest key `build`, Cargo also supports a `links` manifest
+key to declare the name of a native library that is being linked to:
+
+```toml
+[package]
+# ...
+links = "foo"
+build = "build.rs"
+```
+
+This manifest states that the package links to the `libfoo` native library, and
+it also has a build script for locating and/or building the library. Cargo
+requires that a `build` command is specified if a `links` entry is also
+specified.
+
+The purpose of this manifest key is to give Cargo an understanding about the set
+of native dependencies that a package has, as well as providing a principled
+system of passing metadata between package build scripts.
+
+Primarily, Cargo requires that there is at most one package per `links` value.
+In other words, it’s forbidden to have two packages link to the same native
+library. Note, however, that there are [conventions in place][star-sys] to
+alleviate this.
+
+[star-sys]: #-sys-packages
+
+As mentioned above in the output format, each build script can generate an
+arbitrary set of metadata in the form of key-value pairs. This metadata is
+passed to the build scripts of **dependent** packages. For example, if `libbar`
+depends on `libfoo`, then if `libfoo` generates `key=value` as part of its
+metadata, then the build script of `libbar` will have the environment variables
+`DEP_FOO_KEY=value`.
+
+Note that metadata is only passed to immediate dependents, not transitive
+dependents. The motivation for this metadata passing is outlined in the linking
+to system libraries case study below.
+
+## Overriding Build Scripts
+
+If a manifest contains a `links` key, then Cargo supports overriding the build
+script specified with a custom library. The purpose of this functionality is to
+prevent running the build script in question altogether and instead supply the
+metadata ahead of time.
+
+To override a build script, place the following configuration in any acceptable
+Cargo [configuration location](config.html).
+
+```toml
+[target.x86_64-unknown-linux-gnu.foo]
+rustc-link-search = ["/path/to/foo"]
+rustc-link-lib = ["foo"]
+root = "/path/to/foo"
+key = "value"
+```
+
+This section states that for the target `x86_64-unknown-linux-gnu` the library
+named `foo` has the metadata specified. This metadata is the same as the
+metadata generated as if the build script had run, providing a number of
+key/value pairs where the `rustc-flags`, `rustc-link-search`, and
+`rustc-link-lib` keys are slightly special.
+
+With this configuration, if a package declares that it links to `foo` then the
+build script will **not** be compiled or run, and the metadata specified will
+instead be used.
+
+# Case study: Code generation
+
+Some Cargo packages need to have code generated just before they are compiled
+for various reasons. Here we’ll walk through a simple example which generates a
+library call as part of the build script.
+
+First, let’s take a look at the directory structure of this package:
+
+```notrust
+.
+├── Cargo.toml
+├── build.rs
+└── src
+ └── main.rs
+
+1 directory, 3 files
+```
+
+Here we can see that we have a `build.rs` build script and our binary in
+`main.rs`. Next, let’s take a look at the manifest:
+
+```toml
+# Cargo.toml
+
+[package]
+name = "hello-from-generated-code"
+version = "0.1.0"
+authors = ["you@example.com"]
+build = "build.rs"
+```
+
+Here we can see we’ve got a build script specified which we’ll use to generate
+some code. Let’s see what’s inside the build script:
+
+```rust,no_run
+// build.rs
+
+use std::env;
+use std::fs::File;
+use std::io::Write;
+use std::path::Path;
+
+fn main() {
+ let out_dir = env::var("OUT_DIR").unwrap();
+ let dest_path = Path::new(&out_dir).join("hello.rs");
+ let mut f = File::create(&dest_path).unwrap();
+
+ f.write_all(b"
+ pub fn message() -> &'static str {
+ \"Hello, World!\"
+ }
+ ").unwrap();
+}
+```
+
+There’s a couple of points of note here:
+
+* The script uses the `OUT_DIR` environment variable to discover where the
+ output files should be located. It can use the process’ current working
+ directory to find where the input files should be located, but in this case we
+ don’t have any input files.
+* This script is relatively simple as it just writes out a small generated file.
+ One could imagine that other more fanciful operations could take place such as
+ generating a Rust module from a C header file or another language definition,
+ for example.
+
+Next, let’s peek at the library itself:
+
+```rust,ignore
+// src/main.rs
+
+include!(concat!(env!("OUT_DIR"), "/hello.rs"));
+
+fn main() {
+ println!("{}", message());
+}
+```
+
+This is where the real magic happens. The library is using the rustc-defined
+`include!` macro in combination with the `concat!` and `env!` macros to include
+the generated file (`hello.rs`) into the crate’s compilation.
+
+Using the structure shown here, crates can include any number of generated files
+from the build script itself.
+
+# Case study: Building some native code
+
+Sometimes it’s necessary to build some native C or C++ code as part of a
+package. This is another excellent use case of leveraging the build script to
+build a native library before the Rust crate itself. As an example, we’ll create
+a Rust library which calls into C to print “Hello, World!”.
+
+Like above, let’s first take a look at the project layout:
+
+```notrust
+.
+├── Cargo.toml
+├── build.rs
+└── src
+ ├── hello.c
+ └── main.rs
+
+1 directory, 4 files
+```
+
+Pretty similar to before! Next, the manifest:
+
+```toml
+# Cargo.toml
+
+[package]
+name = "hello-world-from-c"
+version = "0.1.0"
+authors = ["you@example.com"]
+build = "build.rs"
+```
+
+For now we’re not going to use any build dependencies, so let’s take a look at
+the build script now:
+
+```rust,no_run
+// build.rs
+
+use std::process::Command;
+use std::env;
+use std::path::Path;
+
+fn main() {
+ let out_dir = env::var("OUT_DIR").unwrap();
+
+ // note that there are a number of downsides to this approach, the comments
+ // below detail how to improve the portability of these commands.
+ Command::new("gcc").args(&["src/hello.c", "-c", "-fPIC", "-o"])
+ .arg(&format!("{}/hello.o", out_dir))
+ .status().unwrap();
+ Command::new("ar").args(&["crus", "libhello.a", "hello.o"])
+ .current_dir(&Path::new(&out_dir))
+ .status().unwrap();
+
+ println!("cargo:rustc-link-search=native={}", out_dir);
+ println!("cargo:rustc-link-lib=static=hello");
+}
+```
+
+This build script starts out by compiling our C file into an object file (by
+invoking `gcc`) and then converting this object file into a static library (by
+invoking `ar`). The final step is feedback to Cargo itself to say that our
+output was in `out_dir` and the compiler should link the crate to `libhello.a`
+statically via the `-l static=hello` flag.
+
+Note that there are a number of drawbacks to this hardcoded approach:
+
+* The `gcc` command itself is not portable across platforms. For example it’s
+ unlikely that Windows platforms have `gcc`, and not even all Unix platforms
+ may have `gcc`. The `ar` command is also in a similar situation.
+* These commands do not take cross-compilation into account. If we’re cross
+ compiling for a platform such as Android it’s unlikely that `gcc` will produce
+ an ARM executable.
+
+Not to fear, though, this is where a `build-dependencies` entry would help! The
+Cargo ecosystem has a number of packages to make this sort of task much easier,
+portable, and standardized. For example, the build script could be written as:
+
+```rust,ignore
+// build.rs
+
+// Bring in a dependency on an externally maintained `cc` package which manages
+// invoking the C compiler.
+extern crate cc;
+
+fn main() {
+ cc::Build::new()
+ .file("src/hello.c")
+ .compile("hello");
+}
+```
+
+Add a build time dependency on the `cc` crate with the following addition to
+your `Cargo.toml`:
+
+```toml
+[build-dependencies]
+cc = "1.0"
+```
+
+The [`cc` crate](https://crates.io/crates/cc) abstracts a range of build
+script requirements for C code:
+
+* It invokes the appropriate compiler (MSVC for windows, `gcc` for MinGW, `cc`
+ for Unix platforms, etc.).
+* It takes the `TARGET` variable into account by passing appropriate flags to
+ the compiler being used.
+* Other environment variables, such as `OPT_LEVEL`, `DEBUG`, etc., are all
+ handled automatically.
+* The stdout output and `OUT_DIR` locations are also handled by the `cc`
+ library.
+
+Here we can start to see some of the major benefits of farming as much
+functionality as possible out to common build dependencies rather than
+duplicating logic across all build scripts!
+
+Back to the case study though, let’s take a quick look at the contents of the
+`src` directory:
+
+```c
+// src/hello.c
+
+#include <stdio.h>
+
+void hello() {
+ printf("Hello, World!\n");
+}
+```
+
+```rust,ignore
+// src/main.rs
+
+// Note the lack of the `#[link]` attribute. We’re delegating the responsibility
+// of selecting what to link to over to the build script rather than hardcoding
+// it in the source file.
+extern { fn hello(); }
+
+fn main() {
+ unsafe { hello(); }
+}
+```
+
+And there we go! This should complete our example of building some C code from a
+Cargo package using the build script itself. This also shows why using a build
+dependency can be crucial in many situations and even much more concise!
+
+We’ve also seen a brief example of how a build script can use a crate as a
+dependency purely for the build process and not for the crate itself at runtime.
+
+# Case study: Linking to system libraries
+
+The final case study here will be investigating how a Cargo library links to a
+system library and how the build script is leveraged to support this use case.
+
+Quite frequently a Rust crate wants to link to a native library often provided
+on the system to bind its functionality or just use it as part of an
+implementation detail. This is quite a nuanced problem when it comes to
+performing this in a platform-agnostic fashion, and the purpose of a build
+script is again to farm out as much of this as possible to make this as easy as
+possible for consumers.
+
+As an example to follow, let’s take a look at one of [Cargo’s own
+dependencies][git2-rs], [libgit2][libgit2]. The C library has a number of
+constraints:
+
+[git2-rs]: https://github.com/alexcrichton/git2-rs/tree/master/libgit2-sys
+[libgit2]: https://github.com/libgit2/libgit2
+
+* It has an optional dependency on OpenSSL on Unix to implement the https
+ transport.
+* It has an optional dependency on libssh2 on all platforms to implement the ssh
+ transport.
+* It is often not installed on all systems by default.
+* It can be built from source using `cmake`.
+
+To visualize what’s going on here, let’s take a look at the manifest for the
+relevant Cargo package that links to the native C library.
+
+```toml
+[package]
+name = "libgit2-sys"
+version = "0.1.0"
+authors = ["..."]
+links = "git2"
+build = "build.rs"
+
+[dependencies]
+libssh2-sys = { git = "https://github.com/alexcrichton/ssh2-rs" }
+
+[target.'cfg(unix)'.dependencies]
+openssl-sys = { git = "https://github.com/alexcrichton/openssl-sys" }
+
+# ...
+```
+
+As the above manifests show, we’ve got a `build` script specified, but it’s
+worth noting that this example has a `links` entry which indicates that the
+crate (`libgit2-sys`) links to the `git2` native library.
+
+Here we also see that we chose to have the Rust crate have an unconditional
+dependency on `libssh2` via the `libssh2-sys` crate, as well as a
+platform-specific dependency on `openssl-sys` for \*nix (other variants elided
+for now). It may seem a little counterintuitive to express *C dependencies* in
+the *Cargo manifest*, but this is actually using one of Cargo’s conventions in
+this space.
+
+## `*-sys` Packages
+
+To alleviate linking to system libraries, Cargo has a *convention* of package
+naming and functionality. Any package named `foo-sys` will provide two major
+pieces of functionality:
+
+* The library crate will link to the native library `libfoo`. This will often
+ probe the current system for `libfoo` before resorting to building from
+ source.
+* The library crate will provide **declarations** for functions in `libfoo`,
+ but it does **not** provide bindings or higher-level abstractions.
+
+The set of `*-sys` packages provides a common set of dependencies for linking
+to native libraries. There are a number of benefits earned from having this
+convention of native-library-related packages:
+
+* Common dependencies on `foo-sys` alleviates the above rule about one package
+ per value of `links`.
+* A common dependency allows centralizing logic on discovering `libfoo` itself
+ (or building it from source).
+* These dependencies are easily overridable.
+
+## Building libgit2
+
+Now that we’ve got libgit2’s dependencies sorted out, we need to actually write
+the build script. We’re not going to look at specific snippets of code here and
+instead only take a look at the high-level details of the build script of
+`libgit2-sys`. This is not recommending all packages follow this strategy, but
+rather just outlining one specific strategy.
+
+The first step of the build script should do is to query whether libgit2 is
+already installed on the host system. To do this we’ll leverage the preexisting
+tool `pkg-config` (when its available). We’ll also use a `build-dependencies`
+section to refactor out all the `pkg-config` related code (or someone’s already
+done that!).
+
+If `pkg-config` failed to find libgit2, or if `pkg-config` just wasn’t
+installed, the next step is to build libgit2 from bundled source code
+(distributed as part of `libgit2-sys` itself). There are a few nuances when
+doing so that we need to take into account, however:
+
+* The build system of libgit2, `cmake`, needs to be able to find libgit2’s
+ optional dependency of libssh2. We’re sure we’ve already built it (it’s a
+ Cargo dependency), we just need to communicate this information. To do this
+ we leverage the metadata format to communicate information between build
+ scripts. In this example the libssh2 package printed out `cargo:root=...` to
+ tell us where libssh2 is installed at, and we can then pass this along to
+ cmake with the `CMAKE_PREFIX_PATH` environment variable.
+
+* We’ll need to handle some `CFLAGS` values when compiling C code (and tell
+ `cmake` about this). Some flags we may want to pass are `-m64` for 64-bit
+ code, `-m32` for 32-bit code, or `-fPIC` for 64-bit code as well.
+
+* Finally, we’ll invoke `cmake` to place all output into the `OUT_DIR`
+ environment variable, and then we’ll print the necessary metadata to instruct
+ rustc how to link to libgit2.
+
+Most of the functionality of this build script is easily refactorable into
+common dependencies, so our build script isn’t quite as intimidating as this
+descriptions! In reality it’s expected that build scripts are quite succinct by
+farming logic such as above to build dependencies.
--- /dev/null
+% Configuration
+
+This document will explain how Cargo’s configuration system works, as well as
+available keys or configuration. For configuration of a project through its
+manifest, see the [manifest format](manifest.html).
+
+# Hierarchical structure
+
+Cargo allows local configuration for a particular project as well as global
+configuration, like git. Cargo extends this to a hierarchical strategy.
+If, for example, Cargo were invoked in `/projects/foo/bar/baz`, then the
+following configuration files would be probed for and unified in this order:
+
+* `/projects/foo/bar/baz/.cargo/config`
+* `/projects/foo/bar/.cargo/config`
+* `/projects/foo/.cargo/config`
+* `/projects/.cargo/config`
+* `/.cargo/config`
+* `$HOME/.cargo/config`
+
+With this structure, you can specify configuration per-project, and even
+possibly check it into version control. You can also specify personal defaults
+with a configuration file in your home directory.
+
+# Configuration format
+
+All configuration is currently in the [TOML format][toml] (like the manifest),
+with simple key-value pairs inside of sections (tables) which all get merged
+together.
+
+[toml]: https://github.com/toml-lang/toml
+
+# Configuration keys
+
+All of the following keys are optional, and their defaults are listed as their
+value unless otherwise noted.
+
+Key values that specify a tool may be given as an absolute path, a relative path
+or as a pathless tool name. Absolute paths and pathless tool names are used as
+given. Relative paths are resolved relative to the parent directory of the
+`.cargo` directory of the config file that the value resides within.
+
+```toml
+# An array of paths to local repositories which are to be used as overrides for
+# dependencies. For more information see the Specifying Dependencies guide.
+paths = ["/path/to/override"]
+
+[cargo-new]
+# This is your name/email to place in the `authors` section of a new Cargo.toml
+# that is generated. If not present, then `git` will be probed, and if that is
+# not present then `$USER` and `$EMAIL` will be used.
+name = "..."
+email = "..."
+
+# By default `cargo new` will initialize a new Git repository. This key can be
+# set to `hg` to create a Mercurial repository, or `none` to disable this
+# behavior.
+vcs = "none"
+
+# For the following sections, $triple refers to any valid target triple, not the
+# literal string "$triple", and it will apply whenever that target triple is
+# being compiled to. 'cfg(...)' refers to the Rust-like `#[cfg]` syntax for
+# conditional compilation.
+[target.$triple]
+# This is the linker which is passed to rustc (via `-C linker=`) when the `$triple`
+# is being compiled for. By default this flag is not passed to the compiler.
+linker = ".."
+# Same but for the library archiver which is passed to rustc via `-C ar=`.
+ar = ".."
+# If a runner is provided, compiled targets for the `$triple` will be executed
+# by invoking the specified runner executable with actual target as first argument.
+# This applies to `cargo run`, `cargo test` and `cargo bench` commands.
+# By default compiled targets are executed directly.
+runner = ".."
+# custom flags to pass to all compiler invocations that target $triple
+# this value overrides build.rustflags when both are present
+rustflags = ["..", ".."]
+
+[target.'cfg(...)']
+# Similar for the $triple configuration, but using the `cfg` syntax.
+# If several `cfg` and $triple targets are candidates, then the rustflags
+# are concatenated. The `cfg` syntax only applies to rustflags, and not to
+# linker.
+rustflags = ["..", ".."]
+
+# Configuration keys related to the registry
+[registry]
+index = "..." # URL of the registry index (defaults to the central repository)
+token = "..." # Access token (found on the central repo’s website)
+
+[http]
+proxy = "host:port" # HTTP proxy to use for HTTP requests (defaults to none)
+ # in libcurl format, e.g. "socks5h://host:port"
+timeout = 60000 # Timeout for each HTTP request, in milliseconds
+cainfo = "cert.pem" # Path to Certificate Authority (CA) bundle (optional)
+check-revoke = true # Indicates whether SSL certs are checked for revocation
+
+[build]
+jobs = 1 # number of parallel jobs, defaults to # of CPUs
+rustc = "rustc" # the rust compiler tool
+rustdoc = "rustdoc" # the doc generator tool
+target = "triple" # build for the target triple
+target-dir = "target" # path of where to place all generated artifacts
+rustflags = ["..", ".."] # custom flags to pass to all compiler invocations
+
+[term]
+verbose = false # whether cargo provides verbose output
+color = 'auto' # whether cargo colorizes output
+
+# Network configuration
+[net]
+retry = 2 # number of times a network call will automatically retried
+
+# Alias cargo commands. The first 3 aliases are built in. If your
+# command requires grouped whitespace use the list format.
+[alias]
+b = "build"
+t = "test"
+r = "run"
+rr = "run --release"
+space_example = ["run", "--release", "--", "\"command list\""]
+```
+
+# Environment variables
+
+Cargo can also be configured through environment variables in addition to the
+TOML syntax above. For each configuration key above of the form `foo.bar` the
+environment variable `CARGO_FOO_BAR` can also be used to define the value. For
+example the `build.jobs` key can also be defined by `CARGO_BUILD_JOBS`.
+
+Environment variables will take precedent over TOML configuration, and currently
+only integer, boolean, and string keys are supported to be defined by
+environment variables.
+
+In addition to the system above, Cargo recognizes a few other specific
+[environment variables][env].
+
+[env]: environment-variables.html
--- /dev/null
+% Publishing on crates.io
+
+Once you've got a library that you'd like to share with the world, it's time to
+publish it on [crates.io]! Publishing a crate is when a specific
+version is uploaded to be hosted on [crates.io].
+
+Take care when publishing a crate, because a publish is **permanent**. The
+version can never be overwritten, and the code cannot be deleted. There is no
+limit to the number of versions which can be published, however.
+
+# Before your first publish
+
+First thing’s first, you’ll need an account on [crates.io] to acquire
+an API token. To do so, [visit the home page][crates.io] and log in via a GitHub
+account (required for now). After this, visit your [Account
+Settings](https://crates.io/me) page and run the `cargo login` command
+specified.
+
+```notrust
+$ cargo login abcdefghijklmnopqrstuvwxyz012345
+```
+
+This command will inform Cargo of your API token and store it locally in your
+`~/.cargo/credentials` (previously it was `~/.cargo/config`). Note that this
+token is a **secret** and should not be shared with anyone else. If it leaks for
+any reason, you should regenerate it immediately.
+
+# Before publishing a new crate
+
+Keep in mind that crate names on [crates.io] are allocated on a first-come-first-
+serve basis. Once a crate name is taken, it cannot be used for another crate.
+
+## Packaging a crate
+
+The next step is to package up your crate into a format that can be uploaded to
+[crates.io]. For this we’ll use the `cargo package` subcommand. This will take
+our entire crate and package it all up into a `*.crate` file in the
+`target/package` directory.
+
+```notrust
+$ cargo package
+```
+
+As an added bonus, the `*.crate` will be verified independently of the current
+source tree. After the `*.crate` is created, it’s unpacked into
+`target/package` and then built from scratch to ensure that all necessary files
+are there for the build to succeed. This behavior can be disabled with the
+`--no-verify` flag.
+
+Now’s a good time to take a look at the `*.crate` file to make sure you didn’t
+accidentally package up that 2GB video asset, or large data files used for code
+generation, integration tests, or benchmarking. There is currently a 10MB
+upload size limit on `*.crate` files. So, if the size of `tests` and `benches`
+directories and their dependencies are up to a couple of MBs, you can keep them
+in your package; otherwise, better to exclude them.
+
+Cargo will automatically ignore files ignored by your version control system
+when packaging, but if you want to specify an extra set of files to ignore you
+can use the `exclude` key in the manifest:
+
+```toml
+[package]
+# ...
+exclude = [
+ "public/assets/*",
+ "videos/*",
+]
+```
+
+The syntax of each element in this array is what
+[rust-lang/glob](https://github.com/rust-lang/glob) accepts. If you’d rather
+roll with a whitelist instead of a blacklist, Cargo also supports an `include`
+key, which if set, overrides the `exclude` key:
+
+```toml
+[package]
+# ...
+include = [
+ "**/*.rs",
+ "Cargo.toml",
+]
+```
+
+## Uploading the crate
+
+Now that we’ve got a `*.crate` file ready to go, it can be uploaded to
+[crates.io] with the `cargo publish` command. And that’s it, you’ve now published
+your first crate!
+
+```notrust
+$ cargo publish
+```
+
+If you’d like to skip the `cargo package` step, the `cargo publish` subcommand
+will automatically package up the local crate if a copy isn’t found already.
+
+Be sure to check out the [metadata you can
+specify](manifest.html#package-metadata) to ensure your crate can be discovered
+more easily!
+
+# Publishing a new version of an existing crate
+
+In order to release a new version, change the `version` value specified in your
+`Cargo.toml` manifest. Keep in mind [the semver
+rules](manifest.html#the-version-field). Then optionally run `cargo package` if
+you want to inspect the `*.crate` file for the new version before publishing,
+and run `cargo publish` to upload the new version.
+
+# Managing a crates.io-based crate
+
+Management of crates is primarily done through the command line `cargo` tool
+rather than the [crates.io] web interface. For this, there are a few subcommands
+to manage a crate.
+
+## `cargo yank`
+
+Occasions may arise where you publish a version of a crate that actually ends up
+being broken for one reason or another (syntax error, forgot to include a file,
+etc.). For situations such as this, Cargo supports a “yank” of a version of a
+crate.
+
+```notrust
+$ cargo yank --vers 1.0.1
+$ cargo yank --vers 1.0.1 --undo
+```
+
+A yank **does not** delete any code. This feature is not intended for deleting
+accidentally uploaded secrets, for example. If that happens, you must reset
+those secrets immediately.
+
+The semantics of a yanked version are that no new dependencies can be created
+against that version, but all existing dependencies continue to work. One of the
+major goals of [crates.io] is to act as a permanent archive of crates that does
+not change over time, and allowing deletion of a version would go against this
+goal. Essentially a yank means that all projects with a `Cargo.lock` will not
+break, while any future `Cargo.lock` files generated will not list the yanked
+version.
+
+## `cargo owner`
+
+A crate is often developed by more than one person, or the primary maintainer
+may change over time! The owner of a crate is the only person allowed to publish
+new versions of the crate, but an owner may designate additional owners.
+
+```notrust
+$ cargo owner --add my-buddy
+$ cargo owner --remove my-buddy
+$ cargo owner --add github:rust-lang:owners
+$ cargo owner --remove github:rust-lang:owners
+```
+
+The owner IDs given to these commands must be GitHub user names or GitHub teams.
+
+If a user name is given to `--add`, that user becomes a “named” owner, with
+full rights to the crate. In addition to being able to publish or yank versions
+of the crate, they have the ability to add or remove owners, *including* the
+owner that made *them* an owner. Needless to say, you shouldn’t make people you
+don’t fully trust into a named owner. In order to become a named owner, a user
+must have logged into [crates.io] previously.
+
+If a team name is given to `--add`, that team becomes a “team” owner, with
+restricted right to the crate. While they have permission to publish or yank
+versions of the crate, they *do not* have the ability to add or remove owners.
+In addition to being more convenient for managing groups of owners, teams are
+just a bit more secure against owners becoming malicious.
+
+The syntax for teams is currently `github:org:team` (see examples above).
+In order to add a team as an owner one must be a member of that team. No
+such restriction applies to removing a team as an owner.
+
+## GitHub permissions
+
+Team membership is not something GitHub provides simple public access to, and it
+is likely for you to encounter the following message when working with them:
+
+> It looks like you don’t have permission to query a necessary property from
+GitHub to complete this request. You may need to re-authenticate on [crates.io]
+to grant permission to read GitHub org memberships. Just go to
+https://crates.io/login
+
+This is basically a catch-all for “you tried to query a team, and one of the
+five levels of membership access control denied this”. That is not an
+exaggeration. GitHub’s support for team access control is Enterprise Grade.
+
+The most likely cause of this is simply that you last logged in before this
+feature was added. We originally requested *no* permissions from GitHub when
+authenticating users, because we didn’t actually ever use the user’s token for
+anything other than logging them in. However to query team membership on your
+behalf, we now require
+[the `read:org` scope](https://developer.github.com/v3/oauth/#scopes).
+
+You are free to deny us this scope, and everything that worked before teams
+were introduced will keep working. However you will never be able to add a team
+as an owner, or publish a crate as a team owner. If you ever attempt to do this,
+you will get the error above. You may also see this error if you ever try to
+publish a crate that you don’t own at all, but otherwise happens to have a team.
+
+If you ever change your mind, or just aren’t sure if [crates.io] has sufficient
+permission, you can always go to https://crates.io/login, which will prompt you
+for permission if [crates.io] doesn’t have all the scopes it would like to.
+
+An additional barrier to querying GitHub is that the organization may be
+actively denying third party access. To check this, you can go to:
+
+ https://github.com/organizations/:org/settings/oauth_application_policy
+
+where `:org` is the name of the organization (e.g. rust-lang). You may see
+something like:
+
+
+
+Where you may choose to explicitly remove [crates.io] from your organization’s
+blacklist, or simply press the “Remove Restrictions” button to allow all third
+party applications to access this data.
+
+Alternatively, when [crates.io] requested the `read:org` scope, you could have
+explicitly whitelisted [crates.io] querying the org in question by pressing
+the “Grant Access” button next to its name:
+
+
+
+[crates.io]: https://crates.io/
--- /dev/null
+% Environment Variables
+
+Cargo sets and reads a number of environment variables which your code can detect
+or override. Here is a list of the variables Cargo sets, organized by when it interacts
+with them:
+
+# Environment variables Cargo reads
+
+You can override these environment variables to change Cargo's behavior on your
+system:
+
+* `CARGO_HOME` - Cargo maintains a local cache of the registry index and of git
+ checkouts of crates. By default these are stored under `$HOME/.cargo`, but
+ this variable overrides the location of this directory. Once a crate is cached
+ it is not removed by the clean command.
+* `CARGO_TARGET_DIR` - Location of where to place all generated artifacts,
+ relative to the current working directory.
+* `RUSTC` - Instead of running `rustc`, Cargo will execute this specified
+ compiler instead.
+* `RUSTC_WRAPPER` - Instead of simply running `rustc`, Cargo will execute this
+ specified wrapper instead, passing as its commandline arguments the rustc
+ invocation, with the first argument being rustc.
+* `RUSTDOC` - Instead of running `rustdoc`, Cargo will execute this specified
+ `rustdoc` instance instead.
+* `RUSTDOCFLAGS` - A space-separated list of custom flags to pass to all `rustdoc`
+ invocations that Cargo performs. In contrast with `cargo rustdoc`, this is
+ useful for passing a flag to *all* `rustdoc` instances.
+* `RUSTFLAGS` - A space-separated list of custom flags to pass to all compiler
+ invocations that Cargo performs. In contrast with `cargo rustc`, this is
+ useful for passing a flag to *all* compiler instances.
+
+Note that Cargo will also read environment variables for `.cargo/config`
+configuration values, as described in [that documentation][config-env]
+
+[config-env]: config.html#environment-variables
+
+# Environment variables Cargo sets for crates
+
+Cargo exposes these environment variables to your crate when it is compiled.
+Note that this applies for test binaries as well.
+To get the value of any of these variables in a Rust program, do this:
+
+```
+let version = env!("CARGO_PKG_VERSION");
+```
+
+`version` will now contain the value of `CARGO_PKG_VERSION`.
+
+* `CARGO` - Path to the `cargo` binary performing the build.
+* `CARGO_MANIFEST_DIR` - The directory containing the manifest of your package.
+* `CARGO_PKG_VERSION` - The full version of your package.
+* `CARGO_PKG_VERSION_MAJOR` - The major version of your package.
+* `CARGO_PKG_VERSION_MINOR` - The minor version of your package.
+* `CARGO_PKG_VERSION_PATCH` - The patch version of your package.
+* `CARGO_PKG_VERSION_PRE` - The pre-release version of your package.
+* `CARGO_PKG_AUTHORS` - Colon separated list of authors from the manifest of your package.
+* `CARGO_PKG_NAME` - The name of your package.
+* `CARGO_PKG_DESCRIPTION` - The description of your package.
+* `CARGO_PKG_HOMEPAGE` - The home page of your package.
+* `OUT_DIR` - If the package has a build script, this is set to the folder where the build
+ script should place its output. See below for more information.
+
+# Environment variables Cargo sets for build scripts
+
+Cargo sets several environment variables when build scripts are run. Because these variables
+are not yet set when the build script is compiled, the above example using `env!` won't work
+and instead you'll need to retrieve the values when the build script is run:
+
+```
+use std::env;
+let out_dir = env::var("OUT_DIR").unwrap();
+```
+
+`out_dir` will now contain the value of `OUT_DIR`.
+
+* `CARGO_MANIFEST_DIR` - The directory containing the manifest for the package
+ being built (the package containing the build
+ script). Also note that this is the value of the
+ current working directory of the build script when it
+ starts.
+* `CARGO_MANIFEST_LINKS` - the manifest `links` value.
+* `CARGO_FEATURE_<name>` - For each activated feature of the package being
+ built, this environment variable will be present
+ where `<name>` is the name of the feature uppercased
+ and having `-` translated to `_`.
+* `CARGO_CFG_<cfg>` - For each [configuration option][configuration] of the
+ package being built, this environment variable will
+ contain the value of the configuration, where `<cfg>` is
+ the name of the configuration uppercased and having `-`
+ translated to `_`.
+ Boolean configurations are present if they are set, and
+ not present otherwise.
+ Configurations with multiple values are joined to a
+ single variable with the values delimited by `,`.
+* `OUT_DIR` - the folder in which all output should be placed. This folder is
+ inside the build directory for the package being built, and it is
+ unique for the package in question.
+* `TARGET` - the target triple that is being compiled for. Native code should be
+ compiled for this triple. Some more information about target
+ triples can be found in [clang’s own documentation][clang].
+* `HOST` - the host triple of the rust compiler.
+* `NUM_JOBS` - the parallelism specified as the top-level parallelism. This can
+ be useful to pass a `-j` parameter to a system like `make`. Note
+ that care should be taken when interpreting this environment
+ variable. For historical purposes this is still provided but
+ recent versions of Cargo, for example, do not need to run `make
+ -j` as it'll automatically happen. Cargo implements its own
+ [jobserver] and will allow build scripts to inherit this
+ information, so programs compatible with GNU make jobservers will
+ already have appropriately configured parallelism.
+* `OPT_LEVEL`, `DEBUG` - values of the corresponding variables for the
+ profile currently being built.
+* `PROFILE` - `release` for release builds, `debug` for other builds.
+* `DEP_<name>_<key>` - For more information about this set of environment
+ variables, see build script documentation about [`links`][links].
+* `RUSTC`, `RUSTDOC` - the compiler and documentation generator that Cargo has
+ resolved to use, passed to the build script so it might
+ use it as well.
+
+[links]: build-script.html#the-links-manifest-key
+[profile]: manifest.html#the-profile-sections
+[configuration]: https://doc.rust-lang.org/reference/attributes.html#conditional-compilation
+[clang]:http://clang.llvm.org/docs/CrossCompilation.html#target-triple
+[jobserver]: http://make.mad-scientist.net/papers/jobserver-implementation/
+
+# Environment variables Cargo sets for 3rd party subcommands
+
+Cargo exposes this environment variable to 3rd party subcommands
+(ie. programs named `cargo-foobar` placed in `$PATH`):
+
+* `CARGO` - Path to the `cargo` binary performing the build.
--- /dev/null
+% External tools
+
+One of the goals of Cargo is simple integration with third-party tools, like
+IDEs and other build systems. To make integration easier, Cargo has several
+facilities:
+
+* a `cargo metadata` command, which outputs project structure and dependencies
+ information in JSON,
+
+* a `--message-format` flag, which outputs information about a particular build,
+ and
+
+* support for custom subcommands.
+
+
+# Information about project structure
+
+You can use `cargo metadata` command to get information about project structure
+and dependencies. The output of the command looks like this:
+
+```text
+{
+ // Integer version number of the format.
+ "version": integer,
+
+ // List of packages for this workspace, including dependencies.
+ "packages": [
+ {
+ // Opaque package identifier.
+ "id": PackageId,
+
+ "name": string,
+
+ "version": string,
+
+ "source": SourceId,
+
+ // A list of declared dependencies, see `resolve` field for actual dependencies.
+ "dependencies": [ Dependency ],
+
+ "targets: [ Target ],
+
+ // Path to Cargo.toml
+ "manifest_path": string,
+ }
+ ],
+
+ "workspace_members": [ PackageId ],
+
+ // Dependencies graph.
+ "resolve": {
+ "nodes": [
+ {
+ "id": PackageId,
+ "dependencies": [ PackageId ]
+ }
+ ]
+ }
+}
+```
+
+The format is stable and versioned. When calling `cargo metadata`, you should
+pass `--format-version` flag explicitly to avoid forward incompatibility
+hazard.
+
+If you are using Rust, there is [cargo_metadata] crate.
+
+[cargo_metadata]: https://crates.io/crates/cargo_metadata
+
+
+# Information about build
+
+When passing `--message-format=json`, Cargo will output the following
+information during the build:
+
+* compiler errors and warnings,
+
+* produced artifacts,
+
+* results of the build scripts (for example, native dependencies).
+
+The output goes to stdout in the JSON object per line format. The `reason` field
+distinguishes different kinds of messages.
+
+Information about dependencies in the Makefile-compatible format is stored in
+the `.d` files alongside the artifacts.
+
+
+# Custom subcommands
+
+Cargo is designed to be extensible with new subcommands without having to modify
+Cargo itself. This is achieved by translating a cargo invocation of the form
+cargo `(?<command>[^ ]+)` into an invocation of an external tool
+`cargo-${command}` that then needs to be present in one of the user's `$PATH`
+directories.
+
+Custom subcommand may use `CARGO` environment variable to call back to
+Cargo. Alternatively, it can link to `cargo` crate as a library, but this
+approach has drawbacks:
+
+* Cargo as a library is unstable, API changes without deprecation,
+
+* versions of Cargo library and Cargo binary may be different.
--- /dev/null
+% Frequently Asked Questions
+
+# Is the plan to use GitHub as a package repository?
+
+No. The plan for Cargo is to use [crates.io], like npm or Rubygems do with
+npmjs.org and rubygems.org.
+
+We plan to support git repositories as a source of packages forever,
+because they can be used for early development and temporary patches,
+even when people use the registry as the primary source of packages.
+
+# Why build crates.io rather than use GitHub as a registry?
+
+We think that it’s very important to support multiple ways to download
+packages, including downloading from GitHub and copying packages into
+your project itself.
+
+That said, we think that [crates.io] offers a number of important benefits, and
+will likely become the primary way that people download packages in Cargo.
+
+For precedent, both Node.js’s [npm][1] and Ruby’s [bundler][2] support both a
+central registry model as well as a Git-based model, and most packages
+are downloaded through the registry in those ecosystems, with an
+important minority of packages making use of git-based packages.
+
+[1]: https://www.npmjs.org
+[2]: https://bundler.io
+
+Some of the advantages that make a central registry popular in other
+languages include:
+
+* **Discoverability**. A central registry provides an easy place to look
+ for existing packages. Combined with tagging, this also makes it
+ possible for a registry to provide ecosystem-wide information, such as a
+ list of the most popular or most-depended-on packages.
+* **Speed**. A central registry makes it possible to easily fetch just
+ the metadata for packages quickly and efficiently, and then to
+ efficiently download just the published package, and not other bloat
+ that happens to exist in the repository. This adds up to a significant
+ improvement in the speed of dependency resolution and fetching. As
+ dependency graphs scale up, downloading all of the git repositories bogs
+ down fast. Also remember that not everybody has a high-speed,
+ low-latency Internet connection.
+
+# Will Cargo work with C code (or other languages)?
+
+Yes!
+
+Cargo handles compiling Rust code, but we know that many Rust projects
+link against C code. We also know that there are decades of tooling
+built up around compiling languages other than Rust.
+
+Our solution: Cargo allows a package to [specify a script](build-script.html)
+(written in Rust) to run before invoking `rustc`. Rust is leveraged to
+implement platform-specific configuration and refactor out common build
+functionality among packages.
+
+# Can Cargo be used inside of `make` (or `ninja`, or ...)
+
+Indeed. While we intend Cargo to be useful as a standalone way to
+compile Rust projects at the top-level, we know that some people will
+want to invoke Cargo from other build tools.
+
+We have designed Cargo to work well in those contexts, paying attention
+to things like error codes and machine-readable output modes. We still
+have some work to do on those fronts, but using Cargo in the context of
+conventional scripts is something we designed for from the beginning and
+will continue to prioritize.
+
+# Does Cargo handle multi-platform projects or cross-compilation?
+
+Rust itself provides facilities for configuring sections of code based
+on the platform. Cargo also supports [platform-specific
+dependencies][target-deps], and we plan to support more per-platform
+configuration in `Cargo.toml` in the future.
+
+[target-deps]: manifest.html#the-dependencies-section
+
+In the longer-term, we’re looking at ways to conveniently cross-compile
+projects using Cargo.
+
+# Does Cargo support environments, like `production` or `test`?
+
+We support environments through the use of [profiles][profile] to support:
+
+[profile]: manifest.html#the-profile-sections
+
+* environment-specific flags (like `-g --opt-level=0` for development
+ and `--opt-level=3` for production).
+* environment-specific dependencies (like `hamcrest` for test assertions).
+* environment-specific `#[cfg]`
+* a `cargo test` command
+
+# Does Cargo work on Windows?
+
+Yes!
+
+All commits to Cargo are required to pass the local test suite on Windows.
+If, however, you find a Windows issue, we consider it a bug, so [please file an
+issue][3].
+
+[3]: https://github.com/rust-lang/cargo/issues
+
+# Why do binaries have `Cargo.lock` in version control, but not libraries?
+
+The purpose of a `Cargo.lock` is to describe the state of the world at the time
+of a successful build. It is then used to provide deterministic builds across
+whatever machine is building the project by ensuring that the exact same
+dependencies are being compiled.
+
+This property is most desirable from applications and projects which are at the
+very end of the dependency chain (binaries). As a result, it is recommended that
+all binaries check in their `Cargo.lock`.
+
+For libraries the situation is somewhat different. A library is not only used by
+the library developers, but also any downstream consumers of the library. Users
+dependent on the library will not inspect the library’s `Cargo.lock` (even if it
+exists). This is precisely because a library should **not** be deterministically
+recompiled for all users of the library.
+
+If a library ends up being used transitively by several dependencies, it’s
+likely that just a single copy of the library is desired (based on semver
+compatibility). If all libraries were to check in their `Cargo.lock`, then
+multiple copies of the library would be used, and perhaps even a version
+conflict.
+
+In other words, libraries specify semver requirements for their dependencies but
+cannot see the full picture. Only end products like binaries have a full
+picture to decide what versions of dependencies should be used.
+
+# Can libraries use `*` as a version for their dependencies?
+
+**As of January 22nd, 2016, [crates.io] rejects all packages (not just libraries)
+with wildcard dependency constraints.**
+
+While libraries _can_, strictly speaking, they should not. A version requirement
+of `*` says “This will work with every version ever,” which is never going
+to be true. Libraries should always specify the range that they do work with,
+even if it’s something as general as “every 1.x.y version.”
+
+# Why `Cargo.toml`?
+
+As one of the most frequent interactions with Cargo, the question of why the
+configuration file is named `Cargo.toml` arises from time to time. The leading
+capital-`C` was chosen to ensure that the manifest was grouped with other
+similar configuration files in directory listings. Sorting files often puts
+capital letters before lowercase letters, ensuring files like `Makefile` and
+`Cargo.toml` are placed together. The trailing `.toml` was chosen to emphasize
+the fact that the file is in the [TOML configuration
+format](https://github.com/toml-lang/toml).
+
+Cargo does not allow other names such as `cargo.toml` or `Cargofile` to
+emphasize the ease of how a Cargo repository can be identified. An option of
+many possible names has historically led to confusion where one case was handled
+but others were accidentally forgotten.
+
+[crates.io]: https://crates.io/
+
+# How can Cargo work offline?
+
+Cargo is often used in situations with limited or no network access such as
+airplanes, CI environments, or embedded in large production deployments. Users
+are often surprised when Cargo attempts to fetch resources from the network, and
+hence the request for Cargo to work offline comes up frequently.
+
+Cargo, at its heart, will not attempt to access the network unless told to do
+so. That is, if no crates comes from crates.io, a git repository, or some other
+network location, Cargo will never attempt to make a network connection. As a
+result, if Cargo attempts to touch the network, then it's because it needs to
+fetch a required resource.
+
+Cargo is also quite aggressive about caching information to minimize the amount
+of network activity. It will guarantee, for example, that if `cargo build` (or
+an equivalent) is run to completion then the next `cargo build` is guaranteed to
+not touch the network so long as `Cargo.toml` has not been modified in the
+meantime. This avoidance of the network boils down to a `Cargo.lock` existing
+and a populated cache of the crates reflected in the lock file. If either of
+these components are missing, then they're required for the build to succeed and
+must be fetched remotely.
+
+As of Rust 1.11.0 Cargo understands a new flag, `--frozen`, which is an
+assertion that it shouldn't touch the network. When passed, Cargo will
+immediately return an error if it would otherwise attempt a network request.
+The error should include contextual information about why the network request is
+being made in the first place to help debug as well. Note that this flag *does
+not change the behavior of Cargo*, it simply asserts that Cargo shouldn't touch
+the network as a previous command has been run to ensure that network activity
+shouldn't be necessary.
+
+For more information about vendoring, see documentation on [source
+replacement][replace].
+
+[replace]: source-replacement.html
--- /dev/null
+</main>
+<footer>
+<a href='index.html'>Install</a>
+<span class='sep'>|</span>
+<a href='index.html'>Getting Started</a>
+<span class='sep'>|</span>
+<a href='guide.html'>Guide</a>
+</footer>
+
+<script type='text/javascript' src='javascripts/prism.js'></script>
+<script type='text/javascript' src='javascripts/all.js'></script>
--- /dev/null
+% Cargo Guide
+
+Welcome to the Cargo guide. This guide will give you all that you need to know
+about how to use Cargo to develop Rust projects.
+
+# Why Cargo exists
+
+Cargo is a tool that allows Rust projects to declare their various
+dependencies and ensure that you’ll always get a repeatable build.
+
+To accomplish this goal, Cargo does four things:
+
+* Introduces two metadata files with various bits of project information.
+* Fetches and builds your project’s dependencies.
+* Invokes `rustc` or another build tool with the correct parameters to build
+ your project.
+* Introduces conventions to make working with Rust projects easier.
+
+# Creating a new project
+
+To start a new project with Cargo, use `cargo new`:
+
+```shell
+$ cargo new hello_world --bin
+```
+
+We’re passing `--bin` because we’re making a binary program: if we
+were making a library, we’d leave it off. This also initializes a new `git`
+repository by default. If you don't want it to do that, pass `--vcs none`.
+
+Let’s check out what Cargo has generated for us:
+
+```shell
+$ cd hello_world
+$ tree .
+.
+├── Cargo.toml
+└── src
+ └── main.rs
+
+1 directory, 2 files
+```
+
+If we had just used `cargo new hello_world` without the `--bin` flag, then
+we would have a `lib.rs` instead of a `main.rs`. For now, however, this is all
+we need to get started. First, let’s check out `Cargo.toml`:
+
+```toml
+[package]
+name = "hello_world"
+version = "0.1.0"
+authors = ["Your Name <you@example.com>"]
+```
+
+This is called a **manifest**, and it contains all of the metadata that Cargo
+needs to compile your project.
+
+Here’s what’s in `src/main.rs`:
+
+```rust
+fn main() {
+ println!("Hello, world!");
+}
+```
+
+Cargo generated a “hello world” for us. Let’s compile it:
+
+```shell
+$ cargo build
+ Compiling hello_world v0.1.0 (file:///path/to/project/hello_world)
+```
+
+And then run it:
+
+```shell
+$ ./target/debug/hello_world
+Hello, world!
+```
+
+We can also use `cargo run` to compile and then run it, all in one step (You
+won't see the `Compiling` line if you have not made any changes since you last
+compiled):
+
+```shell
+$ cargo run
+ Compiling hello_world v0.1.0 (file:///path/to/project/hello_world)
+ Running `target/debug/hello_world`
+Hello, world!
+```
+
+You’ll now notice a new file, `Cargo.lock`. It contains information about our
+dependencies. Since we don’t have any yet, it’s not very interesting.
+
+Once you’re ready for release, you can use `cargo build --release` to compile
+your files with optimizations turned on:
+
+```shell
+$ cargo build --release
+ Compiling hello_world v0.1.0 (file:///path/to/project/hello_world)
+```
+
+`cargo build --release` puts the resulting binary in `target/release` instead of
+`target/debug`.
+
+Compiling in debug mode is the default for development-- compilation time is
+shorter since the compiler doesn't do optimizations, but the code will run
+slower. Release mode takes longer to compile, but the code will run faster.
+
+# Working on an existing Cargo project
+
+If you download an existing project that uses Cargo, it’s really easy
+to get going.
+
+First, get the project from somewhere. In this example, we’ll use `rand`
+cloned from its repository on GitHub:
+
+```shell
+$ git clone https://github.com/rust-lang-nursery/rand.git
+$ cd rand
+```
+
+To build, use `cargo build`:
+
+```shell
+$ cargo build
+ Compiling rand v0.1.0 (file:///path/to/project/rand)
+```
+
+This will fetch all of the dependencies and then build them, along with the
+project.
+
+# Adding dependencies from crates.io
+
+[crates.io] is the Rust community's central package registry that serves as a
+location to discover and download packages. `cargo` is configured to use it by
+default to find requested packages.
+
+To depend on a library hosted on [crates.io], add it to your `Cargo.toml`.
+
+[crates.io]: https://crates.io/
+
+## Adding a dependency
+
+If your `Cargo.toml` doesn't already have a `[dependencies]` section, add that,
+then list the crate name and version that you would like to use. This example
+adds a dependency of the `time` crate:
+
+```toml
+[dependencies]
+time = "0.1.12"
+```
+
+The version string is a [semver] version requirement. The [specifying
+dependencies](specifying-dependencies.html) docs have more information about
+the options you have here.
+
+[semver]: https://github.com/steveklabnik/semver#requirements
+
+If we also wanted to add a dependency on the `regex` crate, we would not need
+to add `[dependencies]` for each crate listed. Here's what your whole
+`Cargo.toml` file would look like with dependencies on the `time` and `regex`
+crates:
+
+```toml
+[package]
+name = "hello_world"
+version = "0.1.0"
+authors = ["Your Name <you@example.com>"]
+
+[dependencies]
+time = "0.1.12"
+regex = "0.1.41"
+```
+
+Re-run `cargo build`, and Cargo will fetch the new dependencies and all of
+their dependencies, compile them all, and update the `Cargo.lock`:
+
+```shell
+$ cargo build
+ Updating registry `https://github.com/rust-lang/crates.io-index`
+ Downloading memchr v0.1.5
+ Downloading libc v0.1.10
+ Downloading regex-syntax v0.2.1
+ Downloading memchr v0.1.5
+ Downloading aho-corasick v0.3.0
+ Downloading regex v0.1.41
+ Compiling memchr v0.1.5
+ Compiling libc v0.1.10
+ Compiling regex-syntax v0.2.1
+ Compiling memchr v0.1.5
+ Compiling aho-corasick v0.3.0
+ Compiling regex v0.1.41
+ Compiling hello_world v0.1.0 (file:///path/to/project/hello_world)
+```
+
+Our `Cargo.lock` contains the exact information about which revision of all of
+these dependencies we used.
+
+Now, if `regex` gets updated, we will still build with the same revision until
+we choose to `cargo update`.
+
+You can now use the `regex` library using `extern crate` in `main.rs`.
+
+```rust
+extern crate regex;
+
+use regex::Regex;
+
+fn main() {
+ let re = Regex::new(r"^\d{4}-\d{2}-\d{2}$").unwrap();
+ println!("Did our date match? {}", re.is_match("2014-01-01"));
+}
+```
+
+Running it will show:
+
+```shell
+$ cargo run
+ Running `target/hello_world`
+Did our date match? true
+```
+# Project layout
+
+Cargo uses conventions for file placement to make it easy to dive into a new
+Cargo project:
+
+```shell
+.
+├── Cargo.lock
+├── Cargo.toml
+├── benches
+│ └── large-input.rs
+├── examples
+│ └── simple.rs
+├── src
+│ ├── bin
+│ │ └── another_executable.rs
+│ ├── lib.rs
+│ └── main.rs
+└── tests
+ └── some-integration-tests.rs
+```
+
+* `Cargo.toml` and `Cargo.lock` are stored in the root of your project (*package
+ root*).
+* Source code goes in the `src` directory.
+* The default library file is `src/lib.rs`.
+* The default executable file is `src/main.rs`.
+* Other executables can be placed in `src/bin/*.rs`.
+* Integration tests go in the `tests` directory (unit tests go in each file
+ they're testing).
+* Examples go in the `examples` directory.
+* Benchmarks go in the `benches` directory.
+
+These are explained in more detail in the [manifest
+description](manifest.html#the-project-layout).
+
+# Cargo.toml vs Cargo.lock
+
+`Cargo.toml` and `Cargo.lock` serve two different purposes. Before we talk
+about them, here’s a summary:
+
+* `Cargo.toml` is about describing your dependencies in a broad sense, and is
+ written by you.
+* `Cargo.lock` contains exact information about your dependencies. It is
+ maintained by Cargo and should not be manually edited.
+
+If you’re building a library that other projects will depend on, put
+`Cargo.lock` in your `.gitignore`. If you’re building an executable like a
+command-line tool or an application, check `Cargo.lock` into `git`. If you're
+curious about why that is, see ["Why do binaries have `Cargo.lock` in version
+control, but not libraries?" in the
+FAQ](faq.html#why-do-binaries-have-cargolock-in-version-control-but-not-libraries).
+
+Let’s dig in a little bit more.
+
+`Cargo.toml` is a **manifest** file in which we can specify a bunch of
+different metadata about our project. For example, we can say that we depend
+on another project:
+
+```toml
+[package]
+name = "hello_world"
+version = "0.1.0"
+authors = ["Your Name <you@example.com>"]
+
+[dependencies]
+rand = { git = "https://github.com/rust-lang-nursery/rand.git" }
+```
+
+This project has a single dependency, on the `rand` library. We’ve stated in
+this case that we’re relying on a particular Git repository that lives on
+GitHub. Since we haven’t specified any other information, Cargo assumes that
+we intend to use the latest commit on the `master` branch to build our project.
+
+Sound good? Well, there’s one problem: If you build this project today, and
+then you send a copy to me, and I build this project tomorrow, something bad
+could happen. There could be more commits to `rand` in the meantime, and my
+build would include new commits while yours would not. Therefore, we would
+get different builds. This would be bad because we want reproducible builds.
+
+We could fix this problem by putting a `rev` line in our `Cargo.toml`:
+
+```toml
+[dependencies]
+rand = { git = "https://github.com/rust-lang-nursery/rand.git", rev = "9f35b8e" }
+```
+
+Now our builds will be the same. But there’s a big drawback: now we have to
+manually think about SHA-1s every time we want to update our library. This is
+both tedious and error prone.
+
+Enter the `Cargo.lock`. Because of its existence, we don’t need to manually
+keep track of the exact revisions: Cargo will do it for us. When we have a
+manifest like this:
+
+```toml
+[package]
+name = "hello_world"
+version = "0.1.0"
+authors = ["Your Name <you@example.com>"]
+
+[dependencies]
+rand = { git = "https://github.com/rust-lang-nursery/rand.git" }
+```
+
+Cargo will take the latest commit and write that information out into our
+`Cargo.lock` when we build for the first time. That file will look like this:
+
+```toml
+[[package]]
+name = "hello_world"
+version = "0.1.0"
+dependencies = [
+ "rand 0.1.0 (git+https://github.com/rust-lang-nursery/rand.git#9f35b8e439eeedd60b9414c58f389bdc6a3284f9)",
+]
+
+[[package]]
+name = "rand"
+version = "0.1.0"
+source = "git+https://github.com/rust-lang-nursery/rand.git#9f35b8e439eeedd60b9414c58f389bdc6a3284f9"
+```
+
+You can see that there’s a lot more information here, including the exact
+revision we used to build. Now when you give your project to someone else,
+they’ll use the exact same SHA, even though we didn’t specify it in our
+`Cargo.toml`.
+
+When we’re ready to opt in to a new version of the library, Cargo can
+re-calculate the dependencies and update things for us:
+
+```shell
+$ cargo update # updates all dependencies
+$ cargo update -p rand # updates just “rand”
+```
+
+This will write out a new `Cargo.lock` with the new version information. Note
+that the argument to `cargo update` is actually a
+[Package ID Specification](pkgid-spec.html) and `rand` is just a short
+specification.
+
+# Tests
+
+Cargo can run your tests with the `cargo test` command. Cargo looks for tests
+to run in two places: in each of your `src` files and any tests in `tests/`.
+Tests in your `src` files should be unit tests, and tests in `tests/` should be
+integration-style tests. As such, you’ll need to import your crates into
+the files in `tests`.
+
+Here's an example of running `cargo test` in our project, which currently has
+no tests:
+
+```shell
+$ cargo test
+ Compiling rand v0.1.0 (https://github.com/rust-lang-nursery/rand.git#9f35b8e)
+ Compiling hello_world v0.1.0 (file:///path/to/project/hello_world)
+ Running target/test/hello_world-9c2b65bbb79eabce
+
+running 0 tests
+
+test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out
+```
+
+If our project had tests, we would see more output with the correct number of
+tests.
+
+You can also run a specific test by passing a filter:
+
+```shell
+$ cargo test foo
+```
+
+This will run any test with `foo` in its name.
+
+`cargo test` runs additional checks as well. For example, it will compile any
+examples you’ve included and will also test the examples in your
+documentation. Please see the [testing guide][testing] in the Rust
+documentation for more details.
+
+[testing]: https://doc.rust-lang.org/book/testing.html
+
+## Travis CI
+
+To test your project on Travis CI, here is a sample `.travis.yml` file:
+
+```yaml
+language: rust
+rust:
+ - stable
+ - beta
+ - nightly
+matrix:
+ allow_failures:
+ - rust: nightly
+```
+
+This will test all three release channels, but any breakage in nightly
+will not fail your overall build. Please see the [Travis CI Rust
+documentation](https://docs.travis-ci.com/user/languages/rust/) for more
+information.
+
+
+## Build cache
+
+Cargo shares build artifacts among all the packages of a single workspace.
+Today, Cargo does not share build results across different workspaces, but
+a similar result can be achieved by using a third party tool, [sccache].
+
+To setup `sccache`, install it with `cargo install sccache` and set
+`RUSTC_WRAPPER` environmental variable to `sccache` before invoking Cargo.
+If you use bash, it makes sense to add `export RUSTC_WRAPPER=sccache` to
+`.bashrc`. Refer to sccache documentation for more details.
+
+[sccache]: https://github.com/mozilla/sccache
+
+
+# Further reading
+
+Now that you have an overview of how to use cargo and have created your first
+crate, you may be interested in:
+
+* [Publishing your crate on crates.io](crates-io.html)
+* [Reading about all the possible ways of specifying dependencies](specifying-dependencies.html)
+* [Learning more details about what you can specify in your `Cargo.toml` manifest](manifest.html)
+
+Even more topics are available in the Docs menu at the top!
--- /dev/null
+<a href='https://github.com/rust-lang/cargo' class='fork-me'>
+ <img src='images/forkme.png'/>
+</a>
+
+<div id="header">
+ <a href='https://crates.io' class='logo'>
+ <img id="logo" height=100 width=100 src='images/Cargo-Logo-Small.png'/>
+ </a>
+ <a href="index.html">
+ <h1>CARGO</h1>
+ </a>
+
+ <div class="search">
+ <form action="https://crates.io/search"
+ method="GET">
+ <input name="q" class="search" placeholder="Search crates" type="text"/>
+ </form>
+ </div>
+
+ <div class="nav">
+ <a href='https://crates.io/crates'>Browse All Crates</a>
+
+ <span class='sep'>|</span>
+
+ <div class="dropdown-container">
+ <button class="dropdown">
+ Docs
+ <span class="arrow"></span>
+ </button>
+ <!-- Sync this list with
+ https://github.com/rust-lang/crates.io/blob/master/app/templates/application.hbs
+ and with Makefile.in in this repository -->
+ <ul id="current-user-links" class="dropdown" data-bindattr-503="503">
+ <li><a href='index.html'>Getting Started</a></li>
+ <li><a href='guide.html'>Guide</a></li>
+ <li><a href='specifying-dependencies.html'>Specifying Dependencies</a></li>
+ <li><a href='crates-io.html'>Publishing on crates.io</a></li>
+ <li><a href='faq.html'>FAQ</a></li>
+ <li><a href='manifest.html'>Cargo.toml Format</a></li>
+ <li><a href='build-script.html'>Build Scripts</a></li>
+ <li><a href='config.html'>Configuration</a></li>
+ <li><a href='pkgid-spec.html'>Package ID specs</a></li>
+ <li><a href='environment-variables.html'>Environment Variables</a></li>
+ <li><a href='source-replacement.html'>Source Replacement</a></li>
+ <li><a href='external-tools.html'>External Tools</a></li>
+ <li><a href='https://crates.io/policies'>Policies</a></li>
+ </ul>
+ </div>
+ </div>
+</div>
+
+<main>
--- /dev/null
+<script src="https://code.jquery.com/jquery-2.1.1.min.js"></script>
+<link rel="icon" type="image/x-icon" href="favicon.ico">
--- /dev/null
+% Cargo, Rust’s Package Manager
+
+# Installing
+
+### Install Stable Rust and Cargo
+
+The easiest way to get Cargo is to get the current stable release of [Rust] by
+using the `rustup` script:
+
+```shell
+$ curl -sSf https://static.rust-lang.org/rustup.sh | sh
+```
+
+After this, you can use the `rustup` command to also install `beta` or `nightly`
+channels for Rust and Cargo.
+
+### Install Nightly Cargo
+
+To install just Cargo, the current recommended installation method is through
+the official nightly builds. Note that Cargo will also require that [Rust] is
+already installed on the system.
+
+| Platform | 64-bit | 32-bit |
+|------------------|-------------------|-------------------|
+| Linux binaries | [tar.gz][linux64] | [tar.gz][linux32] |
+| MacOS binaries | [tar.gz][mac64] | [tar.gz][mac32] |
+| Windows binaries | [tar.gz][win64] | [tar.gz][win32] |
+
+### Build and Install Cargo from Source
+
+Alternatively, you can [build Cargo from source][compiling-from-source].
+
+[rust]: https://www.rust-lang.org/
+[linux64]: https://static.rust-lang.org/cargo-dist/cargo-nightly-x86_64-unknown-linux-gnu.tar.gz
+[linux32]: https://static.rust-lang.org/cargo-dist/cargo-nightly-i686-unknown-linux-gnu.tar.gz
+[mac64]: https://static.rust-lang.org/cargo-dist/cargo-nightly-x86_64-apple-darwin.tar.gz
+[mac32]: https://static.rust-lang.org/cargo-dist/cargo-nightly-i686-apple-darwin.tar.gz
+[win64]: https://static.rust-lang.org/cargo-dist/cargo-nightly-x86_64-pc-windows-gnu.tar.gz
+[win32]: https://static.rust-lang.org/cargo-dist/cargo-nightly-i686-pc-windows-gnu.tar.gz
+[compiling-from-source]: https://github.com/rust-lang/cargo#compiling-from-source
+
+# Let’s get started
+
+To start a new project with Cargo, use `cargo new`:
+
+```shell
+$ cargo new hello_world --bin
+```
+
+We’re passing `--bin` because we’re making a binary program: if we
+were making a library, we’d leave it off.
+
+Let’s check out what Cargo has generated for us:
+
+```shell
+$ cd hello_world
+$ tree .
+.
+├── Cargo.toml
+└── src
+ └── main.rs
+
+1 directory, 2 files
+```
+
+This is all we need to get started. First, let’s check out `Cargo.toml`:
+
+```toml
+[package]
+name = "hello_world"
+version = "0.1.0"
+authors = ["Your Name <you@example.com>"]
+```
+
+This is called a **manifest**, and it contains all of the metadata that Cargo
+needs to compile your project.
+
+Here’s what’s in `src/main.rs`:
+
+```
+fn main() {
+ println!("Hello, world!");
+}
+```
+
+Cargo generated a “hello world” for us. Let’s compile it:
+
+```shell
+$ cargo build
+ Compiling hello_world v0.1.0 (file:///path/to/project/hello_world)
+```
+
+And then run it:
+
+```shell
+$ ./target/debug/hello_world
+Hello, world!
+```
+
+We can also use `cargo run` to compile and then run it, all in one step:
+
+```shell
+$ cargo run
+ Fresh hello_world v0.1.0 (file:///path/to/project/hello_world)
+ Running `target/hello_world`
+Hello, world!
+```
+
+# Going further
+
+For more details on using Cargo, check out the [Cargo Guide](guide.html)
--- /dev/null
+//= require_tree .
+
+Prism.languages.toml = {
+ // https://github.com/LeaVerou/prism/issues/307
+ 'comment': [{
+ pattern: /(^[^"]*?("[^"]*?"[^"]*?)*?[^"\\]*?)(\/\*[\w\W]*?\*\/|(^|[^:])#.*?(\r?\n|$))/g,
+ lookbehind: true
+ }],
+ 'string': /("|')(\\?.)*?\1/g,
+ 'number': /\d+/,
+ 'boolean': /true|false/,
+ 'toml-section': /\[.*\]/,
+ 'toml-key': /[\w-]+/
+};
+
+$(function() {
+ var pres = document.querySelectorAll('pre.rust');
+ for (var i = 0; i < pres.length; i++) {
+ pres[i].className += ' language-rust';
+ }
+
+ // Toggles docs menu
+ $('button.dropdown, a.dropdown').click(function(el, e) {
+ $(this).toggleClass('active').siblings('ul').toggleClass('open');
+
+ return false;
+ });
+
+ // A click in the page anywhere but in the menu will turn the menu off
+ $(document).on('click', function(e) {
+ // Checks to make sure the click did not come from inside dropdown menu
+ // if it doesn't we close the menu
+ // else, we do nothing and just follow the link
+ if (!$(e.target).closest('ul.dropdown').length) {
+ var toggles = $('button.dropdown.active, a.dropdown.active');
+ toggles.toggleClass('active').siblings('ul').toggleClass('open');
+
+ }
+ });
+});
--- /dev/null
+/* http://prismjs.com/download.html?themes=prism&languages=markup+css+clike+javascript */
+self="undefined"!=typeof window?window:"undefined"!=typeof WorkerGlobalScope&&self instanceof WorkerGlobalScope?self:{};var Prism=function(){var e=/\blang(?:uage)?-(?!\*)(\w+)\b/i,t=self.Prism={util:{encode:function(e){return e instanceof n?new n(e.type,t.util.encode(e.content),e.alias):"Array"===t.util.type(e)?e.map(t.util.encode):e.replace(/&/g,"&").replace(/</g,"<").replace(/\u00a0/g," ")},type:function(e){return Object.prototype.toString.call(e).match(/\[object (\w+)\]/)[1]},clone:function(e){var n=t.util.type(e);switch(n){case"Object":var a={};for(var r in e)e.hasOwnProperty(r)&&(a[r]=t.util.clone(e[r]));return a;case"Array":return e.slice()}return e}},languages:{extend:function(e,n){var a=t.util.clone(t.languages[e]);for(var r in n)a[r]=n[r];return a},insertBefore:function(e,n,a,r){r=r||t.languages;var i=r[e],l={};for(var o in i)if(i.hasOwnProperty(o)){if(o==n)for(var s in a)a.hasOwnProperty(s)&&(l[s]=a[s]);l[o]=i[o]}return r[e]=l},DFS:function(e,n){for(var a in e)n.call(e,a,e[a]),"Object"===t.util.type(e)&&t.languages.DFS(e[a],n)}},highlightAll:function(e,n){for(var a,r=document.querySelectorAll('code[class*="language-"], [class*="language-"] code, code[class*="lang-"], [class*="lang-"] code'),i=0;a=r[i++];)t.highlightElement(a,e===!0,n)},highlightElement:function(a,r,i){for(var l,o,s=a;s&&!e.test(s.className);)s=s.parentNode;if(s&&(l=(s.className.match(e)||[,""])[1],o=t.languages[l]),o){a.className=a.className.replace(e,"").replace(/\s+/g," ")+" language-"+l,s=a.parentNode,/pre/i.test(s.nodeName)&&(s.className=s.className.replace(e,"").replace(/\s+/g," ")+" language-"+l);var c=a.textContent;if(c){var g={element:a,language:l,grammar:o,code:c};if(t.hooks.run("before-highlight",g),r&&self.Worker){var u=new Worker(t.filename);u.onmessage=function(e){g.highlightedCode=n.stringify(JSON.parse(e.data),l),t.hooks.run("before-insert",g),g.element.innerHTML=g.highlightedCode,i&&i.call(g.element),t.hooks.run("after-highlight",g)},u.postMessage(JSON.stringify({language:g.language,code:g.code}))}else g.highlightedCode=t.highlight(g.code,g.grammar,g.language),t.hooks.run("before-insert",g),g.element.innerHTML=g.highlightedCode,i&&i.call(a),t.hooks.run("after-highlight",g)}}},highlight:function(e,a,r){var i=t.tokenize(e,a);return n.stringify(t.util.encode(i),r)},tokenize:function(e,n){var a=t.Token,r=[e],i=n.rest;if(i){for(var l in i)n[l]=i[l];delete n.rest}e:for(var l in n)if(n.hasOwnProperty(l)&&n[l]){var o=n[l];o="Array"===t.util.type(o)?o:[o];for(var s=0;s<o.length;++s){var c=o[s],g=c.inside,u=!!c.lookbehind,f=0,h=c.alias;c=c.pattern||c;for(var p=0;p<r.length;p++){var d=r[p];if(r.length>e.length)break e;if(!(d instanceof a)){c.lastIndex=0;var m=c.exec(d);if(m){u&&(f=m[1].length);var y=m.index-1+f,m=m[0].slice(f),v=m.length,k=y+v,b=d.slice(0,y+1),w=d.slice(k+1),N=[p,1];b&&N.push(b);var O=new a(l,g?t.tokenize(m,g):m,h);N.push(O),w&&N.push(w),Array.prototype.splice.apply(r,N)}}}}}return r},hooks:{all:{},add:function(e,n){var a=t.hooks.all;a[e]=a[e]||[],a[e].push(n)},run:function(e,n){var a=t.hooks.all[e];if(a&&a.length)for(var r,i=0;r=a[i++];)r(n)}}},n=t.Token=function(e,t,n){this.type=e,this.content=t,this.alias=n};if(n.stringify=function(e,a,r){if("string"==typeof e)return e;if("[object Array]"==Object.prototype.toString.call(e))return e.map(function(t){return n.stringify(t,a,e)}).join("");var i={type:e.type,content:n.stringify(e.content,a,r),tag:"span",classes:["token",e.type],attributes:{},language:a,parent:r};if("comment"==i.type&&(i.attributes.spellcheck="true"),e.alias){var l="Array"===t.util.type(e.alias)?e.alias:[e.alias];Array.prototype.push.apply(i.classes,l)}t.hooks.run("wrap",i);var o="";for(var s in i.attributes)o+=s+'="'+(i.attributes[s]||"")+'"';return"<"+i.tag+' class="'+i.classes.join(" ")+'" '+o+">"+i.content+"</"+i.tag+">"},!self.document)return self.addEventListener?(self.addEventListener("message",function(e){var n=JSON.parse(e.data),a=n.language,r=n.code;self.postMessage(JSON.stringify(t.util.encode(t.tokenize(r,t.languages[a])))),self.close()},!1),self.Prism):self.Prism;var a=document.getElementsByTagName("script");return a=a[a.length-1],a&&(t.filename=a.src,document.addEventListener&&!a.hasAttribute("data-manual")&&document.addEventListener("DOMContentLoaded",t.highlightAll)),self.Prism}();"undefined"!=typeof module&&module.exports&&(module.exports=Prism);;
+Prism.languages.markup={comment:/<!--[\w\W]*?-->/g,prolog:/<\?.+?\?>/,doctype:/<!DOCTYPE.+?>/,cdata:/<!\[CDATA\[[\w\W]*?]]>/i,tag:{pattern:/<\/?[\w:-]+\s*(?:\s+[\w:-]+(?:=(?:("|')(\\?[\w\W])*?\1|[^\s'">=]+))?\s*)*\/?>/gi,inside:{tag:{pattern:/^<\/?[\w:-]+/i,inside:{punctuation:/^<\/?/,namespace:/^[\w-]+?:/}},"attr-value":{pattern:/=(?:('|")[\w\W]*?(\1)|[^\s>]+)/gi,inside:{punctuation:/=|>|"/g}},punctuation:/\/?>/g,"attr-name":{pattern:/[\w:-]+/g,inside:{namespace:/^[\w-]+?:/}}}},entity:/\&#?[\da-z]{1,8};/gi},Prism.hooks.add("wrap",function(t){"entity"===t.type&&(t.attributes.title=t.content.replace(/&/,"&"))});;
+Prism.languages.css={comment:/\/\*[\w\W]*?\*\//g,atrule:{pattern:/@[\w-]+?.*?(;|(?=\s*{))/gi,inside:{punctuation:/[;:]/g}},url:/url\((["']?).*?\1\)/gi,selector:/[^\{\}\s][^\{\};]*(?=\s*\{)/g,property:/(\b|\B)[\w-]+(?=\s*:)/gi,string:/("|')(\\?.)*?\1/g,important:/\B!important\b/gi,punctuation:/[\{\};:]/g,"function":/[-a-z0-9]+(?=\()/gi},Prism.languages.markup&&Prism.languages.insertBefore("markup","tag",{style:{pattern:/<style[\w\W]*?>[\w\W]*?<\/style>/gi,inside:{tag:{pattern:/<style[\w\W]*?>|<\/style>/gi,inside:Prism.languages.markup.tag.inside},rest:Prism.languages.css}}});;
+Prism.languages.clike={comment:[{pattern:/(^|[^\\])\/\*[\w\W]*?\*\//g,lookbehind:!0},{pattern:/(^|[^\\:])\/\/.*?(\r?\n|$)/g,lookbehind:!0}],string:/("|')(\\?.)*?\1/g,"class-name":{pattern:/((?:(?:class|interface|extends|implements|trait|instanceof|new)\s+)|(?:catch\s+\())[a-z0-9_\.\\]+/gi,lookbehind:!0,inside:{punctuation:/(\.|\\)/}},keyword:/\b(if|else|while|do|for|return|in|instanceof|function|new|try|throw|catch|finally|null|break|continue)\b/g,"boolean":/\b(true|false)\b/g,"function":{pattern:/[a-z0-9_]+\(/gi,inside:{punctuation:/\(/}},number:/\b-?(0x[\dA-Fa-f]+|\d*\.?\d+([Ee]-?\d+)?)\b/g,operator:/[-+]{1,2}|!|<=?|>=?|={1,3}|&{1,2}|\|?\||\?|\*|\/|\~|\^|\%/g,ignore:/&(lt|gt|amp);/gi,punctuation:/[{}[\];(),.:]/g};;
+Prism.languages.javascript=Prism.languages.extend("clike",{keyword:/\b(break|case|catch|class|const|continue|debugger|default|delete|do|else|enum|export|extends|false|finally|for|function|get|if|implements|import|in|instanceof|interface|let|new|null|package|private|protected|public|return|set|static|super|switch|this|throw|true|try|typeof|var|void|while|with|yield)\b/g,number:/\b-?(0x[\dA-Fa-f]+|\d*\.?\d+([Ee]-?\d+)?|NaN|-?Infinity)\b/g}),Prism.languages.insertBefore("javascript","keyword",{regex:{pattern:/(^|[^/])\/(?!\/)(\[.+?]|\\.|[^/\r\n])+\/[gim]{0,3}(?=\s*($|[\r\n,.;})]))/g,lookbehind:!0}}),Prism.languages.markup&&Prism.languages.insertBefore("markup","tag",{script:{pattern:/<script[\w\W]*?>[\w\W]*?<\/script>/gi,inside:{tag:{pattern:/<script[\w\W]*?>|<\/script>/gi,inside:Prism.languages.markup.tag.inside},rest:Prism.languages.javascript}}});;
--- /dev/null
+% The Manifest Format
+
+# The `[package]` section
+
+The first section in a `Cargo.toml` is `[package]`.
+
+```toml
+[package]
+name = "hello_world" # the name of the package
+version = "0.1.0" # the current version, obeying semver
+authors = ["you@example.com"]
+```
+
+All three of these fields are mandatory.
+
+## The `version` field
+
+Cargo bakes in the concept of [Semantic
+Versioning](http://semver.org/), so make sure you follow some basic rules:
+
+* Before you reach 1.0.0, anything goes, but if you make breaking changes,
+ increment the minor version. In Rust, breaking changes include adding fields to
+ structs or variants to enums.
+* After 1.0.0, only make breaking changes when you increment the major version.
+ Don’t break the build.
+* After 1.0.0, don’t add any new public API (no new `pub` anything) in tiny
+ versions. Always increment the minor version if you add any new `pub` structs,
+ traits, fields, types, functions, methods or anything else.
+* Use version numbers with three numeric parts such as 1.0.0 rather than 1.0.
+
+## The `build` field (optional)
+
+This field specifies a file in the project root which is a [build script][1] for
+building native code. More information can be found in the build script
+[guide][1].
+
+[1]: build-script.html
+
+```toml
+[package]
+# ...
+build = "build.rs"
+```
+
+## The `documentation` field (optional)
+
+This field specifies a URL to a website hosting the crate's documentation.
+If no URL is specified in the manifest file, [crates.io][cratesio] will
+automatically link your crate to the corresponding [docs.rs][docsrs] page.
+
+Documentation links from specific hosts are blacklisted. Hosts are added
+to the blacklist if they are known to not be hosting documentation and are
+possibly of malicious intent e.g. ad tracking networks. URLs from the
+following hosts are blacklisted:
+
+* rust-ci.org
+
+Documentation URLs from blacklisted hosts will not appear on crates.io, and
+may be replaced by docs.rs links.
+
+[docsrs]: https://docs.rs/
+[cratesio]: https://crates.io/
+
+## The `exclude` and `include` fields (optional)
+
+You can explicitly specify to Cargo that a set of [globs][globs] should be
+ignored or included for the purposes of packaging and rebuilding a package. The
+globs specified in the `exclude` field identify a set of files that are not
+included when a package is published as well as ignored for the purposes of
+detecting when to rebuild a package, and the globs in `include` specify files
+that are explicitly included.
+
+If a VCS is being used for a package, the `exclude` field will be seeded with
+the VCS’ ignore settings (`.gitignore` for git for example).
+
+```toml
+[package]
+# ...
+exclude = ["build/**/*.o", "doc/**/*.html"]
+```
+
+```toml
+[package]
+# ...
+include = ["src/**/*", "Cargo.toml"]
+```
+
+The options are mutually exclusive: setting `include` will override an
+`exclude`. Note that `include` must be an exhaustive list of files as otherwise
+necessary source files may not be included.
+
+[globs]: http://doc.rust-lang.org/glob/glob/struct.Pattern.html
+
+### Migrating to `gitignore`-like pattern matching
+
+The current interpretation of these configs is based on UNIX Globs, as
+implemented in the [`glob` crate](https://crates.io/crates/glob). We want
+Cargo's `include` and `exclude` configs to work as similar to `gitignore` as
+possible. [The `gitignore` specification](https://git-scm.com/docs/gitignore) is
+also based on Globs, but has a bunch of additional features that enable easier
+pattern writing and more control. Therefore, we are migrating the interpretation
+for the rules of these configs to use the [`ignore`
+crate](https://crates.io/crates/ignore), and treat them each rule as a single
+line in a `gitignore` file. See [the tracking
+issue](https://github.com/rust-lang/cargo/issues/4268) for more details on the
+migration.
+
+## The `publish` field (optional)
+
+The `publish` field can be used to prevent a package from being published to a
+package registry (like *crates.io*) by mistake.
+
+```toml
+[package]
+# ...
+publish = false
+```
+
+## The `workspace` field (optional)
+
+The `workspace` field can be used to configure the workspace that this package
+will be a member of. If not specified this will be inferred as the first
+Cargo.toml with `[workspace]` upwards in the filesystem.
+
+```toml
+[package]
+# ...
+workspace = "path/to/workspace/root"
+```
+
+For more information, see the documentation for the workspace table below.
+
+## Package metadata
+
+There are a number of optional metadata fields also accepted under the
+`[package]` section:
+
+```toml
+[package]
+# ...
+
+# A short blurb about the package. This is not rendered in any format when
+# uploaded to crates.io (aka this is not markdown).
+description = "..."
+
+# These URLs point to more information about the package. These are
+# intended to be webviews of the relevant data, not necessarily compatible
+# with VCS tools and the like.
+documentation = "..."
+homepage = "..."
+repository = "..."
+
+# This points to a file under the package root (relative to this `Cargo.toml`).
+# The contents of this file are stored and indexed in the registry.
+# crates.io will render this file and place the result on the crate's page.
+readme = "..."
+
+# This is a list of up to five keywords that describe this crate. Keywords
+# are searchable on crates.io, and you may choose any words that would
+# help someone find this crate.
+keywords = ["...", "..."]
+
+# This is a list of up to five categories where this crate would fit.
+# Categories are a fixed list available at crates.io/category_slugs, and
+# they must match exactly.
+categories = ["...", "..."]
+
+# This is a string description of the license for this package. Currently
+# crates.io will validate the license provided against a whitelist of known
+# license identifiers from http://spdx.org/licenses/. Multiple licenses can be
+# separated with a `/`.
+license = "..."
+
+# If a project is using a nonstandard license, then this key may be specified in
+# lieu of the above key and must point to a file relative to this manifest
+# (similar to the readme key).
+license-file = "..."
+
+# Optional specification of badges to be displayed on crates.io.
+#
+# - The badges pertaining to build status that are currently available are
+# Appveyor, CircleCI, GitLab, and TravisCI.
+# - Available badges pertaining to code test coverage are Codecov and
+# Coveralls.
+# - There are also maintenance-related badges basesed on isitmaintained.com
+# which state the issue resolution time, percent of open issues, and future
+# maintenance intentions.
+#
+# If a `repository` key is required, this refers to a repository in
+# `user/repo` format.
+[badges]
+
+# Appveyor: `repository` is required. `branch` is optional; default is `master`
+# `service` is optional; valid values are `github` (default), `bitbucket`, and
+# `gitlab`; `id` is optional; you can specify the appveyor project id if you
+# want to use that instead. `project_name` is optional; use when the repository
+# name differs from the appveyor project name.
+appveyor = { repository = "...", branch = "master", service = "github" }
+
+# Circle CI: `repository` is required. `branch` is optional; default is `master`
+circle-ci = { repository = "...", branch = "master" }
+
+# GitLab: `repository` is required. `branch` is optional; default is `master`
+gitlab = { repository = "...", branch = "master" }
+
+# Travis CI: `repository` in format "<user>/<project>" is required.
+# `branch` is optional; default is `master`
+travis-ci = { repository = "...", branch = "master" }
+
+# Codecov: `repository` is required. `branch` is optional; default is `master`
+# `service` is optional; valid values are `github` (default), `bitbucket`, and
+# `gitlab`.
+codecov = { repository = "...", branch = "master", service = "github" }
+
+# Coveralls: `repository` is required. `branch` is optional; default is `master`
+# `service` is optional; valid values are `github` (default) and `bitbucket`.
+coveralls = { repository = "...", branch = "master", service = "github" }
+
+# Is it maintained resolution time: `repository` is required.
+is-it-maintained-issue-resolution = { repository = "..." }
+
+# Is it maintained percentage of open issues: `repository` is required.
+is-it-maintained-open-issues = { repository = "..." }
+
+# Maintenance: `status` is required Available options are `actively-developed`,
+# `passively-maintained`, `as-is`, `none`, `experimental`, `looking-for-maintainer`
+# and `deprecated`.
+maintenance = { status = "..." }
+```
+
+The [crates.io](https://crates.io) registry will render the description, display
+the license, link to the three URLs and categorize by the keywords. These keys
+provide useful information to users of the registry and also influence the
+search ranking of a crate. It is highly discouraged to omit everything in a
+published crate.
+
+## The `metadata` table (optional)
+
+Cargo by default will warn about unused keys in `Cargo.toml` to assist in
+detecting typos and such. The `package.metadata` table, however, is completely
+ignored by Cargo and will not be warned about. This section can be used for
+tools which would like to store project configuration in `Cargo.toml`. For
+example:
+
+```toml
+[package]
+name = "..."
+# ...
+
+# Metadata used when generating an Android APK, for example.
+[package.metadata.android]
+package-name = "my-awesome-android-app"
+assets = "path/to/static"
+```
+
+# Dependency sections
+
+See the [specifying dependencies page](specifying-dependencies.html) for
+information on the `[dependencies]`, `[dev-dependencies]`,
+`[build-dependencies]`, and target-specific `[target.*.dependencies]` sections.
+
+# The `[profile.*]` sections
+
+Cargo supports custom configuration of how rustc is invoked through profiles at
+the top level. Any manifest may declare a profile, but only the top level
+project’s profiles are actually read. All dependencies’ profiles will be
+overridden. This is done so the top-level project has control over how its
+dependencies are compiled.
+
+There are five currently supported profile names, all of which have the same
+configuration available to them. Listed below is the configuration available,
+along with the defaults for each profile.
+
+```toml
+# The development profile, used for `cargo build`.
+[profile.dev]
+opt-level = 0 # controls the `--opt-level` the compiler builds with.
+ # 0-1 is good for debugging. 2 is well-optimized. Max is 3.
+debug = true # include debug information (debug symbols). Equivalent to
+ # `-C debuginfo=2` compiler flag.
+rpath = false # controls whether compiler should set loader paths.
+ # If true, passes `-C rpath` flag to the compiler.
+lto = false # Link Time Optimization usually reduces size of binaries
+ # and static libraries. Increases compilation time.
+ # If true, passes `-C lto` flag to the compiler.
+debug-assertions = true # controls whether debug assertions are enabled
+ # (e.g. debug_assert!() and arithmetic overflow checks)
+codegen-units = 1 # if > 1 enables parallel code generation which improves
+ # compile times, but prevents some optimizations.
+ # Passes `-C codegen-units`. Ignored when `lto = true`.
+panic = 'unwind' # panic strategy (`-C panic=...`), can also be 'abort'
+
+# The release profile, used for `cargo build --release`.
+[profile.release]
+opt-level = 3
+debug = false
+rpath = false
+lto = false
+debug-assertions = false
+codegen-units = 1
+panic = 'unwind'
+
+# The testing profile, used for `cargo test`.
+[profile.test]
+opt-level = 0
+debug = 2
+rpath = false
+lto = false
+debug-assertions = true
+codegen-units = 1
+panic = 'unwind'
+
+# The benchmarking profile, used for `cargo bench` and `cargo test --release`.
+[profile.bench]
+opt-level = 3
+debug = false
+rpath = false
+lto = false
+debug-assertions = false
+codegen-units = 1
+panic = 'unwind'
+
+# The documentation profile, used for `cargo doc`.
+[profile.doc]
+opt-level = 0
+debug = 2
+rpath = false
+lto = false
+debug-assertions = true
+codegen-units = 1
+panic = 'unwind'
+```
+
+# The `[features]` section
+
+Cargo supports features to allow expression of:
+
+* conditional compilation options (usable through `cfg` attributes);
+* optional dependencies, which enhance a package, but are not required; and
+* clusters of optional dependencies, such as `postgres`, that would include the
+ `postgres` package, the `postgres-macros` package, and possibly other packages
+ (such as development-time mocking libraries, debugging tools, etc.).
+
+A feature of a package is either an optional dependency, or a set of other
+features. The format for specifying features is:
+
+```toml
+[package]
+name = "awesome"
+
+[features]
+# The default set of optional packages. Most people will want to use these
+# packages, but they are strictly optional. Note that `session` is not a package
+# but rather another feature listed in this manifest.
+default = ["jquery", "uglifier", "session"]
+
+# A feature with no dependencies is used mainly for conditional compilation,
+# like `#[cfg(feature = "go-faster")]`.
+go-faster = []
+
+# The `secure-password` feature depends on the bcrypt package. This aliasing
+# will allow people to talk about the feature in a higher-level way and allow
+# this package to add more requirements to the feature in the future.
+secure-password = ["bcrypt"]
+
+# Features can be used to reexport features of other packages. The `session`
+# feature of package `awesome` will ensure that the `session` feature of the
+# package `cookie` is also enabled.
+session = ["cookie/session"]
+
+[dependencies]
+# These packages are mandatory and form the core of this package’s distribution.
+cookie = "1.2.0"
+oauth = "1.1.0"
+route-recognizer = "=2.1.0"
+
+# A list of all of the optional dependencies, some of which are included in the
+# above `features`. They can be opted into by apps.
+jquery = { version = "1.0.2", optional = true }
+uglifier = { version = "1.5.3", optional = true }
+bcrypt = { version = "*", optional = true }
+civet = { version = "*", optional = true }
+```
+
+To use the package `awesome`:
+
+```toml
+[dependencies.awesome]
+version = "1.3.5"
+default-features = false # do not include the default features, and optionally
+ # cherry-pick individual features
+features = ["secure-password", "civet"]
+```
+
+## Rules
+
+The usage of features is subject to a few rules:
+
+* Feature names must not conflict with other package names in the manifest. This
+ is because they are opted into via `features = [...]`, which only has a single
+ namespace.
+* With the exception of the `default` feature, all features are opt-in. To opt
+ out of the default feature, use `default-features = false` and cherry-pick
+ individual features.
+* Feature groups are not allowed to cyclically depend on one another.
+* Dev-dependencies cannot be optional.
+* Features groups can only reference optional dependencies.
+* When a feature is selected, Cargo will call `rustc` with `--cfg
+ feature="${feature_name}"`. If a feature group is included, it and all of its
+ individual features will be included. This can be tested in code via
+ `#[cfg(feature = "foo")]`.
+
+Note that it is explicitly allowed for features to not actually activate any
+optional dependencies. This allows packages to internally enable/disable
+features without requiring a new dependency.
+
+## Usage in end products
+
+One major use-case for this feature is specifying optional features in
+end-products. For example, the Servo project may want to include optional
+features that people can enable or disable when they build it.
+
+In that case, Servo will describe features in its `Cargo.toml` and they can be
+enabled using command-line flags:
+
+```
+$ cargo build --release --features "shumway pdf"
+```
+
+Default features could be excluded using `--no-default-features`.
+
+## Usage in packages
+
+In most cases, the concept of *optional dependency* in a library is best
+expressed as a separate package that the top-level application depends on.
+
+However, high-level packages, like Iron or Piston, may want the ability to
+curate a number of packages for easy installation. The current Cargo system
+allows them to curate a number of mandatory dependencies into a single package
+for easy installation.
+
+In some cases, packages may want to provide additional curation for optional
+dependencies:
+
+* grouping a number of low-level optional dependencies together into a single
+ high-level feature;
+* specifying packages that are recommended (or suggested) to be included by
+ users of the package; and
+* including a feature (like `secure-password` in the motivating example) that
+ will only work if an optional dependency is available, and would be difficult
+ to implement as a separate package (for example, it may be overly difficult to
+ design an IO package to be completely decoupled from OpenSSL, with opt-in via
+ the inclusion of a separate package).
+
+In almost all cases, it is an antipattern to use these features outside of
+high-level packages that are designed for curation. If a feature is optional, it
+can almost certainly be expressed as a separate package.
+
+# The `[workspace]` section
+
+Projects can define a workspace which is a set of crates that will all share the
+same `Cargo.lock` and output directory. The `[workspace]` table can be defined
+as:
+
+```toml
+[workspace]
+
+# Optional key, inferred from path dependencies if not present.
+# Additional non-path dependencies that should be included must be given here.
+# In particular, for a virtual manifest, all members have to be listed.
+members = ["path/to/member1", "path/to/member2", "path/to/member3/*"]
+
+# Optional key, empty if not present.
+exclude = ["path1", "path/to/dir2"]
+```
+
+Workspaces were added to Cargo as part of [RFC 1525] and have a number of
+properties:
+
+* A workspace can contain multiple crates where one of them is the *root crate*.
+* The *root crate*'s `Cargo.toml` contains the `[workspace]` table, but is not
+ required to have other configuration.
+* Whenever any crate in the workspace is compiled, output is placed in the
+ *workspace root*. i.e. next to the *root crate*'s `Cargo.toml`.
+* The lock file for all crates in the workspace resides in the *workspace root*.
+* The `[patch]` and `[replace]` sections in `Cargo.toml` are only recognized
+ in the *root crate*'s manifest, and ignored in member crates' manifests.
+
+[RFC 1525]: https://github.com/rust-lang/rfcs/blob/master/text/1525-cargo-workspace.md
+
+The *root crate* of a workspace, indicated by the presence of `[workspace]` in
+its manifest, is responsible for defining the entire workspace. All `path`
+dependencies residing in the workspace directory become members. You can add
+additional packages to the workspace by listing them in the `members` key. Note
+that members of the workspaces listed explicitly will also have their path
+dependencies included in the workspace. Sometimes a project may have a lot of
+workspace members and it can be onerous to keep up to date. The path dependency
+can also use [globs][globs] to match multiple paths. Finally, the `exclude`
+key can be used to blacklist paths from being included in a workspace. This can
+be useful if some path dependencies aren't desired to be in the workspace at
+all.
+
+The `package.workspace` manifest key (described above) is used in member crates
+to point at a workspace's root crate. If this key is omitted then it is inferred
+to be the first crate whose manifest contains `[workspace]` upwards in the
+filesystem.
+
+A crate may either specify `package.workspace` or specify `[workspace]`. That
+is, a crate cannot both be a root crate in a workspace (contain `[workspace]`)
+and also be a member crate of another workspace (contain `package.workspace`).
+
+Most of the time workspaces will not need to be dealt with as `cargo new` and
+`cargo init` will handle workspace configuration automatically.
+
+## Virtual Manifest
+
+In workspace manifests, if the `package` table is present, the workspace root
+crate will be treated as a normal package, as well as a workspace. If the
+`package` table is not present in a workspace manifest, it is called a *virtual
+manifest*.
+
+When working with *virtual manifests*, package-related cargo commands, like
+`cargo build`, won't be available anymore. But, most of such commands support
+the `--all` option, will execute the command for all the non-virtual manifest in
+the workspace.
+
+# The project layout
+
+If your project is an executable, name the main source file `src/main.rs`. If it
+is a library, name the main source file `src/lib.rs`.
+
+Cargo will also treat any files located in `src/bin/*.rs` as executables. If your
+executable consists of more than just one source file, you might also use a directory
+inside `src/bin` containing a `main.rs` file which will be treated as an executable
+with a name of the parent directory.
+Do note, however, once you add a `[[bin]]` section ([see
+below](#configuring-a-target)), Cargo will no longer automatically build files
+located in `src/bin/*.rs`. Instead you must create a `[[bin]]` section for
+each file you want to build.
+
+Your project can optionally contain folders named `examples`, `tests`, and
+`benches`, which Cargo will treat as containing examples,
+integration tests, and benchmarks respectively. Analogous to `bin` targets, they
+may be composed of single files or directories with a `main.rs` file.
+
+```notrust
+▾ src/ # directory containing source files
+ lib.rs # the main entry point for libraries and packages
+ main.rs # the main entry point for projects producing executables
+ ▾ bin/ # (optional) directory containing additional executables
+ *.rs
+ ▾ */ # (optional) directories containing multi-file executables
+ main.rs
+▾ examples/ # (optional) examples
+ *.rs
+ ▾ */ # (optional) directories containing multi-file examples
+ main.rs
+▾ tests/ # (optional) integration tests
+ *.rs
+ ▾ */ # (optional) directories containing multi-file tests
+ main.rs
+▾ benches/ # (optional) benchmarks
+ *.rs
+ ▾ */ # (optional) directories containing multi-file benchmarks
+ main.rs
+```
+
+To structure your code after you've created the files and folders for your
+project, you should remember to use Rust's module system, which you can read
+about in [the book](https://doc.rust-lang.org/book/crates-and-modules.html).
+
+# Examples
+
+Files located under `examples` are example uses of the functionality provided by
+the library. When compiled, they are placed in the `target/examples` directory.
+
+They can compile either as executables (with a `main()` function) or libraries
+and pull in the library by using `extern crate <library-name>`. They are
+compiled when you run your tests to protect them from bitrotting.
+
+You can run individual executable examples with the command `cargo run --example
+<example-name>`.
+
+Specify `crate-type` to make an example be compiled as a library:
+
+```toml
+[[example]]
+name = "foo"
+crate-type = ["staticlib"]
+```
+
+You can build individual library examples with the command `cargo build
+--example <example-name>`.
+
+# Tests
+
+When you run `cargo test`, Cargo will:
+
+* compile and run your library’s unit tests, which are in the files reachable
+ from `lib.rs` (naturally, any sections marked with `#[cfg(test)]` will be
+ considered at this stage);
+* compile and run your library’s documentation tests, which are embedded inside
+ of documentation blocks;
+* compile and run your library’s [integration tests](#integration-tests); and
+* compile your library’s examples.
+
+## Integration tests
+
+Each file in `tests/*.rs` is an integration test. When you run `cargo test`,
+Cargo will compile each of these files as a separate crate. The crate can link
+to your library by using `extern crate <library-name>`, like any other code that
+depends on it.
+
+Cargo will not automatically compile files inside subdirectories of `tests`, but
+an integration test can import modules from these directories as usual. For
+example, if you want several integration tests to share some code, you can put
+the shared code in `tests/common/mod.rs` and then put `mod common;` in each of
+the test files.
+
+# Configuring a target
+
+All of the `[[bin]]`, `[lib]`, `[[bench]]`, `[[test]]`, and `[[example]]`
+sections support similar configuration for specifying how a target should be
+built. The double-bracket sections like `[[bin]]` are array-of-table of
+[TOML](https://github.com/toml-lang/toml#array-of-tables), which means you can
+write more than one `[[bin]]` section to make several executables in your crate.
+
+The example below uses `[lib]`, but it also applies to all other sections
+as well. All values listed are the defaults for that option unless otherwise
+specified.
+
+```toml
+[package]
+# ...
+
+[lib]
+# The name of a target is the name of the library that will be generated. This
+# is defaulted to the name of the package or project, with any dashes replaced
+# with underscores. (Rust `extern crate` declarations reference this name;
+# therefore the value must be a valid Rust identifier to be usable.)
+name = "foo"
+
+# This field points at where the crate is located, relative to the `Cargo.toml`.
+path = "src/lib.rs"
+
+# A flag for enabling unit tests for this target. This is used by `cargo test`.
+test = true
+
+# A flag for enabling documentation tests for this target. This is only relevant
+# for libraries, it has no effect on other sections. This is used by
+# `cargo test`.
+doctest = true
+
+# A flag for enabling benchmarks for this target. This is used by `cargo bench`.
+bench = true
+
+# A flag for enabling documentation of this target. This is used by `cargo doc`.
+doc = true
+
+# If the target is meant to be a compiler plugin, this field must be set to true
+# for Cargo to correctly compile it and make it available for all dependencies.
+plugin = false
+
+# If the target is meant to be a "macros 1.1" procedural macro, this field must
+# be set to true.
+proc-macro = false
+
+# If set to false, `cargo test` will omit the `--test` flag to rustc, which
+# stops it from generating a test harness. This is useful when the binary being
+# built manages the test runner itself.
+harness = true
+```
+
+## The `required-features` field (optional)
+
+The `required-features` field specifies which features the target needs in order
+to be built. If any of the required features are not selected, the target will
+be skipped. This is only relevant for the `[[bin]]`, `[[bench]]`, `[[test]]`,
+and `[[example]]` sections, it has no effect on `[lib]`.
+
+```toml
+[features]
+# ...
+postgres = []
+sqlite = []
+tools = []
+
+[[bin]]
+# ...
+required-features = ["postgres", "tools"]
+```
+
+# Building dynamic or static libraries
+
+If your project produces a library, you can specify which kind of library to
+build by explicitly listing the library in your `Cargo.toml`:
+
+```toml
+# ...
+
+[lib]
+name = "..."
+crate-type = ["dylib"] # could be `staticlib` as well
+```
+
+The available options are `dylib`, `rlib`, `staticlib`, `cdylib`, and
+`proc-macro`. You should only use this option in a project. Cargo will always
+compile packages (dependencies) based on the requirements of the project that
+includes them.
+
+You can read more about the different crate types in the
+[Rust Reference Manual](https://doc.rust-lang.org/reference/linkage.html)
+
+# The `[patch]` Section
+
+This section of Cargo.toml can be used to [override dependencies][replace] with
+other copies. The syntax is similar to the `[dependencies]` section:
+
+```toml
+[patch.crates-io]
+foo = { git = 'https://github.com/example/foo' }
+bar = { path = 'my/local/bar' }
+```
+
+The `[patch]` table is made of dependency-like sub-tables. Each key after
+`[patch]` is a URL of the source that's being patched, or `crates-io` if
+you're modifying the https://crates.io registry. In the example above
+`crates-io` could be replaced with a git URL such as
+`https://github.com/rust-lang-nursery/log`.
+
+Each entry in these tables is a normal dependency specification, the same as
+found in the `[dependencies]` section of the manifest. The dependencies listed
+in the `[patch]` section are resolved and used to patch the source at the
+URL specified. The above manifest snippet patches the `crates-io` source (e.g.
+crates.io itself) with the `foo` crate and `bar` crate.
+
+Sources can be patched with versions of crates that do not exist, and they can
+also be patched with versions of crates that already exist. If a source is
+patched with a crate version that already exists in the source, then the
+source's original crate is replaced.
+
+More information about overriding dependencies can be found in the [overriding
+dependencies][replace] section of the documentation and [RFC 1969] for the
+technical specification of this feature. Note that the `[patch]` feature will
+first become available in Rust 1.21, set to be released on 2017-10-12.
+
+[RFC 1969]: https://github.com/rust-lang/rfcs/pull/1969
+[replace]: specifying-dependencies.html#overriding-dependencies
+
+# The `[replace]` Section
+
+This section of Cargo.toml can be used to [override dependencies][replace] with
+other copies. The syntax is similar to the `[dependencies]` section:
+
+```toml
+[replace]
+"foo:0.1.0" = { git = 'https://github.com/example/foo' }
+"bar:1.0.2" = { path = 'my/local/bar' }
+```
+
+Each key in the `[replace]` table is a [package id
+specification](pkgid-spec.html) which allows arbitrarily choosing a node in the
+dependency graph to override. The value of each key is the same as the
+`[dependencies]` syntax for specifying dependencies, except that you can't
+specify features. Note that when a crate is overridden the copy it's overridden
+with must have both the same name and version, but it can come from a different
+source (e.g. git or a local path).
+
+More information about overriding dependencies can be found in the [overriding
+dependencies][replace] section of the documentation.
--- /dev/null
+% Package ID Specifications
+
+# Package ID specifications
+
+Subcommands of Cargo frequently need to refer to a particular package within a
+dependency graph for various operations like updating, cleaning, building, etc.
+To solve this problem, Cargo supports Package ID Specifications. A specification
+is a string which is used to uniquely refer to one package within a graph of
+packages.
+
+## Specification grammar
+
+The formal grammar for a Package Id Specification is:
+
+```notrust
+pkgid := pkgname
+ | [ proto "://" ] hostname-and-path [ "#" ( pkgname | semver ) ]
+pkgname := name [ ":" semver ]
+
+proto := "http" | "git" | ...
+```
+
+Here, brackets indicate that the contents are optional.
+
+## Example specifications
+
+These could all be references to a package `foo` version `1.2.3` from the
+registry at `crates.io`
+
+| pkgid | name | version | url |
+|:-----------------------------|:-----:|:-------:|:----------------------:|
+| `foo` | `foo` | `*` | `*` |
+| `foo:1.2.3` | `foo` | `1.2.3` | `*` |
+| `crates.io/foo` | `foo` | `*` | `*://crates.io/foo` |
+| `crates.io/foo#1.2.3` | `foo` | `1.2.3` | `*://crates.io/foo` |
+| `crates.io/bar#foo:1.2.3` | `foo` | `1.2.3` | `*://crates.io/bar` |
+| `http://crates.io/foo#1.2.3` | `foo` | `1.2.3` | `http://crates.io/foo` |
+
+## Brevity of specifications
+
+The goal of this is to enable both succinct and exhaustive syntaxes for
+referring to packages in a dependency graph. Ambiguous references may refer to
+one or more packages. Most commands generate an error if more than one package
+could be referred to with the same specification.
--- /dev/null
+<!DOCTYPE html>
+<html lang="en">
+<head>
+ <meta charset="utf-8">
+ <meta http-equiv="refresh" content="0;URL='https://crates.io/policies'" />
+</head>
+<body class="rustdoc">
+ Redirecting to <a href="https://crates.io/policies">https://crates.io/policies</a>...
+</body>
+</html>
--- /dev/null
+% Replacing sources
+
+Cargo supports the ability to **replace one source with another** to express
+strategies along the lines of mirrors or vendoring dependencies. Configuration
+is currently done through the [`.cargo/config` configuration][config] mechanism,
+like so:
+
+[config]: config.html
+
+```toml
+# The `source` table is where all keys related to source-replacement
+# are stored.
+[source]
+
+# Under the `source` table are a number of other tables whose keys are a
+# name for the relevant source. For example this section defines a new
+# source, called `my-awesome-source`, which comes from a directory
+# located at `vendor` relative to the directory containing this `.cargo/config`
+# file
+[source.my-awesome-source]
+directory = "vendor"
+
+# The crates.io default source for crates is available under the name
+# "crates-io", and here we use the `replace-with` key to indicate that it's
+# replaced with our source above.
+[source.crates-io]
+replace-with = "my-awesome-source"
+```
+
+With this configuration Cargo attempts to look up all crates in the directory
+"vendor" rather than querying the online registry at crates.io. Using source
+replacement Cargo can express:
+
+* Vendoring - custom sources can be defined which represent crates on the local
+ filesystem. These sources are subsets of the source that they're replacing and
+ can be checked into projects if necessary.
+
+* Mirroring - sources can be replaced with an equivalent version which acts as a
+ cache for crates.io itself.
+
+Cargo has a core assumption about source replacement that the source code is
+exactly the same from both sources. In our above example Cargo assumes that all
+of the crates coming from `my-awesome-source` are the exact same as the copies
+from `crates-io`. Note that this also means that `my-awesome-source` is not
+allowed to have crates which are not present in the `crates-io` source.
+
+As a consequence, source replacement is not appropriate for situations such as
+patching a dependency or a private registry. Cargo supports patching
+dependencies through the usage of [the `[replace]` key][replace-section], and
+private registry support is planned for a future version of Cargo.
+
+[replace-section]: manifest.html#the-replace-section
+
+## Configuration
+
+Configuration of replacement sources is done through [`.cargo/config`][config]
+and the full set of available keys are:
+
+```toml
+# Each source has its own table where the key is the name of the source
+[source.the-source-name]
+
+# Indicate that `the-source-name` will be replaced with `another-source`,
+# defined elsewhere
+replace-with = "another-source"
+
+# Available kinds of sources that can be specified (described below)
+registry = "https://example.com/path/to/index"
+local-registry = "path/to/registry"
+directory = "path/to/vendor"
+
+# Git sources can optionally specify a branch/tag/rev as well
+git = "https://example.com/path/to/repo"
+# branch = "master"
+# tag = "v1.0.1"
+# rev = "313f44e8"
+```
+
+The `crates-io` represents the crates.io online registry (default source of
+crates) and can be replaced with:
+
+```toml
+[source.crates-io]
+replace-with = 'another-source'
+```
+
+## Registry Sources
+
+A "registry source" is one that is the same as crates.io itself. That is, it has
+an index served in a git repository which matches the format of the
+[crates.io index](https://github.com/rust-lang/crates.io-index). That repository
+then has configuration indicating where to download crates from.
+
+Currently there is not an already-available project for setting up a mirror of
+crates.io. Stay tuned though!
+
+## Local Registry Sources
+
+A "local registry source" is intended to be a subset of another registry
+source, but available on the local filesystem (aka vendoring). Local registries
+are downloaded ahead of time, typically sync'd with a `Cargo.lock`, and are
+made up of a set of `*.crate` files and an index like the normal registry is.
+
+The primary way to manage and crate local registry sources is through the
+[`cargo-local-registry`][cargo-local-registry] subcommand, available on
+crates.io and can be installed with `cargo install cargo-local-registry`.
+
+[cargo-local-registry]: https://crates.io/crates/cargo-local-registry
+
+Local registries are contained within one directory and contain a number of
+`*.crate` files downloaded from crates.io as well as an `index` directory with
+the same format as the crates.io-index project (populated with just entries for
+the crates that are present).
+
+## Directory Sources
+
+A "directory source" is similar to a local registry source where it contains a
+number of crates available on the local filesystem, suitable for vendoring
+dependencies. Also like local registries, directory sources can primarily be
+managed by an external subcommand, [`cargo-vendor`][cargo-vendor], which can be
+installed with `cargo install cargo-vendor`.
+
+[cargo-vendor]: https://crates.io/crates/cargo-vendor
+
+Directory sources are distinct from local registries though in that they contain
+the unpacked version of `*.crate` files, making it more suitable in some
+situations to check everything into source control. A directory source is just a
+directory containing a number of other directories which contain the source code
+for crates (the unpacked version of `*.crate` files). Currently no restriction
+is placed on the name of each directory.
+
+Each crate in a directory source also has an associated metadata file indicating
+the checksum of each file in the crate to protect against accidental
+modifications.
--- /dev/null
+% Specifying Dependencies
+
+Your crates can depend on other libraries from [crates.io], `git` repositories, or
+subdirectories on your local file system. You can also temporarily override the
+location of a dependency— for example, to be able to test out a bug fix in the
+dependency that you are working on locally. You can have different
+dependencies for different platforms, and dependencies that are only used during
+development. Let's take a look at how to do each of these.
+
+# Specifying dependencies from crates.io
+
+Cargo is configured to look for dependencies on [crates.io] by default. Only
+the name and a version string are required in this case. In [the cargo
+guide](guide.html), we specified a dependency on the `time` crate:
+
+```toml
+[dependencies]
+time = "0.1.12"
+```
+
+The string `"0.1.12"` is a [semver] version requirement. Since this
+string does not have any operators in it, it is interpreted the same way as
+if we had specified `"^0.1.12"`, which is called a caret requirement.
+
+[semver]: https://github.com/steveklabnik/semver#requirements
+
+## Caret requirements
+
+**Caret requirements** allow SemVer compatible updates to a specified version.
+An update is allowed if the new version number does not modify the left-most
+non-zero digit in the major, minor, patch grouping. In this case, if we ran
+`cargo update -p time`, cargo would update us to version `0.1.13` if it was
+available, but would not update us to `0.2.0`. If instead we had specified the
+version string as `^1.0`, cargo would update to `1.1` but not `2.0`. `0.0.x` is
+not considered compatible with any other version.
+
+Here are some more examples of caret requirements and the versions that would
+be allowed with them:
+
+```notrust
+^1.2.3 := >=1.2.3 <2.0.0
+^1.2 := >=1.2.0 <2.0.0
+^1 := >=1.0.0 <2.0.0
+^0.2.3 := >=0.2.3 <0.3.0
+^0.0.3 := >=0.0.3 <0.0.4
+^0.0 := >=0.0.0 <0.1.0
+^0 := >=0.0.0 <1.0.0
+```
+
+This compatibility convention is different from SemVer in the way it treats
+versions before 1.0.0. While SemVer says there is no compatibility before
+1.0.0, Cargo considers `0.x.y` to be compatible with `0.x.z`, where `y ≥ z`
+and `x > 0`.
+
+## Tilde requirements
+
+**Tilde requirements** specify a minimal version with some ability to update.
+If you specify a major, minor, and patch version or only a major and minor
+version, only patch-level changes are allowed. If you only specify a major
+version, then minor- and patch-level changes are allowed.
+
+`~1.2.3` is an example of a tilde requirement.
+
+```notrust
+~1.2.3 := >=1.2.3 <1.3.0
+~1.2 := >=1.2.0 <1.3.0
+~1 := >=1.0.0 <2.0.0
+```
+
+## Wildcard requirements
+
+**Wildcard requirements** allow for any version where the wildcard is
+positioned.
+
+`*`, `1.*` and `1.2.*` are examples of wildcard requirements.
+
+```notrust
+* := >=0.0.0
+1.* := >=1.0.0 <2.0.0
+1.2.* := >=1.2.0 <1.3.0
+```
+
+## Inequality requirements
+
+**Inequality requirements** allow manually specifying a version range or an
+exact version to depend on.
+
+Here are some examples of inequality requirements:
+
+```notrust
+>= 1.2.0
+> 1
+< 2
+= 1.2.3
+```
+
+## Multiple requirements
+
+Multiple version requirements can also be separated with a comma, e.g. `>= 1.2,
+< 1.5`.
+
+# Specifying dependencies from `git` repositories
+
+To depend on a library located in a `git` repository, the minimum information
+you need to specify is the location of the repository with the `git` key:
+
+```toml
+[dependencies]
+rand = { git = "https://github.com/rust-lang-nursery/rand" }
+```
+
+Cargo will fetch the `git` repository at this location then look for a
+`Cargo.toml` for the requested crate anywhere inside the `git` repository
+(not necessarily at the root).
+
+Since we haven’t specified any other information, Cargo assumes that
+we intend to use the latest commit on the `master` branch to build our project.
+You can combine the `git` key with the `rev`, `tag`, or `branch` keys to
+specify something else. Here's an example of specifying that you want to use
+the latest commit on a branch named `next`:
+
+```toml
+[dependencies]
+rand = { git = "https://github.com/rust-lang-nursery/rand", branch = "next" }
+```
+
+# Specifying path dependencies
+
+Over time, our `hello_world` project from [the guide](guide.html) has grown
+significantly in size! It’s gotten to the point that we probably want to
+split out a separate crate for others to use. To do this Cargo supports
+**path dependencies** which are typically sub-crates that live within one
+repository. Let’s start off by making a new crate inside of our `hello_world`
+project:
+
+```shell
+# inside of hello_world/
+$ cargo new hello_utils
+```
+
+This will create a new folder `hello_utils` inside of which a `Cargo.toml` and
+`src` folder are ready to be configured. In order to tell Cargo about this, open
+up `hello_world/Cargo.toml` and add `hello_utils` to your dependencies:
+
+```toml
+[dependencies]
+hello_utils = { path = "hello_utils" }
+```
+
+This tells Cargo that we depend on a crate called `hello_utils` which is found
+in the `hello_utils` folder (relative to the `Cargo.toml` it’s written in).
+
+And that’s it! The next `cargo build` will automatically build `hello_utils` and
+all of its own dependencies, and others can also start using the crate as well.
+However, crates that use dependencies specified with only a path are not
+permitted on [crates.io]. If we wanted to publish our `hello_world` crate, we
+would need to publish a version of `hello_utils` to [crates.io](https://crates.io)
+and specify its version in the dependencies line as well:
+
+```toml
+[dependencies]
+hello_utils = { path = "hello_utils", version = "0.1.0" }
+```
+
+# Overriding dependencies
+
+There are a number of methods in Cargo to support overriding dependencies and
+otherwise controlling the dependency graph. These options are typically, though,
+only available at the workspace level and aren't propagated through
+dependencies. In other words, "applications" have the ability to override
+dependencies but "libraries" do not.
+
+The desire to override a dependency or otherwise alter some dependencies can
+arise through a number of scenarios. Most of them, however, boil down to the
+ability to work with a crate before it's been published to crates.io. For
+example:
+
+* A crate you're working on is also used in a much larger application you're
+ working on, and you'd like to test a bug fix to the library inside of the
+ larger application.
+* An upstream crate you don't work on has a new feature or a bug fix on the
+ master branch of its git repository which you'd like to test out.
+* You're about to publish a new major version of your crate, but you'd like to
+ do integration testing across an entire project to ensure the new major
+ version works.
+* You've submitted a fix to an upstream crate for a bug you found, but you'd
+ like to immediately have your application start depending on the fixed version
+ of the crate to avoid blocking on the bug fix getting merged.
+
+These scenarios are currently all solved with the [`[patch]` manifest
+section][patch-section]. (Note that the `[patch]` feature will first become
+available in Rust 1.21, set to be released on 2017-10-12.) Historically some of
+these scenarios have been solved with [the `[replace]` section][replace-section],
+but we'll document the `[patch]` section here.
+
+[patch-section]: manifest.html#the-patch-section
+[replace-section]: manifest.html#the-replace-section
+
+### Testing a bugfix
+
+Let's say you're working with the [`uuid`] crate but while you're working on it
+you discover a bug. You are, however, quite enterprising so you decide to also
+try out to fix the bug! Originally your manifest will look like:
+
+[`uuid`](https://crates.io/crates/uuid)
+
+```toml
+[package]
+name = "my-library"
+version = "0.1.0"
+authors = ["..."]
+
+[dependencies]
+uuid = "1.0"
+```
+
+First thing we'll do is to clone the [`uuid` repository][uuid-repository]
+locally via:
+
+```shell
+$ git clone https://github.com/rust-lang-nursery/uuid
+```
+
+Next we'll edit the manifest of `my-library` to contain:
+
+```toml
+[patch.crates-io]
+uuid = { path = "../path/to/uuid" }
+```
+
+Here we declare that we're *patching* the source `crates-io` with a new
+dependency. This will effectively add the local checked out version of `uuid` to
+the crates.io registry for our local project.
+
+Next up we need to ensure that our lock file is updated to use this new version
+of `uuid` so our project uses the locally checked out copy instead of one from
+crates.io. The way `[patch]` works is that it'll load the dependency at
+`../path/to/uuid` and then whenever crates.io is queried for versions of `uuid`
+it'll *also* return the local version.
+
+This means that the version number of the local checkout is significant and will
+affect whether the patch is used. Our manifest declared `uuid = "1.0"` which
+means we'll only resolve to `>= 1.0.0, < 2.0.0`, and Cargo's greedy resolution
+algorithm also means that we'll resolve to the maximum version within that
+range. Typically this doesn't matter as the version of the git repository will
+already be greater or match the maximum version published on crates.io, but it's
+important to keep this in mind!
+
+In any case, typically all you need to do now is:
+
+```shell
+$ cargo build
+ Compiling uuid v1.0.0 (file://.../uuid)
+ Compiling my-library v0.1.0 (file://.../my-library)
+ Finished dev [unoptimized + debuginfo] target(s) in 0.32 secs
+```
+
+And that's it! You're now building with the local version of `uuid` (note the
+`file://` in the build output). If you don't see the `file://` version getting
+built then you may need to run `cargo update -p uuid --precise $version` where
+`$version` is the version of the locally checked out copy of `uuid`.
+
+Once you've fixed the bug you originally found the next thing you'll want to do
+is to likely submit that as a pull request to the `uuid` crate itself. Once
+you've done this then you can also update the `[patch]` section. The listing
+inside of `[patch]` is just like the `[dependencies]` section, so once your pull
+request is merged you could change your `path` dependency to:
+
+```toml
+[patch.crates-io]
+uuid = { git = 'https://github.com/rust-lang-nursery/uuid' }
+```
+
+[uuid-repository]: https://github.com/rust-lang-nursery/uuid
+
+### Working with an unpublished minor version
+
+Let's now shift gears a bit from bug fixes to adding features. While working on
+`my-library` you discover that a whole new feature is needed in the `uuid`
+crate. You've implemented this feature, tested it locally above with `[patch]`,
+and submitted a pull request. Let's go over how you continue to use and test it
+before it's actually published.
+
+Let's also say that the current version of `uuid` on crates.io is `1.0.0`, but
+since then the master branch of the git repository has updated to `1.0.1`. This
+branch includes your new feature you submitted previously. To use this
+repository we'll edit our `Cargo.toml` to look like
+
+```toml
+[package]
+name = "my-library"
+version = "0.1.0"
+authors = ["..."]
+
+[dependencies]
+uuid = "1.0.1"
+
+[patch.crates-io]
+uuid = { git = 'https://github.com/rust-lang-nursery/uuid' }
+```
+
+Note that our local dependency on `uuid` has been updated to `1.0.1` as it's
+what we'll actually require once the crate is published. This version doesn't
+exist on crates.io, though, so we provide it with the `[patch]` section of the
+manifest.
+
+Now when our library is built it'll fetch `uuid` from the git repository and
+resolve to 1.0.1 inside the repository instead of trying to download a version
+from crates.io. Once 1.0.1 is published on crates.io the `[patch]` section can
+be deleted.
+
+It's also worth nothing that `[patch]` applies *transitively*. Let's say you use
+`my-library` in a larger project, such as:
+
+```toml
+[package]
+name = "my-binary"
+version = "0.1.0"
+authors = ["..."]
+
+[dependencies]
+my-library = { git = 'https://example.com/git/my-library' }
+uuid = "1.0"
+
+[patch.crates-io]
+uuid = { git = 'https://github.com/rust-lang-nursery/uuid' }
+```
+
+Remember that `[patch]` is only applicable at the *top level* so we consumers of
+`my-library` have to repeat the `[patch]` section if necessary. Here, though,
+the new `uuid` crate applies to *both* our dependency on `uuid` and the
+`my-library -> uuid` dependency. The `uuid` crate will be resolved to one
+version for this entire crate graph, 1.0.1, and it'll be pulled from the git
+repository.
+
+### Prepublishing a breaking change
+
+As a final scenario, let's take a look at working with a new major version of a
+crate, typically accompanied with breaking changes. Sticking with our previous
+crates, this means that we're going to be creating version 2.0.0 of the `uuid`
+crate. After we've submitted all changes upstream we can update our manifest for
+`my-library` to look like:
+
+```toml
+[dependencies]
+uuid = "2.0"
+
+[patch.crates-io]
+uuid = { git = "https://github.com/rust-lang-nursery/uuid", branch = "2.0.0" }
+```
+
+And that's it! Like with the previous example the 2.0.0 version doesn't actually
+exist on crates.io but we can still put it in through a git dependency through
+the usage of the `[patch]` section. As a thought exercise let's take another
+look at the `my-binary` manifest from above again as well:
+
+```toml
+[package]
+name = "my-binary"
+version = "0.1.0"
+authors = ["..."]
+
+[dependencies]
+my-library = { git = 'https://example.com/git/my-library' }
+uuid = "1.0"
+
+[patch.crates-io]
+uuid = { git = 'https://github.com/rust-lang-nursery/uuid', version = '2.0.0' }
+```
+
+Note that this will actually resolve to two versions of the `uuid` crate. The
+`my-binary` crate will continue to use the 1.x.y series of the `uuid` crate but
+the `my-library` crate will use the 2.0.0 version of `uuid`. This will allow you
+to gradually roll out breaking changes to a crate through a dependency graph
+without being force to update everything all at once.
+
+### Overriding with local dependencies
+
+Sometimes you're only temporarily working on a crate and you don't want to have
+to modify `Cargo.toml` like with the `[patch]` section above. For this use
+case Cargo offers a much more limited version of overrides called **path
+overrides**.
+
+Path overrides are specified through `.cargo/config` instead of `Cargo.toml`,
+and you can find [more documentation about this configuration][config-docs].
+Inside of `.cargo/config` you'll specify a key called `paths`:
+
+[config-docs]: config.html
+
+```toml
+paths = ["/path/to/uuid"]
+```
+
+This array should be filled with directories that contain a `Cargo.toml`. In
+this instance, we’re just adding `uuid`, so it will be the only one that’s
+overridden. This path can be either absolute or relative to the directory that
+contains the `.cargo` folder.
+
+Path overrides are more restricted than the `[patch]` section, however, in
+that they cannot change the structure of the dependency graph. When a
+path replacement is used then the previous set of dependencies
+must all match exactly to the new `Cargo.toml` specification. For example this
+means that path overrides cannot be used to test out adding a dependency to a
+crate, instead `[patch]` must be used in that situation. As a result usage of a
+path override is typically isolated to quick bug fixes rather than larger
+changes.
+
+Note: using a local configuration to override paths will only work for crates
+that have been published to [crates.io]. You cannot use this feature to tell
+Cargo how to find local unpublished crates.
+
+# Platform specific dependencies
+
+
+Platform-specific dependencies take the same format, but are listed under a
+`target` section. Normally Rust-like `#[cfg]` syntax will be used to define
+these sections:
+
+```toml
+[target.'cfg(windows)'.dependencies]
+winhttp = "0.4.0"
+
+[target.'cfg(unix)'.dependencies]
+openssl = "1.0.1"
+
+[target.'cfg(target_arch = "x86")'.dependencies]
+native = { path = "native/i686" }
+
+[target.'cfg(target_arch = "x86_64")'.dependencies]
+native = { path = "native/x86_64" }
+```
+
+Like with Rust, the syntax here supports the `not`, `any`, and `all` operators
+to combine various cfg name/value pairs. Note that the `cfg` syntax has only
+been available since Cargo 0.9.0 (Rust 1.8.0).
+
+In addition to `#[cfg]` syntax, Cargo also supports listing out the full target
+the dependencies would apply to:
+
+```toml
+[target.x86_64-pc-windows-gnu.dependencies]
+winhttp = "0.4.0"
+
+[target.i686-unknown-linux-gnu.dependencies]
+openssl = "1.0.1"
+```
+
+If you’re using a custom target specification, quote the full path and file
+name:
+
+```toml
+[target."x86_64/windows.json".dependencies]
+winhttp = "0.4.0"
+
+[target."i686/linux.json".dependencies]
+openssl = "1.0.1"
+native = { path = "native/i686" }
+
+[target."x86_64/linux.json".dependencies]
+openssl = "1.0.1"
+native = { path = "native/x86_64" }
+```
+
+# Development dependencies
+
+You can add a `[dev-dependencies]` section to your `Cargo.toml` whose format
+is equivalent to `[dependencies]`:
+
+```toml
+[dev-dependencies]
+tempdir = "0.3"
+```
+
+Dev-dependencies are not used when compiling
+a package for building, but are used for compiling tests, examples, and
+benchmarks.
+
+These dependencies are *not* propagated to other packages which depend on this
+package.
+
+You can also have target-specific development dependencies by using
+`dev-dependencies` in the target section header instead of `dependencies`. For
+example:
+
+```toml
+[target.'cfg(unix)'.dev-dependencies]
+mio = "0.0.1"
+```
+
+[crates.io]: https://crates.io/
+
+# Build dependencies
+
+You can depend on other Cargo-based crates for use in your build scripts.
+Dependencies are declared through the `build-dependencies` section of the
+manifest:
+
+```toml
+[build-dependencies]
+gcc = "0.3"
+```
+
+The build script **does not** have access to the dependencies listed
+in the `dependencies` or `dev-dependencies` section. Build
+dependencies will likewise not be available to the package itself
+unless listed under the `dependencies` section as well. A package
+itself and its build script are built separately, so their
+dependencies need not coincide. Cargo is kept simpler and cleaner by
+using independent dependencies for independent purposes.
+
+# Choosing features
+
+If a package you depend on offers conditional features, you can
+specify which to use:
+
+```toml
+[dependencies.awesome]
+version = "1.3.5"
+default-features = false # do not include the default features, and optionally
+ # cherry-pick individual features
+features = ["secure-password", "civet"]
+```
+
+More information about features can be found in the
+[manifest documentation](manifest.html#the-features-section).
--- /dev/null
+html {
+ background: url("../images/noise.png");
+ background-color: #3b6837;
+}
+
+main, #header { width: 900px; }
+
+* {
+ box-sizing: border-box;
+}
+
+body {
+ display: -webkit-flex;
+ display: flex;
+ -webkit-flex-direction: column;
+ flex-direction: column;
+ -webkit-align-items: center;
+ align-items: center;
+ font-family: sans-serif;
+ font-size: 16px;
+}
+
+a { color: #00ac5b; text-decoration: none; }
+a:hover { color: #00793f; }
+
+h1 {
+ font-size: 24px;
+ margin: 20px 0 10px 0;
+ font-weight: bold;
+ color: #b64790;
+}
+
+h1 code:not(.highlight) {
+ color: #d9a700;
+ vertical-align: bottom;
+}
+h1 a, h2 a { color: #b64790; text-decoration: none; }
+h1:hover a, h2:hover a { color: #A03D7E; }
+h1:hover a:after,
+h2:hover a:after { content: '\2002\00a7\2002'; }
+:target { background: rgba(239, 242, 178, 1); padding: 5px; }
+
+h1.title { /* style rustdoc-generated title */
+ width: 100%;
+ padding: 40px 20px 40px 60px;
+ background-color: #edebdd;
+ margin-bottom: 20px;
+ -webkit-border-radius: 5px;
+ -moz-border-radius: 5px;
+ -ms-border-radius: 5px;
+ border-radius: 5px;
+ margin: 0;
+ color: #383838;
+ font-size: 2em;
+ background-image: url(../images/circle-with-i.png);
+ background-repeat: no-repeat;
+ background-position: 20px center;
+}
+
+h2 {
+ font-size: 18px;
+ margin: 15px 0 5px 0;
+ color: #b64790;
+ font-weight: bold;
+}
+
+h2 code:not(.highlight) { color: #d9a700; }
+
+code:not(.highlight) {
+ font-family: monospace;
+ color: #b64790;
+}
+
+main {
+ display: -webkit-flex;
+ display: flex;
+ -webkit-flex-direction: column;
+ flex-direction: column;
+
+ width: 100%;
+ max-width: 900px;
+ margin-bottom: 10px;
+
+ background-color: #f9f7ec;
+ padding: 15px;
+
+ -webkit-border-radius: 10px;
+ -moz-border-radius: 10px;
+ -ms-border-radius: 10px;
+ border-radius: 10px;
+ box-shadow: 0px 0px 5px 2px #3b6837;
+ border: 5px solid #62865f;
+ color: #383838;
+}
+
+main > p:first-child {
+ font-weight: 500;
+ margin-top: 3px;
+ padding-bottom: 15px;
+ border-bottom: 1px solid #62865f;
+ text-align: center;
+}
+
+main p:first-child a { color: #3b6837; }
+main p:first-child a:hover { color: #62865f; }
+
+main p, main ul {
+ /* color: #3b6837; */
+ margin: 10px 0;
+ line-height: 150%;
+}
+
+main ul { margin-left: 20px; }
+main li { list-style-type: disc; }
+main strong { font-weight: bold; }
+
+img.logo {
+ align-self: center;
+ margin-bottom: 10px;
+}
+
+pre {
+ padding: 10px;
+ margin: 10px 0;
+ /* border: 1px solid #cad0d0; */
+ border-radius: 4px;
+ max-width: calc(100vw - 45px);
+ overflow-x: auto;
+
+ background: #383838 !important;
+ color: white;
+ padding: 20px;
+
+ /* override prism.js styles */
+ font-size: 1em !important;
+ border: none !important;
+ box-shadow: none !important;
+ text-shadow: none !important;
+}
+
+pre code {
+ text-shadow: none !important;
+}
+
+footer {
+ padding: 40px;
+ width: 900px;
+}
+footer a {
+ color: white;
+}
+footer a:hover {
+ color: #e6e6e6;
+}
+footer .sep, #header .sep { color: #284725; }
+footer .sep { margin: 0 10px; }
+#header .sep { margin-left: 10px; }
+
+.headerlink {
+ display: none;
+ text-decoration: none;
+}
+.fork-me {
+ position:absolute;
+ top:0;
+ right:0;
+}
+
+.token.toml-section { color: #CB4B16; }
+.token.toml-key { color: #268BD2; }
+
+/* Rust code highlighting */
+pre.rust .kw { color: #8959A8; }
+pre.rust .kw-2, pre.rust .prelude-ty { color: #4271AE; }
+pre.rust .number, pre.rust .string { color: #718C00; }
+pre.rust .self, pre.rust .boolval, pre.rust .prelude-val,
+pre.rust .attribute, pre.rust .attribute .ident { color: #C82829; }
+pre.rust .comment { color: #8E908C; }
+pre.rust .doccomment { color: #4D4D4C; }
+pre.rust .macro, pre.rust .macro-nonterminal { color: #3E999F; }
+pre.rust .lifetime { color: #B76514; }
+code span.s1 { color: #2AA198; }
+
+table th { border-bottom: 1px solid black; }
+table td, table th { padding: 5px 10px; }
+
+#header {
+ color: white;
+ position: relative;
+ height: 100px;
+ display: -webkit-flex;
+ display: flex;
+ -webkit-align-items: center;
+ align-items: center;
+}
+#header h1 { font-size: 2em; }
+#header a, #header h1 { color: white; text-decoration: none; }
+#header a:hover { color: #d9d9d9; }
+
+#header input.search {
+ border: none;
+ color: black;
+ outline: 0;
+ margin-left: 30px;
+ padding: 5px 5px 5px 25px;
+ background-image: url(../images/search.png);
+ background-repeat: no-repeat;
+ background-position: 6px 6px;
+ -webkit-border-radius: 15px;
+ -moz-border-radius: 15px;
+ -ms-border-radius: 15px;
+ border-radius: 15px;
+}
+
+#header .nav {
+ -webkit-flex-grow: 2;
+ flex-grow: 2;
+ text-align: right;
+}
+
+button.dropdown, a.dropdown { cursor: pointer; }
+button.dropdown .arrow, a.dropdown .arrow {
+ font-size: 50%; display: inline-block; vertical-align: middle;
+}
+button.dropdown .arrow::after, a.dropdown .arrow::after { content: "▼"; }
+button.active.dropdown .arrow::after, a.active.dropdown .arrow::after {
+ content: "▲";
+}
+
+button {
+ background: none;
+ outline: 0;
+ border: 0;
+ padding: 10px;
+ color: white;
+}
+
+button.active {
+ background:#2a4f27;
+ box-shadow:inset -2px 2px 4px 0 #243d26
+}
+
+ul.dropdown {
+ display: none;
+ visibility: none;
+ position: absolute;
+ top: 100%;
+ right: 0;
+ width: 100%;
+ min-width: 150px;
+ opacity: 0;
+ margin: 0;
+ text-align: left;
+ padding: 0;
+ background: white;
+ border: 1px solid #d5d3cb;
+ list-style: none;
+ z-index: 10;
+ -webkit-border-radius: 5px;
+ -moz-border-radius: 5px;
+ -ms-border-radius: 5px;
+ border-radius: 5px;
+}
+
+ul.dropdown li a {
+ font-size: 90%;
+ width: 100%;
+ display: inline-block;
+ padding: 8px 10px;
+ text-decoration: none;
+ color: #383838 !important;
+}
+
+ul.dropdown li a:hover {
+ background: #5e5e5e;
+ color: white !important;
+}
+ul.dropdown li.last { border-top: 1px solid #d5d3cb; }
+ul.dropdown.open {
+ display: block;
+ visibility: visible;
+ opacity: 1;
+}
+.dropdown-container {
+ display: inline-block;
+ position: relative;
+}
+
+p > img {
+ max-width: 100%;
+}
--- /dev/null
+/*! normalize.css v2.0.1 | MIT License | git.io/normalize */
+
+/* ==========================================================================
+ HTML5 display definitions
+ ========================================================================== */
+
+/*
+ * Corrects `block` display not defined in IE 8/9.
+ */
+
+article,
+aside,
+details,
+figcaption,
+figure,
+footer,
+header,
+hgroup,
+nav,
+section,
+summary {
+ display: block;
+}
+
+/*
+ * Corrects `inline-block` display not defined in IE 8/9.
+ */
+
+audio,
+canvas,
+video {
+ display: inline-block;
+}
+
+/*
+ * Prevents modern browsers from displaying `audio` without controls.
+ * Remove excess height in iOS 5 devices.
+ */
+
+audio:not([controls]) {
+ display: none;
+ height: 0;
+}
+
+/*
+ * Addresses styling for `hidden` attribute not present in IE 8/9.
+ */
+
+[hidden] {
+ display: none;
+}
+
+/* ==========================================================================
+ Base
+ ========================================================================== */
+
+/*
+ * 1. Sets default font family to sans-serif.
+ * 2. Prevents iOS text size adjust after orientation change, without disabling
+ * user zoom.
+ */
+
+html {
+ font-family: sans-serif; /* 1 */
+ -webkit-text-size-adjust: 100%; /* 2 */
+ -ms-text-size-adjust: 100%; /* 2 */
+}
+
+/*
+ * Removes default margin.
+ */
+
+body {
+ margin: 0;
+}
+
+/* ==========================================================================
+ Links
+ ========================================================================== */
+
+/*
+ * Addresses `outline` inconsistency between Chrome and other browsers.
+ */
+
+a:focus {
+ outline: thin dotted;
+}
+
+/*
+ * Improves readability when focused and also mouse hovered in all browsers.
+ */
+
+a:active,
+a:hover {
+ outline: 0;
+}
+
+/* ==========================================================================
+ Typography
+ ========================================================================== */
+
+/*
+ * Addresses `h1` font sizes within `section` and `article` in Firefox 4+,
+ * Safari 5, and Chrome.
+ */
+
+h1 {
+ font-size: 2em;
+}
+
+/*
+ * Addresses styling not present in IE 8/9, Safari 5, and Chrome.
+ */
+
+abbr[title] {
+ border-bottom: 1px dotted;
+}
+
+/*
+ * Addresses style set to `bolder` in Firefox 4+, Safari 5, and Chrome.
+ */
+
+b,
+strong {
+ font-weight: bold;
+}
+
+/*
+ * Addresses styling not present in Safari 5 and Chrome.
+ */
+
+dfn {
+ font-style: italic;
+}
+
+/*
+ * Addresses styling not present in IE 8/9.
+ */
+
+mark {
+ background: #ff0;
+ color: #000;
+}
+
+
+/*
+ * Corrects font family set oddly in Safari 5 and Chrome.
+ */
+
+code,
+kbd,
+pre,
+samp {
+ font-family: monospace, serif;
+ font-size: 1em;
+}
+
+/*
+ * Improves readability of pre-formatted text in all browsers.
+ */
+
+pre {
+ white-space: pre;
+ white-space: pre-wrap;
+ word-wrap: break-word;
+}
+
+/*
+ * Sets consistent quote types.
+ */
+
+q {
+ quotes: "\201C" "\201D" "\2018" "\2019";
+}
+
+/*
+ * Addresses inconsistent and variable font size in all browsers.
+ */
+
+small {
+ font-size: 80%;
+}
+
+/*
+ * Prevents `sub` and `sup` affecting `line-height` in all browsers.
+ */
+
+sub,
+sup {
+ font-size: 75%;
+ line-height: 0;
+ position: relative;
+ vertical-align: baseline;
+}
+
+sup {
+ top: -0.5em;
+}
+
+sub {
+ bottom: -0.25em;
+}
+
+/* ==========================================================================
+ Embedded content
+ ========================================================================== */
+
+/*
+ * Removes border when inside `a` element in IE 8/9.
+ */
+
+img {
+ border: 0;
+}
+
+/*
+ * Corrects overflow displayed oddly in IE 9.
+ */
+
+svg:not(:root) {
+ overflow: hidden;
+}
+
+/* ==========================================================================
+ Figures
+ ========================================================================== */
+
+/*
+ * Addresses margin not present in IE 8/9 and Safari 5.
+ */
+
+figure {
+ margin: 0;
+}
+
+/* ==========================================================================
+ Forms
+ ========================================================================== */
+
+/*
+ * Define consistent border, margin, and padding.
+ */
+
+fieldset {
+ border: 1px solid #c0c0c0;
+ margin: 0 2px;
+ padding: 0.35em 0.625em 0.75em;
+}
+
+/*
+ * 1. Corrects color not being inherited in IE 8/9.
+ * 2. Remove padding so people aren't caught out if they zero out fieldsets.
+ */
+
+legend {
+ border: 0; /* 1 */
+ padding: 0; /* 2 */
+}
+
+/*
+ * 1. Corrects font family not being inherited in all browsers.
+ * 2. Corrects font size not being inherited in all browsers.
+ * 3. Addresses margins set differently in Firefox 4+, Safari 5, and Chrome
+ */
+
+button,
+input,
+select,
+textarea {
+ font-family: inherit; /* 1 */
+ font-size: 100%; /* 2 */
+ margin: 0; /* 3 */
+}
+
+/*
+ * Addresses Firefox 4+ setting `line-height` on `input` using `!important` in
+ * the UA stylesheet.
+ */
+
+button,
+input {
+ line-height: normal;
+}
+
+/*
+ * 1. Avoid the WebKit bug in Android 4.0.* where (2) destroys native `audio`
+ * and `video` controls.
+ * 2. Corrects inability to style clickable `input` types in iOS.
+ * 3. Improves usability and consistency of cursor style between image-type
+ * `input` and others.
+ */
+
+button,
+html input[type="button"], /* 1 */
+input[type="reset"],
+input[type="submit"] {
+ -webkit-appearance: button; /* 2 */
+ cursor: pointer; /* 3 */
+}
+
+/*
+ * Re-set default cursor for disabled elements.
+ */
+
+button[disabled],
+input[disabled] {
+ cursor: default;
+}
+
+/*
+ * 1. Addresses box sizing set to `content-box` in IE 8/9.
+ * 2. Removes excess padding in IE 8/9.
+ */
+
+input[type="checkbox"],
+input[type="radio"] {
+ box-sizing: border-box; /* 1 */
+ padding: 0; /* 2 */
+}
+
+/*
+ * 1. Addresses `appearance` set to `searchfield` in Safari 5 and Chrome.
+ * 2. Addresses `box-sizing` set to `border-box` in Safari 5 and Chrome
+ * (include `-moz` to future-proof).
+ */
+
+input[type="search"] {
+ -webkit-appearance: textfield; /* 1 */
+ -moz-box-sizing: content-box;
+ -webkit-box-sizing: content-box; /* 2 */
+ box-sizing: content-box;
+}
+
+/*
+ * Removes inner padding and search cancel button in Safari 5 and Chrome
+ * on OS X.
+ */
+
+input[type="search"]::-webkit-search-cancel-button,
+input[type="search"]::-webkit-search-decoration {
+ -webkit-appearance: none;
+}
+
+/*
+ * Removes inner padding and border in Firefox 4+.
+ */
+
+button::-moz-focus-inner,
+input::-moz-focus-inner {
+ border: 0;
+ padding: 0;
+}
+
+/*
+ * 1. Removes default vertical scrollbar in IE 8/9.
+ * 2. Improves readability and alignment in all browsers.
+ */
+
+textarea {
+ overflow: auto; /* 1 */
+ vertical-align: top; /* 2 */
+}
+
+/* ==========================================================================
+ Tables
+ ========================================================================== */
+
+/*
+ * Remove most spacing between table cells.
+ */
+
+table {
+ border-collapse: collapse;
+ border-spacing: 0;
+}
\ No newline at end of file
--- /dev/null
+/* http://prismjs.com/download.html?themes=prism-twilight&languages=markup+css+clike+javascript */
+/**
+ * prism.js Twilight theme
+ * Based (more or less) on the Twilight theme originally of Textmate fame.
+ * @author Remy Bach
+ */
+code[class*="language-"],
+pre[class*="language-"] {
+ color: white;
+ direction: ltr;
+ font-family: Consolas, Monaco, 'Andale Mono', monospace;
+ text-align: left;
+ text-shadow: 0 -.1em .2em black;
+ white-space: pre;
+ word-spacing: normal;
+ word-break: normal;
+ line-height: 1.5;
+
+ -moz-tab-size: 4;
+ -o-tab-size: 4;
+ tab-size: 4;
+
+ -webkit-hyphens: none;
+ -moz-hyphens: none;
+ -ms-hyphens: none;
+ hyphens: none;
+}
+
+pre[class*="language-"],
+:not(pre) > code[class*="language-"] {
+ background: hsl(0, 0%, 8%); /* #141414 */
+}
+
+/* Code blocks */
+pre[class*="language-"] {
+ border-radius: .5em;
+ border: .3em solid hsl(0, 0%, 33%); /* #282A2B */
+ box-shadow: 1px 1px .5em black inset;
+ margin: .5em 0;
+ overflow: auto;
+ padding: 1em;
+}
+
+pre[class*="language-"]::selection {
+ /* Safari */
+ background: hsl(200, 4%, 16%); /* #282A2B */
+}
+
+pre[class*="language-"]::selection {
+ /* Firefox */
+ background: hsl(200, 4%, 16%); /* #282A2B */
+}
+
+/* Text Selection colour */
+pre[class*="language-"]::-moz-selection, pre[class*="language-"] ::-moz-selection,
+code[class*="language-"]::-moz-selection, code[class*="language-"] ::-moz-selection {
+ text-shadow: none;
+ background: hsla(0, 0%, 93%, 0.15); /* #EDEDED */
+}
+
+pre[class*="language-"]::selection, pre[class*="language-"] ::selection,
+code[class*="language-"]::selection, code[class*="language-"] ::selection {
+ text-shadow: none;
+ background: hsla(0, 0%, 93%, 0.15); /* #EDEDED */
+}
+
+/* Inline code */
+:not(pre) > code[class*="language-"] {
+ border-radius: .3em;
+ border: .13em solid hsl(0, 0%, 33%); /* #545454 */
+ box-shadow: 1px 1px .3em -.1em black inset;
+ padding: .15em .2em .05em;
+}
+
+.token.comment,
+.token.prolog,
+.token.doctype,
+.token.cdata {
+ color: hsl(0, 0%, 47%); /* #777777 */
+}
+
+.token.punctuation {
+ opacity: .7;
+}
+
+.namespace {
+ opacity: .7;
+}
+
+.token.tag,
+.token.boolean,
+.token.number,
+.token.deleted {
+ color: hsl(14, 58%, 55%); /* #CF6A4C */
+}
+
+.token.keyword,
+.token.property,
+.token.selector,
+.token.constant,
+.token.symbol,
+.token.builtin {
+ color: hsl(53, 89%, 79%); /* #F9EE98 */
+}
+
+.token.attr-name,
+.token.attr-value,
+.token.string,
+.token.char,
+.token.operator,
+.token.entity,
+.token.url,
+.language-css .token.string,
+.style .token.string,
+.token.variable,
+.token.inserted {
+ color: hsl(76, 21%, 52%); /* #8F9D6A */
+}
+
+.token.atrule {
+ color: hsl(218, 22%, 55%); /* #7587A6 */
+}
+
+.token.regex,
+.token.important {
+ color: hsl(42, 75%, 65%); /* #E9C062 */
+}
+
+.token.important {
+ font-weight: bold;
+}
+
+.token.entity {
+ cursor: help;
+}
+
+pre[data-line] {
+ padding: 1em 0 1em 3em;
+ position: relative;
+}
+
+/* Markup */
+.language-markup .token.tag,
+.language-markup .token.attr-name,
+.language-markup .token.punctuation {
+ color: hsl(33, 33%, 52%); /* #AC885B */
+}
+
+/* Make the tokens sit above the line highlight so the colours don't look faded. */
+.token {
+ position: relative;
+ z-index: 1;
+}
+
+.line-highlight {
+ background: -moz-linear-gradient(left, hsla(0, 0%, 33%, .1) 70%, hsla(0, 0%, 33%, 0)); /* #545454 */
+ background: -o-linear-gradient(left, hsla(0, 0%, 33%, .1) 70%, hsla(0, 0%, 33%, 0)); /* #545454 */
+ background: -webkit-linear-gradient(left, hsla(0, 0%, 33%, .1) 70%, hsla(0, 0%, 33%, 0)); /* #545454 */
+ background: hsla(0, 0%, 33%, 0.25); /* #545454 */
+ background: linear-gradient(left, hsla(0, 0%, 33%, .1) 70%, hsla(0, 0%, 33%, 0)); /* #545454 */
+ border-bottom: 1px dashed hsl(0, 0%, 33%); /* #545454 */
+ border-top: 1px dashed hsl(0, 0%, 33%); /* #545454 */
+ left: 0;
+ line-height: inherit;
+ margin-top: 0.75em; /* Same as .prism’s padding-top */
+ padding: inherit 0;
+ pointer-events: none;
+ position: absolute;
+ right: 0;
+ white-space: pre;
+ z-index: 0;
+}
+
+.line-highlight:before,
+.line-highlight[data-end]:after {
+ background-color: hsl(215, 15%, 59%); /* #8794A6 */
+ border-radius: 999px;
+ box-shadow: 0 1px white;
+ color: hsl(24, 20%, 95%); /* #F5F2F0 */
+ content: attr(data-start);
+ font: bold 65%/1.5 sans-serif;
+ left: .6em;
+ min-width: 1em;
+ padding: 0 .5em;
+ position: absolute;
+ text-align: center;
+ text-shadow: none;
+ top: .4em;
+ vertical-align: .3em;
+}
+
+.line-highlight[data-end]:after {
+ bottom: .4em;
+ content: attr(data-end);
+ top: auto;
+}
+
--- /dev/null
+#compdef cargo
+
+autoload -U regexp-replace
+
+zstyle -T ':completion:*:*:cargo:*' tag-order && \
+ zstyle ':completion:*:*:cargo:*' tag-order 'common-commands'
+
+_cargo() {
+local context state state_descr line
+typeset -A opt_args
+
+# leading items in parentheses are an exclusion list for the arguments following that arg
+# See: http://zsh.sourceforge.net/Doc/Release/Completion-System.html#Completion-Functions
+# - => exclude all other options
+# 1 => exclude positional arg 1
+# * => exclude all other args
+# +blah => exclude +blah
+_arguments \
+ '(- 1 *)'{-h,--help}'[show help message]' \
+ '(- 1 *)--list[list installed commands]' \
+ '(- 1 *)'{-V,--version}'[show version information]' \
+ {-v,--verbose}'[use verbose output]' \
+ --color'[colorization option]' \
+ '(+beta +nightly)+stable[use the stable toolchain]' \
+ '(+stable +nightly)+beta[use the beta toolchain]' \
+ '(+stable +beta)+nightly[use the nightly toolchain]' \
+ '1: :->command' \
+ '*:: :->args'
+
+case $state in
+ command)
+ _alternative 'common-commands:common:_cargo_cmds' 'all-commands:all:_cargo_all_cmds'
+ ;;
+
+ args)
+ case $words[1] in
+ bench)
+ _arguments \
+ '--features=[space separated feature list]' \
+ '--all-features[enable all available features]' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '(-j, --jobs)'{-j,--jobs}'[number of parallel jobs, defaults to # of CPUs]' \
+ "${command_scope_spec[@]}" \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '--no-default-features[do not build the default features]' \
+ '--no-run[compile but do not run]' \
+ '(-p,--package)'{-p=,--package=}'[package to run benchmarks for]:packages:_get_package_names' \
+ '--target=[target triple]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ build)
+ _arguments \
+ '--features=[space separated feature list]' \
+ '--all-features[enable all available features]' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '(-j, --jobs)'{-j,--jobs}'[number of parallel jobs, defaults to # of CPUs]' \
+ "${command_scope_spec[@]}" \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '--no-default-features[do not build the default features]' \
+ '(-p,--package)'{-p=,--package=}'[package to build]:packages:_get_package_names' \
+ '--release=[build in release mode]' \
+ '--target=[target triple]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ check)
+ _arguments \
+ '--features=[space separated feature list]' \
+ '--all-features[enable all available features]' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '(-j, --jobs)'{-j,--jobs}'[number of parallel jobs, defaults to # of CPUs]' \
+ "${command_scope_spec[@]}" \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '--no-default-features[do not check the default features]' \
+ '(-p,--package)'{-p=,--package=}'[package to check]:packages:_get_package_names' \
+ '--release=[check in release mode]' \
+ '--target=[target triple]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ clean)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '(-p,--package)'{-p=,--package=}'[package to clean]:packages:_get_package_names' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '--release[whether or not to clean release artifacts]' \
+ '--target=[target triple(default:all)]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ doc)
+ _arguments \
+ '--features=[space separated feature list]' \
+ '--all-features[enable all available features]' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '(-j, --jobs)'{-j,--jobs}'[number of parallel jobs, defaults to # of CPUs]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '--no-deps[do not build docs for dependencies]' \
+ '--no-default-features[do not build the default features]' \
+ '--open[open docs in browser after the build]' \
+ '(-p, --package)'{-p,--package}'=[package to document]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '--release[build artifacts in release mode, with optimizations]' \
+ '--target=[build for the target triple]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ fetch)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ generate-lockfile)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ git-checkout)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ 'q(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '--reference=[REF]' \
+ '--url=[URL]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ help)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '*: :_cargo_cmds' \
+ ;;
+
+ init)
+ _arguments \
+ '--bin[use binary template]' \
+ '--vcs:initialize a new repo with a given VCS:(git hg none)' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--name=[set the resulting package name]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ install)
+ _arguments \
+ '--bin=[only install the specified binary]' \
+ '--branch=[branch to use when installing from git]' \
+ '--color=:colorization option:(auto always never)' \
+ '--debug[build in debug mode instead of release mode]' \
+ '--example[install the specified example instead of binaries]' \
+ '--features=[space separated feature list]' \
+ '--all-features[enable all available features]' \
+ '--git=[URL from which to install the crate]' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '(-j, --jobs)'{-j,--jobs}'[number of parallel jobs, defaults to # of CPUs]' \
+ '--no-default-features[do not build the default features]' \
+ '--path=[local filesystem path to crate to install]: :_files -/' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '--rev=[specific commit to use when installing from git]' \
+ '--root=[directory to install packages into]: :_files -/' \
+ '--tag=[tag to use when installing from git]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--vers=[version to install from crates.io]' \
+ ;;
+
+ locate-project)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ ;;
+
+ login)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--host=[Host to set the token for]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ metadata)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ "--no-deps[output information only about the root package and don't fetch dependencies]" \
+ '--no-default-features[do not include the default feature]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '--features=[space separated feature list]' \
+ '--all-features[enable all available features]' \
+ '--format-version=[format version(default: 1)]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ new)
+ _arguments \
+ '--bin[use binary template]' \
+ '--vcs:initialize a new repo with a given VCS:(git hg none)' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--name=[set the resulting package name]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ owner)
+ _arguments \
+ '(-a, --add)'{-a,--add}'[add owner LOGIN]' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--index[registry index]' \
+ '(-l, --list)'{-l,--list}'[list owners of a crate]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '(-r, --remove)'{-r,--remove}'[remove owner LOGIN]' \
+ '--token[API token to use when authenticating]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ package)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '(-l, --list)'{-l,--list}'[print files included in a package without making one]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '--no-metadata[ignore warnings about a lack of human-usable metadata]' \
+ '--no-verify[do not build to verify contents]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ pkgid)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ publish)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--host=[Host to set the token for]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '--no-verify[Do not verify tarball until before publish]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '--token[token to use when uploading]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ read-manifest)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ run)
+ _arguments \
+ '--example=[name of the bin target]' \
+ '--features=[space separated feature list]' \
+ '--all-features[enable all available features]' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '(-j, --jobs)'{-j,--jobs}'[number of parallel jobs, defaults to # of CPUs]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '--bin=[name of the bin target]' \
+ '--no-default-features[do not build the default features]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '--release=[build in release mode]' \
+ '--target=[target triple]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ '*: :_normal' \
+ ;;
+
+ rustc)
+ _arguments \
+ '--color=:colorization option:(auto always never)' \
+ '--features=[features to compile for the package]' \
+ '--all-features[enable all available features]' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '(-j, --jobs)'{-j,--jobs}'=[number of parallel jobs, defaults to # of CPUs]' \
+ '--manifest-path=[path to the manifest to fetch dependencies for]: :_files -/' \
+ '--no-default-features[do not compile default features for the package]' \
+ '(-p, --package)'{-p,--package}'=[profile to compile for]' \
+ '--profile=[profile to build the selected target for]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '--release[build artifacts in release mode, with optimizations]' \
+ '--target=[target triple which compiles will be for]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ "${command_scope_spec[@]}" \
+ ;;
+
+ rustdoc)
+ _arguments \
+ '--color=:colorization option:(auto always never)' \
+ '--features=[space-separated list of features to also build]' \
+ '--all-features[enable all available features]' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '(-j, --jobs)'{-j,--jobs}'=[number of parallel jobs, defaults to # of CPUs]' \
+ '--manifest-path=[path to the manifest to document]: :_files -/' \
+ '--no-default-features[do not build the `default` feature]' \
+ '--open[open the docs in a browser after the operation]' \
+ '(-p, --package)'{-p,--package}'=[package to document]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '--release[build artifacts in release mode, with optimizations]' \
+ '--target=[build for the target triple]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ "${command_scope_spec[@]}" \
+ ;;
+
+ search)
+ _arguments \
+ '--color=:colorization option:(auto always never)' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--host=[host of a registry to search in]' \
+ '--limit=[limit the number of results]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ ;;
+
+ test)
+ _arguments \
+ '--features=[space separated feature list]' \
+ '--all-features[enable all available features]' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '(-j, --jobs)'{-j,--jobs}'[number of parallel jobs, defaults to # of CPUs]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '--test=[test name]: :_test_names' \
+ '--no-default-features[do not build the default features]' \
+ '--no-fail-fast[run all tests regardless of failure]' \
+ '--no-run[compile but do not run]' \
+ '(-p,--package)'{-p=,--package=}'[package to run tests for]:packages:_get_package_names' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '--release[build artifacts in release mode, with optimizations]' \
+ '--target=[target triple]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ '1: :_test_names' \
+ '(--doc --bin --example --test --bench)--lib[only test library]' \
+ '(--lib --bin --example --test --bench)--doc[only test documentation]' \
+ '(--lib --doc --example --test --bench)--bin=[binary name]' \
+ '(--lib --doc --bin --test --bench)--example=[example name]' \
+ '(--lib --doc --bin --example --bench)--test=[test name]' \
+ '(--lib --doc --bin --example --test)--bench=[benchmark name]' \
+ '--message-format:error format:(human json)' \
+ '--frozen[require lock and cache up to date]' \
+ '--locked[require lock up to date]'
+ ;;
+
+ uninstall)
+ _arguments \
+ '--bin=[only uninstall the binary NAME]' \
+ '--color=:colorization option:(auto always never)' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '(-q, --quiet)'{-q,--quiet}'[less output printed to stdout]' \
+ '--root=[directory to uninstall packages from]: :_files -/' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ ;;
+
+ update)
+ _arguments \
+ '--aggressive=[force dependency update]' \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '(-p,--package)'{-p=,--package=}'[package to update]:packages:__get_package_names' \
+ '--precise=[update single dependency to PRECISE]: :' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ verify-project)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--manifest-path=[path to manifest]: :_files -/' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ version)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ ;;
+
+ yank)
+ _arguments \
+ '(-h, --help)'{-h,--help}'[show help message]' \
+ '--index[registry index]' \
+ '(-q, --quiet)'{-q,--quiet}'[no output printed to stdout]' \
+ '--token[API token to use when authenticating]' \
+ '--undo[undo a yank, putting a version back into the index]' \
+ '(-v, --verbose)'{-v,--verbose}'[use verbose output]' \
+ '--color=:colorization option:(auto always never)' \
+ '--vers[yank version]' \
+ ;;
+ esac
+ ;;
+esac
+}
+
+_cargo_cmds(){
+local -a commands;commands=(
+'bench:execute all benchmarks of a local package'
+'build:compile the current project'
+'check:check the current project without compiling'
+'clean:remove generated artifacts'
+'doc:build package documentation'
+'fetch:fetch package dependencies'
+'generate-lockfile:create lockfile'
+'git-checkout:git checkout'
+'help:get help for commands'
+'init:create new project in current directory'
+'install:install a Rust binary'
+'locate-project:print "Cargo.toml" location'
+'login:login to remote server'
+'metadata:the metadata for a project in json'
+'new:create a new project'
+'owner:manage the owners of a crate on the registry'
+'package:assemble local package into a distributable tarball'
+'pkgid:print a fully qualified package specification'
+'publish:upload package to the registry'
+'read-manifest:print manifest in JSON format'
+'run:run the main binary of the local package'
+'rustc:compile a package and all of its dependencies'
+'rustdoc:build documentation for a package'
+'search:search packages on crates.io'
+'test:execute all unit and tests of a local package'
+'uninstall:remove a Rust binary'
+'update:update dependencies'
+'verify-project:check Cargo.toml'
+'version:show version information'
+'yank:remove pushed file from index'
+)
+_describe -t common-commands 'common commands' commands
+}
+
+_cargo_all_cmds(){
+local -a commands;commands=($(cargo --list))
+_describe -t all-commands 'all commands' commands
+}
+
+
+#FIXME: Disabled until fixed
+#gets package names from the manifest file
+_get_package_names()
+{
+}
+
+#TODO:see if it makes sense to have 'locate-project' to have non-json output.
+#strips package name from json stuff
+_locate_manifest(){
+local manifest=`cargo locate-project 2>/dev/null`
+regexp-replace manifest '\{"root":"|"\}' ''
+echo $manifest
+}
+
+# Extracts the values of "name" from the array given in $1 and shows them as
+# command line options for completion
+_get_names_from_array()
+{
+ local -a filelist;
+ local manifest=$(_locate_manifest)
+ if [[ -z $manifest ]]; then
+ return 0
+ fi
+
+ local last_line
+ local -a names;
+ local in_block=false
+ local block_name=$1
+ names=()
+ while read line
+ do
+ if [[ $last_line == "[[$block_name]]" ]]; then
+ in_block=true
+ else
+ if [[ $last_line =~ '.*\[\[.*' ]]; then
+ in_block=false
+ fi
+ fi
+
+ if [[ $in_block == true ]]; then
+ if [[ $line =~ '.*name.*=' ]]; then
+ regexp-replace line '^.*name *= *|"' ""
+ names+=$line
+ fi
+ fi
+
+ last_line=$line
+ done < $manifest
+ _describe $block_name names
+
+}
+
+#Gets the test names from the manifest file
+_test_names()
+{
+ _get_names_from_array "test"
+}
+
+#Gets the bench names from the manifest file
+_benchmark_names()
+{
+ _get_names_from_array "bench"
+}
+
+# These flags are mutually exclusive specifiers for the scope of a command; as
+# they are used in multiple places without change, they are expanded into the
+# appropriate command's `_arguments` where appropriate.
+set command_scope_spec
+command_scope_spec=(
+ '(--bin --example --test --lib)--bench=[benchmark name]: :_benchmark_names'
+ '(--bench --bin --test --lib)--example=[example name]'
+ '(--bench --example --test --lib)--bin=[binary name]'
+ '(--bench --bin --example --test)--lib=[library name]'
+ '(--bench --bin --example --lib)--test=[test name]'
+)
+
+_cargo
--- /dev/null
+command -v cargo >/dev/null 2>&1 &&
+_cargo()
+{
+ local cur prev words cword cmd
+ _get_comp_words_by_ref cur prev words cword
+
+ COMPREPLY=()
+
+ cmd=${words[1]}
+
+ local vcs='git hg none'
+ local color='auto always never'
+ local msg_format='human json'
+
+ local opt_help='-h --help'
+ local opt_verbose='-v --verbose'
+ local opt_quiet='-q --quiet'
+ local opt_color='--color'
+ local opt_common="$opt_help $opt_verbose $opt_quiet $opt_color"
+ local opt_pkg='-p --package'
+ local opt_feat='--features --all-features --no-default-features'
+ local opt_mani='--manifest-path'
+ local opt_jobs='-j --jobs'
+ local opt_force='-f --force'
+ local opt_test='--test --bench'
+ local opt_lock='--frozen --locked'
+
+ local opt___nocmd="$opt_common -V --version --list"
+ local opt__bench="$opt_common $opt_pkg $opt_feat $opt_mani $opt_lock $opt_jobs $opt_test --message-format --target --lib --bin --example --no-run"
+ local opt__build="$opt_common $opt_pkg $opt_feat $opt_mani $opt_lock $opt_jobs $opt_test --message-format --target --lib --bin --example --release"
+ local opt__check="$opt_common $opt_pkg $opt_feat $opt_mani $opt_lock $opt_jobs $opt_test --message-format --target --lib --bin --example --release"
+ local opt__clean="$opt_common $opt_pkg $opt_mani $opt_lock --target --release"
+ local opt__doc="$opt_common $opt_pkg $opt_feat $opt_mani $opt_lock $opt_jobs --message-format --bin --lib --target --open --no-deps --release"
+ local opt__fetch="$opt_common $opt_mani $opt_lock"
+ local opt__generate_lockfile="${opt__fetch}"
+ local opt__git_checkout="$opt_common $opt_lock --reference --url"
+ local opt__help="$opt_help"
+ local opt__init="$opt_common $opt_lock --bin --lib --name --vcs"
+ local opt__install="$opt_common $opt_feat $opt_jobs $opt_lock $opt_force --bin --branch --debug --example --git --list --path --rev --root --tag --vers"
+ local opt__locate_project="$opt_mani -h --help"
+ local opt__login="$opt_common $opt_lock --host"
+ local opt__metadata="$opt_common $opt_feat $opt_mani $opt_lock --format-version --no-deps"
+ local opt__new="$opt_common $opt_lock --vcs --bin --lib --name"
+ local opt__owner="$opt_common $opt_lock -a --add -r --remove -l --list --index --token"
+ local opt__package="$opt_common $opt_mani $opt_lock $opt_jobs --allow-dirty -l --list --no-verify --no-metadata"
+ local opt__pkgid="${opt__fetch} $opt_pkg"
+ local opt__publish="$opt_common $opt_mani $opt_lock $opt_jobs --allow-dirty --dry-run --host --token --no-verify"
+ local opt__read_manifest="$opt_help $opt_verbose $opt_mani $opt_color --no-deps"
+ local opt__run="$opt_common $opt_feat $opt_mani $opt_lock $opt_jobs --message-format --target --bin --example --release"
+ local opt__rustc="$opt_common $opt_pkg $opt_feat $opt_mani $opt_lock $opt_jobs $opt_test --message-format --profile --target --lib --bin --example --release"
+ local opt__rustdoc="$opt_common $opt_pkg $opt_feat $opt_mani $opt_lock $opt_jobs $opt_test --message-format --target --lib --bin --example --release --open"
+ local opt__search="$opt_common $opt_lock --host --limit"
+ local opt__test="$opt_common $opt_pkg $opt_feat $opt_mani $opt_lock $opt_jobs $opt_test --message-format --all --doc --target --lib --bin --example --no-run --release --no-fail-fast"
+ local opt__uninstall="$opt_common $opt_lock --bin --root"
+ local opt__update="$opt_common $opt_pkg $opt_mani $opt_lock --aggressive --precise"
+ local opt__verify_project="${opt__fetch}"
+ local opt__version="$opt_help $opt_verbose $opt_color"
+ local opt__yank="$opt_common $opt_lock --vers --undo --index --token"
+
+ if [[ $cword -eq 1 ]]; then
+ if [[ "$cur" == -* ]]; then
+ COMPREPLY=( $( compgen -W "${opt___nocmd}" -- "$cur" ) )
+ else
+ COMPREPLY=( $( compgen -W "$__cargo_commands" -- "$cur" ) )
+ fi
+ elif [[ $cword -ge 2 ]]; then
+ case "${prev}" in
+ --vcs)
+ COMPREPLY=( $( compgen -W "$vcs" -- "$cur" ) )
+ ;;
+ --color)
+ COMPREPLY=( $( compgen -W "$color" -- "$cur" ) )
+ ;;
+ --message-format)
+ COMPREPLY=( $( compgen -W "$msg_format" -- "$cur" ) )
+ ;;
+ --manifest-path)
+ _filedir toml
+ ;;
+ --bin)
+ COMPREPLY=( $( compgen -W "$(_bin_names)" -- "$cur" ) )
+ ;;
+ --test)
+ COMPREPLY=( $( compgen -W "$(_test_names)" -- "$cur" ) )
+ ;;
+ --bench)
+ COMPREPLY=( $( compgen -W "$(_benchmark_names)" -- "$cur" ) )
+ ;;
+ --example)
+ COMPREPLY=( $( compgen -W "$(_get_examples)" -- "$cur" ) )
+ ;;
+ --target)
+ COMPREPLY=( $( compgen -W "$(_get_targets)" -- "$cur" ) )
+ ;;
+ help)
+ COMPREPLY=( $( compgen -W "$__cargo_commands" -- "$cur" ) )
+ ;;
+ *)
+ local opt_var=opt__${cmd//-/_}
+ COMPREPLY=( $( compgen -W "${!opt_var}" -- "$cur" ) )
+ ;;
+ esac
+ fi
+
+ # compopt does not work in bash version 3
+
+ return 0
+} &&
+complete -F _cargo cargo
+
+__cargo_commands=$(cargo --list 2>/dev/null | tail -n +2)
+
+_locate_manifest(){
+ local manifest=`cargo locate-project 2>/dev/null`
+ # regexp-replace manifest '\{"root":"|"\}' ''
+ echo ${manifest:9:-2}
+}
+
+# Extracts the values of "name" from the array given in $1 and shows them as
+# command line options for completion
+_get_names_from_array()
+{
+ local manifest=$(_locate_manifest)
+ if [[ -z $manifest ]]; then
+ return 0
+ fi
+
+ local last_line
+ local -a names
+ local in_block=false
+ local block_name=$1
+ while read line
+ do
+ if [[ $last_line == "[[$block_name]]" ]]; then
+ in_block=true
+ else
+ if [[ $last_line =~ .*\[\[.* ]]; then
+ in_block=false
+ fi
+ fi
+
+ if [[ $in_block == true ]]; then
+ if [[ $line =~ .*name.*\= ]]; then
+ line=${line##*=}
+ line=${line%%\"}
+ line=${line##*\"}
+ names+=($line)
+ fi
+ fi
+
+ last_line=$line
+ done < $manifest
+ echo "${names[@]}"
+}
+
+#Gets the bin names from the manifest file
+_bin_names()
+{
+ _get_names_from_array "bin"
+}
+
+#Gets the test names from the manifest file
+_test_names()
+{
+ _get_names_from_array "test"
+}
+
+#Gets the bench names from the manifest file
+_benchmark_names()
+{
+ _get_names_from_array "bench"
+}
+
+_get_examples(){
+ local files=($(dirname $(_locate_manifest))/examples/*.rs)
+ local names=("${files[@]##*/}")
+ local names=("${names[@]%.*}")
+ # "*" means no examples found
+ if [[ "${names[@]}" != "*" ]]; then
+ echo "${names[@]}"
+ fi
+}
+
+_get_targets(){
+ local CURRENT_PATH
+ if [ `uname -o` == "Cygwin" -a -f "$PWD"/Cargo.toml ]; then
+ CURRENT_PATH=$PWD
+ else
+ CURRENT_PATH=$(_locate_manifest)
+ fi
+ if [[ -z "$CURRENT_PATH" ]]; then
+ return 1
+ fi
+ local TARGETS=()
+ local FIND_PATHS=( "/" )
+ local FIND_PATH LINES LINE
+ while [[ "$CURRENT_PATH" != "/" ]]; do
+ FIND_PATHS+=( "$CURRENT_PATH" )
+ CURRENT_PATH=$(dirname $CURRENT_PATH)
+ done
+ for FIND_PATH in ${FIND_PATHS[@]}; do
+ if [[ -f "$FIND_PATH"/.cargo/config ]]; then
+ LINES=( `grep "$FIND_PATH"/.cargo/config -e "^\[target\."` )
+ for LINE in ${LINES[@]}; do
+ TARGETS+=(`sed 's/^\[target\.\(.*\)\]$/\1/' <<< $LINE`)
+ done
+ fi
+ done
+ echo "${TARGETS[@]}"
+}
+# vim:ft=sh
--- /dev/null
+.TH "CARGO\-BENCH" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-bench \- Execute benchmarks of a package
+.SH SYNOPSIS
+.PP
+\f[I]cargo bench\f[] [OPTIONS] [\-\-] [<ARGS>...]
+.SH DESCRIPTION
+.PP
+Execute all benchmarks of a local package.
+.PP
+All of the trailing arguments are passed to the benchmark binaries
+generated for filtering benchmarks and generally providing options
+configuring how they run.
+.PP
+If the \f[B]\-\-package\f[] argument is given, then \f[I]SPEC\f[] is a
+package id specification which indicates which package should be built.
+If it is not given, then the current package is built.
+For more information on \f[I]SPEC\f[] and its format, see the "cargo
+help pkgid" command.
+.PP
+The \f[B]\-\-jobs\f[] argument affects the building of the benchmark
+executable but does not affect how many jobs are used when running the
+benchmarks.
+.PP
+Compilation can be customized with the \[aq]bench\[aq] profile in the
+manifest.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-lib
+Benchmark only this package\[aq]s library.
+.RS
+.RE
+.TP
+.B \-\-bin \f[I]NAME\f[]
+Benchmark only the specified binary.
+.RS
+.RE
+.TP
+.B \-\-example \f[I]NAME\f[]
+Benchmark only the specified example.
+.RS
+.RE
+.TP
+.B \-\-test \f[I]NAME\f[]
+Benchmark only the specified test target.
+.RS
+.RE
+.TP
+.B \-\-bench \f[I]NAME\f[]
+Benchmark only the specified bench target.
+.RS
+.RE
+.TP
+.B \-\-no\-run
+Compile, but don\[aq]t run benchmarks.
+.RS
+.RE
+.TP
+.B \-p \f[I]SPEC\f[], \-\-package \f[I]SPEC ...\f[]
+Package to benchmarks for.
+.RS
+.RE
+.TP
+.B \-j \f[I]IN\f[], \-\-jobs \f[I]IN\f[]
+Number of parallel jobs, defaults to # of CPUs.
+.RS
+.RE
+.TP
+.B \-\-release
+Build artifacts in release mode, with optimizations.
+.RS
+.RE
+.TP
+.B \-\-features \f[I]FEATURES\f[]
+Space\-separated list of features to also build.
+.RS
+.RE
+.TP
+.B \-\-all\-features
+Build all available features.
+.RS
+.RE
+.TP
+.B \-\-no\-default\-features
+Do not build the \f[C]default\f[] feature.
+.RS
+.RE
+.TP
+.B \-\-target \f[I]TRIPLE\f[]
+Build for the target triple.
+.RS
+.RE
+.TP
+.B \-\-manifest\-path \f[I]PATH\f[]
+Path to the manifest to compile.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH EXAMPLES
+.PP
+Execute all the benchmarks of the current package
+.IP
+.nf
+\f[C]
+$\ cargo\ bench
+\f[]
+.fi
+.PP
+Execute the BENCH benchmark
+.IP
+.nf
+\f[C]
+$\ cargo\ bench\ \-\-bench\ BENCH
+\f[]
+.fi
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-build(1), cargo\-test(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-BUILD" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-build \- Compile the current project
+.SH SYNOPSIS
+.PP
+\f[I]cargo build\f[] [OPTIONS]
+.SH DESCRIPTION
+.PP
+Compile a local package and all of its dependencies.
+.PP
+If the \f[B]\-\-package\f[] argument is given, then \f[I]SPEC\f[] is a
+package id specification which indicates which package should be built.
+If it is not given, then the current package is built.
+For more information on \f[I]SPEC\f[] and its format, see the "cargo
+help pkgid" command.
+.PP
+Compilation can be configured via the use of profiles which are
+configured in the manifest.
+The default profile for this command is \f[I]dev\f[], but passing the
+\f[B]\-\-release\f[] flag will use the \f[I]release\f[] profile instead.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-p \f[I]SPEC\f[], \-\-package \f[I]SPEC ...\f[]
+Package to build.
+.RS
+.RE
+.TP
+.B \-j \f[I]IN\f[], \-\-jobs \f[I]IN\f[]
+Number of parallel jobs, defaults to # of CPUs.
+.RS
+.RE
+.TP
+.B \-\-lib
+Build only this package\[aq]s library.
+.RS
+.RE
+.TP
+.B \-\-bin \f[I]NAME\f[]
+Build only the specified binary.
+.RS
+.RE
+.TP
+.B \-\-example \f[I]NAME\f[]
+Build only the specified example.
+.RS
+.RE
+.TP
+.B \-\-test \f[I]NAME\f[]
+Build only the specified test target.
+.RS
+.RE
+.TP
+.B \-\-bench \f[I]NAME\f[]
+Build only the specified benchmark target.
+.RS
+.RE
+.TP
+.B \-\-release
+Build artifacts in release mode, with optimizations.
+.RS
+.RE
+.TP
+.B \-\-all\-features
+Build all available features.
+.RS
+.RE
+.TP
+.B \-\-features \f[I]FEATURES\f[]
+Space\-separated list of features to also build.
+.RS
+.RE
+.TP
+.B \-\-no\-default\-features
+Do not build the \f[C]default\f[] feature.
+.RS
+.RE
+.TP
+.B \-\-target \f[I]TRIPLE\f[]
+Build for the target triple.
+.RS
+.RE
+.TP
+.B \-\-manifest\-path \f[I]PATH\f[]
+Path to the manifest to compile.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH EXAMPLES
+.PP
+Build a local package and all of its dependencies
+.IP
+.nf
+\f[C]
+$\ cargo\ build
+\f[]
+.fi
+.PP
+Build a package with optimizations
+.IP
+.nf
+\f[C]
+$\ cargo\ build\ \-\-release
+\f[]
+.fi
+.SH SEE ALSO
+.PP
+cargo(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-CHECK" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-check \- Check the current project
+.SH SYNOPSIS
+.PP
+\f[I]cargo check\f[] [OPTIONS]
+.SH DESCRIPTION
+.PP
+Check a local package and all of its dependencies.
+.PP
+If the \f[B]\-\-package\f[] argument is given, then \f[I]SPEC\f[] is a
+package id specification which indicates which package should be checked.
+If it is not given, then the current package is checked.
+For more information on \f[I]SPEC\f[] and its format, see the "cargo
+help pkgid" command.
+.PP
+Compilation can be configured via the use of profiles which are
+configured in the manifest.
+The default profile for this command is \f[I]dev\f[], but passing the
+\f[B]\-\-release\f[] flag will use the \f[I]release\f[] profile instead.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-p \f[I]SPEC\f[], \-\-package \f[I]SPEC ...\f[]
+Package to check.
+.RS
+.RE
+.TP
+.B \-j \f[I]IN\f[], \-\-jobs \f[I]IN\f[]
+Number of parallel jobs, defaults to # of CPUs.
+.RS
+.RE
+.TP
+.B \-\-lib
+Check only this package\[aq]s library.
+.RS
+.RE
+.TP
+.B \-\-bin \f[I]NAME\f[]
+Check only the specified binary.
+.RS
+.RE
+.TP
+.B \-\-example \f[I]NAME\f[]
+Check only the specified example.
+.RS
+.RE
+.TP
+.B \-\-test \f[I]NAME\f[]
+Check only the specified test target.
+.RS
+.RE
+.TP
+.B \-\-bench \f[I]NAME\f[]
+Check only the specified benchmark target.
+.RS
+.RE
+.TP
+.B \-\-release
+Check artifacts in release mode.
+.RS
+.RE
+.TP
+.B \-\-all\-features
+Check with all available features.
+.RS
+.RE
+.TP
+.B \-\-features \f[I]FEATURES\f[]
+Space\-separated list of features to also check.
+.RS
+.RE
+.TP
+.B \-\-no\-default\-features
+Do not check the \f[C]default\f[] feature.
+.RS
+.RE
+.TP
+.B \-\-target \f[I]TRIPLE\f[]
+Check for the target triple.
+.RS
+.RE
+.TP
+.B \-\-manifest\-path \f[I]PATH\f[]
+Path to the manifest to compile.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH EXAMPLES
+.PP
+Check a local package and all of its dependencies
+.IP
+.nf
+\f[C]
+$\ cargo\ check
+\f[]
+.fi
+.PP
+Check a package with optimizations
+.IP
+.nf
+\f[C]
+$\ cargo\ check\ \-\-release
+\f[]
+.fi
+.SH SEE ALSO
+.PP
+cargo(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-CLEAN" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-clean \- Remove generated artifacts
+.SH SYNOPSIS
+.PP
+\f[I]cargo clean\f[] [OPTIONS]
+.SH DESCRIPTION
+.PP
+Remove artifacts that cargo has generated in the past.
+.PP
+If the \f[B]\-\-package\f[] argument is given, then \f[I]SPEC\f[] is a
+package id specification which indicates which package should be built.
+If it is not given, then the current package is built.
+For more information on \f[I]SPEC\f[] and its format, see the "cargo
+help pkgid" command.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-p \f[I]SPEC\f[], \-\-package \f[I]SPEC ...\f[]
+Package to clean artifacts for.
+.RS
+.RE
+.TP
+.B \-\-manifest\-path PATH
+Path to the manifest to the package to clean.
+.RS
+.RE
+.TP
+.B \-\-target TRIPLE
+Target triple to clean output for (default all).
+.RS
+.RE
+.TP
+.B \-\-release
+Whether or not to clean release artifacts.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH EXAMPLES
+.PP
+Remove local package generated artifacts
+.IP
+.nf
+\f[C]
+$\ cargo\ clean
+\f[]
+.fi
+.PP
+Clean release artifacts
+.IP
+.nf
+\f[C]
+$\ cargo\ clean\ \-\-release
+\f[]
+.fi
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-build(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-DOC" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-doc \- Build a package\[aq]s documentation
+.SH SYNOPSIS
+.PP
+\f[I]cargo doc\f[] [OPTIONS]
+.SH DESCRIPTION
+.PP
+Build a package\[aq]s documentation.
+.PP
+By default the documentation for the local package and all dependencies
+is built.
+The output is all placed in \[aq]target/doc\[aq] in rustdoc\[aq]s usual
+format.
+.PP
+If the \f[B]\-\-package\f[] argument is given, then \f[I]SPEC\f[] is a
+package id specification which indicates which package should be built.
+If it is not given, then the current package is built.
+For more information on \f[I]SPEC\f[] and its format, see the "cargo
+help pkgid" command.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-p \f[I]SPEC\f[], \-\-package \f[I]SPEC ...\f[]
+Package to document.
+.RS
+.RE
+.TP
+.B \-\-open
+Opens the docs in a browser after the operation.
+.RS
+.RE
+.TP
+.B \-\-no\-deps
+Don\[aq]t build documentation for dependencies.
+.RS
+.RE
+.TP
+.B \-j \f[I]N\f[], \-\-jobs \f[I]N\f[]
+Number of parallel jobs, defaults to # of CPUs.
+.RS
+.RE
+.TP
+.B \-\-release
+Build artifacts in release mode, with optimizations.
+.RS
+.RE
+.TP
+.B \-\-features \f[I]FEATURES\f[]
+Space\-separated list of features to also build.
+.RS
+.RE
+.TP
+.B \-\-all\-features
+Build all available features.
+.RS
+.RE
+.TP
+.B \-\-no\-default\-features
+Do not build the \f[C]default\f[] feature.
+.RS
+.RE
+.TP
+.B \-\-target \f[I]TRIPLE\f[]
+Build for the target triple.
+.RS
+.RE
+.TP
+.B \-\-manifest\-path \f[I]PATH\f[]
+Path to the manifest to compile.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH EXAMPLES
+.PP
+Build a local package documentation in \[aq]target/doc\[aq]
+.IP
+.nf
+\f[C]
+$\ cargo\ doc
+\f[]
+.fi
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-build(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-FETCH" "1" "July 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-fetch \- Fetch dependencies of a package from the network
+.SH SYNOPSIS
+.PP
+\f[I]cargo fetch\f[] [OPTIONS]
+.SH DESCRIPTION
+.PP
+If a lockfile is available, this command will ensure that all of the git
+dependencies and/or registries dependencies are downloaded and locally
+available. The network is never touched after a `cargo fetch` unless
+the lockfile changes.
+
+If the lockfile is not available, then this is the equivalent of
+`cargo generate-lockfile`. A lockfile is generated and dependencies are also
+all updated.
+.PP
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-manifest-path \f[I]PATH\f[]
+Path to the manifest to fetch dependencies for.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-update(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-GENERATE LOCKFILE" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-generate-lockfile \- Generate the lockfile for a project
+.SH SYNOPSIS
+.PP
+\f[I]cargo generate-lockfile\f[] [OPTIONS]
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-manifest-path \f[I]PATH\f[]
+Path to the manifest to generate a lockfile for.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-INIT" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-init \- Create a new cargo package in the current directory
+.SH SYNOPSIS
+.PP
+\f[I]cargo init\f[] [OPTIONS]
+.SH DESCRIPTION
+.PP
+Create a new cargo package in the current directory.
+.PP
+Use the \f[B]\-\-vcs\f[] option to control the version control system to
+use.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-vcs \f[I]VCS\f[]
+Initialize a new repository for the given version control system (git or
+hg) or do not initialize any version control at all (none) overriding a
+global configuration.
+.RS
+.RE
+.TP
+.B \-\-bin
+Use a binary instead of a library template.
+.RS
+.RE
+.TP
+.B \-\-name \f[I]NAME\f[]
+Set the resulting package name.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH EXAMPLES
+.PP
+Initialize a binary cargo package in the current directory
+.IP
+.nf
+\f[C]
+$\ cargo\ init\ \-\-bin
+\f[]
+.fi
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-new(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-INSTALL" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-install \- Install a Rust binary
+.SH SYNOPSIS
+.PP
+\f[I]cargo install\f[] [OPTIONS] <CRATE>
+.PP
+\f[I]cargo install\f[] [OPTIONS] \-\-list
+.SH DESCRIPTION
+.PP
+Install a Rust binary
+.PP
+This command manages Cargo\[aq]s local set of install binary crates.
+Only packages which have [[bin]] targets can be installed, and all
+binaries are installed into the installation root\[aq]s \f[I]bin\f[]
+folder.
+The installation root is determined, in order of precedence, by
+\f[B]\-\-root\f[], \f[I]$CARGO_INSTALL_ROOT\f[], the
+\f[I]install.root\f[] configuration key, and finally the home directory
+(which is either \f[I]$CARGO_HOME\f[] if set or \f[I]$HOME/.cargo\f[] by
+default).
+.PP
+There are multiple sources from which a crate can be installed.
+The default location is crates.io but the \f[B]\-\-git\f[] and
+\f[B]\-\-path\f[] flags can change this source.
+If the source contains more than one package (such as \f[I]crates.io\f[]
+or a git repository with multiple crates) the \f[B]\f[] argument is
+required to indicate which crate should be installed.
+.PP
+Crates from crates.io can optionally specify the version they wish to
+install via the \f[B]\-\-vers\f[] flags, and similarly packages from git
+repositories can optionally specify the branch, tag, or revision that
+should be installed.
+If a crate has multiple binaries, the \f[B]\-\-bin\f[] argument can
+selectively install only one of them, and if you\[aq]d rather install
+examples the \f[B]\-\-example\f[] argument can be used as well.
+.PP
+As a special convenience, omitting the <crate> specification entirely
+will install the crate in the current directory.
+That is, \f[I]install\f[] is equivalent to the more explicit "install
+\-\-path .".
+.PP
+The \f[B]\-\-list\f[] option will list all installed packages (and their
+versions).
+.SH OPTIONS
+.SS Query options
+.TP
+.B \-\-list
+List all installed packages (and their versions).
+.RS
+.RE
+.SS Specifying what crate to install
+.TP
+.B \-\-vers \f[I]VERS\f[]
+Specify a version to install from crates.io.
+.RS
+.RE
+.TP
+.B \-\-git \f[I]URL\f[]
+Git URL to install the specified crate from.
+.RS
+.RE
+.TP
+.B \-\-branch \f[I]BRANCH\f[]
+Branch to use when installing from git.
+.RS
+.RE
+.TP
+.B \-\-tag \f[I]TAG\f[]
+Tag to use when installing from git.
+.RS
+.RE
+.TP
+.B \-\-rev \f[I]SHA\f[]
+Specific commit to use when installing from git.
+.RS
+.RE
+.TP
+.B \-\-path \f[I]PATH\f[]
+Filesystem path to local crate to install.
+.RS
+.RE
+.SS Built and install options
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-j \f[I]N\f[], \-\-jobs \f[I]N\f[]
+Number of parallel jobs, defaults to # of CPUs.
+.RS
+.RE
+.TP
+.B \-\-features \f[I]FEATURES\f[]
+Space\-separated list of features to activate.
+.RS
+.RE
+.TP
+.B \-\-all\-features
+Build all available features.
+.RS
+.RE
+.TP
+.B \-f, \-\-force
+Force overwriting existing crates or binaries
+.RS
+.RE
+.TP
+.B \-\-no\-default\-features
+Do not build the \f[C]default\f[] feature.
+.RS
+.RE
+.TP
+.B \-\-debug
+Build in debug mode instead of release mode.
+.RS
+.RE
+.TP
+.B \-\-bin \f[I]NAME\f[]
+Only install the binary NAME.
+.RS
+.RE
+.TP
+.B \-\-example \f[I]EXAMPLE\f[]
+Install the example EXAMPLE instead of binaries.
+.RS
+.RE
+.TP
+.B \-\-root \f[I]DIR\f[]
+Directory to install packages into.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-search(1), cargo\-publish(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-LOGIN" "1" "July 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-login \- Save an API token from the registry locally
+.SH SYNOPSIS
+.PP
+\f[I]cargo login\f[] [OPTIONS] [<TOKEN>]
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-host \f[I]HOST\f[]
+Host to set the token for
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-publish(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-METADATA" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-metadata \- Machine-readable metadata about the current project
+.SH SYNOPSIS
+.PP
+\f[I]cargo metadata\f[] [OPTIONS]
+.SH DESCRIPTION
+.PP
+Output the resolved dependencies of a project, the concrete used versions
+including overrides, in machine-readable format.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-features \f[I]FEATURES\f[]
+Space-separated list of features.
+.RS
+.RE
+.TP
+.B \-\-all\-features
+Build all available features.
+.RS
+.RE
+.TP
+.B \-\-no\-default\-features
+Do not include the \f[C]default\f[] feature.
+.RS
+.RE
+.TP
+.B \-\-no\-deps
+Output information only about the root package and don\[aq]t fetch
+dependencies.
+.RS
+.RE
+.TP
+.B \-\-manifest\-path \f[I]PATH\f[]
+Path to the manifest.
+.RS
+.RE
+.TP
+.B \-\-format\-version \f[I]VERSION\f[]
+Format version [default: 1]. Valid values: 1.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-NEW" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-new \- Create a new cargo package
+.SH SYNOPSIS
+.PP
+\f[I]cargo new\f[] [OPTIONS] <PATH>
+.SH DESCRIPTION
+.PP
+Create a new cargo package at <PATH>.
+.PP
+Use the \f[B]\-\-vcs\f[] option to control the version control system to
+use.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-vcs \f[I]VCS\f[]
+Initialize a new repository for the given version control system (git or
+hg) or do not initialize any version control at all (none) overriding a
+global configuration.
+.RS
+.RE
+.TP
+.B \-\-bin
+Use a binary instead of a library template.
+.RS
+.RE
+.TP
+.B \-\-name \f[I]NAME\f[]
+Set the resulting package name.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH EXAMPLES
+.PP
+Create a binary cargo package in the current directory
+.IP
+.nf
+\f[C]
+$\ cargo\ new\ \-\-bin\ ./
+\f[]
+.fi
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-init(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-OWNER" "1" "July 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-owner \- Manage the owners of a crate of the registry
+.SH SYNOPSIS
+.PP
+\f[I]cargo owner\f[] [OPTIONS] [<CRATE>]
+.SH DESCRIPTION
+.PP
+This command will modify the owners for a package on the specified
+registry (or default). Note that owners of a package can upload new
+versions, and yank old versions. Explicitly named owners can also modify
+the set of owners, so take caution!
+.PP
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-a, \-\-add \f[I]LOGIN\f[]
+Name of a user or team to add as an owner.
+.RS
+.RE
+.TP
+.B \-r, \-\-remove \f[I]LOGIN\f[]
+Name of a user or team to remove as an owner.
+.RS
+.RE
+.TP
+.B \-l, \-\-list
+List owners of a crate.
+.RS
+.RE
+.TP
+.B \-\-index \f[I]INDEX\f[]
+Registry index to modify owners for.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH EXAMPLES
+.PP
+Add user as an owner of the current package
+.IP
+.nf
+\f[C]
+$\ cargo\ owner\ \-\-add\ user
+\f[]
+.fi
+.PP
+Remove user as an owner of the current package
+.IP
+.nf
+\f[C]
+$\ cargo\ owner\ \-\-remove\ user
+\f[]
+.fi
+.PP
+Use a certain API token to authenticate with
+.IP
+.nf
+\f[C]
+$\ cargo\ owner\ \-\-token\ U6WHXacP3Qqwd5kze1fohr4JEOmGCuRK2
+\f[]
+.fi
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-publish(1), cargo\-login(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-PACKAGE" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-package \- Create a distributable tarball
+.SH SYNOPSIS
+.PP
+\f[I]cargo package\f[] [OPTIONS]
+.SH DESCRIPTION
+.PP
+Assemble the local package into a distributable tarball.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-l, \-\-list
+Print files included in a package without making one.
+.RS
+.RE
+.TP
+.B \-\-no\-verify
+Don\[aq]t verify the contents by building them.
+.RS
+.RE
+.TP
+.B \-\-no\-metadata
+Ignore warnings about a lack of human\-usable metadata.
+.RS
+.RE
+.TP
+.B \-\-manifest\-path \f[I]PATH\f[]
+Path to the manifest to compile.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-build(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-PKGID" "1" "July 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-pkgid \- Print a fully qualified package specification
+.SH SYNOPSIS
+.PP
+\f[I]cargo pkgid\f[] [OPTIONS] [<SPEC>]
+.SH DESCRIPTION
+.PP
+Given a <SPEC> argument, print out the fully qualified package id
+specifier. This command will generate an error if <SPEC> is ambiguous as
+to which package it refers to in the dependency graph. If no <SPEC> is
+given, then the pkgid for the local package is printed.
+.PP
+This command requires that a lockfile is available and dependencies have
+been fetched.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-manifest\-path \f[I]PATH\f[]
+Path to the manifest to the package to clean.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH EXAMPLES
+.PP
+Retrieve package specification for foo package
+.IP
+.nf
+\f[C]
+$\ cargo\ pkgid\ foo
+\f[]
+.fi
+.PP
+Retrieve package specification for version 1.0.0 of foo
+.IP
+.nf
+\f[C]
+$\ cargo\ pkgid\ foo:1.0.0
+\f[]
+.fi
+.PP
+Retrieve package specification for foo from crates.io
+.IP
+.nf
+\f[C]
+$\ cargo\ pkgid\ crates.io/foo
+\f[]
+.fi
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-generate\-lockfile(1), cargo-search(1), cargo-metadata(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-PUBLISH" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-publish \- Upload a package to the registry.
+.SH SYNOPSIS
+.PP
+\f[I]cargo publish\f[] [OPTIONS]
+.SH DESCRIPTION
+.PP
+Upload a package to the registry.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-host \f[I]HOST\f[]
+Host to upload the package to.
+.RS
+.RE
+.TP
+.B \-\-token \f[I]TOKEN\f[]
+Token to use when uploading.
+.RS
+.RE
+.TP
+.B \-\-no\-verify
+Don\[aq]t verify package tarball before publish.
+.RS
+.RE
+.TP
+.B \-\-manifest\-path \f[I]PATH\f[]
+Path to the manifest of the package to publish.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-install(1), cargo\-search(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-RUN" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-run \- Run the current project
+.SH SYNOPSIS
+.PP
+\f[I]cargo run\f[] [OPTIONS] [\-\-] [<ARGS>...]
+.SH DESCRIPTION
+.PP
+Run the main binary of the local package (src/main.rs).
+.PP
+If neither \f[B]\-\-bin\f[] nor \f[B]\-\-example\f[] are given, then if
+the project only has one bin target it will be run.
+Otherwise \f[B]\-\-bin\f[] specifies the bin target to run, and
+\f[B]\-\-example\f[] specifies the example target to run.
+At most one of \f[B]\-\-bin\f[] or \f[B]\-\-example\f[] can be provided.
+.PP
+All of the trailing arguments are passed to the binary to run.
+If you\[aq]re passing arguments to both Cargo and the binary, the ones
+after \f[B]\-\-\f[] go to the binary, the ones before go to Cargo.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-bin \f[I]NAME\f[]
+Name of the bin target to run.
+.RS
+.RE
+.TP
+.B \-\-example \f[I]NAME\f[]
+Name of the example target to run.
+.RS
+.RE
+.TP
+.B \-j \f[I]IN\f[], \-\-jobs \f[I]IN\f[]
+Number of parallel jobs, defaults to # of CPUs.
+.RS
+.RE
+.TP
+.B \-\-release
+Build artifacts in release mode, with optimizations.
+.RS
+.RE
+.TP
+.B \-\-features \f[I]FEATURES\f[]
+Space\-separated list of features to also build.
+.RS
+.RE
+.TP
+.B \-\-all\-features
+Build all available features.
+.RS
+.RE
+.TP
+.B \-\-no\-default\-features
+Do not build the \f[C]default\f[] feature.
+.RS
+.RE
+.TP
+.B \-\-target \f[I]TRIPLE\f[]
+Build for the target triple.
+.RS
+.RE
+.TP
+.B \-\-manifest\-path \f[I]PATH\f[]
+Path to the manifest to compile.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH EXAMPLES
+.PP
+Run the main binary of the current package
+.IP
+.nf
+\f[C]
+$\ cargo\ run
+\f[]
+.fi
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-new(1), cargo\-init(1), cargo\-build(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-RUSTC" "1" "July 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-rustc \- Compile a package and all of its dependencies
+.SH SYNOPSIS
+.PP
+\f[I]cargo rustc\f[] [OPTIONS] [\-\-] [<OPTS>...]
+.SH DESCRIPTION
+.PP
+.PP
+The specified target for the current package (or package specified by
+SPEC if provided) will be compiled along with all of its dependencies.
+The specified ...
+will all be passed to the final compiler invocation, not any of the
+dependencies.
+Note that the compiler will still unconditionally receive arguments such
+as \-L, \-\-extern, and \-\-crate\-type, and the specified ...
+will simply be added to the compiler invocation.
+.PP
+This command requires that only one target is being compiled.
+If more than one target is available for the current package the filters
+of \-\-lib, \-\-bin, etc, must be used to select which target is
+compiled.
+To pass flags to all compiler processes spawned by Cargo, use the
+$RUSTFLAGS environment variable or the \f[C]build.rustflags\f[]
+configuration option.
+.PP
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-p \f[I]SPEC\f[], \-\-package SPEC\f[]
+The profile to compiler for.
+.RS
+.RE
+.TP
+.B \-j \f[I]N\f[], \-\-jobs \f[I]N\f[]
+Number of parallel jobs, defaults to # of CPUs.
+.RS
+.RE
+.TP
+.B \-\-lib
+Build only this package\[aq]s library.
+.RS
+.RE
+.TP
+.B \-\-bin \f[I]NAME\f[]
+Build only the specified binary.
+.RS
+.RE
+.TP
+.B \-\-example \f[I]NAME\f[]
+Build only the specified example.
+.RS
+.RE
+.TP
+.B \-\-test \f[I]NAME\f[]
+Build only the specified test target.
+.RS
+.RE
+.TP
+.B \-\-bench \f[I]NAME\f[]
+Build only the specified benchmark target.
+.RS
+.RE
+.TP
+.B \-\-release
+Build artifacts in release mode, with optimizations.
+.RS
+.RE
+.TP
+.B \-\-profile \f[I]PROFILE
+Profile to build the selected target for.
+.RS
+.RE
+.TP
+.B \-\-features \f[I]FEATURES\f[]
+The version to yank or un\-yank.
+.RS
+.RE
+.TP
+.B \-\-all\-features
+Build all available features.
+.RS
+.RE
+.TP
+.B \-\-no\-default\-features
+Do not compile default features for the package.
+.RS
+.RE
+.TP
+.B \-\-target \f[I]TRIPLE\f[]
+Target triple which compiles will be for.
+.RS
+.RE
+.TP
+.B \-\-manifest-path \f[I]PATH\f[]
+Path to the manifest to fetch dependencies for.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-run(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-RUSTDOC" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-rustdoc \- Build a package\[aq]s documentation, using specified
+custom flags.
+
+.SH SYNOPSIS
+.PP
+\f[I]cargo rustdoc\f[] [OPTIONS] [\-\-] [<OPTS>...]
+.SH DESCRIPTION
+.PP
+The specified target for the current package (or package specified by
+SPEC if provided) will be documented with the specified <OPTS>...
+being passed to the final rustdoc invocation.
+Dependencies will not be documented as part of this command.
+Note that rustdoc will still unconditionally receive arguments such as
+\-L, \-\-extern, and \-\-crate\-type, and the specified <OPTS>...
+will simply be added to the rustdoc invocation.
+.PP
+If the \-\-package argument is given, then SPEC is a package id
+specification which indicates which package should be documented.
+If it is not given, then the current package is documented.
+For more information on SPEC and its format, see the
+\f[C]cargo\ help\ pkgid\f[] command.
+
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-open
+Open the docs in a browser after the operation.
+.RS
+.RE
+.TP
+.B \-p \f[I]SPEC\f[], \-\-package \f[I]SPEC\f[]
+Package to document.
+.RS
+.RE
+.TP
+.B \-j \f[I]N\f[], \-\-jobs \f[I]N\f[]
+Number of parallel jobs, defaults to # of CPUs.
+.RS
+.RE
+.TP
+.B \-\-lib
+Build only this package\[aq]s library.
+.RS
+.RE
+.TP
+.B \-\-bin \f[I]NAME\f[]
+Build only the specified binary.
+.RS
+.RE
+.TP
+.B \-\-example \f[I]NAME\f[]
+Build only the specified example.
+.RS
+.RE
+.TP
+.B \-\-test \f[I]NAME\f[]
+Build only the specified test target.
+.RS
+.RE
+.TP
+.B \-\-bench \f[I]NAME\f[]
+Build only the specified benchmark target.
+.RS
+.RE
+.TP
+.B \-\-release
+Build artifacts in release mode, with optimizations.
+.RS
+.RE
+.TP
+.B \-\-features \f[I]FEATURES\f[]
+Space-separated list of features to also build.
+.RS
+.RE
+.TP
+.B \-\-all\-features
+Build all available features.
+.RS
+.RE
+.TP
+.B \-\-no\-default\-features
+Do not build the \f[C]default\f[] feature.
+.RS
+.RE
+.TP
+.B \-\-target \f[I]TRIPLE\f[]
+Build for the target triple.
+.RS
+.RE
+.TP
+.B \-\-manifest\-path \f[I]PATH\f[]
+Path to the manifest to document.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1), cargo-doc(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-SEARCH" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-search \- Search packages in crates.io
+.SH SYNOPSIS
+.PP
+\f[I]cargo search\f[] [OPTIONS] <QUERY>...
+.SH DESCRIPTION
+.PP
+Search packages in \f[I]crates.io\f[].
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-host \f[I]HOST\f[]
+Host of a registry to search in.
+.RS
+.RE
+.TP
+.B \-\-limit \f[I]LIMIT\f[]
+Limit the number of results (default: 10, max: 100).
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-install(1), cargo\-publish(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-TEST" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-test \- Execute unit and integration tests of a package
+.SH SYNOPSIS
+.PP
+\f[I]cargo test\f[] [OPTIONS] [\-\-] [<ARGS>...]
+.SH DESCRIPTION
+.PP
+Execute all unit and integration tests of a local package.
+.PP
+All of the trailing arguments are passed to the test binaries generated
+for filtering tests and generally providing options configuring how they
+run.
+For example, this will run all tests with the name \[aq]foo\[aq] in
+their name:
+.IP
+.nf
+\f[C]
+cargo\ test\ foo
+\f[]
+.fi
+.PP
+If the \f[B]\-\-package\f[] argument is given, then \[aq]SPEC\[aq] is a
+package id specification which indicates which package should be tested.
+If it is not given, then the current package is tested.
+For more information on \[aq]SPEC\[aq] and its format, see the "cargo
+help pkgid" command.
+.PP
+The \f[B]\-\-jobs\f[] argument affects the building of the test
+executable but does not affect how many jobs are used when running the
+tests.
+.PP
+Compilation can be configured via the \[aq]test\[aq] profile in the
+manifest.
+.PP
+By default the rust test harness hides output from test execution to
+keep results readable.
+Test output can be recovered (e.g.
+for debugging) by passing \f[B]\-\-nocapture\f[] to the test binaries:
+.IP
+.nf
+\f[C]
+cargo\ test\ \-\-\ \-\-nocapture
+\f[]
+.fi
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-lib
+Test only this package\[aq]s library.
+.RS
+.RE
+.TP
+.B \-\-doc
+Test only this library\[aq]s documentation
+.RS
+.RE
+.TP
+.B \-\-bin \f[I]NAME\f[]
+Test only the specified binary.
+.RS
+.RE
+.TP
+.B \-\-example \f[I]NAME\f[]
+Test only the specified example.
+.RS
+.RE
+.TP
+.B \-\-test \f[I]NAME\f[]
+Test only the specified integration test target.
+.RS
+.RE
+.TP
+.B \-\-bench \f[I]NAME\f[]
+Test only the specified benchmark target.
+.RS
+.RE
+.TP
+.B \-\-no\-run
+Compile, but don\[aq]t run tests.
+.RS
+.RE
+.TP
+.B \-p \f[I]SPEC\f[], \-\-package \f[I]SPEC ...\f[]
+Package to run tests for.
+.RS
+.RE
+.TP
+.B \-j \f[I]IN\f[], \-\-jobs \f[I]IN\f[]
+Number of parallel jobs, defaults to # of CPUs.
+.RS
+.RE
+.TP
+.B \-\-release
+Build artifacts in release mode, with optimizations.
+.RS
+.RE
+.TP
+.B \-\-features \f[I]FEATURES\f[]
+Space\-separated list of features to also build.
+.RS
+.RE
+.TP
+.B \-\-all\-features
+Build all available features.
+.RS
+.RE
+.TP
+.B \-\-no\-default\-features
+Do not build the \f[C]default\f[] feature.
+.RS
+.RE
+.TP
+.B \-\-target \f[I]TRIPLE\f[]
+Build for the target triple.
+.RS
+.RE
+.TP
+.B \-\-manifest\-path \f[I]PATH\f[]
+Path to the manifest to compile.
+.RS
+.RE
+.TP
+.B \-\-no\-fail\-fast
+Run all tests regardless of failure.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH EXAMPLES
+.PP
+Execute all the unit and integration tests of the current package
+.IP
+.nf
+\f[C]
+$\ cargo\ test
+\f[]
+.fi
+.PP
+Execute the BENCH benchmark
+.IP
+.nf
+\f[C]
+$\ cargo\ test\ \-\-bench\ BENCH
+\f[]
+.fi
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-build(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-UNINSTALL" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-uninstall \- Remove a Rust binary
+.SH SYNOPSIS
+.PP
+\f[I]cargo uninstall\f[] [OPTIONS] <SPEC>
+.PP
+\f[I]cargo uninstall\f[] (\-h | \-\-help)
+.SH DESCRIPTION
+.PP
+The argument SPEC is a package id specification (see
+\f[C]cargo\ help\ pkgid\f[]) to specify which crate should be
+uninstalled.
+By default all binaries are uninstalled for a crate but the
+\f[C]\-\-bin\f[] and \f[C]\-\-example\f[] flags can be used to only
+uninstall particular binaries.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-root \f[I]DIR\f[]
+Directory to uninstall packages from.
+.RS
+.RE
+.TP
+.B \-\-bin \f[I]NAME\f[]
+Only uninstall the binary NAME.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1), cargo-install(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-UPDATE" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-update \- Update the package dependencies
+.SH SYNOPSIS
+.PP
+\f[I]cargo update\f[] [OPTIONS]
+.SH DESCRIPTION
+.PP
+Update dependencies as recorded in the local lock file.
+.PP
+This command requires that a \f[I]Cargo.lock\f[] already exists as
+generated by \f[I]cargo build\f[] or related commands.
+.PP
+If \f[I]SPEC\f[] is given, then a conservative update of the
+\f[I]lockfile\f[] will be performed.
+This means that only the dependency specified by \f[I]SPEC\f[] will be
+updated.
+Its transitive dependencies will be updated only if \f[I]SPEC\f[] cannot
+be updated without updating dependencies.
+All other dependencies will remain locked at their currently recorded
+versions.
+.PP
+If \f[I]PRECISE\f[] is specified, then \f[B]\-\-aggressive\f[] must not
+also be specified.
+The argument \f[I]PRECISE\f[] is a string representing a precise
+revision that the package being updated should be updated to.
+For example, if the package comes from a git repository, then
+\f[I]PRECISE\f[] would be the exact revision that the repository should
+be updated to.
+.PP
+If \f[I]SPEC\f[] is not given, then all dependencies will be
+re\-resolved and updated.
+.PP
+For more information about package id specifications, see "cargo help
+pkgid".
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-p \f[I]SPEC\f[], \-\-package \f[I]SPEC ...\f[]
+Package to update.
+.RS
+.RE
+.TP
+.B \-\-aggressive
+Force updating all dependencies of <name> as well.
+.RS
+.RE
+.TP
+.B \-\-precise \f[I]PRECISE\f[]
+Update a single dependency to exactly \f[I]PRECISE\f[].
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-VERSION" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-version \- Show version information
+.SH SYNOPSIS
+.PP
+\f[I]cargo version\f[] [OPTIONS]
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO\-YANK" "1" "July 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo\-yank \- Remove a pushed crate from the index
+.SH SYNOPSIS
+.PP
+\f[I]cargo yank\f[] [OPTIONS] [<CRATE>]
+.SH DESCRIPTION
+.PP
+The yank command removes a previously pushed crate\[aq]s version from
+the server\[aq]s index.
+This command does not delete any data, and the crate will still be
+available for download via the registry\[aq]s download link.
+.PP
+Note that existing crates locked to a yanked version will still be able
+to download the yanked version to use it.
+Cargo will, however, not allow any new crates to be locked to any yanked
+version.
+.PP
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Print this message.
+.RS
+.RE
+.TP
+.B \-\-vers \f[I]VERSION\f[]
+The version to yank or un-yank.
+.RS
+.RE
+.TP
+.B \-\-undo
+Undo a yank, putting a version back into the index.
+.RS
+.RE
+.TP
+.B \-\-index \f[I]INDEX\f[]
+Registry index to yank from.
+.RS
+.RE
+.TP
+.B \-\-token \f[I]TOKEN\f[]
+API token to use when authenticating.
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-q, \-\-quiet
+No output printed to stdout.
+.RS
+.RE
+.TP
+.B \-\-color \f[I]WHEN\f[]
+Coloring: auto, always, never.
+.RS
+.RE
+.SH SEE ALSO
+.PP
+cargo(1), cargo\-owner(1), cargo\-version(1)
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+.TH "CARGO" "1" "May 2016" "The Rust package manager" "Cargo Manual"
+.hy
+.SH NAME
+.PP
+cargo \- The Rust package manager
+.SH SYNOPSIS
+.PP
+\f[I]cargo\f[] <COMMAND> [<ARGS>...]
+.SH DESCRIPTION
+.PP
+This program is a package manager for the Rust language, available at
+<http://rust-lang.org>.
+.SH OPTIONS
+.TP
+.B \-h, \-\-help
+Display a help message.
+.RS
+.RE
+.TP
+.B \-V, \-\-version
+Print version information and exit.
+.RS
+.RE
+.TP
+.B \-\-list
+List all available cargo commands.
+.RS
+.RE
+.TP
+.B \-\-explain CODE
+Run \f[C]rustc\ \-\-explain\ CODE\f[]
+.RS
+.RE
+.TP
+.B \-v, \-\-verbose
+Use verbose output.
+.RS
+.RE
+.TP
+.B \-\-color
+Configure coloring of output.
+.RS
+.RE
+.SH COMMANDS
+.PP
+To get extended information about commands, run \f[I]cargo help
+<command>\f[] or \f[I]man cargo\-command\f[]
+.TP
+.B cargo\-build(1)
+Compile the current project.
+.RS
+.RE
+.TP
+.B cargo\-clean(1)
+Remove the target directory with build output.
+.RS
+.RE
+.TP
+.B cargo\-doc(1)
+Build this project\[aq]s and its dependencies\[aq] documentation.
+.RS
+.RE
+.TP
+.B cargo\-init(1)
+Create a new cargo project in the current directory.
+.RS
+.RE
+.TP
+.B cargo\-install(1)
+Install a Rust binary.
+.RS
+.RE
+.TP
+.B cargo\-new(1)
+Create a new cargo project.
+.RS
+.RE
+.TP
+.B cargo\-run(1)
+Build and execute src/main.rs.
+.RS
+.RE
+.TP
+.B cargo\-test(1)
+Run the tests for the package.
+.RS
+.RE
+.TP
+.B cargo\-bench(1)
+Run the benchmarks for the package.
+.RS
+.RE
+.TP
+.B cargo\-update(1)
+Update dependencies in Cargo.lock.
+.RS
+.RE
+.TP
+.B cargo\-rustc(1)
+Compile the current project, and optionally pass additional rustc parameters
+.RS
+.RE
+.TP
+.B cargo\-package(1)
+Generate a source tarball for the current package.
+.RS
+.RE
+.TP
+.B cargo\-publish(1)
+Package and upload this project to the registry.
+.RS
+.RE
+.TP
+.B cargo\-owner(1)
+Manage the owners of a crate on the registry.
+.RS
+.RE
+.TP
+.B cargo\-uninstall(1)
+Remove a Rust binary.
+.RS
+.RE
+.TP
+.B cargo\-search(1)
+Search registry for crates.
+.RS
+.RE
+.TP
+.B cargo\-help(1)
+Display help for a cargo command
+.RS
+.RE
+.TP
+.B cargo\-version(1)
+Print cargo\[aq]s version and exit.
+.RS
+.RE
+.SH FILES
+.TP
+.B ~/.cargo
+Directory in which Cargo stores repository data.
+Cargo can be instructed to use a \f[I]\&.cargo\f[] subdirectory in a
+different location by setting the \f[B]CARGO_HOME\f[] environment
+variable.
+.RS
+.RE
+.SH EXAMPLES
+.PP
+Build a local package and all of its dependencies
+.IP
+.nf
+\f[C]
+$\ cargo\ build
+\f[]
+.fi
+.PP
+Build a package with optimizations
+.IP
+.nf
+\f[C]
+$\ cargo\ build\ \-\-release
+\f[]
+.fi
+.PP
+Run tests for a cross\-compiled target
+.IP
+.nf
+\f[C]
+$\ cargo\ test\ \-\-target\ i686\-unknown\-linux\-gnu
+\f[]
+.fi
+.PP
+Create a new project that builds an executable
+.IP
+.nf
+\f[C]
+$\ cargo\ new\ \-\-bin\ foobar
+\f[]
+.fi
+.PP
+Create a project in the current directory
+.IP
+.nf
+\f[C]
+$\ mkdir\ foo\ &&\ cd\ foo
+$\ cargo\ init\ .
+\f[]
+.fi
+.PP
+Learn about a command\[aq]s options and usage
+.IP
+.nf
+\f[C]
+$\ cargo\ help\ clean
+\f[]
+.fi
+.SH SEE ALSO
+.PP
+rustc(1), rustdoc(1)
+.SH BUGS
+.PP
+See <https://github.com/rust-lang/cargo/issues> for issues.
+.SH COPYRIGHT
+.PP
+This work is dual\-licensed under Apache 2.0 and MIT terms.
+See \f[I]COPYRIGHT\f[] file in the cargo source distribution.
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::ChannelChanger;
+use cargotest::support::registry::{self, Package, alt_dl_path};
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+#[test]
+fn is_feature_gated() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ version = "0.0.1"
+ registry = "alternative"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("bar", "0.0.1").alternative(true).publish();
+
+ assert_that(p.cargo("build").masquerade_as_nightly_cargo(),
+ execs().with_status(101)
+ .with_stderr_contains(" feature `alternative-registries` is required"));
+}
+
+#[test]
+fn depend_on_alt_registry() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ version = "0.0.1"
+ registry = "alternative"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("bar", "0.0.1").alternative(true).publish();
+
+ assert_that(p.cargo("build").masquerade_as_nightly_cargo(),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `{reg}`
+[DOWNLOADING] bar v0.0.1 (registry `file://[..]`)
+[COMPILING] bar v0.0.1 (registry `file://[..]`)
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url(),
+ reg = registry::alt_registry())));
+
+ assert_that(p.cargo("clean").masquerade_as_nightly_cargo(), execs().with_status(0));
+
+ // Don't download a second time
+ assert_that(p.cargo("build").masquerade_as_nightly_cargo(),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] bar v0.0.1 (registry `file://[..]`)
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url())));
+}
+
+#[test]
+fn depend_on_alt_registry_depends_on_same_registry_no_index() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ version = "0.0.1"
+ registry = "alternative"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("baz", "0.0.1").alternative(true).publish();
+ Package::new("bar", "0.0.1").dep("baz", "0.0.1").alternative(true).publish();
+
+ assert_that(p.cargo("build").masquerade_as_nightly_cargo(),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `{reg}`
+[DOWNLOADING] [..] v0.0.1 (registry `file://[..]`)
+[DOWNLOADING] [..] v0.0.1 (registry `file://[..]`)
+[COMPILING] baz v0.0.1 (registry `file://[..]`)
+[COMPILING] bar v0.0.1 (registry `file://[..]`)
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url(),
+ reg = registry::alt_registry())));
+}
+
+#[test]
+fn depend_on_alt_registry_depends_on_same_registry() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ version = "0.0.1"
+ registry = "alternative"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("baz", "0.0.1").alternative(true).publish();
+ Package::new("bar", "0.0.1").registry_dep("baz", "0.0.1", registry::alt_registry().as_str()).alternative(true).publish();
+
+ assert_that(p.cargo("build").masquerade_as_nightly_cargo(),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `{reg}`
+[DOWNLOADING] [..] v0.0.1 (registry `file://[..]`)
+[DOWNLOADING] [..] v0.0.1 (registry `file://[..]`)
+[COMPILING] baz v0.0.1 (registry `file://[..]`)
+[COMPILING] bar v0.0.1 (registry `file://[..]`)
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url(),
+ reg = registry::alt_registry())));
+}
+
+#[test]
+fn depend_on_alt_registry_depends_on_crates_io() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ version = "0.0.1"
+ registry = "alternative"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("baz", "0.0.1").publish();
+ Package::new("bar", "0.0.1").registry_dep("baz", "0.0.1", registry::registry().as_str()).alternative(true).publish();
+
+ assert_that(p.cargo("build").masquerade_as_nightly_cargo(),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `{alt_reg}`
+[UPDATING] registry `{reg}`
+[DOWNLOADING] [..] v0.0.1 (registry `file://[..]`)
+[DOWNLOADING] [..] v0.0.1 (registry `file://[..]`)
+[COMPILING] baz v0.0.1 (registry `file://[..]`)
+[COMPILING] bar v0.0.1 (registry `file://[..]`)
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url(),
+ alt_reg = registry::alt_registry(),
+ reg = registry::registry())));
+}
+
+#[test]
+fn registry_incompatible_with_path() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = ""
+ registry = "alternative"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").masquerade_as_nightly_cargo(),
+ execs().with_status(101)
+ .with_stderr_contains(" dependency (bar) specification is ambiguous. Only one of `path` or `registry` is allowed."));
+}
+
+#[test]
+fn registry_incompatible_with_git() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ git = ""
+ registry = "alternative"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").masquerade_as_nightly_cargo(),
+ execs().with_status(101)
+ .with_stderr_contains(" dependency (bar) specification is ambiguous. Only one of `git` or `registry` is allowed."));
+}
+
+#[test]
+fn cannot_publish_to_crates_io_with_registry_dependency() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ [dependencies.bar]
+ version = "0.0.1"
+ registry = "alternative"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("bar", "0.0.1").alternative(true).publish();
+
+ assert_that(p.cargo("publish").masquerade_as_nightly_cargo()
+ .arg("--index").arg(registry::registry().to_string()),
+ execs().with_status(101));
+}
+
+#[test]
+fn publish_with_registry_dependency() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ version = "0.0.1"
+ registry = "alternative"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("bar", "0.0.1").alternative(true).publish();
+
+ // Login so that we have the token available
+ assert_that(p.cargo("login").masquerade_as_nightly_cargo()
+ .arg("--registry").arg("alternative").arg("TOKEN").arg("-Zunstable-options"),
+ execs().with_status(0));
+
+ assert_that(p.cargo("publish").masquerade_as_nightly_cargo()
+ .arg("--registry").arg("alternative").arg("-Zunstable-options"),
+ execs().with_status(0));
+}
+
+#[test]
+fn alt_registry_and_crates_io_deps() {
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ crates_io_dep = "0.0.1"
+
+ [dependencies.alt_reg_dep]
+ version = "0.1.0"
+ registry = "alternative"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("crates_io_dep", "0.0.1").publish();
+ Package::new("alt_reg_dep", "0.1.0").alternative(true).publish();
+
+ assert_that(p.cargo("build").masquerade_as_nightly_cargo(),
+ execs().with_status(0)
+ .with_stderr_contains(format!("\
+[UPDATING] registry `{}`", registry::alt_registry()))
+ .with_stderr_contains(&format!("\
+[UPDATING] registry `{}`", registry::registry()))
+ .with_stderr_contains("\
+[DOWNLOADING] crates_io_dep v0.0.1 (registry `file://[..]`)")
+ .with_stderr_contains("\
+[DOWNLOADING] alt_reg_dep v0.1.0 (registry `file://[..]`)")
+ .with_stderr_contains("\
+[COMPILING] alt_reg_dep v0.1.0 (registry `file://[..]`)")
+ .with_stderr_contains("\
+[COMPILING] crates_io_dep v0.0.1")
+ .with_stderr_contains(&format!("\
+[COMPILING] foo v0.0.1 ({})", p.url()))
+ .with_stderr_contains("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs"))
+
+}
+
+#[test]
+fn block_publish_due_to_no_token() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ // Setup the registry by publishing a package
+ Package::new("bar", "0.0.1").alternative(true).publish();
+
+ // Now perform the actual publish
+ assert_that(p.cargo("publish").masquerade_as_nightly_cargo()
+ .arg("--registry").arg("alternative").arg("-Zunstable-options"),
+ execs().with_status(101)
+ .with_stderr_contains("error: no upload token found, please run `cargo login`"));
+}
+
+#[test]
+fn publish_to_alt_registry() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ // Setup the registry by publishing a package
+ Package::new("bar", "0.0.1").alternative(true).publish();
+
+ // Login so that we have the token available
+ assert_that(p.cargo("login").masquerade_as_nightly_cargo()
+ .arg("--registry").arg("alternative").arg("TOKEN").arg("-Zunstable-options"),
+ execs().with_status(0));
+
+ // Now perform the actual publish
+ assert_that(p.cargo("publish").masquerade_as_nightly_cargo()
+ .arg("--registry").arg("alternative").arg("-Zunstable-options"),
+ execs().with_status(0));
+
+ // Ensure that the crate is uploaded
+ assert!(alt_dl_path().join("api/v1/crates/new").exists());
+}
+
+#[test]
+fn publish_with_crates_io_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = ["me"]
+ license = "MIT"
+ description = "foo"
+
+ [dependencies.bar]
+ version = "0.0.1"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("bar", "0.0.1").publish();
+
+ // Login so that we have the token available
+ assert_that(p.cargo("login").masquerade_as_nightly_cargo()
+ .arg("--registry").arg("alternative").arg("TOKEN").arg("-Zunstable-options"),
+ execs().with_status(0));
+
+ assert_that(p.cargo("publish").masquerade_as_nightly_cargo()
+ .arg("--registry").arg("alternative").arg("-Zunstable-options"),
+ execs().with_status(0));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::{project, execs};
+use cargotest::support::registry::Package;
+use hamcrest::assert_that;
+
+#[test]
+fn bad1() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [target]
+ nonexistent-target = "foo"
+ "#)
+ .build();
+ assert_that(p.cargo("build").arg("-v")
+ .arg("--target=nonexistent-target"),
+ execs().with_status(101).with_stderr("\
+[ERROR] expected table for configuration key `target.nonexistent-target`, \
+but found string in [..]config
+"));
+}
+
+#[test]
+fn bad2() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [http]
+ proxy = 3.0
+ "#)
+ .build();
+ assert_that(p.cargo("publish").arg("-v"),
+ execs().with_status(101).with_stderr("\
+[ERROR] Couldn't load Cargo configuration
+
+Caused by:
+ failed to load TOML configuration from `[..]config`
+
+Caused by:
+ failed to parse key `http`
+
+Caused by:
+ failed to parse key `proxy`
+
+Caused by:
+ found TOML configuration value of unknown type `float`
+"));
+}
+
+#[test]
+fn bad3() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [http]
+ proxy = true
+ "#)
+ .build();
+ Package::new("foo", "1.0.0").publish();
+
+ assert_that(p.cargo("publish").arg("-v"),
+ execs().with_status(101).with_stderr("\
+error: failed to update registry [..]
+
+Caused by:
+ invalid configuration for key `http.proxy`
+expected a string, but found a boolean for `http.proxy` in [..]config
+"));
+}
+
+#[test]
+fn bad4() {
+ let p = project("foo")
+ .file(".cargo/config", r#"
+ [cargo-new]
+ name = false
+ "#)
+ .build();
+ assert_that(p.cargo("new").arg("-v").arg("foo"),
+ execs().with_status(101).with_stderr("\
+[ERROR] Failed to create project `foo` at `[..]`
+
+Caused by:
+ invalid configuration for key `cargo-new.name`
+expected a string, but found a boolean for `cargo-new.name` in [..]config
+"));
+}
+
+#[test]
+fn bad5() {
+ let p = project("foo")
+ .file(".cargo/config", r#"
+ foo = ""
+ "#)
+ .file("foo/.cargo/config", r#"
+ foo = 2
+ "#)
+ .build();
+ assert_that(p.cargo("new")
+ .arg("-v").arg("foo").cwd(&p.root().join("foo")),
+ execs().with_status(101).with_stderr("\
+[ERROR] Failed to create project `foo` at `[..]`
+
+Caused by:
+ Couldn't load Cargo configuration
+
+Caused by:
+ failed to merge configuration at `[..]`
+
+Caused by:
+ failed to merge key `foo` between files:
+ file 1: [..]foo[..]foo[..]config
+ file 2: [..]foo[..]config
+
+Caused by:
+ expected integer, but found string
+"));
+}
+
+#[test]
+fn bad_cargo_config_jobs() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [build]
+ jobs = -1
+ "#)
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101).with_stderr("\
+[ERROR] build.jobs must be positive, but found -1 in [..]
+"));
+}
+
+#[test]
+fn default_cargo_config_jobs() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [build]
+ jobs = 1
+ "#)
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn good_cargo_config_jobs() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [build]
+ jobs = 4
+ "#)
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn invalid_global_config() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file(".cargo/config", "4")
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101).with_stderr("\
+[ERROR] Couldn't load Cargo configuration
+
+Caused by:
+ could not parse TOML configuration in `[..]`
+
+Caused by:
+ could not parse input as TOML
+
+Caused by:
+ expected an equals, found eof at line 1
+"));
+}
+
+#[test]
+fn bad_cargo_lock() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("Cargo.lock", "[[package]]\nfoo = 92")
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse lock file at: [..]Cargo.lock
+
+Caused by:
+ missing field `name` for key `package`
+"));
+}
+
+#[test]
+fn duplicate_packages_in_cargo_lock() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .file("Cargo.lock", r#"
+ [[package]]
+ name = "bar"
+ version = "0.0.1"
+ dependencies = [
+ "foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
+ ]
+
+ [[package]]
+ name = "foo"
+ version = "0.1.0"
+ source = "registry+https://github.com/rust-lang/crates.io-index"
+
+ [[package]]
+ name = "foo"
+ version = "0.1.0"
+ source = "registry+https://github.com/rust-lang/crates.io-index"
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("--verbose"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse lock file at: [..]
+
+Caused by:
+ package `foo` is specified twice in the lockfile
+"));
+}
+
+#[test]
+fn bad_source_in_cargo_lock() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .file("Cargo.lock", r#"
+ [[package]]
+ name = "bar"
+ version = "0.0.1"
+ dependencies = [
+ "foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
+ ]
+
+ [[package]]
+ name = "foo"
+ version = "0.1.0"
+ source = "You shall not parse"
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("--verbose"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse lock file at: [..]
+
+Caused by:
+ invalid source `You shall not parse` for key `package.source`
+"));
+}
+
+#[test]
+fn bad_dependency_in_lockfile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("Cargo.lock", r#"
+ [[package]]
+ name = "foo"
+ version = "0.0.1"
+ dependencies = [
+ "bar 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
+ ]
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("--verbose"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse lock file at: [..]
+
+Caused by:
+ package `bar 0.1.0 ([..])` is specified as a dependency, but is missing from the package list
+"));
+
+}
+
+#[test]
+fn bad_git_dependency() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies]
+ foo = { git = "file:.." }
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101).with_stderr("\
+[UPDATING] git repository `file:///`
+[ERROR] failed to load source for a dependency on `foo`
+
+Caused by:
+ Unable to update file:///
+
+Caused by:
+ failed to clone into: [..]
+
+Caused by:
+ [[..]] 'file:///' is not a valid local file URI
+"));
+}
+
+#[test]
+fn bad_crate_type() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [lib]
+ crate-type = ["bad_type", "rlib"]
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101).with_stderr_contains("\
+error: failed to run `rustc` to learn about target-specific information
+"));
+}
+
+#[test]
+fn malformed_override() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [target.x86_64-apple-darwin.freetype]
+ native = {
+ foo: "bar"
+ }
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ could not parse input as TOML
+
+Caused by:
+ expected a table key, found a newline at line 8
+"));
+}
+
+#[test]
+fn duplicate_binary_names() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "qqq"
+ version = "0.1.0"
+ authors = ["A <a@a.a>"]
+
+ [[bin]]
+ name = "e"
+ path = "a.rs"
+
+ [[bin]]
+ name = "e"
+ path = "b.rs"
+ "#)
+ .file("a.rs", r#"fn main() -> () {}"#)
+ .file("b.rs", r#"fn main() -> () {}"#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ found duplicate binary name e, but all binary targets must have a unique name
+"));
+}
+
+#[test]
+fn duplicate_example_names() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "qqq"
+ version = "0.1.0"
+ authors = ["A <a@a.a>"]
+
+ [[example]]
+ name = "ex"
+ path = "examples/ex.rs"
+
+ [[example]]
+ name = "ex"
+ path = "examples/ex2.rs"
+ "#)
+ .file("examples/ex.rs", r#"fn main () -> () {}"#)
+ .file("examples/ex2.rs", r#"fn main () -> () {}"#)
+ .build();
+
+ assert_that(p.cargo("build").arg("--example").arg("ex"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ found duplicate example name ex, but all example targets must have a unique name
+"));
+}
+
+#[test]
+fn duplicate_bench_names() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "qqq"
+ version = "0.1.0"
+ authors = ["A <a@a.a>"]
+
+ [[bench]]
+ name = "ex"
+ path = "benches/ex.rs"
+
+ [[bench]]
+ name = "ex"
+ path = "benches/ex2.rs"
+ "#)
+ .file("benches/ex.rs", r#"fn main () {}"#)
+ .file("benches/ex2.rs", r#"fn main () {}"#)
+ .build();
+
+ assert_that(p.cargo("bench"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ found duplicate bench name ex, but all bench targets must have a unique name
+"));
+}
+
+#[test]
+fn duplicate_deps() {
+ let p = project("foo")
+ .file("shim-bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("shim-bar/src/lib.rs", r#"
+ pub fn a() {}
+ "#)
+ .file("linux-bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("linux-bar/src/lib.rs", r#"
+ pub fn a() {}
+ "#)
+ .file("Cargo.toml", r#"
+ [package]
+ name = "qqq"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = { path = "shim-bar" }
+
+ [target.x86_64-unknown-linux-gnu.dependencies]
+ bar = { path = "linux-bar" }
+ "#)
+ .file("src/main.rs", r#"fn main () {}"#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ Dependency 'bar' has different source paths depending on the build target. Each dependency must \
+have a single canonical source path irrespective of build target.
+"));
+}
+
+#[test]
+fn duplicate_deps_diff_sources() {
+ let p = project("foo")
+ .file("shim-bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("shim-bar/src/lib.rs", r#"
+ pub fn a() {}
+ "#)
+ .file("linux-bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("linux-bar/src/lib.rs", r#"
+ pub fn a() {}
+ "#)
+ .file("Cargo.toml", r#"
+ [package]
+ name = "qqq"
+ version = "0.0.1"
+ authors = []
+
+ [target.i686-unknown-linux-gnu.dependencies]
+ bar = { path = "shim-bar" }
+
+ [target.x86_64-unknown-linux-gnu.dependencies]
+ bar = { path = "linux-bar" }
+ "#)
+ .file("src/main.rs", r#"fn main () {}"#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ Dependency 'bar' has different source paths depending on the build target. Each dependency must \
+have a single canonical source path irrespective of build target.
+"));
+}
+
+#[test]
+fn unused_keys() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [target.foo]
+ bar = "3"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+warning: unused manifest key: target.foo.bar
+[COMPILING] foo v0.1.0 (file:///[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ bulid = "foo"
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+warning: unused manifest key: project.bulid
+[COMPILING] foo [..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+ build = "foo"
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+warning: unused manifest key: lib.build
+[COMPILING] foo [..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+
+#[test]
+fn empty_dependencies() {
+ let p = project("empty_deps")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "empty_deps"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies]
+ foo = {}
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("foo", "0.0.1").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr_contains("\
+warning: dependency (foo) specified without providing a local path, Git repository, or version \
+to use. This will be considered an error in future versions
+"));
+}
+
+#[test]
+fn invalid_toml_historically_allowed_is_warned() {
+ let p = project("empty_deps")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "empty_deps"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file(".cargo/config", r#"
+ [foo] bar = 2
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+warning: TOML file found which contains invalid syntax and will soon not parse
+at `[..]config`.
+
+The TOML spec requires newlines after table definitions (e.g. `[a] b = 1` is
+invalid), but this file has a table header which does not have a newline after
+it. A newline needs to be added and this warning will soon become a hard error
+in the future.
+[COMPILING] empty_deps v0.0.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn ambiguous_git_reference() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies.bar]
+ git = "https://127.0.0.1"
+ branch = "master"
+ tag = "some-tag"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_stderr_contains("\
+[WARNING] dependency (bar) specification is ambiguous. \
+Only one of `branch`, `tag` or `rev` is allowed. \
+This will be considered an error in future versions
+"));
+}
+
+#[test]
+fn bad_source_config1() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [source.foo]
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: no source URL specified for `source.foo`, need [..]
+"));
+}
+
+#[test]
+fn bad_source_config2() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [source.crates-io]
+ registry = 'http://example.com'
+ replace-with = 'bar'
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: failed to load source for a dependency on `bar`
+
+Caused by:
+ Unable to update registry `https://[..]`
+
+Caused by:
+ could not find a configured source with the name `bar` \
+ when attempting to lookup `crates-io` (configuration in [..])
+"));
+}
+
+#[test]
+fn bad_source_config3() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [source.crates-io]
+ registry = 'http://example.com'
+ replace-with = 'crates-io'
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: failed to load source for a dependency on `bar`
+
+Caused by:
+ Unable to update registry `https://[..]`
+
+Caused by:
+ detected a cycle of `replace-with` sources, [..]
+"));
+}
+
+#[test]
+fn bad_source_config4() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [source.crates-io]
+ registry = 'http://example.com'
+ replace-with = 'bar'
+
+ [source.bar]
+ registry = 'http://example.com'
+ replace-with = 'crates-io'
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: failed to load source for a dependency on `bar`
+
+Caused by:
+ Unable to update registry `https://[..]`
+
+Caused by:
+ detected a cycle of `replace-with` sources, the source `crates-io` is \
+ eventually replaced with itself (configuration in [..])
+"));
+}
+
+#[test]
+fn bad_source_config5() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [source.crates-io]
+ registry = 'http://example.com'
+ replace-with = 'bar'
+
+ [source.bar]
+ registry = 'not a url'
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: configuration key `source.bar.registry` specified an invalid URL (in [..])
+
+Caused by:
+ invalid url `not a url`: [..]
+"));
+}
+
+#[test]
+fn both_git_and_path_specified() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies.bar]
+ git = "https://127.0.0.1"
+ path = "bar"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(foo.cargo("build").arg("-v"),
+ execs().with_stderr_contains("\
+[WARNING] dependency (bar) specification is ambiguous. \
+Only one of `git` or `path` is allowed. \
+This will be considered an error in future versions
+"));
+}
+
+#[test]
+fn bad_source_config6() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [source.crates-io]
+ registry = 'http://example.com'
+ replace-with = ['not', 'a', 'string']
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: expected a string, but found a array for `source.crates-io.replace-with` in [..]
+"));
+}
+
+#[test]
+fn ignored_git_revision() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ branch = "spam"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(foo.cargo("build").arg("-v"),
+ execs().with_stderr_contains("\
+[WARNING] key `branch` is ignored for dependency (bar). \
+This will be considered an error in future versions"));
+}
+
+#[test]
+fn bad_source_config7() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [source.foo]
+ registry = 'http://example.com'
+ local-registry = 'file:///another/file'
+ "#)
+ .build();
+
+ Package::new("bar", "0.1.0").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: more than one source URL specified for `source.foo`
+"));
+}
+
+#[test]
+fn bad_dependency() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies]
+ bar = 3
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ invalid type: integer `3`, expected a version string like [..]
+"));
+}
+
+#[test]
+fn bad_debuginfo() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [profile.dev]
+ debug = 'a'
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ invalid type: string \"a\", expected a boolean or an integer for [..]
+"));
+}
+
+#[test]
+fn bad_opt_level() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ build = 3
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ invalid type: integer `3`, expected a boolean or a string for key [..]
+"));
+}
--- /dev/null
+extern crate hamcrest;
+extern crate cargotest;
+
+use cargotest::support::{project, execs, main_file, basic_bin_manifest};
+use hamcrest::{assert_that};
+
+fn assert_not_a_cargo_toml(command: &str, manifest_path_argument: &str) {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo(command)
+ .arg("--manifest-path").arg(manifest_path_argument)
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(101)
+ .with_stderr("[ERROR] the manifest-path must be a path \
+ to a Cargo.toml file"));
+}
+
+
+fn assert_cargo_toml_doesnt_exist(command: &str, manifest_path_argument: &str) {
+ let p = project("foo").build();
+ let expected_path = manifest_path_argument
+ .split('/').collect::<Vec<_>>().join("[..]");
+
+ assert_that(p.cargo(command)
+ .arg("--manifest-path").arg(manifest_path_argument)
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(101)
+ .with_stderr(
+ format!("[ERROR] manifest path `{}` does not exist",
+ expected_path)
+ ));
+}
+
+#[test]
+fn bench_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("bench", "foo");
+}
+
+#[test]
+fn bench_dir_plus_file() {
+ assert_not_a_cargo_toml("bench", "foo/bar");
+}
+
+#[test]
+fn bench_dir_plus_path() {
+ assert_not_a_cargo_toml("bench", "foo/bar/baz");
+}
+
+#[test]
+fn bench_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("bench", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn build_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("build", "foo");
+}
+
+#[test]
+fn build_dir_plus_file() {
+ assert_not_a_cargo_toml("bench", "foo/bar");
+}
+
+#[test]
+fn build_dir_plus_path() {
+ assert_not_a_cargo_toml("bench", "foo/bar/baz");
+}
+
+#[test]
+fn build_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("build", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn clean_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("clean", "foo");
+}
+
+#[test]
+fn clean_dir_plus_file() {
+ assert_not_a_cargo_toml("clean", "foo/bar");
+}
+
+#[test]
+fn clean_dir_plus_path() {
+ assert_not_a_cargo_toml("clean", "foo/bar/baz");
+}
+
+#[test]
+fn clean_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("clean", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn doc_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("doc", "foo");
+}
+
+#[test]
+fn doc_dir_plus_file() {
+ assert_not_a_cargo_toml("doc", "foo/bar");
+}
+
+#[test]
+fn doc_dir_plus_path() {
+ assert_not_a_cargo_toml("doc", "foo/bar/baz");
+}
+
+#[test]
+fn doc_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("doc", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn fetch_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("fetch", "foo");
+}
+
+#[test]
+fn fetch_dir_plus_file() {
+ assert_not_a_cargo_toml("fetch", "foo/bar");
+}
+
+#[test]
+fn fetch_dir_plus_path() {
+ assert_not_a_cargo_toml("fetch", "foo/bar/baz");
+}
+
+#[test]
+fn fetch_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("fetch", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn generate_lockfile_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("generate-lockfile", "foo");
+}
+
+#[test]
+fn generate_lockfile_dir_plus_file() {
+ assert_not_a_cargo_toml("generate-lockfile", "foo/bar");
+}
+
+#[test]
+fn generate_lockfile_dir_plus_path() {
+ assert_not_a_cargo_toml("generate-lockfile", "foo/bar/baz");
+}
+
+#[test]
+fn generate_lockfile_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("generate-lockfile", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn package_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("package", "foo");
+}
+
+#[test]
+fn package_dir_plus_file() {
+ assert_not_a_cargo_toml("package", "foo/bar");
+}
+
+#[test]
+fn package_dir_plus_path() {
+ assert_not_a_cargo_toml("package", "foo/bar/baz");
+}
+
+#[test]
+fn package_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("package", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn pkgid_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("pkgid", "foo");
+}
+
+#[test]
+fn pkgid_dir_plus_file() {
+ assert_not_a_cargo_toml("pkgid", "foo/bar");
+}
+
+#[test]
+fn pkgid_dir_plus_path() {
+ assert_not_a_cargo_toml("pkgid", "foo/bar/baz");
+}
+
+#[test]
+fn pkgid_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("pkgid", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn publish_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("publish", "foo");
+}
+
+#[test]
+fn publish_dir_plus_file() {
+ assert_not_a_cargo_toml("publish", "foo/bar");
+}
+
+#[test]
+fn publish_dir_plus_path() {
+ assert_not_a_cargo_toml("publish", "foo/bar/baz");
+}
+
+#[test]
+fn publish_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("publish", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn read_manifest_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("read-manifest", "foo");
+}
+
+#[test]
+fn read_manifest_dir_plus_file() {
+ assert_not_a_cargo_toml("read-manifest", "foo/bar");
+}
+
+#[test]
+fn read_manifest_dir_plus_path() {
+ assert_not_a_cargo_toml("read-manifest", "foo/bar/baz");
+}
+
+#[test]
+fn read_manifest_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("read-manifest", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn run_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("run", "foo");
+}
+
+#[test]
+fn run_dir_plus_file() {
+ assert_not_a_cargo_toml("run", "foo/bar");
+}
+
+#[test]
+fn run_dir_plus_path() {
+ assert_not_a_cargo_toml("run", "foo/bar/baz");
+}
+
+#[test]
+fn run_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("run", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn rustc_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("rustc", "foo");
+}
+
+#[test]
+fn rustc_dir_plus_file() {
+ assert_not_a_cargo_toml("rustc", "foo/bar");
+}
+
+#[test]
+fn rustc_dir_plus_path() {
+ assert_not_a_cargo_toml("rustc", "foo/bar/baz");
+}
+
+#[test]
+fn rustc_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("rustc", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn test_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("test", "foo");
+}
+
+#[test]
+fn test_dir_plus_file() {
+ assert_not_a_cargo_toml("test", "foo/bar");
+}
+
+#[test]
+fn test_dir_plus_path() {
+ assert_not_a_cargo_toml("test", "foo/bar/baz");
+}
+
+#[test]
+fn test_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("test", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn update_dir_containing_cargo_toml() {
+ assert_not_a_cargo_toml("update", "foo");
+}
+
+#[test]
+fn update_dir_plus_file() {
+ assert_not_a_cargo_toml("update", "foo/bar");
+}
+
+#[test]
+fn update_dir_plus_path() {
+ assert_not_a_cargo_toml("update", "foo/bar/baz");
+}
+
+#[test]
+fn update_dir_to_nonexistent_cargo_toml() {
+ assert_cargo_toml_doesnt_exist("update", "foo/bar/baz/Cargo.toml");
+}
+
+#[test]
+fn verify_project_dir_containing_cargo_toml() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("verify-project")
+ .arg("--manifest-path").arg("foo")
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(1)
+ .with_stdout("\
+{\"invalid\":\"the manifest-path must be a path to a Cargo.toml file\"}\
+ "));
+}
+
+#[test]
+fn verify_project_dir_plus_file() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("verify-project")
+ .arg("--manifest-path").arg("foo/bar")
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(1)
+ .with_stdout("\
+{\"invalid\":\"the manifest-path must be a path to a Cargo.toml file\"}\
+ "));
+}
+
+#[test]
+fn verify_project_dir_plus_path() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("verify-project")
+ .arg("--manifest-path").arg("foo/bar/baz")
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(1)
+ .with_stdout("\
+{\"invalid\":\"the manifest-path must be a path to a Cargo.toml file\"}\
+ "));
+}
+
+#[test]
+fn verify_project_dir_to_nonexistent_cargo_toml() {
+ let p = project("foo").build();
+ assert_that(p.cargo("verify-project")
+ .arg("--manifest-path").arg("foo/bar/baz/Cargo.toml")
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(1)
+ .with_stdout("\
+{\"invalid\":\"manifest path `foo[..]bar[..]baz[..]Cargo.toml` does not exist\"}\
+ "));
+}
--- /dev/null
+extern crate cargotest;
+extern crate cargo;
+extern crate hamcrest;
+
+use std::str;
+
+use cargo::util::process;
+use cargotest::is_nightly;
+use cargotest::support::paths::CargoPathExt;
+use cargotest::support::{project, execs, basic_bin_manifest, basic_lib_manifest};
+use hamcrest::{assert_that, existing_file};
+
+#[test]
+fn cargo_bench_simple() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ #![feature(test)]
+ #[cfg(test)]
+ extern crate test;
+
+ fn hello() -> &'static str {
+ "hello"
+ }
+
+ pub fn main() {
+ println!("{}", hello())
+ }
+
+ #[bench]
+ fn bench_hello(_b: &mut test::Bencher) {
+ assert_eq!(hello(), "hello")
+ }"#)
+ .build();
+
+ assert_that(p.cargo("build"), execs());
+ assert_that(&p.bin("foo"), existing_file());
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_stdout("hello\n"));
+
+ assert_that(p.cargo("bench"),
+ execs().with_stderr(&format!("\
+[COMPILING] foo v0.5.0 ({})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]", p.url()))
+ .with_stdout_contains("test bench_hello ... bench: [..]"));
+}
+
+#[test]
+fn bench_bench_implicit() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml" , r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ #![cfg_attr(test, feature(test))]
+ #[cfg(test)]
+ extern crate test;
+ #[bench] fn run1(_ben: &mut test::Bencher) { }
+ fn main() { println!("Hello main!"); }"#)
+ .file("tests/other.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run3(_ben: &mut test::Bencher) { }"#)
+ .file("benches/mybench.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run2(_ben: &mut test::Bencher) { }"#)
+ .build();
+
+ assert_that(p.cargo("bench").arg("--benches"),
+ execs().with_status(0)
+ .with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]release[/]deps[/]mybench-[..][EXE]
+", dir = p.url()))
+ .with_stdout_contains("test run2 ... bench: [..]"));
+}
+
+#[test]
+fn bench_bin_implicit() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml" , r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ #![feature(test)]
+ #[cfg(test)]
+ extern crate test;
+ #[bench] fn run1(_ben: &mut test::Bencher) { }
+ fn main() { println!("Hello main!"); }"#)
+ .file("tests/other.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run3(_ben: &mut test::Bencher) { }"#)
+ .file("benches/mybench.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run2(_ben: &mut test::Bencher) { }"#)
+ .build();
+
+ assert_that(p.cargo("bench").arg("--bins"),
+ execs().with_status(0)
+ .with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]
+", dir = p.url()))
+ .with_stdout_contains("test run1 ... bench: [..]"));
+}
+
+#[test]
+fn bench_tarname() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml" , r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("benches/bin1.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run1(_ben: &mut test::Bencher) { }"#)
+ .file("benches/bin2.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run2(_ben: &mut test::Bencher) { }"#)
+ .build();
+
+ assert_that(p.cargo("bench").arg("--bench").arg("bin2"),
+ execs().with_status(0)
+ .with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]bin2-[..][EXE]
+", dir = p.url()))
+ .with_stdout_contains("test run2 ... bench: [..]"));
+}
+
+#[test]
+fn bench_multiple_targets() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml" , r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("benches/bin1.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run1(_ben: &mut test::Bencher) { }"#)
+ .file("benches/bin2.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run2(_ben: &mut test::Bencher) { }"#)
+ .file("benches/bin3.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run3(_ben: &mut test::Bencher) { }"#)
+ .build();
+
+ assert_that(p.cargo("bench")
+ .arg("--bench").arg("bin1")
+ .arg("--bench").arg("bin2"),
+ execs()
+ .with_status(0)
+ .with_stdout_contains("test run1 ... bench: [..]")
+ .with_stdout_contains("test run2 ... bench: [..]")
+ .with_stdout_does_not_contain("run3"));
+}
+
+#[test]
+fn cargo_bench_verbose() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ #![feature(test)]
+ #[cfg(test)]
+ extern crate test;
+ fn main() {}
+ #[bench] fn bench_hello(_b: &mut test::Bencher) {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("bench").arg("-v").arg("hello"),
+ execs().with_stderr(&format!("\
+[COMPILING] foo v0.5.0 ({url})
+[RUNNING] `rustc [..] src[/]main.rs [..]`
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] `[..]target[/]release[/]deps[/]foo-[..][EXE] hello --bench`", url = p.url()))
+ .with_stdout_contains("test bench_hello ... bench: [..]"));
+}
+
+#[test]
+fn many_similar_names() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "
+ #![feature(test)]
+ #[cfg(test)]
+ extern crate test;
+ pub fn foo() {}
+ #[bench] fn lib_bench(_b: &mut test::Bencher) {}
+ ")
+ .file("src/main.rs", "
+ #![feature(test)]
+ #[cfg(test)]
+ extern crate foo;
+ #[cfg(test)]
+ extern crate test;
+ fn main() {}
+ #[bench] fn bin_bench(_b: &mut test::Bencher) { foo::foo() }
+ ")
+ .file("benches/foo.rs", r#"
+ #![feature(test)]
+ extern crate foo;
+ extern crate test;
+ #[bench] fn bench_bench(_b: &mut test::Bencher) { foo::foo() }
+ "#)
+ .build();
+
+ let output = p.cargo("bench").exec_with_output().unwrap();
+ let output = str::from_utf8(&output.stdout).unwrap();
+ assert!(output.contains("test bin_bench"), "bin_bench missing\n{}", output);
+ assert!(output.contains("test lib_bench"), "lib_bench missing\n{}", output);
+ assert!(output.contains("test bench_bench"), "bench_bench missing\n{}", output);
+}
+
+#[test]
+fn cargo_bench_failing_test() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ #![feature(test)]
+ #[cfg(test)]
+ extern crate test;
+ fn hello() -> &'static str {
+ "hello"
+ }
+
+ pub fn main() {
+ println!("{}", hello())
+ }
+
+ #[bench]
+ fn bench_hello(_b: &mut test::Bencher) {
+ assert_eq!(hello(), "nope")
+ }"#)
+ .build();
+
+ assert_that(p.cargo("build"), execs());
+ assert_that(&p.bin("foo"), existing_file());
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_stdout("hello\n"));
+
+ assert_that(p.cargo("bench"),
+ execs().with_stdout_contains("test bench_hello ... ")
+ .with_stderr_contains(format!("\
+[COMPILING] foo v0.5.0 ({})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]
+thread '[..]' panicked at 'assertion failed: \
+ `(left == right)`[..]", p.url()))
+ .with_stderr_contains("[..]left: `\"hello\"`[..]")
+ .with_stderr_contains("[..]right: `\"nope\"`[..]")
+ .with_stderr_contains("[..]src[/]main.rs:15[..]")
+ .with_status(101));
+}
+
+#[test]
+fn bench_with_lib_dep() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name = "baz"
+ path = "src/main.rs"
+ "#)
+ .file("src/lib.rs", r#"
+ #![cfg_attr(test, feature(test))]
+ #[cfg(test)]
+ extern crate test;
+ ///
+ /// ```rust
+ /// extern crate foo;
+ /// fn main() {
+ /// println!("{}", foo::foo());
+ /// }
+ /// ```
+ ///
+ pub fn foo(){}
+ #[bench] fn lib_bench(_b: &mut test::Bencher) {}
+ "#)
+ .file("src/main.rs", "
+ #![feature(test)]
+ #[allow(unused_extern_crates)]
+ extern crate foo;
+ #[cfg(test)]
+ extern crate test;
+
+ fn main() {}
+
+ #[bench]
+ fn bin_bench(_b: &mut test::Bencher) {}
+ ")
+ .build();
+
+ assert_that(p.cargo("bench"),
+ execs().with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]release[/]deps[/]baz-[..][EXE]", p.url()))
+ .with_stdout_contains("test lib_bench ... bench: [..]")
+ .with_stdout_contains("test bin_bench ... bench: [..]"));
+}
+
+#[test]
+fn bench_with_deep_lib_dep() {
+ if !is_nightly() { return }
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.foo]
+ path = "../foo"
+ "#)
+ .file("src/lib.rs", "
+ #![cfg_attr(test, feature(test))]
+ #[cfg(test)]
+ extern crate foo;
+ #[cfg(test)]
+ extern crate test;
+ #[bench]
+ fn bar_bench(_b: &mut test::Bencher) {
+ foo::foo();
+ }
+ ")
+ .build();
+ let _p2 = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "
+ #![cfg_attr(test, feature(test))]
+ #[cfg(test)]
+ extern crate test;
+
+ pub fn foo() {}
+
+ #[bench]
+ fn foo_bench(_b: &mut test::Bencher) {}
+ ")
+ .build();
+
+ assert_that(p.cargo("bench"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ([..])
+[COMPILING] bar v0.0.1 ({dir})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]bar-[..][EXE]", dir = p.url()))
+ .with_stdout_contains("test bar_bench ... bench: [..]"));
+}
+
+#[test]
+fn external_bench_explicit() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[bench]]
+ name = "bench"
+ path = "src/bench.rs"
+ "#)
+ .file("src/lib.rs", r#"
+ #![cfg_attr(test, feature(test))]
+ #[cfg(test)]
+ extern crate test;
+ pub fn get_hello() -> &'static str { "Hello" }
+
+ #[bench]
+ fn internal_bench(_b: &mut test::Bencher) {}
+ "#)
+ .file("src/bench.rs", r#"
+ #![feature(test)]
+ #[allow(unused_extern_crates)]
+ extern crate foo;
+ extern crate test;
+
+ #[bench]
+ fn external_bench(_b: &mut test::Bencher) {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("bench"),
+ execs().with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]release[/]deps[/]bench-[..][EXE]", p.url()))
+ .with_stdout_contains("test internal_bench ... bench: [..]")
+ .with_stdout_contains("test external_bench ... bench: [..]"));
+}
+
+#[test]
+fn external_bench_implicit() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ #![cfg_attr(test, feature(test))]
+ #[cfg(test)]
+ extern crate test;
+
+ pub fn get_hello() -> &'static str { "Hello" }
+
+ #[bench]
+ fn internal_bench(_b: &mut test::Bencher) {}
+ "#)
+ .file("benches/external.rs", r#"
+ #![feature(test)]
+ #[allow(unused_extern_crates)]
+ extern crate foo;
+ extern crate test;
+
+ #[bench]
+ fn external_bench(_b: &mut test::Bencher) {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("bench"),
+ execs().with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]release[/]deps[/]external-[..][EXE]", p.url()))
+ .with_stdout_contains("test internal_bench ... bench: [..]")
+ .with_stdout_contains("test external_bench ... bench: [..]"));
+}
+
+#[test]
+fn dont_run_examples() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r"")
+ .file("examples/dont-run-me-i-will-fail.rs", r#"
+ fn main() { panic!("Examples should not be run by 'cargo test'"); }
+ "#)
+ .build();
+ assert_that(p.cargo("bench"),
+ execs().with_status(0));
+}
+
+#[test]
+fn pass_through_command_line() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "
+ #![feature(test)]
+ #[cfg(test)]
+ extern crate test;
+
+ #[bench] fn foo(_b: &mut test::Bencher) {}
+ #[bench] fn bar(_b: &mut test::Bencher) {}
+ ")
+ .build();
+
+ assert_that(p.cargo("bench").arg("bar"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]", dir = p.url()))
+ .with_stdout_contains("test bar ... bench: [..]"));
+
+ assert_that(p.cargo("bench").arg("foo"),
+ execs().with_status(0)
+ .with_stderr("[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]")
+ .with_stdout_contains("test foo ... bench: [..]"));
+}
+
+// Regression test for running cargo-bench twice with
+// tests in an rlib
+#[test]
+fn cargo_bench_twice() {
+ if !is_nightly() { return }
+
+ let p = project("test_twice")
+ .file("Cargo.toml", &basic_lib_manifest("test_twice"))
+ .file("src/test_twice.rs", r#"
+ #![crate_type = "rlib"]
+ #![feature(test)]
+ #[cfg(test)]
+ extern crate test;
+
+ #[bench]
+ fn dummy_bench(b: &mut test::Bencher) { }
+ "#)
+ .build();
+
+ p.cargo("build");
+
+ for _ in 0..2 {
+ assert_that(p.cargo("bench"),
+ execs().with_status(0));
+ }
+}
+
+#[test]
+fn lib_bin_same_name() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo"
+ [[bin]]
+ name = "foo"
+ "#)
+ .file("src/lib.rs", "
+ #![cfg_attr(test, feature(test))]
+ #[cfg(test)]
+ extern crate test;
+ #[bench] fn lib_bench(_b: &mut test::Bencher) {}
+ ")
+ .file("src/main.rs", "
+ #![cfg_attr(test, feature(test))]
+ #[allow(unused_extern_crates)]
+ extern crate foo;
+ #[cfg(test)]
+ extern crate test;
+
+ #[bench]
+ fn bin_bench(_b: &mut test::Bencher) {}
+ ")
+ .build();
+
+ assert_that(p.cargo("bench"),
+ execs().with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]", p.url()))
+ .with_stdout_contains_n("test [..] ... bench: [..]", 2));
+}
+
+#[test]
+fn lib_with_standard_name() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "syntax"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "
+ #![cfg_attr(test, feature(test))]
+ #[cfg(test)]
+ extern crate test;
+
+ /// ```
+ /// syntax::foo();
+ /// ```
+ pub fn foo() {}
+
+ #[bench]
+ fn foo_bench(_b: &mut test::Bencher) {}
+ ")
+ .file("benches/bench.rs", "
+ #![feature(test)]
+ extern crate syntax;
+ extern crate test;
+
+ #[bench]
+ fn bench(_b: &mut test::Bencher) { syntax::foo() }
+ ")
+ .build();
+
+ assert_that(p.cargo("bench"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] syntax v0.0.1 ({dir})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]syntax-[..][EXE]
+[RUNNING] target[/]release[/]deps[/]bench-[..][EXE]", dir = p.url()))
+ .with_stdout_contains("test foo_bench ... bench: [..]")
+ .with_stdout_contains("test bench ... bench: [..]"));
+}
+
+#[test]
+fn lib_with_standard_name2() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "syntax"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "syntax"
+ bench = false
+ doctest = false
+ "#)
+ .file("src/lib.rs", "
+ pub fn foo() {}
+ ")
+ .file("src/main.rs", "
+ #![feature(test)]
+ #[cfg(test)]
+ extern crate syntax;
+ #[cfg(test)]
+ extern crate test;
+
+ fn main() {}
+
+ #[bench]
+ fn bench(_b: &mut test::Bencher) { syntax::foo() }
+ ")
+ .build();
+
+ assert_that(p.cargo("bench"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] syntax v0.0.1 ({dir})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]syntax-[..][EXE]", dir = p.url()))
+ .with_stdout_contains("test bench ... bench: [..]"));
+}
+
+#[test]
+fn bench_dylib() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo"
+ crate_type = ["dylib"]
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/lib.rs", r#"
+ #![cfg_attr(test, feature(test))]
+ extern crate bar as the_bar;
+ #[cfg(test)]
+ extern crate test;
+
+ pub fn bar() { the_bar::baz(); }
+
+ #[bench]
+ fn foo(_b: &mut test::Bencher) {}
+ "#)
+ .file("benches/bench.rs", r#"
+ #![feature(test)]
+ extern crate foo as the_foo;
+ extern crate test;
+
+ #[bench]
+ fn foo(_b: &mut test::Bencher) { the_foo::bar(); }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "bar"
+ crate_type = ["dylib"]
+ "#)
+ .file("bar/src/lib.rs", "
+ pub fn baz() {}
+ ")
+ .build();
+
+ assert_that(p.cargo("bench").arg("-v"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] bar v0.0.1 ({dir}/bar)
+[RUNNING] [..] -C opt-level=3 [..]
+[COMPILING] foo v0.0.1 ({dir})
+[RUNNING] [..] -C opt-level=3 [..]
+[RUNNING] [..] -C opt-level=3 [..]
+[RUNNING] [..] -C opt-level=3 [..]
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] `[..]target[/]release[/]deps[/]foo-[..][EXE] --bench`
+[RUNNING] `[..]target[/]release[/]deps[/]bench-[..][EXE] --bench`", dir = p.url()))
+ .with_stdout_contains_n("test foo ... bench: [..]", 2));
+
+ p.root().move_into_the_past();
+ assert_that(p.cargo("bench").arg("-v"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[FRESH] bar v0.0.1 ({dir}/bar)
+[FRESH] foo v0.0.1 ({dir})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] `[..]target[/]release[/]deps[/]foo-[..][EXE] --bench`
+[RUNNING] `[..]target[/]release[/]deps[/]bench-[..][EXE] --bench`", dir = p.url()))
+ .with_stdout_contains_n("test foo ... bench: [..]", 2));
+}
+
+#[test]
+fn bench_twice_with_build_cmd() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", "fn main() {}")
+ .file("src/lib.rs", "
+ #![feature(test)]
+ #[cfg(test)]
+ extern crate test;
+ #[bench]
+ fn foo(_b: &mut test::Bencher) {}
+ ")
+ .build();
+
+ assert_that(p.cargo("bench"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]", dir = p.url()))
+ .with_stdout_contains("test foo ... bench: [..]"));
+
+ assert_that(p.cargo("bench"),
+ execs().with_status(0)
+ .with_stderr("[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]")
+ .with_stdout_contains("test foo ... bench: [..]"));
+}
+
+#[test]
+fn bench_with_examples() {
+ if !is_nightly() { return }
+
+ let p = project("testbench")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "testbench"
+ version = "6.6.6"
+ authors = []
+
+ [[example]]
+ name = "teste1"
+
+ [[bench]]
+ name = "testb1"
+ "#)
+ .file("src/lib.rs", r#"
+ #![cfg_attr(test, feature(test))]
+ #[cfg(test)]
+ extern crate test;
+ #[cfg(test)]
+ use test::Bencher;
+
+ pub fn f1() {
+ println!("f1");
+ }
+
+ pub fn f2() {}
+
+ #[bench]
+ fn bench_bench1(_b: &mut Bencher) {
+ f2();
+ }
+ "#)
+ .file("benches/testb1.rs", "
+ #![feature(test)]
+ extern crate testbench;
+ extern crate test;
+
+ use test::Bencher;
+
+ #[bench]
+ fn bench_bench2(_b: &mut Bencher) {
+ testbench::f2();
+ }
+ ")
+ .file("examples/teste1.rs", r#"
+ extern crate testbench;
+
+ fn main() {
+ println!("example1");
+ testbench::f1();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("bench").arg("-v"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] testbench v6.6.6 ({url})
+[RUNNING] `rustc [..]`
+[RUNNING] `rustc [..]`
+[RUNNING] `rustc [..]`
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] `{dir}[/]target[/]release[/]deps[/]testbench-[..][EXE] --bench`
+[RUNNING] `{dir}[/]target[/]release[/]deps[/]testb1-[..][EXE] --bench`",
+ dir = p.root().display(), url = p.url()))
+ .with_stdout_contains("test bench_bench1 ... bench: [..]")
+ .with_stdout_contains("test bench_bench2 ... bench: [..]"));
+}
+
+#[test]
+fn test_a_bench() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ authors = []
+ version = "0.1.0"
+
+ [lib]
+ name = "foo"
+ test = false
+ doctest = false
+
+ [[bench]]
+ name = "b"
+ test = true
+ "#)
+ .file("src/lib.rs", "")
+ .file("benches/b.rs", r#"
+ #[test]
+ fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.1.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]b-[..][EXE]")
+ .with_stdout_contains("test foo ... ok"));
+}
+
+#[test]
+fn test_bench_no_run() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ authors = []
+ version = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .file("benches/bbaz.rs", r#"
+ #![feature(test)]
+
+ extern crate test;
+
+ use test::Bencher;
+
+ #[bench]
+ fn bench_baz(_: &mut Bencher) {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("bench").arg("--no-run"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.1.0 ([..])
+[FINISHED] release [optimized] target(s) in [..]
+"));
+}
+
+#[test]
+fn test_bench_no_fail_fast() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", r#"
+ #![feature(test)]
+ #[cfg(test)]
+ extern crate test;
+ fn hello() -> &'static str {
+ "hello"
+ }
+
+ pub fn main() {
+ println!("{}", hello())
+ }
+
+ #[bench]
+ fn bench_hello(_b: &mut test::Bencher) {
+ assert_eq!(hello(), "hello")
+ }
+
+ #[bench]
+ fn bench_nope(_b: &mut test::Bencher) {
+ assert_eq!("nope", hello())
+ }"#)
+ .build();
+
+ assert_that(p.cargo("bench").arg("--no-fail-fast"),
+ execs().with_status(101)
+ .with_stderr_contains("\
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]")
+ .with_stdout_contains("running 2 tests")
+ .with_stderr_contains("\
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]")
+ .with_stdout_contains("test bench_hello [..]")
+ .with_stdout_contains("test bench_nope [..]"));
+}
+
+#[test]
+fn test_bench_multiple_packages() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ authors = []
+ version = "0.1.0"
+
+ [dependencies.bar]
+ path = "../bar"
+
+ [dependencies.baz]
+ path = "../baz"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ authors = []
+ version = "0.1.0"
+
+ [[bench]]
+ name = "bbar"
+ test = true
+ "#)
+ .file("src/lib.rs", "")
+ .file("benches/bbar.rs", r#"
+ #![feature(test)]
+ extern crate test;
+
+ use test::Bencher;
+
+ #[bench]
+ fn bench_bar(_b: &mut Bencher) {}
+ "#)
+ .build();
+
+ let _baz = project("baz")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "baz"
+ authors = []
+ version = "0.1.0"
+
+ [[bench]]
+ name = "bbaz"
+ test = true
+ "#)
+ .file("src/lib.rs", "")
+ .file("benches/bbaz.rs", r#"
+ #![feature(test)]
+ extern crate test;
+
+ use test::Bencher;
+
+ #[bench]
+ fn bench_baz(_b: &mut Bencher) {}
+ "#)
+ .build();
+
+
+ assert_that(p.cargo("bench").arg("-p").arg("bar").arg("-p").arg("baz"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[RUNNING] target[/]release[/]deps[/]bbaz-[..][EXE]")
+ .with_stdout_contains("test bench_baz ... bench: [..]")
+ .with_stderr_contains("\
+[RUNNING] target[/]release[/]deps[/]bbar-[..][EXE]")
+ .with_stdout_contains("test bench_bar ... bench: [..]"));
+}
+
+#[test]
+fn bench_all_workspace() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+
+ [dependencies]
+ bar = { path = "bar" }
+
+ [workspace]
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("benches/foo.rs", r#"
+ #![feature(test)]
+ extern crate test;
+
+ use test::Bencher;
+
+ #[bench]
+ fn bench_foo(_: &mut Bencher) -> () { () }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .file("bar/benches/bar.rs", r#"
+ #![feature(test)]
+ extern crate test;
+
+ use test::Bencher;
+
+ #[bench]
+ fn bench_bar(_: &mut Bencher) -> () { () }
+ "#)
+ .build();
+
+ assert_that(p.cargo("bench")
+ .arg("--all"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[RUNNING] target[/]release[/]deps[/]bar-[..][EXE]")
+ .with_stdout_contains("test bench_bar ... bench: [..]")
+ .with_stderr_contains("\
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]")
+ .with_stdout_contains("test bench_foo ... bench: [..]"));
+}
+
+#[test]
+fn bench_all_exclude() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+
+ [workspace]
+ members = ["bar", "baz"]
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ #![feature(test)]
+ #[cfg(test)]
+ extern crate test;
+
+ #[bench]
+ pub fn bar(b: &mut test::Bencher) {
+ b.iter(|| {});
+ }
+ "#)
+ .file("baz/Cargo.toml", r#"
+ [project]
+ name = "baz"
+ version = "0.1.0"
+ "#)
+ .file("baz/src/lib.rs", r#"
+ #[test]
+ pub fn baz() {
+ break_the_build();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("bench")
+ .arg("--all")
+ .arg("--exclude")
+ .arg("baz"),
+ execs().with_status(0)
+ .with_stdout_contains("\
+running 1 test
+test bar ... bench: [..] ns/iter (+/- [..])"));
+}
+
+#[test]
+fn bench_all_virtual_manifest() {
+ if !is_nightly() { return }
+
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo", "bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("foo/src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .file("foo/benches/foo.rs", r#"
+ #![feature(test)]
+ extern crate test;
+
+ use test::Bencher;
+
+ #[bench]
+ fn bench_foo(_: &mut Bencher) -> () { () }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .file("bar/benches/bar.rs", r#"
+ #![feature(test)]
+ extern crate test;
+
+ use test::Bencher;
+
+ #[bench]
+ fn bench_bar(_: &mut Bencher) -> () { () }
+ "#)
+ .build();
+
+ // The order in which foo and bar are built is not guaranteed
+ assert_that(p.cargo("bench")
+ .arg("--all"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[RUNNING] target[/]release[/]deps[/]bar-[..][EXE]")
+ .with_stdout_contains("test bench_bar ... bench: [..]")
+ .with_stderr_contains("\
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]")
+ .with_stdout_contains("test bench_foo ... bench: [..]"));
+}
+
+// https://github.com/rust-lang/cargo/issues/4287
+#[test]
+fn legacy_bench_name() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+
+ [[bench]]
+ name = "bench"
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .file("src/bench.rs", r#"
+ #![feature(test)]
+ extern crate test;
+
+ use test::Bencher;
+
+ #[bench]
+ fn bench_foo(_: &mut Bencher) -> () { () }
+ "#)
+ .build();
+
+ assert_that(p.cargo("bench"), execs().with_status(0).with_stderr_contains("\
+[WARNING] path `[..]src[/]bench.rs` was erroneously implicitly accepted for benchmark `bench`,
+please set bench.path in Cargo.toml"));
+}
+
+#[test]
+fn bench_virtual_manifest_all_implied() {
+ if !is_nightly() { return }
+
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo", "bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("foo/src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .file("foo/benches/foo.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ use test::Bencher;
+ #[bench]
+ fn bench_foo(_: &mut Bencher) -> () { () }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .file("bar/benches/bar.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ use test::Bencher;
+ #[bench]
+ fn bench_bar(_: &mut Bencher) -> () { () }
+ "#)
+ .build();
+
+ // The order in which foo and bar are built is not guaranteed
+
+ assert_that(p.cargo("bench"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[RUNNING] target[/]release[/]deps[/]bar-[..][EXE]")
+ .with_stdout_contains("test bench_bar ... bench: [..]")
+ .with_stderr_contains("\
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]")
+ .with_stdout_contains("test bench_foo ... bench: [..]"));
+}
--- /dev/null
+extern crate bufstream;
+extern crate git2;
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::collections::HashSet;
+use std::io::prelude::*;
+use std::net::TcpListener;
+use std::thread;
+
+use bufstream::BufStream;
+use cargotest::support::paths;
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+// Test that HTTP auth is offered from `credential.helper`
+#[test]
+fn http_auth_offered() {
+ let server = TcpListener::bind("127.0.0.1:0").unwrap();
+ let addr = server.local_addr().unwrap();
+
+ fn headers(rdr: &mut BufRead) -> HashSet<String> {
+ let valid = ["GET", "Authorization", "Accept", "User-Agent"];
+ rdr.lines().map(|s| s.unwrap())
+ .take_while(|s| s.len() > 2)
+ .map(|s| s.trim().to_string())
+ .filter(|s| {
+ valid.iter().any(|prefix| s.starts_with(*prefix))
+ })
+ .collect()
+ }
+
+ let t = thread::spawn(move|| {
+ let mut conn = BufStream::new(server.accept().unwrap().0);
+ let req = headers(&mut conn);
+ let user_agent = if cfg!(windows) {
+ "User-Agent: git/1.0 (libgit2 0.26.0)"
+ } else {
+ "User-Agent: git/2.0 (libgit2 0.26.0)"
+ };
+ conn.write_all(b"\
+ HTTP/1.1 401 Unauthorized\r\n\
+ WWW-Authenticate: Basic realm=\"wheee\"\r\n
+ \r\n\
+ ").unwrap();
+ assert_eq!(req, vec![
+ "GET /foo/bar/info/refs?service=git-upload-pack HTTP/1.1",
+ "Accept: */*",
+ user_agent,
+ ].into_iter().map(|s| s.to_string()).collect());
+ drop(conn);
+
+ let mut conn = BufStream::new(server.accept().unwrap().0);
+ let req = headers(&mut conn);
+ conn.write_all(b"\
+ HTTP/1.1 401 Unauthorized\r\n\
+ WWW-Authenticate: Basic realm=\"wheee\"\r\n
+ \r\n\
+ ").unwrap();
+ assert_eq!(req, vec![
+ "GET /foo/bar/info/refs?service=git-upload-pack HTTP/1.1",
+ "Authorization: Basic Zm9vOmJhcg==",
+ "Accept: */*",
+ user_agent,
+ ].into_iter().map(|s| s.to_string()).collect());
+ });
+
+ let script = project("script")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "script"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {
+ println!("username=foo");
+ println!("password=bar");
+ }
+ "#)
+ .build();
+
+ assert_that(script.cargo("build").arg("-v"),
+ execs().with_status(0));
+ let script = script.bin("script");
+
+ let config = paths::home().join(".gitconfig");
+ let mut config = git2::Config::open(&config).unwrap();
+ config.set_str("credential.helper",
+ &script.display().to_string()).unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ git = "http://127.0.0.1:{}/foo/bar"
+ "#, addr.port()))
+ .file("src/main.rs", "")
+ .file(".cargo/config","\
+ [net]
+ retry = 0
+ ")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr(&format!("\
+[UPDATING] git repository `http://{addr}/foo/bar`
+[ERROR] failed to load source for a dependency on `bar`
+
+Caused by:
+ Unable to update http://{addr}/foo/bar
+
+Caused by:
+ failed to clone into: [..]
+
+Caused by:
+ failed to authenticate when downloading repository
+attempted to find username/password via `credential.helper`, but [..]
+
+To learn more, run the command again with --verbose.
+",
+ addr = addr)));
+
+ t.join().ok().unwrap();
+}
+
+// Boy, sure would be nice to have a TLS implementation in rust!
+#[test]
+fn https_something_happens() {
+ let server = TcpListener::bind("127.0.0.1:0").unwrap();
+ let addr = server.local_addr().unwrap();
+ let t = thread::spawn(move|| {
+ let mut conn = server.accept().unwrap().0;
+ drop(conn.write(b"1234"));
+ drop(conn.shutdown(std::net::Shutdown::Write));
+ drop(conn.read(&mut [0; 16]));
+ });
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ git = "https://127.0.0.1:{}/foo/bar"
+ "#, addr.port()))
+ .file("src/main.rs", "")
+ .file(".cargo/config","\
+ [net]
+ retry = 0
+ ")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101).with_stderr_contains(&format!("\
+[UPDATING] git repository `https://{addr}/foo/bar`
+", addr = addr))
+ .with_stderr_contains(&format!("\
+Caused by:
+ {errmsg}
+",
+ errmsg = if cfg!(windows) {
+ "[[..]] failed to send request: [..]\n"
+ } else if cfg!(target_os = "macos") {
+ // OSX is difficult to tests as some builds may use
+ // Security.framework and others may use OpenSSL. In that case let's
+ // just not verify the error message here.
+ "[..]"
+ } else {
+ "[..] SSL error: [..]"
+ })));
+
+ t.join().ok().unwrap();
+}
+
+// Boy, sure would be nice to have an SSH implementation in rust!
+#[test]
+fn ssh_something_happens() {
+ let server = TcpListener::bind("127.0.0.1:0").unwrap();
+ let addr = server.local_addr().unwrap();
+ let t = thread::spawn(move|| {
+ drop(server.accept().unwrap());
+ });
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ git = "ssh://127.0.0.1:{}/foo/bar"
+ "#, addr.port()))
+ .file("src/main.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101).with_stderr_contains(&format!("\
+[UPDATING] git repository `ssh://{addr}/foo/bar`
+", addr = addr))
+ .with_stderr_contains("\
+Caused by:
+ [[..]] failed to start SSH session: Failed getting banner
+"));
+ t.join().ok().unwrap();
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::{basic_bin_manifest, execs, project, Project};
+use hamcrest::{assert_that};
+
+fn verbose_output_for_lib(p: &Project) -> String {
+ format!("\
+[COMPILING] {name} v{version} ({url})
+[RUNNING] `rustc --crate-name {name} src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = p.root().display(), url = p.url(),
+ name = "foo", version = "0.0.1")
+}
+
+#[test]
+fn build_lib_only() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "foo"
+ version = "0.0.1"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/lib.rs", r#" "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("--lib").arg("-v"),
+ execs()
+ .with_status(0)
+ .with_stderr(verbose_output_for_lib(&p)));
+}
+
+
+#[test]
+fn build_with_no_lib() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("--lib"),
+ execs().with_status(101)
+ .with_stderr("[ERROR] no library targets found"));
+}
+
+#[test]
+fn build_with_relative_cargo_home_path() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "foo"
+ version = "0.0.1"
+ authors = ["wycats@example.com"]
+
+ [dependencies]
+
+ "test-dependency" = { path = "src/test_dependency" }
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/test_dependency/src/lib.rs", r#" "#)
+ .file("src/test_dependency/Cargo.toml", r#"
+ [package]
+
+ name = "test-dependency"
+ version = "0.0.1"
+ authors = ["wycats@example.com"]
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").env("CARGO_HOME", "./cargo_home/"),
+ execs()
+ .with_status(0));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::fs::File;
+
+use cargotest::sleep_ms;
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+#[test]
+fn rerun_if_env_changes() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rerun-if-env-changed=FOO");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.5.0 ([..])
+[FINISHED] [..]
+"));
+ assert_that(p.cargo("build").env("FOO", "bar"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.5.0 ([..])
+[FINISHED] [..]
+"));
+ assert_that(p.cargo("build").env("FOO", "baz"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.5.0 ([..])
+[FINISHED] [..]
+"));
+ assert_that(p.cargo("build").env("FOO", "baz"),
+ execs().with_status(0)
+ .with_stderr("\
+[FINISHED] [..]
+"));
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.5.0 ([..])
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn rerun_if_env_or_file_changes() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rerun-if-env-changed=FOO");
+ println!("cargo:rerun-if-changed=foo");
+ }
+ "#)
+ .file("foo", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.5.0 ([..])
+[FINISHED] [..]
+"));
+ assert_that(p.cargo("build").env("FOO", "bar"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.5.0 ([..])
+[FINISHED] [..]
+"));
+ assert_that(p.cargo("build").env("FOO", "bar"),
+ execs().with_status(0)
+ .with_stderr("\
+[FINISHED] [..]
+"));
+ sleep_ms(1000);
+ File::create(p.root().join("foo")).unwrap();
+ assert_that(p.cargo("build").env("FOO", "bar"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.5.0 ([..])
+[FINISHED] [..]
+"));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::fs::{self, File};
+use std::io::prelude::*;
+
+use cargotest::{rustc_host, sleep_ms};
+use cargotest::support::{project, execs};
+use cargotest::support::paths::CargoPathExt;
+use cargotest::support::registry::Package;
+use hamcrest::{assert_that, existing_file, existing_dir};
+
+#[test]
+fn custom_build_script_failed() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ build = "build.rs"
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ std::process::exit(101);
+ }
+ "#)
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.5.0 ({url})
+[RUNNING] `rustc --crate-name build_script_build build.rs --crate-type bin [..]`
+[RUNNING] `[..][/]build-script-build`
+[ERROR] failed to run custom build command for `foo v0.5.0 ({url})`
+process didn't exit successfully: `[..][/]build-script-build` (exit code: 101)",
+url = p.url())));
+}
+
+#[test]
+fn custom_build_env_vars() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [features]
+ bar_feat = ["bar/foo"]
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ build = "build.rs"
+
+ [features]
+ foo = []
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn hello() {}
+ "#);
+
+ let file_content = format!(r#"
+ use std::env;
+ use std::io::prelude::*;
+ use std::path::Path;
+ use std::fs;
+
+ fn main() {{
+ let _target = env::var("TARGET").unwrap();
+ let _ncpus = env::var("NUM_JOBS").unwrap();
+ let _dir = env::var("CARGO_MANIFEST_DIR").unwrap();
+
+ let opt = env::var("OPT_LEVEL").unwrap();
+ assert_eq!(opt, "0");
+
+ let opt = env::var("PROFILE").unwrap();
+ assert_eq!(opt, "debug");
+
+ let debug = env::var("DEBUG").unwrap();
+ assert_eq!(debug, "true");
+
+ let out = env::var("OUT_DIR").unwrap();
+ assert!(out.starts_with(r"{0}"));
+ assert!(fs::metadata(&out).map(|m| m.is_dir()).unwrap_or(false));
+
+ let _host = env::var("HOST").unwrap();
+
+ let _feat = env::var("CARGO_FEATURE_FOO").unwrap();
+
+ let rustc = env::var("RUSTC").unwrap();
+ assert_eq!(rustc, "rustc");
+
+ let rustdoc = env::var("RUSTDOC").unwrap();
+ assert_eq!(rustdoc, "rustdoc");
+ }}
+ "#,
+ p.root().join("target").join("debug").join("build").display());
+
+ let p = p.file("bar/build.rs", &file_content).build();
+
+ assert_that(p.cargo("build").arg("--features").arg("bar_feat"),
+ execs().with_status(0));
+}
+
+#[test]
+fn custom_build_script_wrong_rustc_flags() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ build = "build.rs"
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-flags=-aaa -bbb");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr_contains(&format!("\
+[ERROR] Only `-l` and `-L` flags are allowed in build script of `foo v0.5.0 ({})`: \
+`-aaa -bbb`",
+p.url())));
+}
+
+/*
+#[test]
+fn custom_build_script_rustc_flags() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.foo]
+ path = "foo"
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ build = "build.rs"
+ "#)
+ .file("foo/src/lib.rs", r#"
+ "#)
+ .file("foo/build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-flags=-l nonexistinglib -L /dummy/path1 -L /dummy/path2");
+ }
+ "#)
+ .build();
+
+ // TODO: TEST FAILS BECAUSE OF WRONG STDOUT (but otherwise, the build works)
+ assert_that(p.cargo("build").arg("--verbose"),
+ execs().with_status(101)
+ .with_stderr(&format!("\
+[COMPILING] bar v0.5.0 ({url})
+[RUNNING] `rustc --crate-name test {dir}{sep}src{sep}lib.rs --crate-type lib -C debuginfo=2 \
+ -C metadata=[..] \
+ -C extra-filename=-[..] \
+ --out-dir {dir}{sep}target \
+ --emit=dep-info,link \
+ -L {dir}{sep}target \
+ -L {dir}{sep}target{sep}deps`
+", sep = path::SEP,
+dir = p.root().display(),
+url = p.url(),
+)));
+}
+*/
+
+#[test]
+fn links_no_build_cmd() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ links = "a"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] package `foo v0.5.0 (file://[..])` specifies that it links to `a` but does \
+not have a custom build script
+"));
+}
+
+#[test]
+fn links_duplicates() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ links = "a"
+ build = "build.rs"
+
+ [dependencies.a-sys]
+ path = "a-sys"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", "")
+ .file("a-sys/Cargo.toml", r#"
+ [project]
+ name = "a-sys"
+ version = "0.5.0"
+ authors = []
+ links = "a"
+ build = "build.rs"
+ "#)
+ .file("a-sys/src/lib.rs", "")
+ .file("a-sys/build.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] Multiple packages link to native library `a`. A native library can be \
+linked only once.
+
+The root-package links to native library `a`.
+
+Package `a-sys v0.5.0 (file://[..])`
+ ... which is depended on by `foo v0.5.0 (file://[..])`
+also links to native library `a`.
+"));
+}
+
+#[test]
+fn links_duplicates_deep_dependency() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ links = "a"
+ build = "build.rs"
+
+ [dependencies.a]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", "")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+
+ [dependencies.a-sys]
+ path = "a-sys"
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("a/build.rs", "")
+ .file("a/a-sys/Cargo.toml", r#"
+ [project]
+ name = "a-sys"
+ version = "0.5.0"
+ authors = []
+ links = "a"
+ build = "build.rs"
+ "#)
+ .file("a/a-sys/src/lib.rs", "")
+ .file("a/a-sys/build.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] Multiple packages link to native library `a`. A native library can be \
+linked only once.
+
+The root-package links to native library `a`.
+
+Package `a-sys v0.5.0 (file://[..])`
+ ... which is depended on by `a v0.5.0 (file://[..])`
+ ... which is depended on by `foo v0.5.0 (file://[..])`
+also links to native library `a`.
+"));
+}
+
+#[test]
+fn overrides_and_links() {
+ let target = rustc_host();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+
+ [dependencies.a]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ use std::env;
+ fn main() {
+ assert_eq!(env::var("DEP_FOO_FOO").ok().expect("FOO missing"),
+ "bar");
+ assert_eq!(env::var("DEP_FOO_BAR").ok().expect("BAR missing"),
+ "baz");
+ }
+ "#)
+ .file(".cargo/config", &format!(r#"
+ [target.{}.foo]
+ rustc-flags = "-L foo -L bar"
+ foo = "bar"
+ bar = "baz"
+ "#, target))
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ links = "foo"
+ build = "build.rs"
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("a/build.rs", "not valid rust code")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[..]
+[..]
+[..]
+[..]
+[..]
+[RUNNING] `rustc --crate-name foo [..] -L foo -L bar`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn unused_overrides() {
+ let target = rustc_host();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", "fn main() {}")
+ .file(".cargo/config", &format!(r#"
+ [target.{}.foo]
+ rustc-flags = "-L foo -L bar"
+ foo = "bar"
+ bar = "baz"
+ "#, target))
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn links_passes_env_vars() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+
+ [dependencies.a]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ use std::env;
+ fn main() {
+ assert_eq!(env::var("DEP_FOO_FOO").unwrap(), "bar");
+ assert_eq!(env::var("DEP_FOO_BAR").unwrap(), "baz");
+ }
+ "#)
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ links = "foo"
+ build = "build.rs"
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("a/build.rs", r#"
+ use std::env;
+ fn main() {
+ let lib = env::var("CARGO_MANIFEST_LINKS").unwrap();
+ assert_eq!(lib, "foo");
+
+ println!("cargo:foo=bar");
+ println!("cargo:bar=baz");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn only_rerun_build_script() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+ p.root().move_into_the_past();
+
+ File::create(&p.root().join("some-new-file")).unwrap();
+ p.root().move_into_the_past();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.5.0 (file://[..])
+[RUNNING] `[..][/]build-script-build`
+[RUNNING] `rustc --crate-name foo [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn rebuild_continues_to_pass_env_vars() {
+ let a = project("a")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ links = "foo"
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ use std::time::Duration;
+ fn main() {
+ println!("cargo:foo=bar");
+ println!("cargo:bar=baz");
+ std::thread::sleep(Duration::from_millis(500));
+ }
+ "#)
+ .build();
+ a.root().move_into_the_past();
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+
+ [dependencies.a]
+ path = '{}'
+ "#, a.root().display()))
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ use std::env;
+ fn main() {
+ assert_eq!(env::var("DEP_FOO_FOO").unwrap(), "bar");
+ assert_eq!(env::var("DEP_FOO_BAR").unwrap(), "baz");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+ p.root().move_into_the_past();
+
+ File::create(&p.root().join("some-new-file")).unwrap();
+ p.root().move_into_the_past();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn testing_and_such() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+
+ println!("build");
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+ p.root().move_into_the_past();
+
+ File::create(&p.root().join("src/lib.rs")).unwrap();
+ p.root().move_into_the_past();
+
+ println!("test");
+ assert_that(p.cargo("test").arg("-vj1"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.5.0 (file://[..])
+[RUNNING] `[..][/]build-script-build`
+[RUNNING] `rustc --crate-name foo [..]`
+[RUNNING] `rustc --crate-name foo [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..][/]foo-[..][EXE]`
+[DOCTEST] foo
+[RUNNING] `rustdoc --test [..]`")
+ .with_stdout_contains_n("running 0 tests", 2));
+
+ println!("doc");
+ assert_that(p.cargo("doc").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[DOCUMENTING] foo v0.5.0 (file://[..])
+[RUNNING] `rustdoc [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ File::create(&p.root().join("src/main.rs")).unwrap()
+ .write_all(b"fn main() {}").unwrap();
+ println!("run");
+ assert_that(p.cargo("run"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.5.0 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]foo[EXE]`
+"));
+}
+
+#[test]
+fn propagation_of_l_flags() {
+ let target = rustc_host();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ [dependencies.a]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ links = "bar"
+ build = "build.rs"
+
+ [dependencies.b]
+ path = "../b"
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("a/build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-flags=-L bar");
+ }
+ "#)
+ .file("b/Cargo.toml", r#"
+ [project]
+ name = "b"
+ version = "0.5.0"
+ authors = []
+ links = "foo"
+ build = "build.rs"
+ "#)
+ .file("b/src/lib.rs", "")
+ .file("b/build.rs", "bad file")
+ .file(".cargo/config", &format!(r#"
+ [target.{}.foo]
+ rustc-flags = "-L foo"
+ "#, target))
+ .build();
+
+ assert_that(p.cargo("build").arg("-v").arg("-j1"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[RUNNING] `rustc --crate-name a [..] -L bar[..]-L foo[..]`
+[COMPILING] foo v0.5.0 (file://[..])
+[RUNNING] `rustc --crate-name foo [..] -L bar -L foo`
+"));
+}
+
+#[test]
+fn propagation_of_l_flags_new() {
+ let target = rustc_host();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ [dependencies.a]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ links = "bar"
+ build = "build.rs"
+
+ [dependencies.b]
+ path = "../b"
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("a/build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-link-search=bar");
+ }
+ "#)
+ .file("b/Cargo.toml", r#"
+ [project]
+ name = "b"
+ version = "0.5.0"
+ authors = []
+ links = "foo"
+ build = "build.rs"
+ "#)
+ .file("b/src/lib.rs", "")
+ .file("b/build.rs", "bad file")
+ .file(".cargo/config", &format!(r#"
+ [target.{}.foo]
+ rustc-link-search = ["foo"]
+ "#, target))
+ .build();
+
+ assert_that(p.cargo("build").arg("-v").arg("-j1"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[RUNNING] `rustc --crate-name a [..] -L bar[..]-L foo[..]`
+[COMPILING] foo v0.5.0 (file://[..])
+[RUNNING] `rustc --crate-name foo [..] -L bar -L foo`
+"));
+}
+
+#[test]
+fn build_deps_simple() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ [build-dependencies.a]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", "
+ #[allow(unused_extern_crates)]
+ extern crate a;
+ fn main() {}
+ ")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] a v0.5.0 (file://[..])
+[RUNNING] `rustc --crate-name a [..]`
+[COMPILING] foo v0.5.0 (file://[..])
+[RUNNING] `rustc [..] build.rs [..] --extern a=[..]`
+[RUNNING] `[..][/]foo-[..][/]build-script-build`
+[RUNNING] `rustc --crate-name foo [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn build_deps_not_for_normal() {
+ let target = rustc_host();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ [build-dependencies.aaaaa]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "#[allow(unused_extern_crates)] extern crate aaaaa;")
+ .file("build.rs", "
+ #[allow(unused_extern_crates)]
+ extern crate aaaaa;
+ fn main() {}
+ ")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "aaaaa"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v").arg("--target").arg(&target),
+ execs().with_status(101)
+ .with_stderr_contains("\
+[..]can't find crate for `aaaaa`[..]
+")
+ .with_stderr_contains("\
+[ERROR] Could not compile `foo`.
+
+Caused by:
+ process didn't exit successfully: [..]
+"));
+}
+
+#[test]
+fn build_cmd_with_a_build_cmd() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+
+ [build-dependencies.a]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", "
+ #[allow(unused_extern_crates)]
+ extern crate a;
+ fn main() {}
+ ")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+
+ [build-dependencies.b]
+ path = "../b"
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("a/build.rs", "#[allow(unused_extern_crates)] extern crate b; fn main() {}")
+ .file("b/Cargo.toml", r#"
+ [project]
+ name = "b"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] b v0.5.0 (file://[..])
+[RUNNING] `rustc --crate-name b [..]`
+[COMPILING] a v0.5.0 (file://[..])
+[RUNNING] `rustc [..] a[/]build.rs [..] --extern b=[..]`
+[RUNNING] `[..][/]a-[..][/]build-script-build`
+[RUNNING] `rustc --crate-name a [..]lib.rs --crate-type lib \
+ --emit=dep-info,link -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir [..]target[/]debug[/]deps \
+ -L [..]target[/]debug[/]deps`
+[COMPILING] foo v0.5.0 (file://[..])
+[RUNNING] `rustc --crate-name build_script_build build.rs --crate-type bin \
+ --emit=dep-info,link \
+ -C debuginfo=2 -C metadata=[..] --out-dir [..] \
+ -L [..]target[/]debug[/]deps \
+ --extern a=[..]liba[..].rlib`
+[RUNNING] `[..][/]foo-[..][/]build-script-build`
+[RUNNING] `rustc --crate-name foo [..]lib.rs --crate-type lib \
+ --emit=dep-info,link -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L [..]target[/]debug[/]deps`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn out_dir_is_preserved() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ use std::env;
+ use std::fs::File;
+ use std::path::Path;
+ fn main() {
+ let out = env::var("OUT_DIR").unwrap();
+ File::create(Path::new(&out).join("foo")).unwrap();
+ }
+ "#)
+ .build();
+
+ // Make the file
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+ p.root().move_into_the_past();
+
+ // Change to asserting that it's there
+ File::create(&p.root().join("build.rs")).unwrap().write_all(br#"
+ use std::env;
+ use std::old_io::File;
+ fn main() {
+ let out = env::var("OUT_DIR").unwrap();
+ File::open(&Path::new(&out).join("foo")).unwrap();
+ }
+ "#).unwrap();
+ p.root().move_into_the_past();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+
+ // Run a fresh build where file should be preserved
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+
+ // One last time to make sure it's still there.
+ File::create(&p.root().join("foo")).unwrap();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn output_separate_lines() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-flags=-L foo");
+ println!("cargo:rustc-flags=-l static=foo");
+ }
+ "#)
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101)
+ .with_stderr_contains("\
+[COMPILING] foo v0.5.0 (file://[..])
+[RUNNING] `rustc [..] build.rs [..]`
+[RUNNING] `[..][/]foo-[..][/]build-script-build`
+[RUNNING] `rustc --crate-name foo [..] -L foo -l static=foo`
+[ERROR] could not find native static library [..]
+"));
+}
+
+#[test]
+fn output_separate_lines_new() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-link-search=foo");
+ println!("cargo:rustc-link-lib=static=foo");
+ }
+ "#)
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101)
+ .with_stderr_contains("\
+[COMPILING] foo v0.5.0 (file://[..])
+[RUNNING] `rustc [..] build.rs [..]`
+[RUNNING] `[..][/]foo-[..][/]build-script-build`
+[RUNNING] `rustc --crate-name foo [..] -L foo -l static=foo`
+[ERROR] could not find native static library [..]
+"));
+}
+
+#[cfg(not(windows))] // FIXME(#867)
+#[test]
+fn code_generation() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/main.rs", r#"
+ include!(concat!(env!("OUT_DIR"), "/hello.rs"));
+
+ fn main() {
+ println!("{}", message());
+ }
+ "#)
+ .file("build.rs", r#"
+ use std::env;
+ use std::fs::File;
+ use std::io::prelude::*;
+ use std::path::PathBuf;
+
+ fn main() {
+ let dst = PathBuf::from(env::var("OUT_DIR").unwrap());
+ let mut f = File::create(&dst.join("hello.rs")).unwrap();
+ f.write_all(b"
+ pub fn message() -> &'static str {
+ \"Hello, World!\"
+ }
+ ").unwrap();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.5.0 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]foo`")
+ .with_stdout("\
+Hello, World!
+"));
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0));
+}
+
+#[test]
+fn release_with_build_script() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v").arg("--release"),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_script_only() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", r#"fn main() {}"#)
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ no targets specified in the manifest
+ either src/lib.rs, src/main.rs, a [lib] section, or [[bin]] section must be present"));
+}
+
+#[test]
+fn shared_dep_with_a_build_script() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+
+ [dependencies.a]
+ path = "a"
+
+ [build-dependencies.b]
+ path = "b"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", "fn main() {}")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("a/build.rs", "fn main() {}")
+ .file("a/src/lib.rs", "")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies.a]
+ path = "../a"
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn transitive_dep_host() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+
+ [build-dependencies.b]
+ path = "b"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", "fn main() {}")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ links = "foo"
+ build = "build.rs"
+ "#)
+ .file("a/build.rs", "fn main() {}")
+ .file("a/src/lib.rs", "")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.5.0"
+ authors = []
+
+ [lib]
+ name = "b"
+ plugin = true
+
+ [dependencies.a]
+ path = "../a"
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn test_a_lib_with_a_build_command() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", r#"
+ include!(concat!(env!("OUT_DIR"), "/foo.rs"));
+
+ /// ```
+ /// foo::bar();
+ /// ```
+ pub fn bar() {
+ assert_eq!(foo(), 1);
+ }
+ "#)
+ .file("build.rs", r#"
+ use std::env;
+ use std::io::prelude::*;
+ use std::fs::File;
+ use std::path::PathBuf;
+
+ fn main() {
+ let out = PathBuf::from(env::var("OUT_DIR").unwrap());
+ File::create(out.join("foo.rs")).unwrap().write_all(b"
+ fn foo() -> i32 { 1 }
+ ").unwrap();
+ }
+ "#)
+ .build();
+ assert_that(p.cargo("test"),
+ execs().with_status(0));
+}
+
+#[test]
+fn test_dev_dep_build_script() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dev-dependencies.a]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("a/build.rs", "fn main() {}")
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("test"), execs().with_status(0));
+}
+
+#[test]
+fn build_script_with_dynamic_native_dependency() {
+
+ let _workspace = project("ws")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["builder", "foo"]
+ "#)
+ .build();
+
+ let build = project("ws/builder")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "builder"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "builder"
+ crate-type = ["dylib"]
+ plugin = true
+ "#)
+ .file("src/lib.rs", r#"
+ #[no_mangle]
+ pub extern fn foo() {}
+ "#)
+ .build();
+
+ let foo = project("ws/foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+
+ [build-dependencies.bar]
+ path = "bar"
+ "#)
+ .file("build.rs", r#"
+ extern crate bar;
+ fn main() { bar::bar() }
+ "#)
+ .file("src/lib.rs", "")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("bar/build.rs", r#"
+ use std::env;
+ use std::path::PathBuf;
+
+ fn main() {
+ let src = PathBuf::from(env::var("SRC").unwrap());
+ println!("cargo:rustc-link-search=native={}/target/debug/deps",
+ src.display());
+ }
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {
+ #[cfg_attr(not(target_env = "msvc"), link(name = "builder"))]
+ #[cfg_attr(target_env = "msvc", link(name = "builder.dll"))]
+ extern { fn foo(); }
+ unsafe { foo() }
+ }
+ "#)
+ .build();
+
+ assert_that(build.cargo("build").arg("-v")
+ .env("RUST_LOG", "cargo::ops::cargo_rustc"),
+ execs().with_status(0));
+
+ assert_that(foo.cargo("build").arg("-v").env("SRC", build.root())
+ .env("RUST_LOG", "cargo::ops::cargo_rustc"),
+ execs().with_status(0));
+}
+
+#[test]
+fn profile_and_opt_level_set_correctly() {
+ let build = project("builder")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "builder"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ use std::env;
+
+ fn main() {
+ assert_eq!(env::var("OPT_LEVEL").unwrap(), "3");
+ assert_eq!(env::var("PROFILE").unwrap(), "release");
+ assert_eq!(env::var("DEBUG").unwrap(), "false");
+ }
+ "#)
+ .build();
+ assert_that(build.cargo("bench"),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_script_with_lto() {
+ let build = project("builder")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "builder"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+
+ [profile.dev]
+ lto = true
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() {
+ }
+ "#)
+ .build();
+ assert_that(build.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn test_duplicate_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ build = "build.rs"
+
+ [dependencies.bar]
+ path = "bar"
+
+ [build-dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() { bar::do_nothing() }
+ "#)
+ .file("build.rs", r#"
+ extern crate bar;
+ fn main() { bar::do_nothing() }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "pub fn do_nothing() {}")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+}
+
+#[test]
+fn cfg_feedback() {
+ let build = project("builder")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "builder"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/main.rs", "
+ #[cfg(foo)]
+ fn main() {}
+ ")
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-cfg=foo");
+ }
+ "#)
+ .build();
+ assert_that(build.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn cfg_override() {
+ let target = rustc_host();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ links = "a"
+ build = "build.rs"
+ "#)
+ .file("src/main.rs", "
+ #[cfg(foo)]
+ fn main() {}
+ ")
+ .file("build.rs", "")
+ .file(".cargo/config", &format!(r#"
+ [target.{}.a]
+ rustc-cfg = ["foo"]
+ "#, target))
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn cfg_test() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-cfg=foo");
+ }
+ "#)
+ .file("src/lib.rs", r#"
+ ///
+ /// ```
+ /// extern crate foo;
+ ///
+ /// fn main() {
+ /// foo::foo()
+ /// }
+ /// ```
+ ///
+ #[cfg(foo)]
+ pub fn foo() {}
+
+ #[cfg(foo)]
+ #[test]
+ fn test_foo() {
+ foo()
+ }
+ "#)
+ .file("tests/test.rs", r#"
+ #[cfg(foo)]
+ #[test]
+ fn test_bar() {}
+ "#)
+ .build();
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[RUNNING] [..] build.rs [..]
+[RUNNING] `[..][/]build-script-build`
+[RUNNING] [..] --cfg foo[..]
+[RUNNING] [..] --cfg foo[..]
+[RUNNING] [..] --cfg foo[..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..][/]foo-[..][EXE]`
+[RUNNING] `[..][/]test-[..][EXE]`
+[DOCTEST] foo
+[RUNNING] [..] --cfg foo[..]", dir = p.url()))
+ .with_stdout_contains("test test_foo ... ok")
+ .with_stdout_contains("test test_bar ... ok")
+ .with_stdout_contains_n("test [..] ... ok", 3));
+}
+
+#[test]
+fn cfg_doc() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-cfg=foo");
+ }
+ "#)
+ .file("src/lib.rs", r#"
+ #[cfg(foo)]
+ pub fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("bar/build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-cfg=bar");
+ }
+ "#)
+ .file("bar/src/lib.rs", r#"
+ #[cfg(bar)]
+ pub fn bar() {}
+ "#)
+ .build();
+ assert_that(p.cargo("doc"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ assert_that(&p.root().join("target/doc/foo/fn.foo.html"), existing_file());
+ assert_that(&p.root().join("target/doc/bar/fn.bar.html"), existing_file());
+}
+
+#[test]
+fn cfg_override_test() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ links = "a"
+ "#)
+ .file("build.rs", "")
+ .file(".cargo/config", &format!(r#"
+ [target.{}.a]
+ rustc-cfg = ["foo"]
+ "#, rustc_host()))
+ .file("src/lib.rs", r#"
+ ///
+ /// ```
+ /// extern crate foo;
+ ///
+ /// fn main() {
+ /// foo::foo()
+ /// }
+ /// ```
+ ///
+ #[cfg(foo)]
+ pub fn foo() {}
+
+ #[cfg(foo)]
+ #[test]
+ fn test_foo() {
+ foo()
+ }
+ "#)
+ .file("tests/test.rs", r#"
+ #[cfg(foo)]
+ #[test]
+ fn test_bar() {}
+ "#)
+ .build();
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[RUNNING] `[..]`
+[RUNNING] `[..]`
+[RUNNING] `[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..][/]foo-[..][EXE]`
+[RUNNING] `[..][/]test-[..][EXE]`
+[DOCTEST] foo
+[RUNNING] [..] --cfg foo[..]", dir = p.url()))
+ .with_stdout_contains("test test_foo ... ok")
+ .with_stdout_contains("test test_bar ... ok")
+ .with_stdout_contains_n("test [..] ... ok", 3));
+}
+
+#[test]
+fn cfg_override_doc() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ links = "a"
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file(".cargo/config", &format!(r#"
+ [target.{target}.a]
+ rustc-cfg = ["foo"]
+ [target.{target}.b]
+ rustc-cfg = ["bar"]
+ "#, target = rustc_host()))
+ .file("build.rs", "")
+ .file("src/lib.rs", r#"
+ #[cfg(foo)]
+ pub fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ links = "b"
+ "#)
+ .file("bar/build.rs", "")
+ .file("bar/src/lib.rs", r#"
+ #[cfg(bar)]
+ pub fn bar() {}
+ "#)
+ .build();
+ assert_that(p.cargo("doc"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ assert_that(&p.root().join("target/doc/foo/fn.foo.html"), existing_file());
+ assert_that(&p.root().join("target/doc/bar/fn.bar.html"), existing_file());
+}
+
+#[test]
+fn env_build() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/main.rs", r#"
+ const FOO: &'static str = env!("FOO");
+ fn main() {
+ println!("{}", FOO);
+ }
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-env=FOO=foo");
+ }
+ "#)
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+ assert_that(p.cargo("run").arg("-v"),
+ execs().with_status(0).with_stdout("foo\n"));
+}
+
+#[test]
+fn env_test() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-env=FOO=foo");
+ }
+ "#)
+ .file("src/lib.rs", r#"
+ pub const FOO: &'static str = env!("FOO");
+ "#)
+ .file("tests/test.rs", r#"
+ extern crate foo;
+
+ #[test]
+ fn test_foo() {
+ assert_eq!("foo", foo::FOO);
+ }
+ "#)
+ .build();
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[RUNNING] [..] build.rs [..]
+[RUNNING] `[..][/]build-script-build`
+[RUNNING] [..] --crate-name foo[..]
+[RUNNING] [..] --crate-name foo[..]
+[RUNNING] [..] --crate-name test[..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..][/]foo-[..][EXE]`
+[RUNNING] `[..][/]test-[..][EXE]`
+[DOCTEST] foo
+[RUNNING] [..] --crate-name foo[..]", dir = p.url()))
+ .with_stdout_contains_n("running 0 tests", 2)
+ .with_stdout_contains("test test_foo ... ok"));
+}
+
+#[test]
+fn env_doc() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/main.rs", r#"
+ const FOO: &'static str = env!("FOO");
+ fn main() {}
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-env=FOO=foo");
+ }
+ "#)
+ .build();
+ assert_that(p.cargo("doc").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn flags_go_into_tests() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ b = { path = "b" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("tests/foo.rs", "")
+ .file("b/Cargo.toml", r#"
+ [project]
+ name = "b"
+ version = "0.5.0"
+ authors = []
+ [dependencies]
+ a = { path = "../a" }
+ "#)
+ .file("b/src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("a/build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-link-search=test");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("test").arg("-v").arg("--test=foo"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] a v0.5.0 ([..]
+[RUNNING] `rustc [..] a[/]build.rs [..]`
+[RUNNING] `[..][/]build-script-build`
+[RUNNING] `rustc [..] a[/]src[/]lib.rs [..] -L test[..]`
+[COMPILING] b v0.5.0 ([..]
+[RUNNING] `rustc [..] b[/]src[/]lib.rs [..] -L test[..]`
+[COMPILING] foo v0.5.0 ([..]
+[RUNNING] `rustc [..] src[/]lib.rs [..] -L test[..]`
+[RUNNING] `rustc [..] tests[/]foo.rs [..] -L test[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..][/]foo-[..][EXE]`")
+ .with_stdout_contains("running 0 tests"));
+
+ assert_that(p.cargo("test").arg("-v").arg("-pb").arg("--lib"),
+ execs().with_status(0)
+ .with_stderr("\
+[FRESH] a v0.5.0 ([..]
+[COMPILING] b v0.5.0 ([..]
+[RUNNING] `rustc [..] b[/]src[/]lib.rs [..] -L test[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..][/]b-[..][EXE]`")
+ .with_stdout_contains("running 0 tests"));
+}
+
+#[test]
+fn diamond_passes_args_only_once() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+ b = { path = "b" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("tests/foo.rs", "")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ [dependencies]
+ b = { path = "../b" }
+ c = { path = "../c" }
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("b/Cargo.toml", r#"
+ [project]
+ name = "b"
+ version = "0.5.0"
+ authors = []
+ [dependencies]
+ c = { path = "../c" }
+ "#)
+ .file("b/src/lib.rs", "")
+ .file("c/Cargo.toml", r#"
+ [project]
+ name = "c"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("c/build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-link-search=native=test");
+ }
+ "#)
+ .file("c/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] c v0.5.0 ([..]
+[RUNNING] `rustc [..]`
+[RUNNING] `[..]`
+[RUNNING] `rustc [..]`
+[COMPILING] b v0.5.0 ([..]
+[RUNNING] `rustc [..]`
+[COMPILING] a v0.5.0 ([..]
+[RUNNING] `rustc [..]`
+[COMPILING] foo v0.5.0 ([..]
+[RUNNING] `[..]rlib -L native=test`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn adding_an_override_invalidates() {
+ let target = rustc_host();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ links = "foo"
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", "")
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-link-search=native=foo");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.5.0 ([..]
+[RUNNING] `rustc [..]`
+[RUNNING] `[..]`
+[RUNNING] `rustc [..] -L native=foo`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ File::create(p.root().join(".cargo/config")).unwrap().write_all(format!("
+ [target.{}.foo]
+ rustc-link-search = [\"native=bar\"]
+ ", target).as_bytes()).unwrap();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.5.0 ([..]
+[RUNNING] `rustc [..] -L native=bar`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn changing_an_override_invalidates() {
+ let target = rustc_host();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ links = "foo"
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", &format!("
+ [target.{}.foo]
+ rustc-link-search = [\"native=foo\"]
+ ", target))
+ .file("build.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.5.0 ([..]
+[RUNNING] `rustc [..] -L native=foo`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ File::create(p.root().join(".cargo/config")).unwrap().write_all(format!("
+ [target.{}.foo]
+ rustc-link-search = [\"native=bar\"]
+ ", target).as_bytes()).unwrap();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.5.0 ([..]
+[RUNNING] `rustc [..] -L native=bar`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+
+#[test]
+fn fresh_builds_possible_with_link_libs() {
+ // The bug is non-deterministic. Sometimes you can get a fresh build
+ let target = rustc_host();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ links = "nativefoo"
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", &format!("
+ [target.{}.nativefoo]
+ rustc-link-lib = [\"a\"]
+ rustc-link-search = [\"./b\"]
+ rustc-flags = \"-l z -L ./\"
+ ", target))
+ .file("build.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.5.0 ([..]
+[RUNNING] `rustc [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("build")
+ .arg("-v")
+ .env("RUST_LOG", "cargo::ops::cargo_rustc::fingerprint=info"),
+ execs().with_status(0).with_stderr("\
+[FRESH] foo v0.5.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+
+#[test]
+fn fresh_builds_possible_with_multiple_metadata_overrides() {
+ // The bug is non-deterministic. Sometimes you can get a fresh build
+ let target = rustc_host();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ links = "foo"
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", &format!("
+ [target.{}.foo]
+ a = \"\"
+ b = \"\"
+ c = \"\"
+ d = \"\"
+ e = \"\"
+ ", target))
+ .file("build.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.5.0 ([..]
+[RUNNING] `rustc [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("build")
+ .arg("-v")
+ .env("RUST_LOG", "cargo::ops::cargo_rustc::fingerprint=info"),
+ execs().with_status(0).with_stderr("\
+[FRESH] foo v0.5.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+
+#[test]
+fn rebuild_only_on_explicit_paths() {
+ let p = project("a")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rerun-if-changed=foo");
+ println!("cargo:rerun-if-changed=bar");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+
+ // files don't exist, so should always rerun if they don't exist
+ println!("run without");
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] a v0.5.0 ([..])
+[RUNNING] `[..][/]build-script-build`
+[RUNNING] `rustc [..] src[/]lib.rs [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ sleep_ms(1000);
+ File::create(p.root().join("foo")).unwrap();
+ File::create(p.root().join("bar")).unwrap();
+
+ // now the exist, so run once, catch the mtime, then shouldn't run again
+ println!("run with");
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] a v0.5.0 ([..])
+[RUNNING] `[..][/]build-script-build`
+[RUNNING] `rustc [..] src[/]lib.rs [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ println!("run with2");
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[FRESH] a v0.5.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ sleep_ms(1000);
+
+ // random other files do not affect freshness
+ println!("run baz");
+ File::create(p.root().join("baz")).unwrap();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[FRESH] a v0.5.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ // but changing dependent files does
+ println!("run foo change");
+ File::create(p.root().join("foo")).unwrap();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] a v0.5.0 ([..])
+[RUNNING] `[..][/]build-script-build`
+[RUNNING] `rustc [..] src[/]lib.rs [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ // .. as does deleting a file
+ println!("run foo delete");
+ fs::remove_file(p.root().join("bar")).unwrap();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] a v0.5.0 ([..])
+[RUNNING] `[..][/]build-script-build`
+[RUNNING] `rustc [..] src[/]lib.rs [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+
+#[test]
+fn doctest_recieves_build_link_args() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ [dependencies.a]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ links = "bar"
+ build = "build.rs"
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("a/build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-link-search=native=bar");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[RUNNING] `rustdoc --test [..] --crate-name foo [..]-L native=bar[..]`
+"));
+}
+
+#[test]
+fn please_respect_the_dag() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+
+ [dependencies]
+ a = { path = 'a' }
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-link-search=native=foo");
+ }
+ "#)
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ links = "bar"
+ build = "build.rs"
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("a/build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-link-search=native=bar");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[RUNNING] `rustc [..] -L native=foo -L native=bar[..]`
+"));
+}
+
+#[test]
+fn non_utf8_output() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", r#"
+ use std::io::prelude::*;
+
+ fn main() {
+ let mut out = std::io::stdout();
+ // print something that's not utf8
+ out.write_all(b"\xff\xff\n").unwrap();
+
+ // now print some cargo metadata that's utf8
+ println!("cargo:rustc-cfg=foo");
+
+ // now print more non-utf8
+ out.write_all(b"\xff\xff\n").unwrap();
+ }
+ "#)
+ .file("src/main.rs", r#"
+ #[cfg(foo)]
+ fn main() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn custom_target_dir() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [build]
+ target-dir = 'test'
+ "#)
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("a/build.rs", "fn main() {}")
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn panic_abort_with_build_scripts() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [profile.release]
+ panic = 'abort'
+
+ [dependencies]
+ a = { path = "a" }
+ "#)
+ .file("src/lib.rs", "#[allow(unused_extern_crates)] extern crate a;")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+
+ [build-dependencies]
+ b = { path = "../b" }
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("a/build.rs", "#[allow(unused_extern_crates)] extern crate b; fn main() {}")
+ .file("b/Cargo.toml", r#"
+ [project]
+ name = "b"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v").arg("--release"),
+ execs().with_status(0));
+}
+
+#[test]
+fn warnings_emitted() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:warning=foo");
+ println!("cargo:warning=bar");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.5.0 ([..])
+[RUNNING] `rustc [..]`
+[RUNNING] `[..]`
+warning: foo
+warning: bar
+[RUNNING] `rustc [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn warnings_hidden_for_upstream() {
+ Package::new("bar", "0.1.0")
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:warning=foo");
+ println!("cargo:warning=bar");
+ }
+ "#)
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .publish();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] bar v0.1.0 ([..])
+[COMPILING] bar v0.1.0
+[RUNNING] `rustc [..]`
+[RUNNING] `[..]`
+[RUNNING] `rustc [..]`
+[COMPILING] foo v0.5.0 ([..])
+[RUNNING] `rustc [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn warnings_printed_on_vv() {
+ Package::new("bar", "0.1.0")
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:warning=foo");
+ println!("cargo:warning=bar");
+ }
+ "#)
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .publish();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-vv"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] bar v0.1.0 ([..])
+[COMPILING] bar v0.1.0
+[RUNNING] `rustc [..]`
+[RUNNING] `[..]`
+warning: foo
+warning: bar
+[RUNNING] `rustc [..]`
+[COMPILING] foo v0.5.0 ([..])
+[RUNNING] `rustc [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn output_shows_on_vv() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ use std::io::prelude::*;
+
+ fn main() {
+ std::io::stderr().write_all(b"stderr\n").unwrap();
+ std::io::stdout().write_all(b"stdout\n").unwrap();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-vv"),
+ execs().with_status(0)
+ .with_stdout("\
+stdout
+")
+ .with_stderr("\
+[COMPILING] foo v0.5.0 ([..])
+[RUNNING] `rustc [..]`
+[RUNNING] `[..]`
+stderr
+[RUNNING] `rustc [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn links_with_dots() {
+ let target = rustc_host();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ links = "a.b"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-link-search=bar")
+ }
+ "#)
+ .file(".cargo/config", &format!(r#"
+ [target.{}.'a.b']
+ rustc-link-search = ["foo"]
+ "#, target))
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[RUNNING] `rustc --crate-name foo [..] [..] -L foo[..]`
+"));
+}
+
+#[test]
+fn rustc_and_rustdoc_set_correctly() {
+ let p = project("builder")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "builder"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ use std::env;
+
+ fn main() {
+ assert_eq!(env::var("RUSTC").unwrap(), "rustc");
+ assert_eq!(env::var("RUSTDOC").unwrap(), "rustdoc");
+ }
+ "#)
+ .build();
+ assert_that(p.cargo("bench"),
+ execs().with_status(0));
+}
+
+#[test]
+fn cfg_env_vars_available() {
+ let p = project("builder")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "builder"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ use std::env;
+
+ fn main() {
+ let fam = env::var("CARGO_CFG_TARGET_FAMILY").unwrap();
+ if cfg!(unix) {
+ assert_eq!(fam, "unix");
+ } else {
+ assert_eq!(fam, "windows");
+ }
+ }
+ "#)
+ .build();
+ assert_that(p.cargo("bench"),
+ execs().with_status(0));
+}
+
+#[test]
+fn switch_features_rerun() {
+ let p = project("builder")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "builder"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+
+ [features]
+ foo = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {
+ println!(include_str!(concat!(env!("OUT_DIR"), "/output")));
+ }
+ "#)
+ .file("build.rs", r#"
+ use std::env;
+ use std::fs::File;
+ use std::io::Write;
+ use std::path::Path;
+
+ fn main() {
+ let out_dir = env::var_os("OUT_DIR").unwrap();
+ let out_dir = Path::new(&out_dir).join("output");
+ let mut f = File::create(&out_dir).unwrap();
+
+ if env::var_os("CARGO_FEATURE_FOO").is_some() {
+ f.write_all(b"foo").unwrap();
+ } else {
+ f.write_all(b"bar").unwrap();
+ }
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("-v").arg("--features=foo"),
+ execs().with_status(0).with_stdout("foo\n"));
+ assert_that(p.cargo("run").arg("-v"),
+ execs().with_status(0).with_stdout("bar\n"));
+ assert_that(p.cargo("run").arg("-v").arg("--features=foo"),
+ execs().with_status(0).with_stdout("foo\n"));
+}
+
+#[test]
+fn assume_build_script_when_build_rs_present() {
+ let p = project("builder")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "builder"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {
+ if ! cfg!(foo) {
+ panic!("the build script was not run");
+ }
+ }
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-cfg=foo");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn if_build_set_to_false_dont_treat_build_rs_as_build_script() {
+ let p = project("builder")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "builder"
+ version = "0.0.1"
+ authors = []
+ build = false
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {
+ if cfg!(foo) {
+ panic!("the build script was run");
+ }
+ }
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-cfg=foo");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn deterministic_rustc_dependency_flags() {
+ // This bug is non-deterministic hence the large number of dependencies
+ // in the hopes it will have a much higher chance of triggering it.
+
+ Package::new("dep1", "0.1.0")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "dep1"
+ version = "0.1.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-flags=-L native=test1");
+ }
+ "#)
+ .file("src/lib.rs", "")
+ .publish();
+ Package::new("dep2", "0.1.0")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "dep2"
+ version = "0.1.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-flags=-L native=test2");
+ }
+ "#)
+ .file("src/lib.rs", "")
+ .publish();
+ Package::new("dep3", "0.1.0")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "dep3"
+ version = "0.1.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-flags=-L native=test3");
+ }
+ "#)
+ .file("src/lib.rs", "")
+ .publish();
+ Package::new("dep4", "0.1.0")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "dep4"
+ version = "0.1.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-flags=-L native=test4");
+ }
+ "#)
+ .file("src/lib.rs", "")
+ .publish();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ dep1 = "*"
+ dep2 = "*"
+ dep3 = "*"
+ dep4 = "*"
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[RUNNING] `rustc --crate-name foo [..] -L native=test1 -L native=test2 \
+-L native=test3 -L native=test4`
+"));
+}
--- /dev/null
+extern crate cargo;
+#[macro_use]
+extern crate cargotest;
+extern crate hamcrest;
+extern crate tempdir;
+
+use std::env;
+use std::fs::{self, File};
+use std::io::prelude::*;
+
+use cargo::util::paths::dylib_path_envvar;
+use cargo::util::{process, ProcessBuilder};
+use cargotest::{is_nightly, rustc_host, sleep_ms};
+use cargotest::support::paths::{CargoPathExt,root};
+use cargotest::support::{ProjectBuilder};
+use cargotest::support::{project, execs, main_file, basic_bin_manifest};
+use cargotest::support::registry::Package;
+use hamcrest::{assert_that, existing_file, existing_dir, is_not};
+use tempdir::TempDir;
+
+#[test]
+fn cargo_compile_simple() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("i am foo\n"));
+}
+
+#[test]
+fn cargo_fail_with_no_stderr() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &String::from("refusal"))
+ .build();
+ assert_that(p.cargo("build").arg("--message-format=json"), execs().with_status(101)
+ .with_stderr_does_not_contain("--- stderr"));
+}
+
+/// Check that the `CARGO_INCREMENTAL` environment variable results in
+/// `rustc` getting `-Zincremental` passed to it.
+#[test]
+fn cargo_compile_incremental() {
+ if !is_nightly() {
+ return
+ }
+
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(
+ p.cargo("build").arg("-v").env("CARGO_INCREMENTAL", "1"),
+ execs().with_stderr_contains(
+ "[RUNNING] `rustc [..] -Zincremental=[..][/]target[/]debug[/]incremental`\n")
+ .with_status(0));
+
+ assert_that(
+ p.cargo("test").arg("-v").env("CARGO_INCREMENTAL", "1"),
+ execs().with_stderr_contains(
+ "[RUNNING] `rustc [..] -Zincremental=[..][/]target[/]debug[/]incremental`\n")
+ .with_status(0));
+}
+
+#[test]
+fn cargo_compile_manifest_path() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("build")
+ .arg("--manifest-path").arg("foo/Cargo.toml")
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+}
+
+#[test]
+fn cargo_compile_with_invalid_manifest() {
+ let p = project("foo")
+ .file("Cargo.toml", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs()
+ .with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ no `package` section found.
+"))
+}
+
+#[test]
+fn cargo_compile_with_invalid_manifest2() {
+ let p = project("foo")
+ .file("Cargo.toml", r"
+ [project]
+ foo = bar
+ ")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs()
+ .with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ could not parse input as TOML
+
+Caused by:
+ invalid number at line 3
+"))
+}
+
+#[test]
+fn cargo_compile_with_invalid_manifest3() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/Cargo.toml", "a = bar")
+ .build();
+
+ assert_that(p.cargo("build").arg("--manifest-path")
+ .arg("src/Cargo.toml"),
+ execs()
+ .with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ could not parse input as TOML
+
+Caused by:
+ invalid number at line 1
+"))
+}
+
+#[test]
+fn cargo_compile_duplicate_build_targets() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "main"
+ path = "src/main.rs"
+ crate-type = ["dylib"]
+
+ [dependencies]
+ "#)
+ .file("src/main.rs", r#"
+ #![allow(warnings)]
+ fn main() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs()
+ .with_status(0)
+ .with_stderr("\
+warning: file found to be present in multiple build targets: [..]main.rs
+[COMPILING] foo v0.0.1 ([..])
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn cargo_compile_with_invalid_version() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ authors = []
+ version = "1.0"
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs()
+ .with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ Expected dot for key `project.version`
+"))
+
+}
+
+#[test]
+fn cargo_compile_with_invalid_package_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = ""
+ authors = []
+ version = "0.0.0"
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs()
+ .with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ package name cannot be an empty string.
+"))
+}
+
+#[test]
+fn cargo_compile_with_invalid_bin_target_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+
+ [[bin]]
+ name = ""
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs()
+ .with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ binary target names cannot be empty
+"))
+}
+
+#[test]
+fn cargo_compile_with_forbidden_bin_target_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+
+ [[bin]]
+ name = "build"
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs()
+ .with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ the binary target name `build` is forbidden
+"))
+}
+
+#[test]
+fn cargo_compile_with_invalid_lib_target_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+
+ [lib]
+ name = ""
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs()
+ .with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ library target names cannot be empty
+"))
+}
+
+#[test]
+fn cargo_compile_without_manifest() {
+ let tmpdir = TempDir::new("cargo").unwrap();
+ let p = ProjectBuilder::new("foo", tmpdir.path().to_path_buf()).build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] could not find `Cargo.toml` in `[..]` or any parent directory
+"));
+}
+
+#[test]
+fn cargo_compile_with_invalid_code() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", "invalid rust code!")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs()
+ .with_status(101)
+ .with_stderr_contains("\
+[ERROR] Could not compile `foo`.
+
+To learn more, run the command again with --verbose.\n"));
+ assert_that(&p.root().join("Cargo.lock"), existing_file());
+}
+
+#[test]
+fn cargo_compile_with_invalid_code_in_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ [dependencies.baz]
+ path = "../baz"
+ "#)
+ .file("src/main.rs", "invalid rust code!")
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", &basic_bin_manifest("bar"))
+ .file("src/lib.rs", "invalid rust code!")
+ .build();
+ let _baz = project("baz")
+ .file("Cargo.toml", &basic_bin_manifest("baz"))
+ .file("src/lib.rs", "invalid rust code!")
+ .build();
+ assert_that(p.cargo("build"), execs().with_status(101));
+}
+
+#[test]
+fn cargo_compile_with_warnings_in_the_root_package() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", "fn main() {} fn dead() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr_contains("\
+[..]function is never used: `dead`[..]
+"));
+}
+
+#[test]
+fn cargo_compile_with_warnings_in_a_dep_package() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+ path = "bar"
+
+ [[bin]]
+
+ name = "foo"
+ "#)
+ .file("src/foo.rs",
+ &main_file(r#""{}", bar::gimme()"#, &["bar"]))
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+
+ name = "bar"
+ "#)
+ .file("bar/src/bar.rs", r#"
+ pub fn gimme() -> &'static str {
+ "test passed"
+ }
+
+ fn dead() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr_contains("\
+[..]function is never used: `dead`[..]
+"));
+
+ assert_that(&p.bin("foo"), existing_file());
+
+ assert_that(
+ process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("test passed\n"));
+}
+
+#[test]
+fn cargo_compile_with_nested_deps_inferred() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+ path = 'bar'
+
+ [[bin]]
+ name = "foo"
+ "#)
+ .file("src/foo.rs",
+ &main_file(r#""{}", bar::gimme()"#, &["bar"]))
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.baz]
+ path = "../baz"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ extern crate baz;
+
+ pub fn gimme() -> String {
+ baz::gimme()
+ }
+ "#)
+ .file("baz/Cargo.toml", r#"
+ [project]
+
+ name = "baz"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("baz/src/lib.rs", r#"
+ pub fn gimme() -> String {
+ "test passed".to_string()
+ }
+ "#)
+ .build();
+
+ p.cargo("build")
+ .exec_with_output()
+ .unwrap();
+
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("libbar.rlib"), is_not(existing_file()));
+ assert_that(&p.bin("libbaz.rlib"), is_not(existing_file()));
+
+ assert_that(
+ process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("test passed\n"));
+}
+
+#[test]
+fn cargo_compile_with_nested_deps_correct_bin() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+ path = "bar"
+
+ [[bin]]
+ name = "foo"
+ "#)
+ .file("src/main.rs",
+ &main_file(r#""{}", bar::gimme()"#, &["bar"]))
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.baz]
+ path = "../baz"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ extern crate baz;
+
+ pub fn gimme() -> String {
+ baz::gimme()
+ }
+ "#)
+ .file("baz/Cargo.toml", r#"
+ [project]
+
+ name = "baz"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("baz/src/lib.rs", r#"
+ pub fn gimme() -> String {
+ "test passed".to_string()
+ }
+ "#)
+ .build();
+
+ p.cargo("build")
+ .exec_with_output()
+ .unwrap();
+
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("libbar.rlib"), is_not(existing_file()));
+ assert_that(&p.bin("libbaz.rlib"), is_not(existing_file()));
+
+ assert_that(
+ process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("test passed\n"));
+}
+
+#[test]
+fn cargo_compile_with_nested_deps_shorthand() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs",
+ &main_file(r#""{}", bar::gimme()"#, &["bar"]))
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.baz]
+ path = "../baz"
+
+ [lib]
+
+ name = "bar"
+ "#)
+ .file("bar/src/bar.rs", r#"
+ extern crate baz;
+
+ pub fn gimme() -> String {
+ baz::gimme()
+ }
+ "#)
+ .file("baz/Cargo.toml", r#"
+ [project]
+
+ name = "baz"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+
+ name = "baz"
+ "#)
+ .file("baz/src/baz.rs", r#"
+ pub fn gimme() -> String {
+ "test passed".to_string()
+ }
+ "#)
+ .build();
+
+ p.cargo("build")
+ .exec_with_output()
+ .unwrap();
+
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("libbar.rlib"), is_not(existing_file()));
+ assert_that(&p.bin("libbaz.rlib"), is_not(existing_file()));
+
+ assert_that(
+ process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("test passed\n"));
+}
+
+#[test]
+fn cargo_compile_with_nested_deps_longhand() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+ path = "bar"
+ version = "0.5.0"
+
+ [[bin]]
+
+ name = "foo"
+ "#)
+ .file("src/foo.rs",
+ &main_file(r#""{}", bar::gimme()"#, &["bar"]))
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.baz]
+ path = "../baz"
+ version = "0.5.0"
+
+ [lib]
+
+ name = "bar"
+ "#)
+ .file("bar/src/bar.rs", r#"
+ extern crate baz;
+
+ pub fn gimme() -> String {
+ baz::gimme()
+ }
+ "#)
+ .file("baz/Cargo.toml", r#"
+ [project]
+
+ name = "baz"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+
+ name = "baz"
+ "#)
+ .file("baz/src/baz.rs", r#"
+ pub fn gimme() -> String {
+ "test passed".to_string()
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"), execs());
+
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("libbar.rlib"), is_not(existing_file()));
+ assert_that(&p.bin("libbaz.rlib"), is_not(existing_file()));
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("test passed\n"));
+}
+
+// Check that Cargo gives a sensible error if a dependency can't be found
+// because of a name mismatch.
+#[test]
+fn cargo_compile_with_dep_name_mismatch() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "foo"
+ version = "0.0.1"
+ authors = ["wycats@example.com"]
+
+ [[bin]]
+
+ name = "foo"
+
+ [dependencies.notquitebar]
+
+ path = "bar"
+ "#)
+ .file("src/bin/foo.rs", &main_file(r#""i am foo""#, &["bar"]))
+ .file("bar/Cargo.toml", &basic_bin_manifest("bar"))
+ .file("bar/src/bar.rs", &main_file(r#""i am bar""#, &[]))
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr(&format!(
+r#"[ERROR] no matching package named `notquitebar` found (required by `foo`)
+location searched: {proj_dir}/bar
+version required: *
+"#, proj_dir = p.url())));
+}
+
+#[test]
+fn cargo_compile_with_filename() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", r#"
+ extern crate foo;
+ fn main() { println!("hello a.rs"); }
+ "#)
+ .file("examples/a.rs", r#"
+ fn main() { println!("example"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("--bin").arg("bin.rs"),
+ execs().with_status(101).with_stderr("\
+[ERROR] no bin target named `bin.rs`"));
+
+ assert_that(p.cargo("build").arg("--bin").arg("a.rs"),
+ execs().with_status(101).with_stderr("\
+[ERROR] no bin target named `a.rs`
+
+Did you mean `a`?"));
+
+ assert_that(p.cargo("build").arg("--example").arg("example.rs"),
+ execs().with_status(101).with_stderr("\
+[ERROR] no example target named `example.rs`"));
+
+ assert_that(p.cargo("build").arg("--example").arg("a.rs"),
+ execs().with_status(101).with_stderr("\
+[ERROR] no example target named `a.rs`
+
+Did you mean `a`?"));
+}
+
+#[test]
+fn compile_path_dep_then_change_version() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/lib.rs", "")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+
+ File::create(&p.root().join("bar/Cargo.toml")).unwrap().write_all(br#"
+ [package]
+ name = "bar"
+ version = "0.0.2"
+ authors = []
+ "#).unwrap();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] no matching version `= 0.0.1` found for package `bar` (required by `foo`)
+location searched: [..]
+versions found: 0.0.2
+consider running `cargo update` to update a path dependency's locked version
+"));
+}
+
+#[test]
+fn ignores_carriage_return_in_lockfile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+ "#)
+ .file("src/main.rs", r#"
+ mod a; fn main() {}
+ "#)
+ .file("src/a.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ let lockfile = p.root().join("Cargo.lock");
+ let mut lock = String::new();
+ File::open(&lockfile).unwrap().read_to_string(&mut lock).unwrap();
+ let lock = lock.replace("\n", "\r\n");
+ File::create(&lockfile).unwrap().write_all(lock.as_bytes()).unwrap();
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn cargo_default_env_metadata_env_var() {
+ // Ensure that path dep + dylib + env_var get metadata
+ // (even though path_dep + dylib should not)
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/lib.rs", "// hi")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "bar"
+ crate_type = ["dylib"]
+ "#)
+ .file("bar/src/lib.rs", "// hello")
+ .build();
+
+ // No metadata on libbar since it's a dylib path dependency
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] bar v0.0.1 ({url}/bar)
+[RUNNING] `rustc --crate-name bar bar[/]src[/]lib.rs --crate-type dylib \
+ --emit=dep-info,link \
+ -C prefer-dynamic -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `rustc --crate-name foo src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link -C debuginfo=2 \
+ -C metadata=[..] \
+ -C extra-filename=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps \
+ --extern bar={dir}[/]target[/]debug[/]deps[/]{prefix}bar{suffix}`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]",
+dir = p.root().display(),
+url = p.url(),
+prefix = env::consts::DLL_PREFIX,
+suffix = env::consts::DLL_SUFFIX,
+)));
+
+ assert_that(p.cargo("clean"), execs().with_status(0));
+
+ // If you set the env-var, then we expect metadata on libbar
+ assert_that(p.cargo("build").arg("-v").env("__CARGO_DEFAULT_LIB_METADATA", "stable"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] bar v0.0.1 ({url}/bar)
+[RUNNING] `rustc --crate-name bar bar[/]src[/]lib.rs --crate-type dylib \
+ --emit=dep-info,link \
+ -C prefer-dynamic -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `rustc --crate-name foo src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link -C debuginfo=2 \
+ -C metadata=[..] \
+ -C extra-filename=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps \
+ --extern bar={dir}[/]target[/]debug[/]deps[/]{prefix}bar-[..]{suffix}`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+dir = p.root().display(),
+url = p.url(),
+prefix = env::consts::DLL_PREFIX,
+suffix = env::consts::DLL_SUFFIX,
+)));
+}
+
+#[test]
+fn crate_env_vars() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.1-alpha.1"
+ description = "This is foo"
+ homepage = "http://example.com"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/main.rs", r#"
+ extern crate foo;
+
+
+ static VERSION_MAJOR: &'static str = env!("CARGO_PKG_VERSION_MAJOR");
+ static VERSION_MINOR: &'static str = env!("CARGO_PKG_VERSION_MINOR");
+ static VERSION_PATCH: &'static str = env!("CARGO_PKG_VERSION_PATCH");
+ static VERSION_PRE: &'static str = env!("CARGO_PKG_VERSION_PRE");
+ static VERSION: &'static str = env!("CARGO_PKG_VERSION");
+ static CARGO_MANIFEST_DIR: &'static str = env!("CARGO_MANIFEST_DIR");
+ static PKG_NAME: &'static str = env!("CARGO_PKG_NAME");
+ static HOMEPAGE: &'static str = env!("CARGO_PKG_HOMEPAGE");
+ static DESCRIPTION: &'static str = env!("CARGO_PKG_DESCRIPTION");
+
+ fn main() {
+ let s = format!("{}-{}-{} @ {} in {}", VERSION_MAJOR,
+ VERSION_MINOR, VERSION_PATCH, VERSION_PRE,
+ CARGO_MANIFEST_DIR);
+ assert_eq!(s, foo::version());
+ println!("{}", s);
+ assert_eq!("foo", PKG_NAME);
+ assert_eq!("http://example.com", HOMEPAGE);
+ assert_eq!("This is foo", DESCRIPTION);
+ let s = format!("{}.{}.{}-{}", VERSION_MAJOR,
+ VERSION_MINOR, VERSION_PATCH, VERSION_PRE);
+ assert_eq!(s, VERSION);
+ }
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn version() -> String {
+ format!("{}-{}-{} @ {} in {}",
+ env!("CARGO_PKG_VERSION_MAJOR"),
+ env!("CARGO_PKG_VERSION_MINOR"),
+ env!("CARGO_PKG_VERSION_PATCH"),
+ env!("CARGO_PKG_VERSION_PRE"),
+ env!("CARGO_MANIFEST_DIR"))
+ }
+ "#)
+ .build();
+
+ println!("build");
+ assert_that(p.cargo("build").arg("-v"), execs().with_status(0));
+
+ println!("bin");
+ assert_that(process(&p.bin("foo")),
+ execs().with_status(0).with_stdout(&format!("0-5-1 @ alpha.1 in {}\n",
+ p.root().display())));
+
+ println!("test");
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn crate_authors_env_vars() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.1-alpha.1"
+ authors = ["wycats@example.com", "neikos@example.com"]
+ "#)
+ .file("src/main.rs", r#"
+ extern crate foo;
+
+ static AUTHORS: &'static str = env!("CARGO_PKG_AUTHORS");
+
+ fn main() {
+ let s = "wycats@example.com:neikos@example.com";
+ assert_eq!(AUTHORS, foo::authors());
+ println!("{}", AUTHORS);
+ assert_eq!(s, AUTHORS);
+ }
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn authors() -> String {
+ format!("{}", env!("CARGO_PKG_AUTHORS"))
+ }
+ "#)
+ .build();
+
+ println!("build");
+ assert_that(p.cargo("build").arg("-v"), execs().with_status(0));
+
+ println!("bin");
+ assert_that(process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("wycats@example.com:neikos@example.com"));
+
+ println!("test");
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0));
+}
+
+// The tester may already have LD_LIBRARY_PATH=::/foo/bar which leads to a false positive error
+fn setenv_for_removing_empty_component(mut p: ProcessBuilder) -> ProcessBuilder {
+ let v = dylib_path_envvar();
+ if let Ok(search_path) = env::var(v) {
+ let new_search_path =
+ env::join_paths(env::split_paths(&search_path).filter(|e| !e.as_os_str().is_empty()))
+ .expect("join_paths");
+ p.env(v, new_search_path); // build_command() will override LD_LIBRARY_PATH accordingly
+ }
+ p
+}
+
+// Regression test for #4277
+#[test]
+fn crate_library_path_env_var() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", &format!(r##"
+ fn main() {{
+ let search_path = env!("{}");
+ let paths = std::env::split_paths(&search_path).collect::<Vec<_>>();
+ assert!(!paths.contains(&"".into()));
+ }}
+ "##, dylib_path_envvar()))
+ .build();
+
+ assert_that(setenv_for_removing_empty_component(p.cargo("run")),
+ execs().with_status(0));
+}
+
+// Regression test for #4277
+#[test]
+fn build_with_fake_libc_not_loading() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/lib.rs", r#" "#)
+ .file("libc.so.6", r#""#)
+ .build();
+
+ assert_that(setenv_for_removing_empty_component(p.cargo("build")),
+ execs().with_status(0));
+}
+
+// this is testing that src/<pkg-name>.rs still works (for now)
+#[test]
+fn many_crate_types_old_style_lib_location() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+
+ name = "foo"
+ crate_type = ["rlib", "dylib"]
+ "#)
+ .file("src/foo.rs", r#"
+ pub fn foo() {}
+ "#)
+ .build();
+ assert_that(p.cargo("build"), execs().with_status(0).with_stderr_contains("\
+[WARNING] path `[..]src[/]foo.rs` was erroneously implicitly accepted for library `foo`,
+please rename the file to `src/lib.rs` or set lib.path in Cargo.toml"));
+
+ assert_that(&p.root().join("target/debug/libfoo.rlib"), existing_file());
+ let fname = format!("{}foo{}", env::consts::DLL_PREFIX,
+ env::consts::DLL_SUFFIX);
+ assert_that(&p.root().join("target/debug").join(&fname), existing_file());
+}
+
+#[test]
+fn many_crate_types_correct() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+
+ name = "foo"
+ crate_type = ["rlib", "dylib"]
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ assert_that(&p.root().join("target/debug/libfoo.rlib"), existing_file());
+ let fname = format!("{}foo{}", env::consts::DLL_PREFIX,
+ env::consts::DLL_SUFFIX);
+ assert_that(&p.root().join("target/debug").join(&fname), existing_file());
+}
+
+#[test]
+fn self_dependency() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "test"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies.test]
+
+ path = "."
+
+ [lib]
+ name = "test"
+ path = "src/test.rs"
+ "#)
+ .file("src/test.rs", "fn main() {}")
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] cyclic package dependency: package `test v0.0.0 ([..])` depends on itself
+"));
+}
+
+#[test]
+fn ignore_broken_symlinks() {
+ // windows and symlinks don't currently agree that well
+ if cfg!(windows) { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .symlink("Notafile", "bar")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("i am foo\n"));
+}
+
+#[test]
+fn missing_lib_and_bin() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "test"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]Cargo.toml`
+
+Caused by:
+ no targets specified in the manifest
+ either src/lib.rs, src/main.rs, a [lib] section, or [[bin]] section must be present\n"));
+}
+
+#[test]
+fn lto_build() {
+ // FIXME: currently this hits a linker bug on 32-bit MSVC
+ if cfg!(all(target_env = "msvc", target_pointer_width = "32")) {
+ return
+ }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "test"
+ version = "0.0.0"
+ authors = []
+
+ [profile.release]
+ lto = true
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+ assert_that(p.cargo("build").arg("-v").arg("--release"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] test v0.0.0 ({url})
+[RUNNING] `rustc --crate-name test src[/]main.rs --crate-type bin \
+ --emit=dep-info,link \
+ -C opt-level=3 \
+ -C lto \
+ -C metadata=[..] \
+ --out-dir {dir}[/]target[/]release[/]deps \
+ -L dependency={dir}[/]target[/]release[/]deps`
+[FINISHED] release [optimized] target(s) in [..]
+",
+dir = p.root().display(),
+url = p.url(),
+)));
+}
+
+#[test]
+fn verbose_build() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "test"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] test v0.0.0 ({url})
+[RUNNING] `rustc --crate-name test src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+dir = p.root().display(),
+url = p.url(),
+)));
+}
+
+#[test]
+fn verbose_release_build() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "test"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v").arg("--release"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] test v0.0.0 ({url})
+[RUNNING] `rustc --crate-name test src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link \
+ -C opt-level=3 \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]release[/]deps`
+[FINISHED] release [optimized] target(s) in [..]
+",
+dir = p.root().display(),
+url = p.url(),
+)));
+}
+
+#[test]
+fn verbose_release_build_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "test"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies.foo]
+ path = "foo"
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [lib]
+ name = "foo"
+ crate_type = ["dylib", "rlib"]
+ "#)
+ .file("foo/src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v").arg("--release"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] foo v0.0.0 ({url}/foo)
+[RUNNING] `rustc --crate-name foo foo[/]src[/]lib.rs \
+ --crate-type dylib --crate-type rlib \
+ --emit=dep-info,link \
+ -C prefer-dynamic \
+ -C opt-level=3 \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]release[/]deps`
+[COMPILING] test v0.0.0 ({url})
+[RUNNING] `rustc --crate-name test src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link \
+ -C opt-level=3 \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]release[/]deps \
+ --extern foo={dir}[/]target[/]release[/]deps[/]{prefix}foo{suffix} \
+ --extern foo={dir}[/]target[/]release[/]deps[/]libfoo.rlib`
+[FINISHED] release [optimized] target(s) in [..]
+",
+ dir = p.root().display(),
+ url = p.url(),
+ prefix = env::consts::DLL_PREFIX,
+ suffix = env::consts::DLL_SUFFIX)));
+}
+
+#[test]
+fn explicit_examples() {
+ let p = project("world")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "world"
+ version = "1.0.0"
+ authors = []
+
+ [lib]
+ name = "world"
+ path = "src/lib.rs"
+
+ [[example]]
+ name = "hello"
+ path = "examples/ex-hello.rs"
+
+ [[example]]
+ name = "goodbye"
+ path = "examples/ex-goodbye.rs"
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn get_hello() -> &'static str { "Hello" }
+ pub fn get_goodbye() -> &'static str { "Goodbye" }
+ pub fn get_world() -> &'static str { "World" }
+ "#)
+ .file("examples/ex-hello.rs", r#"
+ extern crate world;
+ fn main() { println!("{}, {}!", world::get_hello(), world::get_world()); }
+ "#)
+ .file("examples/ex-goodbye.rs", r#"
+ extern crate world;
+ fn main() { println!("{}, {}!", world::get_goodbye(), world::get_world()); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("test").arg("-v"), execs().with_status(0));
+ assert_that(process(&p.bin("examples/hello")),
+ execs().with_status(0).with_stdout("Hello, World!\n"));
+ assert_that(process(&p.bin("examples/goodbye")),
+ execs().with_status(0).with_stdout("Goodbye, World!\n"));
+}
+
+#[test]
+fn non_existing_example() {
+ let p = project("world")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "world"
+ version = "1.0.0"
+ authors = []
+
+ [lib]
+ name = "world"
+ path = "src/lib.rs"
+
+ [[example]]
+ name = "hello"
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/ehlo.rs", "")
+ .build();
+
+ assert_that(p.cargo("test").arg("-v"), execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ can't find `hello` example, specify example.path"));
+}
+
+#[test]
+fn non_existing_binary() {
+ let p = project("world")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "world"
+ version = "1.0.0"
+ authors = []
+
+ [[bin]]
+ name = "hello"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/ehlo.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"), execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ can't find `hello` bin, specify bin.path"));
+}
+
+#[test]
+fn legacy_binary_paths_warinigs() {
+ let p = project("world")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "1.0.0"
+ authors = []
+
+ [[bin]]
+ name = "bar"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"), execs().with_status(0).with_stderr_contains("\
+[WARNING] path `[..]src[/]main.rs` was erroneously implicitly accepted for binary `bar`,
+please set bin.path in Cargo.toml"));
+
+ let p = project("world")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "1.0.0"
+ authors = []
+
+ [[bin]]
+ name = "bar"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"), execs().with_status(0).with_stderr_contains("\
+[WARNING] path `[..]src[/]bin[/]main.rs` was erroneously implicitly accepted for binary `bar`,
+please set bin.path in Cargo.toml"));
+
+ let p = project("world")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "1.0.0"
+ authors = []
+
+ [[bin]]
+ name = "bar"
+ "#)
+ .file("src/bar.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"), execs().with_status(0).with_stderr_contains("\
+[WARNING] path `[..]src[/]bar.rs` was erroneously implicitly accepted for binary `bar`,
+please set bin.path in Cargo.toml"));
+}
+
+#[test]
+fn implicit_examples() {
+ let p = project("world")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "world"
+ version = "1.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn get_hello() -> &'static str { "Hello" }
+ pub fn get_goodbye() -> &'static str { "Goodbye" }
+ pub fn get_world() -> &'static str { "World" }
+ "#)
+ .file("examples/hello.rs", r#"
+ extern crate world;
+ fn main() {
+ println!("{}, {}!", world::get_hello(), world::get_world());
+ }
+ "#)
+ .file("examples/goodbye.rs", r#"
+ extern crate world;
+ fn main() {
+ println!("{}, {}!", world::get_goodbye(), world::get_world());
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("test"), execs().with_status(0));
+ assert_that(process(&p.bin("examples/hello")),
+ execs().with_status(0).with_stdout("Hello, World!\n"));
+ assert_that(process(&p.bin("examples/goodbye")),
+ execs().with_status(0).with_stdout("Goodbye, World!\n"));
+}
+
+#[test]
+fn standard_build_no_ndebug() {
+ let p = project("world")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", r#"
+ fn main() {
+ if cfg!(debug_assertions) {
+ println!("slow")
+ } else {
+ println!("fast")
+ }
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("slow\n"));
+}
+
+#[test]
+fn release_build_ndebug() {
+ let p = project("world")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", r#"
+ fn main() {
+ if cfg!(debug_assertions) {
+ println!("slow")
+ } else {
+ println!("fast")
+ }
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("--release"),
+ execs().with_status(0));
+ assert_that(process(&p.release_bin("foo")),
+ execs().with_status(0).with_stdout("fast\n"));
+}
+
+#[test]
+fn inferred_main_bin() {
+ let p = project("world")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(process(&p.bin("foo")), execs().with_status(0));
+}
+
+#[test]
+fn deletion_causes_failure() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ p.change_file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#);
+ assert_that(p.cargo("build"), execs().with_status(101));
+}
+
+#[test]
+fn bad_cargo_toml_in_target_dir() {
+ let p = project("world")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("target/Cargo.toml", "bad-toml")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(process(&p.bin("foo")), execs().with_status(0));
+}
+
+#[test]
+fn lib_with_standard_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "syntax"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "
+ pub fn foo() {}
+ ")
+ .file("src/main.rs", "
+ extern crate syntax;
+ fn main() { syntax::foo() }
+ ")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] syntax v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = p.url())));
+}
+
+#[test]
+fn simple_staticlib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+
+ [lib]
+ name = "foo"
+ crate-type = ["staticlib"]
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .build();
+
+ // env var is a test for #1381
+ assert_that(p.cargo("build").env("RUST_LOG", "nekoneko=trace"),
+ execs().with_status(0));
+}
+
+#[test]
+fn staticlib_rlib_and_bin() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+
+ [lib]
+ name = "foo"
+ crate-type = ["staticlib", "rlib"]
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .file("src/main.rs", r#"
+ extern crate foo;
+
+ fn main() {
+ foo::foo();
+ }"#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"), execs().with_status(0));
+}
+
+#[test]
+fn opt_out_of_bin() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ bin = []
+
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/main.rs", "bad syntax")
+ .build();
+ assert_that(p.cargo("build"), execs().with_status(0));
+}
+
+#[test]
+fn single_lib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+
+ [lib]
+ name = "foo"
+ path = "src/bar.rs"
+ "#)
+ .file("src/bar.rs", "")
+ .build();
+ assert_that(p.cargo("build"), execs().with_status(0));
+}
+
+#[test]
+fn freshness_ignores_excluded() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ build = "build.rs"
+ exclude = ["src/b*.rs"]
+ "#)
+ .file("build.rs", "fn main() {}")
+ .file("src/lib.rs", "pub fn bar() -> i32 { 1 }")
+ .build();
+ foo.root().move_into_the_past();
+
+ assert_that(foo.cargo("build"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.0 ({url})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", url = foo.url())));
+
+ // Smoke test to make sure it doesn't compile again
+ println!("first pass");
+ assert_that(foo.cargo("build"),
+ execs().with_status(0)
+ .with_stdout(""));
+
+ // Modify an ignored file and make sure we don't rebuild
+ println!("second pass");
+ File::create(&foo.root().join("src/bar.rs")).unwrap();
+ assert_that(foo.cargo("build"),
+ execs().with_status(0)
+ .with_stdout(""));
+}
+
+#[test]
+fn rebuild_preserves_out_dir() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ build = 'build.rs'
+ "#)
+ .file("build.rs", r#"
+ use std::env;
+ use std::fs::File;
+ use std::path::Path;
+
+ fn main() {
+ let path = Path::new(&env::var("OUT_DIR").unwrap()).join("foo");
+ if env::var_os("FIRST").is_some() {
+ File::create(&path).unwrap();
+ } else {
+ File::create(&path).unwrap();
+ }
+ }
+ "#)
+ .file("src/lib.rs", "pub fn bar() -> i32 { 1 }")
+ .build();
+ foo.root().move_into_the_past();
+
+ assert_that(foo.cargo("build").env("FIRST", "1"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.0 ({url})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", url = foo.url())));
+
+ File::create(&foo.root().join("src/bar.rs")).unwrap();
+ assert_that(foo.cargo("build"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.0 ({url})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", url = foo.url())));
+}
+
+#[test]
+fn dep_no_libs() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/lib.rs", "pub fn bar() -> i32 { 1 }")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("bar/src/main.rs", "")
+ .build();
+ assert_that(foo.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn recompile_space_in_name() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [lib]
+ name = "foo"
+ path = "src/my lib.rs"
+ "#)
+ .file("src/my lib.rs", "")
+ .build();
+ assert_that(foo.cargo("build"), execs().with_status(0));
+ foo.root().move_into_the_past();
+ assert_that(foo.cargo("build"),
+ execs().with_status(0).with_stdout(""));
+}
+
+#[cfg(unix)]
+#[test]
+fn ignore_bad_directories() {
+ use std::os::unix::prelude::*;
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ let dir = foo.root().join("tmp");
+ fs::create_dir(&dir).unwrap();
+ let stat = fs::metadata(&dir).unwrap();
+ let mut perms = stat.permissions();
+ perms.set_mode(0o644);
+ fs::set_permissions(&dir, perms.clone()).unwrap();
+ assert_that(foo.cargo("build"),
+ execs().with_status(0));
+ perms.set_mode(0o755);
+ fs::set_permissions(&dir, perms).unwrap();
+}
+
+#[test]
+fn bad_cargo_config() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ this is not valid toml
+ "#)
+ .build();
+ assert_that(foo.cargo("build").arg("-v"),
+ execs().with_status(101).with_stderr("\
+[ERROR] Couldn't load Cargo configuration
+
+Caused by:
+ could not parse TOML configuration in `[..]`
+
+Caused by:
+ could not parse input as TOML
+
+Caused by:
+ expected an equals, found an identifier at line 2
+"));
+}
+
+#[test]
+fn cargo_platform_specific_dependency() {
+ let host = rustc_host();
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ build = "build.rs"
+
+ [target.{host}.dependencies]
+ dep = {{ path = "dep" }}
+ [target.{host}.build-dependencies]
+ build = {{ path = "build" }}
+ [target.{host}.dev-dependencies]
+ dev = {{ path = "dev" }}
+ "#, host = host))
+ .file("src/main.rs", r#"
+ extern crate dep;
+ fn main() { dep::dep() }
+ "#)
+ .file("tests/foo.rs", r#"
+ extern crate dev;
+ #[test]
+ fn foo() { dev::dev() }
+ "#)
+ .file("build.rs", r#"
+ extern crate build;
+ fn main() { build::build(); }
+ "#)
+ .file("dep/Cargo.toml", r#"
+ [project]
+ name = "dep"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("dep/src/lib.rs", "pub fn dep() {}")
+ .file("build/Cargo.toml", r#"
+ [project]
+ name = "build"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("build/src/lib.rs", "pub fn build() {}")
+ .file("dev/Cargo.toml", r#"
+ [project]
+ name = "dev"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("dev/src/lib.rs", "pub fn dev() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(p.cargo("test"),
+ execs().with_status(0));
+}
+
+#[test]
+fn bad_platform_specific_dependency() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [target.wrong-target.dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs",
+ &main_file(r#""{}", bar::gimme()"#, &["bar"]))
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("bar/src/lib.rs", r#"
+ extern crate baz;
+
+ pub fn gimme() -> String {
+ format!("")
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101));
+}
+
+#[test]
+fn cargo_platform_specific_dependency_wrong_platform() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [target.non-existing-triplet.dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("bar/src/lib.rs", r#"
+ invalid rust file, should not be compiled
+ "#)
+ .build();
+
+ p.cargo("build").exec_with_output().unwrap();
+
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(process(&p.bin("foo")),
+ execs().with_status(0));
+
+ let loc = p.root().join("Cargo.lock");
+ let mut lockfile = String::new();
+ File::open(&loc).unwrap().read_to_string(&mut lockfile).unwrap();
+ assert!(lockfile.contains("bar"))
+}
+
+#[test]
+fn example_as_lib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[example]]
+ name = "ex"
+ crate-type = ["lib"]
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/ex.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("--example=ex"), execs().with_status(0));
+ assert_that(&p.example_lib("ex", "lib"), existing_file());
+}
+
+#[test]
+fn example_as_rlib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[example]]
+ name = "ex"
+ crate-type = ["rlib"]
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/ex.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("--example=ex"), execs().with_status(0));
+ assert_that(&p.example_lib("ex", "rlib"), existing_file());
+}
+
+#[test]
+fn example_as_dylib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[example]]
+ name = "ex"
+ crate-type = ["dylib"]
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/ex.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("--example=ex"), execs().with_status(0));
+ assert_that(&p.example_lib("ex", "dylib"), existing_file());
+}
+
+#[test]
+fn example_as_proc_macro() {
+ if !is_nightly() {
+ return;
+ }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[example]]
+ name = "ex"
+ crate-type = ["proc-macro"]
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/ex.rs", "#![feature(proc_macro)]")
+ .build();
+
+ assert_that(p.cargo("build").arg("--example=ex"), execs().with_status(0));
+ assert_that(&p.example_lib("ex", "proc-macro"), existing_file());
+}
+
+#[test]
+fn example_bin_same_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("examples/foo.rs", "fn main() {}")
+ .build();
+
+ p.cargo("test").arg("--no-run").arg("-v")
+ .exec_with_output()
+ .unwrap();
+
+ assert_that(&p.bin("foo"), is_not(existing_file()));
+ // We expect a file of the form bin/foo-{metadata_hash}
+ assert_that(&p.bin("examples/foo"), existing_file());
+
+ p.cargo("test").arg("--no-run").arg("-v")
+ .exec_with_output()
+ .unwrap();
+
+ assert_that(&p.bin("foo"), is_not(existing_file()));
+ // We expect a file of the form bin/foo-{metadata_hash}
+ assert_that(&p.bin("examples/foo"), existing_file());
+}
+
+#[test]
+fn compile_then_delete() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("run").arg("-v"), execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ if cfg!(windows) {
+ // On windows unlinking immediately after running often fails, so sleep
+ sleep_ms(100);
+ }
+ fs::remove_file(&p.bin("foo")).unwrap();
+ assert_that(p.cargo("run").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn transitive_dependencies_not_available() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.aaaaa]
+ path = "a"
+ "#)
+ .file("src/main.rs", "extern crate bbbbb; extern crate aaaaa; fn main() {}")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "aaaaa"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bbbbb]
+ path = "../b"
+ "#)
+ .file("a/src/lib.rs", "extern crate bbbbb;")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "bbbbb"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101)
+ .with_stderr_contains("\
+[..] can't find crate for `bbbbb`[..]
+"));
+}
+
+#[test]
+fn cyclic_deps_rejected() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.a]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.foo]
+ path = ".."
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] cyclic package dependency: package `a v0.0.1 ([..])` depends on itself
+"));
+}
+
+#[test]
+fn predictable_filenames() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo"
+ crate-type = ["dylib", "rlib"]
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target/debug/libfoo.rlib"), existing_file());
+ let dylib_name = format!("{}foo{}", env::consts::DLL_PREFIX,
+ env::consts::DLL_SUFFIX);
+ assert_that(&p.root().join("target/debug").join(dylib_name),
+ existing_file());
+}
+
+#[test]
+fn dashes_to_underscores() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo-bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/main.rs", "extern crate foo_bar; fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+ assert_that(&p.bin("foo-bar"), existing_file());
+}
+
+#[test]
+fn dashes_in_crate_name_bad() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo-bar"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/main.rs", "extern crate foo_bar; fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101));
+}
+
+#[test]
+fn rustc_env_var() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build")
+ .env("RUSTC", "rustc-that-does-not-exist").arg("-v"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] could not execute process `rustc-that-does-not-exist -vV` ([..])
+
+Caused by:
+[..]
+"));
+ assert_that(&p.bin("a"), is_not(existing_file()));
+}
+
+#[test]
+fn filtering() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", "fn main() {}")
+ .file("src/bin/b.rs", "fn main() {}")
+ .file("examples/a.rs", "fn main() {}")
+ .file("examples/b.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--lib"),
+ execs().with_status(0));
+ assert_that(&p.bin("a"), is_not(existing_file()));
+
+ assert_that(p.cargo("build").arg("--bin=a").arg("--example=a"),
+ execs().with_status(0));
+ assert_that(&p.bin("a"), existing_file());
+ assert_that(&p.bin("b"), is_not(existing_file()));
+ assert_that(&p.bin("examples/a"), existing_file());
+ assert_that(&p.bin("examples/b"), is_not(existing_file()));
+}
+
+#[test]
+fn filtering_implicit_bins() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", "fn main() {}")
+ .file("src/bin/b.rs", "fn main() {}")
+ .file("examples/a.rs", "fn main() {}")
+ .file("examples/b.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--bins"),
+ execs().with_status(0));
+ assert_that(&p.bin("a"), existing_file());
+ assert_that(&p.bin("b"), existing_file());
+ assert_that(&p.bin("examples/a"), is_not(existing_file()));
+ assert_that(&p.bin("examples/b"), is_not(existing_file()));
+}
+
+#[test]
+fn filtering_implicit_examples() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", "fn main() {}")
+ .file("src/bin/b.rs", "fn main() {}")
+ .file("examples/a.rs", "fn main() {}")
+ .file("examples/b.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--examples"),
+ execs().with_status(0));
+ assert_that(&p.bin("a"), is_not(existing_file()));
+ assert_that(&p.bin("b"), is_not(existing_file()));
+ assert_that(&p.bin("examples/a"), existing_file());
+ assert_that(&p.bin("examples/b"), existing_file());
+}
+
+#[test]
+fn ignore_dotfile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/bin/.a.rs", "")
+ .file("src/bin/a.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn ignore_dotdirs() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/bin/a.rs", "fn main() {}")
+ .file(".git/Cargo.toml", "")
+ .file(".pc/dummy-fix.patch/Cargo.toml", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn dotdir_root() {
+ let p = ProjectBuilder::new("foo", root().join(".foo"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/bin/a.rs", "fn main() {}")
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+
+#[test]
+fn custom_target_dir() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ let exe_name = format!("foo{}", env::consts::EXE_SUFFIX);
+
+ assert_that(p.cargo("build").env("CARGO_TARGET_DIR", "foo/target"),
+ execs().with_status(0));
+ assert_that(&p.root().join("foo/target/debug").join(&exe_name),
+ existing_file());
+ assert_that(&p.root().join("target/debug").join(&exe_name),
+ is_not(existing_file()));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ assert_that(&p.root().join("foo/target/debug").join(&exe_name),
+ existing_file());
+ assert_that(&p.root().join("target/debug").join(&exe_name),
+ existing_file());
+
+ fs::create_dir(p.root().join(".cargo")).unwrap();
+ File::create(p.root().join(".cargo/config")).unwrap().write_all(br#"
+ [build]
+ target-dir = "foo/target"
+ "#).unwrap();
+ assert_that(p.cargo("build").env("CARGO_TARGET_DIR", "bar/target"),
+ execs().with_status(0));
+ assert_that(&p.root().join("bar/target/debug").join(&exe_name),
+ existing_file());
+ assert_that(&p.root().join("foo/target/debug").join(&exe_name),
+ existing_file());
+ assert_that(&p.root().join("target/debug").join(&exe_name),
+ existing_file());
+}
+
+#[test]
+fn rustc_no_trans() {
+ if !is_nightly() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("rustc").arg("-v").arg("--").arg("-Zno-trans"),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_multiple_packages() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.d1]
+ path = "d1"
+ [dependencies.d2]
+ path = "d2"
+
+ [[bin]]
+ name = "foo"
+ "#)
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .file("d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name = "d1"
+ "#)
+ .file("d1/src/lib.rs", "")
+ .file("d1/src/main.rs", "fn main() { println!(\"d1\"); }")
+ .file("d2/Cargo.toml", r#"
+ [package]
+ name = "d2"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name = "d2"
+ doctest = false
+ "#)
+ .file("d2/src/main.rs", "fn main() { println!(\"d2\"); }")
+ .build();
+
+ assert_that(p.cargo("build").arg("-p").arg("d1").arg("-p").arg("d2")
+ .arg("-p").arg("foo"),
+ execs().with_status(0));
+
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("i am foo\n"));
+
+ let d1_path = &p.build_dir().join("debug")
+ .join(format!("d1{}", env::consts::EXE_SUFFIX));
+ let d2_path = &p.build_dir().join("debug")
+ .join(format!("d2{}", env::consts::EXE_SUFFIX));
+
+ assert_that(d1_path, existing_file());
+ assert_that(process(d1_path), execs().with_status(0).with_stdout("d1"));
+
+ assert_that(d2_path, existing_file());
+ assert_that(process(d2_path),
+ execs().with_status(0).with_stdout("d2"));
+}
+
+#[test]
+fn invalid_spec() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.d1]
+ path = "d1"
+
+ [[bin]]
+ name = "foo"
+ "#)
+ .file("src/bin/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .file("d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name = "d1"
+ "#)
+ .file("d1/src/lib.rs", "")
+ .file("d1/src/main.rs", "fn main() { println!(\"d1\"); }")
+ .build();
+
+ assert_that(p.cargo("build").arg("-p").arg("notAValidDep"),
+ execs().with_status(101).with_stderr("\
+[ERROR] package id specification `notAValidDep` matched no packages
+"));
+
+ assert_that(p.cargo("build").arg("-p").arg("d1").arg("-p").arg("notAValidDep"),
+ execs().with_status(101).with_stderr("\
+[ERROR] package id specification `notAValidDep` matched no packages
+"));
+}
+
+#[test]
+fn manifest_with_bom_is_ok() {
+ let p = project("foo")
+ .file("Cargo.toml", "\u{FEFF}
+ [package]
+ name = \"foo\"
+ version = \"0.0.1\"
+ authors = []
+ ")
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn panic_abort_compiles_with_panic_abort() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [profile.dev]
+ panic = 'abort'
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("[..] -C panic=abort [..]"));
+}
+
+#[test]
+fn explicit_color_config_is_propagated_to_rustc() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "test"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v").arg("--color").arg("always"),
+ execs().with_status(0).with_stderr_contains(
+ "[..]rustc [..] src[/]lib.rs --color always[..]"));
+
+ assert_that(p.cargo("clean"), execs().with_status(0));
+
+ assert_that(p.cargo("build").arg("-v").arg("--color").arg("never"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] test v0.0.0 ([..])
+[RUNNING] `rustc [..] --color never [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn compiler_json_error_format() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [profile.dev]
+ debug = false # prevent the *.dSYM from affecting the test result
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", "fn main() { let unused = 92; }")
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("bar/src/lib.rs", r#"fn dead() {}"#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v")
+ .arg("--message-format").arg("json"),
+ execs().with_status(0).with_json(r#"
+ {
+ "reason":"compiler-message",
+ "package_id":"bar 0.5.0 ([..])",
+ "target":{
+ "kind":["lib"],
+ "crate_types":["lib"],
+ "name":"bar",
+ "src_path":"[..]lib.rs"
+ },
+ "message":"{...}"
+ }
+
+ {
+ "reason":"compiler-artifact",
+ "profile": {
+ "debug_assertions": true,
+ "debuginfo": null,
+ "opt_level": "0",
+ "overflow_checks": true,
+ "test": false
+ },
+ "features": [],
+ "package_id":"bar 0.5.0 ([..])",
+ "target":{
+ "kind":["lib"],
+ "crate_types":["lib"],
+ "name":"bar",
+ "src_path":"[..]lib.rs"
+ },
+ "filenames":["[..].rlib"],
+ "fresh": false
+ }
+
+ {
+ "reason":"compiler-message",
+ "package_id":"foo 0.5.0 ([..])",
+ "target":{
+ "kind":["bin"],
+ "crate_types":["bin"],
+ "name":"foo",
+ "src_path":"[..]main.rs"
+ },
+ "message":"{...}"
+ }
+
+ {
+ "reason":"compiler-artifact",
+ "package_id":"foo 0.5.0 ([..])",
+ "target":{
+ "kind":["bin"],
+ "crate_types":["bin"],
+ "name":"foo",
+ "src_path":"[..]main.rs"
+ },
+ "profile": {
+ "debug_assertions": true,
+ "debuginfo": null,
+ "opt_level": "0",
+ "overflow_checks": true,
+ "test": false
+ },
+ "features": [],
+ "filenames": ["[..]"],
+ "fresh": false
+ }
+"#));
+
+ // With fresh build, we should repeat the artifacts,
+ // but omit compiler warnings.
+ assert_that(p.cargo("build").arg("-v")
+ .arg("--message-format").arg("json"),
+ execs().with_status(0).with_json(r#"
+ {
+ "reason":"compiler-artifact",
+ "profile": {
+ "debug_assertions": true,
+ "debuginfo": null,
+ "opt_level": "0",
+ "overflow_checks": true,
+ "test": false
+ },
+ "features": [],
+ "package_id":"bar 0.5.0 ([..])",
+ "target":{
+ "kind":["lib"],
+ "crate_types":["lib"],
+ "name":"bar",
+ "src_path":"[..]lib.rs"
+ },
+ "filenames":["[..].rlib"],
+ "fresh": true
+ }
+
+ {
+ "reason":"compiler-artifact",
+ "package_id":"foo 0.5.0 ([..])",
+ "target":{
+ "kind":["bin"],
+ "crate_types":["bin"],
+ "name":"foo",
+ "src_path":"[..]main.rs"
+ },
+ "profile": {
+ "debug_assertions": true,
+ "debuginfo": null,
+ "opt_level": "0",
+ "overflow_checks": true,
+ "test": false
+ },
+ "features": [],
+ "filenames": ["[..]"],
+ "fresh": true
+ }
+"#));
+}
+
+#[test]
+fn wrong_message_format_option() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--message-format").arg("XML"),
+ execs().with_status(1)
+ .with_stderr_contains(
+r#"[ERROR] Could not match 'xml' with any of the allowed variants: ["Human", "Json"]"#));
+}
+
+#[test]
+fn message_format_json_forward_stderr() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", "fn main() { let unused = 0; }")
+ .build();
+
+ assert_that(p.cargo("rustc").arg("--release").arg("--bin").arg("foo")
+ .arg("--message-format").arg("JSON"),
+ execs().with_status(0)
+ .with_json(r#"
+ {
+ "reason":"compiler-message",
+ "package_id":"foo 0.5.0 ([..])",
+ "target":{
+ "kind":["bin"],
+ "crate_types":["bin"],
+ "name":"foo",
+ "src_path":"[..]"
+ },
+ "message":"{...}"
+ }
+
+ {
+ "reason":"compiler-artifact",
+ "package_id":"foo 0.5.0 ([..])",
+ "target":{
+ "kind":["bin"],
+ "crate_types":["bin"],
+ "name":"foo",
+ "src_path":"[..]"
+ },
+ "profile":{
+ "debug_assertions":false,
+ "debuginfo":null,
+ "opt_level":"3",
+ "overflow_checks": false,
+ "test":false
+ },
+ "features":[],
+ "filenames":["[..]"],
+ "fresh": false
+ }
+"#));
+}
+
+#[test]
+fn no_warn_about_package_metadata() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [package.metadata]
+ foo = "bar"
+ a = true
+ b = 3
+
+ [package.metadata.another]
+ bar = 3
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("[..] foo v0.0.1 ([..])\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) in [..]\n"));
+}
+
+#[test]
+fn cargo_build_empty_target() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--target").arg(""),
+ execs().with_status(101)
+ .with_stderr_contains("[..] target was empty"));
+}
+
+#[test]
+fn build_all_workspace() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+
+ [dependencies]
+ bar = { path = "bar" }
+
+ [workspace]
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build")
+ .arg("--all"),
+ execs().with_status(0)
+ .with_stderr("[..] Compiling bar v0.1.0 ([..])\n\
+ [..] Compiling foo v0.1.0 ([..])\n\
+ [..] Finished dev [unoptimized + debuginfo] target(s) in [..]\n"));
+}
+
+#[test]
+fn build_all_exclude() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+
+ [workspace]
+ members = ["bar", "baz"]
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .file("baz/Cargo.toml", r#"
+ [project]
+ name = "baz"
+ version = "0.1.0"
+ "#)
+ .file("baz/src/lib.rs", r#"
+ pub fn baz() {
+ break_the_build();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build")
+ .arg("--all")
+ .arg("--exclude")
+ .arg("baz"),
+ execs().with_status(0)
+ .with_stderr_contains("[..]Compiling foo v0.1.0 [..]")
+ .with_stderr_contains("[..]Compiling bar v0.1.0 [..]")
+ .with_stderr_does_not_contain("[..]Compiling baz v0.1.0 [..]"));
+}
+
+#[test]
+fn build_all_workspace_implicit_examples() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+
+ [dependencies]
+ bar = { path = "bar" }
+
+ [workspace]
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", "fn main() {}")
+ .file("src/bin/b.rs", "fn main() {}")
+ .file("examples/c.rs", "fn main() {}")
+ .file("examples/d.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", "")
+ .file("bar/src/bin/e.rs", "fn main() {}")
+ .file("bar/src/bin/f.rs", "fn main() {}")
+ .file("bar/examples/g.rs", "fn main() {}")
+ .file("bar/examples/h.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build")
+ .arg("--all").arg("--examples"),
+ execs().with_status(0)
+ .with_stderr("[..] Compiling bar v0.1.0 ([..])\n\
+ [..] Compiling foo v0.1.0 ([..])\n\
+ [..] Finished dev [unoptimized + debuginfo] target(s) in [..]\n"));
+ assert_that(&p.bin("a"), is_not(existing_file()));
+ assert_that(&p.bin("b"), is_not(existing_file()));
+ assert_that(&p.bin("examples/c"), existing_file());
+ assert_that(&p.bin("examples/d"), existing_file());
+ assert_that(&p.bin("e"), is_not(existing_file()));
+ assert_that(&p.bin("f"), is_not(existing_file()));
+ assert_that(&p.bin("examples/g"), existing_file());
+ assert_that(&p.bin("examples/h"), existing_file());
+}
+
+#[test]
+fn build_all_virtual_manifest() {
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo", "bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("foo/src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .build();
+
+ // The order in which foo and bar are built is not guaranteed
+ assert_that(p.cargo("build")
+ .arg("--all"),
+ execs().with_status(0)
+ .with_stderr_contains("[..] Compiling bar v0.1.0 ([..])")
+ .with_stderr_contains("[..] Compiling foo v0.1.0 ([..])")
+ .with_stderr("[..] Compiling [..] v0.1.0 ([..])\n\
+ [..] Compiling [..] v0.1.0 ([..])\n\
+ [..] Finished dev [unoptimized + debuginfo] target(s) in [..]\n"));
+}
+
+#[test]
+fn build_virtual_manifest_all_implied() {
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo", "bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("foo/src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .build();
+
+ // The order in which foo and bar are built is not guaranteed
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr_contains("[..] Compiling bar v0.1.0 ([..])")
+ .with_stderr_contains("[..] Compiling foo v0.1.0 ([..])")
+ .with_stderr("[..] Compiling [..] v0.1.0 ([..])\n\
+ [..] Compiling [..] v0.1.0 ([..])\n\
+ [..] Finished dev [unoptimized + debuginfo] target(s) in [..]\n"));
+}
+
+#[test]
+fn build_virtual_manifest_one_project() {
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo", "bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("foo/src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build")
+ .arg("-p").arg("foo"),
+ execs().with_status(0)
+ .with_stderr_does_not_contain("bar")
+ .with_stderr_contains("[..] Compiling foo v0.1.0 ([..])")
+ .with_stderr("[..] Compiling [..] v0.1.0 ([..])\n\
+ [..] Finished dev [unoptimized + debuginfo] target(s) in [..]\n"));
+}
+
+#[test]
+fn build_all_virtual_manifest_implicit_examples() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo", "bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("foo/src/lib.rs", "")
+ .file("foo/src/bin/a.rs", "fn main() {}")
+ .file("foo/src/bin/b.rs", "fn main() {}")
+ .file("foo/examples/c.rs", "fn main() {}")
+ .file("foo/examples/d.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", "")
+ .file("bar/src/bin/e.rs", "fn main() {}")
+ .file("bar/src/bin/f.rs", "fn main() {}")
+ .file("bar/examples/g.rs", "fn main() {}")
+ .file("bar/examples/h.rs", "fn main() {}")
+ .build();
+
+ // The order in which foo and bar are built is not guaranteed
+ assert_that(p.cargo("build")
+ .arg("--all").arg("--examples"),
+ execs().with_status(0)
+ .with_stderr_contains("[..] Compiling bar v0.1.0 ([..])")
+ .with_stderr_contains("[..] Compiling foo v0.1.0 ([..])")
+ .with_stderr("[..] Compiling [..] v0.1.0 ([..])\n\
+ [..] Compiling [..] v0.1.0 ([..])\n\
+ [..] Finished dev [unoptimized + debuginfo] target(s) in [..]\n"));
+ assert_that(&p.bin("a"), is_not(existing_file()));
+ assert_that(&p.bin("b"), is_not(existing_file()));
+ assert_that(&p.bin("examples/c"), existing_file());
+ assert_that(&p.bin("examples/d"), existing_file());
+ assert_that(&p.bin("e"), is_not(existing_file()));
+ assert_that(&p.bin("f"), is_not(existing_file()));
+ assert_that(&p.bin("examples/g"), existing_file());
+ assert_that(&p.bin("examples/h"), existing_file());
+}
+
+#[test]
+fn build_all_member_dependency_same_name() {
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["a"]
+ "#)
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.1.0"
+
+ [dependencies]
+ a = "0.1.0"
+ "#)
+ .file("a/src/lib.rs", r#"
+ pub fn a() {}
+ "#)
+ .build();
+
+ Package::new("a", "0.1.0").publish();
+
+ assert_that(p.cargo("build")
+ .arg("--all"),
+ execs().with_status(0)
+ .with_stderr("[..] Updating registry `[..]`\n\
+ [..] Downloading a v0.1.0 ([..])\n\
+ [..] Compiling a v0.1.0\n\
+ [..] Compiling a v0.1.0 ([..])\n\
+ [..] Finished dev [unoptimized + debuginfo] target(s) in [..]\n"));
+}
+
+#[test]
+fn run_proper_binary() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+ [[bin]]
+ name = "main"
+ [[bin]]
+ name = "other"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/main.rs", r#"
+ fn main() {
+ panic!("This should never be run.");
+ }
+ "#)
+ .file("src/bin/other.rs", r#"
+ fn main() {
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("--bin").arg("other"),
+ execs().with_status(0));
+}
+
+#[test]
+fn run_proper_binary_main_rs() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+ [[bin]]
+ name = "foo"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/main.rs", r#"
+ fn main() {
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("--bin").arg("foo"),
+ execs().with_status(0));
+}
+
+#[test]
+fn run_proper_alias_binary_from_src() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+ [[bin]]
+ name = "foo"
+ [[bin]]
+ name = "bar"
+ "#)
+ .file("src/foo.rs", r#"
+ fn main() {
+ println!("foo");
+ }
+ "#).file("src/bar.rs", r#"
+ fn main() {
+ println!("bar");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build")
+ .arg("--all"),
+ execs().with_status(0)
+ );
+ assert_that(process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("foo\n"));
+ assert_that(process(&p.bin("bar")),
+ execs().with_status(0).with_stdout("bar\n"));
+}
+
+#[test]
+fn run_proper_alias_binary_main_rs() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+ [[bin]]
+ name = "foo"
+ [[bin]]
+ name = "bar"
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {
+ println!("main");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build")
+ .arg("--all"),
+ execs().with_status(0)
+ );
+ assert_that(process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("main\n"));
+ assert_that(process(&p.bin("bar")),
+ execs().with_status(0).with_stdout("main\n"));
+}
+
+#[test]
+fn run_proper_binary_main_rs_as_foo() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+ [[bin]]
+ name = "foo"
+ "#)
+ .file("src/foo.rs", r#"
+ fn main() {
+ panic!("This should never be run.");
+ }
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("--bin").arg("foo"),
+ execs().with_status(0));
+}
+
+#[test]
+fn rustc_wrapper() {
+ // We don't have /usr/bin/env on Windows.
+ if cfg!(windows) { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("build").arg("-v").env("RUSTC_WRAPPER", "/usr/bin/env"),
+ execs().with_stderr_contains(
+ "[RUNNING] `/usr/bin/env rustc --crate-name foo [..]")
+ .with_status(0));
+}
+
+#[test]
+fn cdylib_not_lifted() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ authors = []
+ version = "0.1.0"
+
+ [lib]
+ crate-type = ["cdylib"]
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+
+ let files = if cfg!(windows) {
+ vec!["foo.dll.lib", "foo.dll.exp", "foo.dll"]
+ } else if cfg!(target_os = "macos") {
+ vec!["libfoo.dylib"]
+ } else {
+ vec!["libfoo.so"]
+ };
+
+ for file in files {
+ println!("checking: {}", file);
+ assert_that(&p.root().join("target/debug/deps").join(&file),
+ existing_file());
+ }
+}
+
+#[test]
+fn cdylib_final_outputs() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo-bar"
+ authors = []
+ version = "0.1.0"
+
+ [lib]
+ crate-type = ["cdylib"]
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+
+ let files = if cfg!(windows) {
+ vec!["foo_bar.dll.lib", "foo_bar.dll"]
+ } else if cfg!(target_os = "macos") {
+ vec!["libfoo_bar.dylib"]
+ } else {
+ vec!["libfoo_bar.so"]
+ };
+
+ for file in files {
+ println!("checking: {}", file);
+ assert_that(&p.root().join("target/debug").join(&file), existing_file());
+ }
+}
+
+#[test]
+fn wasm32_final_outputs() {
+ use cargo::core::{Shell, Target, Workspace};
+ use cargo::ops::{self, BuildConfig, Context, CompileMode, CompileOptions, Kind, Unit};
+ use cargo::util::Config;
+ use cargo::util::important_paths::find_root_manifest_for_wd;
+
+ let target_triple = "wasm32-unknown-emscripten";
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo-bar"
+ authors = []
+ version = "0.1.0"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ // We can't cross-compile the project to wasm target unless we have emscripten installed.
+ // So here we will not run `cargo build`, but just create cargo_rustc::Context and ask it
+ // what the target file names would be.
+
+ // Create various stuff required to build cargo_rustc::Context.
+ let shell = Shell::new();
+ let config = Config::new(shell, p.root(), p.root());
+ let root = find_root_manifest_for_wd(None, config.cwd()).expect("Can't find the root manifest");
+ let ws = Workspace::new(&root, &config).expect("Can't create workspace");
+
+ let opts = CompileOptions {
+ target: Some(target_triple),
+ .. CompileOptions::default(&config, CompileMode::Build)
+ };
+
+ let specs = opts.spec.into_package_id_specs(&ws).expect("Can't create specs");
+
+ let (packages, resolve) = ops::resolve_ws_precisely(
+ &ws,
+ None,
+ opts.features,
+ opts.all_features,
+ opts.no_default_features,
+ &specs,
+ ).expect("Can't create resolve");
+
+ let build_config = BuildConfig {
+ requested_target: Some(target_triple.to_string()),
+ jobs: 1,
+ .. BuildConfig::default()
+ };
+
+ let pkgid = packages
+ .package_ids()
+ .filter(|id| id.name() == "foo-bar")
+ .collect::<Vec<_>>();
+ let pkg = packages.get(pkgid[0]).expect("Can't get package");
+
+ let target = Target::bin_target("foo-bar", p.root().join("src/main.rs"), None);
+
+ let unit = Unit {
+ pkg: &pkg,
+ target: &target,
+ profile: &ws.profiles().dev,
+ kind: Kind::Target,
+ };
+ let units = vec![unit];
+
+ // Finally, create the cargo_rustc::Context.
+ let mut ctx = Context::new(
+ &ws,
+ &resolve,
+ &packages,
+ &config,
+ build_config,
+ ws.profiles(),
+ ).expect("Can't create context");
+
+ // Ask the context to resolve target file names.
+ ctx.probe_target_info(&units).expect("Can't probe target info");
+ let target_filenames = ctx.target_filenames(&unit).expect("Can't get target file names");
+
+ // Verify the result.
+ let mut expected = vec!["debug/foo-bar.js", "debug/foo_bar.wasm"];
+
+ assert_eq!(target_filenames.len(), expected.len());
+
+ let mut target_filenames = target_filenames
+ .iter()
+ .map(|&(_, ref link_dst, _)| link_dst.clone().unwrap())
+ .collect::<Vec<_>>();
+ target_filenames.sort();
+ expected.sort();
+
+ for (expected, actual) in expected.iter().zip(target_filenames.iter()) {
+ assert!(
+ actual.ends_with(expected),
+ format!("{:?} does not end with {}", actual, expected)
+ );
+ }
+}
+
+#[test]
+fn deterministic_cfg_flags() {
+ // This bug is non-deterministic
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ build = "build.rs"
+
+ [features]
+ default = ["f_a", "f_b", "f_c", "f_d"]
+ f_a = []
+ f_b = []
+ f_c = []
+ f_d = []
+ "#)
+ .file("build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-cfg=cfg_a");
+ println!("cargo:rustc-cfg=cfg_b");
+ println!("cargo:rustc-cfg=cfg_c");
+ println!("cargo:rustc-cfg=cfg_d");
+ println!("cargo:rustc-cfg=cfg_e");
+ }
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.1.0 [..]
+[RUNNING] [..]
+[RUNNING] [..]
+[RUNNING] `rustc --crate-name foo [..] \
+--cfg[..]default[..]--cfg[..]f_a[..]--cfg[..]f_b[..]\
+--cfg[..]f_c[..]--cfg[..]f_d[..] \
+--cfg cfg_a --cfg cfg_b --cfg cfg_c --cfg cfg_d --cfg cfg_e`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]"));
+}
+
+#[test]
+fn explicit_bins_without_paths() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [[bin]]
+ name = "foo"
+
+ [[bin]]
+ name = "bar"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/main.rs", "fn main() {}")
+ .file("src/bin/bar.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+}
+
+#[test]
+fn no_bin_in_src_with_lib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [[bin]]
+ name = "foo"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/foo.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr_contains("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ can't find `foo` bin, specify bin.path"));
+}
+
+
+#[test]
+fn inferred_bins() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("src/bin/bar.rs", "fn main() {}")
+ .file("src/bin/baz/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), existing_file());
+ assert_that(&p.bin("baz"), existing_file());
+}
+
+#[test]
+fn inferred_bins_duplicate_name() {
+ // this should fail, because we have two binaries with the same name
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("src/bin/foo.rs", "fn main() {}")
+ .file("src/bin/foo/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr_contains("\
+[..]found duplicate binary name foo, but all binary targets must have a unique name[..]
+"));
+}
+
+#[test]
+fn inferred_bin_path() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [[bin]]
+ name = "bar"
+ # Note, no `path` key!
+ "#)
+ .file("src/bin/bar/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("bar"), existing_file());
+}
+
+#[test]
+fn inferred_examples() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "fn main() {}")
+ .file("examples/bar.rs", "fn main() {}")
+ .file("examples/baz/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("test"), execs().with_status(0));
+ assert_that(&p.bin("examples/bar"), existing_file());
+ assert_that(&p.bin("examples/baz"), existing_file());
+}
+
+#[test]
+fn inferred_tests() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "fn main() {}")
+ .file("tests/bar.rs", "fn main() {}")
+ .file("tests/baz/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(
+ p.cargo("test").arg("--test=bar").arg("--test=baz"),
+ execs().with_status(0));
+}
+
+#[test]
+fn inferred_benchmarks() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "fn main() {}")
+ .file("benches/bar.rs", "fn main() {}")
+ .file("benches/baz/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(
+ p.cargo("bench").arg("--bench=bar").arg("--bench=baz"),
+ execs().with_status(0));
+}
+
+#[test]
+fn same_metadata_different_directory() {
+ // A top-level crate built in two different workspaces should have the
+ // same metadata hash.
+ let p = project("foo1")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+ let output = t!(String::from_utf8(
+ t!(p.cargo("build").arg("-v").exec_with_output())
+ .stderr,
+ ));
+ let metadata = output
+ .split_whitespace()
+ .find(|arg| arg.starts_with("metadata="))
+ .unwrap();
+
+ let p = project("foo2")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(
+ p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr_contains(
+ format!("[..]{}[..]", metadata),
+ ),
+ );
+}
+
+#[test]
+fn building_a_dependent_crate_witout_bin_should_fail() {
+ Package::new("testless", "0.1.0")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "testless"
+ version = "0.1.0"
+
+ [[bin]]
+ name = "a_bin"
+ "#)
+ .file("src/lib.rs", "")
+ .publish();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+
+ [dependencies]
+ testless = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr_contains(
+ "[..]can't find `a_bin` bin, specify bin.path"
+ ));
+}
+
+#[cfg(any(target_os = "macos", target_os = "ios"))]
+#[test]
+fn uplift_dsym_of_bin_on_mac() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("src/main.rs", "fn main() { panic!(); }")
+ .file("src/bin/b.rs", "fn main() { panic!(); }")
+ .file("examples/c.rs", "fn main() { panic!(); }")
+ .file("tests/d.rs", "fn main() { panic!(); }")
+ .build();
+
+ assert_that(
+ p.cargo("build").arg("--bins").arg("--examples").arg("--tests"),
+ execs().with_status(0)
+ );
+ assert_that(&p.bin("foo.dSYM"), existing_dir());
+ assert_that(&p.bin("b.dSYM"), existing_dir());
+ assert!(
+ p.bin("b.dSYM")
+ .symlink_metadata()
+ .expect("read metadata from b.dSYM")
+ .file_type()
+ .is_symlink()
+ );
+ assert_that(&p.bin("c.dSYM"), is_not(existing_dir()));
+ assert_that(&p.bin("d.dSYM"), is_not(existing_dir()));
+}
+
+// Make sure that `cargo build` chooses the correct profile for building
+// targets based on filters (assuming --profile is not specified).
+#[test]
+fn build_filter_infer_profile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/main.rs", "fn main() {}")
+ .file("tests/t1.rs", "")
+ .file("benches/b1.rs", "")
+ .file("examples/ex1.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+ [RUNNING] `rustc --crate-name foo src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link[..]")
+ .with_stderr_contains("\
+ [RUNNING] `rustc --crate-name foo src[/]main.rs --crate-type bin \
+ --emit=dep-info,link[..]")
+ );
+
+ p.root().join("target").rm_rf();
+ assert_that(p.cargo("build").arg("-v").arg("--test=t1"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+ [RUNNING] `rustc --crate-name foo src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link[..]")
+ .with_stderr_contains("\
+ [RUNNING] `rustc --crate-name t1 tests[/]t1.rs --emit=dep-info,link[..]")
+ .with_stderr_contains("\
+ [RUNNING] `rustc --crate-name foo src[/]main.rs --crate-type bin \
+ --emit=dep-info,link[..]")
+ );
+
+ p.root().join("target").rm_rf();
+ assert_that(p.cargo("build").arg("-v").arg("--bench=b1"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+ [RUNNING] `rustc --crate-name foo src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link[..]")
+ .with_stderr_contains("\
+ [RUNNING] `rustc --crate-name b1 benches[/]b1.rs --emit=dep-info,link \
+ -C opt-level=3[..]")
+ .with_stderr_contains("\
+ [RUNNING] `rustc --crate-name foo src[/]main.rs --crate-type bin \
+ --emit=dep-info,link[..]")
+ );
+}
+
+#[test]
+fn all_targets_no_lib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+ assert_that(p.cargo("build").arg("-v").arg("--all-targets"),
+ execs().with_status(0)
+ // bin
+ .with_stderr_contains("\
+ [RUNNING] `rustc --crate-name foo src[/]main.rs --crate-type bin \
+ --emit=dep-info,link[..]")
+ // bench
+ .with_stderr_contains("\
+ [RUNNING] `rustc --crate-name foo src[/]main.rs --emit=dep-info,link \
+ -C opt-level=3 --test [..]")
+ // unit test
+ .with_stderr_contains("\
+ [RUNNING] `rustc --crate-name foo src[/]main.rs --emit=dep-info,link \
+ -C debuginfo=2 --test [..]")
+ );
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::ChannelChanger;
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+#[test]
+fn feature_required() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ im-a-teapot = true
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build")
+ .masquerade_as_nightly_cargo(),
+ execs().with_status(101)
+ .with_stderr("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ the `im-a-teapot` manifest key is unstable and may not work properly in England
+
+Caused by:
+ feature `test-dummy-unstable` is required
+
+consider adding `cargo-features = [\"test-dummy-unstable\"]` to the manifest
+"));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ the `im-a-teapot` manifest key is unstable and may not work properly in England
+
+Caused by:
+ feature `test-dummy-unstable` is required
+
+this Cargo does not support nightly features, but if you
+switch to nightly channel you can add
+`cargo-features = [\"test-dummy-unstable\"]` to enable this feature
+"));
+}
+
+#[test]
+fn unknown_feature() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["foo"]
+
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ unknown cargo feature `foo`
+"));
+}
+
+#[test]
+fn stable_feature_warns() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["test-dummy-stable"]
+
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+warning: the cargo feature `test-dummy-stable` is now stable and is no longer \
+necessary to be listed in the manifest
+[COMPILING] a [..]
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn nightly_feature_requires_nightly() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["test-dummy-unstable"]
+
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ im-a-teapot = true
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build")
+ .masquerade_as_nightly_cargo(),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] a [..]
+[FINISHED] [..]
+"));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ the cargo feature `test-dummy-unstable` requires a nightly version of Cargo, \
+ but this is the `stable` channel
+"));
+}
+
+#[test]
+fn nightly_feature_requires_nightly_in_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ cargo-features = ["test-dummy-unstable"]
+
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ im-a-teapot = true
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build")
+ .masquerade_as_nightly_cargo(),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] a [..]
+[COMPILING] b [..]
+[FINISHED] [..]
+"));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: failed to load source for a dependency on `a`
+
+Caused by:
+ Unable to update [..]
+
+Caused by:
+ failed to parse manifest at `[..]`
+
+Caused by:
+ the cargo feature `test-dummy-unstable` requires a nightly version of Cargo, \
+ but this is the `stable` channel
+"));
+}
+
+#[test]
+fn cant_publish() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["test-dummy-unstable"]
+
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ im-a-teapot = true
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build")
+ .masquerade_as_nightly_cargo(),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] a [..]
+[FINISHED] [..]
+"));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ the cargo feature `test-dummy-unstable` requires a nightly version of Cargo, \
+ but this is the `stable` channel
+"));
+}
+
+#[test]
+fn z_flags_rejected() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["test-dummy-unstable"]
+
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ im-a-teapot = true
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build")
+ .arg("-Zprint-im-a-teapot"),
+ execs().with_status(101)
+ .with_stderr("\
+error: the `-Z` flag is only accepted on the nightly channel of Cargo
+"));
+
+ assert_that(p.cargo("build")
+ .masquerade_as_nightly_cargo()
+ .arg("-Zarg"),
+ execs().with_status(101)
+ .with_stderr("\
+error: unknown `-Z` flag specified: arg
+"));
+
+ assert_that(p.cargo("build")
+ .masquerade_as_nightly_cargo()
+ .arg("-Zprint-im-a-teapot"),
+ execs().with_status(0)
+ .with_stdout("im-a-teapot = true\n")
+ .with_stderr("\
+[COMPILING] a [..]
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn publish_rejected() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["test-dummy-unstable"]
+
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("package")
+ .masquerade_as_nightly_cargo(),
+ execs().with_status(101)
+ .with_stderr("\
+error: cannot package or publish crates which activate nightly-only cargo features
+"));
+}
--- /dev/null
+extern crate cargo;
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::env;
+use std::fs::{self, File};
+use std::io::prelude::*;
+use std::path::{Path, PathBuf};
+use std::str;
+
+use cargotest::cargo_process;
+use cargotest::support::paths::{self, CargoPathExt};
+use cargotest::support::{execs, project, Project, basic_bin_manifest};
+use hamcrest::{assert_that, existing_file};
+
+#[cfg_attr(windows,allow(dead_code))]
+enum FakeKind<'a> {
+ Executable,
+ Symlink{target:&'a Path},
+}
+
+/// Add an empty file with executable flags (and platform-dependent suffix).
+/// TODO: move this to `Project` if other cases using this emerge.
+fn fake_file(proj: Project, dir: &Path, name: &str, kind: &FakeKind) -> Project {
+ let path = proj.root().join(dir).join(&format!("{}{}", name,
+ env::consts::EXE_SUFFIX));
+ path.parent().unwrap().mkdir_p();
+ match *kind {
+ FakeKind::Executable => {
+ File::create(&path).unwrap();
+ make_executable(&path);
+ },
+ FakeKind::Symlink{target} => {
+ make_symlink(&path,target);
+ }
+ }
+ return proj;
+
+ #[cfg(unix)]
+ fn make_executable(p: &Path) {
+ use std::os::unix::prelude::*;
+
+ let mut perms = fs::metadata(p).unwrap().permissions();
+ let mode = perms.mode();
+ perms.set_mode(mode | 0o111);
+ fs::set_permissions(p, perms).unwrap();
+ }
+ #[cfg(windows)]
+ fn make_executable(_: &Path) {}
+ #[cfg(unix)]
+ fn make_symlink(p: &Path, t: &Path) {
+ ::std::os::unix::fs::symlink(t,p).expect("Failed to create symlink");
+ }
+ #[cfg(windows)]
+ fn make_symlink(_: &Path, _: &Path) {
+ panic!("Not supported")
+ }
+}
+
+fn path() -> Vec<PathBuf> {
+ env::split_paths(&env::var_os("PATH").unwrap_or_default()).collect()
+}
+
+#[test]
+fn list_command_looks_at_path() {
+ let proj = project("list-non-overlapping").build();
+ let proj = fake_file(proj, Path::new("path-test"), "cargo-1", &FakeKind::Executable);
+ let mut pr = cargo_process();
+
+ let mut path = path();
+ path.push(proj.root().join("path-test"));
+ let path = env::join_paths(path.iter()).unwrap();
+ let output = pr.arg("-v").arg("--list")
+ .env("PATH", &path);
+ let output = output.exec_with_output().unwrap();
+ let output = str::from_utf8(&output.stdout).unwrap();
+ assert!(output.contains("\n 1\n"), "missing 1: {}", output);
+}
+
+// windows and symlinks don't currently agree that well
+#[cfg(unix)]
+#[test]
+fn list_command_resolves_symlinks() {
+ use cargotest::support::cargo_exe;
+
+ let proj = project("list-non-overlapping").build();
+ let proj = fake_file(proj, Path::new("path-test"), "cargo-2",
+ &FakeKind::Symlink{target:&cargo_exe()});
+ let mut pr = cargo_process();
+
+ let mut path = path();
+ path.push(proj.root().join("path-test"));
+ let path = env::join_paths(path.iter()).unwrap();
+ let output = pr.arg("-v").arg("--list")
+ .env("PATH", &path);
+ let output = output.exec_with_output().unwrap();
+ let output = str::from_utf8(&output.stdout).unwrap();
+ assert!(output.contains("\n 2\n"), "missing 2: {}", output);
+}
+
+#[test]
+fn find_closest_biuld_to_build() {
+ let mut pr = cargo_process();
+ pr.arg("biuld");
+
+ assert_that(pr,
+ execs().with_status(101)
+ .with_stderr("[ERROR] no such subcommand: `biuld`
+
+<tab>Did you mean `build`?
+
+"));
+}
+
+// if a subcommand is more than 3 edit distance away, we don't make a suggestion
+#[test]
+fn find_closest_dont_correct_nonsense() {
+ let mut pr = cargo_process();
+ pr.arg("there-is-no-way-that-there-is-a-command-close-to-this")
+ .cwd(&paths::root());
+
+ assert_that(pr,
+ execs().with_status(101)
+ .with_stderr("[ERROR] no such subcommand: \
+ `there-is-no-way-that-there-is-a-command-close-to-this`
+"));
+}
+
+#[test]
+fn displays_subcommand_on_error() {
+ let mut pr = cargo_process();
+ pr.arg("invalid-command");
+
+ assert_that(pr,
+ execs().with_status(101)
+ .with_stderr("[ERROR] no such subcommand: `invalid-command`
+"));
+}
+
+#[test]
+fn override_cargo_home() {
+ let root = paths::root();
+ let my_home = root.join("my_home");
+ fs::create_dir(&my_home).unwrap();
+ File::create(&my_home.join("config")).unwrap().write_all(br#"
+ [cargo-new]
+ name = "foo"
+ email = "bar"
+ git = false
+ "#).unwrap();
+
+ assert_that(cargo_process()
+ .arg("new").arg("foo")
+ .env("USER", "foo")
+ .env("CARGO_HOME", &my_home),
+ execs().with_status(0));
+
+ let toml = paths::root().join("foo/Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"authors = ["foo <bar>"]"#));
+}
+
+#[test]
+fn cargo_subcommand_env() {
+ use cargotest::support::cargo_exe;
+
+ let src = format!(r#"
+ use std::env;
+
+ fn main() {{
+ println!("{{}}", env::var("{}").unwrap());
+ }}
+ "#, cargo::CARGO_ENV);
+
+ let p = project("cargo-envtest")
+ .file("Cargo.toml", &basic_bin_manifest("cargo-envtest"))
+ .file("src/main.rs", &src)
+ .build();
+
+ let target_dir = p.target_debug_dir();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("cargo-envtest"), existing_file());
+
+ let mut pr = cargo_process();
+ let cargo = cargo_exe().canonicalize().unwrap();
+ let mut path = path();
+ path.push(target_dir);
+ let path = env::join_paths(path.iter()).unwrap();
+
+ assert_that(pr.arg("envtest").env("PATH", &path),
+ execs().with_status(0).with_stdout(cargo.to_str().unwrap()));
+}
+
+#[test]
+fn cargo_help() {
+ assert_that(cargo_process(),
+ execs().with_status(0));
+ assert_that(cargo_process().arg("help"),
+ execs().with_status(0));
+ assert_that(cargo_process().arg("-h"),
+ execs().with_status(0));
+ assert_that(cargo_process().arg("help").arg("build"),
+ execs().with_status(0));
+ assert_that(cargo_process().arg("build").arg("-h"),
+ execs().with_status(0));
+ assert_that(cargo_process().arg("help").arg("-h"),
+ execs().with_status(0));
+ assert_that(cargo_process().arg("help").arg("help"),
+ execs().with_status(0));
+}
+
+#[test]
+fn explain() {
+ assert_that(cargo_process().arg("--explain").arg("E0001"),
+ execs().with_status(0));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+use cargotest::support::{project, execs, basic_bin_manifest};
+use hamcrest::{assert_that};
+
+#[test]
+fn alias_incorrect_config_type() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ fn main() {
+ }"#)
+ .file(".cargo/config",r#"
+ [alias]
+ b-cargo-test = 5
+ "#)
+ .build();
+
+ assert_that(p.cargo("b-cargo-test").arg("-v"),
+ execs().with_status(101).
+ with_stderr_contains("[ERROR] invalid configuration \
+for key `alias.b-cargo-test`
+expected a list, but found a integer for [..]"));
+}
+
+
+#[test]
+fn alias_default_config_overrides_config() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ fn main() {
+ }"#)
+ .file(".cargo/config",r#"
+ [alias]
+ b = "not_build"
+ "#)
+ .build();
+
+ assert_that(p.cargo("b").arg("-v"),
+ execs().with_status(0).
+ with_stderr_contains("[COMPILING] foo v0.5.0 [..]"));
+}
+
+#[test]
+fn alias_config() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ fn main() {
+ }"#)
+ .file(".cargo/config",r#"
+ [alias]
+ b-cargo-test = "build"
+ "#)
+ .build();
+
+ assert_that(p.cargo("b-cargo-test").arg("-v"),
+ execs().with_status(0).
+ with_stderr_contains("[COMPILING] foo v0.5.0 [..]
+[RUNNING] `rustc --crate-name foo [..]"));
+}
+
+#[test]
+fn alias_list_test() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ fn main() {
+ }"#)
+ .file(".cargo/config",r#"
+ [alias]
+ b-cargo-test = ["build", "--release"]
+ "#)
+ .build();
+
+ assert_that(p.cargo("b-cargo-test").arg("-v"),
+ execs().with_status(0).
+ with_stderr_contains("[COMPILING] foo v0.5.0 [..]").
+ with_stderr_contains("[RUNNING] `rustc --crate-name [..]")
+ );
+}
+
+#[test]
+fn alias_with_flags_config() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ fn main() {
+ }"#)
+ .file(".cargo/config",r#"
+ [alias]
+ b-cargo-test = "build --release"
+ "#)
+ .build();
+
+ assert_that(p.cargo("b-cargo-test").arg("-v"),
+ execs().with_status(0).
+ with_stderr_contains("[COMPILING] foo v0.5.0 [..]").
+ with_stderr_contains("[RUNNING] `rustc --crate-name foo [..]")
+ );
+}
+
+#[test]
+fn cant_shadow_builtin() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ fn main() {
+ }"#)
+ .file(".cargo/config",r#"
+ [alias]
+ build = "fetch"
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.5.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
--- /dev/null
+[package]
+name = "cargotest"
+version = "0.1.0"
+authors = ["Alex Crichton <alex@alexcrichton.com>"]
+
+[lib]
+path = "lib.rs"
+
+[dependencies]
+cargo = { path = "../.." }
+filetime = "0.1"
+flate2 = "0.2"
+git2 = { version = "0.6", default-features = false }
+hamcrest = "=0.1.1"
+hex = "0.2"
+log = "0.3"
+serde_json = "1.0"
+tar = { version = "0.4", default-features = false }
+url = "1.1"
--- /dev/null
+use std::fmt;
+use std::path::{PathBuf, Path};
+
+use hamcrest::{Matcher, MatchResult, existing_file};
+use support::paths;
+
+pub use self::InstalledExe as has_installed_exe;
+
+pub fn cargo_home() -> PathBuf {
+ paths::home().join(".cargo")
+}
+
+pub struct InstalledExe(pub &'static str);
+
+pub fn exe(name: &str) -> String {
+ if cfg!(windows) {format!("{}.exe", name)} else {name.to_string()}
+}
+
+impl<P: AsRef<Path>> Matcher<P> for InstalledExe {
+ fn matches(&self, path: P) -> MatchResult {
+ let path = path.as_ref().join("bin").join(exe(self.0));
+ existing_file().matches(&path)
+ }
+}
+
+impl fmt::Display for InstalledExe {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ write!(f, "installed exe `{}`", self.0)
+ }
+}
--- /dev/null
+#![deny(warnings)]
+
+extern crate cargo;
+extern crate filetime;
+extern crate flate2;
+extern crate git2;
+extern crate hamcrest;
+extern crate hex;
+#[macro_use]
+extern crate serde_json;
+extern crate tar;
+extern crate url;
+
+use std::ffi::OsStr;
+use std::time::Duration;
+
+use cargo::util::Rustc;
+use std::path::PathBuf;
+
+pub mod support;
+pub mod install;
+
+thread_local!(pub static RUSTC: Rustc = Rustc::new(PathBuf::from("rustc"), None).unwrap());
+
+pub fn rustc_host() -> String {
+ RUSTC.with(|r| r.host.clone())
+}
+
+pub fn is_nightly() -> bool {
+ RUSTC.with(|r| {
+ r.verbose_version.contains("-nightly") ||
+ r.verbose_version.contains("-dev")
+ })
+}
+
+pub fn process<T: AsRef<OsStr>>(t: T) -> cargo::util::ProcessBuilder {
+ _process(t.as_ref())
+}
+
+fn _process(t: &OsStr) -> cargo::util::ProcessBuilder {
+ let mut p = cargo::util::process(t);
+ p.cwd(&support::paths::root())
+ .env_remove("CARGO_HOME")
+ .env("HOME", support::paths::home())
+ .env("CARGO_HOME", support::paths::home().join(".cargo"))
+ .env("__CARGO_TEST_ROOT", support::paths::root())
+
+ // Force cargo to think it's on the stable channel for all tests, this
+ // should hopefully not suprise us as we add cargo features over time and
+ // cargo rides the trains
+ .env("__CARGO_TEST_CHANNEL_OVERRIDE_DO_NOT_USE_THIS", "stable")
+
+ .env_remove("__CARGO_DEFAULT_LIB_METADATA")
+ .env_remove("RUSTC")
+ .env_remove("RUSTDOC")
+ .env_remove("RUSTC_WRAPPER")
+ .env_remove("RUSTFLAGS")
+ .env_remove("CARGO_INCREMENTAL")
+ .env_remove("XDG_CONFIG_HOME") // see #2345
+ .env("GIT_CONFIG_NOSYSTEM", "1") // keep trying to sandbox ourselves
+ .env_remove("EMAIL")
+ .env_remove("MFLAGS")
+ .env_remove("MAKEFLAGS")
+ .env_remove("CARGO_MAKEFLAGS")
+ .env_remove("GIT_AUTHOR_NAME")
+ .env_remove("GIT_AUTHOR_EMAIL")
+ .env_remove("GIT_COMMITTER_NAME")
+ .env_remove("GIT_COMMITTER_EMAIL")
+ .env_remove("CARGO_TARGET_DIR") // we assume 'target'
+ .env_remove("MSYSTEM"); // assume cmd.exe everywhere on windows
+ return p
+}
+
+pub trait ChannelChanger: Sized {
+ fn masquerade_as_nightly_cargo(&mut self) -> &mut Self;
+}
+
+impl ChannelChanger for cargo::util::ProcessBuilder {
+ fn masquerade_as_nightly_cargo(&mut self) -> &mut Self {
+ self.env("__CARGO_TEST_CHANNEL_OVERRIDE_DO_NOT_USE_THIS", "nightly")
+ }
+}
+
+pub fn cargo_process() -> cargo::util::ProcessBuilder {
+ process(&support::cargo_exe())
+}
+
+pub fn sleep_ms(ms: u64) {
+ std::thread::sleep(Duration::from_millis(ms));
+}
--- /dev/null
+use std::env;
+use std::process::Command;
+use std::sync::{Once, ONCE_INIT};
+use std::sync::atomic::{AtomicBool, ATOMIC_BOOL_INIT, Ordering};
+
+use support::{project, main_file, basic_bin_manifest};
+
+pub fn disabled() -> bool {
+ // First, disable if ./configure requested so
+ match env::var("CFG_DISABLE_CROSS_TESTS") {
+ Ok(ref s) if *s == "1" => return true,
+ _ => {}
+ }
+
+ // Right now the windows bots cannot cross compile due to the mingw setup,
+ // so we disable ourselves on all but macos/linux setups where the rustc
+ // install script ensures we have both architectures
+ if !(cfg!(target_os = "macos") ||
+ cfg!(target_os = "linux") ||
+ cfg!(target_env = "msvc")) {
+ return true;
+ }
+
+ // It's not particularly common to have a cross-compilation setup, so
+ // try to detect that before we fail a bunch of tests through no fault
+ // of the user.
+ static CAN_RUN_CROSS_TESTS: AtomicBool = ATOMIC_BOOL_INIT;
+ static CHECK: Once = ONCE_INIT;
+
+ let cross_target = alternate();
+
+ CHECK.call_once(|| {
+ let p = project("cross_test")
+ .file("Cargo.toml", &basic_bin_manifest("cross_test"))
+ .file("src/cross_test.rs", &main_file(r#""testing!""#, &[]))
+ .build();
+
+ let result = p.cargo("build")
+ .arg("--target").arg(&cross_target)
+ .exec_with_output();
+
+ if result.is_ok() {
+ CAN_RUN_CROSS_TESTS.store(true, Ordering::SeqCst);
+ }
+ });
+
+ if CAN_RUN_CROSS_TESTS.load(Ordering::SeqCst) {
+ // We were able to compile a simple project, so the user has the
+ // necessary std:: bits installed. Therefore, tests should not
+ // be disabled.
+ return false;
+ }
+
+ // We can't compile a simple cross project. We want to warn the user
+ // by failing a single test and having the remainder of the cross tests
+ // pass. We don't use std::sync::Once here because panicing inside its
+ // call_once method would poison the Once instance, which is not what
+ // we want.
+ static HAVE_WARNED: AtomicBool = ATOMIC_BOOL_INIT;
+
+ if HAVE_WARNED.swap(true, Ordering::SeqCst) {
+ // We are some other test and somebody else is handling the warning.
+ // Just disable the current test.
+ return true;
+ }
+
+ // We are responsible for warning the user, which we do by panicing.
+ let rustup_available = Command::new("rustup").output().is_ok();
+
+ let linux_help = if cfg!(target_os = "linux") {
+ "
+
+You may need to install runtime libraries for your Linux distribution as well.".to_string()
+ } else {
+ "".to_string()
+ };
+
+ let rustup_help = if rustup_available {
+ format!("
+
+Alternatively, you can install the necessary libraries for cross-compilation with
+
+ rustup target add {}{}", cross_target, linux_help)
+ } else {
+ "".to_string()
+ };
+
+ panic!("Cannot cross compile to {}.
+
+This failure can be safely ignored. If you would prefer to not see this
+failure, you can set the environment variable CFG_DISABLE_CROSS_TESTS to \"1\".{}
+", cross_target, rustup_help);
+}
+
+pub fn alternate() -> String {
+ let platform = match env::consts::OS {
+ "linux" => "unknown-linux-gnu",
+ "macos" => "apple-darwin",
+ "windows" => "pc-windows-msvc",
+ _ => unreachable!(),
+ };
+ let arch = match env::consts::ARCH {
+ "x86" => "x86_64",
+ "x86_64" => "i686",
+ _ => unreachable!(),
+ };
+ format!("{}-{}", arch, platform)
+}
+
+pub fn alternate_arch() -> &'static str {
+ match env::consts::ARCH {
+ "x86" => "x86_64",
+ "x86_64" => "x86",
+ _ => unreachable!(),
+ }
+}
+
+pub fn host() -> String {
+ let platform = match env::consts::OS {
+ "linux" => "unknown-linux-gnu",
+ "macos" => "apple-darwin",
+ "windows" => "pc-windows-msvc",
+ _ => unreachable!(),
+ };
+ let arch = match env::consts::ARCH {
+ "x86" => "i686",
+ "x86_64" => "x86_64",
+ _ => unreachable!(),
+ };
+ format!("{}-{}", arch, platform)
+}
--- /dev/null
+use std::fs::{self, File};
+use std::io::prelude::*;
+use std::path::{Path, PathBuf};
+
+use url::Url;
+use git2;
+
+use cargo::util::ProcessError;
+use support::{ProjectBuilder, Project, project, path2url};
+
+#[must_use]
+pub struct RepoBuilder {
+ repo: git2::Repository,
+ files: Vec<PathBuf>,
+}
+
+pub struct Repository(git2::Repository);
+
+pub fn repo(p: &Path) -> RepoBuilder { RepoBuilder::init(p) }
+
+impl RepoBuilder {
+ pub fn init(p: &Path) -> RepoBuilder {
+ t!(fs::create_dir_all(p.parent().unwrap()));
+ let repo = t!(git2::Repository::init(p));
+ {
+ let mut config = t!(repo.config());
+ t!(config.set_str("user.name", "name"));
+ t!(config.set_str("user.email", "email"));
+ }
+ RepoBuilder { repo: repo, files: Vec::new() }
+ }
+
+ pub fn file(self, path: &str, contents: &str) -> RepoBuilder {
+ let mut me = self.nocommit_file(path, contents);
+ me.files.push(PathBuf::from(path));
+ me
+ }
+
+ pub fn nocommit_file(self, path: &str, contents: &str) -> RepoBuilder {
+ let dst = self.repo.workdir().unwrap().join(path);
+ t!(fs::create_dir_all(dst.parent().unwrap()));
+ t!(t!(File::create(&dst)).write_all(contents.as_bytes()));
+ self
+ }
+
+ pub fn build(self) -> Repository {
+ {
+ let mut index = t!(self.repo.index());
+ for file in self.files.iter() {
+ t!(index.add_path(file));
+ }
+ t!(index.write());
+ let id = t!(index.write_tree());
+ let tree = t!(self.repo.find_tree(id));
+ let sig = t!(self.repo.signature());
+ t!(self.repo.commit(Some("HEAD"), &sig, &sig,
+ "Initial commit", &tree, &[]));
+ }
+ let RepoBuilder{ repo, .. } = self;
+ Repository(repo)
+ }
+}
+
+impl Repository {
+ pub fn root(&self) -> &Path {
+ self.0.workdir().unwrap()
+ }
+
+ pub fn url(&self) -> Url {
+ path2url(self.0.workdir().unwrap().to_path_buf())
+ }
+}
+
+pub fn new<F>(name: &str, callback: F) -> Result<Project, ProcessError>
+ where F: FnOnce(ProjectBuilder) -> ProjectBuilder
+{
+ let mut git_project = project(name);
+ git_project = callback(git_project);
+ let git_project = git_project.build();
+
+ let repo = t!(git2::Repository::init(&git_project.root()));
+ let mut cfg = t!(repo.config());
+ t!(cfg.set_str("user.email", "foo@bar.com"));
+ t!(cfg.set_str("user.name", "Foo Bar"));
+ drop(cfg);
+ add(&repo);
+ commit(&repo);
+ Ok(git_project)
+}
+
+pub fn add(repo: &git2::Repository) {
+ // FIXME(libgit2/libgit2#2514): apparently add_all will add all submodules
+ // as well, and then fail b/c they're a directory. As a stopgap, we just
+ // ignore all submodules.
+ let mut s = t!(repo.submodules());
+ for submodule in s.iter_mut() {
+ t!(submodule.add_to_index(false));
+ }
+ let mut index = t!(repo.index());
+ t!(index.add_all(["*"].iter(), git2::ADD_DEFAULT,
+ Some(&mut (|a, _b| {
+ if s.iter().any(|s| a.starts_with(s.path())) {1} else {0}
+ }))));
+ t!(index.write());
+}
+
+pub fn add_submodule<'a>(repo: &'a git2::Repository, url: &str,
+ path: &Path) -> git2::Submodule<'a>
+{
+ let path = path.to_str().unwrap().replace(r"\", "/");
+ let mut s = t!(repo.submodule(url, Path::new(&path), false));
+ let subrepo = t!(s.open());
+ t!(subrepo.remote_add_fetch("origin", "refs/heads/*:refs/heads/*"));
+ let mut origin = t!(subrepo.find_remote("origin"));
+ t!(origin.fetch(&[], None, None));
+ t!(subrepo.checkout_head(None));
+ t!(s.add_finalize());
+ return s;
+}
+
+pub fn commit(repo: &git2::Repository) -> git2::Oid {
+ let tree_id = t!(t!(repo.index()).write_tree());
+ let sig = t!(repo.signature());
+ let mut parents = Vec::new();
+ match repo.head().ok().map(|h| h.target().unwrap()) {
+ Some(parent) => parents.push(t!(repo.find_commit(parent))),
+ None => {}
+ }
+ let parents = parents.iter().collect::<Vec<_>>();
+ t!(repo.commit(Some("HEAD"), &sig, &sig, "test",
+ &t!(repo.find_tree(tree_id)),
+ &parents))
+}
+
+pub fn tag(repo: &git2::Repository, name: &str) {
+ let head = repo.head().unwrap().target().unwrap();
+ t!(repo.tag(name,
+ &t!(repo.find_object(head, None)),
+ &t!(repo.signature()),
+ "make a new tag",
+ false));
+}
--- /dev/null
+use std::env;
+use std::error::Error;
+use std::ffi::OsStr;
+use std::fmt;
+use std::fs;
+use std::io::prelude::*;
+use std::os;
+use std::path::{Path, PathBuf};
+use std::process::Output;
+use std::str;
+use std::usize;
+
+use serde_json::{self, Value};
+use url::Url;
+use hamcrest as ham;
+use cargo::util::ProcessBuilder;
+use cargo::util::{CargoError, CargoErrorKind, ProcessError};
+
+use support::paths::CargoPathExt;
+
+#[macro_export]
+macro_rules! t {
+ ($e:expr) => (match $e {
+ Ok(e) => e,
+ Err(e) => panic!("{} failed with {}", stringify!($e), e),
+ })
+}
+
+pub mod paths;
+pub mod git;
+pub mod registry;
+pub mod cross_compile;
+pub mod publish;
+
+/*
+ *
+ * ===== Builders =====
+ *
+ */
+
+#[derive(PartialEq,Clone)]
+struct FileBuilder {
+ path: PathBuf,
+ body: String
+}
+
+impl FileBuilder {
+ pub fn new(path: PathBuf, body: &str) -> FileBuilder {
+ FileBuilder { path: path, body: body.to_string() }
+ }
+
+ fn mk(&self) {
+ self.dirname().mkdir_p();
+
+ let mut file = fs::File::create(&self.path).unwrap_or_else(|e| {
+ panic!("could not create file {}: {}", self.path.display(), e)
+ });
+
+ t!(file.write_all(self.body.as_bytes()));
+ }
+
+ fn dirname(&self) -> &Path {
+ self.path.parent().unwrap()
+ }
+}
+
+#[derive(PartialEq,Clone)]
+struct SymlinkBuilder {
+ dst: PathBuf,
+ src: PathBuf,
+}
+
+impl SymlinkBuilder {
+ pub fn new(dst: PathBuf, src: PathBuf) -> SymlinkBuilder {
+ SymlinkBuilder { dst: dst, src: src }
+ }
+
+ #[cfg(unix)]
+ fn mk(&self) {
+ self.dirname().mkdir_p();
+ t!(os::unix::fs::symlink(&self.dst, &self.src));
+ }
+
+ #[cfg(windows)]
+ fn mk(&self) {
+ self.dirname().mkdir_p();
+ t!(os::windows::fs::symlink_file(&self.dst, &self.src));
+ }
+
+ fn dirname(&self) -> &Path {
+ self.src.parent().unwrap()
+ }
+}
+
+#[derive(PartialEq,Clone)]
+pub struct Project{
+ root: PathBuf,
+}
+
+#[must_use]
+#[derive(PartialEq,Clone)]
+pub struct ProjectBuilder {
+ name: String,
+ root: Project,
+ files: Vec<FileBuilder>,
+ symlinks: Vec<SymlinkBuilder>,
+}
+
+impl ProjectBuilder {
+ pub fn root(&self) -> PathBuf {
+ self.root.root()
+ }
+
+ pub fn build_dir(&self) -> PathBuf {
+ self.root.build_dir()
+ }
+
+ pub fn target_debug_dir(&self) -> PathBuf {
+ self.root.target_debug_dir()
+ }
+
+ pub fn new(name: &str, root: PathBuf) -> ProjectBuilder {
+ ProjectBuilder {
+ name: name.to_string(),
+ root: Project{ root },
+ files: vec![],
+ symlinks: vec![],
+ }
+ }
+
+ pub fn file<B: AsRef<Path>>(mut self, path: B,
+ body: &str) -> Self {
+ self._file(path.as_ref(), body);
+ self
+ }
+
+ fn _file(&mut self, path: &Path, body: &str) {
+ self.files.push(FileBuilder::new(self.root.root.join(path), body));
+ }
+
+ pub fn symlink<T: AsRef<Path>>(mut self, dst: T,
+ src: T) -> Self {
+ self.symlinks.push(SymlinkBuilder::new(self.root.root.join(dst),
+ self.root.root.join(src)));
+ self
+ }
+
+ pub fn build(self) -> Project {
+ // First, clean the directory if it already exists
+ self.rm_root();
+
+ // Create the empty directory
+ self.root.root.mkdir_p();
+
+ for file in self.files.iter() {
+ file.mk();
+ }
+
+ for symlink in self.symlinks.iter() {
+ symlink.mk();
+ }
+
+ let ProjectBuilder{ name: _, root, files: _, symlinks: _, .. } = self;
+ root
+ }
+
+ fn rm_root(&self) {
+ self.root.root.rm_rf()
+ }
+}
+
+impl Project {
+ pub fn root(&self) -> PathBuf {
+ self.root.clone()
+ }
+
+ pub fn build_dir(&self) -> PathBuf {
+ self.root.join("target")
+ }
+
+ pub fn target_debug_dir(&self) -> PathBuf {
+ self.build_dir().join("debug")
+ }
+
+ pub fn url(&self) -> Url { path2url(self.root()) }
+
+ pub fn example_lib(&self, name: &str, kind: &str) -> PathBuf {
+ let prefix = Project::get_lib_prefix(kind);
+
+ let extension = Project::get_lib_extension(kind);
+
+ let lib_file_name = format!("{}{}.{}",
+ prefix,
+ name,
+ extension);
+
+ self.target_debug_dir()
+ .join("examples")
+ .join(&lib_file_name)
+ }
+
+ pub fn bin(&self, b: &str) -> PathBuf {
+ self.build_dir().join("debug").join(&format!("{}{}", b,
+ env::consts::EXE_SUFFIX))
+ }
+
+ pub fn release_bin(&self, b: &str) -> PathBuf {
+ self.build_dir().join("release").join(&format!("{}{}", b,
+ env::consts::EXE_SUFFIX))
+ }
+
+ pub fn target_bin(&self, target: &str, b: &str) -> PathBuf {
+ self.build_dir().join(target).join("debug")
+ .join(&format!("{}{}", b, env::consts::EXE_SUFFIX))
+ }
+
+ pub fn change_file(&self, path: &str, body: &str) {
+ FileBuilder::new(self.root.join(path), body).mk()
+ }
+
+ pub fn process<T: AsRef<OsStr>>(&self, program: T) -> ProcessBuilder {
+ let mut p = ::process(program);
+ p.cwd(self.root());
+ return p
+ }
+
+ pub fn cargo(&self, cmd: &str) -> ProcessBuilder {
+ let mut p = self.process(&cargo_exe());
+ p.arg(cmd);
+ return p;
+ }
+
+ pub fn read_lockfile(&self) -> String {
+ let mut buffer = String::new();
+ fs::File::open(self.root().join("Cargo.lock")).unwrap()
+ .read_to_string(&mut buffer).unwrap();
+ buffer
+ }
+
+ fn get_lib_prefix(kind: &str) -> &str {
+ match kind {
+ "lib" | "rlib" => "lib",
+ "staticlib" | "dylib" | "proc-macro" => {
+ if cfg!(windows) {
+ ""
+ } else {
+ "lib"
+ }
+ }
+ _ => unreachable!()
+ }
+ }
+
+ fn get_lib_extension(kind: &str) -> &str {
+ match kind {
+ "lib" | "rlib" => "rlib",
+ "staticlib" => {
+ if cfg!(windows) {
+ "lib"
+ } else {
+ "a"
+ }
+ }
+ "dylib" | "proc-macro" => {
+ if cfg!(windows) {
+ "dll"
+ } else if cfg!(target_os="macos") {
+ "dylib"
+ } else {
+ "so"
+ }
+ }
+ _ => unreachable!()
+ }
+ }
+}
+
+// Generates a project layout
+pub fn project(name: &str) -> ProjectBuilder {
+ ProjectBuilder::new(name, paths::root().join(name))
+}
+
+// Generates a project layout inside our fake home dir
+pub fn project_in_home(name: &str) -> ProjectBuilder {
+ ProjectBuilder::new(name, paths::home().join(name))
+}
+
+// === Helpers ===
+
+pub fn main_file(println: &str, deps: &[&str]) -> String {
+ let mut buf = String::new();
+
+ for dep in deps.iter() {
+ buf.push_str(&format!("extern crate {};\n", dep));
+ }
+
+ buf.push_str("fn main() { println!(");
+ buf.push_str(&println);
+ buf.push_str("); }\n");
+
+ buf.to_string()
+}
+
+trait ErrMsg<T> {
+ fn with_err_msg(self, val: String) -> Result<T, String>;
+}
+
+impl<T, E: fmt::Display> ErrMsg<T> for Result<T, E> {
+ fn with_err_msg(self, val: String) -> Result<T, String> {
+ match self {
+ Ok(val) => Ok(val),
+ Err(err) => Err(format!("{}; original={}", val, err))
+ }
+ }
+}
+
+// Path to cargo executables
+pub fn cargo_dir() -> PathBuf {
+ env::var_os("CARGO_BIN_PATH").map(PathBuf::from).or_else(|| {
+ env::current_exe().ok().map(|mut path| {
+ path.pop();
+ if path.ends_with("deps") {
+ path.pop();
+ }
+ path
+ })
+ }).unwrap_or_else(|| {
+ panic!("CARGO_BIN_PATH wasn't set. Cannot continue running test")
+ })
+}
+
+pub fn cargo_exe() -> PathBuf {
+ cargo_dir().join(format!("cargo{}", env::consts::EXE_SUFFIX))
+}
+
+/// Returns an absolute path in the filesystem that `path` points to. The
+/// returned path does not contain any symlinks in its hierarchy.
+/*
+ *
+ * ===== Matchers =====
+ *
+ */
+
+#[derive(Clone)]
+pub struct Execs {
+ expect_stdout: Option<String>,
+ expect_stdin: Option<String>,
+ expect_stderr: Option<String>,
+ expect_exit_code: Option<i32>,
+ expect_stdout_contains: Vec<String>,
+ expect_stderr_contains: Vec<String>,
+ expect_stdout_contains_n: Vec<(String, usize)>,
+ expect_stdout_not_contains: Vec<String>,
+ expect_stderr_not_contains: Vec<String>,
+ expect_json: Option<Vec<Value>>,
+}
+
+impl Execs {
+ pub fn with_stdout<S: ToString>(mut self, expected: S) -> Execs {
+ self.expect_stdout = Some(expected.to_string());
+ self
+ }
+
+ pub fn with_stderr<S: ToString>(mut self, expected: S) -> Execs {
+ self._with_stderr(&expected);
+ self
+ }
+
+ fn _with_stderr(&mut self, expected: &ToString) {
+ self.expect_stderr = Some(expected.to_string());
+ }
+
+ pub fn with_status(mut self, expected: i32) -> Execs {
+ self.expect_exit_code = Some(expected);
+ self
+ }
+
+ pub fn with_stdout_contains<S: ToString>(mut self, expected: S) -> Execs {
+ self.expect_stdout_contains.push(expected.to_string());
+ self
+ }
+
+ pub fn with_stderr_contains<S: ToString>(mut self, expected: S) -> Execs {
+ self.expect_stderr_contains.push(expected.to_string());
+ self
+ }
+
+ pub fn with_stdout_contains_n<S: ToString>(mut self, expected: S, number: usize) -> Execs {
+ self.expect_stdout_contains_n.push((expected.to_string(), number));
+ self
+ }
+
+ pub fn with_stdout_does_not_contain<S: ToString>(mut self, expected: S) -> Execs {
+ self.expect_stdout_not_contains.push(expected.to_string());
+ self
+ }
+
+ pub fn with_stderr_does_not_contain<S: ToString>(mut self, expected: S) -> Execs {
+ self.expect_stderr_not_contains.push(expected.to_string());
+ self
+ }
+
+ pub fn with_json(mut self, expected: &str) -> Execs {
+ self.expect_json = Some(expected.split("\n\n").map(|obj| {
+ obj.parse().unwrap()
+ }).collect());
+ self
+ }
+
+ fn match_output(&self, actual: &Output) -> ham::MatchResult {
+ self.match_status(actual)
+ .and(self.match_stdout(actual))
+ .and(self.match_stderr(actual))
+ }
+
+ fn match_status(&self, actual: &Output) -> ham::MatchResult {
+ match self.expect_exit_code {
+ None => ham::success(),
+ Some(code) => {
+ ham::expect(
+ actual.status.code() == Some(code),
+ format!("exited with {}\n--- stdout\n{}\n--- stderr\n{}",
+ actual.status,
+ String::from_utf8_lossy(&actual.stdout),
+ String::from_utf8_lossy(&actual.stderr)))
+ }
+ }
+ }
+
+ fn match_stdout(&self, actual: &Output) -> ham::MatchResult {
+ self.match_std(self.expect_stdout.as_ref(), &actual.stdout,
+ "stdout", &actual.stderr, MatchKind::Exact)?;
+ for expect in self.expect_stdout_contains.iter() {
+ self.match_std(Some(expect), &actual.stdout, "stdout",
+ &actual.stderr, MatchKind::Partial)?;
+ }
+ for expect in self.expect_stderr_contains.iter() {
+ self.match_std(Some(expect), &actual.stderr, "stderr",
+ &actual.stdout, MatchKind::Partial)?;
+ }
+ for &(ref expect, number) in self.expect_stdout_contains_n.iter() {
+ self.match_std(Some(&expect), &actual.stdout, "stdout",
+ &actual.stderr, MatchKind::PartialN(number))?;
+ }
+ for expect in self.expect_stdout_not_contains.iter() {
+ self.match_std(Some(expect), &actual.stdout, "stdout",
+ &actual.stderr, MatchKind::NotPresent)?;
+ }
+ for expect in self.expect_stderr_not_contains.iter() {
+ self.match_std(Some(expect), &actual.stderr, "stderr",
+ &actual.stdout, MatchKind::NotPresent)?;
+ }
+
+ if let Some(ref objects) = self.expect_json {
+ let stdout = str::from_utf8(&actual.stdout)
+ .map_err(|_| "stdout was not utf8 encoded".to_owned())?;
+ let lines = stdout.lines().collect::<Vec<_>>();
+ if lines.len() != objects.len() {
+ return Err(format!("expected {} json lines, got {}, stdout:\n{}",
+ objects.len(), lines.len(), stdout));
+ }
+ for (obj, line) in objects.iter().zip(lines) {
+ self.match_json(obj, line)?;
+ }
+ }
+ Ok(())
+ }
+
+ fn match_stderr(&self, actual: &Output) -> ham::MatchResult {
+ self.match_std(self.expect_stderr.as_ref(), &actual.stderr,
+ "stderr", &actual.stdout, MatchKind::Exact)
+ }
+
+ fn match_std(&self, expected: Option<&String>, actual: &[u8],
+ description: &str, extra: &[u8],
+ kind: MatchKind) -> ham::MatchResult {
+ let out = match expected {
+ Some(out) => out,
+ None => return ham::success(),
+ };
+ let actual = match str::from_utf8(actual) {
+ Err(..) => return Err(format!("{} was not utf8 encoded",
+ description)),
+ Ok(actual) => actual,
+ };
+ // Let's not deal with \r\n vs \n on windows...
+ let actual = actual.replace("\r", "");
+ let actual = actual.replace("\t", "<tab>");
+
+ match kind {
+ MatchKind::Exact => {
+ let a = actual.lines();
+ let e = out.lines();
+
+ let diffs = self.diff_lines(a, e, false);
+ ham::expect(diffs.is_empty(),
+ format!("differences:\n\
+ {}\n\n\
+ other output:\n\
+ `{}`", diffs.join("\n"),
+ String::from_utf8_lossy(extra)))
+ }
+ MatchKind::Partial => {
+ let mut a = actual.lines();
+ let e = out.lines();
+
+ let mut diffs = self.diff_lines(a.clone(), e.clone(), true);
+ while let Some(..) = a.next() {
+ let a = self.diff_lines(a.clone(), e.clone(), true);
+ if a.len() < diffs.len() {
+ diffs = a;
+ }
+ }
+ ham::expect(diffs.is_empty(),
+ format!("expected to find:\n\
+ {}\n\n\
+ did not find in output:\n\
+ {}", out,
+ actual))
+ }
+ MatchKind::PartialN(number) => {
+ let mut a = actual.lines();
+ let e = out.lines();
+
+ let mut matches = 0;
+
+ while let Some(..) = {
+ if self.diff_lines(a.clone(), e.clone(), true).is_empty() {
+ matches += 1;
+ }
+ a.next()
+ } {}
+
+ ham::expect(matches == number,
+ format!("expected to find {} occurences:\n\
+ {}\n\n\
+ did not find in output:\n\
+ {}", number, out,
+ actual))
+ }
+ MatchKind::NotPresent => {
+ ham::expect(!actual.contains(out),
+ format!("expected not to find:\n\
+ {}\n\n\
+ but found in output:\n\
+ {}", out,
+ actual))
+ }
+ }
+ }
+
+ fn match_json(&self, expected: &Value, line: &str) -> ham::MatchResult {
+ let actual = match line.parse() {
+ Err(e) => return Err(format!("invalid json, {}:\n`{}`", e, line)),
+ Ok(actual) => actual,
+ };
+
+ match find_mismatch(expected, &actual) {
+ Some((expected_part, actual_part)) => Err(format!(
+ "JSON mismatch\nExpected:\n{}\nWas:\n{}\nExpected part:\n{}\nActual part:\n{}\n",
+ serde_json::to_string_pretty(expected).unwrap(),
+ serde_json::to_string_pretty(&actual).unwrap(),
+ serde_json::to_string_pretty(expected_part).unwrap(),
+ serde_json::to_string_pretty(actual_part).unwrap(),
+ )),
+ None => Ok(()),
+ }
+ }
+
+ fn diff_lines<'a>(&self, actual: str::Lines<'a>, expected: str::Lines<'a>,
+ partial: bool) -> Vec<String> {
+ let actual = actual.take(if partial {
+ expected.clone().count()
+ } else {
+ usize::MAX
+ });
+ zip_all(actual, expected).enumerate().filter_map(|(i, (a,e))| {
+ match (a, e) {
+ (Some(a), Some(e)) => {
+ if lines_match(&e, &a) {
+ None
+ } else {
+ Some(format!("{:3} - |{}|\n + |{}|\n", i, e, a))
+ }
+ },
+ (Some(a), None) => {
+ Some(format!("{:3} -\n + |{}|\n", i, a))
+ },
+ (None, Some(e)) => {
+ Some(format!("{:3} - |{}|\n +\n", i, e))
+ },
+ (None, None) => panic!("Cannot get here")
+ }
+ }).collect()
+ }
+}
+
+#[derive(Debug, PartialEq, Eq, Clone, Copy)]
+enum MatchKind {
+ Exact,
+ Partial,
+ PartialN(usize),
+ NotPresent,
+}
+
+pub fn lines_match(expected: &str, mut actual: &str) -> bool {
+ let expected = substitute_macros(expected);
+ for (i, part) in expected.split("[..]").enumerate() {
+ match actual.find(part) {
+ Some(j) => {
+ if i == 0 && j != 0 {
+ return false
+ }
+ actual = &actual[j + part.len()..];
+ }
+ None => {
+ return false
+ }
+ }
+ }
+ actual.is_empty() || expected.ends_with("[..]")
+}
+
+#[test]
+fn lines_match_works() {
+ assert!(lines_match("a b", "a b"));
+ assert!(lines_match("a[..]b", "a b"));
+ assert!(lines_match("a[..]", "a b"));
+ assert!(lines_match("[..]", "a b"));
+ assert!(lines_match("[..]b", "a b"));
+
+ assert!(!lines_match("[..]b", "c"));
+ assert!(!lines_match("b", "c"));
+ assert!(!lines_match("b", "cb"));
+}
+
+// Compares JSON object for approximate equality.
+// You can use `[..]` wildcard in strings (useful for OS dependent things such
+// as paths). You can use a `"{...}"` string literal as a wildcard for
+// arbitrary nested JSON (useful for parts of object emitted by other programs
+// (e.g. rustc) rather than Cargo itself). Arrays are sorted before comparison.
+fn find_mismatch<'a>(expected: &'a Value, actual: &'a Value)
+ -> Option<(&'a Value, &'a Value)> {
+ use serde_json::Value::*;
+ match (expected, actual) {
+ (&Number(ref l), &Number(ref r)) if l == r => None,
+ (&Bool(l), &Bool(r)) if l == r => None,
+ (&String(ref l), &String(ref r)) if lines_match(l, r) => None,
+ (&Array(ref l), &Array(ref r)) => {
+ if l.len() != r.len() {
+ return Some((expected, actual));
+ }
+
+ let mut l = l.iter().collect::<Vec<_>>();
+ let mut r = r.iter().collect::<Vec<_>>();
+
+ l.retain(|l| {
+ match r.iter().position(|r| find_mismatch(l, r).is_none()) {
+ Some(i) => {
+ r.remove(i);
+ false
+ }
+ None => true
+ }
+ });
+
+ if l.len() > 0 {
+ assert!(r.len() > 0);
+ Some((&l[0], &r[0]))
+ } else {
+ assert!(r.len() == 0);
+ None
+ }
+ }
+ (&Object(ref l), &Object(ref r)) => {
+ let same_keys = l.len() == r.len() && l.keys().all(|k| r.contains_key(k));
+ if !same_keys {
+ return Some((expected, actual));
+ }
+
+ l.values().zip(r.values())
+ .filter_map(|(l, r)| find_mismatch(l, r))
+ .nth(0)
+ }
+ (&Null, &Null) => None,
+ // magic string literal "{...}" acts as wildcard for any sub-JSON
+ (&String(ref l), _) if l == "{...}" => None,
+ _ => Some((expected, actual)),
+ }
+
+}
+
+struct ZipAll<I1: Iterator, I2: Iterator> {
+ first: I1,
+ second: I2,
+}
+
+impl<T, I1: Iterator<Item=T>, I2: Iterator<Item=T>> Iterator for ZipAll<I1, I2> {
+ type Item = (Option<T>, Option<T>);
+ fn next(&mut self) -> Option<(Option<T>, Option<T>)> {
+ let first = self.first.next();
+ let second = self.second.next();
+
+ match (first, second) {
+ (None, None) => None,
+ (a, b) => Some((a, b))
+ }
+ }
+}
+
+fn zip_all<T, I1: Iterator<Item=T>, I2: Iterator<Item=T>>(a: I1, b: I2) -> ZipAll<I1, I2> {
+ ZipAll {
+ first: a,
+ second: b,
+ }
+}
+
+impl fmt::Display for Execs {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ write!(f, "execs")
+ }
+}
+
+impl ham::Matcher<ProcessBuilder> for Execs {
+ fn matches(&self, mut process: ProcessBuilder) -> ham::MatchResult {
+ self.matches(&mut process)
+ }
+}
+
+impl<'a> ham::Matcher<&'a mut ProcessBuilder> for Execs {
+ fn matches(&self, process: &'a mut ProcessBuilder) -> ham::MatchResult {
+ println!("running {}", process);
+ let res = process.exec_with_output();
+
+ match res {
+ Ok(out) => self.match_output(&out),
+ Err(CargoError(CargoErrorKind::ProcessErrorKind(
+ ProcessError { output: Some(ref out), .. }), ..)) => {
+ self.match_output(out)
+ }
+ Err(e) => {
+ let mut s = format!("could not exec process {}: {}", process, e);
+ match e.cause() {
+ Some(cause) => s.push_str(&format!("\ncaused by: {}",
+ cause.description())),
+ None => {}
+ }
+ Err(s)
+ }
+ }
+ }
+}
+
+impl ham::Matcher<Output> for Execs {
+ fn matches(&self, output: Output) -> ham::MatchResult {
+ self.match_output(&output)
+ }
+}
+
+pub fn execs() -> Execs {
+ Execs {
+ expect_stdout: None,
+ expect_stderr: None,
+ expect_stdin: None,
+ expect_exit_code: None,
+ expect_stdout_contains: Vec::new(),
+ expect_stderr_contains: Vec::new(),
+ expect_stdout_contains_n: Vec::new(),
+ expect_stdout_not_contains: Vec::new(),
+ expect_stderr_not_contains: Vec::new(),
+ expect_json: None,
+ }
+}
+
+#[derive(Clone)]
+pub struct ShellWrites {
+ expected: String
+}
+
+impl fmt::Display for ShellWrites {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ write!(f, "`{}` written to the shell", self.expected)
+ }
+}
+
+impl<'a> ham::Matcher<&'a [u8]> for ShellWrites {
+ fn matches(&self, actual: &[u8])
+ -> ham::MatchResult
+ {
+ let actual = String::from_utf8_lossy(actual);
+ let actual = actual.to_string();
+ ham::expect(actual == self.expected, actual)
+ }
+}
+
+pub fn shell_writes<T: fmt::Display>(string: T) -> ShellWrites {
+ ShellWrites { expected: string.to_string() }
+}
+
+pub trait Tap {
+ fn tap<F: FnOnce(&mut Self)>(self, callback: F) -> Self;
+}
+
+impl<T> Tap for T {
+ fn tap<F: FnOnce(&mut Self)>(mut self, callback: F) -> T {
+ callback(&mut self);
+ self
+ }
+}
+
+pub fn basic_bin_manifest(name: &str) -> String {
+ format!(r#"
+ [package]
+
+ name = "{}"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [[bin]]
+
+ name = "{}"
+ "#, name, name)
+}
+
+pub fn basic_lib_manifest(name: &str) -> String {
+ format!(r#"
+ [package]
+
+ name = "{}"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+
+ name = "{}"
+ "#, name, name)
+}
+
+pub fn path2url(p: PathBuf) -> Url {
+ Url::from_file_path(&*p).ok().unwrap()
+}
+
+fn substitute_macros(input: &str) -> String {
+ let macros = [
+ ("[RUNNING]", " Running"),
+ ("[COMPILING]", " Compiling"),
+ ("[CREATED]", " Created"),
+ ("[FINISHED]", " Finished"),
+ ("[ERROR]", "error:"),
+ ("[WARNING]", "warning:"),
+ ("[DOCUMENTING]", " Documenting"),
+ ("[FRESH]", " Fresh"),
+ ("[UPDATING]", " Updating"),
+ ("[ADDING]", " Adding"),
+ ("[REMOVING]", " Removing"),
+ ("[DOCTEST]", " Doc-tests"),
+ ("[PACKAGING]", " Packaging"),
+ ("[DOWNLOADING]", " Downloading"),
+ ("[UPLOADING]", " Uploading"),
+ ("[VERIFYING]", " Verifying"),
+ ("[ARCHIVING]", " Archiving"),
+ ("[INSTALLING]", " Installing"),
+ ("[REPLACING]", " Replacing"),
+ ("[UNPACKING]", " Unpacking"),
+ ("[EXE]", if cfg!(windows) {".exe"} else {""}),
+ ("[/]", if cfg!(windows) {"\\"} else {"/"}),
+ ];
+ let mut result = input.to_owned();
+ for &(pat, subst) in macros.iter() {
+ result = result.replace(pat, subst)
+ }
+ return result;
+}
--- /dev/null
+use std::env;
+use std::cell::Cell;
+use std::fs;
+use std::io::{self, ErrorKind};
+use std::path::{Path, PathBuf};
+use std::sync::{Once, ONCE_INIT};
+use std::sync::atomic::{AtomicUsize, ATOMIC_USIZE_INIT, Ordering};
+
+use filetime::{self, FileTime};
+
+static CARGO_INTEGRATION_TEST_DIR : &'static str = "cit";
+static NEXT_ID: AtomicUsize = ATOMIC_USIZE_INIT;
+
+thread_local!(static TASK_ID: usize = NEXT_ID.fetch_add(1, Ordering::SeqCst));
+
+fn init() {
+ static GLOBAL_INIT: Once = ONCE_INIT;
+ thread_local!(static LOCAL_INIT: Cell<bool> = Cell::new(false));
+ GLOBAL_INIT.call_once(|| {
+ global_root().mkdir_p();
+ });
+ LOCAL_INIT.with(|i| {
+ if i.get() {
+ return
+ }
+ i.set(true);
+ root().rm_rf();
+ home().mkdir_p();
+ })
+}
+
+fn global_root() -> PathBuf {
+ let mut path = t!(env::current_exe());
+ path.pop(); // chop off exe name
+ path.pop(); // chop off 'debug'
+
+ // If `cargo test` is run manually then our path looks like
+ // `target/debug/foo`, in which case our `path` is already pointing at
+ // `target`. If, however, `cargo test --target $target` is used then the
+ // output is `target/$target/debug/foo`, so our path is pointing at
+ // `target/$target`. Here we conditionally pop the `$target` name.
+ if path.file_name().and_then(|s| s.to_str()) != Some("target") {
+ path.pop();
+ }
+
+ path.join(CARGO_INTEGRATION_TEST_DIR)
+}
+
+pub fn root() -> PathBuf {
+ init();
+ global_root().join(&TASK_ID.with(|my_id| format!("t{}", my_id)))
+}
+
+pub fn home() -> PathBuf {
+ root().join("home")
+}
+
+pub trait CargoPathExt {
+ fn rm_rf(&self);
+ fn mkdir_p(&self);
+
+ fn move_into_the_past(&self) {
+ self.move_in_time(|sec, nsec| (sec - 3600, nsec))
+ }
+
+ fn move_into_the_future(&self) {
+ self.move_in_time(|sec, nsec| (sec + 3600, nsec))
+ }
+
+ fn move_in_time<F>(&self, F)
+ where F: Fn(u64, u32) -> (u64, u32);
+}
+
+impl CargoPathExt for Path {
+ /* Technically there is a potential race condition, but we don't
+ * care all that much for our tests
+ */
+ fn rm_rf(&self) {
+ if !self.exists() {
+ return
+ }
+
+ for file in t!(fs::read_dir(self)) {
+ let file = t!(file);
+ if file.file_type().map(|m| m.is_dir()).unwrap_or(false) {
+ file.path().rm_rf();
+ } else {
+ // On windows we can't remove a readonly file, and git will
+ // often clone files as readonly. As a result, we have some
+ // special logic to remove readonly files on windows.
+ do_op(&file.path(), "remove file", |p| fs::remove_file(p));
+ }
+ }
+ do_op(self, "remove dir", |p| fs::remove_dir(p));
+ }
+
+ fn mkdir_p(&self) {
+ fs::create_dir_all(self).unwrap_or_else(|e| {
+ panic!("failed to mkdir_p {}: {}", self.display(), e)
+ })
+ }
+
+ fn move_in_time<F>(&self, travel_amount: F)
+ where F: Fn(u64, u32) -> ((u64, u32)),
+ {
+ if self.is_file() {
+ time_travel(self, &travel_amount);
+ } else {
+ recurse(self, &self.join("target"), &travel_amount);
+ }
+
+ fn recurse<F>(p: &Path, bad: &Path, travel_amount: &F)
+ where F: Fn(u64, u32) -> ((u64, u32)),
+ {
+ if p.is_file() {
+ time_travel(p, travel_amount)
+ } else if !p.starts_with(bad) {
+ for f in t!(fs::read_dir(p)) {
+ let f = t!(f).path();
+ recurse(&f, bad, travel_amount);
+ }
+ }
+ }
+
+ fn time_travel<F>(path: &Path, travel_amount: &F)
+ where F: Fn(u64, u32) -> ((u64, u32)),
+ {
+ let stat = t!(path.metadata());
+
+ let mtime = FileTime::from_last_modification_time(&stat);
+
+ let (sec, nsec) = travel_amount(mtime.seconds_relative_to_1970(), mtime.nanoseconds());
+ let newtime = FileTime::from_seconds_since_1970(sec, nsec);
+
+ // Sadly change_file_times has a failure mode where a readonly file
+ // cannot have its times changed on windows.
+ do_op(path, "set file times",
+ |path| filetime::set_file_times(path, newtime, newtime));
+ }
+ }
+}
+
+fn do_op<F>(path: &Path, desc: &str, mut f: F)
+ where F: FnMut(&Path) -> io::Result<()>
+{
+ match f(path) {
+ Ok(()) => {}
+ Err(ref e) if cfg!(windows) &&
+ e.kind() == ErrorKind::PermissionDenied => {
+ let mut p = t!(path.metadata()).permissions();
+ p.set_readonly(false);
+ t!(fs::set_permissions(path, p));
+ f(path).unwrap_or_else(|e| {
+ panic!("failed to {} {}: {}", desc, path.display(), e);
+ })
+ }
+ Err(e) => {
+ panic!("failed to {} {}: {}", desc, path.display(), e);
+ }
+ }
+}
--- /dev/null
+use std::path::PathBuf;
+use std::io::prelude::*;
+use std::fs::{self, File};
+
+use support::paths;
+use support::git::{repo, Repository};
+
+use url::Url;
+
+pub fn setup() -> Repository {
+ let config = paths::root().join(".cargo/config");
+ t!(fs::create_dir_all(config.parent().unwrap()));
+ t!(t!(File::create(&config)).write_all(format!(r#"
+ [registry]
+ token = "api-token"
+
+ [registries.alternative]
+ index = "{registry}"
+ "#, registry = registry().to_string()).as_bytes()));
+
+ let credentials = paths::root().join("home/.cargo/credentials");
+ t!(fs::create_dir_all(credentials.parent().unwrap()));
+ t!(t!(File::create(&credentials)).write_all(br#"
+ [alternative]
+ token = "api-token"
+ "#));
+
+ t!(fs::create_dir_all(&upload_path().join("api/v1/crates")));
+
+ repo(®istry_path())
+ .file("config.json", &format!(r#"{{
+ "dl": "{0}",
+ "api": "{0}"
+ }}"#, upload()))
+ .build()
+}
+
+fn registry_path() -> PathBuf { paths::root().join("registry") }
+pub fn registry() -> Url { Url::from_file_path(&*registry_path()).ok().unwrap() }
+pub fn upload_path() -> PathBuf { paths::root().join("upload") }
+fn upload() -> Url { Url::from_file_path(&*upload_path()).ok().unwrap() }
--- /dev/null
+use std::collections::HashMap;
+use std::fs::{self, File};
+use std::io::prelude::*;
+use std::path::{PathBuf, Path};
+
+use flate2::Compression::Default;
+use flate2::write::GzEncoder;
+use git2;
+use hex::ToHex;
+use tar::{Builder, Header};
+use url::Url;
+
+use support::paths;
+use support::git::repo;
+use cargo::util::Sha256;
+
+pub fn registry_path() -> PathBuf { paths::root().join("registry") }
+pub fn registry() -> Url { Url::from_file_path(&*registry_path()).ok().unwrap() }
+pub fn dl_path() -> PathBuf { paths::root().join("dl") }
+pub fn dl_url() -> Url { Url::from_file_path(&*dl_path()).ok().unwrap() }
+pub fn alt_registry_path() -> PathBuf { paths::root().join("alternative-registry") }
+pub fn alt_registry() -> Url { Url::from_file_path(&*alt_registry_path()).ok().unwrap() }
+pub fn alt_dl_path() -> PathBuf { paths::root().join("alt_dl") }
+pub fn alt_dl_url() -> Url { Url::from_file_path(&*alt_dl_path()).ok().unwrap() }
+
+pub struct Package {
+ name: String,
+ vers: String,
+ deps: Vec<Dependency>,
+ files: Vec<(String, String)>,
+ extra_files: Vec<(String, String)>,
+ yanked: bool,
+ features: HashMap<String, Vec<String>>,
+ local: bool,
+ alternative: bool,
+}
+
+struct Dependency {
+ name: String,
+ vers: String,
+ kind: String,
+ target: Option<String>,
+ features: Vec<String>,
+ registry: Option<String>,
+}
+
+pub fn init() {
+ let config = paths::home().join(".cargo/config");
+ t!(fs::create_dir_all(config.parent().unwrap()));
+ if fs::metadata(&config).is_ok() {
+ return
+ }
+ t!(t!(File::create(&config)).write_all(format!(r#"
+ [registry]
+ token = "api-token"
+
+ [source.crates-io]
+ registry = 'https://wut'
+ replace-with = 'dummy-registry'
+
+ [source.dummy-registry]
+ registry = '{reg}'
+
+ [registries.alternative]
+ index = '{alt}'
+ "#, reg = registry(), alt = alt_registry()).as_bytes()));
+
+ // Init a new registry
+ let _ = repo(®istry_path())
+ .file("config.json", &format!(r#"
+ {{"dl":"{0}","api":"{0}"}}
+ "#, dl_url()))
+ .build();
+ fs::create_dir_all(dl_path().join("api/v1/crates")).unwrap();
+
+ // Init an alt registry
+ repo(&alt_registry_path())
+ .file("config.json", &format!(r#"
+ {{"dl":"{0}","api":"{0}"}}
+ "#, alt_dl_url()))
+ .build();
+ fs::create_dir_all(alt_dl_path().join("api/v1/crates")).unwrap();
+}
+
+impl Package {
+ pub fn new(name: &str, vers: &str) -> Package {
+ init();
+ Package {
+ name: name.to_string(),
+ vers: vers.to_string(),
+ deps: Vec::new(),
+ files: Vec::new(),
+ extra_files: Vec::new(),
+ yanked: false,
+ features: HashMap::new(),
+ local: false,
+ alternative: false,
+ }
+ }
+
+ pub fn local(&mut self, local: bool) -> &mut Package {
+ self.local = local;
+ self
+ }
+
+ pub fn alternative(&mut self, alternative: bool) -> &mut Package {
+ self.alternative = alternative;
+ self
+ }
+
+ pub fn file(&mut self, name: &str, contents: &str) -> &mut Package {
+ self.files.push((name.to_string(), contents.to_string()));
+ self
+ }
+
+ pub fn extra_file(&mut self, name: &str, contents: &str) -> &mut Package {
+ self.extra_files.push((name.to_string(), contents.to_string()));
+ self
+ }
+
+ pub fn dep(&mut self, name: &str, vers: &str) -> &mut Package {
+ self.full_dep(name, vers, None, "normal", &[], None)
+ }
+
+ pub fn feature_dep(&mut self,
+ name: &str,
+ vers: &str,
+ features: &[&str]) -> &mut Package {
+ self.full_dep(name, vers, None, "normal", features, None)
+ }
+
+ pub fn target_dep(&mut self,
+ name: &str,
+ vers: &str,
+ target: &str) -> &mut Package {
+ self.full_dep(name, vers, Some(target), "normal", &[], None)
+ }
+
+ pub fn registry_dep(&mut self,
+ name: &str,
+ vers: &str,
+ registry: &str) -> &mut Package {
+ self.full_dep(name, vers, None, "normal", &[], Some(registry))
+ }
+
+ pub fn dev_dep(&mut self, name: &str, vers: &str) -> &mut Package {
+ self.full_dep(name, vers, None, "dev", &[], None)
+ }
+
+ fn full_dep(&mut self,
+ name: &str,
+ vers: &str,
+ target: Option<&str>,
+ kind: &str,
+ features: &[&str],
+ registry: Option<&str>) -> &mut Package {
+ self.deps.push(Dependency {
+ name: name.to_string(),
+ vers: vers.to_string(),
+ kind: kind.to_string(),
+ target: target.map(|s| s.to_string()),
+ features: features.iter().map(|s| s.to_string()).collect(),
+ registry: registry.map(|s| s.to_string()),
+ });
+ self
+ }
+
+ pub fn yanked(&mut self, yanked: bool) -> &mut Package {
+ self.yanked = yanked;
+ self
+ }
+
+ pub fn publish(&self) -> String {
+ self.make_archive();
+
+ // Figure out what we're going to write into the index
+ let deps = self.deps.iter().map(|dep| {
+ json!({
+ "name": dep.name,
+ "req": dep.vers,
+ "features": dep.features,
+ "default_features": true,
+ "target": dep.target,
+ "optional": false,
+ "kind": dep.kind,
+ "registry": dep.registry,
+ })
+ }).collect::<Vec<_>>();
+ let cksum = {
+ let mut c = Vec::new();
+ t!(t!(File::open(&self.archive_dst())).read_to_end(&mut c));
+ cksum(&c)
+ };
+ let line = json!({
+ "name": self.name,
+ "vers": self.vers,
+ "deps": deps,
+ "cksum": cksum,
+ "features": self.features,
+ "yanked": self.yanked,
+ }).to_string();
+
+ let file = match self.name.len() {
+ 1 => format!("1/{}", self.name),
+ 2 => format!("2/{}", self.name),
+ 3 => format!("3/{}/{}", &self.name[..1], self.name),
+ _ => format!("{}/{}/{}", &self.name[0..2], &self.name[2..4], self.name),
+ };
+
+ let registry_path = if self.alternative { alt_registry_path() } else { registry_path() };
+
+ // Write file/line in the index
+ let dst = if self.local {
+ registry_path.join("index").join(&file)
+ } else {
+ registry_path.join(&file)
+ };
+ let mut prev = String::new();
+ let _ = File::open(&dst).and_then(|mut f| f.read_to_string(&mut prev));
+ t!(fs::create_dir_all(dst.parent().unwrap()));
+ t!(t!(File::create(&dst))
+ .write_all((prev + &line[..] + "\n").as_bytes()));
+
+ // Add the new file to the index
+ if !self.local {
+ let repo = t!(git2::Repository::open(®istry_path));
+ let mut index = t!(repo.index());
+ t!(index.add_path(Path::new(&file)));
+ t!(index.write());
+ let id = t!(index.write_tree());
+
+ // Commit this change
+ let tree = t!(repo.find_tree(id));
+ let sig = t!(repo.signature());
+ let parent = t!(repo.refname_to_id("refs/heads/master"));
+ let parent = t!(repo.find_commit(parent));
+ t!(repo.commit(Some("HEAD"), &sig, &sig,
+ "Another commit", &tree,
+ &[&parent]));
+ }
+
+ return cksum
+ }
+
+ fn make_archive(&self) {
+ let mut manifest = format!(r#"
+ [package]
+ name = "{}"
+ version = "{}"
+ authors = []
+ "#, self.name, self.vers);
+ for dep in self.deps.iter() {
+ let target = match dep.target {
+ None => String::new(),
+ Some(ref s) => format!("target.'{}'.", s),
+ };
+ let kind = match &dep.kind[..] {
+ "build" => "build-",
+ "dev" => "dev-",
+ _ => ""
+ };
+ manifest.push_str(&format!(r#"
+ [{}{}dependencies.{}]
+ version = "{}"
+ "#, target, kind, dep.name, dep.vers));
+ }
+
+ let dst = self.archive_dst();
+ t!(fs::create_dir_all(dst.parent().unwrap()));
+ let f = t!(File::create(&dst));
+ let mut a = Builder::new(GzEncoder::new(f, Default));
+ self.append(&mut a, "Cargo.toml", &manifest);
+ if self.files.is_empty() {
+ self.append(&mut a, "src/lib.rs", "");
+ } else {
+ for &(ref name, ref contents) in self.files.iter() {
+ self.append(&mut a, name, contents);
+ }
+ }
+ for &(ref name, ref contents) in self.extra_files.iter() {
+ self.append_extra(&mut a, name, contents);
+ }
+ }
+
+ fn append<W: Write>(&self, ar: &mut Builder<W>, file: &str, contents: &str) {
+ self.append_extra(ar,
+ &format!("{}-{}/{}", self.name, self.vers, file),
+ contents);
+ }
+
+ fn append_extra<W: Write>(&self, ar: &mut Builder<W>, path: &str, contents: &str) {
+ let mut header = Header::new_ustar();
+ header.set_size(contents.len() as u64);
+ t!(header.set_path(path));
+ header.set_cksum();
+ t!(ar.append(&header, contents.as_bytes()));
+ }
+
+ pub fn archive_dst(&self) -> PathBuf {
+ if self.local {
+ registry_path().join(format!("{}-{}.crate", self.name,
+ self.vers))
+ } else {
+ let dl_path = if self.alternative { alt_dl_path() } else { dl_path() };
+ dl_path.join(&self.name).join(&self.vers).join("download")
+ }
+ }
+}
+
+pub fn cksum(s: &[u8]) -> String {
+ let mut sha = Sha256::new();
+ sha.update(s);
+ sha.finish().to_hex()
+}
--- /dev/null
+extern crate cargo;
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::str::FromStr;
+use std::fmt;
+
+use cargo::util::{Cfg, CfgExpr};
+use cargotest::rustc_host;
+use cargotest::support::registry::Package;
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+macro_rules! c {
+ ($a:ident) => (
+ Cfg::Name(stringify!($a).to_string())
+ );
+ ($a:ident = $e:expr) => (
+ Cfg::KeyPair(stringify!($a).to_string(), $e.to_string())
+ );
+}
+
+macro_rules! e {
+ (any($($t:tt),*)) => (CfgExpr::Any(vec![$(e!($t)),*]));
+ (all($($t:tt),*)) => (CfgExpr::All(vec![$(e!($t)),*]));
+ (not($($t:tt)*)) => (CfgExpr::Not(Box::new(e!($($t)*))));
+ (($($t:tt)*)) => (e!($($t)*));
+ ($($t:tt)*) => (CfgExpr::Value(c!($($t)*)));
+}
+
+fn good<T>(s: &str, expected: T)
+ where T: FromStr + PartialEq + fmt::Debug,
+ T::Err: fmt::Display
+{
+ let c = match T::from_str(s) {
+ Ok(c) => c,
+ Err(e) => panic!("failed to parse `{}`: {}", s, e),
+ };
+ assert_eq!(c, expected);
+}
+
+fn bad<T>(s: &str, err: &str)
+ where T: FromStr + fmt::Display, T::Err: fmt::Display
+{
+ let e = match T::from_str(s) {
+ Ok(cfg) => panic!("expected `{}` to not parse but got {}", s, cfg),
+ Err(e) => e.to_string(),
+ };
+ assert!(e.contains(err), "when parsing `{}`,\n\"{}\" not contained \
+ inside: {}", s, err, e);
+}
+
+#[test]
+fn cfg_syntax() {
+ good("foo", c!(foo));
+ good("_bar", c!(_bar));
+ good(" foo", c!(foo));
+ good(" foo ", c!(foo));
+ good(" foo = \"bar\"", c!(foo = "bar"));
+ good("foo=\"\"", c!(foo = ""));
+ good(" foo=\"3\" ", c!(foo = "3"));
+ good("foo = \"3 e\"", c!(foo = "3 e"));
+}
+
+#[test]
+fn cfg_syntax_bad() {
+ bad::<Cfg>("", "found nothing");
+ bad::<Cfg>(" ", "found nothing");
+ bad::<Cfg>("\t", "unexpected character");
+ bad::<Cfg>("7", "unexpected character");
+ bad::<Cfg>("=", "expected identifier");
+ bad::<Cfg>(",", "expected identifier");
+ bad::<Cfg>("(", "expected identifier");
+ bad::<Cfg>("foo (", "malformed cfg value");
+ bad::<Cfg>("bar =", "expected a string");
+ bad::<Cfg>("bar = \"", "unterminated string");
+ bad::<Cfg>("foo, bar", "malformed cfg value");
+}
+
+#[test]
+fn cfg_expr() {
+ good("foo", e!(foo));
+ good("_bar", e!(_bar));
+ good(" foo", e!(foo));
+ good(" foo ", e!(foo));
+ good(" foo = \"bar\"", e!(foo = "bar"));
+ good("foo=\"\"", e!(foo = ""));
+ good(" foo=\"3\" ", e!(foo = "3"));
+ good("foo = \"3 e\"", e!(foo = "3 e"));
+
+ good("all()", e!(all()));
+ good("all(a)", e!(all(a)));
+ good("all(a, b)", e!(all(a, b)));
+ good("all(a, )", e!(all(a)));
+ good("not(a = \"b\")", e!(not(a = "b")));
+ good("not(all(a))", e!(not(all(a))));
+}
+
+#[test]
+fn cfg_expr_bad() {
+ bad::<CfgExpr>(" ", "found nothing");
+ bad::<CfgExpr>(" all", "expected `(`");
+ bad::<CfgExpr>("all(a", "expected `)`");
+ bad::<CfgExpr>("not", "expected `(`");
+ bad::<CfgExpr>("not(a", "expected `)`");
+ bad::<CfgExpr>("a = ", "expected a string");
+ bad::<CfgExpr>("all(not())", "expected identifier");
+ bad::<CfgExpr>("foo(a)", "consider using all() or any() explicitly");
+}
+
+#[test]
+fn cfg_matches() {
+ assert!(e!(foo).matches(&[c!(bar), c!(foo), c!(baz)]));
+ assert!(e!(any(foo)).matches(&[c!(bar), c!(foo), c!(baz)]));
+ assert!(e!(any(foo, bar)).matches(&[c!(bar)]));
+ assert!(e!(any(foo, bar)).matches(&[c!(foo)]));
+ assert!(e!(all(foo, bar)).matches(&[c!(foo), c!(bar)]));
+ assert!(e!(all(foo, bar)).matches(&[c!(foo), c!(bar)]));
+ assert!(e!(not(foo)).matches(&[c!(bar)]));
+ assert!(e!(not(foo)).matches(&[]));
+ assert!(e!(any((not(foo)), (all(foo, bar)))).matches(&[c!(bar)]));
+ assert!(e!(any((not(foo)), (all(foo, bar)))).matches(&[c!(foo), c!(bar)]));
+
+ assert!(!e!(foo).matches(&[]));
+ assert!(!e!(foo).matches(&[c!(bar)]));
+ assert!(!e!(foo).matches(&[c!(fo)]));
+ assert!(!e!(any(foo)).matches(&[]));
+ assert!(!e!(any(foo)).matches(&[c!(bar)]));
+ assert!(!e!(any(foo)).matches(&[c!(bar), c!(baz)]));
+ assert!(!e!(all(foo)).matches(&[c!(bar), c!(baz)]));
+ assert!(!e!(all(foo, bar)).matches(&[c!(bar)]));
+ assert!(!e!(all(foo, bar)).matches(&[c!(foo)]));
+ assert!(!e!(all(foo, bar)).matches(&[]));
+ assert!(!e!(not(bar)).matches(&[c!(bar)]));
+ assert!(!e!(not(bar)).matches(&[c!(baz), c!(bar)]));
+ assert!(!e!(any((not(foo)), (all(foo, bar)))).matches(&[c!(foo)]));
+}
+
+#[test]
+fn cfg_easy() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [target.'cfg(unix)'.dependencies]
+ b = { path = 'b' }
+ [target."cfg(windows)".dependencies]
+ b = { path = 'b' }
+ "#)
+ .file("src/lib.rs", "extern crate b;")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn dont_include() {
+ let other_family = if cfg!(unix) {"windows"} else {"unix"};
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [target.'cfg({})'.dependencies]
+ b = {{ path = 'b' }}
+ "#, other_family))
+ .file("src/lib.rs", "")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] a v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn works_through_the_registry() {
+ Package::new("foo", "0.1.0").publish();
+ Package::new("bar", "0.1.0")
+ .target_dep("foo", "0.1.0", "cfg(unix)")
+ .target_dep("foo", "0.1.0", "cfg(windows)")
+ .publish();
+
+ let p = project("a")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "0.1.0"
+ "#)
+ .file("src/lib.rs", "#[allow(unused_extern_crates)] extern crate bar;")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry [..]
+[DOWNLOADING] [..]
+[DOWNLOADING] [..]
+[COMPILING] foo v0.1.0
+[COMPILING] bar v0.1.0
+[COMPILING] a v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn ignore_version_from_other_platform() {
+ let this_family = if cfg!(unix) {"unix"} else {"windows"};
+ let other_family = if cfg!(unix) {"windows"} else {"unix"};
+ Package::new("foo", "0.1.0").publish();
+ Package::new("foo", "0.2.0").publish();
+
+ let p = project("a")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [target.'cfg({})'.dependencies]
+ foo = "0.1.0"
+
+ [target.'cfg({})'.dependencies]
+ foo = "0.2.0"
+ "#, this_family, other_family))
+ .file("src/lib.rs", "#[allow(unused_extern_crates)] extern crate foo;")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry [..]
+[DOWNLOADING] [..]
+[COMPILING] foo v0.1.0
+[COMPILING] a v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn bad_target_spec() {
+ let p = project("a")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [target.'cfg(4)'.dependencies]
+ bar = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ failed to parse `4` as a cfg expression
+
+Caused by:
+ unexpected character in cfg `4`, [..]
+"));
+}
+
+#[test]
+fn bad_target_spec2() {
+ let p = project("a")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [target.'cfg(foo =)'.dependencies]
+ bar = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ failed to parse `foo =` as a cfg expression
+
+Caused by:
+ expected a string, found nothing
+"));
+}
+
+#[test]
+fn multiple_match_ok() {
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [target.'cfg(unix)'.dependencies]
+ b = {{ path = 'b' }}
+ [target.'cfg(target_family = "unix")'.dependencies]
+ b = {{ path = 'b' }}
+ [target."cfg(windows)".dependencies]
+ b = {{ path = 'b' }}
+ [target.'cfg(target_family = "windows")'.dependencies]
+ b = {{ path = 'b' }}
+ [target."cfg(any(windows, unix))".dependencies]
+ b = {{ path = 'b' }}
+
+ [target.{}.dependencies]
+ b = {{ path = 'b' }}
+ "#, rustc_host()))
+ .file("src/lib.rs", "extern crate b;")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn any_ok() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [target."cfg(any(windows, unix))".dependencies]
+ b = { path = 'b' }
+ "#)
+ .file("src/lib.rs", "extern crate b;")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
--- /dev/null
+echo "checking for lines over 100 characters..."
+find src tests -name '*.rs' | xargs grep '.\{101,\}' && exit 1
+echo "ok"
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+extern crate glob;
+
+use cargotest::is_nightly;
+use cargotest::support::{execs, project};
+use cargotest::support::registry::Package;
+use hamcrest::prelude::*;
+use cargotest::support::paths::CargoPathExt;
+use cargotest::install::exe;
+use glob::glob;
+
+const SIMPLE_MANIFEST: &str = r#"
+[package]
+name = "foo"
+version = "0.0.1"
+authors = []
+"#;
+
+#[test]
+fn check_success() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() {
+ ::bar::baz();
+ }
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn baz() {}
+ "#)
+ .build();
+
+ assert_that(foo.cargo("check"),
+ execs().with_status(0));
+}
+
+#[test]
+fn check_fail() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() {
+ ::bar::baz(42);
+ }
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn baz() {}
+ "#)
+ .build();
+
+ assert_that(foo.cargo("check"),
+ execs().with_status(101));
+}
+
+#[test]
+fn custom_derive() {
+ if !is_nightly() {
+ return
+ }
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/main.rs", r#"
+#![feature(proc_macro)]
+
+#[macro_use]
+extern crate bar;
+
+trait B {
+ fn b(&self);
+}
+
+#[derive(B)]
+struct A;
+
+fn main() {
+ let a = A;
+ a.b();
+}
+"#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ [lib]
+ proc-macro = true
+ "#)
+ .file("src/lib.rs", r#"
+#![feature(proc_macro, proc_macro_lib)]
+#![crate_type = "proc-macro"]
+
+extern crate proc_macro;
+
+use proc_macro::TokenStream;
+
+#[proc_macro_derive(B)]
+pub fn derive(_input: TokenStream) -> TokenStream {
+ format!("impl B for A {{ fn b(&self) {{}} }}").parse().unwrap()
+}
+"#)
+ .build();
+
+ assert_that(foo.cargo("check"),
+ execs().with_status(0));
+}
+
+#[test]
+fn check_build() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() {
+ ::bar::baz();
+ }
+ "#)
+ .build();
+
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn baz() {}
+ "#)
+ .build();
+
+ assert_that(foo.cargo("check"),
+ execs().with_status(0));
+ assert_that(foo.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_check() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() {
+ ::bar::baz();
+ }
+ "#)
+ .build();
+
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn baz() {}
+ "#)
+ .build();
+
+ assert_that(foo.cargo("build"),
+ execs().with_status(0));
+ assert_that(foo.cargo("check"),
+ execs().with_status(0));
+}
+
+// Checks that where a project has both a lib and a bin, the lib is only checked
+// not built.
+#[test]
+fn issue_3418() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(foo.cargo("check").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("[..] --emit=dep-info,metadata [..]"));
+}
+
+// Some weirdness that seems to be caused by a crate being built as well as
+// checked, but in this case with a proc macro too.
+#[test]
+fn issue_3419() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ rustc-serialize = "*"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate rustc_serialize;
+
+ use rustc_serialize::Decodable;
+
+ pub fn take<T: Decodable>() {}
+ "#)
+ .file("src/main.rs", r#"
+ extern crate rustc_serialize;
+
+ extern crate foo;
+
+ #[derive(RustcDecodable)]
+ pub struct Foo;
+
+ fn main() {
+ foo::take::<Foo>();
+ }
+ "#)
+ .build();
+
+ Package::new("rustc-serialize", "1.0.0")
+ .file("src/lib.rs",
+ r#"pub trait Decodable: Sized {
+ fn decode<D: Decoder>(d: &mut D) -> Result<Self, D::Error>;
+ }
+ pub trait Decoder {
+ type Error;
+ fn read_struct<T, F>(&mut self, s_name: &str, len: usize, f: F)
+ -> Result<T, Self::Error>
+ where F: FnOnce(&mut Self) -> Result<T, Self::Error>;
+ } "#).publish();
+
+ assert_that(p.cargo("check"),
+ execs().with_status(0));
+}
+
+// test `cargo rustc --profile check`
+#[test]
+fn rustc_check() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() {
+ ::bar::baz();
+ }
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn baz() {}
+ "#)
+ .build();
+
+ assert_that(foo.cargo("rustc")
+ .arg("--profile")
+ .arg("check")
+ .arg("--")
+ .arg("--emit=metadata"),
+ execs().with_status(0));
+}
+
+#[test]
+fn rustc_check_err() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() {
+ ::bar::qux();
+ }
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn baz() {}
+ "#)
+ .build();
+
+ assert_that(foo.cargo("rustc")
+ .arg("--profile")
+ .arg("check")
+ .arg("--")
+ .arg("--emit=metadata"),
+ execs().with_status(101));
+}
+
+#[test]
+fn check_all() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [workspace]
+ [dependencies]
+ b = { path = "b" }
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("examples/a.rs", "fn main() {}")
+ .file("tests/a.rs", "")
+ .file("src/lib.rs", "")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("b/src/main.rs", "fn main() {}")
+ .file("b/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("check").arg("--all").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("[..] --crate-name foo src[/]lib.rs [..]")
+ .with_stderr_contains("[..] --crate-name foo src[/]main.rs [..]")
+ .with_stderr_contains("[..] --crate-name b b[/]src[/]lib.rs [..]")
+ .with_stderr_contains("[..] --crate-name b b[/]src[/]main.rs [..]")
+ );
+}
+
+#[test]
+fn check_virtual_all_implied() {
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo", "bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("foo/src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("check").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("[..] --crate-name foo foo[/]src[/]lib.rs [..]")
+ .with_stderr_contains("[..] --crate-name bar bar[/]src[/]lib.rs [..]")
+ );
+}
+
+#[test]
+fn check_all_targets() {
+ let foo = project("foo")
+ .file("Cargo.toml", SIMPLE_MANIFEST)
+ .file("src/main.rs", "fn main() {}")
+ .file("src/lib.rs", "pub fn smth() {}")
+ .file("examples/example1.rs", "fn main() {}")
+ .file("tests/test2.rs", "#[test] fn t() {}")
+ .file("benches/bench3.rs", "")
+ .build();
+
+ assert_that(foo.cargo("check").arg("--all-targets").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("[..] --crate-name foo src[/]lib.rs [..]")
+ .with_stderr_contains("[..] --crate-name foo src[/]main.rs [..]")
+ .with_stderr_contains("[..] --crate-name example1 examples[/]example1.rs [..]")
+ .with_stderr_contains("[..] --crate-name test2 tests[/]test2.rs [..]")
+ .with_stderr_contains("[..] --crate-name bench3 benches[/]bench3.rs [..]")
+ );
+}
+
+#[test]
+fn check_unit_test_profile() {
+ let foo = project("foo")
+ .file("Cargo.toml", SIMPLE_MANIFEST)
+ .file("src/lib.rs", r#"
+ #[cfg(test)]
+ mod tests {
+ #[test]
+ fn it_works() {
+ badtext
+ }
+ }
+ "#)
+ .build();
+
+ assert_that(foo.cargo("check"),
+ execs().with_status(0));
+ assert_that(foo.cargo("check").arg("--profile").arg("test"),
+ execs().with_status(101)
+ .with_stderr_contains("[..]badtext[..]"));
+}
+
+// Verify what is checked with various command-line filters.
+#[test]
+fn check_filters() {
+ let p = project("foo")
+ .file("Cargo.toml", SIMPLE_MANIFEST)
+ .file("src/lib.rs", r#"
+ fn unused_normal_lib() {}
+ #[cfg(test)]
+ mod tests {
+ fn unused_unit_lib() {}
+ }
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ fn unused_normal_bin() {}
+ #[cfg(test)]
+ mod tests {
+ fn unused_unit_bin() {}
+ }
+ "#)
+ .file("tests/t1.rs", r#"
+ fn unused_normal_t1() {}
+ #[cfg(test)]
+ mod tests {
+ fn unused_unit_t1() {}
+ }
+ "#)
+ .file("examples/ex1.rs", r#"
+ fn main() {}
+ fn unused_normal_ex1() {}
+ #[cfg(test)]
+ mod tests {
+ fn unused_unit_ex1() {}
+ }
+ "#)
+ .file("benches/b1.rs", r#"
+ fn unused_normal_b1() {}
+ #[cfg(test)]
+ mod tests {
+ fn unused_unit_b1() {}
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("check"),
+ execs().with_status(0)
+ .with_stderr_contains("[..]unused_normal_lib[..]")
+ .with_stderr_contains("[..]unused_normal_bin[..]")
+ .with_stderr_does_not_contain("unused_normal_t1")
+ .with_stderr_does_not_contain("unused_normal_ex1")
+ .with_stderr_does_not_contain("unused_normal_b1")
+ .with_stderr_does_not_contain("unused_unit_"));
+ p.root().join("target").rm_rf();
+ assert_that(p.cargo("check").arg("--tests").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("[..] --crate-name foo src[/]lib.rs [..] --test [..]")
+ .with_stderr_contains("[..] --crate-name foo src[/]lib.rs --crate-type lib [..]")
+ .with_stderr_contains("[..] --crate-name foo src[/]main.rs [..] --test [..]")
+ .with_stderr_contains("[..] --crate-name foo src[/]main.rs --crate-type bin [..]")
+ .with_stderr_contains("[..]unused_unit_lib[..]")
+ .with_stderr_contains("[..]unused_unit_bin[..]")
+ .with_stderr_contains("[..]unused_normal_lib[..]")
+ .with_stderr_contains("[..]unused_normal_bin[..]")
+ .with_stderr_contains("[..]unused_unit_t1[..]")
+ .with_stderr_contains("[..]unused_normal_ex1[..]")
+ .with_stderr_contains("[..]unused_unit_ex1[..]")
+ .with_stderr_does_not_contain("unused_normal_b1")
+ .with_stderr_does_not_contain("unused_unit_b1"));
+ p.root().join("target").rm_rf();
+ assert_that(p.cargo("check").arg("--test").arg("t1").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("[..]unused_normal_lib[..]")
+ .with_stderr_contains("[..]unused_normal_bin[..]")
+ .with_stderr_contains("[..]unused_unit_t1[..]")
+ .with_stderr_does_not_contain("unused_unit_lib")
+ .with_stderr_does_not_contain("unused_unit_bin")
+ .with_stderr_does_not_contain("unused_normal_ex1")
+ .with_stderr_does_not_contain("unused_normal_b1")
+ .with_stderr_does_not_contain("unused_unit_ex1")
+ .with_stderr_does_not_contain("unused_unit_b1"));
+ p.root().join("target").rm_rf();
+ assert_that(p.cargo("check").arg("--all-targets").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("[..]unused_normal_lib[..]")
+ .with_stderr_contains("[..]unused_normal_bin[..]")
+ .with_stderr_contains("[..]unused_normal_t1[..]")
+ .with_stderr_contains("[..]unused_normal_ex1[..]")
+ .with_stderr_contains("[..]unused_normal_b1[..]")
+ .with_stderr_contains("[..]unused_unit_b1[..]")
+ .with_stderr_contains("[..]unused_unit_t1[..]")
+ .with_stderr_contains("[..]unused_unit_lib[..]")
+ .with_stderr_contains("[..]unused_unit_bin[..]")
+ .with_stderr_contains("[..]unused_unit_ex1[..]"));
+}
+
+#[test]
+fn check_artifacts()
+{
+ // Verify which artifacts are created when running check (#4059).
+ let p = project("foo")
+ .file("Cargo.toml", SIMPLE_MANIFEST)
+ .file("src/lib.rs", "")
+ .file("src/main.rs", "fn main() {}")
+ .file("tests/t1.rs", "")
+ .file("examples/ex1.rs", "fn main() {}")
+ .file("benches/b1.rs", "")
+ .build();
+ assert_that(p.cargo("check"), execs().with_status(0));
+ assert_that(&p.root().join("target/debug/libfoo.rmeta"),
+ existing_file());
+ assert_that(&p.root().join("target/debug/libfoo.rlib"),
+ is_not(existing_file()));
+ assert_that(&p.root().join("target/debug").join(exe("foo")),
+ is_not(existing_file()));
+
+ p.root().join("target").rm_rf();
+ assert_that(p.cargo("check").arg("--lib"), execs().with_status(0));
+ assert_that(&p.root().join("target/debug/libfoo.rmeta"),
+ existing_file());
+ assert_that(&p.root().join("target/debug/libfoo.rlib"),
+ is_not(existing_file()));
+ assert_that(&p.root().join("target/debug").join(exe("foo")),
+ is_not(existing_file()));
+
+ p.root().join("target").rm_rf();
+ assert_that(p.cargo("check").arg("--bin").arg("foo"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target/debug/libfoo.rmeta"),
+ existing_file());
+ assert_that(&p.root().join("target/debug/libfoo.rlib"),
+ is_not(existing_file()));
+ assert_that(&p.root().join("target/debug").join(exe("foo")),
+ is_not(existing_file()));
+
+ p.root().join("target").rm_rf();
+ assert_that(p.cargo("check").arg("--test").arg("t1"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target/debug/libfoo.rmeta"),
+ existing_file());
+ assert_that(&p.root().join("target/debug/libfoo.rlib"),
+ is_not(existing_file()));
+ assert_that(&p.root().join("target/debug").join(exe("foo")),
+ is_not(existing_file()));
+ assert_that(glob(&p.root().join("target/debug/t1-*").to_str().unwrap())
+ .unwrap().count(),
+ is(equal_to(0)));
+
+ p.root().join("target").rm_rf();
+ assert_that(p.cargo("check").arg("--example").arg("ex1"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target/debug/libfoo.rmeta"),
+ existing_file());
+ assert_that(&p.root().join("target/debug/libfoo.rlib"),
+ is_not(existing_file()));
+ assert_that(&p.root().join("target/debug/examples").join(exe("ex1")),
+ is_not(existing_file()));
+
+ p.root().join("target").rm_rf();
+ assert_that(p.cargo("check").arg("--bench").arg("b1"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target/debug/libfoo.rmeta"),
+ existing_file());
+ assert_that(&p.root().join("target/debug/libfoo.rlib"),
+ is_not(existing_file()));
+ assert_that(&p.root().join("target/debug").join(exe("foo")),
+ is_not(existing_file()));
+ assert_that(glob(&p.root().join("target/debug/b1-*").to_str().unwrap())
+ .unwrap().count(),
+ is(equal_to(0)));
+}
--- /dev/null
+extern crate hamcrest;
+extern crate cargotest;
+
+use std::env;
+
+use cargotest::support::{git, project, execs, main_file, basic_bin_manifest};
+use cargotest::support::registry::Package;
+use hamcrest::{assert_that, existing_dir, existing_file, is_not};
+
+#[test]
+fn cargo_clean_simple() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.build_dir(), existing_dir());
+
+ assert_that(p.cargo("clean"),
+ execs().with_status(0));
+ assert_that(&p.build_dir(), is_not(existing_dir()));
+}
+
+#[test]
+fn different_dir() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .file("src/bar/a.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.build_dir(), existing_dir());
+
+ assert_that(p.cargo("clean").cwd(&p.root().join("src")),
+ execs().with_status(0).with_stdout(""));
+ assert_that(&p.build_dir(), is_not(existing_dir()));
+}
+
+#[test]
+fn clean_multiple_packages() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.d1]
+ path = "d1"
+ [dependencies.d2]
+ path = "d2"
+
+ [[bin]]
+ name = "foo"
+ "#)
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .file("d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name = "d1"
+ "#)
+ .file("d1/src/main.rs", "fn main() { println!(\"d1\"); }")
+ .file("d2/Cargo.toml", r#"
+ [package]
+ name = "d2"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name = "d2"
+ "#)
+ .file("d2/src/main.rs", "fn main() { println!(\"d2\"); }")
+ .build();
+
+ assert_that(p.cargo("build").arg("-p").arg("d1").arg("-p").arg("d2")
+ .arg("-p").arg("foo"),
+ execs().with_status(0));
+
+ let d1_path = &p.build_dir().join("debug")
+ .join(format!("d1{}", env::consts::EXE_SUFFIX));
+ let d2_path = &p.build_dir().join("debug")
+ .join(format!("d2{}", env::consts::EXE_SUFFIX));
+
+
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(d1_path, existing_file());
+ assert_that(d2_path, existing_file());
+
+ assert_that(p.cargo("clean").arg("-p").arg("d1").arg("-p").arg("d2")
+ .cwd(&p.root().join("src")),
+ execs().with_status(0).with_stdout(""));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(d1_path, is_not(existing_file()));
+ assert_that(d2_path, is_not(existing_file()));
+}
+
+#[test]
+fn clean_release() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("--release"),
+ execs().with_status(0));
+
+ assert_that(p.cargo("clean").arg("-p").arg("foo"),
+ execs().with_status(0));
+ assert_that(p.cargo("build").arg("--release"),
+ execs().with_status(0).with_stdout(""));
+
+ assert_that(p.cargo("clean").arg("-p").arg("foo").arg("--release"),
+ execs().with_status(0));
+ assert_that(p.cargo("build").arg("--release"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[FINISHED] release [optimized] target(s) in [..]
+"));
+}
+
+#[test]
+fn build_script() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("build.rs", r#"
+ use std::path::PathBuf;
+ use std::env;
+
+ fn main() {
+ let out = PathBuf::from(env::var_os("OUT_DIR").unwrap());
+ if env::var("FIRST").is_ok() {
+ std::fs::File::create(out.join("out")).unwrap();
+ } else {
+ assert!(!std::fs::metadata(out.join("out")).is_ok());
+ }
+ }
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").env("FIRST", "1"),
+ execs().with_status(0));
+ assert_that(p.cargo("clean").arg("-p").arg("foo"),
+ execs().with_status(0));
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] build.rs [..]`
+[RUNNING] `[..]build-script-build`
+[RUNNING] `rustc [..] src[/]main.rs [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn clean_git() {
+ let git = git::new("dep", |project| {
+ project.file("Cargo.toml", r#"
+ [project]
+ name = "dep"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ }).unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ dep = {{ git = '{}' }}
+ "#, git.url()))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ assert_that(p.cargo("clean").arg("-p").arg("dep"),
+ execs().with_status(0).with_stdout(""));
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn registry() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "0.1"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("bar", "0.1.0").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ assert_that(p.cargo("clean").arg("-p").arg("bar"),
+ execs().with_status(0).with_stdout(""));
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
--- /dev/null
+extern crate cargotest;
+extern crate git2;
+extern crate hamcrest;
+
+use std::{env, str};
+use std::fs::{self, File};
+use std::io::Write;
+use std::net::TcpListener;
+use std::process::Stdio;
+use std::thread;
+use std::sync::mpsc::channel;
+use std::time::Duration;
+
+use cargotest::install::{has_installed_exe, cargo_home};
+use cargotest::support::git;
+use cargotest::support::registry::Package;
+use cargotest::support::{execs, project};
+use hamcrest::{assert_that, existing_file};
+
+fn pkg(name: &str, vers: &str) {
+ Package::new(name, vers)
+ .file("src/main.rs", "fn main() {{}}")
+ .publish();
+}
+
+#[test]
+fn multiple_installs() {
+ let p = project("foo")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+ "#)
+ .file("a/src/main.rs", "fn main() {}")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ authors = []
+ version = "0.0.0"
+ "#)
+ .file("b/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ let mut a = p.cargo("install").cwd(p.root().join("a")).build_command();
+ let mut b = p.cargo("install").cwd(p.root().join("b")).build_command();
+
+ a.stdout(Stdio::piped()).stderr(Stdio::piped());
+ b.stdout(Stdio::piped()).stderr(Stdio::piped());
+
+ let a = a.spawn().unwrap();
+ let b = b.spawn().unwrap();
+ let a = thread::spawn(move || a.wait_with_output().unwrap());
+ let b = b.wait_with_output().unwrap();
+ let a = a.join().unwrap();
+
+ assert_that(a, execs().with_status(0));
+ assert_that(b, execs().with_status(0));
+
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(cargo_home(), has_installed_exe("bar"));
+}
+
+#[test]
+fn concurrent_installs() {
+ const LOCKED_BUILD: &'static str = "waiting for file lock on build directory";
+
+ pkg("foo", "0.0.1");
+ pkg("bar", "0.0.1");
+
+ let mut a = cargotest::cargo_process().arg("install").arg("foo").build_command();
+ let mut b = cargotest::cargo_process().arg("install").arg("bar").build_command();
+
+ a.stdout(Stdio::piped()).stderr(Stdio::piped());
+ b.stdout(Stdio::piped()).stderr(Stdio::piped());
+
+ let a = a.spawn().unwrap();
+ let b = b.spawn().unwrap();
+ let a = thread::spawn(move || a.wait_with_output().unwrap());
+ let b = b.wait_with_output().unwrap();
+ let a = a.join().unwrap();
+
+ assert!(!str::from_utf8(&a.stderr).unwrap().contains(LOCKED_BUILD));
+ assert!(!str::from_utf8(&b.stderr).unwrap().contains(LOCKED_BUILD));
+
+ assert_that(a, execs().with_status(0));
+ assert_that(b, execs().with_status(0));
+
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(cargo_home(), has_installed_exe("bar"));
+}
+
+#[test]
+fn one_install_should_be_bad() {
+ let p = project("foo")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+ "#)
+ .file("a/src/main.rs", "fn main() {}")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+ "#)
+ .file("b/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ let mut a = p.cargo("install").cwd(p.root().join("a")).build_command();
+ let mut b = p.cargo("install").cwd(p.root().join("b")).build_command();
+
+ a.stdout(Stdio::piped()).stderr(Stdio::piped());
+ b.stdout(Stdio::piped()).stderr(Stdio::piped());
+
+ let a = a.spawn().unwrap();
+ let b = b.spawn().unwrap();
+ let a = thread::spawn(move || a.wait_with_output().unwrap());
+ let b = b.wait_with_output().unwrap();
+ let a = a.join().unwrap();
+
+ let (bad, good) = if a.status.code() == Some(101) {(a, b)} else {(b, a)};
+ assert_that(bad, execs().with_status(101).with_stderr_contains("\
+[ERROR] binary `foo[..]` already exists in destination as part of `[..]`
+"));
+ assert_that(good, execs().with_status(0).with_stderr_contains("\
+warning: be sure to add `[..]` to your PATH [..]
+"));
+
+ assert_that(cargo_home(), has_installed_exe("foo"));
+}
+
+#[test]
+fn multiple_registry_fetches() {
+ let mut pkg = Package::new("bar", "1.0.2");
+ for i in 0..10 {
+ let name = format!("foo{}", i);
+ Package::new(&name, "1.0.0").publish();
+ pkg.dep(&name, "*");
+ }
+ pkg.publish();
+
+ let p = project("foo")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("a/src/main.rs", "fn main() {}")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ authors = []
+ version = "0.0.0"
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("b/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ let mut a = p.cargo("build").cwd(p.root().join("a")).build_command();
+ let mut b = p.cargo("build").cwd(p.root().join("b")).build_command();
+
+ a.stdout(Stdio::piped()).stderr(Stdio::piped());
+ b.stdout(Stdio::piped()).stderr(Stdio::piped());
+
+ let a = a.spawn().unwrap();
+ let b = b.spawn().unwrap();
+ let a = thread::spawn(move || a.wait_with_output().unwrap());
+ let b = b.wait_with_output().unwrap();
+ let a = a.join().unwrap();
+
+ assert_that(a, execs().with_status(0));
+ assert_that(b, execs().with_status(0));
+
+ let suffix = env::consts::EXE_SUFFIX;
+ assert_that(&p.root().join("a/target/debug").join(format!("foo{}", suffix)),
+ existing_file());
+ assert_that(&p.root().join("b/target/debug").join(format!("bar{}", suffix)),
+ existing_file());
+}
+
+#[test]
+fn git_same_repo_different_tags() {
+ let a = git::new("dep", |project| {
+ project.file("Cargo.toml", r#"
+ [project]
+ name = "dep"
+ version = "0.5.0"
+ authors = []
+ "#).file("src/lib.rs", "pub fn tag1() {}")
+ }).unwrap();
+
+ let repo = git2::Repository::open(&a.root()).unwrap();
+ git::tag(&repo, "tag1");
+
+ File::create(a.root().join("src/lib.rs")).unwrap()
+ .write_all(b"pub fn tag2() {}").unwrap();
+ git::add(&repo);
+ git::commit(&repo);
+ git::tag(&repo, "tag2");
+
+ let p = project("foo")
+ .file("a/Cargo.toml", &format!(r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+
+ [dependencies]
+ dep = {{ git = '{}', tag = 'tag1' }}
+ "#, a.url()))
+ .file("a/src/main.rs", "extern crate dep; fn main() { dep::tag1(); }")
+ .file("b/Cargo.toml", &format!(r#"
+ [package]
+ name = "bar"
+ authors = []
+ version = "0.0.0"
+
+ [dependencies]
+ dep = {{ git = '{}', tag = 'tag2' }}
+ "#, a.url()))
+ .file("b/src/main.rs", "extern crate dep; fn main() { dep::tag2(); }");
+ let p = p.build();
+
+ let mut a = p.cargo("build").arg("-v").cwd(p.root().join("a")).build_command();
+ let mut b = p.cargo("build").arg("-v").cwd(p.root().join("b")).build_command();
+
+ a.stdout(Stdio::piped()).stderr(Stdio::piped());
+ b.stdout(Stdio::piped()).stderr(Stdio::piped());
+
+ let a = a.spawn().unwrap();
+ let b = b.spawn().unwrap();
+ let a = thread::spawn(move || a.wait_with_output().unwrap());
+ let b = b.wait_with_output().unwrap();
+ let a = a.join().unwrap();
+
+ assert_that(a, execs().with_status(0));
+ assert_that(b, execs().with_status(0));
+}
+
+#[test]
+fn git_same_branch_different_revs() {
+ let a = git::new("dep", |project| {
+ project.file("Cargo.toml", r#"
+ [project]
+ name = "dep"
+ version = "0.5.0"
+ authors = []
+ "#).file("src/lib.rs", "pub fn f1() {}")
+ }).unwrap();
+
+ let p = project("foo")
+ .file("a/Cargo.toml", &format!(r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+
+ [dependencies]
+ dep = {{ git = '{}' }}
+ "#, a.url()))
+ .file("a/src/main.rs", "extern crate dep; fn main() { dep::f1(); }")
+ .file("b/Cargo.toml", &format!(r#"
+ [package]
+ name = "bar"
+ authors = []
+ version = "0.0.0"
+
+ [dependencies]
+ dep = {{ git = '{}' }}
+ "#, a.url()))
+ .file("b/src/main.rs", "extern crate dep; fn main() { dep::f2(); }");
+ let p = p.build();
+
+ // Generate a Cargo.lock pointing at the current rev, then clear out the
+ // target directory
+ assert_that(p.cargo("build").cwd(p.root().join("a")),
+ execs().with_status(0));
+ fs::remove_dir_all(p.root().join("a/target")).unwrap();
+
+ // Make a new commit on the master branch
+ let repo = git2::Repository::open(&a.root()).unwrap();
+ File::create(a.root().join("src/lib.rs")).unwrap()
+ .write_all(b"pub fn f2() {}").unwrap();
+ git::add(&repo);
+ git::commit(&repo);
+
+ // Now run both builds in parallel. The build of `b` should pick up the
+ // newest commit while the build of `a` should use the locked old commit.
+ let mut a = p.cargo("build").cwd(p.root().join("a")).build_command();
+ let mut b = p.cargo("build").cwd(p.root().join("b")).build_command();
+
+ a.stdout(Stdio::piped()).stderr(Stdio::piped());
+ b.stdout(Stdio::piped()).stderr(Stdio::piped());
+
+ let a = a.spawn().unwrap();
+ let b = b.spawn().unwrap();
+ let a = thread::spawn(move || a.wait_with_output().unwrap());
+ let b = b.wait_with_output().unwrap();
+ let a = a.join().unwrap();
+
+ assert_that(a, execs().with_status(0));
+ assert_that(b, execs().with_status(0));
+}
+
+#[test]
+fn same_project() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("src/lib.rs", "");
+ let p = p.build();
+
+ let mut a = p.cargo("build").build_command();
+ let mut b = p.cargo("build").build_command();
+
+ a.stdout(Stdio::piped()).stderr(Stdio::piped());
+ b.stdout(Stdio::piped()).stderr(Stdio::piped());
+
+ let a = a.spawn().unwrap();
+ let b = b.spawn().unwrap();
+ let a = thread::spawn(move || a.wait_with_output().unwrap());
+ let b = b.wait_with_output().unwrap();
+ let a = a.join().unwrap();
+
+ assert_that(a, execs().with_status(0));
+ assert_that(b, execs().with_status(0));
+}
+
+// Make sure that if Cargo dies while holding a lock that it's released and the
+// next Cargo to come in will take over cleanly.
+// older win versions don't support job objects, so skip test there
+#[test]
+#[cfg_attr(target_os = "windows", ignore)]
+fn killing_cargo_releases_the_lock() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+ build = "build.rs"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("build.rs", r#"
+ use std::net::TcpStream;
+
+ fn main() {
+ if std::env::var("A").is_ok() {
+ TcpStream::connect(&std::env::var("ADDR").unwrap()[..])
+ .unwrap();
+ std::thread::sleep(std::time::Duration::new(10, 0));
+ }
+ }
+ "#);
+ let p = p.build();
+
+ // Our build script will connect to our local TCP socket to inform us that
+ // it's started and that's how we know that `a` will have the lock
+ // when we kill it.
+ let l = TcpListener::bind("127.0.0.1:0").unwrap();
+ let mut a = p.cargo("build").build_command();
+ let mut b = p.cargo("build").build_command();
+ a.stdout(Stdio::piped()).stderr(Stdio::piped());
+ b.stdout(Stdio::piped()).stderr(Stdio::piped());
+ a.env("ADDR", l.local_addr().unwrap().to_string()).env("A", "a");
+ b.env("ADDR", l.local_addr().unwrap().to_string()).env_remove("A");
+
+ // Spawn `a`, wait for it to get to the build script (at which point the
+ // lock is held), then kill it.
+ let mut a = a.spawn().unwrap();
+ l.accept().unwrap();
+ a.kill().unwrap();
+
+ // Spawn `b`, then just finish the output of a/b the same way the above
+ // tests does.
+ let b = b.spawn().unwrap();
+ let a = thread::spawn(move || a.wait_with_output().unwrap());
+ let b = b.wait_with_output().unwrap();
+ let a = a.join().unwrap();
+
+ // We killed `a`, so it shouldn't succeed, but `b` should have succeeded.
+ assert!(!a.status.success());
+ assert_that(b, execs().with_status(0));
+}
+
+#[test]
+fn debug_release_ok() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+ "#)
+ .file("src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ fs::remove_dir_all(p.root().join("target")).unwrap();
+
+ let mut a = p.cargo("build").build_command();
+ let mut b = p.cargo("build").arg("--release").build_command();
+ a.stdout(Stdio::piped()).stderr(Stdio::piped());
+ b.stdout(Stdio::piped()).stderr(Stdio::piped());
+ let a = a.spawn().unwrap();
+ let b = b.spawn().unwrap();
+ let a = thread::spawn(move || a.wait_with_output().unwrap());
+ let b = b.wait_with_output().unwrap();
+ let a = a.join().unwrap();
+
+ assert_that(a, execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.0 [..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(b, execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.0 [..]
+[FINISHED] release [optimized] target(s) in [..]
+"));
+}
+
+#[test]
+fn no_deadlock_with_git_dependencies() {
+ let dep1 = git::new("dep1", |project| {
+ project.file("Cargo.toml", r#"
+ [project]
+ name = "dep1"
+ version = "0.5.0"
+ authors = []
+ "#).file("src/lib.rs", "")
+ }).unwrap();
+
+ let dep2 = git::new("dep2", |project| {
+ project.file("Cargo.toml", r#"
+ [project]
+ name = "dep2"
+ version = "0.5.0"
+ authors = []
+ "#).file("src/lib.rs", "")
+ }).unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+
+ [dependencies]
+ dep1 = {{ git = '{}' }}
+ dep2 = {{ git = '{}' }}
+ "#, dep1.url(), dep2.url()))
+ .file("src/main.rs", "fn main() { }");
+ let p = p.build();
+
+ let n_concurrent_builds = 5;
+
+ let (tx, rx) = channel();
+ for _ in 0..n_concurrent_builds {
+ let cmd = p.cargo("build").build_command()
+ .stdout(Stdio::piped())
+ .stderr(Stdio::piped())
+ .spawn();
+ let tx = tx.clone();
+ thread::spawn(move || {
+ let result = cmd.unwrap().wait_with_output().unwrap();
+ tx.send(result).unwrap()
+ });
+ }
+
+ //TODO: use `Receiver::recv_timeout` once it is stable.
+ let recv_timeout = |chan: &::std::sync::mpsc::Receiver<_>| {
+ for _ in 0..3000 {
+ if let Ok(x) = chan.try_recv() {
+ return x
+ }
+ thread::sleep(Duration::from_millis(10));
+ }
+ chan.try_recv().expect("Deadlock!")
+ };
+
+ for _ in 0..n_concurrent_builds {
+ let result = recv_timeout(&rx);
+ assert_that(result, execs().with_status(0))
+ }
+
+}
--- /dev/null
+extern crate hamcrest;
+extern crate cargotest;
+
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+#[test]
+fn read_env_vars_for_config() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ use std::env;
+ fn main() {
+ assert_eq!(env::var("NUM_JOBS").unwrap(), "100");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").env("CARGO_BUILD_JOBS", "100"),
+ execs().with_status(0));
+}
--- /dev/null
+extern crate cargo;
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargo::util::process;
+use cargotest::{is_nightly, rustc_host};
+use cargotest::support::{project, execs, basic_bin_manifest, cross_compile};
+use hamcrest::{assert_that, existing_file};
+
+#[test]
+fn simple_cross() {
+ if cross_compile::disabled() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", &format!(r#"
+ fn main() {{
+ assert_eq!(std::env::var("TARGET").unwrap(), "{}");
+ }}
+ "#, cross_compile::alternate()))
+ .file("src/main.rs", &format!(r#"
+ use std::env;
+ fn main() {{
+ assert_eq!(env::consts::ARCH, "{}");
+ }}
+ "#, cross_compile::alternate_arch()))
+ .build();
+
+ let target = cross_compile::alternate();
+ assert_that(p.cargo("build").arg("--target").arg(&target).arg("-v"),
+ execs().with_status(0));
+ assert_that(&p.target_bin(&target, "foo"), existing_file());
+
+ assert_that(process(&p.target_bin(&target, "foo")),
+ execs().with_status(0));
+}
+
+#[test]
+fn simple_cross_config() {
+ if cross_compile::disabled() { return }
+
+ let p = project("foo")
+ .file(".cargo/config", &format!(r#"
+ [build]
+ target = "{}"
+ "#, cross_compile::alternate()))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", &format!(r#"
+ fn main() {{
+ assert_eq!(std::env::var("TARGET").unwrap(), "{}");
+ }}
+ "#, cross_compile::alternate()))
+ .file("src/main.rs", &format!(r#"
+ use std::env;
+ fn main() {{
+ assert_eq!(env::consts::ARCH, "{}");
+ }}
+ "#, cross_compile::alternate_arch()))
+ .build();
+
+ let target = cross_compile::alternate();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+ assert_that(&p.target_bin(&target, "foo"), existing_file());
+
+ assert_that(process(&p.target_bin(&target, "foo")),
+ execs().with_status(0));
+}
+
+#[test]
+fn simple_deps() {
+ if cross_compile::disabled() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() { bar::bar(); }
+ "#)
+ .build();
+ let _p2 = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn bar() {}")
+ .build();
+
+ let target = cross_compile::alternate();
+ assert_that(p.cargo("build").arg("--target").arg(&target),
+ execs().with_status(0));
+ assert_that(&p.target_bin(&target, "foo"), existing_file());
+
+ assert_that(process(&p.target_bin(&target, "foo")),
+ execs().with_status(0));
+}
+
+#[test]
+fn plugin_deps() {
+ if cross_compile::disabled() { return }
+ if !is_nightly() { return }
+
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+
+ [dependencies.baz]
+ path = "../baz"
+ "#)
+ .file("src/main.rs", r#"
+ #![feature(plugin)]
+ #![plugin(bar)]
+ extern crate baz;
+ fn main() {
+ assert_eq!(bar!(), baz::baz());
+ }
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "bar"
+ plugin = true
+ "#)
+ .file("src/lib.rs", r#"
+ #![feature(plugin_registrar, quote, rustc_private)]
+
+ extern crate rustc_plugin;
+ extern crate syntax;
+
+ use rustc_plugin::Registry;
+ use syntax::tokenstream::TokenTree;
+ use syntax::codemap::Span;
+ use syntax::ext::base::{ExtCtxt, MacEager, MacResult};
+
+ #[plugin_registrar]
+ pub fn foo(reg: &mut Registry) {
+ reg.register_macro("bar", expand_bar);
+ }
+
+ fn expand_bar(cx: &mut ExtCtxt, sp: Span, tts: &[TokenTree])
+ -> Box<MacResult + 'static> {
+ MacEager::expr(quote_expr!(cx, 1))
+ }
+ "#)
+ .build();
+ let _baz = project("baz")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn baz() -> i32 { 1 }")
+ .build();
+
+ let target = cross_compile::alternate();
+ assert_that(foo.cargo("build").arg("--target").arg(&target),
+ execs().with_status(0));
+ assert_that(&foo.target_bin(&target, "foo"), existing_file());
+
+ assert_that(process(&foo.target_bin(&target, "foo")),
+ execs().with_status(0));
+}
+
+#[test]
+fn plugin_to_the_max() {
+ if cross_compile::disabled() { return }
+ if !is_nightly() { return }
+
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+
+ [dependencies.baz]
+ path = "../baz"
+ "#)
+ .file("src/main.rs", r#"
+ #![feature(plugin)]
+ #![plugin(bar)]
+ extern crate baz;
+ fn main() {
+ assert_eq!(bar!(), baz::baz());
+ }
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "bar"
+ plugin = true
+
+ [dependencies.baz]
+ path = "../baz"
+ "#)
+ .file("src/lib.rs", r#"
+ #![feature(plugin_registrar, quote, rustc_private)]
+
+ extern crate rustc_plugin;
+ extern crate syntax;
+ extern crate baz;
+
+ use rustc_plugin::Registry;
+ use syntax::tokenstream::TokenTree;
+ use syntax::codemap::Span;
+ use syntax::ext::base::{ExtCtxt, MacEager, MacResult};
+
+ #[plugin_registrar]
+ pub fn foo(reg: &mut Registry) {
+ reg.register_macro("bar", expand_bar);
+ }
+
+ fn expand_bar(cx: &mut ExtCtxt, sp: Span, tts: &[TokenTree])
+ -> Box<MacResult + 'static> {
+ MacEager::expr(quote_expr!(cx, baz::baz()))
+ }
+ "#)
+ .build();
+ let _baz = project("baz")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn baz() -> i32 { 1 }")
+ .build();
+
+ let target = cross_compile::alternate();
+ assert_that(foo.cargo("build").arg("--target").arg(&target).arg("-v"),
+ execs().with_status(0));
+ println!("second");
+ assert_that(foo.cargo("build").arg("-v")
+ .arg("--target").arg(&target),
+ execs().with_status(0));
+ assert_that(&foo.target_bin(&target, "foo"), existing_file());
+
+ assert_that(process(&foo.target_bin(&target, "foo")),
+ execs().with_status(0));
+}
+
+#[test]
+fn linker_and_ar() {
+ if cross_compile::disabled() { return }
+
+ let target = cross_compile::alternate();
+ let p = project("foo")
+ .file(".cargo/config", &format!(r#"
+ [target.{}]
+ ar = "my-ar-tool"
+ linker = "my-linker-tool"
+ "#, target))
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &format!(r#"
+ use std::env;
+ fn main() {{
+ assert_eq!(env::consts::ARCH, "{}");
+ }}
+ "#, cross_compile::alternate_arch()))
+ .build();
+
+ assert_that(p.cargo("build").arg("--target").arg(&target)
+ .arg("-v"),
+ execs().with_status(101)
+ .with_stderr_contains(&format!("\
+[COMPILING] foo v0.5.0 ({url})
+[RUNNING] `rustc --crate-name foo src[/]foo.rs --crate-type bin \
+ --emit=dep-info,link -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir {dir}[/]target[/]{target}[/]debug[/]deps \
+ --target {target} \
+ -C ar=my-ar-tool -C linker=my-linker-tool \
+ -L dependency={dir}[/]target[/]{target}[/]debug[/]deps \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+",
+ dir = p.root().display(),
+ url = p.url(),
+ target = target,
+ )));
+}
+
+#[test]
+fn plugin_with_extra_dylib_dep() {
+ if cross_compile::disabled() { return }
+ if !is_nightly() { return }
+
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/main.rs", r#"
+ #![feature(plugin)]
+ #![plugin(bar)]
+
+ fn main() {}
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "bar"
+ plugin = true
+
+ [dependencies.baz]
+ path = "../baz"
+ "#)
+ .file("src/lib.rs", r#"
+ #![feature(plugin_registrar, rustc_private)]
+
+ extern crate rustc_plugin;
+ extern crate baz;
+
+ use rustc_plugin::Registry;
+
+ #[plugin_registrar]
+ pub fn foo(reg: &mut Registry) {
+ println!("{}", baz::baz());
+ }
+ "#)
+ .build();
+ let _baz = project("baz")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "baz"
+ crate_type = ["dylib"]
+ "#)
+ .file("src/lib.rs", "pub fn baz() -> i32 { 1 }")
+ .build();
+
+ let target = cross_compile::alternate();
+ assert_that(foo.cargo("build").arg("--target").arg(&target),
+ execs().with_status(0));
+}
+
+#[test]
+fn cross_tests() {
+ if cross_compile::disabled() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+
+ [[bin]]
+ name = "bar"
+ "#)
+ .file("src/bin/bar.rs", &format!(r#"
+ #[allow(unused_extern_crates)]
+ extern crate foo;
+ use std::env;
+ fn main() {{
+ assert_eq!(env::consts::ARCH, "{}");
+ }}
+ #[test] fn test() {{ main() }}
+ "#, cross_compile::alternate_arch()))
+ .file("src/lib.rs", &format!(r#"
+ use std::env;
+ pub fn foo() {{ assert_eq!(env::consts::ARCH, "{}"); }}
+ #[test] fn test_foo() {{ foo() }}
+ "#, cross_compile::alternate_arch()))
+ .build();
+
+ let target = cross_compile::alternate();
+ assert_that(p.cargo("test").arg("--target").arg(&target),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.0 ({foo})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]{triple}[/]debug[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]{triple}[/]debug[/]deps[/]bar-[..][EXE]", foo = p.url(), triple = target))
+ .with_stdout_contains("test test_foo ... ok")
+ .with_stdout_contains("test test ... ok"));
+}
+
+#[test]
+fn no_cross_doctests() {
+ if cross_compile::disabled() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ authors = []
+ version = "0.0.0"
+ "#)
+ .file("src/lib.rs", r#"
+ //! ```
+ //! extern crate foo;
+ //! assert!(true);
+ //! ```
+ "#)
+ .build();
+
+ let host_output = format!("\
+[COMPILING] foo v0.0.0 ({foo})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[DOCTEST] foo
+", foo = p.url());
+
+ println!("a");
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr(&host_output));
+
+ println!("b");
+ let target = cross_compile::host();
+ assert_that(p.cargo("test").arg("--target").arg(&target),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.0 ({foo})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]{triple}[/]debug[/]deps[/]foo-[..][EXE]
+[DOCTEST] foo
+", foo = p.url(), triple = target)));
+
+ println!("c");
+ let target = cross_compile::alternate();
+ assert_that(p.cargo("test").arg("--target").arg(&target),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.0 ({foo})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]{triple}[/]debug[/]deps[/]foo-[..][EXE]
+", foo = p.url(), triple = target)));
+}
+
+#[test]
+fn simple_cargo_run() {
+ if cross_compile::disabled() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/main.rs", &format!(r#"
+ use std::env;
+ fn main() {{
+ assert_eq!(env::consts::ARCH, "{}");
+ }}
+ "#, cross_compile::alternate_arch()))
+ .build();
+
+ let target = cross_compile::alternate();
+ assert_that(p.cargo("run").arg("--target").arg(&target),
+ execs().with_status(0));
+}
+
+#[test]
+fn cross_with_a_build_script() {
+ if cross_compile::disabled() { return }
+
+ let target = cross_compile::alternate();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ build = 'build.rs'
+ "#)
+ .file("build.rs", &format!(r#"
+ use std::env;
+ use std::path::PathBuf;
+ fn main() {{
+ assert_eq!(env::var("TARGET").unwrap(), "{0}");
+ let mut path = PathBuf::from(env::var_os("OUT_DIR").unwrap());
+ assert_eq!(path.file_name().unwrap().to_str().unwrap(), "out");
+ path.pop();
+ assert!(path.file_name().unwrap().to_str().unwrap()
+ .starts_with("foo-"));
+ path.pop();
+ assert_eq!(path.file_name().unwrap().to_str().unwrap(), "build");
+ path.pop();
+ assert_eq!(path.file_name().unwrap().to_str().unwrap(), "debug");
+ path.pop();
+ assert_eq!(path.file_name().unwrap().to_str().unwrap(), "{0}");
+ path.pop();
+ assert_eq!(path.file_name().unwrap().to_str().unwrap(), "target");
+ }}
+ "#, target))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--target").arg(&target).arg("-v"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.0 (file://[..])
+[RUNNING] `rustc [..] build.rs [..] --out-dir {dir}[/]target[/]debug[/]build[/]foo-[..]`
+[RUNNING] `{dir}[/]target[/]debug[/]build[/]foo-[..][/]build-script-build`
+[RUNNING] `rustc [..] src[/]main.rs [..] --target {target} [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", target = target,
+ dir = p.root().display())));
+}
+
+#[test]
+fn build_script_needed_for_host_and_target() {
+ if cross_compile::disabled() { return }
+
+ let target = cross_compile::alternate();
+ let host = rustc_host();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ build = 'build.rs'
+
+ [dependencies.d1]
+ path = "d1"
+ [build-dependencies.d2]
+ path = "d2"
+ "#)
+
+ .file("build.rs", r#"
+ #[allow(unused_extern_crates)]
+ extern crate d2;
+ fn main() { d2::d2(); }
+ "#)
+ .file("src/main.rs", "
+ #[allow(unused_extern_crates)]
+ extern crate d1;
+ fn main() { d1::d1(); }
+ ")
+ .file("d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.0"
+ authors = []
+ build = 'build.rs'
+ "#)
+ .file("d1/src/lib.rs", "
+ pub fn d1() {}
+ ")
+ .file("d1/build.rs", r#"
+ use std::env;
+ fn main() {
+ let target = env::var("TARGET").unwrap();
+ println!("cargo:rustc-flags=-L /path/to/{}", target);
+ }
+ "#)
+ .file("d2/Cargo.toml", r#"
+ [package]
+ name = "d2"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies.d1]
+ path = "../d1"
+ "#)
+ .file("d2/src/lib.rs", "
+ #[allow(unused_extern_crates)]
+ extern crate d1;
+ pub fn d2() { d1::d1(); }
+ ")
+ .build();
+
+ assert_that(p.cargo("build").arg("--target").arg(&target).arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains(&format!("\
+[COMPILING] d1 v0.0.0 ({url}/d1)", url = p.url()))
+ .with_stderr_contains(&format!("\
+[RUNNING] `rustc [..] d1[/]build.rs [..] --out-dir {dir}[/]target[/]debug[/]build[/]d1-[..]`",
+ dir = p.root().display()))
+ .with_stderr_contains(&format!("\
+[RUNNING] `{dir}[/]target[/]debug[/]build[/]d1-[..][/]build-script-build`",
+ dir = p.root().display()))
+ .with_stderr_contains("\
+[RUNNING] `rustc [..] d1[/]src[/]lib.rs [..]`")
+ .with_stderr_contains(&format!("\
+[COMPILING] d2 v0.0.0 ({url}/d2)", url = p.url()))
+ .with_stderr_contains(&format!("\
+[RUNNING] `rustc [..] d2[/]src[/]lib.rs [..] \
+ -L /path/to/{host}`", host = host))
+ .with_stderr_contains(&format!("\
+[COMPILING] foo v0.0.0 ({url})", url = p.url()))
+ .with_stderr_contains(&format!("\
+[RUNNING] `rustc [..] build.rs [..] --out-dir {dir}[/]target[/]debug[/]build[/]foo-[..] \
+ -L /path/to/{host}`", dir = p.root().display(), host = host))
+ .with_stderr_contains(&format!("\
+[RUNNING] `rustc [..] src[/]main.rs [..] --target {target} [..] \
+ -L /path/to/{target}`", target = target)));
+}
+
+#[test]
+fn build_deps_for_the_right_arch() {
+ if cross_compile::disabled() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies.d2]
+ path = "d2"
+ "#)
+ .file("src/main.rs", "extern crate d2; fn main() {}")
+ .file("d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("d1/src/lib.rs", "
+ pub fn d1() {}
+ ")
+ .file("d2/Cargo.toml", r#"
+ [package]
+ name = "d2"
+ version = "0.0.0"
+ authors = []
+ build = "build.rs"
+
+ [build-dependencies.d1]
+ path = "../d1"
+ "#)
+ .file("d2/build.rs", "extern crate d1; fn main() {}")
+ .file("d2/src/lib.rs", "")
+ .build();
+
+ let target = cross_compile::alternate();
+ assert_that(p.cargo("build").arg("--target").arg(&target).arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_script_only_host() {
+ if cross_compile::disabled() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ build = "build.rs"
+
+ [build-dependencies.d1]
+ path = "d1"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("build.rs", "extern crate d1; fn main() {}")
+ .file("d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("d1/src/lib.rs", "
+ pub fn d1() {}
+ ")
+ .file("d1/build.rs", r#"
+ use std::env;
+
+ fn main() {
+ assert!(env::var("OUT_DIR").unwrap().replace("\\", "/")
+ .contains("target/debug/build/d1-"),
+ "bad: {:?}", env::var("OUT_DIR"));
+ }
+ "#)
+ .build();
+
+ let target = cross_compile::alternate();
+ assert_that(p.cargo("build").arg("--target").arg(&target).arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn plugin_build_script_right_arch() {
+ if cross_compile::disabled() { return }
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+
+ [lib]
+ name = "foo"
+ plugin = true
+ "#)
+ .file("build.rs", "fn main() {}")
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v").arg("--target").arg(cross_compile::alternate()),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] build.rs [..]`
+[RUNNING] `[..][/]build-script-build`
+[RUNNING] `rustc [..] src[/]lib.rs [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn build_script_with_platform_specific_dependencies() {
+ if cross_compile::disabled() { return }
+
+ let target = cross_compile::alternate();
+ let host = rustc_host();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+
+ [build-dependencies.d1]
+ path = "d1"
+ "#)
+ .file("build.rs", "
+ #[allow(unused_extern_crates)]
+ extern crate d1;
+ fn main() {}
+ ")
+ .file("src/lib.rs", "")
+ .file("d1/Cargo.toml", &format!(r#"
+ [package]
+ name = "d1"
+ version = "0.0.0"
+ authors = []
+
+ [target.{}.dependencies]
+ d2 = {{ path = "../d2" }}
+ "#, host))
+ .file("d1/src/lib.rs", "
+ #[allow(unused_extern_crates)]
+ extern crate d2;
+ ")
+ .file("d2/Cargo.toml", r#"
+ [package]
+ name = "d2"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("d2/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v").arg("--target").arg(&target),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] d2 v0.0.0 ([..])
+[RUNNING] `rustc [..] d2[/]src[/]lib.rs [..]`
+[COMPILING] d1 v0.0.0 ([..])
+[RUNNING] `rustc [..] d1[/]src[/]lib.rs [..]`
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] build.rs [..]`
+[RUNNING] `{dir}[/]target[/]debug[/]build[/]foo-[..][/]build-script-build`
+[RUNNING] `rustc [..] src[/]lib.rs [..] --target {target} [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.root().display(), target = target)));
+}
+
+#[test]
+fn platform_specific_dependencies_do_not_leak() {
+ if cross_compile::disabled() { return }
+
+ let target = cross_compile::alternate();
+ let host = rustc_host();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+
+ [dependencies.d1]
+ path = "d1"
+
+ [build-dependencies.d1]
+ path = "d1"
+ "#)
+ .file("build.rs", "extern crate d1; fn main() {}")
+ .file("src/lib.rs", "")
+ .file("d1/Cargo.toml", &format!(r#"
+ [package]
+ name = "d1"
+ version = "0.0.0"
+ authors = []
+
+ [target.{}.dependencies]
+ d2 = {{ path = "../d2" }}
+ "#, host))
+ .file("d1/src/lib.rs", "extern crate d2;")
+ .file("d2/Cargo.toml", r#"
+ [package]
+ name = "d2"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("d2/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v").arg("--target").arg(&target),
+ execs().with_status(101)
+ .with_stderr_contains("\
+[..] can't find crate for `d2`[..]"));
+}
+
+#[test]
+fn platform_specific_variables_reflected_in_build_scripts() {
+ if cross_compile::disabled() { return }
+
+ let target = cross_compile::alternate();
+ let host = rustc_host();
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+
+ [target.{host}.dependencies]
+ d1 = {{ path = "d1" }}
+
+ [target.{target}.dependencies]
+ d2 = {{ path = "d2" }}
+ "#, host = host, target = target))
+ .file("build.rs", &format!(r#"
+ use std::env;
+
+ fn main() {{
+ let platform = env::var("TARGET").unwrap();
+ let (expected, not_expected) = match &platform[..] {{
+ "{host}" => ("DEP_D1_VAL", "DEP_D2_VAL"),
+ "{target}" => ("DEP_D2_VAL", "DEP_D1_VAL"),
+ _ => panic!("unknown platform")
+ }};
+
+ env::var(expected).ok()
+ .expect(&format!("missing {{}}", expected));
+ env::var(not_expected).err()
+ .expect(&format!("found {{}}", not_expected));
+ }}
+ "#, host = host, target = target))
+ .file("src/lib.rs", "")
+ .file("d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.0"
+ authors = []
+ links = "d1"
+ build = "build.rs"
+ "#)
+ .file("d1/build.rs", r#"
+ fn main() { println!("cargo:val=1") }
+ "#)
+ .file("d1/src/lib.rs", "")
+ .file("d2/Cargo.toml", r#"
+ [package]
+ name = "d2"
+ version = "0.0.0"
+ authors = []
+ links = "d2"
+ build = "build.rs"
+ "#)
+ .file("d2/build.rs", r#"
+ fn main() { println!("cargo:val=1") }
+ "#)
+ .file("d2/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"), execs().with_status(0));
+ assert_that(p.cargo("build").arg("-v").arg("--target").arg(&target),
+ execs().with_status(0));
+}
+
+#[test]
+fn cross_test_dylib() {
+ if cross_compile::disabled() { return }
+
+ let target = cross_compile::alternate();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo"
+ crate_type = ["dylib"]
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate bar as the_bar;
+
+ pub fn bar() { the_bar::baz(); }
+
+ #[test]
+ fn foo() { bar(); }
+ "#)
+ .file("tests/test.rs", r#"
+ extern crate foo as the_foo;
+
+ #[test]
+ fn foo() { the_foo::bar(); }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "bar"
+ crate_type = ["dylib"]
+ "#)
+ .file("bar/src/lib.rs", &format!(r#"
+ use std::env;
+ pub fn baz() {{
+ assert_eq!(env::consts::ARCH, "{}");
+ }}
+ "#, cross_compile::alternate_arch()))
+ .build();
+
+ assert_that(p.cargo("test").arg("--target").arg(&target),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] bar v0.0.1 ({dir}/bar)
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]{arch}[/]debug[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]{arch}[/]debug[/]deps[/]test-[..][EXE]",
+ dir = p.url(), arch = cross_compile::alternate()))
+ .with_stdout_contains_n("test foo ... ok", 2));
+
+}
--- /dev/null
+extern crate cargo;
+extern crate cargotest;
+extern crate hamcrest;
+extern crate flate2;
+extern crate tar;
+
+use std::fs::File;
+use std::path::PathBuf;
+use std::io::prelude::*;
+
+use cargotest::support::{project, execs, cross_compile, publish};
+use hamcrest::{assert_that, contains};
+use flate2::read::GzDecoder;
+use tar::Archive;
+
+#[test]
+fn simple_cross_package() {
+ if cross_compile::disabled() { return }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ repository = "bar"
+ "#)
+ .file("src/main.rs", &format!(r#"
+ use std::env;
+ fn main() {{
+ assert_eq!(env::consts::ARCH, "{}");
+ }}
+ "#, cross_compile::alternate_arch()))
+ .build();
+
+ let target = cross_compile::alternate();
+
+ assert_that(p.cargo("package").arg("--target").arg(&target),
+ execs().with_status(0).with_status(0).with_stderr(&format!(
+" Packaging foo v0.0.0 ({dir})
+ Verifying foo v0.0.0 ({dir})
+ Compiling foo v0.0.0 ({dir}/target/package/foo-0.0.0)
+ Finished dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+
+ // Check that the tarball contains the files
+ let f = File::open(&p.root().join("target/package/foo-0.0.0.crate")).unwrap();
+ let mut rdr = GzDecoder::new(f).unwrap();
+ let mut contents = Vec::new();
+ rdr.read_to_end(&mut contents).unwrap();
+ let mut ar = Archive::new(&contents[..]);
+ let entries = ar.entries().unwrap();
+ let entry_paths = entries.map(|entry| {
+ entry.unwrap().path().unwrap().into_owned()
+ }).collect::<Vec<PathBuf>>();
+ assert_that(&entry_paths, contains(vec![PathBuf::from("foo-0.0.0/Cargo.toml")]));
+ assert_that(&entry_paths, contains(vec![PathBuf::from("foo-0.0.0/Cargo.toml.orig")]));
+ assert_that(&entry_paths, contains(vec![PathBuf::from("foo-0.0.0/src/main.rs")]));
+}
+
+#[test]
+fn publish_with_target() {
+ if cross_compile::disabled() { return }
+
+ publish::setup();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ repository = "bar"
+ "#)
+ .file("src/main.rs", &format!(r#"
+ use std::env;
+ fn main() {{
+ assert_eq!(env::consts::ARCH, "{}");
+ }}
+ "#, cross_compile::alternate_arch()))
+ .build();
+
+ let target = cross_compile::alternate();
+
+ assert_that(p.cargo("publish")
+ .arg("--index").arg(publish::registry().to_string())
+ .arg("--target").arg(&target),
+ execs().with_status(0).with_stderr(&format!(
+" Updating registry `{registry}`
+ Packaging foo v0.0.0 ({dir})
+ Verifying foo v0.0.0 ({dir})
+ Compiling foo v0.0.0 ({dir}/target/package/foo-0.0.0)
+ Finished dev [unoptimized + debuginfo] target(s) in [..]
+ Uploading foo v0.0.0 ({dir})
+", dir = p.url(), registry = publish::registry())));
+}
--- /dev/null
+extern crate cargotest;
+extern crate libc;
+
+use std::fs;
+use std::io::{self, Read};
+use std::net::TcpListener;
+use std::process::{Stdio, Child};
+use std::thread;
+use std::time::Duration;
+
+use cargotest::support::project;
+
+#[cfg(unix)]
+fn enabled() -> bool {
+ true
+}
+
+// On Windows suport for these tests is only enabled through the usage of job
+// objects. Support for nested job objects, however, was added in recent-ish
+// versions of Windows, so this test may not always be able to succeed.
+//
+// As a result, we try to add ourselves to a job object here
+// can succeed or not.
+#[cfg(windows)]
+fn enabled() -> bool {
+ extern crate kernel32;
+ extern crate winapi;
+ unsafe {
+ // If we're not currently in a job, then we can definitely run these
+ // tests.
+ let me = kernel32::GetCurrentProcess();
+ let mut ret = 0;
+ let r = kernel32::IsProcessInJob(me, 0 as *mut _, &mut ret);
+ assert!(r != 0);
+ if ret == winapi::FALSE {
+ return true
+ }
+
+ // If we are in a job, then we can run these tests if we can be added to
+ // a nested job (as we're going to create a nested job no matter what as
+ // part of these tests.
+ //
+ // If we can't be added to a nested job, then these tests will
+ // definitely fail, and there's not much we can do about that.
+ let job = kernel32::CreateJobObjectW(0 as *mut _, 0 as *const _);
+ assert!(!job.is_null());
+ let r = kernel32::AssignProcessToJobObject(job, me);
+ kernel32::CloseHandle(job);
+ r != 0
+ }
+}
+
+#[test]
+fn ctrl_c_kills_everyone() {
+ if !enabled() {
+ return
+ }
+
+ let listener = TcpListener::bind("127.0.0.1:0").unwrap();
+ let addr = listener.local_addr().unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", &format!(r#"
+ use std::net::TcpStream;
+ use std::io::Read;
+
+ fn main() {{
+ let mut socket = TcpStream::connect("{}").unwrap();
+ let _ = socket.read(&mut [0; 10]);
+ panic!("that read should never return");
+ }}
+ "#, addr))
+ .build();
+
+ let mut cargo = p.cargo("build").build_command();
+ cargo.stdin(Stdio::piped())
+ .stdout(Stdio::piped())
+ .stderr(Stdio::piped())
+ .env("__CARGO_TEST_SETSID_PLEASE_DONT_USE_ELSEWHERE", "1");
+ let mut child = cargo.spawn().unwrap();
+
+ let mut sock = listener.accept().unwrap().0;
+ ctrl_c(&mut child);
+
+ assert!(!child.wait().unwrap().success());
+ match sock.read(&mut [0; 10]) {
+ Ok(n) => assert_eq!(n, 0),
+ Err(e) => assert_eq!(e.kind(), io::ErrorKind::ConnectionReset),
+ }
+
+ // Ok so what we just did was spawn cargo that spawned a build script, then
+ // we killed cargo in hopes of it killing the build script as well. If all
+ // went well the build script is now dead. On Windows, however, this is
+ // enforced with job objects which means that it may actually be in the
+ // *process* of being torn down at this point.
+ //
+ // Now on Windows we can't completely remove a file until all handles to it
+ // have been closed. Including those that represent running processes. So if
+ // we were to return here then there may still be an open reference to some
+ // file in the build directory. What we want to actually do is wait for the
+ // build script to *complete* exit. Take care of that by blowing away the
+ // build directory here, and panicking if we eventually spin too long
+ // without being able to.
+ for i in 0..10 {
+ match fs::remove_dir_all(&p.root().join("target")) {
+ Ok(()) => return,
+ Err(e) => println!("attempt {}: {}", i, e),
+ }
+ thread::sleep(Duration::from_millis(100));
+ }
+
+ panic!("couldn't remove build directory after a few tries, seems like \
+ we won't be able to!");
+}
+
+#[cfg(unix)]
+fn ctrl_c(child: &mut Child) {
+ use libc;
+
+ let r = unsafe { libc::kill(-(child.id() as i32), libc::SIGINT) };
+ if r < 0 {
+ panic!("failed to kill: {}", io::Error::last_os_error());
+ }
+}
+
+#[cfg(windows)]
+fn ctrl_c(child: &mut Child) {
+ child.kill().unwrap();
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::{basic_bin_manifest, main_file, execs, project};
+use hamcrest::{assert_that, existing_file};
+
+#[test]
+fn build_dep_info() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+
+ let depinfo_bin_path = &p.bin("foo").with_extension("d");
+
+ assert_that(depinfo_bin_path, existing_file());
+}
+
+#[test]
+fn build_dep_info_lib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[example]]
+ name = "ex"
+ crate-type = ["lib"]
+ "#)
+ .file("build.rs", "fn main() {}")
+ .file("src/lib.rs", "")
+ .file("examples/ex.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("--example=ex"), execs().with_status(0));
+ assert_that(&p.example_lib("ex", "lib").with_extension("d"), existing_file());
+}
+
+
+#[test]
+fn build_dep_info_rlib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[example]]
+ name = "ex"
+ crate-type = ["rlib"]
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/ex.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("--example=ex"), execs().with_status(0));
+ assert_that(&p.example_lib("ex", "rlib").with_extension("d"), existing_file());
+}
+
+#[test]
+fn build_dep_info_dylib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[example]]
+ name = "ex"
+ crate-type = ["dylib"]
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/ex.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("--example=ex"), execs().with_status(0));
+ assert_that(&p.example_lib("ex", "dylib").with_extension("d"), existing_file());
+}
--- /dev/null
+#[macro_use]
+extern crate cargotest;
+extern crate hamcrest;
+#[macro_use]
+extern crate serde_derive;
+extern crate serde_json;
+
+use std::collections::HashMap;
+use std::fs::{self, File};
+use std::io::prelude::*;
+use std::str;
+
+use cargotest::cargo_process;
+use cargotest::support::git;
+use cargotest::support::paths;
+use cargotest::support::registry::{Package, cksum};
+use cargotest::support::{project, execs, ProjectBuilder};
+use hamcrest::assert_that;
+
+fn setup() {
+ let root = paths::root();
+ t!(fs::create_dir(&root.join(".cargo")));
+ t!(t!(File::create(root.join(".cargo/config"))).write_all(br#"
+ [source.crates-io]
+ replace-with = 'my-awesome-local-registry'
+
+ [source.my-awesome-local-registry]
+ directory = 'index'
+ "#));
+}
+
+struct VendorPackage {
+ p: Option<ProjectBuilder>,
+ cksum: Checksum,
+}
+
+#[derive(Serialize)]
+struct Checksum {
+ package: Option<String>,
+ files: HashMap<String, String>,
+}
+
+impl VendorPackage {
+ fn new(name: &str) -> VendorPackage {
+ VendorPackage {
+ p: Some(project(&format!("index/{}", name))),
+ cksum: Checksum {
+ package: Some(String::new()),
+ files: HashMap::new(),
+ },
+ }
+ }
+
+ fn file(&mut self, name: &str, contents: &str) -> &mut VendorPackage {
+ self.p = Some(self.p.take().unwrap().file(name, contents));
+ self.cksum.files.insert(name.to_string(), cksum(contents.as_bytes()));
+ self
+ }
+
+ fn disable_checksum(&mut self) -> &mut VendorPackage {
+ self.cksum.package = None;
+ self
+ }
+
+ fn build(&mut self) {
+ let p = self.p.take().unwrap();
+ let json = serde_json::to_string(&self.cksum).unwrap();
+ let p = p.file(".cargo-checksum.json", &json);
+ let _ = p.build();
+ }
+}
+
+#[test]
+fn simple() {
+ setup();
+
+ VendorPackage::new("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .build();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate foo;
+
+ pub fn bar() {
+ foo::foo();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.1.0
+[COMPILING] bar v0.1.0 ([..]bar)
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn simple_install() {
+ setup();
+
+ VendorPackage::new("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .build();
+
+ VendorPackage::new("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate foo;
+
+ pub fn main() {
+ foo::foo();
+ }
+ "#)
+ .build();
+
+ assert_that(cargo_process().arg("install").arg("bar"),
+ execs().with_status(0).with_stderr(
+" Installing bar v0.1.0
+ Compiling foo v0.1.0
+ Compiling bar v0.1.0
+ Finished release [optimized] target(s) in [..] secs
+ Installing [..]bar[..]
+warning: be sure to add `[..]` to your PATH to be able to run the installed binaries
+"));
+}
+
+#[test]
+fn simple_install_fail() {
+ setup();
+
+ VendorPackage::new("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .build();
+
+ VendorPackage::new("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ baz = "9.8.7"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate foo;
+
+ pub fn main() {
+ foo::foo();
+ }
+ "#)
+ .build();
+
+ assert_that(cargo_process().arg("install").arg("bar"),
+ execs().with_status(101).with_stderr(
+" Installing bar v0.1.0
+error: failed to compile `bar v0.1.0`, intermediate artifacts can be found at `[..]`
+
+Caused by:
+ no matching package named `baz` found (required by `bar`)
+location searched: registry `https://github.com/rust-lang/crates.io-index`
+version required: ^9.8.7
+"));
+}
+
+#[test]
+fn install_without_feature_dep() {
+ setup();
+
+ VendorPackage::new("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .build();
+
+ VendorPackage::new("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ baz = { version = "9.8.7", optional = true }
+
+ [features]
+ wantbaz = ["baz"]
+ "#)
+ .file("src/main.rs", r#"
+ extern crate foo;
+
+ pub fn main() {
+ foo::foo();
+ }
+ "#)
+ .build();
+
+ assert_that(cargo_process().arg("install").arg("bar"),
+ execs().with_status(0).with_stderr(
+" Installing bar v0.1.0
+ Compiling foo v0.1.0
+ Compiling bar v0.1.0
+ Finished release [optimized] target(s) in [..] secs
+ Installing [..]bar[..]
+warning: be sure to add `[..]` to your PATH to be able to run the installed binaries
+"));
+}
+
+#[test]
+fn not_there() {
+ setup();
+
+ let _ = project("index").build();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate foo;
+
+ pub fn bar() {
+ foo::foo();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: no matching package named `foo` found (required by `bar`)
+location searched: [..]
+version required: ^0.1.0
+"));
+}
+
+#[test]
+fn multiple() {
+ setup();
+
+ VendorPackage::new("foo-0.1.0")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .file(".cargo-checksum", "")
+ .build();
+
+ VendorPackage::new("foo-0.2.0")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.2.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .file(".cargo-checksum", "")
+ .build();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate foo;
+
+ pub fn bar() {
+ foo::foo();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.1.0
+[COMPILING] bar v0.1.0 ([..]bar)
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn crates_io_then_directory() {
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate foo;
+
+ pub fn bar() {
+ foo::foo();
+ }
+ "#)
+ .build();
+
+ let cksum = Package::new("foo", "0.1.0")
+ .file("src/lib.rs", "pub fn foo() -> u32 { 0 }")
+ .publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] foo v0.1.0 ([..])
+[COMPILING] foo v0.1.0
+[COMPILING] bar v0.1.0 ([..]bar)
+[FINISHED] [..]
+"));
+
+ setup();
+
+ let mut v = VendorPackage::new("foo");
+ v.file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#);
+ v.file("src/lib.rs", "pub fn foo() -> u32 { 1 }");
+ v.cksum.package = Some(cksum);
+ v.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.1.0
+[COMPILING] bar v0.1.0 ([..]bar)
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn crates_io_then_bad_checksum() {
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ Package::new("foo", "0.1.0").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ setup();
+
+ VendorPackage::new("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: checksum for `foo v0.1.0` changed between lock files
+
+this could be indicative of a few possible errors:
+
+ * the lock file is corrupt
+ * a replacement source in use (e.g. a mirror) returned a different checksum
+ * the source itself may be corrupt in one way or another
+
+unable to verify that `foo v0.1.0` is the same as when the lockfile was generated
+
+"));
+}
+
+#[test]
+fn bad_file_checksum() {
+ setup();
+
+ VendorPackage::new("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ let mut f = t!(File::create(paths::root().join("index/foo/src/lib.rs")));
+ t!(f.write_all(b"fn foo() -> u32 { 0 }"));
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: the listed checksum of `[..]lib.rs` has changed:
+expected: [..]
+actual: [..]
+
+directory sources are not intended to be edited, if modifications are \
+required then it is recommended that [replace] is used with a forked copy of \
+the source
+"));
+}
+
+#[test]
+fn only_dot_files_ok() {
+ setup();
+
+ VendorPackage::new("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ VendorPackage::new("bar")
+ .file(".foo", "")
+ .build();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+}
+
+#[test]
+fn git_lock_file_doesnt_change() {
+
+ let git = git::new("git", |p| {
+ p.file("Cargo.toml", r#"
+ [project]
+ name = "git"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ }).unwrap();
+
+ VendorPackage::new("git")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "git"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .disable_checksum()
+ .build();
+
+ let p = project("bar")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ git = {{ git = '{0}' }}
+ "#, git.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+
+ let mut lock1 = String::new();
+ t!(t!(File::open(p.root().join("Cargo.lock"))).read_to_string(&mut lock1));
+
+ let root = paths::root();
+ t!(fs::create_dir(&root.join(".cargo")));
+ t!(t!(File::create(root.join(".cargo/config"))).write_all(&format!(r#"
+ [source.my-git-repo]
+ git = '{}'
+ replace-with = 'my-awesome-local-registry'
+
+ [source.my-awesome-local-registry]
+ directory = 'index'
+ "#, git.url()).as_bytes()));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] [..]
+[COMPILING] [..]
+[FINISHED] [..]
+"));
+
+ let mut lock2 = String::new();
+ t!(t!(File::open(p.root().join("Cargo.lock"))).read_to_string(&mut lock2));
+ assert!(lock1 == lock2, "lock files changed");
+}
+
+#[test]
+fn git_override_requires_lockfile() {
+ VendorPackage::new("git")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "git"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .disable_checksum()
+ .build();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ git = { git = 'https://example.com/' }
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ let root = paths::root();
+ t!(fs::create_dir(&root.join(".cargo")));
+ t!(t!(File::create(root.join(".cargo/config"))).write_all(br#"
+ [source.my-git-repo]
+ git = 'https://example.com/'
+ replace-with = 'my-awesome-local-registry'
+
+ [source.my-awesome-local-registry]
+ directory = 'index'
+ "#));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: failed to load source for a dependency on `git`
+
+Caused by:
+ Unable to update [..]
+
+Caused by:
+ the source my-git-repo requires a lock file to be present first before it can be
+used against vendored source code
+
+remove the source replacement configuration, generate a lock file, and then
+restore the source replacement configuration to continue the build
+
+"));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+extern crate cargo;
+
+use std::str;
+use std::fs::{self, File};
+use std::io::Read;
+
+use cargotest::rustc_host;
+use cargotest::support::{project, execs, path2url};
+use cargotest::support::registry::Package;
+use hamcrest::{assert_that, existing_file, existing_dir, is_not};
+use cargo::util::{CargoError, CargoErrorKind};
+
+#[test]
+fn simple() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", "fn main() {}")
+ .file("src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc"),
+ execs().with_status(0).with_stderr(&format!("\
+[..] foo v0.0.1 ({dir})
+[..] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = path2url(p.root()))));
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ assert_that(&p.root().join("target/doc/foo/index.html"), existing_file());
+}
+
+#[test]
+fn doc_no_libs() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name = "foo"
+ doc = false
+ "#)
+ .file("src/main.rs", r#"
+ bad code
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc"),
+ execs().with_status(0));
+}
+
+#[test]
+fn doc_twice() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc"),
+ execs().with_status(0).with_stderr(&format!("\
+[DOCUMENTING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = path2url(p.root()))));
+
+ assert_that(p.cargo("doc"),
+ execs().with_status(0).with_stdout(""))
+}
+
+#[test]
+fn doc_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate bar;
+ pub fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc"),
+ execs().with_status(0).with_stderr(&format!("\
+[..] bar v0.0.1 ({dir}/bar)
+[..] bar v0.0.1 ({dir}/bar)
+[DOCUMENTING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = path2url(p.root()))));
+
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ assert_that(&p.root().join("target/doc/foo/index.html"), existing_file());
+ assert_that(&p.root().join("target/doc/bar/index.html"), existing_file());
+
+ assert_that(p.cargo("doc")
+ .env("RUST_LOG", "cargo::ops::cargo_rustc::fingerprint"),
+ execs().with_status(0).with_stdout(""));
+
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ assert_that(&p.root().join("target/doc/foo/index.html"), existing_file());
+ assert_that(&p.root().join("target/doc/bar/index.html"), existing_file());
+}
+
+#[test]
+fn doc_no_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate bar;
+ pub fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc").arg("--no-deps"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] bar v0.0.1 ({dir}/bar)
+[DOCUMENTING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = path2url(p.root()))));
+
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ assert_that(&p.root().join("target/doc/foo/index.html"), existing_file());
+ assert_that(&p.root().join("target/doc/bar/index.html"), is_not(existing_file()));
+}
+
+#[test]
+fn doc_only_bin() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ pub fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc").arg("-v"),
+ execs().with_status(0));
+
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ assert_that(&p.root().join("target/doc/bar/index.html"), existing_file());
+ assert_that(&p.root().join("target/doc/foo/index.html"), existing_file());
+}
+
+#[test]
+fn doc_multiple_targets_same_name_lib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo", "bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ [lib]
+ name = "foo_lib"
+ "#)
+ .file("foo/src/lib.rs", "")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ [lib]
+ name = "foo_lib"
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("doc").arg("--all"),
+ execs()
+ .with_status(101)
+ .with_stderr_contains("[..] library `foo_lib` is specified [..]")
+ .with_stderr_contains("[..] `foo v0.1.0[..]` [..]")
+ .with_stderr_contains("[..] `bar v0.1.0[..]` [..]"));
+}
+
+#[test]
+fn doc_multiple_targets_same_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo", "bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ [[bin]]
+ name = "foo_lib"
+ path = "src/foo_lib.rs"
+ "#)
+ .file("foo/src/foo_lib.rs", "")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ [lib]
+ name = "foo_lib"
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ let root = path2url(p.root());
+
+ assert_that(p.cargo("doc").arg("--all"),
+ execs()
+ .with_status(0)
+ .with_stderr_contains(&format!("[DOCUMENTING] foo v0.1.0 ({}/foo)", root))
+ .with_stderr_contains(&format!("[DOCUMENTING] bar v0.1.0 ({}/bar)", root))
+ .with_stderr_contains("[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]"));
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ let doc_file = p.root().join("target/doc/foo_lib/index.html");
+ assert_that(&doc_file, existing_file());
+}
+
+#[test]
+fn doc_multiple_targets_same_name_bin() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo", "bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ [[bin]]
+ name = "foo-cli"
+ "#)
+ .file("foo/src/foo-cli.rs", "")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ [[bin]]
+ name = "foo-cli"
+ "#)
+ .file("bar/src/foo-cli.rs", "")
+ .build();
+
+ assert_that(p.cargo("doc").arg("--all"),
+ execs()
+ .with_status(101)
+ .with_stderr_contains("[..] binary `foo_cli` is specified [..]")
+ .with_stderr_contains("[..] `foo v0.1.0[..]` [..]")
+ .with_stderr_contains("[..] `bar v0.1.0[..]` [..]"));
+}
+
+#[test]
+fn doc_multiple_targets_same_name_undoced() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo", "bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ [[bin]]
+ name = "foo-cli"
+ "#)
+ .file("foo/src/foo-cli.rs", "")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ [[bin]]
+ name = "foo-cli"
+ doc = false
+ "#)
+ .file("bar/src/foo-cli.rs", "")
+ .build();
+
+ assert_that(p.cargo("doc").arg("--all"),
+ execs().with_status(0));
+}
+
+#[test]
+fn doc_lib_bin_same_name_documents_lib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ //! Binary documentation
+ extern crate foo;
+ fn main() {
+ foo::foo();
+ }
+ "#)
+ .file("src/lib.rs", r#"
+ //! Library documentation
+ pub fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc"),
+ execs().with_status(0).with_stderr(&format!("\
+[DOCUMENTING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = path2url(p.root()))));
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ let doc_file = p.root().join("target/doc/foo/index.html");
+ assert_that(&doc_file, existing_file());
+ let mut doc_html = String::new();
+ File::open(&doc_file).unwrap().read_to_string(&mut doc_html).unwrap();
+ assert!(doc_html.contains("Library"));
+ assert!(!doc_html.contains("Binary"));
+}
+
+#[test]
+fn doc_lib_bin_same_name_documents_lib_when_requested() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ //! Binary documentation
+ extern crate foo;
+ fn main() {
+ foo::foo();
+ }
+ "#)
+ .file("src/lib.rs", r#"
+ //! Library documentation
+ pub fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc").arg("--lib"),
+ execs().with_status(0).with_stderr(&format!("\
+[DOCUMENTING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = path2url(p.root()))));
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ let doc_file = p.root().join("target/doc/foo/index.html");
+ assert_that(&doc_file, existing_file());
+ let mut doc_html = String::new();
+ File::open(&doc_file).unwrap().read_to_string(&mut doc_html).unwrap();
+ assert!(doc_html.contains("Library"));
+ assert!(!doc_html.contains("Binary"));
+}
+
+#[test]
+fn doc_lib_bin_same_name_documents_named_bin_when_requested() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ //! Binary documentation
+ extern crate foo;
+ fn main() {
+ foo::foo();
+ }
+ "#)
+ .file("src/lib.rs", r#"
+ //! Library documentation
+ pub fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc").arg("--bin").arg("foo"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[DOCUMENTING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = path2url(p.root()))));
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ let doc_file = p.root().join("target/doc/foo/index.html");
+ assert_that(&doc_file, existing_file());
+ let mut doc_html = String::new();
+ File::open(&doc_file).unwrap().read_to_string(&mut doc_html).unwrap();
+ assert!(!doc_html.contains("Library"));
+ assert!(doc_html.contains("Binary"));
+}
+
+#[test]
+fn doc_lib_bin_same_name_documents_bins_when_requested() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ //! Binary documentation
+ extern crate foo;
+ fn main() {
+ foo::foo();
+ }
+ "#)
+ .file("src/lib.rs", r#"
+ //! Library documentation
+ pub fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc").arg("--bins"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[DOCUMENTING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = path2url(p.root()))));
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ let doc_file = p.root().join("target/doc/foo/index.html");
+ assert_that(&doc_file, existing_file());
+ let mut doc_html = String::new();
+ File::open(&doc_file).unwrap().read_to_string(&mut doc_html).unwrap();
+ assert!(!doc_html.contains("Library"));
+ assert!(doc_html.contains("Binary"));
+}
+
+#[test]
+fn doc_dash_p() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.a]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "extern crate a;")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.b]
+ path = "../b"
+ "#)
+ .file("a/src/lib.rs", "extern crate b;")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("doc").arg("-p").arg("a"),
+ execs().with_status(0)
+ .with_stderr("\
+[..] b v0.0.1 (file://[..])
+[..] b v0.0.1 (file://[..])
+[DOCUMENTING] a v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn doc_same_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/main.rs", "fn main() {}")
+ .file("examples/main.rs", "fn main() {}")
+ .file("tests/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("doc"),
+ execs().with_status(0));
+}
+
+#[test]
+fn doc_target() {
+ const TARGET: &'static str = "arm-unknown-linux-gnueabihf";
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ #![feature(no_core)]
+ #![no_core]
+
+ extern {
+ pub static A: u32;
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc").arg("--target").arg(TARGET).arg("--verbose"),
+ execs().with_status(0));
+ assert_that(&p.root().join(&format!("target/{}/doc", TARGET)), existing_dir());
+ assert_that(&p.root().join(&format!("target/{}/doc/foo/index.html", TARGET)), existing_file());
+}
+
+#[test]
+fn target_specific_not_documented() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [target.foo.dependencies]
+ a = { path = "a" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "not rust")
+ .build();
+
+ assert_that(p.cargo("doc"),
+ execs().with_status(0));
+}
+
+#[test]
+fn output_not_captured() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "
+ /// ```
+ /// ☃
+ /// ```
+ pub fn foo() {}
+ ")
+ .build();
+
+ let error = p.cargo("doc").exec_with_output().err().unwrap();
+ if let CargoError(CargoErrorKind::ProcessErrorKind(perr), ..) = error {
+ let output = perr.output.unwrap();
+ let stderr = str::from_utf8(&output.stderr).unwrap();
+
+ assert!(stderr.contains("☃"), "no snowman\n{}", stderr);
+ assert!(stderr.contains("unknown start of token"), "no message{}", stderr);
+ } else {
+ assert!(false, "an error kind other than ProcessErrorKind was encountered");
+ }
+}
+
+#[test]
+fn target_specific_documented() {
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [target.foo.dependencies]
+ a = {{ path = "a" }}
+ [target.{}.dependencies]
+ a = {{ path = "a" }}
+ "#, rustc_host()))
+ .file("src/lib.rs", "
+ extern crate a;
+
+ /// test
+ pub fn foo() {}
+ ")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "
+ /// test
+ pub fn foo() {}
+ ")
+ .build();
+
+ assert_that(p.cargo("doc"),
+ execs().with_status(0));
+}
+
+#[test]
+fn no_document_build_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [build-dependencies]
+ a = { path = "a" }
+ "#)
+ .file("src/lib.rs", "
+ pub fn foo() {}
+ ")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "
+ /// ```
+ /// ☃
+ /// ```
+ pub fn foo() {}
+ ")
+ .build();
+
+ assert_that(p.cargo("doc"),
+ execs().with_status(0));
+}
+
+#[test]
+fn doc_release() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("--release"),
+ execs().with_status(0));
+ assert_that(p.cargo("doc").arg("--release").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[DOCUMENTING] foo v0.0.1 ([..])
+[RUNNING] `rustdoc [..] src[/]lib.rs [..]`
+[FINISHED] release [optimized] target(s) in [..]
+"));
+}
+
+#[test]
+fn doc_multiple_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+
+ [dependencies.baz]
+ path = "baz"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate bar;
+ pub fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .file("baz/Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("baz/src/lib.rs", r#"
+ pub fn baz() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc")
+ .arg("-p").arg("bar")
+ .arg("-p").arg("baz")
+ .arg("-v"),
+ execs().with_status(0));
+
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ assert_that(&p.root().join("target/doc/bar/index.html"), existing_file());
+ assert_that(&p.root().join("target/doc/baz/index.html"), existing_file());
+}
+
+#[test]
+fn features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+
+ [features]
+ foo = ["bar/bar"]
+ "#)
+ .file("src/lib.rs", r#"
+ #[cfg(feature = "foo")]
+ pub fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ bar = []
+ "#)
+ .file("bar/build.rs", r#"
+ fn main() {
+ println!("cargo:rustc-cfg=bar");
+ }
+ "#)
+ .file("bar/src/lib.rs", r#"
+ #[cfg(feature = "bar")]
+ pub fn bar() {}
+ "#)
+ .build();
+ assert_that(p.cargo("doc").arg("--features").arg("foo"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target/doc"), existing_dir());
+ assert_that(&p.root().join("target/doc/foo/fn.foo.html"), existing_file());
+ assert_that(&p.root().join("target/doc/bar/fn.bar.html"), existing_file());
+}
+
+#[test]
+fn rerun_when_dir_removed() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ /// dox
+ pub fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target/doc/foo/index.html"), existing_file());
+
+ fs::remove_dir_all(p.root().join("target/doc/foo")).unwrap();
+
+ assert_that(p.cargo("doc"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target/doc/foo/index.html"), existing_file());
+}
+
+#[test]
+fn document_only_lib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ /// dox
+ pub fn foo() {}
+ "#)
+ .file("src/bin/bar.rs", r#"
+ /// ```
+ /// ☃
+ /// ```
+ pub fn foo() {}
+ fn main() { foo(); }
+ "#)
+ .build();
+ assert_that(p.cargo("doc").arg("--lib"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target/doc/foo/index.html"), existing_file());
+}
+
+#[test]
+fn plugins_no_use_target() {
+ if !cargotest::is_nightly() {
+ return
+ }
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ proc-macro = true
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("doc")
+ .arg("--target=x86_64-unknown-openbsd")
+ .arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn doc_all_workspace() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+
+ [dependencies]
+ bar = { path = "bar" }
+
+ [workspace]
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .build();
+
+ // The order in which bar is compiled or documented is not deterministic
+ assert_that(p.cargo("doc")
+ .arg("--all"),
+ execs().with_status(0)
+ .with_stderr_contains("[..] Documenting bar v0.1.0 ([..])")
+ .with_stderr_contains("[..] Compiling bar v0.1.0 ([..])")
+ .with_stderr_contains("[..] Documenting foo v0.1.0 ([..])"));
+}
+
+#[test]
+fn doc_all_virtual_manifest() {
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo", "bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("foo/src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .build();
+
+ // The order in which foo and bar are documented is not guaranteed
+ assert_that(p.cargo("doc")
+ .arg("--all"),
+ execs().with_status(0)
+ .with_stderr_contains("[..] Documenting bar v0.1.0 ([..])")
+ .with_stderr_contains("[..] Documenting foo v0.1.0 ([..])"));
+}
+
+#[test]
+fn doc_virtual_manifest_all_implied() {
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo", "bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("foo/src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .build();
+
+ // The order in which foo and bar are documented is not guaranteed
+ assert_that(p.cargo("doc"),
+ execs().with_status(0)
+ .with_stderr_contains("[..] Documenting bar v0.1.0 ([..])")
+ .with_stderr_contains("[..] Documenting foo v0.1.0 ([..])"));
+}
+
+#[test]
+fn doc_all_member_dependency_same_name() {
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["a"]
+ "#)
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.1.0"
+
+ [dependencies]
+ a = "0.1.0"
+ "#)
+ .file("a/src/lib.rs", r#"
+ pub fn a() {}
+ "#)
+ .build();
+
+ Package::new("a", "0.1.0").publish();
+
+ assert_that(p.cargo("doc")
+ .arg("--all"),
+ execs().with_status(0)
+ .with_stderr_contains("[..] Updating registry `[..]`")
+ .with_stderr_contains("[..] Documenting a v0.1.0 ([..])"));
+}
--- /dev/null
+#[macro_use]
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::fs::File;
+use std::io::prelude::*;
+
+use cargotest::support::paths::CargoPathExt;
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+#[test]
+fn invalid1() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ bar = ["baz"]
+ "#)
+ .file("src/main.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ Feature `bar` includes `baz` which is neither a dependency nor another feature
+"));
+}
+
+#[test]
+fn invalid2() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ bar = ["baz"]
+
+ [dependencies.bar]
+ path = "foo"
+ "#)
+ .file("src/main.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ Features and dependencies cannot have the same name: `bar`
+"));
+}
+
+#[test]
+fn invalid3() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ bar = ["baz"]
+
+ [dependencies.baz]
+ path = "foo"
+ "#)
+ .file("src/main.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ Feature `bar` depends on `baz` which is not an optional dependency.
+Consider adding `optional = true` to the dependency
+"));
+}
+
+#[test]
+fn invalid4() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ features = ["bar"]
+ "#)
+ .file("src/main.rs", "")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] Package `bar v0.0.1 ([..])` does not have these features: `bar`
+"));
+
+ p.change_file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#);
+
+ assert_that(p.cargo("build").arg("--features").arg("test"),
+ execs().with_status(101).with_stderr("\
+[ERROR] Package `foo v0.0.1 ([..])` does not have these features: `test`
+"));
+}
+
+#[test]
+fn invalid5() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dev-dependencies.bar]
+ path = "bar"
+ optional = true
+ "#)
+ .file("src/main.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ Dev-dependencies are not allowed to be optional: `bar`
+"));
+}
+
+#[test]
+fn invalid6() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ foo = ["bar/baz"]
+ "#)
+ .file("src/main.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("--features").arg("foo"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ Feature `foo` requires a feature of `bar` which is not a dependency
+"));
+}
+
+#[test]
+fn invalid7() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ foo = ["bar/baz"]
+ bar = []
+ "#)
+ .file("src/main.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("--features").arg("foo"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ Feature `foo` requires a feature of `bar` which is not a dependency
+"));
+}
+
+#[test]
+fn invalid8() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ features = ["foo/bar"]
+ "#)
+ .file("src/main.rs", "")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("--features").arg("foo"),
+ execs().with_status(101).with_stderr("\
+[ERROR] feature names may not contain slashes: `foo/bar`
+"));
+}
+
+#[test]
+fn invalid9() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("--features").arg("bar"),
+ execs().with_status(0).with_stderr("\
+warning: Package `foo v0.0.1 ([..])` does not have feature `bar`. It has a required dependency with \
+that name, but only optional dependencies can be used as features. [..]
+ Compiling bar v0.0.1 ([..])
+ Compiling foo v0.0.1 ([..])
+ Finished dev [unoptimized + debuginfo] target(s) in [..] secs
+"));
+}
+
+#[test]
+fn invalid10() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ features = ["baz"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.baz]
+ path = "baz"
+ "#)
+ .file("bar/src/lib.rs", "")
+ .file("bar/baz/Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/baz/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+warning: Package `bar v0.0.1 ([..])` does not have feature `baz`. It has a required dependency with \
+that name, but only optional dependencies can be used as features. [..]
+ Compiling baz v0.0.1 ([..])
+ Compiling bar v0.0.1 ([..])
+ Compiling foo v0.0.1 ([..])
+ Finished dev [unoptimized + debuginfo] target(s) in [..] secs
+"));
+}
+
+#[test]
+fn no_transitive_dep_feature_requirement() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.derived]
+ path = "derived"
+
+ [features]
+ default = ["derived/bar/qux"]
+ "#)
+ .file("src/main.rs", r#"
+ extern crate derived;
+ fn main() { derived::test(); }
+ "#)
+ .file("derived/Cargo.toml", r#"
+ [package]
+ name = "derived"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("derived/src/lib.rs", r#"
+ extern crate bar;
+ pub use bar::test;
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ qux = []
+ "#)
+ .file("bar/src/lib.rs", r#"
+ #[cfg(feature = "qux")]
+ pub fn test() { print!("test"); }
+ "#)
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] feature names may not contain slashes: `bar/qux`
+"));
+}
+
+#[test]
+fn no_feature_doesnt_build() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ optional = true
+ "#)
+ .file("src/main.rs", r#"
+ #[cfg(feature = "bar")]
+ extern crate bar;
+ #[cfg(feature = "bar")]
+ fn main() { bar::bar(); println!("bar") }
+ #[cfg(not(feature = "bar"))]
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "pub fn bar() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+ assert_that(p.process(&p.bin("foo")),
+ execs().with_status(0).with_stdout(""));
+
+ assert_that(p.cargo("build").arg("--features").arg("bar"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] bar v0.0.1 ({dir}/bar)
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+ assert_that(p.process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("bar\n"));
+}
+
+#[test]
+fn default_feature_pulled_in() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["bar"]
+
+ [dependencies.bar]
+ path = "bar"
+ optional = true
+ "#)
+ .file("src/main.rs", r#"
+ #[cfg(feature = "bar")]
+ extern crate bar;
+ #[cfg(feature = "bar")]
+ fn main() { bar::bar(); println!("bar") }
+ #[cfg(not(feature = "bar"))]
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "pub fn bar() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] bar v0.0.1 ({dir}/bar)
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+ assert_that(p.process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("bar\n"));
+
+ assert_that(p.cargo("build").arg("--no-default-features"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+ assert_that(p.process(&p.bin("foo")),
+ execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn cyclic_feature() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["default"]
+ "#)
+ .file("src/main.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] Cyclic feature dependency: feature `default` depends on itself
+"));
+}
+
+#[test]
+fn cyclic_feature2() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ foo = ["bar"]
+ bar = ["foo"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn groups_on_groups_on_groups() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["f1"]
+ f1 = ["f2", "bar"]
+ f2 = ["f3", "f4"]
+ f3 = ["f5", "f6", "baz"]
+ f4 = ["f5", "f7"]
+ f5 = ["f6"]
+ f6 = ["f7"]
+ f7 = ["bar"]
+
+ [dependencies.bar]
+ path = "bar"
+ optional = true
+
+ [dependencies.baz]
+ path = "baz"
+ optional = true
+ "#)
+ .file("src/main.rs", r#"
+ #[allow(unused_extern_crates)]
+ extern crate bar;
+ #[allow(unused_extern_crates)]
+ extern crate baz;
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "pub fn bar() {}")
+ .file("baz/Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("baz/src/lib.rs", "pub fn baz() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] ba[..] v0.0.1 ({dir}/ba[..])
+[COMPILING] ba[..] v0.0.1 ({dir}/ba[..])
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+}
+
+#[test]
+fn many_cli_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ optional = true
+
+ [dependencies.baz]
+ path = "baz"
+ optional = true
+ "#)
+ .file("src/main.rs", r#"
+ #[allow(unused_extern_crates)]
+ extern crate bar;
+ #[allow(unused_extern_crates)]
+ extern crate baz;
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "pub fn bar() {}")
+ .file("baz/Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("baz/src/lib.rs", "pub fn baz() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--features").arg("bar baz"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] ba[..] v0.0.1 ({dir}/ba[..])
+[COMPILING] ba[..] v0.0.1 ({dir}/ba[..])
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+}
+
+#[test]
+fn union_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.d1]
+ path = "d1"
+ features = ["f1"]
+ [dependencies.d2]
+ path = "d2"
+ features = ["f2"]
+ "#)
+ .file("src/main.rs", r#"
+ #[allow(unused_extern_crates)]
+ extern crate d1;
+ extern crate d2;
+ fn main() {
+ d2::f1();
+ d2::f2();
+ }
+ "#)
+ .file("d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ f1 = ["d2"]
+
+ [dependencies.d2]
+ path = "../d2"
+ features = ["f1"]
+ optional = true
+ "#)
+ .file("d1/src/lib.rs", "")
+ .file("d2/Cargo.toml", r#"
+ [package]
+ name = "d2"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ f1 = []
+ f2 = []
+ "#)
+ .file("d2/src/lib.rs", r#"
+ #[cfg(feature = "f1")] pub fn f1() {}
+ #[cfg(feature = "f2")] pub fn f2() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] d2 v0.0.1 ({dir}/d2)
+[COMPILING] d1 v0.0.1 ({dir}/d1)
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+}
+
+#[test]
+fn many_features_no_rebuilds() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies.a]
+ path = "a"
+ features = ["fall"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.1.0"
+ authors = []
+
+ [features]
+ ftest = []
+ ftest2 = []
+ fall = ["ftest", "ftest2"]
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] a v0.1.0 ({dir}/a)
+[COMPILING] b v0.1.0 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+ p.root().move_into_the_past();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[FRESH] a v0.1.0 ([..]/a)
+[FRESH] b v0.1.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+// Tests that all cmd lines work with `--features ""`
+#[test]
+fn empty_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--features").arg(""),
+ execs().with_status(0));
+}
+
+// Tests that all cmd lines work with `--features ""`
+#[test]
+fn transitive_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ foo = ["bar/baz"]
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", "
+ extern crate bar;
+ fn main() { bar::baz(); }
+ ")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ baz = []
+ "#)
+ .file("bar/src/lib.rs", r#"
+ #[cfg(feature = "baz")]
+ pub fn baz() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("--features").arg("foo"),
+ execs().with_status(0));
+}
+
+#[test]
+fn everything_in_the_lockfile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ f1 = ["d1/f1"]
+ f2 = ["d2"]
+
+ [dependencies.d1]
+ path = "d1"
+ [dependencies.d2]
+ path = "d2"
+ optional = true
+ [dependencies.d3]
+ path = "d3"
+ optional = true
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ f1 = []
+ "#)
+ .file("d1/src/lib.rs", "")
+ .file("d2/Cargo.toml", r#"
+ [package]
+ name = "d2"
+ version = "0.0.2"
+ authors = []
+ "#)
+ .file("d2/src/lib.rs", "")
+ .file("d3/Cargo.toml", r#"
+ [package]
+ name = "d3"
+ version = "0.0.3"
+ authors = []
+
+ [features]
+ f3 = []
+ "#)
+ .file("d3/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("fetch"), execs().with_status(0));
+ let loc = p.root().join("Cargo.lock");
+ let mut lockfile = String::new();
+ t!(t!(File::open(&loc)).read_to_string(&mut lockfile));
+ assert!(lockfile.contains(r#"name = "d1""#), "d1 not found\n{}", lockfile);
+ assert!(lockfile.contains(r#"name = "d2""#), "d2 not found\n{}", lockfile);
+ assert!(lockfile.contains(r#"name = "d3""#), "d3 not found\n{}", lockfile);
+}
+
+#[test]
+fn no_rebuild_when_frobbing_default_feature() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+ b = { path = "b" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ a = { path = "../a", features = ["f1"], default-features = false }
+ "#)
+ .file("b/src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.1.0"
+ authors = []
+
+ [features]
+ default = ["f1"]
+ f1 = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(p.cargo("build"), execs().with_status(0).with_stdout(""));
+ assert_that(p.cargo("build"), execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn unions_work_with_no_default_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+ b = { path = "b" }
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate a;
+ pub fn foo() { a::a(); }
+ "#)
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ a = { path = "../a", features = [], default-features = false }
+ "#)
+ .file("b/src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.1.0"
+ authors = []
+
+ [features]
+ default = ["f1"]
+ f1 = []
+ "#)
+ .file("a/src/lib.rs", r#"
+ #[cfg(feature = "f1")]
+ pub fn a() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(p.cargo("build"), execs().with_status(0).with_stdout(""));
+ assert_that(p.cargo("build"), execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn optional_and_dev_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "test"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = { path = "foo", optional = true }
+ [dev-dependencies]
+ foo = { path = "foo" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] test v0.1.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn activating_feature_activates_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "test"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = { path = "foo", optional = true }
+
+ [features]
+ a = ["foo/a"]
+ "#)
+ .file("src/lib.rs", "
+ extern crate foo;
+ pub fn bar() {
+ foo::bar();
+ }
+ ")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [features]
+ a = []
+ "#)
+ .file("foo/src/lib.rs", r#"
+ #[cfg(feature = "a")]
+ pub fn bar() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("--features").arg("a").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn dep_feature_in_cmd_line() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.derived]
+ path = "derived"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate derived;
+ fn main() { derived::test(); }
+ "#)
+ .file("derived/Cargo.toml", r#"
+ [package]
+ name = "derived"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+
+ [features]
+ default = []
+ derived-feat = ["bar/some-feat"]
+ "#)
+ .file("derived/src/lib.rs", r#"
+ extern crate bar;
+ pub use bar::test;
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ some-feat = []
+ "#)
+ .file("bar/src/lib.rs", r#"
+ #[cfg(feature = "some-feat")]
+ pub fn test() { print!("test"); }
+ "#)
+ .build();
+
+ // The foo project requires that feature "some-feat" in "bar" is enabled.
+ // Building without any features enabled should fail:
+ assert_that(p.cargo("build"),
+ execs().with_status(101));
+
+ // We should be able to enable the feature "derived-feat", which enables "some-feat",
+ // on the command line. The feature is enabled, thus building should be successful:
+ assert_that(p.cargo("build").arg("--features").arg("derived/derived-feat"),
+ execs().with_status(0));
+
+ // Trying to enable features of transitive dependencies is an error
+ assert_that(p.cargo("build").arg("--features").arg("bar/some-feat"),
+ execs().with_status(101).with_stderr("\
+[ERROR] Package `foo v0.0.1 ([..])` does not have these features: `bar`
+"));
+
+ // Hierarchical feature specification should still be disallowed
+ assert_that(p.cargo("build").arg("--features").arg("derived/bar/some-feat"),
+ execs().with_status(101).with_stderr("\
+[ERROR] feature names may not contain slashes: `bar/some-feat`
+"));
+}
+
+#[test]
+fn all_features_flag_enables_all_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ foo = []
+ bar = []
+
+ [dependencies.baz]
+ path = "baz"
+ optional = true
+ "#)
+ .file("src/main.rs", r#"
+ #[cfg(feature = "foo")]
+ pub fn foo() {}
+
+ #[cfg(feature = "bar")]
+ pub fn bar() {
+ extern crate baz;
+ baz::baz();
+ }
+
+ fn main() {
+ foo();
+ bar();
+ }
+ "#)
+ .file("baz/Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("baz/src/lib.rs", "pub fn baz() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--all-features"),
+ execs().with_status(0));
+}
+
+#[test]
+fn many_cli_features_comma_delimited() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ optional = true
+
+ [dependencies.baz]
+ path = "baz"
+ optional = true
+ "#)
+ .file("src/main.rs", r#"
+ #[allow(unused_extern_crates)]
+ extern crate bar;
+ #[allow(unused_extern_crates)]
+ extern crate baz;
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "pub fn bar() {}")
+ .file("baz/Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("baz/src/lib.rs", "pub fn baz() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--features").arg("bar,baz"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] ba[..] v0.0.1 ({dir}/ba[..])
+[COMPILING] ba[..] v0.0.1 ({dir}/ba[..])
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+}
+
+#[test]
+fn many_cli_features_comma_and_space_delimited() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ optional = true
+
+ [dependencies.baz]
+ path = "baz"
+ optional = true
+
+ [dependencies.bam]
+ path = "bam"
+ optional = true
+
+ [dependencies.bap]
+ path = "bap"
+ optional = true
+ "#)
+ .file("src/main.rs", r#"
+ #[allow(unused_extern_crates)]
+ extern crate bar;
+ #[allow(unused_extern_crates)]
+ extern crate baz;
+ #[allow(unused_extern_crates)]
+ extern crate bam;
+ #[allow(unused_extern_crates)]
+ extern crate bap;
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "pub fn bar() {}")
+ .file("baz/Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("baz/src/lib.rs", "pub fn baz() {}")
+ .file("bam/Cargo.toml", r#"
+ [package]
+ name = "bam"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bam/src/lib.rs", "pub fn bam() {}")
+ .file("bap/Cargo.toml", r#"
+ [package]
+ name = "bap"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bap/src/lib.rs", "pub fn bap() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--features").arg("bar,baz bam bap"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] ba[..] v0.0.1 ({dir}/ba[..])
+[COMPILING] ba[..] v0.0.1 ({dir}/ba[..])
+[COMPILING] ba[..] v0.0.1 ({dir}/ba[..])
+[COMPILING] ba[..] v0.0.1 ({dir}/ba[..])
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+#[test]
+fn no_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+ "#)
+ .file("src/main.rs", r#"
+ mod a; fn main() {}
+ "#)
+ .file("src/a.rs", "")
+ .build();
+
+ assert_that(p.cargo("fetch"),
+ execs().with_status(0).with_stdout(""));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::fs::{self, File};
+use std::io::prelude::*;
+
+use cargotest::sleep_ms;
+use cargotest::support::{project, execs, path2url};
+use cargotest::support::paths::CargoPathExt;
+use hamcrest::{assert_that, existing_file};
+
+#[test]
+fn modifying_and_moving() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+ "#)
+ .file("src/main.rs", r#"
+ mod a; fn main() {}
+ "#)
+ .file("src/a.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = path2url(p.root()))));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stdout(""));
+ p.root().move_into_the_past();
+ p.root().join("target").move_into_the_past();
+
+ File::create(&p.root().join("src/a.rs")).unwrap()
+ .write_all(b"#[allow(unused)]fn main() {}").unwrap();
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = path2url(p.root()))));
+
+ fs::rename(&p.root().join("src/a.rs"), &p.root().join("src/b.rs")).unwrap();
+ assert_that(p.cargo("build"),
+ execs().with_status(101));
+}
+
+#[test]
+fn modify_only_some_files() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "mod a;")
+ .file("src/a.rs", "")
+ .file("src/main.rs", r#"
+ mod b;
+ fn main() {}
+ "#)
+ .file("src/b.rs", "")
+ .file("tests/test.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = path2url(p.root()))));
+ assert_that(p.cargo("test"),
+ execs().with_status(0));
+ sleep_ms(1000);
+
+ assert_that(&p.bin("foo"), existing_file());
+
+ let lib = p.root().join("src/lib.rs");
+ let bin = p.root().join("src/b.rs");
+
+ File::create(&lib).unwrap().write_all(b"invalid rust code").unwrap();
+ File::create(&bin).unwrap().write_all(b"#[allow(unused)]fn foo() {}").unwrap();
+ lib.move_into_the_past();
+
+ // Make sure the binary is rebuilt, not the lib
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = path2url(p.root()))));
+ assert_that(&p.bin("foo"), existing_file());
+}
+
+#[test]
+fn rebuild_sub_package_then_while_package() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+
+ [dependencies.a]
+ path = "a"
+ [dependencies.b]
+ path = "b"
+ "#)
+ .file("src/lib.rs", "extern crate a; extern crate b;")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ authors = []
+ version = "0.0.1"
+ [dependencies.b]
+ path = "../b"
+ "#)
+ .file("a/src/lib.rs", "extern crate b;")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ authors = []
+ version = "0.0.1"
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ File::create(&p.root().join("b/src/lib.rs")).unwrap().write_all(br#"
+ pub fn b() {}
+ "#).unwrap();
+
+ assert_that(p.cargo("build").arg("-pb"),
+ execs().with_status(0));
+
+ File::create(&p.root().join("src/lib.rs")).unwrap().write_all(br#"
+ extern crate a;
+ extern crate b;
+ pub fn toplevel() {}
+ "#).unwrap();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn changing_lib_features_caches_targets() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+
+ [features]
+ foo = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[..]Compiling foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("build").arg("--features").arg("foo"),
+ execs().with_status(0)
+ .with_stderr("\
+[..]Compiling foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ /* Targets should be cached from the first build */
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stdout(""));
+
+ assert_that(p.cargo("build").arg("--features").arg("foo"),
+ execs().with_status(0)
+ .with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn changing_profiles_caches_targets() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+
+ [profile.dev]
+ panic = "abort"
+
+ [profile.test]
+ panic = "unwind"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[..]Compiling foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr("\
+[..]Compiling foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[..]debug[..]deps[..]foo-[..][EXE]
+[DOCTEST] foo
+"));
+
+ /* Targets should be cached from the first build */
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("test").arg("foo"),
+ execs().with_status(0)
+ .with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[..]debug[..]deps[..]foo-[..][EXE]
+[DOCTEST] foo
+"));
+}
+
+#[test]
+fn changing_bin_paths_common_target_features_caches_targets() {
+ // Make sure dep_cache crate is built once per feature
+ let p = project("foo")
+ .file(".cargo/config", r#"
+ [build]
+ target-dir = "./target"
+ "#)
+ .file("dep_crate/Cargo.toml", r#"
+ [package]
+ name = "dep_crate"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ ftest = []
+ "#)
+ .file("dep_crate/src/lib.rs", r#"
+ #[cfg(feature = "ftest")]
+ pub fn yo() {
+ println!("ftest on")
+ }
+ #[cfg(not(feature = "ftest"))]
+ pub fn yo() {
+ println!("ftest off")
+ }
+ "#)
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ dep_crate = {path = "../dep_crate", features = []}
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("a/src/main.rs", r#"
+ extern crate dep_crate;
+ use dep_crate::yo;
+ fn main() {
+ yo();
+ }
+ "#)
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ dep_crate = {path = "../dep_crate", features = ["ftest"]}
+ "#)
+ .file("b/src/lib.rs", "")
+ .file("b/src/main.rs", r#"
+ extern crate dep_crate;
+ use dep_crate::yo;
+ fn main() {
+ yo();
+ }
+ "#)
+ .build();
+
+ /* Build and rebuild a/. Ensure dep_crate only builds once */
+ assert_that(p.cargo("run").cwd(p.root().join("a")),
+ execs().with_status(0)
+ .with_stdout("ftest off")
+ .with_stderr("\
+[..]Compiling dep_crate v0.0.1 ([..])
+[..]Compiling a v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..]target[/]debug[/]a[EXE]`
+"));
+ assert_that(p.cargo("clean").arg("-p").arg("a").cwd(p.root().join("a")),
+ execs().with_status(0));
+ assert_that(p.cargo("run").cwd(p.root().join("a")),
+ execs().with_status(0)
+ .with_stdout("ftest off")
+ .with_stderr("\
+[..]Compiling a v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..]target[/]debug[/]a[EXE]`
+"));
+
+ /* Build and rebuild b/. Ensure dep_crate only builds once */
+ assert_that(p.cargo("run").cwd(p.root().join("b")),
+ execs().with_status(0)
+ .with_stdout("ftest on")
+ .with_stderr("\
+[..]Compiling dep_crate v0.0.1 ([..])
+[..]Compiling b v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..]target[/]debug[/]b[EXE]`
+"));
+ assert_that(p.cargo("clean").arg("-p").arg("b").cwd(p.root().join("b")),
+ execs().with_status(0));
+ assert_that(p.cargo("run").cwd(p.root().join("b")),
+ execs().with_status(0)
+ .with_stdout("ftest on")
+ .with_stderr("\
+[..]Compiling b v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..]target[/]debug[/]b[EXE]`
+"));
+
+ /* Build a/ package again. If we cache different feature dep builds correctly,
+ * this should not cause a rebuild of dep_crate */
+ assert_that(p.cargo("clean").arg("-p").arg("a").cwd(p.root().join("a")),
+ execs().with_status(0));
+ assert_that(p.cargo("run").cwd(p.root().join("a")),
+ execs().with_status(0)
+ .with_stdout("ftest off")
+ .with_stderr("\
+[..]Compiling a v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..]target[/]debug[/]a[EXE]`
+"));
+
+ /* Build b/ package again. If we cache different feature dep builds correctly,
+ * this should not cause a rebuild */
+ assert_that(p.cargo("clean").arg("-p").arg("b").cwd(p.root().join("b")),
+ execs().with_status(0));
+ assert_that(p.cargo("run").cwd(p.root().join("b")),
+ execs().with_status(0)
+ .with_stdout("ftest on")
+ .with_stderr("\
+[..]Compiling b v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..]target[/]debug[/]b[EXE]`
+"));
+}
+
+#[test]
+fn changing_bin_features_caches_targets() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+
+ [features]
+ foo = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {
+ let msg = if cfg!(feature = "foo") { "feature on" } else { "feature off" };
+ println!("{}", msg);
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run"),
+ execs().with_status(0)
+ .with_stdout("feature off")
+ .with_stderr("\
+[..]Compiling foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]foo[EXE]`
+"));
+
+ assert_that(p.cargo("run").arg("--features").arg("foo"),
+ execs().with_status(0)
+ .with_stdout("feature on")
+ .with_stderr("\
+[..]Compiling foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]foo[EXE]`
+"));
+
+ /* Targets should be cached from the first build */
+
+ assert_that(p.cargo("run"),
+ execs().with_status(0)
+ .with_stdout("feature off")
+ .with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]foo[EXE]`
+"));
+
+ assert_that(p.cargo("run").arg("--features").arg("foo"),
+ execs().with_status(0)
+ .with_stdout("feature on")
+ .with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]foo[EXE]`
+"));
+}
+
+#[test]
+fn rebuild_tests_if_lib_changes() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .file("tests/foo.rs", r#"
+ extern crate foo;
+ #[test]
+ fn test() { foo::foo(); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ assert_that(p.cargo("test"),
+ execs().with_status(0));
+
+ sleep_ms(1000);
+ File::create(&p.root().join("src/lib.rs")).unwrap();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(101));
+}
+
+#[test]
+fn no_rebuild_transitive_target_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+ [dev-dependencies]
+ b = { path = "b" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("tests/foo.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [target.foo.dependencies]
+ c = { path = "../c" }
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ c = { path = "../c" }
+ "#)
+ .file("b/src/lib.rs", "")
+ .file("c/Cargo.toml", r#"
+ [package]
+ name = "c"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("c/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ assert_that(p.cargo("test").arg("--no-run"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] c v0.0.1 ([..])
+[COMPILING] b v0.0.1 ([..])
+[COMPILING] foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn rerun_if_changed_in_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("a/build.rs", r#"
+ fn main() {
+ println!("cargo:rerun-if-changed=build.rs");
+ }
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn same_build_dir_cached_packages() {
+ let p = project("foo")
+ .file("a1/Cargo.toml", r#"
+ [package]
+ name = "a1"
+ version = "0.0.1"
+ authors = []
+ [dependencies]
+ b = { path = "../b" }
+ "#)
+ .file("a1/src/lib.rs", "")
+ .file("a2/Cargo.toml", r#"
+ [package]
+ name = "a2"
+ version = "0.0.1"
+ authors = []
+ [dependencies]
+ b = { path = "../b" }
+ "#)
+ .file("a2/src/lib.rs", "")
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+ [dependencies]
+ c = { path = "../c" }
+ "#)
+ .file("b/src/lib.rs", "")
+ .file("c/Cargo.toml", r#"
+ [package]
+ name = "c"
+ version = "0.0.1"
+ authors = []
+ [dependencies]
+ d = { path = "../d" }
+ "#)
+ .file("c/src/lib.rs", "")
+ .file("d/Cargo.toml", r#"
+ [package]
+ name = "d"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("d/src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [build]
+ target-dir = "./target"
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("a1")),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] d v0.0.1 ({dir}/d)
+[COMPILING] c v0.0.1 ({dir}/c)
+[COMPILING] b v0.0.1 ({dir}/b)
+[COMPILING] a1 v0.0.1 ({dir}/a1)
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+ assert_that(p.cargo("build").cwd(p.root().join("a2")),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] a2 v0.0.1 ({dir}/a2)
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+}
+
+#[test]
+fn no_rebuild_if_build_artifacts_move_backwards_in_time() {
+ let p = project("backwards_in_time")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "backwards_in_time"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ p.root().move_into_the_past();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stdout("").with_stderr("\
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn rebuild_if_build_artifacts_move_forward_in_time() {
+ let p = project("forwards_in_time")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "forwards_in_time"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ p.root().move_into_the_future();
+
+ assert_that(p.cargo("build").env("RUST_LOG", ""),
+ execs().with_status(0).with_stdout("").with_stderr("\
+[COMPILING] a v0.0.1 ([..])
+[COMPILING] forwards_in_time v0.0.1 ([..])
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn rebuild_if_environment_changes() {
+ let p = project("env_change")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "env_change"
+ description = "old desc"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {
+ println!("{}", env!("CARGO_PKG_DESCRIPTION"));
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run"),
+ execs().with_status(0)
+ .with_stdout("old desc").with_stderr(&format!("\
+[COMPILING] env_change v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]env_change[EXE]`
+", dir = p.url())));
+
+ File::create(&p.root().join("Cargo.toml")).unwrap().write_all(br#"
+ [package]
+ name = "env_change"
+ description = "new desc"
+ version = "0.0.1"
+ authors = []
+ "#).unwrap();
+
+ assert_that(p.cargo("run"),
+ execs().with_status(0)
+ .with_stdout("new desc").with_stderr(&format!("\
+[COMPILING] env_change v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]env_change[EXE]`
+", dir = p.url())));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::fs::{self, File};
+use std::io::prelude::*;
+
+use cargotest::support::{project, execs};
+use hamcrest::{assert_that, existing_file, is_not};
+
+#[test]
+fn adding_and_removing_packages() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ authors = []
+ version = "0.0.1"
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("generate-lockfile"),
+ execs().with_status(0));
+
+ let toml = p.root().join("Cargo.toml");
+ let lock1 = p.read_lockfile();
+
+ // add a dep
+ File::create(&toml).unwrap().write_all(br#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+
+ [dependencies.bar]
+ path = "bar"
+ "#).unwrap();
+ assert_that(p.cargo("generate-lockfile"),
+ execs().with_status(0));
+ let lock2 = p.read_lockfile();
+ assert!(lock1 != lock2);
+
+ // change the dep
+ File::create(&p.root().join("bar/Cargo.toml")).unwrap().write_all(br#"
+ [package]
+ name = "bar"
+ authors = []
+ version = "0.0.2"
+ "#).unwrap();
+ assert_that(p.cargo("generate-lockfile"),
+ execs().with_status(0));
+ let lock3 = p.read_lockfile();
+ assert!(lock1 != lock3);
+ assert!(lock2 != lock3);
+
+ // remove the dep
+ println!("lock4");
+ File::create(&toml).unwrap().write_all(br#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+ "#).unwrap();
+ assert_that(p.cargo("generate-lockfile"),
+ execs().with_status(0));
+ let lock4 = p.read_lockfile();
+ assert_eq!(lock1, lock4);
+}
+
+#[test]
+fn preserve_metadata() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ authors = []
+ version = "0.0.1"
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("generate-lockfile"),
+ execs().with_status(0));
+
+ let metadata = r#"
+[metadata]
+bar = "baz"
+foo = "bar"
+"#;
+ let lockfile = p.root().join("Cargo.lock");
+ let lock = p.read_lockfile();
+ let data = lock + metadata;
+ File::create(&lockfile).unwrap().write_all(data.as_bytes()).unwrap();
+
+ // Build and make sure the metadata is still there
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ let lock = p.read_lockfile();
+ assert!(lock.contains(metadata.trim()), "{}", lock);
+
+ // Update and make sure the metadata is still there
+ assert_that(p.cargo("update"),
+ execs().with_status(0));
+ let lock = p.read_lockfile();
+ assert!(lock.contains(metadata.trim()), "{}", lock);
+}
+
+#[test]
+fn preserve_line_endings_issue_2076() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ authors = []
+ version = "0.0.1"
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ let lockfile = p.root().join("Cargo.lock");
+ assert_that(p.cargo("generate-lockfile"),
+ execs().with_status(0));
+ assert_that(&lockfile,
+ existing_file());
+ assert_that(p.cargo("generate-lockfile"),
+ execs().with_status(0));
+
+ let lock0 = p.read_lockfile();
+
+ assert!(lock0.starts_with("[[package]]\n"));
+
+ let lock1 = lock0.replace("\n", "\r\n");
+ {
+ File::create(&lockfile).unwrap().write_all(lock1.as_bytes()).unwrap();
+ }
+
+ assert_that(p.cargo("generate-lockfile"),
+ execs().with_status(0));
+
+ let lock2 = p.read_lockfile();
+
+ assert!(lock2.starts_with("[[package]]\r\n"));
+ assert_eq!(lock1, lock2);
+}
+
+#[test]
+fn cargo_update_generate_lockfile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ let lockfile = p.root().join("Cargo.lock");
+ assert_that(&lockfile, is_not(existing_file()));
+ assert_that(p.cargo("update"), execs().with_status(0).with_stdout(""));
+ assert_that(&lockfile, existing_file());
+
+ fs::remove_file(p.root().join("Cargo.lock")).unwrap();
+
+ assert_that(&lockfile, is_not(existing_file()));
+ assert_that(p.cargo("update"), execs().with_status(0).with_stdout(""));
+ assert_that(&lockfile, existing_file());
+
+}
--- /dev/null
+extern crate cargo;
+extern crate cargotest;
+extern crate git2;
+extern crate hamcrest;
+
+use std::fs::{self, File};
+use std::io::prelude::*;
+use std::path::Path;
+
+use cargo::util::process;
+use cargotest::sleep_ms;
+use cargotest::support::paths::{self, CargoPathExt};
+use cargotest::support::{git, project, execs, main_file, path2url};
+use hamcrest::{assert_that,existing_file};
+
+#[test]
+fn cargo_compile_simple_git_dep() {
+ let project = project("foo");
+ let git_project = git::new("dep1", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "dep1"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+
+ [lib]
+
+ name = "dep1"
+ "#)
+ .file("src/dep1.rs", r#"
+ pub fn hello() -> &'static str {
+ "hello world"
+ }
+ "#)
+ }).unwrap();
+
+ let project = project
+ .file("Cargo.toml", &format!(r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.dep1]
+
+ git = '{}'
+ "#, git_project.url()))
+ .file("src/main.rs", &main_file(r#""{}", dep1::hello()"#, &["dep1"]))
+ .build();
+
+ let root = project.root();
+ let git_root = git_project.root();
+
+ assert_that(project.cargo("build"),
+ execs()
+ .with_stderr(&format!("[UPDATING] git repository `{}`\n\
+ [COMPILING] dep1 v0.5.0 ({}#[..])\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) in [..]\n",
+ path2url(git_root.clone()),
+ path2url(git_root),
+ path2url(root))));
+
+ assert_that(&project.bin("foo"), existing_file());
+
+ assert_that(
+ process(&project.bin("foo")),
+ execs().with_stdout("hello world\n"));
+}
+
+#[test]
+fn cargo_compile_git_dep_branch() {
+ let project = project("foo");
+ let git_project = git::new("dep1", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "dep1"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+
+ [lib]
+
+ name = "dep1"
+ "#)
+ .file("src/dep1.rs", r#"
+ pub fn hello() -> &'static str {
+ "hello world"
+ }
+ "#)
+ }).unwrap();
+
+ // Make a new branch based on the current HEAD commit
+ let repo = git2::Repository::open(&git_project.root()).unwrap();
+ let head = repo.head().unwrap().target().unwrap();
+ let head = repo.find_commit(head).unwrap();
+ repo.branch("branchy", &head, true).unwrap();
+
+ let project = project
+ .file("Cargo.toml", &format!(r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.dep1]
+
+ git = '{}'
+ branch = "branchy"
+
+ "#, git_project.url()))
+ .file("src/main.rs", &main_file(r#""{}", dep1::hello()"#, &["dep1"]))
+ .build();
+
+ let root = project.root();
+ let git_root = git_project.root();
+
+ assert_that(project.cargo("build"),
+ execs()
+ .with_stderr(&format!("[UPDATING] git repository `{}`\n\
+ [COMPILING] dep1 v0.5.0 ({}?branch=branchy#[..])\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) in [..]\n",
+ path2url(git_root.clone()),
+ path2url(git_root),
+ path2url(root))));
+
+ assert_that(&project.bin("foo"), existing_file());
+
+ assert_that(
+ process(&project.bin("foo")),
+ execs().with_stdout("hello world\n"));
+}
+
+#[test]
+fn cargo_compile_git_dep_tag() {
+ let project = project("foo");
+ let git_project = git::new("dep1", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "dep1"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+
+ [lib]
+
+ name = "dep1"
+ "#)
+ .file("src/dep1.rs", r#"
+ pub fn hello() -> &'static str {
+ "hello world"
+ }
+ "#)
+ }).unwrap();
+
+ // Make a tag corresponding to the current HEAD
+ let repo = git2::Repository::open(&git_project.root()).unwrap();
+ let head = repo.head().unwrap().target().unwrap();
+ repo.tag("v0.1.0",
+ &repo.find_object(head, None).unwrap(),
+ &repo.signature().unwrap(),
+ "make a new tag",
+ false).unwrap();
+
+ let project = project
+ .file("Cargo.toml", &format!(r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.dep1]
+
+ git = '{}'
+ tag = "v0.1.0"
+ "#, git_project.url()))
+ .file("src/main.rs", &main_file(r#""{}", dep1::hello()"#, &["dep1"]))
+ .build();
+
+ let root = project.root();
+ let git_root = git_project.root();
+
+ assert_that(project.cargo("build"),
+ execs()
+ .with_stderr(&format!("[UPDATING] git repository `{}`\n\
+ [COMPILING] dep1 v0.5.0 ({}?tag=v0.1.0#[..])\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) in [..]\n",
+ path2url(git_root.clone()),
+ path2url(git_root),
+ path2url(root))));
+
+ assert_that(&project.bin("foo"), existing_file());
+
+ assert_that(process(&project.bin("foo")),
+ execs().with_stdout("hello world\n"));
+
+ assert_that(project.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn cargo_compile_with_nested_paths() {
+ let git_project = git::new("dep1", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "dep1"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+
+ [dependencies.dep2]
+
+ version = "0.5.0"
+ path = "vendor/dep2"
+
+ [lib]
+
+ name = "dep1"
+ "#)
+ .file("src/dep1.rs", r#"
+ extern crate dep2;
+
+ pub fn hello() -> &'static str {
+ dep2::hello()
+ }
+ "#)
+ .file("vendor/dep2/Cargo.toml", r#"
+ [project]
+
+ name = "dep2"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+
+ [lib]
+
+ name = "dep2"
+ "#)
+ .file("vendor/dep2/src/dep2.rs", r#"
+ pub fn hello() -> &'static str {
+ "hello world"
+ }
+ "#)
+ }).unwrap();
+
+ let p = project("parent")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+
+ name = "parent"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.dep1]
+
+ version = "0.5.0"
+ git = '{}'
+
+ [[bin]]
+
+ name = "parent"
+ "#, git_project.url()))
+ .file("src/parent.rs",
+ &main_file(r#""{}", dep1::hello()"#, &["dep1"]))
+ .build();
+
+ p.cargo("build")
+ .exec_with_output()
+ .unwrap();
+
+ assert_that(&p.bin("parent"), existing_file());
+
+ assert_that(process(&p.bin("parent")),
+ execs().with_stdout("hello world\n"));
+}
+
+#[test]
+fn cargo_compile_with_malformed_nested_paths() {
+ let git_project = git::new("dep1", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "dep1"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+
+ [lib]
+
+ name = "dep1"
+ "#)
+ .file("src/dep1.rs", r#"
+ pub fn hello() -> &'static str {
+ "hello world"
+ }
+ "#)
+ .file("vendor/dep2/Cargo.toml", r#"
+ !INVALID!
+ "#)
+ }).unwrap();
+
+ let p = project("parent")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+
+ name = "parent"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.dep1]
+
+ version = "0.5.0"
+ git = '{}'
+
+ [[bin]]
+
+ name = "parent"
+ "#, git_project.url()))
+ .file("src/parent.rs",
+ &main_file(r#""{}", dep1::hello()"#, &["dep1"]))
+ .build();
+
+ p.cargo("build")
+ .exec_with_output()
+ .unwrap();
+
+ assert_that(&p.bin("parent"), existing_file());
+
+ assert_that(process(&p.bin("parent")),
+ execs().with_stdout("hello world\n"));
+}
+
+#[test]
+fn cargo_compile_with_meta_package() {
+ let git_project = git::new("meta-dep", |project| {
+ project
+ .file("dep1/Cargo.toml", r#"
+ [project]
+
+ name = "dep1"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+
+ [lib]
+
+ name = "dep1"
+ "#)
+ .file("dep1/src/dep1.rs", r#"
+ pub fn hello() -> &'static str {
+ "this is dep1"
+ }
+ "#)
+ .file("dep2/Cargo.toml", r#"
+ [project]
+
+ name = "dep2"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+
+ [lib]
+
+ name = "dep2"
+ "#)
+ .file("dep2/src/dep2.rs", r#"
+ pub fn hello() -> &'static str {
+ "this is dep2"
+ }
+ "#)
+ }).unwrap();
+
+ let p = project("parent")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+
+ name = "parent"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.dep1]
+
+ version = "0.5.0"
+ git = '{}'
+
+ [dependencies.dep2]
+
+ version = "0.5.0"
+ git = '{}'
+
+ [[bin]]
+
+ name = "parent"
+ "#, git_project.url(), git_project.url()))
+ .file("src/parent.rs",
+ &main_file(r#""{} {}", dep1::hello(), dep2::hello()"#, &["dep1", "dep2"]))
+ .build();
+
+ p.cargo("build")
+ .exec_with_output()
+ .unwrap();
+
+ assert_that(&p.bin("parent"), existing_file());
+
+ assert_that(process(&p.bin("parent")),
+ execs().with_stdout("this is dep1 this is dep2\n"));
+}
+
+#[test]
+fn cargo_compile_with_short_ssh_git() {
+ let url = "git@github.com:a/dep";
+
+ let project = project("project")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.dep]
+
+ git = "{}"
+
+ [[bin]]
+
+ name = "foo"
+ "#, url))
+ .file("src/foo.rs", &main_file(r#""{}", dep1::hello()"#, &["dep1"]))
+ .build();
+
+ assert_that(project.cargo("build"),
+ execs()
+ .with_stdout("")
+ .with_stderr(&format!("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ invalid url `{}`: relative URL without a base
+", url)));
+}
+
+#[test]
+fn two_revs_same_deps() {
+ let bar = git::new("meta-dep", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn bar() -> i32 { 1 }")
+ }).unwrap();
+
+ let repo = git2::Repository::open(&bar.root()).unwrap();
+ let rev1 = repo.revparse_single("HEAD").unwrap().id();
+
+ // Commit the changes and make sure we trigger a recompile
+ File::create(&bar.root().join("src/lib.rs")).unwrap().write_all(br#"
+ pub fn bar() -> i32 { 2 }
+ "#).unwrap();
+ git::add(&repo);
+ let rev2 = git::commit(&repo);
+
+ let foo = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies.bar]
+ git = '{}'
+ rev = "{}"
+
+ [dependencies.baz]
+ path = "../baz"
+ "#, bar.url(), rev1))
+ .file("src/main.rs", r#"
+ extern crate bar;
+ extern crate baz;
+
+ fn main() {
+ assert_eq!(bar::bar(), 1);
+ assert_eq!(baz::baz(), 2);
+ }
+ "#)
+ .build();
+
+ let _baz = project("baz")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "baz"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies.bar]
+ git = '{}'
+ rev = "{}"
+ "#, bar.url(), rev2))
+ .file("src/lib.rs", r#"
+ extern crate bar;
+ pub fn baz() -> i32 { bar::bar() }
+ "#)
+ .build();
+
+ assert_that(foo.cargo("build").arg("-v"),
+ execs().with_status(0));
+ assert_that(&foo.bin("foo"), existing_file());
+ assert_that(foo.process(&foo.bin("foo")), execs().with_status(0));
+}
+
+#[test]
+fn recompilation() {
+ let git_project = git::new("bar", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+
+ [lib]
+ name = "bar"
+ "#)
+ .file("src/bar.rs", r#"
+ pub fn bar() {}
+ "#)
+ }).unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+
+ version = "0.5.0"
+ git = '{}'
+ "#, git_project.url()))
+ .file("src/main.rs",
+ &main_file(r#""{:?}", bar::bar()"#, &["bar"]))
+ .build();
+
+ // First time around we should compile both foo and bar
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("[UPDATING] git repository `{}`\n\
+ [COMPILING] bar v0.5.0 ({}#[..])\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ git_project.url(),
+ git_project.url(),
+ p.url())));
+
+ // Don't recompile the second time
+ assert_that(p.cargo("build"),
+ execs().with_stdout(""));
+
+ // Modify a file manually, shouldn't trigger a recompile
+ File::create(&git_project.root().join("src/bar.rs")).unwrap().write_all(br#"
+ pub fn bar() { println!("hello!"); }
+ "#).unwrap();
+
+ assert_that(p.cargo("build"),
+ execs().with_stdout(""));
+
+ assert_that(p.cargo("update"),
+ execs().with_stderr(&format!("[UPDATING] git repository `{}`",
+ git_project.url())));
+
+ assert_that(p.cargo("build"),
+ execs().with_stdout(""));
+
+ // Commit the changes and make sure we don't trigger a recompile because the
+ // lockfile says not to change
+ let repo = git2::Repository::open(&git_project.root()).unwrap();
+ git::add(&repo);
+ git::commit(&repo);
+
+ println!("compile after commit");
+ assert_that(p.cargo("build"),
+ execs().with_stdout(""));
+ p.root().move_into_the_past();
+
+ // Update the dependency and carry on!
+ assert_that(p.cargo("update"),
+ execs().with_stderr(&format!("[UPDATING] git repository `{}`\n\
+ [UPDATING] bar v0.5.0 ([..]) -> #[..]\n\
+ ",
+ git_project.url())));
+ println!("going for the last compile");
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("[COMPILING] bar v0.5.0 ({}#[..])\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ git_project.url(),
+ p.url())));
+
+ // Make sure clean only cleans one dep
+ assert_that(p.cargo("clean")
+ .arg("-p").arg("foo"),
+ execs().with_stdout(""));
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("[COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ p.url())));
+}
+
+#[test]
+fn update_with_shared_deps() {
+ let git_project = git::new("bar", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+
+ [lib]
+ name = "bar"
+ "#)
+ .file("src/bar.rs", r#"
+ pub fn bar() {}
+ "#)
+ }).unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.dep1]
+ path = "dep1"
+ [dependencies.dep2]
+ path = "dep2"
+ "#)
+ .file("src/main.rs", r#"
+ #[allow(unused_extern_crates)]
+ extern crate dep1;
+ #[allow(unused_extern_crates)]
+ extern crate dep2;
+ fn main() {}
+ "#)
+ .file("dep1/Cargo.toml", &format!(r#"
+ [package]
+ name = "dep1"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+ version = "0.5.0"
+ git = '{}'
+ "#, git_project.url()))
+ .file("dep1/src/lib.rs", "")
+ .file("dep2/Cargo.toml", &format!(r#"
+ [package]
+ name = "dep2"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+ version = "0.5.0"
+ git = '{}'
+ "#, git_project.url()))
+ .file("dep2/src/lib.rs", "")
+ .build();
+
+ // First time around we should compile both foo and bar
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("\
+[UPDATING] git repository `{git}`
+[COMPILING] bar v0.5.0 ({git}#[..])
+[COMPILING] [..] v0.5.0 ([..])
+[COMPILING] [..] v0.5.0 ([..])
+[COMPILING] foo v0.5.0 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]\n",
+git = git_project.url(), dir = p.url())));
+
+ // Modify a file manually, and commit it
+ File::create(&git_project.root().join("src/bar.rs")).unwrap().write_all(br#"
+ pub fn bar() { println!("hello!"); }
+ "#).unwrap();
+ let repo = git2::Repository::open(&git_project.root()).unwrap();
+ let old_head = repo.head().unwrap().target().unwrap();
+ git::add(&repo);
+ git::commit(&repo);
+
+ sleep_ms(1000);
+
+ // By default, not transitive updates
+ println!("dep1 update");
+ assert_that(p.cargo("update")
+ .arg("-p").arg("dep1"),
+ execs().with_stdout(""));
+
+ // Don't do anything bad on a weird --precise argument
+ println!("bar bad precise update");
+ assert_that(p.cargo("update")
+ .arg("-p").arg("bar")
+ .arg("--precise").arg("0.1.2"),
+ execs().with_status(101).with_stderr("\
+[UPDATING] git repository [..]
+[ERROR] Unable to update [..]
+
+To learn more, run the command again with --verbose.
+"));
+
+ // Specifying a precise rev to the old rev shouldn't actually update
+ // anything because we already have the rev in the db.
+ println!("bar precise update");
+ assert_that(p.cargo("update")
+ .arg("-p").arg("bar")
+ .arg("--precise").arg(&old_head.to_string()),
+ execs().with_stdout(""));
+
+ // Updating aggressively should, however, update the repo.
+ println!("dep1 aggressive update");
+ assert_that(p.cargo("update")
+ .arg("-p").arg("dep1")
+ .arg("--aggressive"),
+ execs().with_stderr(&format!("[UPDATING] git repository `{}`\n\
+ [UPDATING] bar v0.5.0 ([..]) -> #[..]\n\
+ ", git_project.url())));
+
+ // Make sure we still only compile one version of the git repo
+ println!("build");
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("\
+[COMPILING] bar v0.5.0 ({git}#[..])
+[COMPILING] [..] v0.5.0 ({dir}[..]dep[..])
+[COMPILING] [..] v0.5.0 ({dir}[..]dep[..])
+[COMPILING] foo v0.5.0 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]\n",
+ git = git_project.url(), dir = p.url())));
+
+ // We should be able to update transitive deps
+ assert_that(p.cargo("update").arg("-p").arg("bar"),
+ execs().with_stderr(&format!("[UPDATING] git repository `{}`",
+ git_project.url())));
+}
+
+#[test]
+fn dep_with_submodule() {
+ let project = project("foo");
+ let git_project = git::new("dep1", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [package]
+ name = "dep1"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+ "#)
+ }).unwrap();
+ let git_project2 = git::new("dep2", |project| {
+ project.file("lib.rs", "pub fn dep() {}")
+ }).unwrap();
+
+ let repo = git2::Repository::open(&git_project.root()).unwrap();
+ let url = path2url(git_project2.root()).to_string();
+ git::add_submodule(&repo, &url, Path::new("src"));
+ git::commit(&repo);
+
+ let project = project
+ .file("Cargo.toml", &format!(r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.dep1]
+
+ git = '{}'
+ "#, git_project.url()))
+ .file("src/lib.rs", "
+ extern crate dep1;
+ pub fn foo() { dep1::dep() }
+ ")
+ .build();
+
+ assert_that(project.cargo("build"),
+ execs().with_stderr("\
+[UPDATING] git repository [..]
+[COMPILING] dep1 [..]
+[COMPILING] foo [..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]\n").with_status(0));
+}
+
+#[test]
+fn dep_with_bad_submodule() {
+ let project = project("foo");
+ let git_project = git::new("dep1", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [package]
+ name = "dep1"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+ "#)
+ }).unwrap();
+ let git_project2 = git::new("dep2", |project| {
+ project.file("lib.rs", "pub fn dep() {}")
+ }).unwrap();
+
+ let repo = git2::Repository::open(&git_project.root()).unwrap();
+ let url = path2url(git_project2.root()).to_string();
+ git::add_submodule(&repo, &url, Path::new("src"));
+ git::commit(&repo);
+
+ // now amend the first commit on git_project2 to make submodule ref point to not-found
+ // commit
+ let repo = git2::Repository::open(&git_project2.root()).unwrap();
+ let original_submodule_ref = repo.refname_to_id("refs/heads/master").unwrap();
+ let commit = repo.find_commit(original_submodule_ref).unwrap();
+ commit.amend(
+ Some("refs/heads/master"),
+ None,
+ None,
+ None,
+ Some("something something"),
+ None).unwrap();
+
+ let p = project
+ .file("Cargo.toml", &format!(r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.dep1]
+
+ git = '{}'
+ "#, git_project.url()))
+ .file("src/lib.rs", "
+ extern crate dep1;
+ pub fn foo() { dep1::dep() }
+ ")
+ .build();
+
+ let expected = format!("\
+[UPDATING] git repository [..]
+[ERROR] failed to load source for a dependency on `dep1`
+
+Caused by:
+ Unable to update {}
+
+Caused by:
+ failed to update submodule `src`
+
+To learn more, run the command again with --verbose.\n", path2url(git_project.root()));
+
+ assert_that(p.cargo("build"),
+ execs().with_stderr(expected).with_status(101));
+}
+
+#[test]
+fn two_deps_only_update_one() {
+ let project = project("foo");
+ let git1 = git::new("dep1", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [package]
+ name = "dep1"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+ "#)
+ .file("src/lib.rs", "")
+ }).unwrap();
+ let git2 = git::new("dep2", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [package]
+ name = "dep2"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+ "#)
+ .file("src/lib.rs", "")
+ }).unwrap();
+
+ let p = project
+ .file("Cargo.toml", &format!(r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.dep1]
+ git = '{}'
+ [dependencies.dep2]
+ git = '{}'
+ "#, git1.url(), git2.url()))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs()
+ .with_stderr(&format!("[UPDATING] git repository `[..]`\n\
+ [UPDATING] git repository `[..]`\n\
+ [COMPILING] [..] v0.5.0 ([..])\n\
+ [COMPILING] [..] v0.5.0 ([..])\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) in [..]\n",
+ p.url())));
+
+ File::create(&git1.root().join("src/lib.rs")).unwrap().write_all(br#"
+ pub fn foo() {}
+ "#).unwrap();
+ let repo = git2::Repository::open(&git1.root()).unwrap();
+ git::add(&repo);
+ git::commit(&repo);
+
+ assert_that(p.cargo("update")
+ .arg("-p").arg("dep1"),
+ execs()
+ .with_stderr(&format!("[UPDATING] git repository `{}`\n\
+ [UPDATING] dep1 v0.5.0 ([..]) -> #[..]\n\
+ ", git1.url())));
+}
+
+#[test]
+fn stale_cached_version() {
+ let bar = git::new("meta-dep", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn bar() -> i32 { 1 }")
+ }).unwrap();
+
+ // Update the git database in the cache with the current state of the git
+ // repo
+ let foo = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [dependencies.bar]
+ git = '{}'
+ "#, bar.url()))
+ .file("src/main.rs", r#"
+ extern crate bar;
+
+ fn main() { assert_eq!(bar::bar(), 1) }
+ "#)
+ .build();
+
+ assert_that(foo.cargo("build"), execs().with_status(0));
+ assert_that(foo.process(&foo.bin("foo")), execs().with_status(0));
+
+ // Update the repo, and simulate someone else updating the lockfile and then
+ // us pulling it down.
+ File::create(&bar.root().join("src/lib.rs")).unwrap().write_all(br#"
+ pub fn bar() -> i32 { 1 + 0 }
+ "#).unwrap();
+ let repo = git2::Repository::open(&bar.root()).unwrap();
+ git::add(&repo);
+ git::commit(&repo);
+
+ sleep_ms(1000);
+
+ let rev = repo.revparse_single("HEAD").unwrap().id();
+
+ File::create(&foo.root().join("Cargo.lock")).unwrap().write_all(format!(r#"
+ [[package]]
+ name = "foo"
+ version = "0.0.0"
+ dependencies = [
+ 'bar 0.0.0 (git+{url}#{hash})'
+ ]
+
+ [[package]]
+ name = "bar"
+ version = "0.0.0"
+ source = 'git+{url}#{hash}'
+ "#, url = bar.url(), hash = rev).as_bytes()).unwrap();
+
+ // Now build!
+ assert_that(foo.cargo("build"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[UPDATING] git repository `{bar}`
+[COMPILING] bar v0.0.0 ({bar}#[..])
+[COMPILING] foo v0.0.0 ({foo})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", bar = bar.url(), foo = foo.url())));
+ assert_that(foo.process(&foo.bin("foo")), execs().with_status(0));
+}
+
+#[test]
+fn dep_with_changed_submodule() {
+ let project = project("foo");
+ let git_project = git::new("dep1", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [package]
+ name = "dep1"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+ "#)
+ }).unwrap();
+
+ let git_project2 = git::new("dep2", |project| {
+ project
+ .file("lib.rs", "pub fn dep() -> &'static str { \"project2\" }")
+ }).unwrap();
+
+ let git_project3 = git::new("dep3", |project| {
+ project
+ .file("lib.rs", "pub fn dep() -> &'static str { \"project3\" }")
+ }).unwrap();
+
+ let repo = git2::Repository::open(&git_project.root()).unwrap();
+ let mut sub = git::add_submodule(&repo, &git_project2.url().to_string(),
+ Path::new("src"));
+ git::commit(&repo);
+
+ let p = project
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ [dependencies.dep1]
+ git = '{}'
+ "#, git_project.url()))
+ .file("src/main.rs", "
+ extern crate dep1;
+ pub fn main() { println!(\"{}\", dep1::dep()) }
+ ")
+ .build();
+
+ println!("first run");
+ assert_that(p.cargo("run"), execs()
+ .with_stderr("[UPDATING] git repository `[..]`\n\
+ [COMPILING] dep1 v0.5.0 ([..])\n\
+ [COMPILING] foo v0.5.0 ([..])\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) in \
+ [..]\n\
+ [RUNNING] `target[/]debug[/]foo[EXE]`\n")
+ .with_stdout("project2\n")
+ .with_status(0));
+
+ File::create(&git_project.root().join(".gitmodules")).unwrap()
+ .write_all(format!("[submodule \"src\"]\n\tpath = src\n\turl={}",
+ git_project3.url()).as_bytes()).unwrap();
+
+ // Sync the submodule and reset it to the new remote.
+ sub.sync().unwrap();
+ {
+ let subrepo = sub.open().unwrap();
+ subrepo.remote_add_fetch("origin",
+ "refs/heads/*:refs/heads/*").unwrap();
+ subrepo.remote_set_url("origin",
+ &git_project3.url().to_string()).unwrap();
+ let mut origin = subrepo.find_remote("origin").unwrap();
+ origin.fetch(&[], None, None).unwrap();
+ let id = subrepo.refname_to_id("refs/remotes/origin/master").unwrap();
+ let obj = subrepo.find_object(id, None).unwrap();
+ subrepo.reset(&obj, git2::ResetType::Hard, None).unwrap();
+ }
+ sub.add_to_index(true).unwrap();
+ git::add(&repo);
+ git::commit(&repo);
+
+ sleep_ms(1000);
+ // Update the dependency and carry on!
+ println!("update");
+ assert_that(p.cargo("update").arg("-v"),
+ execs()
+ .with_stderr("")
+ .with_stderr(&format!("[UPDATING] git repository `{}`\n\
+ [UPDATING] dep1 v0.5.0 ([..]) -> #[..]\n\
+ ", git_project.url())));
+
+ println!("last run");
+ assert_that(p.cargo("run"), execs()
+ .with_stderr("[COMPILING] dep1 v0.5.0 ([..])\n\
+ [COMPILING] foo v0.5.0 ([..])\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) in \
+ [..]\n\
+ [RUNNING] `target[/]debug[/]foo[EXE]`\n")
+ .with_stdout("project3\n")
+ .with_status(0));
+}
+
+#[test]
+fn dev_deps_with_testing() {
+ let p2 = git::new("bar", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn gimme() -> &'static str { "zoidberg" }
+ "#)
+ }).unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dev-dependencies.bar]
+ version = "0.5.0"
+ git = '{}'
+ "#, p2.url()))
+ .file("src/main.rs", r#"
+ fn main() {}
+
+ #[cfg(test)]
+ mod tests {
+ extern crate bar;
+ #[test] fn foo() { bar::gimme(); }
+ }
+ "#)
+ .build();
+
+ // Generate a lockfile which did not use `bar` to compile, but had to update
+ // `bar` to generate the lockfile
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("\
+[UPDATING] git repository `{bar}`
+[COMPILING] foo v0.5.0 ({url})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", url = p.url(), bar = p2.url())));
+
+ // Make sure we use the previous resolution of `bar` instead of updating it
+ // a second time.
+ assert_that(p.cargo("test"),
+ execs().with_stderr("\
+[COMPILING] [..] v0.5.0 ([..])
+[COMPILING] [..] v0.5.0 ([..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]")
+ .with_stdout_contains("test tests::foo ... ok"));
+}
+
+#[test]
+fn git_build_cmd_freshness() {
+ let foo = git::new("foo", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", "fn main() {}")
+ .file("src/lib.rs", "pub fn bar() -> i32 { 1 }")
+ .file(".gitignore", "
+ src/bar.rs
+ ")
+ }).unwrap();
+ foo.root().move_into_the_past();
+
+ sleep_ms(1000);
+
+ assert_that(foo.cargo("build"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.0 ({url})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", url = foo.url())));
+
+ // Smoke test to make sure it doesn't compile again
+ println!("first pass");
+ assert_that(foo.cargo("build"),
+ execs().with_status(0)
+ .with_stdout(""));
+
+ // Modify an ignored file and make sure we don't rebuild
+ println!("second pass");
+ File::create(&foo.root().join("src/bar.rs")).unwrap();
+ assert_that(foo.cargo("build"),
+ execs().with_status(0)
+ .with_stdout(""));
+}
+
+#[test]
+fn git_name_not_always_needed() {
+ let p2 = git::new("bar", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn gimme() -> &'static str { "zoidberg" }
+ "#)
+ }).unwrap();
+
+ let repo = git2::Repository::open(&p2.root()).unwrap();
+ let mut cfg = repo.config().unwrap();
+ let _ = cfg.remove("user.name");
+ let _ = cfg.remove("user.email");
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dev-dependencies.bar]
+ git = '{}'
+ "#, p2.url()))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ // Generate a lockfile which did not use `bar` to compile, but had to update
+ // `bar` to generate the lockfile
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("\
+[UPDATING] git repository `{bar}`
+[COMPILING] foo v0.5.0 ({url})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", url = p.url(), bar = p2.url())));
+}
+
+#[test]
+fn git_repo_changing_no_rebuild() {
+ let bar = git::new("bar", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/lib.rs", "pub fn bar() -> i32 { 1 }")
+ }).unwrap();
+
+ // Lock p1 to the first rev in the git repo
+ let p1 = project("p1")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "p1"
+ version = "0.5.0"
+ authors = []
+ build = 'build.rs'
+ [dependencies.bar]
+ git = '{}'
+ "#, bar.url()))
+ .file("src/main.rs", "fn main() {}")
+ .file("build.rs", "fn main() {}")
+ .build();
+ p1.root().move_into_the_past();
+ assert_that(p1.cargo("build"),
+ execs().with_stderr(&format!("\
+[UPDATING] git repository `{bar}`
+[COMPILING] [..]
+[COMPILING] [..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", bar = bar.url())));
+
+ // Make a commit to lock p2 to a different rev
+ File::create(&bar.root().join("src/lib.rs")).unwrap().write_all(br#"
+ pub fn bar() -> i32 { 2 }
+ "#).unwrap();
+ let repo = git2::Repository::open(&bar.root()).unwrap();
+ git::add(&repo);
+ git::commit(&repo);
+
+ // Lock p2 to the second rev
+ let p2 = project("p2")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "p2"
+ version = "0.5.0"
+ authors = []
+ [dependencies.bar]
+ git = '{}'
+ "#, bar.url()))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+ assert_that(p2.cargo("build"),
+ execs().with_stderr(&format!("\
+[UPDATING] git repository `{bar}`
+[COMPILING] [..]
+[COMPILING] [..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", bar = bar.url())));
+
+ // And now for the real test! Make sure that p1 doesn't get rebuilt
+ // even though the git repo has changed.
+ assert_that(p1.cargo("build"),
+ execs().with_stdout(""));
+}
+
+#[test]
+fn git_dep_build_cmd() {
+ let p = git::new("foo", |project| {
+ project.file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+
+ version = "0.5.0"
+ path = "bar"
+
+ [[bin]]
+
+ name = "foo"
+ "#)
+ .file("src/foo.rs",
+ &main_file(r#""{}", bar::gimme()"#, &["bar"]))
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ build = "build.rs"
+
+ [lib]
+ name = "bar"
+ path = "src/bar.rs"
+ "#)
+ .file("bar/src/bar.rs.in", r#"
+ pub fn gimme() -> i32 { 0 }
+ "#)
+ .file("bar/build.rs", r#"
+ use std::fs;
+ fn main() {
+ fs::copy("src/bar.rs.in", "src/bar.rs").unwrap();
+ }
+ "#)
+ }).unwrap();
+
+ p.root().join("bar").move_into_the_past();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_stdout("0\n"));
+
+ // Touching bar.rs.in should cause the `build` command to run again.
+ fs::File::create(&p.root().join("bar/src/bar.rs.in")).unwrap()
+ .write_all(b"pub fn gimme() -> i32 { 1 }").unwrap();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_stdout("1\n"));
+}
+
+#[test]
+fn fetch_downloads() {
+ let bar = git::new("bar", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/lib.rs", "pub fn bar() -> i32 { 1 }")
+ }).unwrap();
+
+ let p = project("p1")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "p1"
+ version = "0.5.0"
+ authors = []
+ [dependencies.bar]
+ git = '{}'
+ "#, bar.url()))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+ assert_that(p.cargo("fetch"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] git repository `{url}`
+", url = bar.url())));
+
+ assert_that(p.cargo("fetch"),
+ execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn warnings_in_git_dep() {
+ let bar = git::new("bar", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/lib.rs", "fn unused() {}")
+ }).unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ [dependencies.bar]
+ git = '{}'
+ "#, bar.url()))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs()
+ .with_stderr(&format!("[UPDATING] git repository `{}`\n\
+ [COMPILING] bar v0.5.0 ({}#[..])\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) in [..]\n",
+ bar.url(),
+ bar.url(),
+ p.url())));
+}
+
+#[test]
+fn update_ambiguous() {
+ let foo1 = git::new("foo1", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/lib.rs", "")
+ }).unwrap();
+ let foo2 = git::new("foo2", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.6.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/lib.rs", "")
+ }).unwrap();
+ let bar = git::new("bar", |project| {
+ project.file("Cargo.toml", &format!(r#"
+ [package]
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.foo]
+ git = '{}'
+ "#, foo2.url()))
+ .file("src/lib.rs", "")
+ }).unwrap();
+
+ let p = project("project")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "project"
+ version = "0.5.0"
+ authors = []
+ [dependencies.foo]
+ git = '{}'
+ [dependencies.bar]
+ git = '{}'
+ "#, foo1.url(), bar.url()))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("generate-lockfile"), execs().with_status(0));
+ assert_that(p.cargo("update")
+ .arg("-p").arg("foo"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] There are multiple `foo` packages in your project, and the specification `foo` \
+is ambiguous.
+Please re-run this command with `-p <spec>` where `<spec>` is one of the \
+following:
+ foo:0.[..].0
+ foo:0.[..].0
+"));
+}
+
+#[test]
+fn update_one_dep_in_repo_with_many_deps() {
+ let foo = git::new("foo", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("a/src/lib.rs", "")
+ }).unwrap();
+
+ let p = project("project")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "project"
+ version = "0.5.0"
+ authors = []
+ [dependencies.foo]
+ git = '{}'
+ [dependencies.a]
+ git = '{}'
+ "#, foo.url(), foo.url()))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("generate-lockfile"), execs().with_status(0));
+ assert_that(p.cargo("update")
+ .arg("-p").arg("foo"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[UPDATING] git repository `{}`
+", foo.url())));
+}
+
+#[test]
+fn switch_deps_does_not_update_transitive() {
+ let transitive = git::new("transitive", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "transitive"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/lib.rs", "")
+ }).unwrap();
+ let dep1 = git::new("dep1", |project| {
+ project.file("Cargo.toml", &format!(r#"
+ [package]
+ name = "dep"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.transitive]
+ git = '{}'
+ "#, transitive.url()))
+ .file("src/lib.rs", "")
+ }).unwrap();
+ let dep2 = git::new("dep2", |project| {
+ project.file("Cargo.toml", &format!(r#"
+ [package]
+ name = "dep"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.transitive]
+ git = '{}'
+ "#, transitive.url()))
+ .file("src/lib.rs", "")
+ }).unwrap();
+
+ let p = project("project")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "project"
+ version = "0.5.0"
+ authors = []
+ [dependencies.dep]
+ git = '{}'
+ "#, dep1.url()))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[UPDATING] git repository `{}`
+[UPDATING] git repository `{}`
+[COMPILING] transitive [..]
+[COMPILING] dep [..]
+[COMPILING] project [..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dep1.url(), transitive.url())));
+
+ // Update the dependency to point to the second repository, but this
+ // shouldn't update the transitive dependency which is the same.
+ File::create(&p.root().join("Cargo.toml")).unwrap().write_all(format!(r#"
+ [project]
+ name = "project"
+ version = "0.5.0"
+ authors = []
+ [dependencies.dep]
+ git = '{}'
+ "#, dep2.url()).as_bytes()).unwrap();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[UPDATING] git repository `{}`
+[COMPILING] dep [..]
+[COMPILING] project [..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dep2.url())));
+}
+
+#[test]
+fn update_one_source_updates_all_packages_in_that_git_source() {
+ let dep = git::new("dep", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "dep"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies.a]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ }).unwrap();
+
+ let p = project("project")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "project"
+ version = "0.5.0"
+ authors = []
+ [dependencies.dep]
+ git = '{}'
+ "#, dep.url()))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ let repo = git2::Repository::open(&dep.root()).unwrap();
+ let rev1 = repo.revparse_single("HEAD").unwrap().id();
+
+ // Just be sure to change a file
+ File::create(&dep.root().join("src/lib.rs")).unwrap().write_all(br#"
+ pub fn bar() -> i32 { 2 }
+ "#).unwrap();
+ git::add(&repo);
+ git::commit(&repo);
+
+ assert_that(p.cargo("update").arg("-p").arg("dep"),
+ execs().with_status(0));
+ let mut lockfile = String::new();
+ File::open(&p.root().join("Cargo.lock")).unwrap()
+ .read_to_string(&mut lockfile).unwrap();
+ assert!(!lockfile.contains(&rev1.to_string()),
+ "{} in {}", rev1, lockfile);
+}
+
+#[test]
+fn switch_sources() {
+ let a1 = git::new("a1", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ }).unwrap();
+ let a2 = git::new("a2", |project| {
+ project.file("Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.5.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ }).unwrap();
+
+ let p = project("project")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "project"
+ version = "0.5.0"
+ authors = []
+ [dependencies.b]
+ path = "b"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("b/Cargo.toml", &format!(r#"
+ [project]
+ name = "b"
+ version = "0.5.0"
+ authors = []
+ [dependencies.a]
+ git = '{}'
+ "#, a1.url()))
+ .file("b/src/lib.rs", "pub fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] git repository `file://[..]a1`
+[COMPILING] a v0.5.0 ([..]a1#[..]
+[COMPILING] b v0.5.0 ([..])
+[COMPILING] project v0.5.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ File::create(&p.root().join("b/Cargo.toml")).unwrap().write_all(format!(r#"
+ [project]
+ name = "b"
+ version = "0.5.0"
+ authors = []
+ [dependencies.a]
+ git = '{}'
+ "#, a2.url()).as_bytes()).unwrap();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] git repository `file://[..]a2`
+[COMPILING] a v0.5.1 ([..]a2#[..]
+[COMPILING] b v0.5.0 ([..])
+[COMPILING] project v0.5.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn dont_require_submodules_are_checked_out() {
+ let p = project("foo").build();
+ let git1 = git::new("dep1", |p| {
+ p.file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", "fn main() {}")
+ .file("src/lib.rs", "")
+ .file("a/foo", "")
+ }).unwrap();
+ let git2 = git::new("dep2", |p| p).unwrap();
+
+ let repo = git2::Repository::open(&git1.root()).unwrap();
+ let url = path2url(git2.root()).to_string();
+ git::add_submodule(&repo, &url, Path::new("a/submodule"));
+ git::commit(&repo);
+
+ git2::Repository::init(&p.root()).unwrap();
+ let url = path2url(git1.root()).to_string();
+ let dst = paths::home().join("foo");
+ git2::Repository::clone(&url, &dst).unwrap();
+
+ assert_that(git1.cargo("build").arg("-v").cwd(&dst),
+ execs().with_status(0));
+}
+
+#[test]
+fn doctest_same_name() {
+ let a2 = git::new("a2", |p| {
+ p.file("Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn a2() {}")
+ }).unwrap();
+
+ let a1 = git::new("a1", |p| {
+ p.file("Cargo.toml", &format!(r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ [dependencies]
+ a = {{ git = '{}' }}
+ "#, a2.url()))
+ .file("src/lib.rs", "extern crate a; pub fn a1() {}")
+ }).unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = {{ git = '{}' }}
+ "#, a1.url()))
+ .file("src/lib.rs", r#"
+ #[macro_use]
+ extern crate a;
+ "#)
+ .build();
+
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn lints_are_suppressed() {
+ let a = git::new("a", |p| {
+ p.file("Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "
+ use std::option;
+ ")
+ }).unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = {{ git = '{}' }}
+ "#, a.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] git repository `[..]`
+[COMPILING] a v0.5.0 ([..])
+[COMPILING] foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn denied_lints_are_allowed() {
+ let a = git::new("a", |p| {
+ p.file("Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "
+ #![deny(warnings)]
+ use std::option;
+ ")
+ }).unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = {{ git = '{}' }}
+ "#, a.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] git repository `[..]`
+[COMPILING] a v0.5.0 ([..])
+[COMPILING] foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn add_a_git_dep() {
+ let git = git::new("git", |p| {
+ p.file("Cargo.toml", r#"
+ [project]
+ name = "git"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ }).unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = {{ path = 'a' }}
+ git = {{ git = '{}' }}
+ "#, git.url()))
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+
+ File::create(p.root().join("a/Cargo.toml")).unwrap().write_all(format!(r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ git = {{ git = '{}' }}
+ "#, git.url()).as_bytes()).unwrap();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+}
+
+#[test]
+fn two_at_rev_instead_of_tag() {
+ let git = git::new("git", |p| {
+ p.file("Cargo.toml", r#"
+ [project]
+ name = "git1"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "git2"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ }).unwrap();
+
+ // Make a tag corresponding to the current HEAD
+ let repo = git2::Repository::open(&git.root()).unwrap();
+ let head = repo.head().unwrap().target().unwrap();
+ repo.tag("v0.1.0",
+ &repo.find_object(head, None).unwrap(),
+ &repo.signature().unwrap(),
+ "make a new tag",
+ false).unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ git1 = {{ git = '{0}', rev = 'v0.1.0' }}
+ git2 = {{ git = '{0}', rev = 'v0.1.0' }}
+ "#, git.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("generate-lockfile"), execs().with_status(0));
+ assert_that(p.cargo("build").arg("-v"), execs().with_status(0));
+}
+
+#[test]
+#[ignore] // accesses crates.io
+fn include_overrides_gitignore() {
+ let p = git::new("reduction", |repo| {
+ repo.file("Cargo.toml", r#"
+ [package]
+ name = "reduction"
+ version = "0.5.0"
+ authors = ["pnkfelix"]
+ build = "tango-build.rs"
+ include = ["src/lib.rs", "src/incl.rs", "src/mod.md", "tango-build.rs", "Cargo.toml"]
+
+ [build-dependencies]
+ filetime = "0.1"
+ "#)
+ .file(".gitignore", r#"
+ target
+ Cargo.lock
+ # Below files represent generated code, thus not managed by `git`
+ src/incl.rs
+ src/not_incl.rs
+ "#)
+ .file("tango-build.rs", r#"
+ extern crate filetime;
+ use filetime::FileTime;
+ use std::fs::{self, File};
+
+ fn main() {
+ // generate files, or bring their timestamps into sync.
+ let source = "src/mod.md";
+
+ let metadata = fs::metadata(source).unwrap();
+ let mtime = FileTime::from_last_modification_time(&metadata);
+ let atime = FileTime::from_last_access_time(&metadata);
+
+ // sync time stamps for generated files with time stamp of source file.
+
+ let files = ["src/not_incl.rs", "src/incl.rs"];
+ for file in files.iter() {
+ File::create(file).unwrap();
+ filetime::set_file_times(file, atime, mtime).unwrap();
+ }
+ }
+ "#)
+ .file("src/lib.rs", r#"
+ mod not_incl;
+ mod incl;
+ "#)
+ .file("src/mod.md", r#"
+ (The content of this file does not matter since we are not doing real codegen.)
+ "#)
+ }).unwrap();
+
+ println!("build 1: all is new");
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] filetime [..]
+[DOWNLOADING] libc [..]
+[COMPILING] libc [..]
+[RUNNING] `rustc --crate-name libc [..]`
+[COMPILING] filetime [..]
+[RUNNING] `rustc --crate-name filetime [..]`
+[COMPILING] reduction [..]
+[RUNNING] `rustc --crate-name build_script_tango_build tango-build.rs --crate-type bin [..]`
+[RUNNING] `[..][/]build-script-tango-build`
+[RUNNING] `rustc --crate-name reduction src[/]lib.rs --crate-type lib [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ println!("build 2: nothing changed; file timestamps reset by build script");
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[FRESH] libc [..]
+[FRESH] filetime [..]
+[FRESH] reduction [..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ println!("build 3: touch `src/not_incl.rs`; expect build script *not* re-run");
+ sleep_ms(1000);
+ File::create(p.root().join("src").join("not_incl.rs")).unwrap();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[FRESH] libc [..]
+[FRESH] filetime [..]
+[COMPILING] reduction [..]
+[RUNNING] `rustc --crate-name reduction src[/]lib.rs --crate-type lib [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ // This final case models the bug from rust-lang/cargo#4135: an
+ // explicitly included file should cause a build-script re-run,
+ // even if that same file is matched by `.gitignore`.
+ println!("build 4: touch `src/incl.rs`; expect build script re-run");
+ sleep_ms(1000);
+ File::create(p.root().join("src").join("incl.rs")).unwrap();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[FRESH] libc [..]
+[FRESH] filetime [..]
+[COMPILING] reduction [..]
+[RUNNING] `[..][/]build-script-tango-build`
+[RUNNING] `rustc --crate-name reduction src[/]lib.rs --crate-type lib [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn invalid_git_dependency_manifest() {
+ let project = project("foo");
+ let git_project = git::new("dep1", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "dep1"
+ version = "0.5.0"
+ authors = ["carlhuda@example.com"]
+ categories = ["algorithms"]
+ categories = ["algorithms"]
+
+ [lib]
+
+ name = "dep1"
+ "#)
+ .file("src/dep1.rs", r#"
+ pub fn hello() -> &'static str {
+ "hello world"
+ }
+ "#)
+ }).unwrap();
+
+ let project = project
+ .file("Cargo.toml", &format!(r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.dep1]
+
+ git = '{}'
+ "#, git_project.url()))
+ .file("src/main.rs", &main_file(r#""{}", dep1::hello()"#, &["dep1"]))
+ .build();
+
+ let git_root = git_project.root();
+
+ assert_that(project.cargo("build"),
+ execs()
+ .with_stderr(&format!("[UPDATING] git repository `{}`\n\
+ error: failed to load source for a dependency on `dep1`\n\
+ \n\
+ Caused by:\n \
+ Unable to update {}\n\
+ \n\
+ Caused by:\n \
+ failed to parse manifest at `[..]`\n\
+ \n\
+ Caused by:\n \
+ could not parse input as TOML\n\
+ \n\
+ Caused by:\n \
+ duplicate key: `categories` for key `project`",
+ path2url(git_root.clone()),
+ path2url(git_root),
+ )));
+}
--- /dev/null
+extern crate cargotest;
+extern crate cargo;
+extern crate tempdir;
+extern crate hamcrest;
+
+use std::fs::{self, File};
+use std::io::prelude::*;
+use std::env;
+
+use cargo::util::ProcessBuilder;
+use cargotest::support::{execs, paths, cargo_exe};
+use hamcrest::{assert_that, existing_file, existing_dir, is_not};
+use tempdir::TempDir;
+
+fn cargo_process(s: &str) -> ProcessBuilder {
+ let mut p = cargotest::process(&cargo_exe());
+ p.arg(s).cwd(&paths::root()).env("HOME", &paths::home());
+ p
+}
+
+#[test]
+fn simple_lib() {
+ assert_that(cargo_process("init").arg("--lib").arg("--vcs").arg("none")
+ .env("USER", "foo"),
+ execs().with_status(0).with_stderr("\
+[CREATED] library project
+"));
+
+ assert_that(&paths::root().join("Cargo.toml"), existing_file());
+ assert_that(&paths::root().join("src/lib.rs"), existing_file());
+ assert_that(&paths::root().join(".gitignore"), is_not(existing_file()));
+
+ assert_that(cargo_process("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn simple_bin() {
+ let path = paths::root().join("foo");
+ fs::create_dir(&path).unwrap();
+ assert_that(cargo_process("init").arg("--bin").arg("--vcs").arg("none")
+ .env("USER", "foo").cwd(&path),
+ execs().with_status(0).with_stderr("\
+[CREATED] binary (application) project
+"));
+
+ assert_that(&paths::root().join("foo/Cargo.toml"), existing_file());
+ assert_that(&paths::root().join("foo/src/main.rs"), existing_file());
+
+ assert_that(cargo_process("build").cwd(&path),
+ execs().with_status(0));
+ assert_that(&paths::root().join(&format!("foo/target/debug/foo{}",
+ env::consts::EXE_SUFFIX)),
+ existing_file());
+}
+
+#[test]
+fn both_lib_and_bin() {
+ let td = TempDir::new("cargo").unwrap();
+ assert_that(cargo_process("init").arg("--lib").arg("--bin").cwd(td.path())
+ .env("USER", "foo"),
+ execs().with_status(101).with_stderr(
+ "[ERROR] can't specify both lib and binary outputs"));
+}
+
+fn bin_already_exists(explicit: bool, rellocation: &str) {
+ let path = paths::root().join("foo");
+ fs::create_dir_all(&path.join("src")).unwrap();
+
+ let sourcefile_path = path.join(rellocation);
+
+ let content = br#"
+ fn main() {
+ println!("Hello, world 2!");
+ }
+ "#;
+
+ File::create(&sourcefile_path).unwrap().write_all(content).unwrap();
+
+ if explicit {
+ assert_that(cargo_process("init").arg("--bin").arg("--vcs").arg("none")
+ .env("USER", "foo").cwd(&path),
+ execs().with_status(0));
+ } else {
+ assert_that(cargo_process("init").arg("--vcs").arg("none")
+ .env("USER", "foo").cwd(&path),
+ execs().with_status(0));
+ }
+
+ assert_that(&paths::root().join("foo/Cargo.toml"), existing_file());
+ assert_that(&paths::root().join("foo/src/lib.rs"), is_not(existing_file()));
+
+ // Check that our file is not overwritten
+ let mut new_content = Vec::new();
+ File::open(&sourcefile_path).unwrap().read_to_end(&mut new_content).unwrap();
+ assert_eq!(Vec::from(content as &[u8]), new_content);
+}
+
+#[test]
+fn bin_already_exists_explicit() {
+ bin_already_exists(true, "src/main.rs")
+}
+
+#[test]
+fn bin_already_exists_implicit() {
+ bin_already_exists(false, "src/main.rs")
+}
+
+#[test]
+fn bin_already_exists_explicit_nosrc() {
+ bin_already_exists(true, "main.rs")
+}
+
+#[test]
+fn bin_already_exists_implicit_nosrc() {
+ bin_already_exists(false, "main.rs")
+}
+
+#[test]
+fn bin_already_exists_implicit_namenosrc() {
+ bin_already_exists(false, "foo.rs")
+}
+
+#[test]
+fn bin_already_exists_implicit_namesrc() {
+ bin_already_exists(false, "src/foo.rs")
+}
+
+#[test]
+fn confused_by_multiple_lib_files() {
+ let path = paths::root().join("foo");
+ fs::create_dir_all(&path.join("src")).unwrap();
+
+ let sourcefile_path1 = path.join("src/lib.rs");
+
+ File::create(&sourcefile_path1).unwrap().write_all(br#"
+ fn qqq () {
+ println!("Hello, world 2!");
+ }
+ "#).unwrap();
+
+ let sourcefile_path2 = path.join("lib.rs");
+
+ File::create(&sourcefile_path2).unwrap().write_all(br#"
+ fn qqq () {
+ println!("Hello, world 3!");
+ }
+ "#).unwrap();
+
+ assert_that(cargo_process("init").arg("--vcs").arg("none")
+ .env("USER", "foo").cwd(&path),
+ execs().with_status(101).with_stderr("\
+[ERROR] cannot have a project with multiple libraries, found both `src/lib.rs` and `lib.rs`
+"));
+
+ assert_that(&paths::root().join("foo/Cargo.toml"), is_not(existing_file()));
+}
+
+
+#[test]
+fn multibin_project_name_clash() {
+ let path = paths::root().join("foo");
+ fs::create_dir(&path).unwrap();
+
+ let sourcefile_path1 = path.join("foo.rs");
+
+ File::create(&sourcefile_path1).unwrap().write_all(br#"
+ fn main () {
+ println!("Hello, world 2!");
+ }
+ "#).unwrap();
+
+ let sourcefile_path2 = path.join("main.rs");
+
+ File::create(&sourcefile_path2).unwrap().write_all(br#"
+ fn main () {
+ println!("Hello, world 3!");
+ }
+ "#).unwrap();
+
+ assert_that(cargo_process("init").arg("--lib").arg("--vcs").arg("none")
+ .env("USER", "foo").cwd(&path),
+ execs().with_status(101).with_stderr("\
+[ERROR] multiple possible binary sources found:
+ main.rs
+ foo.rs
+cannot automatically generate Cargo.toml as the main target would be ambiguous
+"));
+
+ assert_that(&paths::root().join("foo/Cargo.toml"), is_not(existing_file()));
+}
+
+fn lib_already_exists(rellocation: &str) {
+ let path = paths::root().join("foo");
+ fs::create_dir_all(&path.join("src")).unwrap();
+
+ let sourcefile_path = path.join(rellocation);
+
+ let content = br#"
+ pub fn qqq() {}
+ "#;
+
+ File::create(&sourcefile_path).unwrap().write_all(content).unwrap();
+
+ assert_that(cargo_process("init").arg("--vcs").arg("none")
+ .env("USER", "foo").cwd(&path),
+ execs().with_status(0));
+
+ assert_that(&paths::root().join("foo/Cargo.toml"), existing_file());
+ assert_that(&paths::root().join("foo/src/main.rs"), is_not(existing_file()));
+
+ // Check that our file is not overwritten
+ let mut new_content = Vec::new();
+ File::open(&sourcefile_path).unwrap().read_to_end(&mut new_content).unwrap();
+ assert_eq!(Vec::from(content as &[u8]), new_content);
+}
+
+#[test]
+fn lib_already_exists_src() {
+ lib_already_exists("src/lib.rs")
+}
+
+#[test]
+fn lib_already_exists_nosrc() {
+ lib_already_exists("lib.rs")
+}
+
+#[test]
+fn simple_git() {
+ assert_that(cargo_process("init").arg("--lib")
+ .arg("--vcs")
+ .arg("git")
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+ assert_that(&paths::root().join("Cargo.toml"), existing_file());
+ assert_that(&paths::root().join("src/lib.rs"), existing_file());
+ assert_that(&paths::root().join(".git"), existing_dir());
+ assert_that(&paths::root().join(".gitignore"), existing_file());
+}
+
+#[test]
+fn auto_git() {
+ let td = TempDir::new("cargo").unwrap();
+ let foo = &td.path().join("foo");
+ fs::create_dir_all(&foo).unwrap();
+ assert_that(cargo_process("init").arg("--lib")
+ .cwd(foo.clone())
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+ assert_that(&foo.join("Cargo.toml"), existing_file());
+ assert_that(&foo.join("src/lib.rs"), existing_file());
+ assert_that(&foo.join(".git"), existing_dir());
+ assert_that(&foo.join(".gitignore"), existing_file());
+}
+
+#[test]
+fn invalid_dir_name() {
+ let foo = &paths::root().join("foo.bar");
+ fs::create_dir_all(&foo).unwrap();
+ assert_that(cargo_process("init").cwd(foo.clone())
+ .env("USER", "foo"),
+ execs().with_status(101).with_stderr("\
+[ERROR] Invalid character `.` in crate name: `foo.bar`
+use --name to override crate name
+"));
+
+ assert_that(&foo.join("Cargo.toml"), is_not(existing_file()));
+}
+
+#[test]
+fn reserved_name() {
+ let test = &paths::root().join("test");
+ fs::create_dir_all(&test).unwrap();
+ assert_that(cargo_process("init").cwd(test.clone())
+ .env("USER", "foo"),
+ execs().with_status(101).with_stderr("\
+[ERROR] The name `test` cannot be used as a crate name\n\
+use --name to override crate name
+"));
+
+ assert_that(&test.join("Cargo.toml"), is_not(existing_file()));
+}
+
+#[test]
+fn git_autodetect() {
+ fs::create_dir(&paths::root().join(".git")).unwrap();
+
+ assert_that(cargo_process("init").arg("--lib")
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+
+ assert_that(&paths::root().join("Cargo.toml"), existing_file());
+ assert_that(&paths::root().join("src/lib.rs"), existing_file());
+ assert_that(&paths::root().join(".git"), existing_dir());
+ assert_that(&paths::root().join(".gitignore"), existing_file());
+}
+
+
+#[test]
+fn mercurial_autodetect() {
+ fs::create_dir(&paths::root().join(".hg")).unwrap();
+
+ assert_that(cargo_process("init").arg("--lib")
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+
+ assert_that(&paths::root().join("Cargo.toml"), existing_file());
+ assert_that(&paths::root().join("src/lib.rs"), existing_file());
+ assert_that(&paths::root().join(".git"), is_not(existing_dir()));
+ assert_that(&paths::root().join(".hgignore"), existing_file());
+}
+
+#[test]
+fn gitignore_appended_not_replaced() {
+ fs::create_dir(&paths::root().join(".git")).unwrap();
+
+ File::create(&paths::root().join(".gitignore")).unwrap().write_all(b"qqqqqq\n").unwrap();
+
+ assert_that(cargo_process("init").arg("--lib")
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+
+ assert_that(&paths::root().join("Cargo.toml"), existing_file());
+ assert_that(&paths::root().join("src/lib.rs"), existing_file());
+ assert_that(&paths::root().join(".git"), existing_dir());
+ assert_that(&paths::root().join(".gitignore"), existing_file());
+
+ let mut contents = String::new();
+ File::open(&paths::root().join(".gitignore")).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"qqqqqq"#));
+}
+
+#[test]
+fn gitignore_added_newline_if_required() {
+ fs::create_dir(&paths::root().join(".git")).unwrap();
+
+ File::create(&paths::root().join(".gitignore")).unwrap().write_all(b"first").unwrap();
+
+ assert_that(cargo_process("init").arg("--lib")
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+ assert_that(&paths::root().join(".gitignore"), existing_file());
+
+ let mut contents = String::new();
+ File::open(&paths::root().join(".gitignore")).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.starts_with("first\n"));
+}
+
+#[test]
+fn mercurial_added_newline_if_required() {
+ fs::create_dir(&paths::root().join(".hg")).unwrap();
+
+ File::create(&paths::root().join(".hgignore")).unwrap().write_all(b"first").unwrap();
+
+ assert_that(cargo_process("init").arg("--lib")
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+ assert_that(&paths::root().join(".hgignore"), existing_file());
+
+ let mut contents = String::new();
+ File::open(&paths::root().join(".hgignore")).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.starts_with("first\n"));
+}
+
+#[test]
+fn cargo_lock_gitignored_if_lib1() {
+ fs::create_dir(&paths::root().join(".git")).unwrap();
+
+ assert_that(cargo_process("init").arg("--lib").arg("--vcs").arg("git")
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+ assert_that(&paths::root().join(".gitignore"), existing_file());
+
+ let mut contents = String::new();
+ File::open(&paths::root().join(".gitignore")).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"Cargo.lock"#));
+}
+
+#[test]
+fn cargo_lock_gitignored_if_lib2() {
+ fs::create_dir(&paths::root().join(".git")).unwrap();
+
+ File::create(&paths::root().join("lib.rs")).unwrap().write_all(br#""#).unwrap();
+
+ assert_that(cargo_process("init").arg("--vcs").arg("git")
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+ assert_that(&paths::root().join(".gitignore"), existing_file());
+
+ let mut contents = String::new();
+ File::open(&paths::root().join(".gitignore")).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"Cargo.lock"#));
+}
+
+#[test]
+fn cargo_lock_not_gitignored_if_bin1() {
+ fs::create_dir(&paths::root().join(".git")).unwrap();
+
+ assert_that(cargo_process("init").arg("--vcs").arg("git")
+ .arg("--bin")
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+ assert_that(&paths::root().join(".gitignore"), existing_file());
+
+ let mut contents = String::new();
+ File::open(&paths::root().join(".gitignore")).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(!contents.contains(r#"Cargo.lock"#));
+}
+
+#[test]
+fn cargo_lock_not_gitignored_if_bin2() {
+ fs::create_dir(&paths::root().join(".git")).unwrap();
+
+ File::create(&paths::root().join("main.rs")).unwrap().write_all(br#""#).unwrap();
+
+ assert_that(cargo_process("init").arg("--vcs").arg("git")
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+ assert_that(&paths::root().join(".gitignore"), existing_file());
+
+ let mut contents = String::new();
+ File::open(&paths::root().join(".gitignore")).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(!contents.contains(r#"Cargo.lock"#));
+}
+
+#[test]
+fn with_argument() {
+ assert_that(cargo_process("init").arg("foo").arg("--vcs").arg("none")
+ .env("USER", "foo"),
+ execs().with_status(0));
+ assert_that(&paths::root().join("foo/Cargo.toml"), existing_file());
+}
+
+
+#[test]
+fn unknown_flags() {
+ assert_that(cargo_process("init").arg("foo").arg("--flag"),
+ execs().with_status(1)
+ .with_stderr("\
+[ERROR] Unknown flag: '--flag'
+
+Usage:
+ cargo init [options] [<path>]
+ cargo init -h | --help
+"));
+}
+
+#[cfg(not(windows))]
+#[test]
+fn no_filename() {
+ assert_that(cargo_process("init").arg("/"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] cannot auto-detect project name from path \"/\" ; use --name to override
+".to_string()));
+}
--- /dev/null
+extern crate cargo;
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::fs::{self, File, OpenOptions};
+use std::io::prelude::*;
+
+use cargo::util::ProcessBuilder;
+use cargotest::install::{cargo_home, has_installed_exe};
+use cargotest::support::git;
+use cargotest::support::paths;
+use cargotest::support::registry::Package;
+use cargotest::support::{project, execs};
+use hamcrest::{assert_that, is_not};
+
+fn cargo_process(s: &str) -> ProcessBuilder {
+ let mut p = cargotest::cargo_process();
+ p.arg(s);
+ p
+}
+
+fn pkg(name: &str, vers: &str) {
+ Package::new(name, vers)
+ .file("src/lib.rs", "")
+ .file("src/main.rs", &format!("
+ extern crate {};
+ fn main() {{}}
+ ", name))
+ .publish();
+}
+
+#[test]
+fn simple() {
+ pkg("foo", "0.0.1");
+
+ assert_that(cargo_process("install").arg("foo"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] foo v0.0.1 (registry [..])
+[INSTALLING] foo v0.0.1
+[COMPILING] foo v0.0.1
+[FINISHED] release [optimized] target(s) in [..]
+[INSTALLING] {home}[..]bin[..]foo[..]
+warning: be sure to add `[..]` to your PATH to be able to run the installed binaries
+",
+ home = cargo_home().display())));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+
+ assert_that(cargo_process("uninstall").arg("foo"),
+ execs().with_status(0).with_stderr(&format!("\
+[REMOVING] {home}[..]bin[..]foo[..]
+",
+ home = cargo_home().display())));
+ assert_that(cargo_home(), is_not(has_installed_exe("foo")));
+}
+
+#[test]
+fn multiple_pkgs() {
+ pkg("foo", "0.0.1");
+ pkg("bar", "0.0.2");
+
+ assert_that(cargo_process("install").args(&["foo", "bar", "baz"]),
+ execs().with_status(101).with_stderr(&format!("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] foo v0.0.1 (registry `file://[..]`)
+[INSTALLING] foo v0.0.1
+[COMPILING] foo v0.0.1
+[FINISHED] release [optimized] target(s) in [..]
+[INSTALLING] {home}[..]bin[..]foo[..]
+[DOWNLOADING] bar v0.0.2 (registry `file://[..]`)
+[INSTALLING] bar v0.0.2
+[COMPILING] bar v0.0.2
+[FINISHED] release [optimized] target(s) in [..]
+[INSTALLING] {home}[..]bin[..]bar[..]
+error: could not find `baz` in registry `[..]`
+
+Summary: Successfully installed foo, bar! Failed to install baz (see error(s) above).
+warning: be sure to add `[..]` to your PATH to be able to run the installed binaries
+error: some crates failed to install
+",
+ home = cargo_home().display())));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(cargo_home(), has_installed_exe("bar"));
+
+ assert_that(cargo_process("uninstall").args(&["foo", "bar"]),
+ execs().with_status(0).with_stderr(&format!("\
+[REMOVING] {home}[..]bin[..]foo[..]
+[REMOVING] {home}[..]bin[..]bar[..]
+
+Summary: Successfully uninstalled foo, bar!
+",
+ home = cargo_home().display())));
+
+ assert_that(cargo_home(), is_not(has_installed_exe("foo")));
+ assert_that(cargo_home(), is_not(has_installed_exe("bar")));
+}
+
+#[test]
+fn pick_max_version() {
+ pkg("foo", "0.0.1");
+ pkg("foo", "0.0.2");
+
+ assert_that(cargo_process("install").arg("foo"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] foo v0.0.2 (registry [..])
+[INSTALLING] foo v0.0.2
+[COMPILING] foo v0.0.2
+[FINISHED] release [optimized] target(s) in [..]
+[INSTALLING] {home}[..]bin[..]foo[..]
+warning: be sure to add `[..]` to your PATH to be able to run the installed binaries
+",
+ home = cargo_home().display())));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+}
+
+#[test]
+fn missing() {
+ pkg("foo", "0.0.1");
+ assert_that(cargo_process("install").arg("bar"),
+ execs().with_status(101).with_stderr("\
+[UPDATING] registry [..]
+[ERROR] could not find `bar` in registry `[..]`
+"));
+}
+
+#[test]
+fn bad_version() {
+ pkg("foo", "0.0.1");
+ assert_that(cargo_process("install").arg("foo").arg("--vers=0.2.0"),
+ execs().with_status(101).with_stderr("\
+[UPDATING] registry [..]
+[ERROR] could not find `foo` in registry `[..]` with version `=0.2.0`
+"));
+}
+
+#[test]
+fn no_crate() {
+ assert_that(cargo_process("install"),
+ execs().with_status(101).with_stderr("\
+[ERROR] `[..]` is not a crate root; specify a crate to install [..]
+
+Caused by:
+ failed to read `[..]Cargo.toml`
+
+Caused by:
+ [..] (os error [..])
+"));
+}
+
+#[test]
+fn install_location_precedence() {
+ pkg("foo", "0.0.1");
+
+ let root = paths::root();
+ let t1 = root.join("t1");
+ let t2 = root.join("t2");
+ let t3 = root.join("t3");
+ let t4 = cargo_home();
+
+ fs::create_dir(root.join(".cargo")).unwrap();
+ File::create(root.join(".cargo/config")).unwrap().write_all(format!("\
+ [install]
+ root = '{}'
+ ", t3.display()).as_bytes()).unwrap();
+
+ println!("install --root");
+
+ assert_that(cargo_process("install").arg("foo")
+ .arg("--root").arg(&t1)
+ .env("CARGO_INSTALL_ROOT", &t2),
+ execs().with_status(0));
+ assert_that(&t1, has_installed_exe("foo"));
+ assert_that(&t2, is_not(has_installed_exe("foo")));
+
+ println!("install CARGO_INSTALL_ROOT");
+
+ assert_that(cargo_process("install").arg("foo")
+ .env("CARGO_INSTALL_ROOT", &t2),
+ execs().with_status(0));
+ assert_that(&t2, has_installed_exe("foo"));
+ assert_that(&t3, is_not(has_installed_exe("foo")));
+
+ println!("install install.root");
+
+ assert_that(cargo_process("install").arg("foo"),
+ execs().with_status(0));
+ assert_that(&t3, has_installed_exe("foo"));
+ assert_that(&t4, is_not(has_installed_exe("foo")));
+
+ fs::remove_file(root.join(".cargo/config")).unwrap();
+
+ println!("install cargo home");
+
+ assert_that(cargo_process("install").arg("foo"),
+ execs().with_status(0));
+ assert_that(&t4, has_installed_exe("foo"));
+}
+
+#[test]
+fn install_path() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("--path").arg(p.root()),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(cargo_process("install").arg("--path").arg(".").cwd(p.root()),
+ execs().with_status(101).with_stderr("\
+[INSTALLING] foo v0.1.0 [..]
+[ERROR] binary `foo[..]` already exists in destination as part of `foo v0.1.0 [..]`
+Add --force to overwrite
+"));
+}
+
+#[test]
+fn multiple_crates_error() {
+ let p = git::repo(&paths::root().join("foo"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("a/src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("--git").arg(p.url().to_string()),
+ execs().with_status(101).with_stderr("\
+[UPDATING] git repository [..]
+[ERROR] multiple packages with binaries found: bar, foo
+"));
+}
+
+#[test]
+fn multiple_crates_select() {
+ let p = git::repo(&paths::root().join("foo"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("a/src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("--git").arg(p.url().to_string())
+ .arg("foo"),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(cargo_home(), is_not(has_installed_exe("bar")));
+
+ assert_that(cargo_process("install").arg("--git").arg(p.url().to_string())
+ .arg("bar"),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("bar"));
+}
+
+#[test]
+fn multiple_crates_auto_binaries() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "a" }
+ "#)
+ .file("src/main.rs", "extern crate bar; fn main() {}")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(cargo_process("install").arg("--path").arg(p.root()),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+}
+
+#[test]
+fn multiple_crates_auto_examples() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "a" }
+ "#)
+ .file("src/lib.rs", "extern crate bar;")
+ .file("examples/foo.rs", "
+ extern crate bar;
+ extern crate foo;
+ fn main() {}
+ ")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(cargo_process("install").arg("--path").arg(p.root())
+ .arg("--example=foo"),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+}
+
+#[test]
+fn no_binaries_or_examples() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "a" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(cargo_process("install").arg("--path").arg(p.root()),
+ execs().with_status(101).with_stderr("\
+[ERROR] no packages found with binaries or examples
+"));
+}
+
+#[test]
+fn no_binaries() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/foo.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("--path").arg(p.root()).arg("foo"),
+ execs().with_status(101).with_stderr("\
+[INSTALLING] foo [..]
+[ERROR] specified package has no binaries
+"));
+}
+
+#[test]
+fn examples() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/foo.rs", "extern crate foo; fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("--path").arg(p.root())
+ .arg("--example=foo"),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+}
+
+#[test]
+fn install_twice() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/bin/foo-bin1.rs", "fn main() {}")
+ .file("src/bin/foo-bin2.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("--path").arg(p.root()),
+ execs().with_status(0));
+ assert_that(cargo_process("install").arg("--path").arg(p.root()),
+ execs().with_status(101).with_stderr("\
+[INSTALLING] foo v0.1.0 [..]
+[ERROR] binary `foo-bin1[..]` already exists in destination as part of `foo v0.1.0 ([..])`
+binary `foo-bin2[..]` already exists in destination as part of `foo v0.1.0 ([..])`
+Add --force to overwrite
+"));
+}
+
+#[test]
+fn install_force() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("--path").arg(p.root()),
+ execs().with_status(0));
+
+ let p = project("foo2")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.2.0"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("--force").arg("--path").arg(p.root()),
+ execs().with_status(0).with_stderr(&format!("\
+[INSTALLING] foo v0.2.0 ([..])
+[COMPILING] foo v0.2.0 ([..])
+[FINISHED] release [optimized] target(s) in [..]
+[REPLACING] {home}[..]bin[..]foo[..]
+warning: be sure to add `[..]` to your PATH to be able to run the installed binaries
+",
+ home = cargo_home().display())));
+
+ assert_that(cargo_process("install").arg("--list"),
+ execs().with_status(0).with_stdout("\
+foo v0.2.0 ([..]):
+ foo[..]
+"));
+}
+
+#[test]
+fn install_force_partial_overlap() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/bin/foo-bin1.rs", "fn main() {}")
+ .file("src/bin/foo-bin2.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("--path").arg(p.root()),
+ execs().with_status(0));
+
+ let p = project("foo2")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.2.0"
+ authors = []
+ "#)
+ .file("src/bin/foo-bin2.rs", "fn main() {}")
+ .file("src/bin/foo-bin3.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("--force").arg("--path").arg(p.root()),
+ execs().with_status(0).with_stderr(&format!("\
+[INSTALLING] foo v0.2.0 ([..])
+[COMPILING] foo v0.2.0 ([..])
+[FINISHED] release [optimized] target(s) in [..]
+[INSTALLING] {home}[..]bin[..]foo-bin3[..]
+[REPLACING] {home}[..]bin[..]foo-bin2[..]
+warning: be sure to add `[..]` to your PATH to be able to run the installed binaries
+",
+ home = cargo_home().display())));
+
+ assert_that(cargo_process("install").arg("--list"),
+ execs().with_status(0).with_stdout("\
+foo v0.1.0 ([..]):
+ foo-bin1[..]
+foo v0.2.0 ([..]):
+ foo-bin2[..]
+ foo-bin3[..]
+"));
+}
+
+#[test]
+fn install_force_bin() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/bin/foo-bin1.rs", "fn main() {}")
+ .file("src/bin/foo-bin2.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("--path").arg(p.root()),
+ execs().with_status(0));
+
+ let p = project("foo2")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.2.0"
+ authors = []
+ "#)
+ .file("src/bin/foo-bin1.rs", "fn main() {}")
+ .file("src/bin/foo-bin2.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("--force")
+ .arg("--bin")
+ .arg("foo-bin2")
+ .arg("--path")
+ .arg(p.root()),
+ execs().with_status(0).with_stderr(&format!("\
+[INSTALLING] foo v0.2.0 ([..])
+[COMPILING] foo v0.2.0 ([..])
+[FINISHED] release [optimized] target(s) in [..]
+[REPLACING] {home}[..]bin[..]foo-bin2[..]
+warning: be sure to add `[..]` to your PATH to be able to run the installed binaries
+",
+ home = cargo_home().display())));
+
+ assert_that(cargo_process("install").arg("--list"),
+ execs().with_status(0).with_stdout("\
+foo v0.1.0 ([..]):
+ foo-bin1[..]
+foo v0.2.0 ([..]):
+ foo-bin2[..]
+"));
+}
+
+#[test]
+fn compile_failure() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", "")
+ .build();
+
+ assert_that(cargo_process("install").arg("--path").arg(p.root()),
+ execs().with_status(101).with_stderr_contains("\
+[ERROR] failed to compile `foo v0.1.0 ([..])`, intermediate artifacts can be \
+ found at `[..]target`
+
+Caused by:
+ Could not compile `foo`.
+
+To learn more, run the command again with --verbose.
+"));
+}
+
+#[test]
+fn git_repo() {
+ let p = git::repo(&paths::root().join("foo"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ // use `--locked` to test that we don't even try to write a lockfile
+ assert_that(cargo_process("install").arg("--locked").arg("--git").arg(p.url().to_string()),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] git repository `[..]`
+[INSTALLING] foo v0.1.0 ([..])
+[COMPILING] foo v0.1.0 ([..])
+[FINISHED] release [optimized] target(s) in [..]
+[INSTALLING] {home}[..]bin[..]foo[..]
+warning: be sure to add `[..]` to your PATH to be able to run the installed binaries
+",
+ home = cargo_home().display())));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+}
+
+#[test]
+fn list() {
+ pkg("foo", "0.0.1");
+ pkg("bar", "0.2.1");
+ pkg("bar", "0.2.2");
+
+ assert_that(cargo_process("install").arg("--list"),
+ execs().with_status(0).with_stdout(""));
+
+ assert_that(cargo_process("install").arg("bar").arg("--vers").arg("=0.2.1"),
+ execs().with_status(0));
+ assert_that(cargo_process("install").arg("foo"),
+ execs().with_status(0));
+ assert_that(cargo_process("install").arg("--list"),
+ execs().with_status(0).with_stdout("\
+bar v0.2.1:
+ bar[..]
+foo v0.0.1:
+ foo[..]
+"));
+}
+
+#[test]
+fn list_error() {
+ pkg("foo", "0.0.1");
+ assert_that(cargo_process("install").arg("foo"),
+ execs().with_status(0));
+ assert_that(cargo_process("install").arg("--list"),
+ execs().with_status(0).with_stdout("\
+foo v0.0.1:
+ foo[..]
+"));
+ let mut worldfile_path = cargo_home();
+ worldfile_path.push(".crates.toml");
+ let mut worldfile = OpenOptions::new()
+ .write(true)
+ .open(worldfile_path)
+ .expect(".crates.toml should be there");
+ worldfile.write_all(b"\x00").unwrap();
+ drop(worldfile);
+ assert_that(cargo_process("install").arg("--list").arg("--verbose"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse crate metadata at `[..]`
+
+Caused by:
+ invalid TOML found for metadata
+
+Caused by:
+ unexpected character[..]
+"));
+}
+
+#[test]
+fn uninstall_pkg_does_not_exist() {
+ assert_that(cargo_process("uninstall").arg("foo"),
+ execs().with_status(101).with_stderr("\
+[ERROR] package id specification `foo` matched no packages
+"));
+}
+
+#[test]
+fn uninstall_bin_does_not_exist() {
+ pkg("foo", "0.0.1");
+
+ assert_that(cargo_process("install").arg("foo"),
+ execs().with_status(0));
+ assert_that(cargo_process("uninstall").arg("foo").arg("--bin=bar"),
+ execs().with_status(101).with_stderr("\
+[ERROR] binary `bar[..]` not installed as part of `foo v0.0.1`
+"));
+}
+
+#[test]
+fn uninstall_piecemeal() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/bin/foo.rs", "fn main() {}")
+ .file("src/bin/bar.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("--path").arg(p.root()),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(cargo_home(), has_installed_exe("bar"));
+
+ assert_that(cargo_process("uninstall").arg("foo").arg("--bin=bar"),
+ execs().with_status(0).with_stderr("\
+[REMOVING] [..]bar[..]
+"));
+
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(cargo_home(), is_not(has_installed_exe("bar")));
+
+ assert_that(cargo_process("uninstall").arg("foo").arg("--bin=foo"),
+ execs().with_status(0).with_stderr("\
+[REMOVING] [..]foo[..]
+"));
+ assert_that(cargo_home(), is_not(has_installed_exe("foo")));
+
+ assert_that(cargo_process("uninstall").arg("foo"),
+ execs().with_status(101).with_stderr("\
+[ERROR] package id specification `foo` matched no packages
+"));
+}
+
+#[test]
+fn subcommand_works_out_of_the_box() {
+ Package::new("cargo-foo", "1.0.0")
+ .file("src/main.rs", r#"
+ fn main() {
+ println!("bar");
+ }
+ "#)
+ .publish();
+ assert_that(cargo_process("install").arg("cargo-foo"),
+ execs().with_status(0));
+ assert_that(cargo_process("foo"),
+ execs().with_status(0).with_stdout("bar\n"));
+ assert_that(cargo_process("--list"),
+ execs().with_status(0).with_stdout_contains(" foo\n"));
+}
+
+#[test]
+fn installs_from_cwd_by_default() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").cwd(p.root()),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+}
+
+#[test]
+fn do_not_rebuilds_on_local_install() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--release"),
+ execs().with_status(0));
+ assert_that(cargo_process("install").arg("--path").arg(p.root()),
+ execs().with_status(0).with_stderr("[INSTALLING] [..]
+[FINISHED] release [optimized] target(s) in [..]
+[INSTALLING] [..]
+warning: be sure to add `[..]` to your PATH to be able to run the installed binaries
+"));
+
+ assert!(p.build_dir().exists());
+ assert!(p.release_bin("foo").exists());
+ assert_that(cargo_home(), has_installed_exe("foo"));
+}
+
+#[test]
+fn reports_unsuccessful_subcommand_result() {
+ Package::new("cargo-fail", "1.0.0")
+ .file("src/main.rs", r#"
+ fn main() {
+ panic!();
+ }
+ "#)
+ .publish();
+ assert_that(cargo_process("install").arg("cargo-fail"),
+ execs().with_status(0));
+ assert_that(cargo_process("--list"),
+ execs().with_status(0).with_stdout_contains(" fail\n"));
+ assert_that(cargo_process("fail"),
+ execs().with_status(101).with_stderr_contains("\
+thread '[..]' panicked at 'explicit panic', [..]
+"));
+}
+
+#[test]
+fn git_with_lockfile() {
+ let p = git::repo(&paths::root().join("foo"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "bar" }
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "fn main() {}")
+ .file("Cargo.lock", r#"
+ [[package]]
+ name = "foo"
+ version = "0.1.0"
+ dependencies = [ "bar 0.1.0" ]
+
+ [[package]]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .build();
+
+ assert_that(cargo_process("install").arg("--git").arg(p.url().to_string()),
+ execs().with_status(0));
+}
+
+#[test]
+fn q_silences_warnings() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(cargo_process("install").arg("-q").arg("--path").arg(p.root()),
+ execs().with_status(0).with_stderr(""));
+}
+
+#[test]
+fn readonly_dir() {
+ pkg("foo", "0.0.1");
+
+ let root = paths::root();
+ let dir = &root.join("readonly");
+ fs::create_dir(root.join("readonly")).unwrap();
+ let mut perms = fs::metadata(dir).unwrap().permissions();
+ perms.set_readonly(true);
+ fs::set_permissions(dir, perms).unwrap();
+
+ assert_that(cargo_process("install").arg("foo").cwd(dir),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+}
+
+#[test]
+fn use_path_workspace() {
+ Package::new("foo", "1.0.0").publish();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["baz"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("baz/Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "1"
+ "#)
+ .file("baz/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ let lock = p.read_lockfile();
+ assert_that(p.cargo("install"), execs().with_status(0));
+ let lock2 = p.read_lockfile();
+ assert!(lock == lock2, "different lockfiles");
+}
+
+#[test]
+fn vers_precise() {
+ pkg("foo", "0.1.1");
+ pkg("foo", "0.1.2");
+
+ assert_that(cargo_process("install").arg("foo").arg("--vers").arg("0.1.1"),
+ execs().with_status(0).with_stderr_contains("\
+[DOWNLOADING] foo v0.1.1 (registry [..])
+"));
+}
+
+#[test]
+fn version_too() {
+ pkg("foo", "0.1.1");
+ pkg("foo", "0.1.2");
+
+ assert_that(cargo_process("install").arg("foo").arg("--version").arg("0.1.1"),
+ execs().with_status(0).with_stderr_contains("\
+ [DOWNLOADING] foo v0.1.1 (registry [..])
+"));
+}
+
+#[test]
+fn not_both_vers_and_version() {
+ pkg("foo", "0.1.1");
+ pkg("foo", "0.1.2");
+
+ assert_that(cargo_process("install").arg("foo").arg("--version").arg("0.1.1").arg("--vers").arg("0.1.2"),
+ execs().with_status(101).with_stderr_contains("\
+error: Invalid arguments.
+"));
+}
+
+#[test]
+fn legacy_version_requirement() {
+ pkg("foo", "0.1.1");
+
+ assert_that(cargo_process("install").arg("foo").arg("--vers").arg("0.1"),
+ execs().with_status(0).with_stderr_contains("\
+warning: the `--vers` provided, `0.1`, is not a valid semver version
+
+historically Cargo treated this as a semver version requirement accidentally
+and will continue to do so, but this behavior will be removed eventually
+"));
+}
+
+#[test]
+fn test_install_git_cannot_be_a_base_url() {
+ assert_that(cargo_process("install").arg("--git").arg("github.com:rust-lang-nursery/rustfmt.git"),
+ execs().with_status(101).with_stderr("\
+error: invalid url `github.com:rust-lang-nursery/rustfmt.git`: cannot-be-a-base-URLs are not supported
+"));
+}
+
+#[test]
+fn uninstall_multiple_and_specifying_bin() {
+ assert_that(cargo_process("uninstall").args(&["foo", "bar"]).arg("--bin").arg("baz"),
+ execs().with_status(101).with_stderr("\
+error: A binary can only be associated with a single installed package, specifying multiple specs with --bin is redundant.
+"));
+}
+
+#[test]
+fn uninstall_multiple_and_some_pkg_does_not_exist() {
+ pkg("foo", "0.0.1");
+
+ assert_that(cargo_process("install").arg("foo"),
+ execs().with_status(0));
+
+ assert_that(cargo_process("uninstall").args(&["foo", "bar"]),
+ execs().with_status(101).with_stderr(&format!("\
+[REMOVING] {home}[..]bin[..]foo[..]
+error: package id specification `bar` matched no packages
+
+Summary: Successfully uninstalled foo! Failed to uninstall bar (see error(s) above).
+error: some packages failed to uninstall
+",
+ home = cargo_home().display())));
+
+ assert_that(cargo_home(), is_not(has_installed_exe("foo")));
+ assert_that(cargo_home(), is_not(has_installed_exe("bar")));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::net::TcpListener;
+use std::thread;
+use std::process::Command;
+
+use cargotest::support::{project, execs, cargo_exe};
+use hamcrest::assert_that;
+
+#[test]
+fn jobserver_exists() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("build.rs", r#"
+ use std::env;
+
+ fn main() {
+ let var = env::var("CARGO_MAKEFLAGS").unwrap();
+ let arg = var.split(' ')
+ .find(|p| p.starts_with("--jobserver"))
+ .unwrap();
+ let val = &arg[arg.find('=').unwrap() + 1..];
+ validate(val);
+ }
+
+ #[cfg(unix)]
+ fn validate(s: &str) {
+ use std::fs::File;
+ use std::io::*;
+ use std::os::unix::prelude::*;
+
+ let fds = s.split(',').collect::<Vec<_>>();
+ println!("{}", s);
+ assert_eq!(fds.len(), 2);
+ unsafe {
+ let mut read = File::from_raw_fd(fds[0].parse().unwrap());
+ let mut write = File::from_raw_fd(fds[1].parse().unwrap());
+
+ let mut buf = [0];
+ assert_eq!(read.read(&mut buf).unwrap(), 1);
+ assert_eq!(write.write(&buf).unwrap(), 1);
+ }
+ }
+
+ #[cfg(windows)]
+ fn validate(_: &str) {
+ // a little too complicated for a test...
+ }
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn makes_jobserver_used() {
+ let make = if cfg!(windows) {"mingw32-make"} else {"make"};
+ if Command::new(make).arg("--version").output().is_err() {
+ return
+ }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ d1 = { path = "d1" }
+ d2 = { path = "d2" }
+ d3 = { path = "d3" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.1"
+ authors = []
+ build = "../dbuild.rs"
+ "#)
+ .file("d1/src/lib.rs", "")
+ .file("d2/Cargo.toml", r#"
+ [package]
+ name = "d2"
+ version = "0.0.1"
+ authors = []
+ build = "../dbuild.rs"
+ "#)
+ .file("d2/src/lib.rs", "")
+ .file("d3/Cargo.toml", r#"
+ [package]
+ name = "d3"
+ version = "0.0.1"
+ authors = []
+ build = "../dbuild.rs"
+ "#)
+ .file("d3/src/lib.rs", "")
+ .file("dbuild.rs", r#"
+ use std::net::TcpStream;
+ use std::env;
+ use std::io::Read;
+
+ fn main() {
+ let addr = env::var("ADDR").unwrap();
+ let mut stream = TcpStream::connect(addr).unwrap();
+ let mut v = Vec::new();
+ stream.read_to_end(&mut v).unwrap();
+ }
+ "#)
+ .file("Makefile", "\
+all:
+\t+$(CARGO) build
+")
+ .build();
+
+ let l = TcpListener::bind("127.0.0.1:0").unwrap();
+ let addr = l.local_addr().unwrap();
+
+ let child = thread::spawn(move || {
+ let a1 = l.accept().unwrap();
+ let a2 = l.accept().unwrap();
+ l.set_nonblocking(true).unwrap();
+
+ for _ in 0..1000 {
+ assert!(l.accept().is_err());
+ thread::yield_now();
+ }
+
+ drop(a1);
+ l.set_nonblocking(false).unwrap();
+ let a3 = l.accept().unwrap();
+
+ drop((a2, a3));
+ });
+
+ assert_that(p.process(make)
+ .env("CARGO", cargo_exe())
+ .env("ADDR", addr.to_string())
+ .arg("-j2"),
+ execs().with_status(0));
+ child.join().unwrap();
+}
+
+#[test]
+fn jobserver_and_j() {
+ let make = if cfg!(windows) {"mingw32-make"} else {"make"};
+ if Command::new(make).arg("--version").output().is_err() {
+ return
+ }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("Makefile", "\
+all:
+\t+$(CARGO) build -j2
+")
+ .build();
+
+ assert_that(p.process(make)
+ .env("CARGO", cargo_exe())
+ .arg("-j2"),
+ execs().with_status(0).with_stderr("\
+warning: a `-j` argument was passed to Cargo but Cargo is also configured \
+with an external jobserver in its environment, ignoring the `-j` parameter
+[COMPILING] [..]
+[FINISHED] [..]
+"));
+}
--- /dev/null
+#[macro_use]
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::fs::{self, File};
+use std::io::prelude::*;
+
+use cargotest::support::paths::{self, CargoPathExt};
+use cargotest::support::registry::Package;
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+fn setup() {
+ let root = paths::root();
+ t!(fs::create_dir(&root.join(".cargo")));
+ t!(t!(File::create(root.join(".cargo/config"))).write_all(br#"
+ [source.crates-io]
+ registry = 'https://wut'
+ replace-with = 'my-awesome-local-registry'
+
+ [source.my-awesome-local-registry]
+ local-registry = 'registry'
+ "#));
+}
+
+#[test]
+fn simple() {
+ setup();
+ Package::new("foo", "0.0.1")
+ .local(true)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.0.1"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate foo;
+ pub fn bar() {
+ foo::foo();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UNPACKING] foo v0.0.1 ([..])
+[COMPILING] foo v0.0.1
+[COMPILING] bar v0.0.1 ({dir})
+[FINISHED] [..]
+",
+ dir = p.url())));
+ assert_that(p.cargo("build"), execs().with_status(0).with_stderr("\
+[FINISHED] [..]
+"));
+ assert_that(p.cargo("test"), execs().with_status(0));
+}
+
+#[test]
+fn multiple_versions() {
+ setup();
+ Package::new("foo", "0.0.1").local(true).publish();
+ Package::new("foo", "0.1.0")
+ .local(true)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "*"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate foo;
+ pub fn bar() {
+ foo::foo();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UNPACKING] foo v0.1.0 ([..])
+[COMPILING] foo v0.1.0
+[COMPILING] bar v0.0.1 ({dir})
+[FINISHED] [..]
+",
+ dir = p.url())));
+
+ Package::new("foo", "0.2.0")
+ .local(true)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .publish();
+
+ assert_that(p.cargo("update").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] foo v0.1.0 -> v0.2.0
+"));
+}
+
+#[test]
+fn multiple_names() {
+ setup();
+ Package::new("foo", "0.0.1")
+ .local(true)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .publish();
+ Package::new("bar", "0.1.0")
+ .local(true)
+ .file("src/lib.rs", "pub fn bar() {}")
+ .publish();
+
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "*"
+ bar = "*"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate foo;
+ extern crate bar;
+ pub fn local() {
+ foo::foo();
+ bar::bar();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UNPACKING] [..]
+[UNPACKING] [..]
+[COMPILING] [..]
+[COMPILING] [..]
+[COMPILING] local v0.0.1 ({dir})
+[FINISHED] [..]
+",
+ dir = p.url())));
+}
+
+#[test]
+fn interdependent() {
+ setup();
+ Package::new("foo", "0.0.1")
+ .local(true)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .publish();
+ Package::new("bar", "0.1.0")
+ .local(true)
+ .dep("foo", "*")
+ .file("src/lib.rs", "extern crate foo; pub fn bar() {}")
+ .publish();
+
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "*"
+ bar = "*"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate foo;
+ extern crate bar;
+ pub fn local() {
+ foo::foo();
+ bar::bar();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UNPACKING] [..]
+[UNPACKING] [..]
+[COMPILING] foo v0.0.1
+[COMPILING] bar v0.1.0
+[COMPILING] local v0.0.1 ({dir})
+[FINISHED] [..]
+",
+ dir = p.url())));
+}
+
+#[test]
+fn path_dep_rewritten() {
+ setup();
+ Package::new("foo", "0.0.1")
+ .local(true)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .publish();
+ Package::new("bar", "0.1.0")
+ .local(true)
+ .dep("foo", "*")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = { path = "foo", version = "*" }
+ "#)
+ .file("src/lib.rs", "extern crate foo; pub fn bar() {}")
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", "pub fn foo() {}")
+ .publish();
+
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "*"
+ bar = "*"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate foo;
+ extern crate bar;
+ pub fn local() {
+ foo::foo();
+ bar::bar();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UNPACKING] [..]
+[UNPACKING] [..]
+[COMPILING] foo v0.0.1
+[COMPILING] bar v0.1.0
+[COMPILING] local v0.0.1 ({dir})
+[FINISHED] [..]
+",
+ dir = p.url())));
+}
+
+#[test]
+fn invalid_dir_bad() {
+ setup();
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "*"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [source.crates-io]
+ registry = 'https://wut'
+ replace-with = 'my-awesome-local-directory'
+
+ [source.my-awesome-local-directory]
+ local-registry = '/path/to/nowhere'
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to load source for a dependency on `foo`
+
+Caused by:
+ Unable to update registry `https://[..]`
+
+Caused by:
+ failed to update replaced source registry `https://[..]`
+
+Caused by:
+ local registry path is not a directory: [..]path[..]to[..]nowhere
+"));
+}
+
+#[test]
+fn different_directory_replacing_the_registry_is_bad() {
+ setup();
+
+ // Move our test's .cargo/config to a temporary location and publish a
+ // registry package we're going to use first.
+ let config = paths::root().join(".cargo");
+ let config_tmp = paths::root().join(".cargo-old");
+ t!(fs::rename(&config, &config_tmp));
+
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "*"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ // Generate a lock file against the crates.io registry
+ Package::new("foo", "0.0.1").publish();
+ assert_that(p.cargo("build"), execs().with_status(0));
+
+ // Switch back to our directory source, and now that we're replacing
+ // crates.io make sure that this fails because we're replacing with a
+ // different checksum
+ config.rm_rf();
+ t!(fs::rename(&config_tmp, &config));
+ Package::new("foo", "0.0.1")
+ .file("src/lib.rs", "invalid")
+ .local(true)
+ .publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[ERROR] checksum for `foo v0.0.1` changed between lock files
+
+this could be indicative of a few possible errors:
+
+ * the lock file is corrupt
+ * a replacement source in use (e.g. a mirror) returned a different checksum
+ * the source itself may be corrupt in one way or another
+
+unable to verify that `foo v0.0.1` is the same as when the lockfile was generated
+
+"));
+}
+
+#[test]
+fn crates_io_registry_url_is_optional() {
+ let root = paths::root();
+ t!(fs::create_dir(&root.join(".cargo")));
+ t!(t!(File::create(root.join(".cargo/config"))).write_all(br#"
+ [source.crates-io]
+ replace-with = 'my-awesome-local-registry'
+
+ [source.my-awesome-local-registry]
+ local-registry = 'registry'
+ "#));
+
+ Package::new("foo", "0.0.1")
+ .local(true)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.0.1"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate foo;
+ pub fn bar() {
+ foo::foo();
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UNPACKING] foo v0.0.1 ([..])
+[COMPILING] foo v0.0.1
+[COMPILING] bar v0.0.1 ({dir})
+[FINISHED] [..]
+",
+ dir = p.url())));
+ assert_that(p.cargo("build"), execs().with_status(0).with_stderr("\
+[FINISHED] [..]
+"));
+ assert_that(p.cargo("test"), execs().with_status(0));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::git;
+use cargotest::support::registry::Package;
+use cargotest::support::{execs, project, lines_match};
+use hamcrest::assert_that;
+
+#[test]
+fn oldest_lockfile_still_works() {
+ let cargo_commands = vec![
+ "build",
+ "update"
+ ];
+ for cargo_command in cargo_commands {
+ oldest_lockfile_still_works_with_command(cargo_command);
+ }
+}
+
+fn oldest_lockfile_still_works_with_command(cargo_command: &str) {
+ Package::new("foo", "0.1.0").publish();
+
+ let expected_lockfile =
+r#"[[package]]
+name = "foo"
+version = "0.1.0"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+
+[[package]]
+name = "zzz"
+version = "0.0.1"
+dependencies = [
+ "foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
+]
+
+[metadata]
+"checksum foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)" = "[..]"
+"#;
+
+ let old_lockfile =
+r#"[root]
+name = "zzz"
+version = "0.0.1"
+dependencies = [
+ "foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
+]
+
+[[package]]
+name = "foo"
+version = "0.1.0"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+"#;
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "zzz"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .file("Cargo.lock", old_lockfile)
+ .build();
+
+ assert_that(p.cargo(cargo_command),
+ execs().with_status(0));
+
+ let lock = p.read_lockfile();
+ for (l, r) in expected_lockfile.lines().zip(lock.lines()) {
+ assert!(lines_match(l, r), "Lines differ:\n{}\n\n{}", l, r);
+ }
+
+ assert_eq!(lock.lines().count(), expected_lockfile.lines().count());
+}
+
+
+#[test]
+fn frozen_flag_preserves_old_lockfile() {
+ let cksum = Package::new("foo", "0.1.0").publish();
+
+ let old_lockfile = format!(
+ r#"[root]
+name = "zzz"
+version = "0.0.1"
+dependencies = [
+ "foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
+]
+
+[[package]]
+name = "foo"
+version = "0.1.0"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+
+[metadata]
+"checksum foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)" = "{}"
+"#,
+ cksum,
+ );
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "zzz"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .file("Cargo.lock", &old_lockfile)
+ .build();
+
+ assert_that(p.cargo("build").arg("--locked"),
+ execs().with_status(0));
+
+ let lock = p.read_lockfile();
+ for (l, r) in old_lockfile.lines().zip(lock.lines()) {
+ assert!(lines_match(l, r), "Lines differ:\n{}\n\n{}", l, r);
+ }
+
+ assert_eq!(lock.lines().count(), old_lockfile.lines().count());
+}
+
+
+#[test]
+fn totally_wild_checksums_works() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .file("Cargo.lock", r#"
+[[package]]
+name = "bar"
+version = "0.0.1"
+dependencies = [
+ "foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
+]
+
+[[package]]
+name = "foo"
+version = "0.1.0"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+
+[metadata]
+"checksum baz 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)" = "checksum"
+"checksum foo 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)" = "checksum"
+"#);
+
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ let lock = p.read_lockfile();
+ assert!(lock.starts_with(r#"
+[[package]]
+name = "bar"
+version = "0.0.1"
+dependencies = [
+ "foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
+]
+
+[[package]]
+name = "foo"
+version = "0.1.0"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+
+[metadata]
+"#.trim()));
+}
+
+#[test]
+fn wrong_checksum_is_an_error() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .file("Cargo.lock", r#"
+[[package]]
+name = "bar"
+version = "0.0.1"
+dependencies = [
+ "foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
+]
+
+[[package]]
+name = "foo"
+version = "0.1.0"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+
+[metadata]
+"checksum foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)" = "checksum"
+"#);
+
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[UPDATING] registry `[..]`
+error: checksum for `foo v0.1.0` changed between lock files
+
+this could be indicative of a few possible errors:
+
+ * the lock file is corrupt
+ * a replacement source in use (e.g. a mirror) returned a different checksum
+ * the source itself may be corrupt in one way or another
+
+unable to verify that `foo v0.1.0` is the same as when the lockfile was generated
+
+"));
+}
+
+// If the checksum is unlisted in the lockfile (e.g. <none>) yet we can
+// calculate it (e.g. it's a registry dep), then we should in theory just fill
+// it in.
+#[test]
+fn unlisted_checksum_is_bad_if_we_calculate() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .file("Cargo.lock", r#"
+[[package]]
+name = "bar"
+version = "0.0.1"
+dependencies = [
+ "foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
+]
+
+[[package]]
+name = "foo"
+version = "0.1.0"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+
+[metadata]
+"checksum foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)" = "<none>"
+"#);
+ let p = p.build();
+
+ assert_that(p.cargo("fetch"),
+ execs().with_status(101).with_stderr("\
+[UPDATING] registry `[..]`
+error: checksum for `foo v0.1.0` was not previously calculated, but a checksum \
+could now be calculated
+
+this could be indicative of a few possible situations:
+
+ * the source `[..]` did not previously support checksums,
+ but was replaced with one that does
+ * newer Cargo implementations know how to checksum this source, but this
+ older implementation does not
+ * the lock file is corrupt
+
+"));
+}
+
+// If the checksum is listed in the lockfile yet we cannot calculate it (e.g.
+// git dependencies as of today), then make sure we choke.
+#[test]
+fn listed_checksum_bad_if_we_cannot_compute() {
+ let git = git::new("foo", |p| {
+ p.file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ }).unwrap();
+
+ let p = project("bar")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = {{ git = '{}' }}
+ "#, git.url()))
+ .file("src/lib.rs", "")
+ .file("Cargo.lock", &format!(r#"
+[[package]]
+name = "bar"
+version = "0.0.1"
+dependencies = [
+ "foo 0.1.0 (git+{0})"
+]
+
+[[package]]
+name = "foo"
+version = "0.1.0"
+source = "git+{0}"
+
+[metadata]
+"checksum foo 0.1.0 (git+{0})" = "checksum"
+"#, git.url()));
+
+ let p = p.build();
+
+ assert_that(p.cargo("fetch"),
+ execs().with_status(101).with_stderr("\
+[UPDATING] git repository `[..]`
+error: checksum for `foo v0.1.0 ([..])` could not be calculated, but a \
+checksum is listed in the existing lock file[..]
+
+this could be indicative of a few possible situations:
+
+ * the source `[..]` supports checksums,
+ but was replaced with one that doesn't
+ * the lock file is corrupt
+
+unable to verify that `foo v0.1.0 ([..])` is the same as when the lockfile was generated
+
+"));
+}
+
+#[test]
+fn current_lockfile_format() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+
+ let actual = p.read_lockfile();
+
+ let expected = "\
+[[package]]
+name = \"bar\"
+version = \"0.0.1\"
+dependencies = [
+ \"foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)\",
+]
+
+[[package]]
+name = \"foo\"
+version = \"0.1.0\"
+source = \"registry+https://github.com/rust-lang/crates.io-index\"
+
+[metadata]
+\"checksum foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)\" = \"[..]\"";
+
+ for (l, r) in expected.lines().zip(actual.lines()) {
+ assert!(lines_match(l, r), "Lines differ:\n{}\n\n{}", l, r);
+ }
+
+ assert_eq!(actual.lines().count(), expected.lines().count());
+}
+
+#[test]
+fn lockfile_without_root() {
+ Package::new("foo", "0.1.0").publish();
+
+ let lockfile = r#"[[package]]
+name = "bar"
+version = "0.0.1"
+dependencies = [
+ "foo 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
+]
+
+[[package]]
+name = "foo"
+version = "0.1.0"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+"#;
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .file("Cargo.lock", lockfile);
+
+ let p = p.build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+
+ let lock = p.read_lockfile();
+ assert!(lock.starts_with(lockfile.trim()));
+}
+
+#[test]
+fn locked_correct_error() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("build").arg("--locked"),
+ execs().with_status(101).with_stderr("\
+[UPDATING] registry `[..]`
+error: the lock file needs to be updated but --locked was passed to prevent this
+"));
+}
--- /dev/null
+#[macro_use]
+extern crate cargotest;
+extern crate cargo;
+extern crate hamcrest;
+extern crate toml;
+
+use std::io::prelude::*;
+use std::fs::{self, File};
+
+use cargotest::{ChannelChanger, cargo_process};
+use cargotest::support::execs;
+use cargotest::support::registry::registry;
+use cargotest::install::cargo_home;
+use cargo::util::config::Config;
+use cargo::core::Shell;
+use hamcrest::{assert_that, existing_file, is_not};
+
+const TOKEN: &str = "test-token";
+const ORIGINAL_TOKEN: &str = "api-token";
+const CONFIG_FILE: &str = r#"
+ [registry]
+ token = "api-token"
+
+ [registries.test-reg]
+ index = "http://dummy_index/"
+"#;
+
+fn setup_old_credentials() {
+ let config = cargo_home().join("config");
+ t!(fs::create_dir_all(config.parent().unwrap()));
+ t!(t!(File::create(&config)).write_all(CONFIG_FILE.as_bytes()));
+}
+
+fn setup_new_credentials() {
+ let config = cargo_home().join("credentials");
+ t!(fs::create_dir_all(config.parent().unwrap()));
+ t!(t!(File::create(&config)).write_all(format!(r#"
+ token = "{token}"
+ "#, token = ORIGINAL_TOKEN)
+ .as_bytes()));
+}
+
+fn check_token(expected_token: &str, registry: Option<&str>) -> bool {
+
+ let credentials = cargo_home().join("credentials");
+ assert_that(&credentials, existing_file());
+
+ let mut contents = String::new();
+ File::open(&credentials).unwrap().read_to_string(&mut contents).unwrap();
+ let toml: toml::Value = contents.parse().unwrap();
+
+ let token = match (registry, toml) {
+ // A registry has been provided, so check that the token exists in a
+ // table for the registry.
+ (Some(registry), toml::Value::Table(table)) => {
+ table.get(registry).and_then(|registry_table| {
+ match registry_table.get("token") {
+ Some(&toml::Value::String(ref token)) => Some(token.as_str().to_string()),
+ _ => None,
+ }
+ })
+ },
+ // There is no registry provided, so check the global token instead.
+ (None, toml::Value::Table(table)) => {
+ table.get("token").and_then(|v| {
+ match v {
+ &toml::Value::String(ref token) => Some(token.as_str().to_string()),
+ _ => None,
+ }
+ })
+ }
+ _ => None
+ };
+
+ if let Some(token_val) = token {
+ token_val == expected_token
+ } else {
+ false
+ }
+}
+
+#[test]
+fn login_with_old_credentials() {
+ setup_old_credentials();
+
+ assert_that(cargo_process().arg("login")
+ .arg("--host").arg(registry().to_string()).arg(TOKEN),
+ execs().with_status(0));
+
+ let config = cargo_home().join("config");
+ assert_that(&config, existing_file());
+
+ let mut contents = String::new();
+ File::open(&config).unwrap().read_to_string(&mut contents).unwrap();
+ assert_eq!(CONFIG_FILE, contents);
+
+ // Ensure that we get the new token for the registry
+ assert!(check_token(TOKEN, None));
+}
+
+#[test]
+fn login_with_new_credentials() {
+ setup_new_credentials();
+
+ assert_that(cargo_process().arg("login")
+ .arg("--host").arg(registry().to_string()).arg(TOKEN),
+ execs().with_status(0));
+
+ let config = cargo_home().join("config");
+ assert_that(&config, is_not(existing_file()));
+
+ // Ensure that we get the new token for the registry
+ assert!(check_token(TOKEN, None));
+}
+
+#[test]
+fn login_with_old_and_new_credentials() {
+ setup_new_credentials();
+ login_with_old_credentials();
+}
+
+#[test]
+fn login_without_credentials() {
+ assert_that(cargo_process().arg("login")
+ .arg("--host").arg(registry().to_string()).arg(TOKEN),
+ execs().with_status(0));
+
+ let config = cargo_home().join("config");
+ assert_that(&config, is_not(existing_file()));
+
+ // Ensure that we get the new token for the registry
+ assert!(check_token(TOKEN, None));
+}
+
+#[test]
+fn new_credentials_is_used_instead_old() {
+ setup_old_credentials();
+ setup_new_credentials();
+
+ assert_that(cargo_process().arg("login")
+ .arg("--host").arg(registry().to_string()).arg(TOKEN),
+ execs().with_status(0));
+
+ let config = Config::new(Shell::new(), cargo_home(), cargo_home());
+
+ let token = config.get_string("registry.token").unwrap().map(|p| p.val);
+ assert_eq!(token.unwrap(), TOKEN);
+}
+
+#[test]
+fn registry_credentials() {
+ setup_old_credentials();
+ setup_new_credentials();
+
+ let reg = "test-reg";
+
+ assert_that(cargo_process().arg("login").masquerade_as_nightly_cargo()
+ .arg("--registry").arg(reg).arg(TOKEN).arg("-Zunstable-options"),
+ execs().with_status(0));
+
+ // Ensure that we have not updated the default token
+ assert!(check_token(ORIGINAL_TOKEN, None));
+
+ // Also ensure that we get the new token for the registry
+ assert!(check_token(TOKEN, Some(reg)));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use hamcrest::assert_that;
+use cargotest::support::registry::Package;
+use cargotest::support::{project, execs, basic_bin_manifest, basic_lib_manifest, main_file};
+
+#[test]
+fn cargo_metadata_simple() {
+ let p = project("foo")
+ .file("src/foo.rs", "")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .build();
+
+ assert_that(p.cargo("metadata"), execs().with_json(r#"
+ {
+ "packages": [
+ {
+ "name": "foo",
+ "version": "0.5.0",
+ "id": "foo[..]",
+ "source": null,
+ "dependencies": [],
+ "license": null,
+ "license_file": null,
+ "description": null,
+ "targets": [
+ {
+ "kind": [
+ "bin"
+ ],
+ "crate_types": [
+ "bin"
+ ],
+ "name": "foo",
+ "src_path": "[..][/]foo[/]src[/]foo.rs"
+ }
+ ],
+ "features": {},
+ "manifest_path": "[..]Cargo.toml"
+ }
+ ],
+ "workspace_members": ["foo 0.5.0 (path+file:[..]foo)"],
+ "resolve": {
+ "nodes": [
+ {
+ "dependencies": [],
+ "id": "foo 0.5.0 (path+file:[..]foo)"
+ }
+ ],
+ "root": "foo 0.5.0 (path+file:[..]foo)"
+ },
+ "target_directory": "[..]foo[/]target",
+ "version": 1
+ }"#));
+}
+
+#[test]
+fn cargo_metadata_warns_on_implicit_version() {
+ let p = project("foo")
+ .file("src/foo.rs", "")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .build();
+
+ assert_that(p.cargo("metadata"),
+ execs().with_stderr("\
+[WARNING] please specify `--format-version` flag explicitly to avoid compatibility problems"));
+
+ assert_that(p.cargo("metadata").arg("--format-version").arg("1"),
+ execs().with_stderr(""));
+}
+
+#[test]
+fn library_with_several_crate_types() {
+ let p = project("foo")
+ .file("src/lib.rs", "")
+ .file("Cargo.toml", r#"
+[package]
+name = "foo"
+version = "0.5.0"
+
+[lib]
+crate-type = ["lib", "staticlib"]
+ "#)
+ .build();
+
+ assert_that(p.cargo("metadata"), execs().with_json(r#"
+ {
+ "packages": [
+ {
+ "name": "foo",
+ "version": "0.5.0",
+ "id": "foo[..]",
+ "source": null,
+ "dependencies": [],
+ "license": null,
+ "license_file": null,
+ "description": null,
+ "targets": [
+ {
+ "kind": [
+ "lib",
+ "staticlib"
+ ],
+ "crate_types": [
+ "lib",
+ "staticlib"
+ ],
+ "name": "foo",
+ "src_path": "[..][/]foo[/]src[/]lib.rs"
+ }
+ ],
+ "features": {},
+ "manifest_path": "[..]Cargo.toml"
+ }
+ ],
+ "workspace_members": ["foo 0.5.0 (path+file:[..]foo)"],
+ "resolve": {
+ "nodes": [
+ {
+ "dependencies": [],
+ "id": "foo 0.5.0 (path+file:[..]foo)"
+ }
+ ],
+ "root": "foo 0.5.0 (path+file:[..]foo)"
+ },
+ "target_directory": "[..]foo[/]target",
+ "version": 1
+ }"#));
+}
+
+#[test]
+fn cargo_metadata_with_deps_and_version() {
+ let p = project("foo")
+ .file("src/foo.rs", "")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ license = "MIT"
+ description = "foo"
+
+ [[bin]]
+ name = "foo"
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .build();
+ Package::new("baz", "0.0.1").publish();
+ Package::new("bar", "0.0.1").dep("baz", "0.0.1").publish();
+
+ assert_that(p.cargo("metadata")
+ .arg("-q")
+ .arg("--format-version").arg("1"),
+ execs().with_json(r#"
+ {
+ "packages": [
+ {
+ "dependencies": [],
+ "features": {},
+ "id": "baz 0.0.1 (registry+[..])",
+ "manifest_path": "[..]Cargo.toml",
+ "name": "baz",
+ "source": "registry+[..]",
+ "license": null,
+ "license_file": null,
+ "description": null,
+ "targets": [
+ {
+ "kind": [
+ "lib"
+ ],
+ "crate_types": [
+ "lib"
+ ],
+ "name": "baz",
+ "src_path": "[..]lib.rs"
+ }
+ ],
+ "version": "0.0.1"
+ },
+ {
+ "dependencies": [
+ {
+ "features": [],
+ "kind": null,
+ "name": "baz",
+ "optional": false,
+ "req": "^0.0.1",
+ "source": "registry+[..]",
+ "target": null,
+ "uses_default_features": true
+ }
+ ],
+ "features": {},
+ "id": "bar 0.0.1 (registry+[..])",
+ "manifest_path": "[..]Cargo.toml",
+ "name": "bar",
+ "source": "registry+[..]",
+ "license": null,
+ "license_file": null,
+ "description": null,
+ "targets": [
+ {
+ "kind": [
+ "lib"
+ ],
+ "crate_types": [
+ "lib"
+ ],
+ "name": "bar",
+ "src_path": "[..]lib.rs"
+ }
+ ],
+ "version": "0.0.1"
+ },
+ {
+ "dependencies": [
+ {
+ "features": [],
+ "kind": null,
+ "name": "bar",
+ "optional": false,
+ "req": "*",
+ "source": "registry+[..]",
+ "target": null,
+ "uses_default_features": true
+ }
+ ],
+ "features": {},
+ "id": "foo 0.5.0 (path+file:[..]foo)",
+ "manifest_path": "[..]Cargo.toml",
+ "name": "foo",
+ "source": null,
+ "license": "MIT",
+ "license_file": null,
+ "description": "foo",
+ "targets": [
+ {
+ "kind": [
+ "bin"
+ ],
+ "crate_types": [
+ "bin"
+ ],
+ "name": "foo",
+ "src_path": "[..]foo.rs"
+ }
+ ],
+ "version": "0.5.0"
+ }
+ ],
+ "workspace_members": ["foo 0.5.0 (path+file:[..]foo)"],
+ "resolve": {
+ "nodes": [
+ {
+ "dependencies": [
+ "bar 0.0.1 (registry+[..])"
+ ],
+ "id": "foo 0.5.0 (path+file:[..]foo)"
+ },
+ {
+ "dependencies": [
+ "baz 0.0.1 (registry+[..])"
+ ],
+ "id": "bar 0.0.1 (registry+[..])"
+ },
+ {
+ "dependencies": [],
+ "id": "baz 0.0.1 (registry+[..])"
+ }
+ ],
+ "root": "foo 0.5.0 (path+file:[..]foo)"
+ },
+ "target_directory": "[..]foo[/]target",
+ "version": 1
+ }"#));
+}
+
+#[test]
+fn example() {
+ let p = project("foo")
+ .file("src/lib.rs", "")
+ .file("examples/ex.rs", "")
+ .file("Cargo.toml", r#"
+[package]
+name = "foo"
+version = "0.1.0"
+
+[[example]]
+name = "ex"
+ "#)
+ .build();
+
+ assert_that(p.cargo("metadata"), execs().with_json(r#"
+ {
+ "packages": [
+ {
+ "name": "foo",
+ "version": "0.1.0",
+ "id": "foo[..]",
+ "license": null,
+ "license_file": null,
+ "description": null,
+ "source": null,
+ "dependencies": [],
+ "targets": [
+ {
+ "kind": [ "lib" ],
+ "crate_types": [ "lib" ],
+ "name": "foo",
+ "src_path": "[..][/]foo[/]src[/]lib.rs"
+ },
+ {
+ "kind": [ "example" ],
+ "crate_types": [ "bin" ],
+ "name": "ex",
+ "src_path": "[..][/]foo[/]examples[/]ex.rs"
+ }
+ ],
+ "features": {},
+ "manifest_path": "[..]Cargo.toml"
+ }
+ ],
+ "workspace_members": [
+ "foo 0.1.0 (path+file:[..]foo)"
+ ],
+ "resolve": {
+ "root": "foo 0.1.0 (path+file://[..]foo)",
+ "nodes": [
+ {
+ "id": "foo 0.1.0 (path+file:[..]foo)",
+ "dependencies": []
+ }
+ ]
+ },
+ "target_directory": "[..]foo[/]target",
+ "version": 1
+ }"#));
+}
+
+#[test]
+fn example_lib() {
+ let p = project("foo")
+ .file("src/lib.rs", "")
+ .file("examples/ex.rs", "")
+ .file("Cargo.toml", r#"
+[package]
+name = "foo"
+version = "0.1.0"
+
+[[example]]
+name = "ex"
+crate-type = ["rlib", "dylib"]
+ "#)
+ .build();
+
+ assert_that(p.cargo("metadata"), execs().with_json(r#"
+ {
+ "packages": [
+ {
+ "name": "foo",
+ "version": "0.1.0",
+ "id": "foo[..]",
+ "license": null,
+ "license_file": null,
+ "description": null,
+ "source": null,
+ "dependencies": [],
+ "targets": [
+ {
+ "kind": [ "lib" ],
+ "crate_types": [ "lib" ],
+ "name": "foo",
+ "src_path": "[..][/]foo[/]src[/]lib.rs"
+ },
+ {
+ "kind": [ "example" ],
+ "crate_types": [ "rlib", "dylib" ],
+ "name": "ex",
+ "src_path": "[..][/]foo[/]examples[/]ex.rs"
+ }
+ ],
+ "features": {},
+ "manifest_path": "[..]Cargo.toml"
+ }
+ ],
+ "workspace_members": [
+ "foo 0.1.0 (path+file:[..]foo)"
+ ],
+ "resolve": {
+ "root": "foo 0.1.0 (path+file://[..]foo)",
+ "nodes": [
+ {
+ "id": "foo 0.1.0 (path+file:[..]foo)",
+ "dependencies": []
+ }
+ ]
+ },
+ "target_directory": "[..]foo[/]target",
+ "version": 1
+ }"#));
+}
+
+#[test]
+fn workspace_metadata() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["bar", "baz"]
+ "#)
+ .file("bar/Cargo.toml", &basic_lib_manifest("bar"))
+ .file("bar/src/lib.rs", "")
+ .file("baz/Cargo.toml", &basic_lib_manifest("baz"))
+ .file("baz/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("metadata"), execs().with_status(0).with_json(r#"
+ {
+ "packages": [
+ {
+ "name": "bar",
+ "version": "0.5.0",
+ "id": "bar[..]",
+ "source": null,
+ "dependencies": [],
+ "license": null,
+ "license_file": null,
+ "description": null,
+ "targets": [
+ {
+ "kind": [ "lib" ],
+ "crate_types": [ "lib" ],
+ "name": "bar",
+ "src_path": "[..]bar[/]src[/]lib.rs"
+ }
+ ],
+ "features": {},
+ "manifest_path": "[..]bar[/]Cargo.toml"
+ },
+ {
+ "name": "baz",
+ "version": "0.5.0",
+ "id": "baz[..]",
+ "source": null,
+ "dependencies": [],
+ "license": null,
+ "license_file": null,
+ "description": null,
+ "targets": [
+ {
+ "kind": [ "lib" ],
+ "crate_types": [ "lib" ],
+ "name": "baz",
+ "src_path": "[..]baz[/]src[/]lib.rs"
+ }
+ ],
+ "features": {},
+ "manifest_path": "[..]baz[/]Cargo.toml"
+ }
+ ],
+ "workspace_members": ["baz 0.5.0 (path+file:[..]baz)", "bar 0.5.0 (path+file:[..]bar)"],
+ "resolve": {
+ "nodes": [
+ {
+ "dependencies": [],
+ "id": "baz 0.5.0 (path+file:[..]baz)"
+ },
+ {
+ "dependencies": [],
+ "id": "bar 0.5.0 (path+file:[..]bar)"
+ }
+ ],
+ "root": null
+ },
+ "target_directory": "[..]foo[/]target",
+ "version": 1
+ }"#))
+}
+
+#[test]
+fn workspace_metadata_no_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["bar", "baz"]
+ "#)
+ .file("bar/Cargo.toml", &basic_lib_manifest("bar"))
+ .file("bar/src/lib.rs", "")
+ .file("baz/Cargo.toml", &basic_lib_manifest("baz"))
+ .file("baz/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("metadata").arg("--no-deps"), execs().with_status(0).with_json(r#"
+ {
+ "packages": [
+ {
+ "name": "bar",
+ "version": "0.5.0",
+ "id": "bar[..]",
+ "source": null,
+ "dependencies": [],
+ "license": null,
+ "license_file": null,
+ "description": null,
+ "targets": [
+ {
+ "kind": [ "lib" ],
+ "crate_types": [ "lib" ],
+ "name": "bar",
+ "src_path": "[..]bar[/]src[/]lib.rs"
+ }
+ ],
+ "features": {},
+ "manifest_path": "[..]bar[/]Cargo.toml"
+ },
+ {
+ "name": "baz",
+ "version": "0.5.0",
+ "id": "baz[..]",
+ "source": null,
+ "dependencies": [],
+ "license": null,
+ "license_file": null,
+ "description": null,
+ "targets": [
+ {
+ "kind": [ "lib" ],
+ "crate_types": ["lib"],
+ "name": "baz",
+ "src_path": "[..]baz[/]src[/]lib.rs"
+ }
+ ],
+ "features": {},
+ "manifest_path": "[..]baz[/]Cargo.toml"
+ }
+ ],
+ "workspace_members": ["baz 0.5.0 (path+file:[..]baz)", "bar 0.5.0 (path+file:[..]bar)"],
+ "resolve": null,
+ "target_directory": "[..]foo[/]target",
+ "version": 1
+ }"#))
+}
+
+#[test]
+fn cargo_metadata_with_invalid_manifest() {
+ let p = project("foo")
+ .file("Cargo.toml", "")
+ .build();
+
+ assert_that(p.cargo("metadata").arg("--format-version").arg("1"),
+ execs().with_status(101).with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ no `package` section found."))
+}
+
+const MANIFEST_OUTPUT: &'static str=
+ r#"
+{
+ "packages": [{
+ "name":"foo",
+ "version":"0.5.0",
+ "id":"foo[..]0.5.0[..](path+file://[..]/foo)",
+ "source":null,
+ "dependencies":[],
+ "license": null,
+ "license_file": null,
+ "description": null,
+ "targets":[{
+ "kind":["bin"],
+ "crate_types":["bin"],
+ "name":"foo",
+ "src_path":"[..][/]foo[/]src[/]foo.rs"
+ }],
+ "features":{},
+ "manifest_path":"[..]Cargo.toml"
+ }],
+ "workspace_members": [ "foo 0.5.0 (path+file:[..]foo)" ],
+ "resolve": null,
+ "target_directory": "[..]foo[/]target",
+ "version": 1
+}"#;
+
+#[test]
+fn cargo_metadata_no_deps_path_to_cargo_toml_relative() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("metadata").arg("--no-deps")
+ .arg("--manifest-path").arg("foo/Cargo.toml")
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(0)
+ .with_json(MANIFEST_OUTPUT));
+}
+
+#[test]
+fn cargo_metadata_no_deps_path_to_cargo_toml_absolute() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("metadata").arg("--no-deps")
+ .arg("--manifest-path").arg(p.root().join("Cargo.toml"))
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(0)
+ .with_json(MANIFEST_OUTPUT));
+}
+
+#[test]
+fn cargo_metadata_no_deps_path_to_cargo_toml_parent_relative() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("metadata").arg("--no-deps")
+ .arg("--manifest-path").arg("foo")
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(101)
+ .with_stderr("[ERROR] the manifest-path must be \
+ a path to a Cargo.toml file"));
+}
+
+#[test]
+fn cargo_metadata_no_deps_path_to_cargo_toml_parent_absolute() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("metadata").arg("--no-deps")
+ .arg("--manifest-path").arg(p.root())
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(101)
+ .with_stderr("[ERROR] the manifest-path must be \
+ a path to a Cargo.toml file"));
+}
+
+#[test]
+fn cargo_metadata_no_deps_cwd() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("metadata").arg("--no-deps")
+ .cwd(p.root()),
+ execs().with_status(0)
+ .with_json(MANIFEST_OUTPUT));
+}
+
+#[test]
+fn cargo_metadata_bad_version() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("metadata").arg("--no-deps")
+ .arg("--format-version").arg("2")
+ .cwd(p.root()),
+ execs().with_status(101)
+ .with_stderr("[ERROR] metadata version 2 not supported, only 1 is currently supported"));
+}
+
+#[test]
+fn multiple_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [features]
+ a = []
+ b = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("metadata")
+ .arg("--features").arg("a b"),
+ execs().with_status(0));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+#[test]
+fn net_retry_loads_from_config() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ git = "https://127.0.0.1:11/foo/bar"
+ "#)
+ .file("src/main.rs", "").file(".cargo/config", r#"
+ [net]
+ retry=1
+ [http]
+ timeout=1
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101)
+ .with_stderr_contains("[WARNING] spurious network error \
+(1 tries remaining): [..]"));
+}
+
+#[test]
+fn net_retry_git_outputs_warning() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ git = "https://127.0.0.1:11/foo/bar"
+ "#)
+ .file(".cargo/config", r#"
+ [http]
+ timeout=1
+ "#)
+ .file("src/main.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v").arg("-j").arg("1"),
+ execs().with_status(101)
+ .with_stderr_contains("[WARNING] spurious network error \
+(2 tries remaining): [..]")
+ .with_stderr_contains("\
+[WARNING] spurious network error (1 tries remaining): [..]"));
+}
--- /dev/null
+extern crate cargo;
+extern crate cargotest;
+extern crate hamcrest;
+extern crate tempdir;
+
+use std::fs::{self, File};
+use std::io::prelude::*;
+use std::env;
+
+use cargo::util::ProcessBuilder;
+use cargotest::process;
+use cargotest::support::{execs, paths};
+use hamcrest::{assert_that, existing_file, existing_dir, is_not};
+use tempdir::TempDir;
+
+fn cargo_process(s: &str) -> ProcessBuilder {
+ let mut p = cargotest::cargo_process();
+ p.arg(s);
+ p
+}
+
+fn create_empty_gitconfig() {
+ // This helps on Windows where libgit2 is very aggressive in attempting to
+ // find a git config file.
+ let gitconfig = paths::home().join(".gitconfig");
+ File::create(gitconfig).unwrap();
+}
+
+
+#[test]
+fn simple_lib() {
+ assert_that(cargo_process("new").arg("--lib").arg("foo").arg("--vcs").arg("none")
+ .env("USER", "foo"),
+ execs().with_status(0).with_stderr("\
+[CREATED] library `foo` project
+"));
+
+ assert_that(&paths::root().join("foo"), existing_dir());
+ assert_that(&paths::root().join("foo/Cargo.toml"), existing_file());
+ assert_that(&paths::root().join("foo/src/lib.rs"), existing_file());
+ assert_that(&paths::root().join("foo/.gitignore"), is_not(existing_file()));
+
+ let lib = paths::root().join("foo/src/lib.rs");
+ let mut contents = String::new();
+ File::open(&lib).unwrap().read_to_string(&mut contents).unwrap();
+ assert_eq!(contents, r#"#[cfg(test)]
+mod tests {
+ #[test]
+ fn it_works() {
+ assert_eq!(2 + 2, 4);
+ }
+}
+"#);
+
+ assert_that(cargo_process("build").cwd(&paths::root().join("foo")),
+ execs().with_status(0));
+}
+
+#[test]
+fn simple_bin() {
+ assert_that(cargo_process("new").arg("--bin").arg("foo")
+ .env("USER", "foo"),
+ execs().with_status(0).with_stderr("\
+[CREATED] binary (application) `foo` project
+"));
+
+ assert_that(&paths::root().join("foo"), existing_dir());
+ assert_that(&paths::root().join("foo/Cargo.toml"), existing_file());
+ assert_that(&paths::root().join("foo/src/main.rs"), existing_file());
+
+ assert_that(cargo_process("build").cwd(&paths::root().join("foo")),
+ execs().with_status(0));
+ assert_that(&paths::root().join(&format!("foo/target/debug/foo{}",
+ env::consts::EXE_SUFFIX)),
+ existing_file());
+}
+
+#[test]
+fn both_lib_and_bin() {
+ assert_that(cargo_process("new").arg("--lib").arg("--bin").arg("foo")
+ .env("USER", "foo"),
+ execs().with_status(101).with_stderr(
+ "[ERROR] can't specify both lib and binary outputs"));
+}
+
+#[test]
+fn simple_git() {
+ // Run inside a temp directory so that cargo will initialize a git repo.
+ // If this ran inside paths::root() it would detect that we are already
+ // inside a git repo and skip the initialization.
+ let td = TempDir::new("cargo").unwrap();
+ assert_that(cargo_process("new").arg("--lib").arg("foo").cwd(td.path())
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+ assert_that(td.path(), existing_dir());
+ assert_that(&td.path().join("foo/Cargo.toml"), existing_file());
+ assert_that(&td.path().join("foo/src/lib.rs"), existing_file());
+ assert_that(&td.path().join("foo/.git"), existing_dir());
+ assert_that(&td.path().join("foo/.gitignore"), existing_file());
+
+ assert_that(cargo_process("build").cwd(&td.path().join("foo")),
+ execs().with_status(0));
+}
+
+#[test]
+fn no_argument() {
+ assert_that(cargo_process("new"),
+ execs().with_status(1)
+ .with_stderr("\
+[ERROR] Invalid arguments.
+
+Usage:
+ cargo new [options] <path>
+ cargo new -h | --help
+"));
+}
+
+#[test]
+fn existing() {
+ let dst = paths::root().join("foo");
+ fs::create_dir(&dst).unwrap();
+ assert_that(cargo_process("new").arg("foo"),
+ execs().with_status(101)
+ .with_stderr(format!("[ERROR] destination `{}` already exists\n\n\
+ Use `cargo init` to initialize the directory",
+ dst.display())));
+}
+
+#[test]
+fn invalid_characters() {
+ assert_that(cargo_process("new").arg("foo.rs"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] Invalid character `.` in crate name: `foo.rs`
+use --name to override crate name"));
+}
+
+#[test]
+fn reserved_name() {
+ assert_that(cargo_process("new").arg("test"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] The name `test` cannot be used as a crate name\n\
+use --name to override crate name"));
+}
+
+#[test]
+fn reserved_binary_name() {
+ assert_that(cargo_process("new").arg("--bin").arg("incremental"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] The name `incremental` cannot be used as a crate name\n\
+use --name to override crate name"));
+}
+
+#[test]
+fn keyword_name() {
+ assert_that(cargo_process("new").arg("pub"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] The name `pub` cannot be used as a crate name\n\
+use --name to override crate name"));
+}
+
+#[test]
+fn rust_prefix_stripped() {
+ assert_that(cargo_process("new").arg("--lib").arg("rust-foo").env("USER", "foo"),
+ execs().with_status(0)
+ .with_stderr_contains("note: package will be named `foo`; use --name to override"));
+ let toml = paths::root().join("rust-foo/Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"name = "foo""#));
+}
+
+#[test]
+fn bin_disables_stripping() {
+ assert_that(cargo_process("new").arg("rust-foo").arg("--bin").env("USER", "foo"),
+ execs().with_status(0));
+ let toml = paths::root().join("rust-foo/Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"name = "rust-foo""#));
+}
+
+#[test]
+fn explicit_name_not_stripped() {
+ assert_that(cargo_process("new").arg("foo").arg("--name").arg("rust-bar").env("USER", "foo"),
+ execs().with_status(0));
+ let toml = paths::root().join("foo/Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"name = "rust-bar""#));
+}
+
+#[test]
+fn finds_author_user() {
+ create_empty_gitconfig();
+ assert_that(cargo_process("new").arg("foo").env("USER", "foo"),
+ execs().with_status(0));
+
+ let toml = paths::root().join("foo/Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"authors = ["foo"]"#));
+}
+
+#[test]
+fn finds_author_user_escaped() {
+ create_empty_gitconfig();
+ assert_that(cargo_process("new").arg("foo").env("USER", "foo \"bar\""),
+ execs().with_status(0));
+
+ let toml = paths::root().join("foo/Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"authors = ["foo \"bar\""]"#));
+}
+
+#[test]
+fn finds_author_username() {
+ create_empty_gitconfig();
+ assert_that(cargo_process("new").arg("foo")
+ .env_remove("USER")
+ .env("USERNAME", "foo"),
+ execs().with_status(0));
+
+ let toml = paths::root().join("foo/Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"authors = ["foo"]"#));
+}
+
+#[test]
+fn finds_author_priority() {
+ assert_that(cargo_process("new").arg("foo")
+ .env("USER", "bar2")
+ .env("EMAIL", "baz2")
+ .env("CARGO_NAME", "bar")
+ .env("CARGO_EMAIL", "baz"),
+ execs().with_status(0));
+
+ let toml = paths::root().join("foo/Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"authors = ["bar <baz>"]"#));
+}
+
+#[test]
+fn finds_author_email() {
+ create_empty_gitconfig();
+ assert_that(cargo_process("new").arg("foo")
+ .env("USER", "bar")
+ .env("EMAIL", "baz"),
+ execs().with_status(0));
+
+ let toml = paths::root().join("foo/Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"authors = ["bar <baz>"]"#));
+}
+
+#[test]
+fn finds_author_git() {
+ process("git").args(&["config", "--global", "user.name", "bar"])
+ .exec().unwrap();
+ process("git").args(&["config", "--global", "user.email", "baz"])
+ .exec().unwrap();
+ assert_that(cargo_process("new").arg("foo").env("USER", "foo"),
+ execs().with_status(0));
+
+ let toml = paths::root().join("foo/Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"authors = ["bar <baz>"]"#));
+}
+
+#[test]
+fn finds_local_author_git() {
+ process("git").args(&["init"])
+ .exec().unwrap();
+ process("git").args(&["config", "--global", "user.name", "foo"])
+ .exec().unwrap();
+ process("git").args(&["config", "--global", "user.email", "foo@bar"])
+ .exec().unwrap();
+
+ // Set local git user config
+ process("git").args(&["config", "user.name", "bar"])
+ .exec().unwrap();
+ process("git").args(&["config", "user.email", "baz"])
+ .exec().unwrap();
+ assert_that(cargo_process("init").env("USER", "foo"),
+ execs().with_status(0));
+
+ let toml = paths::root().join("Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"authors = ["bar <baz>"]"#));
+}
+
+#[test]
+fn finds_git_email() {
+ assert_that(cargo_process("new").arg("foo")
+ .env("GIT_AUTHOR_NAME", "foo")
+ .env("GIT_AUTHOR_EMAIL", "gitfoo"),
+ execs().with_status(0));
+
+ let toml = paths::root().join("foo/Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"authors = ["foo <gitfoo>"]"#), contents);
+}
+
+
+#[test]
+fn finds_git_author() {
+ create_empty_gitconfig();
+ assert_that(cargo_process("new").arg("foo")
+ .env_remove("USER")
+ .env("GIT_COMMITTER_NAME", "gitfoo"),
+ execs().with_status(0));
+
+ let toml = paths::root().join("foo/Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"authors = ["gitfoo"]"#));
+}
+
+#[test]
+fn author_prefers_cargo() {
+ process("git").args(&["config", "--global", "user.name", "foo"])
+ .exec().unwrap();
+ process("git").args(&["config", "--global", "user.email", "bar"])
+ .exec().unwrap();
+ let root = paths::root();
+ fs::create_dir(&root.join(".cargo")).unwrap();
+ File::create(&root.join(".cargo/config")).unwrap().write_all(br#"
+ [cargo-new]
+ name = "new-foo"
+ email = "new-bar"
+ vcs = "none"
+ "#).unwrap();
+
+ assert_that(cargo_process("new").arg("foo").env("USER", "foo"),
+ execs().with_status(0));
+
+ let toml = paths::root().join("foo/Cargo.toml");
+ let mut contents = String::new();
+ File::open(&toml).unwrap().read_to_string(&mut contents).unwrap();
+ assert!(contents.contains(r#"authors = ["new-foo <new-bar>"]"#));
+ assert!(!root.join("foo/.gitignore").exists());
+}
+
+#[test]
+fn git_prefers_command_line() {
+ let root = paths::root();
+ fs::create_dir(&root.join(".cargo")).unwrap();
+ File::create(&root.join(".cargo/config")).unwrap().write_all(br#"
+ [cargo-new]
+ vcs = "none"
+ name = "foo"
+ email = "bar"
+ "#).unwrap();
+
+ assert_that(cargo_process("new").arg("foo").arg("--vcs").arg("git")
+ .env("USER", "foo"),
+ execs().with_status(0));
+ assert!(paths::root().join("foo/.gitignore").exists());
+}
+
+#[test]
+fn subpackage_no_git() {
+ assert_that(cargo_process("new").arg("foo").env("USER", "foo"),
+ execs().with_status(0));
+
+ let subpackage = paths::root().join("foo").join("components");
+ fs::create_dir(&subpackage).unwrap();
+ assert_that(cargo_process("new").arg("foo/components/subcomponent")
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+ assert_that(&paths::root().join("foo/components/subcomponent/.git"),
+ is_not(existing_file()));
+ assert_that(&paths::root().join("foo/components/subcomponent/.gitignore"),
+ is_not(existing_file()));
+}
+
+#[test]
+fn subpackage_git_with_vcs_arg() {
+ assert_that(cargo_process("new").arg("foo").env("USER", "foo"),
+ execs().with_status(0));
+
+ let subpackage = paths::root().join("foo").join("components");
+ fs::create_dir(&subpackage).unwrap();
+ assert_that(cargo_process("new").arg("foo/components/subcomponent")
+ .arg("--vcs").arg("git")
+ .env("USER", "foo"),
+ execs().with_status(0));
+
+ assert_that(&paths::root().join("foo/components/subcomponent/.git"),
+ existing_dir());
+ assert_that(&paths::root().join("foo/components/subcomponent/.gitignore"),
+ existing_file());
+}
+
+#[test]
+fn unknown_flags() {
+ assert_that(cargo_process("new").arg("foo").arg("--flag"),
+ execs().with_status(1)
+ .with_stderr("\
+[ERROR] Unknown flag: '--flag'
+
+Usage:
+ cargo new [..]
+ cargo new [..]
+"));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::git;
+use cargotest::support::paths;
+use cargotest::support::registry::Package;
+use cargotest::support::{execs, project};
+use hamcrest::assert_that;
+
+#[test]
+fn override_simple() {
+ Package::new("foo", "0.1.0").publish();
+
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .build();
+
+ let p = project("local")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [replace]
+ "foo:0.1.0" = {{ git = '{}' }}
+ "#, foo.url()))
+ .file("src/lib.rs", "
+ extern crate foo;
+ pub fn bar() {
+ foo::foo();
+ }
+ ")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[UPDATING] git repository `[..]`
+[COMPILING] foo v0.1.0 (file://[..])
+[COMPILING] local v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn missing_version() {
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [replace]
+ foo = { git = 'https://example.com' }
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ replacements must specify a version to replace, but `[..]foo` does not
+"));
+}
+
+#[test]
+fn invalid_semver_version() {
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "*"
+
+ [replace]
+ "foo:*" = { git = 'https://example.com' }
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr_contains("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ replacements must specify a valid semver version to replace, but `foo:*` does not
+"));
+}
+
+#[test]
+fn different_version() {
+ Package::new("foo", "0.2.0").publish();
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [replace]
+ "foo:0.1.0" = "0.2.0"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ replacements cannot specify a version requirement, but found one for [..]
+"));
+}
+
+#[test]
+fn transitive() {
+ Package::new("foo", "0.1.0").publish();
+ Package::new("bar", "0.2.0")
+ .dep("foo", "0.1.0")
+ .file("src/lib.rs", "extern crate foo; fn bar() { foo::foo(); }")
+ .publish();
+
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .build();
+
+ let p = project("local")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "0.2.0"
+
+ [replace]
+ "foo:0.1.0" = {{ git = '{}' }}
+ "#, foo.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[UPDATING] git repository `[..]`
+[DOWNLOADING] bar v0.2.0 (registry [..])
+[COMPILING] foo v0.1.0 (file://[..])
+[COMPILING] bar v0.2.0
+[COMPILING] local v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("build"), execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn persists_across_rebuilds() {
+ Package::new("foo", "0.1.0").publish();
+
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .build();
+
+ let p = project("local")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [replace]
+ "foo:0.1.0" = {{ git = '{}' }}
+ "#, foo.url()))
+ .file("src/lib.rs", "
+ extern crate foo;
+ pub fn bar() {
+ foo::foo();
+ }
+ ")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[UPDATING] git repository `file://[..]`
+[COMPILING] foo v0.1.0 (file://[..])
+[COMPILING] local v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn replace_registry_with_path() {
+ Package::new("foo", "0.1.0").publish();
+
+ let _ = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .build();
+
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [replace]
+ "foo:0.1.0" = { path = "../foo" }
+ "#)
+ .file("src/lib.rs", "
+ extern crate foo;
+ pub fn bar() {
+ foo::foo();
+ }
+ ")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[COMPILING] foo v0.1.0 (file://[..])
+[COMPILING] local v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn use_a_spec_to_select() {
+ Package::new("foo", "0.1.1")
+ .file("src/lib.rs", "pub fn foo1() {}")
+ .publish();
+ Package::new("foo", "0.2.0").publish();
+ Package::new("bar", "0.1.1")
+ .dep("foo", "0.2")
+ .file("src/lib.rs", "
+ extern crate foo;
+ pub fn bar() { foo::foo3(); }
+ ")
+ .publish();
+
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.2.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo3() {}")
+ .build();
+
+ let p = project("local")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "0.1"
+ foo = "0.1"
+
+ [replace]
+ "foo:0.2.0" = {{ git = '{}' }}
+ "#, foo.url()))
+ .file("src/lib.rs", "
+ extern crate foo;
+ extern crate bar;
+
+ pub fn local() {
+ foo::foo1();
+ bar::bar();
+ }
+ ")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[UPDATING] git repository `[..]`
+[DOWNLOADING] [..]
+[DOWNLOADING] [..]
+[COMPILING] [..]
+[COMPILING] [..]
+[COMPILING] [..]
+[COMPILING] local v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn override_adds_some_deps() {
+ Package::new("foo", "0.1.1").publish();
+ Package::new("bar", "0.1.0").publish();
+
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ let p = project("local")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "0.1"
+
+ [replace]
+ "bar:0.1.0" = {{ git = '{}' }}
+ "#, foo.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[UPDATING] git repository `[..]`
+[DOWNLOADING] foo v0.1.1 (registry [..])
+[COMPILING] foo v0.1.1
+[COMPILING] bar v0.1.0 ([..])
+[COMPILING] local v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("build"), execs().with_status(0).with_stdout(""));
+
+ Package::new("foo", "0.1.2").publish();
+ assert_that(p.cargo("update").arg("-p").arg(&format!("{}#bar", foo.url())),
+ execs().with_status(0).with_stderr("\
+[UPDATING] git repository `file://[..]`
+"));
+ assert_that(p.cargo("update")
+ .arg("-p")
+ .arg("https://github.com/rust-lang/crates.io-index#bar"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+"));
+
+ assert_that(p.cargo("build"), execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn locked_means_locked_yes_no_seriously_i_mean_locked() {
+ // this in theory exercises #2041
+ Package::new("foo", "0.1.0").publish();
+ Package::new("foo", "0.2.0").publish();
+ Package::new("bar", "0.1.0").publish();
+
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "*"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ let p = project("local")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1"
+ bar = "0.1"
+
+ [replace]
+ "bar:0.1.0" = {{ git = '{}' }}
+ "#, foo.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ assert_that(p.cargo("build"), execs().with_status(0).with_stdout(""));
+ assert_that(p.cargo("build"), execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn override_wrong_name() {
+ Package::new("foo", "0.1.0").publish();
+
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ let p = project("local")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1"
+
+ [replace]
+ "foo:0.1.0" = {{ git = '{}' }}
+ "#, foo.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[UPDATING] registry [..]
+[UPDATING] git repository [..]
+error: no matching package for override `[..]foo:0.1.0` found
+location searched: file://[..]
+version required: = 0.1.0
+"));
+}
+
+#[test]
+fn override_with_nothing() {
+ Package::new("foo", "0.1.0").publish();
+
+ let foo = git::repo(&paths::root().join("override"))
+ .file("src/lib.rs", "")
+ .build();
+
+ let p = project("local")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1"
+
+ [replace]
+ "foo:0.1.0" = {{ git = '{}' }}
+ "#, foo.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[UPDATING] registry [..]
+[UPDATING] git repository [..]
+[ERROR] failed to load source for a dependency on `foo`
+
+Caused by:
+ Unable to update file://[..]
+
+Caused by:
+ Could not find Cargo.toml in `[..]`
+"));
+}
+
+#[test]
+fn override_wrong_version() {
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [replace]
+ "foo:0.1.0" = { git = 'https://example.com', version = '0.2.0' }
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ replacements cannot specify a version requirement, but found one for `[..]foo:0.1.0`
+"));
+}
+
+#[test]
+fn multiple_specs() {
+ Package::new("foo", "0.1.0").publish();
+
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .build();
+
+ let p = project("local")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [replace]
+ "foo:0.1.0" = {{ git = '{0}' }}
+
+ [replace."https://github.com/rust-lang/crates.io-index#foo:0.1.0"]
+ git = '{0}'
+ "#, foo.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[UPDATING] registry [..]
+[UPDATING] git repository [..]
+error: overlapping replacement specifications found:
+
+ * [..]
+ * [..]
+
+both specifications match: foo v0.1.0
+"));
+}
+
+#[test]
+fn test_override_dep() {
+ Package::new("foo", "0.1.0").publish();
+
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .build();
+
+ let p = project("local")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [replace]
+ "foo:0.1.0" = {{ git = '{0}' }}
+ "#, foo.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("test").arg("-p").arg("foo"),
+ execs().with_status(101)
+ .with_stderr_contains("\
+error: There are multiple `foo` packages in your project, and the [..]
+Please re-run this command with [..]
+ [..]#foo:0.1.0
+ [..]#foo:0.1.0
+"));
+}
+
+#[test]
+fn update() {
+ Package::new("foo", "0.1.0").publish();
+
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .build();
+
+ let p = project("local")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [replace]
+ "foo:0.1.0" = {{ git = '{0}' }}
+ "#, foo.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("generate-lockfile"),
+ execs().with_status(0));
+ assert_that(p.cargo("update"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] registry `[..]`
+[UPDATING] git repository `[..]`
+"));
+}
+
+// local -> near -> far
+// near is overridden with itself
+#[test]
+fn no_override_self() {
+ let deps = git::repo(&paths::root().join("override"))
+
+ .file("far/Cargo.toml", r#"
+ [package]
+ name = "far"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("far/src/lib.rs", "")
+
+ .file("near/Cargo.toml", r#"
+ [package]
+ name = "near"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ far = { path = "../far" }
+ "#)
+ .file("near/src/lib.rs", r#"
+ #![no_std]
+ pub extern crate far;
+ "#)
+ .build();
+
+ let p = project("local")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ near = {{ git = '{0}' }}
+
+ [replace]
+ "near:0.1.0" = {{ git = '{0}' }}
+ "#, deps.url()))
+ .file("src/lib.rs", r#"
+ #![no_std]
+ pub extern crate near;
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").arg("--verbose"),
+ execs().with_status(0));
+}
+
+#[test]
+fn broken_path_override_warns() {
+ Package::new("foo", "0.1.0").publish();
+ Package::new("foo", "0.2.0").publish();
+
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = { path = "a1" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("a1/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1"
+ "#)
+ .file("a1/src/lib.rs", "")
+ .file("a2/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.2"
+ "#)
+ .file("a2/src/lib.rs", "")
+ .file(".cargo/config", r#"
+ paths = ["a2"]
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] [..]
+warning: path override for crate `a` has altered the original list of
+dependencies; the dependency on `foo` was either added or
+modified to not match the previously resolved version
+
+This is currently allowed but is known to produce buggy behavior with spurious
+recompiles and changes to the crate graph. Path overrides unfortunately were
+never intended to support this feature, so for now this message is just a
+warning. In the future, however, this message will become a hard error.
+
+To change the dependency graph via an override it's recommended to use the
+`[replace]` feature of Cargo instead of the path override feature. This is
+documented online at the url below for more information.
+
+http://doc.crates.io/specifying-dependencies.html#overriding-dependencies
+
+[DOWNLOADING] [..]
+[COMPILING] [..]
+[COMPILING] [..]
+[COMPILING] [..]
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn override_an_override() {
+ Package::new("chrono", "0.2.0").dep("serde", "< 0.9").publish();
+ Package::new("serde", "0.7.0")
+ .file("src/lib.rs", "pub fn serde07() {}")
+ .publish();
+ Package::new("serde", "0.8.0")
+ .file("src/lib.rs", "pub fn serde08() {}")
+ .publish();
+
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ chrono = "0.2"
+ serde = "0.8"
+
+ [replace]
+ "chrono:0.2.0" = { path = "chrono" }
+ "serde:0.8.0" = { path = "serde" }
+ "#)
+ .file("Cargo.lock", r#"
+ [[package]]
+ name = "local"
+ version = "0.0.1"
+ dependencies = [
+ "chrono 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
+ "serde 0.8.0 (registry+https://github.com/rust-lang/crates.io-index)",
+ ]
+
+ [[package]]
+ name = "chrono"
+ version = "0.2.0"
+ source = "registry+https://github.com/rust-lang/crates.io-index"
+ replace = "chrono 0.2.0"
+
+ [[package]]
+ name = "chrono"
+ version = "0.2.0"
+ dependencies = [
+ "serde 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)",
+ ]
+
+ [[package]]
+ name = "serde"
+ version = "0.7.0"
+ source = "registry+https://github.com/rust-lang/crates.io-index"
+
+ [[package]]
+ name = "serde"
+ version = "0.8.0"
+ source = "registry+https://github.com/rust-lang/crates.io-index"
+ replace = "serde 0.8.0"
+
+ [[package]]
+ name = "serde"
+ version = "0.8.0"
+ "#)
+ .file("src/lib.rs", "
+ extern crate chrono;
+ extern crate serde;
+
+ pub fn local() {
+ chrono::chrono();
+ serde::serde08_override();
+ }
+ ")
+ .file("chrono/Cargo.toml", r#"
+ [package]
+ name = "chrono"
+ version = "0.2.0"
+ authors = []
+
+ [dependencies]
+ serde = "< 0.9"
+ "#)
+ .file("chrono/src/lib.rs", "
+ extern crate serde;
+ pub fn chrono() {
+ serde::serde07();
+ }
+ ")
+ .file("serde/Cargo.toml", r#"
+ [package]
+ name = "serde"
+ version = "0.8.0"
+ authors = []
+ "#)
+ .file("serde/src/lib.rs", "
+ pub fn serde08_override() {}
+ ")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn overriding_nonexistent_no_spurious() {
+ Package::new("foo", "0.1.0").dep("bar", "0.1").publish();
+ Package::new("bar", "0.1.0").publish();
+
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "bar" }
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "pub fn foo() {}")
+ .build();
+
+
+ let p = project("local")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [replace]
+ "foo:0.1.0" = {{ git = '{url}' }}
+ "bar:0.1.0" = {{ git = '{url}' }}
+ "#, url = foo.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[WARNING] package replacement is not used: [..]bar:0.1.0
+[FINISHED] [..]
+").with_stdout(""));
+}
+
+#[test]
+fn no_warnings_when_replace_is_used_in_another_workspace_member() {
+ Package::new("foo", "0.1.0").publish();
+ Package::new("bar", "0.1.0").publish();
+
+ let p = project("ws")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = [ "first_crate", "second_crate"]
+
+ [replace]
+ "foo:0.1.0" = { path = "local_foo" }"#)
+ .file("first_crate/Cargo.toml", r#"
+ [package]
+ name = "first_crate"
+ version = "0.1.0"
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("first_crate/src/lib.rs", "")
+ .file("second_crate/Cargo.toml", r#"
+ [package]
+ name = "second_crate"
+ version = "0.1.0"
+ "#)
+ .file("second_crate/src/lib.rs", "")
+ .file("local_foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("local_foo/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("first_crate")),
+ execs().with_status(0)
+ .with_stdout("")
+ .with_stderr("\
+[UPDATING] registry `[..]`
+[COMPILING] foo v0.1.0 ([..])
+[COMPILING] first_crate v0.1.0 ([..])
+[FINISHED] [..]"));
+
+ assert_that(p.cargo("build").cwd(p.root().join("second_crate")),
+ execs().with_status(0)
+ .with_stdout("")
+ .with_stderr("\
+[COMPILING] second_crate v0.1.0 ([..])
+[FINISHED] [..]"));
+}
+
+
+#[test]
+fn override_to_path_dep() {
+ Package::new("foo", "0.1.0").dep("bar", "0.1").publish();
+ Package::new("bar", "0.1.0").publish();
+
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = { path = "bar" }
+ "#)
+ .file("foo/src/lib.rs", "")
+ .file("foo/bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("foo/bar/src/lib.rs", "")
+ .file(".cargo/config", r#"
+ paths = ["foo"]
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn replace_to_path_dep() {
+ Package::new("foo", "0.1.0").dep("bar", "0.1").publish();
+ Package::new("bar", "0.1.0").publish();
+
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [replace]
+ "foo:0.1.0" = { path = "foo" }
+ "#)
+ .file("src/lib.rs", "extern crate foo;")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "bar" }
+ "#)
+ .file("foo/src/lib.rs", "
+ extern crate bar;
+
+ pub fn foo() {
+ bar::bar();
+ }
+ ")
+ .file("foo/bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/bar/src/lib.rs", "pub fn bar() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn paths_ok_with_optional() {
+ Package::new("bar", "0.1.0").publish();
+
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = { path = "foo" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { version = "0.1", optional = true }
+ "#)
+ .file("foo/src/lib.rs", "")
+ .file("foo2/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { version = "0.1", optional = true }
+ "#)
+ .file("foo2/src/lib.rs", "")
+ .file(".cargo/config", r#"
+ paths = ["foo2"]
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.1.0 ([..]foo2)
+[COMPILING] local v0.0.1 ([..])
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn paths_add_optional_bad() {
+ Package::new("bar", "0.1.0").publish();
+
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = { path = "foo" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", "")
+ .file("foo2/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { version = "0.1", optional = true }
+ "#)
+ .file("foo2/src/lib.rs", "")
+ .file(".cargo/config", r#"
+ paths = ["foo2"]
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr_contains("\
+warning: path override for crate `foo` has altered the original list of
+dependencies; the dependency on `bar` was either added or\
+"));
+}
+
+#[test]
+fn override_with_default_feature() {
+ Package::new("another", "0.1.0").publish();
+ Package::new("another", "0.1.1")
+ .dep("bar", "0.1")
+ .publish();
+ Package::new("bar", "0.1.0").publish();
+
+ let p = project("local")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "local"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = { path = "bar", default-features = false }
+ another = "0.1"
+ another2 = { path = "another2" }
+
+ [replace]
+ 'bar:0.1.0' = { path = "bar" }
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+
+ fn main() {
+ bar::bar();
+ }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [features]
+ default = []
+ "#)
+ .file("bar/src/lib.rs", r#"
+ #[cfg(feature = "default")]
+ pub fn bar() {}
+ "#)
+ .file("another2/Cargo.toml", r#"
+ [package]
+ name = "another2"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { version = "0.1", default-features = false }
+ "#)
+ .file("another2/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("run"),
+ execs().with_status(0));
+}
+
+#[test]
+fn override_plus_dep() {
+ Package::new("bar", "0.1.0").publish();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "0.1"
+
+ [replace]
+ 'bar:0.1.0' = { path = "bar" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = { path = ".." }
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr_contains("\
+error: cyclic package dependency: [..]
+"));
+}
--- /dev/null
+#[macro_use]
+extern crate cargotest;
+extern crate flate2;
+extern crate git2;
+extern crate hamcrest;
+extern crate tar;
+
+use std::fs::File;
+use std::io::prelude::*;
+use std::path::{Path, PathBuf};
+
+use cargotest::{cargo_process, process};
+use cargotest::support::{project, execs, paths, git, path2url, cargo_exe};
+use cargotest::support::registry::Package;
+use flate2::read::GzDecoder;
+use hamcrest::{assert_that, existing_file, contains, equal_to};
+use tar::Archive;
+
+#[test]
+fn simple() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ exclude = ["*.txt"]
+ license = "MIT"
+ description = "foo"
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { println!("hello"); }
+ "#)
+ .file("src/bar.txt", "") // should be ignored when packaging
+ .build();
+
+ assert_that(p.cargo("package"),
+ execs().with_status(0).with_stderr(&format!("\
+[WARNING] manifest has no documentation[..]
+See [..]
+[PACKAGING] foo v0.0.1 ({dir})
+[VERIFYING] foo v0.0.1 ({dir})
+[COMPILING] foo v0.0.1 ({dir}[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = p.url())));
+ assert_that(&p.root().join("target/package/foo-0.0.1.crate"), existing_file());
+ assert_that(p.cargo("package").arg("-l"),
+ execs().with_status(0).with_stdout("\
+Cargo.toml
+src[/]main.rs
+"));
+ assert_that(p.cargo("package"),
+ execs().with_status(0).with_stdout(""));
+
+ let f = File::open(&p.root().join("target/package/foo-0.0.1.crate")).unwrap();
+ let mut rdr = GzDecoder::new(f).unwrap();
+ let mut contents = Vec::new();
+ rdr.read_to_end(&mut contents).unwrap();
+ let mut ar = Archive::new(&contents[..]);
+ for f in ar.entries().unwrap() {
+ let f = f.unwrap();
+ let fname = f.header().path_bytes();
+ let fname = &*fname;
+ assert!(fname == b"foo-0.0.1/Cargo.toml" ||
+ fname == b"foo-0.0.1/Cargo.toml.orig" ||
+ fname == b"foo-0.0.1/src/main.rs",
+ "unexpected filename: {:?}", f.header().path())
+ }
+}
+
+#[test]
+fn metadata_warning() {
+ let p = project("all")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+ assert_that(p.cargo("package"),
+ execs().with_status(0).with_stderr(&format!("\
+warning: manifest has no description, license, license-file, documentation, \
+homepage or repository.
+See http://doc.crates.io/manifest.html#package-metadata for more info.
+[PACKAGING] foo v0.0.1 ({dir})
+[VERIFYING] foo v0.0.1 ({dir})
+[COMPILING] foo v0.0.1 ({dir}[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = p.url())));
+
+ let p = project("one")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+ assert_that(p.cargo("package"),
+ execs().with_status(0).with_stderr(&format!("\
+warning: manifest has no description, documentation, homepage or repository.
+See http://doc.crates.io/manifest.html#package-metadata for more info.
+[PACKAGING] foo v0.0.1 ({dir})
+[VERIFYING] foo v0.0.1 ({dir})
+[COMPILING] foo v0.0.1 ({dir}[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = p.url())));
+
+ let p = project("all")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ repository = "bar"
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+ assert_that(p.cargo("package"),
+ execs().with_status(0).with_stderr(&format!("\
+[PACKAGING] foo v0.0.1 ({dir})
+[VERIFYING] foo v0.0.1 ({dir})
+[COMPILING] foo v0.0.1 ({dir}[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = p.url())));
+}
+
+#[test]
+fn package_verbose() {
+ let root = paths::root().join("all");
+ let p = git::repo(&root)
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+ let mut cargo = cargo_process();
+ cargo.cwd(p.root());
+ assert_that(cargo.clone().arg("build"), execs().with_status(0));
+
+ println!("package main repo");
+ assert_that(cargo.clone().arg("package").arg("-v").arg("--no-verify"),
+ execs().with_status(0).with_stderr("\
+[WARNING] manifest has no description[..]
+See http://doc.crates.io/manifest.html#package-metadata for more info.
+[PACKAGING] foo v0.0.1 ([..])
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+"));
+
+ println!("package sub-repo");
+ assert_that(cargo.arg("package").arg("-v").arg("--no-verify")
+ .cwd(p.root().join("a")),
+ execs().with_status(0).with_stderr("\
+[WARNING] manifest has no description[..]
+See http://doc.crates.io/manifest.html#package-metadata for more info.
+[PACKAGING] a v0.0.1 ([..])
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+"));
+}
+
+#[test]
+fn package_verification() {
+ let p = project("all")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ assert_that(p.cargo("package"),
+ execs().with_status(0).with_stderr(&format!("\
+[WARNING] manifest has no description[..]
+See http://doc.crates.io/manifest.html#package-metadata for more info.
+[PACKAGING] foo v0.0.1 ({dir})
+[VERIFYING] foo v0.0.1 ({dir})
+[COMPILING] foo v0.0.1 ({dir}[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = p.url())));
+}
+
+#[test]
+fn path_dependency_no_version() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("package"),
+ execs().with_status(101).with_stderr("\
+[WARNING] manifest has no documentation, homepage or repository.
+See http://doc.crates.io/manifest.html#package-metadata for more info.
+[ERROR] all path dependencies must have a version specified when packaging.
+dependency `bar` does not specify a version.
+"));
+}
+
+#[test]
+fn exclude() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ exclude = [
+ "*.txt",
+ # file in root
+ "file_root_1", # NO_CHANGE (ignored)
+ "/file_root_2", # CHANGING (packaged -> ignored)
+ "file_root_3/", # NO_CHANGE (packaged)
+ "file_root_4/*", # NO_CHANGE (packaged)
+ "file_root_5/**", # NO_CHANGE (packaged)
+ # file in sub-dir
+ "file_deep_1", # CHANGING (packaged -> ignored)
+ "/file_deep_2", # NO_CHANGE (packaged)
+ "file_deep_3/", # NO_CHANGE (packaged)
+ "file_deep_4/*", # NO_CHANGE (packaged)
+ "file_deep_5/**", # NO_CHANGE (packaged)
+ # dir in root
+ "dir_root_1", # CHANGING (packaged -> ignored)
+ "/dir_root_2", # CHANGING (packaged -> ignored)
+ "dir_root_3/", # CHANGING (packaged -> ignored)
+ "dir_root_4/*", # NO_CHANGE (ignored)
+ "dir_root_5/**", # NO_CHANGE (ignored)
+ # dir in sub-dir
+ "dir_deep_1", # CHANGING (packaged -> ignored)
+ "/dir_deep_2", # NO_CHANGE
+ "dir_deep_3/", # CHANGING (packaged -> ignored)
+ "dir_deep_4/*", # CHANGING (packaged -> ignored)
+ "dir_deep_5/**", # CHANGING (packaged -> ignored)
+ ]
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { println!("hello"); }
+ "#)
+ .file("bar.txt", "")
+ .file("src/bar.txt", "")
+ // file in root
+ .file("file_root_1", "")
+ .file("file_root_2", "")
+ .file("file_root_3", "")
+ .file("file_root_4", "")
+ .file("file_root_5", "")
+ // file in sub-dir
+ .file("some_dir/file_deep_1", "")
+ .file("some_dir/file_deep_2", "")
+ .file("some_dir/file_deep_3", "")
+ .file("some_dir/file_deep_4", "")
+ .file("some_dir/file_deep_5", "")
+ // dir in root
+ .file("dir_root_1/some_dir/file", "")
+ .file("dir_root_2/some_dir/file", "")
+ .file("dir_root_3/some_dir/file", "")
+ .file("dir_root_4/some_dir/file", "")
+ .file("dir_root_5/some_dir/file", "")
+ // dir in sub-dir
+ .file("some_dir/dir_deep_1/some_dir/file", "")
+ .file("some_dir/dir_deep_2/some_dir/file", "")
+ .file("some_dir/dir_deep_3/some_dir/file", "")
+ .file("some_dir/dir_deep_4/some_dir/file", "")
+ .file("some_dir/dir_deep_5/some_dir/file", "")
+ .build();
+
+ assert_that(p.cargo("package").arg("--no-verify").arg("-v"),
+ execs().with_status(0).with_stdout("").with_stderr("\
+[WARNING] manifest has no description[..]
+See http://doc.crates.io/manifest.html#package-metadata for more info.
+[PACKAGING] foo v0.0.1 ([..])
+[WARNING] [..] file `dir_root_1[/]some_dir[/]file` WILL be excluded [..]
+See [..]
+[WARNING] [..] file `dir_root_2[/]some_dir[/]file` WILL be excluded [..]
+See [..]
+[WARNING] [..] file `dir_root_3[/]some_dir[/]file` WILL be excluded [..]
+See [..]
+[WARNING] [..] file `some_dir[/]dir_deep_1[/]some_dir[/]file` WILL be excluded [..]
+See [..]
+[WARNING] [..] file `some_dir[/]dir_deep_3[/]some_dir[/]file` WILL be excluded [..]
+See [..]
+[WARNING] [..] file `some_dir[/]dir_deep_4[/]some_dir[/]file` WILL be excluded [..]
+See [..]
+[WARNING] [..] file `some_dir[/]dir_deep_5[/]some_dir[/]file` WILL be excluded [..]
+See [..]
+[WARNING] [..] file `some_dir[/]file_deep_1` WILL be excluded [..]
+See [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+"));
+
+ assert_that(&p.root().join("target/package/foo-0.0.1.crate"), existing_file());
+
+ assert_that(p.cargo("package").arg("-l"),
+ execs().with_status(0).with_stdout("\
+Cargo.toml
+dir_root_1[/]some_dir[/]file
+dir_root_2[/]some_dir[/]file
+dir_root_3[/]some_dir[/]file
+file_root_3
+file_root_4
+file_root_5
+some_dir[/]dir_deep_1[/]some_dir[/]file
+some_dir[/]dir_deep_2[/]some_dir[/]file
+some_dir[/]dir_deep_3[/]some_dir[/]file
+some_dir[/]dir_deep_4[/]some_dir[/]file
+some_dir[/]dir_deep_5[/]some_dir[/]file
+some_dir[/]file_deep_1
+some_dir[/]file_deep_2
+some_dir[/]file_deep_3
+some_dir[/]file_deep_4
+some_dir[/]file_deep_5
+src[/]main.rs
+"));
+}
+
+#[test]
+fn include() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ exclude = ["*.txt"]
+ include = ["foo.txt", "**/*.rs", "Cargo.toml"]
+ "#)
+ .file("foo.txt", "")
+ .file("src/main.rs", r#"
+ fn main() { println!("hello"); }
+ "#)
+ .file("src/bar.txt", "") // should be ignored when packaging
+ .build();
+
+ assert_that(p.cargo("package").arg("--no-verify").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[WARNING] manifest has no description[..]
+See http://doc.crates.io/manifest.html#package-metadata for more info.
+[PACKAGING] foo v0.0.1 ([..])
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+[ARCHIVING] [..]
+"));
+}
+
+#[test]
+fn package_lib_with_bin() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ extern crate foo;
+ fn main() {}
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("package").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn package_git_submodule() {
+ let project = git::new("foo", |project| {
+ project.file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = ["foo@example.com"]
+ license = "MIT"
+ description = "foo"
+ repository = "foo"
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ }).unwrap();
+ let library = git::new("bar", |library| {
+ library.file("Makefile", "all:")
+ }).unwrap();
+
+ let repository = git2::Repository::open(&project.root()).unwrap();
+ let url = path2url(library.root()).to_string();
+ git::add_submodule(&repository, &url, Path::new("bar"));
+ git::commit(&repository);
+
+ let repository = git2::Repository::open(&project.root().join("bar")).unwrap();
+ repository.reset(&repository.revparse_single("HEAD").unwrap(),
+ git2::ResetType::Hard, None).unwrap();
+
+ assert_that(cargo_process().arg("package").cwd(project.root())
+ .arg("--no-verify").arg("-v"),
+ execs().with_status(0).with_stderr_contains("[ARCHIVING] bar/Makefile"));
+}
+
+#[test]
+fn no_duplicates_from_modified_tracked_files() {
+ let root = paths::root().join("all");
+ let p = git::repo(&root)
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+ File::create(p.root().join("src/main.rs")).unwrap().write_all(br#"
+ fn main() { println!("A change!"); }
+ "#).unwrap();
+ let mut cargo = cargo_process();
+ cargo.cwd(p.root());
+ assert_that(cargo.clone().arg("build"), execs().with_status(0));
+ assert_that(cargo.arg("package").arg("--list"),
+ execs().with_status(0).with_stdout("\
+Cargo.toml
+src/main.rs
+"));
+}
+
+#[test]
+fn ignore_nested() {
+ let cargo_toml = r#"
+ [project]
+ name = "nested"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "nested"
+ "#;
+ let main_rs = r#"
+ fn main() { println!("hello"); }
+ "#;
+ let p = project("nested")
+ .file("Cargo.toml", cargo_toml)
+ .file("src/main.rs", main_rs)
+ // If a project happens to contain a copy of itself, we should
+ // ignore it.
+ .file("a_dir/nested/Cargo.toml", cargo_toml)
+ .file("a_dir/nested/src/main.rs", main_rs)
+ .build();
+
+ assert_that(p.cargo("package"),
+ execs().with_status(0).with_stderr(&format!("\
+[WARNING] manifest has no documentation[..]
+See http://doc.crates.io/manifest.html#package-metadata for more info.
+[PACKAGING] nested v0.0.1 ({dir})
+[VERIFYING] nested v0.0.1 ({dir})
+[COMPILING] nested v0.0.1 ({dir}[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = p.url())));
+ assert_that(&p.root().join("target/package/nested-0.0.1.crate"), existing_file());
+ assert_that(p.cargo("package").arg("-l"),
+ execs().with_status(0).with_stdout("\
+Cargo.toml
+src[..]main.rs
+"));
+ assert_that(p.cargo("package"),
+ execs().with_status(0).with_stdout(""));
+
+ let f = File::open(&p.root().join("target/package/nested-0.0.1.crate")).unwrap();
+ let mut rdr = GzDecoder::new(f).unwrap();
+ let mut contents = Vec::new();
+ rdr.read_to_end(&mut contents).unwrap();
+ let mut ar = Archive::new(&contents[..]);
+ for f in ar.entries().unwrap() {
+ let f = f.unwrap();
+ let fname = f.header().path_bytes();
+ let fname = &*fname;
+ assert!(fname == b"nested-0.0.1/Cargo.toml" ||
+ fname == b"nested-0.0.1/Cargo.toml.orig" ||
+ fname == b"nested-0.0.1/src/main.rs",
+ "unexpected filename: {:?}", f.header().path())
+ }
+}
+
+#[cfg(unix)] // windows doesn't allow these characters in filenames
+#[test]
+fn package_weird_characters() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { println!("hello"); }
+ "#)
+ .file("src/:foo", "")
+ .build();
+
+ assert_that(p.cargo("package"),
+ execs().with_status(101).with_stderr("\
+warning: [..]
+See [..]
+[PACKAGING] foo [..]
+[ERROR] failed to prepare local package for uploading
+
+Caused by:
+ cannot package a filename with a special character `:`: src/:foo
+"));
+}
+
+#[test]
+fn repackage_on_source_change() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { println!("hello"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("package"),
+ execs().with_status(0));
+
+ // Add another source file
+ let mut file = File::create(p.root().join("src").join("foo.rs")).unwrap_or_else(|e| {
+ panic!("could not create file {}: {}", p.root().join("src/foo.rs").display(), e)
+ });
+
+ file.write_all(br#"
+ fn main() { println!("foo"); }
+ "#).unwrap();
+ std::mem::drop(file);
+
+ let mut pro = process(&cargo_exe());
+ pro.arg("package").cwd(p.root());
+
+ // Check that cargo rebuilds the tarball
+ assert_that(pro, execs().with_status(0).with_stderr(&format!("\
+[WARNING] [..]
+See [..]
+[PACKAGING] foo v0.0.1 ({dir})
+[VERIFYING] foo v0.0.1 ({dir})
+[COMPILING] foo v0.0.1 ({dir}[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = p.url())));
+
+ // Check that the tarball contains the added file
+ let f = File::open(&p.root().join("target/package/foo-0.0.1.crate")).unwrap();
+ let mut rdr = GzDecoder::new(f).unwrap();
+ let mut contents = Vec::new();
+ rdr.read_to_end(&mut contents).unwrap();
+ let mut ar = Archive::new(&contents[..]);
+ let entries = ar.entries().unwrap();
+ let entry_paths = entries.map(|entry| {
+ entry.unwrap().path().unwrap().into_owned()
+ }).collect::<Vec<PathBuf>>();
+ assert_that(&entry_paths, contains(vec![PathBuf::from("foo-0.0.1/src/foo.rs")]));
+}
+
+#[test]
+#[cfg(unix)]
+fn broken_symlink() {
+ use std::os::unix::fs;
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = 'foo'
+ documentation = 'foo'
+ homepage = 'foo'
+ repository = 'foo'
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { println!("hello"); }
+ "#)
+ .build();
+ t!(fs::symlink("nowhere", &p.root().join("src/foo.rs")));
+
+ assert_that(p.cargo("package").arg("-v"),
+ execs().with_status(101)
+ .with_stderr_contains("\
+error: failed to prepare local package for uploading
+
+Caused by:
+ failed to open for archiving: `[..]foo.rs`
+
+Caused by:
+ [..]
+"));
+}
+
+#[test]
+fn do_not_package_if_repository_is_dirty() {
+ let p = project("foo").build();
+
+ // Create a Git repository containing a minimal Rust project.
+ let _ = git::repo(&paths::root().join("foo"))
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ license = "MIT"
+ description = "foo"
+ documentation = "foo"
+ homepage = "foo"
+ repository = "foo"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ // Modify Cargo.toml without committing the change.
+ p.change_file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ license = "MIT"
+ description = "foo"
+ documentation = "foo"
+ homepage = "foo"
+ repository = "foo"
+ # change
+ "#);
+
+ assert_that(p.cargo("package"),
+ execs().with_status(101)
+ .with_stderr("\
+error: 1 files in the working directory contain changes that were not yet \
+committed into git:
+
+Cargo.toml
+
+to proceed despite this, pass the `--allow-dirty` flag
+"));
+}
+
+#[test]
+fn generated_manifest() {
+ Package::new("abc", "1.0.0").publish();
+ Package::new("def", "1.0.0").publish();
+ Package::new("ghi", "1.0.0").publish();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ exclude = ["*.txt"]
+ license = "MIT"
+ description = "foo"
+
+ [project.metadata]
+ foo = 'bar'
+
+ [workspace]
+
+ [dependencies]
+ bar = { path = "bar", version = "0.1" }
+ def = "1.0"
+ ghi = "1.0"
+ abc = "1.0"
+ "#)
+ .file("src/main.rs", "")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("package").arg("--no-verify"),
+ execs().with_status(0));
+
+ let f = File::open(&p.root().join("target/package/foo-0.0.1.crate")).unwrap();
+ let mut rdr = GzDecoder::new(f).unwrap();
+ let mut contents = Vec::new();
+ rdr.read_to_end(&mut contents).unwrap();
+ let mut ar = Archive::new(&contents[..]);
+ let mut entry = ar.entries().unwrap()
+ .map(|f| f.unwrap())
+ .find(|e| e.path().unwrap().ends_with("Cargo.toml"))
+ .unwrap();
+ let mut contents = String::new();
+ entry.read_to_string(&mut contents).unwrap();
+ // BTreeMap makes the order of dependencies in the generated file deterministic
+ // by sorting alphabetically
+ assert_that(&contents[..], equal_to(
+r#"# THIS FILE IS AUTOMATICALLY GENERATED BY CARGO
+#
+# When uploading crates to the registry Cargo will automatically
+# "normalize" Cargo.toml files for maximal compatibility
+# with all versions of Cargo and also rewrite `path` dependencies
+# to registry (e.g. crates.io) dependencies
+#
+# If you believe there's an error in this file please file an
+# issue against the rust-lang/cargo repository. If you're
+# editing this file be aware that the upstream Cargo.toml
+# will likely look very different (and much more reasonable)
+
+[package]
+name = "foo"
+version = "0.0.1"
+authors = []
+exclude = ["*.txt"]
+description = "foo"
+license = "MIT"
+
+[package.metadata]
+foo = "bar"
+[dependencies.abc]
+version = "1.0"
+
+[dependencies.bar]
+version = "0.1"
+
+[dependencies.def]
+version = "1.0"
+
+[dependencies.ghi]
+version = "1.0"
+"#));
+}
+
+#[test]
+fn ignore_workspace_specifier() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+
+ authors = []
+
+ [workspace]
+
+ [dependencies]
+ bar = { path = "bar", version = "0.1" }
+ "#)
+ .file("src/main.rs", "")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ workspace = ".."
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("package").arg("--no-verify").cwd(p.root().join("bar")),
+ execs().with_status(0));
+
+ let f = File::open(&p.root().join("target/package/bar-0.1.0.crate")).unwrap();
+ let mut rdr = GzDecoder::new(f).unwrap();
+ let mut contents = Vec::new();
+ rdr.read_to_end(&mut contents).unwrap();
+ let mut ar = Archive::new(&contents[..]);
+ let mut entry = ar.entries().unwrap()
+ .map(|f| f.unwrap())
+ .find(|e| e.path().unwrap().ends_with("Cargo.toml"))
+ .unwrap();
+ let mut contents = String::new();
+ entry.read_to_string(&mut contents).unwrap();
+ assert_that(&contents[..], equal_to(
+r#"# THIS FILE IS AUTOMATICALLY GENERATED BY CARGO
+#
+# When uploading crates to the registry Cargo will automatically
+# "normalize" Cargo.toml files for maximal compatibility
+# with all versions of Cargo and also rewrite `path` dependencies
+# to registry (e.g. crates.io) dependencies
+#
+# If you believe there's an error in this file please file an
+# issue against the rust-lang/cargo repository. If you're
+# editing this file be aware that the upstream Cargo.toml
+# will likely look very different (and much more reasonable)
+
+[package]
+name = "bar"
+version = "0.1.0"
+authors = []
+"#));
+}
+
+#[test]
+fn package_two_kinds_of_deps() {
+ Package::new("other", "1.0.0").publish();
+ Package::new("other1", "1.0.0").publish();
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ other = "1.0"
+ other1 = { version = "1.0" }
+ "#)
+ .file("src/main.rs", "")
+ .build();
+
+ assert_that(p.cargo("package").arg("--no-verify"),
+ execs().with_status(0));
+}
--- /dev/null
+#[macro_use]
+extern crate cargotest;
+extern crate hamcrest;
+extern crate toml;
+
+use std::fs::{self, File};
+use std::io::{Read, Write};
+
+use cargotest::support::git;
+use cargotest::support::paths;
+use cargotest::support::registry::Package;
+use cargotest::support::{execs, project};
+use hamcrest::assert_that;
+
+#[test]
+fn replace() {
+ Package::new("foo", "0.1.0").publish();
+ Package::new("deep-foo", "0.1.0")
+ .file("src/lib.rs", r#"
+ extern crate foo;
+ pub fn deep() {
+ foo::foo();
+ }
+ "#)
+ .dep("foo", "0.1.0")
+ .publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ deep-foo = "0.1.0"
+
+ [patch.crates-io]
+ foo = { path = "foo" }
+ "#)
+ .file("src/lib.rs", "
+ extern crate foo;
+ extern crate deep_foo;
+ pub fn bar() {
+ foo::foo();
+ deep_foo::deep();
+ }
+ ")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[DOWNLOADING] deep-foo v0.1.0 ([..])
+[COMPILING] foo v0.1.0 (file://[..])
+[COMPILING] deep-foo v0.1.0
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("build"),//.env("RUST_LOG", "trace"),
+ execs().with_status(0).with_stderr("[FINISHED] [..]"));
+}
+
+#[test]
+fn nonexistent() {
+ Package::new("baz", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [patch.crates-io]
+ foo = { path = "foo" }
+ "#)
+ .file("src/lib.rs", "
+ extern crate foo;
+ pub fn bar() {
+ foo::foo();
+ }
+ ")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[COMPILING] foo v0.1.0 (file://[..])
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("[FINISHED] [..]"));
+}
+
+#[test]
+fn patch_git() {
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ let p = project("bar")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = {{ git = '{}' }}
+
+ [patch.'{0}']
+ foo = {{ path = "foo" }}
+ "#, foo.url()))
+ .file("src/lib.rs", "
+ extern crate foo;
+ pub fn bar() {
+ foo::foo();
+ }
+ ")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#"
+ pub fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] git repository `file://[..]`
+[COMPILING] foo v0.1.0 (file://[..])
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("[FINISHED] [..]"));
+}
+
+#[test]
+fn patch_to_git() {
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "pub fn foo() {}")
+ .build();
+
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1"
+
+ [patch.crates-io]
+ foo = {{ git = '{}' }}
+ "#, foo.url()))
+ .file("src/lib.rs", "
+ extern crate foo;
+ pub fn bar() {
+ foo::foo();
+ }
+ ")
+ .build();
+
+ assert_that(p.cargo("build"),//.env("RUST_LOG", "cargo=trace"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] git repository `file://[..]`
+[UPDATING] registry `file://[..]`
+[COMPILING] foo v0.1.0 (file://[..])
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("[FINISHED] [..]"));
+}
+
+#[test]
+fn unused() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [patch.crates-io]
+ foo = { path = "foo" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.2.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#"
+ not rust code
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[DOWNLOADING] foo v0.1.0 [..]
+[COMPILING] foo v0.1.0
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("[FINISHED] [..]"));
+
+ // unused patch should be in the lock file
+ let mut lock = String::new();
+ File::open(p.root().join("Cargo.lock")).unwrap()
+ .read_to_string(&mut lock).unwrap();
+ let toml: toml::Value = toml::from_str(&lock).unwrap();
+ assert_eq!(toml["patch"]["unused"].as_array().unwrap().len(), 1);
+ assert_eq!(toml["patch"]["unused"][0]["name"].as_str(), Some("foo"));
+ assert_eq!(toml["patch"]["unused"][0]["version"].as_str(), Some("0.2.0"));
+}
+
+#[test]
+fn unused_git() {
+ Package::new("foo", "0.1.0").publish();
+
+ let foo = git::repo(&paths::root().join("override"))
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.2.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ let p = project("bar")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1"
+
+ [patch.crates-io]
+ foo = {{ git = '{}' }}
+ "#, foo.url()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] git repository `file://[..]`
+[UPDATING] registry `file://[..]`
+[DOWNLOADING] foo v0.1.0 [..]
+[COMPILING] foo v0.1.0
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("[FINISHED] [..]"));
+}
+
+#[test]
+fn add_patch() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#""#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[DOWNLOADING] foo v0.1.0 [..]
+[COMPILING] foo v0.1.0
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("[FINISHED] [..]"));
+
+ t!(t!(File::create(p.root().join("Cargo.toml"))).write_all(br#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [patch.crates-io]
+ foo = { path = 'foo' }
+ "#));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.1.0 (file://[..])
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("[FINISHED] [..]"));
+}
+
+#[test]
+fn add_ignored_patch() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.1"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#""#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[DOWNLOADING] foo v0.1.0 [..]
+[COMPILING] foo v0.1.0
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("[FINISHED] [..]"));
+
+ t!(t!(File::create(p.root().join("Cargo.toml"))).write_all(br#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [patch.crates-io]
+ foo = { path = 'foo' }
+ "#));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("[FINISHED] [..]"));
+}
+
+#[test]
+fn new_minor() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+
+ [patch.crates-io]
+ foo = { path = 'foo' }
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.1"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#""#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[COMPILING] foo v0.1.1 [..]
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn transitive_new_minor() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ subdir = { path = 'subdir' }
+
+ [patch.crates-io]
+ foo = { path = 'foo' }
+ "#)
+ .file("src/lib.rs", "")
+ .file("subdir/Cargo.toml", r#"
+ [package]
+ name = "subdir"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = '0.1.0'
+ "#)
+ .file("subdir/src/lib.rs", r#""#)
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.1"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#""#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[COMPILING] foo v0.1.1 [..]
+[COMPILING] subdir v0.1.0 [..]
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn new_major() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.2.0"
+
+ [patch.crates-io]
+ foo = { path = 'foo' }
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.2.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#""#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[COMPILING] foo v0.2.0 [..]
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ Package::new("foo", "0.2.0").publish();
+ assert_that(p.cargo("update"),
+ execs().with_status(0));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ t!(t!(File::create(p.root().join("Cargo.toml"))).write_all(br#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.2.0"
+ "#));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[DOWNLOADING] foo v0.2.0 [..]
+[COMPILING] foo v0.2.0
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn transitive_new_major() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ subdir = { path = 'subdir' }
+
+ [patch.crates-io]
+ foo = { path = 'foo' }
+ "#)
+ .file("src/lib.rs", "")
+ .file("subdir/Cargo.toml", r#"
+ [package]
+ name = "subdir"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = '0.2.0'
+ "#)
+ .file("subdir/src/lib.rs", r#""#)
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.2.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#""#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `file://[..]`
+[COMPILING] foo v0.2.0 [..]
+[COMPILING] subdir v0.1.0 [..]
+[COMPILING] bar v0.0.1 (file://[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn remove_patch() {
+ Package::new("foo", "0.1.0").publish();
+ Package::new("bar", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1"
+
+ [patch.crates-io]
+ foo = { path = 'foo' }
+ bar = { path = 'bar' }
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#""#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", r#""#)
+ .build();
+
+ // Generate a lock file where `bar` is unused
+ assert_that(p.cargo("build"), execs().with_status(0));
+ let mut lock_file1 = String::new();
+ File::open(p.root().join("Cargo.lock")).unwrap()
+ .read_to_string(&mut lock_file1).unwrap();
+
+ // Remove `bar` and generate a new lock file form the old one
+ File::create(p.root().join("Cargo.toml")).unwrap().write_all(r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "0.1"
+
+ [patch.crates-io]
+ foo = { path = 'foo' }
+ "#.as_bytes()).unwrap();
+ assert_that(p.cargo("build"), execs().with_status(0));
+ let mut lock_file2 = String::new();
+ File::open(p.root().join("Cargo.lock")).unwrap()
+ .read_to_string(&mut lock_file2).unwrap();
+
+ // Remove the lock file and build from scratch
+ fs::remove_file(p.root().join("Cargo.lock")).unwrap();
+ assert_that(p.cargo("build"), execs().with_status(0));
+ let mut lock_file3 = String::new();
+ File::open(p.root().join("Cargo.lock")).unwrap()
+ .read_to_string(&mut lock_file3).unwrap();
+
+ assert!(lock_file1.contains("bar"));
+ assert_eq!(lock_file2, lock_file3);
+ assert!(lock_file1 != lock_file2);
+}
+
+#[test]
+fn non_crates_io() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [patch.some-other-source]
+ foo = { path = 'foo' }
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#""#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ invalid url `some-other-source`: relative URL without a base
+"));
+}
+
+#[test]
+fn replace_with_crates_io() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [patch.crates-io]
+ foo = "0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#""#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+[UPDATING] [..]
+error: failed to resolve patches for `[..]`
+
+Caused by:
+ patch for `foo` in `[..]` points to the same source, but patches must point \
+ to different sources
+"));
+}
+
+#[test]
+fn patch_in_virtual() {
+ Package::new("foo", "0.1.0").publish();
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["bar"]
+
+ [patch.crates-io]
+ foo = { path = "foo" }
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", r#""#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1"
+ "#)
+ .file("bar/src/lib.rs", r#""#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[FINISHED] [..]
+"));
+}
--- /dev/null
+extern crate cargo;
+#[macro_use]
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::fs::{self, File};
+use std::io::prelude::*;
+
+use cargo::util::process;
+use cargotest::sleep_ms;
+use cargotest::support::paths::{self, CargoPathExt};
+use cargotest::support::{project, execs, main_file};
+use cargotest::support::registry::Package;
+use hamcrest::{assert_that, existing_file};
+
+#[test]
+#[cfg(not(windows))] // I have no idea why this is failing spuriously on
+ // Windows, for more info see #3466.
+fn cargo_compile_with_nested_deps_shorthand() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+
+ version = "0.5.0"
+ path = "bar"
+ "#)
+ .file("src/main.rs",
+ &main_file(r#""{}", bar::gimme()"#, &["bar"]))
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.baz]
+
+ version = "0.5.0"
+ path = "baz"
+
+ [lib]
+
+ name = "bar"
+ "#)
+ .file("bar/src/bar.rs", r#"
+ extern crate baz;
+
+ pub fn gimme() -> String {
+ baz::gimme()
+ }
+ "#)
+ .file("bar/baz/Cargo.toml", r#"
+ [project]
+
+ name = "baz"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+
+ name = "baz"
+ "#)
+ .file("bar/baz/src/baz.rs", r#"
+ pub fn gimme() -> String {
+ "test passed".to_string()
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr(&format!("[COMPILING] baz v0.5.0 ({}/bar/baz)\n\
+ [COMPILING] bar v0.5.0 ({}/bar)\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ p.url(),
+ p.url(),
+ p.url())));
+
+ assert_that(&p.bin("foo"), existing_file());
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_stdout("test passed\n").with_status(0));
+
+ println!("cleaning");
+ assert_that(p.cargo("clean").arg("-v"),
+ execs().with_stdout("").with_status(0));
+ println!("building baz");
+ assert_that(p.cargo("build").arg("-p").arg("baz"),
+ execs().with_status(0)
+ .with_stderr(&format!("[COMPILING] baz v0.5.0 ({}/bar/baz)\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ p.url())));
+ println!("building foo");
+ assert_that(p.cargo("build")
+ .arg("-p").arg("foo"),
+ execs().with_status(0)
+ .with_stderr(&format!("[COMPILING] bar v0.5.0 ({}/bar)\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ p.url(),
+ p.url())));
+}
+
+#[test]
+fn cargo_compile_with_root_dev_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dev-dependencies.bar]
+
+ version = "0.5.0"
+ path = "../bar"
+
+ [[bin]]
+ name = "foo"
+ "#)
+ .file("src/main.rs",
+ &main_file(r#""{}", bar::gimme()"#, &["bar"]))
+ .build();
+ let _p2 = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn gimme() -> &'static str {
+ "zoidberg"
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101))
+}
+
+#[test]
+fn cargo_compile_with_root_dev_deps_with_testing() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dev-dependencies.bar]
+
+ version = "0.5.0"
+ path = "../bar"
+
+ [[bin]]
+ name = "foo"
+ "#)
+ .file("src/main.rs",
+ &main_file(r#""{}", bar::gimme()"#, &["bar"]))
+ .build();
+ let _p2 = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn gimme() -> &'static str {
+ "zoidberg"
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_stderr("\
+[COMPILING] [..] v0.5.0 ([..])
+[COMPILING] [..] v0.5.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]")
+ .with_stdout_contains("running 0 tests"));
+}
+
+#[test]
+fn cargo_compile_with_transitive_dev_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+
+ version = "0.5.0"
+ path = "bar"
+ "#)
+ .file("src/main.rs",
+ &main_file(r#""{}", bar::gimme()"#, &["bar"]))
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dev-dependencies.baz]
+
+ git = "git://example.com/path/to/nowhere"
+
+ [lib]
+
+ name = "bar"
+ "#)
+ .file("bar/src/bar.rs", r#"
+ pub fn gimme() -> &'static str {
+ "zoidberg"
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("[COMPILING] bar v0.5.0 ({}/bar)\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) in \
+ [..]\n",
+ p.url(),
+ p.url())));
+
+ assert_that(&p.bin("foo"), existing_file());
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_stdout("zoidberg\n"));
+}
+
+#[test]
+fn no_rebuild_dependency() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() { bar::bar() }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+ name = "bar"
+ "#)
+ .file("bar/src/bar.rs", r#"
+ pub fn bar() {}
+ "#)
+ .build();
+ // First time around we should compile both foo and bar
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("[COMPILING] bar v0.5.0 ({}/bar)\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ p.url(),
+ p.url())));
+
+ sleep_ms(1000);
+ p.change_file("src/main.rs", r#"
+ extern crate bar;
+ fn main() { bar::bar(); }
+ "#);
+ // Don't compile bar, but do recompile foo.
+ assert_that(p.cargo("build"),
+ execs().with_stderr("\
+ [COMPILING] foo v0.5.0 ([..])\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n"));
+}
+
+#[test]
+fn deep_dependencies_trigger_rebuild() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() { bar::bar() }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+ name = "bar"
+ [dependencies.baz]
+ path = "../baz"
+ "#)
+ .file("bar/src/bar.rs", r#"
+ extern crate baz;
+ pub fn bar() { baz::baz() }
+ "#)
+ .file("baz/Cargo.toml", r#"
+ [project]
+
+ name = "baz"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+ name = "baz"
+ "#)
+ .file("baz/src/baz.rs", r#"
+ pub fn baz() {}
+ "#)
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("[COMPILING] baz v0.5.0 ({}/baz)\n\
+ [COMPILING] bar v0.5.0 ({}/bar)\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ p.url(),
+ p.url(),
+ p.url())));
+ assert_that(p.cargo("build"),
+ execs().with_stdout(""));
+
+ // Make sure an update to baz triggers a rebuild of bar
+ //
+ // We base recompilation off mtime, so sleep for at least a second to ensure
+ // that this write will change the mtime.
+ sleep_ms(1000);
+ File::create(&p.root().join("baz/src/baz.rs")).unwrap().write_all(br#"
+ pub fn baz() { println!("hello!"); }
+ "#).unwrap();
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("[COMPILING] baz v0.5.0 ({}/baz)\n\
+ [COMPILING] bar v0.5.0 ({}/bar)\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ p.url(),
+ p.url(),
+ p.url())));
+
+ // Make sure an update to bar doesn't trigger baz
+ sleep_ms(1000);
+ File::create(&p.root().join("bar/src/bar.rs")).unwrap().write_all(br#"
+ extern crate baz;
+ pub fn bar() { println!("hello!"); baz::baz(); }
+ "#).unwrap();
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("[COMPILING] bar v0.5.0 ({}/bar)\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ p.url(),
+ p.url())));
+
+}
+
+#[test]
+fn no_rebuild_two_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+ path = "bar"
+ [dependencies.baz]
+ path = "baz"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() { bar::bar() }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+ name = "bar"
+ [dependencies.baz]
+ path = "../baz"
+ "#)
+ .file("bar/src/bar.rs", r#"
+ pub fn bar() {}
+ "#)
+ .file("baz/Cargo.toml", r#"
+ [project]
+
+ name = "baz"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+ name = "baz"
+ "#)
+ .file("baz/src/baz.rs", r#"
+ pub fn baz() {}
+ "#)
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("[COMPILING] baz v0.5.0 ({}/baz)\n\
+ [COMPILING] bar v0.5.0 ({}/bar)\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ p.url(),
+ p.url(),
+ p.url())));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(p.cargo("build"),
+ execs().with_stdout(""));
+ assert_that(&p.bin("foo"), existing_file());
+}
+
+#[test]
+fn nested_deps_recompile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+
+ version = "0.5.0"
+ path = "src/bar"
+ "#)
+ .file("src/main.rs",
+ &main_file(r#""{}", bar::gimme()"#, &["bar"]))
+ .file("src/bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [lib]
+
+ name = "bar"
+ "#)
+ .file("src/bar/src/bar.rs", "pub fn gimme() -> i32 { 92 }")
+ .build();
+ let bar = p.url();
+
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("[COMPILING] bar v0.5.0 ({}/src/bar)\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ bar,
+ p.url())));
+ sleep_ms(1000);
+
+ File::create(&p.root().join("src/main.rs")).unwrap().write_all(br#"
+ fn main() {}
+ "#).unwrap();
+
+ // This shouldn't recompile `bar`
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("[COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ p.url())));
+}
+
+#[test]
+fn error_message_for_missing_manifest() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+
+ path = "src/bar"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bar/not-a-manifest", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] failed to load source for a dependency on `bar`
+
+Caused by:
+ Unable to update file://[..]
+
+Caused by:
+ failed to read `[..]bar[/]Cargo.toml`
+
+Caused by:
+ [..] (os error [..])
+"));
+
+}
+
+#[test]
+fn override_relative() {
+ let bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ fs::create_dir(&paths::root().join(".cargo")).unwrap();
+ File::create(&paths::root().join(".cargo/config")).unwrap()
+ .write_all(br#"paths = ["bar"]"#).unwrap();
+
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+ path = '{}'
+ "#, bar.root().display()))
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v"), execs().with_status(0));
+
+}
+
+#[test]
+fn override_self() {
+ let bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ let p = project("foo");
+ let root = p.root().clone();
+ let p = p
+ .file(".cargo/config", &format!(r#"
+ paths = ['{}']
+ "#, root.display()))
+ .file("Cargo.toml", &format!(r#"
+ [package]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+ path = '{}'
+
+ "#, bar.root().display()))
+ .file("src/lib.rs", "")
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+}
+
+#[test]
+fn override_path_dep() {
+ let bar = project("bar")
+ .file("p1/Cargo.toml", r#"
+ [package]
+ name = "p1"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies.p2]
+ path = "../p2"
+ "#)
+ .file("p1/src/lib.rs", "")
+ .file("p2/Cargo.toml", r#"
+ [package]
+ name = "p2"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("p2/src/lib.rs", "")
+ .build();
+
+ let p = project("foo")
+ .file(".cargo/config", &format!(r#"
+ paths = ['{}', '{}']
+ "#, bar.root().join("p1").display(),
+ bar.root().join("p2").display()))
+ .file("Cargo.toml", &format!(r#"
+ [package]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.p2]
+ path = '{}'
+
+ "#, bar.root().join("p2").display()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+
+}
+
+#[test]
+fn path_dep_build_cmd() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+
+ name = "foo"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+
+ [dependencies.bar]
+
+ version = "0.5.0"
+ path = "bar"
+ "#)
+ .file("src/main.rs",
+ &main_file(r#""{}", bar::gimme()"#, &["bar"]))
+ .file("bar/Cargo.toml", r#"
+ [project]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ build = "build.rs"
+
+ [lib]
+ name = "bar"
+ path = "src/bar.rs"
+ "#)
+ .file("bar/build.rs", r#"
+ use std::fs;
+ fn main() {
+ fs::copy("src/bar.rs.in", "src/bar.rs").unwrap();
+ }
+ "#)
+ .file("bar/src/bar.rs.in", r#"
+ pub fn gimme() -> i32 { 0 }
+ "#).build();
+ p.root().join("bar").move_into_the_past();
+
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("[COMPILING] bar v0.5.0 ({}/bar)\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) in \
+ [..]\n",
+ p.url(),
+ p.url())));
+
+ assert_that(&p.bin("foo"), existing_file());
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_stdout("0\n"));
+
+ // Touching bar.rs.in should cause the `build` command to run again.
+ {
+ let file = fs::File::create(&p.root().join("bar/src/bar.rs.in"));
+ file.unwrap().write_all(br#"pub fn gimme() -> i32 { 1 }"#).unwrap();
+ }
+
+ assert_that(p.cargo("build"),
+ execs().with_stderr(&format!("[COMPILING] bar v0.5.0 ({}/bar)\n\
+ [COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) in \
+ [..]\n",
+ p.url(),
+ p.url())));
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_stdout("1\n"));
+}
+
+#[test]
+fn dev_deps_no_rebuild_lib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dev-dependencies.bar]
+ path = "bar"
+
+ [lib]
+ name = "foo"
+ doctest = false
+ "#)
+ .file("src/lib.rs", r#"
+ #[cfg(test)] #[allow(unused_extern_crates)] extern crate bar;
+ #[cfg(not(test))] pub fn foo() { env!("FOO"); }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+
+ name = "bar"
+ version = "0.5.0"
+ authors = ["wycats@example.com"]
+ "#)
+ .file("bar/src/lib.rs", "pub fn bar() {}")
+ .build();
+ assert_that(p.cargo("build")
+ .env("FOO", "bar"),
+ execs().with_status(0)
+ .with_stderr(&format!("[COMPILING] foo v0.5.0 ({})\n\
+ [FINISHED] dev [unoptimized + debuginfo] target(s) \
+ in [..]\n",
+ p.url())));
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] [..] v0.5.0 ({url}[..])
+[COMPILING] [..] v0.5.0 ({url}[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]", url = p.url()))
+ .with_stdout_contains("running 0 tests"));
+}
+
+#[test]
+fn custom_target_no_rebuild() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ [dependencies]
+ a = { path = "a" }
+ [workspace]
+ members = ["a", "b"]
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .file("b/Cargo.toml", r#"
+ [project]
+ name = "b"
+ version = "0.5.0"
+ authors = []
+ [dependencies]
+ a = { path = "../a" }
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] a v0.5.0 ([..])
+[COMPILING] foo v0.5.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ t!(fs::rename(p.root().join("target"), p.root().join("target_moved")));
+ assert_that(p.cargo("build")
+ .arg("--manifest-path=b/Cargo.toml")
+ .env("CARGO_TARGET_DIR", "target_moved"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] b v0.5.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn override_and_depend() {
+ let p = project("foo")
+ .file("a/a1/Cargo.toml", r#"
+ [project]
+ name = "a1"
+ version = "0.5.0"
+ authors = []
+ [dependencies]
+ a2 = { path = "../a2" }
+ "#)
+ .file("a/a1/src/lib.rs", "")
+ .file("a/a2/Cargo.toml", r#"
+ [project]
+ name = "a2"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("a/a2/src/lib.rs", "")
+ .file("b/Cargo.toml", r#"
+ [project]
+ name = "b"
+ version = "0.5.0"
+ authors = []
+ [dependencies]
+ a1 = { path = "../a/a1" }
+ a2 = { path = "../a/a2" }
+ "#)
+ .file("b/src/lib.rs", "")
+ .file("b/.cargo/config", r#"
+ paths = ["../a"]
+ "#)
+ .build();
+ assert_that(p.cargo("build").cwd(p.root().join("b")),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] a2 v0.5.0 ([..])
+[COMPILING] a1 v0.5.0 ([..])
+[COMPILING] b v0.5.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn missing_path_dependency() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ paths = ["../whoa-this-does-not-exist"]
+ "#)
+ .build();
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] failed to update path override `[..]../whoa-this-does-not-exist` \
+(defined in `[..]`)
+
+Caused by:
+ failed to read directory `[..]`
+
+Caused by:
+ [..] (os error [..])
+"));
+}
+
+#[test]
+fn invalid_path_dep_in_workspace_with_lockfile() {
+ Package::new("bar", "1.0.0").publish();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "top"
+ version = "0.5.0"
+ authors = []
+
+ [workspace]
+
+ [dependencies]
+ foo = { path = "foo" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("foo/src/lib.rs", "")
+ .build();
+
+ // Generate a lock file
+ assert_that(p.cargo("build"), execs().with_status(0));
+
+ // Change the dependency on `bar` to an invalid path
+ File::create(&p.root().join("foo/Cargo.toml")).unwrap().write_all(br#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "" }
+ "#).unwrap();
+
+ // Make sure we get a nice error. In the past this actually stack
+ // overflowed!
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: no matching package named `bar` found (required by `foo`)
+location searched: [..]
+version required: *
+"));
+}
+
+#[test]
+fn workspace_produces_rlib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "top"
+ version = "0.5.0"
+ authors = []
+
+ [workspace]
+
+ [dependencies]
+ foo = { path = "foo" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+
+ assert_that(&p.root().join("target/debug/libtop.rlib"), existing_file());
+ assert_that(&p.root().join("target/debug/libfoo.rlib"), existing_file());
+
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::fs;
+use std::env;
+
+use cargotest::{is_nightly, rustc_host};
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+#[test]
+fn plugin_to_the_max() {
+ if !is_nightly() { return }
+
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo_lib"
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/main.rs", r#"
+ #![feature(plugin)]
+ #![plugin(bar)]
+ extern crate foo_lib;
+
+ fn main() { foo_lib::foo(); }
+ "#)
+ .file("src/foo_lib.rs", r#"
+ #![feature(plugin)]
+ #![plugin(bar)]
+
+ pub fn foo() {}
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "bar"
+ plugin = true
+
+ [dependencies.baz]
+ path = "../baz"
+ "#)
+ .file("src/lib.rs", r#"
+ #![feature(plugin_registrar, rustc_private)]
+
+ extern crate rustc_plugin;
+ extern crate baz;
+
+ use rustc_plugin::Registry;
+
+ #[plugin_registrar]
+ pub fn foo(_reg: &mut Registry) {
+ println!("{}", baz::baz());
+ }
+ "#)
+ .build();
+ let _baz = project("baz")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "baz"
+ crate_type = ["dylib"]
+ "#)
+ .file("src/lib.rs", "pub fn baz() -> i32 { 1 }")
+ .build();
+
+ assert_that(foo.cargo("build"),
+ execs().with_status(0));
+ assert_that(foo.cargo("doc"),
+ execs().with_status(0));
+}
+
+#[test]
+fn plugin_with_dynamic_native_dependency() {
+ if !is_nightly() { return }
+
+ let workspace = project("ws")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["builder", "foo"]
+ "#)
+ .build();
+
+ let build = project("ws/builder")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "builder"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "builder"
+ crate-type = ["dylib"]
+ "#)
+ .file("src/lib.rs", r#"
+ #[no_mangle]
+ pub extern fn foo() {}
+ "#)
+ .build();
+
+ let foo = project("ws/foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", r#"
+ #![feature(plugin)]
+ #![plugin(bar)]
+
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ build = 'build.rs'
+
+ [lib]
+ name = "bar"
+ plugin = true
+ "#)
+ .file("bar/build.rs", r#"
+ use std::path::PathBuf;
+ use std::env;
+
+ fn main() {
+ let src = PathBuf::from(env::var("SRC").unwrap());
+ println!("cargo:rustc-flags=-L {}/deps", src.parent().unwrap().display());
+ }
+ "#)
+ .file("bar/src/lib.rs", r#"
+ #![feature(plugin_registrar, rustc_private)]
+ extern crate rustc_plugin;
+
+ use rustc_plugin::Registry;
+
+ #[cfg_attr(not(target_env = "msvc"), link(name = "builder"))]
+ #[cfg_attr(target_env = "msvc", link(name = "builder.dll"))]
+ extern { fn foo(); }
+
+ #[plugin_registrar]
+ pub fn bar(_reg: &mut Registry) {
+ unsafe { foo() }
+ }
+ "#)
+ .build();
+
+ assert_that(build.cargo("build"),
+ execs().with_status(0));
+
+ let src = workspace.root().join("target/debug");
+ let lib = fs::read_dir(&src).unwrap().map(|s| s.unwrap().path()).find(|lib| {
+ let lib = lib.file_name().unwrap().to_str().unwrap();
+ lib.starts_with(env::consts::DLL_PREFIX) &&
+ lib.ends_with(env::consts::DLL_SUFFIX)
+ }).unwrap();
+
+ assert_that(foo.cargo("build").env("SRC", &lib).arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn plugin_integration() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+
+ [lib]
+ name = "foo"
+ plugin = true
+ doctest = false
+ "#)
+ .file("build.rs", "fn main() {}")
+ .file("src/lib.rs", "")
+ .file("tests/it_works.rs", "")
+ .build();
+
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn doctest_a_plugin() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = { path = "bar" }
+ "#)
+ .file("src/lib.rs", r#"
+ #[macro_use]
+ extern crate bar;
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "bar"
+ plugin = true
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn bar() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0));
+}
+
+// See #1515
+#[test]
+fn native_plugin_dependency_with_custom_ar_linker() {
+ let target = rustc_host();
+
+ let _foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ plugin = true
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ let bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.foo]
+ path = "../foo"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", &format!(r#"
+ [target.{}]
+ ar = "nonexistent-ar"
+ linker = "nonexistent-linker"
+ "#, target))
+ .build();
+
+ assert_that(bar.cargo("build").arg("--verbose"),
+ execs().with_stderr_contains("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] -C ar=nonexistent-ar -C linker=nonexistent-linker [..]`
+[ERROR] could not exec the linker [..]
+"));
+}
+
+#[test]
+fn panic_abort_plugins() {
+ if !is_nightly() {
+ return
+ }
+
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [profile.dev]
+ panic = 'abort'
+
+ [dependencies]
+ foo = { path = "foo" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ plugin = true
+ "#)
+ .file("foo/src/lib.rs", r#"
+ #![feature(rustc_private)]
+ extern crate syntax;
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn shared_panic_abort_plugins() {
+ if !is_nightly() {
+ return
+ }
+
+ let p = project("top")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "top"
+ version = "0.0.1"
+ authors = []
+
+ [profile.dev]
+ panic = 'abort'
+
+ [dependencies]
+ foo = { path = "foo" }
+ bar = { path = "bar" }
+ "#)
+ .file("src/lib.rs", "
+ extern crate bar;
+ ")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ plugin = true
+
+ [dependencies]
+ bar = { path = "../bar" }
+ "#)
+ .file("foo/src/lib.rs", r#"
+ #![feature(rustc_private)]
+ extern crate syntax;
+ extern crate bar;
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::is_nightly;
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+#[test]
+fn probe_cfg_before_crate_type_discovery() {
+ if !is_nightly() {
+ return;
+ }
+
+ let client = project("client")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "client"
+ version = "0.0.1"
+ authors = []
+
+ [target.'cfg(not(stage300))'.dependencies.noop]
+ path = "../noop"
+ "#)
+ .file("src/main.rs", r#"
+ #![feature(proc_macro)]
+
+ #[macro_use]
+ extern crate noop;
+
+ #[derive(Noop)]
+ struct X;
+
+ fn main() {}
+ "#)
+ .build();
+ let _noop = project("noop")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "noop"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ proc-macro = true
+ "#)
+ .file("src/lib.rs", r#"
+ #![feature(proc_macro, proc_macro_lib)]
+
+ extern crate proc_macro;
+ use proc_macro::TokenStream;
+
+ #[proc_macro_derive(Noop)]
+ pub fn noop(_input: TokenStream) -> TokenStream {
+ "".parse().unwrap()
+ }
+ "#)
+ .build();
+
+ assert_that(client.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn noop() {
+ if !is_nightly() {
+ return;
+ }
+
+ let client = project("client")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "client"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.noop]
+ path = "../noop"
+ "#)
+ .file("src/main.rs", r#"
+ #![feature(proc_macro)]
+
+ #[macro_use]
+ extern crate noop;
+
+ #[derive(Noop)]
+ struct X;
+
+ fn main() {}
+ "#)
+ .build();
+ let _noop = project("noop")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "noop"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ proc-macro = true
+ "#)
+ .file("src/lib.rs", r#"
+ #![feature(proc_macro, proc_macro_lib)]
+
+ extern crate proc_macro;
+ use proc_macro::TokenStream;
+
+ #[proc_macro_derive(Noop)]
+ pub fn noop(_input: TokenStream) -> TokenStream {
+ "".parse().unwrap()
+ }
+ "#)
+ .build();
+
+ assert_that(client.cargo("build"),
+ execs().with_status(0));
+ assert_that(client.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn impl_and_derive() {
+ if !is_nightly() {
+ return;
+ }
+
+ let client = project("client")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "client"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.transmogrify]
+ path = "../transmogrify"
+ "#)
+ .file("src/main.rs", r#"
+ #![feature(proc_macro)]
+
+ #[macro_use]
+ extern crate transmogrify;
+
+ trait ImplByTransmogrify {
+ fn impl_by_transmogrify(&self) -> bool;
+ }
+
+ #[derive(Transmogrify, Debug)]
+ struct X { success: bool }
+
+ fn main() {
+ let x = X::new();
+ assert!(x.impl_by_transmogrify());
+ println!("{:?}", x);
+ }
+ "#)
+ .build();
+ let _transmogrify = project("transmogrify")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "transmogrify"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ proc-macro = true
+ "#)
+ .file("src/lib.rs", r#"
+ #![feature(proc_macro, proc_macro_lib)]
+
+ extern crate proc_macro;
+ use proc_macro::TokenStream;
+
+ #[proc_macro_derive(Transmogrify)]
+ #[doc(hidden)]
+ pub fn transmogrify(input: TokenStream) -> TokenStream {
+ "
+ impl X {
+ fn new() -> Self {
+ X { success: true }
+ }
+ }
+
+ impl ImplByTransmogrify for X {
+ fn impl_by_transmogrify(&self) -> bool {
+ true
+ }
+ }
+ ".parse().unwrap()
+ }
+ "#)
+ .build();
+
+ assert_that(client.cargo("build"),
+ execs().with_status(0));
+ assert_that(client.cargo("run"),
+ execs().with_status(0).with_stdout("X { success: true }"));
+}
+
+#[test]
+fn plugin_and_proc_macro() {
+ if !is_nightly() {
+ return;
+ }
+
+ let questionable = project("questionable")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "questionable"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ plugin = true
+ proc-macro = true
+ "#)
+ .file("src/lib.rs", r#"
+ #![feature(plugin_registrar, rustc_private)]
+ #![feature(proc_macro, proc_macro_lib)]
+
+ extern crate rustc_plugin;
+ use rustc_plugin::Registry;
+
+ extern crate proc_macro;
+ use proc_macro::TokenStream;
+
+ #[plugin_registrar]
+ pub fn plugin_registrar(reg: &mut Registry) {}
+
+ #[proc_macro_derive(Questionable)]
+ pub fn questionable(input: TokenStream) -> TokenStream {
+ input
+ }
+ "#)
+ .build();
+
+ let msg = " lib.plugin and lib.proc-macro cannot both be true";
+ assert_that(questionable.cargo("build"),
+ execs().with_status(101).with_stderr_contains(msg));
+}
+
+#[test]
+fn proc_macro_doctest() {
+ if !is_nightly() {
+ return
+ }
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ [lib]
+ proc-macro = true
+ "#)
+ .file("src/lib.rs", r#"
+#![feature(proc_macro, proc_macro_lib)]
+#![crate_type = "proc-macro"]
+
+extern crate proc_macro;
+
+use proc_macro::TokenStream;
+
+/// ```
+/// assert!(true);
+/// ```
+#[proc_macro_derive(Bar)]
+pub fn derive(_input: TokenStream) -> TokenStream {
+ "".parse().unwrap()
+}
+
+#[test]
+fn a() {
+ assert!(true);
+}
+"#)
+ .build();
+
+ assert_that(foo.cargo("test"),
+ execs().with_status(0)
+ .with_stdout_contains("test a ... ok")
+ .with_stdout_contains_n("test [..] ... ok", 2));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::env;
+
+use cargotest::is_nightly;
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+#[test]
+fn profile_overrides() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "test"
+ version = "0.0.0"
+ authors = []
+
+ [profile.dev]
+ opt-level = 1
+ debug = false
+ rpath = true
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] test v0.0.0 ({url})
+[RUNNING] `rustc --crate-name test src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link \
+ -C opt-level=1 \
+ -C debug-assertions=on \
+ -C metadata=[..] \
+ -C rpath \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[FINISHED] dev [optimized] target(s) in [..]
+",
+dir = p.root().display(),
+url = p.url(),
+)));
+}
+
+#[test]
+fn opt_level_override_0() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "test"
+ version = "0.0.0"
+ authors = []
+
+ [profile.dev]
+ opt-level = 0
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] test v0.0.0 ({url})
+[RUNNING] `rustc --crate-name test src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link \
+ -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[FINISHED] [..] target(s) in [..]
+",
+dir = p.root().display(),
+url = p.url()
+)));
+}
+
+#[test]
+fn debug_override_1() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "test"
+ version = "0.0.0"
+ authors = []
+
+ [profile.dev]
+ debug = 1
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] test v0.0.0 ({url})
+[RUNNING] `rustc --crate-name test src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link \
+ -C debuginfo=1 \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[FINISHED] [..] target(s) in [..]
+",
+dir = p.root().display(),
+url = p.url()
+)));
+}
+
+fn check_opt_level_override(profile_level: &str, rustc_level: &str) {
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+
+ name = "test"
+ version = "0.0.0"
+ authors = []
+
+ [profile.dev]
+ opt-level = {level}
+ "#, level = profile_level))
+ .file("src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] test v0.0.0 ({url})
+[RUNNING] `rustc --crate-name test src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link \
+ -C opt-level={level} \
+ -C debuginfo=2 \
+ -C debug-assertions=on \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[FINISHED] [..] target(s) in [..]
+",
+dir = p.root().display(),
+url = p.url(),
+level = rustc_level
+)));
+}
+
+#[test]
+fn opt_level_overrides() {
+ if !is_nightly() { return }
+
+ for &(profile_level, rustc_level) in &[
+ ("1", "1"),
+ ("2", "2"),
+ ("3", "3"),
+ ("\"s\"", "s"),
+ ("\"z\"", "z"),
+ ] {
+ check_opt_level_override(profile_level, rustc_level)
+ }
+}
+
+#[test]
+fn top_level_overrides_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+
+ name = "test"
+ version = "0.0.0"
+ authors = []
+
+ [profile.release]
+ opt-level = 1
+ debug = true
+
+ [dependencies.foo]
+ path = "foo"
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [package]
+
+ name = "foo"
+ version = "0.0.0"
+ authors = []
+
+ [profile.release]
+ opt-level = 0
+ debug = false
+
+ [lib]
+ name = "foo"
+ crate_type = ["dylib", "rlib"]
+ "#)
+ .file("foo/src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("build").arg("-v").arg("--release"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] foo v0.0.0 ({url}/foo)
+[RUNNING] `rustc --crate-name foo foo[/]src[/]lib.rs \
+ --crate-type dylib --crate-type rlib \
+ --emit=dep-info,link \
+ -C prefer-dynamic \
+ -C opt-level=1 \
+ -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir {dir}[/]target[/]release[/]deps \
+ -L dependency={dir}[/]target[/]release[/]deps`
+[COMPILING] test v0.0.0 ({url})
+[RUNNING] `rustc --crate-name test src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link \
+ -C opt-level=1 \
+ -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]release[/]deps \
+ --extern foo={dir}[/]target[/]release[/]deps[/]\
+ {prefix}foo[..]{suffix} \
+ --extern foo={dir}[/]target[/]release[/]deps[/]libfoo.rlib`
+[FINISHED] release [optimized + debuginfo] target(s) in [..]
+",
+ dir = p.root().display(),
+ url = p.url(),
+ prefix = env::consts::DLL_PREFIX,
+ suffix = env::consts::DLL_SUFFIX)));
+}
+
+#[test]
+fn profile_in_non_root_manifest_triggers_a_warning() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["bar"]
+
+ [profile.dev]
+ debug = false
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ workspace = ".."
+
+ [profile.dev]
+ opt-level = 1
+ "#)
+ .file("bar/src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("bar")).arg("-v"),
+ execs().with_status(0).with_stderr("\
+[WARNING] profiles for the non root package will be ignored, specify profiles at the workspace root:
+package: [..]
+workspace: [..]
+[COMPILING] bar v0.1.0 ([..])
+[RUNNING] `rustc [..]`
+[FINISHED] dev [unoptimized] target(s) in [..]"));
+}
+
+#[test]
+fn profile_in_virtual_manifest_works() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["bar"]
+
+ [profile.dev]
+ opt-level = 1
+ debug = false
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ workspace = ".."
+ "#)
+ .file("bar/src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("bar")).arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] bar v0.1.0 ([..])
+[RUNNING] `rustc [..]`
+[FINISHED] dev [optimized] target(s) in [..]"));
+}
--- /dev/null
+extern crate cargotest;
+extern crate flate2;
+extern crate hamcrest;
+extern crate tar;
+
+use std::io::prelude::*;
+use std::fs::File;
+use std::io::SeekFrom;
+
+use cargotest::ChannelChanger;
+use cargotest::support::git::repo;
+use cargotest::support::paths;
+use cargotest::support::{project, execs, publish};
+use flate2::read::GzDecoder;
+use hamcrest::assert_that;
+use tar::Archive;
+
+#[test]
+fn simple() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish").arg("--no-verify")
+ .arg("--index").arg(publish::registry().to_string()),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `{reg}`
+[WARNING] manifest has no documentation, [..]
+See [..]
+[PACKAGING] foo v0.0.1 ({dir})
+[UPLOADING] foo v0.0.1 ({dir})
+",
+ dir = p.url(),
+ reg = publish::registry())));
+
+ let mut f = File::open(&publish::upload_path().join("api/v1/crates/new")).unwrap();
+ // Skip the metadata payload and the size of the tarball
+ let mut sz = [0; 4];
+ assert_eq!(f.read(&mut sz).unwrap(), 4);
+ let sz = ((sz[0] as u32) << 0) |
+ ((sz[1] as u32) << 8) |
+ ((sz[2] as u32) << 16) |
+ ((sz[3] as u32) << 24);
+ f.seek(SeekFrom::Current(sz as i64 + 4)).unwrap();
+
+ // Verify the tarball
+ let mut rdr = GzDecoder::new(f).unwrap();
+ assert_eq!(rdr.header().filename().unwrap(), b"foo-0.0.1.crate");
+ let mut contents = Vec::new();
+ rdr.read_to_end(&mut contents).unwrap();
+ let mut ar = Archive::new(&contents[..]);
+ for file in ar.entries().unwrap() {
+ let file = file.unwrap();
+ let fname = file.header().path_bytes();
+ let fname = &*fname;
+ assert!(fname == b"foo-0.0.1/Cargo.toml" ||
+ fname == b"foo-0.0.1/Cargo.toml.orig" ||
+ fname == b"foo-0.0.1/src/main.rs",
+ "unexpected filename: {:?}", file.header().path());
+ }
+}
+
+// TODO: Deprecated
+// remove once it has been decided --host can be removed
+#[test]
+fn simple_with_host() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish").arg("--no-verify")
+ .arg("--host").arg(publish::registry().to_string()),
+ execs().with_status(0).with_stderr(&format!("\
+[WARNING] The flag '--host' is no longer valid.
+
+Previous versions of Cargo accepted this flag, but it is being
+deprecated. The flag is being renamed to 'index', as the flag
+wants the location of the index to which to publish. Please
+use '--index' instead.
+
+This will soon become a hard error, so it's either recommended
+to update to a fixed version or contact the upstream maintainer
+about this warning.
+[UPDATING] registry `{reg}`
+[WARNING] manifest has no documentation, [..]
+See [..]
+[PACKAGING] foo v0.0.1 ({dir})
+[UPLOADING] foo v0.0.1 ({dir})
+",
+ dir = p.url(),
+ reg = publish::registry())));
+
+ let mut f = File::open(&publish::upload_path().join("api/v1/crates/new")).unwrap();
+ // Skip the metadata payload and the size of the tarball
+ let mut sz = [0; 4];
+ assert_eq!(f.read(&mut sz).unwrap(), 4);
+ let sz = ((sz[0] as u32) << 0) |
+ ((sz[1] as u32) << 8) |
+ ((sz[2] as u32) << 16) |
+ ((sz[3] as u32) << 24);
+ f.seek(SeekFrom::Current(sz as i64 + 4)).unwrap();
+
+ // Verify the tarball
+ let mut rdr = GzDecoder::new(f).unwrap();
+ assert_eq!(rdr.header().filename().unwrap(), "foo-0.0.1.crate".as_bytes());
+ let mut contents = Vec::new();
+ rdr.read_to_end(&mut contents).unwrap();
+ let mut ar = Archive::new(&contents[..]);
+ for file in ar.entries().unwrap() {
+ let file = file.unwrap();
+ let fname = file.header().path_bytes();
+ let fname = &*fname;
+ assert!(fname == b"foo-0.0.1/Cargo.toml" ||
+ fname == b"foo-0.0.1/Cargo.toml.orig" ||
+ fname == b"foo-0.0.1/src/main.rs",
+ "unexpected filename: {:?}", file.header().path());
+ }
+}
+
+// TODO: Deprecated
+// remove once it has been decided --host can be removed
+#[test]
+fn simple_with_index_and_host() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish").arg("--no-verify")
+ .arg("--index").arg(publish::registry().to_string())
+ .arg("--host").arg(publish::registry().to_string()),
+ execs().with_status(0).with_stderr(&format!("\
+[WARNING] The flag '--host' is no longer valid.
+
+Previous versions of Cargo accepted this flag, but it is being
+deprecated. The flag is being renamed to 'index', as the flag
+wants the location of the index to which to publish. Please
+use '--index' instead.
+
+This will soon become a hard error, so it's either recommended
+to update to a fixed version or contact the upstream maintainer
+about this warning.
+[UPDATING] registry `{reg}`
+[WARNING] manifest has no documentation, [..]
+See [..]
+[PACKAGING] foo v0.0.1 ({dir})
+[UPLOADING] foo v0.0.1 ({dir})
+",
+ dir = p.url(),
+ reg = publish::registry())));
+
+ let mut f = File::open(&publish::upload_path().join("api/v1/crates/new")).unwrap();
+ // Skip the metadata payload and the size of the tarball
+ let mut sz = [0; 4];
+ assert_eq!(f.read(&mut sz).unwrap(), 4);
+ let sz = ((sz[0] as u32) << 0) |
+ ((sz[1] as u32) << 8) |
+ ((sz[2] as u32) << 16) |
+ ((sz[3] as u32) << 24);
+ f.seek(SeekFrom::Current(sz as i64 + 4)).unwrap();
+
+ // Verify the tarball
+ let mut rdr = GzDecoder::new(f).unwrap();
+ assert_eq!(rdr.header().filename().unwrap(), "foo-0.0.1.crate".as_bytes());
+ let mut contents = Vec::new();
+ rdr.read_to_end(&mut contents).unwrap();
+ let mut ar = Archive::new(&contents[..]);
+ for file in ar.entries().unwrap() {
+ let file = file.unwrap();
+ let fname = file.header().path_bytes();
+ let fname = &*fname;
+ assert!(fname == b"foo-0.0.1/Cargo.toml" ||
+ fname == b"foo-0.0.1/Cargo.toml.orig" ||
+ fname == b"foo-0.0.1/src/main.rs",
+ "unexpected filename: {:?}", file.header().path());
+ }
+}
+
+#[test]
+fn git_deps() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+
+ [dependencies.foo]
+ git = "git://path/to/nowhere"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish").arg("-v").arg("--no-verify")
+ .arg("--index").arg(publish::registry().to_string()),
+ execs().with_status(101).with_stderr("\
+[UPDATING] registry [..]
+[ERROR] crates cannot be published to crates.io with dependencies sourced from \
+a repository\neither publish `foo` as its own crate on crates.io and \
+specify a crates.io version as a dependency or pull it into this \
+repository and specify it with a path and version\n\
+(crate `foo` has repository path `git://path/to/nowhere`)\
+"));
+}
+
+#[test]
+fn path_dependency_no_version() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("publish")
+ .arg("--index").arg(publish::registry().to_string()),
+ execs().with_status(101).with_stderr("\
+[UPDATING] registry [..]
+[ERROR] all path dependencies must have a version specified when publishing.
+dependency `bar` does not specify a version
+"));
+}
+
+#[test]
+fn unpublishable_crate() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ publish = false
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish")
+ .arg("--index").arg(publish::registry().to_string()),
+ execs().with_status(101).with_stderr("\
+[ERROR] some crates cannot be published.
+`foo` is marked as unpublishable
+"));
+}
+
+#[test]
+fn dont_publish_dirty() {
+ publish::setup();
+ let p = project("foo")
+ .file("bar", "")
+ .build();
+
+ let _ = repo(&paths::root().join("foo"))
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ documentation = "foo"
+ homepage = "foo"
+ repository = "foo"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish")
+ .arg("--index").arg(publish::registry().to_string()),
+ execs().with_status(101).with_stderr("\
+[UPDATING] registry `[..]`
+error: 1 files in the working directory contain changes that were not yet \
+committed into git:
+
+bar
+
+to proceed despite this, pass the `--allow-dirty` flag
+"));
+}
+
+#[test]
+fn publish_clean() {
+ publish::setup();
+
+ let p = project("foo").build();
+
+ let _ = repo(&paths::root().join("foo"))
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ documentation = "foo"
+ homepage = "foo"
+ repository = "foo"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish")
+ .arg("--index").arg(publish::registry().to_string()),
+ execs().with_status(0));
+}
+
+#[test]
+fn publish_in_sub_repo() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("baz", "")
+ .build();
+
+ let _ = repo(&paths::root().join("foo"))
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ documentation = "foo"
+ homepage = "foo"
+ repository = "foo"
+ "#)
+ .file("bar/src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish").cwd(p.root().join("bar"))
+ .arg("--index").arg(publish::registry().to_string()),
+ execs().with_status(0));
+}
+
+#[test]
+fn publish_when_ignored() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("baz", "")
+ .build();
+
+ let _ = repo(&paths::root().join("foo"))
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ documentation = "foo"
+ homepage = "foo"
+ repository = "foo"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file(".gitignore", "baz")
+ .build();
+
+ assert_that(p.cargo("publish")
+ .arg("--index").arg(publish::registry().to_string()),
+ execs().with_status(0));
+}
+
+#[test]
+fn ignore_when_crate_ignored() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("bar/baz", "")
+ .build();
+
+ let _ = repo(&paths::root().join("foo"))
+ .file(".gitignore", "bar")
+ .nocommit_file("bar/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ documentation = "foo"
+ homepage = "foo"
+ repository = "foo"
+ "#)
+ .nocommit_file("bar/src/main.rs", "fn main() {}");
+ assert_that(p.cargo("publish").cwd(p.root().join("bar"))
+ .arg("--index").arg(publish::registry().to_string()),
+ execs().with_status(0));
+}
+
+#[test]
+fn new_crate_rejected() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("baz", "")
+ .build();
+
+ let _ = repo(&paths::root().join("foo"))
+ .nocommit_file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ documentation = "foo"
+ homepage = "foo"
+ repository = "foo"
+ "#)
+ .nocommit_file("src/main.rs", "fn main() {}");
+ assert_that(p.cargo("publish")
+ .arg("--index").arg(publish::registry().to_string()),
+ execs().with_status(101));
+}
+
+#[test]
+fn dry_run() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish").arg("--dry-run")
+ .arg("--index").arg(publish::registry().to_string()),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `[..]`
+[WARNING] manifest has no documentation, [..]
+See [..]
+[PACKAGING] foo v0.0.1 ({dir})
+[VERIFYING] foo v0.0.1 ({dir})
+[COMPILING] foo v0.0.1 [..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[UPLOADING] foo v0.0.1 ({dir})
+[WARNING] aborting upload due to dry run
+",
+ dir = p.url())));
+
+ // Ensure the API request wasn't actually made
+ assert!(!publish::upload_path().join("api/v1/crates/new").exists());
+}
+
+#[test]
+fn block_publish_feature_not_enabled() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ publish = [
+ "test"
+ ]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish").masquerade_as_nightly_cargo()
+ .arg("--registry").arg("alternative").arg("-Zunstable-options"),
+ execs().with_status(101).with_stderr("\
+error: failed to parse manifest at `[..]`
+
+Caused by:
+ the `publish` manifest key is unstable for anything other than a value of true or false
+
+Caused by:
+ feature `alternative-registries` is required
+
+consider adding `cargo-features = [\"alternative-registries\"]` to the manifest
+"));
+}
+
+#[test]
+fn registry_not_in_publish_list() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ publish = [
+ "test"
+ ]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish").masquerade_as_nightly_cargo()
+ .arg("--registry").arg("alternative").arg("-Zunstable-options"),
+ execs().with_status(101).with_stderr("\
+[ERROR] some crates cannot be published.
+`foo` is marked as unpublishable
+"));
+}
+
+#[test]
+fn publish_empty_list() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ publish = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish").masquerade_as_nightly_cargo()
+ .arg("--registry").arg("alternative").arg("-Zunstable-options"),
+ execs().with_status(101).with_stderr("\
+[ERROR] some crates cannot be published.
+`foo` is marked as unpublishable
+"));
+}
+
+#[test]
+fn publish_allowed_registry() {
+ publish::setup();
+
+ let p = project("foo").build();
+
+ let _ = repo(&paths::root().join("foo"))
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ documentation = "foo"
+ homepage = "foo"
+ publish = ["alternative"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish").masquerade_as_nightly_cargo()
+ .arg("--registry").arg("alternative").arg("-Zunstable-options"),
+ execs().with_status(0));
+}
+
+#[test]
+fn block_publish_no_registry() {
+ publish::setup();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ cargo-features = ["alternative-registries"]
+
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ publish = []
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("publish").masquerade_as_nightly_cargo()
+ .arg("--index").arg(publish::registry().to_string()),
+ execs().with_status(101).with_stderr("\
+[ERROR] some crates cannot be published.
+`foo` is marked as unpublishable
+"));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::{project, execs, main_file, basic_bin_manifest};
+use hamcrest::{assert_that};
+
+static MANIFEST_OUTPUT: &'static str = r#"
+{
+ "name":"foo",
+ "version":"0.5.0",
+ "id":"foo[..]0.5.0[..](path+file://[..]/foo)",
+ "license": null,
+ "license_file": null,
+ "description": null,
+ "source":null,
+ "dependencies":[],
+ "targets":[{
+ "kind":["bin"],
+ "crate_types":["bin"],
+ "name":"foo",
+ "src_path":"[..][/]foo[/]src[/]foo.rs"
+ }],
+ "features":{},
+ "manifest_path":"[..]Cargo.toml"
+}"#;
+
+#[test]
+fn cargo_read_manifest_path_to_cargo_toml_relative() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("read-manifest")
+ .arg("--manifest-path").arg("foo/Cargo.toml")
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(0)
+ .with_json(MANIFEST_OUTPUT));
+}
+
+#[test]
+fn cargo_read_manifest_path_to_cargo_toml_absolute() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("read-manifest")
+ .arg("--manifest-path").arg(p.root().join("Cargo.toml"))
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(0)
+ .with_json(MANIFEST_OUTPUT));
+}
+
+#[test]
+fn cargo_read_manifest_path_to_cargo_toml_parent_relative() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("read-manifest")
+ .arg("--manifest-path").arg("foo")
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(101)
+ .with_stderr("[ERROR] the manifest-path must be \
+ a path to a Cargo.toml file"));
+}
+
+#[test]
+fn cargo_read_manifest_path_to_cargo_toml_parent_absolute() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("read-manifest")
+ .arg("--manifest-path").arg(p.root())
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(101)
+ .with_stderr("[ERROR] the manifest-path must be \
+ a path to a Cargo.toml file"));
+}
+
+#[test]
+fn cargo_read_manifest_cwd() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("read-manifest")
+ .cwd(p.root()),
+ execs().with_status(0)
+ .with_json(MANIFEST_OUTPUT));
+}
--- /dev/null
+#[macro_use]
+extern crate cargotest;
+extern crate hamcrest;
+extern crate url;
+
+use std::fs::{self, File};
+use std::io::prelude::*;
+use std::path::PathBuf;
+
+use cargotest::cargo_process;
+use cargotest::support::git;
+use cargotest::support::paths::{self, CargoPathExt};
+use cargotest::support::registry::{self, Package};
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+use url::Url;
+
+fn registry_path() -> PathBuf { paths::root().join("registry") }
+fn registry() -> Url { Url::from_file_path(&*registry_path()).ok().unwrap() }
+
+#[test]
+fn simple() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = ">= 0.0.0"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("bar", "0.0.1").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `{reg}`
+[DOWNLOADING] bar v0.0.1 (registry `file://[..]`)
+[COMPILING] bar v0.0.1
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url(),
+ reg = registry::registry())));
+
+ assert_that(p.cargo("clean"), execs().with_status(0));
+
+ // Don't download a second time
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] bar v0.0.1
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url())));
+}
+
+#[test]
+fn deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = ">= 0.0.0"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("baz", "0.0.1").publish();
+ Package::new("bar", "0.0.1").dep("baz", "*").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `{reg}`
+[DOWNLOADING] [..] v0.0.1 (registry `file://[..]`)
+[DOWNLOADING] [..] v0.0.1 (registry `file://[..]`)
+[COMPILING] baz v0.0.1
+[COMPILING] bar v0.0.1
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url(),
+ reg = registry::registry())));
+}
+
+#[test]
+fn nonexistent() {
+ Package::new("init", "0.0.1").publish();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ nonexistent = ">= 0.0.0"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr("\
+[UPDATING] registry [..]
+[ERROR] no matching package named `nonexistent` found (required by `foo`)
+location searched: registry [..]
+version required: >= 0.0.0
+"));
+}
+
+#[test]
+fn wrong_version() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = ">= 1.0.0"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("foo", "0.0.1").publish();
+ Package::new("foo", "0.0.2").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr_contains("\
+[ERROR] no matching version `>= 1.0.0` found for package `foo` (required by `foo`)
+location searched: registry [..]
+versions found: 0.0.2, 0.0.1
+"));
+
+ Package::new("foo", "0.0.3").publish();
+ Package::new("foo", "0.0.4").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr_contains("\
+[ERROR] no matching version `>= 1.0.0` found for package `foo` (required by `foo`)
+location searched: registry [..]
+versions found: 0.0.4, 0.0.3, 0.0.2, ...
+"));
+}
+
+#[test]
+fn bad_cksum() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bad-cksum = ">= 0.0.0"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ let pkg = Package::new("bad-cksum", "0.0.1");
+ pkg.publish();
+ t!(File::create(&pkg.archive_dst()));
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(101).with_stderr("\
+[UPDATING] registry [..]
+[DOWNLOADING] bad-cksum [..]
+[ERROR] unable to get packages from source
+
+Caused by:
+ failed to download replaced source registry `https://[..]`
+
+Caused by:
+ failed to verify the checksum of `bad-cksum v0.0.1 (registry `file://[..]`)`
+"));
+}
+
+#[test]
+fn update_registry() {
+ Package::new("init", "0.0.1").publish();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ notyet = ">= 0.0.0"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr_contains("\
+[ERROR] no matching package named `notyet` found (required by `foo`)
+location searched: registry [..]
+version required: >= 0.0.0
+"));
+
+ Package::new("notyet", "0.0.1").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `{reg}`
+[DOWNLOADING] notyet v0.0.1 (registry `file://[..]`)
+[COMPILING] notyet v0.0.1
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url(),
+ reg = registry::registry())));
+}
+
+#[test]
+fn package_with_path_deps() {
+ Package::new("init", "0.0.1").publish();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license = "MIT"
+ description = "foo"
+ repository = "bar"
+
+ [dependencies.notyet]
+ version = "0.0.1"
+ path = "notyet"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("notyet/Cargo.toml", r#"
+ [package]
+ name = "notyet"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("notyet/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("package").arg("-v"),
+ execs().with_status(101).with_stderr_contains("\
+[ERROR] failed to verify package tarball
+
+Caused by:
+ no matching package named `notyet` found (required by `foo`)
+location searched: registry [..]
+version required: ^0.0.1
+"));
+
+ Package::new("notyet", "0.0.1").publish();
+
+ assert_that(p.cargo("package"),
+ execs().with_status(0).with_stderr(format!("\
+[PACKAGING] foo v0.0.1 ({dir})
+[VERIFYING] foo v0.0.1 ({dir})
+[UPDATING] registry `[..]`
+[DOWNLOADING] notyet v0.0.1 (registry `file://[..]`)
+[COMPILING] notyet v0.0.1
+[COMPILING] foo v0.0.1 ({dir}[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+", dir = p.url())));
+}
+
+#[test]
+fn lockfile_locks() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("bar", "0.0.1").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] bar v0.0.1 (registry `file://[..]`)
+[COMPILING] bar v0.0.1
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url())));
+
+ p.root().move_into_the_past();
+ Package::new("bar", "0.0.2").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn lockfile_locks_transitively() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("baz", "0.0.1").publish();
+ Package::new("bar", "0.0.1").dep("baz", "*").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] [..] v0.0.1 (registry `file://[..]`)
+[DOWNLOADING] [..] v0.0.1 (registry `file://[..]`)
+[COMPILING] baz v0.0.1
+[COMPILING] bar v0.0.1
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url())));
+
+ p.root().move_into_the_past();
+ Package::new("baz", "0.0.2").publish();
+ Package::new("bar", "0.0.2").dep("baz", "*").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn yanks_are_not_used() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("baz", "0.0.1").publish();
+ Package::new("baz", "0.0.2").yanked(true).publish();
+ Package::new("bar", "0.0.1").dep("baz", "*").publish();
+ Package::new("bar", "0.0.2").dep("baz", "*").yanked(true).publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] [..] v0.0.1 (registry `file://[..]`)
+[DOWNLOADING] [..] v0.0.1 (registry `file://[..]`)
+[COMPILING] baz v0.0.1
+[COMPILING] bar v0.0.1
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url())));
+}
+
+#[test]
+fn relying_on_a_yank_is_bad() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("baz", "0.0.1").publish();
+ Package::new("baz", "0.0.2").yanked(true).publish();
+ Package::new("bar", "0.0.1").dep("baz", "=0.0.2").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101).with_stderr_contains("\
+[ERROR] no matching version `= 0.0.2` found for package `baz` (required by `bar`)
+location searched: registry [..]
+versions found: 0.0.1
+"));
+}
+
+#[test]
+fn yanks_in_lockfiles_are_ok() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("bar", "0.0.1").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ registry::registry_path().join("3").rm_rf();
+
+ Package::new("bar", "0.0.1").yanked(true).publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stdout(""));
+
+ assert_that(p.cargo("update"),
+ execs().with_status(101).with_stderr_contains("\
+[ERROR] no matching package named `bar` found (required by `foo`)
+location searched: registry [..]
+version required: *
+"));
+}
+
+#[test]
+fn update_with_lockfile_if_packages_missing() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("bar", "0.0.1").publish();
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ p.root().move_into_the_past();
+
+ paths::home().join(".cargo/registry").rm_rf();
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] bar v0.0.1 (registry `file://[..]`)
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+"));
+}
+
+#[test]
+fn update_lockfile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ println!("0.0.1");
+ Package::new("bar", "0.0.1").publish();
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ Package::new("bar", "0.0.2").publish();
+ Package::new("bar", "0.0.3").publish();
+ paths::home().join(".cargo/registry").rm_rf();
+ println!("0.0.2 update");
+ assert_that(p.cargo("update")
+ .arg("-p").arg("bar").arg("--precise").arg("0.0.2"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `[..]`
+[UPDATING] bar v0.0.1 -> v0.0.2
+"));
+
+ println!("0.0.2 build");
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[DOWNLOADING] [..] v0.0.2 (registry `file://[..]`)
+[COMPILING] bar v0.0.2
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url())));
+
+ println!("0.0.3 update");
+ assert_that(p.cargo("update")
+ .arg("-p").arg("bar"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `[..]`
+[UPDATING] bar v0.0.2 -> v0.0.3
+"));
+
+ println!("0.0.3 build");
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[DOWNLOADING] [..] v0.0.3 (registry `file://[..]`)
+[COMPILING] bar v0.0.3
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url())));
+
+ println!("new dependencies update");
+ Package::new("bar", "0.0.4").dep("spam", "0.2.5").publish();
+ Package::new("spam", "0.2.5").publish();
+ assert_that(p.cargo("update")
+ .arg("-p").arg("bar"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `[..]`
+[UPDATING] bar v0.0.3 -> v0.0.4
+[ADDING] spam v0.2.5
+"));
+
+ println!("new dependencies update");
+ Package::new("bar", "0.0.5").publish();
+ assert_that(p.cargo("update")
+ .arg("-p").arg("bar"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `[..]`
+[UPDATING] bar v0.0.4 -> v0.0.5
+[REMOVING] spam v0.2.5
+"));
+}
+
+#[test]
+fn dev_dependency_not_used() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("baz", "0.0.1").publish();
+ Package::new("bar", "0.0.1").dev_dep("baz", "*").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] [..] v0.0.1 (registry `file://[..]`)
+[COMPILING] bar v0.0.1
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url())));
+}
+
+#[test]
+fn login_with_no_cargo_dir() {
+ let home = paths::home().join("new-home");
+ t!(fs::create_dir(&home));
+ assert_that(cargo_process().arg("login").arg("foo").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn login_with_differently_sized_token() {
+ // Verify that the configuration file gets properly trunchated.
+ let home = paths::home().join("new-home");
+ t!(fs::create_dir(&home));
+ assert_that(cargo_process().arg("login").arg("lmaolmaolmao").arg("-v"),
+ execs().with_status(0));
+ assert_that(cargo_process().arg("login").arg("lmao").arg("-v"),
+ execs().with_status(0));
+ assert_that(cargo_process().arg("login").arg("lmaolmaolmao").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn bad_license_file() {
+ Package::new("foo", "1.0.0").publish();
+ let p = project("all")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ license-file = "foo"
+ description = "bar"
+ repository = "baz"
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+ assert_that(p.cargo("publish")
+ .arg("-v")
+ .arg("--index").arg(registry().to_string()),
+ execs().with_status(101)
+ .with_stderr_contains("\
+[ERROR] the license file `foo` does not exist"));
+}
+
+#[test]
+fn updating_a_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.a]
+ path = "a"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ Package::new("bar", "0.0.1").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] bar v0.0.1 (registry `file://[..]`)
+[COMPILING] bar v0.0.1
+[COMPILING] a v0.0.1 ({dir}/a)
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url())));
+
+ t!(t!(File::create(&p.root().join("a/Cargo.toml"))).write_all(br#"
+ [project]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "0.1.0"
+ "#));
+ Package::new("bar", "0.1.0").publish();
+
+ println!("second");
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] bar v0.1.0 (registry `file://[..]`)
+[COMPILING] bar v0.1.0
+[COMPILING] a v0.0.1 ({dir}/a)
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url())));
+}
+
+#[test]
+fn git_and_registry_dep() {
+ let b = git::repo(&paths::root().join("b"))
+ .file("Cargo.toml", r#"
+ [project]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = "0.0.1"
+
+ [dependencies.b]
+ git = '{}'
+ "#, b.url()))
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("a", "0.0.1").publish();
+
+ p.root().move_into_the_past();
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] [..]
+[UPDATING] [..]
+[DOWNLOADING] a v0.0.1 (registry `file://[..]`)
+[COMPILING] a v0.0.1
+[COMPILING] b v0.0.1 ([..])
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url())));
+ p.root().move_into_the_past();
+
+ println!("second");
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn update_publish_then_update() {
+ // First generate a Cargo.lock and a clone of the registry index at the
+ // "head" of the current registry.
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ a = "0.1.0"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+ Package::new("a", "0.1.0").publish();
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ // Next, publish a new package and back up the copy of the registry we just
+ // created.
+ Package::new("a", "0.1.1").publish();
+ let registry = paths::home().join(".cargo/registry");
+ let backup = paths::root().join("registry-backup");
+ t!(fs::rename(®istry, &backup));
+
+ // Generate a Cargo.lock with the newer version, and then move the old copy
+ // of the registry back into place.
+ let p2 = project("foo2")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ a = "0.1.1"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+ assert_that(p2.cargo("build"),
+ execs().with_status(0));
+ registry.rm_rf();
+ t!(fs::rename(&backup, ®istry));
+ t!(fs::rename(p2.root().join("Cargo.lock"), p.root().join("Cargo.lock")));
+
+ // Finally, build the first project again (with our newer Cargo.lock) which
+ // should force an update of the old registry, download the new crate, and
+ // then build everything again.
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr(&format!("\
+[UPDATING] [..]
+[DOWNLOADING] a v0.1.1 (registry `file://[..]`)
+[COMPILING] a v0.1.1
+[COMPILING] foo v0.5.0 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+",
+ dir = p.url())));
+
+}
+
+#[test]
+fn fetch_downloads() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ a = "0.1.0"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("a", "0.1.0").publish();
+
+ assert_that(p.cargo("fetch"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] a v0.1.0 (registry [..])
+"));
+}
+
+#[test]
+fn update_transitive_dependency() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ a = "0.1.0"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("a", "0.1.0").dep("b", "*").publish();
+ Package::new("b", "0.1.0").publish();
+
+ assert_that(p.cargo("fetch"),
+ execs().with_status(0));
+
+ Package::new("b", "0.1.1").publish();
+
+ assert_that(p.cargo("update").arg("-pb"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] registry `[..]`
+[UPDATING] b v0.1.0 -> v0.1.1
+"));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[DOWNLOADING] b v0.1.1 (registry `file://[..]`)
+[COMPILING] b v0.1.1
+[COMPILING] a v0.1.0
+[COMPILING] foo v0.5.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+"));
+}
+
+#[test]
+fn update_backtracking_ok() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ webdriver = "0.1"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("webdriver", "0.1.0").dep("hyper", "0.6").publish();
+ Package::new("hyper", "0.6.5").dep("openssl", "0.1")
+ .dep("cookie", "0.1")
+ .publish();
+ Package::new("cookie", "0.1.0").dep("openssl", "0.1").publish();
+ Package::new("openssl", "0.1.0").publish();
+
+ assert_that(p.cargo("generate-lockfile"),
+ execs().with_status(0));
+
+ Package::new("openssl", "0.1.1").publish();
+ Package::new("hyper", "0.6.6").dep("openssl", "0.1.1")
+ .dep("cookie", "0.1.0")
+ .publish();
+
+ assert_that(p.cargo("update").arg("-p").arg("hyper"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] registry `[..]`
+"));
+}
+
+#[test]
+fn update_multiple_packages() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ a = "*"
+ b = "*"
+ c = "*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("a", "0.1.0").publish();
+ Package::new("b", "0.1.0").publish();
+ Package::new("c", "0.1.0").publish();
+
+ assert_that(p.cargo("fetch"),
+ execs().with_status(0));
+
+ Package::new("a", "0.1.1").publish();
+ Package::new("b", "0.1.1").publish();
+ Package::new("c", "0.1.1").publish();
+
+ assert_that(p.cargo("update").arg("-pa").arg("-pb"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] registry `[..]`
+[UPDATING] a v0.1.0 -> v0.1.1
+[UPDATING] b v0.1.0 -> v0.1.1
+"));
+
+ assert_that(p.cargo("update").arg("-pb").arg("-pc"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] registry `[..]`
+[UPDATING] c v0.1.0 -> v0.1.1
+"));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[DOWNLOADING] a v0.1.1 (registry `file://[..]`)")
+ .with_stderr_contains("\
+[DOWNLOADING] b v0.1.1 (registry `file://[..]`)")
+ .with_stderr_contains("\
+[DOWNLOADING] c v0.1.1 (registry `file://[..]`)")
+ .with_stderr_contains("\
+[COMPILING] a v0.1.1")
+ .with_stderr_contains("\
+[COMPILING] b v0.1.1")
+ .with_stderr_contains("\
+[COMPILING] c v0.1.1")
+ .with_stderr_contains("\
+[COMPILING] foo v0.5.0 ([..])"));
+}
+
+#[test]
+fn bundled_crate_in_registry() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ bar = "0.1"
+ baz = "0.1"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("bar", "0.1.0").publish();
+ Package::new("baz", "0.1.0")
+ .dep("bar", "0.1.0")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "bar", version = "0.1.0" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "")
+ .publish();
+
+ assert_that(p.cargo("run"), execs().with_status(0));
+}
+
+#[test]
+fn update_same_prefix_oh_my_how_was_this_a_bug() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "ugh"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("foobar", "0.2.0").publish();
+ Package::new("foo", "0.1.0")
+ .dep("foobar", "0.2.0")
+ .publish();
+
+ assert_that(p.cargo("generate-lockfile"), execs().with_status(0));
+ assert_that(p.cargo("update").arg("-pfoobar").arg("--precise=0.2.0"),
+ execs().with_status(0));
+}
+
+#[test]
+fn use_semver() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ foo = "1.2.3-alpha.0"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("foo", "1.2.3-alpha.0").publish();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+}
+
+#[test]
+fn only_download_relevant() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.5.0"
+ authors = []
+
+ [target.foo.dependencies]
+ foo = "*"
+ [dev-dependencies]
+ bar = "*"
+ [dependencies]
+ baz = "*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("foo", "0.1.0").publish();
+ Package::new("bar", "0.1.0").publish();
+ Package::new("baz", "0.1.0").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0).with_stderr("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] baz v0.1.0 ([..])
+[COMPILING] baz v0.1.0
+[COMPILING] bar v0.5.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..] secs
+"));
+}
+
+#[test]
+fn resolve_and_backtracking() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ foo = "*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("foo", "0.1.1")
+ .feature_dep("bar", "0.1", &["a", "b"])
+ .publish();
+ Package::new("foo", "0.1.0").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn upstream_warnings_on_extra_verbose() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ foo = "*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("foo", "0.1.0")
+ .file("src/lib.rs", "fn unused() {}")
+ .publish();
+
+ assert_that(p.cargo("build").arg("-vv"),
+ execs().with_status(0).with_stderr_contains("\
+[..]warning: function is never used[..]
+"));
+}
+
+#[test]
+fn disallow_network() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ foo = "*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--frozen"),
+ execs().with_status(101).with_stderr("\
+error: failed to load source for a dependency on `foo`
+
+Caused by:
+ Unable to update registry [..]
+
+Caused by:
+ attempting to make an HTTP request, but --frozen was specified
+"));
+}
+
+#[test]
+fn add_dep_dont_update_registry() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ baz = { path = "baz" }
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("baz/Cargo.toml", r#"
+ [project]
+ name = "baz"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ remote = "0.3"
+ "#)
+ .file("baz/src/lib.rs", "")
+ .build();
+
+ Package::new("remote", "0.3.4").publish();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+
+ t!(t!(File::create(p.root().join("Cargo.toml"))).write_all(br#"
+ [project]
+ name = "bar"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ baz = { path = "baz" }
+ remote = "0.3"
+ "#));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] bar v0.5.0 ([..])
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn bump_version_dont_update_registry() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ baz = { path = "baz" }
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("baz/Cargo.toml", r#"
+ [project]
+ name = "baz"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ remote = "0.3"
+ "#)
+ .file("baz/src/lib.rs", "")
+ .build();
+
+ Package::new("remote", "0.3.4").publish();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+
+ t!(t!(File::create(p.root().join("Cargo.toml"))).write_all(br#"
+ [project]
+ name = "bar"
+ version = "0.6.0"
+ authors = []
+
+ [dependencies]
+ baz = { path = "baz" }
+ "#));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] bar v0.6.0 ([..])
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn old_version_req() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ remote = "0.2*"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("remote", "0.2.0").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+warning: parsed version requirement `0.2*` is no longer valid
+
+Previous versions of Cargo accepted this malformed requirement,
+but it is being deprecated. This was found when parsing the manifest
+of bar 0.5.0, and the correct version requirement is `0.2.*`.
+
+This will soon become a hard error, so it's either recommended to
+update to a fixed version or contact the upstream maintainer about
+this warning.
+
+warning: parsed version requirement `0.2*` is no longer valid
+
+Previous versions of Cargo accepted this malformed requirement,
+but it is being deprecated. This was found when parsing the manifest
+of bar 0.5.0, and the correct version requirement is `0.2.*`.
+
+This will soon become a hard error, so it's either recommended to
+update to a fixed version or contact the upstream maintainer about
+this warning.
+
+[UPDATING] [..]
+[DOWNLOADING] [..]
+[COMPILING] [..]
+[COMPILING] [..]
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn old_version_req_upstream() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ remote = "0.3"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ Package::new("remote", "0.3.0")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "remote"
+ version = "0.3.0"
+ authors = []
+
+ [dependencies]
+ bar = "0.2*"
+ "#)
+ .file("src/lib.rs", "")
+ .publish();
+ Package::new("bar", "0.2.0").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] [..]
+[DOWNLOADING] [..]
+warning: parsed version requirement `0.2*` is no longer valid
+
+Previous versions of Cargo accepted this malformed requirement,
+but it is being deprecated. This was found when parsing the manifest
+of remote 0.3.0, and the correct version requirement is `0.2.*`.
+
+This will soon become a hard error, so it's either recommended to
+update to a fixed version or contact the upstream maintainer about
+this warning.
+
+[COMPILING] [..]
+[COMPILING] [..]
+[FINISHED] [..]
+"));
+}
+
+#[test]
+fn toml_lies_but_index_is_truth() {
+ Package::new("foo", "0.2.0").publish();
+ Package::new("bar", "0.3.0")
+ .dep("foo", "0.2.0")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.3.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.1.0"
+ "#)
+ .file("src/lib.rs", "extern crate foo;")
+ .publish();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ bar = "0.3"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn vv_prints_warnings() {
+ Package::new("foo", "0.2.0")
+ .file("src/lib.rs", r#"
+ #![deny(warnings)]
+
+ fn foo() {} // unused function
+ "#)
+ .publish();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "fo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.2"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("-vv"),
+ execs().with_status(0));
+}
+
+#[test]
+fn bad_and_or_malicious_packages_rejected() {
+ Package::new("foo", "0.2.0")
+ .extra_file("foo-0.1.0/src/lib.rs", "")
+ .publish();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "fo"
+ version = "0.5.0"
+ authors = []
+
+ [dependencies]
+ foo = "0.2"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("-vv"),
+ execs().with_status(101)
+ .with_stderr("\
+[UPDATING] [..]
+[DOWNLOADING] [..]
+error: unable to get packages from source
+
+Caused by:
+ failed to download [..]
+
+Caused by:
+ failed to unpack [..]
+
+Caused by:
+ [..] contains a file at \"foo-0.1.0/src/lib.rs\" which isn't under \"foo-0.2.0\"
+"));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::is_nightly;
+use cargotest::install::{cargo_home, has_installed_exe};
+use cargotest::support::{project, execs};
+use hamcrest::{assert_that, existing_file, not};
+
+#[test]
+fn build_bin_default_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["a"]
+ a = []
+
+ [[bin]]
+ name = "foo"
+ required-features = ["a"]
+ "#)
+ .file("src/main.rs", r#"
+ extern crate foo;
+
+ #[cfg(feature = "a")]
+ fn test() {
+ foo::foo();
+ }
+
+ fn main() {}
+ "#)
+ .file("src/lib.rs", r#"
+ #[cfg(feature = "a")]
+ pub fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+
+ assert_that(p.cargo("build").arg("--no-default-features"),
+ execs().with_status(0));
+
+ assert_that(p.cargo("build").arg("--bin=foo"),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+
+ assert_that(p.cargo("build").arg("--bin=foo").arg("--no-default-features"),
+ execs().with_status(101).with_stderr("\
+error: target `foo` requires the features: `a`
+Consider enabling them by passing e.g. `--features=\"a\"`
+"));
+}
+
+#[test]
+fn build_bin_arg_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ a = []
+
+ [[bin]]
+ name = "foo"
+ required-features = ["a"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--features").arg("a"),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+}
+
+#[test]
+fn build_bin_multiple_required_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["a", "b"]
+ a = []
+ b = ["a"]
+ c = []
+
+ [[bin]]
+ name = "foo_1"
+ path = "src/foo_1.rs"
+ required-features = ["b", "c"]
+
+ [[bin]]
+ name = "foo_2"
+ path = "src/foo_2.rs"
+ required-features = ["a"]
+ "#)
+ .file("src/foo_1.rs", "fn main() {}")
+ .file("src/foo_2.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ assert_that(&p.bin("foo_1"), not(existing_file()));
+ assert_that(&p.bin("foo_2"), existing_file());
+
+ assert_that(p.cargo("build").arg("--features").arg("c"),
+ execs().with_status(0));
+
+ assert_that(&p.bin("foo_1"), existing_file());
+ assert_that(&p.bin("foo_2"), existing_file());
+
+ assert_that(p.cargo("build").arg("--no-default-features"),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_example_default_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["a"]
+ a = []
+
+ [[example]]
+ name = "foo"
+ required-features = ["a"]
+ "#)
+ .file("examples/foo.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--example=foo"),
+ execs().with_status(0));
+ assert_that(&p.bin("examples/foo"), existing_file());
+
+ assert_that(p.cargo("build").arg("--example=foo").arg("--no-default-features"),
+ execs().with_status(101).with_stderr("\
+error: target `foo` requires the features: `a`
+Consider enabling them by passing e.g. `--features=\"a\"`
+"));
+}
+
+#[test]
+fn build_example_arg_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ a = []
+
+ [[example]]
+ name = "foo"
+ required-features = ["a"]
+ "#)
+ .file("examples/foo.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--example=foo").arg("--features").arg("a"),
+ execs().with_status(0));
+ assert_that(&p.bin("examples/foo"), existing_file());
+}
+
+#[test]
+fn build_example_multiple_required_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["a", "b"]
+ a = []
+ b = ["a"]
+ c = []
+
+ [[example]]
+ name = "foo_1"
+ required-features = ["b", "c"]
+
+ [[example]]
+ name = "foo_2"
+ required-features = ["a"]
+ "#)
+ .file("examples/foo_1.rs", "fn main() {}")
+ .file("examples/foo_2.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("--example=foo_1"),
+ execs().with_status(101).with_stderr("\
+error: target `foo_1` requires the features: `b`, `c`
+Consider enabling them by passing e.g. `--features=\"b c\"`
+"));
+ assert_that(p.cargo("build").arg("--example=foo_2"),
+ execs().with_status(0));
+
+ assert_that(&p.bin("examples/foo_1"), not(existing_file()));
+ assert_that(&p.bin("examples/foo_2"), existing_file());
+
+ assert_that(p.cargo("build").arg("--example=foo_1")
+ .arg("--features").arg("c"),
+ execs().with_status(0));
+ assert_that(p.cargo("build").arg("--example=foo_2")
+ .arg("--features").arg("c"),
+ execs().with_status(0));
+
+ assert_that(&p.bin("examples/foo_1"), existing_file());
+ assert_that(&p.bin("examples/foo_2"), existing_file());
+
+ assert_that(p.cargo("build").arg("--example=foo_1")
+ .arg("--no-default-features"),
+ execs().with_status(101).with_stderr("\
+error: target `foo_1` requires the features: `b`, `c`
+Consider enabling them by passing e.g. `--features=\"b c\"`
+"));
+ assert_that(p.cargo("build").arg("--example=foo_2")
+ .arg("--no-default-features"),
+ execs().with_status(101).with_stderr("\
+error: target `foo_2` requires the features: `a`
+Consider enabling them by passing e.g. `--features=\"a\"`
+"));
+}
+
+#[test]
+fn test_default_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["a"]
+ a = []
+
+ [[test]]
+ name = "foo"
+ required-features = ["a"]
+ "#)
+ .file("tests/foo.rs", "#[test]\nfn test() {}")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]", p.url()))
+ .with_stdout_contains("test test ... ok"));
+
+ assert_that(p.cargo("test").arg("--no-default-features"),
+ execs().with_status(0).with_stderr(format!("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]"))
+ .with_stdout(""));
+
+ assert_that(p.cargo("test").arg("--test=foo"),
+ execs().with_status(0).with_stderr(format!("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]"))
+ .with_stdout_contains("test test ... ok"));
+
+ assert_that(p.cargo("test").arg("--test=foo").arg("--no-default-features"),
+ execs().with_status(101).with_stderr("\
+error: target `foo` requires the features: `a`
+Consider enabling them by passing e.g. `--features=\"a\"`
+"));
+}
+
+#[test]
+fn test_arg_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ a = []
+
+ [[test]]
+ name = "foo"
+ required-features = ["a"]
+ "#)
+ .file("tests/foo.rs", "#[test]\nfn test() {}")
+ .build();
+
+ assert_that(p.cargo("test").arg("--features").arg("a"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]", p.url()))
+ .with_stdout_contains("test test ... ok"));
+}
+
+#[test]
+fn test_multiple_required_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["a", "b"]
+ a = []
+ b = ["a"]
+ c = []
+
+ [[test]]
+ name = "foo_1"
+ required-features = ["b", "c"]
+
+ [[test]]
+ name = "foo_2"
+ required-features = ["a"]
+ "#)
+ .file("tests/foo_1.rs", "#[test]\nfn test() {}")
+ .file("tests/foo_2.rs", "#[test]\nfn test() {}")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo_2-[..][EXE]", p.url()))
+ .with_stdout_contains("test test ... ok"));
+
+ assert_that(p.cargo("test").arg("--features").arg("c"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo_1-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]foo_2-[..][EXE]", p.url()))
+ .with_stdout_contains_n("test test ... ok", 2));
+
+ assert_that(p.cargo("test").arg("--no-default-features"),
+ execs().with_status(0).with_stderr(format!("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]"))
+ .with_stdout(""));
+}
+
+#[test]
+fn bench_default_features() {
+ if !is_nightly() {
+ return;
+ }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["a"]
+ a = []
+
+ [[bench]]
+ name = "foo"
+ required-features = ["a"]
+ "#)
+ .file("benches/foo.rs", r#"
+ #![feature(test)]
+ extern crate test;
+
+ #[bench]
+ fn bench(_: &mut test::Bencher) {
+ }"#)
+ .build();
+
+ assert_that(p.cargo("bench"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]", p.url()))
+ .with_stdout_contains("test bench ... bench: [..]"));
+
+ assert_that(p.cargo("bench").arg("--no-default-features"),
+ execs().with_status(0).with_stderr(format!("\
+[FINISHED] release [optimized] target(s) in [..]"))
+ .with_stdout(""));
+
+ assert_that(p.cargo("bench").arg("--bench=foo"),
+ execs().with_status(0).with_stderr(format!("\
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]"))
+ .with_stdout_contains("test bench ... bench: [..]"));
+
+ assert_that(p.cargo("bench").arg("--bench=foo").arg("--no-default-features"),
+ execs().with_status(101).with_stderr("\
+error: target `foo` requires the features: `a`
+Consider enabling them by passing e.g. `--features=\"a\"`
+"));
+}
+
+#[test]
+fn bench_arg_features() {
+ if !is_nightly() {
+ return;
+ }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ a = []
+
+ [[bench]]
+ name = "foo"
+ required-features = ["a"]
+ "#)
+ .file("benches/foo.rs", r#"
+ #![feature(test)]
+ extern crate test;
+
+ #[bench]
+ fn bench(_: &mut test::Bencher) {
+ }"#)
+ .build();
+
+ assert_that(p.cargo("bench").arg("--features").arg("a"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]", p.url()))
+ .with_stdout_contains("test bench ... bench: [..]"));
+}
+
+#[test]
+fn bench_multiple_required_features() {
+ if !is_nightly() {
+ return;
+ }
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["a", "b"]
+ a = []
+ b = ["a"]
+ c = []
+
+ [[bench]]
+ name = "foo_1"
+ required-features = ["b", "c"]
+
+ [[bench]]
+ name = "foo_2"
+ required-features = ["a"]
+ "#)
+ .file("benches/foo_1.rs", r#"
+ #![feature(test)]
+ extern crate test;
+
+ #[bench]
+ fn bench(_: &mut test::Bencher) {
+ }"#)
+ .file("benches/foo_2.rs", r#"
+ #![feature(test)]
+ extern crate test;
+
+ #[bench]
+ fn bench(_: &mut test::Bencher) {
+ }"#)
+ .build();
+
+ assert_that(p.cargo("bench"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo_2-[..][EXE]", p.url()))
+ .with_stdout_contains("test bench ... bench: [..]"));
+
+ assert_that(p.cargo("bench").arg("--features").arg("c"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo_1-[..][EXE]
+[RUNNING] target[/]release[/]deps[/]foo_2-[..][EXE]", p.url()))
+ .with_stdout_contains_n("test bench ... bench: [..]", 2));
+
+ assert_that(p.cargo("bench").arg("--no-default-features"),
+ execs().with_status(0).with_stderr(format!("\
+[FINISHED] release [optimized] target(s) in [..]"))
+ .with_stdout(""));
+}
+
+#[test]
+fn install_default_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["a"]
+ a = []
+
+ [[bin]]
+ name = "foo"
+ required-features = ["a"]
+
+ [[example]]
+ name = "foo"
+ required-features = ["a"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("examples/foo.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("install"),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(p.cargo("uninstall").arg("foo"),
+ execs().with_status(0));
+
+ assert_that(p.cargo("install").arg("--no-default-features"),
+ execs().with_status(101).with_stderr(format!("\
+[INSTALLING] foo v0.0.1 ([..])
+[FINISHED] release [optimized] target(s) in [..]
+[ERROR] no binaries are available for install using the selected features
+")));
+ assert_that(cargo_home(), not(has_installed_exe("foo")));
+
+ assert_that(p.cargo("install").arg("--bin=foo"),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(p.cargo("uninstall").arg("foo"),
+ execs().with_status(0));
+
+ assert_that(p.cargo("install").arg("--bin=foo").arg("--no-default-features"),
+ execs().with_status(101).with_stderr(format!("\
+[INSTALLING] foo v0.0.1 ([..])
+[ERROR] failed to compile `foo v0.0.1 ([..])`, intermediate artifacts can be found at \
+ `[..]target`
+
+Caused by:
+ target `foo` requires the features: `a`
+Consider enabling them by passing e.g. `--features=\"a\"`
+")));
+ assert_that(cargo_home(), not(has_installed_exe("foo")));
+
+ assert_that(p.cargo("install").arg("--example=foo"),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(p.cargo("uninstall").arg("foo"),
+ execs().with_status(0));
+
+ assert_that(p.cargo("install").arg("--example=foo").arg("--no-default-features"),
+ execs().with_status(101).with_stderr(format!("\
+[INSTALLING] foo v0.0.1 ([..])
+[ERROR] failed to compile `foo v0.0.1 ([..])`, intermediate artifacts can be found at \
+ `[..]target`
+
+Caused by:
+ target `foo` requires the features: `a`
+Consider enabling them by passing e.g. `--features=\"a\"`
+")));
+ assert_that(cargo_home(), not(has_installed_exe("foo")));
+}
+
+#[test]
+fn install_arg_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ a = []
+
+ [[bin]]
+ name = "foo"
+ required-features = ["a"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("install").arg("--features").arg("a"),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(p.cargo("uninstall").arg("foo"),
+ execs().with_status(0));
+}
+
+#[test]
+fn install_multiple_required_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["a", "b"]
+ a = []
+ b = ["a"]
+ c = []
+
+ [[bin]]
+ name = "foo_1"
+ path = "src/foo_1.rs"
+ required-features = ["b", "c"]
+
+ [[bin]]
+ name = "foo_2"
+ path = "src/foo_2.rs"
+ required-features = ["a"]
+ "#)
+ .file("src/foo_1.rs", "fn main() {}")
+ .file("src/foo_2.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("install"),
+ execs().with_status(0));
+ assert_that(cargo_home(), not(has_installed_exe("foo_1")));
+ assert_that(cargo_home(), has_installed_exe("foo_2"));
+ assert_that(p.cargo("uninstall").arg("foo"),
+ execs().with_status(0));
+
+ assert_that(p.cargo("install").arg("--features").arg("c"),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo_1"));
+ assert_that(cargo_home(), has_installed_exe("foo_2"));
+ assert_that(p.cargo("uninstall").arg("foo"),
+ execs().with_status(0));
+
+ assert_that(p.cargo("install").arg("--no-default-features"),
+ execs().with_status(101).with_stderr("\
+[INSTALLING] foo v0.0.1 ([..])
+[FINISHED] release [optimized] target(s) in [..]
+[ERROR] no binaries are available for install using the selected features
+"));
+ assert_that(cargo_home(), not(has_installed_exe("foo_1")));
+ assert_that(cargo_home(), not(has_installed_exe("foo_2")));
+}
+
+#[test]
+fn dep_feature_in_toml() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = { path = "bar", features = ["a"] }
+
+ [[bin]]
+ name = "foo"
+ required-features = ["bar/a"]
+
+ [[example]]
+ name = "foo"
+ required-features = ["bar/a"]
+
+ [[test]]
+ name = "foo"
+ required-features = ["bar/a"]
+
+ [[bench]]
+ name = "foo"
+ required-features = ["bar/a"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("examples/foo.rs", "fn main() {}")
+ .file("tests/foo.rs", "#[test]\nfn test() {}")
+ .file("benches/foo.rs", r#"
+ #![feature(test)]
+ extern crate test;
+
+ #[bench]
+ fn bench(_: &mut test::Bencher) {
+ }"#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ a = []
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ // bin
+ assert_that(p.cargo("build").arg("--bin=foo"),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+
+ // example
+ assert_that(p.cargo("build").arg("--example=foo"),
+ execs().with_status(0));
+ assert_that(&p.bin("examples/foo"), existing_file());
+
+ // test
+ assert_that(p.cargo("test").arg("--test=foo"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]", p.url()))
+ .with_stdout_contains("test test ... ok"));
+
+ // bench
+ if is_nightly() {
+ assert_that(p.cargo("bench").arg("--bench=foo"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] bar v0.0.1 ({0}/bar)
+[COMPILING] foo v0.0.1 ({0})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]", p.url()))
+ .with_stdout_contains("test bench ... bench: [..]"));
+ }
+
+ // install
+ assert_that(p.cargo("install"),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(p.cargo("uninstall").arg("foo"),
+ execs().with_status(0));
+}
+
+#[test]
+fn dep_feature_in_cmd_line() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = { path = "bar" }
+
+ [[bin]]
+ name = "foo"
+ required-features = ["bar/a"]
+
+ [[example]]
+ name = "foo"
+ required-features = ["bar/a"]
+
+ [[test]]
+ name = "foo"
+ required-features = ["bar/a"]
+
+ [[bench]]
+ name = "foo"
+ required-features = ["bar/a"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("examples/foo.rs", "fn main() {}")
+ .file("tests/foo.rs", "#[test]\nfn test() {}")
+ .file("benches/foo.rs", r#"
+ #![feature(test)]
+ extern crate test;
+
+ #[bench]
+ fn bench(_: &mut test::Bencher) {
+ }"#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ a = []
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ // bin
+ assert_that(p.cargo("build").arg("--bin=foo"),
+ execs().with_status(101).with_stderr("\
+error: target `foo` requires the features: `bar/a`
+Consider enabling them by passing e.g. `--features=\"bar/a\"`
+"));
+
+ assert_that(p.cargo("build").arg("--bin=foo").arg("--features").arg("bar/a"),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+
+ // example
+ assert_that(p.cargo("build").arg("--example=foo"),
+ execs().with_status(101).with_stderr("\
+error: target `foo` requires the features: `bar/a`
+Consider enabling them by passing e.g. `--features=\"bar/a\"`
+"));
+
+ assert_that(p.cargo("build").arg("--example=foo").arg("--features").arg("bar/a"),
+ execs().with_status(0));
+ assert_that(&p.bin("examples/foo"), existing_file());
+
+ // test
+ assert_that(p.cargo("test"),
+ execs().with_status(0).with_stderr(format!("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]"))
+ .with_stdout(""));
+
+ assert_that(p.cargo("test").arg("--test=foo").arg("--features").arg("bar/a"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]", p.url()))
+ .with_stdout_contains("test test ... ok"));
+
+ // bench
+ if is_nightly() {
+ assert_that(p.cargo("bench"),
+ execs().with_status(0).with_stderr(format!("\
+[FINISHED] release [optimized] target(s) in [..]"))
+ .with_stdout(""));
+
+ assert_that(p.cargo("bench").arg("--bench=foo").arg("--features").arg("bar/a"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] bar v0.0.1 ({0}/bar)
+[COMPILING] foo v0.0.1 ({0})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]", p.url()))
+ .with_stdout_contains("test bench ... bench: [..]"));
+ }
+
+ // install
+ assert_that(p.cargo("install"),
+ execs().with_status(101).with_stderr(format!("\
+[INSTALLING] foo v0.0.1 ([..])
+[FINISHED] release [optimized] target(s) in [..]
+[ERROR] no binaries are available for install using the selected features
+")));
+ assert_that(cargo_home(), not(has_installed_exe("foo")));
+
+ assert_that(p.cargo("install").arg("--features").arg("bar/a"),
+ execs().with_status(0));
+ assert_that(cargo_home(), has_installed_exe("foo"));
+ assert_that(p.cargo("uninstall").arg("foo"),
+ execs().with_status(0));
+}
+
+#[test]
+fn test_skips_compiling_bin_with_missing_required_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ a = []
+
+ [[bin]]
+ name = "bin_foo"
+ path = "src/bin/foo.rs"
+ required-features = ["a"]
+ "#)
+ .file("src/bin/foo.rs", "extern crate bar; fn main() {}")
+ .file("tests/foo.rs", "")
+ .file("benches/foo.rs", "")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]", p.url()))
+ .with_stdout_contains("running 0 tests"));
+
+ assert_that(p.cargo("test").arg("--features").arg("a").arg("-j").arg("1"),
+ execs().with_status(101).with_stderr_contains(format!("\
+[COMPILING] foo v0.0.1 ({})
+error[E0463]: can't find crate for `bar`", p.url())));
+
+ if is_nightly() {
+ assert_that(p.cargo("bench"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] target[/]release[/]deps[/]foo-[..][EXE]", p.url()))
+ .with_stdout_contains("running 0 tests"));
+
+ assert_that(p.cargo("bench").arg("--features").arg("a").arg("-j").arg("1"),
+ execs().with_status(101).with_stderr_contains(format!("\
+[COMPILING] foo v0.0.1 ({})
+error[E0463]: can't find crate for `bar`", p.url())));
+ }
+}
+
+#[test]
+fn run_default() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = []
+ a = []
+
+ [[bin]]
+ name = "foo"
+ required-features = ["a"]
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/main.rs", "extern crate foo; fn main() {}")
+ .build();
+
+ assert_that(p.cargo("run"),
+ execs().with_status(101).with_stderr("\
+error: target `foo` requires the features: `a`
+Consider enabling them by passing e.g. `--features=\"a\"`
+"));
+
+ assert_that(p.cargo("run").arg("--features").arg("a"),
+ execs().with_status(0));
+}
+
+#[test]
+fn run_default_multiple_required_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [features]
+ default = ["a"]
+ a = []
+ b = []
+
+ [[bin]]
+ name = "foo1"
+ path = "src/foo1.rs"
+ required-features = ["a"]
+
+ [[bin]]
+ name = "foo2"
+ path = "src/foo2.rs"
+ required-features = ["b"]
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/foo1.rs", "extern crate foo; fn main() {}")
+ .file("src/foo2.rs", "extern crate foo; fn main() {}")
+ .build();
+
+ assert_that(p.cargo("run"),
+ execs().with_status(101).with_stderr("\
+error: `cargo run` requires that a project only have one executable; \
+use the `--bin` option to specify which one to run\navailable binaries: foo1, foo2"));
+}
--- /dev/null
+#![deny(warnings)]
+
+extern crate hamcrest;
+extern crate cargo;
+
+use std::collections::BTreeMap;
+
+use hamcrest::{assert_that, equal_to, contains, not};
+
+use cargo::core::source::{SourceId, GitReference};
+use cargo::core::dependency::Kind::{self, Development};
+use cargo::core::{Dependency, PackageId, Summary, Registry};
+use cargo::util::{CargoResult, ToUrl};
+use cargo::core::resolver::{self, Method};
+
+fn resolve(pkg: &PackageId, deps: Vec<Dependency>, registry: &[Summary])
+ -> CargoResult<Vec<PackageId>>
+{
+ struct MyRegistry<'a>(&'a [Summary]);
+ impl<'a> Registry for MyRegistry<'a> {
+ fn query(&mut self,
+ dep: &Dependency,
+ f: &mut FnMut(Summary)) -> CargoResult<()> {
+ for summary in self.0.iter() {
+ if dep.matches(summary) {
+ f(summary.clone());
+ }
+ }
+ Ok(())
+ }
+ fn supports_checksums(&self) -> bool { false }
+ fn requires_precise(&self) -> bool { false }
+ }
+ let mut registry = MyRegistry(registry);
+ let summary = Summary::new(pkg.clone(), deps, BTreeMap::new()).unwrap();
+ let method = Method::Everything;
+ let resolve = resolver::resolve(&[(summary, method)], &[], &mut registry, None, false)?;
+ let res = resolve.iter().cloned().collect();
+ Ok(res)
+}
+
+trait ToDep {
+ fn to_dep(self) -> Dependency;
+}
+
+impl ToDep for &'static str {
+ fn to_dep(self) -> Dependency {
+ let url = "http://example.com".to_url().unwrap();
+ let source_id = SourceId::for_registry(&url).unwrap();
+ Dependency::parse_no_deprecated(self, Some("1.0.0"), &source_id).unwrap()
+ }
+}
+
+impl ToDep for Dependency {
+ fn to_dep(self) -> Dependency {
+ self
+ }
+}
+
+trait ToPkgId {
+ fn to_pkgid(&self) -> PackageId;
+}
+
+impl ToPkgId for &'static str {
+ fn to_pkgid(&self) -> PackageId {
+ PackageId::new(*self, "1.0.0", ®istry_loc()).unwrap()
+ }
+}
+
+impl ToPkgId for (&'static str, &'static str) {
+ fn to_pkgid(&self) -> PackageId {
+ let (name, vers) = *self;
+ PackageId::new(name, vers, ®istry_loc()).unwrap()
+ }
+}
+
+macro_rules! pkg {
+ ($pkgid:expr => [$($deps:expr),+]) => ({
+ let d: Vec<Dependency> = vec![$($deps.to_dep()),+];
+
+ Summary::new($pkgid.to_pkgid(), d, BTreeMap::new()).unwrap()
+ });
+
+ ($pkgid:expr) => (
+ Summary::new($pkgid.to_pkgid(), Vec::new(), BTreeMap::new()).unwrap()
+ )
+}
+
+fn registry_loc() -> SourceId {
+ let remote = "http://example.com".to_url().unwrap();
+ SourceId::for_registry(&remote).unwrap()
+}
+
+fn pkg(name: &str) -> Summary {
+ Summary::new(pkg_id(name), Vec::new(), BTreeMap::new()).unwrap()
+}
+
+fn pkg_id(name: &str) -> PackageId {
+ PackageId::new(name, "1.0.0", ®istry_loc()).unwrap()
+}
+
+fn pkg_id_loc(name: &str, loc: &str) -> PackageId {
+ let remote = loc.to_url();
+ let master = GitReference::Branch("master".to_string());
+ let source_id = SourceId::for_git(&remote.unwrap(), master).unwrap();
+
+ PackageId::new(name, "1.0.0", &source_id).unwrap()
+}
+
+fn pkg_loc(name: &str, loc: &str) -> Summary {
+ Summary::new(pkg_id_loc(name, loc), Vec::new(), BTreeMap::new()).unwrap()
+}
+
+fn dep(name: &str) -> Dependency { dep_req(name, "1.0.0") }
+fn dep_req(name: &str, req: &str) -> Dependency {
+ let url = "http://example.com".to_url().unwrap();
+ let source_id = SourceId::for_registry(&url).unwrap();
+ Dependency::parse_no_deprecated(name, Some(req), &source_id).unwrap()
+}
+
+fn dep_loc(name: &str, location: &str) -> Dependency {
+ let url = location.to_url().unwrap();
+ let master = GitReference::Branch("master".to_string());
+ let source_id = SourceId::for_git(&url, master).unwrap();
+ Dependency::parse_no_deprecated(name, Some("1.0.0"), &source_id).unwrap()
+}
+fn dep_kind(name: &str, kind: Kind) -> Dependency {
+ dep(name).set_kind(kind).clone()
+}
+
+fn registry(pkgs: Vec<Summary>) -> Vec<Summary> {
+ pkgs
+}
+
+fn names<P: ToPkgId>(names: &[P]) -> Vec<PackageId> {
+ names.iter().map(|name| name.to_pkgid()).collect()
+}
+
+fn loc_names(names: &[(&'static str, &'static str)]) -> Vec<PackageId> {
+ names.iter()
+ .map(|&(name, loc)| pkg_id_loc(name, loc)).collect()
+}
+
+#[test]
+fn test_resolving_empty_dependency_list() {
+ let res = resolve(&pkg_id("root"), Vec::new(),
+ ®istry(vec![])).unwrap();
+
+ assert_that(&res, equal_to(&names(&["root"])));
+}
+
+#[test]
+fn test_resolving_only_package() {
+ let reg = registry(vec![pkg("foo")]);
+ let res = resolve(&pkg_id("root"), vec![dep("foo")], ®);
+
+ assert_that(&res.unwrap(), contains(names(&["root", "foo"])).exactly());
+}
+
+#[test]
+fn test_resolving_one_dep() {
+ let reg = registry(vec![pkg("foo"), pkg("bar")]);
+ let res = resolve(&pkg_id("root"), vec![dep("foo")], ®);
+
+ assert_that(&res.unwrap(), contains(names(&["root", "foo"])).exactly());
+}
+
+#[test]
+fn test_resolving_multiple_deps() {
+ let reg = registry(vec![pkg!("foo"), pkg!("bar"), pkg!("baz")]);
+ let res = resolve(&pkg_id("root"), vec![dep("foo"), dep("baz")],
+ ®).unwrap();
+
+ assert_that(&res, contains(names(&["root", "foo", "baz"])).exactly());
+}
+
+#[test]
+fn test_resolving_transitive_deps() {
+ let reg = registry(vec![pkg!("foo"), pkg!("bar" => ["foo"])]);
+ let res = resolve(&pkg_id("root"), vec![dep("bar")], ®).unwrap();
+
+ assert_that(&res, contains(names(&["root", "foo", "bar"])));
+}
+
+#[test]
+fn test_resolving_common_transitive_deps() {
+ let reg = registry(vec![pkg!("foo" => ["bar"]), pkg!("bar")]);
+ let res = resolve(&pkg_id("root"), vec![dep("foo"), dep("bar")],
+ ®).unwrap();
+
+ assert_that(&res, contains(names(&["root", "foo", "bar"])));
+}
+
+#[test]
+fn test_resolving_with_same_name() {
+ let list = vec![pkg_loc("foo", "http://first.example.com"),
+ pkg_loc("bar", "http://second.example.com")];
+
+ let reg = registry(list);
+ let res = resolve(&pkg_id("root"),
+ vec![dep_loc("foo", "http://first.example.com"),
+ dep_loc("bar", "http://second.example.com")],
+ ®);
+
+ let mut names = loc_names(&[("foo", "http://first.example.com"),
+ ("bar", "http://second.example.com")]);
+
+ names.push(pkg_id("root"));
+
+ assert_that(&res.unwrap(), contains(names).exactly());
+}
+
+#[test]
+fn test_resolving_with_dev_deps() {
+ let reg = registry(vec![
+ pkg!("foo" => ["bar", dep_kind("baz", Development)]),
+ pkg!("baz" => ["bat", dep_kind("bam", Development)]),
+ pkg!("bar"),
+ pkg!("bat")
+ ]);
+
+ let res = resolve(&pkg_id("root"),
+ vec![dep("foo"), dep_kind("baz", Development)],
+ ®).unwrap();
+
+ assert_that(&res, contains(names(&["root", "foo", "bar", "baz"])));
+}
+
+#[test]
+fn resolving_with_many_versions() {
+ let reg = registry(vec![
+ pkg!(("foo", "1.0.1")),
+ pkg!(("foo", "1.0.2")),
+ ]);
+
+ let res = resolve(&pkg_id("root"), vec![dep("foo")], ®).unwrap();
+
+ assert_that(&res, contains(names(&[("root", "1.0.0"),
+ ("foo", "1.0.2")])));
+}
+
+#[test]
+fn resolving_with_specific_version() {
+ let reg = registry(vec![
+ pkg!(("foo", "1.0.1")),
+ pkg!(("foo", "1.0.2")),
+ ]);
+
+ let res = resolve(&pkg_id("root"), vec![dep_req("foo", "=1.0.1")],
+ ®).unwrap();
+
+ assert_that(&res, contains(names(&[("root", "1.0.0"),
+ ("foo", "1.0.1")])));
+}
+
+#[test]
+fn test_resolving_maximum_version_with_transitive_deps() {
+ let reg = registry(vec![
+ pkg!(("util", "1.2.2")),
+ pkg!(("util", "1.0.0")),
+ pkg!(("util", "1.1.1")),
+ pkg!("foo" => [dep_req("util", "1.0.0")]),
+ pkg!("bar" => [dep_req("util", ">=1.0.1")]),
+ ]);
+
+ let res = resolve(&pkg_id("root"), vec![dep_req("foo", "1.0.0"), dep_req("bar", "1.0.0")],
+ ®).unwrap();
+
+ assert_that(&res, contains(names(&[("root", "1.0.0"),
+ ("foo", "1.0.0"),
+ ("bar", "1.0.0"),
+ ("util", "1.2.2")])));
+ assert_that(&res, not(contains(names(&[("util", "1.0.1")]))));
+ assert_that(&res, not(contains(names(&[("util", "1.1.1")]))));
+}
+
+#[test]
+fn resolving_incompat_versions() {
+ let reg = registry(vec![
+ pkg!(("foo", "1.0.1")),
+ pkg!(("foo", "1.0.2")),
+ pkg!("bar" => [dep_req("foo", "=1.0.2")]),
+ ]);
+
+ assert!(resolve(&pkg_id("root"), vec![
+ dep_req("foo", "=1.0.1"),
+ dep("bar"),
+ ], ®).is_err());
+}
+
+#[test]
+fn resolving_backtrack() {
+ let reg = registry(vec![
+ pkg!(("foo", "1.0.2") => [dep("bar")]),
+ pkg!(("foo", "1.0.1") => [dep("baz")]),
+ pkg!("bar" => [dep_req("foo", "=2.0.2")]),
+ pkg!("baz"),
+ ]);
+
+ let res = resolve(&pkg_id("root"), vec![
+ dep_req("foo", "^1"),
+ ], ®).unwrap();
+
+ assert_that(&res, contains(names(&[("root", "1.0.0"),
+ ("foo", "1.0.1"),
+ ("baz", "1.0.0")])));
+}
+
+#[test]
+fn resolving_allows_multiple_compatible_versions() {
+ let reg = registry(vec![
+ pkg!(("foo", "1.0.0")),
+ pkg!(("foo", "2.0.0")),
+ pkg!(("foo", "0.1.0")),
+ pkg!(("foo", "0.2.0")),
+
+ pkg!("bar" => ["d1", "d2", "d3", "d4"]),
+ pkg!("d1" => [dep_req("foo", "1")]),
+ pkg!("d2" => [dep_req("foo", "2")]),
+ pkg!("d3" => [dep_req("foo", "0.1")]),
+ pkg!("d4" => [dep_req("foo", "0.2")]),
+ ]);
+
+ let res = resolve(&pkg_id("root"), vec![
+ dep("bar"),
+ ], ®).unwrap();
+
+ assert_that(&res, contains(names(&[("root", "1.0.0"),
+ ("foo", "1.0.0"),
+ ("foo", "2.0.0"),
+ ("foo", "0.1.0"),
+ ("foo", "0.2.0"),
+ ("d1", "1.0.0"),
+ ("d2", "1.0.0"),
+ ("d3", "1.0.0"),
+ ("d4", "1.0.0"),
+ ("bar", "1.0.0")])));
+}
+
+#[test]
+fn resolving_with_deep_backtracking() {
+ let reg = registry(vec![
+ pkg!(("foo", "1.0.1") => [dep_req("bar", "1")]),
+ pkg!(("foo", "1.0.0") => [dep_req("bar", "2")]),
+
+ pkg!(("bar", "1.0.0") => [dep_req("baz", "=1.0.2"),
+ dep_req("other", "1")]),
+ pkg!(("bar", "2.0.0") => [dep_req("baz", "=1.0.1")]),
+
+ pkg!(("baz", "1.0.2") => [dep_req("other", "2")]),
+ pkg!(("baz", "1.0.1")),
+
+ pkg!(("dep_req", "1.0.0")),
+ pkg!(("dep_req", "2.0.0")),
+ ]);
+
+ let res = resolve(&pkg_id("root"), vec![
+ dep_req("foo", "1"),
+ ], ®).unwrap();
+
+ assert_that(&res, contains(names(&[("root", "1.0.0"),
+ ("foo", "1.0.0"),
+ ("bar", "2.0.0"),
+ ("baz", "1.0.1")])));
+}
+
+#[test]
+fn resolving_but_no_exists() {
+ let reg = registry(vec![
+ ]);
+
+ let res = resolve(&pkg_id("root"), vec![
+ dep_req("foo", "1"),
+ ], ®);
+ assert!(res.is_err());
+
+ assert_eq!(res.err().unwrap().to_string(), "\
+no matching package named `foo` found (required by `root`)
+location searched: registry `http://example.com/`
+version required: ^1\
+");
+}
+
+#[test]
+fn resolving_cycle() {
+ let reg = registry(vec![
+ pkg!("foo" => ["foo"]),
+ ]);
+
+ let _ = resolve(&pkg_id("root"), vec![
+ dep_req("foo", "1"),
+ ], ®);
+}
+
+#[test]
+fn hard_equality() {
+ let reg = registry(vec![
+ pkg!(("foo", "1.0.1")),
+ pkg!(("foo", "1.0.0")),
+
+ pkg!(("bar", "1.0.0") => [dep_req("foo", "1.0.0")]),
+ ]);
+
+ let res = resolve(&pkg_id("root"), vec![
+ dep_req("bar", "1"),
+ dep_req("foo", "=1.0.0"),
+ ], ®).unwrap();
+
+ assert_that(&res, contains(names(&[("root", "1.0.0"),
+ ("foo", "1.0.0"),
+ ("bar", "1.0.0")])));
+}
--- /dev/null
+extern crate cargo;
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargo::util::paths::dylib_path_envvar;
+use cargotest::support::{project, execs, path2url};
+use hamcrest::{assert_that, existing_file};
+
+#[test]
+fn simple() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { println!("hello"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]foo[EXE]`", dir = path2url(p.root())))
+ .with_stdout("\
+hello
+"));
+ assert_that(&p.bin("foo"), existing_file());
+}
+
+#[test]
+#[ignore]
+fn simple_implicit_main() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { println!("hello world"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("--bins"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]foo[EXE]`", dir = path2url(p.root())))
+ .with_stdout("\
+hello
+"));
+ assert_that(&p.bin("foo"), existing_file());
+}
+
+#[test]
+fn simple_quiet() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { println!("hello"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("-q"),
+ execs().with_status(0).with_stdout("\
+hello
+")
+ );
+}
+
+#[test]
+fn simple_quiet_and_verbose() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { println!("hello"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("-q").arg("-v"),
+ execs().with_status(101).with_stderr("\
+[ERROR] cannot set both --verbose and --quiet
+"));
+}
+
+#[test]
+fn quiet_and_verbose_config() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file(".cargo/config", r#"
+ [term]
+ verbose = true
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { println!("hello"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("-q"),
+ execs().with_status(0));
+}
+
+#[test]
+fn simple_with_args() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {
+ assert_eq!(std::env::args().nth(1).unwrap(), "hello");
+ assert_eq!(std::env::args().nth(2).unwrap(), "world");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("hello").arg("world"),
+ execs().with_status(0));
+}
+
+#[test]
+fn exit_code() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { std::process::exit(2); }
+ "#)
+ .build();
+
+ let mut output = String::from("\
+[COMPILING] foo v0.0.1 (file[..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[..]`
+");
+ if !cfg!(unix) {
+ output.push_str("\
+[ERROR] process didn't exit successfully: `target[..]foo[..]` (exit code: 2)
+");
+ }
+ assert_that(p.cargo("run"),
+ execs().with_status(2).with_stderr(output));
+}
+
+#[test]
+fn exit_code_verbose() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { std::process::exit(2); }
+ "#)
+ .build();
+
+ let mut output = String::from("\
+[COMPILING] foo v0.0.1 (file[..])
+[RUNNING] `rustc [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[..]`
+");
+ if !cfg!(unix) {
+ output.push_str("\
+[ERROR] process didn't exit successfully: `target[..]foo[..]` (exit code: 2)
+");
+ }
+
+ assert_that(p.cargo("run").arg("-v"),
+ execs().with_status(2).with_stderr(output));
+}
+
+#[test]
+fn no_main_file() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("run"),
+ execs().with_status(101)
+ .with_stderr("[ERROR] a bin target must be available \
+ for `cargo run`\n"));
+}
+
+#[test]
+#[ignore]
+fn no_main_file_implicit() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("run").arg("--bins"),
+ execs().with_status(101)
+ .with_stderr("[ERROR] a bin target must be available \
+ for `cargo run`\n"));
+}
+
+#[test]
+fn too_many_bins() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", "")
+ .file("src/bin/b.rs", "")
+ .build();
+
+ assert_that(p.cargo("run"),
+ execs().with_status(101)
+ .with_stderr("[ERROR] `cargo run` requires that a project only \
+ have one executable; use the `--bin` option \
+ to specify which one to run\navailable binaries: [..]\n"));
+}
+
+#[test]
+#[ignore]
+fn too_many_bins_implicit() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", "")
+ .file("src/bin/b.rs", "")
+ .build();
+
+ assert_that(p.cargo("run").arg("--bins"),
+ execs().with_status(101)
+ .with_stderr("[ERROR] `cargo run` requires that a project only \
+ have one executable; use the `--bin` option \
+ to specify which one to run\navailable binaries: [..]\n"));
+}
+
+#[test]
+fn specify_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", r#"
+ #[allow(unused_extern_crates)]
+ extern crate foo;
+ fn main() { println!("hello a.rs"); }
+ "#)
+ .file("src/bin/b.rs", r#"
+ #[allow(unused_extern_crates)]
+ extern crate foo;
+ fn main() { println!("hello b.rs"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("--bin").arg("a").arg("-v"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[RUNNING] `rustc [..] src[/]lib.rs [..]`
+[RUNNING] `rustc [..] src[/]bin[/]a.rs [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]a[EXE]`", dir = path2url(p.root())))
+ .with_stdout("\
+hello a.rs
+"));
+
+ assert_that(p.cargo("run").arg("--bin").arg("b").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] src[/]bin[/]b.rs [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]b[EXE]`")
+ .with_stdout("\
+hello b.rs
+"));
+}
+
+#[test]
+fn run_example() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/a.rs", r#"
+ fn main() { println!("example"); }
+ "#)
+ .file("src/bin/a.rs", r#"
+ fn main() { println!("bin"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("--example").arg("a"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]examples[/]a[EXE]`", dir = path2url(p.root())))
+ .with_stdout("\
+example
+"));
+}
+
+#[test]
+#[ignore]
+fn run_bin_implicit() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/a.rs", r#"
+ fn main() { println!("example"); }
+ "#)
+ .file("src/bin/a.rs", r#"
+ fn main() { println!("bin"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("--bins"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]examples[/]a[EXE]`", dir = path2url(p.root())))
+ .with_stdout("\
+bin
+"));
+}
+
+#[test]
+#[ignore]
+fn run_example_implicit() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/a.rs", r#"
+ fn main() { println!("example"); }
+ "#)
+ .file("src/bin/a.rs", r#"
+ fn main() { println!("bin"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("--examples"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]examples[/]a[EXE]`", dir = path2url(p.root())))
+ .with_stdout("\
+example
+"));
+}
+
+#[test]
+fn run_with_filename() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", r#"
+ extern crate foo;
+ fn main() { println!("hello a.rs"); }
+ "#)
+ .file("examples/a.rs", r#"
+ fn main() { println!("example"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("--bin").arg("bin.rs"),
+ execs().with_status(101).with_stderr("\
+[ERROR] no bin target named `bin.rs`"));
+
+ assert_that(p.cargo("run").arg("--bin").arg("a.rs"),
+ execs().with_status(101).with_stderr("\
+[ERROR] no bin target named `a.rs`
+
+Did you mean `a`?"));
+
+ assert_that(p.cargo("run").arg("--example").arg("example.rs"),
+ execs().with_status(101).with_stderr("\
+[ERROR] no example target named `example.rs`"));
+
+ assert_that(p.cargo("run").arg("--example").arg("a.rs"),
+ execs().with_status(101).with_stderr("\
+[ERROR] no example target named `a.rs`
+
+Did you mean `a`?"));
+}
+
+#[test]
+fn either_name_or_example() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/bin/a.rs", r#"
+ fn main() { println!("hello a.rs"); }
+ "#)
+ .file("examples/b.rs", r#"
+ fn main() { println!("hello b.rs"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("--bin").arg("a").arg("--example").arg("b"),
+ execs().with_status(101)
+ .with_stderr("[ERROR] `cargo run` can run at most one \
+ executable, but multiple were \
+ specified"));
+}
+
+#[test]
+fn one_bin_multiple_examples() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/main.rs", r#"
+ fn main() { println!("hello main.rs"); }
+ "#)
+ .file("examples/a.rs", r#"
+ fn main() { println!("hello a.rs"); }
+ "#)
+ .file("examples/b.rs", r#"
+ fn main() { println!("hello b.rs"); }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]main[EXE]`", dir = path2url(p.root())))
+ .with_stdout("\
+hello main.rs
+"));
+}
+
+#[test]
+fn example_with_release_flag() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ version = "*"
+ path = "bar"
+ "#)
+ .file("examples/a.rs", r#"
+ extern crate bar;
+
+ fn main() {
+ if cfg!(debug_assertions) {
+ println!("slow1")
+ } else {
+ println!("fast1")
+ }
+ bar::baz();
+ }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "bar"
+ "#)
+ .file("bar/src/bar.rs", r#"
+ pub fn baz() {
+ if cfg!(debug_assertions) {
+ println!("slow2")
+ } else {
+ println!("fast2")
+ }
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("-v").arg("--release").arg("--example").arg("a"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] bar v0.0.1 ({url}/bar)
+[RUNNING] `rustc --crate-name bar bar[/]src[/]bar.rs --crate-type lib \
+ --emit=dep-info,link \
+ -C opt-level=3 \
+ -C metadata=[..] \
+ --out-dir {dir}[/]target[/]release[/]deps \
+ -L dependency={dir}[/]target[/]release[/]deps`
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `rustc --crate-name a examples[/]a.rs --crate-type bin \
+ --emit=dep-info,link \
+ -C opt-level=3 \
+ -C metadata=[..] \
+ --out-dir {dir}[/]target[/]release[/]examples \
+ -L dependency={dir}[/]target[/]release[/]deps \
+ --extern bar={dir}[/]target[/]release[/]deps[/]libbar-[..].rlib`
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] `target[/]release[/]examples[/]a[EXE]`
+",
+ dir = p.root().display(),
+ url = path2url(p.root()),
+ ))
+ .with_stdout("\
+fast1
+fast2"));
+
+ assert_that(p.cargo("run").arg("-v").arg("--example").arg("a"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] bar v0.0.1 ({url}/bar)
+[RUNNING] `rustc --crate-name bar bar[/]src[/]bar.rs --crate-type lib \
+ --emit=dep-info,link \
+ -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir {dir}[/]target[/]debug[/]deps \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `rustc --crate-name a examples[/]a.rs --crate-type bin \
+ --emit=dep-info,link \
+ -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir {dir}[/]target[/]debug[/]examples \
+ -L dependency={dir}[/]target[/]debug[/]deps \
+ --extern bar={dir}[/]target[/]debug[/]deps[/]libbar-[..].rlib`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `target[/]debug[/]examples[/]a[EXE]`
+",
+ dir = p.root().display(),
+ url = path2url(p.root()),
+ ))
+ .with_stdout("\
+slow1
+slow2"));
+}
+
+#[test]
+fn run_dylib_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() { bar::bar(); }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "bar"
+ crate-type = ["dylib"]
+ "#)
+ .file("bar/src/lib.rs", "pub fn bar() {}")
+ .build();
+
+ assert_that(p.cargo("run").arg("hello").arg("world"),
+ execs().with_status(0));
+}
+
+#[test]
+fn release_works() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { if cfg!(debug_assertions) { panic!() } }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("--release"),
+ execs().with_status(0).with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] `target[/]release[/]foo[EXE]`
+",
+ dir = path2url(p.root()),
+ )));
+ assert_that(&p.release_bin("foo"), existing_file());
+}
+
+#[test]
+fn run_bin_different_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name = "bar"
+ "#)
+ .file("src/bar.rs", r#"
+ fn main() { }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run"), execs().with_status(0));
+}
+
+#[test]
+fn dashes_are_forwarded() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name = "bar"
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {
+ let s: Vec<String> = std::env::args().collect();
+ assert_eq!(s[1], "a");
+ assert_eq!(s[2], "--");
+ assert_eq!(s[3], "b");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("--").arg("a").arg("--").arg("b"),
+ execs().with_status(0));
+}
+
+#[test]
+fn run_from_executable_folder() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() { println!("hello"); }
+ "#)
+ .build();
+
+ let cwd = p.root().join("target").join("debug");
+ p.cargo("build").exec_with_output().unwrap();
+
+ assert_that(p.cargo("run").cwd(cwd),
+ execs().with_status(0)
+ .with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]\n\
+[RUNNING] `.[/]foo[EXE]`")
+ .with_stdout("\
+hello
+"));
+}
+
+#[test]
+fn run_with_library_paths() {
+ let p = project("foo");
+
+ // Only link search directories within the target output directory are
+ // propagated through to dylib_path_envvar() (see #3366).
+ let mut dir1 = p.target_debug_dir();
+ dir1.push("foo\\backslash");
+
+ let mut dir2 = p.target_debug_dir();
+ dir2.push("dir=containing=equal=signs");
+
+ let p = p
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", &format!(r##"
+ fn main() {{
+ println!(r#"cargo:rustc-link-search=native={}"#);
+ println!(r#"cargo:rustc-link-search={}"#);
+ }}
+ "##, dir1.display(), dir2.display()))
+ .file("src/main.rs", &format!(r##"
+ fn main() {{
+ let search_path = std::env::var_os("{}").unwrap();
+ let paths = std::env::split_paths(&search_path).collect::<Vec<_>>();
+ assert!(paths.contains(&r#"{}"#.into()));
+ assert!(paths.contains(&r#"{}"#.into()));
+ }}
+ "##, dylib_path_envvar(), dir1.display(), dir2.display()))
+ .build();
+
+ assert_that(p.cargo("run"), execs().with_status(0));
+}
+
+#[test]
+fn fail_no_extra_verbose() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {
+ std::process::exit(1);
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("run").arg("-q"),
+ execs().with_status(1)
+ .with_stdout("")
+ .with_stderr(""));
+}
+
+#[test]
+fn run_multiple_packages() {
+ let p = project("foo")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [workspace]
+
+ [dependencies]
+ d1 = { path = "d1" }
+ d2 = { path = "d2" }
+ d3 = { path = "../d3" } # outside of the workspace
+
+ [[bin]]
+ name = "foo"
+ "#)
+ .file("foo/src/foo.rs", "fn main() { println!(\"foo\"); }")
+ .file("foo/d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name = "d1"
+ "#)
+ .file("foo/d1/src/lib.rs", "")
+ .file("foo/d1/src/main.rs", "fn main() { println!(\"d1\"); }")
+ .file("foo/d2/Cargo.toml", r#"
+ [package]
+ name = "d2"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name = "d2"
+ "#)
+ .file("foo/d2/src/main.rs", "fn main() { println!(\"d2\"); }")
+ .file("d3/Cargo.toml", r#"
+ [package]
+ name = "d3"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("d3/src/main.rs", "fn main() { println!(\"d2\"); }")
+ .build();
+
+ let cargo = || {
+ let mut process_builder = p.cargo("run");
+ process_builder.cwd(p.root().join("foo"));
+ process_builder
+ };
+
+ assert_that(cargo().arg("-p").arg("d1"),
+ execs().with_status(0).with_stdout("d1"));
+
+ assert_that(cargo().arg("-p").arg("d2").arg("--bin").arg("d2"),
+ execs().with_status(0).with_stdout("d2"));
+
+ assert_that(cargo(),
+ execs().with_status(0).with_stdout("foo"));
+
+ assert_that(cargo().arg("-p").arg("d1").arg("-p").arg("d2"),
+ execs()
+ .with_status(1)
+ .with_stderr_contains("[ERROR] Invalid arguments."));
+
+ assert_that(cargo().arg("-p").arg("d3"),
+ execs()
+ .with_status(101)
+ .with_stderr_contains("[ERROR] package `d3` is not a member of the workspace"));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::{execs, project};
+use hamcrest::assert_that;
+
+const CARGO_RUSTC_ERROR: &'static str =
+"[ERROR] extra arguments to `rustc` can only be passed to one target, consider filtering
+the package by passing e.g. `--lib` or `--bin NAME` to specify a single target";
+
+#[test]
+fn build_lib_for_foo() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/lib.rs", r#" "#)
+ .build();
+
+ assert_that(p.cargo("rustc").arg("--lib").arg("-v"),
+ execs()
+ .with_status(0)
+ .with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `rustc --crate-name foo src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.root().display(), url = p.url())));
+}
+
+#[test]
+fn lib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/lib.rs", r#" "#)
+ .build();
+
+ assert_that(p.cargo("rustc").arg("--lib").arg("-v")
+ .arg("--").arg("-C").arg("debug-assertions=off"),
+ execs()
+ .with_status(0)
+ .with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `rustc --crate-name foo src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link -C debuginfo=2 \
+ -C debug-assertions=off \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.root().display(), url = p.url())))
+}
+
+#[test]
+fn build_main_and_allow_unstable_options() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/lib.rs", r#" "#)
+ .build();
+
+ assert_that(p.cargo("rustc").arg("-v").arg("--bin").arg("foo")
+ .arg("--").arg("-C").arg("debug-assertions"),
+ execs()
+ .with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] {name} v{version} ({url})
+[RUNNING] `rustc --crate-name {name} src[/]lib.rs --crate-type lib \
+ --emit=dep-info,link -C debuginfo=2 \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[RUNNING] `rustc --crate-name {name} src[/]main.rs --crate-type bin \
+ --emit=dep-info,link -C debuginfo=2 \
+ -C debug-assertions \
+ -C metadata=[..] \
+ --out-dir [..] \
+ -L dependency={dir}[/]target[/]debug[/]deps \
+ --extern {name}={dir}[/]target[/]debug[/]deps[/]lib{name}-[..].rlib`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = p.root().display(), url = p.url(),
+ name = "foo", version = "0.0.1")));
+}
+
+#[test]
+fn fails_when_trying_to_build_main_and_lib_with_args() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/lib.rs", r#" "#)
+ .build();
+
+ assert_that(p.cargo("rustc").arg("-v")
+ .arg("--").arg("-C").arg("debug-assertions"),
+ execs()
+ .with_status(101)
+ .with_stderr(CARGO_RUSTC_ERROR));
+}
+
+#[test]
+fn build_with_args_to_one_of_multiple_binaries() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/bin/foo.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/bin/bar.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/bin/baz.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/lib.rs", r#" "#)
+ .build();
+
+ assert_that(p.cargo("rustc").arg("-v").arg("--bin").arg("bar")
+ .arg("--").arg("-C").arg("debug-assertions"),
+ execs()
+ .with_status(0)
+ .with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `rustc --crate-name foo src[/]lib.rs --crate-type lib --emit=dep-info,link \
+ -C debuginfo=2 -C metadata=[..] \
+ --out-dir [..]`
+[RUNNING] `rustc --crate-name bar src[/]bin[/]bar.rs --crate-type bin --emit=dep-info,link \
+ -C debuginfo=2 -C debug-assertions [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", url = p.url())));
+}
+
+#[test]
+fn fails_with_args_to_all_binaries() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/bin/foo.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/bin/bar.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/bin/baz.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/lib.rs", r#" "#)
+ .build();
+
+ assert_that(p.cargo("rustc").arg("-v")
+ .arg("--").arg("-C").arg("debug-assertions"),
+ execs()
+ .with_status(101)
+ .with_stderr(CARGO_RUSTC_ERROR));
+}
+
+#[test]
+fn build_with_args_to_one_of_multiple_tests() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("tests/foo.rs", r#" "#)
+ .file("tests/bar.rs", r#" "#)
+ .file("tests/baz.rs", r#" "#)
+ .file("src/lib.rs", r#" "#)
+ .build();
+
+ assert_that(p.cargo("rustc").arg("-v").arg("--test").arg("bar")
+ .arg("--").arg("-C").arg("debug-assertions"),
+ execs()
+ .with_status(0)
+ .with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `rustc --crate-name foo src[/]lib.rs --crate-type lib --emit=dep-info,link \
+ -C debuginfo=2 -C metadata=[..] \
+ --out-dir [..]`
+[RUNNING] `rustc --crate-name bar tests[/]bar.rs --emit=dep-info,link -C debuginfo=2 \
+ -C debug-assertions [..]--test[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", url = p.url())));
+}
+
+#[test]
+fn build_foo_with_bar_dependency() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() {
+ bar::baz()
+ }
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn baz() {}
+ "#)
+ .build();
+
+ assert_that(foo.cargo("rustc").arg("-v").arg("--").arg("-C").arg("debug-assertions"),
+ execs()
+ .with_status(0)
+ .with_stderr(format!("\
+[COMPILING] bar v0.1.0 ([..])
+[RUNNING] `[..] -C debuginfo=2 [..]`
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `[..] -C debuginfo=2 -C debug-assertions [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", url = foo.url())));
+}
+
+#[test]
+fn build_only_bar_dependency() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() {
+ bar::baz()
+ }
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn baz() {}
+ "#)
+ .build();
+
+ assert_that(foo.cargo("rustc").arg("-v").arg("-p").arg("bar")
+ .arg("--").arg("-C").arg("debug-assertions"),
+ execs()
+ .with_status(0)
+ .with_stderr("\
+[COMPILING] bar v0.1.0 ([..])
+[RUNNING] `rustc --crate-name bar [..] --crate-type lib [..] -C debug-assertions [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn fail_with_multiple_packages() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+
+ [dependencies.baz]
+ path = "../baz"
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .build();
+
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {
+ if cfg!(flag = "1") { println!("Yeah from bar!"); }
+ }
+ "#)
+ .build();
+
+ let _baz = project("baz")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "baz"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {
+ if cfg!(flag = "1") { println!("Yeah from baz!"); }
+ }
+ "#)
+ .build();
+
+ assert_that(foo.cargo("rustc").arg("-v").arg("-p").arg("bar")
+ .arg("-p").arg("baz"),
+ execs().with_status(1).with_stderr("\
+[ERROR] Invalid arguments.
+
+Usage:
+ cargo rustc [options] [--] [<opts>...]"));
+}
+
+#[test]
+fn rustc_with_other_profile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dev-dependencies]
+ a = { path = "a" }
+ "#)
+ .file("src/main.rs", r#"
+ #[cfg(test)] extern crate a;
+
+ #[test]
+ fn foo() {}
+ "#)
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("rustc").arg("--profile").arg("test"),
+ execs().with_status(0));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::{execs, project};
+use hamcrest::{assert_that};
+
+#[test]
+fn rustdoc_simple() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#" "#)
+ .build();
+
+ assert_that(p.cargo("rustdoc").arg("-v"),
+ execs()
+ .with_status(0)
+ .with_stderr(format!("\
+[DOCUMENTING] foo v0.0.1 ({url})
+[RUNNING] `rustdoc --crate-name foo src[/]lib.rs \
+ -o {dir}[/]target[/]doc \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.root().display(), url = p.url())));
+}
+
+#[test]
+fn rustdoc_args() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#" "#)
+ .build();
+
+ assert_that(p.cargo("rustdoc").arg("-v").arg("--").arg("--cfg=foo"),
+ execs()
+ .with_status(0)
+ .with_stderr(format!("\
+[DOCUMENTING] foo v0.0.1 ({url})
+[RUNNING] `rustdoc --crate-name foo src[/]lib.rs \
+ -o {dir}[/]target[/]doc \
+ --cfg=foo \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.root().display(), url = p.url())));
+}
+
+
+
+#[test]
+fn rustdoc_foo_with_bar_dependency() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate bar;
+ pub fn foo() {}
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn baz() {}
+ "#)
+ .build();
+
+ assert_that(foo.cargo("rustdoc").arg("-v").arg("--").arg("--cfg=foo"),
+ execs()
+ .with_status(0)
+ .with_stderr(format!("\
+[COMPILING] bar v0.0.1 ([..])
+[RUNNING] `rustc [..]bar[/]src[/]lib.rs [..]`
+[DOCUMENTING] foo v0.0.1 ({url})
+[RUNNING] `rustdoc --crate-name foo src[/]lib.rs \
+ -o {dir}[/]target[/]doc \
+ --cfg=foo \
+ -L dependency={dir}[/]target[/]debug[/]deps \
+ --extern [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = foo.root().display(), url = foo.url())));
+}
+
+#[test]
+fn rustdoc_only_bar_dependency() {
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/main.rs", r#"
+ extern crate bar;
+ fn main() {
+ bar::baz()
+ }
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn baz() {}
+ "#)
+ .build();
+
+ assert_that(foo.cargo("rustdoc").arg("-v").arg("-p").arg("bar")
+ .arg("--").arg("--cfg=foo"),
+ execs()
+ .with_status(0)
+ .with_stderr(format!("\
+[DOCUMENTING] bar v0.0.1 ([..])
+[RUNNING] `rustdoc --crate-name bar [..]bar[/]src[/]lib.rs \
+ -o {dir}[/]target[/]doc \
+ --cfg=foo \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = foo.root().display())));
+}
+
+
+#[test]
+fn rustdoc_same_name_documents_lib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("src/lib.rs", r#" "#)
+ .build();
+
+ assert_that(p.cargo("rustdoc").arg("-v")
+ .arg("--").arg("--cfg=foo"),
+ execs()
+ .with_status(0)
+ .with_stderr(format!("\
+[DOCUMENTING] foo v0.0.1 ([..])
+[RUNNING] `rustdoc --crate-name foo src[/]lib.rs \
+ -o {dir}[/]target[/]doc \
+ --cfg=foo \
+ -L dependency={dir}[/]target[/]debug[/]deps`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.root().display())));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+#[test]
+fn parses_env() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("doc").env("RUSTDOCFLAGS", "--cfg=foo").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[RUNNING] `rustdoc [..] --cfg=foo[..]`
+"));
+}
+
+#[test]
+fn parses_config() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [build]
+ rustdocflags = ["--cfg", "foo"]
+ "#)
+ .build();
+
+ assert_that(p.cargo("doc").arg("-v"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[RUNNING] `rustdoc [..] --cfg foo[..]`
+"));
+}
+
+#[test]
+fn bad_flags() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("doc").env("RUSTDOCFLAGS", "--bogus"),
+ execs().with_status(101));
+}
+
+#[test]
+fn rerun() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("doc").env("RUSTDOCFLAGS", "--cfg=foo"),
+ execs().with_status(0));
+ assert_that(p.cargo("doc").env("RUSTDOCFLAGS", "--cfg=foo"),
+ execs().with_status(0).with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(p.cargo("doc").env("RUSTDOCFLAGS", "--cfg=bar"),
+ execs().with_status(0).with_stderr("\
+[DOCUMENTING] foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::io::Write;
+use std::fs::{self, File};
+
+use cargotest::rustc_host;
+use cargotest::support::{project, project_in_home, execs, paths};
+use hamcrest::assert_that;
+
+#[test]
+fn env_rustflags_normal_source() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", "fn main() {}")
+ .file("examples/b.rs", "fn main() {}")
+ .file("tests/c.rs", "#[test] fn f() { }")
+ .file("benches/d.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run1(_ben: &mut test::Bencher) { }"#)
+ .build();
+
+ // Use RUSTFLAGS to pass an argument that will generate an error
+ assert_that(p.cargo("build").env("RUSTFLAGS", "-Z bogus")
+ .arg("--lib"),
+ execs().with_status(101));
+ assert_that(p.cargo("build").env("RUSTFLAGS", "-Z bogus")
+ .arg("--bin=a"),
+ execs().with_status(101));
+ assert_that(p.cargo("build").env("RUSTFLAGS", "-Z bogus")
+ .arg("--example=b"),
+ execs().with_status(101));
+ assert_that(p.cargo("test").env("RUSTFLAGS", "-Z bogus"),
+ execs().with_status(101));
+ assert_that(p.cargo("bench").env("RUSTFLAGS", "-Z bogus"),
+ execs().with_status(101));
+}
+
+#[test]
+fn env_rustflags_build_script() {
+ // RUSTFLAGS should be passed to rustc for build scripts
+ // when --target is not specified.
+ // In this test if --cfg foo is passed the build will fail.
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() { }
+ #[cfg(not(foo))]
+ fn main() { }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").env("RUSTFLAGS", "--cfg foo"),
+ execs().with_status(0));
+}
+
+#[test]
+fn env_rustflags_build_script_dep() {
+ // RUSTFLAGS should be passed to rustc for build scripts
+ // when --target is not specified.
+ // In this test if --cfg foo is not passed the build will fail.
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ build = "build.rs"
+
+ [build-dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() { }
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", r#"
+ fn bar() { }
+ #[cfg(not(foo))]
+ fn bar() { }
+ "#)
+ .build();
+
+ assert_that(foo.cargo("build").env("RUSTFLAGS", "--cfg foo"),
+ execs().with_status(0));
+}
+
+#[test]
+fn env_rustflags_plugin() {
+ // RUSTFLAGS should be passed to rustc for plugins
+ // when --target is not specified.
+ // In this test if --cfg foo is not passed the build will fail.
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+
+ [lib]
+ name = "foo"
+ plugin = true
+ "#)
+ .file("src/lib.rs", r#"
+ fn main() { }
+ #[cfg(not(foo))]
+ fn main() { }
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").env("RUSTFLAGS", "--cfg foo"),
+ execs().with_status(0));
+}
+
+#[test]
+fn env_rustflags_plugin_dep() {
+ // RUSTFLAGS should be passed to rustc for plugins
+ // when --target is not specified.
+ // In this test if --cfg foo is not passed the build will fail.
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+
+ [lib]
+ name = "foo"
+ plugin = true
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/lib.rs", r#"
+ fn foo() { }
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+
+ [lib]
+ name = "bar"
+ "#)
+ .file("src/lib.rs", r#"
+ fn bar() { }
+ #[cfg(not(foo))]
+ fn bar() { }
+ "#)
+ .build();
+
+ assert_that(foo.cargo("build").env("RUSTFLAGS", "--cfg foo"),
+ execs().with_status(0));
+}
+
+#[test]
+fn env_rustflags_normal_source_with_target() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", "fn main() {}")
+ .file("examples/b.rs", "fn main() {}")
+ .file("tests/c.rs", "#[test] fn f() { }")
+ .file("benches/d.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run1(_ben: &mut test::Bencher) { }"#)
+ .build();
+
+ let host = &rustc_host();
+
+ // Use RUSTFLAGS to pass an argument that will generate an error
+ assert_that(p.cargo("build").env("RUSTFLAGS", "-Z bogus")
+ .arg("--lib").arg("--target").arg(host),
+ execs().with_status(101));
+ assert_that(p.cargo("build").env("RUSTFLAGS", "-Z bogus")
+ .arg("--bin=a").arg("--target").arg(host),
+ execs().with_status(101));
+ assert_that(p.cargo("build").env("RUSTFLAGS", "-Z bogus")
+ .arg("--example=b").arg("--target").arg(host),
+ execs().with_status(101));
+ assert_that(p.cargo("test").env("RUSTFLAGS", "-Z bogus")
+ .arg("--target").arg(host),
+ execs().with_status(101));
+ assert_that(p.cargo("bench").env("RUSTFLAGS", "-Z bogus")
+ .arg("--target").arg(host),
+ execs().with_status(101));
+}
+
+#[test]
+fn env_rustflags_build_script_with_target() {
+ // RUSTFLAGS should not be passed to rustc for build scripts
+ // when --target is specified.
+ // In this test if --cfg foo is passed the build will fail.
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() { }
+ #[cfg(foo)]
+ fn main() { }
+ "#)
+ .build();
+
+ let host = rustc_host();
+ assert_that(p.cargo("build").env("RUSTFLAGS", "--cfg foo")
+ .arg("--target").arg(host),
+ execs().with_status(0));
+}
+
+#[test]
+fn env_rustflags_build_script_dep_with_target() {
+ // RUSTFLAGS should not be passed to rustc for build scripts
+ // when --target is specified.
+ // In this test if --cfg foo is passed the build will fail.
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ build = "build.rs"
+
+ [build-dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() { }
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", r#"
+ fn bar() { }
+ #[cfg(foo)]
+ fn bar() { }
+ "#)
+ .build();
+
+ let host = rustc_host();
+ assert_that(foo.cargo("build").env("RUSTFLAGS", "--cfg foo")
+ .arg("--target").arg(host),
+ execs().with_status(0));
+}
+
+#[test]
+fn env_rustflags_plugin_with_target() {
+ // RUSTFLAGS should not be passed to rustc for plugins
+ // when --target is specified.
+ // In this test if --cfg foo is passed the build will fail.
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+
+ [lib]
+ name = "foo"
+ plugin = true
+ "#)
+ .file("src/lib.rs", r#"
+ fn main() { }
+ #[cfg(foo)]
+ fn main() { }
+ "#)
+ .build();
+
+ let host = rustc_host();
+ assert_that(p.cargo("build").env("RUSTFLAGS", "--cfg foo")
+ .arg("--target").arg(host),
+ execs().with_status(0));
+}
+
+#[test]
+fn env_rustflags_plugin_dep_with_target() {
+ // RUSTFLAGS should not be passed to rustc for plugins
+ // when --target is specified.
+ // In this test if --cfg foo is passed the build will fail.
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+
+ [lib]
+ name = "foo"
+ plugin = true
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/lib.rs", r#"
+ fn foo() { }
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+
+ [lib]
+ name = "bar"
+ "#)
+ .file("src/lib.rs", r#"
+ fn bar() { }
+ #[cfg(foo)]
+ fn bar() { }
+ "#)
+ .build();
+
+ let host = rustc_host();
+ assert_that(foo.cargo("build").env("RUSTFLAGS", "--cfg foo")
+ .arg("--target").arg(host),
+ execs().with_status(0));
+}
+
+#[test]
+fn env_rustflags_recompile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ // Setting RUSTFLAGS forces a recompile
+ assert_that(p.cargo("build").env("RUSTFLAGS", "-Z bogus"),
+ execs().with_status(101));
+}
+
+#[test]
+fn env_rustflags_recompile2() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").env("RUSTFLAGS", "--cfg foo"),
+ execs().with_status(0));
+ // Setting RUSTFLAGS forces a recompile
+ assert_that(p.cargo("build").env("RUSTFLAGS", "-Z bogus"),
+ execs().with_status(101));
+}
+
+#[test]
+fn env_rustflags_no_recompile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").env("RUSTFLAGS", "--cfg foo"),
+ execs().with_status(0));
+ assert_that(p.cargo("build").env("RUSTFLAGS", "--cfg foo"),
+ execs().with_stdout("").with_status(0));
+}
+
+#[test]
+fn build_rustflags_normal_source() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", "fn main() {}")
+ .file("examples/b.rs", "fn main() {}")
+ .file("tests/c.rs", "#[test] fn f() { }")
+ .file("benches/d.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run1(_ben: &mut test::Bencher) { }"#)
+ .file(".cargo/config", r#"
+ [build]
+ rustflags = ["-Z", "bogus"]
+ "#)
+ .build();
+
+ assert_that(p.cargo("build")
+ .arg("--lib"),
+ execs().with_status(101));
+ assert_that(p.cargo("build")
+ .arg("--bin=a"),
+ execs().with_status(101));
+ assert_that(p.cargo("build")
+ .arg("--example=b"),
+ execs().with_status(101));
+ assert_that(p.cargo("test"),
+ execs().with_status(101));
+ assert_that(p.cargo("bench"),
+ execs().with_status(101));
+}
+
+#[test]
+fn build_rustflags_build_script() {
+ // RUSTFLAGS should be passed to rustc for build scripts
+ // when --target is not specified.
+ // In this test if --cfg foo is passed the build will fail.
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() { }
+ #[cfg(not(foo))]
+ fn main() { }
+ "#)
+ .file(".cargo/config", r#"
+ [build]
+ rustflags = ["--cfg", "foo"]
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_rustflags_build_script_dep() {
+ // RUSTFLAGS should be passed to rustc for build scripts
+ // when --target is not specified.
+ // In this test if --cfg foo is not passed the build will fail.
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ build = "build.rs"
+
+ [build-dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() { }
+ "#)
+ .file(".cargo/config", r#"
+ [build]
+ rustflags = ["--cfg", "foo"]
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", r#"
+ fn bar() { }
+ #[cfg(not(foo))]
+ fn bar() { }
+ "#)
+ .build();
+
+ assert_that(foo.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_rustflags_plugin() {
+ // RUSTFLAGS should be passed to rustc for plugins
+ // when --target is not specified.
+ // In this test if --cfg foo is not passed the build will fail.
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+
+ [lib]
+ name = "foo"
+ plugin = true
+ "#)
+ .file("src/lib.rs", r#"
+ fn main() { }
+ #[cfg(not(foo))]
+ fn main() { }
+ "#)
+ .file(".cargo/config", r#"
+ [build]
+ rustflags = ["--cfg", "foo"]
+ "#)
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_rustflags_plugin_dep() {
+ // RUSTFLAGS should be passed to rustc for plugins
+ // when --target is not specified.
+ // In this test if --cfg foo is not passed the build will fail.
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+
+ [lib]
+ name = "foo"
+ plugin = true
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/lib.rs", r#"
+ fn foo() { }
+ "#)
+ .file(".cargo/config", r#"
+ [build]
+ rustflags = ["--cfg", "foo"]
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+
+ [lib]
+ name = "bar"
+ "#)
+ .file("src/lib.rs", r#"
+ fn bar() { }
+ #[cfg(not(foo))]
+ fn bar() { }
+ "#)
+ .build();
+
+ assert_that(foo.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_rustflags_normal_source_with_target() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", "fn main() {}")
+ .file("examples/b.rs", "fn main() {}")
+ .file("tests/c.rs", "#[test] fn f() { }")
+ .file("benches/d.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run1(_ben: &mut test::Bencher) { }"#)
+ .file(".cargo/config", r#"
+ [build]
+ rustflags = ["-Z", "bogus"]
+ "#)
+ .build();
+
+ let ref host = rustc_host();
+
+ // Use RUSTFLAGS to pass an argument that will generate an error
+ assert_that(p.cargo("build")
+ .arg("--lib").arg("--target").arg(host),
+ execs().with_status(101));
+ assert_that(p.cargo("build")
+ .arg("--bin=a").arg("--target").arg(host),
+ execs().with_status(101));
+ assert_that(p.cargo("build")
+ .arg("--example=b").arg("--target").arg(host),
+ execs().with_status(101));
+ assert_that(p.cargo("test")
+ .arg("--target").arg(host),
+ execs().with_status(101));
+ assert_that(p.cargo("bench")
+ .arg("--target").arg(host),
+ execs().with_status(101));
+}
+
+#[test]
+fn build_rustflags_build_script_with_target() {
+ // RUSTFLAGS should not be passed to rustc for build scripts
+ // when --target is specified.
+ // In this test if --cfg foo is passed the build will fail.
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ build = "build.rs"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() { }
+ #[cfg(foo)]
+ fn main() { }
+ "#)
+ .file(".cargo/config", r#"
+ [build]
+ rustflags = ["--cfg", "foo"]
+ "#)
+ .build();
+
+ let host = rustc_host();
+ assert_that(p.cargo("build")
+ .arg("--target").arg(host),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_rustflags_build_script_dep_with_target() {
+ // RUSTFLAGS should not be passed to rustc for build scripts
+ // when --target is specified.
+ // In this test if --cfg foo is passed the build will fail.
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ build = "build.rs"
+
+ [build-dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/lib.rs", "")
+ .file("build.rs", r#"
+ fn main() { }
+ "#)
+ .file(".cargo/config", r#"
+ [build]
+ rustflags = ["--cfg", "foo"]
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", r#"
+ fn bar() { }
+ #[cfg(foo)]
+ fn bar() { }
+ "#)
+ .build();
+
+ let host = rustc_host();
+ assert_that(foo.cargo("build")
+ .arg("--target").arg(host),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_rustflags_plugin_with_target() {
+ // RUSTFLAGS should not be passed to rustc for plugins
+ // when --target is specified.
+ // In this test if --cfg foo is passed the build will fail.
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+
+ [lib]
+ name = "foo"
+ plugin = true
+ "#)
+ .file("src/lib.rs", r#"
+ fn main() { }
+ #[cfg(foo)]
+ fn main() { }
+ "#)
+ .file(".cargo/config", r#"
+ [build]
+ rustflags = ["--cfg", "foo"]
+ "#)
+ .build();
+
+ let host = rustc_host();
+ assert_that(p.cargo("build")
+ .arg("--target").arg(host),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_rustflags_plugin_dep_with_target() {
+ // RUSTFLAGS should not be passed to rustc for plugins
+ // when --target is specified.
+ // In this test if --cfg foo is passed the build will fail.
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+
+ [lib]
+ name = "foo"
+ plugin = true
+
+ [dependencies.bar]
+ path = "../bar"
+ "#)
+ .file("src/lib.rs", r#"
+ fn foo() { }
+ "#)
+ .file(".cargo/config", r#"
+ [build]
+ rustflags = ["--cfg", "foo"]
+ "#)
+ .build();
+ let _bar = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+
+ [lib]
+ name = "bar"
+ "#)
+ .file("src/lib.rs", r#"
+ fn bar() { }
+ #[cfg(foo)]
+ fn bar() { }
+ "#)
+ .build();
+
+ let host = rustc_host();
+ assert_that(foo.cargo("build")
+ .arg("--target").arg(host),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_rustflags_recompile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ // Setting RUSTFLAGS forces a recompile
+ let config = r#"
+ [build]
+ rustflags = ["-Z", "bogus"]
+ "#;
+ let config_file = paths::root().join("foo/.cargo/config");
+ fs::create_dir_all(config_file.parent().unwrap()).unwrap();
+ let mut config_file = File::create(config_file).unwrap();
+ config_file.write_all(config.as_bytes()).unwrap();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101));
+}
+
+#[test]
+fn build_rustflags_recompile2() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").env("RUSTFLAGS", "--cfg foo"),
+ execs().with_status(0));
+
+ // Setting RUSTFLAGS forces a recompile
+ let config = r#"
+ [build]
+ rustflags = ["-Z", "bogus"]
+ "#;
+ let config_file = paths::root().join("foo/.cargo/config");
+ fs::create_dir_all(config_file.parent().unwrap()).unwrap();
+ let mut config_file = File::create(config_file).unwrap();
+ config_file.write_all(config.as_bytes()).unwrap();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101));
+}
+
+#[test]
+fn build_rustflags_no_recompile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [build]
+ rustflags = ["--cfg", "foo"]
+ "#)
+ .build();
+
+ assert_that(p.cargo("build").env("RUSTFLAGS", "--cfg foo"),
+ execs().with_status(0));
+ assert_that(p.cargo("build").env("RUSTFLAGS", "--cfg foo"),
+ execs().with_stdout("").with_status(0));
+}
+
+#[test]
+fn build_rustflags_with_home_config() {
+ // We need a config file inside the home directory
+ let home = paths::home();
+ let home_config = home.join(".cargo");
+ fs::create_dir(&home_config).unwrap();
+ File::create(&home_config.join("config")).unwrap().write_all(br#"
+ [build]
+ rustflags = ["-Cllvm-args=-x86-asm-syntax=intel"]
+ "#).unwrap();
+
+ // And we need the project to be inside the home directory
+ // so the walking process finds the home project twice.
+ let p = project_in_home("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn target_rustflags_normal_source() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/bin/a.rs", "fn main() {}")
+ .file("examples/b.rs", "fn main() {}")
+ .file("tests/c.rs", "#[test] fn f() { }")
+ .file("benches/d.rs", r#"
+ #![feature(test)]
+ extern crate test;
+ #[bench] fn run1(_ben: &mut test::Bencher) { }"#)
+ .file(".cargo/config", &format!("
+ [target.{}]
+ rustflags = [\"-Z\", \"bogus\"]
+ ", rustc_host()))
+ .build();
+
+ assert_that(p.cargo("build")
+ .arg("--lib"),
+ execs().with_status(101));
+ assert_that(p.cargo("build")
+ .arg("--bin=a"),
+ execs().with_status(101));
+ assert_that(p.cargo("build")
+ .arg("--example=b"),
+ execs().with_status(101));
+ assert_that(p.cargo("test"),
+ execs().with_status(101));
+ assert_that(p.cargo("bench"),
+ execs().with_status(101));
+}
+
+// target.{}.rustflags takes precedence over build.rustflags
+#[test]
+fn target_rustflags_precedence() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", &format!("
+ [build]
+ rustflags = [\"--cfg\", \"foo\"]
+
+ [target.{}]
+ rustflags = [\"-Z\", \"bogus\"]
+ ", rustc_host()))
+ .build();
+
+ assert_that(p.cargo("build")
+ .arg("--lib"),
+ execs().with_status(101));
+ assert_that(p.cargo("build")
+ .arg("--bin=a"),
+ execs().with_status(101));
+ assert_that(p.cargo("build")
+ .arg("--example=b"),
+ execs().with_status(101));
+ assert_that(p.cargo("test"),
+ execs().with_status(101));
+ assert_that(p.cargo("bench"),
+ execs().with_status(101));
+}
+
+#[test]
+fn cfg_rustflags_normal_source() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "pub fn t() {}")
+ .file("src/bin/a.rs", "fn main() {}")
+ .file("examples/b.rs", "fn main() {}")
+ .file("tests/c.rs", "#[test] fn f() { }")
+ .file(".cargo/config", &format!(r#"
+ [target.'cfg({})']
+ rustflags = ["--cfg", "bar"]
+ "#, if rustc_host().contains("-windows-") {"windows"} else {"not(windows)"}))
+ .build();
+
+ assert_that(p.cargo("build").arg("--lib").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg bar[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("build").arg("--bin=a").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg bar[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("build").arg("--example=b").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg bar[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("test").arg("--no-run").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg bar[..]`
+[RUNNING] `rustc [..] --cfg bar[..]`
+[RUNNING] `rustc [..] --cfg bar[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("bench").arg("--no-run").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg bar[..]`
+[RUNNING] `rustc [..] --cfg bar[..]`
+[RUNNING] `rustc [..] --cfg bar[..]`
+[FINISHED] release [optimized] target(s) in [..]
+"));
+
+}
+
+// target.'cfg(...)'.rustflags takes precedence over build.rustflags
+#[test]
+fn cfg_rustflags_precedence() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "pub fn t() {}")
+ .file("src/bin/a.rs", "fn main() {}")
+ .file("examples/b.rs", "fn main() {}")
+ .file("tests/c.rs", "#[test] fn f() { }")
+ .file(".cargo/config", &format!(r#"
+ [build]
+ rustflags = ["--cfg", "foo"]
+
+ [target.'cfg({})']
+ rustflags = ["--cfg", "bar"]
+ "#, if rustc_host().contains("-windows-") { "windows" } else { "not(windows)" }))
+ .build();
+
+ assert_that(p.cargo("build").arg("--lib").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg bar[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("build").arg("--bin=a").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg bar[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("build").arg("--example=b").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg bar[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("test").arg("--no-run").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg bar[..]`
+[RUNNING] `rustc [..] --cfg bar[..]`
+[RUNNING] `rustc [..] --cfg bar[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("bench").arg("--no-run").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg bar[..]`
+[RUNNING] `rustc [..] --cfg bar[..]`
+[RUNNING] `rustc [..] --cfg bar[..]`
+[FINISHED] release [optimized] target(s) in [..]
+"));
+
+}
+
+#[test]
+fn target_rustflags_string_and_array_form1() {
+ let p1 = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [build]
+ rustflags = ["--cfg", "foo"]
+ "#)
+ .build();
+
+ assert_that(p1.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg foo[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ let p2 = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", r#"
+ [build]
+ rustflags = "--cfg foo"
+ "#)
+ .build();
+
+ assert_that(p2.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg foo[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+}
+
+#[test]
+fn target_rustflags_string_and_array_form2() {
+ let p1 = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file(".cargo/config", &format!(r#"
+ [target.{}]
+ rustflags = ["--cfg", "foo"]
+ "#, rustc_host()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p1.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg foo[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ let p2 = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file(".cargo/config", &format!(r#"
+ [target.{}]
+ rustflags = "--cfg foo"
+ "#, rustc_host()))
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p2.cargo("build").arg("-v"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] --cfg foo[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+}
--- /dev/null
+extern crate cargo;
+extern crate cargotest;
+extern crate hamcrest;
+extern crate url;
+
+use std::fs::{self, File};
+use std::io::prelude::*;
+use std::path::PathBuf;
+
+use cargo::util::ProcessBuilder;
+use cargotest::support::execs;
+use cargotest::support::git::repo;
+use cargotest::support::paths;
+use hamcrest::assert_that;
+use url::Url;
+
+fn registry_path() -> PathBuf { paths::root().join("registry") }
+fn registry() -> Url { Url::from_file_path(&*registry_path()).ok().unwrap() }
+fn api_path() -> PathBuf { paths::root().join("api") }
+fn api() -> Url { Url::from_file_path(&*api_path()).ok().unwrap() }
+
+fn setup() {
+ let config = paths::root().join(".cargo/config");
+ fs::create_dir_all(config.parent().unwrap()).unwrap();
+ fs::create_dir_all(&api_path().join("api/v1")).unwrap();
+
+ let _ = repo(®istry_path())
+ .file("config.json", &format!(r#"{{
+ "dl": "{0}",
+ "api": "{0}"
+ }}"#, api()))
+ .build();
+}
+
+fn cargo_process(s: &str) -> ProcessBuilder {
+ let mut b = cargotest::cargo_process();
+ b.arg(s);
+ b
+}
+
+#[test]
+fn simple() {
+ setup();
+
+ let contents = r#"{
+ "crates": [{
+ "created_at": "2014-11-16T20:17:35Z",
+ "description": "Design by contract style assertions for Rust",
+ "documentation": null,
+ "downloads": 2,
+ "homepage": null,
+ "id": "hoare",
+ "keywords": [],
+ "license": null,
+ "links": {
+ "owners": "/api/v1/crates/hoare/owners",
+ "reverse_dependencies": "/api/v1/crates/hoare/reverse_dependencies",
+ "version_downloads": "/api/v1/crates/hoare/downloads",
+ "versions": "/api/v1/crates/hoare/versions"
+ },
+ "max_version": "0.1.1",
+ "name": "hoare",
+ "repository": "https://github.com/nick29581/libhoare",
+ "updated_at": "2014-11-20T21:49:21Z",
+ "versions": null
+ }],
+ "meta": {
+ "total": 1
+ }
+ }"#;
+ let base = api_path().join("api/v1/crates");
+
+ // Older versions of curl don't peel off query parameters when looking for
+ // filenames, so just make both files.
+ //
+ // On windows, though, `?` is an invalid character, but we always build curl
+ // from source there anyway!
+ File::create(&base).unwrap().write_all(contents.as_bytes()).unwrap();
+ if !cfg!(windows) {
+ File::create(&base.with_file_name("crates?q=postgres&per_page=10")).unwrap()
+ .write_all(contents.as_bytes()).unwrap();
+ }
+
+ assert_that(cargo_process("search").arg("postgres")
+ .arg("--index").arg(registry().to_string()),
+ execs().with_status(0)
+ .with_stdout_contains("\
+hoare = \"0.1.1\" # Design by contract style assertions for Rust"));
+}
+
+// TODO: Depricated
+// remove once it has been decided '--host' can be safely removed
+#[test]
+fn simple_with_host() {
+ setup();
+
+ let contents = r#"{
+ "crates": [{
+ "created_at": "2014-11-16T20:17:35Z",
+ "description": "Design by contract style assertions for Rust",
+ "documentation": null,
+ "downloads": 2,
+ "homepage": null,
+ "id": "hoare",
+ "keywords": [],
+ "license": null,
+ "links": {
+ "owners": "/api/v1/crates/hoare/owners",
+ "reverse_dependencies": "/api/v1/crates/hoare/reverse_dependencies",
+ "version_downloads": "/api/v1/crates/hoare/downloads",
+ "versions": "/api/v1/crates/hoare/versions"
+ },
+ "max_version": "0.1.1",
+ "name": "hoare",
+ "repository": "https://github.com/nick29581/libhoare",
+ "updated_at": "2014-11-20T21:49:21Z",
+ "versions": null
+ }],
+ "meta": {
+ "total": 1
+ }
+ }"#;
+ let base = api_path().join("api/v1/crates");
+
+ // Older versions of curl don't peel off query parameters when looking for
+ // filenames, so just make both files.
+ //
+ // On windows, though, `?` is an invalid character, but we always build curl
+ // from source there anyway!
+ File::create(&base).unwrap().write_all(contents.as_bytes()).unwrap();
+ if !cfg!(windows) {
+ File::create(&base.with_file_name("crates?q=postgres&per_page=10")).unwrap()
+ .write_all(contents.as_bytes()).unwrap();
+ }
+
+ assert_that(cargo_process("search").arg("postgres")
+ .arg("--host").arg(registry().to_string()),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[WARNING] The flag '--host' is no longer valid.
+
+Previous versions of Cargo accepted this flag, but it is being
+depricated. The flag is being renamed to 'index', as the flag
+wants the location of the index in which to search. Please
+use '--index' instead.
+
+This will soon become a hard error, so it's either recommended
+to update to a fixed version or contact the upstream maintainer
+about this warning.
+[UPDATING] registry `{reg}`
+",
+ reg = registry()))
+ .with_stdout_contains("\
+hoare = \"0.1.1\" # Design by contract style assertions for Rust"));
+}
+
+// TODO: Depricated
+// remove once it has been decided '--host' can be safely removed
+#[test]
+fn simple_with_index_and_host() {
+ setup();
+
+ let contents = r#"{
+ "crates": [{
+ "created_at": "2014-11-16T20:17:35Z",
+ "description": "Design by contract style assertions for Rust",
+ "documentation": null,
+ "downloads": 2,
+ "homepage": null,
+ "id": "hoare",
+ "keywords": [],
+ "license": null,
+ "links": {
+ "owners": "/api/v1/crates/hoare/owners",
+ "reverse_dependencies": "/api/v1/crates/hoare/reverse_dependencies",
+ "version_downloads": "/api/v1/crates/hoare/downloads",
+ "versions": "/api/v1/crates/hoare/versions"
+ },
+ "max_version": "0.1.1",
+ "name": "hoare",
+ "repository": "https://github.com/nick29581/libhoare",
+ "updated_at": "2014-11-20T21:49:21Z",
+ "versions": null
+ }],
+ "meta": {
+ "total": 1
+ }
+ }"#;
+ let base = api_path().join("api/v1/crates");
+
+ // Older versions of curl don't peel off query parameters when looking for
+ // filenames, so just make both files.
+ //
+ // On windows, though, `?` is an invalid character, but we always build curl
+ // from source there anyway!
+ File::create(&base).unwrap().write_all(contents.as_bytes()).unwrap();
+ if !cfg!(windows) {
+ File::create(&base.with_file_name("crates?q=postgres&per_page=10")).unwrap()
+ .write_all(contents.as_bytes()).unwrap();
+ }
+
+ assert_that(cargo_process("search").arg("postgres")
+ .arg("--index").arg(registry().to_string())
+ .arg("--host").arg(registry().to_string()),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[WARNING] The flag '--host' is no longer valid.
+
+Previous versions of Cargo accepted this flag, but it is being
+depricated. The flag is being renamed to 'index', as the flag
+wants the location of the index in which to search. Please
+use '--index' instead.
+
+This will soon become a hard error, so it's either recommended
+to update to a fixed version or contact the upstream maintainer
+about this warning.
+[UPDATING] registry `{reg}`
+",
+ reg = registry()))
+ .with_stdout_contains("\
+hoare = \"0.1.1\" # Design by contract style assertions for Rust"));
+}
+
+#[test]
+fn multiple_query_params() {
+ setup();
+
+ let contents = r#"{
+ "crates": [{
+ "created_at": "2014-11-16T20:17:35Z",
+ "description": "Design by contract style assertions for Rust",
+ "documentation": null,
+ "downloads": 2,
+ "homepage": null,
+ "id": "hoare",
+ "keywords": [],
+ "license": null,
+ "links": {
+ "owners": "/api/v1/crates/hoare/owners",
+ "reverse_dependencies": "/api/v1/crates/hoare/reverse_dependencies",
+ "version_downloads": "/api/v1/crates/hoare/downloads",
+ "versions": "/api/v1/crates/hoare/versions"
+ },
+ "max_version": "0.1.1",
+ "name": "hoare",
+ "repository": "https://github.com/nick29581/libhoare",
+ "updated_at": "2014-11-20T21:49:21Z",
+ "versions": null
+ }],
+ "meta": {
+ "total": 1
+ }
+ }"#;
+ let base = api_path().join("api/v1/crates");
+
+ // Older versions of curl don't peel off query parameters when looking for
+ // filenames, so just make both files.
+ //
+ // On windows, though, `?` is an invalid character, but we always build curl
+ // from source there anyway!
+ File::create(&base).unwrap().write_all(contents.as_bytes()).unwrap();
+ if !cfg!(windows) {
+ File::create(&base.with_file_name("crates?q=postgres+sql&per_page=10")).unwrap()
+ .write_all(contents.as_bytes()).unwrap();
+ }
+
+ assert_that(cargo_process("search").arg("postgres").arg("sql")
+ .arg("--index").arg(registry().to_string()),
+ execs().with_status(0)
+ .with_stdout_contains("\
+hoare = \"0.1.1\" # Design by contract style assertions for Rust"));
+}
+
+#[test]
+fn help() {
+ assert_that(cargo_process("search").arg("-h"),
+ execs().with_status(0));
+ assert_that(cargo_process("help").arg("search"),
+ execs().with_status(0));
+}
--- /dev/null
+extern crate cargotest;
+extern crate git2;
+extern crate hamcrest;
+extern crate url;
+
+use std::env;
+use std::ffi::OsStr;
+use std::path::PathBuf;
+use std::process::Command;
+
+use cargotest::support::{execs, project};
+use cargotest::support::registry::Package;
+use cargotest::support::paths;
+use cargotest::support::git;
+use hamcrest::assert_that;
+
+use url::Url;
+
+fn find_index() -> PathBuf {
+ let dir = paths::home().join(".cargo/registry/index");
+ dir.read_dir().unwrap().next().unwrap().unwrap().path()
+}
+
+fn run_test(path_env: Option<&OsStr>) {
+ const N: usize = 50;
+
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+ Package::new("bar", "0.1.0").publish();
+
+ assert_that(foo.cargo("build"),
+ execs().with_status(0));
+
+ let index = find_index();
+ let path = paths::home().join("tmp");
+ let url = Url::from_file_path(&path).unwrap().to_string();
+ let repo = git2::Repository::init(&path).unwrap();
+ let index = git2::Repository::open(&index).unwrap();
+ let mut cfg = repo.config().unwrap();
+ cfg.set_str("user.email", "foo@bar.com").unwrap();
+ cfg.set_str("user.name", "Foo Bar").unwrap();
+ let mut cfg = index.config().unwrap();
+ cfg.set_str("user.email", "foo@bar.com").unwrap();
+ cfg.set_str("user.name", "Foo Bar").unwrap();
+
+ for _ in 0..N {
+ git::commit(&repo);
+ index.remote_anonymous(&url).unwrap()
+ .fetch(&["refs/heads/master:refs/remotes/foo/master"],
+ None,
+ None).unwrap();
+ }
+ drop((repo, index));
+ Package::new("bar", "0.1.1").publish();
+
+ let before = find_index().join(".git/objects/pack")
+ .read_dir().unwrap()
+ .count();
+ assert!(before > N);
+
+ let mut cmd = foo.cargo("update");
+ cmd.env("__CARGO_PACKFILE_LIMIT", "10");
+ if let Some(path) = path_env {
+ cmd.env("PATH", path);
+ }
+ cmd.env("RUST_LOG", "trace");
+ assert_that(cmd, execs().with_status(0));
+ let after = find_index().join(".git/objects/pack")
+ .read_dir().unwrap()
+ .count();
+ assert!(after < before,
+ "packfiles before: {}\n\
+ packfiles after: {}", before, after);
+}
+
+#[test]
+fn use_git_gc() {
+ if Command::new("git").arg("--version").output().is_err() {
+ return
+ }
+ run_test(None);
+}
+
+#[test]
+// it looks like this test passes on some windows machines but not others,
+// notably not on AppVeyor's machines. Sounds like another but for another day.
+#[cfg_attr(windows, ignore)]
+fn avoid_using_git() {
+ let path = env::var_os("PATH").unwrap_or_default();
+ let mut paths = env::split_paths(&path).collect::<Vec<_>>();
+ let idx = paths.iter().position(|p| {
+ p.join("git").exists() || p.join("git.exe").exists()
+ });
+ match idx {
+ Some(i) => { paths.remove(i); }
+ None => return,
+ }
+ run_test(Some(&env::join_paths(&paths).unwrap()));
+}
--- /dev/null
+extern crate cargo;
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::fs::File;
+use std::io::prelude::*;
+use std::str;
+
+use cargotest::{sleep_ms, is_nightly, rustc_host};
+use cargotest::support::{project, execs, basic_bin_manifest, basic_lib_manifest, cargo_exe};
+use cargotest::support::paths::CargoPathExt;
+use cargotest::support::registry::Package;
+use hamcrest::{assert_that, existing_file, is_not};
+use cargo::util::process;
+
+#[test]
+fn cargo_test_simple() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ fn hello() -> &'static str {
+ "hello"
+ }
+
+ pub fn main() {
+ println!("{}", hello())
+ }
+
+ #[test]
+ fn test_hello() {
+ assert_eq!(hello(), "hello")
+ }"#)
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("hello\n"));
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.5.0 ({})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]", p.url()))
+ .with_stdout_contains("test test_hello ... ok"));
+}
+
+#[test]
+fn cargo_test_release() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.1.0"
+
+ [dependencies]
+ bar = { path = "bar" }
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate bar;
+ pub fn foo() { bar::bar(); }
+
+ #[test]
+ fn test() { foo(); }
+ "#)
+ .file("tests/test.rs", r#"
+ extern crate foo;
+
+ #[test]
+ fn test() { foo::foo(); }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "pub fn bar() {}")
+ .build();
+
+ assert_that(p.cargo("test").arg("-v").arg("--release"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] bar v0.0.1 ({dir}/bar)
+[RUNNING] [..] -C opt-level=3 [..]
+[COMPILING] foo v0.1.0 ({dir})
+[RUNNING] [..] -C opt-level=3 [..]
+[RUNNING] [..] -C opt-level=3 [..]
+[RUNNING] [..] -C opt-level=3 [..]
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] `[..]target[/]release[/]deps[/]foo-[..][EXE]`
+[RUNNING] `[..]target[/]release[/]deps[/]test-[..][EXE]`
+[DOCTEST] foo
+[RUNNING] `rustdoc --test [..]lib.rs[..]`", dir = p.url()))
+ .with_stdout_contains_n("test test ... ok", 2)
+ .with_stdout_contains("running 0 tests"));
+}
+
+#[test]
+fn cargo_test_overflow_checks() {
+ if !is_nightly() {
+ return;
+ }
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.5.0"
+ authors = []
+
+ [[bin]]
+ name = "foo"
+
+ [profile.release]
+ overflow-checks = true
+ "#)
+ .file("src/foo.rs", r#"
+ use std::panic;
+ pub fn main() {
+ let r = panic::catch_unwind(|| {
+ [1, i32::max_value()].iter().sum::<i32>();
+ });
+ assert!(r.is_err());
+ }"#)
+ .build();
+
+ assert_that(p.cargo("build").arg("--release"),
+ execs().with_status(0));
+ assert_that(&p.release_bin("foo"), existing_file());
+
+ assert_that(process(&p.release_bin("foo")),
+ execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn cargo_test_verbose() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ fn main() {}
+ #[test] fn test_hello() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("test").arg("-v").arg("hello"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.5.0 ({url})
+[RUNNING] `rustc [..] src[/]main.rs [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..]target[/]debug[/]deps[/]foo-[..][EXE] hello`", url = p.url()))
+ .with_stdout_contains("test test_hello ... ok"));
+}
+
+#[test]
+fn many_similar_names() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "
+ pub fn foo() {}
+ #[test] fn lib_test() {}
+ ")
+ .file("src/main.rs", "
+ extern crate foo;
+ fn main() {}
+ #[test] fn bin_test() { foo::foo() }
+ ")
+ .file("tests/foo.rs", r#"
+ extern crate foo;
+ #[test] fn test_test() { foo::foo() }
+ "#)
+ .build();
+
+ let output = p.cargo("test").arg("-v").exec_with_output().unwrap();
+ let output = str::from_utf8(&output.stdout).unwrap();
+ assert!(output.contains("test bin_test"), "bin_test missing\n{}", output);
+ assert!(output.contains("test lib_test"), "lib_test missing\n{}", output);
+ assert!(output.contains("test test_test"), "test_test missing\n{}", output);
+}
+
+#[test]
+fn cargo_test_failing_test_in_bin() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ fn hello() -> &'static str {
+ "hello"
+ }
+
+ pub fn main() {
+ println!("{}", hello())
+ }
+
+ #[test]
+ fn test_hello() {
+ assert_eq!(hello(), "nope")
+ }"#)
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("hello\n"));
+
+ assert_that(p.cargo("test"),
+ execs().with_stderr(format!("\
+[COMPILING] foo v0.5.0 ({url})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[ERROR] test failed, to rerun pass '--bin foo'", url = p.url()))
+ .with_stdout_contains("
+running 1 test
+test test_hello ... FAILED
+
+failures:
+
+---- test_hello stdout ----
+<tab>thread 'test_hello' panicked at 'assertion failed:[..]")
+ .with_stdout_contains("[..]`(left == right)`[..]")
+ .with_stdout_contains("[..]left: `\"hello\"`,[..]")
+ .with_stdout_contains("[..]right: `\"nope\"`[..]")
+ .with_stdout_contains("[..]src[/]main.rs:12[..]")
+ .with_stdout_contains("\
+failures:
+ test_hello
+")
+ .with_status(101));
+}
+
+#[test]
+fn cargo_test_failing_test_in_test() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/main.rs", r#"
+ pub fn main() {
+ println!("hello");
+ }"#)
+ .file("tests/footest.rs", r#"
+ #[test]
+ fn test_hello() {
+ assert!(false)
+ }"#)
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+
+ assert_that(process(&p.bin("foo")),
+ execs().with_status(0).with_stdout("hello\n"));
+
+ assert_that(p.cargo("test"),
+ execs().with_stderr(format!("\
+[COMPILING] foo v0.5.0 ({url})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]footest-[..][EXE]
+[ERROR] test failed, to rerun pass '--test footest'", url = p.url()))
+ .with_stdout_contains("running 0 tests")
+ .with_stdout_contains("\
+running 1 test
+test test_hello ... FAILED
+
+failures:
+
+---- test_hello stdout ----
+<tab>thread 'test_hello' panicked at 'assertion failed: false', \
+ tests[/]footest.rs:4[..]
+")
+ .with_stdout_contains("\
+failures:
+ test_hello
+")
+ .with_status(101));
+}
+
+#[test]
+fn cargo_test_failing_test_in_lib() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_lib_manifest("foo"))
+ .file("src/lib.rs", r#"
+ #[test]
+ fn test_hello() {
+ assert!(false)
+ }"#)
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_stderr(format!("\
+[COMPILING] foo v0.5.0 ({url})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[ERROR] test failed, to rerun pass '--lib'", url = p.url()))
+ .with_stdout_contains("\
+test test_hello ... FAILED
+
+failures:
+
+---- test_hello stdout ----
+<tab>thread 'test_hello' panicked at 'assertion failed: false', \
+ src[/]lib.rs:4[..]
+")
+ .with_stdout_contains("\
+failures:
+ test_hello
+")
+ .with_status(101));
+}
+
+
+#[test]
+fn test_with_lib_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name = "baz"
+ path = "src/main.rs"
+ "#)
+ .file("src/lib.rs", r#"
+ ///
+ /// ```rust
+ /// extern crate foo;
+ /// fn main() {
+ /// println!("{:?}", foo::foo());
+ /// }
+ /// ```
+ ///
+ pub fn foo(){}
+ #[test] fn lib_test() {}
+ "#)
+ .file("src/main.rs", "
+ #[allow(unused_extern_crates)]
+ extern crate foo;
+
+ fn main() {}
+
+ #[test]
+ fn bin_test() {}
+ ")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]baz-[..][EXE]
+[DOCTEST] foo", p.url()))
+ .with_stdout_contains("test lib_test ... ok")
+ .with_stdout_contains("test bin_test ... ok")
+ .with_stdout_contains_n("test [..] ... ok", 3));
+}
+
+#[test]
+fn test_with_deep_lib_dep() {
+ let p = project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.foo]
+ path = "../foo"
+ "#)
+ .file("src/lib.rs", "
+ #[cfg(test)]
+ extern crate foo;
+ /// ```
+ /// bar::bar();
+ /// ```
+ pub fn bar() {}
+
+ #[test]
+ fn bar_test() {
+ foo::foo();
+ }
+ ")
+ .build();
+ let _p2 = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "
+ pub fn foo() {}
+
+ #[test]
+ fn foo_test() {}
+ ")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ([..])
+[COMPILING] bar v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[..]
+[DOCTEST] bar", dir = p.url()))
+ .with_stdout_contains("test bar_test ... ok")
+ .with_stdout_contains_n("test [..] ... ok", 2));
+}
+
+#[test]
+fn external_test_explicit() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[test]]
+ name = "test"
+ path = "src/test.rs"
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn get_hello() -> &'static str { "Hello" }
+
+ #[test]
+ fn internal_test() {}
+ "#)
+ .file("src/test.rs", r#"
+ extern crate foo;
+
+ #[test]
+ fn external_test() { assert_eq!(foo::get_hello(), "Hello") }
+ "#)
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]test-[..][EXE]
+[DOCTEST] foo", p.url()))
+ .with_stdout_contains("test internal_test ... ok")
+ .with_stdout_contains("test external_test ... ok")
+ .with_stdout_contains("running 0 tests"));
+}
+
+#[test]
+fn external_test_named_test() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[test]]
+ name = "test"
+ "#)
+ .file("src/lib.rs", "")
+ .file("tests/test.rs", r#"
+ #[test]
+ fn foo() { }
+ "#)
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0))
+}
+
+#[test]
+fn external_test_implicit() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn get_hello() -> &'static str { "Hello" }
+
+ #[test]
+ fn internal_test() {}
+ "#)
+ .file("tests/external.rs", r#"
+ extern crate foo;
+
+ #[test]
+ fn external_test() { assert_eq!(foo::get_hello(), "Hello") }
+ "#)
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]external-[..][EXE]
+[DOCTEST] foo", p.url()))
+ .with_stdout_contains("test internal_test ... ok")
+ .with_stdout_contains("test external_test ... ok")
+ .with_stdout_contains("running 0 tests"));
+}
+
+#[test]
+fn dont_run_examples() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ "#)
+ .file("examples/dont-run-me-i-will-fail.rs", r#"
+ fn main() { panic!("Examples should not be run by 'cargo test'"); }
+ "#)
+ .build();
+ assert_that(p.cargo("test"),
+ execs().with_status(0));
+}
+
+#[test]
+fn pass_through_command_line() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "
+ #[test] fn foo() {}
+ #[test] fn bar() {}
+ ")
+ .build();
+
+ assert_that(p.cargo("test").arg("bar"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[DOCTEST] foo", dir = p.url()))
+ .with_stdout_contains("test bar ... ok")
+ .with_stdout_contains("running 0 tests"));
+
+ assert_that(p.cargo("test").arg("foo"),
+ execs().with_status(0)
+ .with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[DOCTEST] foo")
+ .with_stdout_contains("test foo ... ok")
+ .with_stdout_contains("running 0 tests"));
+}
+
+// Regression test for running cargo-test twice with
+// tests in an rlib
+#[test]
+fn cargo_test_twice() {
+ let p = project("test_twice")
+ .file("Cargo.toml", &basic_lib_manifest("test_twice"))
+ .file("src/test_twice.rs", r#"
+ #![crate_type = "rlib"]
+
+ #[test]
+ fn dummy_test() { }
+ "#)
+ .build();
+
+ p.cargo("build");
+
+ for _ in 0..2 {
+ assert_that(p.cargo("test"),
+ execs().with_status(0));
+ }
+}
+
+#[test]
+fn lib_bin_same_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo"
+ [[bin]]
+ name = "foo"
+ "#)
+ .file("src/lib.rs", "
+ #[test] fn lib_test() {}
+ ")
+ .file("src/main.rs", "
+ #[allow(unused_extern_crates)]
+ extern crate foo;
+
+ #[test]
+ fn bin_test() {}
+ ")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0).with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[DOCTEST] foo", p.url()))
+ .with_stdout_contains_n("test [..] ... ok", 2)
+ .with_stdout_contains("running 0 tests"));
+}
+
+#[test]
+fn lib_with_standard_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "syntax"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "
+ /// ```
+ /// syntax::foo();
+ /// ```
+ pub fn foo() {}
+
+ #[test]
+ fn foo_test() {}
+ ")
+ .file("tests/test.rs", "
+ extern crate syntax;
+
+ #[test]
+ fn test() { syntax::foo() }
+ ")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] syntax v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]syntax-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]test-[..][EXE]
+[DOCTEST] syntax", dir = p.url()))
+ .with_stdout_contains("test foo_test ... ok")
+ .with_stdout_contains("test test ... ok")
+ .with_stdout_contains_n("test [..] ... ok", 3));
+}
+
+#[test]
+fn lib_with_standard_name2() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "syntax"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "syntax"
+ test = false
+ doctest = false
+ "#)
+ .file("src/lib.rs", "
+ pub fn foo() {}
+ ")
+ .file("src/main.rs", "
+ extern crate syntax;
+
+ fn main() {}
+
+ #[test]
+ fn test() { syntax::foo() }
+ ")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] syntax v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]syntax-[..][EXE]", dir = p.url()))
+ .with_stdout_contains("test test ... ok"));
+}
+
+#[test]
+fn lib_without_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "syntax"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ test = false
+ doctest = false
+ "#)
+ .file("src/lib.rs", "
+ pub fn foo() {}
+ ")
+ .file("src/main.rs", "
+ extern crate syntax;
+
+ fn main() {}
+
+ #[test]
+ fn test() { syntax::foo() }
+ ")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] syntax v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]syntax-[..][EXE]", dir = p.url()))
+ .with_stdout_contains("test test ... ok"));
+}
+
+#[test]
+fn bin_without_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "syntax"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ test = false
+ doctest = false
+
+ [[bin]]
+ path = "src/main.rs"
+ "#)
+ .file("src/lib.rs", "
+ pub fn foo() {}
+ ")
+ .file("src/main.rs", "
+ extern crate syntax;
+
+ fn main() {}
+
+ #[test]
+ fn test() { syntax::foo() }
+ ")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ binary target bin.name is required"));
+}
+
+#[test]
+fn bench_without_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "syntax"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ test = false
+ doctest = false
+
+ [[bench]]
+ path = "src/bench.rs"
+ "#)
+ .file("src/lib.rs", "
+ pub fn foo() {}
+ ")
+ .file("src/main.rs", "
+ extern crate syntax;
+
+ fn main() {}
+
+ #[test]
+ fn test() { syntax::foo() }
+ ")
+ .file("src/bench.rs", "
+ #![feature(test)]
+ extern crate syntax;
+ extern crate test;
+
+ #[bench]
+ fn external_bench(_b: &mut test::Bencher) {}
+ ")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ benchmark target bench.name is required"));
+}
+
+#[test]
+fn test_without_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "syntax"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ test = false
+ doctest = false
+
+ [[test]]
+ path = "src/test.rs"
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn foo() {}
+ pub fn get_hello() -> &'static str { "Hello" }
+ "#)
+ .file("src/main.rs", "
+ extern crate syntax;
+
+ fn main() {}
+
+ #[test]
+ fn test() { syntax::foo() }
+ ")
+ .file("src/test.rs", r#"
+ extern crate syntax;
+
+ #[test]
+ fn external_test() { assert_eq!(syntax::get_hello(), "Hello") }
+ "#)
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ test target test.name is required"));
+}
+
+#[test]
+fn example_without_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "syntax"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ test = false
+ doctest = false
+
+ [[example]]
+ path = "examples/example.rs"
+ "#)
+ .file("src/lib.rs", "
+ pub fn foo() {}
+ ")
+ .file("src/main.rs", "
+ extern crate syntax;
+
+ fn main() {}
+
+ #[test]
+ fn test() { syntax::foo() }
+ ")
+ .file("examples/example.rs", r#"
+ extern crate syntax;
+
+ fn main() {
+ println!("example1");
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(101)
+ .with_stderr("\
+[ERROR] failed to parse manifest at `[..]`
+
+Caused by:
+ example target example.name is required"));
+}
+
+#[test]
+fn bin_there_for_integration() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/main.rs", "
+ fn main() { std::process::exit(101); }
+ #[test] fn main_test() {}
+ ")
+ .file("tests/foo.rs", r#"
+ use std::process::Command;
+ #[test]
+ fn test_test() {
+ let status = Command::new("target/debug/foo").status().unwrap();
+ assert_eq!(status.code(), Some(101));
+ }
+ "#)
+ .build();
+
+ let output = p.cargo("test").arg("-v").exec_with_output().unwrap();
+ let output = str::from_utf8(&output.stdout).unwrap();
+ assert!(output.contains("main_test ... ok"), "no main_test\n{}", output);
+ assert!(output.contains("test_test ... ok"), "no test_test\n{}", output);
+}
+
+#[test]
+fn test_dylib() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo"
+ crate_type = ["dylib"]
+
+ [dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate bar as the_bar;
+
+ pub fn bar() { the_bar::baz(); }
+
+ #[test]
+ fn foo() { bar(); }
+ "#)
+ .file("tests/test.rs", r#"
+ extern crate foo as the_foo;
+
+ #[test]
+ fn foo() { the_foo::bar(); }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "bar"
+ crate_type = ["dylib"]
+ "#)
+ .file("bar/src/lib.rs", "
+ pub fn baz() {}
+ ")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] bar v0.0.1 ({dir}/bar)
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]test-[..][EXE]", dir = p.url()))
+ .with_stdout_contains_n("test foo ... ok", 2));
+
+ p.root().move_into_the_past();
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]test-[..][EXE]")
+ .with_stdout_contains_n("test foo ... ok", 2));
+}
+
+#[test]
+fn test_twice_with_build_cmd() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("build.rs", "fn main() {}")
+ .file("src/lib.rs", "
+ #[test]
+ fn foo() {}
+ ")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[DOCTEST] foo", dir = p.url()))
+ .with_stdout_contains("test foo ... ok")
+ .with_stdout_contains("running 0 tests"));
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[DOCTEST] foo")
+ .with_stdout_contains("test foo ... ok")
+ .with_stdout_contains("running 0 tests"));
+}
+
+#[test]
+fn test_then_build() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "
+ #[test]
+ fn foo() {}
+ ")
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[DOCTEST] foo", dir = p.url()))
+ .with_stdout_contains("test foo ... ok")
+ .with_stdout_contains("running 0 tests"));
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stdout(""));
+}
+
+#[test]
+fn test_no_run() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "
+ #[test]
+ fn foo() { panic!() }
+ ")
+ .build();
+
+ assert_that(p.cargo("test").arg("--no-run"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+",
+ dir = p.url())));
+}
+
+#[test]
+fn test_run_specific_bin_target() {
+ let prj = project("foo")
+ .file("Cargo.toml" , r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name="bin1"
+ path="src/bin1.rs"
+
+ [[bin]]
+ name="bin2"
+ path="src/bin2.rs"
+ "#)
+ .file("src/bin1.rs", "#[test] fn test1() { }")
+ .file("src/bin2.rs", "#[test] fn test2() { }")
+ .build();
+
+ assert_that(prj.cargo("test").arg("--bin").arg("bin2"),
+ execs().with_status(0)
+ .with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]bin2-[..][EXE]", dir = prj.url()))
+ .with_stdout_contains("test test2 ... ok"));
+}
+
+#[test]
+fn test_run_implicit_bin_target() {
+ let prj = project("foo")
+ .file("Cargo.toml" , r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name="mybin"
+ path="src/mybin.rs"
+ "#)
+ .file("src/mybin.rs", "#[test] fn test_in_bin() { }
+ fn main() { panic!(\"Don't execute me!\"); }")
+ .file("tests/mytest.rs", "#[test] fn test_in_test() { }")
+ .file("benches/mybench.rs", "#[test] fn test_in_bench() { }")
+ .file("examples/myexm.rs", "#[test] fn test_in_exm() { }
+ fn main() { panic!(\"Don't execute me!\"); }")
+ .build();
+
+ assert_that(prj.cargo("test").arg("--bins"),
+ execs().with_status(0)
+ .with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]mybin-[..][EXE]", dir = prj.url()))
+ .with_stdout_contains("test test_in_bin ... ok"));
+}
+
+#[test]
+fn test_run_specific_test_target() {
+ let prj = project("foo")
+ .file("Cargo.toml" , r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/bin/a.rs", "fn main() { }")
+ .file("src/bin/b.rs", "#[test] fn test_b() { } fn main() { }")
+ .file("tests/a.rs", "#[test] fn test_a() { }")
+ .file("tests/b.rs", "#[test] fn test_b() { }")
+ .build();
+
+ assert_that(prj.cargo("test").arg("--test").arg("b"),
+ execs().with_status(0)
+ .with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]b-[..][EXE]", dir = prj.url()))
+ .with_stdout_contains("test test_b ... ok"));
+}
+
+#[test]
+fn test_run_implicit_test_target() {
+ let prj = project("foo")
+ .file("Cargo.toml" , r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name="mybin"
+ path="src/mybin.rs"
+ "#)
+ .file("src/mybin.rs", "#[test] fn test_in_bin() { }
+ fn main() { panic!(\"Don't execute me!\"); }")
+ .file("tests/mytest.rs", "#[test] fn test_in_test() { }")
+ .file("benches/mybench.rs", "#[test] fn test_in_bench() { }")
+ .file("examples/myexm.rs", "#[test] fn test_in_exm() { }
+ fn main() { panic!(\"Don't execute me!\"); }")
+ .build();
+
+ assert_that(prj.cargo("test").arg("--tests"),
+ execs().with_status(0)
+ .with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]mybin-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]mytest-[..][EXE]
+[RUNNING] target[/]debug[/]examples[/]myexm-[..][EXE]", dir = prj.url()))
+ .with_stdout_contains("test test_in_test ... ok"));
+}
+
+#[test]
+fn test_run_implicit_bench_target() {
+ let prj = project("foo")
+ .file("Cargo.toml" , r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name="mybin"
+ path="src/mybin.rs"
+ "#)
+ .file("src/mybin.rs", "#[test] fn test_in_bin() { }
+ fn main() { panic!(\"Don't execute me!\"); }")
+ .file("tests/mytest.rs", "#[test] fn test_in_test() { }")
+ .file("benches/mybench.rs", "#[test] fn test_in_bench() { }")
+ .file("examples/myexm.rs", "#[test] fn test_in_exm() { }
+ fn main() { panic!(\"Don't execute me!\"); }")
+ .build();
+
+ assert_that(prj.cargo("test").arg("--benches"),
+ execs().with_status(0)
+ .with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]mybin-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]mybench-[..][EXE]", dir = prj.url()))
+ .with_stdout_contains("test test_in_bench ... ok"));
+}
+
+#[test]
+fn test_run_implicit_example_target() {
+ let prj = project("foo")
+ .file("Cargo.toml" , r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name="mybin"
+ path="src/mybin.rs"
+ "#)
+ .file("src/mybin.rs", "#[test] fn test_in_bin() { }
+ fn main() { panic!(\"Don't execute me!\"); }")
+ .file("tests/mytest.rs", "#[test] fn test_in_test() { }")
+ .file("benches/mybench.rs", "#[test] fn test_in_bench() { }")
+ .file("examples/myexm.rs", "#[test] fn test_in_exm() { }
+ fn main() { panic!(\"Don't execute me!\"); }")
+ .build();
+
+ assert_that(prj.cargo("test").arg("--examples"),
+ execs().with_status(0)
+ .with_stderr(format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]examples[/]myexm-[..][EXE]", dir = prj.url())));
+}
+
+#[test]
+fn test_no_harness() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [[bin]]
+ name = "foo"
+ test = false
+
+ [[test]]
+ name = "bar"
+ path = "foo.rs"
+ harness = false
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("foo.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("test").arg("--").arg("--nocapture"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]bar-[..][EXE]
+",
+ dir = p.url())));
+}
+
+#[test]
+fn selective_testing() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.d1]
+ path = "d1"
+ [dependencies.d2]
+ path = "d2"
+
+ [lib]
+ name = "foo"
+ doctest = false
+ "#)
+ .file("src/lib.rs", "")
+ .file("d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "d1"
+ doctest = false
+ "#)
+ .file("d1/src/lib.rs", "")
+ .file("d1/src/main.rs", "#[allow(unused_extern_crates)] extern crate d1; fn main() {}")
+ .file("d2/Cargo.toml", r#"
+ [package]
+ name = "d2"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "d2"
+ doctest = false
+ "#)
+ .file("d2/src/lib.rs", "")
+ .file("d2/src/main.rs", "#[allow(unused_extern_crates)] extern crate d2; fn main() {}");
+ let p = p.build();
+
+ println!("d1");
+ assert_that(p.cargo("test").arg("-p").arg("d1"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] d1 v0.0.1 ({dir}/d1)
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]d1-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]d1-[..][EXE]", dir = p.url()))
+ .with_stdout_contains_n("running 0 tests", 2));
+
+ println!("d2");
+ assert_that(p.cargo("test").arg("-p").arg("d2"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] d2 v0.0.1 ({dir}/d2)
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]d2-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]d2-[..][EXE]", dir = p.url()))
+ .with_stdout_contains_n("running 0 tests", 2));
+
+ println!("whole");
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]", dir = p.url()))
+ .with_stdout_contains("running 0 tests"));
+}
+
+#[test]
+fn almost_cyclic_but_not_quite() {
+ let p = project("a")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [dev-dependencies.b]
+ path = "b"
+ [dev-dependencies.c]
+ path = "c"
+ "#)
+ .file("src/lib.rs", r#"
+ #[cfg(test)] extern crate b;
+ #[cfg(test)] extern crate c;
+ "#)
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.a]
+ path = ".."
+ "#)
+ .file("b/src/lib.rs", r#"
+ #[allow(unused_extern_crates)]
+ extern crate a;
+ "#)
+ .file("c/Cargo.toml", r#"
+ [package]
+ name = "c"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("c/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(p.cargo("test"),
+ execs().with_status(0));
+}
+
+#[test]
+fn build_then_selective_test() {
+ let p = project("a")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.b]
+ path = "b"
+ "#)
+ .file("src/lib.rs", "#[allow(unused_extern_crates)] extern crate b;")
+ .file("src/main.rs", r#"
+ #[allow(unused_extern_crates)]
+ extern crate b;
+ #[allow(unused_extern_crates)]
+ extern crate a;
+ fn main() {}
+ "#)
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ p.root().move_into_the_past();
+ assert_that(p.cargo("test").arg("-p").arg("b"),
+ execs().with_status(0));
+}
+
+#[test]
+fn example_dev_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dev-dependencies.bar]
+ path = "bar"
+ "#)
+ .file("src/lib.rs", r#"
+ "#)
+ .file("examples/e1.rs", r#"
+ extern crate bar;
+ fn main() { }
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", r#"
+ // make sure this file takes awhile to compile
+ macro_rules! f0( () => (1) );
+ macro_rules! f1( () => ({(f0!()) + (f0!())}) );
+ macro_rules! f2( () => ({(f1!()) + (f1!())}) );
+ macro_rules! f3( () => ({(f2!()) + (f2!())}) );
+ macro_rules! f4( () => ({(f3!()) + (f3!())}) );
+ macro_rules! f5( () => ({(f4!()) + (f4!())}) );
+ macro_rules! f6( () => ({(f5!()) + (f5!())}) );
+ macro_rules! f7( () => ({(f6!()) + (f6!())}) );
+ macro_rules! f8( () => ({(f7!()) + (f7!())}) );
+ pub fn bar() {
+ f8!();
+ }
+ "#)
+ .build();
+ assert_that(p.cargo("test"),
+ execs().with_status(0));
+ assert_that(p.cargo("run")
+ .arg("--example").arg("e1").arg("--release").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn selective_testing_with_docs() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.d1]
+ path = "d1"
+ "#)
+ .file("src/lib.rs", r#"
+ /// ```
+ /// not valid rust
+ /// ```
+ pub fn foo() {}
+ "#)
+ .file("d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "d1"
+ path = "d1.rs"
+ "#)
+ .file("d1/d1.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("test").arg("-p").arg("d1"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] d1 v0.0.1 ({dir}/d1)
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]d1[..][EXE]
+[DOCTEST] d1", dir = p.url()))
+ .with_stdout_contains_n("running 0 tests", 2));
+}
+
+#[test]
+fn example_bin_same_name() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/bin/foo.rs", r#"fn main() { println!("bin"); }"#)
+ .file("examples/foo.rs", r#"fn main() { println!("example"); }"#)
+ .build();
+
+ assert_that(p.cargo("test").arg("--no-run").arg("-v"),
+ execs().with_status(0)
+ .with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({dir})
+[RUNNING] `rustc [..]`
+[RUNNING] `rustc [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", dir = p.url())));
+
+ assert_that(&p.bin("foo"), is_not(existing_file()));
+ assert_that(&p.bin("examples/foo"), existing_file());
+
+ assert_that(p.process(&p.bin("examples/foo")),
+ execs().with_status(0).with_stdout("example\n"));
+
+ assert_that(p.cargo("run"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] [..]")
+ .with_stdout("\
+bin
+"));
+ assert_that(&p.bin("foo"), existing_file());
+}
+
+#[test]
+fn test_with_example_twice() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/bin/foo.rs", r#"fn main() { println!("bin"); }"#)
+ .file("examples/foo.rs", r#"fn main() { println!("example"); }"#)
+ .build();
+
+ println!("first");
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0));
+ assert_that(&p.bin("examples/foo"), existing_file());
+ println!("second");
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0));
+ assert_that(&p.bin("examples/foo"), existing_file());
+}
+
+#[test]
+fn example_with_dev_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo"
+ test = false
+ doctest = false
+
+ [dev-dependencies.a]
+ path = "a"
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/ex.rs", "#[allow(unused_extern_crates)] extern crate a; fn main() {}")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0)
+ .with_stderr("\
+[..]
+[..]
+[..]
+[..]
+[RUNNING] `rustc --crate-name ex [..] --extern a=[..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn bin_is_preserved() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/main.rs", "fn main() {}")
+ .build();
+
+ assert_that(p.cargo("build").arg("-v"),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+
+ println!("testing");
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+}
+
+#[test]
+fn bad_example() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("run").arg("--example").arg("foo"),
+ execs().with_status(101).with_stderr("\
+[ERROR] no example target named `foo`
+"));
+ assert_that(p.cargo("run").arg("--bin").arg("foo"),
+ execs().with_status(101).with_stderr("\
+[ERROR] no bin target named `foo`
+"));
+}
+
+#[test]
+fn doctest_feature() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ [features]
+ bar = []
+ "#)
+ .file("src/lib.rs", r#"
+ /// ```rust
+ /// assert_eq!(foo::foo(), 1);
+ /// ```
+ #[cfg(feature = "bar")]
+ pub fn foo() -> i32 { 1 }
+ "#)
+ .build();
+
+ assert_that(p.cargo("test").arg("--features").arg("bar"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo [..]
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo[..][EXE]
+[DOCTEST] foo")
+ .with_stdout_contains("running 0 tests")
+ .with_stdout_contains("test [..] ... ok"));
+}
+
+#[test]
+fn dashes_to_underscores() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo-bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ /// ```
+ /// assert_eq!(foo_bar::foo(), 1);
+ /// ```
+ pub fn foo() -> i32 { 1 }
+ "#)
+ .build();
+
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn doctest_dev_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dev-dependencies]
+ b = { path = "b" }
+ "#)
+ .file("src/lib.rs", r#"
+ /// ```
+ /// extern crate b;
+ /// ```
+ pub fn foo() {}
+ "#)
+ .file("b/Cargo.toml", r#"
+ [package]
+ name = "b"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("b/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn filter_no_doc_tests() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ /// ```
+ /// extern crate b;
+ /// ```
+ pub fn foo() {}
+ "#)
+ .file("tests/foo.rs", "")
+ .build();
+
+ assert_that(p.cargo("test").arg("--test=foo"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo[..][EXE]")
+ .with_stdout_contains("running 0 tests"));
+}
+
+#[test]
+fn dylib_doctest() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo"
+ crate-type = ["rlib", "dylib"]
+ test = false
+ "#)
+ .file("src/lib.rs", r#"
+ /// ```
+ /// foo::foo();
+ /// ```
+ pub fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[DOCTEST] foo")
+ .with_stdout_contains("test [..] ... ok"));
+}
+
+#[test]
+fn dylib_doctest2() {
+ // can't doctest dylibs as they're statically linked together
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo"
+ crate-type = ["dylib"]
+ test = false
+ "#)
+ .file("src/lib.rs", r#"
+ /// ```
+ /// foo::foo();
+ /// ```
+ pub fn foo() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0).with_stdout(""));
+}
+
+#[test]
+fn cyclic_dev_dep_doc_test() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dev-dependencies]
+ bar = { path = "bar" }
+ "#)
+ .file("src/lib.rs", r#"
+ //! ```
+ //! extern crate bar;
+ //! ```
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = { path = ".." }
+ "#)
+ .file("bar/src/lib.rs", r#"
+ #[allow(unused_extern_crates)]
+ extern crate foo;
+ "#)
+ .build();
+ assert_that(p.cargo("test"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[COMPILING] bar v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo[..][EXE]
+[DOCTEST] foo")
+ .with_stdout_contains("running 0 tests")
+ .with_stdout_contains("test [..] ... ok"));
+}
+
+#[test]
+fn dev_dep_with_build_script() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dev-dependencies]
+ bar = { path = "bar" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("examples/foo.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ build = "build.rs"
+ "#)
+ .file("bar/src/lib.rs", "")
+ .file("bar/build.rs", "fn main() {}")
+ .build();
+ assert_that(p.cargo("test"),
+ execs().with_status(0));
+}
+
+#[test]
+fn no_fail_fast() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ pub fn add_one(x: i32) -> i32{
+ x + 1
+ }
+
+ /// ```rust
+ /// use foo::sub_one;
+ /// assert_eq!(sub_one(101), 100);
+ /// ```
+ pub fn sub_one(x: i32) -> i32{
+ x - 1
+ }
+ "#)
+ .file("tests/test_add_one.rs", r#"
+ extern crate foo;
+ use foo::*;
+
+ #[test]
+ fn add_one_test() {
+ assert_eq!(add_one(1), 2);
+ }
+
+ #[test]
+ fn fail_add_one_test() {
+ assert_eq!(add_one(1), 1);
+ }
+ "#)
+ .file("tests/test_sub_one.rs", r#"
+ extern crate foo;
+ use foo::*;
+
+ #[test]
+ fn sub_one_test() {
+ assert_eq!(sub_one(1), 0);
+ }
+ "#)
+ .build();
+ assert_that(p.cargo("test").arg("--no-fail-fast"),
+ execs().with_status(101)
+ .with_stderr_contains("\
+[COMPILING] foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] target[/]debug[/]deps[/]foo-[..][EXE]
+[RUNNING] target[/]debug[/]deps[/]test_add_one-[..][EXE]")
+ .with_stdout_contains("running 0 tests")
+ .with_stderr_contains("\
+[RUNNING] target[/]debug[/]deps[/]test_sub_one-[..][EXE]
+[DOCTEST] foo")
+ .with_stdout_contains("test result: FAILED. [..]")
+ .with_stdout_contains("test sub_one_test ... ok")
+ .with_stdout_contains_n("test [..] ... ok", 3));
+}
+
+#[test]
+fn test_multiple_packages() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies.d1]
+ path = "d1"
+ [dependencies.d2]
+ path = "d2"
+
+ [lib]
+ name = "foo"
+ doctest = false
+ "#)
+ .file("src/lib.rs", "")
+ .file("d1/Cargo.toml", r#"
+ [package]
+ name = "d1"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "d1"
+ doctest = false
+ "#)
+ .file("d1/src/lib.rs", "")
+ .file("d2/Cargo.toml", r#"
+ [package]
+ name = "d2"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "d2"
+ doctest = false
+ "#)
+ .file("d2/src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("test").arg("-p").arg("d1").arg("-p").arg("d2"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[RUNNING] target[/]debug[/]deps[/]d1-[..][EXE]")
+ .with_stderr_contains("\
+[RUNNING] target[/]debug[/]deps[/]d2-[..][EXE]")
+ .with_stdout_contains_n("running 0 tests", 2));
+}
+
+#[test]
+fn bin_does_not_rebuild_tests() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", "")
+ .file("src/main.rs", "fn main() {}")
+ .file("tests/foo.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0));
+
+ sleep_ms(1000);
+ File::create(&p.root().join("src/main.rs")).unwrap()
+ .write_all(b"fn main() { 3; }").unwrap();
+
+ assert_that(p.cargo("test").arg("-v").arg("--no-run"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..] src[/]main.rs [..]`
+[RUNNING] `rustc [..] src[/]main.rs [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn selective_test_wonky_profile() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [profile.release]
+ opt-level = 2
+
+ [dependencies]
+ a = { path = "a" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("test").arg("-v").arg("--no-run").arg("--release")
+ .arg("-p").arg("foo").arg("-p").arg("a"),
+ execs().with_status(0));
+}
+
+#[test]
+fn selective_test_optional_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = { path = "a", optional = true }
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("test").arg("-v").arg("--no-run")
+ .arg("--features").arg("a").arg("-p").arg("a"),
+ execs().with_status(0).with_stderr("\
+[COMPILING] a v0.0.1 ([..])
+[RUNNING] `rustc [..] a[/]src[/]lib.rs [..]`
+[RUNNING] `rustc [..] a[/]src[/]lib.rs [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn only_test_docs() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("src/lib.rs", r#"
+ #[test]
+ fn foo() {
+ let a: u32 = "hello";
+ }
+
+ /// ```
+ /// foo::bar();
+ /// println!("ok");
+ /// ```
+ pub fn bar() {
+ }
+ "#)
+ .file("tests/foo.rs", "this is not rust");
+ let p = p.build();
+
+ assert_that(p.cargo("test").arg("--doc"),
+ execs().with_status(0)
+ .with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[DOCTEST] foo")
+ .with_stdout_contains("test [..] ... ok"));
+}
+
+#[test]
+fn test_panic_abort_with_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ bar = { path = "bar" }
+
+ [profile.dev]
+ panic = 'abort'
+ "#)
+ .file("src/lib.rs", r#"
+ extern crate bar;
+
+ #[test]
+ fn foo() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn cfg_test_even_with_no_harness() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ harness = false
+ doctest = false
+ "#)
+ .file("src/lib.rs", r#"
+ #[cfg(test)]
+ fn main() {
+ println!("hello!");
+ }
+ "#)
+ .build();
+ assert_that(p.cargo("test").arg("-v"),
+ execs().with_status(0)
+ .with_stdout("hello!\n")
+ .with_stderr("\
+[COMPILING] foo v0.0.1 ([..])
+[RUNNING] `rustc [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `[..]`
+"));
+}
+
+#[test]
+fn panic_abort_multiple() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+
+ [profile.release]
+ panic = 'abort'
+ "#)
+ .file("src/lib.rs", "#[allow(unused_extern_crates)] extern crate a;")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+ assert_that(p.cargo("test")
+ .arg("--release").arg("-v")
+ .arg("-p").arg("foo")
+ .arg("-p").arg("a"),
+ execs().with_status(0));
+}
+
+#[test]
+fn pass_correct_cfgs_flags_to_rustdoc() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [features]
+ default = ["feature_a/default"]
+ nightly = ["feature_a/nightly"]
+
+ [dependencies.feature_a]
+ path = "libs/feature_a"
+ default-features = false
+ "#)
+ .file("src/lib.rs", r#"
+ #[cfg(test)]
+ mod tests {
+ #[test]
+ fn it_works() {
+ assert!(true);
+ }
+ }
+ "#)
+ .file("libs/feature_a/Cargo.toml", r#"
+ [package]
+ name = "feature_a"
+ version = "0.1.0"
+ authors = []
+
+ [features]
+ default = ["mock_serde_codegen"]
+ nightly = ["mock_serde_derive"]
+
+ [dependencies]
+ mock_serde_derive = { path = "../mock_serde_derive", optional = true }
+
+ [build-dependencies]
+ mock_serde_codegen = { path = "../mock_serde_codegen", optional = true }
+ "#)
+ .file("libs/feature_a/src/lib.rs", r#"
+ #[cfg(feature = "mock_serde_derive")]
+ const MSG: &'static str = "This is safe";
+
+ #[cfg(feature = "mock_serde_codegen")]
+ const MSG: &'static str = "This is risky";
+
+ pub fn get() -> &'static str {
+ MSG
+ }
+ "#)
+ .file("libs/mock_serde_derive/Cargo.toml", r#"
+ [package]
+ name = "mock_serde_derive"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("libs/mock_serde_derive/src/lib.rs", "")
+ .file("libs/mock_serde_codegen/Cargo.toml", r#"
+ [package]
+ name = "mock_serde_codegen"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("libs/mock_serde_codegen/src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("test")
+ .arg("--package").arg("feature_a")
+ .arg("--verbose"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[DOCTEST] feature_a
+[RUNNING] `rustdoc --test [..]mock_serde_codegen[..]`"));
+
+ assert_that(p.cargo("test")
+ .arg("--verbose"),
+ execs().with_status(0)
+ .with_stderr_contains("\
+[DOCTEST] foo
+[RUNNING] `rustdoc --test [..]feature_a[..]`"));
+}
+
+#[test]
+fn test_release_ignore_panic() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+
+ [profile.test]
+ panic = 'abort'
+ [profile.release]
+ panic = 'abort'
+ "#)
+ .file("src/lib.rs", "#[allow(unused_extern_crates)] extern crate a;")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "");
+ let p = p.build();
+ println!("test");
+ assert_that(p.cargo("test").arg("-v"), execs().with_status(0));
+ println!("bench");
+ assert_that(p.cargo("bench").arg("-v"), execs().with_status(0));
+}
+
+#[test]
+fn test_many_with_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ a = { path = "a" }
+
+ [features]
+ foo = []
+
+ [workspace]
+ "#)
+ .file("src/lib.rs", "")
+ .file("a/Cargo.toml", r#"
+ [package]
+ name = "a"
+ version = "0.0.1"
+ authors = []
+ "#)
+ .file("a/src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("test").arg("-v")
+ .arg("-p").arg("a")
+ .arg("-p").arg("foo")
+ .arg("--features").arg("foo"),
+ execs().with_status(0));
+}
+
+#[test]
+fn test_all_workspace() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+
+ [dependencies]
+ bar = { path = "bar" }
+
+ [workspace]
+ "#)
+ .file("src/main.rs", r#"
+ #[test]
+ fn foo_test() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ #[test]
+ fn bar_test() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("test")
+ .arg("--all"),
+ execs().with_status(0)
+ .with_stdout_contains("test foo_test ... ok")
+ .with_stdout_contains("test bar_test ... ok"));
+}
+
+#[test]
+fn test_all_exclude() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+
+ [workspace]
+ members = ["bar", "baz"]
+ "#)
+ .file("src/main.rs", r#"
+ fn main() {}
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ #[test]
+ pub fn bar() {}
+ "#)
+ .file("baz/Cargo.toml", r#"
+ [project]
+ name = "baz"
+ version = "0.1.0"
+ "#)
+ .file("baz/src/lib.rs", r#"
+ #[test]
+ pub fn baz() {
+ assert!(false);
+ }
+ "#)
+ .build();
+
+ assert_that(p.cargo("test")
+ .arg("--all")
+ .arg("--exclude")
+ .arg("baz"),
+ execs().with_status(0)
+ .with_stdout_contains("running 1 test
+test bar ... ok"));
+}
+
+#[test]
+fn test_all_virtual_manifest() {
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["a", "b"]
+ "#)
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.1.0"
+ "#)
+ .file("a/src/lib.rs", r#"
+ #[test]
+ fn a() {}
+ "#)
+ .file("b/Cargo.toml", r#"
+ [project]
+ name = "b"
+ version = "0.1.0"
+ "#)
+ .file("b/src/lib.rs", r#"
+ #[test]
+ fn b() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("test")
+ .arg("--all"),
+ execs().with_status(0)
+ .with_stdout_contains("test a ... ok")
+ .with_stdout_contains("test b ... ok"));
+}
+
+#[test]
+fn test_virtual_manifest_all_implied() {
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["a", "b"]
+ "#)
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.1.0"
+ "#)
+ .file("a/src/lib.rs", r#"
+ #[test]
+ fn a() {}
+ "#)
+ .file("b/Cargo.toml", r#"
+ [project]
+ name = "b"
+ version = "0.1.0"
+ "#)
+ .file("b/src/lib.rs", r#"
+ #[test]
+ fn b() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("test"),
+ execs().with_status(0)
+ .with_stdout_contains("test a ... ok")
+ .with_stdout_contains("test b ... ok"));
+}
+
+#[test]
+fn test_all_member_dependency_same_name() {
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["a"]
+ "#)
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.1.0"
+
+ [dependencies]
+ a = "0.1.0"
+ "#)
+ .file("a/src/lib.rs", r#"
+ #[test]
+ fn a() {}
+ "#)
+ .build();
+
+ Package::new("a", "0.1.0").publish();
+
+ assert_that(p.cargo("test")
+ .arg("--all"),
+ execs().with_status(0)
+ .with_stdout_contains("test a ... ok"));
+}
+
+#[test]
+fn doctest_only_with_dev_dep() {
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.1.0"
+
+ [dev-dependencies]
+ b = { path = "b" }
+ "#)
+ .file("src/lib.rs", r#"
+ /// ```
+ /// extern crate b;
+ ///
+ /// b::b();
+ /// ```
+ pub fn a() {}
+ "#)
+ .file("b/Cargo.toml", r#"
+ [project]
+ name = "b"
+ version = "0.1.0"
+ "#)
+ .file("b/src/lib.rs", r#"
+ pub fn b() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("test").arg("--doc").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn test_many_targets() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("src/bin/a.rs", r#"
+ fn main() {}
+ #[test] fn bin_a() {}
+ "#)
+ .file("src/bin/b.rs", r#"
+ fn main() {}
+ #[test] fn bin_b() {}
+ "#)
+ .file("src/bin/c.rs", r#"
+ fn main() {}
+ #[test] fn bin_c() { panic!(); }
+ "#)
+ .file("examples/a.rs", r#"
+ fn main() {}
+ #[test] fn example_a() {}
+ "#)
+ .file("examples/b.rs", r#"
+ fn main() {}
+ #[test] fn example_b() {}
+ "#)
+ .file("examples/c.rs", r#"
+ #[test] fn example_c() { panic!(); }
+ "#)
+ .file("tests/a.rs", r#"
+ #[test] fn test_a() {}
+ "#)
+ .file("tests/b.rs", r#"
+ #[test] fn test_b() {}
+ "#)
+ .file("tests/c.rs", r#"
+ does not compile
+ "#)
+ .build();
+
+ assert_that(p.cargo("test").arg("--verbose")
+ .arg("--bin").arg("a").arg("--bin").arg("b")
+ .arg("--example").arg("a").arg("--example").arg("b")
+ .arg("--test").arg("a").arg("--test").arg("b"),
+ execs()
+ .with_status(0)
+ .with_stdout_contains("test bin_a ... ok")
+ .with_stdout_contains("test bin_b ... ok")
+ .with_stdout_contains("test test_a ... ok")
+ .with_stdout_contains("test test_b ... ok")
+ .with_stderr_contains("[RUNNING] `rustc --crate-name a examples[/]a.rs [..]`")
+ .with_stderr_contains("[RUNNING] `rustc --crate-name b examples[/]b.rs [..]`"))
+}
+
+#[test]
+fn doctest_and_registry() {
+ let p = project("workspace")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.1.0"
+
+ [dependencies]
+ b = { path = "b" }
+ c = { path = "c" }
+
+ [workspace]
+ "#)
+ .file("src/lib.rs", "")
+ .file("b/Cargo.toml", r#"
+ [project]
+ name = "b"
+ version = "0.1.0"
+ "#)
+ .file("b/src/lib.rs", "
+ /// ```
+ /// b::foo();
+ /// ```
+ pub fn foo() {}
+ ")
+ .file("c/Cargo.toml", r#"
+ [project]
+ name = "c"
+ version = "0.1.0"
+
+ [dependencies]
+ b = "0.1"
+ "#)
+ .file("c/src/lib.rs", "")
+ .build();
+
+ Package::new("b", "0.1.0").publish();
+
+ assert_that(p.cargo("test").arg("--all").arg("-v"),
+ execs().with_status(0));
+}
+
+#[test]
+fn cargo_test_env() {
+ let src = format!(r#"
+ #![crate_type = "rlib"]
+
+ #[test]
+ fn env_test() {{
+ use std::env;
+ println!("{{}}", env::var("{}").unwrap());
+ }}
+ "#, cargo::CARGO_ENV);
+
+ let p = project("env_test")
+ .file("Cargo.toml", &basic_lib_manifest("env_test"))
+ .file("src/lib.rs", &src)
+ .build();
+
+ let mut pr = p.cargo("test");
+ let cargo = cargo_exe().canonicalize().unwrap();
+ assert_that(pr.args(&["--lib", "--", "--nocapture"]),
+ execs().with_status(0)
+ .with_stdout_contains(format!("\
+{}
+test env_test ... ok
+", cargo.to_str().unwrap())));
+}
+
+#[test]
+fn test_order() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("src/lib.rs", r#"
+ #[test] fn test_lib() {}
+ "#)
+ .file("tests/a.rs", r#"
+ #[test] fn test_a() {}
+ "#)
+ .file("tests/z.rs", r#"
+ #[test] fn test_z() {}
+ "#)
+ .build();
+
+ assert_that(p.cargo("test").arg("--all"),
+ execs().with_status(0)
+ .with_stdout_contains("
+running 1 test
+test test_lib ... ok
+
+test result: ok. [..]
+
+
+running 1 test
+test test_a ... ok
+
+test result: ok. [..]
+
+
+running 1 test
+test test_z ... ok
+
+test result: ok. [..]
+"));
+
+}
+
+#[test]
+fn cyclic_dev() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+
+ [dev-dependencies]
+ foo = { path = "." }
+ "#)
+ .file("src/lib.rs", r#"
+ #[test] fn test_lib() {}
+ "#)
+ .file("tests/foo.rs", r#"
+ extern crate foo;
+ "#)
+ .build();
+
+ assert_that(p.cargo("test").arg("--all"),
+ execs().with_status(0));
+}
+
+#[test]
+fn publish_a_crate_without_tests() {
+ Package::new("testless", "0.1.0")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "testless"
+ version = "0.1.0"
+ exclude = ["tests/*"]
+
+ [[test]]
+ name = "a_test"
+ "#)
+ .file("src/lib.rs", "")
+
+ // In real life, the package will have a test,
+ // which would be excluded from .crate file by the
+ // `exclude` field. Our test harness does not honor
+ // exclude though, so let's just not add the file!
+ // .file("tests/a_test.rs", "")
+
+ .publish();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+
+ [dependencies]
+ testless = "0.1.0"
+ "#)
+ .file("src/lib.rs", "")
+ .build();
+
+ assert_that(p.cargo("test"), execs().with_status(0));
+ assert_that(p.cargo("test").arg("--package").arg("testless"),
+ execs().with_status(0));
+}
+
+#[test]
+fn find_dependency_of_proc_macro_dependency_with_target() {
+ let workspace = project("workspace")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["root", "proc_macro_dep"]
+ "#)
+ .file("root/Cargo.toml", r#"
+ [project]
+ name = "root"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ proc_macro_dep = { path = "../proc_macro_dep" }
+ "#)
+ .file("root/src/lib.rs", r#"
+ #[macro_use]
+ extern crate proc_macro_dep;
+
+ #[derive(Noop)]
+ pub struct X;
+ "#)
+ .file("proc_macro_dep/Cargo.toml", r#"
+ [project]
+ name = "proc_macro_dep"
+ version = "0.1.0"
+ authors = []
+
+ [lib]
+ proc-macro = true
+
+ [dependencies]
+ bar = "^0.1"
+ "#)
+ .file("proc_macro_dep/src/lib.rs", r#"
+ extern crate bar;
+ extern crate proc_macro;
+ use proc_macro::TokenStream;
+
+ #[proc_macro_derive(Noop)]
+ pub fn noop(_input: TokenStream) -> TokenStream {
+ "".parse().unwrap()
+ }
+ "#)
+ .build();
+ Package::new("foo", "0.1.0").publish();
+ Package::new("bar", "0.1.0")
+ .dep("foo", "0.1")
+ .file("src/lib.rs", "extern crate foo;")
+ .publish();
+ assert_that(workspace.cargo("test").arg("--all").arg("--target").arg(rustc_host()),
+ execs().with_status(0));
+}
+
+#[test]
+fn test_hint_not_masked_by_doctest() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/lib.rs", r#"
+ /// ```
+ /// assert_eq!(1, 1);
+ /// ```
+ pub fn this_works() {}
+ "#)
+ .file("tests/integ.rs", r#"
+ #[test]
+ fn this_fails() {
+ panic!();
+ }
+ "#)
+ .build();
+ assert_that(p.cargo("test")
+ .arg("--no-fail-fast"),
+ execs()
+ .with_status(101)
+ .with_stdout_contains("test this_fails ... FAILED")
+ .with_stdout_contains("[..]this_works (line [..]ok")
+ .with_stderr_contains("[ERROR] test failed, to rerun pass \
+ '--test integ'"));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::rustc_host;
+use cargotest::support::{path2url, project, execs};
+use hamcrest::assert_that;
+
+#[test]
+fn pathless_tools() {
+ let target = rustc_host();
+
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", &format!(r#"
+ [target.{}]
+ ar = "nonexistent-ar"
+ linker = "nonexistent-linker"
+ "#, target))
+ .build();
+
+ assert_that(foo.cargo("build").arg("--verbose"),
+ execs().with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `rustc [..] -C ar=nonexistent-ar -C linker=nonexistent-linker [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", url = foo.url())))
+}
+
+#[test]
+fn absolute_tools() {
+ let target = rustc_host();
+
+ // Escaped as they appear within a TOML config file
+ let config = if cfg!(windows) {
+ (r#"C:\\bogus\\nonexistent-ar"#, r#"C:\\bogus\\nonexistent-linker"#)
+ } else {
+ (r#"/bogus/nonexistent-ar"#, r#"/bogus/nonexistent-linker"#)
+ };
+
+ let foo = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo"
+ "#)
+ .file("src/lib.rs", "")
+ .file(".cargo/config", &format!(r#"
+ [target.{target}]
+ ar = "{ar}"
+ linker = "{linker}"
+ "#, target = target, ar = config.0, linker = config.1))
+ .build();
+
+ let output = if cfg!(windows) {
+ (r#"C:\bogus\nonexistent-ar"#, r#"C:\bogus\nonexistent-linker"#)
+ } else {
+ (r#"/bogus/nonexistent-ar"#, r#"/bogus/nonexistent-linker"#)
+ };
+
+ assert_that(foo.cargo("build").arg("--verbose"),
+ execs().with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `rustc [..] -C ar={ar} -C linker={linker} [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", url = foo.url(), ar = output.0, linker = output.1)))
+}
+
+#[test]
+fn relative_tools() {
+ let target = rustc_host();
+
+ // Escaped as they appear within a TOML config file
+ let config = if cfg!(windows) {
+ (r#".\\nonexistent-ar"#, r#".\\tools\\nonexistent-linker"#)
+ } else {
+ (r#"./nonexistent-ar"#, r#"./tools/nonexistent-linker"#)
+ };
+
+ // Funky directory structure to test that relative tool paths are made absolute
+ // by reference to the `.cargo/..` directory and not to (for example) the CWD.
+ let origin = project("origin")
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ authors = []
+
+ [lib]
+ name = "foo"
+ "#)
+ .file("foo/src/lib.rs", "")
+ .file(".cargo/config", &format!(r#"
+ [target.{target}]
+ ar = "{ar}"
+ linker = "{linker}"
+ "#, target = target, ar = config.0, linker = config.1))
+ .build();
+
+ let foo_path = origin.root().join("foo");
+ let foo_url = path2url(foo_path.clone());
+ let prefix = origin.root().into_os_string().into_string().unwrap();
+ let output = if cfg!(windows) {
+ (format!(r#"{}\.\nonexistent-ar"#, prefix),
+ format!(r#"{}\.\tools\nonexistent-linker"#, prefix))
+ } else {
+ (format!(r#"{}/./nonexistent-ar"#, prefix),
+ format!(r#"{}/./tools/nonexistent-linker"#, prefix))
+ };
+
+ assert_that(origin.cargo("build").cwd(foo_path).arg("--verbose"),
+ execs().with_stderr(&format!("\
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `rustc [..] -C ar={ar} -C linker={linker} [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+", url = foo_url, ar = output.0, linker = output.1)))
+}
+
+#[test]
+fn custom_runner() {
+ let target = rustc_host();
+
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.0.1"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("tests/test.rs", "")
+ .file("benches/bench.rs", "")
+ .file(".cargo/config", &format!(r#"
+ [target.{}]
+ runner = "nonexistent-runner -r"
+ "#, target))
+ .build();
+
+ assert_that(p.cargo("run").args(&["--", "--param"]),
+ execs().with_stderr_contains(&format!("\
+[COMPILING] foo v0.0.1 ({url})
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `nonexistent-runner -r target[/]debug[/]foo[EXE] --param`
+", url = p.url())));
+
+ assert_that(p.cargo("test").args(&["--test", "test", "--verbose", "--", "--param"]),
+ execs().with_stderr_contains(&format!("\
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `rustc [..]`
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+[RUNNING] `nonexistent-runner -r [..][/]target[/]debug[/]deps[/]test-[..][EXE] --param`
+", url = p.url())));
+
+ assert_that(p.cargo("bench").args(&["--bench", "bench", "--verbose", "--", "--param"]),
+ execs().with_stderr_contains(&format!("\
+[COMPILING] foo v0.0.1 ({url})
+[RUNNING] `rustc [..]`
+[RUNNING] `rustc [..]`
+[FINISHED] release [optimized] target(s) in [..]
+[RUNNING] `nonexistent-runner -r [..][/]target[/]release[/]deps[/]bench-[..][EXE] --param --bench`
+", url = p.url())));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::{project, execs, main_file, basic_bin_manifest};
+use hamcrest::{assert_that};
+
+fn verify_project_success_output() -> String {
+ r#"{"success":"true"}"#.into()
+}
+
+#[test]
+fn cargo_verify_project_path_to_cargo_toml_relative() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("verify-project")
+ .arg("--manifest-path").arg("foo/Cargo.toml")
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(0)
+ .with_stdout(verify_project_success_output()));
+}
+
+#[test]
+fn cargo_verify_project_path_to_cargo_toml_absolute() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("verify-project")
+ .arg("--manifest-path").arg(p.root().join("Cargo.toml"))
+ .cwd(p.root().parent().unwrap()),
+ execs().with_status(0)
+ .with_stdout(verify_project_success_output()));
+}
+
+#[test]
+fn cargo_verify_project_cwd() {
+ let p = project("foo")
+ .file("Cargo.toml", &basic_bin_manifest("foo"))
+ .file("src/foo.rs", &main_file(r#""i am foo""#, &[]))
+ .build();
+
+ assert_that(p.cargo("verify-project")
+ .cwd(p.root()),
+ execs().with_status(0)
+ .with_stdout(verify_project_success_output()));
+}
--- /dev/null
+extern crate cargo;
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::{project, execs};
+use hamcrest::assert_that;
+
+#[test]
+fn simple() {
+ let p = project("foo").build();
+
+ assert_that(p.cargo("version"),
+ execs().with_status(0).with_stdout(&format!("{}\n",
+ cargo::version())));
+
+ assert_that(p.cargo("--version"),
+ execs().with_status(0).with_stdout(&format!("{}\n",
+ cargo::version())));
+
+}
+
+
+#[test]
+#[cfg_attr(target_os = "windows", ignore)]
+fn version_works_without_rustc() {
+ let p = project("foo").build();
+ assert_that(p.cargo("version").env("PATH", ""),
+ execs().with_status(0));
+}
+
+#[test]
+fn version_works_with_bad_config() {
+ let p = project("foo")
+ .file(".cargo/config", "this is not toml")
+ .build();
+ assert_that(p.cargo("version"),
+ execs().with_status(0));
+}
+
+#[test]
+fn version_works_with_bad_target_dir() {
+ let p = project("foo")
+ .file(".cargo/config", r#"
+ [build]
+ target-dir = 4
+ "#)
+ .build();
+ assert_that(p.cargo("version"),
+ execs().with_status(0));
+}
--- /dev/null
+extern crate cargotest;
+extern crate hamcrest;
+
+use cargotest::support::{project, execs, Project};
+use cargotest::support::registry::Package;
+use hamcrest::assert_that;
+
+static WARNING1: &'static str = "Hello! I'm a warning. :)";
+static WARNING2: &'static str = "And one more!";
+
+fn make_lib(lib_src: &str) {
+ Package::new("foo", "0.0.1")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "foo"
+ authors = []
+ version = "0.0.1"
+ build = "build.rs"
+ "#)
+ .file("build.rs", &format!(r#"
+ fn main() {{
+ use std::io::Write;
+ println!("cargo:warning={{}}", "{}");
+ println!("hidden stdout");
+ write!(&mut ::std::io::stderr(), "hidden stderr");
+ println!("cargo:warning={{}}", "{}");
+ }}
+ "#, WARNING1, WARNING2))
+ .file("src/lib.rs", &format!("fn f() {{ {} }}", lib_src))
+ .publish();
+}
+
+fn make_upstream(main_src: &str) -> Project {
+ project("bar")
+ .file("Cargo.toml", r#"
+ [package]
+ name = "bar"
+ version = "0.0.1"
+ authors = []
+
+ [dependencies]
+ foo = "*"
+ "#)
+ .file("src/main.rs", &format!("fn main() {{ {} }}", main_src))
+ .build()
+}
+
+#[test]
+fn no_warning_on_success() {
+ make_lib("");
+ let upstream = make_upstream("");
+ assert_that(upstream.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] foo v0.0.1 ([..])
+[COMPILING] foo v0.0.1
+[COMPILING] bar v0.0.1 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn no_warning_on_bin_failure() {
+ make_lib("");
+ let upstream = make_upstream("hi()");
+ assert_that(upstream.cargo("build"),
+ execs().with_status(101)
+ .with_stdout_does_not_contain("hidden stdout")
+ .with_stderr_does_not_contain("hidden stderr")
+ .with_stderr_does_not_contain(&format!("[WARNING] {}", WARNING1))
+ .with_stderr_does_not_contain(&format!("[WARNING] {}", WARNING2))
+ .with_stderr_contains("[UPDATING] registry `[..]`")
+ .with_stderr_contains("[DOWNLOADING] foo v0.0.1 ([..])")
+ .with_stderr_contains("[COMPILING] foo v0.0.1")
+ .with_stderr_contains("[COMPILING] bar v0.0.1 ([..])"));
+}
+
+#[test]
+fn warning_on_lib_failure() {
+ make_lib("err()");
+ let upstream = make_upstream("");
+ assert_that(upstream.cargo("build"),
+ execs().with_status(101)
+ .with_stdout_does_not_contain("hidden stdout")
+ .with_stderr_does_not_contain("hidden stderr")
+ .with_stderr_does_not_contain("[COMPILING] bar v0.0.1 ([..])")
+ .with_stderr_contains("[UPDATING] registry `[..]`")
+ .with_stderr_contains("[DOWNLOADING] foo v0.0.1 ([..])")
+ .with_stderr_contains("[COMPILING] foo v0.0.1")
+ .with_stderr_contains(&format!("[WARNING] {}", WARNING1))
+ .with_stderr_contains(&format!("[WARNING] {}", WARNING2)));
+}
--- /dev/null
+#[macro_use]
+extern crate cargotest;
+extern crate hamcrest;
+
+use std::io::{Read, Write};
+use std::fs::File;
+
+use cargotest::sleep_ms;
+use cargotest::support::{project, execs, git};
+use cargotest::support::registry::Package;
+use hamcrest::{assert_that, existing_file, existing_dir, is_not};
+
+#[test]
+fn simple_explicit() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["bar"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ workspace = ".."
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), is_not(existing_file()));
+
+ assert_that(p.cargo("build").cwd(p.root().join("bar")),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), existing_file());
+
+ assert_that(&p.root().join("Cargo.lock"), existing_file());
+ assert_that(&p.root().join("bar/Cargo.lock"), is_not(existing_file()));
+}
+
+#[test]
+fn inferred_root() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["bar"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), is_not(existing_file()));
+
+ assert_that(p.cargo("build").cwd(p.root().join("bar")),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), existing_file());
+
+ assert_that(&p.root().join("Cargo.lock"), existing_file());
+ assert_that(&p.root().join("bar/Cargo.lock"), is_not(existing_file()));
+}
+
+#[test]
+fn inferred_path_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "bar" }
+
+ [workspace]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/main.rs", "fn main() {}")
+ .file("bar/src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), is_not(existing_file()));
+
+ assert_that(p.cargo("build").cwd(p.root().join("bar")),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), existing_file());
+
+ assert_that(&p.root().join("Cargo.lock"), existing_file());
+ assert_that(&p.root().join("bar/Cargo.lock"), is_not(existing_file()));
+}
+
+#[test]
+fn transitive_path_dep() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "bar" }
+
+ [workspace]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ baz = { path = "../baz" }
+ "#)
+ .file("bar/src/main.rs", "fn main() {}")
+ .file("bar/src/lib.rs", "")
+ .file("baz/Cargo.toml", r#"
+ [project]
+ name = "baz"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("baz/src/main.rs", "fn main() {}")
+ .file("baz/src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), is_not(existing_file()));
+ assert_that(&p.bin("baz"), is_not(existing_file()));
+
+ assert_that(p.cargo("build").cwd(p.root().join("bar")),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), existing_file());
+ assert_that(&p.bin("baz"), is_not(existing_file()));
+
+ assert_that(p.cargo("build").cwd(p.root().join("baz")),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), existing_file());
+ assert_that(&p.bin("baz"), existing_file());
+
+ assert_that(&p.root().join("Cargo.lock"), existing_file());
+ assert_that(&p.root().join("bar/Cargo.lock"), is_not(existing_file()));
+ assert_that(&p.root().join("baz/Cargo.lock"), is_not(existing_file()));
+}
+
+#[test]
+fn parent_pointer_works() {
+ let p = project("foo")
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "../bar" }
+
+ [workspace]
+ "#)
+ .file("foo/src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ workspace = "../foo"
+ "#)
+ .file("bar/src/main.rs", "fn main() {}")
+ .file("bar/src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("foo")),
+ execs().with_status(0));
+ assert_that(p.cargo("build").cwd(p.root().join("bar")),
+ execs().with_status(0));
+ assert_that(&p.root().join("foo/Cargo.lock"), existing_file());
+ assert_that(&p.root().join("bar/Cargo.lock"), is_not(existing_file()));
+}
+
+#[test]
+fn same_names_in_workspace() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["bar"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ workspace = ".."
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: two packages named `foo` in this workspace:
+- [..]Cargo.toml
+- [..]Cargo.toml
+"));
+}
+
+#[test]
+fn parent_doesnt_point_to_child() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("bar")),
+ execs().with_status(101)
+ .with_stderr("\
+error: current package believes it's in a workspace when it's not:
+current: [..]Cargo.toml
+workspace: [..]Cargo.toml
+
+this may be fixable [..]
+"));
+}
+
+#[test]
+fn invalid_parent_pointer() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ workspace = "foo"
+ "#)
+ .file("src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: failed to read `[..]Cargo.toml`
+
+Caused by:
+ [..]
+"));
+}
+
+#[test]
+fn invalid_members() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["foo"]
+ "#)
+ .file("src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: failed to read `[..]Cargo.toml`
+
+Caused by:
+ [..]
+"));
+}
+
+#[test]
+fn bare_workspace_ok() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ "#)
+ .file("src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+}
+
+#[test]
+fn two_roots() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["bar"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = [".."]
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: multiple workspace roots found in the same workspace:
+ [..]
+ [..]
+"));
+}
+
+#[test]
+fn workspace_isnt_root() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ workspace = "bar"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: root of a workspace inferred but wasn't a root: [..]
+"));
+}
+
+#[test]
+fn dangling_member() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["bar"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ workspace = "../baz"
+ "#)
+ .file("bar/src/main.rs", "fn main() {}")
+ .file("baz/Cargo.toml", r#"
+ [project]
+ name = "baz"
+ version = "0.1.0"
+ authors = []
+ workspace = "../baz"
+ "#)
+ .file("baz/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: package `[..]` is a member of the wrong workspace
+expected: [..]
+actual: [..]
+"));
+}
+
+#[test]
+fn cycle() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ workspace = "bar"
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ workspace = ".."
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101));
+}
+
+#[test]
+fn share_dependencies() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ dep1 = "0.1"
+
+ [workspace]
+ members = ["bar"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ dep1 = "< 0.1.5"
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ Package::new("dep1", "0.1.3").publish();
+ Package::new("dep1", "0.1.8").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] dep1 v0.1.3 ([..])
+[COMPILING] dep1 v0.1.3
+[COMPILING] foo v0.1.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn fetch_fetches_all() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["bar"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ dep1 = "*"
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ Package::new("dep1", "0.1.3").publish();
+
+ assert_that(p.cargo("fetch"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] registry `[..]`
+[DOWNLOADING] dep1 v0.1.3 ([..])
+"));
+}
+
+#[test]
+fn lock_works_for_everyone() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ dep2 = "0.1"
+
+ [workspace]
+ members = ["bar"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ dep1 = "0.1"
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ Package::new("dep1", "0.1.0").publish();
+ Package::new("dep2", "0.1.0").publish();
+
+ assert_that(p.cargo("generate-lockfile"),
+ execs().with_status(0)
+ .with_stderr("\
+[UPDATING] registry `[..]`
+"));
+
+ Package::new("dep1", "0.1.1").publish();
+ Package::new("dep2", "0.1.1").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0)
+ .with_stderr("\
+[DOWNLOADING] dep2 v0.1.0 ([..])
+[COMPILING] dep2 v0.1.0
+[COMPILING] foo v0.1.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ assert_that(p.cargo("build").cwd(p.root().join("bar")),
+ execs().with_status(0)
+ .with_stderr("\
+[DOWNLOADING] dep1 v0.1.0 ([..])
+[COMPILING] dep1 v0.1.0
+[COMPILING] bar v0.1.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+#[test]
+fn virtual_works() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["bar"]
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+ assert_that(p.cargo("build").cwd(p.root().join("bar")),
+ execs().with_status(0));
+ assert_that(&p.root().join("Cargo.lock"), existing_file());
+ assert_that(&p.bin("bar"), existing_file());
+ assert_that(&p.root().join("bar/Cargo.lock"), is_not(existing_file()));
+}
+
+#[test]
+fn explicit_package_argument_works_with_virtual_manifest() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["bar"]
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+ assert_that(p.cargo("build").cwd(p.root()).args(&["--package", "bar"]),
+ execs().with_status(0));
+ assert_that(&p.root().join("Cargo.lock"), existing_file());
+ assert_that(&p.bin("bar"), existing_file());
+ assert_that(&p.root().join("bar/Cargo.lock"), is_not(existing_file()));
+}
+
+#[test]
+fn virtual_misconfigure() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+ assert_that(p.cargo("build").cwd(p.root().join("bar")),
+ execs().with_status(101)
+ .with_stderr("\
+error: current package believes it's in a workspace when it's not:
+current: [..]bar[..]Cargo.toml
+workspace: [..]Cargo.toml
+
+this may be fixable by adding `bar` to the `workspace.members` array of the \
+manifest located at: [..]
+"));
+}
+
+#[test]
+fn virtual_build_all_implied() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["bar"]
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+#[test]
+fn virtual_build_no_members() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ "#);
+ let p = p.build();
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: manifest path `[..]` contains no package: The manifest is virtual, \
+and the workspace has no members.
+"));
+}
+
+#[test]
+fn include_virtual() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ [workspace]
+ members = ["bar"]
+ "#)
+ .file("src/main.rs", "")
+ .file("bar/Cargo.toml", r#"
+ [workspace]
+ "#);
+ let p = p.build();
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: multiple workspace roots found in the same workspace:
+ [..]
+ [..]
+"));
+}
+
+#[test]
+fn members_include_path_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["p1"]
+
+ [dependencies]
+ p3 = { path = "p3" }
+ "#)
+ .file("src/lib.rs", "")
+ .file("p1/Cargo.toml", r#"
+ [project]
+ name = "p1"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ p2 = { path = "../p2" }
+ "#)
+ .file("p1/src/lib.rs", "")
+ .file("p2/Cargo.toml", r#"
+ [project]
+ name = "p2"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("p2/src/lib.rs", "")
+ .file("p3/Cargo.toml", r#"
+ [project]
+ name = "p3"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("p3/src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("p1")),
+ execs().with_status(0));
+ assert_that(p.cargo("build").cwd(p.root().join("p2")),
+ execs().with_status(0));
+ assert_that(p.cargo("build").cwd(p.root().join("p3")),
+ execs().with_status(0));
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ assert_that(&p.root().join("target"), existing_dir());
+ assert_that(&p.root().join("p1/target"), is_not(existing_dir()));
+ assert_that(&p.root().join("p2/target"), is_not(existing_dir()));
+ assert_that(&p.root().join("p3/target"), is_not(existing_dir()));
+}
+
+#[test]
+fn new_warns_you_this_will_not_work() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ "#)
+ .file("src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("new").arg("--lib").arg("bar").env("USER", "foo"),
+ execs().with_status(0)
+ .with_stderr("\
+warning: compiling this new crate may not work due to invalid workspace \
+configuration
+
+current package believes it's in a workspace when it's not:
+current: [..]
+workspace: [..]
+
+this may be fixable by ensuring that this crate is depended on by the workspace \
+root: [..]
+[CREATED] library `bar` project
+"));
+}
+
+#[test]
+fn lock_doesnt_change_depending_on_crate() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ['baz']
+
+ [dependencies]
+ foo = "*"
+ "#)
+ .file("src/lib.rs", "")
+ .file("baz/Cargo.toml", r#"
+ [project]
+ name = "baz"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = "*"
+ "#)
+ .file("baz/src/lib.rs", "");
+ let p = p.build();
+
+ Package::new("foo", "1.0.0").publish();
+ Package::new("bar", "1.0.0").publish();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+
+ let mut lockfile = String::new();
+ t!(t!(File::open(p.root().join("Cargo.lock"))).read_to_string(&mut lockfile));
+
+ assert_that(p.cargo("build").cwd(p.root().join("baz")),
+ execs().with_status(0));
+
+ let mut lockfile2 = String::new();
+ t!(t!(File::open(p.root().join("Cargo.lock"))).read_to_string(&mut lockfile2));
+
+ assert_eq!(lockfile, lockfile2);
+}
+
+#[test]
+fn rebuild_please() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ['lib', 'bin']
+ "#)
+ .file("lib/Cargo.toml", r#"
+ [package]
+ name = "lib"
+ version = "0.1.0"
+ "#)
+ .file("lib/src/lib.rs", r#"
+ pub fn foo() -> u32 { 0 }
+ "#)
+ .file("bin/Cargo.toml", r#"
+ [package]
+ name = "bin"
+ version = "0.1.0"
+
+ [dependencies]
+ lib = { path = "../lib" }
+ "#)
+ .file("bin/src/main.rs", r#"
+ extern crate lib;
+
+ fn main() {
+ assert_eq!(lib::foo(), 0);
+ }
+ "#);
+ let p = p.build();
+
+ assert_that(p.cargo("run").cwd(p.root().join("bin")),
+ execs().with_status(0));
+
+ sleep_ms(1000);
+
+ t!(t!(File::create(p.root().join("lib/src/lib.rs"))).write_all(br#"
+ pub fn foo() -> u32 { 1 }
+ "#));
+
+ assert_that(p.cargo("build").cwd(p.root().join("lib")),
+ execs().with_status(0));
+
+ assert_that(p.cargo("run").cwd(p.root().join("bin")),
+ execs().with_status(101));
+}
+
+#[test]
+fn workspace_in_git() {
+ let git_project = git::new("dep1", |project| {
+ project
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [package]
+ name = "foo"
+ version = "0.1.0"
+ "#)
+ .file("foo/src/lib.rs", "")
+ }).unwrap();
+ let p = project("foo")
+ .file("Cargo.toml", &format!(r#"
+ [package]
+ name = "lib"
+ version = "0.1.0"
+
+ [dependencies.foo]
+ git = '{}'
+ "#, git_project.url()))
+ .file("src/lib.rs", r#"
+ pub fn foo() -> u32 { 0 }
+ "#);
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+}
+
+
+#[test]
+fn lockfile_can_specify_nonexistant_members() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["a"]
+ "#)
+ .file("a/Cargo.toml", r#"
+ [project]
+ name = "a"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("a/src/main.rs", "fn main() {}")
+ .file("Cargo.lock", r#"
+ [[package]]
+ name = "a"
+ version = "0.1.0"
+
+ [[package]]
+ name = "b"
+ version = "0.1.0"
+ "#);
+
+ let p = p.build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("a")), execs().with_status(0));
+}
+
+#[test]
+fn you_cannot_generate_lockfile_for_empty_workspaces() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ "#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("update"),
+ execs().with_status(101)
+ .with_stderr("\
+error: you can't generate a lockfile for an empty workspace.
+"));
+}
+
+#[test]
+fn workspace_with_transitive_dev_deps() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.5.0"
+ authors = ["mbrubeck@example.com"]
+
+ [dependencies.bar]
+ path = "bar"
+
+ [workspace]
+ "#)
+ .file("src/main.rs", r#"fn main() {}"#)
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.5.0"
+ authors = ["mbrubeck@example.com"]
+
+ [dev-dependencies.baz]
+ path = "../baz"
+ "#)
+ .file("bar/src/lib.rs", r#"
+ pub fn init() {}
+
+ #[cfg(test)]
+
+ #[test]
+ fn test() {
+ extern crate baz;
+ baz::do_stuff();
+ }
+ "#)
+ .file("baz/Cargo.toml", r#"
+ [project]
+ name = "baz"
+ version = "0.5.0"
+ authors = ["mbrubeck@example.com"]
+ "#)
+ .file("baz/src/lib.rs", r#"pub fn do_stuff() {}"#);
+ let p = p.build();
+
+ assert_that(p.cargo("test").args(&["-p", "bar"]),
+ execs().with_status(0));
+}
+
+#[test]
+fn error_if_parent_cargo_toml_is_invalid() {
+ let p = project("foo")
+ .file("Cargo.toml", "Totally not a TOML file")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("bar")),
+ execs().with_status(101)
+ .with_stderr_contains("\
+[ERROR] failed to parse manifest at `[..]`"));
+}
+
+#[test]
+fn relative_path_for_member_works() {
+ let p = project("foo")
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["../bar"]
+ "#)
+ .file("foo/src/main.rs", "fn main() {}")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ workspace = "../foo"
+ "#)
+ .file("bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("foo")), execs().with_status(0));
+ assert_that(p.cargo("build").cwd(p.root().join("bar")), execs().with_status(0));
+}
+
+#[test]
+fn relative_path_for_root_works() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+
+ [dependencies]
+ subproj = { path = "./subproj" }
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("subproj/Cargo.toml", r#"
+ [project]
+ name = "subproj"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("subproj/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build").cwd(p.root())
+ .arg("--manifest-path").arg("./Cargo.toml"),
+ execs().with_status(0));
+
+ assert_that(p.cargo("build").cwd(p.root().join("subproj"))
+ .arg("--manifest-path").arg("../Cargo.toml"),
+ execs().with_status(0));
+}
+
+#[test]
+fn path_dep_outside_workspace_is_not_member() {
+ let p = project("foo")
+ .file("ws/Cargo.toml", r#"
+ [project]
+ name = "ws"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = { path = "../foo" }
+
+ [workspace]
+ "#)
+ .file("ws/src/lib.rs", r"extern crate foo;")
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("ws")),
+ execs().with_status(0));
+}
+
+#[test]
+fn test_in_and_out_of_workspace() {
+ let p = project("foo")
+ .file("ws/Cargo.toml", r#"
+ [project]
+ name = "ws"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = { path = "../foo" }
+
+ [workspace]
+ members = [ "../bar" ]
+ "#)
+ .file("ws/src/lib.rs", r"extern crate foo; pub fn f() { foo::f() }")
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "../bar" }
+ "#)
+ .file("foo/src/lib.rs", "extern crate bar; pub fn f() { bar::f() }")
+ .file("bar/Cargo.toml", r#"
+ [project]
+ workspace = "../ws"
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("bar/src/lib.rs", "pub fn f() { }");
+ let p = p.build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("ws")),
+ execs().with_status(0));
+
+ assert_that(&p.root().join("ws/Cargo.lock"), existing_file());
+ assert_that(&p.root().join("ws/target"), existing_dir());
+ assert_that(&p.root().join("foo/Cargo.lock"), is_not(existing_file()));
+ assert_that(&p.root().join("foo/target"), is_not(existing_dir()));
+ assert_that(&p.root().join("bar/Cargo.lock"), is_not(existing_file()));
+ assert_that(&p.root().join("bar/target"), is_not(existing_dir()));
+
+ assert_that(p.cargo("build").cwd(p.root().join("foo")),
+ execs().with_status(0));
+ assert_that(&p.root().join("foo/Cargo.lock"), existing_file());
+ assert_that(&p.root().join("foo/target"), existing_dir());
+ assert_that(&p.root().join("bar/Cargo.lock"), is_not(existing_file()));
+ assert_that(&p.root().join("bar/target"), is_not(existing_dir()));
+}
+
+#[test]
+fn test_path_dependency_under_member() {
+ let p = project("foo")
+ .file("ws/Cargo.toml", r#"
+ [project]
+ name = "ws"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ foo = { path = "../foo" }
+
+ [workspace]
+ "#)
+ .file("ws/src/lib.rs", r"extern crate foo; pub fn f() { foo::f() }")
+ .file("foo/Cargo.toml", r#"
+ [project]
+ workspace = "../ws"
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "./bar" }
+ "#)
+ .file("foo/src/lib.rs", "extern crate bar; pub fn f() { bar::f() }")
+ .file("foo/bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/bar/src/lib.rs", "pub fn f() { }");
+ let p = p.build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("ws")),
+ execs().with_status(0));
+
+ assert_that(&p.root().join("foo/bar/Cargo.lock"), is_not(existing_file()));
+ assert_that(&p.root().join("foo/bar/target"), is_not(existing_dir()));
+
+ assert_that(p.cargo("build").cwd(p.root().join("foo/bar")),
+ execs().with_status(0));
+
+ assert_that(&p.root().join("foo/bar/Cargo.lock"), is_not(existing_file()));
+ assert_that(&p.root().join("foo/bar/target"), is_not(existing_dir()));
+}
+
+#[test]
+fn excluded_simple() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "ws"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ exclude = ["foo"]
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target"), existing_dir());
+ assert_that(p.cargo("build").cwd(p.root().join("foo")),
+ execs().with_status(0));
+ assert_that(&p.root().join("foo/target"), existing_dir());
+}
+
+#[test]
+fn exclude_members_preferred() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "ws"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["foo/bar"]
+ exclude = ["foo"]
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", "")
+ .file("foo/bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/bar/src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target"), existing_dir());
+ assert_that(p.cargo("build").cwd(p.root().join("foo")),
+ execs().with_status(0));
+ assert_that(&p.root().join("foo/target"), existing_dir());
+ assert_that(p.cargo("build").cwd(p.root().join("foo/bar")),
+ execs().with_status(0));
+ assert_that(&p.root().join("foo/bar/target"), is_not(existing_dir()));
+}
+
+#[test]
+fn exclude_but_also_depend() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "ws"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ bar = { path = "foo/bar" }
+
+ [workspace]
+ exclude = ["foo"]
+ "#)
+ .file("src/lib.rs", "")
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", "")
+ .file("foo/bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/bar/src/lib.rs", "");
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(0));
+ assert_that(&p.root().join("target"), existing_dir());
+ assert_that(p.cargo("build").cwd(p.root().join("foo")),
+ execs().with_status(0));
+ assert_that(&p.root().join("foo/target"), existing_dir());
+ assert_that(p.cargo("build").cwd(p.root().join("foo/bar")),
+ execs().with_status(0));
+ assert_that(&p.root().join("foo/bar/target"), existing_dir());
+}
+
+#[test]
+fn glob_syntax() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["crates/*"]
+ exclude = ["crates/qux"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("crates/bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ workspace = "../.."
+ "#)
+ .file("crates/bar/src/main.rs", "fn main() {}")
+ .file("crates/baz/Cargo.toml", r#"
+ [project]
+ name = "baz"
+ version = "0.1.0"
+ authors = []
+ workspace = "../.."
+ "#)
+ .file("crates/baz/src/main.rs", "fn main() {}")
+ .file("crates/qux/Cargo.toml", r#"
+ [project]
+ name = "qux"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("crates/qux/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), is_not(existing_file()));
+ assert_that(&p.bin("baz"), is_not(existing_file()));
+
+ assert_that(p.cargo("build").cwd(p.root().join("crates/bar")),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), existing_file());
+
+ assert_that(p.cargo("build").cwd(p.root().join("crates/baz")),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("baz"), existing_file());
+
+ assert_that(p.cargo("build").cwd(p.root().join("crates/qux")),
+ execs().with_status(0));
+ assert_that(&p.bin("qux"), is_not(existing_file()));
+
+ assert_that(&p.root().join("Cargo.lock"), existing_file());
+ assert_that(&p.root().join("crates/bar/Cargo.lock"), is_not(existing_file()));
+ assert_that(&p.root().join("crates/baz/Cargo.lock"), is_not(existing_file()));
+ assert_that(&p.root().join("crates/qux/Cargo.lock"), existing_file());
+}
+
+/*FIXME: This fails because of how workspace.exclude and workspace.members are working.
+#[test]
+fn glob_syntax_2() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["crates/b*"]
+ exclude = ["crates/q*"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("crates/bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ workspace = "../.."
+ "#)
+ .file("crates/bar/src/main.rs", "fn main() {}")
+ .file("crates/baz/Cargo.toml", r#"
+ [project]
+ name = "baz"
+ version = "0.1.0"
+ authors = []
+ workspace = "../.."
+ "#)
+ .file("crates/baz/src/main.rs", "fn main() {}")
+ .file("crates/qux/Cargo.toml", r#"
+ [project]
+ name = "qux"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("crates/qux/src/main.rs", "fn main() {}");
+ p.build();
+
+ assert_that(p.cargo("build"), execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), is_not(existing_file()));
+ assert_that(&p.bin("baz"), is_not(existing_file()));
+
+ assert_that(p.cargo("build").cwd(p.root().join("crates/bar")),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("bar"), existing_file());
+
+ assert_that(p.cargo("build").cwd(p.root().join("crates/baz")),
+ execs().with_status(0));
+ assert_that(&p.bin("foo"), existing_file());
+ assert_that(&p.bin("baz"), existing_file());
+
+ assert_that(p.cargo("build").cwd(p.root().join("crates/qux")),
+ execs().with_status(0));
+ assert_that(&p.bin("qux"), is_not(existing_file()));
+
+ assert_that(&p.root().join("Cargo.lock"), existing_file());
+ assert_that(&p.root().join("crates/bar/Cargo.lock"), is_not(existing_file()));
+ assert_that(&p.root().join("crates/baz/Cargo.lock"), is_not(existing_file()));
+ assert_that(&p.root().join("crates/qux/Cargo.lock"), existing_file());
+}
+*/
+
+#[test]
+fn glob_syntax_invalid_members() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+
+ [workspace]
+ members = ["crates/*"]
+ "#)
+ .file("src/main.rs", "fn main() {}")
+ .file("crates/bar/src/main.rs", "fn main() {}");
+ let p = p.build();
+
+ assert_that(p.cargo("build"),
+ execs().with_status(101)
+ .with_stderr("\
+error: failed to read `[..]Cargo.toml`
+
+Caused by:
+ [..]
+"));
+}
+
+/// This is a freshness test for feature use with workspaces
+///
+/// feat_lib is used by caller1 and caller2, but with different features enabled.
+/// This test ensures that alternating building caller1, caller2 doesn't force
+/// recompile of feat_lib.
+///
+/// Ideally once we solve https://github.com/rust-lang/cargo/issues/3620, then
+/// a single cargo build at the top level will be enough.
+#[test]
+fn dep_used_with_separate_features() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["feat_lib", "caller1", "caller2"]
+ "#)
+ .file("feat_lib/Cargo.toml", r#"
+ [project]
+ name = "feat_lib"
+ version = "0.1.0"
+ authors = []
+
+ [features]
+ myfeature = []
+ "#)
+ .file("feat_lib/src/lib.rs", "")
+ .file("caller1/Cargo.toml", r#"
+ [project]
+ name = "caller1"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ feat_lib = { path = "../feat_lib" }
+ "#)
+ .file("caller1/src/main.rs", "fn main() {}")
+ .file("caller1/src/lib.rs", "")
+ .file("caller2/Cargo.toml", r#"
+ [project]
+ name = "caller2"
+ version = "0.1.0"
+ authors = []
+
+ [dependencies]
+ feat_lib = { path = "../feat_lib", features = ["myfeature"] }
+ caller1 = { path = "../caller1" }
+ "#)
+ .file("caller2/src/main.rs", "fn main() {}")
+ .file("caller2/src/lib.rs", "");
+ let p = p.build();
+
+ // Build the entire workspace
+ assert_that(p.cargo("build").arg("--all"),
+ execs().with_status(0)
+ .with_stderr("\
+[..]Compiling feat_lib v0.1.0 ([..])
+[..]Compiling caller1 v0.1.0 ([..])
+[..]Compiling caller2 v0.1.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(&p.bin("caller1"), existing_file());
+ assert_that(&p.bin("caller2"), existing_file());
+
+
+ // Build caller1. should build the dep library. Because the features
+ // are different than the full workspace, it rebuilds.
+ // Ideally once we solve https://github.com/rust-lang/cargo/issues/3620, then
+ // a single cargo build at the top level will be enough.
+ assert_that(p.cargo("build").cwd(p.root().join("caller1")),
+ execs().with_status(0)
+ .with_stderr("\
+[..]Compiling feat_lib v0.1.0 ([..])
+[..]Compiling caller1 v0.1.0 ([..])
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+
+ // Alternate building caller2/caller1 a few times, just to make sure
+ // features are being built separately. Should not rebuild anything
+ assert_that(p.cargo("build").cwd(p.root().join("caller2")),
+ execs().with_status(0)
+ .with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(p.cargo("build").cwd(p.root().join("caller1")),
+ execs().with_status(0)
+ .with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+ assert_that(p.cargo("build").cwd(p.root().join("caller2")),
+ execs().with_status(0)
+ .with_stderr("\
+[FINISHED] dev [unoptimized + debuginfo] target(s) in [..]
+"));
+}
+
+/*FIXME: This fails because of how workspace.exclude and workspace.members are working.
+#[test]
+fn include_and_exclude() {
+ let p = project("foo")
+ .file("Cargo.toml", r#"
+ [workspace]
+ members = ["foo"]
+ exclude = ["foo/bar"]
+ "#)
+ .file("foo/Cargo.toml", r#"
+ [project]
+ name = "foo"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/src/lib.rs", "")
+ .file("foo/bar/Cargo.toml", r#"
+ [project]
+ name = "bar"
+ version = "0.1.0"
+ authors = []
+ "#)
+ .file("foo/bar/src/lib.rs", "");
+ p.build();
+
+ assert_that(p.cargo("build").cwd(p.root().join("foo")),
+ execs().with_status(0));
+ assert_that(&p.root().join("target"), existing_dir());
+ assert_that(&p.root().join("foo/target"), is_not(existing_dir()));
+ assert_that(p.cargo("build").cwd(p.root().join("foo/bar")),
+ execs().with_status(0));
+ assert_that(&p.root().join("foo/bar/target"), existing_dir());
+}
+*/