--- /dev/null
+julia (0.3.12-1) unstable; urgency=medium
+
+ Starting with this release, the dynamic library loader only
+ considers files with a .so extension when given a library name
+ without an extension. With previous releases, the loader would
+ fall back to files with a .so.VERSION extension, which could
+ result in undefined behaviour when multiple versions of the
+ same library were installed.
+
+ Before you install an external Julia package using Pkg.add() that
+ depends on a shared library, make sure to install the matching
+ lib*-dev package, which provides a symlink with a .so extension.
+
+ -- Peter Colberg <peter@colberg.org> Tue, 10 Nov 2015 23:52:57 -0500
--- /dev/null
+Starting Julia REPL
+-------------------
+
+ Simply type `julia' in a shell to launch an interactive Julia session.
+
+ Full documentation and examples are available in the julia-doc package.
+ /usr/share/doc/julia/html/en/index.html
+
+ An Emacs mode for editing Julia source files and managing interactive sessions
+ is available in the ess package.
+
+ -- Sébastien Villemot <sebastien@debian.org>, Wed, 16 Oct 2013 16:07:45 +0200
+
+
+Julia and Math Kernel Library (Intel-MKL)
+-----------------------------------------
+
+ There are two ways to switch the BLAS/LAPACK implementation to MKL.
+ 1. Switch the libblas.so.3 and liblapack.so.3 candidate with Debian's
+ alternatives system.
+ 2. Rebuild Julia against MKL.
+
+ Alternatives System
+ ^^^^^^^^^^^^^^^^^^^
+
+ You can switch e.g. libblas.so.3-x86_64-linux-gnu with galternatives.
+
+ $ sudo apt install galternatives
+
+ When not using OpenBLAS, you might encounter the following warning,
+ but it doesn't harm:
+
+ WARNING: Error during initialization of module LinearAlgebra:
+ ErrorException("could not load symbol "openblas_get_config":
+ /usr/bin/../lib/x86_64-linux-gnu/julia/libblas.so: undefined symbol: openblas_get_config")
+
+ Rebuild against MKL
+ ^^^^^^^^^^^^^^^^^^^
+
+ To rebuild Julia against MKL, set the variable CUSTOM_MKL as 1 in
+ debian/rules, and rebuild the package. Please make sure that you
+ have intel-mkl installed before doing the custom build.
+
+ Brief Instruction for doing custom build against MKL:
+
+ 0. Fetch the source package of julia and enter the source tree.
+
+ 1. Install the build dependencies and helper scripts
+
+ $ sudo apt install devscripts
+ $ sudo apt build-dep julia
+
+ 2. Modify debian/rules, setting CUSTOM_MKL to 1 .
+
+ 3. Build, check, and install.
+
+ $ debuild -j4
+ $ debc
+ $ sudo debi
+
+ Known Problems about MKL
+ ^^^^^^^^^^^^^^^^^^^^^^^^
+
+ 1. When MKL is installed in the build environment, this test failure
+ may appear: https://github.com/JuliaLang/julia/issues/23264
+
+ -- Mo Zhou <cdluminate@gmail.com> Thu, 22 Sept 2018 09:00:00 +0000
--- /dev/null
+julia (1.0.4+dfsg-1) unstable; urgency=medium
+
+ * New upstream version 1.0.4+dfsg
+ * Uscan: Monitor the 1.0.X series.
+ * Upgrade embedded Pkg.jl -> 1609a05aee5d5960670738d8d834d91235bd6b1e
+
+ -- Mo Zhou <cdluminate@gmail.com> Fri, 17 May 2019 09:01:01 +0000
+
+julia (1.0.3+dfsg-4) unstable; urgency=medium
+
+ [ Mo Zhou ]
+ * Set JULIA_CPU_TARGET="pwr8" for ppc64el architecture
+
+ [ Graham Inggs]
+ * Avoid baseline violation on armhf, thanks Adrian Bunk
+ (Closes: #919183)
+ * Build with GCC 8 on armhf again
+
+ -- Graham Inggs <ginggs@debian.org> Tue, 22 Jan 2019 20:19:55 +0000
+
+julia (1.0.3+dfsg-3) unstable; urgency=medium
+
+ * Switch to use embedded LLVM-6.0.0 with upstream patches.
+ + Embed llvm-6.0.0.src.tar.xz to debian/embedded.
+ + Register new tarball in source/include-binaries.
+ + Set USE_SYSTEM_LLVM=0 in rules.
+ + Add patch do-not-download-llvm.patch to redirect downloading.
+ + Add symlinks in debian/embedded to avoid "download" errors.
+ + Add cmake, python3 to B-D for the embedded LLVM.
+ + Update copyright for the embedded LLVM.
+ + Patch Makefile and update rules to amend installation.
+ + Install the embedded LLVM to Julia's private library directory.
+ + dh_makeshlibs: Don't track symbols for the private LLVM library.
+ - Drop B-D on Debian's LLVM 6.0
+ - Remove makefile options for system LLVM.
+ * Remove the DESTDIR definition from make flags. This definition
+ screws up installtion of embedded LLVM.
+ * Add hack for embedded LLVM to let it install shlibs correctly.
+ * Refresh patches (quilt push -a --refresh).
+ * Skip more DNS tests for Sockets.jl module.
+
+ -- Mo Zhou <cdluminate@gmail.com> Thu, 17 Jan 2019 15:27:52 +0000
+
+julia (1.0.3+dfsg-2) unstable; urgency=medium
+
+ [ Graham Inggs ]
+ * Skip DNS tests which fail on Ubuntu autopkgtest infrastructure
+
+ [ Mo Zhou ]
+ * Add d/gitlab-ci.yml for enabling Salsa CI pipeline.
+ * Delete the unused DEBIAN_FORCE_NONET env var from rules and autopkgtest.
+ * Update copyright for embedded .jl packages.
+ * Uscan: add dversionmangle option to append +dfsg postfix.
+ * Add headers for headless patches.
+ * Add patch (disabled by default) to append "@distro" to default LOAD_PATH.
+ * Bump Standards-Version to 4.3.0 (no change).
+ * Rollback embedded Documenter to 0.20.0, DocStringExtensions to 0.5.0 .
+ * Refresh the list of included-binaries.
+ * Bump B-D-I pygmentize to Py3 version.
+ + Append python3-pkg-resources to B-D-I to workaround #918401
+ * Add repacksuffix option to uscan.
+ * Add openssl to B-D (more test coverage) and Recommends.
+
+ -- Mo Zhou <cdluminate@gmail.com> Sun, 13 Jan 2019 10:53:28 +0000
+
+julia (1.0.3+dfsg-1) unstable; urgency=medium
+
+ * Users who want to use MKL as Julia's BLAS/LAPACK backend should
+ rebuild this package after turning on the CUSTOM_MKL flag in d/rules.
+ I temporarily reverted the use of alternatives mechanism because the
+ code of stdlib/LinearAlgebra is not ready for such a feature.
+
+ * DFSG: Exclude contrib/windows/7zS.sfx (Closes: #916957)
+ * Remove the aforementioned incremental patch since the
+ source tarball has been refreshed.
+ * Revert "Link libjulia.so.X against libblas.so.3 to take
+ advantage from alternatives." (Closes: #916991)
+
+ -- Mo Zhou <cdluminate@gmail.com> Fri, 21 Dec 2018 14:47:47 +0000
+
+julia (1.0.3-2) unstable; urgency=medium
+
+ * Cherry-pick incremental patch from upstream's force-pushed 1.0.3 tag.
+
+ -- Mo Zhou <cdluminate@gmail.com> Tue, 18 Dec 2018 06:25:16 +0000
+
+julia (1.0.3-1) unstable; urgency=medium
+
+ * New upstream version 1.0.3
+ * Strip shared object libjulia.so.1 , whilst sys.so is still unstripped.
+ * Update embedded tarball for Pkg.jl .
+ * Refresh and update patches, including patches for embedded source.
+ * Refresh doc-silent.patch on new embedded source.
+ * Upgrade embedded DocStringExtensions.jl to v0.6.0
+ * Upgrade embedded Documenter to v0.21.0
+ * Remove embedded Compat.jl, not used by Documenter.jl anymore.
+
+ -- Mo Zhou <cdluminate@gmail.com> Mon, 17 Dec 2018 10:15:03 +0000
+
+julia (1.0.2-1) unstable; urgency=medium
+
+ * New upstream version 1.0.2
+ * Remove test-precision.patch, merged upstream.
+ * Import Pkg.jl tarball to d/embedded directory.
+ Its sha512sum matches with the one provided by upstream.
+ + Register the Pkg.jl tarball in include-binaries.
+ + Patch makefile to avoid downloading Pkg.jl tarball.
+ * Remove LLVM_DISABLE_ABI_BREAKING_CHECKS_ENFORCING definition from rules.
+ * Refresh Patches.
+
+ -- Mo Zhou <cdluminate@gmail.com> Thu, 29 Nov 2018 06:50:23 +0000
+
+julia (1.0.1-2) unstable; urgency=medium
+
+ * Extend JULIA_CPU_TARGET to include optimized targets as well,
+ apart from the generic target. (Closes: #910784)
+
+ -- Mo Zhou <cdluminate@gmail.com> Sat, 13 Oct 2018 06:16:18 +0000
+
+julia (1.0.1-1) unstable; urgency=medium
+
+ * New upstream stable release 1.0.1 (Sept 2018)
+ * Remove hundreds of patches picked from WIP Julia 1.0.1 release.
+ * Elaborate on why debug info should not be stripped from sys.so in rules.
+ * Drop deps:openspecfun which is not used anymore since 0.7 release.
+ * Add CUSTOM_NATIVE variable in rules for custom builds.
+ * Drop test related patches, because they are not needed anymore:
+ - Drop test-add-envvar-for-skipping-network-tests.patch .
+ - Drop test-moredetail.patch, test-skip-ppc64el-spec.patch
+ - Drop test-remove-powermod.patch, test-aggressive-gc.patch .
+
+ -- Mo Zhou <cdluminate@gmail.com> Sun, 30 Sep 2018 15:22:20 +0000
+
+julia (1.0.0-3) unstable; urgency=medium
+
+ * Handle floating point precision issue during test. (test-precision.patch)
+ * Patch runtests.jl to let it print more detail. (test-moredetail.patch)
+ * Make Documenter.jl quiet. (pre-apply, doc-silent.patch)
+ * Autopkgtest: Skip network related tests for Ubuntu.
+ * Link libjulia.so.X against netlib LAPACK (libblas.so.3/liblapack.so.3)
+ in order to take advantage from alternatives system. (Closes: #905826)
+ * Shared object libjulia.so.X Depends on high performance BLAS/LAPACK
+ implementations when available, i.e. OpenBLAS | Atlas | Intel-MKL.
+ * Add 167 patches from work-in-progress Julia 1.0.1 release.
+ * Autopkgtest: Execute runtests.jl instead of calling Base.runtests()
+ * README.Debian: Julia's BLAS/LAPACK backend is switchable henceforth.
+ * Also remove the "net_on" guard in test/runtests.jl.
+ (update, test-add-envvar-for-skipping-network-tests.patch)
+ * Downgrade GCC version to 7 for armhf in order to circumvent FTBFS.
+
+ -- Mo Zhou <cdluminate@gmail.com> Sun, 23 Sep 2018 03:11:52 +0000
+
+julia (1.0.0-2) unstable; urgency=medium
+
+ * Run test on all architectures instead of only amd64 and i386,
+ but allow non-x86 tests to fail. (Closes: #906754)
+ * Binary package julia Suggests vim-julia. (Closes: #907050)
+ * Enable the build on s390x
+ - Disable libunwind on s390x (unsatisfiable dependency).
+ * shlibdeps.mk: don't link against libunwind on s390x architecture.
+ * Fix logic error in base/Makefile. (+ make-unwind-logic-error.patch)
+ * Skip some tests unconditionally or conditionally according to Sys.ARCH
+ * Skip "powermod" related tests on non-x86 architecture.
+ * Skip several tests specifically on i386. (+ test-skip-i386-spec.patch)
+ * Skip several tests on ppc64el. (+ test-skip-ppc64el-spec.patch)
+ * Add patch test-add-envvar-for-skipping-network-tests.patch.
+ + rules: Export DEBIAN_FORCE_NONET to skip network tests during build.
+ * Add patch test-aggressive-gc.patch: make garbage collection aggressive.
+ * Re-arrange patches and refresh them.
+ * Disable thumb instruction set for armhf by adding -marm to cflags.
+ * Bump Standards-Version to 4.2.1 (no change).
+ * rules: Don't install extra license files.
+ * Override false-positive lintian error: wrong-path-for-interpreter.
+ * Upload to unstable.
+
+ -- Mo Zhou <cdluminate@gmail.com> Fri, 14 Sep 2018 11:09:12 +0000
+
+julia (1.0.0-1) experimental; urgency=medium
+
+ * New upstream version 1.0.0
+ * Bump SOVERSION from 0.7.0 to 1.0.0, update debian directory accordingly.
+ * Package libjulia1.0 Breaks and Replaces libjulia0.7 .
+ * Refresh symbols list for libjulia.so.1.0 .
+ * Update the embedded Documenter:
+ + Documenter.jl to v0.19.4
+ + Compat.jl to v1.0.1 .
+ + DocStringExtensions.jl to v0.4.6 .
+ * Update debian/source/include-binaries accordingly.
+ * Update patch privacy-breach.patch and pre-apply the patch.
+ * Add patch doc-unicode-data-path.patch and update deb-make-doc.patch
+ to avoid UnicodeData.txt File-Not-Found during documentation build.
+ * Refresh debian/copyright.
+ * Also build the PDF documentation apart from the HTML doc.
+ + Add latex related Build-Depends-Indep packages for building PDF doc.
+ + Override dh_auto_build-indep to build both HTML and PDF document.
+ + Install the generated PDF doc doc/_build/pdf/en/TheJuliaLanguage.pdf
+ + Register the PDF documentation in doc-base.
+ * Remove source/local-options , not useful anymore.
+ * Update appstream.patch from upstream pull request #28020.
+
+ -- Mo Zhou <cdluminate@gmail.com> Thu, 16 Aug 2018 07:48:17 +0000
+
+julia (0.7.0-2) unstable; urgency=medium
+
+ [ Peter Colberg ]
+ * Drop Build-Depends on libfftw3-dev, Bindings to the FFTW library
+ have been removed from Base. (Closes: #905849)
+
+ [ Mo Zhou ]
+ * Replace Sys.CPU_CORES with Sys.CPU_THREADS in test script.
+ * Add patch test-skip-sigint.patch to temporarily avoid a random
+ test failure in test/stress.jl, which sometimes fail to catch SIGINT.
+
+ -- Mo Zhou <cdluminate@gmail.com> Sun, 12 Aug 2018 01:00:36 +0000
+
+julia (0.7.0-1) unstable; urgency=medium
+
+ * New upstream version 0.7.0
+ * Update embedded libuv tarball to ed3700c849289ed01fe04273a7bf865340b2bd7e.
+ * Refresh all the patches.
+ * Move privacy-breach.patch to quilt directory, and pre-apply the patch.
+ * Require libgit2-dev (>= 0.27.0~) as B-D.
+ * Bump Standards-Version to 4.2.0 (no change).
+ * Three new symbols for libjulia0.7 0.7.0 .
+ * Only traverse within debian directory when fixing shebang. (Avoid FTBFSx2)
+ * README.Debian: Add brief instructions for rebuilding Julia against MKL.
+ * Update file matching expressions in copyright.
+ * Require llvm-6.0-dev (>= 1:6.0.1-3) in the B-D field.
+
+ -- Mo Zhou <cdluminate@gmail.com> Thu, 09 Aug 2018 16:43:05 +0000
+
+julia (0.7.0~beta2-1) experimental; urgency=medium
+
+ * New upstream BETA version 0.7.0~beta2
+ * Changes related to embedded sources:
+ + Embed newer libuv tarball which is specified by upstream.
+ - Remove the old libuv-d8ab1c6a33e77bf155facb54215dd8798e13825d.tar.gz
+ + Embed libwhich-81e9723c0273d78493dc8c8ed570f68d9ce7e89e.tar.gz
+ - Remove embedded arpack tarball and corresponding b-d.
+ * Update binary file registration in source/include-binaries .
+ * Updates related to patch stack:
+ * Refresh patches for 0.7.0~beta2
+ + Add patch: do-not-download-libwhich.patch
+ + Update patch deb-make-doc.patch to fix document build failure.
+ + Update no-debug-version.patch to adapt to upstream Makefile change.
+ - Remove do-not-download-arpack.patch, not used anymore.
+ - Remove do-not-query-ldconfig.patch, due to upstream's commit:
+ github.com/JuliaLang/julia/commit/28f3c06f2d30dd1ae2c189cf8fb54d625ecc26ad
+ - Remove patch: unversioned-system-load-path.patch (not applied).
+ We don't need it anymore because it may result in confusing behaviour.
+ - Remove unused tests patch: test_libgit2.patch, unicode-test.patch,
+ test_replcompletions.patch. Nolonger needed since the tests don't fail.
+ * Control-related changes:
+ * Bump B-D llvm version to 6 (Closes: #873408)
+ * Bump SOVERSION, and update corresponding file names.
+ * Update .install files: install several new files.
+ * Update symbols control file.
+ * libjuila0.7 breaks and replaces libjulia0.6 .
+ * symbols: Mark jl_cpuidex as {amd64,i386}-only.
+ * Mark usr/share/doc/julia/html/* as not-installed to silent dh_missing.
+ * Build-related changes:
+ + Run dh_missing --list-missing during build.
+ + Suppress compiler warning by explicitly defining the following CXX flag:
+ -DLLVM_DISABLE_ABI_BREAKING_CHECKS_ENFORCING=0
+ * JULIA_CPU_CORES is deprecated in favor of JULIA_CORE_THREADS.
+ * Fix shebang and mode bits during dh_fixperms:
+ 's@#!/usr/bin/env julia@#!/usr/bin/julia@g'
+ * Delete the deprecated make flag USE_SYSTEM_ARPACK.
+ * Policy-related changes:
+ + Update SOVERSION in lintian overrides
+ * julia-common: Override one more lintian Info:
+ package-contains-documentation-outside-usr-share-doc
+ * Miscellaneous changes:
+ - Delete comment from the watch file.
+ + Install an example config which tweaks the default prompt string.
+ * Update copyright for Julia 0.7.X and embedded sources.
+ * autopkgtest: Remove the arpack-sanity test, because arpack was split out
+ from Julia 0.7.X by upstream.
+
+ -- Mo Zhou <cdluminate@gmail.com> Mon, 23 Jul 2018 08:52:51 +0000
+
+julia (0.6.4-2) unstable; urgency=medium
+
+ * Let julia depend on libjulia0.6 (= ${binary:Version})
+ * libjulia0.6 breaks julia (<< 0.5.0~) due to file overwrite.
+ This fixes stable->sid upgrade. (Closes: #903498)
+ * Remove unused patch version-git.patch since we have NO_GIT=1 in rules.
+ * Fill in patches' header with DEP-3 alike information.
+
+ -- Mo Zhou <cdluminate@gmail.com> Wed, 11 Jul 2018 09:23:52 +0000
+
+julia (0.6.4-1) unstable; urgency=medium
+
+ [ Mo Zhou ]
+ * New upstream version 0.6.4
+ * Refresh patches after importing 0.6.4 .
+ * Replace the placeholder in watch file, which was left unchanged by mistake.
+ * Import embedded copy of libuv (checksum identical to upstream's).
+ libuv-d8ab1c6a33e77bf155facb54215dd8798e13825d.tar.gz
+ + Register the new embedded tarball in source/include-binary
+ * Refresh copyright for the tarball.
+ * Deprecate the get-orig-tarball target in rules.
+ * Split the autopkgtest test command into individual script.
+ * Replace the old do-not-download-libuv.patch patch with a new one.
+ * Patch: Let jldownload be verbose and download from file:// URI.
+ + debian/patches/jldownload-verbose-fakedownload.patch
+ * rules: Don't install any .gitignore file!
+ * rules: Update the TAGGED_RELEASE_BANNER to avoid confusion.
+ * Update the matching pattern in lintian overrides.
+ * Patch: disable unversioned-system-load-path.patch .
+ * Autopkgtest: One more script to test arpack sanity.
+
+ [ Graham Inggs ]
+ * Disable unaligned access on armhf
+ * Enable ARM NEON extensions, since #842142 is fixed
+
+ -- Mo Zhou <cdluminate@gmail.com> Tue, 10 Jul 2018 16:08:48 +0000
+
+julia (0.6.3-6) unstable; urgency=medium
+
+ * Bump B-D on libutf8proc-dev to (>= 2.1.1) and enable
+ unicode/utf8proc test, since #902902 is fixed
+ * Drop libjulia0.6's explicit dependency on libgit2-26
+
+ -- Graham Inggs <ginggs@debian.org> Mon, 09 Jul 2018 20:29:42 +0000
+
+julia (0.6.3-5) unstable; urgency=medium
+
+ * Switch B-D LLVM version from 3.9 -> 4.0 .
+ * Skip more tests in test/libgit2.jl to avoid FTBFS.
+
+ -- Mo Zhou <cdluminate@gmail.com> Mon, 09 Jul 2018 12:29:34 +0000
+
+julia (0.6.3-4) unstable; urgency=medium
+
+ * Don't strip sys.so and libjulia.so as suggested upstream. This fixes
+ several tests that fail during autopkgtest but didn't fail during build.
+ https://github.com/JuliaLang/julia/issues/23115#issuecomment-320715030
+ * Override lintianE: libjulia0.6: unstripped-binary-or-object.
+ * Enable more tests from libgit2.jl and replcompletions.jl .
+ * Let libjulia0.6 depend on libgit2-27 (>= 0.27.0+dfsg.1-0.2) explicitly.
+ * Add missing symlinks to libmbedcrypto and libmbedx509 .
+ * Drop unneeded B-D libarpack2-dev, libjs-mathjax.
+ Drop libjs-underscore from julia-doc Depends.
+ * Revert "control: Stick to unicode-data version 11 to avoid surprise."
+ * Loosen libgit2 requirement to (>= 0.26.0+dfsg.1-1.1) .
+ * Explicitly Build-Depends on libmbedtls-dev.
+ * Upload to unstable.
+
+ -- Mo Zhou <cdluminate@gmail.com> Mon, 09 Jul 2018 09:08:35 +0000
+
+julia (0.6.3-3) experimental; urgency=medium
+
+ * rules: Wrap-and-sort the build flags.
+ * Fix a typo in shlibdeps.mk which could causes dh-link failure.
+ * Update shlibdeps.mk for libuv1 .
+ * rules: Comment that we cannot use the libuv1 provided in archive.
+ * Skip test "replcompletions" to avoid FTBFS.
+ See: https://github.com/JuliaLang/julia/issues/27958
+ * Bump Standards-Version to 4.1.5 (no change).
+ * Add the missing B-D libcurl4-gnutls-dev | libcurl-dev .
+
+ -- Mo Zhou <cdluminate@gmail.com> Sat, 07 Jul 2018 08:39:35 +0000
+
+julia (0.6.3-2) experimental; urgency=medium
+
+ * Deal with symlinks for embedded julia documenter in more graceful way.
+ * Remove unused symlinks shipped in julia-doc .
+ * Install upstream NEWS, HISTORY, DISTRIBUTING, etc to julia.
+ * Merge debian/NOTES into debian/README.Debian
+ * Clean up old/unused parts in rules.
+ * Stick to unicode-data version 11 to avoid surprise.
+ * Update shlibdeps.mk according to upstream Make.inc .
+ * Move all private shared object files to package libjulia0.6 .
+ * Don't generate symbols for the private libarpack.so .
+ * Add a custom option to enable building Julia with MKL.
+
+ -- Mo Zhou <cdluminate@gmail.com> Fri, 06 Jul 2018 11:18:49 +0000
+
+julia (0.6.3-1) experimental; urgency=medium
+
+ [ Peter Colberg ]
+ * Update Vcs-* fields for move to salsa.debian.org
+ * New upstream version 0.6.0
+ * Refresh patches
+ * Move manpages and examples back to julia package
+ * Add libjulia0.6 with runtime library
+ * Add libjulia-dev with runtime headers
+ * Build-Depends on llvm-3.9-dev
+ * Drop Build-Depends on python3-sphinx (Closes: #896622)
+ * Build-Depends on libgit2-dev (>= 0.23)
+ * Build-Depends on libutf8proc-dev (>= 2.1.0)
+ * Substitute deprecated parameters in autopkgtest command
+ * Update debian/copyright
+ * Drop embedded copy of Rmath from upstream tarball
+ * Drop embedded copy of juliadoc from upstream tarball
+
+ [ Mo Zhou ]
+ * Add myself to Uploaders.
+ * Refresh patch after rebasing Peter's work to master (0.4.7-7).
+ * New upstream version 0.6.3 (Closes: #839668)
+ * Refresh existing patches based on Julia 0.6.3 .
+ * Household changes:
+ * Bump Standards-Version to 4.1.4 .
+ * Change Priority from extra to optional. (4.0.0 -> 4.0.1)
+ * Bump debhelper compat level to 11 .
+ * Remove --parallel option in favor of debhelper compat 11.
+ * Embed several convenient code copies to debian/embedded/ :
+ * These embedded Julia package copies are used to build Julia doc.
+ + Compat.jl-0.69.0
+ + DocStringExtensions.jl-0.4.4
+ + Documenter.jl-0.18.0
+ * The embedded arpack source is used to avoid #902914.
+ + arpack-ng-3.3.0.tar.gz
+ * source: don't complain about binary files from embedded source.
+ * Patch upstream doc to avoid downloading anything.
+ + Build-Depends on unicode-data. (used during doc build)
+ + Prepare symlinks for embedded julia packages during dh_auto_configure.
+ + Patch upstream makefile to prevent it from downloading arpack source.
+ * Add upstream metadata including citation information.
+ * Upgrade watch file to uscan version 4.
+ * Move libjulia0.6 to libs section.
+ * Requires libgit2-dev >= 0.27.0+dfsg.1-0.5 for Build-Depends.
+ The specified version ships working TLS support.
+ * Add symbols control file for libjulia0.6 .
+ * rules: Don't use system Arpack. It causes "eigs(spdiagm(1:25))" failure.
+ * Patch upstream tester to skip utf8proc and libgit2 tests.
+ * Add missing Build-Depends for embedded arpack.
+ * Switch the default downloader dependency from wget to curl.
+ See https://github.com/JuliaLang/julia/issues/22783 . wget doesn't
+ throw the expected error, which would cause test failure.
+ * Patch embedded documenter code to prevent privacy-breach-generic .
+ + Replace google font css URL with customized css to avoid privacy breach.
+ + Use the "Incolsolata" font instead of "Roboto Mono".
+ + Document package depends on inconsolata font.
+ * Move AppStream xml file to new location /usr/share/metainfo .
+ * Patch upstream's outdated appstream xml file to prevent lintian Error.
+ * Update HTML documentation registration path in doc-base.
+ * Don't ship debug files e.g. libccalltest.so.debug .
+ * Don't trigger ldconfig for binary package "julia" because it ships libs
+ in private directory. This is accomplished by appending --no-scripts
+ option to dh_makeshlibs . An ldconfig trigger is manually added for
+ "libjulia0.6" package.
+ * Export HOME environt variable to really fix the mkdir permission issue.
+ * Mark symbol jl_cpuid as (amd64, i386)-only.
+ * Add NOTES to debian/ directory. (MKL is causing test failure)
+ * Note related to LLVM-4.0 : julia-0.6.3, built with llvm-4.0, is able to
+ pass the tests. However, llvm-3.9 is still preferred since upstream
+ sticks to llvm-3.9 .
+ * Update copyright for Julia 0.6.3 and embedded sources.
+ * Upload to experimental.
+
+ -- Mo Zhou <cdluminate@gmail.com> Thu, 05 Jul 2018 09:26:56 +0000
+
+julia (0.4.7-7) unstable; urgency=medium
+
+ * Add missing documentation option to fix build with Sphinx 1.5
+ * Switch to debhelper 10
+ * Use https in debian/control and debian/copyright
+ * Bump Standards-Version to 4.0.0
+ * Drop override_dh_strip-arch, ddeb migration is complete
+ * Fix more Lintian warnings spelling-error-in-binary
+ * Mark all binary packages Multi-Arch: foreign
+ * Enable all hardening flags
+
+ -- Graham Inggs <ginggs@debian.org> Wed, 12 Jul 2017 14:50:13 +0200
+
+julia (0.4.7-6) unstable; urgency=medium
+
+ * Use openlibm instead of libm on mips, mips64el and mipsel
+
+ -- Graham Inggs <ginggs@debian.org> Wed, 25 Jan 2017 07:39:32 +0200
+
+julia (0.4.7-5) unstable; urgency=medium
+
+ * Use openlibm instead of libm on armhf and ppc64
+ * Use openblas instead of blas and lapack on mips64el and ppc64
+ * Update debian/copyright
+ * Do not override ARM options in Make.inc
+ * Do not print warning if unable to determine host CPU name
+ * Add src/*.dwo to debian/clean
+ * Explicitly set USE_SYSTEM_LIBM where needed
+
+ -- Graham Inggs <ginggs@debian.org> Mon, 23 Jan 2017 08:29:33 +0200
+
+julia (0.4.7-4) unstable; urgency=medium
+
+ * Use DEB_VENDOR inplace of lsb_release -si
+
+ -- Peter Colberg <peter@colberg.org> Mon, 02 Jan 2017 22:28:51 -0500
+
+julia (0.4.7-3) unstable; urgency=medium
+
+ * Set TAGGED_RELEASE_BANNER to distribution and source package version
+ (Closes: #849815)
+
+ -- Peter Colberg <peter@colberg.org> Sat, 31 Dec 2016 13:43:20 -0500
+
+julia (0.4.7-2) unstable; urgency=medium
+
+ * Ensure JULIA_CPU_CORES >= 2 to respawn test workers reaching memory limit
+ (Closes: #848506)
+
+ -- Peter Colberg <peter@colberg.org> Tue, 20 Dec 2016 23:57:28 -0500
+
+julia (0.4.7-1) unstable; urgency=medium
+
+ * New upstream release
+ * Refresh patches
+ * Drop install-sh-exit-status.patch, applied upstream
+ * Drop verbose-build.patch since libuv is already built with V=1
+ * Drop do-not-use-home-directory-in-tests.patch in favour of setting HOME
+ * Drop disable-download-test.patch since wget is not actually invoked
+ * Remove unused lintian override configure-generated-file-in-source
+ * Add debian/gbp.conf for pristine-tar
+
+ -- Peter Colberg <peter@colberg.org> Sun, 18 Sep 2016 23:55:05 -0400
+
+julia (0.4.6-1) unstable; urgency=medium
+
+ [ Peter Colberg ]
+ * New upstream release
+ * Refresh patches
+ * Drop unneeded build dependency on libdouble-conversion-dev
+ * Bump Standards-Version to 3.9.8, no further changes
+
+ [ Graham Inggs ]
+ * Use libopenlibm instead of libm on arm64
+ * Fix inconsistent use of GNU_SOURCE in embedded libuv (Closes: #748573)
+ * Drop arm-rec_backtrace.patch, no longer needed since ARM ABI backport
+
+ -- Peter Colberg <peter@colberg.org> Thu, 23 Jun 2016 22:20:54 -0400
+
+julia (0.4.5-3) unstable; urgency=medium
+
+ * Disable ARM NEON extensions. (Closes: #820220)
+
+ -- Graham Inggs <ginggs@debian.org> Sun, 17 Apr 2016 13:35:42 +0200
+
+julia (0.4.5-2) unstable; urgency=medium
+
+ * Make rec_backtrace() always return 0 on ARM and PPC64,
+ this avoids a FTBFS on armhf with recent GCC.
+
+ -- Graham Inggs <ginggs@debian.org> Thu, 07 Apr 2016 14:32:28 +0200
+
+julia (0.4.5-1) unstable; urgency=medium
+
+ * New upstream release.
+ * Refresh patches, new fix-spelling-error-in-binary.patch.
+ * Use libopenlibm instead of libm on powerpc and ppc64el.
+
+ -- Graham Inggs <ginggs@debian.org> Wed, 30 Mar 2016 17:56:56 +0200
+
+julia (0.4.3-4) unstable; urgency=medium
+
+ * Drop versioned build dependency on llvm-3.8-dev and
+ build dependencies on libllvm3.x (no longer needed).
+ * Upload to unstable.
+
+ -- Graham Inggs <ginggs@debian.org> Sun, 13 Mar 2016 11:06:39 +0200
+
+julia (0.4.3-3) experimental; urgency=medium
+
+ * Build depend on libllvm3.x as well (experimental).
+
+ -- Graham Inggs <ginggs@debian.org> Mon, 08 Feb 2016 13:06:53 +0200
+
+julia (0.4.3-2) experimental; urgency=medium
+
+ * Optionally build depend on llvm-3.8-dev (>= 1:3.8~+rc1).
+ * Add debian/clean to fix FTBFSx2.
+ * Migrate from julia-dbg to ddebs, bump debhelper build-dependency.
+ * Bump Standards-Version to 3.9.7, no further changes.
+ * Enable tests on i386 again.
+ * Improve generated debug info to fix FTBFS on i386
+ (see upstream issue #13754).
+
+ -- Graham Inggs <ginggs@debian.org> Mon, 08 Feb 2016 10:02:00 +0200
+
+julia (0.4.3-1) unstable; urgency=medium
+
+ [ Graham Inggs ]
+ * Ensure pcre_h.jl and errno_h.jl are sorted reproducibly.
+
+ [ Peter Colberg ]
+ * Imported Upstream version 0.4.3
+ * Refresh patches.
+ * Drop patch fix-arm64-ftbfs.patch, no longer needed.
+ * Update Vcs-Git and Vcs-Browser fields.
+ * Fix lintian warning dh-exec-useless-usage.
+ * Fix lintian warning spelling-error-in-binary.
+
+ -- Peter Colberg <peter@colberg.org> Thu, 14 Jan 2016 07:56:18 -0500
+
+julia (0.4.2-3) unstable; urgency=medium
+
+ * Fix FTBFS on arm64 (thanks to Edmund Grimley Evans). (Closes: #807701)
+ * Disable tests on i386 until perfomance issue with LLVM >= 3.4
+ is resolved (see upstream issue #14191).
+
+ -- Graham Inggs <ginggs@debian.org> Sat, 12 Dec 2015 11:27:26 +0200
+
+julia (0.4.2-2) unstable; urgency=medium
+
+ * Set number of parallel workers for tests.
+ + Restart workers exceeding maximum resident memory size of 500 MiB.
+ * Drop build depends on llvm-3.8-dev to permit migration to testing.
+ (Closes: #803644)
+
+ -- Peter Colberg <peter@colberg.org> Thu, 10 Dec 2015 06:50:01 -0500
+
+julia (0.4.2-1) unstable; urgency=medium
+
+ * Imported Upstream version 0.4.2
+ * Refresh patches.
+ * Upload to unstable. (Closes: #803644)
+
+ -- Peter Colberg <peter@colberg.org> Mon, 07 Dec 2015 06:49:01 -0500
+
+julia (0.4.1-2) experimental; urgency=medium
+
+ * Build depend on libpcre2-dev >= 10.20-3~ to fix FTBFS on ppc64el.
+ * Optionally build depend on llvm-3.8-dev to reduce performance regression.
+ * Optionally build depend on llvm-3.6-dev to ease backporting.
+
+ -- Peter Colberg <peter@colberg.org> Wed, 02 Dec 2015 07:26:24 -0500
+
+julia (0.4.1-1) experimental; urgency=medium
+
+ * Imported Upstream version 0.4.1
+ * Refresh patches.
+ * Update debian/copyright.
+ * Build depend on libsuitesparse-dev >= 1:4.4.5
+ * Build depend on llvm-3.7-dev.
+ * Build depend on libpcre2-dev (thanks to Matthew Vernon).
+ * Disable libgit2 test to avoid dependency on libgit2.
+ * Disable download test to avoid dependency on curl or wget.
+ * Do not use home directory in tests.
+ * Fix backtrace test with MCJIT.
+ * Fix documentation spelling errors.
+ * Install examples to doc directory inplace of symlink.
+
+ -- Peter Colberg <peter@colberg.org> Wed, 25 Nov 2015 08:00:20 -0500
+
+julia (0.3.12-2) unstable; urgency=medium
+
+ [ Peter Colberg ]
+ * Fix automated package testing with autopkgtest.
+ * Test REPL in dumb mode.
+ * Build depend on libopenlibm-dev (>= 0.4.1+dfsg-4~) to fix FTBFS on i386.
+ * Fix arch-all-only build with dpkg-buildpackage -A.
+
+ [ Sébastien Villemot ]
+ * Remove myself from Uploaders.
+
+ -- Graham Inggs <ginggs@debian.org> Wed, 25 Nov 2015 10:46:18 +0200
+
+julia (0.3.12-1) unstable; urgency=medium
+
+ [ Peter Colberg ]
+ * Imported Upstream version 0.3.12
+ * Add julia-common package with standard library and test suite.
+ * Remove embedded libraries:
+ + Build depend on libdsfmt-dev.
+ + Build depend on libutf8proc-dev.
+ * Generate version_git.jl from upstream commit.
+ * Build documentation using python3-sphinx.
+ * Fix potential privacy breach in documentation:
+ + Use libjs-modernizr package inplace of external link.
+ + Strip external link to Google Fonts API.
+ * Query SSE2 extension on i386 using x86 CPUID opcode.
+ * Do not query ldconfig for library sonames.
+ * Generate package dependencies for dynamically loaded libraries.
+ * Symlink dynamically loaded libraries to private library path.
+ * Add missing examples test for Base.runtests().
+ * Fix hanging socket test for Base.runtests().
+ * Enable parallel test.
+
+ [ Graham Inggs ]
+ * Build in the multiarch lib directories as well,
+ so that patchelf is no longer required. (Closes: #799099)
+ * Build everywhere. (Closes: #802583)
+ + Use libopenlibm where available, libm elsewhere.
+ + Use libopenblas where available, libblas and liblapack elsewhere.
+ + Cherry-pick patches from upstream 0.4.0 for armhf and ppc64el.
+
+ -- Graham Inggs <ginggs@debian.org> Fri, 13 Nov 2015 15:56:34 +0200
+
+julia (0.3.11-1) unstable; urgency=medium
+
+ * Imported Upstream version 0.3.11
+ * d/p/mcjit-llvm-ftbfs.patch: dropped, applied upstream.
+
+ -- Sébastien Villemot <sebastien@debian.org> Sun, 06 Sep 2015 18:51:09 +0200
+
+julia (0.3.10-1) unstable; urgency=medium
+
+ * Imported Upstream version 0.3.10
+ * d/p/inject-ldflags.patch: drop patch, no longer needed.
+ * d/p/mcjit-llvm-ftbfs.patch: new patch, taken from upstream.
+
+ -- Sébastien Villemot <sebastien@debian.org> Fri, 17 Jul 2015 23:14:01 +0200
+
+julia (0.3.9-1) unstable; urgency=medium
+
+ * Imported Upstream version 0.3.9
+ * repl-test.patch: new patch, prevents a test failure.
+
+ -- Sébastien Villemot <sebastien@debian.org> Sun, 21 Jun 2015 14:57:41 +0200
+
+julia (0.3.8-1) unstable; urgency=medium
+
+ * Imported Upstream version 0.3.8.
+ For the time being, manually embed juliadoc python package.
+ In the longer run, a separate Debian package should be created.
+ * sphinx-build.patch: new patch, needed for building doc without virtualenv.
+ * unicode-test.patch: new patch to workaround a test failure in unicode.jl.
+ * Set a hard dependency on OpenBLAS. (Closes: #778912)
+ * Ship tests in main package, so that they can be run on the compiled binary.
+ * Add autopkgtest (DEP8) support.
+
+ -- Sébastien Villemot <sebastien@debian.org> Mon, 25 May 2015 15:48:34 +0200
+
+julia (0.3.5-1) experimental; urgency=low
+
+ * Imported Upstream version 0.3.5. (Closes: #776069)
+
+ -- Sébastien Villemot <sebastien@debian.org> Fri, 13 Feb 2015 23:31:19 +0100
+
+julia (0.3.2-1) unstable; urgency=medium
+
+ * Imported Upstream version 0.3.2
+ * Ship new desktop file, SVG icon and appdata file.
+
+ -- Sébastien Villemot <sebastien@debian.org> Wed, 22 Oct 2014 18:42:04 +0200
+
+julia (0.3.1-1) unstable; urgency=medium
+
+ * Imported Upstream version 0.3.1
+ * Fix get-orig-source rule with respect to embedded utf8proc.
+ * No longer embed sphinx-rtd-theme.
+ + Add build-dependency on python-sphinx-rtd-theme.
+ + Drop patch embed-sphinx-rtd-theme.patch.
+ * Bump Standards-Version to 3.9.6, no changes needed.
+ * No longer try the "parallel" test at build time.
+ It always fails in chroots.
+
+ -- Sébastien Villemot <sebastien@debian.org> Wed, 24 Sep 2014 11:56:26 +0200
+
+julia (0.3.0-1) unstable; urgency=medium
+
+ * New upstream release.
+ - no longer breaks on PCRE upgrades. (Closes: #755576)
+ - command-line option "-p" works as expected. (Closes: #758783)
+ * Rewrite get-orig-source in d/rules. In particular no longer embed openlibm,
+ since it is now a separate package.
+ * debian/copyright: reflect upstream changes.
+ * New patches:
+ + do-not-download-utf8proc.patch
+ + inject-ldflags.patch
+ + install-sh-exit-status.patch
+ * Dropped patches:
+ + do-not-download-patchelf.patch (instead build depend on patchelf)
+ + fix-cpu-detection.patch
+ + ld-library-path-for-testing.patch
+ + make-4.0.patch
+ + no-git.patch (instead use "make -C base version_git.jl.phony")
+ + readline-6.3.patch (Julia no longer uses readline)
+ + sysconfdir-install.patch (instead use new make variable)
+ + use-sonames-with-dlopen.patch (not really needed, too difficult to
+ maintain)
+ * Ship the cached binary system image (sys.so).
+ * Compile with MARCH=x86-64 on amd64, and with MARCH=pentium4. In particular,
+ this means that SSE2 is now required on i386, because x87 FPU computing is
+ not supported by upstream and is buggy. Abort nicely if the CPU does have
+ SSE2, and by the way activate SSE2 in dSFMT (require-sse2-on-i386.patch).
+ As a consequence, drop the now unneeded testsuite-i386.patch.
+ * Bump to LLVM 3.5. (Closes: #753971)
+ * Add libopenblas-base to Recommends. Having it first in the BLAS
+ dependency alternative is not enough to ensure that it is installed by
+ default.
+ * Use OpenBLAS for both BLAS and LAPACK, since the Debian package now ships
+ both.
+ * Documentation package (julia-doc):
+ + use dh_sphinxdoc
+ + use packaged Mathjax instead of online version (with
+ use_packaged_mathjax.patch, symbolic links in d/julia-doc.links and
+ appropriate logic in d/rules)
+ + use packaged awesome font
+ + embed sphinx-rtd-theme, since the corresponding package has not yet been
+ accepted in sid (embed-sphinx-rtd-theme.patch)
+
+ -- Sébastien Villemot <sebastien@debian.org> Wed, 20 Aug 2014 10:51:46 +0000
+
+julia (0.2.1+dfsg-3) unstable; urgency=medium
+
+ * make-4.0.patch: fix FTBFS against make 4.0.
+
+ -- Sébastien Villemot <sebastien@debian.org> Wed, 07 May 2014 15:17:37 +0200
+
+julia (0.2.1+dfsg-2) unstable; urgency=medium
+
+ * readline-6.3.patch: new patch, fixes FTBFS against readline 6.3.
+ (Closes: #741824)
+ * Restrict supported archs to amd64 and i386, it never compiled elsewhere.
+
+ -- Sébastien Villemot <sebastien@debian.org> Sun, 16 Mar 2014 16:25:21 +0100
+
+julia (0.2.1+dfsg-1) unstable; urgency=medium
+
+ * New upstream release.
+
+ -- Sébastien Villemot <sebastien@debian.org> Sat, 15 Feb 2014 21:31:41 +0100
+
+julia (0.2.0+dfsg-6) unstable; urgency=medium
+
+ * Transition to libunwind8-dev. (Closes: #730464)
+
+ -- Sébastien Villemot <sebastien@debian.org> Sat, 01 Feb 2014 10:18:04 +0100
+
+julia (0.2.0+dfsg-5) unstable; urgency=low
+
+ * Transition to suitesparse 4.2.1.
+
+ -- Sébastien Villemot <sebastien@debian.org> Mon, 02 Dec 2013 18:38:37 +0100
+
+julia (0.2.0+dfsg-4) unstable; urgency=low
+
+ * Make the parallel.jl test non-fatal.
+
+ -- Sébastien Villemot <sebastien@debian.org> Sun, 24 Nov 2013 15:14:50 +0100
+
+julia (0.2.0+dfsg-3) unstable; urgency=low
+
+ * testsuite-i386.patch: loosen the numerical precision for yet another test.
+
+ -- Sébastien Villemot <sebastien@debian.org> Sun, 17 Nov 2013 19:32:52 +0100
+
+julia (0.2.0+dfsg-2) unstable; urgency=low
+
+ * testsuite-i386.patch: new patches, fixes FTBFS on i386.
+
+ -- Sébastien Villemot <sebastien@debian.org> Sun, 17 Nov 2013 17:51:13 +0100
+
+julia (0.2.0+dfsg-1) unstable; urgency=low
+
+ * New upstream release.
+ * debian/copyright: reflect upstream changes
+ * Update patches:
+ + remove patches applied upstream:
+ - suitesparse-3.4.patch
+ - testsuite-i386.patch
+ + remove fhs.patch, and replace it by the SYSCONFDIR build option.
+ + sysconfdir-install.patch: new patch to make the SYSCONFDIR option work
+ for us.
+ + refresh other patches.
+ * Bump Standards-Version to 3.9.5, no changes needed.
+ * Dependency of julia-dbg on julia is now versioned.
+
+ -- Sébastien Villemot <sebastien@debian.org> Sun, 17 Nov 2013 12:10:10 +0100
+
+julia (0.2.0~rc2+dfsg-2) unstable; urgency=low
+
+ * Use (older) suitesparse 3.4.
+ + d/control: downgrade (build-)dependencies
+ + use-sonames-with-dlopen.patch: update sonames
+ + suitesparse-3.4.patch: new patch
+ * Downgrade build-dependency to libunwind7-dev (was libunwind8-dev).
+ libunwind8-dev is unlikely to merge to testing soon.
+ * unversioned-system-load-path.patch: new patch.
+ Drops version number from system load path. Versioning unnecessary since
+ this path is managed by dpkg/apt. Moreover, it would make transitions to
+ higher versions needlessly complicated.
+ * testsuite-i386.patch: new patch, fixes testsuite crash on i386.
+ * Use canonical URL for Vcs-* fields
+
+ -- Sébastien Villemot <sebastien@debian.org> Sun, 03 Nov 2013 16:00:08 +0100
+
+julia (0.2.0~rc2+dfsg-1) experimental; urgency=low
+
+ * New upstream release candidate
+ * Link dynamically against LLVM, this seems to now work correctly
+
+ -- Sébastien Villemot <sebastien@debian.org> Sat, 26 Oct 2013 04:33:44 +0000
+
+julia (0.2.0~rc1+dfsg-1) experimental; urgency=low
+
+ * New upstream release candidate
+ * Removed patches:
+ + do-not-download-jquery.patch
+ + fix-version.patch
+ + no-webrepl.patch
+ + suitesparse-3.4.patch
+ + use-system-double-conversion.patch
+ + zlib-1.2.8.patch
+ * Added patches:
+ + fhs.patch
+ + no-debug-version.patch
+ + verbose-build.patch
+ * Ship upstream manpage
+ * Ship NEWS.md
+ * Build depend on llvm-3.3-dev
+ * Build depend on libsuitesparse-dev >= 1:4.2.1
+ * Stop distributing PDF documentation, it currently does not build
+ * debian/copyright: reflect upstream changes
+
+ -- Sébastien Villemot <sebastien@debian.org> Wed, 16 Oct 2013 15:54:13 +0200
+
+julia (0.1.2+dfsg-3) unstable; urgency=low
+
+ * Bump the B-D on dpkg-dev.
+ Support for source:Package and source:Version fields was added in dpkg-dev
+ 1.16.2 (Closes: #706470)
+ * Add support for DEB_BUILD_OPTIONS=nocheck (Closes: #706472)
+ * Add julia-dbg package
+ * Fix FTBFS with zlib >= 1.2.8 (Closes: #707962)
+ - zlib-1.2.8.patch: new patch, fixes the gzip.jl test
+ - tighten B-D on zlib1g-dev to >= 1:1.2.8
+
+ -- Sébastien Villemot <sebastien@debian.org> Wed, 15 May 2013 12:44:14 +0200
+
+julia (0.1.2+dfsg-2) unstable; urgency=low
+
+ * Statically link against LLVM.
+ With dynamic linking, strange bugs appear when the runtime library is not
+ the same than the one used for building. At this stage it is not clear
+ whether it is a Julia or LLVM bug. Reported as Julia issue #2494.
+ - use-shared-llvm.patch: remove patch
+ - add Built-Using field for julia binary package
+ - keep a dependency of julia on libllvm3.2, since strpack.jl dlopen's it
+
+ -- Sébastien Villemot <sebastien@debian.org> Wed, 03 Apr 2013 17:03:39 +0200
+
+julia (0.1.2+dfsg-1) unstable; urgency=low
+
+ * Imported Upstream version 0.1.2+dfsg.
+ Contains hotfix for package manager.
+ * debian/patches/fix-version.patch: new patch
+ * Refresh other patches
+
+ -- Sébastien Villemot <sebastien@debian.org> Thu, 07 Mar 2013 20:50:18 +0100
+
+julia (0.1.1+dfsg-1) unstable; urgency=low
+
+ * Imported Upstream version 0.1.1+dfsg
+ * debian/copyright: reflect upstream changes
+ * Refresh patches
+ * Disable tk-wrapper, since it is no longer built upstream
+ * Update README.debian
+
+ -- Sébastien Villemot <sebastien@debian.org> Thu, 07 Mar 2013 12:14:59 +0100
+
+julia (0.1+dfsg-1) unstable; urgency=low
+
+ * First upstream release!
+ * debian/copyright: document how to recreate orig tarball
+ * Add a debian/watch file
+ * d/rules, d/p/no-git-patch: adapt for numbered releases.
+ In particular, add a COMMITSHA file in the orig tarball containing the
+ SHA of the release tag.
+ * Promote zlib1g and libarpack2 to Depends
+ * Fix typo in manpage
+
+ -- Sébastien Villemot <sebastien@debian.org> Thu, 14 Feb 2013 12:00:05 +0100
+
+julia (0.1~20130213.git4bc33bbc-1) unstable; urgency=low
+
+ * New upstream release candidate.
+ Hopefully obviates the need of ugly hacks in libuv in order to build on
+ build daemons.
+ * Remove obsolete patches:
+ + bump-version-0.1.patch
+ + disable-final-uv-loop.patch
+ + revert-stdin-file-iostream.patch
+ * Refresh other patches
+
+ -- Sébastien Villemot <sebastien@debian.org> Wed, 13 Feb 2013 10:58:23 +0100
+
+julia (0.1~20130212.gitf375d4bb-1) unstable; urgency=low
+
+ * New upstream snapshot.
+ * no-git.patch: simplify and improve patch
+ * gsvddense_blasint.patch: remove patch, applied upstream
+ * New patches:
+ + bump-version-0.1.patch
+ + disable-final-uv-loop.patch
+ + revert-stdin-file-iostream.patch
+ * Refresh other patches
+
+ -- Sébastien Villemot <sebastien@debian.org> Tue, 12 Feb 2013 12:02:24 +0100
+
+julia (0.1~20130211.git86fbe98d-1) unstable; urgency=low
+
+ * New upstream snapshot.
+ * debian/control:
+ + add git to Recommends (for Julia package manager)
+ + remove dependencies on libglpk-dev (it moved to its own package)
+ + add explicit dependency on libgmp10 (there is no more a wrapper)
+ * fix-clean-rules.patch: remove patch, applied upstream
+ * gsvddense_blasint.patch: new patch
+ * Refresh other patches
+
+ -- Sébastien Villemot <sebastien@debian.org> Mon, 11 Feb 2013 03:51:26 +0100
+
+julia (0.0.0+20130206.git32ff5759-1) unstable; urgency=low
+
+ * New upstream snapshot.
+ * debian/copyright: reflect upstream changes
+ * debian/rules: update get-orig-source to reflect upstream changes
+ + Don't ship nginx
+ + Adapt for new configure-random target in deps/Makefile
+ * Enable build of Tk wrapper.
+ + debian/control: add build dependency on tk-dev
+ + debian/rules: add tk rule to build-arch
+ * debian/julia.install: install VERSION and COMMIT files
+ * no-webrepl.patch: new patch
+ * Refresh other patches
+ * Add source override for config.status file under deps/random/
+
+ -- Sébastien Villemot <sebastien@debian.org> Wed, 06 Feb 2013 17:54:29 +0100
+
+julia (0.0.0+20130107.gitd9656f41-2) unstable; urgency=low
+
+ * fix-cpu-detection.patch: do not use -momit-leaf-frame-pointer on non-x86
+ archs
+ * Upgrade to LLVM 3.2.
+ + debian/control: bump build-dependency
+ + debian/rules: use llvm-config-3.2
+ + debian/patches/use-shared-llvm.patch
+ debian/patches/use-sonames-with-dlopen.patch: update patches
+
+ -- Sébastien Villemot <sebastien@debian.org> Sat, 19 Jan 2013 15:33:43 +0100
+
+julia (0.0.0+20130107.gitd9656f41-1) unstable; urgency=low
+
+ * New upstream snashot
+ * No longer try to rebuild helpdb.jl.
+ + debian/rules: remove helpdb.jl from build-arch rule
+ + debian/control: move back python-sphinx to Build-Depends-Indep
+ * debian/copyright: reflect upstream changes
+ * Add Build-Conflicts on libatlas3-base (makes linalg tests fail)
+ * debian/rules: replace obsolete USE_DEBIAN makeflag by a list of
+ USE_SYSTEM_* flags
+ * debian/rules: on non-x86 systems, use libm instead of openlibm
+ * dpkg-buildflags.patch: remove patch, applied upstream
+ * Refreshed other patches
+
+ -- Sébastien Villemot <sebastien@debian.org> Wed, 16 Jan 2013 12:29:42 +0100
+
+julia (0.0.0+20121214.gitdced1f7-1) unstable; urgency=low
+
+ * New upstream snapshot.
+ * debian/copyright: reflect upstream changes
+ * Embedded libuv no longer uses libc-ares and libev.
+ + debian/control: remove them from build-dependencies
+ + debian/rules: no longer strip them from upstream tarball
+ + use-system-{ev,c-ares}.patch: remove patches
+ * Remove other patches merged upstream or no longer necessary
+ + fhs-multiarch.patch
+ + linalg-test-tolerance.patch
+ + remove-rpath.patch
+ + verbose-build.patch
+ * New patches
+ + do-not-download-patchelf.patch
+ + ld-library-path-for-testing.patch
+ + dpkg-multiarch.patch
+ + libjulia-release-drop-soname.patch
+ * Refresh other patches
+ * debian/rules:
+ + compile with MULTIARCH_INSTALL=1
+ + build helpdb.jl as part of the build-arch rule
+ + abort on failures in extra tests
+ * debian/control: move python-sphinx to Build-Depends (now used in build-arch)
+
+ -- Sébastien Villemot <sebastien@debian.org> Tue, 18 Dec 2012 14:42:23 +0100
+
+julia (0.0.0+20121102.git63e93f2-1) unstable; urgency=low
+
+ * Initial release. (Closes: #691912)
+
+ -- Sébastien Villemot <sebastien@debian.org> Fri, 02 Nov 2012 16:29:29 +0100
--- /dev/null
+deps/libuv/Makefile
+deps/libuv/config.log
+deps/libuv/config.status
+deps/libuv/libtool
+deps/libuv/libuv.pc
+src/*.dwo
+base/version_git.jl
--- /dev/null
+Source: julia
+Section: science
+Homepage: https://julialang.org
+Priority: optional
+Standards-Version: 4.3.0
+Vcs-Git: https://salsa.debian.org/julia-team/julia.git
+Vcs-Browser: https://salsa.debian.org/julia-team/julia
+Maintainer: Debian Julia Team <pkg-julia-devel@lists.alioth.debian.org>
+Uploaders: Peter Colberg <peter@colberg.org>,
+ Graham Inggs <ginggs@debian.org>,
+ Mo Zhou <cdluminate@gmail.com>,
+Build-Depends:
+ cmake, python3,
+ curl,
+ debhelper (>= 11~),
+ dpkg-dev (>= 1.16.2~),
+ libblas-dev | libblas.so,
+ libcurl4-gnutls-dev | libcurl-dev,
+ libdsfmt-dev (>= 2.2.3),
+ libgit2-dev (>= 0.27.0~),
+ libgmp-dev,
+ liblapack-dev | liblapack.so,
+ libmbedtls-dev,
+ libmpfr-dev,
+ libopenblas-dev [amd64 arm64 armhf i386 kfreebsd-amd64 kfreebsd-i386 mips64el ppc64el s390x],
+ libopenlibm-dev (>= 0.4.1+dfsg-4~) [any-i386 any-amd64 arm64 armhf mips mips64el mipsel powerpc ppc64 ppc64el],
+ libpcre2-dev (>= 10.20-3~),
+ libsuitesparse-dev (>= 1:4.4.5),
+ libunwind8-dev [!s390x],
+ libutf8proc-dev (>= 2.1.1),
+ openssl,
+ unicode-data,
+Build-Depends-Indep:
+ latexmk,
+ texlive,
+ texlive-extra-utils,
+ texlive-fonts-extra,
+ texlive-latex-base,
+ texlive-latex-extra,
+ texlive-latex-recommended,
+ texlive-luatex,
+ texlive-plain-generic,
+ python3-pygments,
+ python3-pkg-resources,
+ fonts-lato,
+
+Package: julia
+Architecture: any
+Multi-Arch: foreign
+Pre-Depends: ${misc:Pre-Depends}
+Depends: julia-common (= ${source:Version}),
+ libjulia1 (= ${binary:Version}),
+ ${misc:Depends},
+ ${shlibs:Depends},
+Replaces: julia-common (<< 0.5.0~)
+Breaks: julia-common (<< 0.5.0~)
+Suggests: ess (>= 12.09-1~), julia-doc, vim-julia,
+Recommends: git, openssl,
+Description: high-performance programming language for technical computing
+ Julia is a high-level, high-performance dynamic programming language for
+ technical computing, with syntax that is familiar to users of other technical
+ computing environments. It provides a sophisticated compiler, distributed
+ parallel execution, numerical accuracy, and an extensive mathematical function
+ library. The library, mostly written in Julia itself, also integrates mature,
+ best-of-breed C and Fortran libraries for linear algebra, random number
+ generation, FFTs, and string processing. Julia programs are organized around
+ defining functions, and overloading them for different combinations of
+ argument types (which can also be user-defined).
+ .
+ This package provides a complete Julia installation (JIT compiler, standard
+ library, text-based user interface).
+
+Package: libjulia1
+Section: libs
+Architecture: any
+Pre-Depends: ${misc:Pre-Depends}
+Replaces: julia (<< 0.5.0~), libjulia0.6, libjulia0.7
+Breaks: julia (<< 0.5.0~), libjulia0.6, libjulia0.7
+Depends: ${misc:Depends}, ${shlibs:Depends},
+ libopenblas-base [amd64 arm64 armhf i386 kfreebsd-amd64 kfreebsd-i386 mips64el ppc64el s390x]
+ | libatlas3-base [amd64 arm64 armel armhf hurd-i386 i386 kfreebsd-amd64 kfreebsd-i386 mips mips64el mipsel ppc64el s390x]
+ | libmkl-rt [amd64 i386],
+ libopenblas-dev [amd64 arm64 armhf i386 kfreebsd-amd64 kfreebsd-i386 mips64el ppc64el s390x]
+ | libatlas-base-dev [amd64 arm64 armel armhf hurd-i386 i386 kfreebsd-amd64 kfreebsd-i386 mips mips64el mipsel ppc64el s390x]
+ | libmkl-dev [amd64 i386],
+Description: high-performance programming language for technical computing (runtime library)
+ Julia is a high-level, high-performance dynamic programming language for
+ technical computing, with syntax that is familiar to users of other technical
+ computing environments. It provides a sophisticated compiler, distributed
+ parallel execution, numerical accuracy, and an extensive mathematical function
+ library. The library, mostly written in Julia itself, also integrates mature,
+ best-of-breed C and Fortran libraries for linear algebra, random number
+ generation, FFTs, and string processing. Julia programs are organized around
+ defining functions, and overloading them for different combinations of
+ argument types (which can also be user-defined).
+ .
+ This package provides the Julia runtime library.
+
+Package: julia-common
+Architecture: all
+Multi-Arch: foreign
+Depends: ${misc:Depends}
+Replaces: julia (<< 0.4.1-1~)
+Breaks: julia (<< 0.4.1-1~)
+Recommends: julia
+Description: high-performance programming language for technical computing (common files)
+ Julia is a high-level, high-performance dynamic programming language for
+ technical computing, with syntax that is familiar to users of other technical
+ computing environments. It provides a sophisticated compiler, distributed
+ parallel execution, numerical accuracy, and an extensive mathematical function
+ library. The library, mostly written in Julia itself, also integrates mature,
+ best-of-breed C and Fortran libraries for linear algebra, random number
+ generation, FFTs, and string processing. Julia programs are organized around
+ defining functions, and overloading them for different combinations of
+ argument types (which can also be user-defined).
+ .
+ This package contains the Julia standard library and test suite.
+
+Package: libjulia-dev
+Section: libdevel
+Architecture: any
+Depends: libjulia1 (= ${binary:Version}), ${misc:Depends}
+Description: high-performance programming language for technical computing (development)
+ Julia is a high-level, high-performance dynamic programming language for
+ technical computing, with syntax that is familiar to users of other technical
+ computing environments. It provides a sophisticated compiler, distributed
+ parallel execution, numerical accuracy, and an extensive mathematical function
+ library. The library, mostly written in Julia itself, also integrates mature,
+ best-of-breed C and Fortran libraries for linear algebra, random number
+ generation, FFTs, and string processing. Julia programs are organized around
+ defining functions, and overloading them for different combinations of
+ argument types (which can also be user-defined).
+ .
+ This package provides the Julia runtime headers.
+
+Package: julia-doc
+Architecture: all
+Multi-Arch: foreign
+Section: doc
+Depends: ${misc:Depends},
+ fonts-font-awesome,
+ fonts-inconsolata,
+ libjs-jquery,
+ libjs-jquery-ui,
+ libjs-mathjax,
+ libjs-highlight.js,
+ libjs-requirejs,
+ libjs-lodash,
+ node-normalize.css,
+ node-highlight.js,
+Suggests: julia
+Description: high-performance programming language for technical computing (documentation)
+ Julia is a high-level, high-performance dynamic programming language for
+ technical computing, with syntax that is familiar to users of other technical
+ computing environments. It provides a sophisticated compiler, distributed
+ parallel execution, numerical accuracy, and an extensive mathematical function
+ library. The library, mostly written in Julia itself, also integrates mature,
+ best-of-breed C and Fortran libraries for linear algebra, random number
+ generation, FFTs, and string processing. Julia programs are organized around
+ defining functions, and overloading them for different combinations of
+ argument types (which can also be user-defined).
+ .
+ This package contains the Julia manual, which describes the language and its
+ standard library. It also contains example Julia programs.
--- /dev/null
+Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/
+Upstream-Name: Julia
+Source: https://julialang.org
+Files-Excluded:
+ contrib/windows/7zS.sfx
+
+Files: *
+Copyright: 2009-2016 Jeff Bezanson, Stefan Karpinski, Viral B. Shah
+ and other contributors: https://github.com/JuliaLang/julia/contributors
+License: Expat
+
+Files: base/grisu/bignum.jl
+ base/grisu/bignums.jl
+ base/grisu/fastfixed.jl
+ base/grisu/fastprecision.jl
+ base/grisu/fastshortest.jl
+ base/grisu/float.jl
+Copyright: 2006-2014, the V8 project authors.
+License: BSD-3-clause
+
+Files: base/special/exp.jl
+ base/special/hyperbolic.jl
+ base/special/rem_pio2.jl
+ base/special/trig.jl
+Copyright: 1993, 2004, Sun Microsystems, Inc.
+ 2009-2016 Jeff Bezanson, Stefan Karpinski, Viral B. Shah
+ and other contributors: https://github.com/JuliaLang/julia/contributors
+License: Expat
+
+Files: stdlib/REPL/src/TerminalMenus/*
+Copyright: 2017 Nick Paul
+License: Expat
+
+Files: stdlib/SHA/*
+Copyright: 2014 Elliot Saba
+License: Expat
+
+Files: contrib/julia.appdata.xml
+Copyright: 2014, Paul Lange <palango@gmx.de>
+License: CC-BY-SA-3.0
+
+Files: src/abi_llvm.cpp
+ src/abi_ppc64le.cpp
+ src/abi_win32.cpp
+ src/abi_win64.cpp
+ src/abi_x86.cpp
+ src/abi_x86_64.cpp
+Copyright: 2007-2012, LDC Team.
+License: BSD-3-clause
+
+Files: src/support/END.h
+ src/support/ENTRY.amd64.h
+ src/support/ENTRY.i387.h
+ src/support/_longjmp.win32.S
+ src/support/_setjmp.win32.S
+Copyright: 1990, The Regents of the University of California. / William Jolitz
+License: BSD-3-clause
+
+Files: src/support/MurmurHash3.c
+ src/support/MurmurHash3.h
+Copyright: none
+License: public-domain-murmurhash
+ MurmurHash3 was written by Austin Appleby, and is placed in the public
+ domain. The author hereby disclaims copyright to this source code.
+
+Files: src/support/asprintf.c
+Copyright: 1997 Todd C. Miller <Todd.Miller AT courtesan.com>
+ 2004 Darren Tucker
+License: ISC
+
+Files: src/flisp/flisp.c
+ src/flisp/system.lsp
+Copyright: 2009 Jeff Bezanson
+License: BSD-3-clause
+
+Files: src/support/dirname.c
+Copyright: 2012, 2013 MinGW.org project
+License: Expat
+
+Files: src/getopt.c
+ src/getopt.h
+Copyright: 2005-2014, Rich Felker, et al.
+License: Expat
+
+Files: src/support/strptime.c
+Copyright: 1997-1998, 2005, 2008, The NetBSD Foundation, Inc.
+License: BSD-2-clause
+
+Files: src/disasm.cpp
+Copyright: 2009-2016 Jeff Bezanson, Stefan Karpinski, Viral B. Shah
+License: Expat
+Comment: Original code comes from The LLVM Compiler Infrastructure.
+ Modified by Julia developers.
+
+Files: src/support/strtod.c
+Copyright: 2009-2016 Jeff Bezanson, Stefan Karpinski, Viral B. Shah
+License: Expat
+Comment: Portions derived from the Python function _PyOS_ascii_strtod
+ Copyright 2001-2014 Python Software Foundation
+
+Files: src/crc32c.c
+Copyright: 2013, Mark Adler <madler@alumni.caltech.edu>
+License: Zlib
+
+Files: src/support/tzfile.h
+Copyright: NONE
+License: public-domain-tzfile
+ This file is in the public domain,
+ so clarified as of 1996-06-05 by Arthur David Olson.
+
+Files: deps/gfortblas.c
+Copyright: 2013 JuliaLang Project / Jameson Nash
+License: Expat
+
+Files: deps/valgrind/valgrind.h
+Copyright: 2000-2013, Julian Seward.
+License: BSD-3-clause
+
+Files: deps/patches/llvm-D31524-sovers_4.0.patch
+Copyright: 2017 Rebecca N. Palmer <rebecca_palmer@zoho.com>
+ 2017 Lisandro Damían Nicanor Pérez Meyer <lisandro@debian.org>
+ 2017 Sylvestre Ledru <sylvestre@debian.org>
+License: U-OF-I-BSD-LIKE
+Comment: License copied from Sylvestre Ledru <sylvestre@debian.org>'s
+ copyright file from llvm-toolchain-4.0 source.
+
+Files: debian/*
+Copyright: 2012-2015 Sébastien Villemot <sebastien@debian.org>
+ 2015-2017 Graham Inggs <ginggs@debian.org>
+ 2015-2017 Peter Colberg <peter@colberg.org>
+ 2018 Mo Zhou <cdluminate@gmail.com>
+License: Expat
+
+Files: debian/embedded/Pkg*
+Copyright: 2017 Stefan Karpinski
+ 2017 SimonDanisch
+ 2016 Art Wild
+License: Expat
+
+Files: debian/embedded/DocStringExtensions.jl*/*
+Copyright: 2016 Michael Hatherly
+License: Expat
+
+Files: debian/embedded/Documenter.jl*/*
+Copyright: 2016 Michael Hatherly
+License: Expat
+Comment:
+ Files: debian/embedded/Documenter.jl*/assets/html/search.js
+ Copyright: Steven Levithan <stevenlevithan.com>
+ License: Expat
+
+# Just a tiny library written for Julia. Maybe no need to package separately.
+Files: debian/embedded/libwhich-81e9723c0273d78493dc8c8ed570f68d9ce7e89e.tar.gz
+Copyright: 2017 Jameson Nash
+License: Expat
+
+Files: debian/embedded/llvm*
+Copyright: 2003-2017 University of Illinois at Urbana-Champaign.
+License: U-OF-I-BSD-LIKE
+
+Files: debian/embedded/libuv-ed3700c849289ed01fe04273a7bf865340b2bd7e.tar.gz
+Copyright: Joyent, Inc. and other Node contributors
+ 2013 Ben Noordhuis <info@bnoordhuis.nl>
+ StrongLoop, Inc.
+License: Expat
+Comment: Copyright for files contained in this tarball
+ Files: docs/src/sphinx-plugins/manpage.py
+ Copyright: 2013, Dariusz Dwornikowski.
+ License: Apache-2.0
+ .
+ Files: config.guess
+ config.sub
+ Copyright: 1992-2014, Free Software Foundation, Inc.
+ License: GPL-3+
+ .
+ Files: include/pthread-fixes.h
+ src/unix/pthread-fixes.c
+ Copyright: 2012, Google Inc.
+ 2013, Sony Mobile Communications AB
+ License: BSD-3-clause
+ .
+ Files: Makefile.am
+ Makefile.mingw
+ autogen.sh
+ checksparse.sh
+ configure.ac
+ src/heap-inl.h
+ src/queue.h
+ src/unix/atomic-ops.h
+ src/unix/spinlock.h
+ test/test-loop-configure.c
+ test/test-pipe-set-non-blocking.c
+ Copyright: 2013-2015, Ben Noordhuis <info@bnoordhuis.nl>
+ License: ISC
+ .
+ Files: samples/socks5-proxy/Makefile
+ samples/socks5-proxy/build.gyp
+ samples/socks5-proxy/client.c
+ samples/socks5-proxy/defs.h
+ samples/socks5-proxy/main.c
+ samples/socks5-proxy/s5.c
+ samples/socks5-proxy/s5.h
+ samples/socks5-proxy/server.c
+ samples/socks5-proxy/util.c
+ Copyright: StrongLoop, Inc.
+ License: Expat
+ .
+ Files: test/test-pipe-connect-multiple.c
+ test/test-pipe-connect-prepare.c
+ test/test-pipe-pending-instances.c
+ test/test-tcp-create-socket-early.c
+ test/test-udp-create-socket-early.c
+ Copyright: 2015 Saúl Ibarra Corretgé <saghul@gmail.com>.
+ License: Expat
+ .
+ Files: ar-lib
+ compile
+ depcomp
+ ltmain.sh
+ missing
+ Copyright: 1996-2014, Free Software Foundation, Inc.
+ License: GPL-2+
+ .
+ Files: m4/ltoptions.m4
+ m4/ltsugar.m4
+ m4/lt~obsolete.m4
+ Copyright: 2004-2015, Free Software Foundation, Inc.
+ License: FSFULLR
+ .
+ Files: src/inet.c
+ Copyright: 2004 Internet Systems Consortium, Inc. ("ISC")
+ 1996-1999 Internet Software Consortium
+ License: ISC
+ .
+ Files: include/stdint-msvc2008.h
+ Copyright: 2006-2008 Alexander Chemeris
+ License: BSD-3-clause
+ .
+ Files: samples/socks5-proxy/getopt.c
+ Copyright: 1987, 1993, 1994 The Regents of the University of California / NetBSD
+ License: BSD-3-clause
+ .
+ Files: include/tree.h
+ Copyright: 2002 Niels Provos <provos@citi.umich.edu>
+ License: BSD-2-clause
+ .
+ Files: include/android-ifaddrs.h
+ src/unix/android-ifaddrs.c
+ Copyright: 1995, 1999 Berkeley Software Design, Inc.
+ 2013 Kenneth MacKay
+ 2014 Emergya (Cloud4all, FP7/2007-2013 grant agreement n° 289016)
+ License: BSD-2-clause
+
+License: Expat
+ Permission is hereby granted, free of charge, to any person obtaining
+ a copy of this software and associated documentation files (the
+ "Software"), to deal in the Software without restriction, including
+ without limitation the rights to use, copy, modify, merge, publish,
+ distribute, sublicense, and/or sell copies of the Software, and to
+ permit persons to whom the Software is furnished to do so, subject to
+ the following conditions:
+ .
+ The above copyright notice and this permission notice shall be
+ included in all copies or substantial portions of the Software.
+ .
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+ EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+ MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
+ NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
+ LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
+ OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
+ WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+License: BSD-2-clause
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions are
+ met:
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ .
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ .
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
+ AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
+ DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
+ FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
+ DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
+ SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
+ CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
+ OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+License: BSD-3-clause
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions
+ are met:
+ 1. Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ 2. Redistributions in binary form must reproduce the above copyright
+ notice, this list of conditions and the following disclaimer in the
+ documentation and/or other materials provided with the distribution.
+ 3. Neither the name of the author nor the names of any contributors
+ may be used to endorse or promote products derived from this software
+ without specific prior written permission.
+ .
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS ``AS IS''
+ AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
+ DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS OR CONTRIBUTORS BE LIABLE
+ FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
+ DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
+ SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
+ CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
+ OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+License: ISC
+ Permission to use, copy, modify, and distribute this software for any
+ purpose with or without fee is hereby granted, provided that the above
+ copyright notice and this permission notice appear in all copies.
+ .
+ THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
+ WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
+ MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
+ ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
+ WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
+ ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
+ OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
+
+License: FSFUL
+ This configure script is free software; the Free Software Foundation
+ gives unlimited permission to copy, distribute and modify it.
+
+License: FSFULLR
+ This file is free software; the Free Software Foundation gives
+ unlimited permission to copy and/or distribute it, with or without
+ modifications, as long as this notice is preserved.
+
+License: CC-BY-SA-3.0
+ CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE LEGAL
+ SERVICES. DISTRIBUTION OF THIS LICENSE DOES NOT CREATE AN ATTORNEY-CLIENT
+ RELATIONSHIP. CREATIVE COMMONS PROVIDES THIS INFORMATION ON AN "AS-IS"
+ BASIS. CREATIVE COMMONS MAKES NO WARRANTIES REGARDING THE INFORMATION
+ PROVIDED, AND DISCLAIMS LIABILITY FOR DAMAGES RESULTING FROM ITS USE.
+ .
+ License
+ .
+ THE WORK (AS DEFINED BELOW) IS PROVIDED UNDER THE TERMS OF THIS CREATIVE
+ COMMONS PUBLIC LICENSE ("CCPL" OR "LICENSE"). THE WORK IS PROTECTED BY
+ COPYRIGHT AND/OR OTHER APPLICABLE LAW. ANY USE OF THE WORK OTHER THAN AS
+ AUTHORIZED UNDER THIS LICENSE OR COPYRIGHT LAW IS PROHIBITED.
+ .
+ BY EXERCISING ANY RIGHTS TO THE WORK PROVIDED HERE, YOU ACCEPT AND AGREE TO
+ BE BOUND BY THE TERMS OF THIS LICENSE. TO THE EXTENT THIS LICENSE MAY BE
+ CONSIDERED TO BE A CONTRACT, THE LICENSOR GRANTS YOU THE RIGHTS CONTAINED
+ HERE IN CONSIDERATION OF YOUR ACCEPTANCE OF SUCH TERMS AND CONDITIONS.
+ .
+ 1. Definitions
+ "Adaptation" means a work based upon the Work, or upon the Work and
+ other pre-existing works, such as a translation, adaptation, derivative
+ work, arrangement of music or other alterations of a literary or
+ artistic work, or phonogram or performance and includes cinematographic
+ adaptations or any other form in which the Work may be recast,
+ transformed, or adapted including in any form recognizably derived from
+ the original, except that a work that constitutes a Collection will not
+ be considered an Adaptation for the purpose of this License. For the
+ avoidance of doubt, where the Work is a musical work, performance or
+ phonogram, the synchronization of the Work in timed-relation with a
+ moving image ("synching") will be considered an Adaptation for the
+ purpose of this License.
+ "Collection" means a collection of literary or artistic works, such as
+ encyclopedias and anthologies, or performances, phonograms or
+ broadcasts, or other works or subject matter other than works listed in
+ Section 1(f) below, which, by reason of the selection and arrangement of
+ their contents, constitute intellectual creations, in which the Work is
+ included in its entirety in unmodified form along with one or more other
+ contributions, each constituting separate and independent works in
+ themselves, which together are assembled into a collective whole. A work
+ that constitutes a Collection will not be considered an Adaptation (as
+ defined below) for the purposes of this License.
+ "Creative Commons Compatible License" means a license that is listed at
+ http://creativecommons.org/compatiblelicenses that has been approved by
+ Creative Commons as being essentially equivalent to this License,
+ including, at a minimum, because that license: (i) contains terms that
+ have the same purpose, meaning and effect as the License Elements of
+ this License; and, (ii) explicitly permits the relicensing of
+ adaptations of works made available under that license under this
+ License or a Creative Commons jurisdiction license with the same License
+ Elements as this License.
+ "Distribute" means to make available to the public the original and
+ copies of the Work or Adaptation, as appropriate, through sale or other
+ transfer of ownership.
+ "License Elements" means the following high-level license attributes as
+ selected by Licensor and indicated in the title of this License:
+ Attribution, ShareAlike.
+ "Licensor" means the individual, individuals, entity or entities that
+ offer(s) the Work under the terms of this License.
+ "Original Author" means, in the case of a literary or artistic work, the
+ individual, individuals, entity or entities who created the Work or if
+ no individual or entity can be identified, the publisher; and in
+ addition (i) in the case of a performance the actors, singers,
+ musicians, dancers, and other persons who act, sing, deliver, declaim,
+ play in, interpret or otherwise perform literary or artistic works or
+ expressions of folklore; (ii) in the case of a phonogram the producer
+ being the person or legal entity who first fixes the sounds of a
+ performance or other sounds; and, (iii) in the case of broadcasts, the
+ organization that transmits the broadcast.
+ "Work" means the literary and/or artistic work offered under the terms
+ of this License including without limitation any production in the
+ literary, scientific and artistic domain, whatever may be the mode or
+ form of its expression including digital form, such as a book, pamphlet
+ and other writing; a lecture, address, sermon or other work of the same
+ nature; a dramatic or dramatico-musical work; a choreographic work or
+ entertainment in dumb show; a musical composition with or without words;
+ a cinematographic work to which are assimilated works expressed by a
+ process analogous to cinematography; a work of drawing, painting,
+ architecture, sculpture, engraving or lithography; a photographic work
+ to which are assimilated works expressed by a process analogous to
+ photography; a work of applied art; an illustration, map, plan, sketch
+ or three-dimensional work relative to geography, topography,
+ architecture or science; a performance; a broadcast; a phonogram; a
+ compilation of data to the extent it is protected as a copyrightable
+ work; or a work performed by a variety or circus performer to the extent
+ it is not otherwise considered a literary or artistic work.
+ "You" means an individual or entity exercising rights under this License
+ who has not previously violated the terms of this License with respect
+ to the Work, or who has received express permission from the Licensor to
+ exercise rights under this License despite a previous violation.
+ "Publicly Perform" means to perform public recitations of the Work and
+ to communicate to the public those public recitations, by any means or
+ process, including by wire or wireless means or public digital
+ performances; to make available to the public Works in such a way that
+ members of the public may access these Works from a place and at a place
+ individually chosen by them; to perform the Work to the public by any
+ means or process and the communication to the public of the performances
+ of the Work, including by public digital performance; to broadcast and
+ rebroadcast the Work by any means including signs, sounds or images.
+ "Reproduce" means to make copies of the Work by any means including
+ without limitation by sound or visual recordings and the right of
+ fixation and reproducing fixations of the Work, including storage of a
+ protected performance or phonogram in digital form or other electronic
+ medium.
+ .
+ 2. Fair Dealing Rights. Nothing in this License is intended to reduce,
+ limit, or restrict any uses free from copyright or rights arising from
+ limitations or exceptions that are provided for in connection with the
+ copyright protection under copyright law or other applicable laws.
+ .
+ 3. License Grant. Subject to the terms and conditions of this License,
+ Licensor hereby grants You a worldwide, royalty-free, non-exclusive,
+ perpetual (for the duration of the applicable copyright) license to
+ exercise the rights in the Work as stated below:
+ .
+ to Reproduce the Work, to incorporate the Work into one or more
+ Collections, and to Reproduce the Work as incorporated in the
+ Collections;
+ to create and Reproduce Adaptations provided that any such Adaptation,
+ including any translation in any medium, takes reasonable steps to
+ clearly label, demarcate or otherwise identify that changes were made to
+ the original Work. For example, a translation could be marked "The
+ original work was translated from English to Spanish," or a modification
+ could indicate "The original work has been modified.";
+ to Distribute and Publicly Perform the Work including as incorporated in
+ Collections; and,
+ to Distribute and Publicly Perform Adaptations.
+ .
+ For the avoidance of doubt:
+ Non-waivable Compulsory License Schemes. In those jurisdictions in
+ which the right to collect royalties through any statutory or
+ compulsory licensing scheme cannot be waived, the Licensor reserves
+ the exclusive right to collect such royalties for any exercise by
+ You of the rights granted under this License;
+ Waivable Compulsory License Schemes. In those jurisdictions in which
+ the right to collect royalties through any statutory or compulsory
+ licensing scheme can be waived, the Licensor waives the exclusive
+ right to collect such royalties for any exercise by You of the
+ rights granted under this License; and,
+ Voluntary License Schemes. The Licensor waives the right to collect
+ royalties, whether individually or, in the event that the Licensor
+ is a member of a collecting society that administers voluntary
+ licensing schemes, via that society, from any exercise by You of the
+ rights granted under this License.
+ .
+ The above rights may be exercised in all media and formats whether now
+ known or hereafter devised. The above rights include the right to make such
+ modifications as are technically necessary to exercise the rights in other
+ media and formats. Subject to Section 8(f), all rights not expressly
+ granted by Licensor are hereby reserved.
+ .
+ 4. Restrictions. The license granted in Section 3 above is expressly made
+ subject to and limited by the following restrictions:
+ .
+ You may Distribute or Publicly Perform the Work only under the terms of
+ this License. You must include a copy of, or the Uniform Resource
+ Identifier (URI) for, this License with every copy of the Work You
+ Distribute or Publicly Perform. You may not offer or impose any terms on
+ the Work that restrict the terms of this License or the ability of the
+ recipient of the Work to exercise the rights granted to that recipient
+ under the terms of the License. You may not sublicense the Work. You
+ must keep intact all notices that refer to this License and to the
+ disclaimer of warranties with every copy of the Work You Distribute or
+ Publicly Perform. When You Distribute or Publicly Perform the Work, You
+ may not impose any effective technological measures on the Work that
+ restrict the ability of a recipient of the Work from You to exercise the
+ rights granted to that recipient under the terms of the License. This
+ Section 4(a) applies to the Work as incorporated in a Collection, but
+ this does not require the Collection apart from the Work itself to be
+ made subject to the terms of this License. If You create a Collection,
+ upon notice from any Licensor You must, to the extent practicable,
+ remove from the Collection any credit as required by Section 4(c), as
+ requested. If You create an Adaptation, upon notice from any Licensor
+ You must, to the extent practicable, remove from the Adaptation any
+ credit as required by Section 4(c), as requested.
+ You may Distribute or Publicly Perform an Adaptation only under the
+ terms of: (i) this License; (ii) a later version of this License with
+ the same License Elements as this License; (iii) a Creative Commons
+ jurisdiction license (either this or a later license version) that
+ contains the same License Elements as this License (e.g.,
+ Attribution-ShareAlike 3.0 US)); (iv) a Creative Commons Compatible
+ License. If you license the Adaptation under one of the licenses
+ mentioned in (iv), you must comply with the terms of that license. If
+ you license the Adaptation under the terms of any of the licenses
+ mentioned in (i), (ii) or (iii) (the "Applicable License"), you must
+ comply with the terms of the Applicable License generally and the
+ following provisions: (I) You must include a copy of, or the URI for,
+ the Applicable License with every copy of each Adaptation You Distribute
+ or Publicly Perform; (II) You may not offer or impose any terms on the
+ Adaptation that restrict the terms of the Applicable License or the
+ ability of the recipient of the Adaptation to exercise the rights
+ granted to that recipient under the terms of the Applicable License;
+ (III) You must keep intact all notices that refer to the Applicable
+ License and to the disclaimer of warranties with every copy of the Work
+ as included in the Adaptation You Distribute or Publicly Perform; (IV)
+ when You Distribute or Publicly Perform the Adaptation, You may not
+ impose any effective technological measures on the Adaptation that
+ restrict the ability of a recipient of the Adaptation from You to
+ exercise the rights granted to that recipient under the terms of the
+ Applicable License. This Section 4(b) applies to the Adaptation as
+ incorporated in a Collection, but this does not require the Collection
+ apart from the Adaptation itself to be made subject to the terms of the
+ Applicable License.
+ If You Distribute, or Publicly Perform the Work or any Adaptations or
+ Collections, You must, unless a request has been made pursuant to
+ Section 4(a), keep intact all copyright notices for the Work and
+ provide, reasonable to the medium or means You are utilizing: (i) the
+ name of the Original Author (or pseudonym, if applicable) if supplied,
+ and/or if the Original Author and/or Licensor designate another party or
+ parties (e.g., a sponsor institute, publishing entity, journal) for
+ attribution ("Attribution Parties") in Licensor's copyright notice,
+ terms of service or by other reasonable means, the name of such party or
+ parties; (ii) the title of the Work if supplied; (iii) to the extent
+ reasonably practicable, the URI, if any, that Licensor specifies to be
+ associated with the Work, unless such URI does not refer to the
+ copyright notice or licensing information for the Work; and (iv) ,
+ consistent with Ssection 3(b), in the case of an Adaptation, a credit
+ identifying the use of the Work in the Adaptation (e.g., "French
+ translation of the Work by Original Author," or "Screenplay based on
+ original Work by Original Author"). The credit required by this Section
+ 4(c) may be implemented in any reasonable manner; provided, however,
+ that in the case of a Adaptation or Collection, at a minimum such credit
+ will appear, if a credit for all contributing authors of the Adaptation
+ or Collection appears, then as part of these credits and in a manner at
+ least as prominent as the credits for the other contributing authors.
+ For the avoidance of doubt, You may only use the credit required by this
+ Section for the purpose of attribution in the manner set out above and,
+ by exercising Your rights under this License, You may not implicitly or
+ explicitly assert or imply any connection with, sponsorship or
+ endorsement by the Original Author, Licensor and/or Attribution Parties,
+ as appropriate, of You or Your use of the Work, without the separate,
+ express prior written permission of the Original Author, Licensor and/or
+ Attribution Parties.
+ Except as otherwise agreed in writing by the Licensor or as may be
+ otherwise permitted by applicable law, if You Reproduce, Distribute or
+ Publicly Perform the Work either by itself or as part of any Adaptations
+ or Collections, You must not distort, mutilate, modify or take other
+ derogatory action in relation to the Work which would be prejudicial to
+ the Original Author's honor or reputation. Licensor agrees that in those
+ jurisdictions (e.g. Japan), in which any exercise of the right granted
+ in Section 3(b) of this License (the right to make Adaptations) would be
+ deemed to be a distortion, mutilation, modification or other derogatory
+ action prejudicial to the Original Author's honor and reputation, the
+ Licensor will waive or not assert, as appropriate, this Section, to the
+ fullest extent permitted by the applicable national law, to enable You
+ to reasonably exercise Your right under Section 3(b) of this License
+ (right to make Adaptations) but not otherwise.
+ .
+ 5. Representations, Warranties and Disclaimer
+ .
+ UNLESS OTHERWISE MUTUALLY AGREED TO BY THE PARTIES IN WRITING, LICENSOR
+ OFFERS THE WORK AS-IS AND MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY
+ KIND CONCERNING THE WORK, EXPRESS, IMPLIED, STATUTORY OR OTHERWISE,
+ INCLUDING, WITHOUT LIMITATION, WARRANTIES OF TITLE, MERCHANTIBILITY,
+ FITNESS FOR A PARTICULAR PURPOSE, NONINFRINGEMENT, OR THE ABSENCE OF LATENT
+ OR OTHER DEFECTS, ACCURACY, OR THE PRESENCE OF ABSENCE OF ERRORS, WHETHER
+ OR NOT DISCOVERABLE. SOME JURISDICTIONS DO NOT ALLOW THE EXCLUSION OF
+ IMPLIED WARRANTIES, SO SUCH EXCLUSION MAY NOT APPLY TO YOU.
+ .
+ 6. Limitation on Liability. EXCEPT TO THE EXTENT REQUIRED BY APPLICABLE
+ LAW, IN NO EVENT WILL LICENSOR BE LIABLE TO YOU ON ANY LEGAL THEORY FOR ANY
+ SPECIAL, INCIDENTAL, CONSEQUENTIAL, PUNITIVE OR EXEMPLARY DAMAGES ARISING
+ OUT OF THIS LICENSE OR THE USE OF THE WORK, EVEN IF LICENSOR HAS BEEN
+ ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
+ .
+ 7. Termination
+ .
+ This License and the rights granted hereunder will terminate
+ automatically upon any breach by You of the terms of this License.
+ Individuals or entities who have received Adaptations or Collections
+ from You under this License, however, will not have their licenses
+ terminated provided such individuals or entities remain in full
+ compliance with those licenses. Sections 1, 2, 5, 6, 7, and 8 will
+ survive any termination of this License.
+ Subject to the above terms and conditions, the license granted here is
+ perpetual (for the duration of the applicable copyright in the Work).
+ Notwithstanding the above, Licensor reserves the right to release the
+ Work under different license terms or to stop distributing the Work at
+ any time; provided, however that any such election will not serve to
+ withdraw this License (or any other license that has been, or is
+ required to be, granted under the terms of this License), and this
+ License will continue in full force and effect unless terminated as
+ stated above.
+ .
+ 8. Miscellaneous
+ .
+ Each time You Distribute or Publicly Perform the Work or a Collection,
+ the Licensor offers to the recipient a license to the Work on the same
+ terms and conditions as the license granted to You under this License.
+ Each time You Distribute or Publicly Perform an Adaptation, Licensor
+ offers to the recipient a license to the original Work on the same terms
+ and conditions as the license granted to You under this License.
+ If any provision of this License is invalid or unenforceable under
+ applicable law, it shall not affect the validity or enforceability of
+ the remainder of the terms of this License, and without further action
+ by the parties to this agreement, such provision shall be reformed to
+ the minimum extent necessary to make such provision valid and
+ enforceable.
+ No term or provision of this License shall be deemed waived and no
+ breach consented to unless such waiver or consent shall be in writing
+ and signed by the party to be charged with such waiver or consent.
+ This License constitutes the entire agreement between the parties with
+ respect to the Work licensed here. There are no understandings,
+ agreements or representations with respect to the Work not specified
+ here. Licensor shall not be bound by any additional provisions that may
+ appear in any communication from You. This License may not be modified
+ without the mutual written agreement of the Licensor and You.
+ The rights granted under, and the subject matter referenced, in this
+ License were drafted utilizing the terminology of the Berne Convention
+ for the Protection of Literary and Artistic Works (as amended on
+ September 28, 1979), the Rome Convention of 1961, the WIPO Copyright
+ Treaty of 1996, the WIPO Performances and Phonograms Treaty of 1996 and
+ the Universal Copyright Convention (as revised on July 24, 1971). These
+ rights and subject matter take effect in the relevant jurisdiction in
+ which the License terms are sought to be enforced according to the
+ corresponding provisions of the implementation of those treaty
+ provisions in the applicable national law. If the standard suite of
+ rights granted under applicable copyright law includes additional rights
+ not granted under this License, such additional rights are deemed to be
+ included in the License; this License is not intended to restrict the
+ license of any rights under applicable law.
+
+License: Apache-2.0
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+ .
+ http://www.apache.org/licenses/LICENSE-2.0
+ .
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+ .
+ On Debian systems, the complete text of the Apache version 2.0 license
+ can be found in "/usr/share/common-licenses/Apache-2.0".
+
+License: GPL-2+
+ This package is free software; you can redistribute it and/or modify
+ it under the terms of the GNU General Public License as published by
+ the Free Software Foundation; either version 2 of the License, or
+ (at your option) any later version.
+ .
+ This package is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+ GNU General Public License for more details.
+ .
+ You should have received a copy of the GNU General Public License
+ along with this program. If not, see <http://www.gnu.org/licenses/>
+ .
+ On Debian systems, the complete text of the GNU General
+ Public License version 2 can be found in "/usr/share/common-licenses/GPL-2".
+
+License: GPL-3+
+ This program is free software: you can redistribute it and/or modify
+ it under the terms of the GNU General Public License as published by
+ the Free Software Foundation, either version 3 of the License, or
+ (at your option) any later version.
+ .
+ This package is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+ GNU General Public License for more details.
+ .
+ You should have received a copy of the GNU General Public License
+ along with this program. If not, see <http://www.gnu.org/licenses/>.
+ .
+ On Debian systems, the complete text of the GNU General
+ Public License version 3 can be found in "/usr/share/common-licenses/GPL-3".
+
+License: Zlib
+ This software is provided 'as-is', without any express or implied
+ warranty. In no event will the authors be held liable for any damages
+ arising from the use of this software.
+ .
+ Permission is granted to anyone to use this software for any purpose,
+ including commercial applications, and to alter it and redistribute it
+ freely, subject to the following restrictions:
+ .
+ 1. The origin of this software must not be misrepresented; you must not
+ claim that you wrote the original software. If you use this software
+ in a product, an acknowledgment in the product documentation would be
+ appreciated but is not required.
+ 2. Altered source versions must be plainly marked as such, and must not be
+ misrepresented as being the original software.
+ 3. This notice may not be removed or altered from any source distribution.
+
+License: U-OF-I-BSD-LIKE
+ ==============================================================================
+ LLVM Release License
+ ==============================================================================
+ University of Illinois/NCSA
+ Open Source License
+ .
+ Copyright (c) 2003-2017 University of Illinois at Urbana-Champaign.
+ All rights reserved.
+ .
+ Developed by:
+ .
+ LLVM Team
+ .
+ University of Illinois at Urbana-Champaign
+ .
+ http://llvm.org
+ .
+ Permission is hereby granted, free of charge, to any person obtaining a copy of
+ this software and associated documentation files (the "Software"), to deal with
+ the Software without restriction, including without limitation the rights to
+ use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
+ of the Software, and to permit persons to whom the Software is furnished to do
+ so, subject to the following conditions:
+ .
+ * Redistributions of source code must retain the above copyright notice,
+ this list of conditions and the following disclaimers.
+ .
+ * Redistributions in binary form must reproduce the above copyright notice,
+ this list of conditions and the following disclaimers in the
+ documentation and/or other materials provided with the distribution.
+ .
+ * Neither the names of the LLVM Team, University of Illinois at
+ Urbana-Champaign, nor the names of its contributors may be used to
+ endorse or promote products derived from this Software without specific
+ prior written permission.
+ .
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
+ FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+ CONTRIBUTORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS WITH THE
+ SOFTWARE.
--- /dev/null
+.
\ No newline at end of file
--- /dev/null
+.
\ No newline at end of file
--- /dev/null
+DocStringExtensions.jl-0.5.0
\ No newline at end of file
--- /dev/null
+comment: false
--- /dev/null
+language: julia
+
+os:
+ - linux
+ - osx
+
+sudo: false
+
+julia:
+ - 0.7
+ - 1.0
+ - nightly
+
+notifications:
+ email: false
+
+after_success:
+ - julia test/coverage.jl
+
+jobs:
+ include:
+ - stage: "Deploy docs"
+ julia: 1.0
+ os: linux
+ script:
+ - julia -e 'using Pkg; Pkg.add([PackageSpec("Documenter"), PackageSpec(path=pwd())])'
+ - julia docs/make.jl
+ after_success: skip
--- /dev/null
+The DocStringExtensions.jl package is licensed under the MIT "Expat" License:
+
+> Copyright (c) 2016: Michael Hatherly.
+>
+> Permission is hereby granted, free of charge, to any person obtaining a copy
+> of this software and associated documentation files (the "Software"), to deal
+> in the Software without restriction, including without limitation the rights
+> to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+> copies of the Software, and to permit persons to whom the Software is
+> furnished to do so, subject to the following conditions:
+>
+> The above copyright notice and this permission notice shall be included in all
+> copies or substantial portions of the Software.
+>
+> THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+> IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+> FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+> AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+> LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+> OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+> SOFTWARE.
+>
--- /dev/null
+# DocStringExtensions
+
+*Extensions for Julia's docsystem.*
+
+| **Documentation** | **Build Status** |
+|:-------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------:|
+| [![][docs-stable-img]][docs-stable-url] [![][docs-latest-img]][docs-latest-url] | [![][travis-img]][travis-url] [![][appveyor-img]][appveyor-url] [![][codecov-img]][codecov-url] |
+
+## Installation
+
+The package can be added using the Julia package manager. From the Julia REPL, type `]`
+to enter the Pkg REPL mode and run
+
+```
+pkg> add DocStringExtensions
+```
+
+## Documentation
+
+- [**STABLE**][docs-stable-url] — **most recently tagged version of the documentation.**
+- [**LATEST**][docs-latest-url] — *in-development version of the documentation.*
+
+## Project Status
+
+The package is tested and developed against Julia `0.7` and `1.0` on Linux, OS X, and Windows,
+but there are versions of the package that works on older versions of Julia.
+
+## Contributing and Questions
+
+Contributions are very welcome, as are feature requests and suggestions. Please open an [issue][issues-url] if you encounter any problems. If you have a question then feel free to ask for help in the [Gitter chat room][gitter-url].
+
+[gitter-url]: https://gitter.im/juliadocs/users
+
+[docs-latest-img]: https://img.shields.io/badge/docs-latest-blue.svg
+[docs-latest-url]: https://juliadocs.github.io/DocStringExtensions.jl/latest
+
+[docs-stable-img]: https://img.shields.io/badge/docs-stable-blue.svg
+[docs-stable-url]: https://juliadocs.github.io/DocStringExtensions.jl/stable
+
+[travis-img]: https://travis-ci.org/JuliaDocs/DocStringExtensions.jl.svg?branch=master
+[travis-url]: https://travis-ci.org/JuliaDocs/DocStringExtensions.jl
+
+[appveyor-img]: https://ci.appveyor.com/api/projects/status/7bixd69chxps91wx/branch/master?svg=true
+[appveyor-url]: https://ci.appveyor.com/project/JuliaDocs/docstringextensions-jl/branch/master
+
+[codecov-img]: https://codecov.io/gh/JuliaDocs/DocStringExtensions.jl/branch/master/graph/badge.svg
+[codecov-url]: https://codecov.io/gh/JuliaDocs/DocStringExtensions.jl
+
+[issues-url]: https://github.com/JuliaDocs/DocStringExtensions.jl/issues
--- /dev/null
+environment:
+ matrix:
+ - julia_version: 0.7
+ - julia_version: 1
+ - julia_version: nightly
+
+platform:
+ - x86 # 32-bit
+ - x64 # 64-bit
+
+# # Uncomment the following lines to allow failures on nightly julia
+# # (tests will run but not make your overall status red)
+# matrix:
+# allow_failures:
+# - julia_version: nightly
+
+branches:
+ only:
+ - master
+ - /release-.*/
+
+notifications:
+ - provider: Email
+ on_build_success: false
+ on_build_failure: false
+ on_build_status_changed: false
+
+install:
+ - ps: iex ((new-object net.webclient).DownloadString("https://raw.githubusercontent.com/JuliaCI/Appveyor.jl/version-1/bin/install.ps1"))
+
+build_script:
+ - echo "%JL_BUILD_SCRIPT%"
+ - C:\julia\bin\julia -e "%JL_BUILD_SCRIPT%"
+
+test_script:
+ - echo "%JL_TEST_SCRIPT%"
+ - C:\julia\bin\julia -e "%JL_TEST_SCRIPT%"
+
+# # Uncomment to support code coverage upload. Should only be enabled for packages
+# # which would have coverage gaps without running on Windows
+# on_success:
+# - echo "%JL_CODECOV_SCRIPT%"
+# - C:\julia\bin\julia -e "%JL_CODECOV_SCRIPT%"
--- /dev/null
+using Documenter, DocStringExtensions
+
+makedocs(
+ sitename = "DocStringExtensions.jl",
+ modules = [DocStringExtensions],
+ format = :html,
+ clean = false,
+ pages = Any["Home" => "index.md"],
+)
+
+deploydocs(
+ target = "build",
+ deps = nothing,
+ make = nothing,
+ repo = "github.com/JuliaDocs/DocStringExtensions.jl.git",
+ julia = "1.0",
+)
+
--- /dev/null
+# DocStringExtensions
+
+```@docs
+DocStringExtensions
+```
+
+## Index
+
+```@index
+Modules = [DocStringExtensions]
+```
+
+## Reference
+
+```@autodocs
+Modules = [DocStringExtensions]
+Order = [:constant, :function, :macro, :type]
+```
+
--- /dev/null
+__precompile__(true)
+
+"""
+*Extensions for the Julia docsystem.*
+
+# Introduction
+
+This package provides a collection of useful extensions for Julia's built-in docsystem.
+These are features that are still regarded as "experimental" and not yet mature enough to be
+considered for inclusion in `Base`, or that have sufficiently niche use cases that including
+them with the default Julia installation is not seen as valuable enough at this time.
+
+Currently `DocStringExtensions.jl` exports a collection of so-called "abbreviations", which
+can be used to add useful automatically generated information to docstrings. These include
+information such as:
+
+ * simplified method signatures;
+ * documentation for the individual fields of composite types;
+ * import and export lists for modules;
+ * and source-linked lists of methods applicable to a particular docstring.
+
+Users are most welcome to suggest additional abbreviation ideas, or implement and submit
+them themselves. Julia's strong support for program introspection makes this a reasonably
+straight forward process.
+
+Details of the currently available abbreviations can be viewed in their individual
+docstrings listed below in the "Exports" section.
+
+# Examples
+
+In simple terms an abbreviation can be used by simply interpolating it into a suitable
+docstring. For example:
+
+```julia
+using DocStringExtensions
+
+\"""
+A short summary of `func`...
+
+\$(SIGNATURES)
+
+where `x` and `y` should both be positive.
+
+# Details
+
+Some details about `func`...
+\"""
+func(x, y) = x + y
+```
+
+`\$(SIGNATURES)` will be replaced in the above docstring with
+
+````markdown
+# Signatures
+
+```julia
+func(x, y)
+```
+````
+
+The resulting generated content can be viewed via Julia's `?` mode or, if `Documenter.jl` is
+set up, the generated external documentation.
+
+The advantage of using [`SIGNATURES`](@ref) (and other abbreviations) is that docstrings are
+less likely to become out-of-sync with the surrounding code. Note though that references to
+the argument names `x` and `y` that have been manually embedded within the docstring are, of
+course, not updated automatically.
+
+# Exports
+$(EXPORTS)
+
+# Imports
+$(IMPORTS)
+
+"""
+module DocStringExtensions
+
+# Exports.
+
+export @template, FIELDS, EXPORTS, METHODLIST, IMPORTS, SIGNATURES, TYPEDEF, DOCSTRING, FUNCTIONNAME
+
+# Includes.
+
+include("utilities.jl")
+include("abbreviations.jl")
+include("templates.jl")
+
+#
+# Bootstrap abbreviations.
+#
+# Within the package itself we would like to be able to use the abbreviations that have been
+# implemented. To do this we need to delay evaluation of the interpolated abbreviations
+# until they have all been defined. We use `Symbol`s in place of the actual constants, such
+# as `METHODLIST` which is written as `:METHODLIST` instead.
+#
+# The docstring for the module itself, defined at the start of the file, does not need to
+# use `Symbol`s since with the way `@doc` works the module docstring gets inserted at the
+# end of the module definition and so has all the definitions already defined.
+#
+let λ = s -> isa(s, Symbol) ? getfield(DocStringExtensions, s) : s
+ for (binding, multidoc) in Docs.meta(DocStringExtensions)
+ for (typesig, docstr) in multidoc.docs
+ docstr.text = Core.svec(map(λ, docstr.text)...)
+ end
+ end
+end
+
+__init__() = (hook!(template_hook); nothing)
+
+end # module
--- /dev/null
+
+#
+# Abstract Interface.
+#
+
+"""
+Abbreviation objects are used to automatically generate context-dependent markdown content
+within documentation strings. Objects of this type interpolated into docstrings will be
+expanded automatically before parsing the text to markdown.
+
+$(:FIELDS)
+"""
+abstract type Abbreviation end
+
+"""
+$(:SIGNATURES)
+
+Expand the [`Abbreviation`](@ref) `abbr` in the context of the `DocStr` `doc` and write
+the resulting markdown-formatted text to the `IOBuffer` `buf`.
+"""
+format(abbr, buf, doc) = error("`format` not implemented for `$typeof(abbr)`.")
+
+# Only extend `formatdoc` once with our abstract type. Within the package use a different
+# `format` function instead to keep things decoupled from `Base` as much as possible.
+Docs.formatdoc(buf::IOBuffer, doc::Docs.DocStr, part::Abbreviation) = format(part, buf, doc)
+
+
+#
+# Implementations.
+#
+
+
+#
+# `TypeFields`
+#
+
+"""
+The singleton type for [`FIELDS`](@ref) abbreviations.
+
+$(:FIELDS)
+"""
+struct TypeFields <: Abbreviation end
+
+"""
+An [`Abbreviation`](@ref) to include the names of the fields of a type as well as any
+documentation that may be attached to the fields.
+
+# Examples
+
+The generated markdown text should look similar to to following example where a
+type has three fields (`x`, `y`, and `z`) and the last two have documentation
+attached.
+
+```markdown
+
+ - `x`
+
+ - `y`
+
+ Unlike the `x` field this field has been documented.
+
+ - `z`
+
+ Another documented field.
+```
+"""
+const FIELDS = TypeFields()
+
+function format(::TypeFields, buf, doc)
+ local docs = get(doc.data, :fields, Dict())
+ local binding = doc.data[:binding]
+ local object = Docs.resolve(binding)
+ # On 0.7 fieldnames() on an abstract type throws an error. We then explicitly return
+ # an empty vector to be consistent with the behaviour on v0.6.
+ local fields = isabstracttype(object) ? Symbol[] : fieldnames(object)
+ if !isempty(fields)
+ println(buf)
+ for field in fields
+ print(buf, " - `", field, "`\n")
+ # Print the field docs if they exist and aren't a `doc"..."` docstring.
+ if haskey(docs, field) && isa(docs[field], AbstractString)
+ println(buf)
+ for line in split(docs[field], "\n")
+ println(buf, isempty(line) ? "" : " ", rstrip(line))
+ end
+ end
+ println(buf)
+ end
+ println(buf)
+ end
+ return nothing
+end
+
+
+#
+# `ModuleExports`
+#
+
+"""
+The singleton type for [`EXPORTS`](@ref) abbreviations.
+
+$(:FIELDS)
+"""
+struct ModuleExports <: Abbreviation end
+
+"""
+An [`Abbreviation`](@ref) to include all the exported names of a module is a sorted list of
+`Documenter.jl`-style `@ref` links.
+
+!!! note
+
+ The names are sorted alphabetically and ignore leading `@` characters so that macros are
+ *not* sorted before other names.
+
+# Examples
+
+The markdown text generated by the `EXPORTS` abbreviation looks similar to the following:
+
+```markdown
+
+ - [`bar`](@ref)
+ - [`@baz`](@ref)
+ - [`foo`](@ref)
+
+```
+"""
+const EXPORTS = ModuleExports()
+
+function format(::ModuleExports, buf, doc)
+ local binding = doc.data[:binding]
+ local object = Docs.resolve(binding)
+ local exports = names(object)
+ if !isempty(exports)
+ println(buf)
+ # Sorting ignores the `@` in macro names and sorts them in with others.
+ for sym in sort(exports, by = s -> lstrip(string(s), '@'))
+ # Skip the module itself, since that's always exported.
+ sym === nameof(object) && continue
+ # We print linked names using Documenter.jl cross-reference syntax
+ # for ease of integration with that package.
+ println(buf, " - [`", sym, "`](@ref)")
+ end
+ println(buf)
+ end
+ return nothing
+end
+
+
+#
+# `ModuleImports`
+#
+
+"""
+The singleton type for [`IMPORTS`](@ref) abbreviations.
+
+$(:FIELDS)
+"""
+struct ModuleImports <: Abbreviation end
+
+"""
+An [`Abbreviation`](@ref) to include all the imported modules in a sorted list.
+
+# Examples
+
+The markdown text generated by the `IMPORTS` abbreviation looks similar to the following:
+
+```markdown
+
+ - `Foo`
+ - `Bar`
+ - `Baz`
+
+```
+"""
+const IMPORTS = ModuleImports()
+
+function format(::ModuleImports, buf, doc)
+ local binding = doc.data[:binding]
+ local object = Docs.resolve(binding)
+ local imports = unique(ccall(:jl_module_usings, Any, (Any,), object))
+ if !isempty(imports)
+ println(buf)
+ for mod in sort(imports, by = string)
+ println(buf, " - `", mod, "`")
+ end
+ println(buf)
+ end
+end
+
+
+#
+# `MethodList`
+#
+
+"""
+The singleton type for [`METHODLIST`](@ref) abbreviations.
+
+$(:FIELDS)
+"""
+struct MethodList <: Abbreviation end
+
+"""
+An [`Abbreviation`](@ref) for including a list of all the methods that match a documented
+`Method`, `Function`, or `DataType` within the current module.
+
+# Examples
+
+The generated markdown text will look similar to the following example where a function
+`f` defines two different methods (one that takes a number, and the other a string):
+
+````markdown
+```julia
+f(num)
+```
+
+defined at [`<path>:<line>`](<github-url>).
+
+```julia
+f(str)
+```
+
+defined at [`<path>:<line>`](<github-url>).
+````
+"""
+const METHODLIST = MethodList()
+
+function format(::MethodList, buf, doc)
+ local binding = doc.data[:binding]
+ local typesig = doc.data[:typesig]
+ local modname = doc.data[:module]
+ local func = Docs.resolve(binding)
+ local groups = methodgroups(func, typesig, modname; exact = false)
+ if !isempty(groups)
+ println(buf)
+ for group in groups
+ println(buf, "```julia")
+ for method in group
+ printmethod(buf, binding, func, method)
+ println(buf)
+ end
+ println(buf, "```\n")
+ if !isempty(group)
+ local method = group[1]
+ local file = string(method.file)
+ local line = method.line
+ local path = cleanpath(file)
+ local URL = url(method)
+ isempty(URL) || println(buf, "defined at [`$path:$line`]($URL).")
+ end
+ println(buf)
+ end
+ println(buf)
+ end
+ return nothing
+end
+
+
+#
+# `MethodSignatures`
+#
+
+"""
+The singleton type for [`SIGNATURES`](@ref) abbreviations.
+
+$(:FIELDS)
+"""
+struct MethodSignatures <: Abbreviation end
+
+"""
+An [`Abbreviation`](@ref) for including a simplified representation of all the method
+signatures that match the given docstring. See [`printmethod`](@ref) for details on
+the simplifications that are applied.
+
+# Examples
+
+The generated markdown text will look similar to the following example where a function `f`
+defines method taking two positional arguments, `x` and `y`, and two keywords, `a` and the `b`.
+
+````markdown
+```julia
+f(x, y; a, b...)
+```
+````
+"""
+const SIGNATURES = MethodSignatures()
+
+function format(::MethodSignatures, buf, doc)
+ local binding = doc.data[:binding]
+ local typesig = doc.data[:typesig]
+ local modname = doc.data[:module]
+ local func = Docs.resolve(binding)
+ local groups = methodgroups(func, typesig, modname)
+ if !isempty(groups)
+ println(buf)
+ println(buf, "```julia")
+ for group in groups
+ for method in group
+ printmethod(buf, binding, func, method)
+ println(buf)
+ end
+ end
+ println(buf, "\n```\n")
+ end
+end
+
+#
+# `FunctionName`
+#
+
+"""
+The singleton type for [`FUNCTIONNAME`](@ref) abbreviations.
+
+$(:FIELDS)
+"""
+struct FunctionName <: Abbreviation end
+
+"""
+An [`Abbreviation`](@ref) for including the function name matching the method of
+the docstring.
+
+# Usage
+
+This is mostly useful for not repeating the function name in docstrings where
+the user wants to retain full control of the argument list, or the latter does
+not exist (eg generic functions).
+
+Note that the generated docstring snippet is not quoted, use indentation or
+explicit quoting.
+
+# Example
+
+```julia
+\"""
+ \$(FUNCTIONNAME)(d, θ)
+
+Calculate the logdensity `d` at `θ`.
+
+Users should define their own methods for `$(FUNCTIONNAME)`.
+\"""
+function logdensity end
+```
+"""
+const FUNCTIONNAME = FunctionName()
+
+format(::FunctionName, buf, doc) = print(buf, doc.data[:binding].var)
+
+#
+# `TypeSignature`
+#
+
+"""
+The singleton type for [`TYPEDEF`](@ref) abbreviations.
+"""
+struct TypeDefinition <: Abbreviation end
+
+"""
+An [`Abbreviation`](@ref) for including a summary of the signature of a type definition.
+Some of the following information may be included in the output:
+
+ * whether the object is an `abstract` type or a `bitstype`;
+ * mutability (either `type` or `struct` is printed);
+ * the unqualified name of the type;
+ * any type parameters;
+ * the supertype of the type if it is not `Any`.
+
+# Examples
+
+The generated output for a type definition such as:
+
+```julia
+\"""
+\$(TYPEDEF)
+\"""
+struct MyType{S, T <: Integer} <: AbstractArray
+ # ...
+end
+```
+
+will look similar to the following:
+
+````markdown
+```julia
+struct MyType{S, T<:Integer} <: AbstractArray
+```
+````
+
+!!! note
+
+ No information about the fields of the type is printed. Use the [`FIELDS`](@ref)
+ abbreviation to include information about the fields of a type.
+"""
+const TYPEDEF = TypeDefinition()
+
+function print_supertype(buf, object)
+ super = supertype(object)
+ super != Any && print(buf, " <: ", super)
+end
+
+function print_params(buf, object)
+ if !isempty(object.parameters)
+ print(buf, "{")
+ join(buf, object.parameters, ", ")
+ print(buf, "}")
+ end
+end
+
+function print_primitive_type(buf, object)
+ print(buf, "primitive type ", object.name.name)
+ print_supertype(buf, object)
+ print(buf, " ", sizeof(object) * 8)
+ println(buf)
+end
+
+function print_abstract_type(buf, object)
+ print(buf, "abstract type ", object.name.name)
+ print_supertype(buf, object)
+ println(buf)
+end
+
+function print_mutable_struct_or_struct(buf, object)
+ object.mutable && print(buf, "mutable ")
+ print(buf, "struct ", object.name.name)
+ print_params(buf, object)
+ print_supertype(buf, object)
+ println(buf)
+end
+
+@static if VERSION < v"0.7.0"
+ isprimitivetype(x) = isbitstype(x)
+end
+
+function format(::TypeDefinition, buf, doc)
+ local binding = doc.data[:binding]
+ local object = gettype(Docs.resolve(binding))
+ if isa(object, DataType)
+ println(buf, "\n```julia")
+ if isprimitivetype(object)
+ print_primitive_type(buf, object)
+ elseif isabstracttype(object)
+ print_abstract_type(buf, object)
+ else
+ print_mutable_struct_or_struct(buf, object)
+ end
+ println(buf, "```\n")
+ end
+end
+
+#
+# `DocStringTemplate`
+#
+
+"""
+The singleton type for [`DOCSTRING`](@ref) abbreviations.
+"""
+struct DocStringTemplate <: Abbreviation end
+
+"""
+An [`Abbreviation`](@ref) used in [`@template`](@ref) definitions to represent the location
+of the docstring body that should be spliced into a template.
+
+!!! warning
+
+ This abbreviation must only ever be used in template strings; never normal docstrings.
+"""
+const DOCSTRING = DocStringTemplate()
+
+# NOTE: no `format` needed for this 'mock' abbreviation.
--- /dev/null
+const expander = Core.atdoc
+const setter! = Core.atdoc!
+
+"""
+$(:SIGNATURES)
+
+Set the docstring expander function to first call `func` before calling the default expander.
+
+To remove a hook that has been applied using this method call [`hook!()`](@ref).
+"""
+hook!(func) = setter!((args...) -> expander(func(args...)...))
+
+"""
+$(:SIGNATURES)
+
+Reset the docstring expander to only call the default expander function. This clears any
+'hook' that has been set using [`hook!(func)`](@ref).
+"""
+hook!() = setter!(expander)
+
+"""
+$(:SIGNATURES)
+
+Defines a docstring template that will be applied to all docstrings in a module that match
+the specified category or tuple of categories.
+
+# Examples
+
+```julia
+@template DEFAULT =
+ \"""
+ \$(SIGNATURES)
+ \$(DOCSTRING)
+ \"""
+```
+
+`DEFAULT` is the default template that is applied to a docstring if no other template
+definitions match the documented expression. The `DOCSTRING` abbreviation is used to mark
+the location in the template where the actual docstring body will be spliced into each
+docstring.
+
+```julia
+@template (FUNCTIONS, METHODS, MACROS) =
+ \"""
+ \$(SIGNATURES)
+ \$(DOCSTRING)
+ \$(METHODLIST)
+ \"""
+```
+
+A tuple of categories can be specified when a docstring template should be used for several
+different categories.
+
+```julia
+@template MODULES = ModName
+```
+
+The template definition above will define a template for module docstrings based on the
+template for modules found in module `ModName`.
+
+!!! note
+
+ Supported categories are `DEFAULT`, `FUNCTIONS`, `METHODS`, `MACROS`, `TYPES`,
+ `MODULES`, and `CONSTANTS`.
+
+"""
+macro template(ex)
+ # JuliaLang/julia#22064 introduced the __module__ variable and deprecated current_module()
+ @static if VERSION >= v"0.7.0-DEV.484"
+ template(__source__, __module__, ex)
+ else
+ template(LineNumberNode(0), current_module(), ex)
+ end
+end
+
+const TEMP_SYM = gensym("templates")
+
+function template(src::LineNumberNode, mod::Module, ex::Expr)
+ Meta.isexpr(ex, :(=), 2) || error("invalid `@template` syntax.")
+ template(src, mod, ex.args[1], ex.args[2])
+end
+
+function template(source::LineNumberNode, mod::Module, tuple::Expr, docstr::Union{Symbol, Expr})
+ Meta.isexpr(tuple, :tuple) || error("invalid `@template` syntax on LHS.")
+ isdefined(mod, TEMP_SYM) || Core.eval(mod, :(const $(TEMP_SYM) = $(Dict{Symbol, Vector}())))
+ local block = Expr(:block)
+ for category in tuple.args
+ local key = Meta.quot(category)
+ local vec = Meta.isexpr(docstr, :string) ?
+ Expr(:vect, docstr.args...) : :($(docstr).$(TEMP_SYM)[$(key)])
+ push!(block.args, :($(TEMP_SYM)[$(key)] = $(vec)))
+ end
+ push!(block.args, nothing)
+ return esc(block)
+end
+
+function template(src::LineNumberNode, mod::Module, sym::Symbol, docstr::Union{Symbol, Expr})
+ template(src, mod, Expr(:tuple, sym), docstr)
+end
+
+# The signature for the atdocs() calls changed in v0.7
+# On v0.6 and below it seems it was assumed to be (docstr::String, expr::Expr), but on v0.7
+# it is (source::LineNumberNode, mod::Module, docstr::String, expr::Expr)
+function template_hook(source::LineNumberNode, mod::Module, docstr, expr::Expr)
+ local docex = interp_string(docstr)
+ if isdefined(mod, TEMP_SYM) && Meta.isexpr(docex, :string)
+ local templates = getfield(mod, TEMP_SYM)
+ local template = get_template(templates, expression_type(expr))
+ local out = Expr(:string)
+ for t in template
+ t == DOCSTRING ? append!(out.args, docex.args) : push!(out.args, t)
+ end
+ return (source, mod, out, expr)
+ else
+ return (source, mod, docstr, expr)
+ end
+end
+
+function template_hook(docstr, expr::Expr)
+ source, mod, docstr, expr::Expr = template_hook(LineNumberNode(0), current_module(), docstr, expr)
+ docstr, expr
+end
+
+template_hook(args...) = args
+
+interp_string(str::AbstractString) = Expr(:string, str)
+interp_string(other) = other
+
+get_template(t::Dict, k::Symbol) = haskey(t, k) ? t[k] : get(t, :DEFAULT, Any[DOCSTRING])
+
+function expression_type(ex::Expr)
+ # Expression heads changed in JuliaLang/julia/pull/23157 to match the new keyword syntax.
+ if VERSION < v"0.7.0-DEV.1263" && Meta.isexpr(ex, [:type, :bitstype])
+ :TYPES
+ elseif Meta.isexpr(ex, :module)
+ :MODULES
+ elseif Meta.isexpr(ex, [:struct, :abstract, :typealias, :primitive])
+ :TYPES
+ elseif Meta.isexpr(ex, :macro)
+ :MACROS
+ elseif Meta.isexpr(ex, [:function, :(=)]) && Meta.isexpr(ex.args[1], :call) || (Meta.isexpr(ex.args[1], :where) && Meta.isexpr(ex.args[1].args[1], :call))
+ :METHODS
+ elseif Meta.isexpr(ex, :function)
+ :FUNCTIONS
+ elseif Meta.isexpr(ex, [:const, :(=)])
+ :CONSTANTS
+ else
+ :DEFAULT
+ end
+end
+expression_type(other) = :DEFAULT
--- /dev/null
+
+#
+# Utilities.
+#
+
+#
+# Method grouping.
+#
+
+"""
+$(:SIGNATURES)
+
+Group all methods of function `func` with type signatures `typesig` in module `modname`.
+Keyword argument `exact = true` matches signatures "exactly" with `==` rather than `<:`.
+
+# Examples
+
+```julia
+groups = methodgroups(f, Union{Tuple{Any}, Tuple{Any, Integer}}, Main; exact = false)
+```
+"""
+function methodgroups(func, typesig, modname; exact = true)
+ # Group methods by file and line number.
+ local methods = getmethods(func, typesig)
+ local groups = groupby(Tuple{Symbol, Int}, Vector{Method}, methods) do m
+ (m.file, m.line), m
+ end
+
+ # Filter out methods from other modules and with non-matching signatures.
+ local typesigs = alltypesigs(typesig)
+ local results = Vector{Method}[]
+ for (key, group) in groups
+ filter!(group) do m
+ local ismod = m.module == modname
+ exact ? (ismod && Base.rewrap_unionall(Base.tuple_type_tail(m.sig), m.sig) in typesigs) : ismod
+ end
+ isempty(group) || push!(results, group)
+ end
+
+ # Sort the groups by file and line.
+ sort!(results, lt = comparemethods, by = first)
+
+ return results
+end
+
+"""
+$(:SIGNATURES)
+
+Compare methods `a` and `b` by file and line number.
+"""
+function comparemethods(a::Method, b::Method)
+ comp = a.file < b.file ? -1 : a.file > b.file ? 1 : 0
+ comp == 0 ? a.line < b.line : comp < 0
+end
+
+if isdefined(Base, :UnionAll)
+ uniontypes(T) = uniontypes!(Any[], T)
+ function uniontypes!(out, T)
+ if isa(T, Union)
+ push!(out, T.a)
+ uniontypes!(out, T.b)
+ else
+ push!(out, T)
+ end
+ return out
+ end
+ gettype(T::UnionAll) = gettype(T.body)
+else
+ uniontypes(T) = collect(T.types)
+end
+gettype(other) = other
+
+"""
+$(:SIGNATURES)
+
+A helper method for [`getmethods`](@ref) that collects methods in `results`.
+"""
+function getmethods!(results, f, sig)
+ if sig == Union{}
+ append!(results, methods(f))
+ elseif isa(sig, Union)
+ for each in uniontypes(sig)
+ getmethods!(results, f, each)
+ end
+ elseif isa(sig, UnionAll)
+ getmethods!(results, f, Base.unwrap_unionall(sig))
+ else
+ append!(results, methods(f, sig))
+ end
+ return results
+end
+"""
+$(:SIGNATURES)
+
+Collect and return all methods of function `f` matching signature `sig`.
+
+This is similar to `methods(f, sig)`, but handles type signatures found in `DocStr` objects
+more consistently that `methods`.
+"""
+getmethods(f, sig) = unique(getmethods!(Method[], f, sig))
+
+
+"""
+$(:SIGNATURES)
+
+Is the type `t` a `bitstype`?
+"""
+isbitstype(@nospecialize(t)) = isconcretetype(t) && sizeof(t) > 0 && isbits(t)
+
+"""
+$(:SIGNATURES)
+
+Is the type `t` an `abstract` type?
+"""
+isabstracttype(@nospecialize(t)) = isa(t, DataType) && getfield(t, :abstract)
+
+
+"""
+$(:SIGNATURES)
+
+Returns a `Vector` of the `Tuple` types contained in `sig`.
+"""
+function alltypesigs(sig)::Vector{Any}
+ if sig == Union{}
+ Any[]
+ elseif isa(sig, Union)
+ uniontypes(sig)
+ elseif isa(sig, UnionAll)
+ Base.rewrap_unionall.(uniontypes(Base.unwrap_unionall(sig)), sig)
+ else
+ Any[sig]
+ end
+end
+
+"""
+$(:SIGNATURES)
+
+A helper method for [`groupby`](@ref) that uses a pre-allocated `groups` `Dict`.
+"""
+function groupby!(f, groups, data)
+ for each in data
+ key, value = f(each)
+ push!(get!(groups, key, []), value)
+ end
+ return sort!(collect(groups), by = first)
+end
+
+"""
+$(:SIGNATURES)
+
+Group `data` using function `f` where key type is specified by `K` and group type by `V`.
+
+The function `f` takes a single argument, an element of `data`, and should return a 2-tuple
+of `(computed_key, element)`. See the example below for details.
+
+# Examples
+
+```julia
+groupby(Int, Vector{Int}, collect(1:10)) do num
+ mod(num, 3), num
+end
+```
+"""
+groupby(f, K, V, data) = groupby!(f, Dict{K, V}(), data)
+
+"""
+$(:SIGNATURES)
+
+Remove the `Pkg.dir` part of a file `path` if it exists.
+"""
+function cleanpath(path::AbstractString)
+ for depot in DEPOT_PATH
+ pkgdir = joinpath(depot, "")
+ startswith(path, pkgdir) && return first(split(path, pkgdir, keepempty=false))
+ end
+ return path
+end
+
+"""
+$(:SIGNATURES)
+
+Parse all docstrings defined within a module `mod`.
+"""
+function parsedocs(mod::Module)
+ for (binding, multidoc) in Docs.meta(mod)
+ for (typesig, docstr) in multidoc.docs
+ Docs.parsedoc(docstr)
+ end
+ end
+end
+
+
+"""
+$(:SIGNATURES)
+
+Print a simplified representation of a method signature to `buffer`. Some of these
+simplifications include:
+
+ * no `TypeVar`s;
+ * no types;
+ * no keyword default values;
+ * `?` printed where `#unused#` arguments are found.
+
+# Examples
+
+```julia
+f(x; a = 1, b...) = x
+sig = printmethod(Docs.Binding(Main, :f), f, first(methods(f)))
+```
+"""
+function printmethod(buffer::IOBuffer, binding::Docs.Binding, func, method::Method)
+ # TODO: print qualified?
+ print(buffer, binding.var)
+ print(buffer, "(")
+ join(buffer, arguments(method), ", ")
+ local kws = keywords(func, method)
+ if !isempty(kws)
+ print(buffer, "; ")
+ join(buffer, kws, ", ")
+ end
+ print(buffer, ")")
+ return buffer
+end
+
+printmethod(b, f, m) = String(take!(printmethod(IOBuffer(), b, f, m)))
+
+get_method_source(m::Method) = Base.uncompressed_ast(m)
+nargs(m::Method) = m.nargs
+
+
+"""
+$(:SIGNATURES)
+
+Returns the list of keywords for a particular method `m` of a function `func`.
+
+# Examples
+
+```julia
+f(x; a = 1, b...) = x
+kws = keywords(f, first(methods(f)))
+```
+"""
+function keywords(func, m::Method)
+ local table = methods(func).mt
+ # table is a MethodTable object. For some reason, the :kwsorter field is not always
+ # defined. An undefined kwsorter seems to imply that there are no methods in the
+ # MethodTable with keyword arguments.
+ if isdefined(table, :kwsorter)
+ # Fetching method keywords stolen from base/replutil.jl:572-576 (commit 3b45cdc9aab0):
+ local kwargs = Base.kwarg_decl(m, typeof(table.kwsorter))
+ if isa(kwargs, Vector) && length(kwargs) > 0
+ filter!(arg -> !occursin("#", string(arg)), kwargs)
+ # Keywords *may* not be sorted correctly. We move the vararg one to the end.
+ local index = findfirst(arg -> endswith(string(arg), "..."), kwargs)
+ if index != nothing
+ kwargs[index], kwargs[end] = kwargs[end], kwargs[index]
+ end
+ return kwargs
+ end
+ end
+ return Symbol[]
+end
+
+
+"""
+$(:SIGNATURES)
+
+Returns the list of arguments for a particular method `m`.
+
+# Examples
+
+```julia
+f(x; a = 1, b...) = x
+args = arguments(first(methods(f)))
+```
+"""
+function arguments(m::Method)
+ local template = get_method_source(m)
+ if isdefined(template, :slotnames)
+ local args = map(template.slotnames[1:nargs(m)]) do arg
+ arg === Symbol("#unused#") ? "?" : arg
+ end
+ return filter(arg -> arg !== Symbol("#self#"), args)
+ end
+ return Symbol[]
+end
+
+#
+# Source URLs.
+#
+# Based on code from https://github.com/JuliaLang/julia/blob/master/base/methodshow.jl.
+#
+# Customised to handle URLs on travis since the directory is not a Git repo and we must
+# instead rely on `TRAVIS_REPO_SLUG` to get the remote repo.
+#
+
+"""
+$(:SIGNATURES)
+
+Get the URL (file and line number) where a method `m` is defined.
+
+Note that this is based on the implementation of `Base.url`, but handles URLs correctly
+on TravisCI as well.
+"""
+url(m::Method) = url(m.module, string(m.file), m.line)
+
+import LibGit2
+
+function url(mod::Module, file::AbstractString, line::Integer)
+ file = Sys.iswindows() ? replace(file, '\\' => '/') : file
+ if Base.inbase(mod) && !isabspath(file)
+ local base = "https://github.com/JuliaLang/julia/tree"
+ if isempty(Base.GIT_VERSION_INFO.commit)
+ return "$base/v$VERSION/base/$file#L$line"
+ else
+ local commit = Base.GIT_VERSION_INFO.commit
+ return "$base/$commit/base/$file#L$line"
+ end
+ else
+ if isfile(file)
+ local d = dirname(file)
+ return LibGit2.with(LibGit2.GitRepoExt(d)) do repo
+ LibGit2.with(LibGit2.GitConfig(repo)) do cfg
+ local u = LibGit2.get(cfg, "remote.origin.url", "")
+ local m = match(LibGit2.GITHUB_REGEX, u)
+ u = m === nothing ? get(ENV, "TRAVIS_REPO_SLUG", "") : m.captures[1]
+ local commit = string(LibGit2.head_oid(repo))
+ local root = LibGit2.path(repo)
+ if startswith(file, root) || startswith(realpath(file), root)
+ local base = "https://github.com/$u/tree"
+ local filename = file[(length(root) + 1):end]
+ return "$base/$commit/$filename#L$line"
+ else
+ return ""
+ end
+ end
+ end
+ else
+ return ""
+ end
+ end
+end
--- /dev/null
+# Only run coverage from linux nightly build on travis.
+get(ENV, "TRAVIS_OS_NAME", "") == "linux" || exit()
+get(ENV, "TRAVIS_JULIA_VERSION", "") == "nightly" || exit()
+using Pkg
+Pkg.add("Coverage")
+using Coverage
+
+cd(joinpath(dirname(@__FILE__), "..")) do
+ Codecov.submit(Codecov.process_folder())
+end
--- /dev/null
+using DocStringExtensions
+using Test
+import Markdown
+
+include("tests.jl")
--- /dev/null
+module TemplateTests
+
+using DocStringExtensions
+
+@template DEFAULT =
+ """
+ (DEFAULT)
+
+ $(DOCSTRING)
+ """
+
+@template TYPES =
+ """
+ (TYPES)
+
+ $(TYPEDEF)
+
+ $(DOCSTRING)
+ """
+
+@template (METHODS, MACROS) =
+ """
+ (METHODS, MACROS)
+
+ $(SIGNATURES)
+
+ $(DOCSTRING)
+
+ $(METHODLIST)
+ """
+
+"constant `K`"
+const K = 1
+
+"mutable struct `T`"
+mutable struct T end
+
+"method `f`"
+f(x) = x
+
+"method `g`"
+g(::Type{T}) where {T} = T # Issue 32
+
+"macro `@m`"
+macro m(x) end
+
+module InnerModule
+
+ import ..TemplateTests
+
+ using DocStringExtensions
+
+ @template DEFAULT = TemplateTests
+
+ @template METHODS = TemplateTests
+
+ @template MACROS =
+ """
+ (MACROS)
+
+ $(DOCSTRING)
+
+ $(SIGNATURES)
+ """
+
+ "constant `K`"
+ const K = 1
+
+ "mutable struct `T`"
+ mutable struct T end
+
+ "method `f`"
+ f(x) = x
+
+ "macro `@m`"
+ macro m(x) end
+end
+
+module OtherModule
+
+ import ..TemplateTests
+
+ using DocStringExtensions
+
+ @template TYPES = TemplateTests
+ @template MACROS = TemplateTests.InnerModule
+
+ "mutable struct `T`"
+ mutable struct T end
+
+ "macro `@m`"
+ macro m(x) end
+
+ "method `f`"
+ f(x) = x
+end
+
+end
--- /dev/null
+const DSE = DocStringExtensions
+
+include("templates.jl")
+
+module M
+
+export f
+
+f(x) = x
+
+g(x = 1, y = 2, z = 3; kwargs...) = x
+
+const A{T} = Union{Vector{T}, Matrix{T}}
+
+h_1(x::A) = x
+h_2(x::A{Int}) = x
+h_3(x::A{T}) where {T} = x
+
+i_1(x; y = x) = x * y
+i_2(x::Int; y = x) = x * y
+i_3(x::T; y = x) where {T} = x * y
+i_4(x; y::T = zero(T), z::U = zero(U)) where {T, U} = x + y + z
+
+j_1(x, y) = x * y # two arguments, no keyword arguments
+j_1(x; y = x) = x * y # one argument, one keyword argument
+
+mutable struct T
+ a
+ b
+ c
+end
+
+struct K
+ K(; a = 1) = new()
+end
+
+
+abstract type AbstractType <: Integer end
+
+struct CustomType{S, T <: Integer} <: Integer
+end
+
+primitive type BitType8 8 end
+
+primitive type BitType32 <: Real 32 end
+
+end
+
+@testset "DocStringExtensions" begin
+ @testset "Base assumptions" begin
+ # The package heavily relies on type and docsystem-related methods and types from
+ # Base, which are generally undocumented and their behaviour might change at any
+ # time. This set of tests is tests and documents the assumptions the package makes
+ # about them.
+ #
+ # The testset is not comprehensive -- i.e. DocStringExtensions makes use of
+ # undocumented features that are not tested here. Should you come across anything
+ # like that, please add a test here.
+ #
+
+ # Getting keyword arguments of a method.
+ #
+ # Used in src/utilities.jl for the keywords() function.
+ #
+ # The methodology is based on a snippet in Base at base/replutil.jl:572-576
+ # (commit 3b45cdc9aab0). It uses the undocumented Base.kwarg_decl() function.
+ @test isdefined(Base, :kwarg_decl)
+ # Its signature is kwarg_decl(m::Method, kwtype::DataType). The second argument
+ # should be the type of the kwsorter from the corresponding MethodTable.
+ @test isa(methods(M.j_1), Base.MethodList)
+ @test isdefined(methods(M.j_1), :mt)
+ local mt = methods(M.j_1).mt
+ @test isa(mt, Core.MethodTable)
+ @test isdefined(mt, :kwsorter)
+ # .kwsorter is not always defined -- namely, it seems when none of the methods
+ # have keyword arguments:
+ @test isdefined(methods(M.f).mt, :kwsorter) === false
+ # M.j_1 has two methods. Fetch the single argument one..
+ local m = which(M.j_1, (Any,))
+ @test isa(m, Method)
+ # .. which should have a single keyword argument, :y
+ # Base.kwarg_decl returns a Vector{Any} of the keyword arguments.
+ local kwargs = Base.kwarg_decl(m, typeof(mt.kwsorter))
+ @test isa(kwargs, Vector{Any})
+ @test kwargs == [:y]
+ # Base.kwarg_decl will return a Tuple{} for some reason when called on a method
+ # that does not have any arguments
+ m = which(M.j_1, (Any,Any)) # fetch the no-keyword method
+ @test Base.kwarg_decl(m, typeof(methods(M.j_1).mt.kwsorter)) == Tuple{}()
+ end
+ @testset "format" begin
+ # Setup.
+ doc = Docs.DocStr(Core.svec(), nothing, Dict())
+ buf = IOBuffer()
+
+ # Errors.
+ @test_throws ErrorException DSE.format(nothing, buf, doc)
+
+ @testset "imports & exports" begin
+ # Module imports.
+ doc.data = Dict(
+ :binding => Docs.Binding(Main, :M),
+ :typesig => Union{},
+ )
+ DSE.format(IMPORTS, buf, doc)
+ str = String(take!(buf))
+ @test occursin("\n - `Base`\n", str)
+ @test occursin("\n - `Core`\n", str)
+
+ # Module exports.
+ DSE.format(EXPORTS, buf, doc)
+ str = String(take!(buf))
+ @test occursin("\n - [`f`](@ref)\n", str)
+ end
+
+ @testset "type fields" begin
+ doc.data = Dict(
+ :binding => Docs.Binding(M, :T),
+ :fields => Dict(
+ :a => "one",
+ :b => "two",
+ ),
+ )
+ DSE.format(FIELDS, buf, doc)
+ str = String(take!(buf))
+ @test occursin(" - `a`", str)
+ @test occursin(" - `b`", str)
+ @test occursin(" - `c`", str)
+ @test occursin("one", str)
+ @test occursin("two", str)
+ end
+
+ @testset "method lists" begin
+ doc.data = Dict(
+ :binding => Docs.Binding(M, :f),
+ :typesig => Tuple{Any},
+ :module => M,
+ )
+ DSE.format(METHODLIST, buf, doc)
+ str = String(take!(buf))
+ @test occursin("```julia", str)
+ @test occursin("f(x)", str)
+ @test occursin(joinpath("test", "tests.jl"), str)
+ end
+
+ @testset "method signatures" begin
+ doc.data = Dict(
+ :binding => Docs.Binding(M, :f),
+ :typesig => Tuple{Any},
+ :module => M,
+ )
+ DSE.format(SIGNATURES, buf, doc)
+ str = String(take!(buf))
+ @test occursin("\n```julia\n", str)
+ @test occursin("\nf(x)\n", str)
+ @test occursin("\n```\n", str)
+
+ doc.data = Dict(
+ :binding => Docs.Binding(M, :g),
+ :typesig => Union{Tuple{}, Tuple{Any}},
+ :module => M,
+ )
+ DSE.format(SIGNATURES, buf, doc)
+ str = String(take!(buf))
+ @test occursin("\n```julia\n", str)
+ @test occursin("\ng()\n", str)
+ @test occursin("\ng(x)\n", str)
+ @test occursin("\n```\n", str)
+
+ doc.data = Dict(
+ :binding => Docs.Binding(M, :g),
+ :typesig => Union{Tuple{}, Tuple{Any}, Tuple{Any, Any}, Tuple{Any, Any, Any}},
+ :module => M,
+ )
+ DSE.format(SIGNATURES, buf, doc)
+ str = String(take!(buf))
+ @test occursin("\n```julia\n", str)
+ @test occursin("\ng()\n", str)
+ @test occursin("\ng(x)\n", str)
+ @test occursin("\ng(x, y)\n", str)
+ @test occursin("\ng(x, y, z; kwargs...)\n", str)
+ @test occursin("\n```\n", str)
+ end
+
+ @testset "function names" begin
+ doc.data = Dict(
+ :binding => Docs.Binding(M, :f),
+ :typesig => Tuple{Any},
+ :module => M,
+ )
+ DSE.format(FUNCTIONNAME, buf, doc)
+ str = String(take!(buf))
+ @test str == "f"
+ end
+
+ @testset "type definitions" begin
+ doc.data = Dict(
+ :binding => Docs.Binding(M, :AbstractType),
+ :typesig => Union{},
+ :module => M,
+ )
+ DSE.format(TYPEDEF, buf, doc)
+ str = String(take!(buf))
+ @test str == "\n```julia\nabstract type AbstractType <: Integer\n```\n\n"
+
+ doc.data = Dict(
+ :binding => Docs.Binding(M, :CustomType),
+ :typesig => Union{},
+ :module => M,
+ )
+ DSE.format(TYPEDEF, buf, doc)
+ str = String(take!(buf))
+ @test str == "\n```julia\nstruct CustomType{S, T<:Integer} <: Integer\n```\n\n"
+
+ doc.data = Dict(
+ :binding => Docs.Binding(M, :BitType8),
+ :typesig => Union{},
+ :module => M,
+ )
+ DSE.format(TYPEDEF, buf, doc)
+ str = String(take!(buf))
+ @test str == "\n```julia\nprimitive type BitType8 8\n```\n\n"
+
+ doc.data = Dict(
+ :binding => Docs.Binding(M, :BitType32),
+ :typesig => Union{},
+ :module => M,
+ )
+ DSE.format(TYPEDEF, buf, doc)
+ str = String(take!(buf))
+ @test str == "\n```julia\nprimitive type BitType32 <: Real 32\n```\n\n"
+ end
+ end
+ @testset "templates" begin
+ let fmt = expr -> Markdown.plain(eval(:(@doc $expr)))
+ @test occursin("(DEFAULT)", fmt(:(TemplateTests.K)))
+ @test occursin("(TYPES)", fmt(:(TemplateTests.T)))
+ @test occursin("(METHODS, MACROS)", fmt(:(TemplateTests.f)))
+ @test occursin("(METHODS, MACROS)", fmt(:(TemplateTests.g)))
+ @test occursin("(METHODS, MACROS)", fmt(:(TemplateTests.@m)))
+
+ @test occursin("(DEFAULT)", fmt(:(TemplateTests.InnerModule.K)))
+ @test occursin("(DEFAULT)", fmt(:(TemplateTests.InnerModule.T)))
+ @test occursin("(METHODS, MACROS)", fmt(:(TemplateTests.InnerModule.f)))
+ @test occursin("(MACROS)", fmt(:(TemplateTests.InnerModule.@m)))
+
+ @test occursin("(TYPES)", fmt(:(TemplateTests.OtherModule.T)))
+ @test occursin("(MACROS)", fmt(:(TemplateTests.OtherModule.@m)))
+ @test fmt(:(TemplateTests.OtherModule.f)) == "method `f`\n"
+ end
+ end
+ @testset "utilities" begin
+ @testset "keywords" begin
+ @test DSE.keywords(M.T, first(methods(M.T))) == Symbol[]
+ @test DSE.keywords(M.K, first(methods(M.K))) == [:a]
+ @test DSE.keywords(M.f, first(methods(M.f))) == Symbol[]
+ let f = (() -> ()),
+ m = first(methods(f))
+ @test DSE.keywords(f, m) == Symbol[]
+ end
+ let f = ((a) -> ()),
+ m = first(methods(f))
+ @test DSE.keywords(f, m) == Symbol[]
+ end
+ let f = ((; a = 1) -> ()),
+ m = first(methods(f))
+ @test DSE.keywords(f, m) == [:a]
+ end
+ let f = ((; a = 1, b = 2) -> ()),
+ m = first(methods(f))
+ @test DSE.keywords(f, m) == [:a, :b]
+ end
+ let f = ((; a...) -> ()),
+ m = first(methods(f))
+ @test DSE.keywords(f, m) == [Symbol("a...")]
+ end
+ # Tests for #42
+ let f = M.i_1, m = first(methods(f))
+ @test DSE.keywords(f, m) == [:y]
+ end
+ let f = M.i_2, m = first(methods(f))
+ @test DSE.keywords(f, m) == [:y]
+ end
+ let f = M.i_3, m = first(methods(f))
+ @test DSE.keywords(f, m) == [:y]
+ end
+ let f = M.i_4, m = first(methods(f))
+ @test DSE.keywords(f, m) == [:y, :z]
+ end
+ end
+ @testset "arguments" begin
+ @test DSE.arguments(first(methods(M.T))) == [:a, :b, :c]
+ @test DSE.arguments(first(methods(M.K))) == Symbol[]
+ @test DSE.arguments(first(methods(M.f))) == [:x]
+ let m = first(methods(() -> ()))
+ @test DSE.arguments(m) == Symbol[]
+ end
+ let m = first(methods((a) -> ()))
+ @test DSE.arguments(m) == [:a]
+ end
+ let m = first(methods((; a = 1) -> ()))
+ @test DSE.arguments(m) == Symbol[]
+ end
+ let m = first(methods((x; a = 1, b = 2) -> ()))
+ @test DSE.arguments(m) == Symbol[:x]
+ end
+ let m = first(methods((; a...) -> ()))
+ @test DSE.arguments(m) == Symbol[]
+ end
+ end
+ @testset "printmethod" begin
+ let b = Docs.Binding(M, :T),
+ f = M.T,
+ m = first(methods(f))
+ @test DSE.printmethod(b, f, m) == "T(a, b, c)"
+ end
+ let b = Docs.Binding(M, :K),
+ f = M.K,
+ m = first(methods(f))
+ @test DSE.printmethod(b, f, m) == "K(; a)"
+ end
+ let b = Docs.Binding(M, :f),
+ f = M.f,
+ m = first(methods(f))
+ @test DSE.printmethod(b, f, m) == "f(x)"
+ end
+ let b = Docs.Binding(Main, :f),
+ f = () -> (),
+ m = first(methods(f))
+ @test DSE.printmethod(b, f, m) == "f()"
+ end
+ let b = Docs.Binding(Main, :f),
+ f = (a) -> (),
+ m = first(methods(f))
+ @test DSE.printmethod(b, f, m) == "f(a)"
+ end
+ let b = Docs.Binding(Main, :f),
+ f = (; a = 1) -> (),
+ m = first(methods(f))
+ @test DSE.printmethod(b, f, m) == "f(; a)"
+ end
+ let b = Docs.Binding(Main, :f),
+ f = (; a = 1, b = 2) -> (),
+ m = first(methods(f))
+ # Keywords are not ordered, so check for both combinations.
+ @test DSE.printmethod(b, f, m) in ("f(; a, b)", "f(; b, a)")
+ end
+ let b = Docs.Binding(Main, :f),
+ f = (; a...) -> (),
+ m = first(methods(f))
+ @test DSE.printmethod(b, f, m) == "f(; a...)"
+ end
+ let b = Docs.Binding(Main, :f),
+ f = (; a = 1, b = 2, c...) -> (),
+ m = first(methods(f))
+ # Keywords are not ordered, so check for both combinations.
+ @test DSE.printmethod(b, f, m) in ("f(; a, b, c...)", "f(; b, a, c...)")
+ end
+ end
+ @testset "getmethods" begin
+ @test length(DSE.getmethods(M.f, Union{})) == 1
+ @test length(DSE.getmethods(M.f, Tuple{})) == 0
+ @test length(DSE.getmethods(M.f, Union{Tuple{}, Tuple{Any}})) == 1
+ @test length(DSE.getmethods(M.h_3, Tuple{M.A{Int}})) == 1
+ @test length(DSE.getmethods(M.h_3, Tuple{Vector{Int}})) == 1
+ @test length(DSE.getmethods(M.h_3, Tuple{Array{Int, 3}})) == 0
+ end
+ @testset "methodgroups" begin
+ @test length(DSE.methodgroups(M.f, Tuple{Any}, M)) == 1
+ @test length(DSE.methodgroups(M.f, Tuple{Any}, M)[1]) == 1
+ @test length(DSE.methodgroups(M.h_1, Tuple{M.A}, M)) == 1
+ @test length(DSE.methodgroups(M.h_1, Tuple{M.A}, M)[1]) == 1
+ @test length(DSE.methodgroups(M.h_2, Tuple{M.A{Int}}, M)) == 1
+ @test length(DSE.methodgroups(M.h_2, Tuple{M.A{Int}}, M)[1]) == 1
+ @test length(DSE.methodgroups(M.h_3, Tuple{M.A}, M)[1]) == 1
+ end
+ @testset "alltypesigs" begin
+ @test DSE.alltypesigs(Union{}) == Any[]
+ @test DSE.alltypesigs(Union{Tuple{}}) == Any[Tuple{}]
+ @test DSE.alltypesigs(Tuple{}) == Any[Tuple{}]
+
+ # TODO: Clean me up
+ T = Type{T} where {T}
+ @test DSE.alltypesigs(T) ==
+ Base.rewrap_unionall.(DSE.uniontypes(Base.unwrap_unionall(T)), T)
+ end
+ @testset "groupby" begin
+ let groups = DSE.groupby(Int, Vector{Int}, collect(1:10)) do each
+ mod(each, 3), each
+ end
+ @test groups == Pair{Int, Vector{Int}}[
+ 0 => [3, 6, 9],
+ 1 => [1, 4, 7, 10],
+ 2 => [2, 5, 8],
+ ]
+ end
+ end
+ @testset "url" begin
+ @test !isempty(DSE.url(first(methods(sin))))
+ @test !isempty(DSE.url(first(methods(DSE.parsedocs))))
+ @test !isempty(DSE.url(first(methods(M.f))))
+ @test !isempty(DSE.url(first(methods(M.K))))
+ end
+ @testset "comparemethods" begin
+ let f = first(methods(M.f)),
+ g = first(methods(M.g))
+ @test !DSE.comparemethods(f, f)
+ @test DSE.comparemethods(f, g)
+ @test !DSE.comparemethods(g, f)
+ end
+ end
+ end
+end
+
+DSE.parsedocs(DSE)
--- /dev/null
+https://github.com/JuliaDocs/DocStringExtensions.jl/releases
--- /dev/null
+Documenter.jl-0.20.0
\ No newline at end of file
--- /dev/null
+comment: false
--- /dev/null
+language: julia
+
+os:
+ - linux
+ - osx
+
+julia:
+ - 0.7
+ - 1.0
+ - nightly
+
+notifications:
+ email: false
+
+after_success:
+ - julia --project=coverage/ -e 'using Pkg; Pkg.instantiate()'
+ - julia --project=coverage/ coverage/coverage.jl
+
+jobs:
+ include:
+ - stage: "Documentation"
+ julia: 1.0
+ os: linux
+ script:
+ - julia --project=docs/ -e 'using Pkg; Pkg.instantiate(); Pkg.develop(PackageSpec(path=pwd()))'
+ - julia --project=docs/ docs/make.jl
+ after_success: skip
--- /dev/null
+# Documenter.jl changelog
+
+## Version `v0.20.0`
+
+* Documenter v0.20 requires at least Julia v0.7. ([#795][github-795])
+
+* ![BREAKING][badge-breaking] Documentation deployment via the `deploydocs` function has changed considerably.
+
+ - The user-facing directories (URLs) of different versions and what gets displayed in the version selector have changed. By default, Documenter now creates the `stable/` directory (as before) and a directory for every minor version (`vX.Y/`). The `release-X.Y` directories are no longer created. ([#706][github-706], [#813][github-813], [#817][github-817])
+
+ Technically, Documenter now deploys actual files only to `dev/` and `vX.Y.Z/` directories. The directories (URLs) that change from version to version (e.g. `latest/`, `vX.Y`) are implemented as symlinks on the `gh-pages` branch.
+
+ The version selector will only display `vX.Y/`, `stable/` and `dev/` directories by default. This behavior can be customized with the `versions` keyword of `deploydocs`.
+
+ - Documentation from the development branch (e.g. `master`) now deploys to `dev/` by default (instead of `latest/`). This can be customized with the `devurl` keyword. ([#802][github-802])
+
+ - The `latest` keyword to `deploydocs` has been deprecated and renamed to `devbranch`. ([#802][github-802])
+
+ - The `julia` and `osname` keywords to `deploydocs` are now deprecated. ([#816][github-816])
+
+ - The default values of the `target`, `deps` and `make` keywords to `deploydocs` have been changed. See the default format change below for more information. ([#826][github-826])
+
+ **For upgrading:**
+
+ - If you are using the `latest` keyword, then just use `devbranch` with the same value instead.
+
+ - Update links that point to `latest/` to point to `dev/` instead (e.g. in the README).
+
+ - Remove any links to the `release-X.Y` branches and remove the directories from your `gh-pages` branch.
+
+ - The operating system and Julia version should be specified in the Travis build stage configuration (via `julia:` and `os:` options, see "Hosting Documentation" in the manual for more details) or by checking the `TRAVIS_JULIA_VERSION` and `TRAVIS_OS_NAME` environment variables in `make.jl` yourself.
+
+* ![BREAKING][badge-breaking] `makedocs` will now build Documenter's native HTML output by default and `deploydocs`' defaults now assume the HTML output. ([#826][github-826])
+
+ - The default value of the `format` keyword of `makedocs` has been changed to `:html`.
+
+ - The default value of the `target` keyword to `deploydocs` has been changed to `"build"`.
+
+ - The default value of the `make` and `deps` keywords to `deploydocs` have been changed to `nothing`.
+
+ **For upgrading:** If you are relying on the Markdown/MkDocs output, you now need to:
+
+ - In `makedocs`, explicitly set `format = :markdown`
+
+ - In `deploydocs`, explicitly set
+
+ ```julia
+ target = "site"
+ deps = Deps.pip("pygments", "mkdocs")
+ make = () -> run(`mkdocs build`)
+ ```
+
+ - Explicitly import `DocumenterMarkdown` in `make.jl`. See the `MarkdownWriter` deprecation below.
+
+ If you already specify any of the changed keywords, then you do not need to make any changes to those keywords you already set.
+
+ However, if you are setting any of the values to the new defaults (e.g. when you are already using the HTML output), you may now rely on the new defaults.
+
+* ![Deprecation][badge-deprecation] The Markdown/MkDocs (`format = :markdown`) and PDF/LaTeX (`format = :latex`) outputs now require an external package to be loaded ([DocumenterMarkdown](https://github.com/JuliaDocs/DocumenterMarkdown.jl) and [DocumenterLaTeX](https://github.com/JuliaDocs/DocumenterLaTeX.jl), respectively). ([#833][github-833])
+
+ **For upgrading:** Make sure that the respective extra package is installed and then just add `using DocumenterMarkdown` or `using DocumenterLaTeX` to `make.jl`.
+
+* ![BREAKING][badge-breaking] "Pretty URLs" are enabled by default now for the HTML output. The default value of the `html_prettyurls` has been changed to `true`.
+
+ For a page `foo/page.md` Documenter now generates `foo/page/index.html`, instead of `foo/page.html`.
+ On GitHub pages deployments it means that your URLs look like `foo/page/` instead of `foo/page.html`.
+
+ For local builds you should explicitly set `html_prettyurls = false`.
+
+ **For upgrading:** If you wish to retain the old behavior, set `html_prettyurls = false` in `makedocs`. If you already set `html_prettyurls`, you do not need to change anything.
+
+* ![BREAKING][badge-breaking] The `Travis.genkeys` and `Documenter.generate` functions have been moved to a separate [DocumenterTools.jl package](https://github.com/JuliaDocs/DocumenterTools.jl). ([#789][github-789])
+
+* ![Enhancement][badge-enhancement] If Documenter is not able to determine which Git hosting service is being used to host the source, the "Edit on XXX" links become "Edit source" with a generic icon. ([#804][github-804])
+
+* ![Enhancement][badge-enhancement] The at-blocks now support `MIME"text/html"` rendering of objects (e.g. for interactive plots). I.e. if a type has `show(io, ::MIME"text/html", x)` defined, Documenter now uses that when rendering the objects in the document. ([#764][github-764])
+
+* ![Enhancement][badge-enhancement] Enhancements to the sidebar. When loading a page, the sidebar will jump to the current page now. Also, the scrollbar in WebKit-based browsers look less intrusive now. ([#792][github-792], [#854][github-854], [#863][github-863])
+
+* ![Enhancement][badge-enhancement] Minor style enhancements to admonitions. ([#841][github-841])
+
+* ![Bugfix][badge-bugfix] The at-blocks that execute code can now handle `include` statements. ([#793][github-793], [#794][github-794])
+
+* ![Bugfix][badge-bugfix] At-docs blocks no longer give an error when containing empty lines. ([#823][github-823], [#824][github-824])
+
+[github-706]: https://github.com/JuliaDocs/Documenter.jl/pull/706
+[github-764]: https://github.com/JuliaDocs/Documenter.jl/pull/764
+[github-789]: https://github.com/JuliaDocs/Documenter.jl/pull/789
+[github-792]: https://github.com/JuliaDocs/Documenter.jl/pull/792
+[github-793]: https://github.com/JuliaDocs/Documenter.jl/pull/793
+[github-794]: https://github.com/JuliaDocs/Documenter.jl/pull/794
+[github-795]: https://github.com/JuliaDocs/Documenter.jl/pull/795
+[github-802]: https://github.com/JuliaDocs/Documenter.jl/pull/802
+[github-804]: https://github.com/JuliaDocs/Documenter.jl/pull/804
+[github-813]: https://github.com/JuliaDocs/Documenter.jl/pull/813
+[github-816]: https://github.com/JuliaDocs/Documenter.jl/pull/816
+[github-817]: https://github.com/JuliaDocs/Documenter.jl/pull/817
+[github-823]: https://github.com/JuliaDocs/Documenter.jl/pull/823
+[github-824]: https://github.com/JuliaDocs/Documenter.jl/pull/824
+[github-826]: https://github.com/JuliaDocs/Documenter.jl/pull/826
+[github-833]: https://github.com/JuliaDocs/Documenter.jl/pull/833
+[github-841]: https://github.com/JuliaDocs/Documenter.jl/pull/841
+[github-854]: https://github.com/JuliaDocs/Documenter.jl/pull/854
+[github-863]: https://github.com/JuliaDocs/Documenter.jl/pull/863
+
+
+[badge-breaking]: https://img.shields.io/badge/BREAKING-red.svg
+[badge-deprecation]: https://img.shields.io/badge/deprecation-orange.svg
+[badge-feature]: https://img.shields.io/badge/feature-green.svg
+[badge-enhancement]: https://img.shields.io/badge/enhancement-blue.svg
+[badge-bugfix]: https://img.shields.io/badge/bugfix-purple.svg
+
+<!--
+# Badges
+
+![BREAKING][badge-breaking]
+![Deprecation][badge-deprecation]
+![Feature][badge-feature]
+![Enhancement][badge-enhancement]
+![Bugfix][badge-bugfix]
+-->
--- /dev/null
+The Documenter.jl package is licensed under the MIT "Expat" License:
+
+> Copyright (c) 2016: Michael Hatherly.
+>
+> Permission is hereby granted, free of charge, to any person obtaining a copy
+> of this software and associated documentation files (the "Software"), to deal
+> in the Software without restriction, including without limitation the rights
+> to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+> copies of the Software, and to permit persons to whom the Software is
+> furnished to do so, subject to the following conditions:
+>
+> The above copyright notice and this permission notice shall be included in all
+> copies or substantial portions of the Software.
+>
+> THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+> IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+> FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+> AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+> LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+> OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+> SOFTWARE.
+>
--- /dev/null
+
+# Documenter
+
+*A documentation generator for Julia.*
+
+| **Documentation** | **Build Status** |
+|:-------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------:|
+| [![][docs-stable-img]][docs-stable-url] [![][docs-dev-img]][docs-dev-url] | [![][travis-img]][travis-url] [![][appveyor-img]][appveyor-url] [![][codecov-img]][codecov-url] |
+
+
+## Installation
+
+The package can be installed with the Julia package manager.
+From the Julia REPL, type `]` to enter the Pkg REPL mode and run:
+
+```
+pkg> add Documenter
+```
+
+Or, equivalently, via the `Pkg` API:
+
+```julia
+julia> import Pkg; Pkg.add("Documenter")
+```
+
+## Documentation
+
+- [**STABLE**][docs-stable-url] — **documentation of the most recently tagged version.**
+- [**DEVEL**][docs-dev-url] — *documentation of the in-development version.*
+
+## Project Status
+
+The package is tested against Julia `0.7`, `1.0` and the nightly builds of the Julia `master` branch on Linux, macOS, and Windows.
+
+Support for Julia `0.4`, `0.5` and `0.6` has been dropped in the latest version, but older versions of Documenter may still work (Documenter versions `0.8`, `0.11` and `0.19`, respectively).
+
+## Questions and Contributions
+
+Usage questions can be posted on the [Julia Discourse forum][discourse-tag-url] under the `documenter` tag, in the #documentation channel of the [Julia Slack](https://julialang.org/community/) and/or in the [JuliaDocs Gitter chat room][gitter-url].
+
+Contributions are very welcome, as are feature requests and suggestions. Please open an [issue][issues-url] if you encounter any problems. The [contributing page][contrib-url] has a few guidelines that should be followed when opening pull requests and contributing code.
+
+[contrib-url]: https://juliadocs.github.io/Documenter.jl/latest/man/contributing/
+[discourse-tag-url]: https://discourse.julialang.org/tags/documenter
+[gitter-url]: https://gitter.im/juliadocs/users
+
+[docs-dev-img]: https://img.shields.io/badge/docs-dev-blue.svg
+[docs-dev-url]: https://juliadocs.github.io/Documenter.jl/latest
+
+[docs-stable-img]: https://img.shields.io/badge/docs-stable-blue.svg
+[docs-stable-url]: https://juliadocs.github.io/Documenter.jl/stable
+
+[travis-img]: https://travis-ci.org/JuliaDocs/Documenter.jl.svg?branch=master
+[travis-url]: https://travis-ci.org/JuliaDocs/Documenter.jl
+
+[appveyor-img]: https://ci.appveyor.com/api/projects/status/xx7nimfpnl1r4gx0?svg=true
+[appveyor-url]: https://ci.appveyor.com/project/JuliaDocs/documenter-jl
+
+[codecov-img]: https://codecov.io/gh/JuliaDocs/Documenter.jl/branch/master/graph/badge.svg
+[codecov-url]: https://codecov.io/gh/JuliaDocs/Documenter.jl
+
+[issues-url]: https://github.com/JuliaDocs/Documenter.jl/issues
--- /dev/null
+julia 0.7
+DocStringExtensions 0.2
--- /dev/null
+environment:
+ matrix:
+ - julia_version: 0.7
+ - julia_version: 1.0
+ - julia_version: latest
+
+platform:
+ - x86 # 32-bit
+ - x64 # 64-bit
+
+## uncomment the following lines to allow failures on nightly julia
+## (tests will run but not make your overall status red)
+#matrix:
+# allow_failures:
+# - julia_version: latest
+
+branches:
+ only:
+ - master
+ - /release-.*/
+
+notifications:
+ - provider: Email
+ on_build_success: false
+ on_build_failure: false
+ on_build_status_changed: false
+
+install:
+ - ps: iex ((new-object net.webclient).DownloadString("https://raw.githubusercontent.com/JuliaCI/Appveyor.jl/version-1/bin/install.ps1"))
+
+build_script:
+ - echo "%JL_BUILD_SCRIPT%"
+ - C:\julia\bin\julia -e "%JL_BUILD_SCRIPT%"
+
+test_script:
+ - echo "%JL_TEST_SCRIPT%"
+ - C:\julia\bin\julia -e "%JL_TEST_SCRIPT%"
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<!-- Created with Inkscape (http://www.inkscape.org/) -->
+
+<svg
+ xmlns:dc="http://purl.org/dc/elements/1.1/"
+ xmlns:cc="http://creativecommons.org/ns#"
+ xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
+ xmlns:svg="http://www.w3.org/2000/svg"
+ xmlns="http://www.w3.org/2000/svg"
+ xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
+ xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
+ width="16.5mm"
+ height="8.6603003mm"
+ viewBox="0 0 58.464567 30.686103"
+ id="svg2"
+ version="1.1"
+ inkscape:version="0.91 r13725"
+ sodipodi:docname="arrow.svg">
+ <defs
+ id="defs4" />
+ <sodipodi:namedview
+ id="base"
+ pagecolor="#ffffff"
+ bordercolor="#666666"
+ borderopacity="1.0"
+ inkscape:pageopacity="0.0"
+ inkscape:pageshadow="2"
+ inkscape:zoom="11.2"
+ inkscape:cx="14.209234"
+ inkscape:cy="29.780479"
+ inkscape:document-units="px"
+ inkscape:current-layer="layer1"
+ showgrid="false"
+ inkscape:window-width="1920"
+ inkscape:window-height="1053"
+ inkscape:window-x="0"
+ inkscape:window-y="27"
+ inkscape:window-maximized="1" />
+ <metadata
+ id="metadata7">
+ <rdf:RDF>
+ <cc:Work
+ rdf:about="">
+ <dc:format>image/svg+xml</dc:format>
+ <dc:type
+ rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
+ <dc:title></dc:title>
+ </cc:Work>
+ </rdf:RDF>
+ </metadata>
+ <g
+ inkscape:label="Layer 1"
+ inkscape:groupmode="layer"
+ id="layer1"
+ transform="translate(0,-1021.6761)">
+ <path
+ style="fill:#000000;fill-opacity:1;fill-rule:evenodd;stroke:#000000;stroke-width:0;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
+ d="m 0,1021.6761 35.433071,0 -17.716536,30.6861 z"
+ id="path4140"
+ inkscape:connector-curvature="0"
+ sodipodi:nodetypes="cccc" />
+ </g>
+</svg>
--- /dev/null
+/*
+ * The default CSS style for Documenter.jl generated sites
+ *
+ * Heavily inspired by the Julia Sphinx theme
+ * https://github.com/JuliaLang/JuliaDoc
+ * which extends the sphinx_rtd_theme
+ * https://github.com/snide/sphinx_rtd_theme
+ *
+ * Part of Documenter.jl
+ * https://github.com/JuliaDocs/Documenter.jl
+ *
+ * License: MIT
+ */
+
+/* fonts */
+body, input {
+ font-family: 'Lato', 'Helvetica Neue', Arial, sans-serif;
+ font-size: 16px;
+ color: #222;
+ text-rendering: optimizeLegibility;
+}
+
+pre, code, kbd {
+ font-family: Inconsolata, Monaco, courier, monospace;
+ font-size: 0.90em;
+}
+
+pre code {
+ font-size: 1em;
+}
+
+a {
+ color: #2980b9;
+ text-decoration: none;
+}
+
+a:hover {
+ color: #3091d1;
+}
+
+a:visited {
+ color: #9b59b6;
+}
+
+body {
+ line-height: 1.5;
+}
+
+h1 {
+ font-size: 1.75em;
+}
+
+/* Unless the <h1> the is very first thing on the page (i.e. the second element
+ * in the <article>, * after the <header>, we add some additional styling to it
+ * to make it stand out a bit more. This way we get a reasonable fallback if CSS3
+ * selectors are not supported in the browser.
+ */
+article > h1:not(:nth-child(2)) {
+ margin: 2.5em 0 0;
+ padding-bottom: 0.30em;
+ border-bottom: 1px solid #e5e5e5;
+}
+h2 {
+ font-size: 1.50em;
+ margin: 2.3em 0 0;
+ padding-bottom: 0.25em;
+ border-bottom: 1px solid #e5e5e5;
+}
+h3 {
+ font-size: 1.25em;
+ margin: 2.0em 0 0;
+}
+h4 { font-size: 1.15em; }
+h5 { font-size: 1.10em; }
+h6 { font-size: 1em; }
+
+h4, h5, h6 {
+ margin-top: 1.5em;
+ margin-bottom: 1em;
+}
+
+img {
+ max-width: 100%;
+}
+
+table {
+ border-collapse: collapse;
+ margin: 1em 0;
+}
+
+th, td {
+ border: 1px solid #e1e4e5;
+ padding: 0.5em 1em;
+}
+
+th {
+ border-bottom-width: 2px;
+}
+
+tr:nth-child(even) {
+ background-color: #f3f6f6;
+}
+
+hr {
+ border: 0;
+ border-top: 1px solid #e5e5e5;
+}
+
+/* Inline code and code blocks */
+
+code {
+ padding: 0.1em;
+ background-color: rgba(0,0,0,.04);
+ border-radius: 3px;
+}
+
+pre {
+ background-color: #f5f5f5;
+ border: 1px solid #dddddd;
+ border-radius: 3px;
+ padding: 0.5em;
+ overflow: auto;
+}
+
+pre code {
+ padding: 0;
+ background-color: initial;
+}
+
+kbd {
+ font-size: 0.70em;
+ display: inline-block;
+ padding: 0.1em 0.5em 0.4em 0.5em;
+ line-height: 1.0em;
+ color: #444d56;
+ vertical-align: middle;
+ background-color: #fafbfc;
+ border: solid 1px #c6cbd1;
+ border-bottom-color: #959da5;
+ border-radius: 3px;
+ box-shadow: inset 0 -1px 0 #959da5;
+}
+
+/* Headers in admonitions and docstrings */
+.admonition h1,
+article section.docstring h1 {
+ font-size: 1.25em;
+}
+
+.admonition h2,
+article section.docstring h2 {
+ font-size: 1.10em;
+}
+
+.admonition h3,
+.admonition h4,
+.admonition h5,
+.admonition h6,
+article section.docstring h3,
+article section.docstring h4,
+article section.docstring h5,
+article section.docstring h6 {
+ font-size: 1em;
+}
+
+/* Navigation */
+nav.toc {
+ position: fixed;
+ top: 0;
+ left: 0;
+ bottom: 0;
+ width: 20em;
+ display: flex;
+ flex-flow: column nowrap;
+ overflow-y: auto;
+ padding: 1em 0 0 0;
+ background-color: #fcfcfc;
+ box-shadow: inset -14px 0px 5px -12px rgb(210,210,210);
+}
+
+nav.toc .logo {
+ margin: 0 auto;
+ display: block;
+ max-height: 6em;
+ max-width: 18em;
+}
+
+nav.toc h1 {
+ text-align: center;
+ margin-top: .57em;
+ margin-bottom: 0;
+}
+
+nav.toc select {
+ display: block;
+ height: 2em;
+ flex-shrink: 0;
+ padding: 0 1.6em 0 1em;
+ min-width: 7em;
+ max-width: 90%;
+ max-width: calc(100% - 5em);
+ margin: 0 auto;
+ font-size: .83em;
+ border: 1px solid #c9c9c9;
+ border-radius: 1em;
+
+ /* TODO: doesn't seem to be centered on Safari */
+ text-align: center;
+ text-align-last: center;
+
+ appearance: none;
+ -moz-appearance: none;
+ -webkit-appearance: none;
+
+ background: white url("arrow.svg");
+ background-size: 1.155em;
+ background-repeat: no-repeat;
+ background-position: right;
+}
+
+nav.toc select:hover {
+ border: 1px solid #a0a0a0;
+}
+
+nav.toc select option {
+ text-align: center;
+}
+
+nav.toc input {
+ display: block;
+ height: 2em;
+ width: 90%;
+ width: calc(100% - 5em);
+ margin: 1.2em auto;
+ padding: 0 1em;
+ border: 1px solid #c9c9c9;
+ border-radius: 1em;
+ font-size: .83em;
+}
+
+nav.toc > ul * {
+ margin: 0;
+}
+
+nav.toc > ul {
+ min-height: 2em;
+ overflow-y: auto;
+ margin: 0;
+}
+
+nav.toc > ul > li:last-child {
+ padding-bottom: 1em;
+}
+
+nav.toc ul {
+ color: #404040;
+ padding: 0;
+ list-style: none;
+}
+
+nav.toc ul .toctext {
+ color: inherit;
+ display: block;
+}
+
+nav.toc ul a:hover {
+ color: #fcfcfc;
+ background-color: #4e4a4a;
+}
+
+nav.toc ul.internal a {
+ color: inherit;
+ display: block;
+}
+
+nav.toc ul.internal a:hover {
+ background-color: #d6d6d6;
+}
+
+nav.toc ul.internal {
+ background-color: #e3e3e3;
+ box-shadow: inset -14px 0px 5px -12px rgb(210,210,210);
+ list-style: none;
+}
+
+nav.toc ul.internal li.toplevel {
+ border-top: 1px solid #909090;
+ font-weight: bold;
+}
+
+nav.toc ul.internal li.toplevel:first-child {
+ border-top: none;
+}
+
+nav.toc .toctext {
+ padding-top: 0.3em;
+ padding-bottom: 0.3em;
+ padding-right: 1em;
+}
+
+nav.toc ul .toctext {
+ padding-left: 1em;
+}
+
+nav.toc ul ul .toctext {
+ padding-left: 2em;
+}
+
+nav.toc ul ul ul .toctext {
+ padding-left: 3em;
+}
+
+nav.toc li.current > .toctext {
+ border-top: 1px solid #c9c9c9;
+ border-bottom: 1px solid #c9c9c9;
+ color: #404040;
+ font-weight: bold;
+ background-color: white;
+}
+
+nav.toc ul::-webkit-scrollbar {
+ width: .4em;
+ background: none;
+}
+
+nav.toc ul::-webkit-scrollbar-thumb {
+ border-radius: 5px;
+ background: #c9c9c9;
+}
+
+nav.toc ul::-webkit-scrollbar-thumb:hover {
+ border-radius: 5px;
+ background: #aaaaaa;
+}
+
+article {
+ margin-left: 20em;
+ min-width: 20em;
+ max-width: 48em;
+ padding: 2em;
+}
+
+article > header {}
+
+article > header div#topbar {
+ display: none;
+}
+
+article > header nav ul {
+ display: inline-block;
+ list-style: none;
+ margin: 0;
+ padding: 0;
+}
+
+article > header nav li {
+ display: inline-block;
+ padding-right: 0.2em;
+}
+
+article > header nav li:before {
+ content: "»";
+ padding-right: 0.2em;
+}
+
+article > header .edit-page {
+ float: right;
+}
+
+article > footer {}
+
+article > footer a.prev {
+ float: left;
+}
+article > footer a.next {
+ float: right;
+}
+
+article > footer a .direction:after {
+ content: ": ";
+}
+
+article hr {
+ margin: 1em 0;
+}
+
+article section.docstring {
+ border: 1px solid #ddd;
+ margin: 0.5em 0;
+ padding: 0.5em;
+ border-radius: 3px;
+}
+
+article section.docstring .docstring-header {
+ margin-bottom: 1em;
+}
+
+article section.docstring .docstring-binding {
+ color: #333;
+ font-weight: bold;
+}
+
+article section.docstring .docstring-category {
+ font-style: italic;
+}
+
+article section.docstring a.source-link {
+ display: block;
+ font-weight: bold;
+}
+
+.nav-anchor,
+.nav-anchor:hover,
+.nav-anchor:visited {
+ color: #333;
+}
+
+/*
+ * Admonitions
+ *
+ * Colors (title, body)
+ * warning: #f0b37e #ffedcc (orange)
+ * note: #6ab0de #e7f2fa (blue)
+ * tip: #1abc9c #dbfaf4 (green)
+*/
+.admonition {
+ border-radius: 3px;
+ background-color: #eeeeee;
+ margin: 1em 0;
+}
+
+.admonition-title {
+ border-radius: 3px 3px 0 0;
+ background-color: #9b9b9b;
+ padding: 0.15em 0.5em;
+}
+
+.admonition-text {
+ padding: 0.5em;
+}
+
+.admonition-text > :first-child {
+ margin-top: 0;
+}
+
+.admonition-text > :last-child {
+ margin-bottom: 0;
+}
+
+.admonition > .admonition-title:before {
+ font-family: "FontAwesome";
+ margin-right: 5px;
+ content: "\f06a";
+}
+
+.admonition.warning > .admonition-title {
+ background-color: #f0b37e;
+}
+
+.admonition.warning {
+ background-color: #ffedcc;
+}
+
+.admonition.note > .admonition-title {
+ background-color: #6ab0de;
+}
+
+.admonition.note {
+ background-color: #e7f2fa;
+}
+
+.admonition.tip > .admonition-title {
+ background-color: #1abc9c;
+}
+
+.admonition.tip {
+ background-color: #dbfaf4;
+}
+
+
+/* footnotes */
+.footnote {
+ padding-left: 0.8em;
+ border-left: 2px solid #ccc;
+}
+
+/* Search page */
+#search-results .category {
+ font-size: smaller;
+}
+
+/* Overriding the <code> block style of highligh.js.
+ * We have to override the padding and the background-color, since we style this
+ * part ourselves. Specifically, we style the <pre> surrounding the <code>, while
+ * highlight.js applies the .hljs style directly to the <code> tag.
+ */
+.hljs {
+ background-color: transparent;
+ padding: 0;
+}
+
+@media only screen and (max-width: 768px) {
+ nav.toc {
+ position: fixed;
+ width: 16em;
+ left: -16em;
+ -webkit-overflow-scrolling: touch;
+ -webkit-transition-property: left; /* Safari */
+ -webkit-transition-duration: 0.3s; /* Safari */
+ transition-property: left;
+ transition-duration: 0.3s;
+ -webkit-transition-timing-function: ease-out; /* Safari */
+ transition-timing-function: ease-out;
+ z-index: 2;
+ box-shadow: 5px 0px 5px 0px rgb(210,210,210);
+ }
+
+ nav.toc.show {
+ left: 0;
+ }
+
+ article {
+ margin-left: 0;
+ padding: 3em 0.9em 0 0.9em; /* top right bottom left */
+ overflow-wrap: break-word;
+ }
+
+ article > header {
+ position: fixed;
+ left: 0;
+ z-index: 1;
+ }
+
+ article > header nav, hr {
+ display: none;
+ }
+
+ article > header div#topbar {
+ display: block; /* is mobile */
+ position: fixed;
+ width: 100%;
+ height: 1.5em;
+ padding-top: 1em;
+ padding-bottom: 1em;
+ background-color: #fcfcfc;
+ box-shadow: 0 1px 3px rgba(0,0,0,.26);
+ top: 0;
+ -webkit-transition-property: top; /* Safari */
+ -webkit-transition-duration: 0.3s; /* Safari */
+ transition-property: top;
+ transition-duration: 0.3s;
+ }
+
+ article > header div#topbar.headroom--unpinned.headroom--not-top.headroom--not-bottom {
+ top: -4em;
+ -webkit-transition-property: top; /* Safari */
+ -webkit-transition-duration: 0.7s; /* Safari */
+ transition-property: top;
+ transition-duration: 0.7s;
+ }
+
+ article > header div#topbar span {
+ width: 80%;
+ height: 1.5em;
+ margin-top: -0.1em;
+ margin-left: 0.9em;
+ font-size: 1.2em;
+ overflow: hidden;
+ }
+
+ article > header div#topbar a.fa-bars {
+ float: right;
+ padding: 0.6em;
+ margin-top: -0.6em;
+ margin-right: 0.3em;
+ font-size: 1.5em;
+ }
+
+ article > header div#topbar a.fa-bars:visited {
+ color: #3091d1;
+ }
+
+ article table {
+ overflow-x: auto;
+ display: block;
+ }
+
+ article div.MathJax_Display {
+ overflow: scroll;
+ }
+
+ article span.MathJax {
+ overflow: hidden;
+ }
+}
+
+@media only screen and (max-width: 320px) {
+ body {
+ font-size: 15px;
+ }
+}
--- /dev/null
+/*
+ * Part of Documenter.jl
+ * https://github.com/JuliaDocs/Documenter.jl
+ *
+ * License: MIT
+ */
+
+requirejs.config({
+ paths: {
+ 'jquery': 'file:///usr/share/javascript/jquery/jquery.min.js',
+ 'jqueryui': 'file:///usr/share/javascript/jquery-ui/jquery-ui.min.js',
+ //'headroom': 'https://cdnjs.cloudflare.com/ajax/libs/headroom/0.9.3/headroom.min',
+ 'headroom': 'http://localhost',
+ 'mathjax': 'file:///usr/share/javascript/mathjax/MathJax.js?config=TeX-AMS_HTML',
+ 'highlight': 'file:///usr/lib/nodejs/highlight.js/highlight.js',
+ 'highlight-julia': 'file:////usr/lib/nodejs/highlight.js/languages/julia.js',
+ 'highlight-julia-repl': 'file:////usr/lib/nodejs/highlight.js/languages/julia-repl.js',
+ },
+ shim: {
+ 'mathjax' : {
+ exports: "MathJax"
+ },
+ 'highlight-julia': ['highlight'],
+ 'highlight-julia-repl': ['highlight'],
+ }
+});
+
+// Load MathJax
+require(['mathjax'], function(MathJax) {
+ MathJax.Hub.Config({
+ "tex2jax": {
+ inlineMath: [['$','$'], ['\\(','\\)']],
+ processEscapes: true
+ }
+ });
+ MathJax.Hub.Config({
+ config: ["MMLorHTML.js"],
+ jax: [
+ "input/TeX",
+ "output/HTML-CSS",
+ "output/NativeMML"
+ ],
+ extensions: [
+ "MathMenu.js",
+ "MathZoom.js",
+ "TeX/AMSmath.js",
+ "TeX/AMSsymbols.js",
+ "TeX/autobold.js",
+ "TeX/autoload-all.js"
+ ]
+ });
+ MathJax.Hub.Config({
+ TeX: { equationNumbers: { autoNumber: "AMS" } }
+ });
+})
+
+require(['jquery', 'highlight', 'highlight-julia', 'highlight-julia-repl'], function($, hljs) {
+ $(document).ready(function() {
+ hljs.initHighlighting();
+ })
+
+})
+
+// update the version selector with info from the siteinfo.js and ../versions.js files
+require(['jquery'], function($) {
+ $(document).ready(function() {
+ var version_selector = $("#version-selector");
+
+ // add the current version to the selector based on siteinfo.js, but only if the selector is empty
+ if (typeof DOCUMENTER_CURRENT_VERSION !== 'undefined' && $('#version-selector > option').length == 0) {
+ var option = $("<option value='#' selected='selected'>" + DOCUMENTER_CURRENT_VERSION + "</option>");
+ version_selector.append(option);
+ }
+
+ if (typeof DOC_VERSIONS !== 'undefined') {
+ var existing_versions = $('#version-selector > option');
+ var existing_versions_texts = existing_versions.map(function(i,x){return x.text});
+ DOC_VERSIONS.forEach(function(each) {
+ var version_url = documenterBaseURL + "/../" + each;
+ var existing_id = $.inArray(each, existing_versions_texts);
+ // if not already in the version selector, add it as a new option,
+ // otherwise update the old option with the URL and enable it
+ if (existing_id == -1) {
+ var option = $("<option value='" + version_url + "'>" + each + "</option>");
+ version_selector.append(option);
+ } else {
+ var option = existing_versions[existing_id];
+ option.value = version_url;
+ option.disabled = false;
+ }
+ });
+ }
+
+ // only show the version selector if the selector has been populated
+ if ($('#version-selector > option').length > 0) {
+ version_selector.css("visibility", "visible");
+ }
+
+ // Scroll the navigation bar to the currently selected menu item
+ $("nav.toc > ul").get(0).scrollTop = $(".current").get(0).offsetTop - $("nav.toc > ul").get(0).offsetTop;
+ })
+
+})
+
+// mobile
+require(['jquery', 'headroom'], function($, Headroom) {
+ $(document).ready(function() {
+ var navtoc = $("nav.toc");
+ $("nav.toc li.current a.toctext").click(function() {
+ navtoc.toggleClass('show');
+ });
+ $("article > header div#topbar a.fa-bars").click(function(ev) {
+ ev.preventDefault();
+ navtoc.toggleClass('show');
+ if (navtoc.hasClass('show')) {
+ var title = $("article > header div#topbar span").text();
+ $("nav.toc ul li a:contains('" + title + "')").focus();
+ }
+ });
+ $("article#docs").bind('click', function(ev) {
+ if ($(ev.target).is('div#topbar a.fa-bars')) {
+ return;
+ }
+ if (navtoc.hasClass('show')) {
+ navtoc.removeClass('show');
+ }
+ });
+ if ($("article > header div#topbar").css('display') == 'block') {
+ var headroom = new Headroom(document.querySelector("article > header div#topbar"), {"tolerance": {"up": 10, "down": 10}});
+ headroom.init();
+ }
+ })
+})
--- /dev/null
+/*
+ * Part of Documenter.jl
+ * https://github.com/JuliaDocs/Documenter.jl
+ *
+ * License: MIT
+ */
+
+// parseUri 1.2.2
+// (c) Steven Levithan <stevenlevithan.com>
+// MIT License
+function parseUri (str) {
+ var o = parseUri.options,
+ m = o.parser[o.strictMode ? "strict" : "loose"].exec(str),
+ uri = {},
+ i = 14;
+
+ while (i--) uri[o.key[i]] = m[i] || "";
+
+ uri[o.q.name] = {};
+ uri[o.key[12]].replace(o.q.parser, function ($0, $1, $2) {
+ if ($1) uri[o.q.name][$1] = $2;
+ });
+
+ return uri;
+};
+parseUri.options = {
+ strictMode: false,
+ key: ["source","protocol","authority","userInfo","user","password","host","port","relative","path","directory","file","query","anchor"],
+ q: {
+ name: "queryKey",
+ parser: /(?:^|&)([^&=]*)=?([^&]*)/g
+ },
+ parser: {
+ strict: /^(?:([^:\/?#]+):)?(?:\/\/((?:(([^:@]*)(?::([^:@]*))?)?@)?([^:\/?#]*)(?::(\d*))?))?((((?:[^?#\/]*\/)*)([^?#]*))(?:\?([^#]*))?(?:#(.*))?)/,
+ loose: /^(?:(?![^:@]+:[^:@\/]*@)([^:\/?#.]+):)?(?:\/\/)?((?:(([^:@]*)(?::([^:@]*))?)?@)?([^:\/?#]*)(?::(\d*))?)(((\/(?:[^?#](?![^?#\/]*\.[^?#\/.]+(?:[?#]|$)))*\/?)?([^?#\/]*))(?:\?([^#]*))?(?:#(.*))?)/
+ }
+};
+
+requirejs.config({
+ paths: {
+ 'jquery': 'https://cdnjs.cloudflare.com/ajax/libs/jquery/3.1.1/jquery.min',
+ 'lunr': 'https://cdnjs.cloudflare.com/ajax/libs/lunr.js/2.3.1/lunr.min',
+ 'lodash': 'https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.4/lodash.min',
+ }
+});
+
+var currentScript = document.currentScript;
+
+require(["jquery", "lunr", "lodash"], function($, lunr, _) {
+ $("#search-form").submit(function(e) {
+ e.preventDefault()
+ })
+
+ // list below is the lunr 2.1.3 list minus the intersect with names(Base)
+ // (all, any, get, in, is, which) and (do, else, for, let, where, while, with)
+ // ideally we'd just filter the original list but it's not available as a variable
+ lunr.stopWordFilter = lunr.generateStopWordFilter([
+ 'a',
+ 'able',
+ 'about',
+ 'across',
+ 'after',
+ 'almost',
+ 'also',
+ 'am',
+ 'among',
+ 'an',
+ 'and',
+ 'are',
+ 'as',
+ 'at',
+ 'be',
+ 'because',
+ 'been',
+ 'but',
+ 'by',
+ 'can',
+ 'cannot',
+ 'could',
+ 'dear',
+ 'did',
+ 'does',
+ 'either',
+ 'ever',
+ 'every',
+ 'from',
+ 'got',
+ 'had',
+ 'has',
+ 'have',
+ 'he',
+ 'her',
+ 'hers',
+ 'him',
+ 'his',
+ 'how',
+ 'however',
+ 'i',
+ 'if',
+ 'into',
+ 'it',
+ 'its',
+ 'just',
+ 'least',
+ 'like',
+ 'likely',
+ 'may',
+ 'me',
+ 'might',
+ 'most',
+ 'must',
+ 'my',
+ 'neither',
+ 'no',
+ 'nor',
+ 'not',
+ 'of',
+ 'off',
+ 'often',
+ 'on',
+ 'only',
+ 'or',
+ 'other',
+ 'our',
+ 'own',
+ 'rather',
+ 'said',
+ 'say',
+ 'says',
+ 'she',
+ 'should',
+ 'since',
+ 'so',
+ 'some',
+ 'than',
+ 'that',
+ 'the',
+ 'their',
+ 'them',
+ 'then',
+ 'there',
+ 'these',
+ 'they',
+ 'this',
+ 'tis',
+ 'to',
+ 'too',
+ 'twas',
+ 'us',
+ 'wants',
+ 'was',
+ 'we',
+ 'were',
+ 'what',
+ 'when',
+ 'who',
+ 'whom',
+ 'why',
+ 'will',
+ 'would',
+ 'yet',
+ 'you',
+ 'your'
+ ])
+
+ // add . as a separator, because otherwise "title": "Documenter.Anchors.add!"
+ // would not find anything if searching for "add!", only for the entire qualification
+ lunr.tokenizer.separator = /[\s\-\.]+/
+
+ // custom trimmer that doesn't strip @ and !, which are used in julia macro and function names
+ lunr.trimmer = function (token) {
+ return token.update(function (s) {
+ return s.replace(/^[^a-zA-Z0-9@!]+/, '').replace(/[^a-zA-Z0-9@!]+$/, '')
+ })
+ }
+
+ lunr.Pipeline.registerFunction(lunr.stopWordFilter, 'juliaStopWordFilter')
+ lunr.Pipeline.registerFunction(lunr.trimmer, 'juliaTrimmer')
+
+ var index = lunr(function () {
+ this.ref('location')
+ this.field('title')
+ this.field('text')
+ documenterSearchIndex['docs'].forEach(function(e) {
+ this.add(e)
+ }, this)
+ })
+ var store = {}
+
+ documenterSearchIndex['docs'].forEach(function(e) {
+ store[e.location] = {title: e.title, category: e.category}
+ })
+
+ $(function(){
+ function update_search(querystring) {
+ tokens = lunr.tokenizer(querystring)
+ results = index.query(function (q) {
+ tokens.forEach(function (t) {
+ q.term(t.toString(), {
+ fields: ["title"],
+ boost: 100,
+ usePipeline: false,
+ editDistance: 0,
+ wildcard: lunr.Query.wildcard.NONE
+ })
+ q.term(t.toString(), {
+ fields: ["title"],
+ boost: 10,
+ usePipeline: false,
+ editDistance: 2,
+ wildcard: lunr.Query.wildcard.NONE
+ })
+ q.term(t.toString(), {
+ fields: ["text"],
+ boost: 1,
+ usePipeline: true,
+ editDistance: 0,
+ wildcard: lunr.Query.wildcard.NONE
+ })
+ })
+ })
+ $('#search-info').text("Number of results: " + results.length)
+ $('#search-results').empty()
+ results.forEach(function(result) {
+ data = store[result.ref]
+ link = $('<a>')
+ link.text(data.title)
+ link.attr('href', documenterBaseURL+'/'+result.ref)
+ cat = $('<span class="category">('+data.category+')</span>')
+ li = $('<li>').append(link).append(" ").append(cat)
+ $('#search-results').append(li)
+ })
+ }
+
+ function update_search_box() {
+ querystring = $('#search-query').val()
+ update_search(querystring)
+ }
+
+ $('#search-query').keyup(_.debounce(update_search_box, 250))
+ $('#search-query').change(update_search_box)
+
+ search_query_uri = parseUri(window.location).queryKey["q"]
+ if(search_query_uri !== undefined) {
+ search_query = decodeURIComponent(search_query_uri.replace(/\+/g, '%20'))
+ $("#search-query").val(search_query)
+ }
+ update_search_box();
+ })
+})
--- /dev/null
+% font settings
+\usepackage{fontspec, newunicodechar, polyglossia}
+
+\setsansfont{Lato}[Scale=MatchLowercase, Ligatures=TeX]
+\setmonofont{Roboto Mono}[Scale=MatchLowercase]
+\renewcommand{\familydefault}{\sfdefault}
+%
+
+% colours
+\usepackage{xcolor}
+
+\definecolor{light-blue}{HTML}{6b85dd}
+\definecolor{dark-blue}{HTML}{4266d5}
+\definecolor{light-red}{HTML}{d66661}
+\definecolor{dark-red}{HTML}{c93d39}
+\definecolor{light-green}{HTML}{6bab5b}
+\definecolor{dark-green}{HTML}{3b972e}
+\definecolor{light-purple}{HTML}{aa7dc0}
+\definecolor{dark-purple}{HTML}{945bb0}
+%
+
+% maths
+\usepackage{amsmath, amssymb}
+%
+
+% listings
+\usepackage{listings, minted}
+
+\lstset{
+ basicstyle = \small\ttfamily,
+ breaklines = true,
+ columns = fullflexible,
+ frame = leftline,
+ keepspaces = true,
+ showstringspaces = false,
+ xleftmargin = 3pt,
+}
+
+\setminted{
+ breaklines = true,
+ fontsize = \small,
+ frame = leftline,
+}
+%
+
+% tables
+\usepackage{tabulary}
+%
+
+% hyperref
+\usepackage{hyperref}
+\hypersetup{
+ pdfpagelabels,
+ bookmarks,
+ hyperindex,
+ unicode = true,
+ linkcolor = dark-blue,
+ urlcolor = dark-purple,
+ colorlinks = true,
+}
+%
+
+% table of contents
+\maxtocdepth{subsection}
+%
+
+% paragraphs
+\setlength{\parindent}{0pt}
+\nonzeroparskip
+%
+
+% adjust margins
+\setulmarginsandblock{1.5in}{1in}{*}
+\setlrmarginsandblock{1.5in}{1in}{*}
+\setheaderspaces{1in}{*}{*}
+\checkandfixthelayout
+%
--- /dev/null
+div.wy-menu-vertical ul.current li.toctree-l3 a {
+ font-weight: bold;
+}
+
+a.documenter-source {
+ float: right;
+}
+
+.documenter-methodtable pre {
+ margin-left: 0px;
+ margin-right: 0px;
+ margin-top: 0px;
+ padding: 0px;
+}
+
+.documenter-methodtable pre.documenter-inline {
+ display: inline;
+}
--- /dev/null
+MathJax.Hub.Config({
+ "tex2jax": {
+ inlineMath: [['$','$'], ['\\(','\\)']],
+ processEscapes: true
+ }
+});
+MathJax.Hub.Config({
+ config: ["MMLorHTML.js"],
+ jax: [
+ "input/TeX",
+ "output/HTML-CSS",
+ "output/NativeMML"
+ ],
+ extensions: [
+ "MathMenu.js",
+ "MathZoom.js",
+ "TeX/AMSmath.js",
+ "TeX/AMSsymbols.js",
+ "TeX/autobold.js",
+ "TeX/autoload-all.js"
+ ]
+});
+MathJax.Hub.Config({
+ TeX: { equationNumbers: { autoNumber: "AMS" } }
+});
--- /dev/null
+[deps]
+Coverage = "a2441757-f6aa-5fb2-8edb-039e3f45d037"
--- /dev/null
+# Only run coverage from linux nightly build on travis.
+get(ENV, "TRAVIS_OS_NAME", "") == "linux" || exit()
+get(ENV, "TRAVIS_JULIA_VERSION", "") == "nightly" || exit()
+
+using Coverage
+
+cd(joinpath(dirname(@__FILE__), "..")) do
+ Codecov.submit(Codecov.process_folder())
+end
--- /dev/null
+[[Base64]]
+uuid = "2a0f44e3-6c83-55bd-87e4-b1978d98bd5f"
+
+[[Dates]]
+deps = ["Printf"]
+uuid = "ade2ca70-3891-5945-98fb-dc099432e06a"
+
+[[Distributed]]
+deps = ["LinearAlgebra", "Random", "Serialization", "Sockets"]
+uuid = "8ba89e20-285c-5b6f-9357-94700520ee1b"
+
+[[DocStringExtensions]]
+deps = ["LibGit2", "Markdown", "Pkg", "Test"]
+git-tree-sha1 = "a016e0bfe98a748c4488e2248c2ef4c67d6fdd35"
+uuid = "ffbed154-4ef7-542d-bbb7-c09d3a79fcae"
+version = "0.5.0"
+
+[[DocumenterTools]]
+deps = ["Base64", "DocStringExtensions", "LibGit2"]
+git-tree-sha1 = "519f8026e8f719ec018b70cfb394ae3756b427b1"
+repo-rev = "master"
+repo-url = "https://github.com/JuliaDocs/DocumenterTools.jl"
+uuid = "46bb7b6e-9a58-11e8-21f8-9f2f45a0fcba"
+version = "0.0.0"
+
+[[InteractiveUtils]]
+deps = ["LinearAlgebra", "Markdown"]
+uuid = "b77e0a4c-d291-57a0-90e8-8db25a27a240"
+
+[[LibGit2]]
+uuid = "76f85450-5226-5b5a-8eaa-529ad045b433"
+
+[[Libdl]]
+uuid = "8f399da3-3557-5675-b5ff-fb832c97cbdb"
+
+[[LinearAlgebra]]
+deps = ["Libdl"]
+uuid = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
+
+[[Logging]]
+uuid = "56ddb016-857b-54e1-b83d-db4d58db5568"
+
+[[Markdown]]
+deps = ["Base64"]
+uuid = "d6f4376e-aef5-505a-96c1-9c027394607a"
+
+[[Pkg]]
+deps = ["Dates", "LibGit2", "Markdown", "Printf", "REPL", "Random", "SHA", "UUIDs"]
+uuid = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
+
+[[Printf]]
+deps = ["Unicode"]
+uuid = "de0858da-6303-5e67-8744-51eddeeeb8d7"
+
+[[REPL]]
+deps = ["InteractiveUtils", "Markdown", "Sockets"]
+uuid = "3fa0cd96-eef1-5676-8a61-b3b8758bbffb"
+
+[[Random]]
+deps = ["Serialization"]
+uuid = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
+
+[[SHA]]
+uuid = "ea8e919c-243c-51af-8825-aaa63cd721ce"
+
+[[Serialization]]
+uuid = "9e88b42a-f829-5b0c-bbe9-9e923198166b"
+
+[[Sockets]]
+uuid = "6462fe0b-24de-5631-8697-dd941f90decc"
+
+[[Test]]
+deps = ["Distributed", "InteractiveUtils", "Logging", "Random"]
+uuid = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
+
+[[UUIDs]]
+deps = ["Random"]
+uuid = "cf7118a7-6976-5b1a-9a39-7adc72f591a4"
+
+[[Unicode]]
+uuid = "4ec0a83e-493e-50e2-b9ac-8f72acf5a8f5"
--- /dev/null
+[deps]
+DocumenterTools = "46bb7b6e-9a58-11e8-21f8-9f2f45a0fcba"
--- /dev/null
+using Documenter, DocumenterTools
+
+makedocs(
+ modules = [Documenter, DocumenterTools],
+ clean = false,
+ assets = ["assets/favicon.ico"],
+ sitename = "Documenter.jl",
+ authors = "Michael Hatherly, Morten Piibeleht, and contributors.",
+ analytics = "UA-89508993-1",
+ linkcheck = !("skiplinks" in ARGS),
+ pages = [
+ "Home" => "index.md",
+ "Manual" => Any[
+ "Guide" => "man/guide.md",
+ "man/examples.md",
+ "man/syntax.md",
+ "man/doctests.md",
+ "man/latex.md",
+ hide("man/hosting.md", [
+ "man/hosting/walkthrough.md"
+ ]),
+ "man/other-formats.md",
+ ],
+ "Library" => Any[
+ "Public" => "lib/public.md",
+ hide("Internals" => "lib/internals.md", Any[
+ "lib/internals/anchors.md",
+ "lib/internals/builder.md",
+ "lib/internals/cross-references.md",
+ "lib/internals/docchecks.md",
+ "lib/internals/docsystem.md",
+ "lib/internals/doctests.md",
+ "lib/internals/documenter.md",
+ "lib/internals/documentertools.md",
+ "lib/internals/documents.md",
+ "lib/internals/dom.md",
+ "lib/internals/expanders.md",
+ "lib/internals/formats.md",
+ "lib/internals/mdflatten.md",
+ "lib/internals/selectors.md",
+ "lib/internals/textdiff.md",
+ "lib/internals/utilities.md",
+ "lib/internals/writers.md",
+ ])
+ ],
+ "contributing.md",
+ ],
+ # Use clean URLs, unless built as a "local" build
+ html_prettyurls = !("local" in ARGS),
+ html_canonical = "https://juliadocs.github.io/Documenter.jl/stable/",
+)
+
+deploydocs(
+ repo = "github.com/JuliaDocs/Documenter.jl.git",
+ target = "build",
+)
--- /dev/null
+# Contributing
+
+This page details the some of the guidelines that should be followed when contributing to this package.
+
+
+## Branches
+
+From `Documenter` version `0.3` onwards `release-*` branches are used for tagged minor versions of this package. This follows the same approach used in the main Julia repository, albeit on a much more modest scale.
+
+Please open pull requests against the `master` branch rather than any of the `release-*` branches whenever possible.
+
+### Backports
+
+Bug fixes are backported to the `release-*` branches using `git cherry-pick -x` by a JuliaDocs member and will become available in point releases of that particular minor version of the package.
+
+Feel free to nominate commits that should be backported by opening an issue. Requests for new point releases to be tagged in `METADATA.jl` can also be made in the same way.
+
+### `release-*` branches
+
+ * Each new minor version `x.y.0` gets a branch called `release-x.y` (a [protected branch](https://help.github.com/articles/about-protected-branches/)).
+ * New versions are usually tagged only from the `release-x.y` branches.
+ * For patch releases, changes get backported to the `release-x.y` branch via a single PR with the standard name "Backports for x.y.z" and label ["Type: Backport"](https://github.com/JuliaDocs/Documenter.jl/pulls?q=label%3A%22Type%3A+Backport%22). The PR message links to all the PRs that are providing commits to the backport. The PR gets merged as a merge commit (i.e. not squashed).
+ * The old `release-*` branches may be removed once they have outlived their usefulness.
+ * Patch version [milestones](https://github.com/JuliaDocs/Documenter.jl/milestones) are used to keep track of which PRs get backported etc.
+
+
+## Style Guide
+
+Follow the style of the surrounding text when making changes. When adding new features please try to stick to the following points whenever applicable.
+
+### Julia
+
+ * 4-space indentation;
+ * modules spanning entire files should not be indented, but modules that have surrounding code should;
+ * no blank lines at the start or end of files;
+ * do not manually align syntax such as `=` or `::` over adjacent lines;
+ * use `function ... end` when a method definition contains more than one toplevel expression;
+ * related short-form method definitions don't need a new line between them;
+ * unrelated or long-form method definitions must have a blank line separating each one;
+ * surround all binary operators with whitespace except for `::`, `^`, and `:`;
+ * files containing a single `module ... end` must be named after the module;
+ * method arguments should be ordered based on the amount of usage within the method body;
+ * methods extended from other modules must follow their inherited argument order, not the above rule;
+ * explicit `return` should be preferred except in short-form method definitions;
+ * avoid dense expressions where possible e.g. prefer nested `if`s over complex nested `?`s;
+ * include a trailing `,` in vectors, tuples, or method calls that span several lines;
+ * do not use multiline comments (`#=` and `=#`);
+ * wrap long lines as near to 92 characters as possible, this includes docstrings;
+ * follow the standard naming conventions used in `Base`.
+
+### Markdown
+
+ * Use unbalanced `#` headers, i.e. no `#` on the right hand side of the header text;
+ * include a single blank line between toplevel blocks;
+ * unordered lists must use `*` bullets with two preceding spaces;
+ * do *not* hard wrap lines;
+ * use emphasis (`*`) and bold (`**`) sparingly;
+ * always use fenced code blocks instead of indented blocks;
+ * follow the conventions outlined in the Julia documentation page on documentation.
--- /dev/null
+# Documenter.jl
+
+*A documentation generator for Julia.*
+
+A package for building documentation from docstrings and markdown files.
+
+!!! note
+
+ Please read through the
+ [Documentation](https://docs.julialang.org/en/latest/manual/documentation/) section
+ of the main Julia manual if this is your first time using Julia's documentation system.
+ Once you've read through how to write documentation for your code then come back here.
+
+## Package Features
+
+- Write all your documentation in [Markdown](https://en.wikipedia.org/wiki/Markdown).
+- Minimal configuration.
+- Supports Julia `0.7` and `1.0`.
+- Doctests Julia code blocks.
+- Cross references for docs and section headers.
+- [``\LaTeX`` syntax](@ref latex_syntax) support.
+- Checks for missing docstrings and incorrect cross references.
+- Generates tables of contents and docstring indexes.
+- Automatically builds and deploys docs from Travis to GitHub Pages.
+
+The [Package Guide](@ref) provides a tutorial explaining how to get started using Documenter.
+
+Some examples of packages using Documenter can be found on the [Examples](@ref) page.
+
+See the [Index](@ref main-index) for the complete list of documented functions and types.
+
+## Manual Outline
+
+```@contents
+Pages = [
+ "man/guide.md",
+ "man/examples.md",
+ "man/syntax.md",
+ "man/doctests.md",
+ "man/hosting.md",
+ "man/latex.md",
+ "man/contributing.md",
+]
+Depth = 1
+```
+
+## Library Outline
+
+```@contents
+Pages = ["lib/public.md", "lib/internals.md"]
+```
+
+### [Index](@id main-index)
+
+```@index
+Pages = ["lib/public.md"]
+```
--- /dev/null
+# Internal Documentation
+
+This page lists all the documented internals of the `Documenter` module and submodules.
+
+## Contents
+
+```@contents
+Pages = [joinpath("internals", f) for f in readdir("internals")]
+```
+
+## Index
+
+A list of all internal documentation sorted by module.
+
+```@index
+Pages = [joinpath("internals", f) for f in readdir("internals")]
+```
--- /dev/null
+# Anchors
+
+```@autodocs
+Modules = [Documenter.Anchors]
+```
--- /dev/null
+# Builder
+
+```@autodocs
+Modules = [Documenter.Builder]
+```
--- /dev/null
+# CrossReferences
+
+```@autodocs
+Modules = [Documenter.CrossReferences]
+```
--- /dev/null
+# DocChecks
+
+```@autodocs
+Modules = [Documenter.DocChecks]
+```
--- /dev/null
+# DocSystem
+
+```@autodocs
+Modules = [Documenter.DocSystem]
+```
--- /dev/null
+# DocTests
+
+```@autodocs
+Modules = [Documenter.DocTests]
+```
--- /dev/null
+# Documenter
+
+```@docs
+Documenter.gitrm_copy
+Documenter.git_push
+```
--- /dev/null
+# DocumenterTools
+
+```@docs
+DocumenterTools.package_devpath
+```
+
+## Generator
+
+```@autodocs
+Modules = [DocumenterTools.Generator]
+```
--- /dev/null
+# Documents
+
+```@autodocs
+Modules = [Documenter.Documents]
+```
--- /dev/null
+# DOM
+
+```@autodocs
+Modules = [Documenter.Utilities.DOM]
+```
--- /dev/null
+# Expanders
+
+```@autodocs
+Modules = [Documenter.Expanders]
+```
--- /dev/null
+# Formats
+
+```@autodocs
+Modules = [Documenter.Formats]
+```
--- /dev/null
+# MDFlatten
+
+```@autodocs
+Modules = [Documenter.Utilities.MDFlatten]
+```
--- /dev/null
+# Selectors
+
+```@autodocs
+Modules = [Documenter.Selectors]
+```
--- /dev/null
+# TextDiff
+
+```@autodocs
+Modules = [Documenter.Utilities.TextDiff]
+```
--- /dev/null
+# Utilities
+
+```@autodocs
+Modules = [Documenter.Utilities]
+```
--- /dev/null
+# Writers
+
+```@autodocs
+Modules = [
+ Documenter.Writers,
+ Documenter.Writers.MarkdownWriter,
+ Documenter.Writers.HTMLWriter,
+ Documenter.Writers.LaTeXWriter,
+]
+```
--- /dev/null
+# Public Documentation
+
+Documentation for `Documenter.jl`'s public interface.
+
+See [Internal Documentation](@ref) for internal package docs covering all submodules.
+
+## Contents
+
+```@contents
+Pages = ["public.md"]
+```
+
+## Index
+
+```@index
+Pages = ["public.md"]
+```
+
+## Public Interface
+
+```@docs
+Documenter
+makedocs
+hide
+deploydocs
+Deps
+Deps.pip
+```
+
+## DocumenterTools
+
+```@docs
+DocumenterTools.generate
+DocumenterTools.Travis.genkeys
+DocumenterTools.Travis
+```
--- /dev/null
+# Doctests
+
+Documenter will, by default, try to run `jldoctest` code blocks that it finds in the generated
+documentation. This can help to avoid documentation examples from becoming outdated,
+incorrect, or misleading. It's recommended that as many of a package's examples be runnable
+by Documenter's doctest.
+
+This section of the manual outlines how to go about enabling doctests for code blocks in
+your package's documentation.
+
+## "Script" Examples
+
+The first, of two, types of doctests is the "script" code block. To make Documenter detect
+this kind of code block the following format must be used:
+
+````markdown
+```jldoctest
+a = 1
+b = 2
+a + b
+
+# output
+
+3
+```
+````
+
+The code block's "language" must be `jldoctest` and must include a line containing the text `#
+output`. The text before this line is the contents of the script which is run. The text that
+appears after `# output` is the textual representation that would be shown in the Julia REPL
+if the script had been `include`d.
+
+The actual output produced by running the "script" is compared to the expected result and
+any difference will result in [`makedocs`](@ref) throwing an error and terminating.
+
+Note that the amount of whitespace appearing above and below the `# output` line is not
+significant and can be increased or decreased if desired.
+
+It is possible to suppress the output from the doctest by setting the `output` keyword
+argument to `false`, for example
+
+````markdown
+```jldoctest; output = false
+a = 1
+b = 2
+a + b
+
+# output
+
+3
+```
+````
+
+Note that the output of the script will still be compared to the expected result,
+i.e. what is `# output` section, but the `# output` section will be suppressed in
+the rendered documentation.
+
+## REPL Examples
+
+The other kind of doctest is a simulated Julia REPL session. The following format is
+detected by Documenter as a REPL doctest:
+
+````markdown
+```jldoctest
+julia> a = 1
+1
+
+julia> b = 2;
+
+julia> c = 3; # comment
+
+julia> a + b + c
+6
+```
+````
+
+As with script doctests, the code block must have it's language set to `jldoctest`. When a code
+block contains one or more `julia> ` at the start of a line then it is assumed to be a REPL
+doctest. Semi-colons, `;`, at the end of a line works in the same way as in the Julia REPL
+and will suppress the output, although the line is still evaluated.
+
+Note that not all features of the REPL are supported such as shell and help modes.
+
+## Exceptions
+
+Doctests can also test for thrown exceptions and their stacktraces. Comparing of the actual
+and expected results is done by checking whether the expected result matches the start of
+the actual result. Hence, both of the following errors will match the actual result.
+
+````markdown
+```jldoctest
+julia> div(1, 0)
+ERROR: DivideError: integer division error
+ in div(::Int64, ::Int64) at ./int.jl:115
+
+julia> div(1, 0)
+ERROR: DivideError: integer division error
+```
+````
+
+If instead the first `div(1, 0)` error was written as
+
+````markdown
+```jldoctest
+julia> div(1, 0)
+ERROR: DivideError: integer division error
+ in div(::Int64, ::Int64) at ./int.jl:114
+```
+````
+
+where line `115` is replaced with `114` then the doctest will fail.
+
+In the second `div(1, 0)`, where no stacktrace is shown, it may appear to the reader that
+it is expected that no stacktrace will actually be displayed when they attempt to try to
+recreate the error themselves. To indicate to readers that the output result is truncated
+and does not display the entire (or any of) the stacktrace you may write `[...]` at the
+line where checking should stop, i.e.
+
+````markdown
+```jldoctest
+julia> div(1, 0)
+ERROR: DivideError: integer division error
+[...]
+```
+````
+
+## Preserving Definitions Between Blocks
+
+Every doctest block is evaluated inside its own `module`. This means that definitions
+(types, variables, functions etc.) from a block can *not* be used in the next block.
+For example:
+
+````markdown
+```jldoctest
+julia> foo = 42
+42
+```
+````
+
+The variable `foo` will not be defined in the next block:
+
+````markdown
+```jldoctest
+julia> println(foo)
+ERROR: UndefVarError: foo not defined
+```
+````
+
+To preserve definitions it is possible to label blocks in order to collect several blocks
+into the same module. All blocks with the same label (in the same file) will be evaluated
+in the same module, and hence share scope. This can be useful if the same definitions are
+used in more than one block, with for example text, or other doctest blocks, in between.
+Example:
+
+````markdown
+```jldoctest mylabel
+julia> foo = 42
+42
+```
+````
+
+Now, since the block below has the same label as the block above, the variable `foo` can
+be used:
+
+````markdown
+```jldoctest mylabel
+julia> println(foo)
+42
+```
+````
+
+!!! note
+
+ Labeled doctest blocks does not need to be consecutive (as in the example above) to be
+ included in the same module. They can be interspaced with unlabeled blocks or blocks
+ with another label.
+
+## Setup Code
+
+Doctests may require some setup code that must be evaluated prior to that of the actual
+example, but that should not be displayed in the final documentation. For this purpose a
+`@meta` block containing a `DocTestSetup = ...` value can be used. In the example below,
+the function `foo` is defined inside a `@meta` block. This block will be evaluated at
+the start of the following doctest blocks:
+
+````markdown
+```@meta
+DocTestSetup = quote
+ function foo(x)
+ return x^2
+ end
+end
+```
+
+```jldoctest
+julia> foo(2)
+4
+```
+
+```@meta
+DocTestSetup = nothing
+```
+````
+
+The `DocTestSetup = nothing` is not strictly necessary, but good practice nonetheless to
+help avoid unintentional definitions in following doctest blocks.
+
+Another option is to use the `setup` keyword argument, which is convenient for short definitions,
+and for setups needed in inline docstrings.
+
+````markdown
+```jldoctest; setup = :(foo(x) = x^2)
+julia> foo(2)
+4
+```
+````
+
+!!! note
+
+ The `DocTestSetup` and the `setup` values are **re-evaluated** at the start of *each* doctest block
+ and no state is shared between any code blocks.
+ To preserve definitions see [Preserving Definitions Between Blocks](@ref).
+
+## Filtering Doctests
+
+A part of the output of a doctest might be non-deterministic, e.g. pointer addresses and timings.
+It is therefore possible to filter a doctest so that the deterministic part can still be tested.
+
+A filter takes the form of a regular expression.
+In a doctest, each match in the expected output and the actual output is removed before the two outputs are compared.
+Filters are added globally, i.e. applied to all doctests in the documentation, by passing a list of regular expressions to
+`makedocs` with the keyword `doctestfilters`.
+
+For more fine grained control it is possible to define filters in `@meta` blocks by assigning them
+to the `DocTestFilters` variable, either as a single regular expression (`DocTestFilters = [r"foo"]`)
+or as a vector of several regex (`DocTestFilters = [r"foo", r"bar"]`).
+
+An example is given below where some of the non-deterministic output from `@time` is filtered.
+
+````markdown
+```@meta
+DocTestFilters = r"[0-9\.]+ seconds \(.*\)"
+```
+
+```jldoctest
+julia> @time [1,2,3,4]
+ 0.000003 seconds (5 allocations: 272 bytes)
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+
+```@meta
+DocTestFilters = nothing
+```
+````
+
+The `DocTestFilters = nothing` is not strictly necessary, but good practice nonetheless to
+help avoid unintentional filtering in following doctest blocks.
+
+Another option is to use the `filter` keyword argument. This defines a doctest-local filter
+which is only active for the specific doctest. Note that such filters are not shared between
+named doctests either. It is possible to define a filter by a single regex (filter = r"foo")
+or as a list of regex (filter = [r"foo", r"bar"]). Example:
+
+````markdown
+```jldoctest; filter = r"[0-9\.]+ seconds \(.*\)"
+julia> @time [1,2,3,4]
+ 0.000003 seconds (5 allocations: 272 bytes)
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+````
+
+!!! note
+
+ The global filters, filters defined in `@meta` blocks, and filters defined with the `filter`
+ keyword argument are all applied to each doctest.
+
+## Fixing Outdated Doctests
+
+To fix outdated doctests, the `doctest` flag to [`makedocs`](@ref) can be set to
+`doctest = :fix`. This will run the doctests, and overwrite the old results with
+the new output.
+
+!!! note
+
+ The `:fix` option currently only works for LF line endings (`'\n'`)
+
+!!! note
+
+ It is recommended to `git commit` any code changes before running the doctest fixing.
+ That way it is simple to restore to the previous state if the fixing goes wrong.
+
+!!! note
+
+ There are some corner cases where the fixing algorithm may replace the wrong code snippet.
+ It is therefore recommended to manually inspect the result of the fixing before committing.
+
+
+## Skipping Doctests
+
+Doctesting can be disabled by setting the [`makedocs`](@ref) keyword `doctest = false`.
+This should only be done when initially laying out the structure of a package's
+documentation, after which it's encouraged to always run doctests when building docs.
--- /dev/null
+# Examples
+
+Sometimes the best way to learn how to use a new package is to look for
+examples of what others have already built with it.
+
+The following packages use Documenter to build their documentation and so
+should give a good overview of what this package is currently able to do.
+
+!!! note
+
+ Packages are listed alphabetically. If you have a package that uses Documenter then
+ please open a PR that adds it to the appropriate list below; a simple way to do so
+ is to navigate to
+ https://github.com/JuliaDocs/Documenter.jl/edit/master/docs/src/man/examples.md.
+
+ The `make.jl` file for all listed packages will be tested to check for potential
+ regressions prior to tagging new Documenter releases whenever possible.
+
+## Registered
+
+Packages that have tagged versions available in `METADATA.jl`.
+
+- [Augmentor.jl](https://evizero.github.io/Augmentor.jl/)
+- [BanditOpt.jl](https://v-i-s-h.github.io/BanditOpt.jl/latest/)
+- [BeaData.jl](https://stephenbnicar.github.io/BeaData.jl/latest/)
+- [Bio.jl](https://biojulia.net/Bio.jl/latest/)
+- [ControlSystems.jl](http://juliacontrol.github.io/ControlSystems.jl/latest/)
+- [Currencies.jl](https://juliafinance.github.io/Currencies.jl/latest/)
+- [DiscretePredictors.jl](https://github.com/v-i-s-h/DiscretePredictors.jl)
+- [Documenter.jl](https://juliadocs.github.io/Documenter.jl/latest/)
+- [EvolvingGraphs.jl](https://etymoio.github.io/EvolvingGraphs.jl/latest/)
+- [ExtractMacro.jl](https://carlobaldassi.github.io/ExtractMacro.jl/latest/)
+- [EzXML.jl](https://bicycle1885.github.io/EzXML.jl/latest/)
+- [FourierFlows.jl](https://FourierFlows.github.io/FourierFlows.jl/latest/)
+- [Gadfly.jl](http://gadflyjl.org/stable/)
+- [GeoStats.jl](http://juliohm.github.io/GeoStats.jl/latest/)
+- [Highlights.jl](https://juliadocs.github.io/Highlights.jl/latest/)
+- [IntervalConstraintProgramming.jl](https://juliaintervals.github.io/IntervalConstraintProgramming.jl/latest/)
+- [Luxor.jl](https://juliagraphics.github.io/Luxor.jl/stable/)
+- [MergedMethods.jl](https://michaelhatherly.github.io/MergedMethods.jl/latest/)
+- [Mimi.jl](http://anthofflab.berkeley.edu/Mimi.jl/stable/)
+- [NumericSuffixes.jl](https://michaelhatherly.github.io/NumericSuffixes.jl/latest/)
+- [OhMyREPL.jl](https://github.com/KristofferC/OhMyREPL.jl)
+- [OnlineStats.jl](http://joshday.github.io/OnlineStats.jl/latest/)
+- [POMDPs.jl](http://juliapomdp.github.io/POMDPs.jl/latest/)
+- [PhyloNetworks.jl](http://crsl4.github.io/PhyloNetworks.jl/latest/)
+- [PrivateModules.jl](https://michaelhatherly.github.io/PrivateModules.jl/latest/)
+- [Query.jl](http://www.queryverse.org/Query.jl/stable/)
+- [TaylorSeries.jl](http://www.juliadiff.org/TaylorSeries.jl/latest/)
+- [Weave.jl](http://weavejl.mpastell.com/stable/)
+
+## Documentation repositories
+
+Some projects or organizations maintain dedicated documentation repositories that are
+separate from specific packages.
+
+- [DifferentialEquations.jl](http://docs.juliadiffeq.org/latest/)
+- [JuliaDocs landing page](https://juliadocs.github.io/latest/)
+- [JuliaMusic](https://juliamusic.github.io/JuliaMusic_documentation.jl/latest/)
+- [Plots.jl](https://docs.juliaplots.org/latest/)
+
+## Unregistered
+
+Packages that are not available in `METADATA.jl` and may be works-in-progress.
+Please do take that into consideration when browsing this list.
+
+- [AnonymousTypes.jl](https://michaelhatherly.github.io/AnonymousTypes.jl/latest/)
--- /dev/null
+# Package Guide
+
+Documenter is designed to do one thing -- combine markdown files and inline docstrings from
+Julia's docsystem into a single inter-linked document. What follows is a step-by-step guide
+to creating a simple document.
+
+
+## Installation
+
+Documenter can be installed using the Julia package manager.
+From the Julia REPL, type `]` to enter the Pkg REPL mode and run
+
+```
+pkg> add Documenter
+```
+
+
+## Setting up the Folder Structure
+
+!!! note
+ The function [`DocumenterTools.generate`](@ref) from the `DocumenterTools` package
+ can generate the basic structure that Documenter expects.
+
+Firstly, we need a Julia module to document. This could be a package generated via
+`PkgDev.generate` or a single `.jl` script accessible via Julia's `LOAD_PATH`. For this
+guide we'll be using a package called `Example.jl` that has the following directory layout:
+
+```
+Example/
+ src/
+ Example.jl
+ ...
+```
+
+Note that the `...` just represent unimportant files and folders.
+
+We must decide on a location where we'd like to store the documentation for this package.
+It's recommended to use a folder named `docs/` in the toplevel of the package, like so
+
+```
+Example/
+ docs/
+ ...
+ src/
+ Example.jl
+ ...
+```
+
+Inside the `docs/` folder we need to add two things. A source folder which will contain the
+markdown files that will be used to build the finished document and a Julia script that will
+be used to control the build process. The following names are recommended
+
+```
+docs/
+ src/
+ make.jl
+```
+
+
+## Building an Empty Document
+
+With our `docs/` directory now setup we're going to build our first document. It'll just be
+a single empty file at the moment, but we'll be adding to it later on.
+
+Add the following to your `make.jl` file
+
+```julia
+using Documenter, Example
+
+makedocs(sitename="My Documentation")
+```
+
+This assumes you've installed Documenter as discussed in [Installation](@ref) and that your
+`Example.jl` package can be found by Julia.
+
+!!! note
+
+ If your source directory is not accessible through Julia's LOAD_PATH, you might wish to
+ add the following line at the top of make.jl
+
+ ```julia
+ push!(LOAD_PATH,"../src/")
+ ```
+
+Now add an `index.md` file to the `src/` directory. The name has no particular significance
+though and you may name it whatever you like. We'll stick to `index.md` for this guide.
+
+Leave the newly added file empty and then run the following command from the `docs/` directory
+
+```sh
+$ julia make.jl
+```
+
+Note that `$` just represents the prompt character. You don't need to type that.
+
+If you'd like to see the output from this command in color use
+
+```sh
+$ julia --color=yes make.jl
+```
+
+When you run that you should see the following output
+
+```
+Documenter: setting up build directory.
+Documenter: expanding markdown templates.
+Documenter: building cross-references.
+Documenter: running document checks.
+ > checking for missing docstrings.
+ > running doctests.
+ > checking footnote links.
+Documenter: populating indices.
+Documenter: rendering document.
+```
+
+The `docs/` folder should contain a new directory -- called `build/`. It's structure should
+look like the following
+
+```
+build/
+ assets/
+ arrow.svg
+ documenter.css
+ documenter.js
+ search.js
+ index.html
+ search.html
+ search_index.js
+```
+
+!!! warning
+
+ **Never** `git commit` the contents of `build` (or any other content generated by
+ Documenter) to your repository's `master` branch. Always commit generated files to the
+ `gh-pages` branch of your repository. This helps to avoid including unnecessary changes
+ for anyone reviewing commits that happen to include documentation changes.
+
+ See the [Hosting Documentation](@ref) section for details regarding how you should go
+ about setting this up correctly.
+
+At this point `build/index.html` should be an empty page since `src/index.md` is empty. You
+can try adding some text to `src/index.md` and re-running the `make.jl` file to see the
+changes.
+
+
+## Adding Some Docstrings
+
+Next we'll splice a docstring defined in the `Example` module into the `index.md` file. To
+do this first document a function in that module:
+
+```julia
+module Example
+
+export func
+
+"""
+ func(x)
+
+Returns double the number `x` plus `1`.
+"""
+func(x) = 2x + 1
+
+end
+```
+
+Then in the `src/index.md` file add the following
+
+````markdown
+# Example.jl Documentation
+
+```@docs
+func(x)
+```
+````
+
+When we next run `make.jl` the docstring for `Example.func(x)` should appear in place of
+the `@docs` block in `build/index.md`. Note that *more than one* object can be referenced
+inside a `@docs` block -- just place each one on a separate line.
+
+Note that a `@docs` block is evaluated in the `Main` module. This means that each object
+listed in the block must be visible there. The module can be changed to something else on
+a per-page basis with a `@meta` block as in the following
+
+````markdown
+# Example.jl Documentation
+
+```@meta
+CurrentModule = Example
+```
+
+```@docs
+func(x)
+```
+````
+
+### Filtering included docstrings
+
+In some cases you may want to include a docstring for a `Method` that extends a
+`Function` from a different module -- such as `Base`. In the following example we extend
+`Base.length` with a new definition for the struct `T` and also add a docstring:
+
+```julia
+struct T
+ # ...
+end
+
+"""
+Custom `length` docs for `T`.
+"""
+Base.length(::T) = 1
+```
+
+When trying to include this docstring with
+
+````markdown
+```@docs
+length
+```
+````
+
+all the docs for `length` will be included -- even those from other modules. There are two
+ways to solve this problem. Either include the type in the signature with
+
+````markdown
+```@docs
+length(::T)
+```
+````
+
+or declare the specific modules that [`makedocs`](@ref) should include with
+
+```julia
+makedocs(
+ # options
+ modules = [MyModule]
+)
+```
+
+
+## Cross Referencing
+
+It may be necessary to refer to a particular docstring or section of your document from
+elsewhere in the document. To do this we can make use of Documenter's cross-referencing
+syntax which looks pretty similar to normal markdown link syntax. Replace the contents of
+`src/index.md` with the following
+
+````markdown
+# Example.jl Documentation
+
+```@docs
+func(x)
+```
+
+- link to [Example.jl Documentation](@ref)
+- link to [`func(x)`](@ref)
+````
+
+So we just have to replace each link's url with `@ref` and write the name of the thing we'd
+link to cross-reference. For document headers it's just plain text that matches the name of
+the header and for docstrings enclose the object in backticks.
+
+This also works across different pages in the same way. Note that these sections and
+docstrings must be unique within a document.
+
+
+## Navigation
+
+Documenter can auto-generate tables of contents and docstring indexes for your document with
+the following syntax. We'll illustrate these features using our `index.md` file from the
+previous sections. Add the following to that file
+
+````markdown
+# Example.jl Documentation
+
+```@contents
+```
+
+## Functions
+
+```@docs
+func(x)
+```
+
+## Index
+
+```@index
+```
+````
+
+The `@contents` block will generate a nested list of links to all the section headers in
+the document. By default it will gather all the level 1 and 2 headers from every page in the
+document, but this can be adjusted using `Pages` and `Depth` settings as in the following
+
+````markdown
+```@contents
+Pages = ["foo.md", "bar.md"]
+Depth = 3
+```
+````
+
+The `@index` block will generate a flat list of links to all the docs that that have been
+spliced into the document using `@docs` blocks. As with the `@contents` block the pages to
+be included can be set with a `Pages = [...]` line. Since the list is not nested `Depth` is
+not supported for `@index`.
+
+
+## Pages in the Sidebar
+
+By default all the pages (`.md` files) in your source directory get added to the sidebar,
+sorted by their filenames. However, in most cases you want to use the `pages` argument to
+[`makedocs`](@ref) to control how the sidebar looks like. The basic usage is as follows:
+
+```julia
+makedocs(
+ ...,
+ pages = [
+ "page.md",
+ "Page title" => "page2.md",
+ "Subsection" => [
+ ...
+ ]
+ ]
+)
+```
+
+Using the `pages` argument you can organize your pages into subsections and hide some pages
+from the sidebar with the help of the [`hide`](@ref) functions.
--- /dev/null
+# Hosting Documentation
+
+After going through the [Package Guide](@ref) and [Doctests](@ref) page you will need to
+host the generated documentation somewhere for potential users to read. This guide will
+describe how to setup automatic updates for your package docs using the Travis build service
+and GitHub Pages. This is the same approach used by this package to host its own docs --
+the docs you're currently reading.
+
+!!! note
+
+ Following this guide should be the *final* step you take after you are comfortable with
+ the syntax and build process used by `Documenter.jl`. It is recommended that you only
+ proceed with the steps outlined here once you have successfully managed to build your
+ documentation locally with Documenter.
+
+ This guide assumes that you already have [GitHub](https://github.com/) and
+ [Travis](https://travis-ci.com/) accounts setup. If not then go set those up first and
+ then return here.
+
+
+## Overview
+
+Once set up correctly, the following will happen each time you push new updates to your
+package repository:
+
+- Travis buildbots will start up and run your package tests in a "Test" stage.
+- After the Test stage completes, a single bot will run a new "Documentation" stage, which
+ will build the documentation.
+- If the documentation is built successfully, the bot will attempt to push the generated
+ HTML pages back to GitHub.
+
+Note that the hosted documentation does not update when you make pull requests; you see
+updates only when you merge to `master` or push new tags.
+
+The following sections outline how to enable this for your own package.
+
+
+## SSH Deploy Keys
+
+Deploy keys provide push access to a *single* repository, to allow secure deployment of
+generated documentation from Travis to GitHub. The SSH keys can be generated with the
+`Travis.genkeys` from the [DocumenterTools](https://github.com/JuliaDocs/DocumenterTools.jl)
+package.
+
+!!! note
+
+ You will need several command line programs (`which`, `git` and `ssh-keygen`) to be
+ installed for the following steps to work. If DocumenterTools fails, please see the the
+ [SSH Deploy Keys Walkthrough](@ref) section for instruction on how to generate the keys
+ manually (including in Windows).
+
+
+Install and load DocumenterTools with
+
+```
+pkg> add DocumenterTools
+```
+```julia-repl
+julia> using DocumenterTools
+```
+
+Then call the [`Travis.genkeys`](@ref) function as follows:
+
+```julia-repl
+julia> using MyPackage
+julia> Travis.genkeys(MyPackage)
+```
+
+where `MyPackage` is the name of the package you would like to create deploy keys for. The
+output will look similar to the text below:
+
+```
+INFO: add the public key below to https://github.com/USER/REPO/settings/keys
+ with read/write access:
+
+[SSH PUBLIC KEY HERE]
+
+INFO: add a secure environment variable named 'DOCUMENTER_KEY' to
+ https://travis-ci.org/USER/REPO/settings with value:
+
+[LONG BASE64 ENCODED PRIVATE KEY]
+```
+
+Follow the instructions that are printed out, namely:
+
+ 1. Add the public ssh key to your settings page for the GitHub repository that you are
+ setting up by following the `.../settings/key` link provided. Click on **`Add deploy
+ key`**, enter the name **`documenter`** as the title, and copy the public key into the
+ **`Key`** field. Check **`Allow write access`** to allow Documenter to commit the
+ generated documentation to the repo.
+
+ 2. Next add the long private key to the Travis settings page using the provided link. Again
+ note that you should include **no whitespace** when copying the key. In the **`Environment
+ Variables`** section add a key with the name `DOCUMENTER_KEY` and the value that was printed
+ out. **Do not** set the variable to be displayed in the build log. Then click **`Add`**.
+
+ !!! warning "Security warning"
+
+ To reiterate: make sure that the "Display value in build log" option is **OFF** for
+ the variable, so that it does not get printed when the tests run. This
+ base64-encoded string contains the *unencrypted* private key that gives full write
+ access to your repository, so it must be kept safe. Also, make sure that you never
+ expose this variable in your tests, nor merge any code that does. You can read more
+ about Travis environment variables in [Travis User Documentation](https://docs.travis-ci.com/user/environment-variables/#Defining-Variables-in-Repository-Settings).
+
+!!! note
+
+ There are more explicit instructions for adding the keys to GitHub and Travis in the
+ [SSH Deploy Keys Walkthrough](@ref) section of the manual.
+
+## `.travis.yml` Configuration
+
+To tell Travis that we want a new build stage we can add the following to the `.travis.yml`
+file:
+
+```yaml
+jobs:
+ include:
+ - stage: "Documentation"
+ julia: 1.0
+ os: linux
+ script:
+ - julia --project=docs/ -e 'using Pkg; Pkg.instantiate();
+ Pkg.develop(PackageSpec(path=pwd()))'
+ - julia --project=docs/ docs/make.jl
+ after_success: skip
+```
+
+where the `julia:` and `os:` entries decide the worker from which the docs are built and
+deployed. In the example above we will thus build and deploy the documentation from a linux
+worker running Julia 1.0. For more information on how to setup a build stage, see the Travis
+manual for [Build Stages](https://docs.travis-ci.com/user/build-stages).
+
+The three lines in the `script:` section do the following:
+
+ 1. Instantiate the doc-building environment (i.e. `docs/Project.toml`, see below).
+ 2. Install your package in the doc-build environment.
+ 3. Run the docs/make.jl script, which builds and deploys the documentation.
+
+The doc-build environment `docs/Project.toml` includes Documenter and other doc-build
+dependencies your package might have. If Documenter is the only dependency, then the
+`Project.toml` should include the following:
+
+```toml
+[deps]
+Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
+
+[compat]
+Documenter = "~0.20"
+```
+
+Note that it is recommended that you have a `[compat]` section, like the one above, in your
+`Project.toml` file, which would restrict Documenter's version that gets installed when the
+build runs. This is to make sure that your builds do not start failing suddenly due to a new
+major release of Documenter, which may include breaking changes. However, it also means that
+you will not get updates to Documenter automatically, and hence need to upgrade Documenter's
+major version yourself.
+
+
+## The `deploydocs` Function
+
+At the moment your `docs/make.jl` file probably only contains
+
+```julia
+using Documenter, PACKAGE_NAME
+
+makedocs()
+```
+
+We'll need to add an additional function call to this file after [`makedocs`](@ref) which
+would perform the deployment of the docs to the `gh-pages` branch.
+Add the following at the end of the file:
+
+```julia
+deploydocs(
+ repo = "github.com/USER_NAME/PACKAGE_NAME.jl.git",
+)
+```
+
+where `USER_NAME` and `PACKAGE_NAME` must be set to the appropriate names.
+Note that `repo` should not specify any protocol, i.e. it should not begin with `https://`
+or `git@`.
+
+See the [`deploydocs`](@ref) function documentation for more details.
+
+
+
+## `.gitignore`
+
+Add the following to your package's `.gitignore` file
+
+```
+docs/build/
+```
+
+These are needed to avoid committing generated content to your repository.
+
+## `gh-pages` Branch
+
+By default, Documenter pushes documentation to the `gh-pages` branch. If the branch does not
+exist it will be created automatically by [`deploydocs`](@ref). If does exist then
+Documenter simply adds an additional commit with the built documentation. You should be
+aware that Documenter may overwrite existing content without warning.
+
+If you wish to create the `gh-pages` branch manually the that can be done following
+[these instructions](https://coderwall.com/p/0n3soa/create-a-disconnected-git-branch).
+
+## Documentation Versions
+
+The documentation is deployed as follows:
+
+- Documentation built for a tag `vX.Y.Z` will be stored in a folder `vX.Y.Z`.
+
+- Documentation built from the `devbranch` branch (`master` by default) is stored a folder
+ determined by the `devurl` keyword to [`deploydocs`](@ref) (`dev` by default).
+
+Which versions that will show up in the version selector is determined by the
+`versions` argument to [`deploydocs`](@ref).
+
+Unless a custom domain is being used, the pages are found at:
+
+```
+https://USER_NAME.github.io/PACKAGE_NAME.jl/vX.Y.Z
+https://USER_NAME.github.io/PACKAGE_NAME.jl/dev
+```
+
+By default Documenter will create a link called `stable` that points to the latest release
+
+```
+https://USER_NAME.github.io/PACKAGE_NAME.jl/stable
+```
+
+It is recommended to use this link, rather then the versioned links, since it will be updated
+with new releases.
+
+Once your documentation has been pushed to the `gh-pages` branch you should add links to
+your `README.md` pointing to the `stable` (and perhaps `dev`) documentation URLs. It is common
+practice to make use of "badges" similar to those used for Travis and AppVeyor build
+statuses or code coverage. Adding the following to your package `README.md` should be all
+that is necessary:
+
+```markdown
+[](https://USER_NAME.github.io/PACKAGE_NAME.jl/stable)
+[](https://USER_NAME.github.io/PACKAGE_NAME.jl/dev)
+```
+
+`PACKAGE_NAME` and `USER_NAME` should be replaced with their appropriate values. The colour
+and text of the image can be changed by altering `docs-stable-blue` as described on
+[shields.io](https://shields.io), though it is recommended that package authors follow this
+standard to make it easier for potential users to find documentation links across multiple
+package README files.
+
+---
+
+**Final Remarks**
+
+That should be all that is needed to enable automatic documentation building. Pushing new
+commits to your `master` branch should trigger doc builds. **Note that other branches do not
+trigger these builds and neither do pull requests by potential contributors.**
+
+If you would like to see a more complete example of how this process is setup then take a
+look at this package's repository for some inspiration.
--- /dev/null
+# SSH Deploy Keys Walkthrough
+
+If the instructions in [SSH Deploy Keys](@ref) did not work for you (for example,
+`ssh-keygen` is not installed), don't worry! This walkthrough will guide you through the
+process. There are three main steps:
+
+1. [Generating an SSH Key](@ref)
+2. [Adding the Public Key to GitHub](@ref)
+3. [Adding the Private Key to Travis](@ref)
+
+## Generating an SSH Key
+
+The first step is to generate an SSH key. An SSH key is made up of two components: a
+*public* key, which can be shared publicly, and a *private* key, which you should ensure is
+**never** shared publicly.
+
+The public key usually looks something like this
+
+```
+ssh-rsa [base64-encoded-key] [optional-comment]
+```
+
+And the private key usually look something like this
+
+```
+-----BEGIN RSA PRIVATE KEY-----
+ ... base64-encoded key over several lines ...
+-----END RSA PRIVATE KEY-----
+```
+
+### If you have `ssh-keygen` installed
+
+If you have `ssh-keygen` installed, but `Travis.genkeys()` didn't work, you can generate an
+SSH key as follows. First, generate a key using `ssh-keygen` and save it to the file
+`privatekey`:
+
+```julia
+shell> ssh-keygen -N "" -f privatekey
+```
+
+Next, we need to encode the private key in Base64. Run the following command:
+
+```julia
+julia> read("privatekey", String) |> base64encode |> println
+```
+
+Copy and paste the output somewhere. This is your *private key* and is required for the step
+[Adding the Private Key to Travis](@ref).
+
+Now we need to get the public key. Run the following command:
+
+```julia
+julia> read("privatekey.pub", String) |> println
+```
+
+Copy and paste the output somewhere. This is your *public key* and is required for the step
+[Adding the Public Key to GitHub](@ref).
+
+### If you do not have `ssh-keygen`
+
+If you're using Windows, you probably don't have `ssh-keygen` installed. Instead, we're
+going to use a program called PuTTY. The first step in the process to generate a new SSH key
+is to download PuTTY:
+
+* Download and install [PuTTY](https://www.chiark.greenend.org.uk/~sgtatham/putty/)
+
+PuTTY is actually a collection of a few different programs. We need to use PuTTYgen. Open
+it, and you should get a window that looks like:
+
+
+
+Now we need to generate a key.
+
+* Click the "Generate" button, then follow the instructions and move the mouse around to
+ create randomness.
+
+Once you've moved the mouse enough, the window should look like:
+
+
+
+Now we need to save the public key somewhere.
+
+* Copy the text in the box titled "Public key for pasting into OpenSSH authorized_keys file"
+ and paste it somewhere for later. This is your *public key* and is required for the step
+ [Adding the Public Key to GitHub](@ref)
+
+Finally, we need to save the private key somewhere.
+
+* Click the "Conversions" tab, and then click "Export OpenSSH key". Save that file
+ somewhere. That file is your *private key* and is required for the [Adding the Private Key
+ to Travis](@ref) step.
+
+ 
+
+ !!! note
+
+ Don't save your key via the "Save private key" button as this will save the key in the
+ wrong format.
+
+If you made it this far, congratulations! You now have the private and public keys needed to
+set up automatic deployment of your documentation. The next steps are to add the keys to
+GitHub and Travis.
+
+
+## Adding the Public Key to GitHub
+
+In this section, we explain how to upload a public SSH key to GitHub. By this point, you
+should have generated a public key and saved it to a file. If you haven't done this, go read
+[Generating an SSH Key](@ref).
+
+Go to `https://github.com/[YOUR_USER_NAME]/[YOUR_REPO_NAME]/settings/keys` and click "Add
+deploy key". You should get to a page that looks like:
+
+
+
+Now we need to fill in three pieces of information.
+
+1. Have "Title" be e.g. "Documenter".
+2. Copy and paste the *public key* that we generated in the [Generating an SSH Key](@ref)
+ step into the "Key" field.
+3. Make sure that the "Allow write access" box is checked.
+
+Once you're done, click "Add key". Congratulations! You've added the public key
+to GitHub. The next step is to add the private key to Travis.
+
+
+## Adding the Private Key to Travis
+
+In this section, we explain how to upload a private SSH key to Travis. By this point, you
+should have generated a private key and saved it to a file. If you haven't done this, go
+read [Generating an SSH Key](@ref).
+
+First, we need to Base64 encode the private key. Open Julia, and run the command
+
+```julia
+julia> read("path/to/private/key", String) |> Documenter.base64encode |> println
+```
+
+Copy the resulting output.
+
+Next, go to `https://travis-ci.org/[YOUR_USER_NAME]/[YOUR_REPO_NAME]/settings`. Scroll down
+to the "Environment Variables" section. It should look like this:
+
+
+
+Now, add a new environment variable called `DOCUMENTER_KEY`, and set its value to the output
+from the Julia command above (make sure to remove the surrounding quotes).
+
+Finally, check that the "Display value in build log" is switched off and then click "Add".
+Congratulations! You've added the private key to Travis.
+
+!!! warning "Security warning"
+
+ To reiterate: make sure that the "Display value in build log" option is **OFF** for
+ the variable, so that it does not get printed when the tests run. This
+ base64-encoded string contains the *unencrypted* private key that gives full write
+ access to your repository, so it must be kept safe. Also, make sure that you never
+ expose this variable in your tests, nor merge any code that does. You can read more
+ about Travis environment variables in [Travis User Documentation](https://docs.travis-ci.com/user/environment-variables/#Defining-Variables-in-Repository-Settings).
+
+---
+
+**Final Remarks**
+
+You should now be able to continue on with the [Hosting Documentation](@ref).
--- /dev/null
+# [``\LaTeX`` Syntax](@id latex_syntax)
+
+The following section describes how to add equations written using ``\LaTeX`` to your
+documentation.
+
+## Escaping Characters in Docstrings
+
+Since some characters used in ``\LaTeX`` syntax are treated differently in docstrings they
+need to be escaped using a `\` character as in the following example:
+
+```julia
+"""
+Here's some inline maths: \$\\sqrt[n]{1 + x + x^2 + \\ldots}\$.
+
+Here's an equation:
+
+\$\\frac{n!}{k!(n - k)!} = \\binom{n}{k}\$
+
+This is the binomial coefficient.
+"""
+func(x) = # ...
+```
+
+To avoid needing to escape the special characters the `doc""` string macro can be used:
+
+```julia
+doc"""
+Here's some inline maths: $\sqrt[n]{1 + x + x^2 + \ldots}$.
+
+Here's an equation:
+
+$\frac{n!}{k!(n - k)!} = \binom{n}{k}$
+
+This is the binomial coefficient.
+"""
+func(x) = # ...
+```
+
+A related issue is how to add dollar signs to a docstring. They need to be
+double-escaped as follows:
+```julia
+"""
+The cost was \\\$1.
+"""
+```
+
+## Inline Equations
+
+```markdown
+Here's some inline maths: ``\sqrt[n]{1 + x + x^2 + \ldots}``.
+```
+
+which will be displayed as
+
+---
+
+Here's some inline maths: ``\sqrt[n]{1 + x + x^2 + \ldots}``.
+
+---
+
+## Display Equations
+
+````markdown
+Here's an equation:
+
+```math
+\frac{n!}{k!(n - k)!} = \binom{n}{k}
+```
+
+This is the binomial coefficient.
+````
+
+which will be displayed as
+
+---
+
+Here's an equation:
+
+```math
+\frac{n!}{k!(n - k)!} = \binom{n}{k}
+```
+
+This is the binomial coefficient.
--- /dev/null
+# Other Output Formats
+
+In addition to the default native HTML output, plugin packages enable Documenter to generate
+output in other formats. Once the corresponding package is loaded, the output format can be
+specified using the `format` option in [`makedocs`](@ref).
+
+
+## Markdown & MkDocs
+
+Markdown output requires the [`DocumenterMarkdown`](https://github.com/JuliaDocs/DocumenterMarkdown.jl)
+package to be available and loaded.
+For Travis setups, add the package to the `docs/Project.toml` environment as a dependency.
+You also need to import the package in `make.jl`:
+
+```
+using DocumenterMarkdown
+```
+
+When `DocumenterMarkdown` is loaded, you can specify `format = :markdown` in [`makedocs`](@ref).
+Documenter will then output a set of Markdown files to the `build` directory that can then
+further be processed with [MkDocs](https://www.mkdocs.org/) into HTML pages.
+
+MkDocs, of course, is not the only option you have -- any markdown to HTML converter should
+work fine with some amount of setting up.
+
+!!! note
+
+ Markdown output used to be the default option (i.e. when leaving the `format` option
+ unspecified). The default now is the HTML output.
+
+### The MkDocs `mkdocs.yml` file
+
+A MkDocs build is controlled by the `mkdocs.yml` configuration file. Add the file with the
+following content to the `docs/` directory:
+
+```yaml
+site_name: PACKAGE_NAME.jl
+repo_url: https://github.com/USER_NAME/PACKAGE_NAME.jl
+site_description: Description...
+site_author: USER_NAME
+
+theme: readthedocs
+
+extra_css:
+ - assets/Documenter.css
+
+extra_javascript:
+ - https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/MathJax.js?config=TeX-AMS_HTML
+ - assets/mathjaxhelper.js
+
+markdown_extensions:
+ - extra
+ - tables
+ - fenced_code
+ - mdx_math
+
+docs_dir: 'build'
+
+pages:
+ - Home: index.md
+```
+
+If you have run Documenter and it has generated a `build/` directory, you can now try running
+`mkdocs build` -- this should now generate the `site/` directory.
+You should also add the `docs/site/` directory into your `.gitignore` file, which should now
+look like:
+
+```
+docs/build/
+docs/site/
+```
+
+This is only a basic skeleton. Read through the MkDocs documentation if you would like to
+know more about the available settings.
+
+
+### Deployment with MkDocs
+
+To deploy MkDocs on Travis, you also need to provide additional keyword arguments to
+[`deploydocs`](@ref). Your [`deploydocs`](@ref) call should look something like
+
+```julia
+deploydocs(
+ repo = "github.com/USER_NAME/PACKAGE_NAME.jl.git",
+ deps = Deps.pip("mkdocs", "pygments", "python-markdown-math"),
+ make = () -> run(`mkdocs build`)
+ target = "site"
+)
+```
+
+* `deps` serves to provide the required Python dependencies to build the documentation
+* `make` specifies the function that calls `mkdocs` to perform the second build step
+* `target`, which specified which files get copied to `gh-pages`, needs to point to the
+ `site/` directory
+
+In the example above we include the dependencies [mkdocs](https://www.mkdocs.org)
+and [`python-markdown-math`](https://github.com/mitya57/python-markdown-math).
+The former makes sure that MkDocs is installed to deploy the documentation,
+and the latter provides the `mdx_math` markdown extension to exploit MathJax
+rendering of latex equations in markdown. Other dependencies should be
+included here.
+
+
+### ``\LaTeX``: MkDocs and MathJax
+
+To get MkDocs to display ``\LaTeX`` equations correctly we need to update several of this
+configuration files described in the [Package Guide](@ref).
+
+`docs/make.jl` should add the `python-markdown-math` dependency to allow for equations to
+be rendered correctly.
+
+```julia
+# ...
+
+deploydocs(
+ deps = Deps.pip("pygments", "mkdocs", "python-markdown-math"),
+ # ...
+)
+```
+
+This package should also be installed locally so that you can preview the generated
+documentation prior to pushing new commits to a repository.
+
+```sh
+$ pip install python-markdown-math
+```
+
+The `docs/mkdocs.yml` file must add the `python-markdown-math` extension, called `mdx_math`,
+as well as two MathJax JavaScript files:
+
+```yaml
+# ...
+markdown_extensions:
+ - mdx_math
+ # ...
+
+extra_javascript:
+ - https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/MathJax.js?config=TeX-AMS_HTML
+ - assets/mathjaxhelper.js
+# ...
+```
+
+**Final Remarks**
+
+Following this guide and adding the necessary changes to the configuration files should
+enable properly rendered mathematical equations within your documentation both locally and
+when built and deployed using the Travis built service.
+
+
+## PDF Output via LaTeX
+
+LaTeX/PDF output requires the [`DocumenterLaTeX`](https://github.com/JuliaDocs/DocumenterLaTeX.jl)
+package to be available and loaded in `make.jl` with
+
+```
+using DocumenterLaTeX
+```
+
+When `DocumenterLaTeX` is loaded, you can set `format = :latex` in [`makedocs`](@ref),
+and Documenter will generate a PDF version of the documentation using LaTeX.
+
+* You need `pdflatex` command to be installed and available to Documenter.
+* You need the [minted](https://ctan.org/pkg/minted) LaTeX package and its backend source
+ highlighter [Pygments](http://pygments.org/) installed.
+* You need the [Lato](http://www.latofonts.com/lato-free-fonts/) and
+ [Roboto Mono](https://fonts.google.com/specimen/Roboto+Mono) fonts installed.
+
+You should also specify the `sitename` and `authors` keywords for `makedocs` when using the
+LaTeX output.
--- /dev/null
+# Syntax
+
+This section of the manual describes the syntax used by Documenter to build documentation.
+
+```@contents
+Pages = ["syntax.md"]
+```
+
+## `@docs` block
+
+Splice one or more docstrings into a document in place of the code block, i.e.
+
+````markdown
+```@docs
+Documenter
+makedocs
+deploydocs
+```
+````
+
+This block type is evaluated within the `CurrentModule` module if defined, otherwise within
+`Main`, and so each object listed in the block should be visible from that
+module. Undefined objects will raise warnings during documentation generation and cause the
+code block to be rendered in the final document unchanged.
+
+Objects may not be listed more than once within the document. When duplicate objects are
+detected an error will be raised and the build process will be terminated.
+
+To ensure that all docstrings from a module are included in the final document the `modules`
+keyword for [`makedocs`](@ref) can be set to the desired module or modules, i.e.
+
+```julia
+makedocs(
+ modules = [Documenter],
+)
+```
+
+which will cause any unlisted docstrings to raise warnings when [`makedocs`](@ref) is
+called. If `modules` is not defined then no warnings are printed, even if a document has
+missing docstrings.
+
+## `@autodocs` block
+
+Automatically splices all docstrings from the provided modules in place of the code block.
+This is equivalent to manually adding all the docstrings in a `@docs` block.
+
+````markdown
+```@autodocs
+Modules = [Foo, Bar]
+Order = [:function, :type]
+```
+````
+
+The above `@autodocs` block adds all the docstrings found in modules `Foo` and `Bar` that
+refer to functions or types to the document.
+
+Each module is added in order and so all docs from `Foo` will appear before those of `Bar`.
+Possible values for the `Order` vector are
+
+- `:module`
+- `:constant`
+- `:type`
+- `:function`
+- `:macro`
+
+If no `Order` is provided then the order listed above is used.
+
+When a potential docstring is found in one of the listed modules, but does not match any
+value from `Order` then it will be omitted from the document. Hence `Order` acts as a basic
+filter as well as sorter.
+
+In addition to `Order`, a `Pages` vector may be included in `@autodocs` to filter docstrings
+based on the source file in which they are defined:
+
+````markdown
+```@autodocs
+Modules = [Foo]
+Pages = ["a.jl", "b.jl"]
+```
+````
+
+In the above example docstrings from module `Foo` found in source files that end in `a.jl`
+and `b.jl` are included. The page order provided by `Pages` is also used to sort the
+docstrings. Note that page matching is done using the end of the provided strings and so
+`a.jl` will be matched by *any* source file that ends in `a.jl`, i.e. `src/a.jl` or
+`src/foo/a.jl`.
+
+To include only the exported names from the modules listed in `Modules` use `Private = false`.
+In a similar way `Public = false` can be used to only show the unexported names. By
+default both of these are set to `true` so that all names will be shown.
+
+````markdown
+Functions exported from `Foo`:
+
+```@autodocs
+Modules = [Foo]
+Private = false
+Order = [:function]
+```
+
+Private types in module `Foo`:
+
+```@autodocs
+Modules = [Foo]
+Public = false
+Order = [:type]
+```
+````
+
+!!! note
+
+ When more complex sorting and filtering is needed then use `@docs` to define it
+ explicitly.
+
+## `@ref` link
+
+Used in markdown links as the URL to tell Documenter to generate a cross-reference
+automatically. The text part of the link can be a docstring, header name, or GitHub PR/Issue
+number.
+
+````markdown
+# Syntax
+
+... [`makedocs`](@ref) ...
+
+# Functions
+
+```@docs
+makedocs
+```
+
+... [Syntax](@ref) ...
+
+... [#42](@ref) ...
+````
+
+Plain text in the "text" part of a link will either cross-reference a header, or, when it is
+a number preceded by a `#`, a GitHub issue/pull request. Text wrapped in backticks will
+cross-reference a docstring from a `@docs` block.
+
+`@ref`s may refer to docstrings or headers on different pages as well as the current page
+using the same syntax.
+
+Note that depending on what the `CurrentModule` is set to, a docstring `@ref` may need to
+be prefixed by the module which defines it.
+
+### Duplicate Headers
+
+In some cases a document may contain multiple headers with the same name, but on different
+pages or of different levels. To allow `@ref` to cross-reference a duplicate header it must
+be given a name as in the following example
+
+```markdown
+# [Header](@id my_custom_header_name)
+
+...
+
+## Header
+
+... [Custom Header](@ref my_custom_header_name) ...
+```
+
+The link that wraps the named header is removed in the final document. The text for a named
+`@ref ...` does not need to match the header that it references. Named `@ref ...`s may refer
+to headers on different pages in the same way as unnamed ones do.
+
+Duplicate docstring references do not occur since splicing the same docstring into a
+document more than once is disallowed.
+
+### Named doc `@ref`s
+
+Docstring `@ref`s can also be "named" in a similar way to headers as shown in the
+[Duplicate Headers](@ref) section above. For example
+
+```julia
+module Mod
+
+"""
+Both of the following references point to `g` found in module `Main.Other`:
+
+ * [`Main.Other.g`](@ref)
+ * [`g`](@ref Main.Other.g)
+
+"""
+f(args...) = # ...
+
+end
+```
+
+This can be useful to avoid having to write fully qualified names for references that
+are not imported into the current module, or when the text displayed in the link is
+used to add additional meaning to the surrounding text, such as
+
+```markdown
+Use [`for i = 1:10 ...`](@ref for) to loop over all the numbers from 1 to 10.
+```
+
+!!! note
+
+ Named doc `@ref`s should be used sparingly since writing unqualified names may, in some
+ cases, make it difficult to tell *which* function is being referred to in a particular
+ docstring if there happen to be several modules that provide definitions with the same
+ name.
+
+## `@meta` block
+
+This block type is used to define metadata key/value pairs that can be used elsewhere in the
+page. Currently recognised keys:
+- `CurrentModule`: module where Documenter evaluates, for example, [`@docs`-block](@ref)
+ and [`@ref`-link](@ref)s.
+- `DocTestSetup`: code to be evaluated before a doctest, see the [Setup Code](@ref)
+ section under [Doctests](@ref).
+- `DocTestFilters`: filters to deal with, for example, unpredictable output from doctests,
+ see the [Filtering Doctests](@ref) section under [Doctests](@ref).
+- `EditURL`: link to where the page can be edited. This defaults to the `.md` page itself,
+ but if the source is something else (for example if the `.md` page is generated as part of
+ the doc build) this can be set, either as a local link, or an absolute url.
+
+Example:
+
+````markdown
+```@meta
+CurrentModule = FooBar
+DocTestSetup = quote
+ using MyPackage
+end
+DocTestFilters = [r"Stacktrace:[\s\S]+"]
+EditURL = "link/to/source/file"
+```
+````
+
+Note that `@meta` blocks are always evaluated in `Main`.
+
+## `@index` block
+
+Generates a list of links to docstrings that have been spliced into a document. Valid
+settings are `Pages`, `Modules`, and `Order`. For example:
+
+````markdown
+```@index
+Pages = ["foo.md"]
+Modules = [Foo, Bar]
+Order = [:function, :type]
+```
+````
+
+When `Pages` or `Modules` are not provided then all pages or modules are included. `Order`
+defaults to
+
+```julia
+[:module, :constant, :type, :function, :macro]
+```
+
+if not specified. `Order` and `Modules` behave the same way as in [`@autodocs` block](@ref)s
+and filter out docstrings that do not match one of the modules or categories specified.
+
+Note that the values assigned to `Pages`, `Modules`, and `Order` may be any valid Julia code
+and thus can be something more complex that an array literal if required, i.e.
+
+````markdown
+```@index
+Pages = map(file -> joinpath("man", file), readdir("man"))
+```
+````
+
+It should be noted though that in this case `Pages` may not be sorted in the order that is
+expected by the user. Try to stick to array literals as much as possible.
+
+## `@contents` block
+
+Generates a nested list of links to document sections. Valid settings are `Pages` and `Depth`.
+
+````markdown
+```@contents
+Pages = ["foo.md"]
+Depth = 5
+```
+````
+
+As with `@index` if `Pages` is not provided then all pages are included. The default
+`Depth` value is `2`.
+
+## `@example` block
+
+Evaluates the code block and inserts the result into the final document along with the
+original source code.
+
+````markdown
+```@example
+a = 1
+b = 2
+a + b
+```
+````
+
+The above `@example` block will splice the following into the final document
+
+````markdown
+```julia
+a = 1
+b = 2
+a + b
+```
+
+```
+3
+```
+````
+
+Leading and trailing newlines are removed from the rendered code blocks. Trailing whitespace
+on each line is also removed.
+
+**Hiding Source Code**
+
+Code blocks may have some content that does not need to be displayed in the final document.
+`# hide` comments can be appended to lines that should not be rendered, i.e.
+
+````markdown
+```@example
+import Random # hide
+Random.seed!(1) # hide
+A = rand(3, 3)
+b = [1, 2, 3]
+A \ b
+```
+````
+
+Note that appending `# hide` to every line in an `@example` block will result in the block
+being hidden in the rendered document. The results block will still be rendered though.
+`@setup` blocks are a convenient shorthand for hiding an entire block, including the output.
+
+**`stdout` and `stderr`**
+
+The Julia output streams are redirected to the results block when evaluating `@example`
+blocks in the same way as when running doctest code blocks.
+
+**`nothing` Results**
+
+When the `@example` block evaluates to `nothing` then the second block is not displayed.
+Only the source code block will be shown in the rendered document. Note that if any output
+from either `stdout` or `stderr` is captured then the results block will be displayed even
+if `nothing` is returned.
+
+**Named `@example` Blocks**
+
+By default `@example` blocks are run in their own anonymous `Module`s to avoid side-effects
+between blocks. To share the same module between different blocks on a page the `@example`
+can be named with the following syntax
+
+````markdown
+```@example 1
+a = 1
+```
+
+```@example 1
+println(a)
+```
+````
+
+The name can be any text, not just integers as in the example above, i.e. `@example foo`.
+
+Named `@example` blocks can be useful when generating documentation that requires
+intermediate explanation or multimedia such as plots as illustrated in the following example
+
+````markdown
+First we define some functions
+
+```@example 1
+using PyPlot # hide
+f(x) = sin(2x) + 1
+g(x) = cos(x) - x
+```
+
+and then we plot `f` over the interval from ``-π`` to ``π``
+
+```@example 1
+x = linspace(-π, π)
+plot(x, f(x), color = "red")
+savefig("f-plot.svg"); nothing # hide
+```
+
+
+
+and then we do the same with `g`
+
+```@example 1
+plot(x, g(x), color = "blue")
+savefig("g-plot.svg"); nothing # hide
+```
+
+
+````
+
+Note that `@example` blocks are evaluated within the directory of `build` where the file
+will be rendered . This means than in the above example `savefig` will output the `.svg`
+files into that directory. This allows the images to be easily referenced without needing to
+worry about relative paths.
+
+`@example` blocks automatically define `ans` which, as in the Julia REPL, is bound to the
+value of the last evaluated expression. This can be useful in situations such as the
+following one where where binding the object returned by `plot` to a named variable would
+look out of place in the final rendered documentation:
+
+````markdown
+```@example
+using Gadfly # hide
+plot([sin, x -> 2sin(x) + x], -2π, 2π)
+draw(SVG("plot.svg", 6inch, 4inch), ans); nothing # hide
+```
+
+
+````
+
+**Delayed Execution of `@example` Blocks**
+
+`@example` blocks accept a keyword argument `continued` which can be set to `true` or `false`
+(defaults to `false`). When `continued = true` the execution of the code is delayed until the
+next `continued = false` `@example`-block. This is needed for example when the expression in
+a block is not complete. Example:
+
+````markdown
+```@example half-loop; continued = true
+for i in 1:3
+ j = i^2
+```
+Some text explaining what we should do with `j`
+```@example half-loop
+ println(j)
+end
+```
+````
+
+Here the first block is not complete -- the loop is missing the `end`. Thus, by setting
+`continued = true` here we delay the evaluation of the first block, until we reach the
+second block. A block with `continued = true` does not have any output.
+
+## `@repl` block
+
+These are similar to `@example` blocks, but adds a `julia> ` prompt before each toplevel
+expression. `;` and `# hide` syntax may be used in `@repl` blocks in the same way as in the
+Julia REPL and `@example` blocks.
+
+````markdown
+```@repl
+a = 1
+b = 2
+a + b
+```
+````
+
+will generate
+
+````markdown
+```julia
+julia> a = 1
+1
+
+julia> b = 2
+2
+
+julia> a + b
+3
+```
+````
+
+Named `@repl <name>` blocks behave in the same way as named `@example <name>` blocks.
+
+## `@setup <name>` block
+
+These are similar to `@example` blocks, but both the input and output are hidden from the
+final document. This can be convenient if there are several lines of setup code that need to be
+hidden.
+
+!!! note
+
+ Unlike `@example` and `@repl` blocks, `@setup` requires a `<name>` attribute to associate it
+ with downstream `@example <name>` and `@repl <name>` blocks.
+
+````markdown
+```@setup abc
+using RDatasets
+using DataFrames
+iris = dataset("datasets", "iris")
+```
+
+```@example abc
+println(iris)
+```
+````
+
+
+## `@eval` block
+
+Evaluates the contents of the block and inserts the resulting value into the final document.
+
+In the following example we use the PyPlot package to generate a plot and display it in the
+final document.
+
+````markdown
+```@eval
+using PyPlot
+
+x = linspace(-π, π)
+y = sin(x)
+
+plot(x, y, color = "red")
+savefig("plot.svg")
+
+nothing
+```
+
+
+````
+
+Note that each `@eval` block evaluates its contents within a separate module. When
+evaluating each block the present working directory, `pwd`, is set to the directory in
+`build` where the file will be written to.
+
+Also, instead of returning `nothing` in the example above we could have returned a new
+`Markdown.MD` object through `Markdown.parse`. This can be more appropriate when the
+filename is not known until evaluation of the block itself.
+
+!!! note
+
+ In most cases `@example` is preferred over `@eval`. Just like in normal Julia code where
+ `eval` should be only be considered as a last resort, `@eval` should be treated in the
+ same way.
+
+
+## `@raw <format>` block
+
+Allows code to be inserted into the final document verbatim. E.g. to insert custom HTML or
+LaTeX code into the output.
+
+The `format` argument is mandatory and Documenter uses it to determine whether a particular
+block should be copied over to the output or not. Currently supported formats are `html`
+and `latex`, used by the respective writers. A `@raw` block whose `format` is not
+recognized is usually ignored, so it is possible to have a raw block for each output format
+without the blocks being duplicated in the output.
+
+The following example shows how SVG code with custom styling can be included into documents
+using the `@raw` block.
+
+````markdown
+```@raw html
+<svg style="display: block; margin: 0 auto;" width="5em" heigth="5em">
+ <circle cx="2.5em" cy="2.5em" r="2em" stroke="black" stroke-width=".1em" fill="red" />
+</svg>
+```
+````
+
+It will show up as follows, with code having been copied over verbatim to the HTML file.
+
+```@raw html
+<svg style="display: block; margin: 0 auto;" width="5em" heigth="5em">
+ <circle cx="2.5em" cy="2.5em" r="2em" stroke="black" stroke-width=".1em" fill="red" />
+ (SVG)
+</svg>
+```
--- /dev/null
+"""
+Defines the [`Anchor`](@ref) and [`AnchorMap`](@ref) types.
+
+`Anchor`s and `AnchorMap`s are used to represent links between objects within a document.
+"""
+module Anchors
+
+using DocStringExtensions
+
+# Types.
+# ------
+
+"""
+Stores an arbitrary object called `.object` and it's location within a document.
+
+**Fields**
+
+- `object` -- the stored object.
+- `order` -- ordering of `object` within the entire document.
+- `file` -- the destination file, in `build`, where the object will be written to.
+- `id` -- the generated "slug" identifying the object.
+- `nth` -- integer that unique-ifies anchors with the same `id`.
+"""
+mutable struct Anchor
+ object :: Any
+ order :: Int
+ file :: String
+ id :: String
+ nth :: Int
+ Anchor(object) = new(object, 0, "", "", 1)
+end
+
+"""
+Tree structure representating anchors in a document and their relationships with eachother.
+
+**Object Hierarchy**
+
+ id -> file -> anchors
+
+Each `id` maps to a `file` which in turn maps to a vector of `Anchor` objects.
+"""
+mutable struct AnchorMap
+ map :: Dict{String, Dict{String, Vector{Anchor}}}
+ count :: Int
+ AnchorMap() = new(Dict(), 0)
+end
+
+# Add anchor.
+# -----------
+
+"""
+$(SIGNATURES)
+
+Adds a new [`Anchor`](@ref) to the [`AnchorMap`](@ref) for a given `id` and `file`.
+
+Either an actual [`Anchor`](@ref) object may be provided or any other object which is
+automatically wrapped in an [`Anchor`](@ref) before being added to the [`AnchorMap`](@ref).
+"""
+function add!(m::AnchorMap, anchor::Anchor, id, file)
+ filemap = get!(m.map, id, Dict{String, Vector{Anchor}}())
+ anchors = get!(filemap, file, Anchor[])
+ push!(anchors, anchor)
+ anchor.order = m.count += 1
+ anchor.file = file
+ anchor.id = id
+ anchor.nth = length(anchors)
+ anchor
+end
+add!(m::AnchorMap, object, id, file) = add!(m, Anchor(object), id, file)
+
+# Anchor existance.
+# -----------------
+
+"""
+$(SIGNATURES)
+
+Does the given `id` exist within the [`AnchorMap`](@ref)? A `file` and integer `n` may also
+be provided to narrow the search for existance.
+"""
+exists(m::AnchorMap, id, file, n) = exists(m, id, file) && 1 ≤ n ≤ length(m.map[id][file])
+exists(m::AnchorMap, id, file) = exists(m, id) && haskey(m.map[id], file)
+exists(m::AnchorMap, id) = haskey(m.map, id)
+
+# Anchor uniqueness.
+# ------------------
+
+"""
+$(SIGNATURES)
+
+Is the `id` unique within the given [`AnchorMap`](@ref)? May also specify the `file`.
+"""
+function isunique(m::AnchorMap, id)
+ exists(m, id) &&
+ length(m.map[id]) === 1 &&
+ isunique(m, id, first(first(m.map[id])))
+end
+function isunique(m::AnchorMap, id, file)
+ exists(m, id, file) &&
+ length(m.map[id][file]) === 1
+end
+
+# Get anchor.
+# -----------
+
+"""
+$(SIGNATURES)
+
+Returns the [`Anchor`](@ref) object matching `id`. `file` and `n` may also be provided. An
+`Anchor` is returned, or `nothing` in case of no match.
+"""
+function anchor(m::AnchorMap, id)
+ isunique(m, id) ?
+ anchor(m, id, first(first(m.map[id])), 1) :
+ nothing
+end
+function anchor(m::AnchorMap, id, file)
+ isunique(m, id, file) ?
+ anchor(m, id, file, 1) :
+ nothing
+end
+function anchor(m::AnchorMap, id, file, n)
+ exists(m, id, file, n) ?
+ m.map[id][file][n] :
+ nothing
+end
+
+end
--- /dev/null
+"""
+Defines the `Documenter.jl` build "pipeline" named [`DocumentPipeline`](@ref).
+
+Each stage of the pipeline performs an action on a [`Documents.Document`](@ref) object.
+These actions may involve creating directory structures, expanding templates, running
+doctests, etc.
+"""
+module Builder
+
+import ..Documenter:
+ Anchors,
+ Documents,
+ Documenter,
+ Utilities
+
+import .Utilities: Selectors
+
+using DocStringExtensions
+
+# Document Pipeline.
+# ------------------
+
+"""
+The default document processing "pipeline", which consists of the following actions:
+
+- [`SetupBuildDirectory`](@ref)
+- [`ExpandTemplates`](@ref)
+- [`CrossReferences`](@ref)
+- [`CheckDocument`](@ref)
+- [`Populate`](@ref)
+- [`RenderDocument`](@ref)
+
+"""
+abstract type DocumentPipeline <: Selectors.AbstractSelector end
+
+"""
+Creates the correct directory layout within the `build` folder and parses markdown files.
+"""
+abstract type SetupBuildDirectory <: DocumentPipeline end
+
+"""
+Executes a sequence of actions on each node of the parsed markdown files in turn.
+"""
+abstract type ExpandTemplates <: DocumentPipeline end
+
+"""
+Finds and sets URLs for each `@ref` link in the document to the correct destinations.
+"""
+abstract type CrossReferences <: DocumentPipeline end
+
+"""
+Checks that all documented objects are included in the document and runs doctests on all
+valid Julia code blocks.
+"""
+abstract type CheckDocument <: DocumentPipeline end
+
+"""
+Populates the `ContentsNode`s and `IndexNode`s with links.
+"""
+abstract type Populate <: DocumentPipeline end
+
+"""
+Writes the document tree to the `build` directory.
+"""
+abstract type RenderDocument <: DocumentPipeline end
+
+Selectors.order(::Type{SetupBuildDirectory}) = 1.0
+Selectors.order(::Type{ExpandTemplates}) = 2.0
+Selectors.order(::Type{CrossReferences}) = 3.0
+Selectors.order(::Type{CheckDocument}) = 4.0
+Selectors.order(::Type{Populate}) = 5.0
+Selectors.order(::Type{RenderDocument}) = 6.0
+
+Selectors.matcher(::Type{T}, doc::Documents.Document) where {T <: DocumentPipeline} = true
+
+Selectors.strict(::Type{T}) where {T <: DocumentPipeline} = false
+
+function Selectors.runner(::Type{SetupBuildDirectory}, doc::Documents.Document)
+ Utilities.log(doc, "setting up build directory.")
+
+ # Frequently used fields.
+ build = doc.user.build
+ source = doc.user.source
+
+ # The .user.source directory must exist.
+ isdir(source) || error("source directory '$(abspath(source))' is missing.")
+
+ # We create the .user.build directory.
+ # If .user.clean is set, we first clean the existing directory.
+ doc.user.clean && isdir(build) && rm(build; recursive = true)
+ isdir(build) || mkpath(build)
+
+ # We'll walk over all the files in the .user.source directory.
+ # The directory structure is copied over to .user.build. All files, with
+ # the exception of markdown files (identified by the extension) are copied
+ # over as well, since they're assumed to be images, data files etc.
+ # Markdown files, however, get added to the document and also stored into
+ # `mdpages`, to be used later.
+ mdpages = String[]
+ for (root, dirs, files) in walkdir(source)
+ for dir in dirs
+ d = normpath(joinpath(build, relpath(root, source), dir))
+ isdir(d) || mkdir(d)
+ end
+ for file in files
+ src = normpath(joinpath(root, file))
+ dst = normpath(joinpath(build, relpath(root, source), file))
+ if endswith(file, ".md")
+ push!(mdpages, Utilities.srcpath(source, root, file))
+ Documents.addpage!(doc, src, dst)
+ else
+ cp(src, dst; force = true)
+ end
+ end
+ end
+
+ # If the user hasn't specified the page list, then we'll just default to a
+ # flat list of all the markdown files we found, sorted by the filesystem
+ # path (it will group them by subdirectory, among others).
+ userpages = isempty(doc.user.pages) ? sort(mdpages) : doc.user.pages
+
+ # Populating the .navtree and .navlist.
+ # We need the for loop because we can't assign to the fields of the immutable
+ # doc.internal.
+ for navnode in walk_navpages(userpages, nothing, doc)
+ push!(doc.internal.navtree, navnode)
+ end
+
+ # Finally we populate the .next and .prev fields of the navnodes that point
+ # to actual pages.
+ local prev::Union{Documents.NavNode, Nothing} = nothing
+ for navnode in doc.internal.navlist
+ navnode.prev = prev
+ if prev !== nothing
+ prev.next = navnode
+ end
+ prev = navnode
+ end
+end
+
+"""
+$(SIGNATURES)
+
+Recursively walks through the [`Documents.Document`](@ref)'s `.user.pages` field,
+generating [`Documents.NavNode`](@ref)s and related data structures in the
+process.
+
+This implementation is the de facto specification for the `.user.pages` field.
+"""
+function walk_navpages(visible, title, src, children, parent, doc)
+ # parent can also be nothing (for top-level elements)
+ parent_visible = (parent === nothing) || parent.visible
+ if src !== nothing
+ src = normpath(src)
+ src in keys(doc.internal.pages) || error("'$src' is not an existing page!")
+ end
+ nn = Documents.NavNode(src, title, parent)
+ (src === nothing) || push!(doc.internal.navlist, nn)
+ nn.visible = parent_visible && visible
+ nn.children = walk_navpages(children, nn, doc)
+ nn
+end
+
+function walk_navpages(hps::Tuple, parent, doc)
+ @assert length(hps) == 4
+ walk_navpages(hps..., parent, doc)
+end
+
+walk_navpages(title::String, children::Vector, parent, doc) = walk_navpages(true, title, nothing, children, parent, doc)
+walk_navpages(title::String, page, parent, doc) = walk_navpages(true, title, page, [], parent, doc)
+
+walk_navpages(p::Pair, parent, doc) = walk_navpages(p.first, p.second, parent, doc)
+walk_navpages(ps::Vector, parent, doc) = [walk_navpages(p, parent, doc)::Documents.NavNode for p in ps]
+walk_navpages(src::String, parent, doc) = walk_navpages(true, nothing, src, [], parent, doc)
+
+
+function Selectors.runner(::Type{ExpandTemplates}, doc::Documents.Document)
+ Utilities.log(doc, "expanding markdown templates.")
+ Documenter.Expanders.expand(doc)
+end
+
+function Selectors.runner(::Type{CrossReferences}, doc::Documents.Document)
+ Utilities.log(doc, "building cross-references.")
+ Documenter.CrossReferences.crossref(doc)
+end
+
+function Selectors.runner(::Type{CheckDocument}, doc::Documents.Document)
+ Utilities.log(doc, "running document checks.")
+ Documenter.DocChecks.missingdocs(doc)
+ Documenter.DocTests.doctest(doc)
+ Documenter.DocChecks.footnotes(doc)
+ Documenter.DocChecks.linkcheck(doc)
+end
+
+function Selectors.runner(::Type{Populate}, doc::Documents.Document)
+ Utilities.log("populating indices.")
+ Documents.doctest_replace!(doc)
+ Documents.populate!(doc)
+end
+
+function Selectors.runner(::Type{RenderDocument}, doc::Documents.Document)
+ count = length(doc.internal.errors)
+ if doc.user.strict && count > 0
+ error("`makedocs` encountered $(count > 1 ? "errors" : "an error"). Terminating build")
+ else
+ Utilities.log(doc, "rendering document.")
+ Documenter.Writers.render(doc)
+ end
+end
+
+Selectors.runner(::Type{DocumentPipeline}, doc::Documents.Document) = nothing
+
+end
--- /dev/null
+"""
+Provides the [`crossref`](@ref) function used to automatically calculate link URLs.
+"""
+module CrossReferences
+
+import ..Documenter:
+ Anchors,
+ Builder,
+ Documents,
+ Expanders,
+ Formats,
+ Documenter,
+ Utilities
+
+using DocStringExtensions
+import Markdown
+
+"""
+$(SIGNATURES)
+
+Traverses a [`Documents.Document`](@ref) and replaces links containg `@ref` URLs with
+their real URLs.
+"""
+function crossref(doc::Documents.Document)
+ for (src, page) in doc.internal.pages
+ empty!(page.globals.meta)
+ for element in page.elements
+ crossref(page.mapping[element], page, doc)
+ end
+ end
+end
+
+function crossref(elem, page, doc)
+ Documents.walk(page.globals.meta, elem) do link
+ xref(link, page.globals.meta, page, doc)
+ end
+end
+
+# Dispatch to `namedxref` / `docsxref`.
+# -------------------------------------
+
+const NAMED_XREF = r"^@ref (.+)$"
+
+function xref(link::Markdown.Link, meta, page, doc)
+ link.url == "@ref" ? basicxref(link, meta, page, doc) :
+ occursin(NAMED_XREF, link.url) ? namedxref(link, meta, page, doc) : nothing
+ return false # Stop `walk`ing down this `link` element.
+end
+xref(other, meta, page, doc) = true # Continue to `walk` through element `other`.
+
+function basicxref(link::Markdown.Link, meta, page, doc)
+ if length(link.text) === 1 && isa(link.text[1], Markdown.Code)
+ docsxref(link, link.text[1].code, meta, page, doc)
+ elseif isa(link.text, Vector)
+ # No `name` was provided, since given a `@ref`, so slugify the `.text` instead.
+ text = strip(sprint(Markdown.plain, Markdown.Paragraph(link.text)))
+ if occursin(r"#[0-9]+", text)
+ issue_xref(link, lstrip(text, '#'), meta, page, doc)
+ else
+ name = Utilities.slugify(text)
+ namedxref(link, name, meta, page, doc)
+ end
+ end
+end
+
+# Cross referencing headers.
+# --------------------------
+
+function namedxref(link::Markdown.Link, meta, page, doc)
+ # Extract the `name` from the `(@ref ...)`.
+ slug = match(NAMED_XREF, link.url)[1]
+ if isempty(slug)
+ text = sprint(Markdown.plaininline, link)
+ push!(doc.internal.errors, :cross_references)
+ Utilities.warn(page.source, "'$text' missing a name after '#'.")
+ else
+ if Anchors.exists(doc.internal.headers, slug)
+ namedxref(link, slug, meta, page, doc)
+ elseif length(link.text) === 1 && isa(link.text[1], Markdown.Code)
+ docsxref(link, slug, meta, page, doc)
+ else
+ namedxref(link, slug, meta, page, doc)
+ end
+ end
+end
+
+function namedxref(link::Markdown.Link, slug, meta, page, doc)
+ headers = doc.internal.headers
+ # Add the link to list of local uncheck links.
+ doc.internal.locallinks[link] = link.url
+ # Error checking: `slug` should exist and be unique.
+ # TODO: handle non-unique slugs.
+ if Anchors.exists(headers, slug)
+ if Anchors.isunique(headers, slug)
+ # Replace the `@ref` url with a path to the referenced header.
+ anchor = Anchors.anchor(headers, slug)
+ path = relpath(anchor.file, dirname(page.build))
+ link.url = string(path, '#', slug, '-', anchor.nth)
+ else
+ push!(doc.internal.errors, :cross_references)
+ Utilities.warn(page.source, "'$slug' is not unique.")
+ end
+ else
+ push!(doc.internal.errors, :cross_references)
+ Utilities.warn(page.source, "Reference for '$slug' could not be found.")
+ end
+end
+
+# Cross referencing docstrings.
+# -----------------------------
+
+function docsxref(link::Markdown.Link, code, meta, page, doc)
+ # Add the link to list of local uncheck links.
+ doc.internal.locallinks[link] = link.url
+ # Parse the link text and find current module.
+ keyword = Symbol(strip(code))
+ local ex
+ if haskey(Docs.keywords, keyword)
+ ex = QuoteNode(keyword)
+ else
+ try
+ ex = Meta.parse(code)
+ catch err
+ !isa(err, Meta.ParseError) && rethrow(err)
+ push!(doc.internal.errors, :cross_references)
+ Utilities.warn(page.source, "Unable to parse the reference '[`$code`](@ref)'.")
+ return
+ end
+ end
+ mod = get(meta, :CurrentModule, Main)
+
+ # Find binding and type signature associated with the link.
+ local binding
+ try
+ binding = Documenter.DocSystem.binding(mod, ex)
+ catch err
+ push!(doc.internal.errors, :cross_references)
+ Utilities.warn(page.source, "Unable to get the binding for '[`$code`](@ref)'.", err, ex, mod)
+ return
+ end
+
+ local typesig
+ try
+ typesig = Core.eval(mod, Documenter.DocSystem.signature(ex, rstrip(code)))
+ catch err
+ push!(doc.internal.errors, :cross_references)
+ Utilities.warn(page.source, "Unable to evaluate the type signature for '[`$code`](@ref)'.", err, ex, mod)
+ return
+ end
+
+ # Try to find a valid object that we can cross-reference.
+ object = find_object(doc, binding, typesig)
+ if object !== nothing
+ # Replace the `@ref` url with a path to the referenced docs.
+ docsnode = doc.internal.objects[object]
+ path = relpath(docsnode.page.build, dirname(page.build))
+ slug = Utilities.slugify(object)
+ link.url = string(path, '#', slug)
+ else
+ push!(doc.internal.errors, :cross_references)
+ Utilities.warn(page.source, "No doc found for reference '[`$code`](@ref)'.")
+ end
+end
+
+"""
+$(SIGNATURES)
+
+Find the included `Object` in the `doc` matching `binding` and `typesig`. The matching
+heuristic isn't too picky about what matches and will only fail when no `Binding`s matching
+`binding` have been included.
+"""
+function find_object(doc::Documents.Document, binding, typesig)
+ object = Utilities.Object(binding, typesig)
+ if haskey(doc.internal.objects, object)
+ # Exact object matching the requested one.
+ return object
+ else
+ objects = get(doc.internal.bindings, binding, Utilities.Object[])
+ if isempty(objects)
+ # No bindings match the requested object == FAILED.
+ return nothing
+ elseif length(objects) == 1
+ # Only one possible choice. Use it even if the signature doesn't match.
+ return objects[1]
+ else
+ candidate = find_object(binding, typesig)
+ if candidate in objects
+ # We've found an actual match out of the possible choices! Use it.
+ return candidate
+ else
+ # No match in the possible choices. Use the one that was first included.
+ return objects[1]
+ end
+ end
+ end
+end
+function find_object(binding, typesig)
+ if Documenter.DocSystem.defined(binding)
+ local λ = Documenter.DocSystem.resolve(binding)
+ return find_object(λ, binding, typesig)
+ else
+ return Utilities.Object(binding, typesig)
+ end
+end
+function find_object(λ::Union{Function, DataType}, binding, typesig)
+ if hasmethod(λ, typesig)
+ signature = getsig(λ, typesig)
+ return Utilities.Object(binding, signature)
+ else
+ return Utilities.Object(binding, typesig)
+ end
+end
+find_object(::Union{Function, DataType}, binding, ::Union{Union,Type{Union{}}}) = Utilities.Object(binding, Union{})
+find_object(other, binding, typesig) = Utilities.Object(binding, typesig)
+
+getsig(λ::Union{Function, DataType}, typesig) = Base.tuple_type_tail(which(λ, typesig).sig)
+
+
+# Issues/PRs cross referencing.
+# -----------------------------
+
+function issue_xref(link::Markdown.Link, num, meta, page, doc)
+ link.url = isempty(doc.internal.remote) ? link.url :
+ "https://github.com/$(doc.internal.remote)/issues/$num"
+end
+
+end
--- /dev/null
+"""
+Exported module that provides build and deploy dependencies and related functions.
+
+Currently only [`pip`](@ref) is implemented.
+"""
+module Deps
+
+export pip
+
+using DocStringExtensions
+
+"""
+$(SIGNATURES)
+
+Installs (as non-root user) all python packages listed in `deps`.
+
+# Examples
+
+```julia
+using Documenter
+
+makedocs(
+ # ...
+)
+
+deploydocs(
+ deps = Deps.pip("pygments", "mkdocs", "mkdocs-material"),
+ # ...
+)
+```
+"""
+function pip(deps...)
+ for dep in deps
+ run(`pip install --user $(dep)`)
+ end
+end
+
+
+function localbin()
+ Sys.islinux() ? joinpath(homedir(), ".local", "bin") :
+ Sys.isapple() ? joinpath(homedir(), "Library", "Python", "2.7", "bin") : ""
+end
+
+function updatepath!(p = localbin())
+ if occursin(p, ENV["PATH"])
+ ENV["PATH"]
+ else
+ ENV["PATH"] = "$p:$(ENV["PATH"])"
+ end
+end
+
+end
--- /dev/null
+"""
+Provides the [`missingdocs`](@ref), [`footnotes`](@ref) and [`linkcheck`](@ref) functions
+for checking docs.
+"""
+module DocChecks
+
+import ..Documenter:
+ Documenter,
+ Documents,
+ Utilities
+
+using DocStringExtensions
+import Markdown
+
+# Missing docstrings.
+# -------------------
+
+"""
+$(SIGNATURES)
+
+Checks that a [`Documents.Document`](@ref) contains all available docstrings that are
+defined in the `modules` keyword passed to [`Documenter.makedocs`](@ref).
+
+Prints out the name of each object that has not had its docs spliced into the document.
+"""
+function missingdocs(doc::Documents.Document)
+ doc.user.checkdocs === :none && return
+ println(" > checking for missing docstrings.")
+ bindings = allbindings(doc.user.checkdocs, doc.user.modules)
+ for object in keys(doc.internal.objects)
+ if haskey(bindings, object.binding)
+ signatures = bindings[object.binding]
+ if object.signature ≡ Union{} || length(signatures) ≡ 1
+ delete!(bindings, object.binding)
+ elseif object.signature in signatures
+ delete!(signatures, object.signature)
+ end
+ end
+ end
+ n = reduce(+, map(length, values(bindings)), init=0)
+ if n > 0
+ b = IOBuffer()
+ println(b, "$n docstring$(n ≡ 1 ? "" : "s") potentially missing:\n")
+ for (binding, signatures) in bindings
+ for sig in signatures
+ println(b, " $binding", sig ≡ Union{} ? "" : " :: $sig")
+ end
+ end
+ push!(doc.internal.errors, :missing_docs)
+ Utilities.warn(String(take!(b)))
+ end
+end
+
+function allbindings(checkdocs::Symbol, mods)
+ out = Dict{Utilities.Binding, Set{Type}}()
+ for m in mods
+ allbindings(checkdocs, m, out)
+ end
+ out
+end
+
+function allbindings(checkdocs::Symbol, mod::Module, out = Dict{Utilities.Binding, Set{Type}}())
+ for (obj, doc) in meta(mod)
+ isa(obj, IdDict{Any,Any}) && continue
+ name = nameof(obj)
+ isexported = Base.isexported(mod, name)
+ if checkdocs === :all || (isexported && checkdocs === :exports)
+ out[Utilities.Binding(mod, name)] = Set(sigs(doc))
+ end
+ end
+ out
+end
+
+meta(m) = Docs.meta(m)
+
+nameof(b::Base.Docs.Binding) = b.var
+nameof(x) = Base.nameof(x)
+
+sigs(x::Base.Docs.MultiDoc) = x.order
+sigs(::Any) = Type[Union{}]
+
+
+# Footnote checks.
+# ----------------
+"""
+$(SIGNATURES)
+
+Checks footnote links in a [`Documents.Document`](@ref).
+"""
+function footnotes(doc::Documents.Document)
+ println(" > checking footnote links.")
+ # A mapping of footnote ids to a tuple counter of how many footnote references and
+ # footnote bodies have been found.
+ #
+ # For all ids the final result should be `(N, 1)` where `N > 1`, i.e. one or more
+ # footnote references and a single footnote body.
+ footnotes = Dict{Documents.Page, Dict{String, Tuple{Int, Int}}}()
+ for (src, page) in doc.internal.pages
+ empty!(page.globals.meta)
+ orphans = Dict{String, Tuple{Int, Int}}()
+ for element in page.elements
+ Documents.walk(page.globals.meta, page.mapping[element]) do block
+ footnote(block, orphans)
+ end
+ end
+ footnotes[page] = orphans
+ end
+ for (page, orphans) in footnotes
+ for (id, (ids, bodies)) in orphans
+ # Multiple footnote bodies.
+ if bodies > 1
+ push!(doc.internal.errors, :footnote)
+ Utilities.warn(page.source, "Footnote '$id' has $bodies bodies.")
+ end
+ # No footnote references for an id.
+ if ids === 0
+ push!(doc.internal.errors, :footnote)
+ Utilities.warn(page.source, "Unused footnote named '$id'.")
+ end
+ # No footnote bodies for an id.
+ if bodies === 0
+ push!(doc.internal.errors, :footnote)
+ Utilities.warn(page.source, "No footnotes found for '$id'.")
+ end
+ end
+ end
+end
+
+function footnote(fn::Markdown.Footnote, orphans::Dict)
+ ids, bodies = get(orphans, fn.id, (0, 0))
+ if fn.text === nothing
+ # Footnote references: syntax `[^1]`.
+ orphans[fn.id] = (ids + 1, bodies)
+ return false # No more footnotes inside footnote references.
+ else
+ # Footnote body: syntax `[^1]:`.
+ orphans[fn.id] = (ids, bodies + 1)
+ return true # Might be footnotes inside footnote bodies.
+ end
+end
+
+footnote(other, orphans::Dict) = true
+
+# Link Checks.
+# ------------
+
+hascurl() = (try; success(`curl --version`); catch err; false; end)
+
+"""
+$(SIGNATURES)
+
+Checks external links using curl.
+"""
+function linkcheck(doc::Documents.Document)
+ if doc.user.linkcheck
+ if hascurl()
+ println(" > checking external URLs:")
+ for (src, page) in doc.internal.pages
+ println(" - ", src)
+ for element in page.elements
+ Documents.walk(page.globals.meta, page.mapping[element]) do block
+ linkcheck(block, doc)
+ end
+ end
+ end
+ else
+ push!(doc.internal.errors, :linkcheck)
+ Utilities.warn("linkcheck requires `curl`.")
+ end
+ end
+ return nothing
+end
+
+function linkcheck(link::Markdown.Link, doc::Documents.Document)
+ INDENT = " "^6
+
+ # first, make sure we're not supposed to ignore this link
+ for r in doc.user.linkcheck_ignore
+ if linkcheck_ismatch(r, link.url)
+ printstyled(INDENT, "--- ", link.url, "\n", color=:normal)
+ return false
+ end
+ end
+
+ if !haskey(doc.internal.locallinks, link)
+ local result
+ try
+ result = read(`curl -sI $(link.url) --max-time 10`, String)
+ catch err
+ push!(doc.internal.errors, :linkcheck)
+ Utilities.warn("`curl -sI $(link.url)` failed:\n\n$(err)")
+ return false
+ end
+ local STATUS_REGEX = r"^HTTP/(1.1|2) (\d+) (.+)$"m
+ if occursin(STATUS_REGEX, result)
+ status = parse(Int, match(STATUS_REGEX, result).captures[2])
+ if status < 300
+ printstyled(INDENT, "$(status) ", link.url, "\n", color=:green)
+ elseif status < 400
+ LOCATION_REGEX = r"^Location: (.+)$"m
+ if occursin(LOCATION_REGEX, result)
+ location = strip(match(LOCATION_REGEX, result).captures[1])
+ printstyled(INDENT, "$(status) ", link.url, "\n", color=:yellow)
+ printstyled(INDENT, " -> ", location, "\n\n", color=:yellow)
+ else
+ printstyled(INDENT, "$(status) ", link.url, "\n", color=:yellow)
+ end
+ else
+ push!(doc.internal.errors, :linkcheck)
+ printstyled(INDENT, "$(status) ", link.url, "\n", color=:red)
+ end
+ else
+ push!(doc.internal.errors, :linkcheck)
+ Utilities.warn("invalid result returned by `curl -sI $(link.url)`:\n\n$(result)")
+ end
+ end
+ return false
+end
+linkcheck(other, doc::Documents.Document) = true
+
+linkcheck_ismatch(r::String, url) = (url == r)
+linkcheck_ismatch(r::Regex, url) = occursin(r, url)
+
+end
--- /dev/null
+"""
+Provides a consistent interface to retreiving `DocStr` objects from the Julia
+docsystem in both `0.4` and `0.5`.
+"""
+module DocSystem
+
+using DocStringExtensions
+import Markdown
+import Base.Docs: MultiDoc, parsedoc, formatdoc, DocStr
+
+## Bindings ##
+
+"""
+Converts an object to a `Base.Docs.Binding` object.
+
+$(SIGNATURES)
+
+Supported inputs are:
+
+- `Binding`
+- `DataType`
+- `Function`
+- `Module`
+- `Symbol`
+
+Note that unsupported objects will throw an `ArgumentError`.
+"""
+binding(any::Any) = throw(ArgumentError("cannot convert `$(repr(any))` to a `Binding`."))
+
+#
+# The simple definitions.
+#
+binding(b::Docs.Binding) = binding(b.mod, b.var)
+binding(d::DataType) = binding(d.name.module, d.name.name)
+binding(m::Module) = binding(m, nameof(m))
+binding(s::Symbol) = binding(Main, s)
+binding(f::Function) = binding(typeof(f).name.module, typeof(f).name.mt.name)
+
+#
+# We need a lookup table for `IntrinsicFunction`s since they do not track their
+# own name and defining module.
+#
+# Note that `IntrinsicFunction` is exported from `Base` in `0.4`, but not in `0.5`.
+#
+let INTRINSICS = Dict(map(s -> getfield(Core.Intrinsics, s) => s, names(Core.Intrinsics, all=true)))
+ global binding(i::Core.IntrinsicFunction) = binding(Core.Intrinsics, INTRINSICS[i]::Symbol)
+end
+
+#
+# Normalise the parent module.
+#
+# This is done within the `Binding` constructor on `0.5`, but not on `0.4`.
+#
+function binding(m::Module, v::Symbol)
+ m = nameof(m) === v ? parentmodule(m) : m
+ Docs.Binding(m, v)
+end
+
+#
+# Pseudo-eval of `Expr`s to find their equivalent `Binding`.
+#
+binding(m::Module, x::Expr) =
+ Meta.isexpr(x, :.) ? binding(getmod(m, x.args[1]), x.args[2].value) :
+ Meta.isexpr(x, [:call, :macrocall, :curly]) ? binding(m, x.args[1]) :
+ Meta.isexpr(x, :where) ? binding(m, x.args[1].args[1]) :
+ error("`binding` cannot understand expression `$x`.")
+
+# Helper methods for the above `binding` method.
+getmod(m::Module, x::Expr) = getfield(getmod(m, x.args[1]), x.args[2].value)
+getmod(m::Module, s::Symbol) = getfield(m, s)
+
+binding(m::Module, q::QuoteNode) = binding(Main, q.value)
+
+binding(m::Module, λ::Any) = binding(λ)
+
+## Signatures. ##
+
+function signature(x, str::AbstractString)
+ ts = Base.Docs.signature(x)
+ (Meta.isexpr(x, :macrocall, 2) && !endswith(strip(str), "()")) ? :(Union{}) : ts
+end
+
+## Docstring containers. ##
+
+
+"""
+Construct a `MultiDoc` object from the provided argument.
+
+Valid inputs are:
+
+- `Markdown.MD`
+- `Docs.FuncDoc`
+- `Docs.TypeDoc`
+
+"""
+function multidoc end
+
+function multidoc(markdown::Markdown.MD)
+ md = MultiDoc()
+ sig = Union{}
+ push!(md.order, sig)
+ md.docs[sig] = docstr(markdown)
+ md
+end
+
+
+
+"""
+$(SIGNATURES)
+
+Construct a `DocStr` object from a `Markdown.MD` object.
+
+The optional keyword arguments are used to add new data to the `DocStr`'s
+`.data` dictionary.
+"""
+function docstr(md::Markdown.MD; kws...)
+ data = Dict{Symbol, Any}(
+ :path => md.meta[:path],
+ :module => md.meta[:module],
+ :linenumber => 0,
+ )
+ doc = DocStr(Core.svec(), md, data)
+ for (key, value) in kws
+ doc.data[key] = value
+ end
+ doc
+end
+docstr(other) = other
+
+
+## Formatting `DocStr`s. ##
+
+
+
+
+## Converting docstring caches. ##
+
+"""
+$(SIGNATURES)
+
+Converts a `0.4`-style docstring cache into a `0.5` one.
+
+The original docstring cache is not modified.
+"""
+function convertmeta(meta::IdDict{Any,Any})
+ if !haskey(CACHED, meta)
+ docs = IdDict{Any,Any}()
+ for (k, v) in meta
+ if !isa(k, Union{Number, AbstractString, IdDict{Any,Any}})
+ docs[binding(k)] = multidoc(v)
+ end
+ end
+ CACHED[meta] = docs
+ end
+ CACHED[meta]::IdDict{Any,Any}
+end
+const CACHED = IdDict{Any,Any}()
+
+
+## Get docs from modules.
+
+"""
+$(SIGNATURES)
+
+Find all `DocStr` objects that match the provided arguments:
+
+- `binding`: the name of the object.
+- `typesig`: the signature of the object. Default: `Union{}`.
+- `compare`: how to compare signatures? Exact (`==`) or subtypes (`<:`). Default: `<:`.
+- `modules`: which modules to search through. Default: *all modules*.
+- `aliases`: check aliases of `binding` when nothing is found. Default: `true`.
+
+Returns a `Vector{DocStr}` ordered by definition order in `0.5` and by
+`type_morespecific` in `0.4`.
+"""
+function getdocs(
+ binding::Docs.Binding,
+ typesig::Type = Union{};
+ compare = (==),
+ modules = Docs.modules,
+ aliases = true,
+ )
+ # Fall back to searching all modules if user provides no modules.
+ modules = isempty(modules) ? Docs.modules : modules
+ # Keywords are special-cased within the docsystem. Handle those first.
+ iskeyword(binding) && return [docstr(Base.Docs.keywords[binding.var])]
+ # Handle all the other possible bindings.
+ results = DocStr[]
+ for mod in modules
+ meta = getmeta(mod)
+ if haskey(meta, binding)
+ multidoc = meta[binding]::MultiDoc
+ for signature in multidoc.order
+ if compare(typesig, signature)
+ doc = multidoc.docs[signature]
+ doc.data[:binding] = binding
+ doc.data[:typesig] = signature
+ push!(results, doc)
+ end
+ end
+ end
+ end
+ if compare == (==)
+ # Exact matching of signatures:
+ #
+ # When we get a single match from using `==` as the comparision then we just return
+ # that one result.
+ #
+ # Otherwise we fallback to comparing signatures using `<:` to match, hopefully, a
+ # wider range of possible docstrings.
+ if length(results) == 1
+ results
+ else
+ getdocs(binding, typesig; compare = (<:), modules = modules, aliases = aliases)
+ end
+ else
+ # When nothing is found we check whether the `binding` is an alias of some other
+ # `Binding`. If so then we redo the search using that `Binding` instead.
+ if aliases && isempty(results) && (b = aliasof(binding)) != binding
+ getdocs(b, typesig; compare = compare, modules = modules)
+ else
+ results
+ end
+ end
+end
+
+"""
+$(SIGNATURES)
+
+Accepts objects of any type and tries to convert them to `Binding`s before
+searching for the `Binding` in the docsystem.
+
+Note that when conversion fails this method returns an empty `Vector{DocStr}`.
+"""
+function getdocs(object::Any, typesig::Type = Union{}; kws...)
+ binding = aliasof(object, object)
+ binding === object ? DocStr[] : getdocs(binding, typesig; kws...)
+end
+
+#
+# Helper methods used by the `getdocs` function above.
+#
+
+getmeta(m::Module) = Docs.meta(m)
+
+import Base.Docs: aliasof, resolve, defined
+
+
+aliasof(s::Symbol, b) = binding(s)
+
+iskeyword(b::Docs.Binding) = b.mod === Main && haskey(Base.Docs.keywords, b.var)
+ismacro(b::Docs.Binding) = startswith(string(b.var), '@')
+
+
+function category(b::Docs.Binding)
+ if iskeyword(b)
+ :keyword
+ elseif ismacro(b)
+ :macro
+ else
+ category(resolve(b))
+ end
+end
+category(::Function) = :function
+category(::DataType) = :type
+category(x::UnionAll) = category(Base.unwrap_unionall(x))
+category(::Module) = :module
+category(::Any) = :constant
+
+end
--- /dev/null
+"""
+Provides the [`doctest`](@ref) function that makes sure that the `jldoctest` code blocks
+in the documents and docstrings run and are up to date.
+"""
+module DocTests
+using DocStringExtensions
+
+import ..Documenter:
+ Documenter,
+ Documents,
+ Expanders,
+ Utilities
+
+import Markdown, REPL
+
+# Julia code block testing.
+# -------------------------
+
+# escape characters that has a meaning in regex
+regex_escape(str) = sprint(escape_string, str, "\\^\$.|?*+()[{")
+
+# helper to display linerange for error printing
+function find_codeblock_in_file(code, file)
+ content = read(Base.find_source_file(file), String)
+ content = replace(content, "\r\n" => "\n")
+ # make a regex of the code that matches leading whitespace
+ rcode = "\\h*" * replace(regex_escape(code), "\\n" => "\\n\\h*")
+ blockidx = findfirst(Regex(rcode), content)
+ if blockidx !== nothing
+ startline = countlines(IOBuffer(content[1:prevind(content, first(blockidx))]))
+ endline = startline + countlines(IOBuffer(code)) + 1 # +1 to include the closing ```
+ return ":$(startline)-$(endline)"
+ else
+ return ""
+ end
+end
+
+"""
+$(SIGNATURES)
+
+Traverses the document tree and tries to run each Julia code block encountered. Will abort
+the document generation when an error is thrown. Use `doctest = false` keyword in
+[`Documenter.makedocs`](@ref) to disable doctesting.
+"""
+function doctest(doc::Documents.Document)
+ if doc.user.doctest === :fix || doc.user.doctest
+ println(" > running doctests.")
+ for (src, page) in doc.internal.pages
+ empty!(page.globals.meta)
+ for element in page.elements
+ page.globals.meta[:CurrentFile] = page.source
+ Documents.walk(page.globals.meta, page.mapping[element]) do block
+ doctest(block, page.globals.meta, doc, page)
+ end
+ end
+ end
+ else
+ Utilities.warn("Skipped doctesting.")
+ end
+end
+
+function doctest(block::Markdown.Code, meta::Dict, doc::Documents.Document, page)
+ lang = block.language
+ if startswith(lang, "jldoctest")
+ # Define new module or reuse an old one from this page if we have a named doctest.
+ name = match(r"jldoctest[ ]?(.*)$", split(lang, ';', limit = 2)[1])[1]
+ sym = isempty(name) ? gensym("doctest-") : Symbol("doctest-", name)
+ sandbox = get!(() -> Expanders.get_new_sandbox(sym), page.globals.meta, sym)
+
+ # Normalise line endings.
+ block.code = replace(block.code, "\r\n" => "\n")
+
+ # parse keyword arguments to doctest
+ d = Dict()
+ idx = findfirst(c -> c == ';', lang)
+ if idx !== nothing
+ kwargs = Meta.parse("($(lang[nextind(lang, idx):end]),)")
+ for kwarg in kwargs.args
+ if !(isa(kwarg, Expr) && kwarg.head === :(=) && isa(kwarg.args[1], Symbol))
+ file = meta[:CurrentFile]
+ lines = find_codeblock_in_file(block.code, file)
+ Utilities.warn(
+ """
+ Invalid syntax for doctest keyword arguments in $(file)$(lines)
+ Use ```jldoctest name; key1 = value1, key2 = value2
+
+ ```$(lang)
+ $(block.code)
+ ```
+ """
+ )
+ return false
+ end
+ d[kwarg.args[1]] = Core.eval(sandbox, kwarg.args[2])
+ end
+ end
+ meta[:LocalDocTestArguments] = d
+
+ for expr in [get(meta, :DocTestSetup, []); get(meta[:LocalDocTestArguments], :setup, [])]
+ Meta.isexpr(expr, :block) && (expr.head = :toplevel)
+ try
+ Core.eval(sandbox, expr)
+ catch e
+ push!(doc.internal.errors, :doctest)
+ @error("could not evaluate expression from doctest setup.",
+ expression = expr, exception = e)
+ return false
+ end
+ end
+ if occursin(r"^julia> "m, block.code)
+ eval_repl(block, sandbox, meta, doc, page)
+ elseif occursin(r"^# output$"m, block.code)
+ eval_script(block, sandbox, meta, doc, page)
+ else
+ push!(doc.internal.errors, :doctest)
+ file = meta[:CurrentFile]
+ lines = find_codeblock_in_file(block.code, file)
+ Utilities.warn(
+ """
+ Invalid doctest block in $(file)$(lines)
+ Requires `julia> ` or `# output`
+
+ ```$(lang)
+ $(block.code)
+ ```
+ """
+ )
+ end
+ delete!(meta, :LocalDocTestArguments)
+ end
+ false
+end
+doctest(block, meta::Dict, doc::Documents.Document, page) = true
+
+function doctest(block::Markdown.MD, meta::Dict, doc::Documents.Document, page)
+ haskey(block.meta, :path) && (meta[:CurrentFile] = block.meta[:path])
+ return true
+end
+
+# Doctest evaluation.
+
+mutable struct Result
+ block :: Markdown.Code # The entire code block that is being tested.
+ input :: String # Part of `block.code` representing the current input.
+ output :: String # Part of `block.code` representing the current expected output.
+ file :: String # File in which the doctest is written. Either `.md` or `.jl`.
+ value :: Any # The value returned when evaluating `input`.
+ hide :: Bool # Semi-colon suppressing the output?
+ stdout :: IOBuffer # Redirected stdout/stderr gets sent here.
+ bt :: Vector # Backtrace when an error is thrown.
+
+ function Result(block, input, output, file)
+ new(block, input, rstrip(output, '\n'), file, nothing, false, IOBuffer())
+ end
+end
+
+function eval_repl(block, sandbox, meta::Dict, doc::Documents.Document, page)
+ for (input, output) in repl_splitter(block.code)
+ result = Result(block, input, output, meta[:CurrentFile])
+ for (ex, str) in Utilities.parseblock(input, doc, page; keywords = false)
+ # Input containing a semi-colon gets suppressed in the final output.
+ result.hide = REPL.ends_with_semicolon(str)
+ (value, success, backtrace, text) = Utilities.withoutput() do
+ disable_color() do
+ Core.eval(sandbox, ex)
+ end
+ end
+ result.value = value
+ print(result.stdout, text)
+ if success
+ # Redefine the magic `ans` binding available in the REPL.
+ __ans__!(sandbox, result.value)
+ else
+ result.bt = backtrace
+ end
+ end
+ checkresult(sandbox, result, meta, doc)
+ end
+end
+
+function __ans__!(m::Module, value)
+ isdefined(m, :__ans__!) || Core.eval(m, :(__ans__!(value) = global ans = value))
+ return Core.eval(m, Expr(:call, () -> m.__ans__!(value)))
+end
+
+function eval_script(block, sandbox, meta::Dict, doc::Documents.Document, page)
+ # TODO: decide whether to keep `# output` syntax for this. It's a bit ugly.
+ # Maybe use double blank lines, i.e.
+ #
+ #
+ # to mark `input`/`output` separation.
+ input, output = split(block.code, "# output\n", limit = 2)
+ input = rstrip(input, '\n')
+ output = lstrip(output, '\n')
+ result = Result(block, input, output, meta[:CurrentFile])
+ for (ex, str) in Utilities.parseblock(input, doc, page; keywords = false)
+ (value, success, backtrace, text) = Utilities.withoutput() do
+ Core.eval(sandbox, ex)
+ end
+ result.value = value
+ print(result.stdout, text)
+ if !success
+ result.bt = backtrace
+ break
+ end
+ end
+ checkresult(sandbox, result, meta, doc)
+end
+
+function filter_doctests(strings::NTuple{2, AbstractString},
+ doc::Documents.Document, meta::Dict)
+ meta_block_filters = get(meta, :DocTestFilters, [])
+ meta_block_filters == nothing && meta_block_filters == []
+ doctest_local_filters = get(meta[:LocalDocTestArguments], :filter, [])
+ for r in [doc.user.doctestfilters; meta_block_filters; doctest_local_filters]
+ if all(occursin.((r,), strings))
+ strings = replace.(strings, (r => "",))
+ end
+ end
+ return strings
+end
+
+# Regex used here to replace gensym'd module names could probably use improvements.
+function checkresult(sandbox::Module, result::Result, meta::Dict, doc::Documents.Document)
+ sandbox_name = nameof(sandbox)
+ mod_regex = Regex("(Main\\.)?(Symbol\\(\"$(sandbox_name)\"\\)|$(sandbox_name))[,.]")
+ mod_regex_nodot = Regex(("(Main\\.)?$(sandbox_name)"))
+ outio = IOContext(result.stdout, :module => sandbox)
+ if isdefined(result, :bt) # An error was thrown and we have a backtrace.
+ # To avoid dealing with path/line number issues in backtraces we use `[...]` to
+ # mark ignored output from an error message. Only the text prior to it is used to
+ # test for doctest success/failure.
+ head = replace(split(result.output, "\n[...]"; limit = 2)[1], mod_regex => "")
+ head = replace(head, mod_regex_nodot => "Main")
+ str = replace(error_to_string(outio, result.value, result.bt), mod_regex => "")
+ str = replace(str, mod_regex_nodot => "Main")
+
+ str, head = filter_doctests((str, head), doc, meta)
+ # Since checking for the prefix of an error won't catch the empty case we need
+ # to check that manually with `isempty`.
+ if isempty(head) || !startswith(str, head)
+ if doc.user.doctest === :fix
+ fix_doctest(result, str, doc)
+ else
+ report(result, str, doc)
+ end
+ end
+ else
+ value = result.hide ? nothing : result.value # `;` hides output.
+ output = replace(rstrip(sanitise(IOBuffer(result.output))), mod_regex => "")
+ str = replace(result_to_string(outio, value), mod_regex => "")
+ # Replace a standalone module name with `Main`.
+ str = rstrip(replace(str, mod_regex_nodot => "Main"))
+ filteredstr, filteredoutput = filter_doctests((str, output), doc, meta)
+ if filteredstr != filteredoutput
+ if doc.user.doctest === :fix
+ fix_doctest(result, str, doc)
+ else
+ report(result, str, doc)
+ end
+ end
+ end
+ return nothing
+end
+
+# Display doctesting results.
+
+function result_to_string(buf, value)
+ dis = text_display(buf)
+ value === nothing || disable_color() do
+ Core.eval(Main, Expr(:call, display, dis, QuoteNode(value)))
+ end
+ sanitise(buf)
+end
+
+text_display(buf) = TextDisplay(IOContext(buf, :limit => true))
+
+funcsym() = CAN_INLINE[] ? :disable_color : :eval
+
+function error_to_string(buf, er, bt)
+ fs = funcsym()
+ # Remove unimportant backtrace info.
+ index = findlast(ptr -> Base.ip_matches_func(ptr, fs), bt)
+ # Print a REPL-like error message.
+ disable_color() do
+ print(buf, "ERROR: ")
+ Base.invokelatest(showerror, buf, er, index === nothing ? bt : bt[1:(index - 1)])
+ end
+ sanitise(buf)
+end
+
+# Strip trailing whitespace and remove terminal colors.
+function sanitise(buffer)
+ out = IOBuffer()
+ for line in eachline(seekstart(Base.unwrapcontext(buffer)[1]))
+ println(out, rstrip(line))
+ end
+ remove_term_colors(rstrip(String(take!(out)), '\n'))
+end
+
+import .Utilities.TextDiff
+
+function report(result::Result, str, doc::Documents.Document)
+ iob = IOBuffer()
+ ioc = IOContext(iob, :color => Base.have_color)
+ println(ioc, "=====[Test Error]", "="^30)
+ println(ioc)
+ printstyled(ioc, "> Location: ", result.file, color=:cyan)
+ printstyled(ioc, find_codeblock_in_file(result.block.code, result.file), color=:cyan)
+ printstyled(ioc, "\n\n> Code block:\n", color=:cyan)
+ println(ioc, "\n```$(result.block.language)")
+ println(ioc, result.block.code)
+ println(ioc, "```")
+ if !isempty(result.input)
+ printstyled(ioc, "\n> Subexpression:\n", color=:cyan)
+ print_indented(ioc, result.input; indent = 4)
+ end
+ warning = Base.have_color ? "" : " (REQUIRES COLOR)"
+ printstyled(ioc, "\n> Output Diff", warning, ":\n\n", color=:cyan)
+ diff = TextDiff.Diff{TextDiff.Words}(result.output, rstrip(str))
+ Utilities.TextDiff.showdiff(ioc, diff)
+ println(ioc, "\n\n", "=====[End Error]=", "="^30)
+ push!(doc.internal.errors, :doctest)
+ printstyled(String(take!(iob)), color=:normal)
+end
+
+function print_indented(buffer::IO, str::AbstractString; indent = 4)
+ println(buffer)
+ for line in split(str, '\n')
+ println(buffer, " "^indent, line)
+ end
+end
+
+function fix_doctest(result::Result, str, doc::Documents.Document)
+ code = result.block.code
+ filename = Base.find_source_file(result.file)
+ # read the file containing the code block
+ content = read(filename, String)
+ # output stream
+ io = IOBuffer(sizehint = sizeof(content))
+ # first look for the entire code block
+ # make a regex of the code that matches leading whitespace
+ rcode = "(\\h*)" * replace(regex_escape(code), "\\n" => "\\n\\h*")
+ r = Regex(rcode)
+ codeidx = findfirst(r, content)
+ if codeidx === nothing
+ Utilities.warn("Could not find code block in source file")
+ return
+ end
+ # use the capture group to identify indentation
+ indent = match(r, content).captures[1]
+ # write everything up until the code block
+ write(io, content[1:prevind(content, first(codeidx))])
+ # next look for the particular input string in the given code block
+ # make a regex of the input that matches leading whitespace (for multiline input)
+ rinput = "\\h*" * replace(regex_escape(result.input), "\\n" => "\\n\\h*")
+ r = Regex(rinput)
+ inputidx = findfirst(r, code)
+ if inputidx === nothing
+ Utilities.warn("Could not find input line in code block")
+ return
+ end
+ # construct the new code-snippet (without indent)
+ # first part: everything up until the last index of the input string
+ newcode = code[1:last(inputidx)]
+ isempty(result.output) && (newcode *= '\n') # issue #772
+ # second part: the rest, with the old output replaced with the new one
+ newcode *= replace(code[nextind(code, last(inputidx)):end], result.output => str, count = 1)
+ # replace internal code block with the non-indented new code, needed if we come back
+ # looking to replace output in the same code block later
+ result.block.code = newcode
+ # write the new code snippet to the stream, with indent
+ newcode = replace(newcode, r"^(.+)$"m => Base.SubstitutionString(indent * "\\1"))
+ write(io, newcode)
+ # write rest of the file
+ write(io, content[nextind(content, last(codeidx)):end])
+ # write to file
+ write(filename, seekstart(io))
+ return
+end
+
+# Remove terminal colors.
+
+const TERM_COLOR_REGEX = r"\e\[[0-9;]*m"
+remove_term_colors(s) = replace(s, TERM_COLOR_REGEX => "")
+
+# REPL doctest splitter.
+
+const PROMPT_REGEX = r"^julia> (.*)$"
+const SOURCE_REGEX = r"^ (.*)$"
+const ANON_FUNC_DECLARATION = r"#[0-9]+ \(generic function with [0-9]+ method(s)?\)"
+
+function repl_splitter(code)
+ lines = split(string(code, "\n"), '\n')
+ input = String[]
+ output = String[]
+ buffer = IOBuffer()
+ while !isempty(lines)
+ line = popfirst!(lines)
+ # REPL code blocks may contain leading lines with comments. Drop them.
+ # TODO: handle multiline comments?
+ # ANON_FUNC_DECLARATION deals with `x->x` -> `#1 (generic function ....)` on 0.7
+ # TODO: Remove this special case and just disallow lines with comments?
+ startswith(line, '#') && !occursin(ANON_FUNC_DECLARATION, line) && continue
+ prompt = match(PROMPT_REGEX, line)
+ if prompt === nothing
+ source = match(SOURCE_REGEX, line)
+ if source === nothing
+ savebuffer!(input, buffer)
+ println(buffer, line)
+ takeuntil!(PROMPT_REGEX, buffer, lines)
+ else
+ println(buffer, source[1])
+ end
+ else
+ savebuffer!(output, buffer)
+ println(buffer, prompt[1])
+ end
+ end
+ savebuffer!(output, buffer)
+ zip(input, output)
+end
+
+function savebuffer!(out, buf)
+ n = bytesavailable(seekstart(buf))
+ n > 0 ? push!(out, rstrip(String(take!(buf)))) : out
+end
+
+function takeuntil!(r, buf, lines)
+ while !isempty(lines)
+ line = lines[1]
+ if !occursin(r, line)
+ println(buf, popfirst!(lines))
+ else
+ break
+ end
+ end
+end
+
+function disable_color(func)
+ color = Base.have_color
+ try
+ @eval Base have_color = false
+ func()
+ finally
+ @eval Base have_color = $color
+ end
+end
+
+const CAN_INLINE = Ref(true)
+function __init__()
+ CAN_INLINE[] = Base.JLOptions().can_inline == 0 ? false : true
+end
+
+end
--- /dev/null
+"""
+Main module for `Documenter.jl` -- a documentation generation package for Julia.
+
+Two functions are exported from this module for public use:
+
+- [`makedocs`](@ref). Generates documentation from docstrings and templated markdown files.
+- [`deploydocs`](@ref). Deploys generated documentation from *Travis-CI* to *GitHub Pages*.
+
+# Exports
+
+$(EXPORTS)
+
+"""
+module Documenter
+
+using DocStringExtensions
+import Base64: base64decode
+import Pkg
+
+# Submodules
+# ----------
+
+include("Utilities/Utilities.jl")
+include("DocSystem.jl")
+include("Formats.jl")
+include("Anchors.jl")
+include("Documents.jl")
+include("Builder.jl")
+include("Expanders.jl")
+include("CrossReferences.jl")
+include("DocTests.jl")
+include("DocChecks.jl")
+include("Writers/Writers.jl")
+include("Deps.jl")
+
+import .Utilities: Selectors
+
+
+# User Interface.
+# ---------------
+
+export Deps, makedocs, deploydocs, hide
+
+"""
+ makedocs(
+ root = "<current-directory>",
+ source = "src",
+ build = "build",
+ clean = true,
+ doctest = true,
+ modules = Module[],
+ repo = "",
+ )
+
+Combines markdown files and inline docstrings into an interlinked document.
+In most cases [`makedocs`](@ref) should be run from a `make.jl` file:
+
+```julia
+using Documenter
+makedocs(
+ # keywords...
+)
+```
+
+which is then run from the command line with:
+
+```sh
+\$ julia make.jl
+```
+
+The folder structure that [`makedocs`](@ref) expects looks like:
+
+ docs/
+ build/
+ src/
+ make.jl
+
+# Keywords
+
+**`root`** is the directory from which `makedocs` should run. When run from a `make.jl` file
+this keyword does not need to be set. It is, for the most part, needed when repeatedly
+running `makedocs` from the Julia REPL like so:
+
+ julia> makedocs(root = Pkg.dir("MyPackage", "docs"))
+
+**`source`** is the directory, relative to `root`, where the markdown source files are read
+from. By convention this folder is called `src`. Note that any non-markdown files stored
+in `source` are copied over to the build directory when [`makedocs`](@ref) is run.
+
+**`build`** is the directory, relative to `root`, into which generated files and folders are
+written when [`makedocs`](@ref) is run. The name of the build directory is, by convention,
+called `build`, though, like with `source`, users are free to change this to anything else
+to better suit their project needs.
+
+**`clean`** tells [`makedocs`](@ref) whether to remove all the content from the `build`
+folder prior to generating new content from `source`. By default this is set to `true`.
+
+**`doctest`** instructs [`makedocs`](@ref) on whether to try to test Julia code blocks
+that are encountered in the generated document. By default this keyword is set to `true`.
+Doctesting should only ever be disabled when initially setting up a newly developed package
+where the developer is just trying to get their package and documentation structure correct.
+After that, it's encouraged to always make sure that documentation examples are runnable and
+produce the expected results. See the [Doctests](@ref) manual section for details about
+running doctests.
+
+**`modules`** specifies a vector of modules that should be documented in `source`. If any
+inline docstrings from those modules are seen to be missing from the generated content then
+a warning will be printed during execution of [`makedocs`](@ref). By default no modules are
+passed to `modules` and so no warnings will appear. This setting can be used as an indicator
+of the "coverage" of the generated documentation.
+For example Documenter's `make.jl` file contains:
+
+```julia
+makedocs(
+ modules = [Documenter],
+ # ...
+)
+```
+
+and so any docstring from the module `Documenter` that is not spliced into the generated
+documentation in `build` will raise a warning.
+
+**`repo`** specifies a template for the "link to source" feature. If you are
+using GitHub, this is automatically generated from the remote. If you are using
+a different host, you can use this option to tell Documenter how URLs should be
+generated. The following placeholders will be replaced with the respective
+value of the generated link:
+
+ - `{commit}` Git branch or tag name, or commit hash
+ - `{path}` Path to the file in the repository
+ - `{line}` Line (or range of lines) in the source file
+
+For example if you are using GitLab.com, you could use
+
+```julia
+makedocs(repo = \"https://gitlab.com/user/project/blob/{commit}{path}#{line}\")
+```
+
+# Experimental keywords
+
+In addition to standard arguments there is a set of non-finalized experimental keyword
+arguments. The behaviour of these may change or they may be removed without deprecation
+when a minor version changes (i.e. except in patch releases).
+
+**`checkdocs`** instructs [`makedocs`](@ref) to check whether all names within the modules
+defined in the `modules` keyword that have a docstring attached have the docstring also
+listed in the manual (e.g. there's a `@docs` blocks with that docstring). Possible values
+are `:all` (check all names) and `:exports` (check only exported names). The default value
+is `:none`, in which case no checks are performed. If `strict` is also enabled then the
+build will fail if any missing docstrings are encountered.
+
+**`linkcheck`** -- if set to `true` [`makedocs`](@ref) uses `curl` to check the status codes
+of external-pointing links, to make sure that they are up-to-date. The links and their
+status codes are printed to the standard output. If `strict` is also enabled then the build
+will fail if there are any broken (400+ status code) links. Default: `false`.
+
+**`linkcheck_ignore`** allows certain URLs to be ignored in `linkcheck`. The values should
+be a list of strings (which get matched exactly) or `Regex` objects. By default nothing is
+ignored.
+
+**`strict`** -- [`makedocs`](@ref) fails the build right before rendering if it encountered
+any errors with the document in the previous build phases.
+
+## Output formats
+
+**`format`** allows the output format to be specified. Possible values are `:html` (default),
+`:latex` and `:markdown`.
+
+Documenter is designed to support multiple output formats. By default it is creates a set of
+HTML files, but the output format can be controlled with the `format` keyword. The different
+output formats may require additional keywords to be specified. The keywords for the default
+HTML output are documented at the [`Writers.HTMLWriter`](@ref) module.
+
+Documenter also has (experimental) support for Markdown and LaTeX / PDF outputs. See the
+[Other Output Formats](@ref) for more information.
+
+!!! warning
+
+ The Markdown and LaTeX output formats will be moved to a separate package in future
+ versions of Documenter. Automatic documentation deployments should not rely on it unless
+ they fix Documenter to a minor version.
+
+# See Also
+
+A guide detailing how to document a package using Documenter's [`makedocs`](@ref) is provided
+in the [setup guide in the manual](@ref Package-Guide).
+"""
+function makedocs(; debug = false, args...)
+ document = Documents.Document(; args...)
+ cd(document.user.root) do
+ Selectors.dispatch(Builder.DocumentPipeline, document)
+ end
+ debug ? document : nothing
+end
+
+"""
+$(SIGNATURES)
+
+Allows a page to be hidden in the navigation menu. It will only show up if it happens to be
+the current page. The hidden page will still be present in the linear page list that can be
+accessed via the previous and next page links. The title of the hidden page can be overriden
+using the `=>` operator as usual.
+
+# Usage
+
+```julia
+makedocs(
+ ...,
+ pages = [
+ ...,
+ hide("page1.md"),
+ hide("Title" => "page2.md")
+ ]
+)
+```
+"""
+hide(page::Pair) = (false, page.first, page.second, [])
+hide(page::AbstractString) = (false, nothing, page, [])
+
+"""
+$(SIGNATURES)
+
+Allows a subsection of pages to be hidden from the navigation menu. `root` will be linked
+to in the navigation menu, with the title determined as usual. `children` should be a list
+of pages (note that it **can not** be hierarchical).
+
+# Usage
+
+```julia
+makedocs(
+ ...,
+ pages = [
+ ...,
+ hide("Hidden section" => "hidden_index.md", [
+ "hidden1.md",
+ "Hidden 2" => "hidden2.md"
+ ]),
+ hide("hidden_index.md", [...])
+ ]
+)
+```
+"""
+hide(root::Pair, children) = (true, root.first, root.second, map(hide, children))
+hide(root::AbstractString, children) = (true, nothing, root, map(hide, children))
+
+"""
+ deploydocs(
+ root = "<current-directory>",
+ target = "build",
+ repo = "<required>",
+ branch = "gh-pages",
+ deps = nothing | <Function>,
+ make = nothing | <Function>,
+ devbranch = "master",
+ devurl = "dev",
+ versions = ["stable" => "v^", "v#.#", devurl => devurl]
+ )
+
+Converts markdown files generated by [`makedocs`](@ref) to HTML and pushes them to `repo`.
+This function should be called from within a package's `docs/make.jl` file after the call to
+[`makedocs`](@ref), like so
+
+```julia
+using Documenter, PACKAGE_NAME
+makedocs(
+ # options...
+)
+deploydocs(
+ repo = "github.com/..."
+)
+```
+
+When building the docs for a tag (i.e. a release) the documentation is deployed to
+a directory with the tag name (i.e. `vX.Y.Z`) and to the `stable` directory.
+Otherwise the docs are deployed to the directory determined by the `devurl` argument.
+
+# Required keyword arguments
+
+**`repo`** is the remote repository where generated HTML content should be pushed to. Do not
+specify any protocol - "https://" or "git@" should not be present. This keyword *must*
+be set and will throw an error when left undefined. For example this package uses the
+following `repo` value:
+
+```julia
+repo = "github.com/JuliaDocs/Documenter.jl.git"
+```
+
+# Optional keyword arguments
+
+**`root`** has the same purpose as the `root` keyword for [`makedocs`](@ref).
+
+**`target`** is the directory, relative to `root`, where generated content that should be
+deployed to `gh-pages` is written to. written to. It should generally be the same as
+[`makedocs`](@ref)'s `build` and defaults to `"build"`.
+
+**`branch`** is the branch where the generated documentation is pushed. If the branch does
+not exist, a new orphaned branch is created automatically. It defaults to `"gh-pages"`.
+
+**`deps`** is the function used to install any additional dependencies needed to build the
+documentation. By default nothing is installed.
+
+It can be used e.g. for a Markdown build. The following example installed the `pygments` and
+`mkdocs` Python packages using the [`Deps.pip`](@ref) function:
+
+```julia
+deps = Deps.pip("pygments", "mkdocs")
+```
+
+**`make`** is the function used to specify an additonal build phase. By default, nothing gets
+executed.
+
+**`devbranch`** is the branch that "tracks" the in-development version of the generated
+documentation. By default this value is set to `"master"`.
+
+**`devurl`** the folder that in-development version of the docs will be deployed.
+Defaults to `"dev"`.
+
+**`versions`** determines content and order of the resulting version selector in
+the generated html. The following entries are valied in the `versions` vector:
+ - `"v#"`: includes links to the latest documentation for each major release cycle
+ (i.e. `v2.0`, `v1.1`).
+ - `"v#.#"`: includes links to the latest documentation for each minor release cycle
+ (i.e. `v2.0`, `v1.1`, `v1.0`, `v0.1`).
+ - `"v#.#.#"`: includes links to all released versions.
+ - `"v^"`: includes a link to the docs for the maximum version
+ (i.e. a link `vX.Y` pointing to `vX.Y.Z` for highest `X`, `Y`, `Z`, respectively).
+ - A pair, e.g. `"first" => "second"`, which will put `"first"` in the selector,
+ and generate a url from which `"second"` can be accessed.
+ The second argument can be `"v^"`, to point to the maximum version docs
+ (as in e.g. `"stable" => "v^"`).
+
+# See Also
+
+The [Hosting Documentation](@ref) section of the manual provides a step-by-step guide to
+using the [`deploydocs`](@ref) function to automatically generate docs and push them to
+GitHub.
+"""
+function deploydocs(;
+ root = Utilities.currentdir(),
+ target = "build",
+ dirname = "",
+
+ repo = error("no 'repo' keyword provided."),
+ branch = "gh-pages",
+ latest::Union{String,Nothing} = nothing, # deprecated
+
+ osname::Union{String,Nothing} = nothing, # deprecated
+ julia::Union{String,Nothing} = nothing, # deprecated
+
+ deps = nothing,
+ make = nothing,
+
+ devbranch = "master",
+ devurl = "dev",
+ versions = ["stable" => "v^", "v#.#", devurl => devurl]
+ )
+ # deprecation of latest kwarg (renamed to devbranch)
+ if latest !== nothing
+ Base.depwarn("The `latest` keyword argument has been renamed to `devbranch`.", :deploydocs)
+ devbranch = latest
+ @info("setting `devbranch` to `$(devbranch)`.")
+ end
+ # deprecation/removal of `julia` and `osname` kwargs
+ if julia !== nothing
+ Base.depwarn("the `julia` keyword argument to `Documenter.deploydocs` is " *
+ "removed. Use Travis Build Stages for determining from where to deploy instead. " *
+ "See the section about Hosting in the Documenter manual for more details.", :deploydocs)
+ @info("skipping docs deployment.")
+ return
+ end
+ if osname !== nothing
+ Base.depwarn("the `osname` keyword argument to `Documenter.deploydocs` is " *
+ "removed. Use Travis Build Stages for determining from where to deploy instead. " *
+ "See the section about Hosting in the Documenter manual for more details.", :deploydocs)
+ @info("skipping docs deployment.")
+ return
+ end
+
+ # Get environment variables.
+ documenter_key = get(ENV, "DOCUMENTER_KEY", "")
+ travis_branch = get(ENV, "TRAVIS_BRANCH", "")
+ travis_pull_request = get(ENV, "TRAVIS_PULL_REQUEST", "")
+ travis_repo_slug = get(ENV, "TRAVIS_REPO_SLUG", "")
+ travis_tag = get(ENV, "TRAVIS_TAG", "")
+
+
+ # Other variables.
+ sha = cd(root) do
+ # We'll make sure we run the git commands in the source directory (root), in case
+ # the working directory has been changed (e.g. if the makedocs' build argument is
+ # outside root).
+ try
+ readchomp(`git rev-parse --short HEAD`)
+ catch
+ # git rev-parse will throw an error and return code 128 if it is not being
+ # run in a git repository, which will make run/readchomp throw an exception.
+ # We'll assume that if readchomp fails it is due to this and set the sha
+ # variable accordingly.
+ "(not-git-repo)"
+ end
+ end
+
+ # Sanity checks
+ if !isempty(travis_repo_slug) && !occursin(travis_repo_slug, repo)
+ @warn("repo $repo does not match $travis_repo_slug")
+ end
+
+ # When should a deploy be attempted?
+ should_deploy =
+ occursin(travis_repo_slug, repo) &&
+ travis_pull_request == "false" &&
+ (
+ travis_branch == devbranch ||
+ travis_tag != ""
+ )
+
+ # check that the tag is valid
+ if should_deploy && !isempty(travis_tag) && !occursin(Base.VERSION_REGEX, travis_tag)
+ @warn("tag `$(travis_tag)` is not a valid VersionNumber")
+ should_deploy = false
+ end
+
+ # check DOCUMENTER_KEY only if the branch, Julia version etc. check out
+ if should_deploy && isempty(documenter_key)
+ @warn("""
+ DOCUMENTER_KEY environment variable missing, unable to deploy.
+ Note that in Documenter v0.9.0 old deprecated authentication methods were removed.
+ DOCUMENTER_KEY is now the only option. See the documentation for more information.""")
+ should_deploy = false
+ end
+
+ if get(ENV, "DOCUMENTER_DEBUG", "") == "true"
+ Utilities.debug("TRAVIS_REPO_SLUG = \"$travis_repo_slug\"")
+ Utilities.debug(" should occur in \"$repo\" (kwarg: repo)")
+ Utilities.debug("TRAVIS_PULL_REQUEST = \"$travis_pull_request\"")
+ Utilities.debug(" deploying if equal to \"false\"")
+ Utilities.debug("TRAVIS_BRANCH = \"$travis_branch\"")
+ Utilities.debug("TRAVIS_TAG = \"$travis_tag\"")
+ Utilities.debug(" deploying if branch equal to \"$devbranch\" (kwarg: devbranch) or tag is set")
+ Utilities.debug("git commit SHA = $sha")
+ Utilities.debug("DOCUMENTER_KEY exists = $(!isempty(documenter_key))")
+ Utilities.debug("should_deploy = $should_deploy")
+ end
+
+ if should_deploy
+ # Add local bin path if needed.
+ Deps.updatepath!()
+ # Install dependencies when applicable.
+ if deps !== nothing
+ Utilities.log("installing dependencies.")
+ deps()
+ end
+ # Change to the root directory and try to deploy the docs.
+ cd(root) do
+ Utilities.log("setting up target directory.")
+ isdir(target) || mkpath(target)
+ # Run extra build steps defined in `make` if required.
+ if make !== nothing
+ Utilities.log("running extra build steps.")
+ make()
+ end
+ Utilities.log("pushing new documentation to remote: $repo:$branch.")
+ mktempdir() do temp
+ git_push(
+ root, temp, repo;
+ branch=branch, dirname=dirname, target=target,
+ tag=travis_tag, key=documenter_key, sha=sha,
+ devurl = devurl, versions = versions,
+ )
+ end
+ end
+ else
+ Utilities.log("skipping docs deployment.")
+ if get(ENV, "DOCUMENTER_DEBUG", "") != "true"
+ Utilities.log("You can set DOCUMENTER_DEBUG to \"true\" in Travis to see more information.")
+ end
+ end
+end
+
+"""
+ git_push(
+ root, tmp, repo;
+ branch="gh-pages", dirname="", target="site", tag="", key="", sha="", devurl="dev"
+ )
+
+Handles pushing changes to the remote documentation branch.
+When `tag` is empty the docs are deployed to the `devurl` directory,
+and when building docs for a tag they are deployed to a `vX.Y.Z` directory.
+"""
+function git_push(
+ root, temp, repo;
+ branch="gh-pages", dirname="", target="site", tag="", key="", sha="", devurl="dev",
+ versions
+ )
+ dirname = isempty(dirname) ? temp : joinpath(temp, dirname)
+ isdir(dirname) || mkpath(dirname)
+
+ keyfile = abspath(joinpath(root, ".documenter"))
+ target_dir = abspath(target)
+
+ # The upstream URL to which we push new content and the ssh decryption commands.
+ upstream = "git@$(replace(repo, "github.com/" => "github.com:"))"
+
+ write(keyfile, String(base64decode(key)))
+ chmod(keyfile, 0o600)
+
+ try
+ # Use a custom SSH config file to avoid overwriting the default user config.
+ withfile(joinpath(homedir(), ".ssh", "config"),
+ """
+ Host github.com
+ StrictHostKeyChecking no
+ HostName github.com
+ IdentityFile $keyfile
+ """
+ ) do
+ cd(temp) do
+ # Setup git.
+ run(`git init`)
+ run(`git config user.name "autodocs"`)
+ run(`git config user.email "autodocs"`)
+
+ # Fetch from remote and checkout the branch.
+ run(`git remote add upstream $upstream`)
+ run(`git fetch upstream`)
+
+ try
+ run(`git checkout -b $branch upstream/$branch`)
+ catch e
+ Utilities.log("Checking out $branch failed with error: $e")
+ Utilities.log("Creating a new local $branch branch.")
+ run(`git checkout --orphan $branch`)
+ run(`git commit --allow-empty -m "Initial empty commit for docs"`)
+ end
+
+ # Copy docs to `devurl`, or `stable`, `<release>`, and `<version>` directories.
+ if isempty(tag)
+ devurl_dir = joinpath(dirname, devurl)
+ gitrm_copy(target_dir, devurl_dir)
+ Writers.HTMLWriter.generate_siteinfo_file(devurl_dir, devurl)
+ # symlink "latest" to devurl to preserve links (remove in some future release)
+ if devurl != "latest"
+ rm(joinpath(dirname, "latest"); recursive = true, force = true)
+ @warn(string("creating symlink from `latest` to `$(devurl)` for backwards ",
+ "compatibility with old links. In future Documenter versions this symlink ",
+ "will not be created. Please update any links that point to `latest`."))
+ cd(dirname) do; rm_and_add_symlink(devurl, "latest"); end
+ end
+ else
+ tagged_dir = joinpath(dirname, tag)
+ gitrm_copy(target_dir, tagged_dir)
+ Writers.HTMLWriter.generate_siteinfo_file(tagged_dir, tag)
+ end
+
+ # Expand the users `versions` vector
+ entries, symlinks = Writers.HTMLWriter.expand_versions(dirname, versions)
+
+ # Create the versions.js file containing a list of `entries`.
+ # This must always happen after the folder copying.
+ Writers.HTMLWriter.generate_version_file(joinpath(dirname, "versions.js"), entries)
+
+ # generate the symlinks, make sure we don't overwrite devurl
+ cd(dirname) do
+ for kv in symlinks
+ i = findfirst(x -> x.first == devurl, symlinks)
+ if i === nothing
+ rm_and_add_symlink(kv.second, kv.first)
+ else
+ throw(ArgumentError(string("link `$(kv)` cannot overwrite ",
+ "`devurl = $(devurl)` with the same name.")))
+ end
+ end
+ end
+
+ # Add, commit, and push the docs to the remote.
+ run(`git add -A .`)
+ if !success(`git diff --cached --exit-code`)
+ run(`git commit -m "build based on $sha"`)
+ run(`git push -q upstream HEAD:$branch`)
+ else
+ Utilities.log("New docs identical to the old -- not committing nor pushing.")
+ end
+ end
+ end
+ finally
+ # Remove the unencrypted private key.
+ isfile(keyfile) && rm(keyfile)
+ end
+end
+
+function rm_and_add_symlink(target, link)
+ if ispath(link)
+ @warn "removing `$(link)` and linking `$(link)` to `$(target)`."
+ rm(link; force = true, recursive = true)
+ end
+ symlink(target, link)
+end
+
+"""
+ gitrm_copy(src, dst)
+
+Uses `git rm -r` to remove `dst` and then copies `src` to `dst`. Assumes that the working
+directory is within the git repository of `dst` is when the function is called.
+
+This is to get around [#507](https://github.com/JuliaDocs/Documenter.jl/issues/507) on
+filesystems that are case-insensitive (e.g. on OS X, Windows). Without doing a `git rm`
+first, `git add -A` will not detect case changes in filenames.
+"""
+function gitrm_copy(src, dst)
+ # --ignore-unmatch so that we wouldn't get errors if dst does not exist
+ run(`git rm -rf --ignore-unmatch $(dst)`)
+ cp(src, dst; force=true)
+end
+
+function withfile(func, file::AbstractString, contents::AbstractString)
+ hasfile = isfile(file)
+ original = hasfile ? read(file, String) : ""
+ open(file, "w") do stream
+ print(stream, contents)
+ flush(stream) # Make sure file is written before continuing.
+ end
+ try
+ func()
+ finally
+ if hasfile
+ open(file, "w") do stream
+ print(stream, original)
+ end
+ else
+ rm(file)
+ end
+ end
+end
+
+function getenv(regex::Regex)
+ for (key, value) in ENV
+ occursin(regex, key) && return value
+ end
+ error("could not find key/iv pair.")
+end
+
+end # module
--- /dev/null
+"""
+Defines [`Document`](@ref) and its supporting types
+
+- [`Page`](@ref)
+- [`User`](@ref)
+- [`Internal`](@ref)
+- [`Globals`](@ref)
+
+"""
+module Documents
+
+import ..Documenter:
+ Anchors,
+ Formats,
+ Utilities
+
+using DocStringExtensions
+import Markdown
+using Unicode
+
+# Pages.
+# ------
+
+"""
+[`Page`](@ref)-local values such as current module that are shared between nodes in a page.
+"""
+mutable struct Globals
+ mod :: Module
+ meta :: Dict{Symbol, Any}
+end
+Globals() = Globals(Main, Dict())
+
+"""
+Represents a single markdown file.
+"""
+struct Page
+ source :: String
+ build :: String
+ """
+ Ordered list of raw toplevel markdown nodes from the parsed page contents. This vector
+ should be considered immutable.
+ """
+ elements :: Vector
+ """
+ Each element in `.elements` maps to an "expanded" element. This may be itself if the
+ element does not need expanding or some other object, such as a `DocsNode` in the case
+ of `@docs` code blocks.
+ """
+ mapping :: IdDict{Any,Any}
+ globals :: Globals
+end
+function Page(source::AbstractString, build::AbstractString)
+ elements = Markdown.parse(read(source, String)).content
+ Page(source, build, elements, IdDict{Any,Any}(), Globals())
+end
+
+# Document Nodes.
+# ---------------
+
+## IndexNode.
+
+struct IndexNode
+ pages :: Vector{String} # Which pages to include in the index? Set by user.
+ modules :: Vector{Module} # Which modules to include? Set by user.
+ order :: Vector{Symbol} # What order should docs be listed in? Set by user.
+ build :: String # Path to the file where this index will appear.
+ source :: String # Path to the file where this index was written.
+ elements :: Vector # (object, doc, page, mod, cat)-tuple for constructing links.
+
+ function IndexNode(;
+ # TODO: Fix difference between uppercase and lowercase naming of keys.
+ # Perhaps deprecate the uppercase versions? Same with `ContentsNode`.
+ Pages = [],
+ Modules = [],
+ Order = [:module, :constant, :type, :function, :macro],
+ build = error("missing value for `build` in `IndexNode`."),
+ source = error("missing value for `source` in `IndexNode`."),
+ others...
+ )
+ new(Pages, Modules, Order, build, source, [])
+ end
+end
+
+## ContentsNode.
+
+struct ContentsNode
+ pages :: Vector{String} # Which pages should be included in contents? Set by user.
+ depth :: Int # Down to which level should headers be displayed? Set by user.
+ build :: String # Same as for `IndexNode`s.
+ source :: String # Same as for `IndexNode`s.
+ elements :: Vector # (order, page, anchor)-tuple for constructing links.
+
+ function ContentsNode(;
+ Pages = [],
+ Depth = 2,
+ build = error("missing value for `build` in `ContentsNode`."),
+ source = error("missing value for `source` in `ContentsNode`."),
+ others...
+ )
+ new(Pages, Depth, build, source, [])
+ end
+end
+
+## Other nodes
+
+struct MetaNode
+ dict :: Dict{Symbol, Any}
+end
+
+struct MethodNode
+ method :: Method
+ visible :: Bool
+end
+
+struct DocsNode
+ docstr :: Any
+ anchor :: Anchors.Anchor
+ object :: Utilities.Object
+ page :: Documents.Page
+end
+
+struct DocsNodes
+ nodes :: Vector{DocsNode}
+end
+
+struct EvalNode
+ code :: Markdown.Code
+ result :: Any
+end
+
+struct RawHTML
+ code::String
+end
+
+struct RawNode
+ name::Symbol
+ text::String
+end
+
+# Navigation
+# ----------------------
+
+"""
+Element in the navigation tree of a document, containing navigation references
+to other page, reference to the [`Page`](@ref) object etc.
+"""
+mutable struct NavNode
+ """
+ `nothing` if the `NavNode` is a non-page node of the navigation tree, otherwise
+ the string should be a valid key in `doc.internal.pages`
+ """
+ page :: Union{String, Nothing}
+ """
+ If not `nothing`, specifies the text that should be displayed in navigation
+ links etc. instead of the automatically determined text.
+ """
+ title_override :: Union{String, Nothing}
+ parent :: Union{NavNode, Nothing}
+ children :: Vector{NavNode}
+ visible :: Bool
+ prev :: Union{NavNode, Nothing}
+ next :: Union{NavNode, Nothing}
+end
+NavNode(page, title_override, parent) = NavNode(page, title_override, parent, [], true, nothing, nothing)
+
+"""
+Constructs a list of the ancestors of the `navnode` (inclding the `navnode` itself),
+ordered so that the root of the navigation tree is the first and `navnode` itself
+is the last item.
+"""
+navpath(navnode::NavNode) = navnode.parent === nothing ? [navnode] :
+ push!(navpath(navnode.parent), navnode)
+
+
+# Inner Document Fields.
+# ----------------------
+
+"""
+User-specified values used to control the generation process.
+"""
+struct User
+ root :: String # An absolute path to the root directory of the document.
+ source :: String # Parent directory is `.root`. Where files are read from.
+ build :: String # Parent directory is also `.root`. Where files are written to.
+ format :: Vector{Symbol} # What format to render the final document with?
+ clean :: Bool # Empty the `build` directory before starting a new build?
+ doctest :: Union{Bool,Symbol} # Run doctests?
+ linkcheck::Bool # Check external links..
+ linkcheck_ignore::Vector{Union{String,Regex}} # ..and then ignore (some of) them.
+ checkdocs::Symbol # Check objects missing from `@docs` blocks. `:none`, `:exports`, or `:all`.
+ doctestfilters::Vector{Regex} # Filtering for doctests
+ strict::Bool # Throw an exception when any warnings are encountered.
+ modules :: Set{Module} # Which modules to check for missing docs?
+ pages :: Vector{Any} # Ordering of document pages specified by the user.
+ assets :: Vector{String}
+ repo :: String # Template for URL to source code repo
+ sitename:: String
+ authors :: String
+ analytics::String
+ version :: String # version string used in the version selector by default
+ html_prettyurls :: Bool # Use pretty URLs in the HTML build?
+ html_disable_git :: Bool # Don't call git when exporting HTML
+ html_edit_branch :: Union{String, Nothing} # Change how the "Edit on GitHub" links are handled
+ html_canonical :: Union{String, Nothing} # Set a canonical url, if desired (https://en.wikipedia.org/wiki/Canonical_link_element)
+end
+
+"""
+Private state used to control the generation process.
+"""
+struct Internal
+ assets :: String # Path where asset files will be copied to.
+ remote :: String # The remote repo on github where this package is hosted.
+ pages :: Dict{String, Page} # Markdown files only.
+ navtree :: Vector{NavNode} # A vector of top-level navigation items.
+ navlist :: Vector{NavNode} # An ordered list of `NavNode`s that point to actual pages
+ headers :: Anchors.AnchorMap # See `modules/Anchors.jl`. Tracks `Markdown.Header` objects.
+ docs :: Anchors.AnchorMap # See `modules/Anchors.jl`. Tracks `@docs` docstrings.
+ bindings:: IdDict{Any,Any} # Tracks insertion order of object per-binding.
+ objects :: IdDict{Any,Any} # Tracks which `Utilities.Objects` are included in the `Document`.
+ contentsnodes :: Vector{ContentsNode}
+ indexnodes :: Vector{IndexNode}
+ locallinks :: Dict{Markdown.Link, String}
+ errors::Set{Symbol}
+end
+
+# Document.
+# ---------
+
+"""
+Represents an entire document.
+"""
+struct Document
+ user :: User # Set by the user via `makedocs`.
+ internal :: Internal # Computed values.
+end
+
+function Document(;
+ root :: AbstractString = Utilities.currentdir(),
+ source :: AbstractString = "src",
+ build :: AbstractString = "build",
+ format :: Any = :html,
+ clean :: Bool = true,
+ doctest :: Union{Bool,Symbol} = true,
+ linkcheck:: Bool = false,
+ linkcheck_ignore :: Vector = [],
+ checkdocs::Symbol = :all,
+ doctestfilters::Vector{Regex}= Regex[],
+ strict::Bool = false,
+ modules :: Utilities.ModVec = Module[],
+ pages :: Vector = Any[],
+ assets :: Vector = String[],
+ repo :: AbstractString = "",
+ sitename :: AbstractString = "",
+ authors :: AbstractString = "",
+ analytics :: AbstractString = "",
+ version :: AbstractString = "",
+ html_prettyurls :: Bool = true,
+ html_disable_git :: Bool = false,
+ html_edit_branch :: Union{String, Nothing} = "master",
+ html_canonical :: Union{String, Nothing} = nothing,
+ others...
+ )
+ Utilities.check_kwargs(others)
+
+ fmt = Formats.fmt(format)
+ @assert !isempty(fmt) "No formats provided."
+
+ if version == "git-commit"
+ version = "git:$(Utilities.get_commit_short(root))"
+ end
+
+ user = User(
+ root,
+ source,
+ build,
+ fmt,
+ clean,
+ doctest,
+ linkcheck,
+ linkcheck_ignore,
+ checkdocs,
+ doctestfilters,
+ strict,
+ Utilities.submodules(modules),
+ pages,
+ assets,
+ repo,
+ sitename,
+ authors,
+ analytics,
+ version,
+ html_prettyurls,
+ html_disable_git,
+ html_edit_branch,
+ html_canonical,
+ )
+ internal = Internal(
+ Utilities.assetsdir(),
+ Utilities.getremote(root),
+ Dict{String, Page}(),
+ [],
+ [],
+ Anchors.AnchorMap(),
+ Anchors.AnchorMap(),
+ IdDict{Any,Any}(),
+ IdDict{Any,Any}(),
+ [],
+ [],
+ Dict{Markdown.Link, String}(),
+ Set{Symbol}(),
+ )
+ Document(user, internal)
+end
+
+## Methods
+
+function addpage!(doc::Document, src::AbstractString, dst::AbstractString)
+ page = Page(src, dst)
+ # page's identifier is the path relative to the `doc.user.source` directory
+ name = normpath(relpath(src, doc.user.source))
+ doc.internal.pages[name] = page
+end
+
+"""
+$(SIGNATURES)
+
+Populates the `ContentsNode`s and `IndexNode`s of the `document` with links.
+
+This can only be done after all the blocks have been expanded (and nodes constructed),
+because the items have to exist before we can gather the links to those items.
+"""
+function populate!(document::Document)
+ for node in document.internal.contentsnodes
+ populate!(node, document)
+ end
+ for node in document.internal.indexnodes
+ populate!(node, document)
+ end
+end
+
+function populate!(index::IndexNode, document::Document)
+ # Filtering valid index links.
+ for (object, doc) in document.internal.objects
+ page = relpath(doc.page.build, dirname(index.build))
+ mod = object.binding.mod
+ # Include *all* signatures, whether they are `Union{}` or not.
+ cat = Symbol(lowercase(Utilities.doccat(object.binding, Union{})))
+ if _isvalid(page, index.pages) && _isvalid(mod, index.modules) && _isvalid(cat, index.order)
+ push!(index.elements, (object, doc, page, mod, cat))
+ end
+ end
+ # Sorting index links.
+ pagesmap = precedence(index.pages)
+ modulesmap = precedence(index.modules)
+ ordermap = precedence(index.order)
+ comparison = function(a, b)
+ (x = _compare(pagesmap, 3, a, b)) == 0 || return x < 0 # page
+ (x = _compare(modulesmap, 4, a, b)) == 0 || return x < 0 # module
+ (x = _compare(ordermap, 5, a, b)) == 0 || return x < 0 # category
+ string(a[1].binding) < string(b[1].binding) # object name
+ end
+ sort!(index.elements, lt = comparison)
+ return index
+end
+
+function populate!(contents::ContentsNode, document::Document)
+ # Filtering valid contents links.
+ for (id, filedict) in document.internal.headers.map
+ for (file, anchors) in filedict
+ for anchor in anchors
+ page = relpath(anchor.file, dirname(contents.build))
+ if _isvalid(page, contents.pages) && Utilities.header_level(anchor.object) ≤ contents.depth
+ push!(contents.elements, (anchor.order, page, anchor))
+ end
+ end
+ end
+ end
+ # Sorting contents links.
+ pagesmap = precedence(contents.pages)
+ comparison = function(a, b)
+ (x = _compare(pagesmap, 2, a, b)) == 0 || return x < 0 # page
+ a[1] < b[1] # anchor order
+ end
+ sort!(contents.elements, lt = comparison)
+ return contents
+end
+
+# some replacements for jldoctest blocks
+function doctest_replace!(doc::Documents.Document)
+ for (src, page) in doc.internal.pages
+ empty!(page.globals.meta)
+ for element in page.elements
+ page.globals.meta[:CurrentFile] = page.source
+ walk(page.globals.meta, page.mapping[element]) do block
+ doctest_replace!(block)
+ end
+ end
+ end
+end
+function doctest_replace!(block::Markdown.Code)
+ startswith(block.language, "jldoctest") || return false
+ # suppress output for `#output`-style doctests with `output=false` kwarg
+ if occursin(r"^# output$"m, block.code) && occursin(r";.*output\h*=\h*false", block.language)
+ input = first(split(block.code, "# output\n", limit = 2))
+ block.code = rstrip(input)
+ end
+ # correct the language field
+ block.language = occursin(r"^julia> "m, block.code) ? "julia-repl" : "julia"
+ return false
+end
+doctest_replace!(block) = true
+
+## Utilities.
+
+function buildnode(T::Type, block, doc, page)
+ mod = get(page.globals.meta, :CurrentModule, Main)
+ dict = Dict{Symbol, Any}(:source => page.source, :build => page.build)
+ for (ex, str) in Utilities.parseblock(block.code, doc, page)
+ if Utilities.isassign(ex)
+ cd(dirname(page.source)) do
+ dict[ex.args[1]] = Core.eval(mod, ex.args[2])
+ end
+ end
+ end
+ T(; dict...)
+end
+
+function _compare(col, ind, a, b)
+ x, y = a[ind], b[ind]
+ haskey(col, x) && haskey(col, y) ? _compare(col[x], col[y]) : 0
+end
+_compare(a, b) = a < b ? -1 : a == b ? 0 : 1
+_isvalid(x, xs) = isempty(xs) || x in xs
+precedence(vec) = Dict(zip(vec, 1:length(vec)))
+
+##############################################
+# walk (previously in the Walkers submodule) #
+##############################################
+"""
+$(SIGNATURES)
+
+Calls `f` on `element` and any of its child elements. `meta` is a `Dict` containing metadata
+such as current module.
+"""
+walk(f, meta, element) = (f(element); nothing)
+
+# Change to the docstring's defining module if it has one. Change back afterwards.
+function walk(f, meta, block::Markdown.MD)
+ tmp = get(meta, :CurrentModule, nothing)
+ mod = get(block.meta, :module, nothing)
+ mod ≡ nothing || (meta[:CurrentModule] = mod)
+ f(block) && walk(f, meta, block.content)
+ tmp ≡ nothing ? delete!(meta, :CurrentModule) : (meta[:CurrentModule] = tmp)
+ nothing
+end
+
+function walk(f, meta, block::Vector)
+ for each in block
+ walk(f, meta, each)
+ end
+end
+
+const MDContentElements = Union{
+ Markdown.BlockQuote,
+ Markdown.Paragraph,
+ Markdown.MD,
+}
+walk(f, meta, block::MDContentElements) = f(block) ? walk(f, meta, block.content) : nothing
+
+const MDTextElements = Union{
+ Markdown.Bold,
+ Markdown.Header,
+ Markdown.Italic,
+}
+walk(f, meta, block::MDTextElements) = f(block) ? walk(f, meta, block.text) : nothing
+walk(f, meta, block::Markdown.Footnote) = f(block) ? walk(f, meta, block.text) : nothing
+walk(f, meta, block::Markdown.Admonition) = f(block) ? walk(f, meta, block.content) : nothing
+walk(f, meta, block::Markdown.Image) = f(block) ? walk(f, meta, block.alt) : nothing
+walk(f, meta, block::Markdown.Table) = f(block) ? walk(f, meta, block.rows) : nothing
+walk(f, meta, block::Markdown.List) = f(block) ? walk(f, meta, block.items) : nothing
+walk(f, meta, block::Markdown.Link) = f(block) ? walk(f, meta, block.text) : nothing
+walk(f, meta, block::RawHTML) = nothing
+walk(f, meta, block::DocsNodes) = walk(f, meta, block.nodes)
+walk(f, meta, block::DocsNode) = walk(f, meta, block.docstr)
+walk(f, meta, block::EvalNode) = walk(f, meta, block.result)
+walk(f, meta, block::MetaNode) = (merge!(meta, block.dict); nothing)
+walk(f, meta, block::Anchors.Anchor) = walk(f, meta, block.object)
+
+end
--- /dev/null
+"""
+Defines node "expanders" that transform nodes from the parsed markdown files.
+"""
+module Expanders
+
+import ..Documenter:
+ Anchors,
+ Builder,
+ Documents,
+ Formats,
+ Documenter,
+ Utilities
+
+import .Documents:
+ MethodNode,
+ DocsNode,
+ DocsNodes,
+ EvalNode,
+ MetaNode
+
+import .Utilities: Selectors
+
+import Markdown, REPL
+import Base64: stringmime
+
+
+function expand(doc::Documents.Document)
+ for (src, page) in doc.internal.pages
+ empty!(page.globals.meta)
+ for element in page.elements
+ Selectors.dispatch(ExpanderPipeline, element, page, doc)
+ end
+ pagecheck(page)
+ end
+end
+
+# run some checks after expanding the page
+function pagecheck(page)
+ # make sure there is no "continued code" lingering around
+ if haskey(page.globals.meta, :ContinuedCode) && !isempty(page.globals.meta[:ContinuedCode])
+ Utilities.warn(page.source, "Code from a continued @example block unused.")
+ end
+end
+
+
+# Expander Pipeline.
+# ------------------
+
+"""
+The default node expander "pipeline", which consists of the following expanders:
+
+- [`TrackHeaders`](@ref)
+- [`MetaBlocks`](@ref)
+- [`DocsBlocks`](@ref)
+- [`AutoDocsBlocks`](@ref)
+- [`EvalBlocks`](@ref)
+- [`IndexBlocks`](@ref)
+- [`ContentsBlocks`](@ref)
+- [`ExampleBlocks`](@ref)
+- [`SetupBlocks`](@ref)
+- [`REPLBlocks`](@ref)
+
+"""
+abstract type ExpanderPipeline <: Selectors.AbstractSelector end
+
+"""
+Tracks all `Markdown.Header` nodes found in the parsed markdown files and stores an
+[`Anchors.Anchor`](@ref) object for each one.
+"""
+abstract type TrackHeaders <: ExpanderPipeline end
+
+"""
+Parses each code block where the language is `@meta` and evaluates the key/value pairs found
+within the block, i.e.
+
+````markdown
+```@meta
+CurrentModule = Documenter
+DocTestSetup = quote
+ using Documenter
+end
+```
+````
+"""
+abstract type MetaBlocks <: ExpanderPipeline end
+
+"""
+Parses each code block where the language is `@docs` and evaluates the expressions found
+within the block. Replaces the block with the docstrings associated with each expression.
+
+````markdown
+```@docs
+Documenter
+makedocs
+deploydocs
+```
+````
+"""
+abstract type DocsBlocks <: ExpanderPipeline end
+
+"""
+Parses each code block where the language is `@autodocs` and replaces it with all the
+docstrings that match the provided key/value pairs `Modules = ...` and `Order = ...`.
+
+````markdown
+```@autodocs
+Modules = [Foo, Bar]
+Order = [:function, :type]
+```
+````
+"""
+abstract type AutoDocsBlocks <: ExpanderPipeline end
+
+"""
+Parses each code block where the language is `@eval` and evaluates it's content. Replaces
+the block with the value resulting from the evaluation. This can be useful for inserting
+generated content into a document such as plots.
+
+````markdown
+```@eval
+using PyPlot
+x = linspace(-π, π)
+y = sin(x)
+plot(x, y, color = "red")
+savefig("plot.svg")
+Markdown.parse("")
+```
+````
+"""
+abstract type EvalBlocks <: ExpanderPipeline end
+
+abstract type RawBlocks <: ExpanderPipeline end
+
+"""
+Parses each code block where the language is `@index` and replaces it with an index of all
+docstrings spliced into the document. The pages that are included can be set using a
+key/value pair `Pages = [...]` such as
+
+````markdown
+```@index
+Pages = ["foo.md", "bar.md"]
+```
+````
+"""
+abstract type IndexBlocks <: ExpanderPipeline end
+
+"""
+Parses each code block where the language is `@contents` and replaces it with a nested list
+of all `Header` nodes in the generated document. The pages and depth of the list can be set
+using `Pages = [...]` and `Depth = N` where `N` is and integer.
+
+````markdown
+```@contents
+Pages = ["foo.md", "bar.md"]
+Depth = 1
+```
+````
+The default `Depth` value is `2`.
+"""
+abstract type ContentsBlocks <: ExpanderPipeline end
+
+"""
+Parses each code block where the language is `@example` and evaluates the parsed Julia code
+found within. The resulting value is then inserted into the final document after the source
+code.
+
+````markdown
+```@example
+a = 1
+b = 2
+a + b
+```
+````
+"""
+abstract type ExampleBlocks <: ExpanderPipeline end
+
+"""
+Similar to the [`ExampleBlocks`](@ref) expander, but inserts a Julia REPL prompt before each
+toplevel expression in the final document.
+"""
+abstract type REPLBlocks <: ExpanderPipeline end
+
+"""
+Similar to the [`ExampleBlocks`](@ref) expander, but hides all output in the final document.
+"""
+abstract type SetupBlocks <: ExpanderPipeline end
+
+Selectors.order(::Type{TrackHeaders}) = 1.0
+Selectors.order(::Type{MetaBlocks}) = 2.0
+Selectors.order(::Type{DocsBlocks}) = 3.0
+Selectors.order(::Type{AutoDocsBlocks}) = 4.0
+Selectors.order(::Type{EvalBlocks}) = 5.0
+Selectors.order(::Type{IndexBlocks}) = 6.0
+Selectors.order(::Type{ContentsBlocks}) = 7.0
+Selectors.order(::Type{ExampleBlocks}) = 8.0
+Selectors.order(::Type{REPLBlocks}) = 9.0
+Selectors.order(::Type{SetupBlocks}) = 10.0
+Selectors.order(::Type{RawBlocks}) = 11.0
+
+Selectors.matcher(::Type{TrackHeaders}, node, page, doc) = isa(node, Markdown.Header)
+Selectors.matcher(::Type{MetaBlocks}, node, page, doc) = iscode(node, "@meta")
+Selectors.matcher(::Type{DocsBlocks}, node, page, doc) = iscode(node, "@docs")
+Selectors.matcher(::Type{AutoDocsBlocks}, node, page, doc) = iscode(node, "@autodocs")
+Selectors.matcher(::Type{EvalBlocks}, node, page, doc) = iscode(node, "@eval")
+Selectors.matcher(::Type{IndexBlocks}, node, page, doc) = iscode(node, "@index")
+Selectors.matcher(::Type{ContentsBlocks}, node, page, doc) = iscode(node, "@contents")
+Selectors.matcher(::Type{ExampleBlocks}, node, page, doc) = iscode(node, r"^@example")
+Selectors.matcher(::Type{REPLBlocks}, node, page, doc) = iscode(node, r"^@repl")
+Selectors.matcher(::Type{SetupBlocks}, node, page, doc) = iscode(node, r"^@setup")
+Selectors.matcher(::Type{RawBlocks}, node, page, doc) = iscode(node, r"^@raw")
+
+# Default Expander.
+
+Selectors.runner(::Type{ExpanderPipeline}, x, page, doc) = page.mapping[x] = x
+
+# Track Headers.
+# --------------
+
+function Selectors.runner(::Type{TrackHeaders}, header, page, doc)
+ # Get the header slug.
+ text =
+ if namedheader(header)
+ url = header.text[1].url
+ header.text = header.text[1].text
+ match(NAMEDHEADER_REGEX, url)[1]
+ else
+ sprint(Markdown.plain, Markdown.Paragraph(header.text))
+ end
+ slug = Utilities.slugify(text)
+ # Add the header to the document's header map.
+ anchor = Anchors.add!(doc.internal.headers, header, slug, page.build)
+ # Map the header element to the generated anchor and the current anchor count.
+ page.mapping[header] = anchor
+end
+
+# @meta
+# -----
+
+function Selectors.runner(::Type{MetaBlocks}, x, page, doc)
+ meta = page.globals.meta
+ for (ex, str) in Utilities.parseblock(x.code, doc, page)
+ if Utilities.isassign(ex)
+ try
+ meta[ex.args[1]] = Core.eval(Main, ex.args[2])
+ catch err
+ push!(doc.internal.errors, :meta_block)
+ Utilities.warn(doc, page, "Failed to evaluate `$(strip(str))` in `@meta` block.", err)
+ end
+ end
+ end
+ page.mapping[x] = MetaNode(copy(meta))
+end
+
+# @docs
+# -----
+
+function Selectors.runner(::Type{DocsBlocks}, x, page, doc)
+ failed = false
+ nodes = DocsNode[]
+ curmod = get(page.globals.meta, :CurrentModule, Main)
+ for (ex, str) in Utilities.parseblock(x.code, doc, page)
+ binding = try
+ Documenter.DocSystem.binding(curmod, ex)
+ catch err
+ push!(doc.internal.errors, :docs_block)
+ Utilities.warn(page.source, "Unable to get the binding for '$(strip(str))'.", err, ex, curmod)
+ failed = true
+ continue
+ end
+ # Undefined `Bindings` get discarded.
+ if !Documenter.DocSystem.iskeyword(binding) && !Documenter.DocSystem.defined(binding)
+ push!(doc.internal.errors, :docs_block)
+ Utilities.warn(page.source, "Undefined binding '$(binding)'.")
+ failed = true
+ continue
+ end
+ typesig = Core.eval(curmod, Documenter.DocSystem.signature(ex, str))
+
+ object = Utilities.Object(binding, typesig)
+ # We can't include the same object more than once in a document.
+ if haskey(doc.internal.objects, object)
+ push!(doc.internal.errors, :docs_block)
+ Utilities.warn(page.source, "Duplicate docs found for '$(strip(str))'.")
+ failed = true
+ continue
+ end
+
+ # Find the docs matching `binding` and `typesig`. Only search within the provided modules.
+ docs = Documenter.DocSystem.getdocs(binding, typesig; modules = doc.user.modules)
+
+ # Include only docstrings from user-provided modules if provided.
+ if !isempty(doc.user.modules)
+ filter!(d -> d.data[:module] in doc.user.modules, docs)
+ end
+
+ # Check that we aren't printing an empty docs list. Skip block when empty.
+ if isempty(docs)
+ push!(doc.internal.errors, :docs_block)
+ Utilities.warn(page.source, "No docs found for '$(strip(str))'.")
+ failed = true
+ continue
+ end
+
+ # Concatenate found docstrings into a single `MD` object.
+ docstr = Markdown.MD(map(Documenter.DocSystem.parsedoc, docs))
+ docstr.meta[:results] = docs
+
+ # Generate a unique name to be used in anchors and links for the docstring.
+ slug = Utilities.slugify(object)
+ anchor = Anchors.add!(doc.internal.docs, object, slug, page.build)
+ docsnode = DocsNode(docstr, anchor, object, page)
+
+ # Track the order of insertion of objects per-binding.
+ push!(get!(doc.internal.bindings, binding, Utilities.Object[]), object)
+
+ doc.internal.objects[object] = docsnode
+ push!(nodes, docsnode)
+ end
+ # When a `@docs` block fails we need to remove the `.language` since some markdown
+ # parsers have trouble rendering it correctly.
+ page.mapping[x] = failed ? (x.language = ""; x) : DocsNodes(nodes)
+end
+
+# @autodocs
+# ---------
+
+const AUTODOCS_DEFAULT_ORDER = [:module, :constant, :type, :function, :macro]
+
+function Selectors.runner(::Type{AutoDocsBlocks}, x, page, doc)
+ curmod = get(page.globals.meta, :CurrentModule, Main)
+ fields = Dict{Symbol, Any}()
+ for (ex, str) in Utilities.parseblock(x.code, doc, page)
+ if Utilities.isassign(ex)
+ try
+ fields[ex.args[1]] = Core.eval(curmod, ex.args[2])
+ catch err
+ push!(doc.internal.errors, :autodocs_block)
+ Utilities.warn(doc, page, "Failed to evaluate `$(strip(str))` in `@autodocs` block.", err)
+ end
+ end
+ end
+ if haskey(fields, :Modules)
+ # Gather and filter docstrings.
+ modules = fields[:Modules]
+ order = get(fields, :Order, AUTODOCS_DEFAULT_ORDER)
+ pages = map(normpath, get(fields, :Pages, []))
+ public = get(fields, :Public, true)
+ private = get(fields, :Private, true)
+ results = []
+ for mod in modules
+ for (binding, multidoc) in Documenter.DocSystem.getmeta(mod)
+ # Which bindings should be included?
+ isexported = Base.isexported(mod, binding.var)
+ included = (isexported && public) || (!isexported && private)
+ # What category does the binding belong to?
+ category = Documenter.DocSystem.category(binding)
+ if category in order && included
+ for (typesig, docstr) in multidoc.docs
+ path = normpath(docstr.data[:path])
+ object = Utilities.Object(binding, typesig)
+ if isempty(pages)
+ push!(results, (mod, path, category, object, isexported, docstr))
+ else
+ for p in pages
+ if endswith(path, p)
+ push!(results, (mod, p, category, object, isexported, docstr))
+ break
+ end
+ end
+ end
+ end
+ end
+ end
+ end
+
+ # Sort docstrings.
+ modulemap = Documents.precedence(modules)
+ pagesmap = Documents.precedence(pages)
+ ordermap = Documents.precedence(order)
+ comparison = function (a, b)
+ local t
+ (t = Documents._compare(modulemap, 1, a, b)) == 0 || return t < 0 # module
+ a[5] == b[5] || return a[5] > b[5] # exported bindings before unexported ones.
+ (t = Documents._compare(pagesmap, 2, a, b)) == 0 || return t < 0 # page
+ (t = Documents._compare(ordermap, 3, a, b)) == 0 || return t < 0 # category
+ string(a[4]) < string(b[4]) # name
+ end
+ sort!(results; lt = comparison)
+
+ # Finalise docstrings.
+ nodes = DocsNode[]
+ for (mod, path, category, object, isexported, docstr) in results
+ if haskey(doc.internal.objects, object)
+ push!(doc.internal.errors, :autodocs_block)
+ Utilities.warn(page.source, "Duplicate docs found for '$(object.binding)'.")
+ continue
+ end
+ markdown = Markdown.MD(Documenter.DocSystem.parsedoc(docstr))
+ markdown.meta[:results] = [docstr]
+ slug = Utilities.slugify(object)
+ anchor = Anchors.add!(doc.internal.docs, object, slug, page.build)
+ docsnode = DocsNode(markdown, anchor, object, page)
+
+ # Track the order of insertion of objects per-binding.
+ push!(get!(doc.internal.bindings, object.binding, Utilities.Object[]), object)
+
+ doc.internal.objects[object] = docsnode
+ push!(nodes, docsnode)
+ end
+ page.mapping[x] = DocsNodes(nodes)
+ else
+ push!(doc.internal.errors, :autodocs_block)
+ Utilities.warn(page.source, "'@autodocs' missing 'Modules = ...'.")
+ page.mapping[x] = x
+ end
+end
+
+# @eval
+# -----
+
+function Selectors.runner(::Type{EvalBlocks}, x, page, doc)
+ sandbox = Module(:EvalBlockSandbox)
+ cd(dirname(page.build)) do
+ result = nothing
+ for (ex, str) in Utilities.parseblock(x.code, doc, page)
+ try
+ result = Core.eval(sandbox, ex)
+ catch err
+ push!(doc.internal.errors, :eval_block)
+ Utilities.warn(doc, page, "Failed to evaluate `@eval` block.", err)
+ end
+ end
+ page.mapping[x] = EvalNode(x, result)
+ end
+end
+
+# @index
+# ------
+
+function Selectors.runner(::Type{IndexBlocks}, x, page, doc)
+ node = Documents.buildnode(Documents.IndexNode, x, doc, page)
+ push!(doc.internal.indexnodes, node)
+ page.mapping[x] = node
+end
+
+# @contents
+# ---------
+
+function Selectors.runner(::Type{ContentsBlocks}, x, page, doc)
+ node = Documents.buildnode(Documents.ContentsNode, x, doc, page)
+ push!(doc.internal.contentsnodes, node)
+ page.mapping[x] = node
+end
+
+# @example
+# --------
+
+function Selectors.runner(::Type{ExampleBlocks}, x, page, doc)
+ # The sandboxed module -- either a new one or a cached one from this page.
+ name = match(r"^@example[ ]?(.*)$", first(split(x.language, ';', limit = 2)))[1]
+ sym = isempty(name) ? gensym("ex-") : Symbol("ex-", name)
+ mod = get!(() -> get_new_sandbox(sym), page.globals.meta, sym)
+
+ # "parse" keyword arguments to example (we only need to look for continued = true)
+ continued = occursin(r"continued\s*=\s*true", x.language)
+
+ # Evaluate the code block. We redirect stdout/stderr to `buffer`.
+ result, buffer = nothing, IOBuffer()
+ if !continued # run the code
+ # check if there is any code wating
+ if haskey(page.globals.meta, :ContinuedCode) && haskey(page.globals.meta[:ContinuedCode], sym)
+ code = page.globals.meta[:ContinuedCode][sym] * '\n' * x.code
+ delete!(page.globals.meta[:ContinuedCode], sym)
+ else
+ code = x.code
+ end
+ for (ex, str) in Utilities.parseblock(code, doc, page)
+ (value, success, backtrace, text) = Utilities.withoutput() do
+ cd(dirname(page.build)) do
+ Core.eval(mod, Expr(:(=), :ans, ex))
+ end
+ end
+ result = value
+ print(buffer, text)
+ if !success
+ push!(doc.internal.errors, :example_block)
+ Utilities.warn(page.source, "failed to run code block.\n\n$(value)")
+ page.mapping[x] = x
+ return
+ end
+ end
+ else # store the continued code
+ CC = get!(page.globals.meta, :ContinuedCode, Dict())
+ CC[sym] = get(CC, sym, "") * '\n' * x.code
+ end
+ # Splice the input and output into the document.
+ content = []
+ input = droplines(x.code)
+
+ # Special-case support for displaying SVG and PNG graphics. TODO: make this more general.
+ output = if showable(MIME"text/html"(), result)
+ Documents.RawHTML(Base.invokelatest(stringmime, MIME"text/html"(), result))
+ elseif showable(MIME"image/svg+xml"(), result)
+ Documents.RawHTML(Base.invokelatest(stringmime, MIME"image/svg+xml"(), result))
+ elseif showable(MIME"image/png"(), result)
+ Documents.RawHTML(string("<img src=\"data:image/png;base64,", Base.invokelatest(stringmime, MIME"image/png"(), result), "\" />"))
+ elseif showable(MIME"image/webp"(), result)
+ Documents.RawHTML(string("<img src=\"data:image/webp;base64,", Base.invokelatest(stringmime, MIME"image/webp"(), result), "\" />"))
+ elseif showable(MIME"image/gif"(), result)
+ Documents.RawHTML(string("<img src=\"data:image/gif;base64,", Base.invokelatest(stringmime, MIME"image/gif"(), result), "\" />"))
+ elseif showable(MIME"image/jpeg"(), result)
+ Documents.RawHTML(string("<img src=\"data:image/jpeg;base64,", Base.invokelatest(stringmime, MIME"image/jpeg"(), result), "\" />"))
+ else
+ Markdown.Code(Documenter.DocTests.result_to_string(buffer, result))
+ end
+
+ # Only add content when there's actually something to add.
+ isempty(input) || push!(content, Markdown.Code("julia", input))
+ isempty(output.code) || push!(content, output)
+ # ... and finally map the original code block to the newly generated ones.
+ page.mapping[x] = Markdown.MD(content)
+end
+
+# @repl
+# -----
+
+function Selectors.runner(::Type{REPLBlocks}, x, page, doc)
+ matched = match(r"^@repl[ ]?(.*)$", x.language)
+ matched === nothing && error("invalid '@repl' syntax: $(x.language)")
+ name = matched[1]
+ sym = isempty(name) ? gensym("ex-") : Symbol("ex-", name)
+ mod = get!(() -> get_new_sandbox(sym), page.globals.meta, sym)
+ code = split(x.code, '\n'; limit = 2)[end]
+ result, out = nothing, IOBuffer()
+ for (ex, str) in Utilities.parseblock(x.code, doc, page)
+ buffer = IOBuffer()
+ input = droplines(str)
+ (value, success, backtrace, text) = Utilities.withoutput() do
+ cd(dirname(page.build)) do
+ Core.eval(mod, :(ans = $(Core.eval(mod, ex))))
+ end
+ end
+ result = value
+ output = if success
+ hide = REPL.ends_with_semicolon(input)
+ Documenter.DocTests.result_to_string(buffer, hide ? nothing : value)
+ else
+ Documenter.DocTests.error_to_string(buffer, value, [])
+ end
+ isempty(input) || println(out, prepend_prompt(input))
+ print(out, text)
+ if isempty(input) || isempty(output)
+ println(out)
+ else
+ println(out, output, "\n")
+ end
+ end
+ page.mapping[x] = Markdown.Code("julia-repl", rstrip(String(take!(out))))
+end
+
+# @setup
+# ------
+
+function Selectors.runner(::Type{SetupBlocks}, x, page, doc)
+ matched = match(r"^@setup[ ](.+)$", x.language)
+ matched === nothing && error("invalid '@setup <name>' syntax: $(x.language)")
+ # The sandboxed module -- either a new one or a cached one from this page.
+ name = matched[1]
+ sym = isempty(name) ? gensym("ex-") : Symbol("ex-", name)
+ mod = get!(page.globals.meta, sym, Module(sym))::Module
+
+ # Evaluate whole @setup block at once instead of piecewise
+ page.mapping[x] =
+ try
+ cd(dirname(page.build)) do
+ include_string(mod, x.code)
+ end
+ Markdown.MD([])
+ catch err
+ push!(doc.internal.errors, :setup_block)
+ Utilities.warn(page.source, "failed to run `@setup` block.\n\n$(err)")
+ x
+ end
+ # ... and finally map the original code block to the newly generated ones.
+ page.mapping[x] = Markdown.MD([])
+end
+
+# @raw
+# ----
+
+function Selectors.runner(::Type{RawBlocks}, x, page, doc)
+ m = match(r"@raw[ ](.+)$", x.language)
+ m === nothing && error("invalid '@raw <name>' syntax: $(x.language)")
+ page.mapping[x] = Documents.RawNode(Symbol(m[1]), x.code)
+end
+
+# Utilities.
+# ----------
+
+iscode(x::Markdown.Code, r::Regex) = occursin(r, x.language)
+iscode(x::Markdown.Code, lang) = x.language == lang
+iscode(x, lang) = false
+
+const NAMEDHEADER_REGEX = r"^@id (.+)$"
+
+function namedheader(h::Markdown.Header)
+ if isa(h.text, Vector) && length(h.text) === 1 && isa(h.text[1], Markdown.Link)
+ url = h.text[1].url
+ occursin(NAMEDHEADER_REGEX, url)
+ else
+ false
+ end
+end
+
+# Remove any `# hide` lines, leading/trailing blank lines, and trailing whitespace.
+function droplines(code; skip = 0)
+ buffer = IOBuffer()
+ for line in split(code, '\n')[(skip + 1):end]
+ occursin(r"^(.*)#\s*hide$", line) && continue
+ println(buffer, rstrip(line))
+ end
+ strip(String(take!(buffer)), '\n')
+end
+
+function prepend_prompt(input)
+ prompt = "julia> "
+ padding = " "^length(prompt)
+ out = IOBuffer()
+ for (n, line) in enumerate(split(input, '\n'))
+ line = rstrip(line)
+ println(out, n == 1 ? prompt : padding, line)
+ end
+ rstrip(String(take!(out)))
+end
+
+function get_new_sandbox(name::Symbol)
+ m = Module(name)
+ # eval(expr) is available in the REPL (i.e. Main) so we emulate that for the sandbox
+ Core.eval(m, :(eval(x) = Core.eval($m, x)))
+ # modules created with Module() does not have include defined
+ Core.eval(m, :(include(x) = Base.include($m, x)))
+ return m
+end
+
+end
--- /dev/null
+"""
+Filetypes used to decide which rendering methods in [`Documenter.Writers`](@ref) are called.
+
+The only supported format is currently `Markdown`.
+"""
+module Formats
+
+import ..Documenter
+
+using DocStringExtensions
+
+"""
+Represents the output format. Possible values are `Markdown`, `LaTeX`, and `HTML`.
+"""
+@enum(
+ Format,
+ Markdown,
+ LaTeX,
+ HTML,
+)
+
+"""
+$(SIGNATURES)
+
+Converts a [`Format`](@ref) value to a `MIME` type.
+"""
+function mimetype(f::Symbol)
+ f ≡ :markdown ? MIME"text/plain"() :
+ f ≡ :latex ? MIME"text/latex"() :
+ f ≡ :html ? MIME"text/html"() :
+ error("unexpected format.")
+end
+
+function extension(f::Symbol, file)
+ path, _ = splitext(file)
+ string(path, extension(f))
+end
+
+function extension(f::Symbol)
+ f ≡ :markdown ? ".md" :
+ f ≡ :latex ? ".tex" :
+ f ≡ :html ? ".html" :
+ error("unexpected format.")
+end
+
+# `fmt` -- convert a format spec to a vector of symbols.
+const DEPRECATION_MAPPING = Dict(
+ Markdown => :markdown,
+ LaTeX => :latex,
+ HTML => :html,
+)
+function _fmt(f::Format)
+ s = DEPRECATION_MAPPING[f]
+ Base.depwarn("`$(f)` is deprecated use `:$(s)` for `format = ...`.", :fmt)
+ return s
+end
+fmt(f::Format) = [_fmt(f)]
+fmt(v::Vector{Format}) = map(_fmt, v)
+fmt(s::Symbol) = [s]
+fmt(v::Vector{Symbol}) = v
+
+end
--- /dev/null
+"""
+Provides a domain specific language for representing HTML documents.
+
+# Examples
+
+```julia
+using Documenter.Utilities.DOM
+
+# `DOM` does not export any HTML tags. Define the ones we actually need.
+@tags div p em strong ul li
+
+div(
+ p("This ", em("is"), " a ", strong("paragraph."),
+ p("And this is ", strong("another"), " one"),
+ ul(
+ li("and"),
+ li("an"),
+ li("unordered"),
+ li("list")
+ )
+)
+```
+
+*Notes*
+
+All the arguments passed to a node are flattened into a single vector rather
+than preserving any nested structure. This means that passing two vectors of
+nodes to a `div` will result in a `div` node with a single vector of children
+(the concatenation of the two vectors) rather than two vector children. The
+only arguments that are not flattened are nested nodes.
+
+String arguments are automatically converted into text nodes. Text nodes do not
+have any children or attributes and when displayed the string is escaped using
+[`escapehtml`](@ref).
+
+# Attributes
+
+As well as plain nodes shown in the previous example, nodes can have attributes
+added to them using the following syntax.
+
+```julia
+div[".my-class"](
+ img[:src => "foo.jpg"],
+ input[\"#my-id\", :disabled]
+)
+```
+
+In the above example we add a `class = "my-class"` attribute to the `div` node,
+a `src = "foo.jpg"` to the `img`, and `id = "my-id" disabled` attributes to the
+`input` node.
+
+The following syntax is supported within `[...]`:
+
+```julia
+tag[\"#id\"]
+tag[".class"]
+tag[\".class#id\"]
+tag[:disabled]
+tag[:src => "foo.jpg"]
+# ... or any combination of the above arguments.
+```
+
+# Internal Representation
+
+The [`@tags`](@ref) macro defines named [`Tag`](@ref) objects as follows
+
+```julia
+@tags div p em strong
+```
+
+expands to
+
+```julia
+const div, p, em, strong = Tag(:div), Tag(:p), Tag(:em), Tag(:strong)
+```
+
+These [`Tag`](@ref) objects are lightweight representations of empty HTML
+elements without any attributes and cannot be used to represent a complete
+document. To create an actual tree of HTML elements that can be rendered we
+need to add some attributes and/or child elements using `getindex` or `call`
+syntax. Applying either to a [`Tag`](@ref) object will construct a new
+[`Node`](@ref) object.
+
+```julia
+tag(...) # No attributes.
+tag[...] # No children.
+tag[...](...) # Has both attributes and children.
+```
+
+All three of the above syntaxes return a new [`Node`](@ref) object. Printing of
+`Node` objects is defined using the standard Julia display functions, so only
+needs a call to `print` to print out a valid HTML document with all nessesary
+text escaped.
+"""
+module DOM
+
+import ..Utilities
+
+tostr(p::Pair) = p
+
+export @tags
+
+#
+# The following sets are based on:
+#
+# - https://developer.mozilla.org/en/docs/Web/HTML/Block-level_elements
+# - https://developer.mozilla.org/en-US/docs/Web/HTML/Inline_elements
+# - https://developer.mozilla.org/en-US/docs/Glossary/empty_element
+#
+const BLOCK_ELEMENTS = Set([
+ :address, :article, :aside, :blockquote, :canvas, :dd, :div, :dl,
+ :fieldset, :figcaption, :figure, :footer, :form, :h1, :h2, :h3, :h4, :h5,
+ :h6, :header, :hgroup, :hr, :li, :main, :nav, :noscript, :ol, :output, :p,
+ :pre, :section, :table, :tfoot, :ul, :video,
+])
+const INLINE_ELEMENTS = Set([
+ :a, :abbr, :acronym, :b, :bdo, :big, :br, :button, :cite, :code, :dfn, :em,
+ :i, :img, :input, :kbd, :label, :map, :object, :q, :samp, :script, :select,
+ :small, :span, :strong, :sub, :sup, :textarea, :time, :tt, :var,
+])
+const VOID_ELEMENTS = Set([
+ :area, :base, :br, :col, :command, :embed, :hr, :img, :input, :keygen,
+ :link, :meta, :param, :source, :track, :wbr,
+])
+const ALL_ELEMENTS = union(BLOCK_ELEMENTS, INLINE_ELEMENTS, VOID_ELEMENTS)
+
+#
+# Empty string as a constant to make equality checks slightly cheaper.
+#
+const EMPTY_STRING = ""
+const TEXT = Symbol(EMPTY_STRING)
+
+"""
+Represents a empty and attribute-less HTML element.
+
+Use [`@tags`](@ref) to define instances of this type rather than manually
+creating them via `Tag(:tagname)`.
+"""
+struct Tag
+ name :: Symbol
+end
+
+Base.show(io::IO, t::Tag) = print(io, "<", t.name, ">")
+
+"""
+Define a collection of [`Tag`](@ref) objects and bind them to constants
+with the same names.
+
+# Examples
+
+Defined globally within a module:
+
+```julia
+@tags div ul li
+```
+
+Defined within the scope of a function to avoid cluttering the global namespace:
+
+```julia
+function template(args...)
+ @tags div ul li
+ # ...
+end
+```
+"""
+macro tags(args...) esc(tags(args)) end
+tags(s) = :(($(s...),) = $(map(Tag, s)))
+
+const Attributes = Vector{Pair{Symbol, String}}
+
+"""
+Represents an element within an HTML document including any textual content,
+children `Node`s, and attributes.
+
+This type should not be constructed directly, but instead via `(...)` and
+`[...]` applied to a [`Tag`](@ref) or another [`Node`](@ref) object.
+"""
+struct Node
+ name :: Symbol
+ text :: String
+ attributes :: Attributes
+ nodes :: Vector{Node}
+
+ Node(name::Symbol, attr::Attributes, data::Vector{Node}) = new(name, EMPTY_STRING, attr, data)
+ Node(text::AbstractString) = new(TEXT, text)
+end
+
+#
+# Syntax for defining `Node` objects from `Tag`s and other `Node` objects.
+#
+(t::Tag)(args...) = Node(t.name, Attributes(), data(args))
+(n::Node)(args...) = Node(n.name, n.attributes, data(args))
+Base.getindex(t::Tag, args...) = Node(t.name, attr(args), Node[])
+Base.getindex(n::Node, args...) = Node(n.name, attr(args), n.nodes)
+
+#
+# Helper methods for the above `Node` "pseudo-constructors".
+#
+data(args) = flatten!(nodes!, Node[], args)
+attr(args) = flatten!(attributes!, Attributes(), args)
+
+#
+# Types that must not be flattened when constructing a `Node`'s child vector.
+#
+const Atom = Union{AbstractString, Node, Pair, Symbol}
+
+"""
+# Signatures
+
+```julia
+flatten!(f!, out, x::Atom)
+flatten!(f!, out, xs)
+```
+
+Flatten the contents the third argument into the second after applying the
+function `f!` to the element.
+"""
+flatten!(f!, out, x::Atom) = f!(out, x)
+flatten!(f!, out, xs) = (for x in xs; flatten!(f!, out, x); end; out)
+
+#
+# Helper methods for handling flattening children elements in `Node` construction.
+#
+nodes!(out, s::AbstractString) = push!(out, Node(s))
+nodes!(out, n::Node) = push!(out, n)
+
+#
+# Helper methods for handling flattening in construction of attribute vectors.
+#
+function attributes!(out, s::AbstractString)
+ class, id = IOBuffer(), IOBuffer()
+ for x in eachmatch(r"[#|\.]([\w\-]+)", s)
+ print(startswith(x.match, '.') ? class : id, x.captures[1], ' ')
+ end
+ position(class) === 0 || push!(out, tostr(:class => rstrip(String(take!(class)))))
+ position(id) === 0 || push!(out, tostr(:id => rstrip(String(take!(id)))))
+ return out
+end
+attributes!(out, s::Symbol) = push!(out, tostr(s => ""))
+attributes!(out, p::Pair) = push!(out, tostr(p))
+
+function Base.show(io::IO, n::Node)
+ if n.name === Symbol("#RAW#")
+ print(io, n.nodes[1].text)
+ elseif n.name === TEXT
+ print(io, escapehtml(n.text))
+ else
+ print(io, '<', n.name)
+ for (name, value) in n.attributes
+ print(io, ' ', name)
+ isempty(value) || print(io, '=', repr(escapehtml(value)))
+ end
+ if n.name in VOID_ELEMENTS
+ print(io, "/>")
+ else
+ print(io, '>')
+ if n.name === :script || n.name === :style
+ isempty(n.nodes) || print(io, n.nodes[1].text)
+ else
+ for each in n.nodes
+ show(io, each)
+ end
+ end
+ print(io, "</", n.name, '>')
+ end
+ end
+end
+
+Base.show(io::IO, ::MIME"text/html", n::Node) = print(io, n)
+
+"""
+Escape characters in the provided string. This converts the following characters:
+
+- `<` to `<`
+- `>` to `>`
+- `&` to `&`
+- `'` to `'`
+- `\"` to `"`
+
+When no escaping is needed then the same object is returned, otherwise a new
+string is constructed with the characters escaped. The returned object should
+always be treated as an immutable copy and compared using `==` rather than `===`.
+"""
+function escapehtml(text::AbstractString)
+ if occursin(r"[<>&'\"]", text)
+ buffer = IOBuffer()
+ for char in text
+ char === '<' ? write(buffer, "<") :
+ char === '>' ? write(buffer, ">") :
+ char === '&' ? write(buffer, "&") :
+ char === '\'' ? write(buffer, "'") :
+ char === '"' ? write(buffer, """) : write(buffer, char)
+ end
+ String(take!(buffer))
+ else
+ text
+ end
+end
+
+"""
+A HTML node that wraps around the root node of the document and adds a DOCTYPE
+to it.
+"""
+mutable struct HTMLDocument
+ doctype :: String
+ root :: Node
+end
+HTMLDocument(root) = HTMLDocument("html", root)
+
+function Base.show(io::IO, doc::HTMLDocument)
+ println(io, "<!DOCTYPE $(doc.doctype)>")
+ println(io, doc.root)
+end
+
+end
--- /dev/null
+"""
+Provides the [`mdflatten`](@ref) function that can "flatten" Markdown objects into
+a string, with formatting etc. stripped.
+
+Note that the tests in `test/mdflatten.jl` should be considered to be the spec
+for the output (number of newlines, indents, formatting, etc.).
+"""
+module MDFlatten
+
+export mdflatten
+
+import ..Utilities
+
+import Markdown:
+ MD, BlockQuote, Bold, Code, Header, HorizontalRule,
+ Image, Italic, LaTeX, LineBreak, Link, List, Paragraph, Table,
+ Footnote, Admonition
+
+"""
+Convert a Markdown object to a `String` of only text (i.e. not formatting info).
+
+It drop most of the extra information (e.g. language of a code block, URLs)
+and formatting (e.g. emphasis, headers). This "flattened" representation can
+then be used as input for search engines.
+"""
+function mdflatten(md)
+ io = IOBuffer()
+ mdflatten(io, md)
+ String(take!(io))
+end
+
+mdflatten(io, md) = mdflatten(io, md, md)
+mdflatten(io, md::MD, parent) = mdflatten(io, md.content, md)
+mdflatten(io, vec::Vector, parent) = map(x -> mdflatten(io, x, parent), vec)
+function mdflatten(io, vec::Vector, parent::MD)
+ # this special case separates top level blocks with newlines
+ for md in vec
+ mdflatten(io, md, parent)
+ print(io, "\n\n")
+ end
+end
+
+# Block level MD nodes
+mdflatten(io, h::Header{N}, parent) where {N} = mdflatten(io, h.text, h)
+mdflatten(io, p::Paragraph, parent) = mdflatten(io, p.content, p)
+mdflatten(io, bq::BlockQuote, parent) = mdflatten(io, bq.content, bq)
+mdflatten(io, ::HorizontalRule, parent) = nothing
+function mdflatten(io, list::List, parent)
+ for (idx, li) in enumerate(list.items)
+ for (jdx, x) in enumerate(li)
+ mdflatten(io, x, list)
+ jdx == length(li) || print(io, '\n')
+ end
+ idx == length(list.items) || print(io, '\n')
+ end
+end
+function mdflatten(io, t::Table, parent)
+ for (idx, row) = enumerate(t.rows)
+ for (jdx, x) in enumerate(row)
+ mdflatten(io, x, t)
+ jdx == length(row) || print(io, ' ')
+ end
+ idx == length(t.rows) || print(io, '\n')
+ end
+end
+
+# Inline nodes
+mdflatten(io, text::AbstractString, parent) = print(io, text)
+mdflatten(io, link::Link, parent) = mdflatten(io, link.text, link)
+mdflatten(io, b::Bold, parent) = mdflatten(io, b.text, b)
+mdflatten(io, i::Italic, parent) = mdflatten(io, i.text, i)
+mdflatten(io, i::Image, parent) = print(io, "(Image: $(i.alt))")
+mdflatten(io, m::LaTeX, parent) = print(io, replace(m.formula, r"[^()+\-*^=\w\s]" => ""))
+mdflatten(io, ::LineBreak, parent) = print(io, '\n')
+
+# Is both inline and block
+mdflatten(io, c::Code, parent) = print(io, c.code)
+
+# Special (inline) "node" -- due to JuliaMark's interpolations
+mdflatten(io, expr::Union{Symbol,Expr}, parent) = print(io, expr)
+
+mdflatten(io, f::Footnote, parent) = footnote(io, f.id, f.text, f)
+footnote(io, id, text::Nothing, parent) = print(io, "[$id]")
+function footnote(io, id, text, parent)
+ print(io, "[$id]: ")
+ mdflatten(io, text, parent)
+end
+
+function mdflatten(io, a::Admonition, parent)
+ println(io, "$(a.category): $(a.title)")
+ mdflatten(io, a.content, a)
+end
+
+end
--- /dev/null
+"""
+An extensible code selection interface.
+
+The `Selectors` module provides an extensible way to write code that has to dispatch on
+different predicates without hardcoding the control flow into a single chain of `if`
+statements.
+
+In the following example a selector for a simple condition is implemented and the generated
+selector code is described:
+
+```julia
+abstract type MySelector <: Selectors.AbstractSelector end
+
+# The different cases we want to test.
+abstract type One <: MySelector end
+abstract type NotOne <: MySelector end
+
+# The order in which to test the cases.
+Selectors.order(::Type{One}) = 0.0
+Selectors.order(::Type{NotOne}) = 1.0
+
+# The predicate to test against.
+Selectors.matcher(::Type{One}, x) = x === 1
+Selectors.matcher(::Type{NotOne}, x) = x !== 1
+
+# What to do when a test is successful.
+Selectors.runner(::Type{One}, x) = println("found one")
+Selectors.runner(::Type{NotOne}, x) = println("not found")
+
+# Test our selector with some numbers.
+for i in 0:5
+ Selectors.dispatch(MySelector, i)
+end
+```
+
+`Selectors.dispatch(Selector, i)` will behave equivalent to the following:
+
+```julia
+function dispatch(::Type{MySelector}, i::Int)
+ if matcher(One, i)
+ runner(One, i)
+ elseif matcher(NotOne, i)
+ runner(NotOne, i)
+ end
+end
+```
+
+and further to
+
+```julia
+function dispatch(::Type{MySelector}, i::Int)
+ if i === 1
+ println("found one")
+ elseif i !== 1
+ println("not found")
+ end
+end
+```
+
+The module provides the following interface for creating selectors:
+
+- [`order`](@ref)
+- [`matcher`](@ref)
+- [`runner`](@ref)
+- [`strict`](@ref)
+- [`disable`](@ref)
+- [`dispatch`](@ref)
+
+"""
+module Selectors
+
+import InteractiveUtils: subtypes
+
+"""
+Root selector type. Each user-defined selector must subtype from this, i.e.
+
+```julia
+abstract type MySelector <: Selectors.AbstractSelector end
+
+abstract type First <: MySelector end
+abstract type Second <: MySelector end
+```
+"""
+abstract type AbstractSelector end
+
+"""
+Define the precedence of each case in a selector, i.e.
+
+```julia
+Selectors.order(::Type{First}) = 1.0
+Selectors.order(::Type{Second}) = 2.0
+```
+
+Note that the return type must be `Float64`. Defining multiple case types to have the same
+order will result in undefined behaviour.
+"""
+function order end
+
+"""
+Define the matching test for each case in a selector, i.e.
+
+```julia
+Selectors.matcher(::Type{First}, x) = x == 1
+Selectors.matcher(::Type{Second}, x) = true
+```
+
+Note that the return type must be `Bool`.
+
+To match against multiple cases use the [`Selectors.strict`](@ref) function.
+"""
+function matcher end
+
+"""
+Define the code that will run when a particular [`Selectors.matcher`](@ref) test returns
+`true`, i.e.
+
+```julia
+Selectors.runner(::Type{First}, x) = println("`x` is equal to `1`.")
+Selectors.runner(::Type{Second}, x) = println("`x` is not equal to `1`.")
+```
+"""
+function runner end
+
+"""
+Define whether a selector case will "fallthrough" or not when successfully matched against.
+By default matching is strict and does not fallthrough to subsequent selector cases.
+
+```julia
+# Adding a debugging selector case.
+abstract type Debug <: MySelector end
+
+# Insert prior to all other cases.
+Selectors.order(::Type{Debug}) = 0.0
+
+# Fallthrough to the next case on success.
+Selectors.strict(::Type{Debug}) = false
+
+# We always match, regardless of the value of `x`.
+Selectors.matcher(::Type{Debug}, x) = true
+
+# Print some debugging info.
+Selectors.runner(::Type{Debug}, x) = @show x
+```
+"""
+strict(::Type{T}) where {T <: AbstractSelector} = true
+
+"""
+Disable a particular case in a selector so that it is never used.
+
+```julia
+Selectors.disable(::Type{Debug}) = true
+```
+"""
+disable(::Type{T}) where {T <: AbstractSelector} = false
+
+"""
+Call `Selectors.runner(T, args...)` where `T` is a subtype of
+`MySelector` for which `matcher(T, args...)` is `true`.
+
+```julia
+Selectors.dispatch(MySelector, args...)
+```
+"""
+function dispatch(::Type{T}, x...) where T <: AbstractSelector
+ for t in (sort(subtypes(T); by = order))
+ if !disable(t) && matcher(t, x...)
+ runner(t, x...)
+ strict(t) && return
+ end
+ end
+ runner(T, x...)
+end
+
+end
--- /dev/null
+module TextDiff
+
+using DocStringExtensions
+
+# Utilities.
+
+function lcs(old_tokens::Vector, new_tokens::Vector)
+ m = length(old_tokens)
+ n = length(new_tokens)
+ weights = zeros(Int, m + 1, n + 1)
+ for i = 2:(m + 1), j = 2:(n + 1)
+ weights[i, j] = old_tokens[i - 1] == new_tokens[j - 1] ?
+ weights[i - 1, j - 1] + 1 : max(weights[i, j - 1], weights[i - 1, j])
+ end
+ return weights
+end
+
+function makediff(weights::Matrix, old_tokens::Vector, new_tokens::Vector)
+ m = length(old_tokens)
+ n = length(new_tokens)
+ diff = Vector{Pair{Symbol, SubString{String}}}()
+ makediff!(diff, weights, old_tokens, new_tokens, m + 1, n + 1)
+ return diff
+end
+
+function makediff!(out, weights, X, Y, i, j)
+ if i > 1 && j > 1 && X[i - 1] == Y[j - 1]
+ makediff!(out, weights, X, Y, i - 1, j - 1)
+ push!(out, :normal => X[i - 1])
+ else
+ if j > 1 && (i == 1 || weights[i, j - 1] >= weights[i - 1, j])
+ makediff!(out, weights, X, Y, i, j - 1)
+ push!(out, :green => Y[j - 1])
+ elseif i > 1 && (j == 1 || weights[i, j - 1] < weights[i - 1, j])
+ makediff!(out, weights, X, Y, i - 1, j)
+ push!(out, :red => X[i - 1])
+ end
+ end
+ return out
+end
+
+"""
+$(SIGNATURES)
+
+Splits `text` at `regex` matches, returning an array of substrings. The parts of the string
+that match the regular expression are also included at the ends of the returned strings.
+"""
+function splitby(reg::Regex, text::AbstractString)
+ out = SubString{String}[]
+ token_first = 1
+ for each in eachmatch(reg, text)
+ token_last = each.offset + lastindex(each.match) - 1
+ push!(out, SubString(text, token_first, token_last))
+ token_first = nextind(text, token_last)
+ end
+ laststr = SubString(text, token_first)
+ isempty(laststr) || push!(out, laststr)
+ return out
+end
+
+# Diff Type.
+
+struct Lines end
+struct Words end
+
+splitter(::Type{Lines}) = r"\n"
+splitter(::Type{Words}) = r"\s+"
+
+struct Diff{T}
+ old_tokens::Vector{SubString{String}}
+ new_tokens::Vector{SubString{String}}
+ weights::Matrix{Int}
+ diff::Vector{Pair{Symbol, SubString{String}}}
+
+ function Diff{T}(old_text::AbstractString, new_text::AbstractString) where T
+ reg = splitter(T)
+ old_tokens = splitby(reg, old_text)
+ new_tokens = splitby(reg, new_text)
+ weights = lcs(old_tokens, new_tokens)
+ diff = makediff(weights, old_tokens, new_tokens)
+ return new{T}(old_tokens, new_tokens, weights, diff)
+ end
+end
+
+# Display.
+
+prefix(::Diff{Lines}, s::Symbol) = s === :green ? "+ " : s === :red ? "- " : " "
+prefix(::Diff{Words}, ::Symbol) = ""
+
+function showdiff(io::IO, diff::Diff)
+ for (color, text) in diff.diff
+ printstyled(io, prefix(diff, color), text, color=color)
+ end
+end
+
+function Base.show(io::IO, diff::Diff)
+ printstyled(io, color=:normal) # Reset colors.
+ showdiff(io, diff)
+end
+
+end
--- /dev/null
+"""
+Provides a collection of utility functions and types that are used in other submodules.
+"""
+module Utilities
+
+using Base.Meta
+import Base: isdeprecated, Docs.Binding
+using DocStringExtensions
+import Markdown, LibGit2
+import Base64: stringmime
+
+# Logging output.
+
+const __log__ = Ref(true)
+"""
+ logging(flag::Bool)
+
+Enable or disable logging output for [`log`](@ref) and [`warn`](@ref).
+"""
+logging(flag::Bool) = __log__[] = flag
+
+"""
+Format and print a message to the user.
+"""
+log(msg) = __log__[] ? printstyled(stdout, "Documenter: ", msg, "\n", color=:magenta) : nothing
+
+# Print logging output to the "real" stdout.
+function log(doc, msg)
+ __log__[] && printstyled(stdout, "Documenter: ", msg, "\n", color=:magenta)
+ return nothing
+end
+
+debug(msg) = printstyled(" ?? ", msg, "\n", color=:green)
+
+"""
+ warn(file, msg)
+ warn(msg)
+
+Format and print a warning message to the user. Passing a `file` will include the filename
+where the warning was raised.
+"""
+function warn(file, msg)
+ if __log__[]
+ msg = string(" !! ", msg, " [", file, "]\n")
+ printstyled(stdout, msg, color=:red)
+ else
+ nothing
+ end
+end
+warn(msg) = __log__[] ? printstyled(stdout, " !! ", msg, "\n", color=:red) : nothing
+
+function warn(file, msg, err, ex, mod)
+ if __log__[]
+ warn(file, msg)
+ printstyled(stdout, "\nERROR: $err\n\nexpression '$(repr(ex))' in module '$mod'\n\n", color=:red)
+ else
+ nothing
+ end
+end
+
+function warn(doc, page, msg, err)
+ file = page.source
+ printstyled(stdout, " !! Warning in $(file):\n\n$(msg)\n\nERROR: $(err)\n\n", color=:red)
+end
+
+# Directory paths.
+
+"""
+Returns the current directory.
+"""
+function currentdir()
+ d = Base.source_dir()
+ d === nothing ? pwd() : d
+end
+
+"""
+Returns the path to the Documenter `assets` directory.
+"""
+assetsdir() = normpath(joinpath(dirname(@__FILE__), "..", "..", "assets"))
+
+cleandir(d::AbstractString) = (isdir(d) && rm(d, recursive = true); mkdir(d))
+
+"""
+Find the path of a file relative to the `source` directory. `root` is the path
+to the directory containing the file `file`.
+
+It is meant to be used with `walkdir(source)`.
+"""
+srcpath(source, root, file) = normpath(joinpath(relpath(root, source), file))
+
+# Slugify text.
+
+"""
+Slugify a string into a suitable URL.
+"""
+function slugify(s::AbstractString)
+ s = replace(s, r"\s+" => "-")
+ s = replace(s, r"^\d+" => "")
+ s = replace(s, r"&" => "-and-")
+ s = replace(s, r"[^\p{L}\p{P}\d\-]+" => "")
+ s = strip(replace(s, r"\-\-+" => "-"), '-')
+end
+slugify(object) = string(object) # Non-string slugifying doesn't do anything.
+
+# Parse code blocks.
+
+"""
+Returns a vector of parsed expressions and their corresponding raw strings.
+
+Returns a `Vector` of tuples `(expr, code)`, where `expr` is the corresponding expression
+(e.g. a `Expr` or `Symbol` object) and `code` is the string of code the expression was
+parsed from.
+
+The keyword argument `skip = N` drops the leading `N` lines from the input string.
+"""
+function parseblock(code::AbstractString, doc, page; skip = 0, keywords = true)
+ # Drop `skip` leading lines from the code block. Needed for deprecated `{docs}` syntax.
+ code = string(code, '\n')
+ code = last(split(code, '\n', limit = skip + 1))
+ endofstr = lastindex(code)
+ results = []
+ cursor = 1
+ while cursor < endofstr
+ # Check for keywords first since they will throw parse errors if we `parse` them.
+ line = match(r"^(.*)\r?\n"m, SubString(code, cursor)).match
+ keyword = Symbol(strip(line))
+ (ex, ncursor) =
+ # TODO: On 0.7 Symbol("") is in Docs.keywords, remove that check when dropping 0.6
+ if keywords && (haskey(Docs.keywords, keyword) || keyword == Symbol(""))
+ (QuoteNode(keyword), cursor + lastindex(line))
+ else
+ try
+ Meta.parse(code, cursor)
+ catch err
+ push!(doc.internal.errors, :parse_error)
+ Utilities.warn(doc, page, "Failed to parse expression.", err)
+ break
+ end
+ end
+ str = SubString(code, cursor, prevind(code, ncursor))
+ if !isempty(strip(str))
+ push!(results, (ex, str))
+ end
+ cursor = ncursor
+ end
+ results
+end
+isassign(x) = isexpr(x, :(=), 2) && isa(x.args[1], Symbol)
+
+# Checking arguments.
+
+"""
+Prints a formatted warning to the user listing unrecognised keyword arguments.
+"""
+function check_kwargs(kws)
+ isempty(kws) && return
+ out = IOBuffer()
+ println(out, "Unknown keywords:\n")
+ for (k, v) in kws
+ println(out, " ", k, " = ", v)
+ end
+ warn(String(take!(out)))
+end
+
+# Finding submodules.
+
+const ModVec = Union{Module, Vector{Module}}
+
+"""
+Returns the set of submodules of a given root module/s.
+"""
+function submodules(modules::Vector{Module})
+ out = Set{Module}()
+ for each in modules
+ submodules(each, out)
+ end
+ out
+end
+function submodules(root::Module, seen = Set{Module}())
+ push!(seen, root)
+ for name in names(root, all=true)
+ if Base.isidentifier(name) && isdefined(root, name) && !isdeprecated(root, name)
+ object = getfield(root, name)
+ if isa(object, Module) && !(object in seen) && parentmodule(object::Module) == root
+ submodules(object, seen)
+ end
+ end
+ end
+ return seen
+end
+
+
+
+## objects
+## =======
+
+
+
+"""
+Represents an object stored in the docsystem by its binding and signature.
+"""
+struct Object
+ binding :: Binding
+ signature :: Type
+
+ function Object(b::Binding, signature::Type)
+ m = nameof(b.mod) === b.var ? parentmodule(b.mod) : b.mod
+ new(Binding(m, b.var), signature)
+ end
+end
+
+function splitexpr(x::Expr)
+ isexpr(x, :macrocall) ? splitexpr(x.args[1]) :
+ isexpr(x, :.) ? (x.args[1], x.args[2]) :
+ error("Invalid @var syntax `$x`.")
+end
+splitexpr(s::Symbol) = :(Main), quot(s)
+splitexpr(other) = error("Invalid @var syntax `$other`.")
+
+"""
+ object(ex, str)
+
+Returns a expression that, when evaluated, returns an [`Object`](@ref) representing `ex`.
+"""
+function object(ex::Union{Symbol, Expr}, str::AbstractString)
+ binding = Expr(:call, Binding, splitexpr(Docs.namify(ex))...)
+ signature = Base.Docs.signature(ex)
+ isexpr(ex, :macrocall, 2) && !endswith(str, "()") && (signature = :(Union{}))
+ Expr(:call, Object, binding, signature)
+end
+
+function object(qn::QuoteNode, str::AbstractString)
+ if haskey(Base.Docs.keywords, qn.value)
+ binding = Expr(:call, Binding, Main, qn)
+ Expr(:call, Object, binding, Union{})
+ else
+ error("'$(qn.value)' is not a documented keyword.")
+ end
+end
+
+function Base.print(io::IO, obj::Object)
+ print(io, obj.binding)
+ print_signature(io, obj.signature)
+end
+print_signature(io::IO, signature::Union{Union, Type{Union{}}}) = nothing
+print_signature(io::IO, signature) = print(io, '-', signature)
+
+## docs
+## ====
+
+"""
+ docs(ex, str)
+
+Returns an expression that, when evaluated, returns the docstrings associated with `ex`.
+"""
+function docs end
+
+# Macro representation changed between 0.4 and 0.5.
+function docs(ex::Union{Symbol, Expr}, str::AbstractString)
+ isexpr(ex, :macrocall, 2) && !endswith(rstrip(str), "()") && (ex = quot(ex))
+ :(Base.Docs.@doc $ex)
+end
+docs(qn::QuoteNode, str::AbstractString) = :(Base.Docs.@doc $(qn.value))
+
+"""
+Returns the category name of the provided [`Object`](@ref).
+"""
+doccat(obj::Object) = startswith(string(obj.binding.var), '@') ?
+ "Macro" : doccat(obj.binding, obj.signature)
+
+function doccat(b::Binding, ::Union{Union, Type{Union{}}})
+ if b.mod === Main && haskey(Base.Docs.keywords, b.var)
+ "Keyword"
+ elseif startswith(string(b.var), '@')
+ "Macro"
+ else
+ doccat(getfield(b.mod, b.var))
+ end
+end
+
+doccat(b::Binding, ::Type) = "Method"
+
+doccat(::Function) = "Function"
+doccat(::DataType) = "Type"
+doccat(x::UnionAll) = doccat(Base.unwrap_unionall(x))
+doccat(::Module) = "Module"
+doccat(::Any) = "Constant"
+
+"""
+ filterdocs(doc, modules)
+
+Remove docstrings from the markdown object, `doc`, that are not from one of `modules`.
+"""
+function filterdocs(doc::Markdown.MD, modules::Set{Module})
+ if isempty(modules)
+ # When no modules are specified in `makedocs` then don't filter anything.
+ doc
+ else
+ if haskey(doc.meta, :module)
+ doc.meta[:module] ∈ modules ? doc : nothing
+ else
+ if haskey(doc.meta, :results)
+ out = []
+ results = []
+ for (each, result) in zip(doc.content, doc.meta[:results])
+ r = filterdocs(each, modules)
+ if r !== nothing
+ push!(out, r)
+ push!(results, result)
+ end
+ end
+ if isempty(out)
+ nothing
+ else
+ md = Markdown.MD(out)
+ md.meta[:results] = results
+ md
+ end
+ else
+ out = []
+ for each in doc.content
+ r = filterdocs(each, modules)
+ r === nothing || push!(out, r)
+ end
+ isempty(out) ? nothing : Markdown.MD(out)
+ end
+ end
+ end
+end
+# Non-markdown docs won't have a `.meta` field so always just accept those.
+filterdocs(other, modules::Set{Module}) = other
+
+"""
+Does the given docstring represent actual documentation or a no docs error message?
+"""
+nodocs(x) = occursin("No documentation found.", stringmime("text/plain", x))
+nodocs(::Nothing) = false
+
+header_level(::Markdown.Header{N}) where {N} = N
+
+"""
+ repo_root(file; dbdir=".git")
+
+Tries to determine the root directory of the repository containing `file`. If the file is
+not in a repository, the function returns `nothing`.
+
+The `dbdir` keyword argument specifies the name of the directory we are searching for to
+determine if this is a repostory or not. If there is a file called `dbdir`, then it's
+contents is checked under the assumption that it is a Git worktree.
+"""
+function repo_root(file; dbdir=".git")
+ parent_dir, parent_dir_last = dirname(abspath(file)), ""
+ while parent_dir != parent_dir_last
+ dbdir_path = joinpath(parent_dir, dbdir)
+ isdir(dbdir_path) && return parent_dir
+ # Let's see if this is a worktree checkout
+ if isfile(dbdir_path)
+ contents = chomp(read(dbdir_path, String))
+ if startswith(contents, "gitdir: ")
+ if isdir(contents[9:end])
+ return parent_dir
+ end
+ end
+ end
+ parent_dir, parent_dir_last = dirname(parent_dir), parent_dir
+ end
+ return nothing
+end
+
+"""
+ $(SIGNATURES)
+
+Returns the path of `file`, relative to the root of the Git repository, or `nothing` if the
+file is not in a Git repository.
+"""
+function relpath_from_repo_root(file)
+ cd(dirname(file)) do
+ root = repo_root(file)
+ root !== nothing && startswith(file, root) ? relpath(file, root) : nothing
+ end
+end
+
+function repo_commit(file)
+ cd(dirname(file)) do
+ readchomp(`git rev-parse HEAD`)
+ end
+end
+
+function url(repo, file; commit=nothing)
+ file = realpath(abspath(file))
+ remote = getremote(dirname(file))
+ isempty(repo) && (repo = "https://github.com/$remote/blob/{commit}{path}")
+ path = relpath_from_repo_root(file)
+ if path === nothing
+ nothing
+ else
+ repo = replace(repo, "{commit}" => commit === nothing ? repo_commit(file) : commit)
+ # Note: replacing any backslashes in path (e.g. if building the docs on Windows)
+ repo = replace(repo, "{path}" => string("/", replace(path, '\\' => '/')))
+ repo = replace(repo, "{line}" => "")
+ repo
+ end
+end
+
+url(remote, repo, doc) = url(remote, repo, doc.data[:module], doc.data[:path], linerange(doc))
+
+function url(remote, repo, mod, file, linerange)
+ file === nothing && return nothing # needed on julia v0.6, see #689
+ remote = getremote(dirname(file))
+ isabspath(file) && isempty(remote) && isempty(repo) && return nothing
+
+ # make sure we get the true path, as otherwise we will get different paths when we compute `root` below
+ if isfile(file)
+ file = realpath(abspath(file))
+ end
+
+ # Format the line range.
+ line = format_line(linerange, LineRangeFormatting(repo_host_from_url(repo)))
+ # Macro-generated methods such as those produced by `@deprecate` list their file as
+ # `deprecated.jl` since that is where the macro is defined. Use that to help
+ # determine the correct URL.
+ if inbase(mod) || !isabspath(file)
+ file = replace(file, '\\' => '/')
+ base = "https://github.com/JuliaLang/julia/blob"
+ dest = "base/$file#$line"
+ if isempty(Base.GIT_VERSION_INFO.commit)
+ "$base/v$VERSION/$dest"
+ else
+ commit = Base.GIT_VERSION_INFO.commit
+ "$base/$commit/$dest"
+ end
+ else
+ path = relpath_from_repo_root(file)
+ if isempty(repo)
+ repo = "https://github.com/$remote/blob/{commit}{path}#{line}"
+ end
+ if path === nothing
+ nothing
+ else
+ repo = replace(repo, "{commit}" => repo_commit(file))
+ # Note: replacing any backslashes in path (e.g. if building the docs on Windows)
+ repo = replace(repo, "{path}" => string("/", replace(path, '\\' => '/')))
+ repo = replace(repo, "{line}" => line)
+ repo
+ end
+ end
+end
+
+function getremote(dir::AbstractString)
+ remote =
+ try
+ cd(() -> readchomp(`git config --get remote.origin.url`), dir)
+ catch err
+ ""
+ end
+ m = match(LibGit2.GITHUB_REGEX, remote)
+ if m === nothing
+ travis = get(ENV, "TRAVIS_REPO_SLUG", "")
+ isempty(travis) ? "" : travis
+ else
+ m[1]
+ end
+end
+
+"""
+$(SIGNATURES)
+
+Returns the first 5 characters of the current git commit hash of the directory `dir`.
+"""
+function get_commit_short(dir)
+ commit = cd(dir) do
+ readchomp(`git rev-parse HEAD`)
+ end
+ (length(commit) > 5) ? commit[1:5] : commit
+end
+
+function inbase(m::Module)
+ if m ≡ Base
+ true
+ else
+ parent = parentmodule(m)
+ parent ≡ m ? false : inbase(parent)
+ end
+end
+
+# Repository hosts
+# RepoUnknown denotes that the repository type could not be determined automatically
+@enum RepoHost RepoGithub RepoBitbucket RepoGitlab RepoUnknown
+
+# Repository host from repository url
+# i.e. "https://github.com/something" => RepoGithub
+# "https://bitbucket.org/xxx" => RepoBitbucket
+# If no match, returns RepoUnknown
+function repo_host_from_url(repoURL::String)
+ if occursin("bitbucket", repoURL)
+ return RepoBitbucket
+ elseif occursin("github", repoURL) || isempty(repoURL)
+ return RepoGithub
+ elseif occursin("gitlab", repoURL)
+ return RepoGitlab
+ else
+ return RepoUnknown
+ end
+end
+
+# Find line numbers.
+# ------------------
+
+linerange(doc) = linerange(doc.text, doc.data[:linenumber])
+
+function linerange(text, from)
+ lines = sum([isodd(n) ? newlines(s) : 0 for (n, s) in enumerate(text)])
+ return lines > 0 ? (from:(from + lines + 1)) : (from:from)
+end
+
+struct LineRangeFormatting
+ prefix::String
+ separator::String
+
+ function LineRangeFormatting(host::RepoHost)
+ if host == RepoBitbucket
+ new("", ":")
+ elseif host == RepoGitlab
+ new("L", "-")
+ else
+ # default is github-style
+ new("L", "-L")
+ end
+ end
+end
+
+function format_line(range::AbstractRange, format::LineRangeFormatting)
+ if length(range) <= 1
+ string(format.prefix, first(range))
+ else
+ string(format.prefix, first(range), format.separator, last(range))
+ end
+end
+
+newlines(s::AbstractString) = count(c -> c === '\n', s)
+newlines(other) = 0
+
+
+# Output redirection.
+# -------------------
+using Logging
+
+"""
+Call a function and capture all `stdout` and `stderr` output.
+
+ withoutput(f) --> (result, success, backtrace, output)
+
+where
+
+ * `result` is the value returned from calling function `f`.
+ * `success` signals whether `f` has thrown an error, in which case `result` stores the
+ `Exception` that was raised.
+ * `backtrace` a `Vector{Ptr{Cvoid}}` produced by `catch_backtrace()` if an error is thrown.
+ * `output` is the combined output of `stdout` and `stderr` during execution of `f`.
+
+"""
+function withoutput(f)
+ # Save the default output streams.
+ default_stdout = stdout
+ default_stderr = stderr
+
+ # Redirect both the `stdout` and `stderr` streams to a single `Pipe` object.
+ pipe = Pipe()
+ Base.link_pipe!(pipe; reader_supports_async = true, writer_supports_async = true)
+ redirect_stdout(pipe.in)
+ redirect_stderr(pipe.in)
+ # Also redirect logging stream to the same pipe
+ logger = ConsoleLogger(pipe.in)
+
+ # Bytes written to the `pipe` are captured in `output` and converted to a `String`.
+ output = UInt8[]
+
+ # Run the function `f`, capturing all output that it might have generated.
+ # Success signals whether the function `f` did or did not throw an exception.
+ result, success, backtrace = with_logger(logger) do
+ try
+ f(), true, Vector{Ptr{Cvoid}}()
+ catch err
+ # InterruptException should never happen during normal doc-testing
+ # and not being able to abort the doc-build is annoying (#687).
+ isa(err, InterruptException) && rethrow(err)
+
+ err, false, catch_backtrace()
+ finally
+ # Force at least a single write to `pipe`, otherwise `readavailable` blocks.
+ println()
+ # Restore the original output streams.
+ redirect_stdout(default_stdout)
+ redirect_stderr(default_stderr)
+ # NOTE: `close` must always be called *after* `readavailable`.
+ append!(output, readavailable(pipe))
+ close(pipe)
+ end
+ end
+ return result, success, backtrace, chomp(String(output))
+end
+
+
+"""
+ issubmodule(sub, mod)
+
+Checks whether `sub` is a submodule of `mod`. A module is also considered to be
+its own submodule.
+
+E.g. `A.B.C` is a submodule of `A`, `A.B` and `A.B.C`, but it is not a submodule
+of `D`, `A.D` nor `A.B.C.D`.
+"""
+function issubmodule(sub, mod)
+ if (sub === Main) && (mod !== Main)
+ return false
+ end
+ (sub === mod) || issubmodule(parentmodule(sub), mod)
+end
+
+"""
+ isabsurl(url)
+
+Checks whether `url` is an absolute URL (as opposed to a relative one).
+"""
+isabsurl(url) = occursin(ABSURL_REGEX, url)
+const ABSURL_REGEX = r"^[[:alpha:]+-.]+://"
+
+include("DOM.jl")
+include("MDFlatten.jl")
+include("TextDiff.jl")
+include("Selectors.jl")
+
+end
--- /dev/null
+"""
+A module for rendering `Document` objects to HTML.
+
+# Keywords
+
+[`HTMLWriter`](@ref) uses the following additional keyword arguments that can be passed to
+[`Documenter.makedocs`](@ref): `assets`, `sitename`, `analytics`, `authors`, `pages`,
+`version`, `html_prettyurls`, `html_disable_git`.
+
+**`sitename`** is the site's title displayed in the title bar and at the top of the
+*navigation menu. This argument is mandatory for [`HTMLWriter`](@ref).
+
+**`pages`** defines the hierarchy of the navigation menu.
+
+**`assets`** can be used to include additional assets (JS, CSS, ICO etc. files). See below
+for more information.
+
+# Experimental keywords
+
+**`analytics`** can be used specify the Google Analytics tracking ID.
+
+**`version`** specifies the version string of the current version which will be the
+selected option in the version selector. If this is left empty (default) the version
+selector will be hidden. The special value `git-commit` sets the value in the output to
+`git:{commit}`, where `{commit}` is the first few characters of the current commit hash.
+
+**`html_prettyurls`** (default `true`) -- allows disabling the pretty URLs feature, which
+generates an output directory structre that hides the `.html` suffixes from the URLs (e.g.
+by default `src/foo.md` becomes `src/foo/index.html`). This does not work when browsing
+documentation in local files since browsers do not resolve `foo/` to `foo/index.html`
+for local files. If `html_prettyurls = false`, then Documenter generate `src/foo.html`
+instead and sets up the internal links accordingly, suitable for local documentation builds.
+
+**`html_disable_git`** can be used to disable calls to `git` when the document is not
+in a Git-controlled repository. Without setting this to `true`, Documenter will throw
+an error and exit if any of the Git commands fail. The calls to Git are mainly used to
+gather information about the current commit hash and file paths, necessary for constructing
+the links to the remote repository.
+
+**`html_edit_branch`** specifies which branch, tag or commit the "Edit on GitHub" links
+point to. It defaults to `master`. If it set to `nothing`, the current commit will be used.
+
+**`html_canonical`** specifies the canonical URL for your documentation. We recommend
+you set this to the base url of your stable documentation, e.g. `https://juliadocs.github.io/Documenter.jl/stable`.
+This allows search engines to know which version to send their users to. [See
+wikipedia for more information](https://en.wikipedia.org/wiki/Canonical_link_element).
+Default is `nothing`, in which case no canonical link is set.
+
+# Page outline
+
+The [`HTMLWriter`](@ref) makes use of the page outline that is determined by the
+headings. It is assumed that if the very first block of a page is a level 1 heading,
+then it is intended as the page title. This has two consequences:
+
+1. It is then used to automatically determine the page title in the navigation menu
+ and in the `<title>` tag, unless specified in the `.pages` option.
+2. If the first heading is interpreted as being the page title, it is not displayed
+ in the navigation sidebar.
+
+# Default and custom assets
+
+Documenter copies all files under the source directory (e.g. `/docs/src/`) over
+to the compiled site. It also copies a set of default assets from `/assets/html/`
+to the site's `assets/` directory, unless the user already had a file with the
+same name, in which case the user's files overrides the Documenter's file.
+This could, in principle, be used for customizing the site's style and scripting.
+
+The HTML output also links certain custom assets to the generated HTML documents,
+specfically a logo and additional javascript files.
+The asset files that should be linked must be placed in `assets/`, under the source
+directory (e.g `/docs/src/assets`) and must be on the top level (i.e. files in
+the subdirectories of `assets/` are not linked).
+
+For the **logo**, Documenter checks for the existence of `assets/logo.png`.
+If that's present, it gets displayed in the navigation bar.
+
+Additional JS, ICO, and CSS assets can be included in the generated pages using the
+`assets` keyword for `makedocs`. `assets` must be a `Vector{String}` and will include
+each listed asset in the `<head>` of every page in the order in which they are listed.
+The type of the asset (i.e. whether it is going to be included with a `<script>` or a
+`<link>` tag) is determined by the file's extension -- either `.js`, `.ico`, or `.css`.
+Adding an ICO asset is primarilly useful for setting a custom `favicon`.
+"""
+module HTMLWriter
+
+import Markdown
+
+import ...Documenter:
+ Anchors,
+ Builder,
+ Documents,
+ Expanders,
+ Formats,
+ Documenter,
+ Utilities,
+ Writers
+
+import ...Utilities.DOM: DOM, Tag, @tags
+using ...Utilities.MDFlatten
+
+const requirejs_cdn = "file:///usr/share/javascript/requirejs/require.min.js"
+const normalize_css = "file:///usr/share/javascript/normalize.css/normalize.min.css"
+const google_fonts = "file:///usr/share/doc/julia-doc/fontface.css"
+const fontawesome_css = "file:///usr/share/fonts-font-awesome/css/font-awesome.min.css"
+const highlightjs_css = "file:///usr/share/javascript/highlight.js/styles/default.css"
+
+
+"""
+[`HTMLWriter`](@ref)-specific globals that are passed to [`domify`](@ref) and
+other recursive functions.
+"""
+mutable struct HTMLContext
+ doc :: Documents.Document
+ logo :: String
+ scripts :: Vector{String}
+ documenter_js :: String
+ search_js :: String
+ search_index :: IOBuffer
+ search_index_js :: String
+ search_navnode :: Documents.NavNode
+ local_assets :: Vector{String}
+end
+HTMLContext(doc) = HTMLContext(doc, "", [], "", "", IOBuffer(), "", Documents.NavNode("search", "Search", nothing), [])
+
+"""
+Returns a page (as a [`Documents.Page`](@ref) object) using the [`HTMLContext`](@ref).
+"""
+getpage(ctx, path) = ctx.doc.internal.pages[path]
+getpage(ctx, navnode::Documents.NavNode) = getpage(ctx, navnode.page)
+
+
+function render(doc::Documents.Document)
+ !isempty(doc.user.sitename) || error("HTML output requires `sitename`.")
+
+ ctx = HTMLContext(doc)
+ ctx.search_index_js = "search_index.js"
+
+ copy_asset("arrow.svg", doc)
+
+ let logo = joinpath("assets", "logo.png")
+ if isfile(joinpath(doc.user.build, logo))
+ ctx.logo = logo
+ end
+ end
+
+ ctx.documenter_js = copy_asset("documenter.js", doc)
+ ctx.search_js = copy_asset("search.js", doc)
+
+ push!(ctx.local_assets, copy_asset("documenter.css", doc))
+ append!(ctx.local_assets, doc.user.assets)
+
+ for navnode in doc.internal.navlist
+ render_page(ctx, navnode)
+ end
+
+ render_search(ctx)
+
+ open(joinpath(doc.user.build, ctx.search_index_js), "w") do io
+ println(io, "var documenterSearchIndex = {\"docs\": [\n")
+ write(io, String(take!(ctx.search_index)))
+ println(io, "]}")
+ end
+end
+
+"""
+Copies an asset from Documenters `assets/html/` directory to `doc.user.build`.
+Returns the path of the copied asset relative to `.build`.
+"""
+function copy_asset(file, doc)
+ src = joinpath(Utilities.assetsdir(), "html", file)
+ alt_src = joinpath(doc.user.source, "assets", file)
+ dst = joinpath(doc.user.build, "assets", file)
+ isfile(src) || error("Asset '$file' not found at $(abspath(src))")
+
+ # Since user's alternative assets are already copied over in a previous build
+ # step and they should override documenter's original assets, we only actually
+ # perform the copy if <source>/assets/<file> does not exist. Note that checking
+ # the existence of <build>/assets/<file> is not sufficient since the <build>
+ # directory might be dirty from a previous build.
+ if isfile(alt_src)
+ Utilities.warn("Not copying '$src', provided by the user.")
+ else
+ ispath(dirname(dst)) || mkpath(dirname(dst))
+ ispath(dst) && Utilities.warn("Overwriting '$dst'.")
+ cp(src, dst, force=true)
+ end
+ assetpath = normpath(joinpath("assets", file))
+ # Replace any backslashes in links, if building the docs on Windows
+ return replace(assetpath, '\\' => '/')
+end
+
+# Page
+# ------------------------------------------------------------------------------
+
+"""
+Constructs and writes the page referred to by the `navnode` to `.build`.
+"""
+function render_page(ctx, navnode)
+ @tags html body
+
+ page = getpage(ctx, navnode)
+
+ head = render_head(ctx, navnode)
+ navmenu = render_navmenu(ctx, navnode)
+ article = render_article(ctx, navnode)
+
+ htmldoc = DOM.HTMLDocument(
+ html[:lang=>"en"](
+ head,
+ body(navmenu, article)
+ )
+ )
+
+ open_output(ctx, navnode) do io
+ print(io, htmldoc)
+ end
+end
+
+function render_head(ctx, navnode)
+ @tags head meta link script title
+ src = get_url(ctx, navnode)
+
+ page_title = "$(mdflatten(pagetitle(ctx, navnode))) · $(ctx.doc.user.sitename)"
+ css_links = [
+ normalize_css,
+ google_fonts,
+ fontawesome_css,
+ highlightjs_css,
+ ]
+ head(
+ meta[:charset=>"UTF-8"],
+ meta[:name => "viewport", :content => "width=device-width, initial-scale=1.0"],
+ title(page_title),
+
+ analytics_script(ctx.doc.user.analytics),
+
+ canonical_link_element(ctx.doc.user.html_canonical, src),
+
+ # Stylesheets.
+ map(css_links) do each
+ link[:href => each, :rel => "stylesheet", :type => "text/css"]
+ end,
+
+ script("documenterBaseURL=\"$(relhref(src, "."))\""),
+ script[
+ :src => requirejs_cdn,
+ Symbol("data-main") => relhref(src, ctx.documenter_js)
+ ],
+
+ script[:src => relhref(src, "siteinfo.js")],
+ script[:src => relhref(src, "../versions.js")],
+
+ # Custom user-provided assets.
+ asset_links(src, ctx.local_assets)
+ )
+end
+
+function asset_links(src::AbstractString, assets::Vector)
+ @tags link script
+ links = DOM.Node[]
+ for each in assets
+ ext = splitext(each)[end]
+ url = relhref(src, each)
+ node =
+ ext == ".ico" ? link[:href => url, :rel => "icon", :type => "image/x-icon"] :
+ ext == ".css" ? link[:href => url, :rel => "stylesheet", :type => "text/css"] :
+ ext == ".js" ? script[:src => url] : continue # Skip non-js/css files.
+ push!(links, node)
+ end
+ return links
+end
+
+analytics_script(tracking_id::AbstractString) =
+ isempty(tracking_id) ? Tag(Symbol("#RAW#"))("") : Tag(:script)(
+ """
+ (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
+ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
+ m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
+ })(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
+
+ ga('create', '$(tracking_id)', 'auto');
+ ga('send', 'pageview');
+ """
+ )
+
+function canonical_link_element(canonical_link, src)
+ @tags link
+ if canonical_link === nothing
+ return Tag(Symbol("#RAW#"))("")
+ else
+ canonical_link_stripped = rstrip(canonical_link, '/')
+ href = "$canonical_link_stripped/$src"
+ return link[:rel => "canonical", :href => href]
+ end
+end
+
+## Search page
+# ------------
+
+function render_search(ctx)
+ @tags article body h1 header hr html li nav p span ul script
+
+ src = get_url(ctx, ctx.search_navnode)
+
+ head = render_head(ctx, ctx.search_navnode)
+ navmenu = render_navmenu(ctx, ctx.search_navnode)
+ article = article(
+ header(
+ nav(ul(li("Search"))),
+ hr(),
+ render_topbar(ctx, ctx.search_navnode),
+ ),
+ h1("Search"),
+ p["#search-info"]("Number of results: ", span["#search-results-number"]("loading...")),
+ ul["#search-results"]
+ )
+
+ htmldoc = DOM.HTMLDocument(
+ html[:lang=>"en"](
+ head,
+ body(navmenu, article),
+ script[:src => relhref(src, ctx.search_index_js)],
+ script[:src => relhref(src, ctx.search_js)],
+ )
+ )
+ open_output(ctx, ctx.search_navnode) do io
+ print(io, htmldoc)
+ end
+end
+
+# Navigation menu
+# ------------------------------------------------------------------------------
+
+function render_navmenu(ctx, navnode)
+ @tags a form h1 img input nav div select option
+
+ src = get_url(ctx, navnode)
+
+ navmenu = nav[".toc"]
+ if !isempty(ctx.logo)
+ push!(navmenu.nodes,
+ a[:href => relhref(src, "index.html")](
+ img[
+ ".logo",
+ :src => relhref(src, ctx.logo),
+ :alt => "$(ctx.doc.user.sitename) logo"
+ ]
+ )
+ )
+ end
+ push!(navmenu.nodes, h1(ctx.doc.user.sitename))
+ let version_selector = select["#version-selector", :onChange => "window.location.href=this.value"]()
+ if isempty(ctx.doc.user.version)
+ push!(version_selector.attributes, :style => "visibility: hidden")
+ else
+ push!(version_selector.nodes,
+ option[
+ :value => "#",
+ :selected => "selected",
+ ](ctx.doc.user.version)
+ )
+ end
+ push!(navmenu.nodes, version_selector)
+ end
+ push!(navmenu.nodes,
+ form[".search#search-form", :action => navhref(ctx, ctx.search_navnode, navnode)](
+ input[
+ "#search-query",
+ :name => "q",
+ :type => "text",
+ :placeholder => "Search docs",
+ ],
+ )
+ )
+ push!(navmenu.nodes, navitem(ctx, navnode))
+ navmenu
+end
+
+"""
+[`navitem`](@ref) returns the lists and list items of the navigation menu.
+It gets called recursively to construct the whole tree.
+
+It always returns a [`DOM.Node`](@ref). If there's nothing to display (e.g. the node is set
+to be invisible), it returns an empty text node (`DOM.Node("")`).
+"""
+navitem(ctx, current) = navitem(ctx, current, ctx.doc.internal.navtree)
+function navitem(ctx, current, nns::Vector)
+ nodes = map(nn -> navitem(ctx, current, nn), nns)
+ filter!(node -> node.name !== DOM.TEXT, nodes)
+ isempty(nodes) ? DOM.Node("") : DOM.Tag(:ul)(nodes)
+end
+function navitem(ctx, current, nn::Documents.NavNode)
+ @tags ul li span a
+
+ # We'll do the children first, primarily to determine if this node has any that are
+ # visible. If it does not and it itself is not visible (including current), then
+ # we'll hide this one as well, returning an empty string Node.
+ children = navitem(ctx, current, nn.children)
+ if nn !== current && !nn.visible && children.name === DOM.TEXT
+ return DOM.Node("")
+ end
+
+ # construct this item
+ title = mdconvert(pagetitle(ctx, nn); droplinks=true)
+ link = if nn.page === nothing
+ span[".toctext"](title)
+ else
+ a[".toctext", :href => navhref(ctx, nn, current)](title)
+ end
+ item = (nn === current) ? li[".current"](link) : li(link)
+
+ # add the subsections (2nd level headings) from the page
+ if (nn === current) && current.page !== nothing
+ subs = collect_subsections(ctx.doc.internal.pages[current.page])
+ internal_links = map(subs) do s
+ istoplevel, anchor, text = s
+ _li = istoplevel ? li[".toplevel"] : li[]
+ _li(a[".toctext", :href => anchor](mdconvert(text; droplinks=true)))
+ end
+ push!(item.nodes, ul[".internal"](internal_links))
+ end
+
+ # add the visible subsections, if any, as a single list
+ (children.name === DOM.TEXT) || push!(item.nodes, children)
+
+ item
+end
+
+
+# Article (page contents)
+# ------------------------------------------------------------------------------
+
+function render_article(ctx, navnode)
+ @tags article header footer nav ul li hr span a
+
+ header_links = map(Documents.navpath(navnode)) do nn
+ title = mdconvert(pagetitle(ctx, nn); droplinks=true)
+ nn.page === nothing ? li(title) : li(a[:href => navhref(ctx, nn, navnode)](title))
+ end
+
+ topnav = nav(ul(header_links))
+
+ # Set the logo and name for the "Edit on.." button.
+ host_type = Utilities.repo_host_from_url(ctx.doc.user.repo)
+ if host_type == Utilities.RepoGitlab
+ host = "GitLab"
+ logo = "\uf296"
+ elseif host_type == Utilities.RepoGithub
+ host = "GitHub"
+ logo = "\uf09b"
+ elseif host_type == Utilities.RepoBitbucket
+ host = "BitBucket"
+ logo = "\uf171"
+ else
+ host = ""
+ logo = "\uf15c"
+ end
+ hoststring = isempty(host) ? " source" : " on $(host)"
+
+ if !ctx.doc.user.html_disable_git
+ pageurl = get(getpage(ctx, navnode).globals.meta, :EditURL, getpage(ctx, navnode).source)
+ if Utilities.isabsurl(pageurl)
+ url = pageurl
+ else
+ if !(pageurl == getpage(ctx, navnode).source)
+ # need to set users path relative the page itself
+ pageurl = joinpath(first(splitdir(getpage(ctx, navnode).source)), pageurl)
+ end
+ url = Utilities.url(ctx.doc.user.repo, pageurl, commit=ctx.doc.user.html_edit_branch)
+ end
+ if url !== nothing
+ edit_verb = (ctx.doc.user.html_edit_branch === nothing) ? "View" : "Edit"
+ push!(topnav.nodes, a[".edit-page", :href => url](span[".fa"](logo), " $(edit_verb)$hoststring"))
+ end
+ end
+ art_header = header(topnav, hr(), render_topbar(ctx, navnode))
+
+ # build the footer with nav links
+ art_footer = footer(hr())
+ if navnode.prev !== nothing
+ direction = span[".direction"]("Previous")
+ title = span[".title"](mdconvert(pagetitle(ctx, navnode.prev); droplinks=true))
+ link = a[".previous", :href => navhref(ctx, navnode.prev, navnode)](direction, title)
+ push!(art_footer.nodes, link)
+ end
+
+ if navnode.next !== nothing
+ direction = span[".direction"]("Next")
+ title = span[".title"](mdconvert(pagetitle(ctx, navnode.next); droplinks=true))
+ link = a[".next", :href => navhref(ctx, navnode.next, navnode)](direction, title)
+ push!(art_footer.nodes, link)
+ end
+
+ pagenodes = domify(ctx, navnode)
+ article["#docs"](art_header, pagenodes, art_footer)
+end
+
+function render_topbar(ctx, navnode)
+ @tags a div span
+ page_title = string(mdflatten(pagetitle(ctx, navnode)))
+ return div["#topbar"](span(page_title), a[".fa .fa-bars", :href => "#"])
+end
+
+# expand the versions argument from the user
+# and return entries and needed symlinks
+function expand_versions(dir, versions)
+ # output: entries and symlinks
+ entries = String[]
+ symlinks = Pair{String,String}[]
+
+ # read folders and filter out symlinks
+ available_folders = readdir(dir)
+ cd(() -> filter!(!islink, available_folders), dir)
+
+ # filter and sort release folders
+ vnum(x) = VersionNumber(x)
+ version_folders = [x for x in available_folders if occursin(Base.VERSION_REGEX, x)]
+ sort!(version_folders, lt = (x, y) -> vnum(x) < vnum(y), rev = true)
+ release_folders = filter(x -> (v = vnum(x); v.prerelease == () && v.build == ()), version_folders)
+ # pre_release_folders = filter(x -> (v = vnum(x); v.prerelease != () || v.build != ()), version_folders)
+ major_folders = filter!(x -> (v = vnum(x); v.major != 0),
+ unique(x -> (v = vnum(x); v.major), release_folders))
+ minor_folders = filter!(x -> (v = vnum(x); !(v.major == 0 && v.minor == 0)),
+ unique(x -> (v = vnum(x); (v.major, v.minor)), release_folders))
+ patch_folders = unique(x -> (v = vnum(x); (v.major, v.minor, v.patch)), release_folders)
+
+ filter!(x -> vnum(x) !== 0, major_folders)
+
+ # populate output
+ for entry in versions
+ if entry == "v#" # one doc per major release
+ for x in major_folders
+ vstr = "v$(vnum(x).major).$(vnum(x).minor)"
+ push!(entries, vstr)
+ push!(symlinks, vstr => x)
+ end
+ elseif entry == "v#.#" # one doc per minor release
+ for x in minor_folders
+ vstr = "v$(vnum(x).major).$(vnum(x).minor)"
+ push!(entries, vstr)
+ push!(symlinks, vstr => x)
+ end
+ elseif entry == "v#.#.#" # one doc per patch release
+ for x in patch_folders
+ vstr = "v$(vnum(x).major).$(vnum(x).minor).$(vnum(x).patch)"
+ push!(entries, vstr)
+ push!(symlinks, vstr => x)
+ end
+ elseif entry == "v^" || (entry isa Pair && entry.second == "v^")
+ if !isempty(release_folders)
+ x = first(release_folders)
+ vstr = isa(entry, Pair) ? entry.first : "v$(vnum(x).major).$(vnum(x).minor)"
+ push!(entries, vstr)
+ push!(symlinks, vstr => x)
+ end
+ elseif entry isa Pair
+ k, v = entry
+ i = findfirst(==(v), available_folders)
+ if i === nothing
+ @info("no match for `versions` entry `$(repr(entry))`")
+ else
+ push!(entries, k)
+ push!(symlinks, k => v)
+ end
+ else
+ @info("no match for `versions` entry `$(repr(entry))`")
+ end
+ end
+ unique!(entries) # remove any duplicates
+
+ # generate remaining symlinks
+ foreach(x -> push!(symlinks, "v$(vnum(x).major)" => x), major_folders)
+ foreach(x -> push!(symlinks, "v$(vnum(x).major).$(vnum(x).minor)" => x), minor_folders)
+ foreach(x -> push!(symlinks, "v$(vnum(x).major).$(vnum(x).minor).$(vnum(x).patch)" => x), patch_folders)
+ filter!(x -> x.first != x.second, unique!(symlinks))
+
+ # assert that none of the links point to another link
+ for link in symlinks
+ i = findfirst(x -> link.first == x.second, symlinks)
+ if i !== nothing
+ throw(ArgumentError("link `$(link)` incompatible with link `$(symlinks[i])`."))
+ end
+ end
+
+ return entries, symlinks
+end
+
+# write version file
+function generate_version_file(versionfile::AbstractString, entries)
+ open(versionfile, "w") do buf
+ println(buf, "var DOC_VERSIONS = [")
+ for folder in entries
+ println(buf, " \"", folder, "\",")
+ end
+ println(buf, "];")
+ end
+end
+
+function generate_siteinfo_file(dir::AbstractString, version::AbstractString)
+ open(joinpath(dir, "siteinfo.js"), "w") do buf
+ println(buf, "var DOCUMENTER_CURRENT_VERSION = \"$(version)\";")
+ end
+end
+
+## domify(...)
+# ------------
+
+"""
+Converts recursively a [`Documents.Page`](@ref), `Markdown` or Documenter
+`*Node` objects into HTML DOM.
+"""
+function domify(ctx, navnode)
+ page = getpage(ctx, navnode)
+ sib = SearchIndexBuffer(ctx, navnode)
+ ret = map(page.elements) do elem
+ search_append(sib, elem)
+ domify(ctx, navnode, page.mapping[elem])
+ end
+ search_flush(sib)
+ ret
+end
+
+mutable struct SearchIndexBuffer
+ ctx :: HTMLContext
+ src :: String
+ page :: Documents.Page
+ loc :: String
+ category :: Symbol
+ title :: String
+ page_title :: String
+ buffer :: IOBuffer
+ function SearchIndexBuffer(ctx, navnode)
+ page_title = mdflatten(pagetitle(ctx, navnode))
+ new(
+ ctx,
+ pretty_url(ctx, get_url(ctx, navnode.page)),
+ getpage(ctx, navnode),
+ "",
+ :page,
+ page_title,
+ page_title,
+ IOBuffer()
+ )
+ end
+end
+
+function search_append(sib, node::Markdown.Header)
+ search_flush(sib)
+ sib.category = :section
+ sib.title = mdflatten(node)
+ a = sib.page.mapping[node]
+ sib.loc = "$(a.id)-$(a.nth)"
+end
+
+search_append(sib, node) = mdflatten(sib.buffer, node)
+
+function search_flush(sib)
+ # Replace any backslashes in links, if building the docs on Windows
+ src = replace(sib.src, '\\' => '/')
+ ref = "$(src)#$(sib.loc)"
+ text = String(take!(sib.buffer))
+ println(sib.ctx.search_index, """
+ {
+ "location": "$(jsescape(ref))",
+ "page": "$(jsescape(sib.page_title))",
+ "title": "$(jsescape(sib.title))",
+ "category": "$(jsescape(lowercase(string(sib.category))))",
+ "text": "$(jsescape(text))"
+ },
+ """)
+end
+
+"""
+Replaces some of the characters in the string with escape sequences so that the strings
+would be valid JS string literals, as per the
+[ECMAScript® 2017 standard](https://www.ecma-international.org/ecma-262/8.0/index.html#sec-literals-string-literals).
+
+Note that it always escapes both potential `"` and `'` closing quotes.
+"""
+function jsescape(s)
+ b = IOBuffer()
+ # From the ECMAScript® 2017 standard:
+ #
+ # > All code points may appear literally in a string literal except for the closing
+ # > quote code points, U+005C (REVERSE SOLIDUS), U+000D (CARRIAGE RETURN), U+2028 (LINE
+ # > SEPARATOR), U+2029 (PARAGRAPH SEPARATOR), and U+000A (LINE FEED).
+ #
+ # https://www.ecma-international.org/ecma-262/8.0/index.html#sec-literals-string-literals
+ for c in s
+ if c === '\u000a' # LINE FEED, i.e. \n
+ write(b, "\\n")
+ elseif c === '\u000d' # CARRIAGE RETURN, i.e. \r
+ write(b, "\\r")
+ elseif c === '\u005c' # REVERSE SOLIDUS, i.e. \
+ write(b, "\\\\")
+ elseif c === '\u0022' # QUOTATION MARK, i.e. "
+ write(b, "\\\"")
+ elseif c === '\u0027' # APOSTROPHE, i.e. '
+ write(b, "\\'")
+ elseif c === '\u2028' # LINE SEPARATOR
+ write(b, "\\u2028")
+ elseif c === '\u2029' # PARAGRAPH SEPARATOR
+ write(b, "\\u2029")
+ else
+ write(b, c)
+ end
+ end
+ String(take!(b))
+end
+
+function domify(ctx, navnode, node)
+ fixlinks!(ctx, navnode, node)
+ mdconvert(node, Markdown.MD())
+end
+
+function domify(ctx, navnode, anchor::Anchors.Anchor)
+ @tags a
+ aid = "$(anchor.id)-$(anchor.nth)"
+ if isa(anchor.object, Markdown.Header)
+ h = anchor.object
+ fixlinks!(ctx, navnode, h)
+ DOM.Tag(Symbol("h$(Utilities.header_level(h))"))(
+ a[".nav-anchor", :id => aid, :href => "#$aid"](mdconvert(h.text, h))
+ )
+ else
+ a[".nav-anchor", :id => aid, :href => "#$aid"](domify(ctx, navnode, anchor.object))
+ end
+end
+
+
+struct ListBuilder
+ es::Vector
+end
+ListBuilder() = ListBuilder([])
+
+import Base: push!
+function push!(lb::ListBuilder, level, node)
+ @assert level >= 1
+ if level == 1
+ push!(lb.es, node)
+ else
+ if isempty(lb.es) || typeof(last(lb.es)) !== ListBuilder
+ push!(lb.es, ListBuilder())
+ end
+ push!(last(lb.es), level-1, node)
+ end
+end
+
+function domify(lb::ListBuilder)
+ @tags ul li
+ ul(map(e -> isa(e, ListBuilder) ? domify(e) : li(e), lb.es))
+end
+
+function domify(ctx, navnode, contents::Documents.ContentsNode)
+ @tags a
+ navnode_dir = dirname(navnode.page)
+ navnode_url = get_url(ctx, navnode)
+ lb = ListBuilder()
+ for (count, path, anchor) in contents.elements
+ path = joinpath(navnode_dir, path) # links in ContentsNodes are relative to current page
+ path = pretty_url(ctx, relhref(navnode_url, get_url(ctx, path)))
+ header = anchor.object
+ url = string(path, '#', anchor.id, '-', anchor.nth)
+ node = a[:href=>url](mdconvert(header.text; droplinks=true))
+ level = Utilities.header_level(header)
+ push!(lb, level, node)
+ end
+ domify(lb)
+end
+
+function domify(ctx, navnode, index::Documents.IndexNode)
+ @tags a code li ul
+ navnode_dir = dirname(navnode.page)
+ navnode_url = get_url(ctx, navnode)
+ lis = map(index.elements) do el
+ object, doc, path, mod, cat = el
+ path = joinpath(navnode_dir, path) # links in IndexNodes are relative to current page
+ path = pretty_url(ctx, relhref(navnode_url, get_url(ctx, path)))
+ url = string(path, "#", Utilities.slugify(object))
+ li(a[:href=>url](code("$(object.binding)")))
+ end
+ ul(lis)
+end
+
+function domify(ctx, navnode, docs::Documents.DocsNodes)
+ [domify(ctx, navnode, node) for node in docs.nodes]
+end
+
+function domify(ctx, navnode, node::Documents.DocsNode)
+ @tags a code div section span
+
+ # push to search index
+ sib = SearchIndexBuffer(ctx, navnode)
+ sib.loc = node.anchor.id
+ sib.title = string(node.object.binding)
+ sib.category = Symbol(Utilities.doccat(node.object))
+ mdflatten(sib.buffer, node.docstr)
+ search_flush(sib)
+
+ section[".docstring"](
+ div[".docstring-header"](
+ a[".docstring-binding", :id=>node.anchor.id, :href=>"#$(node.anchor.id)"](code("$(node.object.binding)")),
+ " — ", # —
+ span[".docstring-category"]("$(Utilities.doccat(node.object))"),
+ "."
+ ),
+ domify_doc(ctx, navnode, node.docstr)
+ )
+end
+
+function domify_doc(ctx, navnode, md::Markdown.MD)
+ @tags a
+ if haskey(md.meta, :results)
+ # The `:results` field contains a vector of `Docs.DocStr` objects associated with
+ # each markdown object. The `DocStr` contains data such as file and line info that
+ # we need for generating correct source links.
+ map(zip(md.content, md.meta[:results])) do md
+ markdown, result = md
+ ret = Any[domify(ctx, navnode, Writers.MarkdownWriter.dropheaders(markdown))]
+ # When a source link is available then print the link.
+ if !ctx.doc.user.html_disable_git
+ url = Utilities.url(ctx.doc.internal.remote, ctx.doc.user.repo, result)
+ if url !== nothing
+ push!(ret, a[".source-link", :target=>"_blank", :href=>url]("source"))
+ end
+ end
+ ret
+ end
+ else
+ # Docstrings with no `:results` metadata won't contain source locations so we don't
+ # try to print them out. Just print the basic docstring.
+ domify(ctx, navnode, Writers.MarkdownWriter.dropheaders(md))
+ end
+end
+
+function domify(ctx, navnode, node::Documents.EvalNode)
+ node.result === nothing ? DOM.Node[] : domify(ctx, navnode, node.result)
+end
+
+# nothing to show for MetaNodes, so we just return an empty list
+domify(ctx, navnode, node::Documents.MetaNode) = DOM.Node[]
+
+function domify(ctx, navnode, raw::Documents.RawNode)
+ raw.name === :html ? Tag(Symbol("#RAW#"))(raw.text) : DOM.Node[]
+end
+
+
+# Utilities
+# ------------------------------------------------------------------------------
+
+"""
+Opens the output file of the `navnode` in write node. If necessary, the path to the output
+file is created before opening the file.
+"""
+function open_output(f, ctx, navnode)
+ path = joinpath(ctx.doc.user.build, get_url(ctx, navnode))
+ isdir(dirname(path)) || mkpath(dirname(path))
+ open(f, path, "w")
+end
+
+"""
+Get the relative hyperlink between two [`Documents.NavNode`](@ref)s. Assumes that both
+[`Documents.NavNode`](@ref)s have an associated [`Documents.Page`](@ref) (i.e. `.page`
+is not `nothing`).
+"""
+navhref(ctx, to, from) = pretty_url(ctx, relhref(get_url(ctx, from), get_url(ctx, to)))
+
+"""
+Calculates a relative HTML link from one path to another.
+"""
+function relhref(from, to)
+ pagedir = dirname(from)
+ # The regex separator replacement is necessary since otherwise building the docs on
+ # Windows will result in paths that have `//` separators which break asset inclusion.
+ replace(relpath(to, isempty(pagedir) ? "." : pagedir), r"[/\\]+" => "/")
+end
+
+"""
+Returns the full path corresponding to a path of a `.md` page file. The the input and output
+paths are assumed to be relative to `src/`.
+"""
+function get_url(ctx, path::AbstractString)
+ if ctx.doc.user.html_prettyurls
+ d = if basename(path) == "index.md"
+ dirname(path)
+ else
+ first(splitext(path))
+ end
+ isempty(d) ? "index.html" : "$d/index.html"
+ else
+ Formats.extension(:html, path)
+ end
+end
+
+"""
+Returns the full path of a [`Documents.NavNode`](@ref) relative to `src/`.
+"""
+get_url(ctx, navnode::Documents.NavNode) = get_url(ctx, navnode.page)
+
+"""
+If `html_prettyurls` is enabled, returns a "pretty" version of the `path` which can then be
+used in links in the resulting HTML file.
+"""
+function pretty_url(ctx, path::AbstractString)
+ if ctx.doc.user.html_prettyurls
+ dir, file = splitdir(path)
+ if file == "index.html"
+ return length(dir) == 0 ? "" : "$(dir)/"
+ end
+ end
+ return path
+end
+
+"""
+Tries to guess the page title by looking at the `<h1>` headers and returns the
+header contents of the first `<h1>` on a page (or `nothing` if the algorithm
+was unable to find any `<h1>` headers).
+"""
+function pagetitle(page::Documents.Page)
+ title = nothing
+ for element in page.elements
+ if isa(element, Markdown.Header{1})
+ title = element.text
+ break
+ end
+ end
+ title
+end
+
+function pagetitle(ctx, navnode::Documents.NavNode)
+ if navnode.title_override !== nothing
+ # parse title_override as markdown
+ md = Markdown.parse(navnode.title_override)
+ # Markdown.parse results in a paragraph so we need to strip that
+ if !(length(md.content) === 1 && isa(first(md.content), Markdown.Paragraph))
+ error("Bad Markdown provided for page title: '$(navnode.title_override)'")
+ end
+ return first(md.content).content
+ end
+
+ if navnode.page !== nothing
+ title = pagetitle(getpage(ctx, navnode))
+ title === nothing || return title
+ end
+
+ "-"
+end
+
+"""
+Returns an ordered list of tuples, `(toplevel, anchor, text)`, corresponding to level 1 and 2
+headings on the `page`. Note that if the first header on the `page` is a level 1 header then
+it is not included -- it is assumed to be the page title and so does not need to be included
+in the navigation menu twice.
+"""
+function collect_subsections(page::Documents.Page)
+ sections = []
+ title_found = false
+ for element in page.elements
+ if isa(element, Markdown.Header) && Utilities.header_level(element) < 3
+ toplevel = Utilities.header_level(element) === 1
+ # Don't include the first header if it is `h1`.
+ if toplevel && isempty(sections) && !title_found
+ title_found = true
+ continue
+ end
+ anchor = page.mapping[element]
+ push!(sections, (toplevel, "#$(anchor.id)-$(anchor.nth)", element.text))
+ end
+ end
+ return sections
+end
+
+
+# mdconvert
+# ------------------------------------------------------------------------------
+
+const md_block_nodes = [Markdown.MD, Markdown.BlockQuote]
+push!(md_block_nodes, Markdown.List)
+push!(md_block_nodes, Markdown.Admonition)
+
+"""
+[`MDBlockContext`](@ref) is a union of all the Markdown nodes whose children should
+be blocks. It can be used to dispatch on all the block-context nodes at once.
+"""
+const MDBlockContext = Union{md_block_nodes...}
+
+"""
+Convert a markdown object to a `DOM.Node` object.
+
+The `parent` argument is passed to allow for context-dependant conversions.
+"""
+mdconvert(md; kwargs...) = mdconvert(md, md; kwargs...)
+
+mdconvert(text::AbstractString, parent; kwargs...) = DOM.Node(text)
+
+mdconvert(vec::Vector, parent; kwargs...) = [mdconvert(x, parent; kwargs...) for x in vec]
+
+mdconvert(md::Markdown.MD, parent; kwargs...) = Tag(:div)(mdconvert(md.content, md; kwargs...))
+
+mdconvert(b::Markdown.BlockQuote, parent; kwargs...) = Tag(:blockquote)(mdconvert(b.content, b; kwargs...))
+
+mdconvert(b::Markdown.Bold, parent; kwargs...) = Tag(:strong)(mdconvert(b.text, parent; kwargs...))
+
+function mdconvert(c::Markdown.Code, parent::MDBlockContext; kwargs...)
+ @tags pre code
+ language = isempty(c.language) ? "none" : c.language
+ pre(code[".language-$(language)"](c.code))
+end
+mdconvert(c::Markdown.Code, parent; kwargs...) = Tag(:code)(c.code)
+
+mdconvert(h::Markdown.Header{N}, parent; kwargs...) where {N} = DOM.Tag(Symbol("h$N"))(mdconvert(h.text, h; kwargs...))
+
+mdconvert(::Markdown.HorizontalRule, parent; kwargs...) = Tag(:hr)()
+
+mdconvert(i::Markdown.Image, parent; kwargs...) = Tag(:img)[:src => i.url, :alt => i.alt]
+
+mdconvert(i::Markdown.Italic, parent; kwargs...) = Tag(:em)(mdconvert(i.text, i; kwargs...))
+
+mdconvert(m::Markdown.LaTeX, ::MDBlockContext; kwargs...) = Tag(:div)(string("\\[", m.formula, "\\]"))
+mdconvert(m::Markdown.LaTeX, parent; kwargs...) = Tag(:span)(string('$', m.formula, '$'))
+
+mdconvert(::Markdown.LineBreak, parent; kwargs...) = Tag(:br)()
+
+function mdconvert(link::Markdown.Link, parent; droplinks=false, kwargs...)
+ link_text = mdconvert(link.text, link; droplinks=droplinks, kwargs...)
+ droplinks ? link_text : Tag(:a)[:href => link.url](link_text)
+end
+
+mdconvert(list::Markdown.List, parent; kwargs...) = (Markdown.isordered(list) ? Tag(:ol) : Tag(:ul))(map(Tag(:li), mdconvert(list.items, list; kwargs...)))
+
+mdconvert(paragraph::Markdown.Paragraph, parent; kwargs...) = Tag(:p)(mdconvert(paragraph.content, paragraph; kwargs...))
+
+# For compatibility with versions before Markdown.List got the `loose field, Julia PR #26598
+const list_has_loose_field = :loose in fieldnames(Markdown.List)
+function mdconvert(paragraph::Markdown.Paragraph, parent::Markdown.List; kwargs...)
+ content = mdconvert(paragraph.content, paragraph; kwargs...)
+ return (list_has_loose_field && !parent.loose) ? content : Tag(:p)(content)
+end
+
+mdconvert(t::Markdown.Table, parent; kwargs...) = Tag(:table)(
+ Tag(:tr)(map(x -> Tag(:th)(mdconvert(x, t; kwargs...)), t.rows[1])),
+ map(x -> Tag(:tr)(map(y -> Tag(:td)(mdconvert(y, x; kwargs...)), x)), t.rows[2:end])
+)
+
+mdconvert(expr::Union{Expr,Symbol}, parent; kwargs...) = string(expr)
+
+mdconvert(f::Markdown.Footnote, parent; kwargs...) = footnote(f.id, f.text, parent; kwargs...)
+footnote(id, text::Nothing, parent; kwargs...) = Tag(:a)[:href => "#footnote-$(id)"]("[$id]")
+function footnote(id, text, parent; kwargs...)
+ Tag(:div)[".footnote#footnote-$(id)"](
+ Tag(:a)[:href => "#footnote-$(id)"](Tag(:strong)("[$id]")),
+ mdconvert(text, parent; kwargs...),
+ )
+end
+
+function mdconvert(a::Markdown.Admonition, parent; kwargs...)
+ @tags div
+ div[".admonition.$(a.category)"](
+ div[".admonition-title"](a.title),
+ div[".admonition-text"](mdconvert(a.content, a; kwargs...))
+ )
+end
+
+mdconvert(html::Documents.RawHTML, parent; kwargs...) = Tag(Symbol("#RAW#"))(html.code)
+
+
+# fixlinks!
+# ------------------------------------------------------------------------------
+
+"""
+Replaces URLs in `Markdown.Link` elements (if they point to a local `.md` page) with the
+actual URLs.
+"""
+function fixlinks!(ctx, navnode, link::Markdown.Link)
+ fixlinks!(ctx, navnode, link.text)
+ Utilities.isabsurl(link.url) && return
+
+ # links starting with a # are references within the same file -- there's nothing to fix
+ # for such links
+ startswith(link.url, '#') && return
+
+ s = split(link.url, "#", limit = 2)
+ if Sys.iswindows() && ':' in first(s)
+ Utilities.warn("Invalid local link: colons not allowed in paths on Windows\n '$(link.url)' in $(navnode.page)")
+ return
+ end
+ path = normpath(joinpath(dirname(navnode.page), first(s)))
+
+ if endswith(path, ".md") && path in keys(ctx.doc.internal.pages)
+ # make sure that links to different valid pages are correct
+ path = pretty_url(ctx, relhref(get_url(ctx, navnode), get_url(ctx, path)))
+ elseif isfile(joinpath(ctx.doc.user.build, path))
+ # update links to other files that are present in build/ (e.g. either user
+ # provided files or generated by code examples)
+ path = relhref(get_url(ctx, navnode), path)
+ else
+ Utilities.warn("Invalid local link: unresolved path\n '$(link.url)' in $(navnode.page)")
+ end
+
+ # Replace any backslashes in links, if building the docs on Windows
+ path = replace(path, '\\' => '/')
+ link.url = (length(s) > 1) ? "$path#$(last(s))" : String(path)
+end
+
+function fixlinks!(ctx, navnode, img::Markdown.Image)
+ Utilities.isabsurl(img.url) && return
+
+ if Sys.iswindows() && ':' in img.url
+ Utilities.warn("Invalid local image: colons not allowed in paths on Windows\n '$(img.url)' in $(navnode.page)")
+ return
+ end
+
+ path = joinpath(dirname(navnode.page), img.url)
+ if isfile(joinpath(ctx.doc.user.build, path))
+ path = relhref(get_url(ctx, navnode), path)
+ # Replace any backslashes in links, if building the docs on Windows
+ img.url = replace(path, '\\' => '/')
+ else
+ Utilities.warn("Invalid local image: unresolved path\n '$(img.url)' in `$(navnode.page)`")
+ end
+end
+
+fixlinks!(ctx, navnode, md::Markdown.MD) = fixlinks!(ctx, navnode, md.content)
+function fixlinks!(ctx, navnode, a::Markdown.Admonition)
+ fixlinks!(ctx, navnode, a.title)
+ fixlinks!(ctx, navnode, a.content)
+end
+fixlinks!(ctx, navnode, b::Markdown.BlockQuote) = fixlinks!(ctx, navnode, b.content)
+fixlinks!(ctx, navnode, b::Markdown.Bold) = fixlinks!(ctx, navnode, b.text)
+fixlinks!(ctx, navnode, f::Markdown.Footnote) = fixlinks!(ctx, navnode, f.text)
+fixlinks!(ctx, navnode, h::Markdown.Header) = fixlinks!(ctx, navnode, h.text)
+fixlinks!(ctx, navnode, i::Markdown.Italic) = fixlinks!(ctx, navnode, i.text)
+fixlinks!(ctx, navnode, list::Markdown.List) = fixlinks!(ctx, navnode, list.items)
+fixlinks!(ctx, navnode, p::Markdown.Paragraph) = fixlinks!(ctx, navnode, p.content)
+fixlinks!(ctx, navnode, t::Markdown.Table) = fixlinks!(ctx, navnode, t.rows)
+
+fixlinks!(ctx, navnode, mds::Vector) = map(md -> fixlinks!(ctx, navnode, md), mds)
+fixlinks!(ctx, navnode, md) = nothing
+
+# TODO: do some regex-magic in raw HTML blocks? Currently ignored.
+#fixlinks!(ctx, navnode, md::Documents.RawHTML) = ...
+
+end
--- /dev/null
+"""
+A module for rendering `Document` objects to LaTeX and PDF.
+
+# Keywords
+
+[`LaTeXWriter`](@ref) uses the following additional keyword arguments that can be passed to
+[`Documenter.makedocs`](@ref): `authors`, `sitename`.
+
+**`sitename`** is the site's title displayed in the title bar and at the top of the
+navigation menu. It goes into the `\\title` LaTeX command.
+
+**`authors`** can be used to specify the authors of. It goes into the `\\author` LaTeX command.
+
+"""
+module LaTeXWriter
+
+import ...Documenter:
+ Anchors,
+ Builder,
+ Documents,
+ Expanders,
+ Formats,
+ Documenter,
+ Utilities,
+ Writers
+
+import Markdown
+
+mutable struct Context{I <: IO} <: IO
+ io::I
+ in_header::Bool
+ footnotes::Dict{String, Int}
+ depth::Int
+ filename::String # currently active source file
+end
+Context(io) = Context{typeof(io)}(io, false, Dict(), 1, "")
+
+_print(c::Context, args...) = Base.print(c.io, args...)
+_println(c::Context, args...) = Base.println(c.io, args...)
+
+
+const STYLE = joinpath(dirname(@__FILE__), "..", "..", "assets", "latex", "documenter.sty")
+
+hastex() = (try; success(`latexmk -version`); catch; false; end)
+
+const DOCUMENT_STRUCTURE = (
+ "part",
+ "chapter",
+ "section",
+ "subsection",
+ "subsubsection",
+ "paragraph",
+ "subparagraph",
+)
+
+function render(doc::Documents.Document)
+ mktempdir() do path
+ cp(joinpath(doc.user.root, doc.user.build), joinpath(path, "build"))
+ cd(joinpath(path, "build")) do
+ file = replace("$(doc.user.sitename).tex", " " => "")
+ open(file, "w") do io
+ context = Context(io)
+ writeheader(context, doc)
+ for (title, filename, depth) in files(doc.user.pages)
+ context.filename = filename
+ empty!(context.footnotes)
+ if 1 <= depth <= length(DOCUMENT_STRUCTURE)
+ header_type = DOCUMENT_STRUCTURE[depth]
+ header_text = "\n\\$(header_type){$(title)}\n"
+ if isempty(filename)
+ _println(context, header_text)
+ else
+ path = normpath(filename)
+ page = doc.internal.pages[path]
+ if get(page.globals.meta, :IgnorePage, :none) !== :latex
+ context.depth = depth + (isempty(title) ? 0 : 1)
+ context.depth > depth && _println(context, header_text)
+ latex(context, page, doc)
+ end
+ end
+ end
+ end
+ writefooter(context, doc)
+ end
+ cp(STYLE, "documenter.sty")
+ if hastex()
+ outdir = joinpath(doc.user.root, doc.user.build)
+ pdf = replace("$(doc.user.sitename).pdf", " " => "")
+ try
+ run(`latexmk -silent -f -interaction=nonstopmode -view=none -lualatex -shell-escape $file`)
+ catch err
+ Utilities.warn("failed to compile. Check generated LaTeX file.")
+ cp(file, joinpath(outdir, file); force = true)
+ end
+ cp(pdf, joinpath(outdir, pdf); force = true)
+ else
+ Utilities.warn("`latexmk` and `lualatex` required for PDF generation.")
+ end
+ end
+ end
+end
+
+function writeheader(io::IO, doc::Documents.Document)
+ custom = joinpath(doc.user.root, doc.user.source, "assets", "custom.sty")
+ isfile(custom) ? cp(custom, "custom.sty"; force = true) : touch("custom.sty")
+ preamble =
+ """
+ \\documentclass{memoir}
+
+ \\usepackage{./documenter}
+ \\usepackage{./custom}
+
+ \\title{$(doc.user.sitename)}
+ \\author{$(doc.user.authors)}
+
+ \\begin{document}
+
+ \\frontmatter
+ \\maketitle
+ \\tableofcontents
+
+ \\mainmatter
+
+ """
+ _println(io, preamble)
+end
+
+function writefooter(io::IO, doc::Documents.Document)
+ _println(io, "\n\\end{document}")
+end
+
+function latex(io::IO, page::Documents.Page, doc::Documents.Document)
+ for element in page.elements
+ latex(io, page.mapping[element], page, doc)
+ end
+end
+
+function latex(io::IO, vec::Vector, page, doc)
+ for each in vec
+ latex(io, each, page, doc)
+ end
+end
+
+function latex(io::IO, anchor::Anchors.Anchor, page, doc)
+ id = string(hash(string(anchor.id, "-", anchor.nth)))
+ _println(io, "\n\\hypertarget{", id, "}{}\n")
+ latex(io, anchor.object, page, doc)
+end
+
+
+## Documentation Nodes.
+
+function latex(io::IO, node::Documents.DocsNodes, page, doc)
+ for node in node.nodes
+ latex(io, node, page, doc)
+ end
+end
+
+function latex(io::IO, node::Documents.DocsNode, page, doc)
+ id = string(hash(string(node.anchor.id)))
+ # Docstring header based on the name of the binding and it's category.
+ _println(io, "\\hypertarget{", id, "}{} ")
+ _print(io, "\\hyperlink{", id, "}{\\texttt{")
+ latexesc(io, string(node.object.binding))
+ _print(io, "}} ")
+ _println(io, " -- {", Utilities.doccat(node.object), ".}\n")
+ # # Body. May contain several concatenated docstrings.
+ _println(io, "\\begin{adjustwidth}{2em}{0pt}")
+ latexdoc(io, node.docstr, page, doc)
+ _println(io, "\n\\end{adjustwidth}")
+end
+
+function latexdoc(io::IO, md::Markdown.MD, page, doc)
+ if haskey(md.meta, :results)
+ # The `:results` field contains a vector of `Docs.DocStr` objects associated with
+ # each markdown object. The `DocStr` contains data such as file and line info that
+ # we need for generating correct scurce links.
+ for (markdown, result) in zip(md.content, md.meta[:results])
+ latex(io, Writers.MarkdownWriter.dropheaders(markdown), page, doc)
+ # When a source link is available then print the link.
+ url = Utilities.url(doc.internal.remote, doc.user.repo, result)
+ if url !== nothing
+ link = "\\href{$url}{\\texttt{source}}"
+ _println(io, "\n", link, "\n")
+ end
+ end
+ else
+ # Docstrings with no `:results` metadata won't contain source locations so we don't
+ # try to print them out. Just print the basic docstring.
+ render(io, mime, dropheaders(md), page, doc)
+ end
+end
+
+function latexdoc(io::IO, other, page, doc)
+ # TODO: properly support non-markdown docstrings at some point.
+ latex(io, other, page, doc)
+end
+
+
+## Index, Contents, and Eval Nodes.
+
+function latex(io::IO, index::Documents.IndexNode, page, doc)
+ _println(io, "\\begin{itemize}")
+ for (object, _, page, mod, cat) in index.elements
+ id = string(hash(string(Utilities.slugify(object))))
+ text = string(object.binding)
+ _print(io, "\\item \\hyperlink{")
+ _print(io, id, "}{\\texttt{")
+ latexesc(io, text)
+ _println(io, "}}")
+ end
+ _println(io, "\\end{itemize}\n")
+end
+
+function latex(io::IO, contents::Documents.ContentsNode, page, doc)
+ depth = 1
+ needs_end = false
+ _println(io, "\\begin{itemize}")
+ for (count, path, anchor) in contents.elements
+ header = anchor.object
+ level = Utilities.header_level(header)
+ id = string(hash(string(anchor.id, "-", anchor.nth)))
+ level < depth && _println(io, "\\end{itemize}")
+ level > depth && (_println(io, "\\begin{itemize}"); needs_end = true)
+ _print(io, "\\item \\hyperlink{", id, "}{")
+ latexinline(io, header.text)
+ _println(io, "}")
+ depth = level
+ end
+ needs_end && _println(io, "\\end{itemize}")
+ _println(io, "\\end{itemize}")
+ _println(io)
+end
+
+function latex(io::IO, node::Documents.EvalNode, page, doc)
+ node.result === nothing ? nothing : latex(io, node.result, page, doc)
+end
+
+
+## Basic Nodes. AKA: any other content that hasn't been handled yet.
+
+latex(io::IO, str::AbstractString, page, doc) = _print(io, str)
+
+function latex(io::IO, other, page, doc)
+ _println(io)
+ latex(io, other)
+ _println(io)
+end
+
+latex(io::IO, md::Markdown.MD) = latex(io, md.content)
+
+function latex(io::IO, content::Vector)
+ for c in content
+ latex(io, c)
+ end
+end
+
+function latex(io::IO, h::Markdown.Header{N}) where N
+ tag = DOCUMENT_STRUCTURE[min(io.depth + N - 1, length(DOCUMENT_STRUCTURE))]
+ _print(io, "\\", tag, "{")
+ io.in_header = true
+ latexinline(io, h.text)
+ io.in_header = false
+ _println(io, "}\n")
+end
+
+# Whitelisted lexers.
+const LEXER = Set([
+ "julia",
+ "jlcon",
+])
+
+function latex(io::IO, code::Markdown.Code)
+ language = isempty(code.language) ? "none" : code.language
+ # the julia-repl is called "jlcon" in Pygments
+ language = (language == "julia-repl") ? "jlcon" : language
+ if language in LEXER
+ _print(io, "\n\\begin{minted}")
+ _println(io, "{", language, "}")
+ _println(io, code.code)
+ _println(io, "\\end{minted}\n")
+ else
+ _println(io, "\n\\begin{lstlisting}")
+ _println(io, code.code)
+ _println(io, "\\end{lstlisting}\n")
+ end
+end
+
+function latexinline(io::IO, code::Markdown.Code)
+ _print(io, "\\texttt{")
+ latexesc(io, code.code)
+ _print(io, "}")
+end
+
+function latex(io::IO, md::Markdown.Paragraph)
+ for md in md.content
+ latexinline(io, md)
+ end
+ _println(io, "\n")
+end
+
+function latex(io::IO, md::Markdown.BlockQuote)
+ wrapblock(io, "quote") do
+ latex(io, md.content)
+ end
+end
+
+function latex(io::IO, md::Markdown.Admonition)
+ wrapblock(io, "quote") do
+ wrapinline(io, "textbf") do
+ _print(io, md.title)
+ end
+ _println(io, "\n")
+ latex(io, md.content)
+ end
+end
+
+function latex(io::IO, f::Markdown.Footnote)
+ id = get(io.footnotes, f.id, 1)
+ _print(io, "\\footnotetext[", id, "]{")
+ latex(io, f.text)
+ _println(io, "}")
+end
+
+function latex(io::IO, md::Markdown.List)
+ # `\begin{itemize}` is used here for both ordered and unordered lists since providing
+ # custom starting numbers for enumerated lists is simpler to do by manually assigning
+ # each number to `\item` ourselves rather than using `\setcounter{enumi}{<start>}`.
+ #
+ # For an ordered list starting at 5 the following will be generated:
+ #
+ # \begin{itemize}
+ # \item[5. ] ...
+ # \item[6. ] ...
+ # ...
+ # \end{itemize}
+ #
+ pad = ndigits(md.ordered + length(md.items)) + 2
+ fmt = n -> (Markdown.isordered(md) ? "[$(rpad("$(n + md.ordered - 1).", pad))]" : "")
+ wrapblock(io, "itemize") do
+ for (n, item) in enumerate(md.items)
+ _print(io, "\\item$(fmt(n)) ")
+ latex(io, item)
+ n < length(md.items) && _println(io)
+ end
+ end
+end
+
+
+function latex(io::IO, hr::Markdown.HorizontalRule)
+ _println(io, "{\\rule{\\textwidth}{1pt}}")
+end
+
+# This (equation*, split) math env seems to be the only way to correctly
+# render all the equations in the Julia manual.
+function latex(io::IO, math::Markdown.LaTeX)
+ _print(io, "\\begin{equation*}\n\\begin{split}")
+ _print(io, math.formula)
+ _println(io, "\\end{split}\\end{equation*}")
+end
+
+function latex(io::IO, md::Markdown.Table)
+ _println(io, "\n\\begin{table}[h]")
+ _print(io, "\n\\begin{tabulary}{\\linewidth}")
+ _println(io, "{|", uppercase(join(md.align, '|')), "|}")
+ for (i, row) in enumerate(md.rows)
+ i === 1 && _println(io, "\\hline")
+ for (j, cell) in enumerate(row)
+ j === 1 || _print(io, " & ")
+ latexinline(io, cell)
+ end
+ _println(io, " \\\\")
+ _println(io, "\\hline")
+ end
+ _println(io, "\\end{tabulary}\n")
+ _println(io, "\\end{table}\n")
+end
+
+function latex(io::IO, raw::Documents.RawNode)
+ raw.name === :latex ? _println(io, "\n", raw.text, "\n") : nothing
+end
+
+# Inline Elements.
+
+function latexinline(io::IO, md::Vector)
+ for c in md
+ latexinline(io, c)
+ end
+end
+
+function latexinline(io::IO, md::AbstractString)
+ latexesc(io, md)
+end
+
+function latexinline(io::IO, md::Markdown.Bold)
+ wrapinline(io, "textbf") do
+ latexinline(io, md.text)
+ end
+end
+
+function latexinline(io::IO, md::Markdown.Italic)
+ wrapinline(io, "emph") do
+ latexinline(io, md.text)
+ end
+end
+
+function latexinline(io::IO, md::Markdown.Image)
+ wrapblock(io, "figure") do
+ _println(io, "\\centering")
+ url = if Utilities.isabsurl(md.url)
+ Utilities.warn("Images with absolute URLs not supported in LaTeX output.\n in $(io.filename)\n url: $(md.url)")
+ # We nevertheless output an \includegraphics with the URL. The LaTeX build will
+ # then give an error, indicating to the user that something wrong. Only the
+ # warning would be drowned by all the output from LaTeX.
+ md.url
+ elseif startswith(md.url, '/')
+ # URLs starting with a / are assumed to be relative to the document's root
+ normpath(lstrip(md.url, '/'))
+ else
+ normpath(joinpath(dirname(io.filename), md.url))
+ end
+ wrapinline(io, "includegraphics") do
+ _print(io, url)
+ end
+ _println(io)
+ wrapinline(io, "caption") do
+ latexinline(io, md.alt)
+ end
+ _println(io)
+ end
+end
+
+function latexinline(io::IO, f::Markdown.Footnote)
+ id = get!(io.footnotes, f.id, length(io.footnotes) + 1)
+ _print(io, "\\footnotemark[", id, "]")
+end
+
+function latexinline(io::IO, md::Markdown.Link)
+ if io.in_header
+ latexinline(io, md.text)
+ else
+ if occursin(".md#", md.url)
+ file, target = split(md.url, ".md#"; limit = 2)
+ id = string(hash(target))
+ wrapinline(io, "hyperlink") do
+ _print(io, id)
+ end
+ _print(io, "{")
+ latexinline(io, md.text)
+ _print(io, "}")
+ else
+ wrapinline(io, "href") do
+ latexesc(io, md.url)
+ end
+ _print(io, "{")
+ latexinline(io, md.text)
+ _print(io, "}")
+ end
+ end
+end
+
+function latexinline(io, math::Markdown.LaTeX)
+ # Handle MathJax and TeX inconsistency since the first wants `\LaTeX` wrapped
+ # in math delims, whereas actual TeX fails when that is done.
+ math.formula == "\\LaTeX" ? _print(io, math.formula) : _print(io, "\\(", math.formula, "\\)")
+end
+
+function latexinline(io, hr::Markdown.HorizontalRule)
+ _println(io, "\\rule{\\textwidth}{1pt}}")
+end
+
+
+# Metadata Nodes get dropped from the final output for every format but are needed throughout
+# rest of the build and so we just leave them in place and print a blank line in their place.
+latex(io::IO, node::Documents.MetaNode, page, doc) = _println(io, "\n")
+
+# Utilities.
+
+const _latexescape_chars = Dict{Char, AbstractString}(
+ '~' => "{\\textasciitilde}",
+ '^' => "{\\textasciicircum}",
+ '\\' => "{\\textbackslash}",
+ '\'' => "{\\textquotesingle}",
+ '"' => "{\\textquotedbl}",
+ '_' => "{\\_}",
+)
+for ch in "&%\$#_{}"
+ _latexescape_chars[ch] = "\\$ch"
+end
+
+function latexesc(io, s::AbstractString)
+ for ch in s
+ _print(io, get(_latexescape_chars, ch, ch))
+ end
+end
+
+latexesc(s) = sprint(latexesc, s)
+
+function wrapblock(f, io, env)
+ _println(io, "\\begin{", env, "}")
+ f()
+ _println(io, "\\end{", env, "}")
+end
+
+function wrapinline(f, io, cmd)
+ _print(io, "\\", cmd, "{")
+ f()
+ _print(io, "}")
+end
+
+
+function files!(out::Vector, v::Vector, depth)
+ for each in v
+ files!(out, each, depth + 1)
+ end
+ return out
+end
+
+# Tuples come from `hide(page)` with either
+# (visible, nothing, page, children) or
+# (visible, page.first, pages.second, children)
+function files!(out::Vector, v::Tuple, depth)
+ files!(out, v[2] == nothing ? v[3] : v[2] => v[3], depth)
+ files!(out, v[4], depth)
+end
+
+files!(out, s::AbstractString, depth) = push!(out, ("", s, depth))
+
+function files!(out, p::Pair{S, T}, depth) where {S <: AbstractString, T <: AbstractString}
+ push!(out, (p.first, p.second, depth))
+end
+
+function files!(out, p::Pair{S, V}, depth) where {S <: AbstractString, V}
+ push!(out, (p.first, "", depth))
+ files!(out, p.second, depth)
+end
+
+files(v::Vector) = files!(Tuple{String, String, Int}[], v, 0)
+
+end
--- /dev/null
+"""
+A module for rendering `Document` objects to markdown.
+"""
+module MarkdownWriter
+
+import ...Documenter:
+ Anchors,
+ Builder,
+ Documents,
+ Expanders,
+ Formats,
+ Documenter,
+ Utilities
+
+import Markdown
+
+function render(doc::Documents.Document)
+ copy_assets(doc)
+ mime = Formats.mimetype(:markdown)
+ for (src, page) in doc.internal.pages
+ open(Formats.extension(:markdown, page.build), "w") do io
+ for elem in page.elements
+ node = page.mapping[elem]
+ render(io, mime, node, page, doc)
+ end
+ end
+ end
+end
+
+function copy_assets(doc::Documents.Document)
+ Utilities.log(doc, "copying assets to build directory.")
+ assets = joinpath(doc.internal.assets, "mkdocs")
+ if isdir(assets)
+ builddir = joinpath(doc.user.build, "assets")
+ isdir(builddir) || mkdir(builddir)
+ for each in readdir(assets)
+ src = joinpath(assets, each)
+ dst = joinpath(builddir, each)
+ ispath(dst) && Utilities.warn("Overwriting '$dst'.")
+ cp(src, dst; force = true)
+ end
+ else
+ error("assets directory '$(abspath(assets))' is missing.")
+ end
+end
+
+function render(io::IO, mime::MIME"text/plain", vec::Vector, page, doc)
+ for each in vec
+ render(io, mime, each, page, doc)
+ end
+end
+
+function render(io::IO, mime::MIME"text/plain", anchor::Anchors.Anchor, page, doc)
+ println(io, "\n<a id='", anchor.id, "-", anchor.nth, "'></a>")
+ render(io, mime, anchor.object, page, doc)
+end
+
+
+## Documentation Nodes.
+
+function render(io::IO, mime::MIME"text/plain", node::Documents.DocsNodes, page, doc)
+ for node in node.nodes
+ render(io, mime, node, page, doc)
+ end
+end
+
+function render(io::IO, mime::MIME"text/plain", node::Documents.DocsNode, page, doc)
+ # Docstring header based on the name of the binding and it's category.
+ anchor = "<a id='$(node.anchor.id)' href='#$(node.anchor.id)'>#</a>"
+ header = "**`$(node.object.binding)`** — *$(Utilities.doccat(node.object))*."
+ println(io, anchor, "\n", header, "\n\n")
+ # Body. May contain several concatenated docstrings.
+ renderdoc(io, mime, node.docstr, page, doc)
+end
+
+function renderdoc(io::IO, mime::MIME"text/plain", md::Markdown.MD, page, doc)
+ if haskey(md.meta, :results)
+ # The `:results` field contains a vector of `Docs.DocStr` objects associated with
+ # each markdown object. The `DocStr` contains data such as file and line info that
+ # we need for generating correct source links.
+ for (markdown, result) in zip(md.content, md.meta[:results])
+ render(io, mime, dropheaders(markdown), page, doc)
+ # When a source link is available then print the link.
+ url = Utilities.url(doc.internal.remote, doc.user.repo, result)
+ if url !== nothing
+ link = "<a target='_blank' href='$url' class='documenter-source'>source</a><br>"
+ println(io, "\n", link, "\n")
+ end
+ end
+ else
+ # Docstrings with no `:results` metadata won't contain source locations so we don't
+ # try to print them out. Just print the basic docstring.
+ render(io, mime, dropheaders(md), page, doc)
+ end
+end
+
+function renderdoc(io::IO, mime::MIME"text/plain", other, page, doc)
+ # TODO: properly support non-markdown docstrings at some point.
+ render(io, mime, other, page, doc)
+end
+
+
+## Index, Contents, and Eval Nodes.
+
+function render(io::IO, ::MIME"text/plain", index::Documents.IndexNode, page, doc)
+ for (object, _, page, mod, cat) in index.elements
+ page = Formats.extension(:markdown, page)
+ url = string(page, "#", Utilities.slugify(object))
+ println(io, "- [`", object.binding, "`](", url, ")")
+ end
+ println(io)
+end
+
+function render(io::IO, ::MIME"text/plain", contents::Documents.ContentsNode, page, doc)
+ for (count, path, anchor) in contents.elements
+ path = Formats.extension(:markdown, path)
+ header = anchor.object
+ url = string(path, '#', anchor.id, '-', anchor.nth)
+ link = Markdown.Link(header.text, url)
+ level = Utilities.header_level(header)
+ print(io, " "^(level - 1), "- ")
+ Markdown.plaininline(io, link)
+ println(io)
+ end
+ println(io)
+end
+
+function render(io::IO, mime::MIME"text/plain", node::Documents.EvalNode, page, doc)
+ node.result === nothing ? nothing : render(io, mime, node.result, page, doc)
+end
+
+
+## Basic Nodes. AKA: any other content that hasn't been handled yet.
+
+function render(io::IO, ::MIME"text/plain", other, page, doc)
+ println(io)
+ Markdown.plain(io, other)
+ println(io)
+end
+
+render(io::IO, ::MIME"text/plain", str::AbstractString, page, doc) = print(io, str)
+
+# Metadata Nodes get dropped from the final output for every format but are needed throughout
+# rest of the build and so we just leave them in place and print a blank line in their place.
+render(io::IO, ::MIME"text/plain", node::Documents.MetaNode, page, doc) = println(io, "\n")
+
+function render(io::IO, ::MIME"text/plain", raw::Documents.RawNode, page, doc)
+ raw.name === :html ? println(io, "\n", raw.text, "\n") : nothing
+end
+
+
+## Markdown Utilities.
+
+# Remove all header nodes from a markdown object and replace them with bold font.
+# Only for use in `text/plain` output, since we'll use some css to make these less obtrusive
+# in the HTML rendering instead of using this hack.
+function dropheaders(md::Markdown.MD)
+ out = Markdown.MD()
+ out.meta = md.meta
+ out.content = map(dropheaders, md.content)
+ out
+end
+dropheaders(h::Markdown.Header) = Markdown.Paragraph([Markdown.Bold(h.text)])
+dropheaders(v::Vector) = map(dropheaders, v)
+dropheaders(other) = other
+
+end
--- /dev/null
+"""
+A module that provides several renderers for `Document` objects. The supported
+formats are currently:
+
+ * `:markdown` -- the default format.
+ * `:html` -- generates a complete HTML site with navigation and search included.
+ * `:latex` -- generates a PDF using LuaLaTeX.
+
+"""
+module Writers
+
+import ..Documenter:
+ Anchors,
+ Builder,
+ Documents,
+ Expanders,
+ Formats,
+ Documenter,
+ Utilities
+
+import .Utilities: Selectors
+
+#
+# Format selector definitions.
+#
+
+abstract type FormatSelector <: Selectors.AbstractSelector end
+
+abstract type MarkdownFormat <: FormatSelector end
+abstract type LaTeXFormat <: FormatSelector end
+abstract type HTMLFormat <: FormatSelector end
+
+Selectors.order(::Type{MarkdownFormat}) = 1.0
+Selectors.order(::Type{LaTeXFormat}) = 2.0
+Selectors.order(::Type{HTMLFormat}) = 3.0
+
+Selectors.matcher(::Type{MarkdownFormat}, fmt, _) = fmt === :markdown
+Selectors.matcher(::Type{LaTeXFormat}, fmt, _) = fmt === :latex
+Selectors.matcher(::Type{HTMLFormat}, fmt, _) = fmt === :html
+
+Selectors.runner(::Type{MarkdownFormat}, _, doc) = MarkdownWriter.render(doc)
+Selectors.runner(::Type{LaTeXFormat}, _, doc) = LaTeXWriter.render(doc)
+Selectors.runner(::Type{HTMLFormat}, _, doc) = HTMLWriter.render(doc)
+
+"""
+Writes a [`Documents.Document`](@ref) object to `.user.build` directory in
+the formats specified in the `.user.format` vector.
+
+Adding additional formats requires adding new `Selector` definitions as follows:
+
+```julia
+abstract type CustomFormat <: FormatSelector end
+
+Selectors.order(::Type{CustomFormat}) = 4.0 # or a higher number.
+Selectors.matcher(::Type{CustomFormat}, fmt, _) = fmt === :custom
+Selectors.runner(::Type{CustomFormat}, _, doc) = CustomWriter.render(doc)
+
+# Definition of `CustomWriter` module below...
+```
+"""
+function render(doc::Documents.Document)
+ # Render each format. Additional formats must define an `order`, `matcher`, `runner`, as
+ # well as their own rendering methods in a separate module.
+ for each in doc.user.format
+ if each === :markdown && !backends_enabled[:markdown]
+ @warn """Deprecated format value :markdown
+
+ The Markdown/MkDocs backend must now be imported from a separate package.
+ Add DocumenterMarkdown to your documentation dependencies and add
+
+ using DocumenterMarkdown
+
+ to your make.jl script.
+
+ Built-in support for format=:markdown will be removed completely in a future
+ Documenter version, causing builds to fail completely.
+
+ See the Output Backends section in the manual for more information.
+ """
+ elseif each === :latex && !backends_enabled[:latex]
+ @warn """Deprecated format value :markdown
+
+ The LaTeX/PDF backend must now be imported from a separate package.
+ Add DocumenterLaTeX to your documentation dependencies and add
+
+ using DocumenterLaTeX
+
+ to your make.jl script.
+
+ Built-in support for format=:latex will be removed completely in a future
+ Documenter version, causing builds to fail completely.
+
+ See the Output Backends section in the manual for more information.
+ """
+ end
+ Selectors.dispatch(FormatSelector, each, doc)
+ end
+ # Revert all local links to their original URLs.
+ for (link, url) in doc.internal.locallinks
+ link.url = url
+ end
+end
+
+include("MarkdownWriter.jl")
+include("HTMLWriter.jl")
+include("LaTeXWriter.jl")
+
+# This is hack to enable shell packages that would behave as in the supplementary Writer
+# modules have been moved out of Documenter.
+#
+# External packages DocumenterMarkdown and DocumenterLaTeX can use the enable_backend
+# function to mark that a certain backend is loaded in backends_enabled. That is used to
+# determine whether a deprecation warning should be printed in the render method above.
+#
+# enable_backend() is not part of the API and will be removed as soon as LaTeXWriter and
+# MarkdownWriter are actually moved out into a separate module (TODO).
+backends_enabled = Dict(
+ :markdown => false,
+ :latex => false
+)
+
+function enable_backend(backend::Symbol)
+ global backends_enabled
+ if backend in keys(backends_enabled)
+ backends_enabled[backend] = true
+ else
+ @error "Unknown backend. Expected one of:" keys(backends_enabled)
+ throw(ArgumentError("Unknown backend $backend."))
+ end
+end
+
+end
--- /dev/null
+module DocSystemTests
+
+using Test
+
+import Documenter: Documenter, DocSystem
+
+const alias_of_getdocs = DocSystem.getdocs # NOTE: won't get docstrings if in a @testset
+
+@testset "DocSystem" begin
+ ## Bindings.
+ @test_throws ArgumentError DocSystem.binding(9000)
+ let b = Docs.Binding(@__MODULE__, :DocSystem)
+ @test DocSystem.binding(b) == b
+ end
+ let b = DocSystem.binding(Documenter.Documents.Document)
+ @test b.mod === Documenter.Documents
+ @test b.var === :Document
+ end
+ let b = DocSystem.binding(Documenter)
+ @test b.mod === (Documenter)
+ @test b.var === :Documenter
+ end
+ let b = DocSystem.binding(:Main)
+ # @test b.mod === Main
+ @test b.var === :Main
+ end
+ let b = DocSystem.binding(DocSystem.binding)
+ @test b.mod === DocSystem
+ @test b.var === :binding
+ end
+ let b = DocSystem.binding(Documenter, :Documenter)
+ @test b.mod === (Documenter)
+ @test b.var === :Documenter
+ end
+
+ ## `MultiDoc` object.
+ @test isdefined(DocSystem, :MultiDoc)
+ @test (fieldnames(DocSystem.MultiDoc)...,) == (:order, :docs)
+
+ ## `DocStr` object.
+ @test isdefined(DocSystem, :DocStr)
+ @test (fieldnames(DocSystem.DocStr)...,) == (:text, :object, :data)
+ ## `getdocs`.
+ let b = DocSystem.binding(DocSystem, :getdocs),
+ d_0 = DocSystem.getdocs(b, Tuple{}),
+ d_1 = DocSystem.getdocs(b),
+ d_2 = DocSystem.getdocs(b, Union{Tuple{Any}, Tuple{Any, Type}}; compare = (==)),
+ d_3 = DocSystem.getdocs(b; modules = Module[Main]),
+ d_4 = DocSystem.getdocs(DocSystem.binding(@__MODULE__, :alias_of_getdocs)),
+ d_5 = DocSystem.getdocs(DocSystem.binding(@__MODULE__, :alias_of_getdocs); aliases = false)
+
+ @test length(d_0) == 0
+ @test length(d_1) == 2
+ @test length(d_2) == 1
+ @test length(d_3) == 0
+ @test length(d_4) == 2
+ @test length(d_5) == 0
+
+ @test d_1[1].data[:binding] == b
+ @test d_1[2].data[:binding] == b
+ @test d_1[1].data[:typesig] == Union{Tuple{Docs.Binding}, Tuple{Docs.Binding, Type}}
+ @test d_1[2].data[:typesig] == Union{Tuple{Any}, Tuple{Any, Type}}
+ @test d_1[1].data[:module] == DocSystem
+ @test d_1[2].data[:module] == DocSystem
+
+ @test d_2[1].data[:binding] == b
+ @test d_2[1].data[:typesig] == Union{Tuple{Any}, Tuple{Any, Type}}
+ @test d_2[1].data[:module] == DocSystem
+
+ @test d_1 == d_4
+ @test d_1 != d_5
+ end
+
+ ## `UnionAll`
+ let b = DocSystem.binding(@__MODULE__, Meta.parse("f(x::T) where T"))
+ @test b.var == :f
+ end
+end
+
+end
--- /dev/null
+module Foo
+"""
+```jldoctest
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+
+julia> Int64[1, 2, 3, 4]
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+```jldoctest
+julia> Int64[1, 2, 3, 4]
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+```jldoctest
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+```jldoctest
+julia> begin
+ Int64[1, 2, 3, 4] * 2
+ end
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+```jldoctest
+Int64[1, 2, 3, 4] * 2
+
+# output
+
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+```jldoctest; filter = r"foo"
+julia> println(" foobar")
+ foobaz
+```
+```jldoctest
+julia> 1 + 2
+
+julia> 3 + 4
+```
+"""
+foo() = 1
+
+ """
+ ```jldoctest
+ julia> begin
+ Int64[1, 2, 3, 4] * 2
+ end
+ 4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+ ```
+ ```jldoctest
+ julia> println(); println("foo")
+
+ bar
+ ```
+ """
+ foo(x) = 1
+
+end # module
--- /dev/null
+```@docs
+DocTestsTest.Foo.foo
+```
+
+```jldoctest
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+
+julia> Int64[1, 2, 3, 4]
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+```jldoctest
+julia> Int64[1, 2, 3, 4]
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+```jldoctest
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+```jldoctest
+julia> begin
+ Int64[1, 2, 3, 4] * 2
+ end
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+```jldoctest
+Int64[1, 2, 3, 4] * 2
+
+# output
+
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+```jldoctest; filter = r"foo"
+julia> println(" foobar")
+ foobaz
+```
+```jldoctest
+julia> 1 + 2
+
+julia> 3 + 4
+```
\ No newline at end of file
--- /dev/null
+module DocTestsTest
+using Documenter, Test
+
+println("="^50)
+@info("Testing `doctest = :fix`")
+mktempdir(@__DIR__) do dir
+ srcdir = mktempdir(dir)
+ builddir = mktempdir(dir)
+ cp(joinpath(@__DIR__, "broken.md"), joinpath(srcdir, "index.md"))
+ cp(joinpath(@__DIR__, "broken.jl"), joinpath(srcdir, "src.jl"))
+ include(joinpath(srcdir, "src.jl"))
+ @eval using .Foo
+ # fix up
+ makedocs(sitename="-", modules = [Foo], source = srcdir, build = builddir, doctest = :fix)
+ # test that strict = true works
+ makedocs(sitename="-", modules = [Foo], source = srcdir, build = builddir, strict = true)
+ # also test that we obtain the expected output
+ @test read(joinpath(srcdir, "index.md"), String) ==
+ read(joinpath(@__DIR__, "fixed.md"), String)
+ @test read(joinpath(srcdir, "src.jl"), String) ==
+ read(joinpath(@__DIR__, "fixed.jl"), String)
+end
+@info("Done testing `doctest = :fix`")
+println("="^50)
+
+end # module
--- /dev/null
+module Foo
+"""
+```jldoctest
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 2
+ 4
+ 6
+ 8
+
+julia> Int64[1, 2, 3, 4]
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+```jldoctest
+julia> Int64[1, 2, 3, 4]
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 2
+ 4
+ 6
+ 8
+```
+```jldoctest
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 2
+ 4
+ 6
+ 8
+
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 2
+ 4
+ 6
+ 8
+```
+```jldoctest
+julia> begin
+ Int64[1, 2, 3, 4] * 2
+ end
+4-element Array{Int64,1}:
+ 2
+ 4
+ 6
+ 8
+```
+```jldoctest
+Int64[1, 2, 3, 4] * 2
+
+# output
+
+4-element Array{Int64,1}:
+ 2
+ 4
+ 6
+ 8
+```
+```jldoctest; filter = r"foo"
+julia> println(" foobar")
+ foobar
+```
+```jldoctest
+julia> 1 + 2
+3
+
+julia> 3 + 4
+7
+```
+"""
+foo() = 1
+
+ """
+ ```jldoctest
+ julia> begin
+ Int64[1, 2, 3, 4] * 2
+ end
+ 4-element Array{Int64,1}:
+ 2
+ 4
+ 6
+ 8
+ ```
+ ```jldoctest
+ julia> println(); println("foo")
+
+ foo
+ ```
+ """
+ foo(x) = 1
+
+end # module
--- /dev/null
+```@docs
+DocTestsTest.Foo.foo
+```
+
+```jldoctest
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 2
+ 4
+ 6
+ 8
+
+julia> Int64[1, 2, 3, 4]
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+```
+```jldoctest
+julia> Int64[1, 2, 3, 4]
+4-element Array{Int64,1}:
+ 1
+ 2
+ 3
+ 4
+
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 2
+ 4
+ 6
+ 8
+```
+```jldoctest
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 2
+ 4
+ 6
+ 8
+
+julia> Int64[1, 2, 3, 4] * 2
+4-element Array{Int64,1}:
+ 2
+ 4
+ 6
+ 8
+```
+```jldoctest
+julia> begin
+ Int64[1, 2, 3, 4] * 2
+ end
+4-element Array{Int64,1}:
+ 2
+ 4
+ 6
+ 8
+```
+```jldoctest
+Int64[1, 2, 3, 4] * 2
+
+# output
+
+4-element Array{Int64,1}:
+ 2
+ 4
+ 6
+ 8
+```
+```jldoctest; filter = r"foo"
+julia> println(" foobar")
+ foobar
+```
+```jldoctest
+julia> 1 + 2
+3
+
+julia> 3 + 4
+7
+```
\ No newline at end of file
--- /dev/null
+module DOMTests
+
+using Test
+
+import Documenter.Utilities.DOM: DOM, @tags, HTMLDocument
+
+@tags div ul li p
+
+@testset "DOM" begin
+ for tag in (:div, :ul, :li, :p)
+ TAG = @eval $tag
+ @test isa(TAG, DOM.Tag)
+ @test TAG.name === tag
+ end
+
+ @test div().name === :div
+ @test div().text == ""
+ @test isempty(div().nodes)
+ @test isempty(div().attributes)
+
+ @test div("...").name === :div
+ @test div("...").text == ""
+ @test length(div("...").nodes) === 1
+ @test div("...").nodes[1].text == "..."
+ @test div("...").nodes[1].name === Symbol("")
+ @test isempty(div("...").attributes)
+
+ @test div[".class"]("...").name === :div
+ @test div[".class"]("...").text == ""
+ @test length(div[".class"]("...").nodes) === 1
+ @test div[".class"]("...").nodes[1].text == "..."
+ @test div[".class"]("...").nodes[1].name === Symbol("")
+ @test length(div[".class"]("...").attributes) === 1
+ @test div[".class"]("...").attributes[1] == (:class => "class")
+ @test div[:attribute].attributes[1] == (:attribute => "")
+ @test div[:attribute => "value"].attributes[1] == (:attribute => "value")
+
+ let d = div(ul(map(li, [string(n) for n = 1:10])))
+ @test d.name === :div
+ @test d.text == ""
+ @test isempty(d.attributes)
+ @test length(d.nodes) === 1
+ let u = d.nodes[1]
+ @test u.name === :ul
+ @test u.text == ""
+ @test isempty(u.attributes)
+ @test length(u.nodes) === 10
+ for n = 1:10
+ let v = u.nodes[n]
+ @test v.name === :li
+ @test v.text == ""
+ @test isempty(v.attributes)
+ @test length(v.nodes) === 1
+ @test v.nodes[1].name === Symbol("")
+ @test v.nodes[1].text == string(n)
+ @test !isdefined(v.nodes[1], :attributes)
+ @test !isdefined(v.nodes[1], :nodes)
+ end
+ end
+ end
+ end
+
+ @tags script style img
+
+ @test string(div(p("one"), p("two"))) == "<div><p>one</p><p>two</p></div>"
+ @test string(div[:key => "value"]) == "<div key=\"value\"></div>"
+ @test string(p(" < > & ' \" ")) == "<p> < > & ' " </p>"
+ @test string(img[:src => "source"]) == "<img src=\"source\"/>"
+ @test string(img[:none]) == "<img none/>"
+ @test string(script(" < > & ' \" ")) == "<script> < > & ' \" </script>"
+ @test string(style(" < > & ' \" ")) == "<style> < > & ' \" </style>"
+ @test string(script) == "<script>"
+
+ function locally_defined()
+ @tags button
+ @test try
+ x = button
+ true
+ catch err
+ false
+ end
+ end
+ @test !isdefined(@__MODULE__, :button)
+ locally_defined()
+ @test !isdefined(@__MODULE__, :button)
+
+ # HTMLDocument
+ @test string(HTMLDocument(div())) == "<!DOCTYPE html>\n<div></div>\n"
+ @test string(HTMLDocument("custom doctype", div())) == "<!DOCTYPE custom doctype>\n<div></div>\n"
+end
+
+end
--- /dev/null
+module ErrorsModule
+
+"""
+```jldoctest
+julia> a = 1
+2
+
+```
+
+```jldoctest
+```
+"""
+func(x) = x
+
+end
+
+using Documenter
+
+makedocs(sitename="-", modules = [ErrorsModule])
+
+@test_throws ErrorException makedocs(modules = [ErrorsModule], strict = true)
--- /dev/null
+
+```@docs
+missing_doc
+```
+
+```@docs
+parse error
+```
+
+```@meta
+CurrentModule = NonExistantModule
+```
+
+```@autodocs
+Modules = [NonExistantModule]
+```
+
+```@eval
+NonExistantModule
+```
+
+```@docs
+# comment in a @docs block
+```
+
+[`foo(x::Foo)`](@ref) creates an [`UndefVarError`](@ref) when `eval`d
+for the type signature, since `Foo` is not defined.
+
+Numeric literals don't have bindings: [`1`](@ref). Nor [`"strings"`](@ref).
+[`:symbols`] do, however.
+
+Some syntax errors in references will fail with an `ParseError`: [`foo+*bar`](@ref).
+Others, like [`foo(x`](@ref) will give an `:incomplete` expression.
+
+This is the footnote [^1]. And [^another] [^another].
+
+[^1]: one
+
+ [^nested]: a nested footnote
+
+[^another_one]:
+
+ Things! [^1]. [^2].
+
+[^nested]
+
+[^nested]:
+
+ Duplicate [^1] nested footnote.
+
+```@docs
+ErrorsModule.func
+```
+
+```jldoctest
+julia> b = 1
+2
+
+julia> x
+
+julia> x
+ERROR: UndefVarError: x not defined
+
+julia> x
+```
+
+```jldoctest; setup
+julia> 1+1
+2
+```
+```jldoctest invalidkwarg1; setup
+julia> 1+1
+2
+```
+```jldoctest; setup == 1
+julia> 1+1
+2
+```
+```jldoctest invalidkwarg2; setup == 1
+julia> 1+1
+2
+```
+
+```jldoctest; output = false
+foo(a, b) = a * b
+foo(2, 3)
+
+# output
+
+1
+```
+```jldoctest; output = true
+foo(a, b) = a * b
+foo(2, 3)
+
+# output
+
+1
+```
--- /dev/null
+# Defines the modules referred to in the example docs (under src/) and then builds them.
+# It can be called separately to build the examples/, or as part of the test suite.
+#
+# It defines a set of variables (`examples_*`) that can be used in the tests.
+# The `examples_root` should be used to check whether this file has already been included
+# or not and should be kept unique.
+isdefined(@__MODULE__, :examples_root) && error("examples_root is already defined\n$(@__FILE__) included multiple times?")
+
+# The `Mod` and `AutoDocs` modules are assumed to exists in the Main module.
+(@__MODULE__) === Main || error("$(@__FILE__) must be included into Main.")
+
+# Modules `Mod` and `AutoDocs`
+module Mod
+ """
+ func(x)
+
+ [`T`](@ref)
+ """
+ func(x) = x
+
+ """
+ T
+
+ [`func(x)`](@ref)
+ """
+ mutable struct T end
+end
+
+"`AutoDocs` module."
+module AutoDocs
+ module Pages
+ include("pages/a.jl")
+ include("pages/b.jl")
+ include("pages/c.jl")
+ include("pages/d.jl")
+ include("pages/e.jl")
+ end
+
+ "Function `f`."
+ f(x) = x
+
+ "Constant `K`."
+ const K = 1
+
+ "Type `T`."
+ mutable struct T end
+
+ "Macro `@m`."
+ macro m() end
+
+ "Module `A`."
+ module A
+ "Function `A.f`."
+ f(x) = x
+
+ "Constant `A.K`."
+ const K = 1
+
+ "Type `B.T`."
+ mutable struct T end
+
+ "Macro `B.@m`."
+ macro m() end
+ end
+
+ "Module `B`."
+ module B
+ "Function `B.f`."
+ f(x) = x
+
+ "Constant `B.K`."
+ const K = 1
+
+ "Type `B.T`."
+ mutable struct T end
+
+ "Macro `B.@m`."
+ macro m() end
+ end
+end
+
+# Build example docs
+using Documenter
+
+const examples_root = dirname(@__FILE__)
+
+@info("Building mock package docs: MarkdownWriter")
+examples_markdown_doc = makedocs(
+ format = :markdown,
+ debug = true,
+ root = examples_root,
+ build = "builds/markdown",
+ doctest = false,
+)
+
+
+htmlbuild_pages = Any[
+ "**Home**" => "index.md",
+ "Manual" => [
+ "man/tutorial.md",
+ ],
+ hide("hidden.md"),
+ "Library" => [
+ "lib/functions.md",
+ "lib/autodocs.md",
+ ],
+ hide("Hidden Pages" => "hidden/index.md", Any[
+ "Page X" => "hidden/x.md",
+ "hidden/y.md",
+ "hidden/z.md",
+ ])
+]
+
+@info("Building mock package docs: HTMLWriter / local build")
+examples_html_local_doc = makedocs(
+ debug = true,
+ root = examples_root,
+ build = "builds/html-local",
+ html_prettyurls = false,
+ doctestfilters = [r"Ptr{0x[0-9]+}"],
+ assets = ["assets/custom.css"],
+ sitename = "Documenter example",
+ pages = htmlbuild_pages,
+
+ linkcheck = true,
+ linkcheck_ignore = [r"(x|y).md", "z.md", r":func:.*"],
+
+ html_edit_branch = nothing,
+)
+
+# Build with pretty URLs and canonical links
+@info("Building mock package docs: HTMLWriter / deployment build")
+examples_html_deploy_doc = makedocs(
+ debug = true,
+ root = examples_root,
+ build = "builds/html-deploy",
+ html_prettyurls = true,
+ html_canonical = "https://example.com/stable",
+ doctestfilters = [r"Ptr{0x[0-9]+}"],
+ assets = [
+ "assets/favicon.ico",
+ "assets/custom.css"
+ ],
+ sitename = "Documenter example",
+ pages = htmlbuild_pages,
+ doctest = false,
+)
--- /dev/null
+site_name: Documenter.jl
+repo_url: https://github.com/JuliaDocs/Documenter.jl
+site_description: Julia package documentation generator.
+site_author: Michael Hatherly
+
+theme: material
+
+extra:
+ palette:
+ primary: 'indigo'
+ accent: 'blue'
+
+extra_css:
+ - assets/Documenter.css
+
+markdown_extensions:
+ - codehilite
+ - extra
+ - tables
+ - fenced_code
+
+extra_javascript:
+ - https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/MathJax.js?config=TeX-AMS_HTML
+ - assets/mathjaxhelper.js
+
+docs_dir: 'build'
+
+pages:
+- Home: index.md
+- Manual:
+ - Tutorial: man/tutorial.md
+- Library:
+ - Functions: lib/functions.md
+ - AutoDocs: lib/autodocs.md
--- /dev/null
+
+"""
+`f` from page `a.jl`.
+
+Links:
+
+- [`ccall`](@ref)
+- [`while`](@ref)
+- [`@time(x)`](@ref)
+- [`T(x)`](@ref)
+- [`T(x, y)`](@ref)
+- [`f(::Integer)`](@ref)
+- [`f(::Any)`](@ref)
+- [`f(::Any, ::Any)`](@ref)
+- [`f(x, y, z)`](@ref)
+
+[^footnote]:
+
+ Footnote contents. [^footnote]
+
+"""
+f(x) = x
--- /dev/null
+
+"""
+`f` from page `b.jl`.
+
+Links:
+
+- [`ccall`](@ref)
+- [`while`](@ref)
+- [`@time`](@ref)
+- [`T`](@ref)
+- [`f`](@ref)
+- [`f(::Any)`](@ref)
+- [`f(::Any, ::Any)`](@ref)
+- [`f(::Any, ::Any, ::Any)`](@ref)
+
+"""
+f(x, y) = x + y
--- /dev/null
+
+"""
+`f` from page `c.jl`.
+
+Links:
+
+- [`ccall`](@ref)
+- [`while`](@ref)
+- [`@time`](@ref)
+- [`T`](@ref)
+- [`f`](@ref)
+- [`f(::Any)`](@ref)
+- [`f(::Any, ::Any)`](@ref)
+- [`f(::Any, ::Any, ::Any)`](@ref)
+
+"""
+f(x, y, z) = x + y + z
--- /dev/null
+
+"""
+`T` from page `d.jl`.
+
+Links:
+
+- [`ccall`](@ref)
+- [`while`](@ref)
+- [`@time`](@ref)
+- [`T`](@ref)
+- [`f`](@ref)
+- [`f(x)`](@ref)
+- [`f(x, y)`](@ref)
+- [`f(::Any, ::Any, ::Any)`](@ref)
+
+"""
+mutable struct T end
+
+
+"""
+`T` from page `d.jl`.
+
+Links:
+
+- [`ccall`](@ref)
+- [`while`](@ref)
+- [`@time`](@ref)
+- [`T(x)`](@ref)
+- [`T(x, y)`](@ref)
+- [`T(x, y, z)`](@ref)
+- [`f`](@ref)
+- [`f(x)`](@ref)
+- [`f(x, y)`](@ref)
+- [`f(::Any, ::Any, ::Any)`](@ref)
+
+"""
+T(x) = T()
+
+"""
+`T` from page `d.jl`.
+
+Links:
+
+- [`ccall`](@ref)
+- [`while`](@ref)
+- [`@time`](@ref)
+- [`T()`](@ref)
+- [`T(x)`](@ref)
+- [`T(x, y)`](@ref)
+- [`T(x, y, z)`](@ref)
+- [`f`](@ref)
+- [`f(x)`](@ref)
+- [`f(x, y)`](@ref)
+- [`f(::Any, ::Any, ::Any)`](@ref)
+
+"""
+T(x, y) = T()
--- /dev/null
+module E
+
+export f_1, f_2, f_3
+
+"f_1"
+f_1(x) = x
+
+"f_2"
+f_2(x) = x
+
+"f_3"
+f_3(x) = x
+
+
+"g_1"
+g_1(x) = x
+
+"g_2"
+g_2(x) = x
+
+"g_3"
+g_3(x) = x
+
+export T_1
+
+"T_1"
+mutable struct T_1 end
+
+"T_2"
+mutable struct T_2 end
+
+"T_3"
+struct T_3{T} end
+
+end
--- /dev/null
+/* custom.css */
+
+center.raw-html-block {
+ color: red;
+}
--- /dev/null
+// custom javascript
--- /dev/null
+// overrides Documenter's search.js
--- /dev/null
+# Hidden (toplevel)
+
+## Section
--- /dev/null
+# Hidden pages
+
+Pages can be hidden with the [`hide`](@ref) function.
+
+## List of hidden pages
+
+- [Hidden page 1](x.md)
+- [Hidden page 2](y.md)
+- [Hidden page 3](z.md)
+
+## Docs for `hide`
+
+```@docs
+hide
+```
--- /dev/null
+# Hidden 1
+
+## First heading
+## Second heading
--- /dev/null
+# Hidden 2
--- /dev/null
+# Hidden 3
--- /dev/null
+# Documentation
+
+## Index Page
+
+```@contents
+Pages = ["index.md"]
+Depth = 5
+```
+
+## Functions Contents
+
+```@contents
+Pages = ["lib/functions.md"]
+Depth = 3
+```
+
+## Tutorial Contents
+
+```@contents
+Pages = ["man/tutorial.md"]
+```
+
+## Index
+
+```@index
+```
+
+### Embedded `@ref` links headers: [`ccall`](@ref)
+
+[#60](@ref) [#61](@ref)
+
+```@repl
+zeros(5, 5)
+zeros(50, 50)
+```
+
+```@meta
+DocTestSetup = quote
+ using Base
+ x = -3:0.01:3
+ y = -4:0.02:5
+ z = [Float64((xi^2 + yi^2)) for xi in x, yi in y]
+end
+```
+
+```jldoctest
+julia> [1.0, 2.0, 3.0]
+3-element Array{Float64,1}:
+ 1.0
+ 2.0
+ 3.0
+
+```
+
+```jldoctest
+julia> println(" "^5)
+
+julia> "\nfoo\n\nbar\n\n\nbaz"
+"\nfoo\n\nbar\n\n\nbaz"
+
+julia> println(ans)
+
+foo
+
+bar
+
+
+baz
+```
+
+ * `one` two three
+ * four `five` six
+
+ * ```
+ one
+ ```
+
+## Raw Blocks
+
+```@raw html
+<center class="raw-html-block">
+ <strong>CENTER</strong>
+</center>
+```
+
+```@raw latex
+\begin{verbatim}
+```
+
+```@raw latex
+\end{verbatim}
+```
+
+# Symbols in doctests
+
+```jldoctest
+julia> a = :undefined
+:undefined
+
+julia> a
+:undefined
+```
+
+# Named doctests
+
+```jldoctest test-one
+julia> a = 1
+1
+```
+
+```jldoctest test-one
+julia> a + 1
+2
+```
+
+# Filtered doctests
+
+## Global
+
+```jldoctest
+julia> print("Ptr{0x123456}")
+Ptr{0x654321}
+```
+
+## Local
+```@meta
+DocTestFilters = [r"foo[a-z]+"]
+```
+
+```jldoctest
+julia> print("foobar")
+foobuu
+```
+
+```@meta
+DocTestFilters = [r"foo[a-z]+", r"[0-9]+"]
+```
+
+```jldoctest
+julia> print("foobar123")
+foobuu456
+```
+
+```@meta
+DocTestFilters = r"foo[a-z]+"
+```
+
+```jldoctest
+julia> print("foobar")
+foobuu
+```
+
+```@meta
+DocTestFilters = []
+```
+
+## Errors
+
+```@meta
+DocTestFilters = [r"Stacktrace:\n \[1\][\s\S]+"]
+```
+
+```jldoctest
+julia> error()
+ERROR:
+Stacktrace:
+ [1] error() at ./thisfiledoesnotexist.jl:123456789
+```
+
+
+```jldoctest
+julia> error()
+ERROR:
+Stacktrace:
+[...]
+```
+
+```@meta
+DocTestFilters = []
+```
+
+# Doctest keyword arguments
+
+```jldoctest; setup = :(f(x) = x^2; g(x) = x)
+julia> f(2)
+4
+
+julia> g(2)
+2
+```
+```jldoctest
+julia> f(2)
+ERROR: UndefVarError: f not defined
+```
+
+```jldoctest PR650; setup = :(f(x) = x^2; g(x) = x)
+julia> f(2)
+4
+
+julia> g(2)
+2
+```
+```jldoctest PR650
+julia> f(2)
+4
+
+julia> g(2)
+2
+```
+
+```jldoctest; filter = [r"foo[a-z]+"]
+julia> print("foobar")
+foobuu
+```
+
+```jldoctest; filter = [r"foo[a-z]+", r"[0-9]+"]
+julia> print("foobar123")
+foobuu456
+```
+
+```jldoctest; filter = r"foo[a-z]+"
+julia> print("foobar")
+foobuu
+```
+
+```jldoctest; filter = r"foo[a-z]+", setup = :(f() = print("foobar"))
+julia> f()
+foobuu
+```
+
+```jldoctest; output = false
+foo(a, b) = a * b
+foo(2, 3)
+
+# output
+
+6
+```
+
+
+# Sanitise module names
+
+```jldoctest
+julia> struct T end
+
+julia> t = T()
+T()
+
+julia> fullname(@__MODULE__)
+(:Main,)
+
+julia> fullname(Base.Broadcast)
+(:Base, :Broadcast)
+
+julia> @__MODULE__
+Main
+```
+
+# Issue398
+
+```@meta
+DocTestSetup = quote
+ module Issue398
+
+ struct TestType{T} end
+
+ function _show end
+ Base.show(io::IO, t::TestType) = _show(io, t)
+
+ macro define_show_and_make_object(x, y)
+ z = Expr(:quote, x)
+ esc(quote
+ $(Issue398)._show(io::IO, t::$(Issue398).TestType{$z}) = print(io, $y)
+ const $x = $(Issue398).TestType{$z}()
+ end)
+ end
+
+ export @define_show_and_make_object
+
+ end # module
+
+ using .Issue398
+end
+```
+
+```jldoctest
+julia> @define_show_and_make_object q "abcd"
+abcd
+```
+
+```@meta
+DocTestSetup = nothing
+```
+
+# Issue653
+
+```jldoctest
+julia> struct MyException <: Exception
+ msg::AbstractString
+ end
+
+julia> function Base.showerror(io::IO, err::MyException)
+ print(io, "MyException: ")
+ print(io, err.msg)
+ end
+
+julia> err = MyException("test exception")
+MyException("test exception")
+
+julia> sprint(showerror, err)
+"MyException: test exception"
+
+julia> throw(MyException("test exception"))
+ERROR: MyException: test exception
+```
+
+# Issue418
+
+```jldoctest
+julia> f(x::Float64) = x
+f (generic function with 1 method)
+
+julia> f("")
+ERROR: MethodError: no method matching f(::String)
+Closest candidates are:
+ f(!Matched::Float64) at none:1
+```
+
+
+```jldoctest
+julia> a = 1
+1
+
+julia> b = 2
+2
+
+julia> ex = :(a + b)
+:(a + b)
+
+julia> eval(ex)
+3
+```
+
+```@repl
+ex = :(1 + 5)
+eval(ex)
+```
+
+```@example
+ex = :(1 + 5)
+eval(ex)
+```
+
+# Issue #793
+```jldoctest
+julia> write("issue793.jl", "\"Hello!\"");
+
+julia> include("issue793.jl")
+"Hello!"
+
+julia> rm("issue793.jl");
+```
+```@repl
+write("issue793.jl", "\"Hello!\"")
+include("issue793.jl")
+rm("issue793.jl")
+```
+```@example
+write("issue793.jl", "\"Hello!\"")
+r = include("issue793.jl")
+rm("issue793.jl")
+r
+```
+
+
+```jldoctest
+julia> a = 1
+1
+
+julia> ans
+1
+```
+
+# Issue513
+
+```jldoctest named
+julia> a = 1
+1
+
+julia> ans
+1
+```
+
+# Filtering of `Main.`
+
+```jldoctest
+julia> struct Point end;
+
+julia> println(Point)
+Point
+
+julia> sqrt(100)
+10.0
+
+julia> sqrt = 4
+ERROR: cannot assign variable Base.sqrt from module Main
+```
+
+```jldoctest
+julia> g(x::Float64, y) = 2x + y
+g (generic function with 1 method)
+
+julia> g(x, y::Float64) = x + 2y
+g (generic function with 2 methods)
+
+julia> g(2.0, 3)
+7.0
+
+julia> g(2, 3.0)
+8.0
+
+julia> g(2.0, 3.0)
+ERROR: MethodError: g(::Float64, ::Float64) is ambiguous. Candidates:
+ g(x, y::Float64) in Main at none:1
+ g(x::Float64, y) in Main at none:1
+Possible fix, define
+ g(::Float64, ::Float64)
+```
+
+# Anonymous function declaration
+
+```jldoc
+julia> x->x # ignore error on 0.7
+#1 (generic function with 1 method)
+```
+
+# Assigning symbols example
+
+```@example
+r = :a
+```
+
+# Bad links (Windows)
+
+* [Colons not allowed on Windows -- `some:path`](some:path)
+* [No "drive" -- `:path`](:path)
+* [Absolute Windows paths -- `X:\some\path`](X:\some\path)
--- /dev/null
+# `@autodocs` tests
+
+```@meta
+CurrentModule = Main
+```
+
+## Public
+
+Should include docs for
+
+ * [`AutoDocs.Pages.E.f_1`](@ref)
+ * [`AutoDocs.Pages.E.f_2`](@ref)
+ * [`AutoDocs.Pages.E.f_3`](@ref)
+
+in that order.
+
+```@autodocs
+Modules = [AutoDocs.Pages.E]
+Private = false
+Order = [:function]
+```
+
+## Private
+
+Should include docs for
+
+ * [`AutoDocs.Pages.E.g_1`](@ref)
+ * [`AutoDocs.Pages.E.g_2`](@ref)
+ * [`AutoDocs.Pages.E.g_3`](@ref)
+
+in that order.
+
+```@autodocs
+Modules = [AutoDocs.Pages.E]
+Public = false
+Order = [:function]
+```
+
+## Ordering of Public and Private
+
+Should include docs for
+
+ * [`AutoDocs.Pages.E.T_1`](@ref)
+ * [`AutoDocs.Pages.E.T_2`](@ref)
+
+in that order.
+
+```@autodocs
+Modules = [AutoDocs.Pages.E]
+Order = [:type]
+```
+
--- /dev/null
+!!! warning\r
+\r
+ This file contains windows line endings. Do not edit.\r
+\r
+```jldoctest\r
+a = 1\r
+b = 2\r
+a + b\r
+\r
+# output\r
+\r
+3\r
+```\r
--- /dev/null
+
+```@meta
+CurrentModule = Main.Mod
+```
+
+# Function Index
+
+```@index
+Pages = ["lib/functions.md"]
+```
+
+# Functions
+
+[`ccall`](@ref), [`func(x)`](@ref), [`T`](@ref), [`for`](@ref), and [`while`](@ref).
+
+```@docs
+func(x)
+T
+ccall
+for
+while
+@time
+@assert
+```
+
+# Foo
+
+```@example
+@show pwd()
+a = 1
+```
+
+...
+
+```@example
+@isdefined a
+```
+
+```@example 1
+f(x) = 2x
+g(x) = 3x
+nothing # hide
+```
+
+```@example 2
+x, y = 1, 2
+println(x, y)
+```
+
+```@example 3
+struct T end
+t = T()
+```
+
+```@example hide-all-the-things
+a = 1#hide
+b = 2# hide
+c = 3# hide
+d = 4 #hide
+e = 5 # hide
+f = 6 # hide
+a + b + c + d + e + f
+```
+
+## Foo
+
+```@example 3
+@isdefined T
+@show @isdefined t # hide
+@show typeof(T)
+typeof(t) # hide
+```
+
+```@example 2
+x + y
+```
+
+```@example 1
+f(2), g(2)
+```
+
+### Foo
+
+```@example 2
+x - y
+```
+
+```@example 1
+f(1), g(1)
+```
+
+```@example 3
+using InteractiveUtils
+@which T()
+```
+
+```@example continued-code
+A = 1
+```
+```@example continued-code; continued = true
+for i in 1:3
+```
+```@example
+A = 2
+```
+```@example continued-code; continued = true
+ println(A + i)
+```
+```@example continued-code
+end
+```
+```@example continued-code
+A + 1
+```
+
+#### Foo
+
+```@example
+a = 1
+b = ans
+@assert a === b
+```
+
+```@repl
+using Random # hide
+Random.seed!(1) # hide
+nothing
+rand()
+a = 1
+println(a)
+b = 2
+a + b
+struct T
+ x :: Int
+ y :: Vector
+end
+x = T(1, [1])
+x.y
+x.x
+```
+
+```@repl 1
+d = 1
+```
+
+```@repl 1
+println(d)
+```
+
+Test setup function
+
+```@setup testsetup
+w = 5
+```
+
+```@example testsetup
+@assert w === 5
+```
+
+```@repl testsetup
+@assert w === 5
+```
+
+# Autodocs
+
+```@meta
+CurrentModule = Main
+```
+
+## AutoDocs Module
+
+```@autodocs
+Modules = [AutoDocs]
+```
+
+## Functions, Modules, and Macros
+
+```@autodocs
+Modules = [AutoDocs.A, AutoDocs.B]
+Order = [:function, :module, :macro]
+```
+
+## Constants and Types
+
+```@autodocs
+Modules = [AutoDocs.A, AutoDocs.B]
+Order = [:constant, :type]
+```
+
+## Autodocs by Page
+
+```@autodocs
+Modules = [AutoDocs.Pages]
+Pages = ["a.jl", "b.jl"]
+```
+
+```@autodocs
+Modules = [AutoDocs.Pages]
+Pages = ["c.jl", "d.jl"]
+```
+
+A footnote reference [^footnote].
+
+# Named docstring `@ref`s
+
+ * a normal docstring `@ref` link: [`AutoDocs.Pages.f`](@ref);
+ * a named docstring `@ref` link: [`f`](@ref AutoDocs.Pages.f);
+ * and a link with custom text: [`@time 1 + 2`](@ref @time);
+ * some invalid syntax: [`for i = 1:10; ...`](@ref for).
--- /dev/null
+csv,data,file
+1,2,3
--- /dev/null
+# Tutorial
+
+[Documentation](@ref)
+
+[Index](@ref)
+
+[Functions](@ref)
+
+[`Main.Mod.func(x)`](@ref)
+
+[`Main.Mod.T`](@ref)
+
+```jldoctest
+julia> using Base.Meta # `nothing` shouldn't be displayed.
+
+julia> Meta
+Base.Meta
+
+julia> a = 1
+1
+
+julia> b = 2;
+
+julia> a + b
+3
+```
+
+```jldoctest
+a = 1
+b = 2
+a + b
+
+# output
+
+3
+```
+
+```@meta
+DocTestSetup =
+ quote
+ using Documenter
+ using Random
+ Random.seed!(1)
+ end
+```
+
+```jldoctest
+a = 1
+b = 2
+a / b
+
+# output
+
+0.5
+```
+
+```jldoctest
+julia> a = 1;
+
+julia> b = 2
+2
+
+julia> a / b
+0.5
+```
+
+```@eval
+import Markdown
+code = string(sprint(Base.banner), "julia>")
+Markdown.Code(code)
+```
+
+```jldoctest
+julia> # First definition.
+ function f(x, y)
+ x + y
+ end
+ #
+ # Second definition.
+ #
+ struct T
+ x
+ end
+
+julia> @isdefined(f), @isdefined(T) # Check for both definitions.
+(true, true)
+
+julia> import Base
+
+julia> using Base.Meta
+
+julia> r = isexpr(:(using Base.Meta), :using); # Discarded result.
+
+julia> !r
+false
+```
+
+```jldoctest
+julia> for i = 1:5
+ println(i)
+ end
+1
+2
+3
+4
+5
+
+julia> println("Printing with semi-comma ending.");
+Printing with semi-comma ending.
+
+julia> div(1, 0)
+ERROR: DivideError: integer division error
+[...]
+
+julia> println("a"); # Semi-colons *not* on the last expression shouldn't suppress output.
+ println(1) # ...
+ 2 # ...
+a
+1
+2
+
+julia> println("a"); # Semi-colons *not* on the last expression shouldn't suppress output.
+ println(1) # ...
+ 2; # Only those in the last expression.
+a
+1
+```
+
+```jldoctest
+a = 1
+b = 2; # Semi-colons don't affect script doctests.
+
+# output
+
+2
+```
+
+```@repl 1
+f(x) = (sleep(x); x)
+@time f(0.1);
+```
+
+```@repl 1
+f(0.01)
+div(1, 0)
+```
+
+Make sure that stdout is in the right place (#484):
+
+```@repl 1
+println("---") === nothing
+versioninfo()
+```
+
+```@eval
+1 + 2
+nothing
+```
+
+## Including images with `MIME`
+
+If `show(io, ::MIME, x)` is overloaded for a particular type then `@example` blocks can show the SVG, HTML/JS/CSS, PNG, JPEG, GIF or WebP image as appropriate in the output.
+
+Assuming the following type and method live in the `InlineSVG` module
+
+```julia
+struct SVG
+ code :: String
+end
+Base.show(io, ::MIME"image/svg+xml", svg::SVG) = write(io, svg.code)
+```
+
+.. then we we can invoke and show them with an `@example` block:
+
+```@setup inlinesvg
+module InlineSVG
+export SVG
+mutable struct SVG
+ code :: String
+end
+Base.show(io, ::MIME"image/svg+xml", svg::SVG) = write(io, svg.code)
+end # module
+```
+
+```@example inlinesvg
+using .InlineSVG
+SVG("""
+<svg width="82" height="76">
+ <g style="stroke-width: 3">
+ <circle cx="20" cy="56" r="16" style="stroke: #cb3c33; fill: #d5635c" />
+ <circle cx="41" cy="20" r="16" style="stroke: #389826; fill: #60ad51" />
+ <circle cx="62" cy="56" r="16" style="stroke: #9558b2; fill: #aa79c1" />
+ </g>
+</svg>
+""")
+```
+
+_Note: we can't define the `show` method in the `@example` block due to the world age
+counter in Julia 0.6 (Documenter's `makedocs` is not aware of any of the new method
+definitions happening in `eval`s)._
+
+We can also show SVG images with interactivity via the `text/html` MIME to display output that combines HTML, JS and CSS. Assume the following type and method live in the `InlineHTML` modeul
+
+```julia
+struct HTML
+ code::String
+end
+Base.show(io, ::MIME"text/html", html::HTML) = write(io, read(html.code))
+```
+
+.. then we can invoke and show them with an `@example` block (try mousing over the circles to see the applied style):
+
+```@setup inlinehtml
+module InlineHTML
+mutable struct HTML
+ code::String
+end
+Base.show(io, ::MIME"text/html", html::HTML) = write(io, html.code)
+end # module
+```
+
+```@example inlinehtml
+using .InlineHTML
+InlineHTML.HTML("""
+<script>
+ function showStyle(e) {
+ document.querySelector("#inline-html-style").innerHTML = e.getAttribute('style');
+ }
+</script>
+<svg width="100%" height="76">
+ <g style="stroke-width: 3">
+ <circle cx="20" cy="56" r="16" style="stroke: #cb3c33; fill: #d5635c" onmouseover="showStyle(this)"/>
+ <circle cx="41" cy="20" r="16" style="stroke: #389826; fill: #60ad51" onmouseover="showStyle(this)"/>
+ <circle cx="62" cy="56" r="16" style="stroke: #9558b2; fill: #aa79c1" onmouseover="showStyle(this)"/>
+ <text id="inline-html-style" x="90", y="20"></text>
+ </g>
+</svg>
+""")
+```
+
+The same mechanism also works for PNG files. Assuming again the following
+type and method live in the `InlinePNG` module
+
+```julia
+struct PNG
+ filename::String
+end
+Base.show(io, ::MIME"image/png", png::PNG) = write(io, read(png.filename))
+```
+
+.. then we can invoke and show them with an `@example` block:
+
+```@setup inlinepng
+module InlinePNG
+export PNG
+mutable struct PNG
+ filename::String
+end
+Base.show(io, ::MIME"image/png", png::PNG) = write(io, read(png.filename))
+end # module
+```
+
+```@example inlinepng
+using Documenter
+using .InlinePNG
+PNG(joinpath(dirname(pathof(Documenter)), "..", "test", "examples", "images", "logo.png"))
+```
+
+
+.. and JPEG, GIF and WebP files:
+
+```@setup inlinewebpgifjpeg
+module InlineWEBPGIFJPEG
+export WEBP, GIF, JPEG
+mutable struct WEBP
+ filename :: String
+end
+Base.show(io, ::MIME"image/webp", image::WEBP) = write(io, read(image.filename))
+mutable struct GIF
+ filename :: String
+end
+Base.show(io, ::MIME"image/gif", image::GIF) = write(io, read(image.filename))
+mutable struct JPEG
+ filename :: String
+end
+Base.show(io, ::MIME"image/jpeg", image::JPEG) = write(io, read(image.filename))
+end # module
+```
+
+```@example inlinewebpgifjpeg
+using Documenter
+using .InlineWEBPGIFJPEG
+WEBP(joinpath(dirname(pathof(Documenter)), "..", "test", "examples", "images", "logo.webp"))
+```
+
+```@example inlinewebpgifjpeg
+GIF(joinpath(dirname(pathof(Documenter)), "..", "test", "examples", "images", "logo.gif"))
+```
+
+```@example inlinewebpgifjpeg
+JPEG(joinpath(dirname(pathof(Documenter)), "..", "test", "examples", "images", "logo.jpg"))
+```
+
+## Interacting with external files
+
+You can also write output files and then refer to them in the document:
+
+```@example
+open("julia.svg", "w") do io
+ write(io, """
+ <svg width="82" height="76" xmlns="http://www.w3.org/2000/svg">
+ <g style="stroke-width: 3">
+ <circle cx="20" cy="56" r="16" style="stroke: #cb3c33; fill: #d5635c" />
+ <circle cx="41" cy="20" r="16" style="stroke: #389826; fill: #60ad51" />
+ <circle cx="62" cy="56" r="16" style="stroke: #9558b2; fill: #aa79c1" />
+ </g>
+ </svg>
+ """)
+end
+```
+
+
+
+Dowload [`data.csv`](data.csv).
+
+
+## [Links](../index.md) in headers
+
+... are dropped in the navigation links.
+
+
+## Embedding raw HTML
+
+Below is a nicely rendered version of `^D`:
+
+```@raw html
+<kbd>Ctrl</kbd> + <kbd>D</kbd>
+```
--- /dev/null
+using Test
+
+# When the file is run separately we need to include make.jl which actually builds
+# the docs and defines a few modules that are referred to in the docs. The make.jl
+# has to be expected in the context of the Main module.
+if (@__MODULE__) === Main && !@isdefined examples_root
+ include("make.jl")
+elseif (@__MODULE__) !== Main && isdefined(Main, :examples_root)
+ using Documenter
+ const examples_root = Main.examples_root
+elseif (@__MODULE__) !== Main && !isdefined(Main, :examples_root)
+ error("examples/make.jl has not been loaded into Main.")
+end
+
+@testset "Examples" begin
+ @testset "Markdown" begin
+ doc = Main.examples_markdown_doc
+
+ @test isa(doc, Documenter.Documents.Document)
+
+ let build_dir = joinpath(examples_root, "builds", "markdown"),
+ source_dir = joinpath(examples_root, "src")
+
+ @test isdir(build_dir)
+ @test isdir(joinpath(build_dir, "assets"))
+ @test isdir(joinpath(build_dir, "lib"))
+ @test isdir(joinpath(build_dir, "man"))
+
+ @test isfile(joinpath(build_dir, "index.md"))
+ @test isfile(joinpath(build_dir, "assets", "mathjaxhelper.js"))
+ @test isfile(joinpath(build_dir, "assets", "Documenter.css"))
+ @test isfile(joinpath(build_dir, "assets", "custom.css"))
+ @test isfile(joinpath(build_dir, "assets", "custom.js"))
+ @test isfile(joinpath(build_dir, "lib", "functions.md"))
+ @test isfile(joinpath(build_dir, "man", "tutorial.md"))
+ @test isfile(joinpath(build_dir, "man", "data.csv"))
+ @test isfile(joinpath(build_dir, "man", "julia.svg"))
+
+ @test (==)(
+ read(joinpath(source_dir, "man", "data.csv"), String),
+ read(joinpath(build_dir, "man", "data.csv"), String),
+ )
+ end
+
+ @test doc.user.root == examples_root
+ @test doc.user.source == "src"
+ @test doc.user.build == "builds/markdown"
+ @test doc.user.clean == true
+ @test doc.user.format == [:markdown]
+
+ @test realpath(doc.internal.assets) == realpath(joinpath(dirname(@__FILE__), "..", "..", "assets"))
+
+ @test length(doc.internal.pages) == 10
+
+ let headers = doc.internal.headers
+ @test Documenter.Anchors.exists(headers, "Documentation")
+ @test Documenter.Anchors.exists(headers, "Documentation")
+ @test Documenter.Anchors.exists(headers, "Index-Page")
+ @test Documenter.Anchors.exists(headers, "Functions-Contents")
+ @test Documenter.Anchors.exists(headers, "Tutorial-Contents")
+ @test Documenter.Anchors.exists(headers, "Index")
+ @test Documenter.Anchors.exists(headers, "Tutorial")
+ @test Documenter.Anchors.exists(headers, "Function-Index")
+ @test Documenter.Anchors.exists(headers, "Functions")
+ @test Documenter.Anchors.isunique(headers, "Functions")
+ @test Documenter.Anchors.isunique(headers, "Functions", joinpath("builds", "markdown", "lib", "functions.md"))
+ let name = "Foo", path = joinpath("builds", "markdown", "lib", "functions.md")
+ @test Documenter.Anchors.exists(headers, name, path)
+ @test !Documenter.Anchors.isunique(headers, name)
+ @test !Documenter.Anchors.isunique(headers, name, path)
+ @test length(headers.map[name][path]) == 4
+ end
+ end
+
+ @test length(doc.internal.objects) == 38
+ end
+
+ @testset "HTML: local" begin
+ doc = Main.examples_html_local_doc
+
+ @test isa(doc, Documenter.Documents.Document)
+
+ # TODO: test the HTML build
+ end
+
+ @testset "HTML: deploy" begin
+ doc = Main.examples_html_deploy_doc
+
+ @test isa(doc, Documenter.Documents.Document)
+
+ # TODO: test the HTML build with pretty URLs
+ end
+end
--- /dev/null
+module LaTeXFormatTests
+
+using Test
+
+using Documenter
+
+# Documenter package docs
+@info("Building Documenter's docs with LaTeX.")
+const Documenter_root = normpath(joinpath(dirname(@__FILE__), "..", "..", "docs"))
+doc = makedocs(
+ debug = true,
+ root = Documenter_root,
+ modules = [Documenter],
+ clean = false,
+ format = :latex,
+ sitename = "Documenter.jl",
+ authors = "Michael Hatherly, Morten Piibeleht, and contributors.",
+ pages = [
+ "Home" => "index.md",
+ "Manual" => Any[
+ "Guide" => "man/guide.md",
+ "man/examples.md",
+ "man/syntax.md",
+ "man/doctests.md",
+ "man/latex.md",
+ "man/hosting.md",
+ "man/other-formats.md",
+ ],
+ "Library" => Any[
+ "Public" => "lib/public.md",
+ hide("Internals" => "lib/internals.md", Any[
+ "lib/internals/anchors.md",
+ "lib/internals/builder.md",
+ "lib/internals/cross-references.md",
+ "lib/internals/docchecks.md",
+ "lib/internals/docsystem.md",
+ "lib/internals/doctests.md",
+ "lib/internals/documenter.md",
+ "lib/internals/documentertools.md",
+ "lib/internals/documents.md",
+ "lib/internals/dom.md",
+ "lib/internals/expanders.md",
+ "lib/internals/formats.md",
+ "lib/internals/mdflatten.md",
+ "lib/internals/selectors.md",
+ "lib/internals/textdiff.md",
+ "lib/internals/utilities.md",
+ "lib/internals/writers.md",
+ ])
+ ],
+ "contributing.md",
+ ]
+)
+
+@testset "LaTeX" begin
+ @test isa(doc, Documenter.Documents.Document)
+end
+
+end
--- /dev/null
+module MarkdownFormatTests
+
+using Test
+using Random
+
+using Documenter
+
+# Documenter package docs
+@info("Building Documenter's docs with Markdown.")
+const Documenter_root = normpath(joinpath(@__DIR__, "..", "..", "docs"))
+build_dir_relpath = relpath(joinpath(@__DIR__, "builds/markdown"), Documenter_root)
+doc = makedocs(
+ format = :markdown,
+ debug = true,
+ root = Documenter_root,
+ modules = Documenter,
+ build = build_dir_relpath,
+)
+
+@testset "Markdown" begin
+ @test isa(doc, Documenter.Documents.Document)
+
+ let build_dir = joinpath(Documenter_root, build_dir_relpath),
+ source_dir = joinpath(Documenter_root, "src")
+ @test isdir(build_dir)
+ @test isdir(joinpath(build_dir, "assets"))
+ @test isdir(joinpath(build_dir, "lib"))
+ @test isdir(joinpath(build_dir, "man"))
+
+ @test isfile(joinpath(build_dir, "index.md"))
+ @test isfile(joinpath(build_dir, "assets", "mathjaxhelper.js"))
+ @test isfile(joinpath(build_dir, "assets", "Documenter.css"))
+ end
+
+ @test doc.user.root == Documenter_root
+ @test doc.user.source == "src"
+ @test doc.user.build == build_dir_relpath
+ @test doc.user.clean == true
+end
+
+end
--- /dev/null
+module HTMLWriterTests
+
+using Test
+
+import Documenter.Writers.HTMLWriter: jsescape, generate_version_file, expand_versions
+
+function verify_version_file(versionfile, entries)
+ @test isfile(versionfile)
+ content = read(versionfile, String)
+ idx = 1
+ for entry in entries
+ i = findnext(entry, content, idx)
+ @test i !== nothing
+ idx = last(i)
+ end
+end
+
+@testset "HTMLWriter" begin
+ @test jsescape("abc123") == "abc123"
+ @test jsescape("▶αβγ") == "▶αβγ"
+ @test jsescape("") == ""
+
+ @test jsescape("a\nb") == "a\\nb"
+ @test jsescape("\r\n") == "\\r\\n"
+ @test jsescape("\\") == "\\\\"
+
+ @test jsescape("\"'") == "\\\"\\'"
+
+ # Ref: #639
+ @test jsescape("\u2028") == "\\u2028"
+ @test jsescape("\u2029") == "\\u2029"
+ @test jsescape("policy to
delete.") == "policy to\\u2028 delete."
+
+ mktempdir() do tmpdir
+ versionfile = joinpath(tmpdir, "versions.js")
+ versions = ["stable", "dev",
+ "2.1.1", "v2.1.0", "v2.0.1", "v2.0.0",
+ "1.1.1", "v1.1.0", "v1.0.1", "v1.0.0",
+ "0.1.1", "v0.1.0"] # note no `v` on first ones
+ cd(tmpdir) do
+ for version in versions
+ mkdir(version)
+ end
+ end
+
+ # expanding versions
+ versions = ["stable" => "v^", "v#.#", "dev" => "dev"] # default to makedocs
+ entries, symlinks = expand_versions(tmpdir, versions)
+ @test entries == ["stable", "v2.1", "v2.0", "v1.1", "v1.0", "v0.1", "dev"]
+ @test symlinks == ["stable"=>"2.1.1", "v2.1"=>"2.1.1", "v2.0"=>"v2.0.1",
+ "v1.1"=>"1.1.1", "v1.0"=>"v1.0.1", "v0.1"=>"0.1.1",
+ "v2"=>"2.1.1", "v1"=>"1.1.1", "v2.1.1"=>"2.1.1",
+ "v1.1.1"=>"1.1.1", "v0.1.1"=>"0.1.1"]
+ generate_version_file(versionfile, entries)
+ verify_version_file(versionfile, entries)
+
+ versions = ["v#"]
+ entries, symlinks = expand_versions(tmpdir, versions)
+ @test entries == ["v2.1", "v1.1"]
+ @test symlinks == ["v2.1"=>"2.1.1", "v1.1"=>"1.1.1", "v2"=>"2.1.1", "v1"=>"1.1.1",
+ "v2.0"=>"v2.0.1", "v1.0"=>"v1.0.1", "v0.1"=>"0.1.1",
+ "v2.1.1"=>"2.1.1", "v1.1.1"=>"1.1.1", "v0.1.1"=>"0.1.1"]
+ generate_version_file(versionfile, entries)
+ verify_version_file(versionfile, entries)
+
+ versions = ["v#.#.#"]
+ entries, symlinks = expand_versions(tmpdir, versions)
+ @test entries == ["v2.1.1", "v2.1.0", "v2.0.1", "v2.0.0", "v1.1.1", "v1.1.0",
+ "v1.0.1", "v1.0.0", "v0.1.1", "v0.1.0"]
+ @test symlinks == ["v2.1.1"=>"2.1.1", "v1.1.1"=>"1.1.1", "v0.1.1"=>"0.1.1",
+ "v2"=>"2.1.1", "v1"=>"1.1.1", "v2.1"=>"2.1.1",
+ "v2.0"=>"v2.0.1", "v1.1"=>"1.1.1", "v1.0"=>"v1.0.1", "v0.1"=>"0.1.1"]
+ generate_version_file(versionfile, entries)
+ verify_version_file(versionfile, entries)
+
+ versions = ["v^", "devel" => "dev", "foobar", "foo" => "bar"]
+ entries, symlinks = expand_versions(tmpdir, versions)
+ @test entries == ["v2.1", "devel"]
+ @test ("v2.1" => "2.1.1") in symlinks
+ @test ("devel" => "dev") in symlinks
+ generate_version_file(versionfile, entries)
+ verify_version_file(versionfile, entries)
+
+ versions = ["stable" => "v^", "dev" => "stable"]
+ @test_throws ArgumentError expand_versions(tmpdir, versions)
+ end
+end
+
+end
--- /dev/null
+module MDFlattenTests
+
+using Test
+
+import Markdown
+using Documenter.Utilities.MDFlatten
+
+@testset "MDFlatten" begin
+ @test mdflatten(Markdown.Paragraph("...")) == "..."
+ @test mdflatten(Markdown.Header{1}("...")) == "..."
+
+ # a simple test for blocks in top-level (each gets two newline appended to it)
+ @test mdflatten(Markdown.parse("# Test\nTest")) == "Test\n\nTest\n\n"
+ block_md = Markdown.parse("""
+ # MDFlatten test
+
+
+ ^^^ Ignoring extra whitespace.
+
+ ```markdown
+ code
+ is forwarded as **is**
+ ```
+ """)
+ block_text = """
+ MDFlatten test
+
+ ^^^ Ignoring extra whitespace.
+
+ code
+ is forwarded as **is**
+
+ """
+ @test mdflatten(block_md) == block_text
+
+ # blocks
+ @test mdflatten(Markdown.parse("> Test\n> Test\n\n> Test")) == "Test Test\n\nTest\n\n"
+ @test mdflatten(Markdown.parse("HRs\n\n---\n\nto whitespace")) == "HRs\n\n\n\nto whitespace\n\n"
+ @test mdflatten(Markdown.parse("HRs\n\n---\n\nto whitespace")) == "HRs\n\n\n\nto whitespace\n\n"
+ @test mdflatten(Markdown.parse("HRs\n\n---\n\nto whitespace")) == "HRs\n\n\n\nto whitespace\n\n"
+
+ # test some inline blocks
+ @test mdflatten(Markdown.parse("`code` *em* normal **strong**")) == "code em normal strong\n\n"
+ @test mdflatten(Markdown.parse("[link text *parsed*](link/itself/ignored)")) == "link text parsed\n\n"
+ @test mdflatten(Markdown.parse("- a\n- b\n- c")) == "a\nb\nc\n\n"
+ @test mdflatten(Markdown.parse("A | B\n---|---\naa|bb\ncc | dd")) == "A B\naa bb\ncc dd\n\n"
+
+ # Math
+ @test mdflatten(Markdown.parse("\$e=mc^2\$")) == "e=mc^2\n\n"
+ # backticks and blocks for math only in 0.5, i.e. these fail on 0.4
+ @test mdflatten(Markdown.parse("``e=mc^2``")) == "e=mc^2\n\n"
+ @test mdflatten(Markdown.parse("```math\n\\(m+n)(m-n)\nx=3\\sin(x)\n```")) == "(m+n)(m-n)\nx=3sin(x)\n\n"
+
+ # symbols in markdown
+ @test mdflatten(Markdown.parse("A \$B C")) == "A B C\n\n"
+
+ # linebreaks
+ @test mdflatten(Markdown.parse("A\\\nB")) == "A\nB\n\n"
+
+ # footnotes
+ @test mdflatten(Markdown.parse("[^name]")) == "[name]\n\n"
+ @test mdflatten(Markdown.parse("[^name]:**Strong** text.")) == "[name]: Strong text.\n\n"
+
+ # admonitions
+ @test mdflatten(Markdown.parse("!!! note \"Admonition Title\"\n Test")) == "note: Admonition Title\nTest\n\n"
+end
+
+end
--- /dev/null
+module MissingDocs
+ export f
+
+ "exported"
+ f(x) = x
+
+ "unexported"
+ g(x) = x
+end
+
+using Documenter
+
+for sym in [:none, :exports]
+ makedocs(
+ root = dirname(@__FILE__),
+ source = joinpath("src", string(sym)),
+ build = joinpath("build", string(sym)),
+ modules = MissingDocs,
+ checkdocs = sym,
+ sitename = "MissingDocs Checks",
+ )
+end
--- /dev/null
+# MissingDocs Exports
+
+```@docs
+Main.MissingDocs.f
+```
--- /dev/null
+# MissingDocs None
+
+```@docs
+```
--- /dev/null
+module NavNodeTests
+
+using Test
+
+import Documenter: Documents, Builder
+import Documenter.Documents: NavNode
+
+mutable struct FakeDocumentInternal
+ pages :: Dict{String, Nothing}
+ navlist :: Vector{NavNode}
+ FakeDocumentInternal() = new(Dict(), [])
+end
+mutable struct FakeDocument
+ internal :: FakeDocumentInternal
+ FakeDocument() = new(FakeDocumentInternal())
+end
+
+@testset "NavNode" begin
+ @test fieldtype(FakeDocumentInternal, :navlist) == fieldtype(Documents.Internal, :navlist)
+
+ pages = [
+ "page1.md",
+ "Page2" => "page2.md",
+ "Section" => [
+ "page3.md",
+ "Page4" => "page4.md",
+ "Subsection" => [
+ "page5.md",
+ ],
+ ],
+ "page6.md",
+ ]
+ doc = FakeDocument()
+ doc.internal.pages = Dict(map(i -> "page$i.md" => nothing, 1:8))
+ navtree = Builder.walk_navpages(pages, nothing, doc)
+ navlist = doc.internal.navlist
+
+ @test length(navlist) == 6
+ for (i,navnode) in enumerate(navlist)
+ @test navnode.page == "page$i.md"
+ end
+
+ @test isa(navtree, Vector{NavNode})
+ @test length(navtree) == 4
+ @test navtree[1] === navlist[1]
+ @test navtree[2] === navlist[2]
+ @test navtree[4] === navlist[6]
+
+ section = navtree[3]
+ @test section.title_override == "Section"
+ @test section.page === nothing
+ @test length(section.children) == 3
+
+ navpath = Documents.navpath(navlist[5])
+ @test length(navpath) == 3
+ @test navpath[1] === section
+ @test navpath[3] === navlist[5]
+end
+
+end
--- /dev/null
+using Documenter
+
+makedocs(
+ debug = true,
+ doctestfilters = [r"Ptr{0x[0-9]+}"],
+ sitename = "Documenter example",
+ pages = ["index.md"],
+)
--- /dev/null
+# Test
+
+....
--- /dev/null
+mktempdir() do tmpdir
+ @info("Buiding 'nongit' in $tmpdir")
+ cp(joinpath(@__DIR__, "docs"), joinpath(tmpdir, "docs"))
+ include(joinpath(tmpdir, "docs/make.jl"))
+ # Copy the build/ directory back so that it would be possible to inspect the output.
+ cp(joinpath(tmpdir, "docs/build"), joinpath(@__DIR__, "build"); force=true)
+end
--- /dev/null
+using Test
+
+# Build the example docs
+include("examples/make.jl")
+
+# Test missing docs
+include("missingdocs/make.jl")
+
+# Primary @testset
+
+# Error reporting.
+println("="^50)
+@info("The following errors are expected output.")
+include(joinpath("errors", "make.jl"))
+@info("END of expected error output.")
+println("="^50)
+
+@testset "Documenter" begin
+ # Unit tests for module internals.
+ include("utilities.jl")
+
+ # NavNode tests.
+ include("navnode.jl")
+
+ # DocSystem unit tests.
+ include("docsystem.jl")
+
+ # DocTest unit tests.
+ include("doctests/doctests.jl")
+
+ # DOM Tests.
+ include("dom.jl")
+
+ # MDFlatten tests.
+ include("mdflatten.jl")
+
+ # HTMLWriter
+ include("htmlwriter.jl")
+
+ # Mock package docs.
+ include("examples/tests.jl")
+
+ # Documenter package docs with other formats.
+ include("formats/markdown.jl")
+ include("formats/latex.jl")
+
+ # A simple build outside of a Git repository
+ include("nongit/tests.jl")
+end
+
+# Additional tests
+
+## `Markdown.MD` to `DOM.Node` conversion tests.
+module MarkdownToNode
+ import Documenter.DocSystem
+ import Documenter.Writers.HTMLWriter: mdconvert
+
+ # Exhaustive Conversion from Markdown to Nodes.
+ for mod in Base.Docs.modules
+ for (binding, multidoc) in DocSystem.getmeta(mod)
+ for (typesig, docstr) in multidoc.docs
+ md = DocSystem.parsedoc(docstr)
+ string(mdconvert(md))
+ end
+ end
+ end
+end
--- /dev/null
+module UtilitiesTests
+
+using Test
+import Base64: stringmime
+
+import Documenter
+
+module UnitTests
+ module SubModule end
+
+ # Does `submodules` collect *all* the submodules?
+ module A
+ module B
+ module C
+ module D end
+ end
+ end
+ end
+
+ mutable struct T end
+ mutable struct S{T} end
+
+ "Documenter unit tests."
+ Base.length(::T) = 1
+
+ f(x) = x
+
+ const pi = 3.0
+end
+
+module OuterModule
+module InnerModule
+import ..OuterModule
+export OuterModule
+end
+end
+
+module ExternalModule end
+module ModuleWithAliases
+using ..ExternalModule
+Y = ExternalModule
+module A
+ module B
+ const X = Main
+ end
+end
+end
+
+@testset "Utilities" begin
+ let doc = @doc(length)
+ a = Documenter.Utilities.filterdocs(doc, Set{Module}())
+ b = Documenter.Utilities.filterdocs(doc, Set{Module}([UnitTests]))
+ c = Documenter.Utilities.filterdocs(doc, Set{Module}([Base]))
+ d = Documenter.Utilities.filterdocs(doc, Set{Module}([UtilitiesTests]))
+
+ @test a !== nothing
+ @test a === doc
+ @test b !== nothing
+ @test occursin("Documenter unit tests.", stringmime("text/plain", b))
+ @test c !== nothing
+ @test !occursin("Documenter unit tests.", stringmime("text/plain", c))
+ @test d === nothing
+ end
+
+ # Documenter.Utilities.issubmodule
+ @test Documenter.Utilities.issubmodule(Main, Main) === true
+ @test Documenter.Utilities.issubmodule(UnitTests, UnitTests) === true
+ @test Documenter.Utilities.issubmodule(UnitTests.SubModule, Main) === true
+ @test Documenter.Utilities.issubmodule(UnitTests.SubModule, UnitTests) === true
+ @test Documenter.Utilities.issubmodule(UnitTests.SubModule, Base) === false
+ @test Documenter.Utilities.issubmodule(UnitTests, UnitTests.SubModule) === false
+
+ @test UnitTests.A in Documenter.Utilities.submodules(UnitTests.A)
+ @test UnitTests.A.B in Documenter.Utilities.submodules(UnitTests.A)
+ @test UnitTests.A.B.C in Documenter.Utilities.submodules(UnitTests.A)
+ @test UnitTests.A.B.C.D in Documenter.Utilities.submodules(UnitTests.A)
+ @test OuterModule in Documenter.Utilities.submodules(OuterModule)
+ @test OuterModule.InnerModule in Documenter.Utilities.submodules(OuterModule)
+ @test length(Documenter.Utilities.submodules(OuterModule)) == 2
+ @test Documenter.Utilities.submodules(ModuleWithAliases) == Set([ModuleWithAliases, ModuleWithAliases.A, ModuleWithAliases.A.B])
+
+ @test Documenter.Utilities.isabsurl("file.md") === false
+ @test Documenter.Utilities.isabsurl("../file.md") === false
+ @test Documenter.Utilities.isabsurl(".") === false
+ @test Documenter.Utilities.isabsurl("https://example.org/file.md") === true
+ @test Documenter.Utilities.isabsurl("http://example.org") === true
+ @test Documenter.Utilities.isabsurl("ftp://user:pw@example.org") === true
+ @test Documenter.Utilities.isabsurl("/fs/absolute/path") === false
+
+ @test Documenter.Utilities.doccat(UnitTests) == "Module"
+ @test Documenter.Utilities.doccat(UnitTests.T) == "Type"
+ @test Documenter.Utilities.doccat(UnitTests.S) == "Type"
+ @test Documenter.Utilities.doccat(UnitTests.f) == "Function"
+ @test Documenter.Utilities.doccat(UnitTests.pi) == "Constant"
+
+ # repo type
+ @test Documenter.Utilities.repo_host_from_url("https://bitbucket.org/somerepo") == Documenter.Utilities.RepoBitbucket
+ @test Documenter.Utilities.repo_host_from_url("https://www.bitbucket.org/somerepo") == Documenter.Utilities.RepoBitbucket
+ @test Documenter.Utilities.repo_host_from_url("http://bitbucket.org/somethingelse") == Documenter.Utilities.RepoBitbucket
+ @test Documenter.Utilities.repo_host_from_url("http://github.com/Whatever") == Documenter.Utilities.RepoGithub
+ @test Documenter.Utilities.repo_host_from_url("https://github.com/Whatever") == Documenter.Utilities.RepoGithub
+ @test Documenter.Utilities.repo_host_from_url("https://www.github.com/Whatever") == Documenter.Utilities.RepoGithub
+ @test Documenter.Utilities.repo_host_from_url("https://gitlab.com/Whatever") == Documenter.Utilities.RepoGitlab
+
+ # line range
+ let formatting = Documenter.Utilities.LineRangeFormatting(Documenter.Utilities.RepoGithub)
+ @test Documenter.Utilities.format_line(1:1, formatting) == "L1"
+ @test Documenter.Utilities.format_line(123:123, formatting) == "L123"
+ @test Documenter.Utilities.format_line(2:5, formatting) == "L2-L5"
+ @test Documenter.Utilities.format_line(100:9999, formatting) == "L100-L9999"
+ end
+
+ let formatting = Documenter.Utilities.LineRangeFormatting(Documenter.Utilities.RepoGitlab)
+ @test Documenter.Utilities.format_line(1:1, formatting) == "L1"
+ @test Documenter.Utilities.format_line(123:123, formatting) == "L123"
+ @test Documenter.Utilities.format_line(2:5, formatting) == "L2-5"
+ @test Documenter.Utilities.format_line(100:9999, formatting) == "L100-9999"
+ end
+
+ let formatting = Documenter.Utilities.LineRangeFormatting(Documenter.Utilities.RepoBitbucket)
+ @test Documenter.Utilities.format_line(1:1, formatting) == "1"
+ @test Documenter.Utilities.format_line(123:123, formatting) == "123"
+ @test Documenter.Utilities.format_line(2:5, formatting) == "2:5"
+ @test Documenter.Utilities.format_line(100:9999, formatting) == "100:9999"
+ end
+
+ # URL building
+ filepath = string(first(methods(Documenter.Utilities.url)).file)
+ Sys.iswindows() && (filepath = replace(filepath, "/" => "\\")) # work around JuliaLang/julia#26424
+ let expected_filepath = "/src/Utilities/Utilities.jl"
+ Sys.iswindows() && (expected_filepath = replace(expected_filepath, "/" => "\\"))
+ @test endswith(filepath, expected_filepath)
+ @show filepath expected_filepath
+ end
+
+ mktempdir() do path
+ cd(path) do
+ # Create a simple mock repo in a temporary directory with a single file.
+ @test success(`git init`)
+ @test success(`git config user.email "tester@example.com"`)
+ @test success(`git config user.name "Test Committer"`)
+ @test success(`git remote add origin git@github.com:JuliaDocs/Documenter.jl.git`)
+ mkpath("src")
+ filepath = abspath(joinpath("src", "SourceFile.jl"))
+ write(filepath, "X")
+ @test success(`git add -A`)
+ @test success(`git commit -m"Initial commit."`)
+
+ # Run tests
+ commit = Documenter.Utilities.repo_commit(filepath)
+
+ @test Documenter.Utilities.url("//blob/{commit}{path}#{line}", filepath) == "//blob/$(commit)/src/SourceFile.jl#"
+ @test Documenter.Utilities.url(nothing, "//blob/{commit}{path}#{line}", Documenter.Utilities, filepath, 10:20) == "//blob/$(commit)/src/SourceFile.jl#L10-L20"
+
+ # repo_root & relpath_from_repo_root
+ @test Documenter.Utilities.repo_root(filepath) == dirname(abspath(joinpath(dirname(filepath), ".."))) # abspath() keeps trailing /, hence dirname()
+ @test Documenter.Utilities.repo_root(filepath; dbdir=".svn") == nothing
+ @test Documenter.Utilities.relpath_from_repo_root(filepath) == joinpath("src", "SourceFile.jl")
+ # We assume that a temporary file is not in a repo
+ @test Documenter.Utilities.repo_root(tempname()) == nothing
+ @test Documenter.Utilities.relpath_from_repo_root(tempname()) == nothing
+ end
+ end
+
+ import Documenter.Documents: Document, Page, Globals
+ let page = Page("source", "build", [], IdDict{Any,Any}(), Globals()), doc = Document()
+ code = """
+ x += 3
+ γγγ_γγγ
+ γγγ
+ """
+ exprs = Documenter.Utilities.parseblock(code, doc, page)
+
+ @test isa(exprs, Vector)
+ @test length(exprs) === 3
+
+ @test isa(exprs[1][1], Expr)
+ @test exprs[1][1].head === :+=
+ @test exprs[1][2] == "x += 3\n"
+
+ @test exprs[2][2] == "γγγ_γγγ\n"
+
+ @test exprs[3][1] === :γγγ
+ @test exprs[3][2] == "γγγ\n"
+ end
+
+ @testset "TextDiff" begin
+ import Documenter.Utilities.TextDiff: splitby
+ @test splitby(r"\s+", "X Y Z") == ["X ", "Y ", "Z"]
+ @test splitby(r"[~]", "X~Y~Z") == ["X~", "Y~", "Z"]
+ @test splitby(r"[▶]", "X▶Y▶Z") == ["X▶", "Y▶", "Z"]
+ @test splitby(r"[▶]+", "X▶▶Y▶Z▶") == ["X▶▶", "Y▶", "Z▶"]
+ @test splitby(r"[▶]+", "▶▶Y▶Z▶") == ["▶▶", "Y▶", "Z▶"]
+ @test splitby(r"[▶]+", "Ω▶▶Y▶Z▶") == ["Ω▶▶", "Y▶", "Z▶"]
+ @test splitby(r"[▶]+", "Ω▶▶Y▶Z▶κ") == ["Ω▶▶", "Y▶", "Z▶", "κ"]
+ end
+
+ @static if isdefined(Base, :with_logger)
+ @testset "withoutput" begin
+ _, _, _, output = Documenter.Utilities.withoutput() do
+ println("println")
+ @info "@info"
+ f() = (Base.depwarn("depwarn", :f); nothing)
+ f()
+ end
+ @test startswith(output, "println\n[ Info: @info\n┌ Warning: depwarn\n")
+ end
+ end
+
+ @testset "issues #749, #790, #823" begin
+ let parse(x) = Documenter.Utilities.parseblock(x, nothing, nothing)
+ for LE in ("\r\n", "\n")
+ l1, l2 = parse("x = Int[]$(LE)$(LE)push!(x, 1)$(LE)")
+ @test l1[1] == :(x = Int[])
+ @test l1[2] == "x = Int[]$(LE)"
+ @test l2[1] == :(push!(x, 1))
+ @test l2[2] == "push!(x, 1)$(LE)"
+ end
+ end
+ end
+end
+
+end
--- /dev/null
+https://github.com/JuliaDocs/Documenter.jl/releases
--- /dev/null
+https://api.github.com/repos/JuliaLang/Pkg.jl/tarball/1609a05aee5d5960670738d8d834d91235bd6b1e
--- /dev/null
+https://api.github.com/repos/JuliaLang/libuv/tarball/d8ab1c6a33e77bf155facb54215dd8798e13825d
--- /dev/null
+https://api.github.com/repos/vtjnash/libwhich/tarball/81e9723c0273d78493dc8c8ed570f68d9ce7e89e
--- /dev/null
+llvm-6.0.0.src.tar.xz
\ No newline at end of file
--- /dev/null
+/* This file is provided by Debian as an replacement to
+ * https://fonts.googleapis.com/css?family=Lato|Roboto+Mono
+ * MIT License, 2018, Mo Zhou <cdluminate@gmail.com>
+ */
+@font-face {
+ font-family: 'Lato';
+ font-style: normal;
+ font-weight: 400;
+ src: local('Lato Regular'),
+ local('Lato-Regular'),
+ url(file:///usr/share/fonts/truetype/lato/Lato-Regular.ttf) format('truetype');
+}
+@font-face {
+ font-family: 'Inconsolata';
+ font-style: normal;
+ font-weight: 400;
+ src: local('Inconsolata'),
+ url(file:///usr/share/fonts/truetype/inconsolata/Inconsolata.otf) format('opentype');
+}
--- /dev/null
+[DEFAULT]
+pristine-tar = True
--- /dev/null
+include: https://salsa.debian.org/salsa-ci-team/pipeline/raw/master/salsa-ci.yml
+
+build:
+ extends: .build-unstable
+
+reprotest:
+ extends: .test-reprotest
+
+lintian:
+ extends: .test-lintian
+
+autopkgtest:
+ extends: .test-autopkgtest
+
+piuparts:
+ extends: .test-piuparts
--- /dev/null
+etc/julia/
+usr/share/julia/base
+usr/share/julia/test
+usr/share/julia/stdlib
+usr/share/julia/julia-config.jl
+usr/share/julia/build_sysimg.jl
--- /dev/null
+# Docs are deliberately put there.
+julia-common: package-contains-documentation-outside-usr-share-doc usr/share/julia/stdlib/*
+# This is false-positive
+julia-common: wrong-path-for-interpreter
--- /dev/null
+Document: julia-manual
+Title: Julia Language Manual
+Abstract: Describes the Julia language and its standard library
+Section: Programming
+
+Format: HTML
+Index: /usr/share/doc/julia/html/en/index.html
+Files: /usr/share/doc/julia/html/*
--- /dev/null
+Document: julia-manual-pdf
+Title: Julia Language Manual
+Abstract: Describes the Julia language and its standard library
+Section: Programming
+
+Format: PDF
+Files: /usr/share/doc/julia/TheJuliaLanguage.pdf.gz
--- /dev/null
+doc/_build/html
+doc/_build/pdf/en/TheJuliaLanguage.pdf
--- /dev/null
+debian/fontface.css usr/share/doc/julia-doc/
--- /dev/null
+README.md
+CONTRIBUTING.md
+DISTRIBUTING.md
+HISTORY.md
+NEWS.md
+README.arm.md
+README.md
+README.windows.md
+VERSION
--- /dev/null
+debian/prompt.example.jl
--- /dev/null
+usr/bin/julia
+usr/share/appdata/julia.appdata.xml usr/share/metainfo/
+usr/share/applications/julia.desktop
+usr/share/icons/hicolor/scalable/apps/julia.svg
+usr/share/julia/base/build_h.jl
+usr/share/julia/base.cache
--- /dev/null
+debian/tmp/usr/share/man/man1/julia.1
--- /dev/null
+usr/include/julia
+usr/lib/*/libjulia.so
--- /dev/null
+usr/lib/*/libjulia.so.*
+usr/lib/*/julia
--- /dev/null
+# Deliberately keeping debug info https://github.com/JuliaLang/julia/issues/23115#issuecomment-320715030
+libjulia1: unstripped-binary-or-object usr/lib/*/julia/sys.so
+# Deliberately keeping debug info https://github.com/JuliaLang/julia/issues/23115#issuecomment-320715030
+libjulia1: unstripped-binary-or-object usr/lib/*/libjulia.so.*
--- /dev/null
+libjulia.so.1 libjulia1 #MINVER#
+ LLVMExtraAddInternalizePassWithExportList@Base 0.7.0~beta2
+ LLVMExtraAddMVVMReflectPass@Base 0.7.0~beta2
+ LLVMExtraAddPass@Base 0.7.0~beta2
+ LLVMExtraAddTargetLibraryInfoByTiple@Base 0.7.0~beta2
+ LLVMExtraCreateBasicBlockPass@Base 0.7.0~beta2
+ LLVMExtraCreateFunctionPass@Base 0.7.0~beta2
+ LLVMExtraCreateModulePass@Base 0.7.0~beta2
+ LLVMExtraGetDebugMDVersion@Base 0.7.0~beta2
+ LLVMExtraGetSourceLocation@Base 0.7.0~beta2
+ LLVMExtraGetValueContext@Base 0.7.0~beta2
+ LLVMExtraInitializeAllAsmParsers@Base 0.7.0~beta2
+ LLVMExtraInitializeAllAsmPrinters@Base 0.7.0~beta2
+ LLVMExtraInitializeAllDisassemblers@Base 0.7.0~beta2
+ LLVMExtraInitializeAllTargetInfos@Base 0.7.0~beta2
+ LLVMExtraInitializeAllTargetMCs@Base 0.7.0~beta2
+ LLVMExtraInitializeAllTargets@Base 0.7.0~beta2
+ LLVMExtraInitializeNativeAsmParser@Base 0.7.0~beta2
+ LLVMExtraInitializeNativeAsmPrinter@Base 0.7.0~beta2
+ LLVMExtraInitializeNativeDisassembler@Base 0.7.0~beta2
+ LLVMExtraInitializeNativeTarget@Base 0.7.0~beta2
+ __stack_chk_guard@Base 0.6.3
+ bitvector_get@Base 0.6.3
+ bitvector_new@Base 0.6.3
+ bitvector_resize@Base 0.6.3
+ bitvector_set@Base 0.6.3
+ int32hash@Base 0.6.3
+ int64hash@Base 0.6.3
+ int64to32hash@Base 0.6.3
+ ios_bufmode@Base 0.6.3
+ ios_close@Base 0.6.3
+ ios_copy@Base 0.6.3
+ ios_copyall@Base 0.6.3
+ ios_copyuntil@Base 0.6.3
+ ios_eof@Base 0.6.3
+ ios_eof_blocking@Base 0.6.3
+ ios_fd@Base 0.6.3
+ ios_file@Base 0.6.3
+ ios_flush@Base 0.6.3
+ ios_get_readable@Base 0.6.3
+ ios_get_writable@Base 0.6.3
+ ios_getc@Base 0.6.3
+ ios_getutf8@Base 0.6.3
+ ios_isopen@Base 0.6.3
+ ios_mem@Base 0.6.3
+ ios_mkstemp@Base 0.6.3
+ ios_nchomp@Base 0.6.3
+ ios_peekc@Base 0.6.3
+ ios_peekutf8@Base 0.6.3
+ ios_pos@Base 0.6.3
+ ios_printf@Base 0.6.3
+ ios_purge@Base 0.6.3
+ ios_putc@Base 0.6.3
+ ios_pututf8@Base 0.6.3
+ ios_read@Base 0.6.3
+ ios_readall@Base 0.6.3
+ ios_readline@Base 0.6.3
+ ios_readprep@Base 0.6.3
+ ios_seek@Base 0.6.3
+ ios_seek_end@Base 0.6.3
+ ios_set_readonly@Base 0.6.3
+ ios_setbuf@Base 0.6.3
+ ios_skip@Base 0.6.3
+ ios_stderr@Base 0.6.3
+ ios_stdin@Base 0.6.3
+ ios_stdout@Base 0.6.3
+ ios_take_buffer@Base 0.6.3
+ ios_trunc@Base 0.6.3
+ ios_vprintf@Base 0.6.3
+ ios_write@Base 0.6.3
+ ios_write_direct@Base 0.6.3
+ jl_@Base 0.6.3
+ jl_LLVMContext@Base 0.6.3
+ jl_LLVMCreateDisasm@Base 0.6.3
+ jl_LLVMDisasmInstruction@Base 0.6.3
+ jl_LLVMFlipSign@Base 0.6.3
+ jl_LLVMSMod@Base 0.6.3
+ jl_RTLD_DEFAULT_handle@Base 0.6.3
+ jl_SC_CLK_TCK@Base 0.6.3
+ jl_abs_float@Base 0.6.3
+ jl_abs_float_withtype@Base 0.6.3
+ jl_abstractarray_type@Base 0.6.3
+ jl_abstractslot_type@Base 0.6.3
+ jl_abstractstring_type@Base 0.6.3
+ jl_add_float@Base 0.6.3
+ jl_add_int@Base 0.6.3
+ jl_add_optimization_passes@Base 0.7.0~beta2
+ jl_add_ptr@Base 0.7.0~beta2
+ jl_add_standard_imports@Base 0.6.3
+ jl_alignment@Base 0.6.3
+ jl_alloc_array_1d@Base 0.6.3
+ jl_alloc_array_2d@Base 0.6.3
+ jl_alloc_array_3d@Base 0.6.3
+ jl_alloc_string@Base 0.6.3
+ jl_alloc_svec@Base 0.6.3
+ jl_alloc_svec_uninit@Base 0.6.3
+ jl_alloc_vec_any@Base 0.6.3
+ jl_an_empty_vec_any@Base 0.6.3
+ jl_and_int@Base 0.6.3
+ jl_any_type@Base 0.6.3
+ jl_anytuple_type@Base 0.6.3
+ jl_anytuple_type_type@Base 0.6.3
+ jl_apply_2va@Base 0.6.3
+ jl_apply_array_type@Base 0.6.3
+ jl_apply_generic@Base 0.6.3
+ jl_apply_tuple_type@Base 0.6.3
+ jl_apply_tuple_type_v@Base 0.6.3
+ jl_apply_type1@Base 0.6.3
+ jl_apply_type2@Base 0.6.3
+ jl_apply_type@Base 0.6.3
+ jl_apply_with_saved_exception_state@Base 0.6.3
+ jl_argument_datatype@Base 0.7.0~beta2
+ jl_argumenterror_type@Base 0.6.3
+ jl_array_any_type@Base 0.6.3
+ jl_array_cconvert_cstring@Base 0.6.3
+ jl_array_copy@Base 0.6.3
+ jl_array_data_owner@Base 0.6.3
+ jl_array_del_at@Base 0.6.3
+ jl_array_del_beg@Base 0.6.3
+ jl_array_del_end@Base 0.6.3
+ jl_array_eltype@Base 0.6.3
+ jl_array_grow_at@Base 0.6.3
+ jl_array_grow_beg@Base 0.6.3
+ jl_array_grow_end@Base 0.6.3
+ jl_array_int32_type@Base 0.7.0~beta2
+ jl_array_isassigned@Base 0.6.3
+ jl_array_ptr@Base 0.6.3
+ jl_array_ptr_1d_append@Base 0.6.3
+ jl_array_ptr_1d_push@Base 0.6.3
+ jl_array_ptr_copy@Base 0.6.3
+ jl_array_rank@Base 0.6.3
+ jl_array_size@Base 0.6.3
+ jl_array_sizehint@Base 0.6.3
+ jl_array_store_unboxed@Base 0.7.0~beta2
+ jl_array_symbol_type@Base 0.6.3
+ jl_array_to_string@Base 0.6.3
+ jl_array_type@Base 0.6.3
+ jl_array_typename@Base 0.6.3
+ jl_array_typetagdata@Base 0.7.0
+ jl_array_uint8_type@Base 0.6.3
+ jl_arraylen@Base 0.6.3
+ jl_arrayref@Base 0.6.3
+ jl_arrayset@Base 0.6.3
+ jl_arrayunset@Base 0.6.3
+ jl_ashr_int@Base 0.6.3
+ jl_ast_flag_inferred@Base 0.6.3
+ jl_ast_flag_inlineable@Base 0.6.3
+ jl_ast_flag_pure@Base 0.6.3
+ jl_astaggedvalue@Base 0.6.3
+ jl_atexit_hook@Base 0.6.3
+ jl_backtrace_from_here@Base 0.6.3
+ jl_base_module@Base 0.6.3
+ jl_base_relative_to@Base 0.6.3
+ jl_binding_owner@Base 0.7.0~beta2
+ jl_binding_resolved_p@Base 0.6.3
+ jl_bitcast@Base 0.6.3
+ jl_bool_type@Base 0.6.3
+ jl_bottom_type@Base 0.6.3
+ jl_boundp@Base 0.6.3
+ jl_bounds_error@Base 0.6.3
+ jl_bounds_error_int@Base 0.6.3
+ jl_bounds_error_ints@Base 0.6.3
+ jl_bounds_error_tuple_int@Base 0.6.3
+ jl_bounds_error_unboxed_int@Base 0.6.3
+ jl_bounds_error_v@Base 0.6.3
+ jl_boundserror_type@Base 0.6.3
+ jl_box_bool@Base 0.6.3
+ jl_box_char@Base 0.6.3
+ jl_box_float32@Base 0.6.3
+ jl_box_float64@Base 0.6.3
+ jl_box_int16@Base 0.6.3
+ jl_box_int32@Base 0.6.3
+ jl_box_int64@Base 0.6.3
+ jl_box_int8@Base 0.6.3
+ jl_box_slotnumber@Base 0.6.3
+ jl_box_ssavalue@Base 0.6.3
+ jl_box_uint16@Base 0.6.3
+ jl_box_uint32@Base 0.6.3
+ jl_box_uint64@Base 0.6.3
+ jl_box_uint8@Base 0.6.3
+ jl_box_voidpointer@Base 0.6.3
+ jl_breakpoint@Base 0.6.3
+ jl_bswap_int@Base 0.6.3
+ jl_builtin_type@Base 0.6.3
+ jl_call0@Base 0.6.3
+ jl_call1@Base 0.6.3
+ jl_call2@Base 0.6.3
+ jl_call3@Base 0.6.3
+ jl_call@Base 0.6.3
+ jl_calloc@Base 0.6.3
+ jl_capture_interp_frame@Base 0.7.0~beta2
+ jl_ceil_llvm@Base 0.6.3
+ jl_ceil_llvm_withtype@Base 0.6.3
+ jl_cglobal@Base 0.6.3
+ jl_cglobal_auto@Base 0.6.3
+ jl_char_type@Base 0.6.3
+ jl_checked_assignment@Base 0.6.3
+ jl_checked_sadd_int@Base 0.6.3
+ jl_checked_sdiv_int@Base 0.6.3
+ jl_checked_smul_int@Base 0.6.3
+ jl_checked_srem_int@Base 0.6.3
+ jl_checked_ssub_int@Base 0.6.3
+ jl_checked_uadd_int@Base 0.6.3
+ jl_checked_udiv_int@Base 0.6.3
+ jl_checked_umul_int@Base 0.6.3
+ jl_checked_urem_int@Base 0.6.3
+ jl_checked_usub_int@Base 0.6.3
+ jl_clear_malloc_data@Base 0.6.3
+ jl_clock_now@Base 0.6.3
+ jl_close_uv@Base 0.6.3
+ jl_code_for_staged@Base 0.6.3
+ jl_code_info_type@Base 0.6.3
+ jl_compile_hint@Base 0.6.3
+ jl_compress_ast@Base 0.6.3
+ jl_connect_raw@Base 0.6.3
+ jl_copy_ast@Base 0.6.3
+ jl_copy_code_info@Base 0.6.3
+ jl_copysign_float@Base 0.6.3
+ jl_core_module@Base 0.6.3
+ jl_cpu_pause@Base 0.6.3
+ jl_cpu_threads@Base 0.7.0~beta2
+ jl_cpu_wake@Base 0.6.3
+ (arch=amd64 i386)jl_cpuid@Base 0.6.3
+ (arch=amd64 i386)jl_cpuidex@Base 0.7.0~beta2
+ jl_crc32c@Base 0.6.3
+ jl_crc32c_sw@Base 0.7.0~beta2
+ jl_create_system_image@Base 0.6.3
+ jl_cstr_to_string@Base 0.6.3
+ jl_ctlz_int@Base 0.6.3
+ jl_ctpop_int@Base 0.6.3
+ jl_cttz_int@Base 0.6.3
+ jl_datatype_type@Base 0.6.3
+ jl_declare_constant@Base 0.6.3
+ jl_default_cgparams@Base 0.6.3
+ jl_defines_or_exports_p@Base 0.6.3
+ jl_densearray_type@Base 0.6.3
+ jl_deprecate_binding@Base 0.6.3
+ jl_div_float@Base 0.6.3
+ jl_diverror_exception@Base 0.6.3
+ jl_dl_handle@Base 0.6.3
+ jl_dlclose@Base 0.6.3
+ jl_dlopen@Base 0.6.3
+ jl_dlsym@Base 0.6.3
+ jl_dlsym_e@Base 0.6.3
+ jl_domain_exception@Base 0.6.3
+ jl_dump_compiles@Base 0.6.3
+ jl_dump_fptr_asm@Base 0.7.0~beta2
+ jl_dump_function_asm@Base 0.6.3
+ jl_dump_function_ir@Base 0.6.3
+ jl_dump_host_cpu@Base 0.7.0~beta2
+ jl_egal@Base 0.6.3
+ jl_emptysvec@Base 0.6.3
+ jl_emptytuple@Base 0.6.3
+ jl_emptytuple_type@Base 0.6.3
+ jl_enter_handler@Base 0.6.3
+ jl_environ@Base 0.6.3
+ jl_eof_error@Base 0.6.3
+ jl_eq_float@Base 0.6.3
+ jl_eq_int@Base 0.6.3
+ jl_eqtable_get@Base 0.6.3
+ jl_eqtable_nextind@Base 0.6.3
+ jl_eqtable_pop@Base 0.6.3
+ jl_eqtable_put@Base 0.6.3
+ jl_errno@Base 0.6.3
+ jl_error@Base 0.6.3
+ jl_errorexception_type@Base 0.6.3
+ jl_errorf@Base 0.6.3
+ jl_eval_string@Base 0.6.3
+ jl_exception_clear@Base 0.6.3
+ jl_exception_occurred@Base 0.6.3
+ jl_exceptionf@Base 0.6.3
+ jl_exe_handle@Base 0.6.3
+ jl_exit@Base 0.6.3
+ jl_exit_on_sigint@Base 0.6.3
+ jl_expand@Base 0.6.3
+ jl_expand_stmt@Base 0.7.0~beta2
+ jl_expr_type@Base 0.6.3
+ jl_extern_c@Base 0.6.3
+ jl_f__apply@Base 0.6.3
+ jl_f__apply_latest@Base 0.6.3
+ jl_f__apply_pure@Base 0.6.3
+ jl_f__expr@Base 0.6.3
+ jl_f_applicable@Base 0.6.3
+ jl_f_apply_type@Base 0.6.3
+ jl_f_arrayref@Base 0.6.3
+ jl_f_arrayset@Base 0.6.3
+ jl_f_arraysize@Base 0.6.3
+ jl_f_fieldtype@Base 0.6.3
+ jl_f_getfield@Base 0.6.3
+ jl_f_ifelse@Base 0.7.0~beta2
+ jl_f_intrinsic_call@Base 0.6.3
+ jl_f_invoke@Base 0.6.3
+ jl_f_invoke_kwsorter@Base 0.6.3
+ jl_f_is@Base 0.6.3
+ jl_f_isa@Base 0.6.3
+ jl_f_isdefined@Base 0.6.3
+ jl_f_issubtype@Base 0.6.3
+ jl_f_new_module@Base 0.6.3
+ jl_f_nfields@Base 0.6.3
+ jl_f_setfield@Base 0.6.3
+ jl_f_sizeof@Base 0.6.3
+ jl_f_svec@Base 0.6.3
+ jl_f_throw@Base 0.6.3
+ jl_f_tuple@Base 0.6.3
+ jl_f_typeassert@Base 0.6.3
+ jl_f_typeof@Base 0.6.3
+ jl_false@Base 0.6.3
+ jl_field_index@Base 0.6.3
+ jl_field_isdefined@Base 0.6.3
+ jl_filename@Base 0.6.3
+ jl_fill_argnames@Base 0.6.3
+ jl_finalize@Base 0.6.3
+ jl_finalize_th@Base 0.6.3
+ jl_find_free_typevars@Base 0.6.3
+ jl_first_argument_datatype@Base 0.6.3
+ jl_flipsign_int@Base 0.6.3
+ jl_float16_type@Base 0.6.3
+ jl_float32_type@Base 0.6.3
+ jl_float64_type@Base 0.6.3
+ jl_floatingpoint_type@Base 0.6.3
+ jl_floor_llvm@Base 0.6.3
+ jl_floor_llvm_withtype@Base 0.6.3
+ jl_flush_cstdio@Base 0.6.3
+ jl_fma_float@Base 0.6.3
+ jl_forceclose_uv@Base 0.6.3
+ jl_fpext@Base 0.6.3
+ jl_fpiseq@Base 0.6.3
+ jl_fpislt@Base 0.6.3
+ jl_fptosi@Base 0.6.3
+ jl_fptoui@Base 0.6.3
+ jl_fptr_args@Base 0.7.0~beta2
+ jl_fptr_const_return@Base 0.7.0~beta2
+ jl_fptr_interpret_call@Base 0.7.0~beta2
+ jl_fptr_sparam@Base 0.7.0~beta2
+ jl_fptr_trampoline@Base 0.7.0~beta2
+ jl_fptrunc@Base 0.6.3
+ jl_free@Base 0.6.3
+ jl_fs_chmod@Base 0.6.3
+ jl_fs_chown@Base 0.6.3
+ jl_fs_close@Base 0.6.3
+ jl_fs_read@Base 0.6.3
+ jl_fs_read_byte@Base 0.6.3
+ jl_fs_rename@Base 0.6.3
+ jl_fs_sendfile@Base 0.6.3
+ jl_fs_symlink@Base 0.6.3
+ jl_fs_unlink@Base 0.6.3
+ jl_fs_write@Base 0.6.3
+ jl_fstat@Base 0.6.3
+ jl_ftruncate@Base 0.6.3
+ jl_function_ptr@Base 0.6.3
+ jl_function_ptr_by_llvm_name@Base 0.6.3
+ jl_function_type@Base 0.6.3
+ jl_gc_add_finalizer@Base 0.6.3
+ jl_gc_add_finalizer_th@Base 0.6.3
+ jl_gc_add_ptr_finalizer@Base 0.6.3
+ jl_gc_alloc@Base 0.6.3
+ jl_gc_alloc_0w@Base 0.6.3
+ jl_gc_alloc_1w@Base 0.6.3
+ jl_gc_alloc_2w@Base 0.6.3
+ jl_gc_alloc_3w@Base 0.6.3
+ jl_gc_allocobj@Base 0.6.3
+ jl_gc_big_alloc@Base 0.6.3
+ jl_gc_collect@Base 0.6.3
+ jl_gc_counted_calloc@Base 0.6.3
+ jl_gc_counted_free@Base 0.6.3
+ jl_gc_counted_free_with_size@Base 0.7.0~beta2
+ jl_gc_counted_malloc@Base 0.6.3
+ jl_gc_counted_realloc_with_old_size@Base 0.6.3
+ jl_gc_diff_total_bytes@Base 0.6.3
+ jl_gc_enable@Base 0.6.3
+ jl_gc_enable_finalizers@Base 0.6.3
+ jl_gc_find_taggedvalue_pool@Base 0.6.3
+ jl_gc_is_enabled@Base 0.6.3
+ jl_gc_managed_malloc@Base 0.6.3
+ jl_gc_managed_realloc@Base 0.6.3
+ jl_gc_new_weakref@Base 0.6.3
+ jl_gc_new_weakref_th@Base 0.6.3
+ jl_gc_num@Base 0.6.3
+ jl_gc_pool_alloc@Base 0.6.3
+ jl_gc_queue_root@Base 0.6.3
+ jl_gc_safe_enter@Base 0.6.3
+ jl_gc_safe_leave@Base 0.6.3
+ jl_gc_safepoint@Base 0.6.3
+ jl_gc_total_bytes@Base 0.6.3
+ jl_gc_total_hrtime@Base 0.6.3
+ jl_gc_unsafe_enter@Base 0.6.3
+ jl_gc_unsafe_leave@Base 0.6.3
+ jl_gc_use@Base 0.7.0~beta2
+ jl_gdblookup@Base 0.6.3
+ jl_generating_output@Base 0.6.3
+ jl_generic_function_def@Base 0.6.3
+ jl_gensym@Base 0.6.3
+ jl_get_ARCH@Base 0.6.3
+ jl_get_JIT@Base 0.6.3
+ jl_get_LLVM_VERSION@Base 0.6.3
+ jl_get_UNAME@Base 0.6.3
+ jl_get_backtrace@Base 0.6.3
+ jl_get_binding@Base 0.6.3
+ jl_get_binding_for_method_def@Base 0.6.3
+ jl_get_binding_or_error@Base 0.6.3
+ jl_get_binding_wr@Base 0.6.3
+ jl_get_cfunction_trampoline@Base 0.7.0~beta2
+ jl_get_cpu_name@Base 0.6.3
+ jl_get_current_task@Base 0.6.3
+ jl_get_default_sysimg_path@Base 0.6.3
+ jl_get_dobj_data@Base 0.6.3
+ jl_get_fenv_consts@Base 0.6.3
+ jl_get_field@Base 0.6.3
+ jl_get_field_offset@Base 0.6.3
+ jl_get_global@Base 0.6.3
+ jl_get_image_file@Base 0.6.3
+ jl_get_invoke_lambda@Base 0.6.3
+ jl_get_julia_bin@Base 0.6.3
+ jl_get_julia_bindir@Base 0.7.0~beta2
+ jl_get_keyword_sorter@Base 0.6.3
+ jl_get_kwsorter@Base 0.6.3
+ jl_get_llvm_fptr@Base 0.6.3
+ jl_get_llvmf_decl@Base 0.6.3
+ jl_get_llvmf_defn@Base 0.6.3
+ jl_get_module_of_binding@Base 0.6.3
+ jl_get_nth_field@Base 0.6.3
+ jl_get_nth_field_checked@Base 0.6.3
+ jl_get_nth_field_noalloc@Base 0.7.0~beta2
+ jl_get_ptls_states@Base 0.6.3
+ jl_get_root_symbol@Base 0.6.3
+ jl_get_section_start@Base 0.6.3
+ jl_get_size@Base 0.6.3
+ jl_get_spec_lambda@Base 0.6.3
+ jl_get_tls_world_age@Base 0.6.3
+ jl_get_world_counter@Base 0.6.3
+ jl_get_zero_subnormals@Base 0.6.3
+ jl_getaddrinfo@Base 0.6.3
+ jl_getallocationgranularity@Base 0.6.3
+ jl_getnameinfo6@Base 0.7.0~beta2
+ jl_getnameinfo@Base 0.7.0~beta2
+ jl_getpagesize@Base 0.6.3
+ jl_getpid@Base 0.6.3
+ jl_gettimeofday@Base 0.6.3
+ jl_getutf8@Base 0.6.3
+ jl_gf_invoke_lookup@Base 0.6.3
+ jl_git_branch@Base 0.6.3
+ jl_git_commit@Base 0.6.3
+ jl_global_event_loop@Base 0.6.3
+ jl_globalref_type@Base 0.6.3
+ jl_gotonode_type@Base 0.6.3
+ jl_has_call_ambiguities@Base 0.6.3
+ jl_has_empty_intersection@Base 0.6.3
+ jl_has_free_typevars@Base 0.6.3
+ jl_has_so_reuseport@Base 0.7.0~beta2
+ jl_has_typevar@Base 0.6.3
+ jl_has_typevar_from_unionall@Base 0.6.3
+ jl_hrtime@Base 0.6.3
+ jl_id_char@Base 0.6.3
+ jl_id_start_char@Base 0.6.3
+ jl_idtable_rehash@Base 0.6.3
+ jl_incomplete_sym@Base 0.6.3
+ jl_infer_thunk@Base 0.7.0~beta2
+ jl_init__threading@Base 0.7.0~beta2
+ jl_init_restored_modules@Base 0.7.0
+ jl_init_with_image__threading@Base 0.7.0~beta2
+ jl_initerror_type@Base 0.6.3
+ jl_install_sigint_handler@Base 0.6.3
+ jl_instantiate_type_in_env@Base 0.6.3
+ jl_instantiate_unionall@Base 0.6.3
+ jl_int16_type@Base 0.6.3
+ jl_int32_type@Base 0.6.3
+ jl_int64_type@Base 0.6.3
+ jl_int8_type@Base 0.6.3
+ jl_internal_main_module@Base 0.6.3
+ jl_interrupt_exception@Base 0.6.3
+ jl_intersect_types@Base 0.6.3
+ jl_intrinsic_name@Base 0.6.3
+ jl_intrinsic_type@Base 0.6.3
+ jl_invoke@Base 0.6.3
+ jl_invoke_api@Base 0.7.0~beta2
+ jl_ios_fd@Base 0.6.3
+ jl_ios_get_nbyte_int@Base 0.6.3
+ jl_is_binding_deprecated@Base 0.6.3
+ jl_is_call_ambiguous@Base 0.7.0~beta2
+ jl_is_char_signed@Base 0.6.3
+ jl_is_const@Base 0.6.3
+ jl_is_debugbuild@Base 0.6.3
+ jl_is_enter_interpreter_frame@Base 0.7.0~beta2
+ jl_is_identifier@Base 0.7.0~beta2
+ jl_is_imported@Base 0.6.3
+ jl_is_in_pure_context@Base 0.6.3
+ jl_is_initialized@Base 0.6.3
+ jl_is_interpreter_frame@Base 0.7.0~beta2
+ jl_is_memdebug@Base 0.6.3
+ jl_is_not_broken_subtype@Base 0.7.0~beta2
+ jl_is_operator@Base 0.6.3
+ jl_is_task_started@Base 0.6.3
+ jl_is_unary_and_binary_operator@Base 0.7.0~beta2
+ jl_is_unary_operator@Base 0.7.0~beta2
+ jl_isa@Base 0.6.3
+ jl_isa_compileable_sig@Base 0.7.0~beta2
+ jl_islayout_inline@Base 0.7.0~beta2
+ jl_istopmod@Base 0.6.3
+ jl_le_float@Base 0.6.3
+ jl_lineinfonode_type@Base 0.7.0~beta2
+ jl_lineno@Base 0.6.3
+ jl_linenumbernode_type@Base 0.6.3
+ jl_lisp_prompt@Base 0.6.3
+ jl_load@Base 0.6.3
+ jl_load_@Base 0.6.3
+ jl_load_and_lookup@Base 0.6.3
+ jl_load_dynamic_library@Base 0.6.3
+ jl_load_dynamic_library_e@Base 0.6.3
+ jl_load_file_string@Base 0.6.3
+ jl_loaderror_type@Base 0.6.3
+ jl_lookup_code_address@Base 0.6.3
+ jl_lseek@Base 0.6.3
+ jl_lshr_int@Base 0.6.3
+ jl_lstat@Base 0.6.3
+ jl_lt_float@Base 0.6.3
+ jl_macroexpand1@Base 0.7.0~beta2
+ jl_macroexpand@Base 0.6.3
+ jl_main_module@Base 0.6.3
+ jl_malloc@Base 0.6.3
+ jl_matching_methods@Base 0.6.3
+ jl_maxrss@Base 0.6.3
+ jl_memory_exception@Base 0.6.3
+ jl_method_def@Base 0.6.3
+ jl_method_exists@Base 0.6.3
+ jl_method_instance_add_backedge@Base 0.6.3
+ jl_method_instance_type@Base 0.6.3
+ jl_method_table_add_backedge@Base 0.6.3
+ jl_method_table_disable@Base 0.7.0~beta2
+ jl_method_table_insert@Base 0.6.3
+ jl_method_type@Base 0.6.3
+ jl_methoderror_type@Base 0.6.3
+ jl_methtable_lookup@Base 0.6.3
+ jl_methtable_type@Base 0.6.3
+ jl_mmap@Base 0.6.3
+ jl_module_build_id@Base 0.7.0~beta2
+ jl_module_export@Base 0.6.3
+ jl_module_exports_p@Base 0.6.3
+ jl_module_globalref@Base 0.6.3
+ jl_module_import@Base 0.6.3
+ jl_module_name@Base 0.6.3
+ jl_module_names@Base 0.6.3
+ jl_module_parent@Base 0.6.3
+ jl_module_type@Base 0.6.3
+ jl_module_use@Base 0.6.3
+ jl_module_using@Base 0.6.3
+ jl_module_usings@Base 0.6.3
+ jl_module_uuid@Base 0.6.3
+ jl_mul_float@Base 0.6.3
+ jl_mul_int@Base 0.6.3
+ jl_muladd_float@Base 0.6.3
+ jl_n_threads@Base 0.6.3
+ jl_namedtuple_type@Base 0.7.0~beta2
+ jl_namedtuple_typename@Base 0.7.0~beta2
+ jl_native_alignment@Base 0.6.3
+ jl_nb_available@Base 0.6.3
+ jl_ne_float@Base 0.6.3
+ jl_ne_int@Base 0.6.3
+ jl_neg_float@Base 0.6.3
+ jl_neg_float_withtype@Base 0.6.3
+ jl_neg_int@Base 0.6.3
+ jl_new_array@Base 0.6.3
+ jl_new_bits@Base 0.6.3
+ jl_new_code_info_uninit@Base 0.6.3
+ jl_new_datatype@Base 0.6.3
+ jl_new_main_module@Base 0.6.3
+ jl_new_method_instance_uninit@Base 0.6.3
+ jl_new_method_table@Base 0.6.3
+ jl_new_method_uninit@Base 0.6.3
+ jl_new_module@Base 0.6.3
+ jl_new_primitivetype@Base 0.6.3
+ jl_new_struct@Base 0.6.3
+ jl_new_struct_uninit@Base 0.6.3
+ jl_new_structv@Base 0.6.3
+ jl_new_task@Base 0.6.3
+ jl_new_typename_in@Base 0.6.3
+ jl_new_typevar@Base 0.6.3
+ jl_newvarnode_type@Base 0.6.3
+ jl_next_from_addrinfo@Base 0.6.3
+ jl_no_exc_handler@Base 0.6.3
+ jl_not_int@Base 0.6.3
+ jl_nothing@Base 0.6.3
+ jl_number_type@Base 0.6.3
+ jl_object_id@Base 0.6.3
+ jl_op_suffix_char@Base 0.7.0~beta2
+ jl_operator_precedence@Base 0.6.3
+ jl_options@Base 0.6.3
+ jl_or_int@Base 0.6.3
+ jl_overflow_exception@Base 0.6.3
+ jl_parse_input_line@Base 0.6.3
+ jl_parse_opts@Base 0.6.3
+ jl_parse_string@Base 0.6.3
+ jl_pathname_for_handle@Base 0.6.3
+ jl_pchar_to_array@Base 0.6.3
+ jl_pchar_to_string@Base 0.6.3
+ jl_phicnode_type@Base 0.7.0~beta2
+ jl_phinode_type@Base 0.7.0~beta2
+ jl_pinode_type@Base 0.7.0~beta2
+ jl_pipe_open@Base 0.7.0~beta2
+ jl_pointer_type@Base 0.6.3
+ jl_pointer_typename@Base 0.6.3
+ jl_pointerref@Base 0.6.3
+ jl_pointerset@Base 0.6.3
+ jl_pop_handler@Base 0.6.3
+ jl_preload_sysimg_so@Base 0.6.3
+ jl_prepend_cwd@Base 0.7.0~beta2
+ jl_printf@Base 0.6.3
+ jl_process_events@Base 0.6.3
+ jl_profile_clear_data@Base 0.6.3
+ jl_profile_delay_nsec@Base 0.6.3
+ jl_profile_get_data@Base 0.6.3
+ jl_profile_init@Base 0.6.3
+ jl_profile_is_running@Base 0.6.3
+ jl_profile_len_data@Base 0.6.3
+ jl_profile_maxlen_data@Base 0.6.3
+ jl_profile_start_timer@Base 0.6.3
+ jl_profile_stop_timer@Base 0.6.3
+ jl_ptr_to_array@Base 0.6.3
+ jl_ptr_to_array_1d@Base 0.6.3
+ jl_ptrarrayref@Base 0.7.0~beta2
+ jl_pwrite@Base 0.6.3
+ jl_queue_work@Base 0.6.3
+ jl_quotenode_type@Base 0.6.3
+ jl_raise_debugger@Base 0.6.3
+ jl_read_verify_header@Base 0.6.3
+ jl_readonlymemory_exception@Base 0.6.3
+ jl_readuntil@Base 0.6.3
+ jl_realloc@Base 0.6.3
+ jl_ref_type@Base 0.6.3
+ jl_register_linfo_tracer@Base 0.6.3
+ jl_register_method_tracer@Base 0.6.3
+ jl_register_newmeth_tracer@Base 0.6.3
+ jl_rem_float@Base 0.6.3
+ jl_repl_raise_sigtstp@Base 0.7.0~beta2
+ jl_reshape_array@Base 0.6.3
+ jl_restore_incremental@Base 0.6.3
+ jl_restore_incremental_from_buf@Base 0.6.3
+ jl_restore_system_image@Base 0.6.3
+ jl_restore_system_image_data@Base 0.6.3
+ jl_rethrow@Base 0.6.3
+ jl_rethrow_other@Base 0.6.3
+ jl_rint_llvm@Base 0.6.3
+ jl_rint_llvm_withtype@Base 0.6.3
+ jl_run_event_loop@Base 0.6.3
+ jl_run_once@Base 0.6.3
+ jl_running_on_valgrind@Base 0.6.3
+ jl_safe_printf@Base 0.6.3
+ jl_save_incremental@Base 0.6.3
+ jl_save_system_image@Base 0.6.3
+ jl_sdiv_int@Base 0.6.3
+ jl_set_ARGS@Base 0.6.3
+ jl_set_const@Base 0.6.3
+ jl_set_errno@Base 0.6.3
+ jl_set_global@Base 0.6.3
+ jl_set_istopmod@Base 0.6.3
+ jl_set_method_inferred@Base 0.6.3
+ jl_set_module_nospecialize@Base 0.7.0
+ jl_set_module_uuid@Base 0.7.0~beta2
+ jl_set_nth_field@Base 0.6.3
+ jl_set_ptls_states_getter@Base 0.6.3
+ jl_set_sysimg_so@Base 0.6.3
+ jl_set_typeinf_func@Base 0.6.3
+ jl_set_zero_subnormals@Base 0.6.3
+ jl_sext_int@Base 0.6.3
+ jl_shl_int@Base 0.6.3
+ jl_sigatomic_begin@Base 0.6.3
+ jl_sigatomic_end@Base 0.6.3
+ jl_signal_pending@Base 0.6.3
+ jl_signed_type@Base 0.6.3
+ jl_simplevector_type@Base 0.6.3
+ jl_sitofp@Base 0.6.3
+ jl_sizeof_ios_t@Base 0.6.3
+ jl_sizeof_jl_options@Base 0.7.0~beta2
+ jl_sizeof_mode_t@Base 0.6.3
+ jl_sizeof_off_t@Base 0.6.3
+ jl_sizeof_stat@Base 0.6.3
+ jl_sizeof_uv_fs_t@Base 0.6.3
+ jl_sizeof_uv_mutex@Base 0.6.3
+ jl_sle_int@Base 0.6.3
+ jl_slotnumber_type@Base 0.6.3
+ jl_slt_int@Base 0.6.3
+ jl_smod_int@Base 0.6.3
+ jl_sockaddr_from_addrinfo@Base 0.6.3
+ jl_sockaddr_host4@Base 0.6.3
+ jl_sockaddr_host6@Base 0.6.3
+ jl_sockaddr_in_is_ip4@Base 0.6.3
+ jl_sockaddr_in_is_ip6@Base 0.6.3
+ jl_sockaddr_is_ip4@Base 0.6.3
+ jl_sockaddr_is_ip6@Base 0.6.3
+ jl_sockaddr_set_port@Base 0.6.3
+ jl_spawn@Base 0.6.3
+ jl_specializations_get_linfo@Base 0.6.3
+ jl_specializations_lookup@Base 0.6.3
+ jl_sqrt_llvm@Base 0.6.3
+ jl_sqrt_llvm_withtype@Base 0.6.3
+ jl_srem_int@Base 0.6.3
+ jl_ssavalue_type@Base 0.6.3
+ jl_stackovf_exception@Base 0.6.3
+ jl_stat@Base 0.6.3
+ jl_stat_blksize@Base 0.6.3
+ jl_stat_blocks@Base 0.6.3
+ jl_stat_ctime@Base 0.6.3
+ jl_stat_dev@Base 0.6.3
+ jl_stat_gid@Base 0.6.3
+ jl_stat_ino@Base 0.6.3
+ jl_stat_mode@Base 0.6.3
+ jl_stat_mtime@Base 0.6.3
+ jl_stat_nlink@Base 0.6.3
+ jl_stat_rdev@Base 0.6.3
+ jl_stat_size@Base 0.6.3
+ jl_stat_uid@Base 0.6.3
+ jl_static_show@Base 0.6.3
+ jl_static_show_func_sig@Base 0.6.3
+ jl_stderr_obj@Base 0.6.3
+ jl_stderr_stream@Base 0.6.3
+ jl_stdin_stream@Base 0.6.3
+ jl_stdout_obj@Base 0.6.3
+ jl_stdout_stream@Base 0.6.3
+ jl_string_ptr@Base 0.6.3
+ jl_string_to_array@Base 0.6.3
+ jl_string_type@Base 0.6.3
+ jl_strtod_c@Base 0.6.3
+ jl_strtof_c@Base 0.6.3
+ jl_sub_float@Base 0.6.3
+ jl_sub_int@Base 0.6.3
+ jl_sub_ptr@Base 0.7.0~beta2
+ jl_substrtod@Base 0.6.3
+ jl_substrtof@Base 0.6.3
+ jl_subtype@Base 0.6.3
+ jl_subtype_env@Base 0.6.3
+ jl_subtype_env_size@Base 0.6.3
+ jl_svec1@Base 0.6.3
+ jl_svec2@Base 0.6.3
+ jl_svec@Base 0.6.3
+ jl_svec_copy@Base 0.6.3
+ jl_svec_fill@Base 0.6.3
+ jl_switchto@Base 0.6.3
+ jl_sym_type@Base 0.6.3
+ jl_symbol@Base 0.6.3
+ jl_symbol_lookup@Base 0.6.3
+ jl_symbol_n@Base 0.6.3
+ jl_symbol_name@Base 0.6.3
+ jl_symbol_type@Base 0.6.3
+ jl_tagged_gensym@Base 0.6.3
+ jl_take_buffer@Base 0.6.3
+ jl_task_type@Base 0.6.3
+ jl_tcp4_connect@Base 0.6.3
+ jl_tcp6_connect@Base 0.6.3
+ jl_tcp_bind6@Base 0.6.3
+ jl_tcp_bind@Base 0.6.3
+ jl_tcp_getpeername@Base 0.6.3
+ jl_tcp_getsockname@Base 0.6.3
+ jl_tcp_quickack@Base 0.6.3
+ jl_tcp_reuseport@Base 0.6.3
+ jl_threadid@Base 0.6.3
+ jl_threading_enabled@Base 0.6.3
+ jl_threading_profile@Base 0.6.3
+ jl_threading_run@Base 0.6.3
+ jl_throw@Base 0.6.3
+ jl_too_few_args@Base 0.6.3
+ jl_too_many_args@Base 0.6.3
+ jl_top_module@Base 0.6.3
+ jl_toplevel_eval@Base 0.6.3
+ jl_toplevel_eval_in@Base 0.6.3
+ jl_trace_linfo@Base 0.6.3
+ jl_trace_method@Base 0.6.3
+ jl_true@Base 0.6.3
+ jl_trunc_int@Base 0.6.3
+ jl_trunc_llvm@Base 0.6.3
+ jl_trunc_llvm_withtype@Base 0.6.3
+ jl_try_substrtod@Base 0.6.3
+ jl_try_substrtof@Base 0.6.3
+ jl_tty_set_mode@Base 0.6.3
+ jl_tuple_typename@Base 0.6.3
+ jl_tupletype_fill@Base 0.6.3
+ jl_tvar_type@Base 0.6.3
+ jl_type_error@Base 0.6.3
+ jl_type_error_new_expr@Base 0.7.0~beta2
+ jl_type_error_rt@Base 0.6.3
+ jl_type_intersection@Base 0.6.3
+ jl_type_intersection_with_env@Base 0.7.0~beta2
+ jl_type_morespecific@Base 0.6.3
+ jl_type_morespecific_no_subtype@Base 0.6.3
+ jl_type_type@Base 0.6.3
+ jl_type_typename@Base 0.6.3
+ jl_type_union@Base 0.6.3
+ jl_type_unionall@Base 0.6.3
+ jl_typeassert@Base 0.6.3
+ jl_typedslot_type@Base 0.6.3
+ jl_typeerror_type@Base 0.6.3
+ jl_typeinf_begin@Base 0.6.3
+ jl_typeinf_end@Base 0.6.3
+ jl_typemap_entry_type@Base 0.6.3
+ jl_typemap_level_type@Base 0.6.3
+ jl_typemax_uint@Base 0.7.0~beta2
+ jl_typename_str@Base 0.6.3
+ jl_typename_type@Base 0.6.3
+ jl_typeof@Base 0.6.3
+ jl_typeof_str@Base 0.6.3
+ jl_typeofbottom_type@Base 0.6.3
+ jl_types_equal@Base 0.6.3
+ jl_typetype_type@Base 0.6.3
+ jl_udiv_int@Base 0.6.3
+ jl_udp_bind6@Base 0.6.3
+ jl_udp_bind@Base 0.6.3
+ jl_udp_send6@Base 0.6.3
+ jl_udp_send@Base 0.6.3
+ jl_uint16_type@Base 0.6.3
+ jl_uint32_type@Base 0.6.3
+ jl_uint64_type@Base 0.6.3
+ jl_uint8_type@Base 0.6.3
+ jl_uitofp@Base 0.6.3
+ jl_ule_int@Base 0.6.3
+ jl_ult_int@Base 0.6.3
+ jl_unbox_bool@Base 0.6.3
+ jl_unbox_float32@Base 0.6.3
+ jl_unbox_float64@Base 0.6.3
+ jl_unbox_int16@Base 0.6.3
+ jl_unbox_int32@Base 0.6.3
+ jl_unbox_int64@Base 0.6.3
+ jl_unbox_int8@Base 0.6.3
+ jl_unbox_uint16@Base 0.6.3
+ jl_unbox_uint32@Base 0.6.3
+ jl_unbox_uint64@Base 0.6.3
+ jl_unbox_uint8@Base 0.6.3
+ jl_unbox_voidpointer@Base 0.6.3
+ jl_uncompress_ast@Base 0.6.3
+ jl_undefined_var_error@Base 0.6.3
+ jl_undefref_exception@Base 0.6.3
+ jl_undefvarerror_type@Base 0.6.3
+ jl_unionall_type@Base 0.6.3
+ jl_uniontype_type@Base 0.6.3
+ jl_untrace_linfo@Base 0.6.3
+ jl_untrace_method@Base 0.6.3
+ jl_upsilonnode_type@Base 0.7.0~beta2
+ jl_urem_int@Base 0.6.3
+ jl_uv_associate_julia_struct@Base 0.6.3
+ jl_uv_buf_base@Base 0.6.3
+ jl_uv_buf_len@Base 0.6.3
+ jl_uv_buf_set_base@Base 0.6.3
+ jl_uv_buf_set_len@Base 0.6.3
+ jl_uv_connect_handle@Base 0.6.3
+ jl_uv_disassociate_julia_struct@Base 0.6.3
+ jl_uv_file_handle@Base 0.6.3
+ jl_uv_fs_req_cleanup@Base 0.6.3
+ jl_uv_fs_result@Base 0.6.3
+ jl_uv_fs_t_ptr@Base 0.6.3
+ jl_uv_handle@Base 0.6.3
+ jl_uv_handle_data@Base 0.6.3
+ jl_uv_handle_type@Base 0.6.3
+ jl_uv_interface_address_is_internal@Base 0.6.3
+ jl_uv_interface_address_sockaddr@Base 0.6.3
+ jl_uv_interface_addresses@Base 0.6.3
+ jl_uv_process_data@Base 0.6.3
+ jl_uv_putb@Base 0.6.3
+ jl_uv_putc@Base 0.6.3
+ jl_uv_puts@Base 0.6.3
+ jl_uv_req_data@Base 0.6.3
+ jl_uv_req_set_data@Base 0.6.3
+ jl_uv_sizeof_interface_address@Base 0.6.3
+ jl_uv_stderr@Base 0.6.3
+ jl_uv_stdin@Base 0.6.3
+ jl_uv_stdout@Base 0.6.3
+ jl_uv_unix_fd_is_watched@Base 0.6.3
+ jl_uv_write@Base 0.6.3
+ jl_uv_write_handle@Base 0.6.3
+ jl_uv_writecb@Base 0.6.3
+ jl_value_ptr@Base 0.6.3
+ jl_valueof@Base 0.6.3
+ jl_vararg_type@Base 0.6.3
+ jl_vararg_typename@Base 0.6.3
+ jl_vecelement_typename@Base 0.6.3
+ jl_ver_is_release@Base 0.6.3
+ jl_ver_major@Base 0.6.3
+ jl_ver_minor@Base 0.6.3
+ jl_ver_patch@Base 0.6.3
+ jl_ver_string@Base 0.6.3
+ jl_void_type@Base 0.6.3
+ jl_voidpointer_type@Base 0.6.3
+ jl_vprintf@Base 0.6.3
+ jl_weakref_type@Base 0.6.3
+ jl_world_counter@Base 0.6.3
+ jl_xor_int@Base 0.6.3
+ jl_yield@Base 0.6.3
+ jl_zext_int@Base 0.6.3
+ jlbacktrace@Base 0.6.3
+ julia_init__threading@Base 0.7.0~beta2
+ julia_type_to_llvm@Base 0.6.3
+ libsupport_init@Base 0.6.3
+ memhash32@Base 0.6.3
+ memhash32_seed@Base 0.6.3
+ memhash@Base 0.6.3
+ memhash_seed@Base 0.6.3
+ u8_charnum@Base 0.6.3
+ u8_isvalid@Base 0.6.3
+ u8_offset@Base 0.6.3
+ u8_strwidth@Base 0.6.3
+ uv__accept4@Base 0.7.0~beta2
+ uv__accept@Base 0.7.0~beta2
+ uv__async_close@Base 0.7.0~beta2
+ uv__async_stop@Base 0.7.0~beta2
+ uv__calloc@Base 0.7.0~beta2
+ uv__check_close@Base 0.7.0~beta2
+ uv__cloexec_fcntl@Base 0.7.0~beta2
+ uv__cloexec_ioctl@Base 0.7.0~beta2
+ uv__close@Base 0.7.0~beta2
+ uv__close_nocheckstdio@Base 0.7.0~beta2
+ uv__count_bufs@Base 0.7.0~beta2
+ uv__dup2_cloexec@Base 0.7.0~beta2
+ uv__dup3@Base 0.7.0~beta2
+ uv__dup@Base 0.7.0~beta2
+ uv__epoll_create1@Base 0.7.0~beta2
+ uv__epoll_create@Base 0.7.0~beta2
+ uv__epoll_ctl@Base 0.7.0~beta2
+ uv__epoll_pwait@Base 0.7.0~beta2
+ uv__epoll_wait@Base 0.7.0~beta2
+ uv__eventfd2@Base 0.7.0~beta2
+ uv__eventfd@Base 0.7.0~beta2
+ uv__free@Base 0.7.0~beta2
+ uv__fs_event_close@Base 0.7.0~beta2
+ uv__fs_poll_close@Base 0.7.0~beta2
+ uv__fs_scandir_cleanup@Base 0.7.0~beta2
+ uv__getaddrinfo_translate_error@Base 0.7.0~beta2
+ uv__getiovmax@Base 0.7.0~beta2
+ uv__getpwuid_r@Base 0.7.0~beta2
+ uv__handle_type@Base 0.7.0~beta2
+ uv__hrtime@Base 0.7.0~beta2
+ uv__idle_close@Base 0.7.0~beta2
+ uv__inotify_add_watch@Base 0.7.0~beta2
+ uv__inotify_init1@Base 0.7.0~beta2
+ uv__inotify_init@Base 0.7.0~beta2
+ uv__inotify_rm_watch@Base 0.7.0~beta2
+ uv__io_active@Base 0.7.0~beta2
+ uv__io_check_fd@Base 0.7.0~beta2
+ uv__io_close@Base 0.7.0~beta2
+ uv__io_feed@Base 0.7.0~beta2
+ uv__io_init@Base 0.7.0~beta2
+ uv__io_poll@Base 0.7.0~beta2
+ uv__io_start@Base 0.7.0~beta2
+ uv__io_stop@Base 0.7.0~beta2
+ uv__loop_close@Base 0.7.0~beta2
+ uv__loop_configure@Base 0.7.0~beta2
+ uv__make_close_pending@Base 0.7.0~beta2
+ uv__make_pipe@Base 0.7.0~beta2
+ uv__malloc@Base 0.7.0~beta2
+ uv__next_timeout@Base 0.7.0~beta2
+ uv__nonblock_fcntl@Base 0.7.0~beta2
+ uv__nonblock_ioctl@Base 0.7.0~beta2
+ uv__open_cloexec@Base 0.7.0~beta2
+ uv__open_file@Base 0.7.0~beta2
+ uv__pipe2@Base 0.7.0~beta2
+ uv__pipe_close@Base 0.7.0~beta2
+ uv__platform_invalidate_fd@Base 0.7.0~beta2
+ uv__platform_loop_delete@Base 0.7.0~beta2
+ uv__platform_loop_init@Base 0.7.0~beta2
+ uv__poll_close@Base 0.7.0~beta2
+ uv__preadv@Base 0.7.0~beta2
+ uv__prepare_close@Base 0.7.0~beta2
+ uv__process_close@Base 0.7.0~beta2
+ uv__pwritev@Base 0.7.0~beta2
+ uv__realloc@Base 0.7.0~beta2
+ uv__recvmmsg@Base 0.7.0~beta2
+ uv__recvmsg@Base 0.7.0~beta2
+ uv__run_check@Base 0.7.0~beta2
+ uv__run_idle@Base 0.7.0~beta2
+ uv__run_prepare@Base 0.7.0~beta2
+ uv__run_timers@Base 0.7.0~beta2
+ uv__sendmmsg@Base 0.7.0~beta2
+ uv__server_io@Base 0.7.0~beta2
+ uv__set_process_title@Base 0.7.0~beta2
+ uv__signal_close@Base 0.7.0~beta2
+ uv__signal_global_once_init@Base 0.7.0~beta2
+ uv__signal_loop_cleanup@Base 0.7.0~beta2
+ uv__socket@Base 0.7.0~beta2
+ uv__socket_sockopt@Base 0.7.0~beta2
+ uv__strdup@Base 0.7.0~beta2
+ uv__stream_close@Base 0.7.0~beta2
+ uv__stream_destroy@Base 0.7.0~beta2
+ uv__stream_flush_write_queue@Base 0.7.0~beta2
+ uv__stream_init@Base 0.7.0~beta2
+ uv__stream_open@Base 0.7.0~beta2
+ uv__strndup@Base 0.7.0~beta2
+ uv__tcp_bind@Base 0.7.0~beta2
+ uv__tcp_close@Base 0.7.0~beta2
+ uv__tcp_connect@Base 0.7.0~beta2
+ uv__tcp_keepalive@Base 0.7.0~beta2
+ uv__tcp_nodelay@Base 0.7.0~beta2
+ uv__timer_close@Base 0.7.0~beta2
+ uv__udp_bind@Base 0.7.0~beta2
+ uv__udp_close@Base 0.7.0~beta2
+ uv__udp_finish_close@Base 0.7.0~beta2
+ uv__udp_recv_start@Base 0.7.0~beta2
+ uv__udp_recv_stop@Base 0.7.0~beta2
+ uv__udp_send@Base 0.7.0~beta2
+ uv__udp_try_send@Base 0.7.0~beta2
+ uv__utimesat@Base 0.7.0~beta2
+ uv__work_done@Base 0.7.0~beta2
+ uv__work_submit@Base 0.7.0~beta2
+ uv_accept@Base 0.6.3
+ uv_async_init@Base 0.6.3
+ uv_async_send@Base 0.6.3
+ uv_backend_fd@Base 0.6.3
+ uv_backend_timeout@Base 0.6.3
+ uv_barrier_destroy@Base 0.6.3
+ uv_barrier_init@Base 0.6.3
+ uv_barrier_wait@Base 0.6.3
+ uv_buf_init@Base 0.6.3
+ uv_cancel@Base 0.6.3
+ uv_chdir@Base 0.6.3
+ uv_check_init@Base 0.6.3
+ uv_check_start@Base 0.6.3
+ uv_check_stop@Base 0.6.3
+ uv_close@Base 0.6.3
+ uv_cond_broadcast@Base 0.6.3
+ uv_cond_destroy@Base 0.6.3
+ uv_cond_init@Base 0.6.3
+ uv_cond_signal@Base 0.6.3
+ uv_cond_timedwait@Base 0.6.3
+ uv_cond_wait@Base 0.6.3
+ uv_cpu_info@Base 0.6.3
+ uv_cpumask_size@Base 0.7.0~beta2
+ uv_cwd@Base 0.6.3
+ uv_default_loop@Base 0.6.3
+ uv_disable_stdio_inheritance@Base 0.6.3
+ uv_dlclose@Base 0.6.3
+ uv_dlerror@Base 0.6.3
+ uv_dlopen@Base 0.6.3
+ uv_dlsym@Base 0.6.3
+ uv_err_name@Base 0.6.3
+ uv_exepath@Base 0.6.3
+ uv_fileno@Base 0.6.3
+ uv_free_cpu_info@Base 0.6.3
+ uv_free_interface_addresses@Base 0.6.3
+ uv_freeaddrinfo@Base 0.6.3
+ uv_fs_access@Base 0.6.3
+ uv_fs_chmod@Base 0.6.3
+ uv_fs_chown@Base 0.6.3
+ uv_fs_close@Base 0.6.3
+ uv_fs_event_getpath@Base 0.6.3
+ uv_fs_event_init@Base 0.6.3
+ uv_fs_event_start@Base 0.6.3
+ uv_fs_event_stop@Base 0.6.3
+ uv_fs_fchmod@Base 0.6.3
+ uv_fs_fchown@Base 0.6.3
+ uv_fs_fdatasync@Base 0.6.3
+ uv_fs_fstat@Base 0.6.3
+ uv_fs_fsync@Base 0.6.3
+ uv_fs_ftruncate@Base 0.6.3
+ uv_fs_futime@Base 0.6.3
+ uv_fs_link@Base 0.6.3
+ uv_fs_lstat@Base 0.6.3
+ uv_fs_mkdir@Base 0.6.3
+ uv_fs_mkdtemp@Base 0.6.3
+ uv_fs_open@Base 0.6.3
+ uv_fs_poll_getpath@Base 0.6.3
+ uv_fs_poll_init@Base 0.6.3
+ uv_fs_poll_start@Base 0.6.3
+ uv_fs_poll_stop@Base 0.6.3
+ uv_fs_read@Base 0.6.3
+ uv_fs_readlink@Base 0.6.3
+ uv_fs_realpath@Base 0.6.3
+ uv_fs_rename@Base 0.6.3
+ uv_fs_req_cleanup@Base 0.6.3
+ uv_fs_rmdir@Base 0.6.3
+ uv_fs_scandir@Base 0.6.3
+ uv_fs_scandir_next@Base 0.6.3
+ uv_fs_sendfile@Base 0.6.3
+ uv_fs_stat@Base 0.6.3
+ uv_fs_symlink@Base 0.6.3
+ uv_fs_unlink@Base 0.6.3
+ uv_fs_utime@Base 0.6.3
+ uv_fs_write@Base 0.6.3
+ uv_get_free_memory@Base 0.6.3
+ uv_get_process_title@Base 0.6.3
+ uv_get_total_memory@Base 0.6.3
+ uv_getaddrinfo@Base 0.6.3
+ uv_getnameinfo@Base 0.6.3
+ uv_getrusage@Base 0.6.3
+ uv_guess_handle@Base 0.6.3
+ uv_handle_size@Base 0.6.3
+ uv_has_ref@Base 0.6.3
+ uv_hrtime@Base 0.6.3
+ uv_idle_init@Base 0.6.3
+ uv_idle_start@Base 0.6.3
+ uv_idle_stop@Base 0.6.3
+ uv_inet_ntop@Base 0.6.3
+ uv_inet_pton@Base 0.6.3
+ uv_interface_addresses@Base 0.6.3
+ uv_ip4_addr@Base 0.6.3
+ uv_ip4_name@Base 0.6.3
+ uv_ip6_addr@Base 0.6.3
+ uv_ip6_name@Base 0.6.3
+ uv_is_active@Base 0.6.3
+ uv_is_closing@Base 0.6.3
+ uv_is_readable@Base 0.6.3
+ uv_is_writable@Base 0.6.3
+ uv_key_create@Base 0.6.3
+ uv_key_delete@Base 0.6.3
+ uv_key_get@Base 0.6.3
+ uv_key_set@Base 0.6.3
+ uv_kill@Base 0.6.3
+ uv_listen@Base 0.6.3
+ uv_loadavg@Base 0.6.3
+ uv_loop_alive@Base 0.6.3
+ uv_loop_close@Base 0.6.3
+ uv_loop_configure@Base 0.6.3
+ uv_loop_delete@Base 0.6.3
+ uv_loop_init@Base 0.6.3
+ uv_loop_new@Base 0.6.3
+ uv_loop_size@Base 0.6.3
+ uv_mutex_destroy@Base 0.6.3
+ uv_mutex_init@Base 0.6.3
+ uv_mutex_lock@Base 0.6.3
+ uv_mutex_trylock@Base 0.6.3
+ uv_mutex_unlock@Base 0.6.3
+ uv_now@Base 0.6.3
+ uv_once@Base 0.6.3
+ uv_os_free_passwd@Base 0.6.3
+ uv_os_get_passwd@Base 0.6.3
+ uv_os_getenv@Base 0.7.0~beta2
+ uv_os_gethostname@Base 0.7.0~beta2
+ uv_os_homedir@Base 0.6.3
+ uv_os_setenv@Base 0.7.0~beta2
+ uv_os_tmpdir@Base 0.6.3
+ uv_os_unsetenv@Base 0.7.0~beta2
+ uv_pipe@Base 0.7.0~beta2
+ uv_pipe_bind@Base 0.6.3
+ uv_pipe_connect@Base 0.6.3
+ uv_pipe_getpeername@Base 0.6.3
+ uv_pipe_getsockname@Base 0.6.3
+ uv_pipe_init@Base 0.6.3
+ uv_pipe_listen@Base 0.7.0~beta2
+ uv_pipe_open@Base 0.6.3
+ uv_pipe_pending_count@Base 0.6.3
+ uv_pipe_pending_instances@Base 0.6.3
+ uv_pipe_pending_type@Base 0.6.3
+ uv_poll_init@Base 0.6.3
+ uv_poll_start@Base 0.6.3
+ uv_poll_stop@Base 0.6.3
+ uv_prepare_init@Base 0.6.3
+ uv_prepare_start@Base 0.6.3
+ uv_prepare_stop@Base 0.6.3
+ uv_print_active_handles@Base 0.6.3
+ uv_print_all_handles@Base 0.6.3
+ uv_process_kill@Base 0.6.3
+ uv_queue_work@Base 0.6.3
+ uv_read_start@Base 0.6.3
+ uv_read_stop@Base 0.6.3
+ uv_recv_buffer_size@Base 0.6.3
+ uv_ref@Base 0.6.3
+ uv_replace_allocator@Base 0.6.3
+ uv_req_size@Base 0.6.3
+ uv_resident_set_memory@Base 0.6.3
+ uv_run@Base 0.6.3
+ uv_rwlock_destroy@Base 0.6.3
+ uv_rwlock_init@Base 0.6.3
+ uv_rwlock_rdlock@Base 0.6.3
+ uv_rwlock_rdunlock@Base 0.6.3
+ uv_rwlock_tryrdlock@Base 0.6.3
+ uv_rwlock_trywrlock@Base 0.6.3
+ uv_rwlock_wrlock@Base 0.6.3
+ uv_rwlock_wrunlock@Base 0.6.3
+ uv_sem_destroy@Base 0.6.3
+ uv_sem_init@Base 0.6.3
+ uv_sem_post@Base 0.6.3
+ uv_sem_trywait@Base 0.6.3
+ uv_sem_wait@Base 0.6.3
+ uv_send_buffer_size@Base 0.6.3
+ uv_set_process_title@Base 0.6.3
+ uv_setup_args@Base 0.6.3
+ uv_shutdown@Base 0.6.3
+ uv_signal_init@Base 0.6.3
+ uv_signal_start@Base 0.6.3
+ uv_signal_start_oneshot@Base 0.7.0~beta2
+ uv_signal_stop@Base 0.6.3
+ uv_socketpair@Base 0.7.0~beta2
+ uv_spawn@Base 0.6.3
+ uv_stop@Base 0.6.3
+ uv_stream_set_blocking@Base 0.6.3
+ uv_strerror@Base 0.6.3
+ uv_tcp_bind@Base 0.6.3
+ uv_tcp_connect@Base 0.6.3
+ uv_tcp_getpeername@Base 0.6.3
+ uv_tcp_getsockname@Base 0.6.3
+ uv_tcp_init@Base 0.6.3
+ uv_tcp_init_ex@Base 0.6.3
+ uv_tcp_keepalive@Base 0.6.3
+ uv_tcp_listen@Base 0.7.0~beta2
+ uv_tcp_nodelay@Base 0.6.3
+ uv_tcp_open@Base 0.6.3
+ uv_tcp_simultaneous_accepts@Base 0.6.3
+ uv_thread_create@Base 0.6.3
+ uv_thread_detach@Base 0.6.3
+ uv_thread_equal@Base 0.6.3
+ uv_thread_getaffinity@Base 0.6.3
+ uv_thread_join@Base 0.6.3
+ uv_thread_self@Base 0.6.3
+ uv_thread_setaffinity@Base 0.6.3
+ uv_timer_again@Base 0.6.3
+ uv_timer_get_repeat@Base 0.6.3
+ uv_timer_init@Base 0.6.3
+ uv_timer_set_repeat@Base 0.6.3
+ uv_timer_start@Base 0.6.3
+ uv_timer_stop@Base 0.6.3
+ uv_translate_sys_error@Base 0.7.0~beta2
+ uv_try_write@Base 0.6.3
+ uv_try_write_cb@Base 0.7.0~beta2
+ uv_tty_get_winsize@Base 0.6.3
+ uv_tty_init@Base 0.6.3
+ uv_tty_reset_mode@Base 0.6.3
+ uv_tty_set_mode@Base 0.6.3
+ uv_udp_bind@Base 0.6.3
+ uv_udp_getsockname@Base 0.6.3
+ uv_udp_init@Base 0.6.3
+ uv_udp_init_ex@Base 0.6.3
+ uv_udp_open@Base 0.6.3
+ uv_udp_recv_start@Base 0.6.3
+ uv_udp_recv_stop@Base 0.6.3
+ uv_udp_send@Base 0.6.3
+ uv_udp_set_broadcast@Base 0.6.3
+ uv_udp_set_membership@Base 0.6.3
+ uv_udp_set_multicast_interface@Base 0.6.3
+ uv_udp_set_multicast_loop@Base 0.6.3
+ uv_udp_set_multicast_ttl@Base 0.6.3
+ uv_udp_set_ttl@Base 0.6.3
+ uv_udp_try_send@Base 0.6.3
+ uv_unref@Base 0.6.3
+ uv_update_time@Base 0.6.3
+ uv_uptime@Base 0.6.3
+ uv_version@Base 0.6.3
+ uv_version_string@Base 0.6.3
+ uv_walk@Base 0.6.3
+ uv_write2@Base 0.6.3
+ uv_write@Base 0.6.3
--- /dev/null
+activate-noawait ldconfig
--- /dev/null
+usr/share/doc/julia/html/*
--- /dev/null
+Description: lintian error due to ancient version of metadata format
+Reference: https://www.freedesktop.org/software/appstream/docs/chap-Metadata.html
+Created-by: Mo Zhou
+Last-Update: 2018-08-16
+Forwarded: [Merged] https://github.com/JuliaLang/julia/pull/28020
+
+Index: julia/contrib/julia.appdata.xml
+===================================================================
+--- julia.orig/contrib/julia.appdata.xml
++++ julia/contrib/julia.appdata.xml
+@@ -1,9 +1,15 @@
+ <?xml version="1.0" encoding="UTF-8"?>
+ <!-- Copyright 2014 Paul Lange <palango@gmx.de> -->
+-<application>
+- <id type="desktop">julia.desktop</id>
++<component>
++ <id>org.julialang.julia</id>
++ <name>Julia</name>
++ <launchable type="desktop-id">julia.desktop</launchable>
+ <metadata_license>CC-BY-SA-3.0</metadata_license>
+ <project_license>MIT and LGPL-2.1+ and GPL-2.0+</project_license>
++ <summary>High-performance programming language for technical computing</summary>
++ <provides>
++ <binary>julia</binary>
++ </provides>
+ <description>
+ <p>
+ Julia is a high-level, high-performance dynamic programming language for
+@@ -21,8 +27,9 @@
+ </p>
+ </description>
+ <screenshots>
+- <screenshot type="default">https://julialang.org/images/julia-gnome.png</screenshot>
++ <screenshot type="default">
++ 
++ </screenshot>
+ </screenshots>
+ <url type="homepage">https://julialang.org/</url>
+- <updatecontact>julia-dev@googlegroups.com</updatecontact>
+-</application>
++</component>
--- /dev/null
+Description: enable distro julia packages so that they are immediately importable after install
+See: https://github.com/cdluminate/DistroHelper.jl
+ https://github.com/JuliaLang/julia/issues/30528
+Author: Mo Zhou
+
+diff --git a/base/initdefs.jl b/base/initdefs.jl
+index f95c22b4..e63d395b 100644
+--- a/base/initdefs.jl
++++ b/base/initdefs.jl
+@@ -61,6 +61,7 @@ end
+ # entries starting with `@` are named environments:
+ # - the first three `#`s in a named environment are replaced with version numbers
+ # - `@stdlib` is a special name for the standard library and expands to its path
++# - `@distro` is a special name for distributions' vendored julia packages
+
+ # if you want a current env setup, use direnv and
+ # have your .envrc do something like this:
+@@ -70,7 +71,7 @@ end
+ # this will inherit an existing JULIA_LOAD_PATH value or if there is none, leave
+ # a trailing empty entry in JULIA_LOAD_PATH which will be replaced with defaults.
+
+-const DEFAULT_LOAD_PATH = ["@", "@v#.#", "@stdlib"]
++const DEFAULT_LOAD_PATH = ["@", "@v#.#", "@stdlib", "@distro"]
+
+ """
+ LOAD_PATH
--- /dev/null
+Description: Let it download the embedded source tarball through a file:// URI
+Author: Mo Zhou
+Last-Update: 20180711
+Forward: no need
+
+Index: julia/deps/libuv.mk
+===================================================================
+--- julia.orig/deps/libuv.mk
++++ julia/deps/libuv.mk
+@@ -1,6 +1,5 @@
+ ## LIBUV ##
+-LIBUV_GIT_URL:=git://github.com/JuliaLang/libuv.git
+-LIBUV_TAR_URL=https://api.github.com/repos/JuliaLang/libuv/tarball/$1
++LIBUV_TAR_URL=file://$(CURDIR)/../debian/embedded/libuv-$1.tar.gz
+ $(eval $(call git-external,libuv,LIBUV,configure,,$(SRCCACHE)))
+
+ UV_CFLAGS := -D_GNU_SOURCE
--- /dev/null
+Description: prevent upstream build system from accessing internet
+Author: Mo Zhou
+Forwarded: no need
+Last-Update: 20180719
+Index: julia/deps/libwhich.mk
+===================================================================
+--- julia.orig/deps/libwhich.mk
++++ julia/deps/libwhich.mk
+@@ -1,6 +1,5 @@
+ ## LIBWHICH ##
+-LIBWHICH_GIT_URL := git://github.com/vtjnash/libwhich.git
+-LIBWHICH_TAR_URL = https://api.github.com/repos/vtjnash/libwhich/tarball/$1
++LIBWHICH_TAR_URL = file://$(CURDIR)/../debian/embedded/libwhich-$1.tar.gz
+ $(eval $(call git-external,libwhich,LIBWHICH,,,$(BUILDDIR)))
+
+ LIBWHICH_OBJ_LIB := $(build_depsbindir)/libwhich
--- /dev/null
+Index: julia/deps/llvm.mk
+===================================================================
+--- julia.orig/deps/llvm.mk
++++ julia/deps/llvm.mk
+@@ -161,7 +161,7 @@ ifeq ($(BUILD_LLDB),0)
+ LLVM_CMAKE += -DLLVM_TOOL_LLDB_BUILD=OFF
+ endif
+
+-LLVM_SRC_URL := http://releases.llvm.org/$(LLVM_VER)
++LLVM_SRC_URL := file://$(shell pwd)/../debian/embedded/$(LLVM_VER)
+
+ ifneq ($(LLVM_CLANG_TAR),)
+ $(LLVM_CLANG_TAR): | $(SRCCACHE)
--- /dev/null
+Description: Prevent upstream build system from downloading Pkg.jl
+Forward: no need
+Author: Mo Zhou
+Index: julia/stdlib/Makefile
+===================================================================
+--- julia.orig/stdlib/Makefile
++++ julia/stdlib/Makefile
+@@ -22,7 +22,7 @@ STDLIBS-EXT = Pkg
+
+ # Download and extract Pkg
+ PKG_GIT_URL := git://github.com/JuliaLang/Pkg.jl.git
+-PKG_TAR_URL = https://api.github.com/repos/JuliaLang/Pkg.jl/tarball/$1
++PKG_TAR_URL = file://$(CURDIR)/../debian/embedded/$1
+ $(eval $(call git-external,Pkg,PKG,,,$(BUILDDIR)))
+ $(BUILDDIR)/$(PKG_SRC_DIR)/build-compiled: $(BUILDDIR)/$(PKG_SRC_DIR)/source-extracted
+ @# no build steps
--- /dev/null
+Description: Prevent the build system from downloading anything while building docs.
+Forwarded: no need.
+Author: Mo Zhou
+Last-Update: 2018-07-03
+
+Index: julia/doc/make.jl
+===================================================================
+--- julia.orig/doc/make.jl
++++ julia/doc/make.jl
+@@ -3,8 +3,7 @@ empty!(LOAD_PATH)
+ push!(LOAD_PATH, @__DIR__, "@stdlib")
+ empty!(DEPOT_PATH)
+ pushfirst!(DEPOT_PATH, joinpath(@__DIR__, "deps"))
+-using Pkg
+-Pkg.instantiate()
++push!(LOAD_PATH, "../debian/embedded")
+
+ using Documenter
+
+Index: julia/doc/Makefile
+===================================================================
+--- julia.orig/doc/Makefile
++++ julia/doc/Makefile
+@@ -23,10 +23,8 @@ help:
+ DOCUMENTER_OPTIONS := linkcheck=$(linkcheck) doctest=$(doctest) buildroot=$(call cygpath_w,$(BUILDROOT))
+
+ UnicodeData.txt:
+- $(JLDOWNLOAD) http://www.unicode.org/Public/9.0.0/ucd/UnicodeData.txt
+-
+-deps: UnicodeData.txt
+- $(JLCHECKSUM) UnicodeData.txt
++deps:
++ true
+
+ clean:
+ -rm -rf _build/* deps/* docbuild.log UnicodeData.txt
+Index: julia/doc/Project.toml
+===================================================================
+--- julia.orig/doc/Project.toml
++++ julia/doc/Project.toml
+@@ -1,2 +0,0 @@
+-[deps]
+-Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
--- /dev/null
+diff --git a/debian/embedded/Documenter.jl-0.20.0/src/Writers/LaTeXWriter.jl b/debian/embedded/Documenter.jl-0.20.0/src/Writers/LaTeXWriter.jl
+index f442d959..51ce92e7 100644
+--- a/debian/embedded/Documenter.jl-0.20.0/src/Writers/LaTeXWriter.jl
++++ b/debian/embedded/Documenter.jl-0.20.0/src/Writers/LaTeXWriter.jl
+@@ -87,7 +87,7 @@ function render(doc::Documents.Document)
+ outdir = joinpath(doc.user.root, doc.user.build)
+ pdf = replace("$(doc.user.sitename).pdf", " " => "")
+ try
+- run(`latexmk -f -interaction=nonstopmode -view=none -lualatex -shell-escape $file`)
++ run(`latexmk -silent -f -interaction=nonstopmode -view=none -lualatex -shell-escape $file`)
+ catch err
+ Utilities.warn("failed to compile. Check generated LaTeX file.")
+ cp(file, joinpath(outdir, file); force = true)
+
+diff --git a/debian/embedded/Documenter.jl-0.21.0/src/Writers/LaTeXWriter.jl b/debian/embedded/Documenter.jl-0.21.0/src/Writers/LaTeXWriter.jl
+index 9b98b34e..21727866 100644
+--- a/debian/embedded/Documenter.jl-0.21.0/src/Writers/LaTeXWriter.jl
++++ b/debian/embedded/Documenter.jl-0.21.0/src/Writers/LaTeXWriter.jl
+@@ -139,7 +139,7 @@ function compile_tex(doc::Documents.Document, settings::LaTeX, texfile::String)
+ Sys.which("latexmk") === nothing && (@error "LaTeXWriter: latexmk command not found."; return false)
+ @info "LaTeXWriter: using latexmk to compile tex."
+ try
+- piperun(`latexmk -f -interaction=nonstopmode -view=none -lualatex -shell-escape $texfile`)
++ piperun(`latexmk -silent -f -interaction=nonstopmode -view=none -lualatex -shell-escape $texfile`)
+ return true
+ catch err
+ logs = cp(pwd(), mktempdir(); force=true)
--- /dev/null
+Description: build documentation using Debian's unicode data,
+ instead of a downloaded one
+Forwarded: no need
+Author: Mo Zhou
+Index: julia/doc/src/manual/unicode-input.md
+===================================================================
+--- julia.orig/doc/src/manual/unicode-input.md
++++ julia/doc/src/manual/unicode-input.md
+@@ -31,7 +31,7 @@ function tab_completions(symbols...)
+ end
+
+ function unicode_data()
+- file = normpath(@__DIR__, "..", "..", "..", "..", "..", "doc", "UnicodeData.txt")
++ file = "/usr/share/unicode/UnicodeData.txt"
+ names = Dict{UInt32, String}()
+ open(file) do unidata
+ for line in readlines(unidata)
--- /dev/null
+Description: LLVM's cmake doesn't install stuff to multiarch directory, which
+ breaks julia installation when MULTIARCH* options in Make.inc is turned on.
+ This hacky solution aims to workaround installation failure.
+See: https://github.com/JuliaLang/julia/issues/27922
+Forward: no need
+Author: Mo Zhou
+Index: julia/deps/llvm.mk
+===================================================================
+--- julia.orig/deps/llvm.mk
++++ julia/deps/llvm.mk
+@@ -560,6 +560,7 @@ ifeq ($(OS),Darwin)
+ # https://github.com/JuliaLang/julia/issues/29981
+ LLVM_INSTALL += && ln -s libLLVM.dylib $2$$(build_shlibdir)/libLLVM-$$(LLVM_VER_SHORT).dylib
+ endif
++LLVM_INSTALL += && find $2 -type f,l -name 'libLLVM*so*' -print -exec install -Dv '{}' $(shell pwd)/../$$(libdir)/julia/ \;
+
+ $(eval $(call staged-install,llvm,llvm-$$(LLVM_VER)/build_$$(LLVM_BUILDTYPE), \
+ LLVM_INSTALL,,,))
--- /dev/null
+If DESTDIR is defined in d/rules, embedded LLVM installation would break.
+If DESTDIR is not defined in d/rules, embedded LLVM installation works, but Julia breaks.
+This hack makes both LLVM and Julia happy.
+
+Index: julia/Makefile
+===================================================================
+--- julia.orig/Makefile
++++ julia/Makefile
+@@ -328,6 +328,7 @@ JL_PRIVATE_LIBS-0 += libgfortran libgcc_
+ endif
+
+
++override DESTDIR=./debian/tmp/
+ install: $(build_depsbindir)/stringreplace $(BUILDROOT)/doc/_build/html/en/index.html
+ @$(MAKE) $(QUIET_MAKE) all
+ @for subdir in $(bindir) $(datarootdir)/julia/stdlib/$(VERSDIR) $(docdir) $(man1dir) $(includedir)/julia $(libdir) $(private_libdir) $(sysconfdir); do \
--- /dev/null
+Description: Let the upstream downloader used during build be verbose,
+ and prevent it from accessing internet.
+Author: Mo Zhou
+Last-Update: 20180711
+Forwarded: no need
+
+Index: julia/deps/tools/jldownload
+===================================================================
+--- julia.orig/deps/tools/jldownload
++++ julia/deps/tools/jldownload
+@@ -2,6 +2,7 @@
+ #
+ # usage: jldownload [<output-filename>] <url>
+ #
++set -x
+
+ CACHE_HOST=https://cache.julialang.org
+
+@@ -44,4 +45,5 @@ fi
+ # forward to the original URL if it has not cached this download yet, or
+ # if the URL is not cacheable. We fallback to directly querying the
+ # uncached URL to protect against cache service downtime
+-$GETURL $CACHE_URL || $GETURL $URL
++#$GETURL $CACHE_URL || $GETURL $URL
++$GETURL $URL
--- /dev/null
+Description: Avoid baseline violation on armhf
+Forwarded: not-yet
+Bug-Debian: https://bugs.debian.org/919183
+Author: Graham Inggs <ginggs@debian.org>
+Last-Update: 2018-01-22
+
+--- a/deps/llvm.mk
++++ b/deps/llvm.mk
+@@ -508,6 +508,7 @@
+ $(eval $(call LLVM_PATCH,llvm-D50167-scev-umin))
+ $(eval $(call LLVM_PATCH,llvm-windows-race))
+ $(eval $(call LLVM_PATCH,llvm-rL326967-aligned-load)) # remove for 7.0
++$(eval $(call LLVM_PATCH,clang-arm-default-vfp3-on-armv7a))
+ endif # LLVM_VER
+
+ # Remove hardcoded OS X requirements in compilter-rt cmake build
+--- /dev/null
++++ b/deps/patches/clang-arm-default-vfp3-on-armv7a.patch
+@@ -0,0 +1,24 @@
++--- a/include/llvm/Support/ARMTargetParser.def
+++++ b/include/llvm/Support/ARMTargetParser.def
++@@ -75,7 +75,7 @@ ARM_ARCH("armv6kz", ARMV6KZ, "6KZ", "v6k
++ ARM_ARCH("armv6-m", ARMV6M, "6-M", "v6m", ARMBuildAttrs::CPUArch::v6_M,
++ FK_NONE, ARM::AEK_NONE)
++ ARM_ARCH("armv7-a", ARMV7A, "7-A", "v7", ARMBuildAttrs::CPUArch::v7,
++- FK_NEON, ARM::AEK_DSP)
+++ FK_VFPV3_D16, ARM::AEK_DSP)
++ ARM_ARCH("armv7ve", ARMV7VE, "7VE", "v7ve", ARMBuildAttrs::CPUArch::v7,
++ FK_NEON, (ARM::AEK_SEC | ARM::AEK_MP | ARM::AEK_VIRT |
++ ARM::AEK_HWDIVARM | ARM::AEK_HWDIVTHUMB | ARM::AEK_DSP))
++--- a/lib/Target/ARM/ARM.td
+++++ b/lib/Target/ARM/ARM.td
++@@ -530,7 +530,8 @@ def ARMv6sm : Architecture<"armv6s-m",
++ FeatureMClass]>;
++
++ def ARMv7a : Architecture<"armv7-a", "ARMv7a", [HasV7Ops,
++- FeatureNEON,
+++ FeatureVFP3,
+++ FeatureD16,
++ FeatureDB,
++ FeatureDSP,
++ FeatureAClass]>;
++
--- /dev/null
+Description: don't try to touch libunwind when it's disabled
+Author: Mo Zhou
+Forwarded: https://github.com/JuliaLang/julia/issues/29163
+Index: julia/base/Makefile
+===================================================================
+--- julia.orig/base/Makefile
++++ julia/base/Makefile
+@@ -192,7 +192,7 @@ $(eval $(call symlink_system_library,lib
+ $(eval $(call symlink_system_library,libumfpack,SUITESPARSE))
+ $(eval $(call symlink_system_library,libspqr,SUITESPARSE))
+ $(eval $(call symlink_system_library,libsuitesparseconfig,SUITESPARSE))
+-ifneq ($(DISABLE_LIBUNWIND),0)
++ifneq ($(DISABLE_LIBUNWIND), 1)
+ $(eval $(call symlink_system_library,libunwind,LIBUNWIND))
+ endif
+ endif # WINNT
--- /dev/null
+Description: Do not compile and install the debugging version of Julia
+Author: Sébastien Villemot <sebastien@debian.org>
+Author: Mo Zhou <cdluminate@gmail.com>
+Forwarded: not-needed
+Last-Update: 2018-07-19
+---
+This patch header follows DEP-3: http://dep.debian.net/deps/dep3/
+Index: julia/Makefile
+===================================================================
+--- julia.orig/Makefile
++++ julia/Makefile
+@@ -4,7 +4,7 @@ include $(JULIAHOME)/Make.inc
+ VERSDIR := v`cut -d. -f1-2 < $(JULIAHOME)/VERSION`
+
+ default: $(JULIA_BUILD_MODE) # contains either "debug" or "release"
+-all: debug release
++all: release
+
+ # sort is used to remove potential duplicates
+ DIRS := $(sort $(build_bindir) $(build_depsbindir) $(build_libdir) $(build_private_libdir) $(build_libexecdir) $(build_includedir) $(build_includedir)/julia $(build_sysconfdir)/julia $(build_datarootdir)/julia $(build_datarootdir)/julia/stdlib $(build_man1dir))
--- /dev/null
+Purpose: fix lintian error caused by embedded documenter
+Author: Mo Zhou <cdluminate@gmail.com>
+
+--- a/debian/embedded/Documenter/assets/html/documenter.css
++++ b/debian/embedded/Documenter/assets/html/documenter.css
+@@ -21,7 +21,7 @@ body, input {
+ }
+
+ pre, code, kbd {
+- font-family: 'Roboto Mono', Monaco, courier, monospace;
++ font-family: Inconsolata, Monaco, courier, monospace;
+ font-size: 0.90em;
+ }
+
+--- a/debian/embedded/Documenter/assets/html/documenter.js
++++ b/debian/embedded/Documenter/assets/html/documenter.js
+@@ -7,13 +7,14 @@
+
+ requirejs.config({
+ paths: {
+- 'jquery': 'https://cdnjs.cloudflare.com/ajax/libs/jquery/3.1.1/jquery.min',
+- 'jqueryui': 'https://cdnjs.cloudflare.com/ajax/libs/jqueryui/1.12.0/jquery-ui.min',
+- 'headroom': 'https://cdnjs.cloudflare.com/ajax/libs/headroom/0.9.3/headroom.min',
+- 'mathjax': 'https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/MathJax.js?config=TeX-AMS_HTML',
+- 'highlight': 'https://cdnjs.cloudflare.com/ajax/libs/highlight.js/9.12.0/highlight.min',
+- 'highlight-julia': 'https://cdnjs.cloudflare.com/ajax/libs/highlight.js/9.12.0/languages/julia.min',
+- 'highlight-julia-repl': 'https://cdnjs.cloudflare.com/ajax/libs/highlight.js/9.12.0/languages/julia-repl.min',
++ 'jquery': 'file:///usr/share/javascript/jquery/jquery.min.js',
++ 'jqueryui': 'file:///usr/share/javascript/jquery-ui/jquery-ui.min.js',
++ //'headroom': 'https://cdnjs.cloudflare.com/ajax/libs/headroom/0.9.3/headroom.min',
++ 'headroom': 'http://localhost',
++ 'mathjax': 'file:///usr/share/javascript/mathjax/MathJax.js?config=TeX-AMS_HTML',
++ 'highlight': 'file:///usr/lib/nodejs/highlight.js/highlight.js',
++ 'highlight-julia': 'file:////usr/lib/nodejs/highlight.js/languages/julia.js',
++ 'highlight-julia-repl': 'file:////usr/lib/nodejs/highlight.js/languages/julia-repl.js',
+ },
+ shim: {
+ 'mathjax' : {
+
+--- a/debian/embedded/Documenter/src/Writers/HTMLWriter.jl
++++ b/debian/embedded/Documenter/src/Writers/HTMLWriter.jl
+@@ -98,11 +98,12 @@ import ...Documenter:
+ import ...Utilities.DOM: DOM, Tag, @tags
+ using ...Utilities.MDFlatten
+
+-const requirejs_cdn = "https://cdnjs.cloudflare.com/ajax/libs/require.js/2.2.0/require.min.js"
+-const normalize_css = "https://cdnjs.cloudflare.com/ajax/libs/normalize/4.2.0/normalize.min.css"
+-const google_fonts = "https://fonts.googleapis.com/css?family=Lato|Roboto+Mono"
+-const fontawesome_css = "https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.6.3/css/font-awesome.min.css"
+-const highlightjs_css = "https://cdnjs.cloudflare.com/ajax/libs/highlight.js/9.12.0/styles/default.min.css"
++const requirejs_cdn = "file:///usr/share/javascript/requirejs/require.min.js"
++const normalize_css = "file:///usr/share/javascript/normalize.css/normalize.min.css"
++const google_fonts = "file:///usr/share/doc/julia-doc/fontface.css"
++const fontawesome_css = "file:///usr/share/fonts-font-awesome/css/font-awesome.min.css"
++const highlightjs_css = "file:///usr/share/javascript/highlight.js/styles/default.css"
++
+
+ """
+ [`HTMLWriter`](@ref)-specific globals that are passed to [`domify`](@ref) and
--- /dev/null
+Description: Julia requires SSE2 on i386
+ This patch adds an explicit error message if the CPU does not support SSE2.
+ The CPU features are queried using the x86 CPUID opcode. The wrapper function
+ __get_cpuid() is provided by GCC and Clang. See <cpuid.h> for the list of
+ supported CPU extension flags.
+Author: Sébastien Villemot <sebastien@debian.org>
+Author: Peter Colberg <peter@colberg.org>
+Bug: https://github.com/JuliaLang/julia/issues/7185
+Forwarded: no
+Last-Update: 2015-11-01
+---
+This patch header follows DEP-3: http://dep.debian.net/deps/dep3/
+Index: julia/src/codegen.cpp
+===================================================================
+--- julia.orig/src/codegen.cpp
++++ julia/src/codegen.cpp
+@@ -26,6 +26,9 @@
+ #endif
+
+ #include <setjmp.h>
++#ifdef __i386__
++#include <cpuid.h>
++#endif
+ #include <string>
+ #include <sstream>
+ #include <fstream>
+@@ -7378,6 +7381,15 @@ static void init_julia_llvm_env(Module *
+
+ extern "C" void *jl_init_llvm(void)
+ {
++#ifdef __i386__
++ unsigned int eax = 0, ebx = 0, ecx = 0, edx = 0;
++ __get_cpuid(1, &eax, &ebx, &ecx, &edx);
++ if (!(edx & bit_SSE2)) {
++ fprintf(stderr, "Your processor does not have SSE2 extension, which is required by Julia. Exiting.\n");
++ exit(EXIT_FAILURE);
++ }
++#endif
++
+ const char *const argv_tailmerge[] = {"", "-enable-tail-merge=0"}; // NOO TOUCHIE; NO TOUCH! See #922
+ cl::ParseCommandLineOptions(sizeof(argv_tailmerge)/sizeof(argv_tailmerge[0]), argv_tailmerge, "disable-tail-merge\n");
+ #if defined(_OS_WINDOWS_) && defined(_CPU_X86_64_)
--- /dev/null
+# Build system and feature
+jldownload-verbose-fakedownload.patch
+support-noopt.patch
+require-sse2-on-i386.patch
+no-debug-version.patch
+do-not-download-libuv.patch
+do-not-download-libwhich.patch
+do-not-download-pkgjl.patch
+do-not-download-llvm.patch
+doc-make.patch
+doc-unicode-data-path.patch
+make-unwind-logic-error.patch
+install-embedded-llvm-hack.patch
+install-julia-hack.patch
+llvm-armhf-baseline.patch
+
+# distro integration, wip, experimental, disabled by default
+#default-load-path-distro.patch
+
+# Misc enhancement
+appstream.patch
+
+# Test
+test-skip-sigint.patch
+test-skip-i386-spec.patch
+test-skip-dns-ubuntu.patch
+
+
+# This patch is for embedded sources. Already pre-applied.
+# privacy-breach.patch
+# doc-silent.patch
--- /dev/null
+Description: Remove hardcoded GCC optimization flags
+ This is necessary in order to make DEB_BUILD_OPTIONS=noopt work as expected.
+ .
+ Note that the hack on llvm-config --cxxflags is not absolutely needed, because
+ the -O2 that it brings come before the -O0 brought by dpkg-buildflags. But I
+ leave it for clarity.
+Author: Sébastien Villemot <sebastien@debian.org>
+Forwarded: not-needed
+Last-Update: 2015-11-17
+---
+This patch header follows DEP-3: http://dep.debian.net/deps/dep3/
+Index: julia/Make.inc
+===================================================================
+--- julia.orig/Make.inc
++++ julia/Make.inc
+@@ -413,7 +413,7 @@ ifneq ($(OS), WINNT)
+ JCXXFLAGS += -pedantic
+ endif
+ DEBUGFLAGS := -O0 -ggdb2 -DJL_DEBUG_BUILD -fstack-protector-all
+-SHIPFLAGS := -O3 -ggdb2 -falign-functions
++SHIPFLAGS := -ggdb2 -falign-functions
+ endif
+
+ ifeq ($(USECLANG),1)
+Index: julia/deps/suitesparse.mk
+===================================================================
+--- julia.orig/deps/suitesparse.mk
++++ julia/deps/suitesparse.mk
+@@ -111,7 +111,7 @@ endif
+
+ $(build_shlibdir)/libsuitesparse_wrapper.$(SHLIB_EXT): $(SRCDIR)/SuiteSparse_wrapper.c
+ mkdir -p $(build_shlibdir)
+- $(CC) $(CPPFLAGS) $(CFLAGS) $(LDFLAGS) -O2 -shared $(fPIC) $(SUITESPARSE_INC) $< -o $@ $(SUITESPARSE_LIB)
++ $(CC) $(CPPFLAGS) $(CFLAGS) $(LDFLAGS) -shared $(fPIC) $(SUITESPARSE_INC) $< -o $@ $(SUITESPARSE_LIB)
+ $(INSTALL_NAME_CMD)libsuitesparse_wrapper.$(SHLIB_EXT) $@
+ touch -c $@
+
+Index: julia/src/Makefile
+===================================================================
+--- julia.orig/src/Makefile
++++ julia/src/Makefile
+@@ -136,7 +136,7 @@ $(BUILDDIR)/%.o: $(SRCDIR)/%.c $(HEADERS
+ $(BUILDDIR)/%.dbg.obj: $(SRCDIR)/%.c $(HEADERS) | $(BUILDDIR)
+ @$(call PRINT_CC, $(CC) $(CPPFLAGS) $(CFLAGS) $(DEBUGFLAGS) -c $< -o $@)
+ $(BUILDDIR)/%.o: $(SRCDIR)/%.cpp $(SRCDIR)/llvm-version.h $(HEADERS) $(LLVM_CONFIG_ABSOLUTE) | $(BUILDDIR)
+- @$(call PRINT_CC, $(CXX) $(shell $(LLVM_CONFIG_HOST) --cxxflags) $(CPPFLAGS) $(CXXFLAGS) $(SHIPFLAGS) $(CXX_DISABLE_ASSERTION) -c $< -o $@)
++ @$(call PRINT_CC, $(CXX) $(shell $(LLVM_CONFIG_HOST) --cxxflags | sed 's/^/ /;s/$$/ /;s/\s-O.\s/ /') $(CPPFLAGS) $(CXXFLAGS) $(SHIPFLAGS) $(CXX_DISABLE_ASSERTION) -c $< -o $@)
+ $(BUILDDIR)/%.dbg.obj: $(SRCDIR)/%.cpp $(SRCDIR)/llvm-version.h $(HEADERS) $(LLVM_CONFIG_ABSOLUTE) | $(BUILDDIR)
+ @$(call PRINT_CC, $(CXX) $(shell $(LLVM_CONFIG_HOST) --cxxflags) $(CPPFLAGS) $(CXXFLAGS) $(DEBUGFLAGS) -c $< -o $@)
+
--- /dev/null
+Description: Skip DNS tests which fail on Ubuntu autopkgtest infrastructure
+Author: Graham Inggs <ginggs@debian.org>
+Last-Update: 2018-10-01
+
+Index: julia/stdlib/Sockets/test/runtests.jl
+===================================================================
+--- julia.orig/stdlib/Sockets/test/runtests.jl
++++ julia/stdlib/Sockets/test/runtests.jl
+@@ -151,25 +151,25 @@ defaultport = rand(2000:4000)
+ end
+
+ @testset "getnameinfo on some unroutable IP addresses (RFC 5737)" begin
+- @test getnameinfo(ip"192.0.2.1") == "192.0.2.1"
+- @test getnameinfo(ip"198.51.100.1") == "198.51.100.1"
+- @test getnameinfo(ip"203.0.113.1") == "203.0.113.1"
+- @test getnameinfo(ip"0.1.1.1") == "0.1.1.1"
+- @test getnameinfo(ip"::ffff:0.1.1.1") == "::ffff:0.1.1.1"
+- @test getnameinfo(ip"::ffff:192.0.2.1") == "::ffff:192.0.2.1"
+- @test getnameinfo(ip"2001:db8::1") == "2001:db8::1"
++# @test getnameinfo(ip"192.0.2.1") == "192.0.2.1"
++# @test getnameinfo(ip"198.51.100.1") == "198.51.100.1"
++# @test getnameinfo(ip"203.0.113.1") == "203.0.113.1"
++# @test getnameinfo(ip"0.1.1.1") == "0.1.1.1"
++# @test getnameinfo(ip"::ffff:0.1.1.1") == "::ffff:0.1.1.1"
++# @test getnameinfo(ip"::ffff:192.0.2.1") == "::ffff:192.0.2.1"
++# @test getnameinfo(ip"2001:db8::1") == "2001:db8::1"
+ end
+
+ @testset "getnameinfo on some valid IP addresses" begin
+ @test !isempty(getnameinfo(ip"::")::String)
+- @test !isempty(getnameinfo(ip"0.0.0.0")::String)
+- @test !isempty(getnameinfo(ip"10.1.0.0")::String)
+- @test !isempty(getnameinfo(ip"10.1.0.255")::String)
+- @test !isempty(getnameinfo(ip"10.1.255.1")::String)
+- @test !isempty(getnameinfo(ip"255.255.255.255")::String)
+- @test !isempty(getnameinfo(ip"255.255.255.0")::String)
+- @test !isempty(getnameinfo(ip"192.168.0.1")::String)
+- @test !isempty(getnameinfo(ip"::1")::String)
++# @test !isempty(getnameinfo(ip"0.0.0.0")::String)
++# @test !isempty(getnameinfo(ip"10.1.0.0")::String)
++# @test !isempty(getnameinfo(ip"10.1.0.255")::String)
++# @test !isempty(getnameinfo(ip"10.1.255.1")::String)
++# @test !isempty(getnameinfo(ip"255.255.255.255")::String)
++# @test !isempty(getnameinfo(ip"255.255.255.0")::String)
++# @test !isempty(getnameinfo(ip"192.168.0.1")::String)
++# @test !isempty(getnameinfo(ip"::1")::String)
+ end
+
+ @testset "getaddrinfo" begin
--- /dev/null
+Description: skip some tests specifically for i386
+Author: Mo Zhou
+Forward: no need.
+Index: julia/test/compiler/compiler.jl
+===================================================================
+--- julia.orig/test/compiler/compiler.jl
++++ julia/test/compiler/compiler.jl
+@@ -879,11 +879,15 @@ function break_21369()
+ end
+ i += 1
+ end
++ if !(Sys.ARCH in [:i386, :i686])
+ @test fr.func === :break_21369
++ end
+ rethrow()
+ end
+ end
++if !(Sys.ARCH in [:i386, :i686])
+ @test_throws ErrorException break_21369() # not TypeError
++end
+
+ # issue #17003
+ abstract type AArray_17003{T,N} end
+Index: julia/test/stacktraces.jl
+===================================================================
+--- julia.orig/test/stacktraces.jl
++++ julia/test/stacktraces.jl
+@@ -76,10 +76,12 @@ let ct = current_task()
+ @test try_stacktrace()[1] == StackFrame(:try_stacktrace, @__FILE__, line_numbers[2])
+
+ # Test try...catch with catch_backtrace
++ if !(Sys.ARCH in [:i386, :i686])
+ @test try_catch()[1:2] == [
+ StackFrame(:bad_function, @__FILE__, line_numbers[1]),
+ StackFrame(:try_catch, @__FILE__, line_numbers[3])
+ ]
++ end
+ end
+
+ module inlined_test
--- /dev/null
+Description: temporarily skip this test which causes random test failure. Further investigation needed.
+Forward: never
+Last-Update: 20180812
+Author: Mo Zhou
+
+Index: julia/test/stress.jl
+===================================================================
+--- julia.orig/test/stress.jl
++++ julia/test/stress.jl
+@@ -91,16 +91,3 @@ if !Sys.iswindows()
+ test_22566()
+ end
+ end # !Sys.iswindows
+-
+-# sig 2 is SIGINT per the POSIX.1-1990 standard
+-if !Sys.iswindows()
+- ccall(:jl_exit_on_sigint, Cvoid, (Cint,), 0)
+- @test_throws InterruptException begin
+- ccall(:kill, Cvoid, (Cint, Cint,), getpid(), 2)
+- for i in 1:10
+- Libc.systemsleep(0.1)
+- ccall(:jl_gc_safepoint, Cvoid, ()) # wait for SIGINT to arrive
+- end
+- end
+- ccall(:jl_exit_on_sigint, Cvoid, (Cint,), 1)
+-end
--- /dev/null
+# This example changes the default prompt string.
+# Just include this file in ~/.julia/config/startup.jl
+# For more tweaks to your REPL, please install the "OhMyREPL" package.
+import REPL
+function myrepl(repl)
+ repl.interface = REPL.setup_interface(repl)
+ repl.interface.modes[1].prompt = "⛬ >"
+ return
+end
+atreplinit(myrepl)
--- /dev/null
+#!/usr/bin/make -f
+include /usr/share/dpkg/default.mk
+export DEB_BUILD_MAINT_OPTIONS=hardening=+all
+
+# Disable unaligned access on armhf
+ifneq (,$(filter $(DEB_HOST_ARCH),armhf))
+ export DEB_CFLAGS_MAINT_APPEND += -mno-unaligned-access
+endif
+
+# Ensure pcre_h.jl and errno_h.jl are sorted reproducibly
+export LC_ALL=C.UTF-8
+# HOME is needed to avoid "mkdir(): Permission denied" while building docs.
+# Some tests also need a HOME directory.
+export HOME=/tmp
+# If you want to do a custom build of Julia against MKL, set it to 1.
+export CUSTOM_MKL ?= 0
+# If you want to do a custom build for native machine architecture
+export CUSTOM_NATIVE ?= 0
+
+# NOTES:
+# 1. Some of these flags come from upstream's Make.inc .
+# 2. To enable USE_BLAS64=0 (ILP64), we need to set INTERFACE64=1 for OpenBLAS.
+# 3. Julia is tightly coupled with a specific libuv1 version. we cannot use the one provided in archive.
+# 4. ∴ is unicode "therefore", ⛬ is unicode "historic site" https://unicode-table.com/en/26EC/
+COMMON_FLAGS = \
+ prefix=/usr \
+ sysconfdir=/etc \
+ MULTIARCH=$(DEB_HOST_MULTIARCH) \
+ MULTIARCH_INSTALL=1 \
+ NO_GIT=1 \
+ TAGGED_RELEASE_BANNER='$(DEB_VENDOR) ⛬ julia/$(DEB_VERSION)' \
+ USE_BLAS64=0 \
+ USE_LLVM_SHLIB=1 \
+ USE_SYSTEM_BLAS=1 \
+ USE_SYSTEM_CURL=1 \
+ USE_SYSTEM_DSFMT=1 \
+ USE_SYSTEM_GMP=1 \
+ USE_SYSTEM_LAPACK=1 \
+ USE_SYSTEM_LIBGIT2=1 \
+ USE_SYSTEM_LIBSSH2=1 \
+ USE_SYSTEM_LIBUNWIND=1 \
+ USE_SYSTEM_LIBUV=0 \
+ USE_SYSTEM_LLVM=0 \
+ USE_SYSTEM_MBEDTLS=1 \
+ USE_SYSTEM_MPFR=1 \
+ USE_SYSTEM_OPENSPECFUN=1 \
+ USE_SYSTEM_PATCHELF=1 \
+ USE_SYSTEM_PCRE=1 \
+ USE_SYSTEM_SUITESPARSE=1 \
+ USE_SYSTEM_UTF8PROC=1 \
+ VERBOSE=1
+
+# Set architecture specific CPU targets. See: #910784
+ifneq (,$(filter $(DEB_HOST_ARCH),amd64 kfreebsd-amd64 x32))
+COMMON_FLAGS += MARCH=x86-64 \
+ JULIA_CPU_TARGET="generic;sandybridge,-xsaveopt,clone_all;haswell,-rdrnd,base(1)"
+else ifneq (,$(filter $(DEB_HOST_ARCH),i386 hurd-i386 kfreebsd-i386))
+COMMON_FLAGS += MARCH=pentium4 \
+ JULIA_CPU_TARGET="pentium4;sandybridge,-xsaveopt,clone_all"
+else ifneq (,$(filter $(DEB_HOST_ARCH),armhf))
+COMMON_FLAGS += JULIA_CPU_TARGET="armv7-a;armv7-a,neon;armv7-a,neon,vfp4"
+else ifneq (,$(filter $(DEB_HOST_ARCH),ppc64el))
+COMMON_FLAGS += JULIA_CPU_TARGET="pwr8"
+else
+COMMON_FLAGS += JULIA_CPU_TARGET="generic"
+endif
+
+# Use libopenlibm on architectures that have it
+ifneq (,$(filter $(DEB_HOST_ARCH),amd64 kfreebsd-amd64 x32))
+COMMON_FLAGS += USE_SYSTEM_OPENLIBM=1 USE_SYSTEM_LIBM=0
+else ifneq (,$(filter $(DEB_HOST_ARCH),i386 hurd-i386 kfreebsd-i386))
+COMMON_FLAGS += USE_SYSTEM_OPENLIBM=1 USE_SYSTEM_LIBM=0
+else ifneq (,$(filter $(DEB_HOST_ARCH),arm64 armhf mips mips64el mipsel powerpc ppc64 ppc64el))
+COMMON_FLAGS += USE_SYSTEM_OPENLIBM=1 USE_SYSTEM_LIBM=0
+else
+# Use libm elsewhere
+COMMON_FLAGS += USE_SYSTEM_OPENLIBM=0 USE_SYSTEM_LIBM=1
+endif
+
+# Use libopenblas on architectures that have it
+ifneq (,$(filter $(DEB_HOST_ARCH),amd64 arm64 armhf i386 kfreebsd-amd64 kfreebsd-i386 mips64el ppc64el s390x))
+COMMON_FLAGS += LIBBLAS=-lopenblas LIBBLASNAME=libopenblas \
+ LIBLAPACK=-lopenblas LIBLAPACKNAME=libopenblas
+else
+# Use libblas and liblapack elsewhere
+COMMON_FLAGS += LIBBLAS=-lblas LIBBLASNAME=libblas \
+ LIBLAPACK=-llapack LIBLAPACKNAME=liblapack
+endif
+
+# [s390x] Disable libunwind
+ifneq (,$(filter $(DEB_HOST_ARCH),s390x))
+COMMON_FLAGS += DISABLE_LIBUNWIND=1
+endif
+
+# Set number of parallel workers for tests
+ifneq (,$(filter parallel=1,$(DEB_BUILD_OPTIONS)))
+TESTS_ENV += JULIA_CPU_THREADS=2
+else ifneq (,$(filter parallel=%,$(DEB_BUILD_OPTIONS)))
+TESTS_ENV += JULIA_CPU_THREADS=$(patsubst parallel=%,%,$(filter parallel=%,$(DEB_BUILD_OPTIONS)))
+else
+TESTS_ENV += JULIA_CPU_THREADS=2
+endif
+# Restart workers exceeding maximum resident memory size (requires JULIA_CPU_THREADS >= 2)
+TESTS_ENV += JULIA_TEST_MAXRSS_MB=500
+
+ifeq (1,$(CUSTOM_MKL))
+COMMON_FLAGS += USE_INTEL_MKL=1 USE_BLAS64=1 \
+ LIBBLAS=-lmkl_rt LIBBLASNAME=libmkl_rt \
+ LIBLAPACK=-lmkl_rt LIBLAPACKNAME=libmkl_rt \
+ MKLROOT=/usr MKLLIB=/usr/lib/$(DEB_HOST_MULTIARCH)
+endif
+ifeq (1,$(CUSTOM_NATIVE))
+COMMON_FLAGS += MARCH=native JULIA_CPU_TARGET=native
+endif
+
+
+%:
+ dh $@
+
+override_dh_auto_build-arch:
+ dh_auto_build -- $(COMMON_FLAGS)
+
+override_dh_auto_build-indep:
+ dh_auto_build -- $(COMMON_FLAGS)
+ -$(MAKE) -C doc pdf
+
+override_dh_auto_test-arch:
+ifeq (,$(filter nocheck,$(DEB_BUILD_OPTIONS)))
+ifeq (,$(filter $(DEB_HOST_ARCH),amd64 i386))
+ -env $(TESTS_ENV) make -C test $(COMMON_FLAGS)
+else
+ env $(TESTS_ENV) make -C test $(COMMON_FLAGS)
+endif
+endif
+
+override_dh_auto_test-indep:
+
+override_dh_auto_clean:
+ make $(COMMON_FLAGS) distcleanall
+ make -f debian/shlibdeps.mk $(COMMON_FLAGS) clean
+
+override_dh_auto_install:
+ make $(COMMON_FLAGS) install
+ # FIXME: any better solution than this hack?
+ find deps -type f -name 'libLLVM*so*' -exec cp -v '{}' debian/tmp/usr/lib/$(shell gcc -print-multiarch)/julia/ \;
+ -ln -s libLLVM-6.0.so debian/tmp/usr/lib/$(shell gcc -print-multiarch)/julia/libLLVM-6.0.0.so
+ -ln -s libLLVM-6.0.so debian/tmp/usr/lib/$(shell gcc -print-multiarch)/julia/libLLVM.so
+ rm -rf usr # Otherwise dh_install does not see debian/tmp/usr
+ find debian -type f -name '*.so.debug' -delete
+ find debian -type f -name .gitignore -delete
+ find debian -type f -name 'LICENSE.md' -delete
+
+override_dh_missing:
+ dh_missing --list-missing
+
+override_dh_link-arch:
+ # Create *.so symlinks for dlopen'd libraries in private libdir.
+ make -f debian/shlibdeps.mk $(COMMON_FLAGS)
+ dh_link
+
+override_dh_shlibdeps:
+ # Generate dependencies for dlopen'd libraries using dummy executable.
+ # Suppress useless dependency warnings caused by unused library symbols.
+ dh_shlibdeps -- --warnings=1 debian/shlibdeps
+
+override_dh_compress:
+ dh_compress --exclude=examples/
+
+override_dh_install-indep:
+ dh_install --exclude=build_h.jl --exclude=build.h
+
+override_dh_fixperms:
+ dh_fixperms
+ # Fix shebang and mode bits
+ find debian \( -type f -name '*.jl' \
+ -exec grep -q '..usr/bin/env julia' '{}' \; \
+ -a -exec sed -i -e 's@#!/usr/bin/env julia@#!/usr/bin/julia@g' '{}' \; \
+ -a -exec chmod +x '{}' \; -print \)
+
+# Don't strip sys.so and libjulia.so.* to keep the sanity of this program.
+# https://github.com/JuliaLang/julia/issues/23115#issuecomment-320715030
+#
+# Julia sysimage (i.e. sys.so) sys.so is NOT compiled from any compiled
+# language such as C/C++, but from Julia scripts instead. Julia precompiles .jl
+# scripts under the base and stdlib directories into this ELF-formated shared
+# object sys.so to speedup program startup time. A glance at the symbol table
+# (readelf -sW sys.so) would illustrate that. For more detail about how such
+# debugging information is useful, please refer the official documentation:
+# https://docs.julialang.org/en/v1.0.0/manual/stacktraces/
+#
+override_dh_strip:
+ dh_strip -X"sys.so"
+
+override_dh_makeshlibs:
+ dh_makeshlibs --no-scripts -XLLVM
--- /dev/null
+/*
+ * Query soname of shared library.
+ * Copyright © 2015 Peter Colberg.
+ * Distributed under the MIT license.
+ */
+
+#define _GNU_SOURCE
+#include <dlfcn.h>
+#include <stdio.h>
+#include <stdlib.h>
+
+#define error(fmt, ...) fprintf(stderr, "%s: "fmt"\n", argv[0], ##__VA_ARGS__)
+
+/*
+ * For a given symbol name, this program prints the path relative to /
+ * of the shared library containing that symbol. The program must be
+ * linked against the shared libraries to be queried for symbols, and
+ * compiled as a position-independent executable using -fPIE.
+ */
+int main(int argc, const char *const *argv)
+{
+ Dl_info info;
+ void *sym;
+
+ if (argc != 2) {
+ fprintf(stderr, "Usage: %s SYMBOL\n", argv[0]);
+ return 1;
+ }
+ if (!(sym = dlsym(RTLD_DEFAULT, argv[1]))) {
+ fprintf(stderr, "%s\n", dlerror());
+ return 1;
+ }
+ if (dladdr(sym, &info) == 0) {
+ fprintf(stderr, "%s\n", dlerror());
+ return 1;
+ }
+ if (info.dli_saddr != sym) {
+ error("mismatching symbol name: %s", info.dli_sname);
+ return 1;
+ }
+ if (info.dli_fname[0] != '/') {
+ error("library name not an absolute path: %s", info.dli_fname);
+ return 1;
+ }
+ printf("%s", info.dli_fname + 1);
+ return 0;
+}
--- /dev/null
+# This makefile compiles a dummy executable linked against
+# those shared libraries that are dlopen'd by Julia modules
+# in Base. The dummy executable is passed to dpkg-shlibdeps
+# to generate library dependencies for the julia package.
+#
+# The dummy executable is further used to generate symbolic
+# links lib*.so => lib*.so.SOVERSION in the private library
+# path for Julia, which ensures that ccall() loads exactly
+# those library versions that the julia package was built
+# with and depends on.
+
+# Include Julia makefile with LIB*NAME definitions.
+JULIAHOME := $(CURDIR)
+include Make.inc
+
+TARGETS := debian/shlibdeps debian/libjulia$(SOMAJOR).links
+
+all: $(TARGETS)
+
+ifeq ($(USE_SYSTEM_ARPACK),1)
+SHLIBDEPS += -larpack
+endif
+
+ifeq ($(USE_SYSTEM_BLAS),1)
+SHLIBDEPS += $(LIBBLAS)
+# OpenBLAS library provides both BLAS and LAPACK.
+ifneq ($(LIBBLASNAME),$(LIBLAPACKNAME))
+SHLIBDEPS += $(LIBLAPACK)
+endif
+endif
+
+ifeq ($(USE_SYSTEM_CURL),1)
+SHLIBDEPS += -lcurl
+endif
+
+ifeq ($(USE_SYSTEM_DSFMT),1)
+SHLIBDEPS += -ldSFMT
+endif
+
+ifeq ($(USE_SYSTEM_GMP),1)
+SHLIBDEPS += -lgmp
+endif
+
+ifeq ($(USE_SYSTEM_LIBGIT2),1)
+SHLIBDEPS += -lgit2
+endif
+
+ifeq ($(USE_SYSTEM_LIBM),1)
+SHLIBDEPS += $(LIBM)
+else ifeq ($(USE_SYSTEM_OPENLIBM),1)
+SHLIBDEPS += $(LIBM)
+endif
+
+ifeq ($(USE_SYSTEM_LIBSSH2),1)
+SHLIBDEPS += -lssh2
+endif
+
+ifeq ($(USE_SYSTEM_LIBUNWIND),1)
+ifeq (,$(filter $(DEB_HOST_ARCH),s390x))
+SHLIBDEPS += -lunwind
+endif
+endif
+
+ifeq ($(USE_SYSTEM_LIBUV),1)
+SHLIBDEPS += -luv
+endif
+
+ifeq ($(USE_SYSTEM_LLVM),1)
+SHLIBDEPS += -lLLVM -L/usr/lib/llvm-$(LLVM_VER)/lib/
+endif
+
+ifeq ($(USE_SYSTEM_MBEDTLS),1)
+SHLIBDEPS += -lmbedtls -lmbedcrypto -lmbedx509
+endif
+
+ifeq ($(USE_SYSTEM_MPFR),1)
+SHLIBDEPS += -lmpfr
+endif
+
+ifeq ($(USE_SYSTEM_PCRE),1)
+SHLIBDEPS += -lpcre2-8
+endif
+
+ifeq ($(USE_SYSTEM_SUITESPARSE),1)
+SHLIBDEPS += -lcholmod -lspqr -lsuitesparseconfig -lumfpack
+endif
+
+ifeq ($(USE_SYSTEM_UTF8PROC),1)
+SHLIBDEPS += -lutf8proc
+endif
+
+# The dummy executable is linked with --no-as-needed to prevent
+# the linker from potentially disregarding the given libraries
+# because none of the library symbols are used at compile time.
+debian/shlibdeps: debian/shlibdeps.c
+ $(CC) -fPIE -o $@ $< -ldl -Wl,--no-as-needed $(SHLIBDEPS)
+
+# The soname for each library is looked up by invoking the
+# dummy executable with the name of an arbitrary symbol such
+# as a function exported by that library. Ideally, these
+# should be functions that are ccall'd by the Julia modules.
+debian/libjulia$(SOMAJOR).links: debian/shlibdeps
+ rm -f $@.tmp
+ifeq ($(USE_SYSTEM_ARPACK),1)
+ debian/shlibdeps snaupd_ >> $@.tmp
+ echo " $(private_libdir)/libarpack.so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_BLAS),1)
+ debian/shlibdeps daxpy_ >> $@.tmp
+ echo " $(private_libdir)/$(LIBBLASNAME).so" >> $@.tmp
+ifneq ($(LIBBLASNAME),$(LIBLAPACKNAME))
+ debian/shlibdeps dggsvd_ >> $@.tmp
+ echo " $(private_libdir)/$(LIBLAPACKNAME).so" >> $@.tmp
+endif
+endif
+ifeq ($(USE_SYSTEM_CURL),1)
+ debian/shlibdeps curl_easy_recv >> $@.tmp
+ echo " $(private_libdir)/libcurl.so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_DSFMT),1)
+ debian/shlibdeps dsfmt_get_idstring >> $@.tmp
+ echo " $(private_libdir)/libdSFMT.so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_GMP),1)
+ debian/shlibdeps __gmpz_get_str >> $@.tmp
+ echo " $(private_libdir)/libgmp.so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_LIBGIT2),1)
+ debian/shlibdeps git_libgit2_version >> $@.tmp
+ echo " $(private_libdir)/libgit2.so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_LIBM),1)
+ debian/shlibdeps pow >> $@.tmp
+ echo " $(private_libdir)/$(LIBMNAME).so" >> $@.tmp
+else ifeq ($(USE_SYSTEM_OPENLIBM),1)
+ debian/shlibdeps pow >> $@.tmp
+ echo " $(private_libdir)/$(LIBMNAME).so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_MPFR),1)
+ debian/shlibdeps mpfr_init2 >> $@.tmp
+ echo " $(private_libdir)/libmpfr.so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_PCRE),1)
+ debian/shlibdeps pcre2_compile_8 >> $@.tmp
+ echo " $(private_libdir)/libpcre2-8.so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_LIBSSH2),1)
+ debian/shlibdeps libssh2_version >> $@.tmp
+ echo " $(private_libdir)/libssh2.so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_LIBUV),1)
+ debian/shlibdeps uv_tcp_open >> $@.tmp
+ echo " $(private_libdir)/libuv.so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_LIBUNWIND),1)
+ debian/shlibdeps backtrace >> $@.tmp
+ echo " $(private_libdir)/libunwind.so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_LLVM),1)
+ debian/shlibdeps llvm_regexec >> $@.tmp
+ echo " $(private_libdir)/libLLVM.so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_MBEDTLS),1)
+ debian/shlibdeps mbedtls_net_init >> $@.tmp
+ echo " $(private_libdir)/libmbedtls.so" >> $@.tmp
+ debian/shlibdeps mbedtls_md5 >> $@.tmp
+ echo " $(private_libdir)/libmbedcrypto.so" >> $@.tmp
+ debian/shlibdeps mbedtls_x509_get_serial >> $@.tmp
+ echo " $(private_libdir)/libmbedx509.so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_SUITESPARSE),1)
+ debian/shlibdeps cholmod_version >> $@.tmp
+ echo " $(private_libdir)/libcholmod.so" >> $@.tmp
+ debian/shlibdeps SuiteSparseQR_C_free >> $@.tmp
+ echo " $(private_libdir)/libspqr.so" >> $@.tmp
+ debian/shlibdeps SuiteSparse_config >> $@.tmp
+ echo " $(private_libdir)/libsuitesparseconfig.so" >> $@.tmp
+ debian/shlibdeps umfpack_dl_report_info >> $@.tmp
+ echo " $(private_libdir)/libumfpack.so" >> $@.tmp
+endif
+ifeq ($(USE_SYSTEM_UTF8PROC),1)
+ debian/shlibdeps utf8proc_errmsg >> $@.tmp
+ echo " $(private_libdir)/libutf8proc.so" >> $@.tmp
+endif
+ mv $@.tmp $@
+
+clean:
+ $(RM) $(TARGETS)
--- /dev/null
+3.0 (quilt)
--- /dev/null
+debian/embedded/libuv-ed3700c849289ed01fe04273a7bf865340b2bd7e.tar.gz
+debian/embedded/libwhich-81e9723c0273d78493dc8c8ed570f68d9ce7e89e.tar.gz
+debian/embedded/1609a05aee5d5960670738d8d834d91235bd6b1e
+debian/embedded/Documenter.jl-0.20.0/docs/src/man/hosting/puttygen-generated.png
+debian/embedded/Documenter.jl-0.20.0/docs/src/man/hosting/puttygen-export-private-key.png
+debian/embedded/Documenter.jl-0.20.0/docs/src/man/hosting/puttygen.png
+debian/embedded/Documenter.jl-0.20.0/docs/src/man/hosting/travis-variables.png
+debian/embedded/Documenter.jl-0.20.0/docs/src/man/hosting/github-add-deploy-key.png
+debian/embedded/Documenter.jl-0.20.0/docs/src/assets/favicon.ico
+debian/embedded/Documenter.jl-0.20.0/docs/src/assets/logo.png
+debian/embedded/Documenter.jl-0.20.0/test/examples/src/assets/favicon.ico
+debian/embedded/Documenter.jl-0.20.0/test/examples/images/logo.jpg
+debian/embedded/Documenter.jl-0.20.0/test/examples/images/logo.webp
+debian/embedded/Documenter.jl-0.20.0/test/examples/images/logo.gif
+debian/embedded/Documenter.jl-0.20.0/test/examples/images/logo.png
+debian/embedded/DocStringExtensions.jl-0.5.0/docs/src/assets/logo.png
+debian/embedded/llvm-6.0.0.src.tar.xz
+debian/embedded/llvm-6.0.src.tar.xz
--- /dev/null
+# Justification for "needs-root": https://github.com/JuliaLang/julia/issues/28023
+Tests: runtests.sh
+Restrictions: allow-stderr, needs-root
+Depends: @, curl, dpkg-dev
--- /dev/null
+#!/bin/sh
+set -ex
+
+export JULIA_TEST_MAXRSS_MB=500
+export HOME=/tmp
+
+cd $AUTOPKGTEST_TMP
+
+/usr/bin/julia \
+ --check-bounds=yes \
+ --startup-file=no \
+ /usr/share/julia/test/runtests.jl all
--- /dev/null
+Name: Julia
+Bug-Database: https://github.com/julialang/julia/issues
+Bug-Submit: https://github.com/julialang/julia/issues
+Cite-As: "Julia: A Fresh Approach to Numerical Computing. Jeff Bezanson, Alan Edelman, Stefan Karpinski and Viral B. Shah (2017) SIAM Review, 59: 65–98. doi: 10.1137/141000671. url: http://julialang.org/publications/julia-fresh-approach-BEKS.pdf."
+Documentation: https://docs.julialang.org/
+Reference:
+ Author: Jeff Bezanson, Alan Edelman, Stefan Karpinski and Viral B. Shah
+ Booktitle: SIAM Review
+ DOI: 10.1137/141000671
+ Number: 59
+ Pages: 65–98
+ Title: "Julia: A Fresh Approach to Numerical Computing"
+ Year: 2017
+Repository: https://github.com/JuliaLang/julia
+Repository-Browse: https://github.com/JuliaLang/julia
--- /dev/null
+version=4
+opts="filenamemangle=s%(?:.*?)?v?(\d[\d.]*)\.tar\.gz%julia-$1.tar.gz%,\
+ dversionmangle=s/\+dfsg\d*$// ,repacksuffix=+dfsg" \
+ https://github.com/JuliaLang/julia/tags \
+ (?:.*?/)?v?(1\.0[\d.]*)\.tar\.gz debian uupdate