--- /dev/null
+pandas for Debian
+-----------------
+
+For flexibility and easier interaction with upstream, packaging VCS is
+done on top of upstream's GIT hosted on github:
+git://github.com/wesm/pandas.git
+
+ -- Yaroslav Halchenko <debian@onerussian.com>, Tue, 13 Sep 2011 12:25:02 -0400
--- /dev/null
+pandas (0.23.3+dfsg-8) unstable; urgency=medium
+
+ * Examples dependencies: re-add statsmodels and xarray;
+ also add rpy2 and feather.
+ * Use packaged intersphinx indexes. (Closes: #876417)
+ * Use https for intersphinx links.
+ * Remove cythonized-files*. (They are regenerated on each build.)
+ * Remove test xfail, as statsmodels has now been fixed.
+ * Set Rules-Requires-Root: no.
+ * Make documentation Suggest the Python 3 version.
+ * Suggest statsmodels.
+ * Only use Python 3 sphinx, and mark it -Indep/nodoc.
+ * Bump debhelper compat to 12 and use debhelper-compat and pybuild.
+ * Remove pycompat and X-Python*-Version.
+ * Add missing d/copyright item.
+ * Remove obsolete TODOs.
+ * Clarify descriptions.
+ * Stop referring to examples that no longer exist.
+ * Fix typos.
+ * Remove old (no longer used) EXCLUDE_TESTS*.
+ * Deduplicate documentation files.
+ * Use Python 3 shebangs, and fix broken shebang.
+ * Add python3-ipykernel, -ipywidgets, -seaborn to
+ Build-Depends-Indep.
+ * Disable dh_auto_test: it fails, and we run the tests elsewhere.
+ * Mark test dependencies nocheck/nodoc.
+ * Remove old minimum versions / alternative dependencies.
+ * Build-depend on dh-python.
+ * Don't build on python3.8, as it will fail tests (see #931557).
+
+ -- Rebecca N. Palmer <rebecca_palmer@zoho.com> Sun, 27 Oct 2019 11:38:37 +0000
+
+pandas (0.23.3+dfsg-7) unstable; urgency=medium
+
+ * Revert test patch and use an xfail instead.
+ * Temporarily drop statsmodels+xarray Build-Depends, as they are
+ uninstallable until this is built.
+ * Add python3-xarray to autopkgtest Depends.
+ * Drop Python 2 autopkgtest (but keep build-time test).
+ * Remove duplicate Recommends.
+
+ -- Rebecca N. Palmer <rebecca_palmer@zoho.com> Fri, 20 Sep 2019 08:01:37 +0100
+
+pandas (0.23.3+dfsg-6) unstable; urgency=medium
+
+ * Team upload
+ * Avoid FTBFS with statsmodels 0.9.0
+ * Add python3-statsmodels to autopkgtest Depends
+
+ -- Graham Inggs <ginggs@debian.org> Wed, 18 Sep 2019 13:46:01 +0000
+
+pandas (0.23.3+dfsg-5) unstable; urgency=medium
+
+ * Team upload
+ * Add locales-all to Build-Depends and autopkgtest Depends in order to
+ consistently test in all avalable locales
+ * Add crh_UA to skip_noencoding_locales.patch
+ * Fix wrong debian/source/options exclude, thanks Steve Langasek
+
+ -- Graham Inggs <ginggs@debian.org> Wed, 18 Sep 2019 05:57:44 +0000
+
+pandas (0.23.3+dfsg-4) unstable; urgency=medium
+
+ * Add self to Uploaders.
+ * Recommend .xls format support also in Python 3. (Closes: #880125)
+ * Tests: don't call fixtures, as this is an error in pytest 4+.
+ * Don't test datetime in locales with no encoding.
+ (These are broken by a Python stdlib bug.)
+
+ -- Rebecca N. Palmer <rebecca_palmer@zoho.com> Sat, 14 Sep 2019 16:37:43 +0100
+
+pandas (0.23.3+dfsg-3) unstable; urgency=medium
+
+ * Team upload.
+ * Make np.array @ Series act the right way round. (Closes: #923708)
+ * Replace #918206 fix with a fix that doesn't change the return type
+ and inplace-ness of np.array += DataFrame. (Closes: #923707)
+ * Fix missing page in documentation.
+
+ -- Rebecca N. Palmer <rebecca_palmer@zoho.com> Wed, 06 Mar 2019 22:19:34 +0000
+
+pandas (0.23.3+dfsg-2) unstable; urgency=medium
+
+ * Team upload.
+ * Don't fail the build on +dfsg versions.
+ * Fix another d/copyright issue.
+ * Add d/upstream/metadata.
+
+ -- Rebecca N. Palmer <rebecca_palmer@zoho.com> Sat, 02 Mar 2019 14:57:12 +0000
+
+pandas (0.23.3+dfsg-1) unstable; urgency=medium
+
+ * Team upload.
+ * Fix DataFrame @ np.array matrix multiplication. (Closes: #918206)
+ * Fix documentation build (Sphinx now defaults to Python 3).
+ (Closes: #804552, LP: #1803018)
+ * Add documentation examples dependencies.
+ * Update d/copyright.
+ * Remove unlicensed files.
+
+ -- Rebecca N. Palmer <rebecca_palmer@zoho.com> Fri, 01 Mar 2019 23:02:18 +0000
+
+pandas (0.23.3-1) unstable; urgency=medium
+
+ * New upstream release
+ * debian/patches
+ - many upstreamed patches are removed and others refreshed
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sat, 28 Jul 2018 00:39:32 -0400
+
+pandas (0.22.0-8) unstable; urgency=medium
+
+ * Team Upload.
+ * patches:
+ + Add patch: deb_dont_call_py2_in_py3_test.patch
+ During python3 unit test, command 'python' is called by one of
+ the tests. When there is no python2 installation, tests such as
+ autopkgtest would fail.
+ * Put the conditionally applied patch to series' comment to avoid
+ lintianW: patch-file-present-but-not-mentioned-in-series.
+ * Trying to fix the autopkgtest:
+ + Leave a comment about the way to run unittest in the test control file.
+ + Synchronize B-D and autopkgtest depends.
+ + Allow output to stderr during test.
+ * Switch from nosetest to pytest.
+ * Synchronize pytest argument for rules and autopkgtest.
+ - Replace tests/unittest with the symlink pointed to tests/unittest3.
+ That scripts is smart enough to tell from py2 and py3, so we won't
+ need to write the same thing twice.
+ - Filter out intel tests on non-x86 architectures.
+ - Only enable "slow" tests on (Debian + x86) tester. "slow" tests may
+ consume too much memory to cause memory error or trigger OOM killer.
+ * control:
+ + Add missing python3 dependencies and sort the B-D list.
+ * Point Vcs-* fields to Salsa.
+ * Update Homepage to https://pandas.pydata.org/ .
+ * rules:
+ * Reverse the architecture filtering logic.
+ * Disable "slow" tests during build for non-x86 architectures.
+ This may significantly reduce the build time on those weak architectures.
+ * Don't specify the pytest marker expression twice.
+ The first expression will be overridden.
+ * Fix hardening flags.
+ - Cleanup the mess of unused nosetest exclusion expressions.
+ * Update lintian overrides.
+ + Override source-is-missing error, which is a false-positive triggered
+ by insane-line-length-in-source-file.
+ + Override insane-line-length-in-source-file because we have nothing
+ todo with lenghy lines in html.
+ * TODO: Point out that the unittest speed can be boosted with pytest-xdist.
+
+ -- Mo Zhou <cdluminate@gmail.com> Sun, 17 Jun 2018 16:01:16 +0000
+
+pandas (0.22.0-7) unstable; urgency=medium
+
+ * Team Upload.
+
+ [ Mo Zhou ]
+ * Remove patch: deb_fix_test_failure_test_basic_indexing, which is
+ unneeded for pandas >= 0.21 . (Closes: #900061)
+
+ [ Graham Inggs ]
+ * Add riscv64 to the list of "not intel" architectures
+ * Update mark_tests_working_on_intel_armhf.patch
+
+ -- Graham Inggs <ginggs@debian.org> Tue, 29 May 2018 13:50:59 +0000
+
+pandas (0.22.0-6) unstable; urgency=medium
+
+ * Team upload
+ * Fix FTBFS with Sphinx 1.7, thanks Dmitry Shachnev!
+
+ -- Graham Inggs <ginggs@debian.org> Tue, 24 Apr 2018 19:09:20 +0000
+
+pandas (0.22.0-5) unstable; urgency=medium
+
+ * Team upload
+ * Add compatibility with Matplotlib 2.2 (Closes: #896673)
+
+ -- Graham Inggs <ginggs@debian.org> Mon, 23 Apr 2018 13:56:12 +0000
+
+pandas (0.22.0-4) unstable; urgency=medium
+
+ * Team upload
+ * Fix more tests expecting little-endian results
+ * Fix heap corruption in read_csv on 32-bit, big-endian architectures
+ (Closes: #895890)
+
+ -- Graham Inggs <ginggs@debian.org> Sun, 22 Apr 2018 21:48:27 +0000
+
+pandas (0.22.0-3) unstable; urgency=medium
+
+ * Team upload
+ * Refresh and re-enable mark_tests_working_on_intel.patch
+ * Fix test__get_dtype tests expecting little-endian results
+
+ -- Graham Inggs <ginggs@debian.org> Thu, 12 Apr 2018 11:04:21 +0000
+
+pandas (0.22.0-2) unstable; urgency=medium
+
+ * debian/patches
+ - as upstream moved over to pytest from nose, no more nose imports were
+ in the code. Just adjusted patches to import nose where needed
+ * debian/rules
+ - specify LC_ALL=C locale to avoid crash while building docs
+ - add the 0001-TST-pytest-deprecation-warnings-GH17197-17253-reversed.patch
+ to the series if building on a system with an old pytest
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Wed, 21 Feb 2018 23:44:58 -0500
+
+pandas (0.22.0-1) unstable; urgency=medium
+
+ * Upstream release
+ * debian/patches
+ - refreshed many
+ - updated some
+ - added
+ - up_moto_optional to skip tests requiring moto (#777089)
+ - deb_skip_difffailingtests to skip two failing tests
+ (see https://github.com/pandas-dev/pandas/issues/19774)
+ - up_xlwt_optional to skip a test requiring xlwt
+ - deb_ndsphinx_optional to make nbsphinx optional.
+ Make nbsphinx not required in build-depends on systems with
+ older python-sphinx
+ - mark_tests_failing_on_386.patch
+ see https://github.com/pandas-dev/pandas/issues/19814
+ - removed adopted upstream:
+ - dateutil-2.6.1-fixed-ambiguous-tz-dst-be.patch
+ - up_tst_np_argsort_comparison2
+ - disabled for now:
+ - mark_tests_working_on_intel.patch
+ - up_tst_dont_assert_that_a_bug_exists_in_numpy
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Wed, 21 Feb 2018 10:30:06 -0500
+
+pandas (0.20.3-11) unstable; urgency=medium
+
+ * Team upload.
+ * Cherry-pick upstream commit 5f2b96bb637f6ddeec169c5ef8ad20013a03c853
+ to workaround a numpy bug. (Closes: #884294)
+ + patches/up_tst_dont_assert_that_a_bug_exists_in_numpy
+ * Cherry-pick upstream commits to fix test failure caused by test_argsort().
+ + patches/up_tst_np_argsort_comparison2
+ * Workaround test failure of test_basic_indexing() in file
+ pandas/tests/series/test_indexing.py .
+ + patches/deb_fix_test_failure_test_basic_indexing
+
+ -- Mo Zhou <cdluminate@gmail.com> Sat, 20 Jan 2018 09:00:31 +0000
+
+pandas (0.20.3-10) unstable; urgency=medium
+
+ * Team upload.
+ * Exclude more tests failing on mips, armhf and powerpc
+
+ -- Andreas Tille <tille@debian.org> Tue, 24 Oct 2017 21:26:02 +0200
+
+pandas (0.20.3-9) unstable; urgency=medium
+
+ * Team upload.
+ * Add missing "import pytest" to two patched tests
+ * Secure URI in watch file
+
+ -- Andreas Tille <tille@debian.org> Tue, 24 Oct 2017 08:18:54 +0200
+
+pandas (0.20.3-8) unstable; urgency=medium
+
+ * Team upload.
+ * Exclude one more test and de-activate non-working ignore of test errors
+
+ -- Andreas Tille <tille@debian.org> Mon, 23 Oct 2017 21:32:24 +0200
+
+pandas (0.20.3-7) unstable; urgency=medium
+
+ * Team upload.
+ * debhelper 9
+ * Use Debian packaged mathjax
+ * Do not Recommends python3-six since it is mentioned in Depends
+ * Remove redundant/outdated XS-Testsuite: autopkgtest
+ * Exclude one more test and de-activate non-working ignore of test errors
+
+ -- Andreas Tille <tille@debian.org> Mon, 23 Oct 2017 17:33:55 +0200
+
+pandas (0.20.3-6) unstable; urgency=medium
+
+ * Team upload.
+ * Ignore test errors on some architectures
+ (Concerns bug #877419)
+ * Remove __pycache__ remainings from testing
+ * Standards-Version: 4.1.1
+ * DEP3 for Google Analytics patch
+ * Complete Google Analytics patch
+
+ -- Andreas Tille <tille@debian.org> Mon, 23 Oct 2017 09:05:27 +0200
+
+pandas (0.20.3-5) unstable; urgency=medium
+
+ * Make sure remainings of nose tests will not fail. That's a pretty stupid
+ patch since the tests are not using nose any more only some remaining
+ exceptions. Hope it will work anyway.
+ (Concerns bug #877419)
+
+ -- Andreas Tille <tille@debian.org> Mon, 16 Oct 2017 21:57:45 +0200
+
+pandas (0.20.3-4) unstable; urgency=medium
+
+ * Mark those tests @pytest.mark.intel that pass only on Intel architectures
+ * d/rules: try to exclude tests that were marked "intel"
+ (Concerns bug #877419)
+
+ -- Andreas Tille <tille@debian.org> Sat, 14 Oct 2017 19:49:01 +0200
+
+pandas (0.20.3-3) unstable; urgency=medium
+
+ * Team upload.
+ * Moved packaging from pkg-exppsy to Debian Science
+ * Exclude certain tests on certain architectures
+ (Concerns bug #877419)
+
+ -- Andreas Tille <tille@debian.org> Fri, 13 Oct 2017 20:52:53 +0200
+
+pandas (0.20.3-2) unstable; urgency=medium
+
+ * debian/control
+ - boosted policy to 4.0.0 (I think we should be ok)
+ - drop statsmodels from build-depends to altogether avoid the circular
+ build-depends (Closes: #875805)
+ * Diane Trout:
+ - Add dateutil-2.6.1-fixed-ambiguous-tz-dst-be.patch (Closes: #875807)
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Thu, 21 Sep 2017 16:11:29 -0400
+
+pandas (0.20.3-1) unstable; urgency=medium
+
+ * Fresh upstream release
+ * debian/patches
+ - updated some, removed changeset*, and disabled possibly fixed upstream
+ ones
+ * debian/{control,rules}
+ - upstream switched to use pytest instead of nose
+ - enabled back all the tests for now
+ - added python-nbsphinx for b-depends, needed for docs
+ * debian/*.install
+ - no .so at the first level of subdirectories, now present on the third
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Mon, 10 Jul 2017 20:00:59 -0400
+
+pandas (0.19.2-5.1) unstable; urgency=medium
+
+ * Non-maintainer upload.
+ * Apply patch by Rebecca N. Palmer
+ Closes: #858260
+
+ -- Andreas Tille <tille@debian.org> Sun, 02 Apr 2017 07:06:36 +0200
+
+pandas (0.19.2-5) unstable; urgency=medium
+
+ * And one more test to skip on non-amd64 -- test_round_trip_valid_encodings
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Thu, 12 Jan 2017 13:10:11 -0500
+
+pandas (0.19.2-4) unstable; urgency=medium
+
+ * Exclude few more "plotting" tests on non-amd64 which cause FTBFS
+ on s390
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Thu, 12 Jan 2017 11:43:13 -0500
+
+pandas (0.19.2-3) unstable; urgency=medium
+
+ * Brought back changeset_0699c89882133a41c250abdac02796fec84512e8.diff
+ which should resolve tests failures on BE platforms (wasn't yet
+ upstreamed within 0.19.x releases)
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Thu, 12 Jan 2017 09:44:52 -0500
+
+pandas (0.19.2-2) unstable; urgency=medium
+
+ * Exclude a number of tests while running on non-amd64 platforms
+ due to bugs in numpy/pandas
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Wed, 11 Jan 2017 12:13:05 -0500
+
+pandas (0.19.2-1) unstable; urgency=medium
+
+ * Fresh upstream minor release -- supposed to be bugfix but interacts
+ with current beta (1:1.12.0~b1-1) numpy leading to various failed tests
+ * debian/patches
+ - changeset_ae6a0a51cf41223394b7ef1038c210045d486cc8.diff
+ to guarantee the same Series dtype as of cut regardless of architecture
+ - up_buggy_overflows
+ workaround for inconsistent overflows while doing pow operation on big
+ ints
+ * debian/rules
+ - exclude more tests which are due to known issues in numpy beta and thus
+ not to be addressed directly in pandas
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Wed, 04 Jan 2017 10:19:52 -0500
+
+pandas (0.19.1+git174-g81a2f79-1) experimental; urgency=medium
+
+ * New upstream snapshot from v0.19.0-174-g81a2f79
+ - lots of bugfixes since 0.19.1, so decided to test snapshot
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sat, 10 Dec 2016 22:43:19 -0500
+
+pandas (0.19.1-3) unstable; urgency=medium
+
+ * Require cython >= 0.23 or otherwise use pre-cythoned sources
+ (should resolve https://github.com/pandas-dev/pandas/issues/14699
+ on jessie)
+ * debian/control
+ - Build-Conflicts with python-tables 3.3.0-4 since that one leads to FTBFS
+ - boosted policy to 3.9.8
+ * debian/rules
+ - Exclude few more tests which fail on big endian and other platforms
+ test_(msgpack|read_dta18)
+ * debian/patches
+ - changeset_0699c89882133a41c250abdac02796fec84512e8.diff
+ to compare in the tests against native endianness
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Fri, 09 Dec 2016 15:49:50 -0500
+
+pandas (0.19.1-2) unstable; urgency=medium
+
+ * debian/control
+ - Moved statsmodels build-depend (optional) under build-depends-indep
+ to break circular dependency. Thanks Stuart Prescott for the analysis
+ * debian/patches/
+ - changeset_1309346c08945cd4764a549ec63cf51089634a45.diff
+ to not mask problem reading json leading to use of undefined variable
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sun, 27 Nov 2016 21:49:40 -0500
+
+pandas (0.19.1-1) unstable; urgency=medium
+
+ * Fresh upstream release
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Fri, 18 Nov 2016 12:19:54 -0500
+
+pandas (0.19.0+git14-ga40e185-1) unstable; urgency=medium
+
+ * New upstream post-release (includes some bugfixes) snapshot
+ * debian/patches
+ - dropped changeset_ and up_ patches adopted upstream, refreshed the rest
+ * debian/rules,patches
+ - save debian-based version into __version.py, so doesn't conflict with
+ upstream tests of public API
+ - exclude for now test_expressions on python3
+ (see https://github.com/pydata/pandas/issues/14269)
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Thu, 13 Oct 2016 10:26:18 -0400
+
+pandas (0.18.1-1) unstable; urgency=medium
+
+ * Fresh upstream release
+ * debian/patches/
+ - changeset_46af7cf0f8e0477f6cc7454aa786a573228f0ac3.diff
+ to allow also AttributeError exception being thrown in the tests
+ (Closes: #827938)
+ - debian/patches/deb_skip_test_precision_i386
+ removed (upstreamed)
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Wed, 13 Jul 2016 10:42:00 -0400
+
+pandas (0.18.0+git114-g6c692ae-1) unstable; urgency=medium
+
+ * debian/control
+ - added python{,3}-pkg-resources to direct Depends for the packages
+ (Closes: #821076)
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sun, 17 Apr 2016 20:49:25 -0400
+
+pandas (0.17.1-3) unstable; urgency=medium
+
+ * debian/tests/unittest*
+ - set LC_ALL=C.UTF-8 for the tests run to prevent failure of test_set_locale
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Tue, 08 Dec 2015 08:31:30 -0500
+
+pandas (0.17.1-2) unstable; urgency=medium
+
+ * debian/control
+ - make -statsmodels and -tables optional build-depends on those platforms
+ where they are N/A atm. Added bdepends on python3-tables since available
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sun, 06 Dec 2015 12:58:26 -0500
+
+pandas (0.17.1-1) unstable; urgency=medium
+
+ * Fresh upstream bugfix release
+ * debian/rules
+ - fixed deletion of moved away .so files
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Fri, 27 Nov 2015 10:52:49 -0500
+
+pandas (0.17.0+git8-gcac4ad2-2) unstable; urgency=medium
+
+ * Bug fix: install also msgpack/*.so extensions to -lib packages
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sat, 10 Oct 2015 13:52:54 -0400
+
+pandas (0.17.0+git8-gcac4ad2-1) unstable; urgency=medium
+
+ * New upstream snapshot post release to pick up few bugfixes
+ - Started to trigger failures of test_constructor_compound_dtypes and
+ test_invalid_index_types -- disabled those for now, see
+ https://github.com/pydata/pandas/issues/11169
+ * debian/rules
+ - Generate pandas/version.py if not present out of debian/changelog
+ upstream version information (versioneer wouldn't know since relies on
+ git)
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Fri, 09 Oct 2015 21:35:23 -0400
+
+pandas (0.16.2+git65-g054821d-1) unstable; urgency=medium
+
+ * Fresh upstream post-release snapshot (to pick up recent fixes etc)
+ (Closes: #787432)
+ * debian/{control,rules}
+ - build -doc package (Closes: #660900)
+ - add ipython (or alternative new ones from neurodebian) into
+ Build-Depends-Indep to build docs
+ - add python{,3}-{lxml,html5lib} to Build-Depends and Recommends
+ - use LC_ALL=C.UTF-8 while running tests
+ - exclude also test_set_locale since it fails ATM
+ see https://github.com/pydata/pandas/issues/10471
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Tue, 30 Jun 2015 17:26:54 -0400
+
+pandas (0.16.0~rc1-1) experimental; urgency=medium
+
+ * New upstream release candidate
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Fri, 13 Mar 2015 14:21:39 -0400
+
+pandas (0.15.2-1) unstable; urgency=medium
+
+ * Fresh upstream release
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Thu, 11 Dec 2014 09:51:57 -0500
+
+pandas (0.15.1+git125-ge463818-1) unstable; urgency=medium
+
+ * New upstream snapshot from v0.15.1-125-ge463818.
+ * Upload to unstable during freeze since previous one in sid didn't make it
+ to jessie anyways
+ * debian/control
+ - remove versioning demand for cython (it would use pre-cythonized code on
+ older ones and there is no longer need in sid/jessie to enforce version).
+ As a consecuence -- removed all dsc patches pointing to
+ nocython3-dsc-patch, since no longer needed
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sun, 30 Nov 2014 21:09:36 -0500
+
+pandas (0.15.0-2) unstable; urgency=medium
+
+ * debian/control
+ - specify minimal numpy to be 1.7
+ * debian/patches
+ - deb_skip_stata_on_bigendians skip test_stata again on BE platforms
+ - deb_skip_test_precision_i386 skip test_precision_conversion on 32bit
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Thu, 30 Oct 2014 23:09:13 -0400
+
+pandas (0.15.0-1) unstable; urgency=medium
+
+ * New upstream release
+ * debian/control
+ - restrict statsmodels and matplotlib from being required on the ports
+ which do not have them
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sun, 26 Oct 2014 11:30:23 -0400
+
+pandas (0.14.1-2) unstable; urgency=medium
+
+ * debian/patches/changeset_314012d.diff
+ - Fix converter test for MPL1.4 (Closes: #763709)
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Mon, 06 Oct 2014 11:53:42 -0400
+
+pandas (0.14.1-1) unstable; urgency=medium
+
+ * New upstream release
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Thu, 10 Jul 2014 23:38:49 -0400
+
+pandas (0.14.0+git393-g959e3e4-1) UNRELEASED; urgency=medium
+
+ * New upstream snapshot from v0.14.0-345-g8cd3dd6
+ * debian/rules
+ - disable running disabled tests to prevent clipboard tests failures
+ under kfreebsd kernels
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Mon, 07 Jul 2014 12:29:50 -0400
+
+pandas (0.14.0+git213-g741b2fa-1) experimental; urgency=medium
+
+ * New upstream snapshot from v0.14.0-213-g741b2fa.
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Thu, 19 Jun 2014 10:30:42 -0400
+
+pandas (0.14.0+git17-g3849d5d-1) unstable; urgency=medium
+
+ * New upstream snapshot from v0.14.0-17-g3849d5d -- has resolved a number
+ of bugs sneaked into 0.14.0 release, and caused FTBFS on some platforms
+ and backports
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sun, 01 Jun 2014 00:54:34 -0400
+
+pandas (0.14.0-1) unstable; urgency=medium
+
+ * New upstream release
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Fri, 30 May 2014 08:45:35 -0400
+
+pandas (0.14.0~rc1+git79-g1fa5dd4-1) experimental; urgency=medium
+
+ * New upstream snapshot from v0.14.0rc1-73-g8793356
+ * debian/patches:
+ - dropped CPed changeset_*s
+ - added deb_disable_googleanalytics
+ * debian/control:
+ - boosted policy compliance to 3.9.5
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Tue, 27 May 2014 16:00:00 -0400
+
+pandas (0.13.1-2) unstable; urgency=low
+
+ * debian/patches/changeset_6d56e7300d66d3ba76684334bbb44b6cd0ea9f61.diff
+ to fix FTBFS of statsmodels due to failing tests (Closes: #735804)
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sat, 08 Feb 2014 12:46:42 -0500
+
+pandas (0.13.1-1) unstable; urgency=low
+
+ * Fresh upstream release
+ * debian/patches
+ - deb_skip_test_pytables_failure to mitigate error while testing on
+ amd64 wheezy and ubuntu 12.04
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Tue, 04 Feb 2014 12:09:29 -0500
+
+pandas (0.13.0+git464-g15a8ff7-1) experimental; urgency=low
+
+ * Fresh pre-release snapshot
+ * debian/patches
+ - removed all cherry-picked patches (should have been upstreamed)
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Wed, 29 Jan 2014 21:27:45 -0500
+
+pandas (0.13.0-2) unstable; urgency=low
+
+ * debian/patches
+ - 0001-BLD-fix-cythonized-msgpack-extension-in-setup.py-GH5.patch
+ to resolve issue with building C++ Cython extension using
+ pre-generated sources
+ - 0001-Add-division-future-import-everywhere.patch
+ 0002-remove-explicit-truediv-kwarg.patch
+ to resolve compatibility issues with elderly Numexpr
+ - 0001-BUG-Yahoo-finance-changed-ichart-url.-Fixed-here.patch
+ - deb_skip_sequencelike_on_armel to prevent FTBFS on armel due to failing
+ test: https://github.com/pydata/pandas/issues/4473
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Fri, 03 Jan 2014 23:13:48 -0500
+
+pandas (0.13.0-1) unstable; urgency=low
+
+ * Fresh upstream release
+ - resolved compatibility with matplotlib 1.3 (Closes: #733848)
+ * debian/{control,rules}
+ - use xvfb (added to build-depends together with xauth, and xclip)
+ for tests
+ - define http*_proxy to prevent downloads
+ - install .md files not .rst for docs -- were renamed upstream
+ - include .cpp Cython generated files into debian/cythonized-files*
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Wed, 01 Jan 2014 18:08:22 -0500
+
+pandas (0.12.0-2) unstable; urgency=low
+
+ [ Dmitry Shachnev ]
+ * DEP-8 tests improvements:
+ - Use Xvfb for running tests.
+ - Increase verbosity using -v flag.
+ - Fix printing interpreter version in unittests3.
+ * Fix indentaion in debian/control.
+
+ [ Yaroslav Halchenko ]
+ * debian/control
+ - place python3-matplotlib ahead of elderly python-matplotlib without
+ python3 support since now we have python3-matplotlib in sid
+ * debian/copyright
+ - go through reported missing copyright/license statements (Closes:
+ #700564) Thanks Luca Falavigna for the report
+ * debian/rules,patches
+ - exclude test test_bar_log due to incompatibility with matplotlib 1.3.0 (test
+ adjusted upstream and would be re-enabled for the new release).
+ - debian/patches/changeset_952c5f0bc433622d21df20ed761ee4cb728370eb.diff
+ adds matplotlib 1.3.0 compatibility
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sat, 14 Sep 2013 20:02:58 -0400
+
+pandas (0.12.0-1) unstable; urgency=low
+
+ * New upstream release:
+ - should address failed tests on 32bit platforms
+ * debian/patches
+ - neurodebian: allow to build for jessie with outdated cython
+ * debian/control
+ - build for Python2 >= 2.7 due to some (probably temporary) incompatibilities
+ in tests with 2.6
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Wed, 24 Jul 2013 23:29:03 -0400
+
+pandas (0.12.0~rc1+git127-gec8920a-1) experimental; urgency=low
+
+ * New upstream snapshot from origin/master at v0.12.0rc1-127-gec8920a
+ - should address FTBFS due to failing tests on big endians
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sat, 20 Jul 2013 09:23:04 -0400
+
+pandas (0.12.0~rc1+git112-gb79996c-1) experimental; urgency=low
+
+ * Fresh git snapshot of upstream candidate release. Experimental build
+ to verify functioning across the ports.
+ * debian/control
+ - dedented last "paragraph" to break it away from the 2nd one.
+ Thanks Beatrice Torracca for the detailed report (Closes: #712260)
+ - Depends on python-six now
+ * debian/{,tests/}control
+ - added python{,3}-bs4, python-html5lib to Build-Depends for more
+ thorough testing
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Thu, 18 Jul 2013 13:15:19 -0400
+
+pandas (0.11.0-2) unstable; urgency=low
+
+ [ Yaroslav Halchenko ]
+ * Upload to unstable -- this upstream release addressed Cython 0.19
+ compatibility issue (Closes: #710608)
+ * Recommends numexpr
+ * Re-cythonized using Cython 0.19
+
+ [ Dmitry Shachnev ]
+ * debian/tests/unittests3: use nosetests3 instead of nosetests-3.x.
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Mon, 03 Jun 2013 11:57:43 -0400
+
+pandas (0.11.0-1) experimental; urgency=low
+
+ * New upstream release
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Tue, 23 Apr 2013 22:40:15 -0400
+
+pandas (0.10.1-1) experimental; urgency=low
+
+ * New upstream release
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Tue, 22 Jan 2013 13:07:31 -0500
+
+pandas (0.10.0-1) experimental; urgency=low
+
+ * New upstream release
+ - drops python 2.5 support (we are dropping pyversions in favor of
+ X-Python-Version)
+ * debian/patches:
+ - all previous are in upstream now, dropped locally
+ - added -dsc-patch'es for systems without cython3
+ * debian/control:
+ - added python-statsmodels for the extended tests coverage
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Mon, 17 Dec 2012 12:27:25 -0500
+
+pandas (0.9.1-2) unstable; urgency=low
+
+ [ Julian Taylor ]
+ * Provide python3 packages
+ * Add autopkgtests
+ * debian/patches:
+ - relax-float-tests.patch:
+ replace float equality tests with almost equal
+ - fix-endian-tests.patch:
+ patch from upstream to fix the test failure on big endian machines
+
+ [ Yaroslav Halchenko ]
+ * Upload to unstable
+ * Dropping pysupport
+ * debian/rules:
+ - slight reduction of code duplication between python 2 and 3
+ - cythonize for both python 2 and 3 into separate directories
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sat, 01 Dec 2012 22:57:47 -0500
+
+pandas (0.9.1-1) experimental; urgency=low
+
+ * New upstream release
+ * Boosted policy to 3.9.3 (no due changes)
+ * debian/rules
+ - Fixed up cleaning up of cythonized files
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Wed, 14 Nov 2012 09:44:14 -0500
+
+pandas (0.9.0-1) experimental; urgency=low
+
+ * New upstream release
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sun, 07 Oct 2012 21:26:23 -0400
+
+pandas (0.9.0~rc2-1) experimental; urgency=low
+
+ * New upstream release candidate
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Fri, 21 Sep 2012 10:27:52 -0400
+
+pandas (0.8.1-1) unstable; urgency=low
+
+ * Primarily a bugfix upstream release.
+ * up_tag_yahoo_test_requiring_network patch removed.
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sun, 22 Jul 2012 20:13:16 -0400
+
+pandas (0.8.0-2) unstable; urgency=medium
+
+ * up_tag_yahoo_test_requiring_network patch cherry-picked from upstream
+ GIT so that tests would not be excercised at package build time
+ (Closes: #681449)
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Fri, 13 Jul 2012 08:54:41 -0400
+
+pandas (0.8.0-1) unstable; urgency=low
+
+ * Fresh upstream release
+ * debian/control
+ - drop python-statsmodels from Build-Depends since it might not be yet
+ available on some architectures and is not critical for the test
+ - recommend python-statsmodels instead of deprecated
+ python-scikits.statsmodels
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Fri, 29 Jun 2012 13:02:28 -0400
+
+pandas (0.8.0~rc2+git26-g76c6351-1) experimental; urgency=low
+
+ * Fresh upstream release candidate
+ - all patches dropped (upstreamed)
+ - requires numpy >= 1.6
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Tue, 12 Jun 2012 13:23:27 -0400
+
+pandas (0.7.3-1) unstable; urgency=low
+
+ * Fresh upstream release
+ - few post-release patches (submitted upstream) to exclude unittests
+ requiring network access
+ * debian/control:
+ - python-openpyxl, python-xlwt, python-xlrd into Build-Depends
+ and Recommends
+ * debian/rules:
+ - exclude running tests marked with @network
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Thu, 12 Apr 2012 11:27:31 -0400
+
+pandas (0.7.1+git1-ga2e86c2-1) unstable; urgency=low
+
+ * New upstream release with a bugfix which followed
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Thu, 01 Mar 2012 22:28:10 -0500
+
+pandas (0.7.0-1) unstable; urgency=low
+
+ * New upstream release
+ * Updated pre-cythoned .c files for older Debian/Ubuntu releases.
+ Added a stamp file with upstream version to assure up-to-dateness
+ of the generated files
+ * Dropped all exclusions of unittests and patches -- shouldn't be necessary
+ any longer
+ * Build only for requested versions (not all supported) of Python
+ * Do nothing for build operation, rely on overloaded install
+ (to avoid undesired re-cythonization on elderly Ubuntus)
+ * Adjusted url in watch due to migration of repository under pydata
+ organization
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Mon, 16 Jan 2012 19:31:50 -0500
+
+pandas (0.6.1-1) UNRELEASED; urgency=low
+
+ * New upstream release
+ * python-tk into Build-Depends
+ * Create matplotlibrc with backend: Agg to allow tests run without $DISPLAY
+ * Carry pre-cythonized .c files for systems with older Cython
+ * Skip few tests known to fail
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Tue, 13 Dec 2011 18:36:11 -0500
+
+pandas (0.5.0+git7-gcf32be2-1) unstable; urgency=low
+
+ * New upstream release with post-release fixes
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Tue, 01 Nov 2011 21:15:06 -0400
+
+pandas (0.4.3-1) unstable; urgency=low
+
+ * New upstream release(s): primarily bugfixes and optimizations but also
+ with some minor API changes and new functionality
+ * Adjusted debian/watch to match new layout on github
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Tue, 18 Oct 2011 11:27:50 -0400
+
+pandas (0.4.1-1) unstable; urgency=low
+
+ * New upstream bugfix release
+ - incorporated all debian/patches
+ * debian/rules: 'clean' removes generated pandas/version.py
+ * debian/copyright: adjusted to become DEP-5 compliant
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Sun, 25 Sep 2011 21:48:30 -0400
+
+pandas (0.4.0-1) unstable; urgency=low
+
+ * Initial Debian release (Closes: #641464)
+
+ -- Yaroslav Halchenko <debian@onerussian.com> Tue, 13 Sep 2011 12:24:05 -0400
--- /dev/null
+Source: pandas
+Section: python
+Priority: optional
+Maintainer: Debian Science Team <debian-science-maintainers@lists.alioth.debian.org>
+Uploaders: Yaroslav Halchenko <debian@onerussian.com>,
+ Michael Hanke <michael.hanke@gmail.com>,
+ Rebecca N. Palmer <rebecca_palmer@zoho.com>
+Build-Depends: debhelper-compat (= 12),
+ dh-python,
+ locales-all,
+ quilt,
+ python-all-dev (>= 2.5),
+ python-setuptools,
+ cython,
+ python-bs4 <!nocheck>,
+ python-dateutil,
+ python-html5lib <!nocheck>,
+ python-lxml <!nocheck>,
+ python-matplotlib [!hurd-i386],
+ python-nose <!nocheck>,
+ python-numpy,
+ python-openpyxl <!nocheck>,
+ python-pytest <!nocheck>,
+ python-scipy,
+ python-six,
+ python-tables [!m68k !sh4 !x32] <!nocheck>,
+ python-tk <!nocheck>,
+ python-tz <!nocheck>,
+ python-xlsxwriter <!nocheck>,
+ python-xlrd <!nocheck>,
+ python-xlwt <!nocheck>,
+ python3-all-dev,
+ python3-setuptools,
+ cython3,
+ python3-bs4 <!nocheck> <!nodoc>,
+ python3-dateutil,
+ python3-html5lib <!nocheck> <!nodoc>,
+ python3-lxml <!nocheck> <!nodoc>,
+ python3-matplotlib [!hurd-i386],
+ python3-nose <!nocheck> <!nodoc>,
+ python3-numpy,
+ python3-openpyxl <!nocheck> <!nodoc>,
+ python3-pytest <!nocheck> <!nodoc>,
+ python3-scipy,
+ python3-six,
+ python3-tables [!m68k !sh4 !x32] <!nocheck> <!nodoc>,
+ python3-tk <!nocheck> <!nodoc>,
+ python3-tz <!nocheck> <!nodoc>,
+ python3-xlsxwriter <!nocheck> <!nodoc>,
+ python3-xlrd <!nocheck> <!nodoc>,
+ python3-xlwt <!nocheck> <!nodoc>,
+ xvfb <!nocheck>,
+ xauth <!nocheck>,
+ xclip <!nocheck>,
+# TODO: python3-pytest-xdist for parallel testing?
+Build-Depends-Indep:
+ python3-sphinx <!nodoc>,
+ python3-nbsphinx <!nodoc>,
+ python3-ipykernel <!nodoc>,
+ ipython3 <!nodoc>,
+ jdupes <!nodoc>,
+# for style.ipynb
+ pandoc <!nodoc>,
+# for intersphinx inventories
+ python3-doc <!nodoc>,
+ python-numpy-doc <!nodoc>,
+ python-scipy-doc <!nodoc>,
+ python-matplotlib-doc <!nodoc>,
+ python-statsmodels-doc <!nodoc>,
+# these are for not having (as many) exception messages in documentation examples
+# so may be temporarily removed if they are broken or to break bootstrap cycles
+ python3-feather-format <!nodoc>,
+# not in Debian python3-pyarrow <!nodoc> | python3-fastparquet <!nodoc>,
+ python3-rpy2 <!nodoc>,
+ python3-sqlalchemy <!nodoc>,
+ python3-statsmodels <!nodoc>,
+ python3-xarray <!nodoc>,
+ python3-ipywidgets <!nodoc>,
+ python3-seaborn <!nodoc>
+Build-Conflicts: python-tables (= 3.3.0-4), python3-tables (= 3.3.0-4)
+Standards-Version: 4.1.1
+# TODO for 4.4.1: release notes install (Policy 12.7)
+Rules-Requires-Root: no
+Homepage: https://pandas.pydata.org/
+Vcs-Browser: https://salsa.debian.org/science-team/pandas
+Vcs-Git: https://salsa.debian.org/science-team/pandas.git
+
+Package: python-pandas
+Architecture: all
+Depends: ${misc:Depends}, ${python:Depends},
+ python-numpy (>= 1:1.7~),
+ python-dateutil,
+ python-pandas-lib(>= ${source:Version}),
+ python-pkg-resources,
+ python-six,
+Recommends: python-scipy,
+ python-matplotlib,
+ python-tables,
+ python-numexpr,
+ python-tz,
+ python-xlrd,
+ python-openpyxl, python-xlwt, python-xlrd,
+ python-bs4,
+ python-html5lib,
+ python-lxml,
+Provides: ${python:Provides}
+Suggests: python-pandas-doc,
+ python-statsmodels
+Description: data structures for "relational" or "labeled" data - Python 2
+ pandas is a Python package providing fast, flexible, and expressive
+ data structures designed to make working with "relational" or
+ "labeled" data both easy and intuitive. It aims to be the fundamental
+ high-level building block for doing practical, real world data
+ analysis in Python. pandas is well suited for many different kinds of
+ data:
+ .
+ - Tabular data with heterogeneously-typed columns, as in an SQL
+ table or Excel spreadsheet
+ - Ordered and unordered (not necessarily fixed-frequency) time
+ series data.
+ - Arbitrary matrix data (homogeneously typed or heterogeneous) with
+ row and column labels
+ - Any other form of observational / statistical data sets. The data
+ actually need not be labeled at all to be placed into a pandas
+ data structure
+ .
+ This package contains the Python 2 version.
+
+Package: python3-pandas
+Architecture: all
+Depends: ${misc:Depends}, ${python3:Depends},
+ python3-numpy (>= 1:1.7~),
+ python3-dateutil,
+ python3-pandas-lib(>= ${source:Version}),
+ python3-pkg-resources,
+ python3-six,
+Recommends: python3-scipy,
+ python3-matplotlib,
+ python3-numexpr,
+ python3-tables,
+ python3-tz,
+ python3-xlrd,
+ python3-openpyxl, python3-xlwt,
+ python3-bs4,
+ python3-html5lib,
+ python3-lxml,
+Suggests: python-pandas-doc,
+ python3-statsmodels
+Description: data structures for "relational" or "labeled" data - Python 3
+ pandas is a Python package providing fast, flexible, and expressive
+ data structures designed to make working with "relational" or
+ "labeled" data both easy and intuitive. It aims to be the fundamental
+ high-level building block for doing practical, real world data
+ analysis in Python. pandas is well suited for many different kinds of
+ data:
+ .
+ - Tabular data with heterogeneously-typed columns, as in an SQL
+ table or Excel spreadsheet
+ - Ordered and unordered (not necessarily fixed-frequency) time
+ series data.
+ - Arbitrary matrix data (homogeneously typed or heterogeneous) with
+ row and column labels
+ - Any other form of observational / statistical data sets. The data
+ actually need not be labeled at all to be placed into a pandas
+ data structure
+ .
+ This package contains the Python 3 version.
+
+Package: python-pandas-doc
+Architecture: all
+Section: doc
+Depends: ${misc:Depends},
+ libjs-jquery,
+ libjs-mathjax
+Suggests: python3-pandas
+Description: data structures for "relational" or "labeled" data - documentation
+ pandas is a Python package providing fast, flexible, and expressive
+ data structures designed to make working with "relational" or
+ "labeled" data both easy and intuitive. It aims to be the fundamental
+ high-level building block for doing practical, real world data
+ analysis in Python. pandas is well suited for many different kinds of
+ data:
+ .
+ - Tabular data with heterogeneously-typed columns, as in an SQL
+ table or Excel spreadsheet
+ - Ordered and unordered (not necessarily fixed-frequency) time
+ series data.
+ - Arbitrary matrix data (homogeneously typed or heterogeneous) with
+ row and column labels
+ - Any other form of observational / statistical data sets. The data
+ actually need not be labeled at all to be placed into a pandas
+ data structure
+ .
+ This package contains the documentation.
+
+Package: python-pandas-lib
+Architecture: any
+Depends: ${misc:Depends}, ${shlibs:Depends}, ${python:Depends}, python-numpy (>= 1:1.7~)
+Provides: ${python:Provides}
+XB-Python-Version: ${python:Versions}
+Description: low-level implementations and bindings for pandas - Python 2
+ This is a low-level package for python-pandas providing
+ architecture-dependent extensions.
+ .
+ Users should not need to install it directly.
+
+Package: python3-pandas-lib
+Architecture: any
+Depends: ${misc:Depends}, ${shlibs:Depends}, ${python3:Depends}, python3-numpy (>=1:1.7~)
+Description: low-level implementations and bindings for pandas - Python 3
+ This is a low-level package for python3-pandas providing
+ architecture-dependent extensions.
+ .
+ Users should not need to install it directly.
--- /dev/null
+Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/
+Upstream-Name: pandas
+Upstream-Contact: pandas-dev@python.org
+Source: https://github.com/pandas-dev/pandas
+Files-Excluded: pandas/tests/io/data/computer_sales_page.html
+ pandas/tests/io/data/macau.html
+ pandas/tests/io/data/nyse_wsj.html
+ scripts/find_commits_touching_func.py
+ scripts/merge-pr.py
+Comment: I am not certain whether the above are actually a problem, but this close to freeze it's easiest to just remove them
+
+
+Files: *
+Copyright: 2008-2011 AQR Capital Management, LLC
+ 2011 Wes McKinney and pandas developers
+ 2011-2018 Lambda Foundry, Inc. and PyData Development Team
+License: BSD-3
+
+Files: pandas/_libs/src/datetime/*
+Copyright: 2005-2013, NumPy Developers
+License: BSD-3
+Origin: numpy
+Comment: Listed as derived from Numpy 1.7
+
+Files: pandas/_libs/skiplist.pyx
+ pandas/_libs/src/skiplist.h
+Copyright: 2009, Raymond Hettinger
+ 2011-2018 Wes McKinney and PyData Development Team
+License: Expat and BSD-3
+Origin: http://code.activestate.com/recipes/576930/
+Comment: it is a Cython code "inspired" by the original Python code by Raymond
+
+Files: pandas/_libs/src/headers/ms_*
+Copyright: 2006-2008 Alexander Chemeris
+License: BSD-3
+
+Files: pandas/_libs/src/klib/*
+Copyright: 2008, 2009, 2011 by Attractive Chaos <attractor@live.co.uk>
+License: Expat
+
+Files: pandas/_libs/src/msgpack/*
+Copyright: 2008-2011 FURUHASHI Sadayuki and Naoki INADA
+License: Apache
+
+Files: pandas/_libs/src/parser/tokenizer.*
+Copyright: 2002 Michael Ringgaard
+ 2011-2012 Warren Weckesser
+ 2001-2012 Python Software Foundation and Python contributors
+ 2012-2018 Lambda Foundry, Inc. and PyData Development Team
+License: Python and BSD-3
+Origin: csv (Python standard library), github.com/WarrenWeckesser/textreader
+
+Files: pandas/_libs/src/ujson/*
+Copyright: 1988-1993 The Regents of the University of California
+ 1994 Sun Microsystems, Inc.
+ 2007 Nick Galbreath
+ 2011-2013 ESN Social Software AB and Jonas Tarnstrom
+ 2012-2018 Lambda Foundry, Inc. and PyData Development Team
+License: BSD-3 and Expat
+Origin: ultrajson
+
+Files: pandas/compat/*
+Copyright: 2010-2013 Benjamin Peterson
+ 2012-2018 Lambda Foundry, Inc. and PyData Development Team
+License: Expat and BSD-3
+Origin: six
+
+Files: pandas/core/window.py
+Copyright: 2010-2012 Archipel Asset Management AB
+ 2011-2018 Lambda Foundry, Inc. and PyData Development Team
+License: BSD-3
+Comment: unclear if actual copying from bottleneck has taken place; assuming it has to be safe. Original was BSD-2, but BSD-2 and BSD-3 = BSD-3
+
+Files: pandas/io/packers.py
+ pandas/tests/io/test_packers.py
+Copyright: 2013 Lev Givon
+ 2013-2018 Lambda Foundry, Inc. and PyData Development Team
+License: BSD-3
+
+Files: pandas/io/sas/sas7bdat.py
+Copyright: 2015 Jared Hobbs
+ 2016-2018 Lambda Foundry, Inc. and PyData Development Team
+Origin: https://bitbucket.org/jaredhobbs/sas7bdat
+License: Expat
+
+Files: pandas/io/clipboard/*
+Copyright: 2010-2017 Albert Sweigart and Pyperclip contributors
+ 2016-2018 Lambda Foundry, Inc. and PyData Development Team
+License: BSD-3
+Origin: Pyperclip
+
+Files: pandas/tests/io/data/banklist.html
+ pandas/tests/io/data/banklist.csv
+ pandas/tests/io/data/spam.html
+Copyright: None; by Federal Deposit Insurance Corporation and US Department of Agriculture
+License: public-domain
+
+Files: pandas/tests/io/data/wikipedia_states.html
+Copyright: 2002-2014 Wikipedia contributors (full list: https://en.wikipedia.org/w/index.php?title=List_of_U.S._states_and_territories_by_area&offset=20140630&action=history)
+License: CC-BY-SA-3.0
+
+Files: scripts/announce.py
+Copyright: 2001-2017 Enthought, Inc. and SciPy Developers.
+ 2017-2018 Lambda Foundry, Inc. and PyData Development Team
+License: BSD-3
+Comment: it is possible that other code was also taken from Scipy
+
+Files: setup.py
+Copyright: 2009-2012, Brian Granger, Min Ragan-Kelley (from pyzmq)
+ 2004 Infrae (from lxml)
+ 2008-2018, AQR Capital Management, LLC, Lambda Foundry, Inc. and PyData Development Team
+License: BSD-3
+
+Files: doc/source/themes/nature_with_gtoc/*
+Copyright: 2007-2011 by the Sphinx team
+License: BSD-2
+
+Files: doc/sphinxext/*
+Copyright: 2008, Stefan van der Walt <stefan@mentat.za.net>, Pauli Virtanen <pav@iki.fi>
+License: BSD-2
+
+Files: debian/*
+Copyright: 2011-2018, Yaroslav Halchenko <debian@onerussian.com>
+License: BSD-3
+
+License: BSD-2
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions are
+ met:
+ .
+ 1. Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ 2. Redistributions in binary form must reproduce the above copyright
+ notice, this list of conditions and the following disclaimer in
+ the documentation and/or other materials provided with the
+ distribution.
+ .
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+ "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+ LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+ A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+ OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+License: BSD-3
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions are
+ met:
+ .
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of the copyright holder nor the names of any
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+ .
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDER AND CONTRIBUTORS
+ "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+ LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+ A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+ OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+License: Expat
+ Permission is hereby granted, free of charge, to any person obtaining
+ a copy of this software and associated documentation files (the
+ "Software"), to deal in the Software without restriction, including
+ without limitation the rights to use, copy, modify, merge, publish,
+ distribute, sublicense, and/or sell copies of the Software, and to
+ permit persons to whom the Software is furnished to do so, subject to
+ the following conditions:
+ .
+ The above copyright notice and this permission notice shall be
+ included in all copies or substantial portions of the Software.
+ .
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+ EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+ MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
+ NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS
+ BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN
+ ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
+ CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+ SOFTWARE.
+
+License: Apache
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+ .
+ http://www.apache.org/licenses/LICENSE-2.0
+ .
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+ .
+ On Debian systems full text of the license could be found in
+ /usr/share/common-licenses/Apache-2.0 .
+
+License: Python
+ PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
+ --------------------------------------------
+ .
+ 1. This LICENSE AGREEMENT is between the Python Software Foundation
+ ("PSF"), and the Individual or Organization ("Licensee") accessing and
+ otherwise using this software ("Python") in source or binary form and
+ its associated documentation.
+ .
+ 2. Subject to the terms and conditions of this License Agreement, PSF hereby
+ grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,
+ analyze, test, perform and/or display publicly, prepare derivative works,
+ distribute, and otherwise use Python alone or in any derivative version,
+ provided, however, that PSF's License Agreement and PSF's notice of copyright,
+ i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010
+ Python Software Foundation; All Rights Reserved" are retained in Python alone or
+ in any derivative version prepared by Licensee.
+ .
+ 3. In the event Licensee prepares a derivative work that is based on
+ or incorporates Python or any part thereof, and wants to make
+ the derivative work available to others as provided herein, then
+ Licensee hereby agrees to include in any such work a brief summary of
+ the changes made to Python.
+ .
+ 4. PSF is making Python available to Licensee on an "AS IS"
+ basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
+ IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
+ DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
+ FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
+ INFRINGE ANY THIRD PARTY RIGHTS.
+ .
+ 5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
+ FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
+ A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
+ OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
+ .
+ 6. This License Agreement will automatically terminate upon a material
+ breach of its terms and conditions.
+ .
+ 7. Nothing in this License Agreement shall be deemed to create any
+ relationship of agency, partnership, or joint venture between PSF and
+ Licensee. This License Agreement does not grant permission to use PSF
+ trademarks or trade name in a trademark sense to endorse or promote
+ products or services of Licensee, or any third party.
+ .
+ 8. By copying, installing or otherwise using Python, Licensee
+ agrees to be bound by the terms and conditions of this License
+ Agreement.
+
+License: public-domain
+ US federal government works
+
+License: CC-BY-SA-3.0
+ Creative Commons Attribution-ShareAlike 3.0 Unported
+ .
+ .
+ .
+ CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE LEGAL SERVICES. DISTRIBUTION OF THIS
+ LICENSE DOES NOT CREATE AN ATTORNEY-CLIENT RELATIONSHIP. CREATIVE COMMONS PROVIDES THIS INFORMATION ON
+ AN "AS-IS" BASIS. CREATIVE COMMONS MAKES NO WARRANTIES REGARDING THE INFORMATION PROVIDED,
+ AND DISCLAIMS LIABILITY FOR DAMAGES RESULTING FROM ITS USE.
+ .
+ License
+ .
+ .
+ .
+ THE WORK (AS DEFINED BELOW) IS PROVIDED UNDER THE TERMS OF THIS CREATIVE COMMONS PUBLIC LICENSE
+ ("CCPL" OR "LICENSE"). THE WORK IS PROTECTED BY COPYRIGHT AND/OR OTHER APPLICABLE
+ LAW. ANY USE OF THE WORK OTHER THAN AS AUTHORIZED UNDER THIS LICENSE OR COPYRIGHT LAW IS
+ PROHIBITED.
+ .
+ BY EXERCISING ANY RIGHTS TO THE WORK PROVIDED HERE, YOU ACCEPT AND AGREE TO BE BOUND BY THE TERMS OF THIS
+ LICENSE. TO THE EXTENT THIS LICENSE MAY BE CONSIDERED TO BE A CONTRACT, THE LICENSOR GRANTS YOU THE
+ RIGHTS CONTAINED HERE IN CONSIDERATION OF YOUR ACCEPTANCE OF SUCH TERMS AND CONDITIONS.
+ .
+ .
+ .
+ .
+ .
+ 1.
+ Definitions
+ .
+ .
+ .
+ .
+ a.
+ "Adaptation" means a work based upon the Work, or upon the Work and other
+ pre-existing works, such as a translation, adaptation, derivative work, arrangement of
+ music or other alterations of a literary or artistic work, or phonogram or performance and
+ includes cinematographic adaptations or any other form in which the Work may be recast,
+ transformed, or adapted including in any form recognizably derived from the original,
+ except that a work that constitutes a Collection will not be considered an Adaptation for
+ the purpose of this License. For the avoidance of doubt, where the Work is a musical work,
+ performance or phonogram, the synchronization of the Work in timed-relation with a moving
+ image ("synching") will be considered an Adaptation for the purpose of this
+ License.
+ .
+ .
+ .
+ b.
+ "Collection" means a collection of literary or artistic works, such as
+ encyclopedias and anthologies, or performances, phonograms or broadcasts, or other works
+ or subject matter other than works listed in Section 1(f) below, which, by reason of the
+ selection and arrangement of their contents, constitute intellectual creations, in which
+ the Work is included in its entirety in unmodified form along with one or more other
+ contributions, each constituting separate and independent works in themselves, which
+ together are assembled into a collective whole. A work that constitutes a Collection will
+ not be considered an Adaptation (as defined below) for the purposes of this License.
+ .
+ .
+ .
+ c.
+ "Creative Commons Compatible License" means a license that is listed at
+ http://creativecommons.org/compatiblelicenses that has been approved by Creative Commons
+ as being essentially equivalent to this License, including, at a minimum, because that
+ license: (i) contains terms that have the same purpose, meaning and effect as the License
+ Elements of this License; and, (ii) explicitly permits the relicensing of adaptations of
+ works made available under that license under this License or a Creative Commons
+ jurisdiction license with the same License Elements as this License.
+ .
+ .
+ .
+ d.
+ "Distribute" means to make available to the public the original and copies of the
+ Work or Adaptation, as appropriate, through sale or other transfer of ownership.
+ .
+ .
+ .
+ e.
+ "License Elements" means the following high-level license attributes as selected by
+ Licensor and indicated in the title of this License: Attribution, ShareAlike.
+ .
+ .
+ .
+ f.
+ "Licensor" means the individual, individuals, entity or entities that offer(s) the
+ Work under the terms of this License.
+ .
+ .
+ .
+ g.
+ "Original Author" means, in the case of a literary or artistic work, the
+ individual, individuals, entity or entities who created the Work or if no individual or
+ entity can be identified, the publisher; and in addition (i) in the case of a performance
+ the actors, singers, musicians, dancers, and other persons who act, sing, deliver,
+ declaim, play in, interpret or otherwise perform literary or artistic works or expressions
+ of folklore; (ii) in the case of a phonogram the producer being the person or legal entity
+ who first fixes the sounds of a performance or other sounds; and, (iii) in the case of
+ broadcasts, the organization that transmits the broadcast.
+ .
+ .
+ .
+ h.
+ "Work" means the literary and/or artistic work offered under the terms of this
+ License including without limitation any production in the literary, scientific and
+ artistic domain, whatever may be the mode or form of its expression including digital
+ form, such as a book, pamphlet and other writing; a lecture, address, sermon or other work
+ of the same nature; a dramatic or dramatico-musical work; a choreographic work or
+ entertainment in dumb show; a musical composition with or without words; a cinematographic
+ work to which are assimilated works expressed by a process analogous to cinematography; a
+ work of drawing, painting, architecture, sculpture, engraving or lithography; a
+ photographic work to which are assimilated works expressed by a process analogous to
+ photography; a work of applied art; an illustration, map, plan, sketch or
+ three-dimensional work relative to geography, topography, architecture or science; a
+ performance; a broadcast; a phonogram; a compilation of data to the extent it is protected
+ as a copyrightable work; or a work performed by a variety or circus performer to the
+ extent it is not otherwise considered a literary or artistic work.
+ .
+ .
+ .
+ i.
+ "You" means an individual or entity exercising rights under this License who has
+ not previously violated the terms of this License with respect to the Work, or who has
+ received express permission from the Licensor to exercise rights under this License
+ despite a previous violation.
+ .
+ .
+ .
+ j.
+ "Publicly Perform" means to perform public recitations of the Work and to
+ communicate to the public those public recitations, by any means or process, including by
+ wire or wireless means or public digital performances; to make available to the public
+ Works in such a way that members of the public may access these Works from a place and at
+ a place individually chosen by them; to perform the Work to the public by any means or
+ process and the communication to the public of the performances of the Work, including by
+ public digital performance; to broadcast and rebroadcast the Work by any means including
+ signs, sounds or images.
+ .
+ .
+ .
+ k.
+ "Reproduce" means to make copies of the Work by any means including without
+ limitation by sound or visual recordings and the right of fixation and reproducing
+ fixations of the Work, including storage of a protected performance or phonogram in
+ digital form or other electronic medium.
+ .
+ .
+ .
+ .
+ .
+ 2.
+ Fair Dealing Rights. Nothing in this License is intended to reduce, limit, or restrict any uses
+ free from copyright or rights arising from limitations or exceptions that are provided for in
+ connection with the copyright protection under copyright law or other applicable laws.
+ .
+ .
+ .
+ 3.
+ License Grant. Subject to the terms and conditions of this License, Licensor hereby grants You a
+ worldwide, royalty-free, non-exclusive, perpetual (for the duration of the applicable
+ copyright) license to exercise the rights in the Work as stated below:
+ .
+ .
+ .
+ .
+ a.
+ to Reproduce the Work, to incorporate the Work into one or more Collections, and to Reproduce
+ the Work as incorporated in the Collections;
+ .
+ .
+ .
+ b.
+ to create and Reproduce Adaptations provided that any such Adaptation, including any
+ translation in any medium, takes reasonable steps to clearly label, demarcate or otherwise
+ identify that changes were made to the original Work. For example, a translation could be
+ marked "The original work was translated from English to Spanish," or a
+ modification could indicate "The original work has been modified.";
+ .
+ .
+ .
+ c.
+ to Distribute and Publicly Perform the Work including as incorporated in Collections; and,
+ .
+ .
+ .
+ d.
+ to Distribute and Publicly Perform Adaptations.
+ .
+ .
+ .
+ e.
+ For the avoidance of doubt:
+ .
+ .
+ .
+ .
+ i.
+ Non-waivable Compulsory License Schemes. In those jurisdictions in which the right to
+ collect royalties through any statutory or compulsory licensing scheme cannot be
+ waived, the Licensor reserves the exclusive right to collect such royalties for any
+ exercise by You of the rights granted under this License;
+ .
+ .
+ .
+ ii.
+ Waivable Compulsory License Schemes. In those jurisdictions in which the right to collect
+ royalties through any statutory or compulsory licensing scheme can be waived, the
+ Licensor waives the exclusive right to collect such royalties for any exercise by You
+ of the rights granted under this License; and,
+ .
+ .
+ .
+ iii.
+ Voluntary License Schemes. The Licensor waives the right to collect royalties, whether
+ individually or, in the event that the Licensor is a member of a collecting society
+ that administers voluntary licensing schemes, via that society, from any exercise by
+ You of the rights granted under this License.
+ .
+ .
+ .
+ .
+ The above rights may be exercised in all media and formats whether now known or hereafter
+ devised. The above rights include the right to make such modifications as are
+ technically necessary to exercise the rights in other media and formats. Subject to
+ Section 8(f), all rights not expressly granted by Licensor are hereby reserved.
+ .
+ .
+ .
+ .
+ 4.
+ Restrictions. The license granted in Section 3 above is expressly made subject to and limited by
+ the following restrictions:
+ .
+ .
+ .
+ .
+ a.
+ You may Distribute or Publicly Perform the Work only under the terms of this License. You
+ must include a copy of, or the Uniform Resource Identifier (URI) for, this License with
+ every copy of the Work You Distribute or Publicly Perform. You may not offer or impose any
+ terms on the Work that restrict the terms of this License or the ability of the recipient
+ of the Work to exercise the rights granted to that recipient under the terms of the
+ License. You may not sublicense the Work. You must keep intact all notices that refer to
+ this License and to the disclaimer of warranties with every copy of the Work You
+ Distribute or Publicly Perform. When You Distribute or Publicly Perform the Work, You may
+ not impose any effective technological measures on the Work that restrict the ability of a
+ recipient of the Work from You to exercise the rights granted to that recipient under the
+ terms of the License. This Section 4(a) applies to the Work as incorporated in a
+ Collection, but this does not require the Collection apart from the Work itself to be made
+ subject to the terms of this License. If You create a Collection, upon notice from any
+ Licensor You must, to the extent practicable, remove from the Collection any credit as
+ required by Section 4(c), as requested. If You create an Adaptation, upon notice from any
+ Licensor You must, to the extent practicable, remove from the Adaptation any credit as
+ required by Section 4(c), as requested.
+ .
+ .
+ .
+ b.
+ You may Distribute or Publicly Perform an Adaptation only under the terms of: (i) this
+ License; (ii) a later version of this License with the same License Elements as this
+ License; (iii) a Creative Commons jurisdiction license (either this or a later license
+ version) that contains the same License Elements as this License (e.g.,
+ Attribution-ShareAlike 3.0 US)); (iv) a Creative Commons Compatible License. If you
+ license the Adaptation under one of the licenses mentioned in (iv), you must comply with
+ the terms of that license. If you license the Adaptation under the terms of any of the
+ licenses mentioned in (i), (ii) or (iii) (the "Applicable License"), you must
+ comply with the terms of the Applicable License generally and the following provisions:
+ (I) You must include a copy of, or the URI for, the Applicable License with every copy of
+ each Adaptation You Distribute or Publicly Perform; (II) You may not offer or impose any
+ terms on the Adaptation that restrict the terms of the Applicable License or the ability
+ of the recipient of the Adaptation to exercise the rights granted to that recipient under
+ the terms of the Applicable License; (III) You must keep intact all notices that refer to
+ the Applicable License and to the disclaimer of warranties with every copy of the Work as
+ included in the Adaptation You Distribute or Publicly Perform; (IV) when You Distribute or
+ Publicly Perform the Adaptation, You may not impose any effective technological measures
+ on the Adaptation that restrict the ability of a recipient of the Adaptation from You to
+ exercise the rights granted to that recipient under the terms of the Applicable License.
+ This Section 4(b) applies to the Adaptation as incorporated in a Collection, but this does
+ not require the Collection apart from the Adaptation itself to be made subject to the
+ terms of the Applicable License.
+ .
+ .
+ .
+ c.
+ If You Distribute, or Publicly Perform the Work or any Adaptations or Collections, You must,
+ unless a request has been made pursuant to Section 4(a), keep intact all copyright notices
+ for the Work and provide, reasonable to the medium or means You are utilizing: (i) the
+ name of the Original Author (or pseudonym, if applicable) if supplied, and/or if the
+ Original Author and/or Licensor designate another party or parties (e.g., a sponsor
+ institute, publishing entity, journal) for attribution ("Attribution Parties")
+ in Licensor's copyright notice, terms of service or by other reasonable means, the
+ name of such party or parties; (ii) the title of the Work if supplied; (iii) to the extent
+ reasonably practicable, the URI, if any, that Licensor specifies to be associated with the
+ Work, unless such URI does not refer to the copyright notice or licensing information for
+ the Work; and (iv), consistent with
+ Section
+ 3(b), in the case of an Adaptation, a credit
+ identifying the use of the Work in the Adaptation (e.g., "French translation of the
+ Work by Original Author," or "Screenplay based on original Work by Original
+ Author"). The credit required by this Section 4(c) may be implemented in any
+ reasonable manner; provided, however, that in the case of a Adaptation or Collection, at a
+ minimum such credit will appear, if a credit for all contributing authors of the
+ Adaptation or Collection appears, then as part of these credits and in a manner at least
+ as prominent as the credits for the other contributing authors. For the avoidance of
+ doubt, You may only use the credit required by this Section for the purpose of attribution
+ in the manner set out above and, by exercising Your rights under this License, You may not
+ implicitly or explicitly assert or imply any connection with, sponsorship or endorsement
+ by the Original Author, Licensor and/or Attribution Parties, as appropriate, of You or
+ Your use of the Work, without the separate, express prior written permission of the
+ Original Author, Licensor and/or Attribution Parties.
+ .
+ .
+ .
+ d.
+ Except as otherwise agreed in writing by the Licensor or as may be otherwise permitted by
+ applicable law, if You Reproduce, Distribute or Publicly Perform the Work either by itself
+ or as part of any Adaptations or Collections, You must not distort, mutilate, modify or
+ take other derogatory action in relation to the Work which would be prejudicial to the
+ Original Author's honor or reputation. Licensor agrees that in those jurisdictions
+ (e.g. Japan), in which any exercise of the right granted in Section 3(b) of this License
+ (the right to make Adaptations) would be deemed to be a distortion, mutilation,
+ modification or other derogatory action prejudicial to the Original Author's honor
+ and reputation, the Licensor will waive or not assert, as appropriate, this Section, to
+ the fullest extent permitted by the applicable national law, to enable You to reasonably
+ exercise Your right under Section 3(b) of this License (right to make Adaptations) but not
+ otherwise.
+ .
+ .
+ .
+ .
+ .
+ 5.
+ Representations, Warranties and Disclaimer
+ UNLESS OTHERWISE MUTUALLY AGREED TO BY THE PARTIES IN WRITING, LICENSOR OFFERS THE WORK AS-IS AND
+ MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY KIND CONCERNING THE WORK, EXPRESS, IMPLIED,
+ STATUTORY OR OTHERWISE, INCLUDING, WITHOUT LIMITATION, WARRANTIES OF TITLE, MERCHANTIBILITY,
+ FITNESS FOR A PARTICULAR PURPOSE, NONINFRINGEMENT, OR THE ABSENCE OF LATENT OR OTHER DEFECTS,
+ ACCURACY, OR THE PRESENCE OF ABSENCE OF ERRORS, WHETHER OR NOT DISCOVERABLE. SOME
+ JURISDICTIONS DO NOT ALLOW THE EXCLUSION OF IMPLIED WARRANTIES, SO SUCH EXCLUSION MAY NOT
+ APPLY TO YOU.
+ .
+ .
+ .
+ .
+ 6.
+ Limitation on Liability. EXCEPT TO THE EXTENT REQUIRED BY APPLICABLE LAW, IN NO EVENT WILL
+ LICENSOR BE LIABLE TO YOU ON ANY LEGAL THEORY FOR ANY SPECIAL, INCIDENTAL, CONSEQUENTIAL,
+ PUNITIVE OR EXEMPLARY DAMAGES ARISING OUT OF THIS LICENSE OR THE USE OF THE WORK, EVEN IF
+ LICENSOR HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
+ .
+ .
+ .
+ 7.
+ Termination
+ .
+ .
+ .
+ .
+ a.
+ This License and the rights granted hereunder will terminate automatically upon any breach by
+ You of the terms of this License. Individuals or entities who have received Adaptations or
+ Collections from You under this License, however, will not have their licenses terminated
+ provided such individuals or entities remain in full compliance with those licenses.
+ Sections 1, 2, 5, 6, 7, and 8 will survive any termination of this License.
+ .
+ .
+ .
+ b.
+ Subject to the above terms and conditions, the license granted here is perpetual (for the
+ duration of the applicable copyright in the Work). Notwithstanding the above, Licensor
+ reserves the right to release the Work under different license terms or to stop
+ distributing the Work at any time; provided, however that any such election will not serve
+ to withdraw this License (or any other license that has been, or is required to be,
+ granted under the terms of this License), and this License will continue in full force and
+ effect unless terminated as stated above.
+ .
+ .
+ .
+ .
+ .
+ 8.
+ Miscellaneous
+ .
+ .
+ .
+ .
+ a.
+ Each time You Distribute or Publicly Perform the Work or a Collection, the Licensor offers to
+ the recipient a license to the Work on the same terms and conditions as the license
+ granted to You under this License.
+ .
+ .
+ .
+ b.
+ Each time You Distribute or Publicly Perform an Adaptation, Licensor offers to the recipient
+ a license to the original Work on the same terms and conditions as the license granted to
+ You under this License.
+ .
+ .
+ .
+ c.
+ If any provision of this License is invalid or unenforceable under applicable law, it shall
+ not affect the validity or enforceability of the remainder of the terms of this License,
+ and without further action by the parties to this agreement, such provision shall be
+ reformed to the minimum extent necessary to make such provision valid and enforceable.
+ .
+ .
+ .
+ d.
+ No term or provision of this License shall be deemed waived and no breach consented to unless
+ such waiver or consent shall be in writing and signed by the party to be charged with such
+ waiver or consent.
+ .
+ .
+ .
+ e.
+ This License constitutes the entire agreement between the parties with respect to the Work
+ licensed here. There are no understandings, agreements or representations with respect to
+ the Work not specified here. Licensor shall not be bound by any additional provisions that
+ may appear in any communication from You. This License may not be modified without the
+ mutual written agreement of the Licensor and You.
+ .
+ .
+ .
+ f.
+ The rights granted under, and the subject matter referenced, in this License were drafted
+ utilizing the terminology of the Berne Convention for the Protection of Literary and
+ Artistic Works (as amended on September 28, 1979), the Rome Convention of 1961, the WIPO
+ Copyright Treaty of 1996, the WIPO Performances and Phonograms Treaty of 1996 and the
+ Universal Copyright Convention (as revised on July 24, 1971). These rights and subject
+ matter take effect in the relevant jurisdiction in which the License terms are sought to
+ be enforced according to the corresponding provisions of the implementation of those
+ treaty provisions in the applicable national law. If the standard suite of rights granted
+ under applicable copyright law includes additional rights not granted under this License,
+ such additional rights are deemed to be included in the License; this License is not
+ intended to restrict the license of any rights under applicable law.
+ .
+ .
+ .
+ .
+ Creative Commons Notice
+ .
+ Creative Commons is not a party to this License, and makes no warranty whatsoever in connection with the
+ Work. Creative Commons will not be liable to You or any party on any legal theory for any damages
+ whatsoever, including without limitation any general, special, incidental or consequential damages
+ arising in connection to this license. Notwithstanding the foregoing two (2) sentences, if Creative
+ Commons has expressly identified itself as the Licensor hereunder, it shall have all rights and
+ obligations of Licensor.
+ .
+ Except for the limited purpose of indicating to the public that the Work is licensed under the CCPL,
+ Creative Commons does not authorize the use by either party of the trademark "Creative
+ Commons" or any related trademark or logo of Creative Commons without the prior written consent
+ of Creative Commons. Any permitted use will be in compliance with Creative Commons' then-current
+ trademark usage guidelines, as may be published on its website or otherwise made available upon
+ request from time to time. For the avoidance of doubt, this trademark restriction does not form part
+ of the License.
+ .
+ Creative Commons may be contacted at http://creativecommons.org/.
--- /dev/null
+[DEFAULT]
+# the default branch for upstream sources:
+upstream-branch = master
+# the default branch for the debian patch:
+debian-branch = debian
+# the default tag formats used:
+upstream-tag = v%(version)s
+debian-tag = debian/%(version)s
+
+
--- /dev/null
+reverted:
+--- a/pandas/tests/computation/test_eval.py
++++ b/pandas/tests/computation/test_eval.py
+@@ -38,14 +38,13 @@ _scalar_skip = 'in', 'not in'
+
+
+ @pytest.fixture(params=(
+- pytest.param(engine,
+- marks=pytest.mark.skipif(
+- engine == 'numexpr' and not _USE_NUMEXPR,
+- reason='numexpr enabled->{enabled}, '
+- 'installed->{installed}'.format(
+- enabled=_USE_NUMEXPR,
+- installed=_NUMEXPR_INSTALLED)))
+- for engine in _engines)) # noqa
++ pytest.mark.skipif(engine == 'numexpr' and not _USE_NUMEXPR,
++ reason='numexpr enabled->{enabled}, '
++ 'installed->{installed}'.format(
++ enabled=_USE_NUMEXPR,
++ installed=_NUMEXPR_INSTALLED))(engine)
++ for engine in _engines # noqa
++))
+ def engine(request):
+ return request.param
+
+--- a/pandas/tests/io/parser/test_network.py
++++ b/pandas/tests/io/parser/test_network.py
+@@ -16,10 +16,8 @@ from pandas.compat import BytesIO
+ @pytest.mark.parametrize(
+ "compression,extension",
+ [('gzip', '.gz'), ('bz2', '.bz2'), ('zip', '.zip'),
+- pytest.param('xz', '.xz',
+- marks=pytest.mark.skipif(not tm._check_if_lzma(),
+- reason='need backports.lzma '
+- 'to run'))])
++ pytest.mark.skipif(not tm._check_if_lzma(),
++ reason='need backports.lzma to run')(('xz', '.xz'))])
+ @pytest.mark.parametrize('mode', ['explicit', 'infer'])
+ @pytest.mark.parametrize('engine', ['python', 'c'])
+ def test_compressed_urls(salaries_table, compression, extension, mode, engine):
+--- a/pandas/tests/io/test_excel.py
++++ b/pandas/tests/io/test_excel.py
+@@ -2426,10 +2426,8 @@ class TestExcelWriterEngineTests(object)
+
+
+ @pytest.mark.parametrize('engine', [
+- pytest.param('xlwt',
+- marks=pytest.mark.xfail(reason='xlwt does not support '
+- 'openpyxl-compatible '
+- 'style dicts')),
++ pytest.mark.xfail('xlwt', reason='xlwt does not support '
++ 'openpyxl-compatible style dicts'),
+ 'xlsxwriter',
+ 'openpyxl',
+ ])
+--- a/pandas/tests/io/test_parquet.py
++++ b/pandas/tests/io/test_parquet.py
+@@ -27,14 +27,10 @@ except ImportError:
+
+ # setup engines & skips
+ @pytest.fixture(params=[
+- pytest.param('fastparquet',
+- marks=pytest.mark.skipif(not _HAVE_FASTPARQUET,
+- reason='fastparquet is '
+- 'not installed')),
+- pytest.param('pyarrow',
+- marks=pytest.mark.skipif(not _HAVE_PYARROW,
+- reason='pyarrow is '
+- 'not installed'))])
++ pytest.mark.skipif(not _HAVE_FASTPARQUET,
++ reason='fastparquet is not installed')('fastparquet'),
++ pytest.mark.skipif(not _HAVE_PYARROW,
++ reason='pyarrow is not installed')('pyarrow')])
+ def engine(request):
+ return request.param
+
+--- a/pandas/tests/test_window.py
++++ b/pandas/tests/test_window.py
+@@ -552,9 +552,8 @@ class TestExpanding(Base):
+
+ @pytest.mark.parametrize(
+ 'expander',
+- [1, pytest.param('ls', marks=pytest.mark.xfail(
+- reason='GH 16425 expanding with '
+- 'offset not supported'))])
++ [1, pytest.mark.xfail(
++ reason='GH 16425 expanding with offset not supported')('1s')])
+ def test_empty_df_expanding(self, expander):
+ # GH 15819 Verifies that datetime and integer expanding windows can be
+ # applied to empty DataFrames
--- /dev/null
+Author: Rebecca N. Palmer <rebecca_palmer@zoho.com>
+Bug-Debian: https://bugs.debian.org/858260
+Last-Update: Sat, 1 Apr 2017 23:21:31 +0100
+Description: Use tiinfo correctly
+ The underlying issue (but not strictly a bug as the documentation
+ specifically says not to do that -
+ http://sources.debian.net/src/python-tz/2016.7-0.2/pytz/tzinfo.py/#L247
+ ) is that passing a pytz tzinfo to the datetime constructor uses its
+ first listed offset, not its correct offset for that date:
+ .
+ >>> datetime.datetime(2017,4,1,tzinfo=pytz.timezone('Europe/London'))
+ datetime.datetime(2017, 4, 1, 0, 0, tzinfo=<DstTzInfo 'Europe/London'
+ GMT0:00:00 STD>)
+ >>> pytz.timezone('Europe/London').localize(datetime.datetime(2017,4,1))
+ datetime.datetime(2017, 4, 1, 0, 0, tzinfo=<DstTzInfo 'Europe/London'
+ BST+1:00:00 DST>)
+
+--- a/pandas/tests/test_multilevel.py
++++ b/pandas/tests/test_multilevel.py
+@@ -84,9 +84,9 @@ class TestMultiLevel(tm.TestCase):
+ # GH 7112
+ import pytz
+ tz = pytz.timezone('Asia/Tokyo')
+- expected_tuples = [(1.1, datetime.datetime(2011, 1, 1, tzinfo=tz)),
+- (1.2, datetime.datetime(2011, 1, 2, tzinfo=tz)),
+- (1.3, datetime.datetime(2011, 1, 3, tzinfo=tz))]
++ expected_tuples = [(1.1, tz.localize(datetime.datetime(2011, 1, 1))),
++ (1.2, tz.localize(datetime.datetime(2011, 1, 2))),
++ (1.3, tz.localize(datetime.datetime(2011, 1, 3)))]
+ expected = Index([1.1, 1.2, 1.3] + expected_tuples)
+ self.assert_index_equal(result, expected)
+
+@@ -104,9 +104,9 @@ class TestMultiLevel(tm.TestCase):
+
+ result = midx_lv3.append(midx_lv2)
+ expected = Index._simple_new(
+- np.array([(1.1, datetime.datetime(2011, 1, 1, tzinfo=tz), 'A'),
+- (1.2, datetime.datetime(2011, 1, 2, tzinfo=tz), 'B'),
+- (1.3, datetime.datetime(2011, 1, 3, tzinfo=tz), 'C')] +
++ np.array([(1.1, tz.localize(datetime.datetime(2011, 1, 1)), 'A'),
++ (1.2, tz.localize(datetime.datetime(2011, 1, 2)), 'B'),
++ (1.3, tz.localize(datetime.datetime(2011, 1, 3)), 'C')] +
+ expected_tuples), None)
+ self.assert_index_equal(result, expected)
+
--- /dev/null
+Description: Fix np.array @ DataFrame matrix multiplication
+
+Using this and not upstream's __array_priority__ fix
+https://github.com/pandas-dev/pandas/commit/ad2a14f4bec8a004b2972c12f12ed3e4ce37ff52
+to allow np.array += DataFrame to remain in-place (same object ID /
+other views also affected) and an array (not a DataFrame).
+
+Author: jbrockmendel, Rebecca N. Palmer <rebecca_palmer@zoho.com>
+Origin: upstream
+Bug-Debian: https://bugs.debian.org/918206 https://bugs.debian.org/923707
+Forwarded: not-needed
+
+--- a/pandas/core/generic.py
++++ b/pandas/core/generic.py
+@@ -1607,6 +1607,8 @@ class NDFrame(PandasObject, SelectionMixin):
+
+ def __array_wrap__(self, result, context=None):
+ d = self._construct_axes_dict(self._AXIS_ORDERS, copy=False)
++ if context is not None and context[0]==np.matmul and not hasattr(context[1][0],'index'):
++ d.pop('index',None)
+ return self._constructor(result, **d).__finalize__(self)
+
+ # ideally we would define this to avoid the getattr checks, but
+--- a/pandas/tests/frame/test_analytics.py
++++ b/pandas/tests/frame/test_analytics.py
+@@ -2283,8 +2283,11 @@ class TestDataFrameAnalytics(TestData):
+
+ # np.array @ DataFrame
+ result = operator.matmul(a.values, b)
++ assert isinstance(result, DataFrame)
++ assert result.columns.equals(b.columns)
++ assert result.index.equals(pd.Index(range(3)))
+ expected = np.dot(a.values, b.values)
+- tm.assert_almost_equal(result, expected)
++ tm.assert_almost_equal(result.values, expected)
+
+ # nested list @ DataFrame (__rmatmul__)
+ result = operator.matmul(a.values.tolist(), b)
--- /dev/null
+Description: Fix ordering of np.array @ Series
+
+Author: Ming Li
+Origin: upstream
+Bug-Debian: https://bugs.debian.org/923708
+Forwarded: not-needed
+
+--- pandas-0.23.3+dfsg.orig/pandas/core/series.py
++++ pandas-0.23.3+dfsg/pandas/core/series.py
+@@ -2058,7 +2058,7 @@ class Series(base.IndexOpsMixin, generic
+
+ def __rmatmul__(self, other):
+ """ Matrix multiplication using binary `@` operator in Python>=3.5 """
+- return self.dot(other)
++ return self.dot(np.transpose(other)).T
+
+ @Substitution(klass='Series')
+ @Appender(base._shared_docs['searchsorted'])
+--- pandas-0.23.3+dfsg.orig/pandas/tests/series/test_analytics.py
++++ pandas-0.23.3+dfsg/pandas/tests/series/test_analytics.py
+@@ -950,6 +950,11 @@ class TestSeriesAnalytics(TestData):
+ expected = np.dot(a.values, a.values)
+ assert_almost_equal(result, expected)
+
++ # np.array (matrix) @ Series (__rmatmul__)
++ result = operator.matmul(b.T.values, a)
++ expected = np.dot(b.T.values, a.values)
++ assert_almost_equal(result, expected)
++
+ # mixed dtype DataFrame @ Series
+ a['p'] = int(a.p)
+ result = operator.matmul(b.T, a)
--- /dev/null
+Author: Yaroslav Halchenko <debian@onerussian.com>
+ Andreas Tille <tille@debian.org>
+Last-Update: Mon, 23 Oct 2017 08:55:28 +0200
+Description: Avoid privacy breach by Google Analytics
+
+--- a/pandas/tests/io/data/spam.html
++++ b/pandas/tests/io/data/spam.html
+@@ -27,45 +27,9 @@
+
+ <link rel="stylesheet" href="/ndb/static/css/main.css" />
+
+- <script type="text/JavaScript">
+- var _gaq = _gaq || [];
+- // NAL
+- _gaq.push(['_setAccount', 'UA-28627214-1']);
+- _gaq.push(['_setDomainName', 'nal.usda.gov']);
+- _gaq.push(['_setAllowLinker', true]);
+- _gaq.push(['_trackPageview']);
+- //
+- // _gaq.push(['_setAccount', 'UA-3876418-1']);
+- // _gaq.push(['_trackPageview']);
+- // for NDB
+- _gaq.push(['_setAccount', 'UA-36442725-1']);
+- _gaq.push(['_trackPageview']);
+- // USDA servers
+- _gaq.push(['_setAccount', 'UA-466807-3']);
+- _gaq.push(['_setDomainName', 'usda.gov']);
+- _gaq.push(['_setAllowLinker', true]);
+- _gaq.push(['_trackPageview']);
+- //
+- _gaq.push(['a._setAccount', 'UA-27627304-18']);
+- _gaq.push(['a._setDomainName', 'usda.gov']);
+- _gaq.push(['a._setAllowLinker', true]);
+- _gaq.push(['a._trackPageview']);
+- //
+- _gaq.push(['b._setAccount', 'UA-27627304-1']);
+- _gaq.push(['b._setDomainName', 'usda.gov']);
+- _gaq.push(['b._setAllowLinker', true]);
+- _gaq.push(['b._trackPageview']);
+-
+- (function() {
+- var ga = document.createElement('script'); ga.type =
+- 'text/javascript'; ga.async = true;
+- ga.src = ('https:' == document.location.protocol ? 'https://ssl' :
+- 'http://www') + '.google-analytics.com/ga.js';
+- var s = document.getElementsByTagName('script')[0];
+- s.parentNode.insertBefore(ga, s);
+- })();
+- </script>
+-
++<!-- google analytics snippet was completely removed by Debian maintainers.
++ See http://lintian.debian.org/tags/privacy-breach-google-adsense.html
++ for more information -->
+
+
+ <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
+@@ -794,4 +758,4 @@ handler: function() {this.cancel();},
+ Software v.1.2.2
+ </div>
+ </body>
+-</html>
+\ No newline at end of file
++</html>
+--- a/doc/source/themes/nature_with_gtoc/layout.html
++++ b/doc/source/themes/nature_with_gtoc/layout.html
+@@ -94,15 +94,4 @@ $(document).ready(function() {
+ });
+ });
+ </script>
+-<script type="text/javascript">
+- var _gaq = _gaq || [];
+- _gaq.push(['_setAccount', 'UA-27880019-2']);
+- _gaq.push(['_trackPageview']);
+-
+- (function() {
+- var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
+- ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
+- var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);
+- })();
+-</script>
+-{% endblock %}
+\ No newline at end of file
++{% endblock %}
--- /dev/null
+--- a/doc/make.py
++++ b/doc/make.py
+@@ -346,8 +346,9 @@ def main():
+ # external libraries (namely Sphinx) to compile this module and resolve
+ # the import of `python_path` correctly. The latter is used to resolve
+ # the import within the module, injecting it into the global namespace
+- os.environ['PYTHONPATH'] = args.python_path
+- sys.path.append(args.python_path)
++ # Debian: we set it outside
++ #os.environ['PYTHONPATH'] = args.python_path
++ #sys.path.append(args.python_path)
+ globals()['pandas'] = importlib.import_module('pandas')
+
+ builder = DocBuilder(args.num_jobs, not args.no_api, args.single,
--- /dev/null
+--- a/doc/source/conf.py
++++ b/doc/source/conf.py
+@@ -72,10 +72,14 @@ extensions = ['sphinx.ext.autodoc',
+ 'sphinx.ext.mathjax',
+ 'sphinx.ext.ifconfig',
+ 'sphinx.ext.linkcode',
+- 'nbsphinx',
+ ]
+
+ mathjax_path='MathJax.js'
++try:
++ import nbsphinx
++ extensions += ['nbsphinx']
++except:
++ pass # survive without
+
+ exclude_patterns = ['**.ipynb_checkpoints']
+
--- /dev/null
+--- a/setup.cfg
++++ b/setup.cfg
+@@ -32,5 +32,6 @@ markers =
+ slow: mark a test as slow
+ network: mark a test as network
+ high_memory: mark a test as a high-memory only
+-addopts = --strict-data-files
+-doctest_optionflags= NORMALIZE_WHITESPACE IGNORE_EXCEPTION_DETAIL
+\ No newline at end of file
++# Disabled for Debian build
++# addopts = --strict-data-files
++doctest_optionflags= NORMALIZE_WHITESPACE IGNORE_EXCEPTION_DETAIL
--- /dev/null
+--- a/pandas/__init__.py
++++ b/pandas/__init__.py
+@@ -80,11 +80,7 @@ tslib = _DeprecatedModule(deprmod='panda
+ 'NaTType': 'type(pandas.NaT)',
+ 'OutOfBoundsDatetime': 'pandas.errors.OutOfBoundsDatetime'})
+
+-# use the closest tagged version if possible
+-from ._version import get_versions
+-v = get_versions()
+-__version__ = v.get('closest-tag', v['version'])
+-del get_versions, v
++from .__version import version as __version__
+
+ # module level doc-string
+ __doc__ = """
--- /dev/null
+From: Yaroslav Halchenko <debian@onerussian.com>
+Subject: Skip two tests which fail when ran in full battery during pkg build
+
+Origin: (Neuro)Debian
+Bug: https://github.com/pandas-dev/pandas/issues/19774
+Last-Update: 2018-02-20
+
+--- a/pandas/tests/io/formats/test_to_csv.py
++++ b/pandas/tests/io/formats/test_to_csv.py
+@@ -41,6 +41,7 @@ class TestToCSV(object):
+ with open(path, 'r') as f:
+ assert f.read() == expected2
+
++ @pytest.mark.skipif(True, reason="see https://github.com/pandas-dev/pandas/issues/19774")
+ def test_to_csv_defualt_encoding(self):
+ # GH17097
+ df = DataFrame({'col': [u"AAAAA", u"ÄÄÄÄÄ", u"ßßßßß", u"聞聞聞聞聞"]})
+--- a/pandas/tests/io/test_pytables.py
++++ b/pandas/tests/io/test_pytables.py
+@@ -4885,6 +4885,7 @@ class TestHDFStore(Base):
+ df_loaded = read_hdf(path, 'df', columns=cols2load) # noqa
+ assert cols2load_original == cols2load
+
++ @pytest.mark.skipif(True, reason="see https://github.com/pandas-dev/pandas/issues/19774")
+ def test_to_hdf_with_object_column_names(self):
+ # GH9057
+ # Writing HDF5 table format should only work for string-like
--- /dev/null
+--- a/pandas/tests/frame/test_constructors.py
++++ b/pandas/tests/frame/test_constructors.py
+@@ -1988,6 +1988,10 @@ class TestDataFrameConstructors(TestData
+ tm.assert_frame_equal(result, expected)
+
+ def test_from_records_sequencelike(self):
++ import platform
++ if platform.uname()[4].startswith('armv'):
++ import nose
++ raise nose.SkipTest("Fails on Debian arm boxes due to locales or whatelse")
+ df = DataFrame({'A': np.array(np.random.randn(6), dtype=np.float64),
+ 'A1': np.array(np.random.randn(6), dtype=np.float64),
+ 'B': np.array(np.arange(6), dtype=np.int64),
--- /dev/null
+--- a/pandas/tests/io/test_stata.py
++++ b/pandas/tests/io/test_stata.py
+@@ -23,6 +23,11 @@ from pandas.io.parsers import read_csv
+ from pandas.io.stata import (InvalidColumnName, PossiblePrecisionLoss,
+ StataMissingValue, StataReader, read_stata)
+
++from pandas.compat import is_platform_little_endian
++if not is_platform_little_endian():
++ import nose
++ raise nose.SkipTest("known failure of test_stata on non-little endian")
++
+
+ @pytest.fixture
+ def dirpath(datapath):
--- /dev/null
+From: Yaroslav Halchenko <debian@onerussian.com>
+Subject: swallow the error from pytables
+
+happens on wheezy and ubuntu 12.04, only in amd64, only if the entire test
+battery is run -- difficult to troubleshoot, and definetly resolved on later
+releases of Debian/Ubuntu. Thus skipping for now -- must be some glitch in
+pytables
+
+Origin: NeuroDebian
+Last-Update: 2014-02-04
+
+--- a/pandas/tests/io/test_pytables.py
++++ b/pandas/tests/io/test_pytables.py
+@@ -3318,9 +3318,13 @@ class TestHDFStore(Base, tm.TestCase):
+
+ # big selector along the columns
+ selector = ['a', 'b', 'c'] + ['a%03d' % i for i in range(60)]
+- result = store.select(
+- 'df', [Term("ts>=Timestamp('2012-02-01')"),
+- Term('users=selector')])
++ try:
++ result = store.select(
++ 'df', [Term("ts>=Timestamp('2012-02-01')"),
++ Term('users=selector')])
++ except KeyError as e:
++ if "No object named df in" in str(e):
++ raise nose.SkipTest("Skipping the test due to catching known %s" % e)
+ expected = df[(df.ts >= Timestamp('2012-02-01')) &
+ df.users.isin(selector)]
+ tm.assert_frame_equal(expected, result)
--- /dev/null
+Description: Use Python 3 shebangs, and fix shebang bug
+
+The bug was in pandas/tests/io/generate_legacy_storage_files.py
+
+Author: Rebecca N. Palmer <rebecca_palmer@zoho.com>
+Forwarded: no
+
+--- pandas-0.23.3+dfsg.orig/ci/print_skipped.py
++++ pandas-0.23.3+dfsg/ci/print_skipped.py
+@@ -1,4 +1,4 @@
+-#!/usr/bin/env python
++#!/usr/bin/env python3
+
+ import sys
+ import math
+--- pandas-0.23.3+dfsg.orig/ci/print_versions.py
++++ pandas-0.23.3+dfsg/ci/print_versions.py
+@@ -1,4 +1,4 @@
+-#!/usr/bin/env python
++#!/usr/bin/env python3
+
+
+ def show_versions(as_json=False):
+--- pandas-0.23.3+dfsg.orig/doc/make.py
++++ pandas-0.23.3+dfsg/doc/make.py
+@@ -1,4 +1,4 @@
+-#!/usr/bin/env python
++#!/usr/bin/env python3
+ """
+ Python script for building documentation.
+
+--- pandas-0.23.3+dfsg.orig/pandas/core/computation/eval.py
++++ pandas-0.23.3+dfsg/pandas/core/computation/eval.py
+@@ -1,4 +1,4 @@
+-#!/usr/bin/env python
++#!/usr/bin/env python3
+
+ """Top level ``eval`` module.
+ """
+--- pandas-0.23.3+dfsg.orig/pandas/tests/io/generate_legacy_storage_files.py
++++ pandas-0.23.3+dfsg/pandas/tests/io/generate_legacy_storage_files.py
+@@ -1,4 +1,4 @@
+-#!/usr/env/bin python
++#!/usr/bin/env python3
+
+ """
+ self-contained to write legacy storage (pickle/msgpack) files
+--- pandas-0.23.3+dfsg.orig/pandas/tests/plotting/common.py
++++ pandas-0.23.3+dfsg/pandas/tests/plotting/common.py
+@@ -1,4 +1,4 @@
+-#!/usr/bin/env python
++#!/usr/bin/env python3
+ # coding: utf-8
+
+ import pytest
+--- pandas-0.23.3+dfsg.orig/scripts/announce.py
++++ pandas-0.23.3+dfsg/scripts/announce.py
+@@ -1,4 +1,4 @@
+-#!/usr/bin/env python
++#!/usr/bin/env python3
+ # -*- encoding:utf-8 -*-
+ """
+ Script to generate contributor and pull request lists
+--- pandas-0.23.3+dfsg.orig/scripts/download_wheels.py
++++ pandas-0.23.3+dfsg/scripts/download_wheels.py
+@@ -1,4 +1,4 @@
+-#!/usr/bin/env python
++#!/usr/bin/env python3
+ """Fetch wheels from wheels.scipy.org for a pandas version."""
+ import argparse
+ import pathlib
+--- pandas-0.23.3+dfsg.orig/scripts/find_undoc_args.py
++++ pandas-0.23.3+dfsg/scripts/find_undoc_args.py
+@@ -1,4 +1,4 @@
+-#!/usr/bin/env python
++#!/usr/bin/env python3
+ # -*- coding: utf-8 -*-
+ """
+ Script that compares the signature arguments with the ones in the docsting
+--- pandas-0.23.3+dfsg.orig/scripts/validate_docstrings.py
++++ pandas-0.23.3+dfsg/scripts/validate_docstrings.py
+@@ -1,4 +1,4 @@
+-#!/usr/bin/env python
++#!/usr/bin/env python3
+ """
+ Analyze docstrings to detect errors.
+
+--- pandas-0.23.3+dfsg.orig/setup.py
++++ pandas-0.23.3+dfsg/setup.py
+@@ -1,4 +1,4 @@
+-#!/usr/bin/env python
++#!/usr/bin/env python3
+
+ """
+ Parts of this file were taken from the pyzmq project
--- /dev/null
+--- a/pandas/tests/plotting/test_series.py
++++ b/pandas/tests/plotting/test_series.py
+@@ -10,7 +10,7 @@ from datetime import datetime
+
+ import pandas as pd
+ from pandas import Series, DataFrame, date_range
+-from pandas.compat import range, lrange
++from pandas.compat import range, lrange, is_platform_32bit
+ import pandas.util.testing as tm
+ import pandas.util._test_decorators as td
+
+@@ -718,6 +718,8 @@ class TestSeriesPlots(TestPlotBase):
+ with pytest.raises(TypeError):
+ s.plot(kind=kind, ax=ax)
+
++ @pytest.mark.skipif(is_platform_32bit,
++ reason="https://github.com/pandas-dev/pandas/issues/19814")
+ @pytest.mark.slow
+ def test_valid_object_plot(self):
+ s = Series(lrange(10), dtype=object)
--- /dev/null
+Description: Mark those tests @pytest.mark.intel that pass only on Intel architectures
+Author: Andreas Tille <tille@debian.org>
+Last-Update Sat, 14 Oct 2017 19:42:59 +0200
+Bug-Debian: https://bugs.debian.org/877419
+Author: Graham Inggs <ginggs@debian.org>
+Last-Update: 2018-04-11
+
+--- a/pandas/tests/test_algos.py
++++ b/pandas/tests/test_algos.py
+@@ -759,6 +759,7 @@ class TestValueCounts(object):
+ expected = Series([2, 1, 1], index=[5., 10.3, np.nan])
+ tm.assert_series_equal(result, expected)
+
++ @pytest.mark.intel
+ def test_value_counts_normalized(self):
+ # GH12558
+ s = Series([1, 2, np.nan, np.nan, np.nan])
+--- a/pandas/tests/test_resample.py
++++ b/pandas/tests/test_resample.py
+@@ -2110,6 +2110,7 @@ class TestDatetimeIndex(Base):
+
+ assert_frame_equal(frame.resample('60s').mean(), frame_3s)
+
++ @pytest.mark.intel
+ def test_resample_timedelta_values(self):
+ # GH 13119
+ # check that timedelta dtype is preserved when NaT values are
+@@ -2127,6 +2128,7 @@ class TestDatetimeIndex(Base):
+ res = df['time'].resample('2D').first()
+ tm.assert_series_equal(res, exp)
+
++ @pytest.mark.intel
+ def test_resample_datetime_values(self):
+ # GH 13119
+ # check that datetime dtype is preserved when NaT values are
+--- a/pandas/tests/dtypes/test_cast.py
++++ b/pandas/tests/dtypes/test_cast.py
+@@ -81,6 +81,7 @@ class TestMaybeDowncast(object):
+ tm.assert_almost_equal(result, np.array([], dtype=np.int64))
+ assert result.dtype == np.int64
+
++ @pytest.mark.intel
+ def test_datetimelikes_nan(self):
+ arr = np.array([1, 2, np.nan])
+ exp = np.array([1, 2, np.datetime64('NaT')], dtype='datetime64[ns]')
+--- a/pandas/tests/frame/test_indexing.py
++++ b/pandas/tests/frame/test_indexing.py
+@@ -2709,6 +2709,7 @@ class TestDataFrameIndexing(TestData):
+ result = a.where(do_not_replace, b)
+ assert_frame_equal(result, expected)
+
++ @pytest.mark.intel
+ def test_where_datetime(self):
+
+ # GH 3311
+--- a/pandas/tests/frame/test_operators.py
++++ b/pandas/tests/frame/test_operators.py
+@@ -178,6 +178,7 @@ class TestDataFrameOperators(TestData):
+ df)), 'b': date_range('20100101', periods=len(df))})
+ check(df, df2)
+
++ @pytest.mark.intel
+ def test_timestamp_compare(self):
+ # make sure we can compare Timestamps on the right AND left hand side
+ # GH4982
+--- a/pandas/tests/series/test_constructors.py
++++ b/pandas/tests/series/test_constructors.py
+@@ -1050,6 +1050,7 @@ class TestSeriesConstructors(TestData):
+ series[2] = val
+ assert isna(series[2])
+
++ @pytest.mark.intel
+ def test_NaT_cast(self):
+ # GH10747
+ result = Series([np.nan]).astype('M8[ns]')
+--- a/pandas/tests/series/test_period.py
++++ b/pandas/tests/series/test_period.py
+@@ -83,6 +83,7 @@ class TestSeriesPeriod(object):
+ series[2] = val
+ assert isna(series[2])
+
++ @pytest.mark.intel
+ def test_NaT_cast(self):
+ result = Series([np.nan]).astype('period[D]')
+ expected = Series([NaT])
+--- a/pandas/tests/frame/test_analytics.py
++++ b/pandas/tests/frame/test_analytics.py
+@@ -1013,6 +1013,7 @@ class TestDataFrameAnalytics(TestData):
+ expected = pd.Series(result, index=['A', 'B'])
+ tm.assert_series_equal(result, expected)
+
++ @pytest.mark.intel
+ def test_sum_nanops_timedelta(self):
+ # prod isn't defined on timedeltas
+ idx = ['a', 'b', 'c']
+--- a/pandas/tests/indexes/timedeltas/test_arithmetic.py
++++ b/pandas/tests/indexes/timedeltas/test_arithmetic.py
+@@ -908,6 +908,7 @@ class TestTimedeltaIndexArithmetic(objec
+ tm.assert_series_equal(s + pd.Timedelta('00:30:00'), exp)
+ tm.assert_series_equal(pd.Timedelta('00:30:00') + s, exp)
+
++ @pytest.mark.intel
+ def test_timedelta_ops_with_missing_values(self):
+ # setup
+ s1 = pd.to_timedelta(Series(['00:00:01']))
+--- a/pandas/tests/groupby/aggregate/test_other.py
++++ b/pandas/tests/groupby/aggregate/test_other.py
+@@ -87,6 +87,7 @@ def test_agg_period_index():
+ list(grouped)
+
+
++@pytest.mark.intel
+ def test_agg_dict_parameter_cast_result_dtypes():
+ # GH 12821
+
--- /dev/null
+Description: Mark those tests @pytest.mark.intel that pass only on Intel architectures
+ There was another test failing for armhf
+Author: Andreas Tille <tille@debian.org>
+Last-Update: Tue, 24 Oct 2017 21:19:06 +0200
+Bug-Debian: https://bugs.debian.org/877419
+Author: Graham Inggs <ginggs@debian.org>
+Last-Update: 2018-05-29
+
+--- a/pandas/tests/io/test_pytables.py
++++ b/pandas/tests/io/test_pytables.py
+@@ -1026,6 +1026,7 @@ class TestHDFStore(Base):
+ with catch_warnings(record=True):
+ check('fixed', index)
+
++ @pytest.mark.intel
+ @pytest.mark.skipif(not is_platform_little_endian(),
+ reason="reason platform is not little endian")
+ def test_encoding(self):
+@@ -1042,6 +1043,7 @@ class TestHDFStore(Base):
+ result = store.select('df', Term('columns=A', encoding='ascii'))
+ tm.assert_frame_equal(result, expected)
+
++ @pytest.mark.intel
+ def test_latin_encoding(self):
+
+ if compat.PY2:
+@@ -1232,6 +1234,7 @@ class TestHDFStore(Base):
+ reloaded_panel = read_hdf(path, 'panel_with_missing')
+ tm.assert_panel_equal(panel_with_missing, reloaded_panel)
+
++ @pytest.mark.intel
+ def test_append_frame_column_oriented(self):
+
+ with ensure_clean_store(self.path) as store:
+@@ -4245,6 +4248,7 @@ class TestHDFStore(Base):
+ with pytest.raises(NotImplementedError):
+ store.select('dfs', start=0, stop=5)
+
++ @pytest.mark.intel
+ def test_select_filter_corner(self):
+
+ df = DataFrame(np.random.randn(50, 100))
+--- a/pandas/tests/io/test_stata.py
++++ b/pandas/tests/io/test_stata.py
+@@ -474,6 +474,7 @@ class TestStata(object):
+ tm.assert_frame_equal(
+ written_and_read_again.set_index('index'), parsed_114)
+
++ @pytest.mark.intel
+ @pytest.mark.parametrize(
+ 'file', ['dta15_113', 'dta15_114', 'dta15_115', 'dta15_117'])
+ def test_read_write_reread_dta15(self, file):
+@@ -1160,6 +1161,7 @@ class TestStata(object):
+ tm.assert_frame_equal(from_frame, chunk, check_dtype=False)
+ pos += chunksize
+
++ @pytest.mark.intel
+ @pytest.mark.parametrize('version', [114, 117])
+ def test_write_variable_labels(self, version):
+ # GH 13631, add support for writing variable labels
+@@ -1240,6 +1242,7 @@ class TestStata(object):
+ with tm.ensure_clean() as path:
+ original.to_stata(path, variable_labels=variable_labels_long)
+
++ @pytest.mark.intel
+ def test_default_date_conversion(self):
+ # GH 12259
+ dates = [dt.datetime(1999, 12, 31, 12, 12, 12, 12000),
--- /dev/null
+Description: Mark those tests @pytest.mark.intel that pass only on Intel architectures
+ There was another test failing for mips and powerpc
+Author: Andreas Tille <tille@debian.org>
+Last-Update: Tue, 24 Oct 2017 21:19:06 +0200
+Bug-Debian: https://bugs.debian.org/877419
+
+--- a/pandas/tests/io/parser/skiprows.py
++++ b/pandas/tests/io/parser/skiprows.py
+@@ -9,6 +9,8 @@ from datetime import datetime
+
+ import numpy as np
+
++import pytest
++
+ import pandas.util.testing as tm
+
+ from pandas import DataFrame
+@@ -200,6 +202,7 @@ line 22",2
+ df = self.read_csv(StringIO(data), skiprows=2)
+ tm.assert_frame_equal(df, expected)
+
++ @pytest.mark.intel
+ def test_skiprows_callable(self):
+ data = 'a\n1\n2\n3\n4\n5'
+
--- /dev/null
+Description: Mark those tests @pytest.mark.intel that pass only on Intel architectures
+ There was another test failing for s390x (and armhf, mips, hppa, powerpc, ppc64, sparc64)
+Author: Andreas Tille <tille@debian.org>
+Last-Update: Mon, 23 Oct 2017 14:18:56 +0200
+Bug-Debian: https://bugs.debian.org/877419
+
+--- a/pandas/tests/io/test_packers.py
++++ b/pandas/tests/io/test_packers.py
+@@ -60,6 +60,7 @@ def all_packers_data():
+ return create_data()
+
+
++@pytest.mark.intel
+ def check_arbitrary(a, b):
+
+ if isinstance(a, (list, tuple)) and isinstance(b, (list, tuple)):
+@@ -921,6 +922,7 @@ TestPackers
+ else:
+ tm.assert_frame_equal(result, expected)
+
++ @pytest.mark.intel
+ def test_msgpacks_legacy(self, current_packers_data, all_packers_data,
+ legacy_packer, datapath):
+
+--- a/pandas/tests/indexes/datetimes/test_formats.py
++++ b/pandas/tests/indexes/datetimes/test_formats.py
+@@ -6,10 +6,11 @@ import dateutil.tz
+ import pytz
+ import pytest
+
++import pytest
+ import pandas.util.testing as tm
+ import pandas as pd
+
+-
++@pytest.mark.intel
+ def test_to_native_types():
+ index = DatetimeIndex(freq='1D', periods=3, start='2017-01-01')
+
--- /dev/null
+Author: Andreas Tille <tille@debian.org>
+Last-Update: Mon, 23 Oct 2017 14:18:56 +0200
+Description: Use Debian packaged mathjax
+
+--- a/doc/source/conf.py
++++ b/doc/source/conf.py
+@@ -75,6 +75,8 @@ extensions = ['sphinx.ext.autodoc',
+ 'nbsphinx',
+ ]
+
++mathjax_path='MathJax.js'
++
+ exclude_patterns = ['**.ipynb_checkpoints']
+
+ with open("index.rst") as f:
--- /dev/null
+Description: Stop directly calling pytest fixtures
+
+This is not allowed in recent pytest:
+https://tests.reproducible-builds.org/debian/rbuild/unstable/amd64/pandas_0.23.3+dfsg-3.rbuild.log.gz
+https://ci.debian.net/data/autopkgtest/testing/amd64/p/pandas/2954485/log.gz
+
+Origin: upstream b7e5704be86cb44707ae9be372129e521c35e0d0 + 1a12c41d201f56439510e683fadfed1218ea9067
+Author: alimcmaster1, Tom Augspurger
+Bug: https://github.com/pandas-dev/pandas/issues/22338
+
+--- a/pandas/tests/groupby/test_whitelist.py
++++ b/pandas/tests/groupby/test_whitelist.py
+@@ -123,11 +123,15 @@ def df_letters():
+ return df
+
+
+-@pytest.mark.parametrize(
+- "obj, whitelist", zip((df_letters(), df_letters().floats),
+- (df_whitelist, s_whitelist)))
+-def test_groupby_whitelist(df_letters, obj, whitelist):
++@pytest.mark.parametrize("whitelist", [df_whitelist, s_whitelist])
++def test_groupby_whitelist(df_letters, whitelist):
+ df = df_letters
++ if whitelist == df_whitelist:
++ # dataframe
++ obj = df_letters
++ else:
++ obj = df_letters['floats']
++
+
+ # these are aliases so ok to have the alias __name__
+ alias = {'bfill': 'backfill',
+--- a/pandas/tests/indexes/datetimes/test_tools.py
++++ b/pandas/tests/indexes/datetimes/test_tools.py
+@@ -1491,12 +1491,25 @@ def units_from_epochs():
+ return list(range(5))
+
+
+-@pytest.fixture(params=[epoch_1960(),
+- epoch_1960().to_pydatetime(),
+- epoch_1960().to_datetime64(),
+- str(epoch_1960())])
+-def epochs(request):
+- return request.param
++@pytest.fixture(params=['timestamp', 'pydatetime', 'datetime64', 'str_1960'])
++def epochs(epoch_1960, request):
++ """Timestamp at 1960-01-01 in various forms.
++
++ * pd.Timestamp
++ * datetime.datetime
++ * numpy.datetime64
++ * str
++ """
++ assert request.param in {'timestamp', 'pydatetime', 'datetime64',
++ "str_1960"}
++ if request.param == 'timestamp':
++ return epoch_1960
++ elif request.param == 'pydatetime':
++ return epoch_1960.to_pydatetime()
++ elif request.param == "datetime64":
++ return epoch_1960.to_datetime64()
++ else:
++ return str(epoch_1960)
+
+
+ @pytest.fixture
+--- a/pandas/tests/series/test_analytics.py
++++ b/pandas/tests/series/test_analytics.py
+@@ -1850,8 +1850,35 @@ class TestSeriesAnalytics(TestData):
+ tm.assert_series_equal(idx.value_counts(normalize=True), exp)
+
+
++main_dtypes = [
++ 'datetime',
++ 'datetimetz',
++ 'timedelta',
++ 'int8',
++ 'int16',
++ 'int32',
++ 'int64',
++ 'float32',
++ 'float64',
++ 'uint8',
++ 'uint16',
++ 'uint32',
++ 'uint64'
++]
++
++
+ @pytest.fixture
+ def s_main_dtypes():
++ """A DataFrame with many dtypes
++
++ * datetime
++ * datetimetz
++ * timedelta
++ * [u]int{8,16,32,64}
++ * float{32,64}
++
++ The columns are the name of the dtype.
++ """
+ df = pd.DataFrame(
+ {'datetime': pd.to_datetime(['2003', '2002',
+ '2001', '2002',
+@@ -1871,6 +1898,12 @@ def s_main_dtypes():
+ return df
+
+
++@pytest.fixture(params=main_dtypes)
++def s_main_dtypes_split(request, s_main_dtypes):
++ """Each series in s_main_dtypes."""
++ return s_main_dtypes[request.param]
++
++
+ def assert_check_nselect_boundary(vals, dtype, method):
+ # helper function for 'test_boundary_{dtype}' tests
+ s = Series(vals, dtype=dtype)
+@@ -1900,12 +1933,10 @@ class TestNLargestNSmallest(object):
+ with tm.assert_raises_regex(TypeError, msg):
+ method(arg)
+
+- @pytest.mark.parametrize(
+- "s",
+- [v for k, v in s_main_dtypes().iteritems()])
+- def test_nsmallest_nlargest(self, s):
++ def test_nsmallest_nlargest(self, s_main_dtypes_split):
+ # float, int, datetime64 (use i8), timedelts64 (same),
+ # object that are numbers, object that are strings
++ s = s_main_dtypes_split
+
+ assert_series_equal(s.nsmallest(2), s.iloc[[2, 1]])
+ assert_series_equal(s.nsmallest(2, keep='last'), s.iloc[[2, 3]])
+--- a/pandas/tests/io/test_sql.py
++++ b/pandas/tests/io/test_sql.py
+@@ -254,9 +254,13 @@ class PandasSQLTest(object):
+ else:
+ return self.conn.cursor()
+
+- def _load_iris_data(self, datapath):
++ @pytest.fixture(params=[('io', 'data', 'iris.csv')])
++ def load_iris_data(self, datapath, request):
+ import io
+- iris_csv_file = datapath('io', 'data', 'iris.csv')
++ iris_csv_file = datapath(*request.param)
++
++ if not hasattr(self, 'conn'):
++ self.setup_connect()
+
+ self.drop_table('iris')
+ self._get_exec().execute(SQL_STRINGS['create_iris'][self.flavor])
+@@ -504,10 +508,14 @@ class _TestSQLApi(PandasSQLTest):
+ flavor = 'sqlite'
+ mode = None
+
+- @pytest.fixture(autouse=True)
+- def setup_method(self, datapath):
++ def setup_connect(self):
+ self.conn = self.connect()
+- self._load_iris_data(datapath)
++
++ @pytest.fixture(autouse=True)
++ def setup_method(self, load_iris_data):
++ self.load_test_data_and_sql()
++
++ def load_test_data_and_sql(self):
+ self._load_iris_view()
+ self._load_test1_data()
+ self._load_test2_data()
+@@ -1028,8 +1036,8 @@ class _EngineToConnMixin(object):
+ """
+
+ @pytest.fixture(autouse=True)
+- def setup_method(self, datapath):
+- super(_EngineToConnMixin, self).setup_method(datapath)
++ def setup_method(self, load_iris_data):
++ super(_EngineToConnMixin, self).load_test_data_and_sql()
+ engine = self.conn
+ conn = engine.connect()
+ self.__tx = conn.begin()
+@@ -1154,14 +1162,14 @@ class _TestSQLAlchemy(SQLAlchemyMixIn, P
+ msg = "{0} - can't connect to {1} server".format(cls, cls.flavor)
+ pytest.skip(msg)
+
+- @pytest.fixture(autouse=True)
+- def setup_method(self, datapath):
+- self.setup_connect()
+-
+- self._load_iris_data(datapath)
++ def load_test_data_and_sql(self):
+ self._load_raw_sql()
+ self._load_test1_data()
+
++ @pytest.fixture(autouse=True)
++ def setup_method(self, load_iris_data):
++ self.load_test_data_and_sql()
++
+ @classmethod
+ def setup_import(cls):
+ # Skip this test if SQLAlchemy not available
+@@ -1926,15 +1934,17 @@ class TestSQLiteFallback(SQLiteMixIn, Pa
+ def connect(cls):
+ return sqlite3.connect(':memory:')
+
+- @pytest.fixture(autouse=True)
+- def setup_method(self, datapath):
++ def setup_connect(self):
+ self.conn = self.connect()
+- self.pandasSQL = sql.SQLiteDatabase(self.conn)
+-
+- self._load_iris_data(datapath)
+
++ def load_test_data_and_sql(self):
++ self.pandasSQL = sql.SQLiteDatabase(self.conn)
+ self._load_test1_data()
+
++ @pytest.fixture(autouse=True)
++ def setup_method(self, load_iris_data):
++ self.load_test_data_and_sql()
++
+ def test_read_sql(self):
+ self._read_sql_iris()
+
+@@ -2147,6 +2157,12 @@ class TestXSQLite(SQLiteMixIn):
+ self.method = request.function
+ self.conn = sqlite3.connect(':memory:')
+
++ # In some test cases we may close db connection
++ # Re-open conn here so we can perform cleanup in teardown
++ yield
++ self.method = request.function
++ self.conn = sqlite3.connect(':memory:')
++
+ def test_basic(self):
+ frame = tm.makeTimeDataFrame()
+ self._check_roundtrip(frame)
+@@ -2223,7 +2239,7 @@ class TestXSQLite(SQLiteMixIn):
+ with pytest.raises(Exception):
+ sql.execute('INSERT INTO test VALUES("foo", "bar", 7)', self.conn)
+
+- def test_execute_closed_connection(self, request, datapath):
++ def test_execute_closed_connection(self):
+ create_sql = """
+ CREATE TABLE test
+ (
+@@ -2242,9 +2258,6 @@ class TestXSQLite(SQLiteMixIn):
+ with pytest.raises(Exception):
+ tquery("select * from test", con=self.conn)
+
+- # Initialize connection again (needed for tearDown)
+- self.setup_method(request, datapath)
+-
+ def test_na_roundtrip(self):
+ pass
+
--- /dev/null
+up_skip_testrepr
+deb_nonversioneer_version
+deb_doc_donotoverride_PYTHONPATH
+deb_skip_stata_on_bigendians
+deb_disable_googleanalytics
+deb_skip_sequencelike_on_armel
+deb_no_strict_data
+# Try to skip -- migth have been addressed upstream
+# deb_skip_test_pytables_failure
+# up_buggy_overflows
+# 858260.patch
+up_print_versions
+# does not apply to 0.22 but kept around since next one might have it
+# up_tst_dont_assert_that_a_bug_exists_in_numpy
+mark_tests_working_on_intel.patch
+mark_tests_working_on_intel_s390x.patch
+mark_tests_working_on_intel_mips.patch
+mark_tests_working_on_intel_armhf.patch
+mark_tests_failing_on_386.patch
+mathjax-path.patch
+deb_ndsphinx_optional
+deb_skip_difffailingtests
+918206.patch
+
+# lintian: patch-file-present-but-not-mentioned-in-series
+# Don't remove this comment, so that we can avoid a lintian warning.
+# This patch is conditionally applied via d/rules.
+# 0001-TST-pytest-deprecation-warnings-GH17197-17253-reversed.patch
+skip_tests_copyright.patch
+array_series_matmul.patch
+pytest_fixtures.patch
+skip_noencoding_locales.patch
+use_system_intersphinx.patch
+spelling.patch
+fix_shebangs.patch
--- /dev/null
+Description: Don't test datetime in locales with no encoding
+
+Some datetime tests run the test in every available locale.
+If this set includes locales without an encoding (currently dsb_DE
+and sah_RU), it fails due to Python bug
+https://bugs.python.org/issue20088
+
+Failure log
+https://tests.reproducible-builds.org/debian/rbuild/buster/amd64/pandas_0.23.3+dfsg-3.rbuild.log.gz
+
+Author: Rebecca N. Palmer <rebecca_palmer@zoho.com>
+Bug: https://github.com/pandas-dev/pandas/issues/20957
+Forwarded: no
+
+--- a/pandas/util/testing.py
++++ b/pandas/util/testing.py
+@@ -407,6 +407,8 @@
+ except subprocess.CalledProcessError as e:
+ raise type(e)("{exception}, the 'locale -a' command cannot be found "
+ "on your system".format(exception=e))
++ # skip locales without encoding, to avoid Python bug https://bugs.python.org/issue20088
++ raw_locales = raw_locales.replace(b'\ndsb_DE\n',b'\n').replace(b'\nsah_RU\n',b'\n').replace(b'\ncrh_UA\n',b'\n')
+ return raw_locales
+
+
--- /dev/null
+Description: Skip tests removed for copyright reasons
+
+and revert accidental change to _version.py
+
+Author: Rebecca N. Palmer <rebecca_palmer@zoho.com>
+Forwarded: no
+
+--- a/pandas/_version.py
++++ b/pandas/_version.py
+@@ -20,8 +20,8 @@ def get_keywords():
+ # setup.py/versioneer.py will grep for the variable names, so they must
+ # each be defined on a line of their own. _version.py will just call
+ # get_keywords().
++ git_refnames = "$Format:%d$"
++ git_full = "$Format:%H$"
+- git_refnames = " (tag: v0.23.3)"
+- git_full = "edb71fda022c6a155717e7a25679040ee0476639"
+ keywords = {"refnames": git_refnames, "full": git_full}
+ return keywords
+
+--- pandas-0.23.3+dfsg.orig/pandas/tests/io/test_html.py
++++ pandas-0.23.3+dfsg/pandas/tests/io/test_html.py
+@@ -365,6 +365,7 @@ class TestReadHtml(object):
+ assert sorted(zz) == sorted(['Repo', 'What'])
+
+ @pytest.mark.slow
++ @pytest.mark.skip(reason='test data removed for copyright reasons')
+ def test_thousands_macau_stats(self, datapath):
+ all_non_nan_table_index = -2
+ macau_data = datapath("io", "data", "macau.html")
+@@ -375,6 +376,7 @@ class TestReadHtml(object):
+ assert not any(s.isna().any() for _, s in df.iteritems())
+
+ @pytest.mark.slow
++ @pytest.mark.skip(reason='test data removed for copyright reasons')
+ def test_thousands_macau_index_col(self, datapath):
+ all_non_nan_table_index = -2
+ macau_data = datapath('io', 'data', 'macau.html')
+@@ -531,6 +533,7 @@ class TestReadHtml(object):
+ res2 = self.read_html(data2, header=0)
+ assert_framelist_equal(res1, res2)
+
++ @pytest.mark.skip(reason='test data removed for copyright reasons')
+ def test_nyse_wsj_commas_table(self, datapath):
+ data = datapath('io', 'data', 'nyse_wsj.html')
+ df = self.read_html(data, index_col=0, header=0,
+@@ -671,6 +674,7 @@ class TestReadHtml(object):
+ newdf = DataFrame({'datetime': raw_dates})
+ tm.assert_frame_equal(newdf, res[0])
+
++ @pytest.mark.skip(reason='test data removed for copyright reasons')
+ def test_computer_sales_page(self, datapath):
+ data = datapath('io', 'data', 'computer_sales_page.html')
+ with tm.assert_raises_regex(ParserError,
--- /dev/null
+Description: Fix typos
+
+Author: Rebecca N. Palmer <rebecca_palmer@zoho.com>
+Origin: lintian
+Forwarded: no
+
+--- pandas-0.23.3+dfsg.orig/doc/source/whatsnew/v0.18.1.txt
++++ pandas-0.23.3+dfsg/doc/source/whatsnew/v0.18.1.txt
+@@ -605,7 +605,7 @@ Bug Fixes
+ - Bug in ``.astype()`` of a ``Float64Inde/Int64Index`` to an ``Int64Index`` (:issue:`12881`)
+ - Bug in roundtripping an integer based index in ``.to_json()/.read_json()`` when ``orient='index'`` (the default) (:issue:`12866`)
+ - Bug in plotting ``Categorical`` dtypes cause error when attempting stacked bar plot (:issue:`13019`)
+-- Compat with >= ``numpy`` 1.11 for ``NaT`` comparions (:issue:`12969`)
++- Compat with >= ``numpy`` 1.11 for ``NaT`` comparisons (:issue:`12969`)
+ - Bug in ``.drop()`` with a non-unique ``MultiIndex``. (:issue:`12701`)
+ - Bug in ``.concat`` of datetime tz-aware and naive DataFrames (:issue:`12467`)
+ - Bug in correctly raising a ``ValueError`` in ``.resample(..).fillna(..)`` when passing a non-string (:issue:`12952`)
+--- pandas-0.23.3+dfsg.orig/pandas/_libs/lib.pyx
++++ pandas-0.23.3+dfsg/pandas/_libs/lib.pyx
+@@ -417,7 +417,7 @@ def maybe_booleans_to_slice(ndarray[uint
+ @cython.wraparound(False)
+ @cython.boundscheck(False)
+ cpdef bint array_equivalent_object(object[:] left, object[:] right):
+- """ perform an element by element comparion on 1-d object arrays
++ """ perform an element by element comparison on 1-d object arrays
+ taking into account nan positions """
+ cdef:
+ Py_ssize_t i, n = left.shape[0]
--- /dev/null
+From: Yaroslav Halchenko <debian@onerussian.com>
+Subject: avoid overflows for now
+ Due to a bug in current numpy beta (or numexpr) tests would fail
+ if operation on int leads to overflows (e.g. of pow operation).
+
+ as a workaround for now -- avoid big ints
+
+Origin: Debian
+Bug: https://github.com/pandas-dev/pandas/issues/15046
+Last-Update: 2017-01-04
+
+--- a/pandas/tests/test_expressions.py
++++ b/pandas/tests/test_expressions.py
+@@ -40,7 +40,7 @@ _mixed2 = DataFrame({'A': _frame2['A'].c
+ 'C': _frame2['C'].astype('int64'),
+ 'D': _frame2['D'].astype('int32')})
+ _integer = DataFrame(
+- np.random.randint(1, 100,
++ np.random.randint(1, 10,
+ size=(10001, 4)), columns=list('ABCD'), dtype='int64')
+ _integer2 = DataFrame(np.random.randint(1, 100, size=(101, 4)),
+ columns=list('ABCD'), dtype='int64')
--- /dev/null
+--- a/ci/print_versions.py
++++ b/ci/print_versions.py
+@@ -9,7 +9,7 @@ def show_versions(as_json=False):
+ pandas_dir = os.path.abspath(os.path.join(this_dir, ".."))
+ sv_path = os.path.join(pandas_dir, 'pandas', 'util')
+ mod = imp.load_module(
+- 'pvmod', *imp.find_module('print_versions', [sv_path]))
++ 'pvmod', *imp.find_module('_print_versions', [sv_path]))
+ return mod.show_versions(as_json)
+
+
--- /dev/null
+From: Yaroslav Halchenko <debian@onerussian.com>
+Subject: skip the assert for now - remove patch whenever upstream issue is closed
+Origin: Debian
+Bug: https://github.com/pandas-dev/pandas/issues/21746
+Last-Update: 2018-07-18
+
+--- a/pandas/tests/io/formats/test_format.py
++++ b/pandas/tests/io/formats/test_format.py
+@@ -1598,7 +1598,8 @@ c 10 11 12 13 14\
+ # Wide
+ h, w = max_rows - 1, max_cols + 1
+ df = DataFrame({k: np.arange(1, 1 + h) for k in np.arange(w)})
+- assert has_horizontally_truncated_repr(df)
++ # TEMP: https://github.com/pandas-dev/pandas/issues/21746
++ # assert has_horizontally_truncated_repr(df)
+ with option_context('display.large_repr', 'info',
+ 'display.max_columns', max_cols):
+ assert has_info_repr(df)
--- /dev/null
+From 5f2b96bb637f6ddeec169c5ef8ad20013a03c853 Mon Sep 17 00:00:00 2001
+From: Eric Wieser <wieser.eric@gmail.com>
+Date: Sat, 15 Jul 2017 13:30:03 +0100
+Subject: [PATCH] TST: Don't assert that a bug exists in numpy (#16940)
+
+Better to ignore the warning from the bug, rather than assert the bug is still there
+
+After this change, numpy/numpy#9412 _could_ be backported to fix the bug
+---
+ pandas/tests/test_algos.py | 3 ++-
+ 1 file changed, 2 insertions(+), 1 deletion(-)
+
+diff --git a/pandas/tests/test_algos.py b/pandas/tests/test_algos.py
+index 9504d2a9426..993dcc4f527 100644
+--- a/pandas/tests/test_algos.py
++++ b/pandas/tests/test_algos.py
+@@ -2,6 +2,7 @@
+
+ import numpy as np
+ import pytest
++import warnings
+
+ from numpy.random import RandomState
+ from numpy import nan
+@@ -127,7 +128,7 @@ def test_unsortable(self):
+ arr = np.array([1, 2, datetime.now(), 0, 3], dtype=object)
+ if compat.PY2 and not pd._np_version_under1p10:
+ # RuntimeWarning: tp_compare didn't return -1 or -2 for exception
+- with tm.assert_produces_warning(RuntimeWarning):
++ with warnings.catch_warnings():
+ pytest.raises(TypeError, algos.safe_sort, arr)
+ else:
+ pytest.raises(TypeError, algos.safe_sort, arr)
--- /dev/null
+Description: Use packaged intersphinx indexes, and use https links
+
+Author: Rebecca N. Palmer <rebecca_palmer@zoho.com>
+Bug-Debian: https://bugs.debian.org/876417
+Forwarded: no
+
+--- pandas-0.23.3+dfsg.orig/doc/source/conf.py
++++ pandas-0.23.3+dfsg/doc/source/conf.py
+@@ -357,13 +357,13 @@ latex_documents = [
+
+
+ intersphinx_mapping = {
+- 'statsmodels': ('http://www.statsmodels.org/devel/', None),
+- 'matplotlib': ('http://matplotlib.org/', None),
+- 'pandas-gbq': ('https://pandas-gbq.readthedocs.io/en/latest/', None),
+- 'python': ('https://docs.python.org/3/', None),
+- 'numpy': ('https://docs.scipy.org/doc/numpy/', None),
+- 'scipy': ('https://docs.scipy.org/doc/scipy/reference/', None),
+- 'py': ('https://pylib.readthedocs.io/en/latest/', None)
++ 'statsmodels': ('https://www.statsmodels.org/devel/', '/usr/share/doc/python-statsmodels-doc/html/objects.inv'),
++ 'matplotlib': ('https://matplotlib.org/', '/usr/share/doc/python-matplotlib-doc/html/objects.inv'),
++ 'pandas-gbq': ('https://pandas-gbq.readthedocs.io/en/latest/', None), # not in Debian
++ 'python': ('https://docs.python.org/3/', '/usr/share/doc/python3-doc/html/objects.inv'),
++ 'numpy': ('https://docs.scipy.org/doc/numpy/', '/usr/share/doc/python-numpy-doc/html/objects.inv'),
++ 'scipy': ('https://docs.scipy.org/doc/scipy/reference/', ('/usr/share/doc/python-scipy-doc/html/objects.inv','/usr/share/doc/python-scipy/html/objects.inv')),
++ 'py': ('https://pylib.readthedocs.io/en/latest/', None) # no -doc in Debian
+ }
+ import glob
+ autosummary_generate = glob.glob("*.rst")
--- /dev/null
+doc/build/html
--- /dev/null
+usr/share/javascript/jquery/jquery.js usr/share/doc/python-scikits-learn-doc/html/_static/jquery.js
--- /dev/null
+usr/lib/python2*/dist-packages/pandas/*/*/*.so
+usr/lib/python2*/dist-packages/pandas/*/*.so
--- /dev/null
+usr/lib/python2*/
--- /dev/null
+usr/lib/python3/dist-packages/pandas/*/*/*.so
+usr/lib/python3/dist-packages/pandas/*/*.so
--- /dev/null
+usr/lib/python3/
--- /dev/null
+#!/usr/bin/make -f
+# -*- mode: makefile; coding: utf-8 -*-
+
+export DEB_BUILD_MAINT_OPTIONS = hardening=+all
+
+# Pass hardening flags into distutils, explicitly
+export CFLAGS = $(shell dpkg-buildflags --get CFLAGS)
+export CPPFLAGS = $(shell dpkg-buildflags --get CPPFLAGS)
+export CXXFLAGS = $(shell dpkg-buildflags --get CXXFLAGS)
+
+PACKAGE2_NAME = python-pandas
+PACKAGE3_NAME = python3-pandas
+PACKAGE2_ROOT_DIR = debian/${PACKAGE2_NAME}
+PACKAGE3_ROOT_DIR = debian/${PACKAGE3_NAME}
+
+PYVERS = $(shell pyversions -vr)
+PYVER = $(shell pyversions -vd)
+#broken on python3.8 for now: https://objectstorage.prodstack4-5.canonical.com/v1/AUTH_77e2ada1e7a84929a74ba3b87153c0ac/autopkgtest-focal/focal/amd64/p/pandas/20191024_181815_7c017@/log.gz
+#plan for fixing this: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=931557#10
+PY3VERS = 3.7
+PY3VER = $(shell py3versions -vd)
+
+UVER := $(shell LC_ALL=C dpkg-parsechangelog | awk '/^Version:/{print $$2;}' | sed -e 's,-[^-]*$$,,g' | sed -e 's,+dfsg,,g')
+# Python doesn't use ~ for rc
+UVER_PY := $(shell echo $(UVER) | sed -e 's,[~],,g')
+UVER_PYSHORT := $(shell echo $(UVER_PY) | sed -e 's,+git.*,,g')
+
+MIN_CYTHONVER = 0.23
+
+# Filter out tests with "marker expressions" and "keyword expressions". Ref: pytest(1)
+ifeq ($(DEB_HOST_ARCH),$(filter $(DEB_HOST_ARCH), amd64 i386 kfreebsd-amd64 kfreebsd-i386 x32))
+ PYTEST_MARKER_ARCH :=
+ PYTEST_KEYWORD_ARCH :=
+else
+ PYTEST_MARKER_ARCH := and not intel and not slow
+ PYTEST_KEYWORD_ARCH :=
+endif
+
+PYTEST_MARKER := not single and not network and not disabled $(PYTEST_MARKER_ARCH)
+
+ADDPATCHES :=
+# testing for version became fragile: https://github.com/pytest-dev/pytest/issues/3692
+# PYTESTVER := $(shell python -c 'import pytest; print(pytest.__version__)')
+# and there is way too many uses of pytest.param now, so we can't just easily patch for it... THINK!
+# ADDPATCHES += $(shell dpkg --compare-versions $(PYTESTVER) ge 3.1.0 || echo "0001-TST-pytest-deprecation-warnings-GH17197-17253-reversed.patch" )
+
+# MPLVER := $(shell dpkg -l python-matplotlib | awk '/^ii/{print $$3;}' || echo 0)
+# $(shell dpkg --compare-versions $(MPLVER) lt 1.0 && echo '|test_hist|test_plot|test_boxplot|test_corr_rank' || echo '')
+
+# try to prevent unsunctioned downloads
+export http_proxy=http://127.0.0.1:9/
+export https_proxy=http://127.0.0.1:9/
+
+export SHELL=/bin/bash
+
+# Mega rule
+%:
+ : # Explicit build system to avoid use of all-in-1 Makefile
+ dh $@ --buildsystem=pybuild --with python2,python3
+
+clean_generated:
+ find pandas/ -regex '.*\.c\(\|pp\)' | xargs grep -l -e 'Generated by Cython' | xargs -r rm -f
+
+_cythonize%:
+ debian/rules clean_generated # force removal of previous copies
+ python$(*:2=) setup.py cython
+ D=debian/cythonized-files$(*:2=) && \
+ git rm -rf $$D; \
+ find pandas/ -regex '.*\.c\(\|pp\)' | while read f; do \
+ grep -q 'Generated by Cython' "$$f" || continue; \
+ mkdir -p "$$D/$$(dirname $$f)"; \
+ cp "$$f" "$$D/$$(dirname $$f)"; \
+ git add -f "$$D/$$f"; \
+ done; \
+ echo "$(UVER)" >| $$D/VERSION; git add $$D/VERSION
+
+_uncythonize%:
+ echo "$*" | grep -q '^3' && PY=3 || PY= ; \
+ CYTHONVER=$$(dpkg -l cython$$PY 2>/dev/null | awk '/^ii/{print $$3;}' || echo 0); \
+ dpkg --compare-versions "$$CYTHONVER" lt "$(MIN_CYTHONVER)" && { \
+ echo "I: Using pre-Cython-ed files for Python $*"; \
+ cd debian/cythonized-files$$PY/ ; \
+ find . -regex '.*\.c\(\|pp\)' | while read f; do cp $$f ../../$$f; done; } || :
+
+cythonize: _cythonize2 _cythonize3
+
+override_dh_clean: clean_generated
+ : # Make sure that cythonized sources are up-to-date
+ [ ! -e debian/cythonized-files3/VERSION ] || [ "$(UVER)" = "`cat debian/cythonized-files3/VERSION`" ]
+ rm -rf build doc/_build *-stamp # pandas.egg-info pandas/datasets/__config__.py
+ dh_clean
+
+version_py:
+ [ -e pandas/__version.py ] || \
+ echo -e "version = '$(UVER_PY)'\nshort_version = '$(UVER_PYSHORT)'" > pandas/__version.py
+
+override_dh_auto_build: version_py debian/patch-stamp
+ # Override default build operation which --force's re-cythonization
+ # on elderly ubuntus
+ # Just build the version.py file
+ :
+
+debian/patch-stamp:
+ if echo "${ADDPATCHES}" | sed -e 's,\s,,g' | grep '.' ; then \
+ echo ${ADDPATCHES} >> debian/patches/series; \
+ quilt push -a; \
+ fi
+ touch $@
+
+
+override_dh_auto_install: ${PYVERS:%=python-install%} ${PY3VERS:%=python-install%} ${PYVERS:%=python-test%} ${PY3VERS:%=python-test%}
+# Per Python version logic -- install, test, remomove .so (installed into -lib)
+python-install%: _uncythonize%
+ python$* setup.py install --install-layout=deb --root=$(CURDIR)/debian/tmp
+
+python-test%: python-install%
+ifeq (,$(filter nocheck,$(DEB_BUILD_OPTIONS)))
+ echo "backend : Agg" >| $(CURDIR)/build/matplotlibrc
+ : # Run unittests here against installed pandas
+ echo "$*" | grep -q '^3' && PY=3 || PY=$*; \
+ export PYTHONPATH=`/bin/ls -d $$PWD/debian/tmp/usr/lib/python$$PY/*/`; \
+ export MPLCONFIGDIR=$(CURDIR)/build HOME=$(CURDIR)/build; \
+ python$* ci/print_versions.py; \
+ cd build/; LOCALE_OVERRIDE=C xvfb-run -a -s "-screen 0 1280x1024x24 -noreset" \
+ python$* -m pytest -s -v -m "$(PYTEST_MARKER)" $$PYTHONPATH/pandas;
+else
+ : # Skip unittests due to nocheck
+endif
+
+override_dh_installdocs:
+ : # Build Documentation using installed pandas
+ifeq (,$(filter nodoc,$(DEB_BUILD_OPTIONS)))
+ifneq (,$(findstring -a,$(DH_INTERNAL_OPTIONS)))
+ : # not building documentation in -a
+else
+ cd doc && PYTHONPATH=$(CURDIR)/$(PACKAGE3_ROOT_DIR)-lib/usr/lib/python3/dist-packages:$(CURDIR)/$(PACKAGE3_ROOT_DIR)/usr/lib/python3/dist-packages MPLCONFIGDIR=$(CURDIR)/build HOME=$(CURDIR)/build LC_ALL=C python3 make.py html
+endif
+endif
+ : # Use jquery from Debian package, so prune shipped one
+ #TODO -rm doc/_build/html/_static/jquery.js
+ dh_installdocs -A *.md
+ # deduplicate files - the ||true is because we only build-depend on jdupes if we're building documentation
+ jdupes -r -l debian/python-pandas-doc/usr/share/doc || true
+
+override_dh_install:
+ dh_install
+ find debian -name __pycache__ | xargs rm -rf
+
+## remove .so libraries from main package, and call dh_numpy*
+## while removing 2 if not present
+_dh_python%:
+ [ -e /usr/bin/dh_numpy$(*:2=) ] && dh_numpy$(*:2=) -p$(PACKAGE$*_NAME)-lib || :
+ dh_python$*
+ -find debian/python*-pandas -name "*.so" -delete
+
+## "Instantiate" both rules so dh sees them
+override_dh_python2: _dh_python2
+override_dh_python3: _dh_python3
+
+## immediately useable documentation and exemplar scripts/data
+override_dh_compress:
+ dh_compress -X.py -X.html -X.pdf -X.css -X.jpg -X.txt -X.js -X.json -X.rtc -Xobjects.inv
+
+override_dh_auto_test:
+ # do nothing here, we run tests in python-test% instead
+ true
\ No newline at end of file
--- /dev/null
+3.0 (quilt)
--- /dev/null
+# nothing to do with the html files with lenthy lines.
+insane-line-length-in-source-file
+# False positive triggered by insane-line-length-in-source-file.
+# https://lintian.debian.org/tags/source-is-missing.html
+# Anyway let's override this "feature".
+source-is-missing
--- /dev/null
+extend-diff-ignore="^[^/]+\.egg-info/|pandas/__version.py"
--- /dev/null
+# According to pandas/doc/source/install.rst, running the unit tests looks like:
+# `py.test-3 --skip-slow --skip-network /usr/lib/python3/dist-packages/pandas/ -v -rs`
+# Or simply `python3 -c "import pandas as pd; pd.test()"`, which doesn't require
+# us to specify the path (pandas.__path__) in command line.
+# See: pandas/util/_tester.py
+
+Tests: unittests3
+Depends: locales-all,
+ python3-all,
+ python3-arrow,
+ python3-bs4,
+ python3-dateutil,
+ python3-html5lib,
+ python3-lxml,
+ python3-matplotlib [!hurd-i386]| python-matplotlib (<< 1.2.0~) [!hurd-i386],
+ python3-nose,
+ python3-numpy (>= 1:1.7~),
+ python3-openpyxl,
+ python3-pandas,
+ python3-pytest,
+ python3-scipy,
+ python3-six,
+ python3-sphinx (>= 1.0~),
+ python3-statsmodels,
+ python3-tables [!m68k !sh4 !x32],
+ python3-tk,
+ python3-tz,
+ python3-xarray,
+ python3-xlrd,
+ python3-xlsxwriter,
+ python3-xlwt,
+ xauth,
+ xvfb,
+ xclip,
+Restrictions: allow-stderr
--- /dev/null
+unittests3
\ No newline at end of file
--- /dev/null
+#!/bin/sh
+set -efu
+set -x
+
+arch=$(dpkg --print-architecture)
+
+# Let's filter some tests based on observations
+kw2='test_spam_url'
+kw3='test_spam_url'
+if [ "amd64" = $arch ]; then
+ kw2="$kw2
+ test_register_by_default
+ test_locale
+ "
+ kw3="$kw3
+ test_memory_leak
+ "
+elif [ "arm64" = $arch ]; then
+ kw2="$kw2
+ test_value_counts_normalized
+ test_resample_timedelta_values
+ test_datetimelikes_nan
+ test_sum_nanops_timedelta
+ test_agg_dict_parameter_cast_result_dtypes
+ test_timedelta_ops_with_missing_values
+ test_register_by_default
+ test_NaT_cast
+ test_locale
+ "
+ kw3="$kw3
+ test_value_counts_normalized
+ test_resample_timedelta_values
+ test_resample_datetime_values
+ test_datetimelikes_nan
+ test_sum_nanops_timedelta
+ test_agg_dict_parameter_cast_result_dtypes
+ test_timedelta_ops_with_missing_values
+ test_memory_leak
+ test_NaT_cast
+ test_memory_leak
+ "
+elif [ "armhf" = $arch ]; then
+ kw2="$kw2"
+ kw3="$kw3"
+elif [ "i386" = $arch ]; then
+ kw2="$kw2"
+ kw3="$kw3
+ test_memory_leak
+ "
+elif [ "ppc64el" = $arch ]; then
+ kw2="$kw2
+ test_register_by_default
+ test_locale
+ "
+ kw3="$kw3
+ test_memory_leak
+ "
+elif [ "s390x" = $arch ]; then
+ kw2="$kw2
+ test_msgpacks_legacy
+ test_locale
+ "
+ kw3="$kw3
+ test_msgpacks_legacy
+ "
+else
+ kw2="$kw2"
+ kw3="$kw3"
+fi
+
+if (basename $0 | grep "3" >/dev/null); then
+ keyword=$(python3 -c "print(' and '.join('not ' + x for x in '''$kw3'''.split()))")
+ pys="$(py3versions -r 2>/dev/null)"
+else
+ keyword=$(python -c "print(' and '.join('not ' + x for x in '''$kw2'''.split()))")
+ pys="$(pyversions -r 2>/dev/null)"
+fi
+
+# Debian: Enable "slow" tests on x86 to keep the code coverage.
+# Ubuntu: Disable "slow" tests on ALL architectures.
+if (echo amd64 i386 | grep $arch >/dev/null) && [ "Debian" = $(dpkg-vendor --query vendor) ]; then
+ marker='not single and not network and not disabled'
+elif (echo amd64 i386 | grep $arch >/dev/null) && [ "Ubuntu" = $(dpkg-vendor --query vendor) ]; then
+ marker='not single and not network and not disabled and not slow'
+else
+ marker='not single and not network and not disabled and not intel and not slow'
+fi
+
+cd "$ADTTMP"
+
+for py in $pys; do
+ echo "=== $py ==="
+ modpath=$($py -c 'import pandas as pd; print(pd.__path__[0])')
+ LC_ALL=C.UTF-8 xvfb-run --auto-servernum --server-args="-screen 0 1024x768x24" \
+ $py -m pytest --tb=long -s -v -m "$marker" -k "$keyword" $modpath 2>&1
+done
--- /dev/null
+Name: pandas
+Repository: https://github.com/pydata/pandas
+Documentation: https://pandas.pydata.org/pandas-docs/stable
+Bug-Database: https://github.com/pydata/pandas/issues
+Contact: https://pandas.pydata.org/community.html
+Reference:
+ Title: "pandas: a Foundational Python Library for Data Analysis and Statistics"
+ Eprint: https://www.scribd.com/doc/71048089/pandas-a-Foundational-Python-Library-for-Data-Analysis-and-Statistics
+ Author: McKinney, Wes
+ Booktitle: presented at PyHPC
+ Year: 2011
+Other-References: https://pandas.pydata.org/talks.html
--- /dev/null
+version=4
+opts="dversionmangle=s/.dfsg$//,uversionmangle=s/v//,filenamemangle=s/.*\/(.*)/pandas-$1\.tar\.gz/" \
+ https://github.com/pydata/pandas/tags .*archive/v?([\d\.rc]+).tar.gz