Compare commits
30 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
e84c00efae | ||
|
|
53021ea264 | ||
|
|
01d067ec2b | ||
|
|
cdd25c9512 | ||
|
|
898b63b665 | ||
|
|
080dfb9841 | ||
|
|
02421790bf | ||
|
|
48d91def7e | ||
|
|
eb73db56c7 | ||
|
|
26f590babe | ||
|
|
f90b2f845c | ||
|
|
9d4e0365da | ||
|
|
0431b8bb74 | ||
|
|
2653024409 | ||
|
|
923174718e | ||
|
|
73f37d0989 | ||
|
|
0722b95e53 | ||
|
|
ae89436d97 | ||
|
|
af77a23501 | ||
|
|
2a1424e563 | ||
|
|
1871d526ac | ||
|
|
9346e18d8c | ||
|
|
1db5c95414 | ||
|
|
0c05b906d4 | ||
|
|
f04e01f55f | ||
|
|
ca44e88e54 | ||
|
|
b5fd3cfb84 | ||
|
|
dc727832a0 | ||
|
|
bf837164b4 | ||
|
|
f6d589caa1 |
1
.hgtags
1
.hgtags
@@ -74,3 +74,4 @@ a4f25c5e649892b5cc746d21be971e4773478af9 2.6.2
|
||||
2967aa416a4f3cdb65fc75073a2a148e1f372742 2.6.3
|
||||
f03b6de8325f5b6c35cea7c3de092f134ea8ef07 2.6.4
|
||||
7ed701fa2fb554bfc0618d447dfec700cc697407 2.7.0
|
||||
edc1d080bab5a970da8f6c776be50768829a7b09 2.7.1
|
||||
|
||||
28
.travis.yml
28
.travis.yml
@@ -1,8 +1,32 @@
|
||||
sudo: false
|
||||
language: python
|
||||
# command to install dependencies
|
||||
install: "pip install -U detox"
|
||||
install: "pip install -U tox"
|
||||
# # command to run tests
|
||||
script: detox --recreate -i ALL=https://devpi.net/hpk/dev/
|
||||
env:
|
||||
matrix:
|
||||
- TESTENV=flakes
|
||||
- TESTENV=py26
|
||||
- TESTENV=py27
|
||||
- TESTENV=py34
|
||||
- TESTENV=pypy
|
||||
- TESTENV=py27-pexpect
|
||||
- TESTENV=py33-pexpect
|
||||
- TESTENV=py27-nobyte
|
||||
- TESTENV=py33
|
||||
- TESTENV=py27-xdist
|
||||
- TESTENV=py33-xdist
|
||||
- TESTENV=py27
|
||||
- TESTENV=py27-trial
|
||||
- TESTENV=py33
|
||||
- TESTENV=py33-trial
|
||||
# inprocess tests by default were introduced in 2.8 only;
|
||||
# this TESTENV should be enabled when merged back to master
|
||||
#- TESTENV=py27-subprocess
|
||||
- TESTENV=doctesting
|
||||
- TESTENV=py27-cxfreeze
|
||||
- TESTENV=coveralls
|
||||
script: tox --recreate -i ALL=https://devpi.net/hpk/dev/ -e $TESTENV
|
||||
|
||||
notifications:
|
||||
irc:
|
||||
|
||||
77
AUTHORS
77
AUTHORS
@@ -3,48 +3,51 @@ merlinux GmbH, Germany, office at merlinux eu
|
||||
|
||||
Contributors include::
|
||||
|
||||
Ronny Pfannschmidt
|
||||
Benjamin Peterson
|
||||
Floris Bruynooghe
|
||||
Jason R. Coombs
|
||||
Wouter van Ackooy
|
||||
Samuele Pedroni
|
||||
Anatoly Bubenkoff
|
||||
Andreas Zeidler
|
||||
Andy Freeland
|
||||
Anthon van der Neut
|
||||
Armin Rigo
|
||||
Aron Curzon
|
||||
Benjamin Peterson
|
||||
Bob Ippolito
|
||||
Brian Dorsey
|
||||
Brian Okken
|
||||
Brianna Laugher
|
||||
Carl Friedrich Bolz
|
||||
Armin Rigo
|
||||
Maho
|
||||
Jaap Broekhuizen
|
||||
Maciek Fijalkowski
|
||||
Guido Wesdorp
|
||||
Brian Dorsey
|
||||
Ross Lawley
|
||||
Ralf Schmitt
|
||||
Charles Cloud
|
||||
Chris Lamb
|
||||
Harald Armin Massa
|
||||
Martijn Faassen
|
||||
Ian Bicking
|
||||
Jan Balster
|
||||
Grig Gheorghiu
|
||||
Bob Ippolito
|
||||
Christian Tismer
|
||||
Daniel Nuri
|
||||
Graham Horler
|
||||
Andreas Zeidler
|
||||
Brian Okken
|
||||
Katarzyna Jachim
|
||||
Christian Theunert
|
||||
Anthon van der Neut
|
||||
Mark Abramowitz
|
||||
Piotr Banaszkiewicz
|
||||
Jurko Gospodnetić
|
||||
Marc Schlaich
|
||||
Christian Tismer
|
||||
Christopher Gilling
|
||||
Daniel Grana
|
||||
Andy Freeland
|
||||
Trevor Bekolay
|
||||
David Mohr
|
||||
Nicolas Delaby
|
||||
Tom Viner
|
||||
Daniel Nuri
|
||||
Dave Hunt
|
||||
Charles Cloud
|
||||
David Mohr
|
||||
Edison Gustavo Muenz
|
||||
Floris Bruynooghe
|
||||
Graham Horler
|
||||
Grig Gheorghiu
|
||||
Guido Wesdorp
|
||||
Harald Armin Massa
|
||||
Ian Bicking
|
||||
Jaap Broekhuizen
|
||||
Jan Balster
|
||||
Jason R. Coombs
|
||||
Jurko Gospodnetić
|
||||
Katarzyna Jachim
|
||||
Maciek Fijalkowski
|
||||
Maho
|
||||
Marc Schlaich
|
||||
Mark Abramowitz
|
||||
Martijn Faassen
|
||||
Nicolas Delaby
|
||||
Piotr Banaszkiewicz
|
||||
Punyashloka Biswal
|
||||
Ralf Schmitt
|
||||
Ronny Pfannschmidt
|
||||
Ross Lawley
|
||||
Samuele Pedroni
|
||||
Tom Viner
|
||||
Trevor Bekolay
|
||||
Wouter van Ackooy
|
||||
|
||||
29
CHANGELOG
29
CHANGELOG
@@ -1,3 +1,32 @@
|
||||
2.7.2 (compared to 2.7.1)
|
||||
-----------------------------
|
||||
|
||||
- fix issue767: pytest.raises value attribute does not contain the exception
|
||||
instance on Python 2.6. Thanks Eric Siegerman for providing the test
|
||||
case and Bruno Oliveira for PR.
|
||||
|
||||
- Automatically create directory for junitxml and results log.
|
||||
Thanks Aron Curzon.
|
||||
|
||||
- fix issue713: JUnit XML reports for doctest failures.
|
||||
Thanks Punyashloka Biswal.
|
||||
|
||||
- fix issue735: assertion failures on debug versions of Python 3.4+
|
||||
Thanks Benjamin Peterson.
|
||||
|
||||
- fix issue114: skipif marker reports to internal skipping plugin;
|
||||
Thanks Floris Bruynooghe for reporting and Bruno Oliveira for the PR.
|
||||
|
||||
- fix issue748: unittest.SkipTest reports to internal pytest unittest plugin.
|
||||
Thanks Thomas De Schampheleire for reporting and Bruno Oliveira for the PR.
|
||||
|
||||
- fix issue718: failed to create representation of sets containing unsortable
|
||||
elements in python 2. Thanks Edison Gustavo Muenz
|
||||
|
||||
- fix issue756, fix issue752 (and similar issues): depend on py-1.4.29
|
||||
which has a refined algorithm for traceback generation.
|
||||
|
||||
|
||||
2.7.1 (compared to 2.7.0)
|
||||
-----------------------------
|
||||
|
||||
|
||||
@@ -11,7 +11,7 @@ How to release pytest (draft)
|
||||
|
||||
4. use devpi for uploading a release tarball to a staging area:
|
||||
- ``devpi use https://devpi.net/USER/dev``
|
||||
- ``devpi upload``
|
||||
- ``devpi upload --formats sdist,bdist_wheel``
|
||||
|
||||
5. run from multiple machines:
|
||||
- ``devpi use https://devpi.net/USER/dev``
|
||||
@@ -35,7 +35,10 @@ How to release pytest (draft)
|
||||
cd docs/en
|
||||
make html
|
||||
|
||||
9. Upload the docs using docs/en/Makefile::
|
||||
9. Tag the release::
|
||||
hg tag VERSION
|
||||
|
||||
10. Upload the docs using docs/en/Makefile::
|
||||
cd docs/en
|
||||
make install # or "installall" if you have LaTeX installed
|
||||
This requires ssh-login permission on pytest.org because it uses
|
||||
@@ -43,12 +46,12 @@ How to release pytest (draft)
|
||||
Note that the "install" target of doc/en/Makefile defines where the
|
||||
rsync goes to, typically to the "latest" section of pytest.org.
|
||||
|
||||
10. publish to pypi "devpi push pytest-2.6.2 pypi:NAME" where NAME
|
||||
11. publish to pypi "devpi push pytest-VERSION pypi:NAME" where NAME
|
||||
is the name of pypi.python.org as configured in your
|
||||
~/.pypirc file -- it's the same you would use with
|
||||
"setup.py upload -r NAME"
|
||||
|
||||
11. send release announcement to mailing lists:
|
||||
12. send release announcement to mailing lists:
|
||||
|
||||
pytest-dev
|
||||
testing-in-python
|
||||
|
||||
@@ -1,2 +1,2 @@
|
||||
#
|
||||
__version__ = '2.7.1'
|
||||
__version__ = '2.7.2'
|
||||
|
||||
@@ -442,6 +442,13 @@ binop_map = {
|
||||
ast.NotIn: "not in"
|
||||
}
|
||||
|
||||
# Python 3.4+ compatibility
|
||||
if hasattr(ast, "NameConstant"):
|
||||
_NameConstant = ast.NameConstant
|
||||
else:
|
||||
def _NameConstant(c):
|
||||
return ast.Name(str(c), ast.Load())
|
||||
|
||||
|
||||
def set_location(node, lineno, col_offset):
|
||||
"""Set node location information recursively."""
|
||||
@@ -680,7 +687,7 @@ class AssertionRewriter(ast.NodeVisitor):
|
||||
if self.variables:
|
||||
variables = [ast.Name(name, ast.Store())
|
||||
for name in self.variables]
|
||||
clear = ast.Assign(variables, ast.Name("None", ast.Load()))
|
||||
clear = ast.Assign(variables, _NameConstant(None))
|
||||
self.statements.append(clear)
|
||||
# Fix line numbers.
|
||||
for stmt in self.statements:
|
||||
|
||||
@@ -225,10 +225,18 @@ def _compare_eq_iterable(left, right, verbose=False):
|
||||
# dynamic import to speedup pytest
|
||||
import difflib
|
||||
|
||||
left = pprint.pformat(left).splitlines()
|
||||
right = pprint.pformat(right).splitlines()
|
||||
explanation = [u('Full diff:')]
|
||||
explanation.extend(line.strip() for line in difflib.ndiff(left, right))
|
||||
try:
|
||||
left_formatting = pprint.pformat(left).splitlines()
|
||||
right_formatting = pprint.pformat(right).splitlines()
|
||||
explanation = [u('Full diff:')]
|
||||
except Exception:
|
||||
# hack: PrettyPrinter.pformat() in python 2 fails when formatting items that can't be sorted(), ie, calling
|
||||
# sorted() on a list would raise. See issue #718.
|
||||
# As a workaround, the full diff is generated by using the repr() string of each item of each container.
|
||||
left_formatting = sorted(repr(x) for x in left)
|
||||
right_formatting = sorted(repr(x) for x in right)
|
||||
explanation = [u('Full diff (fallback to calling repr on each item):')]
|
||||
explanation.extend(line.strip() for line in difflib.ndiff(left_formatting, right_formatting))
|
||||
return explanation
|
||||
|
||||
|
||||
|
||||
@@ -123,10 +123,12 @@ class LogXML(object):
|
||||
Junit.skipped(message="xfail-marked test passes unexpectedly"))
|
||||
self.skipped += 1
|
||||
else:
|
||||
if isinstance(report.longrepr, (unicode, str)):
|
||||
if hasattr(report.longrepr, "reprcrash"):
|
||||
message = report.longrepr.reprcrash.message
|
||||
elif isinstance(report.longrepr, (unicode, str)):
|
||||
message = report.longrepr
|
||||
else:
|
||||
message = report.longrepr.reprcrash.message
|
||||
message = str(report.longrepr)
|
||||
message = bin_xml_escape(message)
|
||||
fail = Junit.failure(message=message)
|
||||
fail.append(bin_xml_escape(report.longrepr))
|
||||
@@ -203,6 +205,9 @@ class LogXML(object):
|
||||
self.suite_start_time = time.time()
|
||||
|
||||
def pytest_sessionfinish(self):
|
||||
dirname = os.path.dirname(os.path.abspath(self.logfile))
|
||||
if not os.path.isdir(dirname):
|
||||
os.makedirs(dirname)
|
||||
logfile = open(self.logfile, 'w', encoding='utf-8')
|
||||
suite_stop_time = time.time()
|
||||
suite_time_delta = suite_stop_time - self.suite_start_time
|
||||
|
||||
@@ -1099,6 +1099,13 @@ class RaisesContext(object):
|
||||
__tracebackhide__ = True
|
||||
if tp[0] is None:
|
||||
pytest.fail("DID NOT RAISE")
|
||||
if sys.version_info < (2, 7):
|
||||
# py26: on __exit__() exc_value often does not contain the
|
||||
# exception value.
|
||||
# http://bugs.python.org/issue7853
|
||||
if not isinstance(tp[1], BaseException):
|
||||
exc_type, value, traceback = tp
|
||||
tp = exc_type, exc_type(value), traceback
|
||||
self.excinfo.__init__(tp)
|
||||
return issubclass(self.excinfo.type, self.ExpectedException)
|
||||
|
||||
|
||||
@@ -3,6 +3,7 @@ text file.
|
||||
"""
|
||||
|
||||
import py
|
||||
import os
|
||||
|
||||
def pytest_addoption(parser):
|
||||
group = parser.getgroup("terminal reporting", "resultlog plugin options")
|
||||
@@ -14,6 +15,9 @@ def pytest_configure(config):
|
||||
resultlog = config.option.resultlog
|
||||
# prevent opening resultlog on slave nodes (xdist)
|
||||
if resultlog and not hasattr(config, 'slaveinput'):
|
||||
dirname = os.path.dirname(os.path.abspath(resultlog))
|
||||
if not os.path.isdir(dirname):
|
||||
os.makedirs(dirname)
|
||||
logfile = open(resultlog, 'w', 1) # line buffered
|
||||
config._resultlog = ResultLog(config, logfile)
|
||||
config.pluginmanager.register(config._resultlog)
|
||||
|
||||
@@ -137,6 +137,7 @@ class MarkEvaluator:
|
||||
def pytest_runtest_setup(item):
|
||||
evalskip = MarkEvaluator(item, 'skipif')
|
||||
if evalskip.istrue():
|
||||
item._evalskip = evalskip
|
||||
pytest.skip(evalskip.getexplanation())
|
||||
item._evalxfail = MarkEvaluator(item, 'xfail')
|
||||
check_xfail_no_run(item)
|
||||
@@ -156,6 +157,7 @@ def pytest_runtest_makereport(item, call):
|
||||
outcome = yield
|
||||
rep = outcome.get_result()
|
||||
evalxfail = getattr(item, '_evalxfail', None)
|
||||
evalskip = getattr(item, '_evalskip', None)
|
||||
# unitttest special case, see setting of _unexpectedsuccess
|
||||
if hasattr(item, '_unexpectedsuccess') and rep.when == "call":
|
||||
# we need to translate into how pytest encodes xpass
|
||||
@@ -177,6 +179,13 @@ def pytest_runtest_makereport(item, call):
|
||||
elif call.when == "call":
|
||||
rep.outcome = "failed" # xpass outcome
|
||||
rep.wasxfail = evalxfail.getexplanation()
|
||||
elif evalskip is not None and rep.skipped and type(rep.longrepr) is tuple:
|
||||
# skipped by mark.skipif; change the location of the failure
|
||||
# to point to the item definition, otherwise it will display
|
||||
# the location of where the skip exception was raised within pytest
|
||||
filename, line, reason = rep.longrepr
|
||||
filename, line = item.location[:2]
|
||||
rep.longrepr = filename, line, reason
|
||||
|
||||
# called by terminalreporter progress reporting
|
||||
def pytest_report_teststatus(report):
|
||||
|
||||
@@ -9,6 +9,7 @@ import py
|
||||
|
||||
# for transfering markers
|
||||
from _pytest.python import transfer_markers
|
||||
from _pytest.skipping import MarkEvaluator
|
||||
|
||||
|
||||
def pytest_pycollect_makeitem(collector, name, obj):
|
||||
@@ -113,6 +114,8 @@ class TestCaseFunction(pytest.Function):
|
||||
try:
|
||||
pytest.skip(reason)
|
||||
except pytest.skip.Exception:
|
||||
self._evalskip = MarkEvaluator(self, 'SkipTest')
|
||||
self._evalskip.result = True
|
||||
self._addexcinfo(sys.exc_info())
|
||||
|
||||
def addExpectedFailure(self, testcase, rawexcinfo, reason=""):
|
||||
|
||||
58
doc/en/announce/release-2.7.2.txt
Normal file
58
doc/en/announce/release-2.7.2.txt
Normal file
@@ -0,0 +1,58 @@
|
||||
pytest-2.7.2: bug fixes
|
||||
=======================
|
||||
|
||||
pytest is a mature Python testing tool with more than a 1100 tests
|
||||
against itself, passing on many different interpreters and platforms.
|
||||
This release is supposed to be drop-in compatible to 2.7.1.
|
||||
|
||||
See below for the changes and see docs at:
|
||||
|
||||
http://pytest.org
|
||||
|
||||
As usual, you can upgrade from pypi via::
|
||||
|
||||
pip install -U pytest
|
||||
|
||||
Thanks to all who contributed to this release, among them:
|
||||
|
||||
Bruno Oliveira
|
||||
Floris Bruynooghe
|
||||
Punyashloka Biswal
|
||||
Aron Curzon
|
||||
Benjamin Peterson
|
||||
Thomas De Schampheleire
|
||||
Edison Gustavo Muenz
|
||||
Holger Krekel
|
||||
|
||||
Happy testing,
|
||||
The py.test Development Team
|
||||
|
||||
|
||||
2.7.2 (compared to 2.7.1)
|
||||
-----------------------------
|
||||
|
||||
- fix issue767: pytest.raises value attribute does not contain the exception
|
||||
instance on Python 2.6. Thanks Eric Siegerman for providing the test
|
||||
case and Bruno Oliveira for PR.
|
||||
|
||||
- Automatically create directory for junitxml and results log.
|
||||
Thanks Aron Curzon.
|
||||
|
||||
- fix issue713: JUnit XML reports for doctest failures.
|
||||
Thanks Punyashloka Biswal.
|
||||
|
||||
- fix issue735: assertion failures on debug versions of Python 3.4+
|
||||
Thanks Benjamin Peterson.
|
||||
|
||||
- fix issue114: skipif marker reports to internal skipping plugin;
|
||||
Thanks Floris Bruynooghe for reporting and Bruno Oliveira for the PR.
|
||||
|
||||
- fix issue748: unittest.SkipTest reports to internal pytest unittest plugin.
|
||||
Thanks Thomas De Schampheleire for reporting and Bruno Oliveira for the PR.
|
||||
|
||||
- fix issue718: failed to create representation of sets containing unsortable
|
||||
elements in python 2. Thanks Edison Gustavo Muenz
|
||||
|
||||
- fix issue756, fix issue752 (and similar issues): depend on py-1.4.29
|
||||
which has a refined algorithm for traceback generation.
|
||||
|
||||
@@ -10,7 +10,7 @@ Pass different values to a test function, depending on command line options
|
||||
.. regendoc:wipe
|
||||
|
||||
Suppose we want to write a test that depends on a command line option.
|
||||
Here is a basic pattern how to achieve this::
|
||||
Here is a basic pattern to achieve this::
|
||||
|
||||
# content of test_sample.py
|
||||
def test_answer(cmdopt):
|
||||
@@ -41,9 +41,9 @@ Let's run this without supplying our new option::
|
||||
F
|
||||
================================= FAILURES =================================
|
||||
_______________________________ test_answer ________________________________
|
||||
|
||||
|
||||
cmdopt = 'type1'
|
||||
|
||||
|
||||
def test_answer(cmdopt):
|
||||
if cmdopt == "type1":
|
||||
print ("first")
|
||||
@@ -51,7 +51,7 @@ Let's run this without supplying our new option::
|
||||
print ("second")
|
||||
> assert 0 # to see what was printed
|
||||
E assert 0
|
||||
|
||||
|
||||
test_sample.py:6: AssertionError
|
||||
--------------------------- Captured stdout call ---------------------------
|
||||
first
|
||||
@@ -109,9 +109,9 @@ directory with the above conftest.py::
|
||||
$ py.test
|
||||
=========================== test session starts ============================
|
||||
platform linux -- Python 3.4.1 -- py-1.4.27 -- pytest-2.7.1
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
collected 0 items
|
||||
|
||||
|
||||
============================= in 0.00 seconds =============================
|
||||
|
||||
.. _`excontrolskip`:
|
||||
@@ -154,13 +154,13 @@ and when running it will see a skipped "slow" test::
|
||||
$ py.test -rs # "-rs" means report details on the little 's'
|
||||
=========================== test session starts ============================
|
||||
platform linux -- Python 3.4.1 -- py-1.4.27 -- pytest-2.7.1
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
collected 2 items
|
||||
|
||||
|
||||
test_module.py .s
|
||||
========================= short test summary info ==========================
|
||||
SKIP [1] /tmp/doc-exec-162/conftest.py:9: need --runslow option to run
|
||||
|
||||
|
||||
=================== 1 passed, 1 skipped in 0.01 seconds ====================
|
||||
|
||||
Or run it including the ``slow`` marked test::
|
||||
@@ -168,11 +168,11 @@ Or run it including the ``slow`` marked test::
|
||||
$ py.test --runslow
|
||||
=========================== test session starts ============================
|
||||
platform linux -- Python 3.4.1 -- py-1.4.27 -- pytest-2.7.1
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
collected 2 items
|
||||
|
||||
|
||||
test_module.py ..
|
||||
|
||||
|
||||
========================= 2 passed in 0.01 seconds =========================
|
||||
|
||||
Writing well integrated assertion helpers
|
||||
@@ -205,11 +205,11 @@ Let's run our little function::
|
||||
F
|
||||
================================= FAILURES =================================
|
||||
______________________________ test_something ______________________________
|
||||
|
||||
|
||||
def test_something():
|
||||
> checkconfig(42)
|
||||
E Failed: not configured: 42
|
||||
|
||||
|
||||
test_checkconfig.py:8: Failed
|
||||
1 failed in 0.02 seconds
|
||||
|
||||
@@ -260,10 +260,10 @@ which will add the string to the test header accordingly::
|
||||
$ py.test
|
||||
=========================== test session starts ============================
|
||||
platform linux -- Python 3.4.1 -- py-1.4.27 -- pytest-2.7.1
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
project deps: mylib-1.1
|
||||
collected 0 items
|
||||
|
||||
|
||||
============================= in 0.00 seconds =============================
|
||||
|
||||
.. regendoc:wipe
|
||||
@@ -284,11 +284,11 @@ which will add info only when run with "--v"::
|
||||
$ py.test -v
|
||||
=========================== test session starts ============================
|
||||
platform linux -- Python 3.4.1 -- py-1.4.27 -- pytest-2.7.1 -- /tmp/sandbox/pytest/.tox/regen/bin/python3.4
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
info1: did you know that ...
|
||||
did you?
|
||||
collecting ... collected 0 items
|
||||
|
||||
|
||||
============================= in 0.00 seconds =============================
|
||||
|
||||
and nothing when run plainly::
|
||||
@@ -296,9 +296,9 @@ and nothing when run plainly::
|
||||
$ py.test
|
||||
=========================== test session starts ============================
|
||||
platform linux -- Python 3.4.1 -- py-1.4.27 -- pytest-2.7.1
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
collected 0 items
|
||||
|
||||
|
||||
============================= in 0.00 seconds =============================
|
||||
|
||||
profiling test duration
|
||||
@@ -329,11 +329,11 @@ Now we can profile which test functions execute the slowest::
|
||||
$ py.test --durations=3
|
||||
=========================== test session starts ============================
|
||||
platform linux -- Python 3.4.1 -- py-1.4.27 -- pytest-2.7.1
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
collected 3 items
|
||||
|
||||
|
||||
test_some_are_slow.py ...
|
||||
|
||||
|
||||
========================= slowest 3 test durations =========================
|
||||
0.20s call test_some_are_slow.py::test_funcslow2
|
||||
0.10s call test_some_are_slow.py::test_funcslow1
|
||||
@@ -391,20 +391,20 @@ If we run this::
|
||||
$ py.test -rx
|
||||
=========================== test session starts ============================
|
||||
platform linux -- Python 3.4.1 -- py-1.4.27 -- pytest-2.7.1
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
collected 4 items
|
||||
|
||||
|
||||
test_step.py .Fx.
|
||||
|
||||
|
||||
================================= FAILURES =================================
|
||||
____________________ TestUserHandling.test_modification ____________________
|
||||
|
||||
|
||||
self = <test_step.TestUserHandling object at 0x7ff60bbb83c8>
|
||||
|
||||
|
||||
def test_modification(self):
|
||||
> assert 0
|
||||
E assert 0
|
||||
|
||||
|
||||
test_step.py:9: AssertionError
|
||||
========================= short test summary info ==========================
|
||||
XFAIL test_step.py::TestUserHandling::()::test_deletion
|
||||
@@ -462,14 +462,14 @@ We can run this::
|
||||
$ py.test
|
||||
=========================== test session starts ============================
|
||||
platform linux -- Python 3.4.1 -- py-1.4.27 -- pytest-2.7.1
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
collected 7 items
|
||||
|
||||
|
||||
test_step.py .Fx.
|
||||
a/test_db.py F
|
||||
a/test_db2.py F
|
||||
b/test_error.py E
|
||||
|
||||
|
||||
================================== ERRORS ==================================
|
||||
_______________________ ERROR at setup of test_root ________________________
|
||||
file /tmp/doc-exec-162/b/test_error.py, line 1
|
||||
@@ -477,37 +477,37 @@ We can run this::
|
||||
fixture 'db' not found
|
||||
available fixtures: pytestconfig, capsys, recwarn, monkeypatch, tmpdir, capfd
|
||||
use 'py.test --fixtures [testpath]' for help on them.
|
||||
|
||||
|
||||
/tmp/doc-exec-162/b/test_error.py:1
|
||||
================================= FAILURES =================================
|
||||
____________________ TestUserHandling.test_modification ____________________
|
||||
|
||||
|
||||
self = <test_step.TestUserHandling object at 0x7f8ecd5b87f0>
|
||||
|
||||
|
||||
def test_modification(self):
|
||||
> assert 0
|
||||
E assert 0
|
||||
|
||||
|
||||
test_step.py:9: AssertionError
|
||||
_________________________________ test_a1 __________________________________
|
||||
|
||||
|
||||
db = <conftest.DB object at 0x7f8ecdc11470>
|
||||
|
||||
|
||||
def test_a1(db):
|
||||
> assert 0, db # to show value
|
||||
E AssertionError: <conftest.DB object at 0x7f8ecdc11470>
|
||||
E assert 0
|
||||
|
||||
|
||||
a/test_db.py:2: AssertionError
|
||||
_________________________________ test_a2 __________________________________
|
||||
|
||||
|
||||
db = <conftest.DB object at 0x7f8ecdc11470>
|
||||
|
||||
|
||||
def test_a2(db):
|
||||
> assert 0, db # to show value
|
||||
E AssertionError: <conftest.DB object at 0x7f8ecdc11470>
|
||||
E assert 0
|
||||
|
||||
|
||||
a/test_db2.py:2: AssertionError
|
||||
========== 3 failed, 2 passed, 1 xfailed, 1 error in 0.05 seconds ==========
|
||||
|
||||
@@ -565,27 +565,27 @@ and run them::
|
||||
$ py.test test_module.py
|
||||
=========================== test session starts ============================
|
||||
platform linux -- Python 3.4.1 -- py-1.4.27 -- pytest-2.7.1
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
collected 2 items
|
||||
|
||||
|
||||
test_module.py FF
|
||||
|
||||
|
||||
================================= FAILURES =================================
|
||||
________________________________ test_fail1 ________________________________
|
||||
|
||||
|
||||
tmpdir = local('/tmp/pytest-22/test_fail10')
|
||||
|
||||
|
||||
def test_fail1(tmpdir):
|
||||
> assert 0
|
||||
E assert 0
|
||||
|
||||
|
||||
test_module.py:2: AssertionError
|
||||
________________________________ test_fail2 ________________________________
|
||||
|
||||
|
||||
def test_fail2():
|
||||
> assert 0
|
||||
E assert 0
|
||||
|
||||
|
||||
test_module.py:4: AssertionError
|
||||
========================= 2 failed in 0.02 seconds =========================
|
||||
|
||||
@@ -656,38 +656,38 @@ and run it::
|
||||
$ py.test -s test_module.py
|
||||
=========================== test session starts ============================
|
||||
platform linux -- Python 3.4.1 -- py-1.4.27 -- pytest-2.7.1
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
rootdir: /tmp/doc-exec-162, inifile:
|
||||
collected 3 items
|
||||
|
||||
|
||||
test_module.py Esetting up a test failed! test_module.py::test_setup_fails
|
||||
Fexecuting test failed test_module.py::test_call_fails
|
||||
F
|
||||
|
||||
|
||||
================================== ERRORS ==================================
|
||||
____________________ ERROR at setup of test_setup_fails ____________________
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def other():
|
||||
> assert 0
|
||||
E assert 0
|
||||
|
||||
|
||||
test_module.py:6: AssertionError
|
||||
================================= FAILURES =================================
|
||||
_____________________________ test_call_fails ______________________________
|
||||
|
||||
|
||||
something = None
|
||||
|
||||
|
||||
def test_call_fails(something):
|
||||
> assert 0
|
||||
E assert 0
|
||||
|
||||
|
||||
test_module.py:12: AssertionError
|
||||
________________________________ test_fail2 ________________________________
|
||||
|
||||
|
||||
def test_fail2():
|
||||
> assert 0
|
||||
E assert 0
|
||||
|
||||
|
||||
test_module.py:15: AssertionError
|
||||
==================== 2 failed, 1 error in 0.02 seconds =====================
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
.. highlightlang:: python
|
||||
.. _`goodpractises`:
|
||||
|
||||
Good Integration Practises
|
||||
Good Integration Practices
|
||||
=================================================
|
||||
|
||||
Work with virtual environments
|
||||
|
||||
10
setup.py
10
setup.py
@@ -31,12 +31,12 @@ def get_version():
|
||||
def has_environment_marker_support():
|
||||
"""
|
||||
Tests that setuptools has support for PEP-426 environment marker support.
|
||||
|
||||
The first known release to support it is 0.7 (and the earliest on PyPI seems to be 0.7.2
|
||||
|
||||
The first known release to support it is 0.7 (and the earliest on PyPI seems to be 0.7.2
|
||||
so we're using that), see: http://pythonhosted.org/setuptools/history.html#id142
|
||||
|
||||
|
||||
References:
|
||||
|
||||
|
||||
* https://wheel.readthedocs.org/en/latest/index.html#defining-conditional-dependencies
|
||||
* https://www.python.org/dev/peps/pep-0426/#environment-markers
|
||||
"""
|
||||
@@ -48,7 +48,7 @@ def has_environment_marker_support():
|
||||
|
||||
|
||||
def main():
|
||||
install_requires = ['py>=1.4.25']
|
||||
install_requires = ['py>=1.4.29']
|
||||
extras_require = {}
|
||||
if has_environment_marker_support():
|
||||
extras_require[':python_version=="2.6" or python_version=="3.0" or python_version=="3.1"'] = ['argparse']
|
||||
|
||||
@@ -3,17 +3,19 @@ import sys
|
||||
|
||||
pytest_plugins = "pytester",
|
||||
|
||||
import os, py
|
||||
import os, py, gc
|
||||
|
||||
class LsofFdLeakChecker(object):
|
||||
def get_open_files(self):
|
||||
gc.collect()
|
||||
out = self._exec_lsof()
|
||||
open_files = self._parse_lsof_output(out)
|
||||
return open_files
|
||||
|
||||
def _exec_lsof(self):
|
||||
pid = os.getpid()
|
||||
return py.process.cmdexec("lsof -Ffn0 -p %d" % pid)
|
||||
#return py.process.cmdexec("lsof -Ffn0 -p %d" % pid)
|
||||
return py.process.cmdexec("lsof -p %d" % pid)
|
||||
|
||||
def _parse_lsof_output(self, out):
|
||||
def isopen(line):
|
||||
|
||||
@@ -46,6 +46,7 @@ class TestRaises:
|
||||
1/0
|
||||
print (excinfo)
|
||||
assert excinfo.type == ZeroDivisionError
|
||||
assert isinstance(excinfo.value, ZeroDivisionError)
|
||||
|
||||
def test_noraise():
|
||||
with pytest.raises(pytest.raises.Exception):
|
||||
|
||||
@@ -569,3 +569,39 @@ def test_AssertionError_message(testdir):
|
||||
*assert 0, (x,y)*
|
||||
*AssertionError: (1, 2)*
|
||||
""")
|
||||
|
||||
@pytest.mark.skipif(PY3, reason='This bug does not exist on PY3')
|
||||
def test_set_with_unsortable_elements():
|
||||
# issue #718
|
||||
class UnsortableKey(object):
|
||||
def __init__(self, name):
|
||||
self.name = name
|
||||
|
||||
def __lt__(self, other):
|
||||
raise RuntimeError()
|
||||
|
||||
def __repr__(self):
|
||||
return 'repr({0})'.format(self.name)
|
||||
|
||||
def __eq__(self, other):
|
||||
return self.name == other.name
|
||||
|
||||
def __hash__(self):
|
||||
return hash(self.name)
|
||||
|
||||
left_set = set(UnsortableKey(str(i)) for i in range(1, 3))
|
||||
right_set = set(UnsortableKey(str(i)) for i in range(2, 4))
|
||||
expl = callequal(left_set, right_set, verbose=True)
|
||||
# skip first line because it contains the "construction" of the set, which does not have a guaranteed order
|
||||
expl = expl[1:]
|
||||
dedent = textwrap.dedent("""
|
||||
Extra items in the left set:
|
||||
repr(1)
|
||||
Extra items in the right set:
|
||||
repr(3)
|
||||
Full diff (fallback to calling repr on each item):
|
||||
- repr(1)
|
||||
repr(2)
|
||||
+ repr(3)
|
||||
""").strip()
|
||||
assert '\n'.join(expl) == dedent
|
||||
|
||||
@@ -171,6 +171,7 @@ def test_conftest_confcutdir(testdir):
|
||||
"""))
|
||||
result = testdir.runpytest("-h", "--confcutdir=%s" % x, x)
|
||||
result.stdout.fnmatch_lines(["*--xyz*"])
|
||||
assert 'warning: could not load initial' not in result.stdout.str()
|
||||
|
||||
def test_conftest_existing_resultlog(testdir):
|
||||
x = testdir.mkdir("tests")
|
||||
|
||||
@@ -354,3 +354,19 @@ class TestDoctests:
|
||||
reprec = testdir.inline_run(p, "--doctest-modules",
|
||||
"--doctest-ignore-import-errors")
|
||||
reprec.assertoutcome(skipped=1, failed=1, passed=0)
|
||||
|
||||
def test_junit_report_for_doctest(self, testdir):
|
||||
"""
|
||||
#713: Fix --junit-xml option when used with --doctest-modules.
|
||||
"""
|
||||
p = testdir.makepyfile("""
|
||||
def foo():
|
||||
'''
|
||||
>>> 1 + 1
|
||||
3
|
||||
'''
|
||||
pass
|
||||
""")
|
||||
reprec = testdir.inline_run(p, "--doctest-modules",
|
||||
"--junit-xml=junit.xml")
|
||||
reprec.assertoutcome(failed=1)
|
||||
|
||||
@@ -474,6 +474,16 @@ def test_logxml_changingdir(testdir):
|
||||
assert result.ret == 0
|
||||
assert testdir.tmpdir.join("a/x.xml").check()
|
||||
|
||||
def test_logxml_makedir(testdir):
|
||||
"""--junitxml should automatically create directories for the xml file"""
|
||||
testdir.makepyfile("""
|
||||
def test_pass():
|
||||
pass
|
||||
""")
|
||||
result = testdir.runpytest("--junitxml=path/to/results.xml")
|
||||
assert result.ret == 0
|
||||
assert testdir.tmpdir.join("path/to/results.xml").check()
|
||||
|
||||
def test_escaped_parametrized_names_xml(testdir):
|
||||
testdir.makepyfile("""
|
||||
import pytest
|
||||
|
||||
@@ -180,6 +180,21 @@ def test_generic(testdir, LineMatcher):
|
||||
"x *:test_xfail_norun",
|
||||
])
|
||||
|
||||
def test_makedir_for_resultlog(testdir, LineMatcher):
|
||||
"""--resultlog should automatically create directories for the log file"""
|
||||
testdir.plugins.append("resultlog")
|
||||
testdir.makepyfile("""
|
||||
import pytest
|
||||
def test_pass():
|
||||
pass
|
||||
""")
|
||||
testdir.runpytest("--resultlog=path/to/result.log")
|
||||
lines = testdir.tmpdir.join("path/to/result.log").readlines(cr=0)
|
||||
LineMatcher(lines).fnmatch_lines([
|
||||
". *:test_pass",
|
||||
])
|
||||
|
||||
|
||||
def test_no_resultlog_on_slaves(testdir):
|
||||
config = testdir.parseconfig("-p", "resultlog", "--resultlog=resultlog")
|
||||
|
||||
|
||||
@@ -396,7 +396,7 @@ class TestSkipif:
|
||||
|
||||
|
||||
def test_skipif_reporting(self, testdir):
|
||||
p = testdir.makepyfile("""
|
||||
p = testdir.makepyfile(test_foo="""
|
||||
import pytest
|
||||
@pytest.mark.skipif("hasattr(sys, 'platform')")
|
||||
def test_that():
|
||||
@@ -404,7 +404,7 @@ class TestSkipif:
|
||||
""")
|
||||
result = testdir.runpytest(p, '-s', '-rs')
|
||||
result.stdout.fnmatch_lines([
|
||||
"*SKIP*1*platform*",
|
||||
"*SKIP*1*test_foo.py*platform*",
|
||||
"*1 skipped*"
|
||||
])
|
||||
assert result.ret == 0
|
||||
|
||||
@@ -700,4 +700,17 @@ def test_issue333_result_clearing(testdir):
|
||||
reprec = testdir.inline_run()
|
||||
reprec.assertoutcome(failed=1)
|
||||
|
||||
@pytest.mark.skipif("sys.version_info < (2,7)")
|
||||
def test_unittest_raise_skip_issue748(testdir):
|
||||
testdir.makepyfile(test_foo="""
|
||||
import unittest
|
||||
|
||||
class MyTestCase(unittest.TestCase):
|
||||
def test_one(self):
|
||||
raise unittest.SkipTest('skipping due to reasons')
|
||||
""")
|
||||
result = testdir.runpytest("-v", '-rs')
|
||||
result.stdout.fnmatch_lines("""
|
||||
*SKIP*[1]*test_foo.py*skipping due to reasons*
|
||||
*1 skipped*
|
||||
""")
|
||||
|
||||
81
tox.ini
81
tox.ini
@@ -1,84 +1,71 @@
|
||||
[tox]
|
||||
minversion=2.0
|
||||
distshare={homedir}/.tox/distshare
|
||||
envlist=flakes,py26,py27,py34,pypy,py27-pexpect,py33-pexpect,py27-nobyte,py33,py27-xdist,py33-xdist,py27-trial,py33-trial,doctesting,py27-cxfreeze
|
||||
envlist=
|
||||
flakes,py26,py27,py33,py34,pypy,
|
||||
{py27,py34}-{pexpect,xdist,trial},
|
||||
py27-nobyte,doctesting,py27-cxfreeze
|
||||
|
||||
[testenv]
|
||||
changedir=testing
|
||||
commands= py.test --lsof -rfsxX --junitxml={envlogdir}/junit-{envname}.xml []
|
||||
commands= py.test --lsof -rfsxX {posargs:testing}
|
||||
deps=
|
||||
nose
|
||||
mock
|
||||
|
||||
[testenv:genscript]
|
||||
changedir=.
|
||||
commands= py.test --genscript=pytest1
|
||||
|
||||
[testenv:flakes]
|
||||
changedir=
|
||||
deps = pytest-flakes>=0.2
|
||||
commands = py.test --flakes -m flakes _pytest testing
|
||||
|
||||
[testenv:py27-xdist]
|
||||
changedir=.
|
||||
basepython=python2.7
|
||||
deps=pytest-xdist
|
||||
mock
|
||||
nose
|
||||
commands=
|
||||
py.test -n1 -rfsxX \
|
||||
--junitxml={envlogdir}/junit-{envname}.xml {posargs:testing}
|
||||
py.test -n1 -rfsxX {posargs:testing}
|
||||
|
||||
[testenv:py33-xdist]
|
||||
changedir=.
|
||||
basepython=python3.3
|
||||
[testenv:py34-xdist]
|
||||
deps={[testenv:py27-xdist]deps}
|
||||
commands=
|
||||
py.test -n3 -rfsxX \
|
||||
--junitxml={envlogdir}/junit-{envname}.xml testing
|
||||
py.test -n3 -rfsxX testing
|
||||
|
||||
[testenv:py27-pexpect]
|
||||
changedir=testing
|
||||
basepython=python2.7
|
||||
platform=linux|darwin
|
||||
deps=pexpect
|
||||
commands=
|
||||
py.test -rfsxX test_pdb.py test_terminal.py test_unittest.py
|
||||
|
||||
[testenv:py33-pexpect]
|
||||
[testenv:py34-pexpect]
|
||||
changedir=testing
|
||||
basepython=python3.3
|
||||
platform=linux|darwin
|
||||
deps={[testenv:py27-pexpect]deps}
|
||||
commands=
|
||||
py.test -rfsxX test_pdb.py test_terminal.py test_unittest.py
|
||||
|
||||
[testenv:py27-nobyte]
|
||||
changedir=.
|
||||
basepython=python2.7
|
||||
deps=pytest-xdist
|
||||
distribute=true
|
||||
setenv=
|
||||
PYTHONDONTWRITEBYTECODE=1
|
||||
commands=
|
||||
py.test -n3 -rfsxX \
|
||||
--junitxml={envlogdir}/junit-{envname}.xml {posargs:testing}
|
||||
py.test -n3 -rfsxX {posargs:testing}
|
||||
|
||||
[testenv:py27-trial]
|
||||
changedir=.
|
||||
basepython=python2.7
|
||||
deps=twisted
|
||||
commands=
|
||||
py.test -rsxf \
|
||||
--junitxml={envlogdir}/junit-{envname}.xml {posargs:testing/test_unittest.py}
|
||||
py.test -rsxf {posargs:testing/test_unittest.py}
|
||||
|
||||
[testenv:py33-trial]
|
||||
changedir=.
|
||||
basepython=python3.3
|
||||
[testenv:py34-trial]
|
||||
# py34-trial does not work
|
||||
platform=linux|darwin
|
||||
deps={[testenv:py27-trial]deps}
|
||||
commands=
|
||||
py.test -rsxf \
|
||||
--junitxml={envlogdir}/junit-{envname}.xml {posargs:testing/test_unittest.py}
|
||||
py.test -rsxf {posargs:testing/test_unittest.py}
|
||||
|
||||
[testenv:doctest]
|
||||
changedir=.
|
||||
commands=py.test --doctest-modules _pytest
|
||||
deps=
|
||||
|
||||
@@ -94,13 +81,11 @@ commands=
|
||||
make html
|
||||
|
||||
[testenv:doctesting]
|
||||
basepython=python3.3
|
||||
changedir=doc/en
|
||||
deps=PyYAML
|
||||
commands= py.test -rfsxX --junitxml={envlogdir}/junit-{envname}.xml []
|
||||
commands= py.test -rfsxX {posargs}
|
||||
|
||||
[testenv:regen]
|
||||
basepython=python3.4
|
||||
changedir=doc/en
|
||||
deps=sphinx
|
||||
PyYAML
|
||||
@@ -109,28 +94,30 @@ commands=
|
||||
#pip install pytest==2.3.4
|
||||
make regen
|
||||
|
||||
[testenv:py31]
|
||||
deps=nose>=1.0
|
||||
|
||||
[testenv:py31-xdist]
|
||||
deps=pytest-xdist
|
||||
commands=
|
||||
py.test -n3 -rfsxX \
|
||||
--junitxml={envlogdir}/junit-{envname}.xml []
|
||||
|
||||
[testenv:jython]
|
||||
changedir=testing
|
||||
commands=
|
||||
{envpython} {envbindir}/py.test-jython \
|
||||
-rfsxX --junitxml={envlogdir}/junit-{envname}2.xml []
|
||||
{envpython} {envbindir}/py.test-jython -rfsxX {posargs}
|
||||
|
||||
[testenv:py27-cxfreeze]
|
||||
changedir=testing/cx_freeze
|
||||
basepython=python2.7
|
||||
platform=linux|darwin
|
||||
commands=
|
||||
{envpython} install_cx_freeze.py
|
||||
{envpython} runtests_setup.py build --build-exe build
|
||||
{envpython} tox_run.py
|
||||
|
||||
|
||||
[testenv:coveralls]
|
||||
changedir=testing
|
||||
deps =
|
||||
{[testenv]deps}
|
||||
coveralls
|
||||
commands=
|
||||
coverage run --source=_pytest {envdir}/bin/py.test
|
||||
coverage report -m
|
||||
coveralls
|
||||
passenv=COVERALLS_REPO_TOKEN
|
||||
|
||||
[pytest]
|
||||
minversion=2.0
|
||||
@@ -142,4 +129,4 @@ python_files=test_*.py *_test.py testing/*/*.py
|
||||
python_classes=Test Acceptance
|
||||
python_functions=test
|
||||
pep8ignore = E401 E225 E261 E128 E124 E302
|
||||
norecursedirs = .tox ja .hg
|
||||
norecursedirs = .tox ja .hg cx_freeze_source
|
||||
|
||||
Reference in New Issue
Block a user