I\'m trying to use TDD (test-driven development) with pytest
.
pytest
will not print
to the console when I use print
.
Use the -s
option:
pytest -s
From the docs:
During test execution any output sent to stdout and stderr is captured. If a test or a setup method fails its according captured output will usually be shown along with the failure traceback.
pytest
has the option --capture=method
in which method
is per-test capturing method, and could be one of the following: fd
, sys
or no
. pytest
also has the option -s
which is a shortcut for --capture=no
, and this is the option that will allow you to see your print statements in the console.
pytest --capture=no # show print statements in console
pytest -s # equivalent to previous command
There are two ways in which pytest
can perform capturing:
file descriptor (FD) level capturing (default): All writes going to the operating system file descriptors 1 and 2 will be captured.
sys level capturing: Only writes to Python files sys.stdout and sys.stderr will be captured. No capturing of writes to filedescriptors is performed.
pytest -s # disable all capturing
pytest --capture=sys # replace sys.stdout/stderr with in-mem files
pytest --capture=fd # also point filedescriptors 1 and 2 to temp file
According to the pytest docs, pytest --capture=sys
should work. If you want to capture standard out inside a test, refer to the capsys fixture.
I originally came in here to find how to make PyTest
print in VSCode's console while running/debugging the unit test from there. This can be done with the following launch.json
configuration. Given .venv
the virtual environment folder.
"version": "0.2.0",
"configurations": [
{
"name": "PyTest",
"type": "python",
"request": "launch",
"stopOnEntry": false,
"pythonPath": "${config:python.pythonPath}",
"module": "pytest",
"args": [
"-sv"
],
"cwd": "${workspaceRoot}",
"env": {},
"envFile": "${workspaceRoot}/.venv",
"debugOptions": [
"WaitOnAbnormalExit",
"WaitOnNormalExit",
"RedirectOutput"
]
}
]
}
Using -s
option will print output of all functions, which may be too much.
If you need particular output, the doc page you mentioned offers few suggestions:
Insert assert False, "dumb assert to make PyTest print my stuff"
at the end of your function, and you will see your output due to failed test.
You have special object passed to you by PyTest, and you can write the output into a file to inspect it later, like
def test_good1(capsys):
for i in range(5):
print i
out, err = capsys.readouterr()
open("err.txt", "w").write(err)
open("out.txt", "w").write(out)
You can open the out
and err
files in a separate tab and let editor automatically refresh it for you, or do a simple py.test; cat out.txt
shell command to run your test.
That is rather hackish way to do stuff, but may be it is the stuff you need: after all, TDD means you mess with stuff and leave it clean and silent when it's ready :-).
I needed to print important warning about skipped tests exactly when PyTest
muted literally everything.
I didn't want to fail a test to send a signal, so I did a hack as follow:
def test_2_YellAboutBrokenAndMutedTests():
import atexit
def report():
print C_patch.tidy_text("""
In silent mode PyTest breaks low level stream structure I work with, so
I cannot test if my functionality work fine. I skipped corresponding tests.
Run `py.test -s` to make sure everything is tested.""")
if sys.stdout != sys.__stdout__:
atexit.register(report)
The atexit
module allows me to print stuff after PyTest
released the output streams. The output looks as follow:
============================= test session starts ==============================
platform linux2 -- Python 2.7.3, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
rootdir: /media/Storage/henaro/smyth/Alchemist2-git/sources/C_patch, inifile:
collected 15 items
test_C_patch.py .....ssss....s.
===================== 10 passed, 5 skipped in 0.15 seconds =====================
In silent mode PyTest breaks low level stream structure I work with, so
I cannot test if my functionality work fine. I skipped corresponding tests.
Run `py.test -s` to make sure everything is tested.
~/.../sources/C_patch$
Message is printed even when PyTest
is in silent mode, and is not printed if you run stuff with py.test -s
, so everything is tested nicely already.
By default, py.test
captures the result of standard out so that it can control how it prints it out. If it didn't do this, it would spew out a lot of text without the context of what test printed that text.
However, if a test fails, it will include a section in the resulting report that shows what was printed to standard out in that particular test.
For example,
def test_good():
for i in range(1000):
print(i)
def test_bad():
print('this should fail!')
assert False
Results in the following output:
>>> py.test tmp.py
============================= test session starts ==============================
platform darwin -- Python 2.7.6 -- py-1.4.20 -- pytest-2.5.2
plugins: cache, cov, pep8, xdist
collected 2 items
tmp.py .F
=================================== FAILURES ===================================
___________________________________ test_bad ___________________________________
def test_bad():
print('this should fail!')
> assert False
E assert False
tmp.py:7: AssertionError
------------------------------- Captured stdout --------------------------------
this should fail!
====================== 1 failed, 1 passed in 0.04 seconds ======================
Note the Captured stdout
section.
If you would like to see print
statements as they are executed, you can pass the -s
flag to py.test
. However, note that this can sometimes be difficult to parse.
>>> py.test tmp.py -s
============================= test session starts ==============================
platform darwin -- Python 2.7.6 -- py-1.4.20 -- pytest-2.5.2
plugins: cache, cov, pep8, xdist
collected 2 items
tmp.py 0
1
2
3
... and so on ...
997
998
999
.this should fail!
F
=================================== FAILURES ===================================
___________________________________ test_bad ___________________________________
def test_bad():
print('this should fail!')
> assert False
E assert False
tmp.py:7: AssertionError
====================== 1 failed, 1 passed in 0.02 seconds ======================