pytest

pytest to insert caplog fixture in test method

◇◆丶佛笑我妖孽 提交于 2019-12-22 08:13:04
问题 I have the following test class for pytest: class TestConnection(AsyncTestCase): '''Integration test''' @gen_test def test_connecting_to_server(self): '''Connecting to the TCPserver''' client = server = None try: sock, port = bind_unused_port() with NullContext(): server = EchoServer() server.add_socket(sock) client = IOStream(socket.socket()) #### HERE I WANT TO HAVE THE caplog FIXTURE with ExpectLog(app_log, '.*decode.*'): yield client.connect(('localhost', port)) yield client.write(b'hello

Using a command-line option in a pytest skip-if condition

Deadly 提交于 2019-12-22 07:58:25
问题 Long story short, I want to be able to skip some tests if the session is being run against our production API. The environment that the tests are run against is set with a command-line option. I came across the idea of using the pytest_namespace to track global variables, so I set that up in my conftest.py file. def pytest_namespace(): return {'global_env': ''} I take in the command line option and set various API urls (from a config.ini file) in a fixture in conftest.py. @pytest.fixture

How to mock a imported object with pytest-mock or magicmock

穿精又带淫゛_ 提交于 2019-12-22 05:23:29
问题 I am trying to understand the mock/monkeypatch/pytest-mock capabilities. Let me know if this is possible. If not could you please suggest how I can test this code. My code structure: / ./app ../__init__.py ../some_module1 .../__init__.py ../some_module2 .../__init__.py ./tests ../test_db.py The /app/__init__.py is where my application (a Flask application if it helps) is started along with initializing a database connection object to a MongoDB database: # ... def create_app(): # ... return

py.test patch on fixture

对着背影说爱祢 提交于 2019-12-22 05:07:55
问题 I use the following to mock constant values for a test with py.test: @patch('ConstantsModule.ConstantsClass.DELAY_TIME', 10) def test_PowerUp(): ... thing = Thing.Thing() assert thing.a == 1 This mocks DELAY_TIME as used in both the test, and in Thing, which is what I expected. I wanted to do this for all the tests in this file, so I tried @patch('ConstantsModule.ConstantsClass.DELAY_TIME', 10) @pytest.fixture(autouse=True) def NoDelay(): pass But that doesn't seem to have the same effect.

py.test patch on fixture

馋奶兔 提交于 2019-12-22 05:07:30
问题 I use the following to mock constant values for a test with py.test: @patch('ConstantsModule.ConstantsClass.DELAY_TIME', 10) def test_PowerUp(): ... thing = Thing.Thing() assert thing.a == 1 This mocks DELAY_TIME as used in both the test, and in Thing, which is what I expected. I wanted to do this for all the tests in this file, so I tried @patch('ConstantsModule.ConstantsClass.DELAY_TIME', 10) @pytest.fixture(autouse=True) def NoDelay(): pass But that doesn't seem to have the same effect.

With py.test, database is not reset after LiveServerTestCase

假如想象 提交于 2019-12-22 05:01:32
问题 I have a number of Django tests and typically run them using py.test. I recently added a new test case in a new file test_selenium.py . This Test Case has uses the LiveServerTestCase and StaticLiveServerTestCase classes (which is a first for me, usually I am using just TestCase ). Adding this new batch of tests in this new file has caused subsequent tests to start failing in py.test (when before they all passed). It appears that the database is not being "reset" after the LiveServerTestCase

How to debug py.test in PyCharm when coverage is enabled

妖精的绣舞 提交于 2019-12-22 01:51:35
问题 How do I debug py.test in PyCharm when coverage is enabled? Coverage is enabled using --cov=project --cov-report=term-missing , removing this and breakpoints are hit. Versions: pycharm 5.0.3, pytest==2.8.5, pytest-cache==1.0, pytest-cov==2.2.0, pytest-pep8==1.0.6, pytest-xdist==1.13.1, python-coveralls==2.6.0. (thanks for jon's advice on further diagnosing the issue) 回答1: There is now a flag in py.test to disable coverage which you can activate when running tests from PyCharm. The flag to use

pytest (py.test) very slow startup in cygwin

倖福魔咒の 提交于 2019-12-22 00:05:45
问题 In cygwin, py.test starts up very slow. It does not look like a collection issue because of two reasons: The same test starts up quickly in linux. And sometimes, if rerun the same test fast enough in cygwin, it starts up in less than 1 second. Running through the time command tells it starts up in either 0.4 seconds, or 11.7 seconds, when supplied with the --collection-only option to avoid running the actual tests. I also added a print to the hooks pytest_configure() and pytest_ignore_collect

In pytest, how to skip or xfail certain fixtures?

和自甴很熟 提交于 2019-12-21 09:23:20
问题 I have a heavily-fixtured test function which fails (as it should) with certain fixture inputs. How can I indicate this? This is what I'm doing now, and maybe there's a better way. I'm pretty new to py.test so I'd appreciate any tips. The next part is all the input fixtures. FYI, example_datapackage_path is defined in conf.test @pytest.fixture(params=[None, 'pooled_col', 'phenotype_col']) def metadata_key(self, request): return request.param @pytest.fixture(params=[None, 'feature_rename_col']

In pytest, how to skip or xfail certain fixtures?

◇◆丶佛笑我妖孽 提交于 2019-12-21 09:23:18
问题 I have a heavily-fixtured test function which fails (as it should) with certain fixture inputs. How can I indicate this? This is what I'm doing now, and maybe there's a better way. I'm pretty new to py.test so I'd appreciate any tips. The next part is all the input fixtures. FYI, example_datapackage_path is defined in conf.test @pytest.fixture(params=[None, 'pooled_col', 'phenotype_col']) def metadata_key(self, request): return request.param @pytest.fixture(params=[None, 'feature_rename_col']