Using a command-line option in a pytest skip-if condition

匿名 (未验证) 提交于 2019-12-03 01:39:01

问题:

Long story short, I want to be able to skip some tests if the session is being run against our production API. The environment that the tests are run against is set with a command-line option.

I came across the idea of using the pytest_namespace to track global variables, so I set that up in my conftest.py file.

def pytest_namespace():     return {'global_env': ''} 

I take in the command line option and set various API urls (from a config.ini file) in a fixture in conftest.py.

@pytest.fixture(scope='session', autouse=True) def configInfo(pytestconfig):     global data     environment = pytestconfig.getoption('--ENV')     print(environment)     environment = str.lower(environment)      pytest.global_env = environment      config = configparser.ConfigParser()     config.read('config.ini') # local config file     configData = config['QA-CONFIG']     if environment == 'qa':             configData = config['QA-CONFIG']     if environment == 'prod':             configData = config['PROD-CONFIG']  (...) 

Then I've got the test I want to skip, and it's decorated like so:

@pytest.mark.skipif(pytest.global_env in 'prod',                 reason="feature not in Prod yet") 

However, whenever I run the tests against prod, they don't get skipped. I did some fiddling around, and found that:

a) the global_env variable is accessible through another fixture

@pytest.fixture(scope="session", autouse=True) def mod_header(request):     log.info('\n-----\n| '+pytest.global_env+' |\n-----\n') 

displays correctly in my logs

b) the global_env variable is accessible in a test, correctly logging the env.

c) pytest_namespace is deprecated

So, I'm assuming this has to do with when the skipif accesses that global_env vs. when the fixtures do in the test session. I also find it non-ideal to use a deprecated functionality.

My question is:

  • how do I get a value from the pytest command line option into a skipif?
  • Is there a better way to be trying this than the pytest_namespace?

回答1:

Looks like true way to Control skipping of tests according to command line option is mark tests as skip dynamically:

  1. add option using pytest_addoption hook like this:

    def pytest_addoption(parser): parser.addoption( "--runslow", action="store_true", default=False, help="run slow tests" )

  2. Use pytest_collection_modifyitems hook to add marker like this:

    def pytest_collection_modifyitems(config, items): if config.getoption("--runslow"): # --runslow given in cli: do not skip slow tests return skip_slow = pytest.mark.skip(reason="need --runslow option to run") for item in items: if "slow" in item.keywords: item.add_marker(skip_slow)

  3. Add mark to you test:

    @pytest.mark.slow def test_func_slow(): pass

If you want to use the data from the CLI in a test, for example, it`s credentials, enough to specify a skip option when retrieving them from the pytestconfig:

  1. add option using pytest_addoption hook like this:

    def pytest_addoption(parser): parser.addoption( "--credentials", action="store", default=None, help="credentials to ..." )

  2. use skip option when get it from pytestconfig

    @pytest.fixture(scope="session") def super_secret_fixture(pytestconfig): credentials = pytestconfig.getoption('--credentials', skip=True) ...

  3. use fixture as usual in you test:

    def test_with_fixture(super_secret_fixture): ...

In this case you will got something like this it you not send --credentials option to CLI: Skipped: no 'credentials' option found

It is better to use _pytest.config.get_config instead of deprecated pytest.config If you still wont to use pytest.mark.skipif like this: @pytest.mark.skipif(not _pytest.config.get_config().getoption('--credentials'), reason="--credentials was not specified")



回答2:

The problem with putting global code in fixtures is that markers are evaluated before fixtures, so when skipif is evaluated, configInfo didn't run yet and pytest.global_env will be empty. I'd suggest to move the configuration code from the fixture to pytest_configure hook:

# conftest.py import configparser import pytest   def pytest_addoption(parser):     parser.addoption('--ENV')   def pytest_configure(config):     environment = config.getoption('--ENV')     pytest.global_env = environment     ... 

The configuration hook is guaranteed to execute before the tests are collected and the markers are evaluated.

Is there a better way to be trying this than the pytest_namespace?

Some ways I know of:

  1. Simply assign a module variable in pytest_configure (pytest.foo = 'bar', like I did in the example above).
  2. Use the config object as it is shared throughout the test session:

    def pytest_configure(config):     config.foo = 'bar'  @pytest.fixture def somefixture(pytestconfig):     assert pytestconfig.foo == 'bar'  def test_foo(pytestconfig):     assert pytestconfig.foo == 'bar' 

    Outside of the fixtures/tests, you can access the config via pytest.config, for example:

    @pytest.mark.skipif(pytest.config.foo == 'bar', reason='foo is bar') def test_baz():     ... 
  3. Use caching; this has an additional feature of persisting data between the test runs:

    def pytest_configure(config):     config.cache.set('foo', 'bar')  @pytest.fixture def somefixture(pytestconfig):     assert pytestconfig.cache.get('foo', None)  def test_foo(pytestconfig):     assert pytestconfig.cache.get('foo', None)  @pytest.mark.skipif(pytest.config.cache.get('foo', None) == 'bar', reason='foo is bar') def test_baz():     assert True 

When using 1. or 2., make sure you don't unintentionally overwrite pytest stuff with your own data; prefixing your own variables with a unique name is a good idea. When using caching, you don't have this problem.



标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!