How to skip a pytest using an external fixture?

[亡魂溺海] 提交于 2019-11-28 09:45:43

It seems py.test doesn't use the test fixtures when evaluating the expression for skipif. By your example, test_ios is actually successful because it is comparing the function platform found in the module's namespace to the "ios" string, which evaluates to False hence the test is executed and succeeds. If pytest was inserting the fixture for evaluation as you expect, that test should have been skipped.

A solution to your problem (not to your question though) would be to implement a fixture that inspects marks into the tests, and skips them accordingly:

# conftest.py
import pytest

@pytest.fixture
def platform():
    return "ios"

@pytest.fixture(autouse=True)
def skip_by_platform(request, platform):
    if request.node.get_closest_marker('skip_platform'):
        if request.node.get_closest_marker('skip_platform').args[0] == platform:
            pytest.skip('skipped on this platform: {}'.format(platform))   

A key point is the autouse parameter, which would make that fixture to be automatically included by all tests. Then your tests can mark which platforms to skip like this:

@pytest.mark.skip_platform('ios')
def test_ios(platform, request):
    assert 0, 'should be skipped' 

Hope that helps!

Using inspiration from this answer to another SO question, I am using this approach to this problem which works well:

import pytest

@pytest.fixture(scope='session')
def requires_something(request):
    something = 'a_thing'
    if request.param != something:
        pytest.skip(f"Test requires {request.param} but environment has {something}")


@pytest.mark.parametrize('requires_something',('something_else',), indirect=True)
def test_indirect(requires_something):
    print("Executing test: test_indirect")

I had a similar problem and I don't know if this is still relevant for you, but I might have found a workaround that would do what you want.

The idea is to extend the MarkEvaluator class and override the _getglobals method to force to add fixture values in the global set used by evaluator:

conftest.py

from _pytest.skipping import MarkEvaluator

class ExtendedMarkEvaluator(MarkEvaluator):
    def _getglobals(self):
        d = super()._getglobals()
        d.update(self.item._request._fixture_values)
        return d

add a hook to test calls:

def pytest_runtest_call(item):
    evalskipif = ExtendedMarkEvaluator(item, "skipif_call")
    if evalskipif.istrue():
        pytest.skip('[CANNOT RUN]' + evalskipif.getexplanation())

then you can use the marker skipif_call in your test case:

test_example.py

class Machine():
   def __init__(self, state):
      self.state = state

@pytest.fixture
def myfixture(request):
   return Machine("running")

@pytest.mark.skipif_call('myfixture.state != "running"')
def test_my_fixture_running_success(myfixture):
   print(myfixture.state)
   myfixture.state = "stopped"
   assert True

@pytest.mark.skipif_call('myfixture.state != "running"')
def test_my_fixture_running_fail(myfixture):
   print(myfixture.state)
   assert False

@pytest.mark.skipif_call('myfixture.state != "stopped"')
def test_my_fixture_stopped_success(myfixture):
   print(myfixture.state)
   myfixture.state = "running"

@pytest.mark.skipif_call('myfixture.state != "stopped"')
def test_my_fixture_stopped_fail(myfixture):
   print(myfixture.state)
   assert False

Run

pytest -v --tb=line
============================= test session starts =============================
[...]
collected 4 items

test_example.py::test_my_fixture_running_success PASSED
test_example.py::test_my_fixture_running_fail FAILED
test_example.py::test_my_fixture_stopped_success PASSED
test_example.py::test_my_fixture_stopped_fail FAILED

================================== FAILURES ===================================
C:\test_example.py:21: assert False
C:\test_example.py:31: assert False
===================== 2 failed, 2 passed in 0.16 seconds ======================

Problem

Unfortunately, this works only once for each evaluation expression since MarkEvaluator uses cached eval based on expression as key, so the next time the same expression will be tested, the result will be the cached value.

Solution

The expression is evaluated in the _istrue method. Unfortunately there is no way to configure the evaluator to avoid caching results. The only way to avoid caching is to override the _istrue method to not use the cached_eval function:

class ExtendedMarkEvaluator(MarkEvaluator):
    def _getglobals(self):
        d = super()._getglobals()
        d.update(self.item._request._fixture_values)
        return d

    def _istrue(self):
        if self.holder:
            self.result = False
            args = self.holder.args
            kwargs = self.holder.kwargs
            for expr in args:
                import _pytest._code
                self.expr = expr
                d = self._getglobals()
                # Non cached eval to reload fixture values
                exprcode = _pytest._code.compile(expr, mode="eval")
                result = eval(exprcode, d)

                if result:
                    self.result = True
                    self.reason = expr
                    self.expr = expr
                    break
            return self.result
        return False

Run

pytest -v --tb=line
============================= test session starts =============================
[...]
collected 4 items

test_example.py::test_my_fixture_running_success PASSED
test_example.py::test_my_fixture_running_fail SKIPPED
test_example.py::test_my_fixture_stopped_success PASSED
test_example.py::test_my_fixture_stopped_fail SKIPPED

===================== 2 passed, 2 skipped in 0.10 seconds =====================

Now the tests are skipped because 'myfixture' value has been updated.

Hope it helps.

Cheers

Alex

The Solution from Bruno Oliveira is working, but for new pytest (>= 3.5.0) you need to add the pytest_configure:


# conftest.py
import pytest

@pytest.fixture
def platform():
    return "ios"

@pytest.fixture(autouse=True)
def skip_by_platform(request, platform):
    if request.node.get_closest_marker('skip_platform'):
        if request.node.get_closest_marker('skip_platform').args[0] == platform:
            pytest.skip('skipped on this platform: {}'.format(platform))   

def pytest_configure(config):
  config.addinivalue_line(
        "markers", "skip_by_platform(platform): skip test for the given search engine",
  )

Use:

@pytest.mark.skip_platform('ios')
def test_ios(platform, request):
    assert 0, 'should be skipped' 
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!