Parametrizing test with multiple fixtures that accept arguments [duplicate]

浪子不回头ぞ 提交于 2019-12-23 13:04:07

问题


I am trying to test a math function that I wrote. I would like to supply data to it from a number of different fixtures. The issue is that all of the fixtures accept different fixture parameters of their own.

The test that I run is always the same (test_myfunc in the example), and the fixtures that I want to plug into it all have the same compatible return values (clean_data and noisy_data in the code). I would therefore like to "chain" these two fixtures together so that one or the other will provide inputs to the test.

Here is what the setup looks like:

import numpy as np
import pytest
from scipy import stats

def myfunc(x, y):
    return True

_noises = {
    'normal': lambda scale, n: np.random.normal(scale=scale, size=n),
    'uniform': lambda scale, n: np.random.uniform(-scale, scale, size=n),
    'triangle': lambda scale, n: np.random.triangular(-scale, 0, scale, size=n),
}

@pytest.fixture(params=[10**x for x in range(1, 4)])
def x_data(request):
    """ Run the test on a few different densities """
    return np.linspace(-10, 10, request.param)

@pytest.fixture(params=[0, 1, 0xABCD, 0x1234])
def random_seed(request):
    """ Run the test for a bunch of datasets, but reporoducibly """
    np.random.seed(request.param)

@pytest.fixture(params=np.arange(0.5, 5.5, 0.5))
def shape(request):
    """ Run the test with a bunch of different curve shapes """
    return request.param

@pytest.fixture()
def clean_data(x_data, shape):
    """ Get a datset with no noise """
    return shape, stats.gamma.pdf(x_data, shape)

@pytest.fixture(params=["triangle", "uniform", "normal"])
def noisy_data(request, clean_data, random_seed):
    shape, base = clean_data
    noise = _noises[request.param](10, base.shape)
    return shape, base + noise

def test_myfunc(x_data, data):
    shape, y_data = data
    assert myfunc(x_data, y_data)

The reason that I am using so many fixtures is that I want to run the complete matrix of tests, with the ability to enable, disable, xfail, etc. any of them at will.

Since the clean_data and noisy_data fixtures return the same type of result, I would like to be able to use both of them for my test, one after the other. How do I run a single test with multiple fixtures that accept arguments?

If possible, I would like to avoid test generation. I am familiar with the idea of indirectly parametrizing the test, for example as in Running the same test on two different fixtures. I have tried to create a meta-fixture that can execute the y-data providers by name:

@pytest.fixture()
def data(request):
    """ Get the appropriate datset based on the request """
    return request.getfuncargvalue(request.param)

@pytest.mark.parametrize('data', ['clean_data', 'noisy_data'], indirect=True)
def test_myfunc(x_data, data):
    shape, y_data = data
    assert myfunc(x_data, y_data)

When I run the tests with

pytest -v pytest-parametrized.py

I get a slew of errors, which all seem to point at the fact that the indirected fixture require parameters, which aren't supplied:

_________________ ERROR at setup of test_myfunc[10-clean_data] _________________

self = <_pytest.python.CallSpec2 object at 0x7f8a4ff06518>, name = 'shape'

    def getparam(self, name):
        try:
>           return self.params[name]
E           KeyError: 'shape'

/usr/lib/python3.6/site-packages/_pytest/python.py:684: KeyError

During handling of the above exception, another exception occurred:

self = <SubRequest 'clean_data' for <Function 'test_myfunc[10-clean_data]'>>
fixturedef = <FixtureDef name='shape' scope='function' baseid='pytest-parametrized.py' >

    def _compute_fixture_value(self, fixturedef):
        """
            Creates a SubRequest based on "self" and calls the execute method of the given fixturedef object. This will
            force the FixtureDef object to throw away any previous results and compute a new fixture value, which
            will be stored into the FixtureDef object itself.

            :param FixtureDef fixturedef:
            """
        # prepare a subrequest object before calling fixture function
        # (latter managed by fixturedef)
        argname = fixturedef.argname
        funcitem = self._pyfuncitem
        scope = fixturedef.scope
        try:
>           param = funcitem.callspec.getparam(argname)

/usr/lib/python3.6/site-packages/_pytest/fixtures.py:484: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <_pytest.python.CallSpec2 object at 0x7f8a4ff06518>, name = 'shape'

    def getparam(self, name):
        try:
            return self.params[name]
        except KeyError:
            if self._globalparam is NOTSET:
>               raise ValueError(name)
E               ValueError: shape

/usr/lib/python3.6/site-packages/_pytest/python.py:687: ValueError

During handling of the above exception, another exception occurred:

request = <SubRequest 'data' for <Function 'test_myfunc[10-clean_data]'>>

    @pytest.fixture()
    def data(request):
        """ Get the appropriate datset based on the request """
>       return request.getfuncargvalue(request.param)

pytest-parametrized.py:55: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.6/site-packages/_pytest/fixtures.py:439: in getfuncargvalue
    return self.getfixturevalue(argname)
/usr/lib/python3.6/site-packages/_pytest/fixtures.py:430: in getfixturevalue
    return self._get_active_fixturedef(argname).cached_result[0]
/usr/lib/python3.6/site-packages/_pytest/fixtures.py:455: in _get_active_fixturedef
    self._compute_fixture_value(fixturedef)
/usr/lib/python3.6/site-packages/_pytest/fixtures.py:526: in _compute_fixture_value
    fixturedef.execute(request=subrequest)
/usr/lib/python3.6/site-packages/_pytest/fixtures.py:778: in execute
    fixturedef = request._get_active_fixturedef(argname)
/usr/lib/python3.6/site-packages/_pytest/fixtures.py:455: in _get_active_fixturedef
    self._compute_fixture_value(fixturedef)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <SubRequest 'clean_data' for <Function 'test_myfunc[10-clean_data]'>>
fixturedef = <FixtureDef name='shape' scope='function' baseid='pytest-parametrized.py' >

    def _compute_fixture_value(self, fixturedef):
        """
            Creates a SubRequest based on "self" and calls the execute method of the given fixturedef object. This will
            force the FixtureDef object to throw away any previous results and compute a new fixture value, which
            will be stored into the FixtureDef object itself.

            :param FixtureDef fixturedef:
            """
        # prepare a subrequest object before calling fixture function
        # (latter managed by fixturedef)
        argname = fixturedef.argname
        funcitem = self._pyfuncitem
        scope = fixturedef.scope
        try:
            param = funcitem.callspec.getparam(argname)
        except (AttributeError, ValueError):
            param = NOTSET
            param_index = 0
            if fixturedef.params is not None:
                frame = inspect.stack()[3]
                frameinfo = inspect.getframeinfo(frame[0])
                source_path = frameinfo.filename
                source_lineno = frameinfo.lineno
                source_path = py.path.local(source_path)
                if source_path.relto(funcitem.config.rootdir):
                    source_path = source_path.relto(funcitem.config.rootdir)
                msg = (
                    "The requested fixture has no parameter defined for the "
                    "current test.\n\nRequested fixture '{0}' defined in:\n{1}"
                    "\n\nRequested here:\n{2}:{3}".format(
                        fixturedef.argname,
                        getlocation(fixturedef.func, funcitem.config.rootdir),
                        source_path,
                        source_lineno,
                    )
                )
>               fail(msg)
E               Failed: The requested fixture has no parameter defined for the current test.
E               
E               Requested fixture 'shape' defined in:
E               pytest-parametrized.py:27
E               
E               Requested here:
E               /usr/lib/python3.6/site-packages/_pytest/fixtures.py:526

/usr/lib/python3.6/site-packages/_pytest/fixtures.py:506: Failed

If supplying the missing parameters somehow is the answer, that's great, but I don't want to ask the question that way because I think I may be running into a massive XY problem here.


回答1:


Passing fixture as parameters in the parametrization marker is not supported by pytest. See issue #349 for more details: Using fixtures in pytest.mark.parametrize. When in need of parametrizing with fixtures, I usually resort to creating an auxiliary fixture that accepts all the parameter fixtures and then parametrizing that indirectly in the test. Your example would thus become:

@pytest.fixture
def data(request, clean_data, noisy_data):
    type = request.param
    if type == 'clean':
        return clean_data
    elif type == 'noisy':
        return noisy_data
    else:
        raise ValueError('unknown type')

@pytest.mark.parametrize('data', ['clean', 'noisy'], indirect=True)
def test_myfunc(x_data, data):
    shape, y_data = data
    assert myfunc(x_data, y_data)


来源:https://stackoverflow.com/questions/52307317/parametrizing-test-with-multiple-fixtures-that-accept-arguments

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!