0

It is natural to write my test in terms of 3 separate numpy arrays, but the first dimension of each numpy array must be of equal length. As a hack, I can simply ask for a larger numpy array

@given(
    arrays=arrays(
        dtype=float,
        shape=tuples(
            integers(3, 3),
            array_shapes(max_dims=1).map(lambda t: t[0]),
            array_shapes(max_dims=1).map(lambda t: t[0]),
        ),
        elements=floats(width=16, allow_nan=False, allow_infinity=False),
    ),
)
def test(arrays: np.ndarray):
  a, b, c = arrays[0], arrays[1], arrays[2]
  ...

but this obscures what I'm really trying to generate, and makes it impossible to have separate strategies the elements of each of the arrays. Is there any way to generate these arrays while maintaining the constraint on the size of the first dimension? I imagine I would want something like

@given(
  (a, b, c) = batched_arrays(
    n_arrays=3,
    shared_sizes=array_sizes(max_dims=1),
    unshared_sizes=arrays_sizes(),
    dtypes=[float, int, float],
    elements=[floats(), integers(0), floats(0, 1)])
)
def test(a: np.ndarray, b:np.ndarray, c:np.ndarray):
  assert a.shape[0] == b.shape[0] and a.shape[0] == c.shape[0]
  ...
itisme1997
  • 11
  • 1

1 Answers1

1

Sorry to answer my own question. It turns out you can get this with shared

@given(
    a=arrays(float, shared(array_shapes(max_dims=1), key="dim1")),
    b=arrays(float, shared(array_shapes(max_dims=1), key="dim1")),
)
def test_shared(a, b):
    assert a.shape[0] == b.shape[0]
itisme1997
  • 11
  • 1