Tests

This page should give you an overview over all tests for this plugin.

Global conftest.py

Module: tests.conftest

Project global fixtures, plugins etc.

tests.conftest.pytest_plugins = ['pytester']

Load the fixture pytester for all tests. Even if we don’t need it everywhere (we need it only during the plugin tests), this fixture requires to be loaded in the topmost conftest module.

Standard Tests

Test for header cleaning

Module: tests.test_clean_headers

tests.test_clean_headers.test_header_cleaning(current_headers, replacing, expect_exception, expect_message, expect_result)

This test case tests mainly the _ptcsvp.parametrize.clean_headers() method.

There are many single test cases built with parametrization.

Parameters:
  • current_headers (List[str]) – List of headers before cleaning

  • replacing (Optional[Dict[str, str]]) – A replacement dictionary

  • expect_exception (Optional[Type[ValueError]]) – Exception to expect during method call

  • expect_message (Optional[str]) – Exception message to be expected

  • expect_result (List[str]) – Expected cleaned headers

Return type:

None

Test for the internal parametrization feature

Module: tests.test_parametrize

tests.test_parametrize.test_parametrization(csv_file, id_col, result, ids, expect_exception, expect_message)

This test case tests mainly the internal _ptcsvp.parametrize.add_parametrization() method, which is the backbone of the public csv_params() decorator.

The test is heavily parametrized. See source code for detail.

Parameters:
  • csv_file (str) – CSV file for the test

  • id_col (Optional[str]) – The ID column name of the CSV file

  • result (Optional[Tuple[List[str], List[List[str]]]]) – Expected result, as it would be handed over to the pytest.mark.parametrize() mark decorator

  • ids (Optional[List[str]]) – Expected test case IDs

  • expect_exception (Optional[Type[Exception]]) – Expected exception during call

  • expect_message (Optional[str]) – Expected exception message during call

Return type:

None

Test the reading of CSV files

Module: tests.test_read_csv

tests.test_read_csv.test_csv_reader(csv_file, base_dir, expect_lines, expect_exception, expect_message)

This test case tests several CSV reading scenarios (by parametrization). CSV test files are in the tests/assets folder. The tests target the _ptcsvp.parametrize.read_csv() method.

Parameters:
  • csv_file (str) – The file to test with

  • base_dir (Optional[str]) – The base dir to load the csv_file from

  • expect_lines (Optional[int]) – Expected read lines from the CSV file

  • expect_exception (Optional[Type[Exception]]) – Expected exception when running the method

  • expect_message (Optional[str]) – Expected exception message when running the method

Return type:

None

Test the header name handling

Module: tests.test_varname

The tests in this module aim at testing the validation and cleaning of header/column names of CSV files. Those names serve as arguments to test methods, and must therefore be valid and not shadow builtin names. Reserved names are checked also.

tests.test_varname.test_310_names(name)

There are a few names that are not valid when using python 3.10 and above. This parametrized test checks if they are marked as invalid by the method _ptcsvp.varname.is_valid_name().

This test will be skipped on python versions below 3.10.

Parameters:

name (str) – An invalid name since python 3.10.

Return type:

None

tests.test_varname.test_is_valid_name(var_name, is_valid)

This test case checks that the method _ptcsvp.varname.is_valid_name() gives the right results. The test method is parametrized.

Parameters:
  • var_name (str) – The name to check

  • is_valid (bool) – Expectation if this is a valid name

Return type:

None

tests.test_varname.test_make_name_valid(var_name, valid_var_name, raises_error)

This test case checks the method _ptcsvp.varname.make_name_valid() builds valid names or throws matching exceptions if not possible. Therefore, it is parametrized.

Parameters:
  • var_name (str) – The variable name to try to make valid

  • valid_var_name (Optional[str]) – The name as expected after made valid

  • raises_error (bool) – Expect an error?

Return type:

None

Test the checks for required versions

Module: tests.test_version_checks

Checking the versions this plugin depends on is crucial for the correct function.

tests.test_version_check.build_version(p_version)

Test helper method: Build a version Tuple of a given version string. It is used by the test_python_version() test case.

Parameters:

p_version (str) – Version string

Return type:

Tuple[Union[int, str], ...]

Returns:

The version as tuple

tests.test_version_check.test_pytest_version(mocker, p_version, expect_error)

Test if the pytest version is correctly recognized and if a too old version raises an exception. This test focuses on the _ptcsvp.version.check_pytest_version() method and is parametrized with a lot of different version strings.

This test uses mocking.

Parameters:
  • mocker (MockerFixture) – Mocking fixture

  • p_version (str) – Version string

  • expect_error (Optional[Tuple[Type[Exception], str]]) – Expected error and error message

Return type:

None

tests.test_version_check.test_python_version(mocker, p_version, expect_error)

Test if the python version is correctly recognized and if old versions raise an exception. This tests mainly the _ptcsvp.version.check_python_version() method and is parametrized with a lot of different version combos.

This test uses mocking, and the build_version() helper method.

Parameters:
  • mocker (MockerFixture) – Mocking fixture

  • p_version (str) – Version string

  • expect_error (Optional[Tuple[Type[Exception], str]]) – Expected error for the given version

Return type:

None

Plugin Tests

These tests test the plugin code by inserting the plugin into a test pytest instance.

Plugin conftest.py

Module: tests.plugin.conftest

Local configuration and fixture providing for the Plugin tests

tests.plugin.conftest.bad_test_csv()

Test fixture: Bad CSV

Return type:

str

Returns:

Bad CSV data as string

tests.plugin.conftest.get_csv(csv)

Helper Method: Read CSV file from the tests assets directory under tests/plugin/assets.

Parameters:

csv (str) – Name of the CSV file, without the .csv extension

Return type:

str

Returns:

CSV data as string

tests.plugin.conftest.install_plugin_locally(pytestconfig)

Auto-use Test Fixture to install our plugin in the test environment, so that it can be used with the pytester fixture. The package is removed after the test session automatically.

Parameters:

pytestconfig (Config) – Fixture from pytest that contains the test configuration

Return type:

Generator[None, None, None]

Returns:

An empty generator (from the yield), to let the tests run and cleanup afterwards

tests.plugin.conftest.simple_fruit_test()

Test Fixture: Template of a simple test case

Return type:

Union[Callable[[str], str], Callable[[str, str], str]]

Returns:

A method where a data file can be filled in and what will return a valid pytest test case that can be saved to a .py file

tests.plugin.conftest.simple_test_csv()

Test fixture: Good simple CSV

Return type:

str

Returns:

CSV data as string

tests.plugin.conftest.simple_text_test()

Test Fixture: Template of a simple text test case

Return type:

Callable[[str], str]

Returns:

A method where a data file can be filled in and what will return a valid pytest test case that can be saved to a .py file

tests.plugin.conftest.text_test_csv()

Test Fixture: Text-only CSV

Return type:

str

Returns:

Text-only CSV data as string

Tests

Command line argument handling

Module: tests.plugin.test_cmd_line

tests.plugin.test_cmd_line.test_base_dir_param(pytester, base_dir, simple_test_csv, simple_fruit_test)

Test if the --csv-params-base-dir command line argument is valued. For laziness, it uses a poor parametrization.

Parameters:
Return type:

None

tests.plugin.test_cmd_line.test_help(pytester)

Test that the pytest help now contains our command line argument with our help text.

Parameters:

pytester (Pytester) – Pytester fixture

Return type:

None

Plugin Calls

Module: tests.plugin.test_plugin

tests.plugin.test_plugin.test_plugin_all_tests_at_once(pytester, text_test_csv, bad_test_csv, simple_test_csv, simple_fruit_test, simple_text_test)

This is a meta test to check if multiple files would work also. Basically, it’s a combination of all the other plugin invocation tests of the module tests.plugin.test_plugin.

We can’t run the error-prone test here, because it would stop all tests.

Parameters:
Return type:

None

tests.plugin.test_plugin.test_plugin_test_error(pytester, bad_test_csv, simple_fruit_test)

Test if a test error is correctly recognized

Parameters:
Return type:

None

tests.plugin.test_plugin.test_plugin_test_multiplication(pytester, simple_test_csv, simple_fruit_test)

Test a simple round trip (positive test case)

Parameters:
Return type:

None

tests.plugin.test_plugin.test_plugin_test_text_shorthand(pytester, text_test_csv, simple_text_test)

Test the shorthand version of the plugin’s decorator

Parameters:
Return type:

None

POC Tests

POC conftest.py

Module: tests.poc.conftest

Local configuration and fixture providing for POC tests

class tests.poc.conftest.CheapCounter

A simple cheap counter that is required for counting executions

counter: Dict[str, int] = {}
classmethod get_value(counter)

Get the value of the counter

Parameters:

counter (str) – Name of the counter

Return type:

int

Returns:

Value of the counter

classmethod increment(counter)

Increment the value of the counter

Parameters:

counter (str) – Name of the counter to increment

Return type:

None

tests.poc.conftest.cheap_counter()

Deliver a simple counter as fixture

Return type:

Type[CheapCounter]

Returns:

The Cheap Counter Class

Tests

Pytest feature: Parametrization

Module: tests.poc.test_parametrize_with_generator

We are using a pytest feature heavily: Parametrization. These tests make sure this feature works still as expected.

Tests in this module run in a predefined order!

tests.poc.test_parametrize_with_generator.data_generator()

Helper method: Create Test Data, but keep them as a generator

This helper is used by test_2_generator_parametrize().

Return type:

Generator[List[str], None, None]

Returns:

A bunch of test data as generator

tests.poc.test_parametrize_with_generator.test_1_simple_parametrize(val_a, val_b, val_c, cheap_counter)

Test the simple parametrization from pytest.

Parameters:
Return type:

None

tests.poc.test_parametrize_with_generator.test_2_generator_parametrize(val_a, val_b, val_c, cheap_counter)

Test the generator parametrization from pytest.

Parameters:
Return type:

None

tests.poc.test_parametrize_with_generator.test_3_evaluation(cheap_counter)

Evaluate the values of the cheap_counter() fixture.

Parameters:

cheap_counter (Type[CheapCounter]) – Fixture cheap_counter()

Return type:

None

Examples

Example Code for a test case for the documentation site

Module: tests.test_docs_example

This is the test code for the documentation site’s User guide. It contains everything that’s needed to follow the example – and makes sure the code example is working.

tests.test_docs_example.get_dimensions(dimensions_str)

Read the dimensions from a string. A helper method to build the dimensions tuple.

Parameters:

dimensions_str (str) – The dimensions from the CSV file

Raises:

ValueError – When the dimensions cannot be converted

Return type:

Tuple[int, int, int]

Returns:

The dimensions as int tuple

tests.test_docs_example.get_smallest_possible_container(number_of_items, dimensions_of_item, available_container_sizes=(1000, 2500, 7500))

This is the method to be tested. It searches for the smallest possible container after calculating the volume of the things to be loaded into the container. A container can only contain items of one product, so it is enough to know about the size of a single product, and how many of them need to be stored in a container.

The method raises a ValueError when the items do not fit any container.

Parameters:
  • number_of_items (int) – Number of items to be packed

  • dimensions_of_item (Tuple[int, int, int]) – Edge lengths of a single item

  • available_container_sizes (Union[List[int], Tuple[int, ...]]) – What container sizes are available? This parameter has a default value (1000, 2500, 7500).

Raises:

ValueError – When no matching container can be found

Return type:

int

tests.test_docs_example.test_get_smallest_possible_container(number_of_items, dimensions_of_item, expected_container_size, expect_exception, expected_message)

This is the test method for the documentation.

Parameters:
  • number_of_items (int) – The number of items

  • dimensions_of_item (Tuple[int, int, int]) – The dimensions of a single item

  • expected_container_size (int) – Expected container size

  • expect_exception (bool) – Expect a ValueError

  • expected_message (str) – Message of the exception

Return type:

None

Example Code for a blog post on juergen.rocks

Module: tests.test_blog_example

This is a test example for a blog post on juergen.rocks.

The example consists of serval helper methods and a lot of configuration for the csv_params() decorator.

The CSV file looks like this:

"Order-Ref #", "Anz. Schrauben-Päck.", "Dim. Schrauben-Päck.", "Anz. Scheiben-Päck.", "Dim. Scheiben-Päck.", "Volumen Container"
"221-12-A-24", "670", "30 x 50 x 70 mm", "150", "40 x 50 x 70 mm", "1 m³"
"281-13-C-15", "5000", "30 x 50 x 70 mm", "10000", "40 x 50 x 70 mm", "5 m³"
"281-13-C-76", "50000", "35 x 55 x 75 mm", "5000", "50 x 60 x 90 mm", "10 m³"

You find the CSV file in tests/assets/blog-example.csv.

tests.test_blog_example.get_container_volume(container_size)

Get the container size (remove the unit, as mm³).

Helper method, will be used as a data caster.

Parameters:

container_size (str) – String from the CSV file.

Return type:

int

Returns:

Volume of the container in mm³

Raises:

ValueError: When the test data cannot be converted

tests.test_blog_example.get_volume(size_data)

Get the volume from size data, return it as mm³.

Helper method, will be used as a data caster.

Parameters:

size_data (str) – String from the CSV file.

Return type:

int

Returns:

Volume in mm³

Raises:

ValueError: When the test data cannot be converted

tests.test_blog_example.test_does_it_fit(anz_schrauben, vol_schrauben, anz_scheiben, vol_scheiben, vol_container)

A test example that tries to figure out if all the Schrauben and Scheiben fit in the container, and if the smallest possible container is chosen.

Parameters:
  • anz_schrauben (int) – Number of Schraubenpäckchen

  • vol_schrauben (int) – Volume (mm³) of a single Schraubenpäckchen

  • anz_scheiben (int) – Number of Scheibenpäckchen

  • vol_scheiben (int) – Volume (mm³) of a single Scheibenpäckchen

  • vol_container (int) – Volume (mm³) of the selected container

Return type:

None

Example Code for a test case for the README.md documentation

Module: tests.test_complex_example

This module contains a quite simple, yet complex configured test to show what’s possible with the plugin.

The example uses this CSV data, as found under tests/assets/example.csv:

"Test ID","Bananas shipped","Single Banana Weight","Apples shipped","Single Apple Weight","Container Size"
"Order-7","1503","0.5","2545","0.25","1500"
"Order-15","101","0.55","1474","0.33","550"

The test idea here is much the same as the tests.test_blog_example test case.

Why is such a test case here? That’s simple: To make sure, the code samples in the documentation still work as designed.

tests.test_complex_example.test_container_size_is_big_enough(bananas_shipped, banana_weight, apples_shipped, apple_weight, container_size)

This is just an example test case for the documentation.

Parameters:
  • bananas_shipped (int) – How many mananas were shipped?

  • banana_weight (float) – What’s the weight of one banana?

  • apples_shipped (int) – How many apples where shipped?

  • apple_weight (float) – What’s the weight of one apple?

  • container_size (int) – How large was the container?

Return type:

None