Testing Code

A guide for testing code prior to submitting pull request.

Testing LamAna occurs in two flavors:

  1. Unit-testing with nose
  2. Regression testing of API with Jupyter or runipy

Testing code with nose

The current testing utility is nose. From the root directory, you can test all files prepended with “test_” by running:

$ nosetests

There are three types of tests contained in the source lamana directory:

  1. module tests: normal test files located in the ”./tests” directory
  2. model tests: test files specific for custom models, located in ”./models/tests”
  3. controls: .csv files located ”./tests/controls_LT”

Models tests are separated to support an extensibile design for author contributions. This design enables authors to create models and tests together with a single pull request to the standard module directory.

Tests for the utils module writes and removes temporary files in a root directory called “export”. If this directory does not exist, one will be created. These test check that writing and reading of files are consistent. Temporary files are prefixed with “temp”, but should be removed by these test functions.

Note

The locations for tests may change in future releases.

Control files

LamAna maintains .csv files with expected data for different lamanate configurations. These files are tested with the test_controls module. This module reads each control file and parses information such as layer numbers, number of points per layer and geometry. Control files are named by these variables.

Controls files can be created manually, but it may be simpler to make and then edit a starter file. This process can be expedited for multiple files by passing LaminateModels into the utils.tools.write_csv() function. This function will create a csv file for every LaminateModel, which can be altered as desired and tested by copying into the “lamana/tests/controls_LT” folder.

Coverage

We use the following tools and commands to assess test coverage. nose-cov helps to combine coverage reports for sub-packages automatically. The remaining flags will report missing lines for the source directory.

$ pip install coverage, nose-cov
$ nosetests --with-cov --cov lamana

or

$ nosetests --with-cov --cov-report term-missing --cov lamana

LamAna aims for the highest “reasonable” coverage for core modules. A separate ideology must be developed for testing output_ as plots are tricky to test fully.

Regression Tests

Prior to a release, it is fitting to test API regression tests on any demonstration notebooks in a development virtual environment and release branch (see docs/demo.ipynb). These are notebooks that run code using the lamana package. If the notebook cells fail, then a regression has occured and requires resolving before release.

Testing Dependency Regression

Dependency changes are beyond a package maintainer’s control. If a dependency fails to install, the package may fail as well. However, a successfully deployed package often relies on a number of components working:

  • the package has minimal bugs
  • dependencies do not conflict
  • independent deprecations in dependencies do not break the package
  • the package manager (e.g. pip) can resolve dependencies

Testing in a development environment is very different from testing a package from pypi. The development environment may have a number of sub-dependencies that with are cross-required for other packages. In other words, on another system devoid of such a setup, installation behaviors may vary dramatically, and possibly break and installation.

To catch this type of bug, particularly for testing notebook regression, we need to make a clean environement with minimal dependencies that rarely change and is fairly consistent between release cycles. I say “fairly consistent” because with backports and other sub-dependency updates, it is nearly impossible to pin all of jupyter’s dependencies. We need to pin as much as we can to limit cross dependency contamination.

Solution: this proposed workflow uses `nb_conda_kernels <https://github.com/Anaconda-Platform/nb_conda_kernels>`__ to help rebuild a consistent jupyter enviroment, in which to test notebooks and closely mimic the behavior of a fresh installation. This extension comes pre-installed with Anaconda 4.1. It magically generates kernelspecs for easy access to enviroment kernels from the notebook dropdown menu. This also implies the dependencies are isolated per environment, which is critical for reliably resting pypi builds.

Testing with testpypi

We start by handcrafting a custom enviroment.yaml file with python for jupyter. This file drifts between versions an is only updated as needed All jupyter notebook dependencies are includes, and all entries are pinned.

To determine the jupyter dependencies and the versions of the pinned files, you can start by copying your enviromentent_py<version>.yaml file, running conda install notebook=<version> and then remove unnecessary entries in the yaml file. Here is an example of the least elements required to work with the Anaconda extension (this file may vary for different jupyter versions):

# environment_example.yaml
name: nbregtest
dependencies:
- python=3.5.1=4
- notebook=4.1.0=py35_0
- ...                                 # other dependencies

Note

Since it is impossible to pin all jupyer dependencies, only update the yaml file as needed. It is not so important to have an updated jupyter version. We just need one that works consistently most of the time. If something breaks due

to an updated sub-dependency, you will find out during the testing phases and can selectively update the file as needed.

Given Anaconda > 4.1 is installed and a yaml file is created with the “name” parameter “nbregtest”:

> conda env update -f environment_jupyter.yaml
> activate nbregtest
> pip install --verbose --extra-index-url https://testpypi.python.org/pypi lamana
> jupyter notebook
> # conda install failed dependencies if needed
> # run notebook tests
> # shutdown jupyter
> deactivate
> conda env remove -n nbregtest

Notice the name in the yaml sets the enviroment name and the kernelspec name. Notebooks have been tested in a controlled environment (with minimal jupyter dependencies), and the env/kernelspec has been removed.