Skip to content

A pytest plugin to facilitate image comparison for Matplotlib figures

License

Notifications You must be signed in to change notification settings

eerovaher/pytest-mpl

 
 

Repository files navigation

pytest-mpl

pytest-mpl is a pytest plugin to facilitate image comparison for Matplotlib figures.

For each figure to test, an image is generated and then subtracted from an existing reference image. If the RMS of the residual is larger than a user-specified tolerance, the test will fail. Alternatively, the generated image can be hashed and compared to an expected value.

For more information, see the pytest-mpl documentation.

Installation

pip install pytest-mpl

For detailed instructions, see the installation guide in the pytest-mpl docs.

Usage

First, write test functions that create a figure. These image comparison tests are decorated with @pytest.mark.mpl_image_compare and return the figure for testing:

import matplotlib.pyplot as plt
import pytest

@pytest.mark.mpl_image_compare
def test_plot():
    fig, ax = plt.subplots()
    ax.plot([1, 2])
    return fig

Then, generate reference images by running the test suite with the --mpl-generate-path option:

pytest --mpl-generate-path=baseline

Then, run the test suite as usual, but pass --mpl to compare the returned figures to the reference images:

pytest --mpl

By also passing --mpl-generate-summary=html, a summary of the image comparison results will be generated in HTML format:

html all html filter html result

For more information on how to configure and use pytest-mpl, see the pytest-mpl documentation.

Contributing

pytest-mpl is a community project maintained for and by its users. There are many ways you can help!

  • Report a bug or request a feature on GitHub
  • Improve the documentation or code

About

A pytest plugin to facilitate image comparison for Matplotlib figures

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 82.7%
  • HTML 11.7%
  • JavaScript 4.9%
  • CSS 0.7%