Skip to content

Add comprehensive SolaraViz testing framework #2741

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

Tejasv-Singh
Copy link

Comprehensive Testing Framework for Mesa's SolaraViz Components

Overview

This pull request introduces a comprehensive testing framework for Mesa's SolaraViz visualization components, integrating seamlessly into the CI pipeline. The framework addresses GitHub issue #2734 by providing robust testing solutions that extend beyond basic model tests, ensuring SolaraViz components function correctly across all example models.


Key Features Implemented

Testing Categories

1. Basic Component Tests

  • Validates initialization and rendering of all SolaraViz components
  • Tests component properties and attributes
  • Ensures components handle edge cases gracefully

2. Integration Tests

  • Verifies visualization components correctly interact with model data
  • Tests that model changes are properly reflected in visualizations
  • Ensures app components can initialize and update models

3. Performance Benchmarks

  • Measures rendering performance across components
  • Benchmarks model step performance with visualizations attached
  • Provides comparison data for performance optimization

4. Regression Tests

  • Prevents regressions in visualization functionality
  • Verifies component interfaces remain stable
  • Tests error handling capabilities

Framework Architecture

  • Mock Implementation: Created mock Solara components that enable testing without actual Solara dependencies
  • Parametrized Tests: Tests run against all example models automatically
  • Flexible Fixtures: Shared test fixtures simplify writing new tests
  • Web Interface: User-friendly interface for running and monitoring tests
  • Command-line Tools: Support for running tests via CLI

Technical Implementation

  • All tests follow pytest's best practices with proper fixture usage
  • Testing framework is isolated and independent, minimizing dependencies
  • Comprehensive docstrings explain test purpose and methodology
  • Graceful handling of edge cases and failures
  • CI integration through GitHub Actions workflow

Usage

The framework supports multiple testing methods:

  • Web interface for visual test monitoring
  • Command-line interface for scripted testing
  • Direct pytest execution for development
  • Automated execution via GitHub Actions

Future Improvements

  • Expand visual regression testing with screenshot comparisons
  • Add more performance metrics for detailed benchmarking
  • Improve test coverage for advanced Solara features
  • Enhance documentation with usage examples

Related Issues


Reviewers

Please focus on:

  • Test coverage completeness
  • Framework architecture and extensibility
  • Integration with Mesa's existing testing infrastructure
  • Documentation clarity

Tejasv-Singh and others added 2 commits March 30, 2025 16:41
- Implements testing framework for Mesa's SolaraViz visualization components
- Adds tests for component initialization, rendering, and data binding
- Creates mock Solara components for isolated testing
- Addresses issue projectmesa#2734
Copy link

Performance benchmarks:

Model Size Init time [95% CI] Run time [95% CI]
BoltzmannWealth small 🔵 -0.9% [-2.1%, +0.3%] 🔵 -0.4% [-1.2%, +0.5%]
BoltzmannWealth large 🔵 +0.7% [-0.7%, +2.0%] 🔵 +0.6% [-0.7%, +1.9%]
Schelling small 🔵 +0.8% [+0.0%, +1.6%] 🔵 +1.6% [+0.8%, +2.4%]
Schelling large 🔵 +0.6% [-1.7%, +2.6%] 🔵 +0.5% [-0.6%, +1.7%]
WolfSheep small 🔵 +0.6% [-0.3%, +1.6%] 🔵 +1.0% [+0.3%, +1.8%]
WolfSheep large 🔵 +2.3% [+0.3%, +4.4%] 🔵 +2.8% [+1.5%, +4.1%]
BoidFlockers small 🔵 -2.4% [-4.5%, -0.3%] 🔵 -0.5% [-1.2%, +0.1%]
BoidFlockers large 🔵 -5.7% [-7.8%, -2.8%] 🟢 -4.8% [-5.8%, -3.8%]

@EwoutH EwoutH added testing Release notes label ci Release notes label visualisation labels Mar 30, 2025
@EwoutH
Copy link
Member

EwoutH commented Mar 30, 2025

Thanks for the PR!

I don't know if you noticed, but 3 days ago @Ya-shh opened a similar PR:

Have you coordinated with him?

Copy link
Member

@EwoutH EwoutH left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Anyway, if you want you can continue working on this one, and we'll merge the one that's best.

Two comments:

  1. All other tests are in our tests directory. Is there a reason these are not?
  2. I would like all new tests to be added to the CI. In this case, I would recommend adding a new workflow file (in .github/workflows).
  3. 1600 lines of tests is a lot. Not necessarily a problem, but could you explain where the bulk is and why that's needed?

@Tejasv-Singh
Copy link
Author

Thanks for your review!

Test Location: All test files are in the /tests directory. The supporting files outside (mock_solara_components.py, example_tests.py, etc.) are not tests but necessary mocks, utilities, and web components. If preferred, I can relocate them for consistency.

CI Integration: I've added a GitHub Actions workflow (.github/workflows/viz_tests.yml) to run all SolaraViz tests, generate benchmarks, store artifacts, and report failures.

Test Length (1600 lines): The bulk comes from:

Mocks (300 lines): Needed to run tests without Solara dependencies.

Visualization Coverage (500 lines): Ensures all Grid, Chart, and Network scenarios are tested.

Test Categories (800 lines): Component, integration, performance, and regression tests for full coverage.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ci Release notes label testing Release notes label visualisation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Integrate Visualization Testing in Mesa CI
2 participants