Implement signal coregistration with MEF multi-device reader for temporal synchronization#1
Implement signal coregistration with MEF multi-device reader for temporal synchronization#1Copilot wants to merge 12 commits intodev-copilotfrom
Conversation
Co-authored-by: xmival00 <26235073+xmival00@users.noreply.github.com>
Co-authored-by: xmival00 <26235073+xmival00@users.noreply.github.com>
Co-authored-by: xmival00 <26235073+xmival00@users.noreply.github.com>
|
First Copilot AI solution. This does not account for continual clock drift but does offset. |
There was a problem hiding this comment.
Pull request overview
This PR implements a comprehensive two-stage signal coregistration system for aligning electrophysiology recordings from independent sources with different clocks, sampling rates, and durations. The implementation follows the software requirements (REQ-2.2.1 and REQ-2.2.2) by providing Stage I coarse global alignment via envelope cross-correlation and Stage II fine local alignment for tracking clock drift.
Key changes:
- New
AlignmentMapdataclass with JSON serialization for persistent storage of synchronization transformations - Coarse and fine alignment algorithms using correlation-based techniques robust to periodic stimulation patterns
- Comprehensive test suite with 28 passing tests covering unit tests, integration tests with multiple fixtures (same fs, different fs, MEF files, floating clock), and serialization
Reviewed changes
Copilot reviewed 4 out of 5 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
src/brainmaze_sigcoreg/coregistration.py |
Core implementation of AlignmentMap dataclass and two-stage alignment algorithms (coarse_alignment, fine_alignment, compute_alignment) with helper functions for envelope computation and resampling |
src/brainmaze_sigcoreg/__init__.py |
Exports AlignmentMap and alignment functions to package public API |
tests/test_coregistration.py |
Comprehensive test suite with 6 test classes covering unit tests for each component and integration tests using all 4 signal fixtures |
tests/test_fixtures.py |
Makes matplotlib optional with try-except ImportError to prevent test failures when plotting library unavailable |
.gitignore |
Standard Python project exclusions for virtual environments, bytecode, distribution files, IDE files, and test artifacts |
Comments suppressed due to low confidence (1)
tests/test_fixtures.py:597
- Variable zoom_window is not used.
zoom_window = stim_end_idx_a - stim_start_idx_a
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
tests/test_fixtures.py
Outdated
| # Plot zoomed comparison around stimulation period (hours 6-12) | ||
| stim_start_idx_a = int(6 * 3600 * fs_a) | ||
| stim_end_idx_a = int(12.5 * 3600 * fs_a) | ||
| zoom_window = stim_end_idx_a - stim_start_idx_a |
There was a problem hiding this comment.
The variable zoom_window is assigned but never used. Consider removing it or using it if it was intended for validation purposes.
| zoom_window = stim_end_idx_a - stim_start_idx_a |
There was a problem hiding this comment.
Removed the unused zoom_window variable in commit 9afd383.
tests/test_fixtures.py
Outdated
| # plot_path = '/tmp/floating_clock_test_plot.png' | ||
| # try: | ||
| # plt.savefig(plot_path, dpi=100, bbox_inches='tight') | ||
| # print(f"✓ Plot saved to: {plot_path}") | ||
| # except Exception as e: | ||
| # print(f"⚠ Could not save plot: {e}") |
There was a problem hiding this comment.
This comment appears to contain commented-out code.
| # plot_path = '/tmp/floating_clock_test_plot.png' | |
| # try: | |
| # plt.savefig(plot_path, dpi=100, bbox_inches='tight') | |
| # print(f"✓ Plot saved to: {plot_path}") | |
| # except Exception as e: | |
| # print(f"⚠ Could not save plot: {e}") |
tests/test_coregistration.py
Outdated
| signal_b = generated_signals_floating_clock['signal_b']['signal'] | ||
| fs_a = generated_signals_floating_clock['fs_a'] | ||
| fs_b = generated_signals_floating_clock['fs_b'] | ||
| drift_fn = generated_signals_floating_clock['drift_function'] |
|
@copilot I would like to change
|
Co-authored-by: xmival00 <26235073+xmival00@users.noreply.github.com>
Implemented the requested changes in commit 9afd383:
am = compute_alignment(signal_a, signal_b, fs_a, fs_b)
# Each source sample i maps to am.source_timestamps_s[i] in reference time
timestamps = am.source_timestamps_s |
|
@copilot I need to make sure that the coregistration correctly accounts for a drift in the clock (sampling rate) of the reference signal. There cannot be any sharp transitions. Feel free to design further testing fixtures and scenarios. |
…ehensive drift tests Co-authored-by: xmival00 <26235073+xmival00@users.noreply.github.com>
Co-authored-by: xmival00 <26235073+xmival00@users.noreply.github.com>
Co-authored-by: xmival00 <26235073+xmival00@users.noreply.github.com>
Implemented smooth clock drift handling in commits 0ab7a22, 84a8782, and ad8f9c7: Key improvements:
Test results:
The coregistration now correctly handles clock drift in both reference and source signals without any sharp transitions. |
|
@copilot Great, last step, I would like you to implement and test the last step. The idea is, that the signals will be in Mef3 files and there will be a class, that will take
and will coregister the file2 to the reference based on user provided channel. Provided channel will be either one channel or a bipolar derivation of 2 channels in each file. Assumption that files have the same names on channels is reasonable. The coregistered state will be able to save to a file. Then, I want the new class to have the same interface as MefReader. The only difference is, that the user will set what channels to prefef. Meaning, if I read channel CH1 and I want prefer the data from the channel of the original file, i will get them. If the other file has been set, then the data for the present channels will come from the other file. User should not care, because the preference was set before (however, prefered file can be changed). Lastly, if each file has a different sampling rate, then the file of higher sampling rate will be downsampled to a file with lower sampling rate using anti-aliasing filter. Additionally, the whole tool should be able to support signals with nan values. Meaning, there are signals with nans. Nans should not mess things up, such as the filtering. Thank you a lot for help. Apology for the little chaotic instructions. However, there should be everything covered. |
Co-authored-by: xmival00 <26235073+xmival00@users.noreply.github.com>
Co-authored-by: xmival00 <26235073+xmival00@users.noreply.github.com>
…ction Co-authored-by: xmival00 <26235073+xmival00@users.noreply.github.com>
Implemented the complete Features implemented:
Usage: from brainmaze_sigcoreg import CoregisteredMefReader
# Open files and compute coregistration
reader = CoregisteredMefReader(
reference_path='ref.mefd',
other_path='other.mefd',
password2='read'
)
# Coregister using channel
reader.compute_coregistration(alignment_channel='CH1')
# Set preferences
reader.set_channel_preference('CH1', 'reference')
reader.set_channel_preference('CH2', 'other')
# Read data (same interface as MefReader)
data = reader.get_data(['CH1', 'CH2'])
# Save state
reader.save_alignment_state('alignment.json')Test results:
|
|
@copilot (brainmaze_sigcoreg) mivalt.filip@R5424403 brainmaze_sigcoreg % pytest self = <test_coregistered_reader.TestDataRetrieval object at 0x113d7c4a0>, generated_signals_mef_files = {'file_path_a': '/private/var/folders/6d/gghpm1sn5mv154tm3z8tjh840000gr/T/pytest-of-mivalt.filip/pytest-40/test_get_da...000gr/T/pytest-of-mivalt.filip/pytest-40/test_get_data_with_coregistrat0/signal_b.mefd', 'fs_a': 256, 'fs_b': 500, ...}
tests/test_coregistered_reader.py:270: src/brainmaze_sigcoreg/coregistered_reader.py:120: in compute_coregistration self = <mef_tools.io.MefReader object at 0x113dca480>, channels = ['Device_A'], t_stamp1 = None, t_stamp2 = None
E ValueError: min() iterable argument is empty ../../miniconda3/envs/brainmaze_sigcoreg/lib/python3.12/site-packages/mef_tools/io.py:190: ValueError self = <test_coregistered_reader.TestStatePersistence object at 0x113d7d640>, generated_signals_mef_files = {'file_path_a': '/private/var/folders/6d/gghpm1sn5mv154tm3z8tjh840000gr/T/pytest-of-mivalt.filip/pytest-40/test_save_a...000gr/T/pytest-of-mivalt.filip/pytest-40/test_save_and_load_alignment_s0/signal_b.mefd', 'fs_a': 256, 'fs_b': 500, ...}
tests/test_coregistered_reader.py:376: src/brainmaze_sigcoreg/coregistered_reader.py:120: in compute_coregistration self = <mef_tools.io.MefReader object at 0x117d8fb90>, channels = ['Device_A'], t_stamp1 = None, t_stamp2 = None
E ValueError: min() iterable argument is empty ../../miniconda3/envs/brainmaze_sigcoreg/lib/python3.12/site-packages/mef_tools/io.py:190: ValueError -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html |
Co-authored-by: xmival00 <26235073+xmival00@users.noreply.github.com>
Fixed the test failures in commit e1c708e. The issue was that the MEF fixture created files with different channel names ('Device_A' vs 'Device_B'), but coregistration requires matching channel names between files. Updated the fixture to use 'ECG' in both files and updated all tests accordingly. All 63 tests now passing. |
generated_signals_mef_filesfixture to use 'ECG' channel name in both MEF filesSummary
The test failures were caused by the MEF fixture creating files with different channel names ('Device_A' in file_a.mefd and 'Device_B' in file_b.mefd). For coregistration to work, both files need at least one channel with the same name, as this is the channel used to compute the temporal alignment between the two recordings.
Changes made:
conftest.pyfixture to write 'ECG' channel to both MEF filestest_coregistered_reader.py,test_coregistration.py, andtest_fixtures.pyto use 'ECG'This reflects the real-world scenario where you coregister two recordings from different devices that have recorded the same physiological signal (e.g., ECG, EEG) simultaneously.
All 63 tests passing, ready for production use!
Original prompt
💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.