-
Couldn't load subscription status.
- Fork 29
Better error message in model initialization for unusable dataset #8956
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
📝 WalkthroughWalkthroughRemoves a translation key from frontend translations, adds an internal helper to validate dataset usability and centralize improved error messages (including dataset id/name/status and TRACE annotation prefix) during viewer initialization, and adds a changelog entry documenting the error-message improvements. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Suggested labels
Suggested reviewers
Poem
Pre-merge checks and finishing touches✅ Passed checks (5 passed)
✨ Finishing touches🧪 Generate unit tests
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
Tip 👮 Agentic pre-merge checks are now available in preview!Pro plan users can now enable pre-merge checks in their settings to enforce checklists before merging PRs.
Please see the documentation for more information. Example: reviews:
pre_merge_checks:
custom_checks:
- name: "Undocumented Breaking Changes"
mode: "warning"
instructions: |
Pass/fail criteria: All breaking changes to public APIs, CLI flags, environment variables, configuration keys, database schemas, or HTTP/GraphQL endpoints must be documented in the "Breaking Change" section of the PR description and in CHANGELOG.md. Exclude purely internal or private changes (e.g., code not exported from package entry points or explicitly marked as internal).Please share your feedback with us on this Discord post. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
frontend/javascripts/viewer/model_initialization.ts (2)
216-218: Pass datasetId and accept nullable dataset to improve missing-dataset errorInclude the datasetId in the error when the dataset is missing and avoid the unsafe cast by allowing a nullable APIDataset.
- assertUsableDataset(apiDataset as StoreDataset, initialCommandType); + assertUsableDataset(apiDataset, initialCommandType, datasetId);
472-493: Harden assertUsableDataset — accept nullable APIDataset, add datasetId, and null‑safe accessVerified: function is at frontend/javascripts/viewer/model_initialization.ts (~472–493) and is called at line 216 with a cast (assertUsableDataset(apiDataset as StoreDataset, initialCommandType)); messages["dataset.does_not_exist"] exists (frontend/javascripts/messages.tsx:359); messages["dataset.not_imported"] is unused.
-function assertUsableDataset(dataset: StoreDataset, initialCommandType: TraceOrViewCommand) { +function assertUsableDataset( + dataset: APIDataset | null | undefined, + initialCommandType: TraceOrViewCommand, + datasetId?: string, +) { let error; - let annotationNote = ""; + let annotationNote = ""; if (initialCommandType.type === ControlModeEnum.TRACE) { annotationNote = `Failed to load annotation ${initialCommandType.annotationId}: `; } - if (!dataset) { - error = `${annotationNote}${messages["dataset.does_not_exist"]}`; - } else if (!dataset.dataSource.dataLayers) { - let statusNote = "."; - if (dataset.dataSource.status) { - statusNote = `: ${dataset.dataSource.status}`; - } - error = `${annotationNote}Dataset ‘${dataset.name}’ (${dataset.id}) is not available${statusNote}`; + if (dataset == null) { + const idNote = datasetId ? ` (id: ${datasetId})` : ""; + error = `${annotationNote}${messages["dataset.does_not_exist"]}${idNote}`; + } else if (!dataset.dataSource?.dataLayers) { + const statusNote = dataset.dataSource?.status ? `: ${dataset.dataSource.status}` : "."; + error = `${annotationNote}Dataset ‘${dataset.name}’ (${dataset.id}) is not available${statusNote}`; } if (error) { Toast.error(error); throw HANDLED_ERROR; } }Update the call site at model_initialization.ts:216 to drop the cast and pass datasetId (available nearby): assertUsableDataset(apiDataset, initialCommandType, datasetId);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
frontend/javascripts/messages.tsx(0 hunks)frontend/javascripts/viewer/model_initialization.ts(2 hunks)unreleased_changes/8956.md(1 hunks)
💤 Files with no reviewable changes (1)
- frontend/javascripts/messages.tsx
🧰 Additional context used
🧬 Code graph analysis (1)
frontend/javascripts/viewer/model_initialization.ts (1)
frontend/javascripts/viewer/store.ts (2)
StoreDataset(566-573)TraceOrViewCommand(229-242)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: build-smoketest-push
- GitHub Check: backend-tests
🔇 Additional comments (3)
frontend/javascripts/viewer/model_initialization.ts (2)
495-511: LGTM: dataset initialization flow unchangedType and assertions look consistent with preprocess + Store usage.
472-487: Confirm policy on exposing raw dataset IDs in user‑facing errorsThe new message includes the dataset’s internal id. Please confirm this is acceptable per UX/privacy guidelines (especially with shared links).
unreleased_changes/8956.md (1)
1-2: Changelog LGTMAccurately describes the improvement without implying functional changes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry for the late review! I already read the code but wanted to test it too (which then slipped through somehow).
code looks good 👍 during testing I got this when deleting a DS on the dev instance:
[Server Time 2025-09-29T06:34:14.541Z] Failed RPC 97: DELETE https://initializatonerrortoast.webknossos.xyz/data/datasets/68d3a936010000b50097f657/deleteOnDisk. Response: 400 '{"messages":[{"error":"Could not delete the dataset on disk."},{"chain":"[Server Time 2025-09-29T06:34:14.540Z] Failed to remake symlinks <~ Could not update datasource-properties.json <~ Could not back up old contents: java.io.FileNotFoundException: /webknossos/binaryData/sample_organization/data_types/datasource-properties-backups.log (Permission denied)"}]}' 400
might be dev instance specific, though?
Co-authored-by: Philipp Otto <[email protected]>
|
Thanks! Yes, the dev deployments don’t have full write access on their data directory. Testing locally might be better suited for this PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
works great 👍
Before:


After:
Since I’m stitching together this error string, I’m not using messages.tsx for it. Not sure what the policy is for that, hope it’s ok?
Steps to test:
Issues:
$PR_NUMBER.mdfile inunreleased_changesor use./tools/create-changelog-entry.py)