Conversation
Summary of ChangesHello @eliandoran, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request introduces a significant security and usability enhancement to the Trilium import functionality. It provides users with a detailed preview of .trilium archive contents, specifically highlighting any potentially dangerous elements like scripts or widgets before the import is finalized. This allows users to make an informed decision on how to proceed with the import, offering options for safe import (disabling active content) or trusting the source. The changes involve both frontend UI development for the preview dialog and backend API extensions to process and analyze the archive contents. Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Changelog
Activity
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request introduces a new import preview feature, which is a significant and well-implemented addition. My review focuses on improving performance by parallelizing network requests, ensuring proper error handling for asynchronous operations, and addressing potential resource leaks on the server. I've also suggested some minor changes to improve code robustness on the client side.
| const taskId = utils.randomString(10); | ||
| let counter = 0; | ||
|
|
||
| for (const file of files) { | ||
| counter++; | ||
|
|
||
| server.post( | ||
| `notes/${parentNoteId}/execute-import`, | ||
| { | ||
| ...options, | ||
| id: file.id, | ||
| taskId, | ||
| last: counter === files.length ? "true" : "false" | ||
| } | ||
| ); | ||
| } |
There was a problem hiding this comment.
The server.post calls are not awaited, which means the function will not wait for the import to complete and any errors will result in unhandled promise rejections. It's better to await all promises to ensure proper execution flow and error handling. Using map with an index is also cleaner than an external counter.
const taskId = utils.randomString(10);
const promises = files.map((file, i) => server.post(
`notes/${parentNoteId}/execute-import`,
{
...options,
id: file.id,
taskId,
last: i === files.length - 1 ? "true" : "false"
}
));
await Promise.all(promises);
apps/server/src/routes/api/import.ts
Outdated
| const buffer = readFileSync(importRecord.path); | ||
| const note = await zipImportService.importZip(taskContext, buffer, parentNote); | ||
| onImportDone(note, last, taskContext, parentNoteId); |
There was a problem hiding this comment.
The temporary file and the entry in importStore are not cleaned up after the import is executed. This can lead to unnecessary disk space usage and memory consumption over time. It's a good practice to clean up these resources after they are used.
To fix this, you'll also need to import unlink from the fs module:
import { readFileSync, unlink } from "fs";| const buffer = readFileSync(importRecord.path); | |
| const note = await zipImportService.importZip(taskContext, buffer, parentNote); | |
| onImportDone(note, last, taskContext, parentNoteId); | |
| const buffer = readFileSync(importRecord.path); | |
| unlink(importRecord.path, err => { | |
| if (err) { | |
| log.error(`Could not delete temporary file ${importRecord.path}`, err); | |
| } | |
| }); | |
| delete importStore[id]; | |
| const note = await zipImportService.importZip(taskContext, buffer, parentNote); | |
| onImportDone(note, last, taskContext, parentNoteId); |
| const taskId = utils.randomString(10); | ||
| const results: ImportPreviewResponse[] = []; | ||
| for (const file of files) { | ||
| const formData = new FormData(); | ||
| formData.append("upload", file); | ||
| formData.append("taskId", taskId); | ||
|
|
||
| results.push(await $.ajax({ | ||
| url: `${window.glob.baseApiUrl}notes/${parentNoteId}/preview-import`, | ||
| headers: await server.getHeaders(), | ||
| data: formData, | ||
| dataType: "json", | ||
| type: "POST", | ||
| timeout: 60 * 60 * 1000, | ||
| error (xhr) { | ||
| toastService.showError(t("import.failed", { message: xhr.responseText })); | ||
| }, | ||
| contentType: false, // NEEDED, DON'T REMOVE THIS | ||
| processData: false // NEEDED, DON'T REMOVE THIS | ||
| })); | ||
| } | ||
| return results; |
There was a problem hiding this comment.
The file uploads for the preview are performed sequentially due to await inside the for...of loop. This can be slow if multiple files are being imported. You can parallelize these requests using Promise.all to improve performance.
const taskId = utils.randomString(10);
const promises = files.map(async file => {
const formData = new FormData();
formData.append("upload", file);
formData.append("taskId", taskId);
return $.ajax({
url: `${window.glob.baseApiUrl}notes/${parentNoteId}/preview-import`,
headers: await server.getHeaders(),
data: formData,
dataType: "json",
type: "POST",
timeout: 60 * 60 * 1000,
error (xhr) {
toastService.showError(t("import.failed", { message: xhr.responseText }));
},
contentType: false, // NEEDED, DON'T REMOVE THIS
processData: false // NEEDED, DON'T REMOVE THIS
});
});
return Promise.all(promises);| return ( | ||
| <Card | ||
| title={preview.fileName} | ||
| className={DANGEROUS_CATEGORIES_MAPPINGS[categories[0]]?.category ?? "safe"} |
There was a problem hiding this comment.
Accessing categories[0] can be undefined if the categories array is empty. While DANGEROUS_CATEGORIES_MAPPINGS[undefined] might not throw a runtime error in JavaScript (it accesses the property named 'undefined'), it's not robust and can lead to unexpected behavior. It's better to explicitly handle the case where categories is empty.
| className={DANGEROUS_CATEGORIES_MAPPINGS[categories[0]]?.category ?? "safe"} | |
| className={categories.length > 0 ? (DANGEROUS_CATEGORIES_MAPPINGS[categories[0] as DangerousAttributeCategory]?.category ?? "safe") : "safe"} |
| return categories.toSorted((a, b) => { | ||
| const aLevel = DANGEROUS_CATEGORIES_MAPPINGS[a].category; | ||
| const bLevel = DANGEROUS_CATEGORIES_MAPPINGS[b].category; | ||
| return SEVERITY_ORDER[aLevel] - SEVERITY_ORDER[bLevel]; | ||
| }); |
There was a problem hiding this comment.
Accessing DANGEROUS_CATEGORIES_MAPPINGS[a] could potentially lead to a runtime error if a is not a valid key, which would crash the component. While the data comes from the backend and should be valid, adding a defensive check would make this more robust.
return categories.toSorted((a, b) => {
const aMapping = DANGEROUS_CATEGORIES_MAPPINGS[a as DangerousAttributeCategory];
const bMapping = DANGEROUS_CATEGORIES_MAPPINGS[b as DangerousAttributeCategory];
if (!aMapping) return 1;
if (!bMapping) return -1;
const aLevel = aMapping.category;
const bLevel = bMapping.category;
return SEVERITY_ORDER[aLevel] - SEVERITY_ORDER[bLevel];
});
|
|
||
| public async onRecordRemoved(record: ImportRecord): Promise<void> { | ||
| try { | ||
| await rm(record.path); |
Check failure
Code scanning / CodeQL
Uncontrolled data used in path expression High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 15 hours ago
General approach: ensure that any filesystem path derived from request data is constrained to a specific safe root directory. For deletion, we should only call rm() on paths that are inside DiskImportStore.uploadDir. The robust pattern is: (1) resolve/normalize the candidate path relative to the safe root, (2) resolve any symlinks, and (3) check that the final path starts with the safe root path. If the check fails, we should log and skip deletion instead of deleting.
Best fix here: harden DiskImportStore.onRecordRemoved so it never deletes outside its upload directory. We can do this by importing resolve from "path" and realpath from "fs/promises", then computing a normalized real path and comparing it to the real path of this.uploadDir. Since we cannot change surrounding code or add new fields outside what is shown, we will perform this verification directly inside onRecordRemoved. This keeps existing functionality (cleanup of uploaded files) but adds a safety boundary. We will not change how record.path is set in import_preview; instead we validate before deleting.
Concretely:
- In
apps/server/src/services/import/import_store.ts:- Extend the
fs/promisesimport to includerealpath. - Extend the
pathimport to includeresolve. - Update
onRecordRemovedto:- Resolve the upload directory once per call:
const uploadRoot = await realpath(this.uploadDir); - Resolve the record path against that root:
const candidate = await realpath(resolve(this.uploadDir, record.path)); - Verify
candidate.startsWith(uploadRoot + pathSep)orcandidate === uploadRoot(we’ll use a simplestartsWithcheck with a trailing separator). - Only then call
rm(candidate);otherwise log a warning and skip deletion.
- Resolve the upload directory once per call:
- Extend the
We must also consider that existing code may already be storing an absolute path in record.path. To avoid breaking functionality, the safest compromise in this snippet-only context is: if record.path is absolute, we still check that it lies under uploadRoot by resolving it via realpath(record.path) and comparing; if it’s relative, resolve it relative to uploadDir. Given we cannot add much surrounding helper code, we’ll implement a simple check using path.isAbsolute. However, isAbsolute is not currently imported; since we’re limited to modifying only shown snippets and want to keep imports minimal, we’ll instead always resolve relative to this.uploadDir via resolve(this.uploadDir, record.path). Because resolve ignores the base when record.path is absolute, this will behave correctly for both absolute and relative paths, and our startsWith(uploadRoot) check will still enforce containment.
All changes are confined to apps/server/src/services/import/import_store.ts; apps/server/src/routes/api/import.ts doesn’t need modifications.
| @@ -1,5 +1,5 @@ | ||
| import { mkdir, readdir, rm, } from "fs/promises"; | ||
| import { join } from "path"; | ||
| import { mkdir, readdir, rm, realpath } from "fs/promises"; | ||
| import { join, resolve } from "path"; | ||
|
|
||
| import dataDirs from "../data_dir"; | ||
|
|
||
| @@ -79,7 +79,19 @@ | ||
|
|
||
| public async onRecordRemoved(record: ImportRecord): Promise<void> { | ||
| try { | ||
| await rm(record.path); | ||
| // Resolve the upload directory and the target path, then ensure the file | ||
| // to delete is contained within the upload directory. | ||
| const uploadRoot = await realpath(this.uploadDir); | ||
| const targetPath = await realpath(resolve(this.uploadDir, record.path)); | ||
|
|
||
| if (!targetPath.startsWith(uploadRoot)) { | ||
| console.error( | ||
| `Refusing to delete path outside upload dir. uploadRoot=${uploadRoot}, targetPath=${targetPath}` | ||
| ); | ||
| return; | ||
| } | ||
|
|
||
| await rm(targetPath); | ||
| } catch (e) { | ||
| console.error(`Unable to delete file from import store: ${record.path}.`, e); | ||
| } |
| try { | ||
| await rm(record.path); | ||
| } catch (e) { | ||
| console.error(`Unable to delete file from import store: ${record.path}.`, e); |
Check failure
Code scanning / CodeQL
Use of externally-controlled format string High
Copilot Autofix
AI about 15 hours ago
Copilot could not generate an autofix suggestion
Copilot could not generate an autofix suggestion for this alert. Try pushing a new commit or if the problem persists contact support.
No description provided.