Skip to content

Trilium preview import#8654

Draft
eliandoran wants to merge 38 commits intomainfrom
feature/trilium_extension
Draft

Trilium preview import#8654
eliandoran wants to merge 38 commits intomainfrom
feature/trilium_extension

Conversation

@eliandoran
Copy link
Contributor

No description provided.

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @eliandoran, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request introduces a significant security and usability enhancement to the Trilium import functionality. It provides users with a detailed preview of .trilium archive contents, specifically highlighting any potentially dangerous elements like scripts or widgets before the import is finalized. This allows users to make an informed decision on how to proceed with the import, offering options for safe import (disabling active content) or trusting the source. The changes involve both frontend UI development for the preview dialog and backend API extensions to process and analyze the archive contents.

Highlights

  • Trilium Import Preview Feature: A new feature has been added to provide a preview of Trilium archives (.trilium files) before importing them. This preview displays information such as the number of notes, attributes, and attachments, and crucially, identifies any 'dangerous' content (e.g., scripts, widgets, icon packs) within the archive.
  • Dangerous Content Identification: The system now categorizes and highlights potentially dangerous attributes within imported files, such as client-side scripting, server-side scripting, code execution, icon packs, and web views. This helps users make informed decisions about importing potentially malicious content.
  • Import Flow Enhancement: The import process for .trilium files has been updated. When a .trilium file is dragged and dropped, a preview dialog is shown, allowing users to choose between a 'safe' import (disabling active content) or an 'unsafe' import (trusting and enabling active content) after a short timeout for dangerous imports.
  • Backend API for Import Preview: New API endpoints (/api/notes/:parentNoteId/preview-import and /api/notes/:parentNoteId/execute-import) have been introduced to handle the preview and execution of .trilium file imports.
  • Refactoring and UI Improvements: Several UI components related to dialogs and modals have been refactored, including the extraction of a generic Card component and reordering of imports for consistency. New translation keys have been added for the import preview dialog.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • apps/client/src/components/app_context.ts
    • Added ImportPreviewData import.
    • Added showImportPreviewDialog command mapping.
  • apps/client/src/layouts/layout_commons.tsx
    • Added ImportPreviewDialog import.
    • Added ImportPreviewDialog to the list of modals applied to the root container.
    • Reordered imports for consistency.
  • apps/client/src/services/import.ts
    • Added ImportPreviewResponse import from @triliumnext/commons.
    • Changed BooleanLike type to exclude boolean.
    • Added uploadFilesWithPreview function to handle preview uploads.
    • Added executeUploadWithPreview function to finalize imports after preview.
    • Changed error callback in uploadFiles to use shorthand syntax.
  • apps/client/src/services/utils.ts
    • Added boolToString utility function.
  • apps/client/src/translations/en/translation.json
    • Added new translation keys for the import preview dialog, including titles, counts, and descriptions for dangerous categories.
  • apps/client/src/widgets/dialogs/help.tsx
    • Reordered imports for consistency.
    • Removed local Card component definition, likely replaced by a shared component.
  • apps/client/src/widgets/dialogs/import.tsx
    • Moved boolToString function to services/utils.ts.
    • Reordered imports for consistency.
  • apps/client/src/widgets/dialogs/import_preview.css
    • Added new CSS for the import preview dialog, defining styles for stats, cards, and dangerous categories.
  • apps/client/src/widgets/dialogs/import_preview.tsx
    • Added new component ImportPreviewDialog to display import previews.
    • Implemented logic for displaying dangerous attributes and categories.
    • Included a countdown for dangerous imports before the import button becomes active.
    • Added SinglePreview component for individual file previews.
    • Defined DANGEROUS_CATEGORIES_MAPPINGS and SEVERITY_ORDER for dangerous attribute handling.
  • apps/client/src/widgets/note_tree.ts
    • Modified drag-and-drop logic to differentiate between regular file imports and .trilium archive imports.
    • For .trilium files, it now triggers the showImportPreviewDialog command.
  • apps/client/src/widgets/react/Card.tsx
    • Added new shared Card component.
  • apps/server/src/routes/api/import.ts
    • Added ImportPreviewResponse import.
    • Added readFileSync import.
    • Added previewZipForImport import.
    • Introduced onImportDone helper function.
    • Added importPreview function to handle preview requests for .trilium files.
    • Added importExecute function to handle the final execution of imports after preview.
    • Created importStore to temporarily hold paths of uploaded files for preview.
  • apps/server/src/routes/route_api.ts
    • Added mkdirSync, join, dataDirs, randomString imports.
    • Refactored apiResultHandler to simplify if/else structure.
    • Refactored send function to simplify if/else structure.
    • Added createImportUploadMiddleware for handling .trilium file uploads specifically for preview.
    • Added importMiddlewareWithErrorHandling to use the new import upload middleware.
  • apps/server/src/routes/routes.ts
    • Updated import for importMiddlewareWithErrorHandling.
    • Added new API routes for preview-import and execute-import.
  • apps/server/src/services/builtin_attributes.ts
    • Refactored builtinAttributes to include dangerCategory for dangerous attributes.
    • Exported DangerousAttributeInfo type.
  • apps/server/src/services/import/zip.ts
    • Modified readZipFile to accept Buffer | string for file path or buffer.
    • Exported normalizeFilePath function.
    • Minor code style changes (e.g., function (xhr) to error (xhr)).
  • apps/server/src/services/import/zip_preview.spec.ts
    • Added new test file for zip_preview service.
    • Tests previewMeta function for identifying dangerous attributes and categories.
  • apps/server/src/services/import/zip_preview.ts
    • Added new service previewZipForImport to analyze .trilium zip files and extract preview information.
    • Implemented previewMeta and previewMetaInternal to recursively scan NoteMetaFile for dangerous attributes.
    • Identifies dangerous attributes based on builtin_attributes.ts.
  • packages/commons/src/lib/server_api.ts
    • Added DangerousAttributeCategory type definition.
    • Added ImportPreviewResponse interface definition.
Activity
  • The pull request introduces a new import preview feature for Trilium archives.
  • This involves significant changes across both the client and server.
  • On the client side, a new ImportPreviewDialog component has been added, along with associated CSS and translation keys.
  • The app_context and layout_commons have been updated to integrate this new dialog and command.
  • The import service now includes functions for previewing and executing imports with preview data.
  • The note_tree widget has been modified to trigger this preview dialog when .trilium files are dragged and dropped.
  • On the server side, new API endpoints have been created to handle the preview and execution of these imports.
  • The builtin_attributes service has been enhanced to categorize dangerous attributes, and a new zip_preview service (with tests) has been implemented to analyze .trilium files for potential risks.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a new import preview feature, which is a significant and well-implemented addition. My review focuses on improving performance by parallelizing network requests, ensuring proper error handling for asynchronous operations, and addressing potential resource leaks on the server. I've also suggested some minor changes to improve code robustness on the client side.

Comment on lines +95 to +110
const taskId = utils.randomString(10);
let counter = 0;

for (const file of files) {
counter++;

server.post(
`notes/${parentNoteId}/execute-import`,
{
...options,
id: file.id,
taskId,
last: counter === files.length ? "true" : "false"
}
);
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The server.post calls are not awaited, which means the function will not wait for the import to complete and any errors will result in unhandled promise rejections. It's better to await all promises to ensure proper execution flow and error handling. Using map with an index is also cleaner than an external counter.

    const taskId = utils.randomString(10);

    const promises = files.map((file, i) => server.post(
        `notes/${parentNoteId}/execute-import`,
        {
            ...options,
            id: file.id,
            taskId,
            last: i === files.length - 1 ? "true" : "false"
        }
    ));

    await Promise.all(promises);

Comment on lines 211 to 213
const buffer = readFileSync(importRecord.path);
const note = await zipImportService.importZip(taskContext, buffer, parentNote);
onImportDone(note, last, taskContext, parentNoteId);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The temporary file and the entry in importStore are not cleaned up after the import is executed. This can lead to unnecessary disk space usage and memory consumption over time. It's a good practice to clean up these resources after they are used.

To fix this, you'll also need to import unlink from the fs module:

import { readFileSync, unlink } from "fs";
Suggested change
const buffer = readFileSync(importRecord.path);
const note = await zipImportService.importZip(taskContext, buffer, parentNote);
onImportDone(note, last, taskContext, parentNoteId);
const buffer = readFileSync(importRecord.path);
unlink(importRecord.path, err => {
if (err) {
log.error(`Could not delete temporary file ${importRecord.path}`, err);
}
});
delete importStore[id];
const note = await zipImportService.importZip(taskContext, buffer, parentNote);
onImportDone(note, last, taskContext, parentNoteId);

Comment on lines 66 to 87
const taskId = utils.randomString(10);
const results: ImportPreviewResponse[] = [];
for (const file of files) {
const formData = new FormData();
formData.append("upload", file);
formData.append("taskId", taskId);

results.push(await $.ajax({
url: `${window.glob.baseApiUrl}notes/${parentNoteId}/preview-import`,
headers: await server.getHeaders(),
data: formData,
dataType: "json",
type: "POST",
timeout: 60 * 60 * 1000,
error (xhr) {
toastService.showError(t("import.failed", { message: xhr.responseText }));
},
contentType: false, // NEEDED, DON'T REMOVE THIS
processData: false // NEEDED, DON'T REMOVE THIS
}));
}
return results;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The file uploads for the preview are performed sequentially due to await inside the for...of loop. This can be slow if multiple files are being imported. You can parallelize these requests using Promise.all to improve performance.

    const taskId = utils.randomString(10);

    const promises = files.map(async file => {
        const formData = new FormData();
        formData.append("upload", file);
        formData.append("taskId", taskId);

        return $.ajax({
            url: `${window.glob.baseApiUrl}notes/${parentNoteId}/preview-import`,
            headers: await server.getHeaders(),
            data: formData,
            dataType: "json",
            type: "POST",
            timeout: 60 * 60 * 1000,
            error (xhr) {
                toastService.showError(t("import.failed", { message: xhr.responseText }));
            },
            contentType: false, // NEEDED, DON'T REMOVE THIS
            processData: false // NEEDED, DON'T REMOVE THIS
        });
    });

    return Promise.all(promises);

return (
<Card
title={preview.fileName}
className={DANGEROUS_CATEGORIES_MAPPINGS[categories[0]]?.category ?? "safe"}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Accessing categories[0] can be undefined if the categories array is empty. While DANGEROUS_CATEGORIES_MAPPINGS[undefined] might not throw a runtime error in JavaScript (it accesses the property named 'undefined'), it's not robust and can lead to unexpected behavior. It's better to explicitly handle the case where categories is empty.

Suggested change
className={DANGEROUS_CATEGORIES_MAPPINGS[categories[0]]?.category ?? "safe"}
className={categories.length > 0 ? (DANGEROUS_CATEGORIES_MAPPINGS[categories[0] as DangerousAttributeCategory]?.category ?? "safe") : "safe"}

Comment on lines +211 to +215
return categories.toSorted((a, b) => {
const aLevel = DANGEROUS_CATEGORIES_MAPPINGS[a].category;
const bLevel = DANGEROUS_CATEGORIES_MAPPINGS[b].category;
return SEVERITY_ORDER[aLevel] - SEVERITY_ORDER[bLevel];
});
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Accessing DANGEROUS_CATEGORIES_MAPPINGS[a] could potentially lead to a runtime error if a is not a valid key, which would crash the component. While the data comes from the backend and should be valid, adding a defensive check would make this more robust.

    return categories.toSorted((a, b) => {
        const aMapping = DANGEROUS_CATEGORIES_MAPPINGS[a as DangerousAttributeCategory];
        const bMapping = DANGEROUS_CATEGORIES_MAPPINGS[b as DangerousAttributeCategory];

        if (!aMapping) return 1;
        if (!bMapping) return -1;

        const aLevel = aMapping.category;
        const bLevel = bMapping.category;
        return SEVERITY_ORDER[aLevel] - SEVERITY_ORDER[bLevel];
    });


public async onRecordRemoved(record: ImportRecord): Promise<void> {
try {
await rm(record.path);

Check failure

Code scanning / CodeQL

Uncontrolled data used in path expression High

This path depends on a
user-provided value
.

Copilot Autofix

AI about 15 hours ago

General approach: ensure that any filesystem path derived from request data is constrained to a specific safe root directory. For deletion, we should only call rm() on paths that are inside DiskImportStore.uploadDir. The robust pattern is: (1) resolve/normalize the candidate path relative to the safe root, (2) resolve any symlinks, and (3) check that the final path starts with the safe root path. If the check fails, we should log and skip deletion instead of deleting.

Best fix here: harden DiskImportStore.onRecordRemoved so it never deletes outside its upload directory. We can do this by importing resolve from "path" and realpath from "fs/promises", then computing a normalized real path and comparing it to the real path of this.uploadDir. Since we cannot change surrounding code or add new fields outside what is shown, we will perform this verification directly inside onRecordRemoved. This keeps existing functionality (cleanup of uploaded files) but adds a safety boundary. We will not change how record.path is set in import_preview; instead we validate before deleting.

Concretely:

  • In apps/server/src/services/import/import_store.ts:
    • Extend the fs/promises import to include realpath.
    • Extend the path import to include resolve.
    • Update onRecordRemoved to:
      • Resolve the upload directory once per call: const uploadRoot = await realpath(this.uploadDir);
      • Resolve the record path against that root: const candidate = await realpath(resolve(this.uploadDir, record.path));
      • Verify candidate.startsWith(uploadRoot + pathSep) or candidate === uploadRoot (we’ll use a simple startsWith check with a trailing separator).
      • Only then call rm(candidate); otherwise log a warning and skip deletion.

We must also consider that existing code may already be storing an absolute path in record.path. To avoid breaking functionality, the safest compromise in this snippet-only context is: if record.path is absolute, we still check that it lies under uploadRoot by resolving it via realpath(record.path) and comparing; if it’s relative, resolve it relative to uploadDir. Given we cannot add much surrounding helper code, we’ll implement a simple check using path.isAbsolute. However, isAbsolute is not currently imported; since we’re limited to modifying only shown snippets and want to keep imports minimal, we’ll instead always resolve relative to this.uploadDir via resolve(this.uploadDir, record.path). Because resolve ignores the base when record.path is absolute, this will behave correctly for both absolute and relative paths, and our startsWith(uploadRoot) check will still enforce containment.

All changes are confined to apps/server/src/services/import/import_store.ts; apps/server/src/routes/api/import.ts doesn’t need modifications.


Suggested changeset 1
apps/server/src/services/import/import_store.ts

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/apps/server/src/services/import/import_store.ts b/apps/server/src/services/import/import_store.ts
--- a/apps/server/src/services/import/import_store.ts
+++ b/apps/server/src/services/import/import_store.ts
@@ -1,5 +1,5 @@
-import { mkdir, readdir, rm,  } from "fs/promises";
-import { join } from "path";
+import { mkdir, readdir, rm, realpath } from "fs/promises";
+import { join, resolve } from "path";
 
 import dataDirs from "../data_dir";
 
@@ -79,7 +79,19 @@
 
     public async onRecordRemoved(record: ImportRecord): Promise<void> {
         try {
-            await rm(record.path);
+            // Resolve the upload directory and the target path, then ensure the file
+            // to delete is contained within the upload directory.
+            const uploadRoot = await realpath(this.uploadDir);
+            const targetPath = await realpath(resolve(this.uploadDir, record.path));
+
+            if (!targetPath.startsWith(uploadRoot)) {
+                console.error(
+                    `Refusing to delete path outside upload dir. uploadRoot=${uploadRoot}, targetPath=${targetPath}`
+                );
+                return;
+            }
+
+            await rm(targetPath);
         } catch (e) {
             console.error(`Unable to delete file from import store: ${record.path}.`, e);
         }
EOF
@@ -1,5 +1,5 @@
import { mkdir, readdir, rm, } from "fs/promises";
import { join } from "path";
import { mkdir, readdir, rm, realpath } from "fs/promises";
import { join, resolve } from "path";

import dataDirs from "../data_dir";

@@ -79,7 +79,19 @@

public async onRecordRemoved(record: ImportRecord): Promise<void> {
try {
await rm(record.path);
// Resolve the upload directory and the target path, then ensure the file
// to delete is contained within the upload directory.
const uploadRoot = await realpath(this.uploadDir);
const targetPath = await realpath(resolve(this.uploadDir, record.path));

if (!targetPath.startsWith(uploadRoot)) {
console.error(
`Refusing to delete path outside upload dir. uploadRoot=${uploadRoot}, targetPath=${targetPath}`
);
return;
}

await rm(targetPath);
} catch (e) {
console.error(`Unable to delete file from import store: ${record.path}.`, e);
}
Copilot is powered by AI and may make mistakes. Always verify output.
try {
await rm(record.path);
} catch (e) {
console.error(`Unable to delete file from import store: ${record.path}.`, e);

Check failure

Code scanning / CodeQL

Use of externally-controlled format string High

Format string depends on a
user-provided value
.

Copilot Autofix

AI about 15 hours ago

Copilot could not generate an autofix suggestion

Copilot could not generate an autofix suggestion for this alert. Try pushing a new commit or if the problem persists contact support.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant