Skip to content

Commit da9ec60

Browse files
docuement deserialization errors
1 parent fbfa75a commit da9ec60

File tree

4 files changed

+101
-27
lines changed

4 files changed

+101
-27
lines changed

docs/collections/powersync-collection.md

Lines changed: 27 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -99,9 +99,19 @@ const documentsCollection = createCollection(
9999
table: APP_SCHEMA.props.documents,
100100
})
101101
)
102-
```
103102

104-
#### Option 2: SQLite Types with Schema Validation
103+
/** Note: The types for input and output are defined as this */
104+
// Used for mutations like `insert` or `update`
105+
type DocumentCollectionInput = {
106+
id: string
107+
name: string | null
108+
author: string | null
109+
created_at: string | null // SQLite TEXT
110+
archived: number | null // SQLite integer
111+
}
112+
// The type of query/data results
113+
type DocumentCollectionOutput = DocumentCollectionInput
114+
```
105115
106116
The standard PowerSync SQLite types map to these TypeScript types:
107117
@@ -111,13 +121,17 @@ The standard PowerSync SQLite types map to these TypeScript types:
111121
| `column.integer` | `number \| null` | Integer values, also used for booleans (0/1) |
112122
| `column.real` | `number \| null` | Floating point numbers |
113123
114-
Note: All PowerSync column types are nullable by default, as SQLite allows null values in any column. Your schema should always handle null values appropriately by using `.nullable()` in your Zod schemas and handling null cases in your transformations.
124+
Note: All PowerSync column types are nullable by default.
125+
126+
#### Option 2: SQLite Types with Schema Validation
115127
116128
Additional validations for collection mutations can be performed with a custom schema. The Schema below asserts that
117129
the `name`, `author` and `created_at` fields are required as input. `name` also has an additional string length check.
118130
119131
Note: The input and output types specified in this example still satisfy the underlying SQLite types. An additional `deserializationSchema` is required if the typing differs. See the examples below for more details.
120132
133+
The application logic (including the backend) should enforce that all incoming synced data passes validation with the `deserializationSchema`. Failing to validate data will result in inconsistency of the collection data. This is a fatal error! An `onDeserializationError` handler must be provided to react to this case.
134+
121135
```ts
122136
import { createCollection } from "@tanstack/react-db"
123137
import { powerSyncCollectionOptions } from "@tanstack/powersync-db-collection"
@@ -137,6 +151,9 @@ const documentsCollection = createCollection(
137151
database: db,
138152
table: APP_SCHEMA.props.documents,
139153
schema,
154+
onDeserializationError: (error) => {
155+
// Present fatal error
156+
},
140157
})
141158
)
142159

@@ -161,6 +178,10 @@ Note: The Transformed types are provided by TanStackDB to the PowerSync SQLite p
161178
order to be persisted to SQLite. Most types are converted by default. For custom types, override the serialization by providing a
162179
`serializer` param.
163180
181+
The example below uses `nullable` columns, this is not a requirement.
182+
183+
The application logic (including the backend) should enforce that all incoming synced data passes validation with the `deserializationSchema`. Failing to validate data will result in inconsistency of the collection data. This is a fatal error! An `onDeserializationError` handler must be provided to react to this case.
184+
164185
```ts
165186
const schema = z.object({
166187
id: z.string(),
@@ -180,6 +201,9 @@ const documentsCollection = createCollection(
180201
database: db,
181202
table: APP_SCHEMA.props.documents,
182203
schema,
204+
onDeserializationError: (error) => {
205+
// Present fatal error
206+
},
183207
// Optional: custom column serialization
184208
serializer: {
185209
// Dates are serialized by default, this is just an example

packages/powersync-db-collection/src/definitions.ts

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -85,6 +85,13 @@ export type SerializerConfig<
8585
* ```
8686
*/
8787
serializer?: CustomSQLiteSerializer<TOutput, TSQLite>
88+
89+
/**
90+
* Application logic should ensure that incoming synced data is always valid.
91+
* Failing to deserialize and apply incoming changes results in data inconsistency - which is a fatal error.
92+
* Use this callback to react to deserialization errors.
93+
*/
94+
onDeserializationError: (error: StandardSchemaV1.FailureResult) => void
8895
}
8996

9097
/**
@@ -154,13 +161,6 @@ export type ConfigWithArbitraryCollectionTypes<
154161
ExtractedTable<TTable>,
155162
StandardSchemaV1.InferOutput<TSchema>
156163
>
157-
158-
/**
159-
* Application logic should ensure that incoming synced data is always valid.
160-
* Failing to deserialize and apply incoming changes results in data inconsistency - which is a fatal error.
161-
* Use this callback to react to deserialization errors.
162-
*/
163-
onDeserializationError: (error: StandardSchemaV1.FailureResult) => void
164164
}
165165
export type BasePowerSyncCollectionConfig<
166166
TTable extends Table = Table,

packages/powersync-db-collection/src/powersync.ts

Lines changed: 14 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -248,23 +248,21 @@ export function powerSyncCollectionOptions<
248248
* Deserializes data from the incoming sync stream
249249
*/
250250
const deserializeSyncRow = (value: TableType): OutputType => {
251-
if (deserializationSchema) {
252-
const validation = deserializationSchema[`~standard`].validate(value)
253-
if (`value` in validation) {
254-
return validation.value
255-
} else if (`issues` in validation) {
256-
const issueMessage = `Failed to validate incoming data for ${viewName}. Issues: ${validation.issues.map((issue) => `${issue.path} - ${issue.message}`)}`
257-
database.logger.error(issueMessage)
258-
onDeserializationError!(validation)
259-
throw new Error(issueMessage)
260-
} else {
261-
const unknownErrorMessage = `Unknown deserialization error for ${viewName}`
262-
database.logger.error(unknownErrorMessage)
263-
onDeserializationError!({ issues: [{ message: unknownErrorMessage }] })
264-
throw new Error(unknownErrorMessage)
265-
}
251+
const validationSchema = deserializationSchema || schema
252+
const validation = validationSchema[`~standard`].validate(value)
253+
if (`value` in validation) {
254+
return validation.value
255+
} else if (`issues` in validation) {
256+
const issueMessage = `Failed to validate incoming data for ${viewName}. Issues: ${validation.issues.map((issue) => `${issue.path} - ${issue.message}`)}`
257+
database.logger.error(issueMessage)
258+
onDeserializationError!(validation)
259+
throw new Error(issueMessage)
260+
} else {
261+
const unknownErrorMessage = `Unknown deserialization error for ${viewName}`
262+
database.logger.error(unknownErrorMessage)
263+
onDeserializationError!({ issues: [{ message: unknownErrorMessage }] })
264+
throw new Error(unknownErrorMessage)
266265
}
267-
return value as OutputType
268266
}
269267

270268
// We can do basic runtime validations for columns if not explicit schema has been provided

packages/powersync-db-collection/tests/collection-schema.test.ts

Lines changed: 53 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,10 @@ import { randomUUID } from "node:crypto"
22
import { tmpdir } from "node:os"
33
import { PowerSyncDatabase, Schema, Table, column } from "@powersync/node"
44
import { SchemaValidationError, createCollection } from "@tanstack/db"
5-
import { describe, expect, it, onTestFinished } from "vitest"
5+
import { describe, expect, it, onTestFinished, vi } from "vitest"
66
import { z } from "zod"
77
import { powerSyncCollectionOptions } from "../src"
8+
import type { StandardSchemaV1 } from "@standard-schema/spec"
89

910
const APP_SCHEMA = new Schema({
1011
documents: new Table({
@@ -108,6 +109,7 @@ describe(`PowerSync Schema Integration`, () => {
108109
database: db,
109110
table: APP_SCHEMA.props.documents,
110111
schema,
112+
onDeserializationError: () => {},
111113
})
112114
)
113115
onTestFinished(() => collection.cleanup())
@@ -167,6 +169,7 @@ describe(`PowerSync Schema Integration`, () => {
167169
database: db,
168170
table: APP_SCHEMA.props.documents,
169171
schema,
172+
onDeserializationError: () => {},
170173
})
171174
)
172175
onTestFinished(() => collection.cleanup())
@@ -306,5 +309,54 @@ describe(`PowerSync Schema Integration`, () => {
306309
expect(item?.name instanceof MyDataClass).true
307310
expect(item?.name?.options.value).eq(`document`)
308311
})
312+
313+
/**
314+
* We sync data which cannot be validated by the schema. This is a fatal error.
315+
*/
316+
it(`should catch deserialization errors`, async () => {
317+
const db = await createDatabase()
318+
319+
/**
320+
* Here name is stored as a Buffer. We can't serialize this to SQLite automatically.
321+
* We need to provide a serializer.
322+
*/
323+
const schema = z.object({
324+
id: z.string(),
325+
name: z.string(),
326+
archived: z.number(),
327+
author: z.string(),
328+
created_at: z.string(),
329+
})
330+
331+
const onError = vi.fn((() => {}) as (
332+
error: StandardSchemaV1.FailureResult
333+
) => void)
334+
335+
const collection = createCollection(
336+
powerSyncCollectionOptions({
337+
database: db,
338+
table: APP_SCHEMA.props.documents,
339+
schema,
340+
onDeserializationError: onError,
341+
})
342+
)
343+
onTestFinished(() => collection.cleanup())
344+
345+
await collection.stateWhenReady()
346+
347+
// The columns are not nullable in the schema
348+
// Write invalid data to SQLite, this simulates a sync
349+
await db.execute(`INSERT INTO documents(id) VALUES(uuid())`)
350+
351+
await vi.waitFor(
352+
() => {
353+
const issues = onError.mock.lastCall?.[0]?.issues
354+
expect(issues).toBeDefined()
355+
// Each column which should have been defined
356+
expect(issues?.length).eq(4)
357+
},
358+
{ timeout: 1000 }
359+
)
360+
})
309361
})
310362
})

0 commit comments

Comments
 (0)