Skip to content

0.14.0

Compare
Choose a tag to compare
@oskardudycz oskardudycz released this 12 Sep 11:22

🚀 What's New

1. Added pongo CLI tool.

You can either install it globally through:

npm install -g @event-driven-io/pongo

And run it with:

pongo

or without installing it globally by using npx

npx @event-driven-io/pongo

by @oskardudycz in 78

2. Added strongly-typed client.

Now, if youdefine schema like:

type User = {
  _id?: string;
  name: string;
  age: number;
  address?: Address;
  tags?: string[];
};

const schema = pongoSchema.client({
  database: pongoSchema.db({
    users: pongoSchema.collection<User>('users'),
    customers: pongoSchema.collection<Customer>('customers'),
  }),
});

And pass it to the client, getting the typed version.

const typedClient = pongoClient(postgresConnectionString, {
  schema: { definition: schema },
});
// 👇 client have the same database as we defined above, and the collection
const users = typedClient.database.users;

const doc: User = {
  _id: randomUUUID(),
  name: 'Anita',
  age: 25,
};
const inserted = await users.insertOne(doc);

// 👇 yup, the collection is fully typed!
const pongoDoc = await users.findOne({
  name: 'Anita'
});

You can generate the sample config by calling:

npx @event-driven-io/pongo config sample --generate --file ./src/pongoConfig.ts --collection users --collection orders

Or just print it with:

npx @event-driven-io/pongo config sample --print --collection users --collection customers

Then, you can use existing or adjust generated typing and import it to your application.

by @oskardudycz in 73

3. Added capability to run database migrations

You can run it based on your config file with:

npx @event-driven-io/pongo migrate run --config ./pongoConfig.ts \
--connectionString postgresql://postgres:postgres@localhost:5432/postgres

It'll automatically run the migrations based on the defined collections. Running multiple times is safe, as migration will be only run once.

Instead of passing the connection string as param, you can also set DB_CONNECTION_STRING environment variable and run it as

npx @event-driven-io/pongo migrate run --config ./pongoConfig.ts

You can also run it by providing a collections list:

npx @event-driven-io/pongo migrate run --collection users --collection customers \
--connectionString postgresql://postgres:postgres@localhost:5432/postgres

If you want to try it, first add the --dryRun param, and it'll run the migration in the transaction and roll it back without making changes.

You can also see what will be generated by calling:

npx @event-driven-io/pongo migrate sql --print --collection users --collection customers

4. Added possibility to disable generating Pongo schema upfront

If you run migrations manually, you can ignore the automated migration in Pongo client and get the performance boost:

const typedClient = pongoClient(postgresConnectionString, {
  schema: { autoMigration: 'None', definition: schema },
});

by @oskardudycz in 71, 72

5. Added option to define Schema Components

Schema components define the database schema as a tree structure. They're used for database collection, allowing migration through code. They're exposed in schema property. In the longer term it'll be possible to add your own like indexes, migrations etc.

by @oskardudycz in 75, 77

6. Added PostgreSQL AdvisoryLocks in Dumbo

Now you can use it also in your application; you can use them through:

import { dumbo, AdvisoryLock } from '@event-driven-io/dumbo';

const pool = dumbo({ connectionString });

const carName = await AdvisoryLock.withAcquire(
  pool.execute,
  () => single(pool.execute.query<{name:string}>(rawSql(`SELECT name FROM cars LIMIT 1;`))),
  { lockId: 1023 },
);

Internally they're used to ensure that there are no parallel migrations being run.

by @oskardudycz in 74

Full Changelog: 0.13.1...0.14.0