diff --git a/.agents/README.md b/.agents/README.md new file mode 100644 index 0000000..f2ee7da --- /dev/null +++ b/.agents/README.md @@ -0,0 +1,34 @@ +# `.agents/` + +This directory hosts [Agent Skills](https://agentskills.io) for the +Airfoil TypeScript SDK. Skills provide AI coding agents with focused, +progressively-disclosed playbooks for specific, repeatable tasks against +this codebase. + +## Available skills + +- [`skills/airfoil-kit/`](./skills/airfoil-kit/) — end-to-end playbook + for implementing a new Airfoil producer connector from scratch using the + `templates/producer-template/` scaffold, with docs-first API research and + deterministic validation gates. + +## Directory layout + +Each skill follows the standard layout: + +``` +.agents/skills// +├── SKILL.md # lean orchestrator (YAML frontmatter + body) +├── references/ # deep docs (tier-3, loaded on demand) +└── assets/ # copy-paste templates and checklists +``` + +`SKILL.md` is the entry point. Its frontmatter (`name`, `description`) is +what agent hosts index on, and the body points at the detailed files under +`references/` and `assets/`. + +## Where is this referenced? + +- [`AGENTS.md`](../AGENTS.md) at the repo root has a short pointer section. +- Editor-agent tooling (Claude, Cursor, Codex, OpenCode, etc.) reads the + YAML frontmatter of each `SKILL.md` to decide when to activate the skill. diff --git a/.agents/skills/airfoil-kit/README.md b/.agents/skills/airfoil-kit/README.md new file mode 100644 index 0000000..7a7002b --- /dev/null +++ b/.agents/skills/airfoil-kit/README.md @@ -0,0 +1,58 @@ +# airfoil-kit skill + +Agent skill that walks an AI coding assistant through implementing a new +Airfoil producer connector end-to-end. + +## What this skill does + +- Confirms no existing implementation is being copied. +- Copies `templates/producer-template/` into `connectors/producer-/`. +- Helps you research the target API and derive schemas from recorded traffic. +- Wires Effect v4 `Config`, API clients, `WebhookRoute`, and streams. +- Guides deterministic replay testing (VCR for REST/GraphQL, fixtures/mocks for gRPC). +- Enforces a Definition of Done before declaring the task complete. + +Primary intent: enforce a docs-first, evidence-based process that adapts to +any target platform without hardcoding provider-specific assumptions. + +## Entry point + +Start at [`SKILL.md`](./SKILL.md). It contains the hard rules and a pointer +index; load the `references/.md` files on demand as you progress. + +Canonical process docs: + +- `SKILL.md` +- `references/playbook.md` +- `references/api-research.md` +- `references/definition-of-done.md` + +Example-oriented docs are optional aids, not normative contracts. + +## Files + +``` +SKILL.md # orchestrator +references/ +├── playbook.md # numbered end-to-end flow +├── template-walkthrough.md # file-by-file tour of producer-template +├── connector-archetypes.md # generic capability classification framework +├── api-mode-graphql.md # GraphQL implementation contract +├── api-mode-grpc.md # gRPC implementation contract +├── connector-kit-api.md # exhaustive @useairfoil/connector-kit docs +├── effect-vcr-api.md # exhaustive @useairfoil/effect-vcr docs +├── effect-v4-essentials.md # Effect v4 idioms relevant to connectors +├── patterns.md # shared patterns (cursor, cutoff, streams) +├── webhooks.md # WebhookRoute + signature verification +├── vcr-workflow.md # record/replay + ACK_DISABLE_VCR +├── api-research.md # how to learn a real API's shape +├── anti-cheat.md # pre-flight checks +├── test-data.md # sandbox creds, seeding, coverage +├── definition-of-done.md # gates before marking complete +├── example-producer-polar.md # kitchen-sink reference walkthrough +├── example-pagination.md # optional pagination pattern catalog +├── example-auth.md # optional auth implementation patterns +└── example-webhook-verification.md # optional verification examples +assets/ +└── rename-checklist.md # exact find/replace list after cp -R +``` diff --git a/.agents/skills/airfoil-kit/SKILL.md b/.agents/skills/airfoil-kit/SKILL.md new file mode 100644 index 0000000..6b08184 --- /dev/null +++ b/.agents/skills/airfoil-kit/SKILL.md @@ -0,0 +1,197 @@ +--- +name: airfoil-kit +description: Implement a new Airfoil producer connector end-to-end. Use when the user asks to build, add, scaffold, create, or port a connector/producer/integration for any SaaS API (Stripe, Shopify, GitHub, Intercom, HubSpot, Linear, Polar, custom, etc.) in this monorepo. Copies templates/producer-template/, researches the real API, wires Effect v4 Config + HttpClient + streams + WebhookRoute, and finishes with deterministic replay tests (VCR for REST/GraphQL, fixtures/mocks for gRPC). +--- + +# airfoil-kit + +You are implementing a new producer connector for the Airfoil Connector Kit (ACK) +inside this monorepo. Work in small, verified steps. Use the template as your +starting point, never guess API shapes, and keep changes aligned with the +existing patterns in `connectors/producer-polar/`. + +--- + +## Hard rules (do not violate) + +1. **Copy the template. Do not invent a new structure.** The canonical scaffold + is `templates/producer-template/`. Start every new connector with + `cp -R templates/producer-template connectors/producer-` and adapt from + there. See [`assets/rename-checklist.md`](./assets/rename-checklist.md). +2. **No pre-existing connector for the target service.** Before writing any + code, run the pre-flight checks in [`references/anti-cheat.md`](./references/anti-cheat.md). + If an implementation exists, stop and report it — do not copy, rename, or + refactor it. +3. **Use Effect v4 only** (`effect@4.x`, `@effect/vitest@4.x`, `@effect/platform-*@4.x`). + No legacy `@effect/platform`, `@effect/schema`, or Effect v2/v3 patterns. + Read [`references/effect-v4-essentials.md`](./references/effect-v4-essentials.md) + whenever you reach for a new Effect module. +4. **No `process.env` reads in connector code or tests.** Use + `Config`/`ConfigProvider` everywhere. Sandbox/runtime layers attach + `ConfigProvider.fromEnv()`; tests attach `ConfigProvider.fromUnknown({ ... })` + or equivalent Effect config providers. +5. **Never edit cassette files by hand.** `test/__cassettes__/**` is write-only + via record/replay flow. If replay mismatches, re-record or adjust matcher / + redaction config — never patch cassette JSON directly. +6. **Schemas must be derived from real, observed API traffic.** For REST/GraphQL: + record a VCR cassette against the real (sandbox) API and define `Schema.Struct` + fields from the cassette. For gRPC: use deterministic proto fixtures or a mock + server. Never hand-fabricate field names from memory. See + [`references/vcr-workflow.md`](./references/vcr-workflow.md) and + [`references/api-research.md`](./references/api-research.md). +7. **You must pass the implementation gate before writing connector code.** + Before scaffolding or editing files under `connectors/producer-/`, + produce an API-facts artifact with: API mode (`rest`/`graphql`/`grpc`), + source evidence URLs + access date, and a pinned API version rationale. + Default artifact path: `connectors/producer-/api-facts.md`. + If the user requests non-persistence, keep it ephemeral but include the same + facts in the final report. +8. **Webhook verification must follow platform docs exactly.** If the upstream + service signs events, implement verification using the provider-documented + contract (inputs, canonicalization, algorithm, encoding, tolerance). Use raw + request bytes whenever the platform requires them. See + [`references/webhooks.md`](./references/webhooks.md). +9. **Signed webhook verification must fail closed.** If signature verification + is enabled and required verification inputs are missing (for example raw + request bytes or signature headers), fail with a typed connector error. + Never silently skip verification in this state. +10. **`test` and `test:ci` must load config equivalently.** If tests rely on + env vars, both scripts must provide the same config-loading behavior + (for example both loading `.env` through script/runtime flags). +11. **Pagination behavior must come from official platform docs.** Do not infer + continuation semantics from memory or examples. Validate your implementation + against recorded traffic and deterministic tests. +12. **Expected failures must use typed error channels.** Do not throw inside + `Effect.sync` for recoverable connector errors. Map failures to + `ConnectorError` (or connector-specific tagged errors mapped to it). +13. **The Definition of Done is a gate.** Do not declare complete until every + item in [`references/definition-of-done.md`](./references/definition-of-done.md) + passes (lint, typecheck, build, test:ci, and mode-appropriate deterministic + replay: VCR for REST/GraphQL, fixtures or mock servers for gRPC). + +--- + +## High-level flow + +1. **Pre-flight** — confirm no existing implementation. → [`references/anti-cheat.md`](./references/anti-cheat.md) +2. **Archetype + mode** — classify the target API (sandbox URL? test keys? + OAuth? webhook-only? polling-only?) and choose one implementation mode: + `rest`, `graphql`, or `grpc`. → + [`references/connector-archetypes.md`](./references/connector-archetypes.md) +3. **API research + evidence** — collect real endpoint + auth + pagination + + webhook docs and write an API-facts artifact (required during implementation). + Default path is `connectors/producer-/api-facts.md`; if user asks not + to persist it, keep the same facts in ephemeral notes and final report. + Include source URLs, access date, selected version, and why. + → [`references/api-research.md`](./references/api-research.md) +4. **Mode-specific standards** — read the one mode doc you selected and treat + it as the implementation contract. Keep decisions evidence-based, and adapt + abstractions to the target platform rather than copying one provider's shape. +5. **Credentials / test data** — ask the user for sandbox credentials and + seed data; set up `.env`. → [`references/test-data.md`](./references/test-data.md) +6. **Scaffold** — `cp -R templates/producer-template connectors/producer-` + and run the rename checklist. → [`assets/rename-checklist.md`](./assets/rename-checklist.md) + and [`references/template-walkthrough.md`](./references/template-walkthrough.md) +7. **Implement API client (mode-specific)** — use your selected mode contract + and validate auth + pagination behavior against real docs and captured + traffic. Also cross-check kit contracts in + [`references/connector-kit-api.md`](./references/connector-kit-api.md). +8. **Define schemas from real traffic** — use mode-appropriate evidence: + - REST/GraphQL: record a cassette in `record` mode, then derive + `Schema.Struct` fields from observed responses. + - gRPC: use deterministic proto fixtures and/or mock server outputs. + → [`references/vcr-workflow.md`](./references/vcr-workflow.md), + [`references/api-mode-grpc.md`](./references/api-mode-grpc.md) +9. **Wire entities + streams + webhook route** — follow the template's + `makeEntityStreams` / `defineConnector` / `WebhookRoute` pattern. Add + webhook signature verification if the service signs events. → + [`references/patterns.md`](./references/patterns.md), + [`references/webhooks.md`](./references/webhooks.md) +10. **Update the sandbox runner** — rename config names and port, keep the + telemetry + console publisher boilerplate. +11. **Write tests** — + - REST/GraphQL: `api.vcr.test.ts` replays the backfill path. + - gRPC: deterministic fixture/mock-server tests cover equivalent paths. + - `webhook.test.ts` exercises webhook endpoint behavior in-memory. + Switch to replay mode (or fixture-only deterministic mode) before + committing. +12. **Run the CI gate locally** — `pnpm run lint && pnpm run typecheck && pnpm run build && pnpm run test:ci`. + Every one must pass. → [`references/definition-of-done.md`](./references/definition-of-done.md) + +A detailed, numbered version of this flow lives at +[`references/playbook.md`](./references/playbook.md). Read it on every run. + +--- + +## Files you will almost always need to edit + +After `cp -R templates/producer-template connectors/producer-`: + +- `package.json` — rename, bump version, add service SDK / crypto deps. +- `.env.example` — rename env vars, list required sandbox credentials. +- `src/schemas.ts` — replace `PostSchema` with real entities. +- `src/api.ts` — replace `/posts` endpoint, adjust pagination + auth. +- `src/streams.ts` — keep the shape, adjust `cursorField`, cutoff logic. +- `src/connector.ts` — rename service tags, wire entities, implement webhook + signature verification, rename `TEMPLATE_*` env vars. +- `src/sandbox.ts` — rename env vars and service name for logging + telemetry. +- `src/index.ts` — update exports. +- `test/api.vcr.test.ts` — REST/GraphQL replay test from real recorded cassette. +- `test/__fixtures__/**` and/or gRPC mock-server tests — gRPC deterministic + replay artifacts. +- `test/webhook.test.ts` — adjust payload fixtures. +- `README.md` — describe the connector, required env, and test flow. + +[`references/template-walkthrough.md`](./references/template-walkthrough.md) +explains each file line-by-line. + +--- + +## When stuck + +- For "what does this Effect symbol do?" → [`references/effect-v4-essentials.md`](./references/effect-v4-essentials.md). +- For GraphQL-mode implementation details → [`references/api-mode-graphql.md`](./references/api-mode-graphql.md). +- For gRPC-mode implementation details → [`references/api-mode-grpc.md`](./references/api-mode-grpc.md). +- For "what exports does the kit give me?" → [`references/connector-kit-api.md`](./references/connector-kit-api.md) + and [`references/effect-vcr-api.md`](./references/effect-vcr-api.md). +- For "how did the Polar connector solve X?" → [`references/example-producer-polar.md`](./references/example-producer-polar.md). +- If truly blocked by missing API facts → **ask the user** (sandbox URL, test + key format, webhook header name, pagination style). Never guess. + +## If MCP tools are unavailable + +Do not block on Context7/DeepWiki availability. + +Fallback order: + +1. Local repo source of truth: + - `AGENTS.md` + - `packages/connector-kit/src/**` + - `packages/effect-vcr/src/**` + - `connectors/producer-polar/**` + - `templates/producer-template/**` +2. Official public docs via normal web fetch/search. +3. Ask the user only for missing, material facts (credentials, webhook + signing details, v1 scope). + +--- + +## Output expectations + +- Small, additive commits. Minimize edits outside + `connectors/producer-/`; if cross-package changes are needed, keep them + narrowly scoped and explicitly justify why. +- All generated code must typecheck, lint, and build. +- Final message must summarize: entities delivered, deterministic test evidence + recorded (VCR or fixtures/mocks), commands you ran, and any follow-ups. +- Final message must include an **Environment Setup Guide** for the user: + where each env var is obtained, required scopes/permissions, exact setup + steps, and a quick "verify config" checklist. + +Use this output shape: + +1. `ENV_VAR_NAME` +2. Where to obtain it (dashboard/API flow + link/path) +3. Required scope/permission +4. Setup step (`cp .env.example .env`, paste value) +5. Verification command and expected signal diff --git a/.agents/skills/airfoil-kit/assets/rename-checklist.md b/.agents/skills/airfoil-kit/assets/rename-checklist.md new file mode 100644 index 0000000..67093bf --- /dev/null +++ b/.agents/skills/airfoil-kit/assets/rename-checklist.md @@ -0,0 +1,128 @@ +# rename-checklist + +After copying `templates/producer-template/` to +`connectors/producer-/`, apply every find-and-replace below +before writing any new code. Missing a rename is the most common reason +tests or builds fail. + +Replace ``, ``, `` with the three casing +variants of the target service name: + +- `` — lowercase kebab (e.g. `stripe`, `shopify-admin`). +- `` — PascalCase (e.g. `Stripe`, `ShopifyAdmin`). +- `` — SCREAMING_SNAKE (e.g. `STRIPE`, `SHOPIFY_ADMIN`). + +## File-name renames + +The copy target already renamed the directory. Check inside the new +package for any file that still references `template`: + +```bash +# Should return no results after rename pass +rg -l "template" connectors/producer- --glob '!**/__cassettes__' --glob '!**/dist' --glob '!**/node_modules' +``` + +## Identifier map + +| Find | Replace with | +| ------------------------------------------------- | --------------------------------------------------- | +| `producer-template` | `producer-` | +| `@useairfoil/producer-template` | `@useairfoil/producer-` | +| `TEMPLATE_` (env prefix) | `_` | +| `TemplateApiClient` | `ApiClient` | +| `TemplateApiClientConfig` | `ApiClientConfig` | +| `TemplateApiClientService` | `ApiClientService` | +| `TemplateListPage` | `ListPage` | +| `TemplateConfig` (type) | `Config` | +| `TemplateConfigConfig` (Config value) | `ConfigConfig` | +| `TemplateConnector` (service tag) | `Connector` | +| `TemplateConnectorConfig` (layer factory) | `ConnectorConfig` | +| `TemplateConnectorRuntime` | `ConnectorRuntime` | +| `makeTemplateConnector` | `makeConnector` | +| `Template` (any other identifier prefix) | `` | +| `template` (lowercase in strings / URNs) | `` | +| `@useairfoil/producer-template/TemplateApiClient` | `@useairfoil/producer-/ApiClient` | +| `@useairfoil/producer-template/TemplateConnector` | `@useairfoil/producer-/Connector` | +| `"producer-template"` (connector name string) | `"producer-"` | +| `"/webhooks/template"` (route path) | `"/webhooks/"` | + +Env vars from `.env.example`: + +| Find | Replace with | +| ------------------------- | -------------------------------------------------------- | +| `TEMPLATE_API_BASE_URL` | `_API_BASE_URL` | +| `TEMPLATE_API_TOKEN` | `_API_TOKEN` (or whatever the service calls it) | +| `TEMPLATE_WEBHOOK_PORT` | `_WEBHOOK_PORT` | +| `TEMPLATE_WEBHOOK_SECRET` | `_WEBHOOK_SECRET` | + +## Entity-name renames + +The template ships one toy entity `posts`. For your v1 entity list: + +| Find | Replace with | +| ----------------------------------------------- | ---------------------- | +| `posts` (entity name in `defineEntity`) | `` | +| `PostSchema` | `Schema` | +| `Post` (type alias) | `` | +| `"/posts"` (API path) | the real endpoint path | +| `post.created` / `post.updated` (webhook types) | the real event types | + +Add further entities by duplicating the `makeEntityStreams` + `defineEntity` +block. + +## URLs and base paths + +- `https://jsonplaceholder.typicode.com` → the real API base URL. +- If the real base URL depends on sandbox vs prod, set the default in + `Config.withDefault(...)` to the sandbox URL. + +## Cassettes + +Delete the copied cassette before re-recording against the real API: + +```bash +rm -rf connectors/producer-/test/__cassettes__ +``` + +The next `pnpm run test` run in `mode: "record"` will +recreate it. + +## README + +Rewrite `connectors/producer-/README.md`: + +- Drop every JSONPlaceholder reference. +- Document the real API entities, auth, base URLs, env vars. +- List known limitations specific to the target (rate limits, missing + historical data, sandbox quirks). + +## Verification + +After all renames, these should return zero hits: + +```bash +rg -n "template|TEMPLATE|Template" connectors/producer- \ + --glob '!**/__cassettes__' --glob '!**/dist' --glob '!**/node_modules' + +rg -n "jsonplaceholder" connectors/producer- \ + --glob '!**/__cassettes__' +``` + +If either has hits, investigate before moving on. A stray identifier +will break compile or (worse) silently run against the template toy +endpoint. + +## Global search/replace shortcut + +In most editors, a case-preserving search-and-replace across the new +package handles 95% of the work: + +``` +Find: Template +Replace: +Case: preserve (Template→, template→, TEMPLATE→) +Scope: connectors/producer-/ +``` + +Then manually verify the remaining hits against the identifier table +above. diff --git a/.agents/skills/airfoil-kit/references/anti-cheat.md b/.agents/skills/airfoil-kit/references/anti-cheat.md new file mode 100644 index 0000000..a7ee75c --- /dev/null +++ b/.agents/skills/airfoil-kit/references/anti-cheat.md @@ -0,0 +1,132 @@ +# anti-cheat + +Pre-flight checks the agent MUST run before writing any connector code. +The purpose is to guarantee the agent demonstrates competence against the +target API rather than paraphrasing a pre-existing implementation. + +## Why this matters + +If a previous `producer-` already exists in the repo, the agent +could trivially "build" it by copying that code. That hides whether the +skill itself actually teaches the agent to design connectors. Worse, it +lets stale implementations slip into new work unnoticed. + +## Pre-flight checks (run in order) + +Run all of these from the repo root. If **any** surfaces the target +service, STOP and surface to the user. + +### 1. Direct name match in source + +```bash +rg -n "" connectors packages --glob '!**/node_modules' --glob '!**/dist' -i +``` + +Matches service name in source files. A hit means someone has already +referenced this service. + +### 2. Connector directory exists + +```bash +ls connectors/ | rg -i "" +``` + +Hit => a full connector already exists; stop. + +### 3. Build outputs leak implementation + +```bash +ls packages/*/dist 2>/dev/null | rg -i "" || true +ls connectors/*/dist 2>/dev/null | rg -i "" || true +``` + +Hit => stale build output likely from an earlier implementation. + +### 4. Node modules leak implementation + +```bash +ls node_modules 2>/dev/null | rg -i "producer-" || true +ls node_modules/@useairfoil 2>/dev/null | rg -i "" || true +``` + +Hit => a published package exists; do not install or inspect it. + +### 5. Git history mentions the target + +```bash +git log --oneline --all 2>/dev/null | rg -i "" | head -n 20 +``` + +Hits are informational. If they describe a prior attempt, ask the user +whether that attempt should be resumed/rebased or whether this is a +clean rebuild. + +## What "STOP" means + +STOP means: + +1. Do not read the pre-existing files. +2. Do not `cat`, `Read`, or `Grep` inside the matched paths. +3. Surface the finding to the user with: + - Exact paths that matched. + - A short summary ("prior `connectors/producer-/` present"). + - A decision prompt: "Should I delete and rebuild from scratch, or + continue the existing work (which bypasses this skill)?" + +The user decides. Do not assume. + +## External anti-cheat rule + +Do not inspect external repositories, gists, or branches that already +implement `producer-` for the same service just to copy schemas, +endpoints, or dispatch logic. The goal is to derive implementation details +from official docs plus recorded traffic, not from prior connector code. + +## Safe to read + +These are fine to read regardless of matches: + +- `packages/connector-kit/src/**` (the framework itself). +- `packages/effect-vcr/src/**`. +- `templates/producer-template/**` (this skill's starting point). +- `connectors/producer-polar/**` (the kitchen-sink reference). +- This `.agents/` directory. + +These are the "allowed references" and form the only body of prior +connector code you should study. + +## False-positive handling + +If the matches are obviously unrelated (e.g., searching for "github" hits +`.github/workflows/build.yaml`), note the finding and continue. Use your +judgment — the intent is to block cribbing, not to block mentions. + +Rule of thumb: if the match is **your target's source code or schemas**, +STOP. If the match is **ambient tooling with the same word**, continue +but note it. + +## Example + +``` +$ rg -n "stripe" connectors packages --glob '!**/node_modules' -i +(no output) +$ ls connectors/ | rg -i "stripe" +(no output) +$ ls packages/*/dist 2>/dev/null | rg -i "stripe" || true +(no output) +$ ls node_modules 2>/dev/null | rg -i "producer-stripe" || true +(no output) +$ git log --oneline --all | rg -i "stripe" | head -n 20 +(no output) + +Anti-cheat clear. Proceeding with producer-stripe. +``` + +## Documenting the outcome + +Log the result in your scratch notes or in the PR description: + +> Anti-cheat pre-flight: clean. No prior `producer-stripe` artifacts in +> `connectors/`, `packages/*/dist`, `node_modules`, or git history. + +This is a cheap trust signal for reviewers. diff --git a/.agents/skills/airfoil-kit/references/api-mode-graphql.md b/.agents/skills/airfoil-kit/references/api-mode-graphql.md new file mode 100644 index 0000000..fbbced2 --- /dev/null +++ b/.agents/skills/airfoil-kit/references/api-mode-graphql.md @@ -0,0 +1,167 @@ +# api-mode-graphql + +Implementation contract for connectors whose upstream API is GraphQL. + +Use this file when `api-facts.md` declares `mode: graphql`. + +## Hard rules + +1. **Effect HTTP is the default transport.** Use `HttpClient` from + `effect/unstable/http` and keep GraphQL over normal HTTP POST. +2. **No inline query strings in stream/connector logic.** Put operations in + `src/graphql/operations.ts` (or equivalent) and import them. +3. **Do not ignore GraphQL `errors`.** Handle `{ data, errors }` explicitly and + map to typed `ConnectorError`. +4. **Decode only at typed boundaries.** Validate response envelopes and entity + payloads with `Schema`. +5. **Pin API version once** (header/path/config), then mirror that same value + in code, tests, `.env.example`, and README. + +## Suggested file layout + +```text +src/ + graphql/ + operations.ts # query/mutation constants + operation names + envelopes.ts # optional: shared GraphQL envelope schemas + api.ts # HttpClient + request helpers + schemas.ts # entity schemas + streams.ts + connector.ts +``` + +Use `src/graphql/envelopes.ts` only when multiple operations share envelope +types. For small connectors, keep envelope schema in `api.ts`. + +## Request pattern + +Use a single helper for GraphQL requests in `api.ts`: + +- Build POST request to endpoint (`/graphql` or `/graphql.json`). +- Set auth/version headers centrally. +- Set body `{ query, variables }`. +- Execute in `Effect.scoped(...)`. +- Parse JSON once and decode through a response envelope schema. + +This keeps pagination/auth/error behavior in one place. + +Minimal skeleton: + +```ts +const GraphQLEnvelope = Schema.Struct({ + data: Schema.optional(Schema.Any), + errors: Schema.optional(Schema.Array(Schema.Struct({ message: Schema.String }))), +}); + +const requestGraphql = (options: { + readonly query: string; + readonly variables?: Record; + readonly decodeData: (data: unknown) => Effect.Effect; +}) => + Effect.scoped( + client + .execute( + HttpClientRequest.post("/graphql").pipe( + HttpClientRequest.bodyJsonUnsafe({ + query: options.query, + variables: options.variables, + }), + ), + ) + .pipe( + Effect.flatMap(HttpClientResponse.filterStatusOk), + Effect.flatMap((response) => response.json), + Effect.flatMap(Schema.decodeUnknownEffect(GraphQLEnvelope)), + Effect.flatMap((envelope) => { + if ((envelope.errors?.length ?? 0) > 0) { + return Effect.fail( + new ConnectorError({ + message: "GraphQL returned errors", + cause: envelope.errors, + }), + ); + } + if (envelope.data == null) { + return Effect.fail(new ConnectorError({ message: "GraphQL response missing data" })); + } + return options.decodeData(envelope.data); + }), + Effect.mapError( + (cause) => + new ConnectorError({ + message: "GraphQL request failed", + cause, + }), + ), + ), + ); +``` + +## Response envelope standard + +Model the envelope as: + +```ts +Schema.Struct({ + data: Schema.optional(Schema.Any), + errors: Schema.optional( + Schema.Array( + Schema.Struct({ + message: Schema.String, + path: Schema.optional(Schema.Array(Schema.Union(Schema.String, Schema.Number))), + }), + ), + ), +}); +``` + +Then enforce: + +- if `errors` is non-empty: fail typed +- if `data` missing: fail typed +- else decode `data.` with concrete schema + +## Pagination rules + +For connection-based APIs (`edges/pageInfo`): + +- map `edges[].node` to connector rows +- use `pageInfo.hasNextPage` + `pageInfo.endCursor` for continuation +- keep cursor mapping in one helper, not scattered + +If the API uses non-connection patterns, document exact continuation fields in +`api-facts.md` and implement one deterministic mapper. + +## Error mapping contract + +Map failures into `ConnectorError` with specific messages: + +- request/transport failure +- non-OK HTTP status +- GraphQL `errors` present +- schema decode failure + +Never expose raw unknown failures directly from connector boundaries. + +## Library policy + +- **Default:** no GraphQL runtime client library (stay on Effect HttpClient). +- **Optional:** GraphQL code generation (`@graphql-codegen/*`) for typed + operation results when introspection/schema tooling is stable. + +If codegen is introduced, document how to regenerate and keep generated files +out of handwritten logic modules. + +## Required tests + +1. VCR-backed list/backfill replay test from real GraphQL responses. +2. Pagination boundary test (hasNextPage false or empty edges). +3. GraphQL error path test (`errors` present => fail typed). +4. Webhook tests when applicable (including signature failure path). + +## Anti-patterns + +- Writing raw query strings inline in `streams.ts`/`connector.ts`. +- Treating HTTP 200 as success while ignoring GraphQL `errors`. +- Decoding directly to `Schema.Any` for shipped entities. +- Choosing query fields from memory instead of cassette-observed payloads. diff --git a/.agents/skills/airfoil-kit/references/api-mode-grpc.md b/.agents/skills/airfoil-kit/references/api-mode-grpc.md new file mode 100644 index 0000000..b604f5c --- /dev/null +++ b/.agents/skills/airfoil-kit/references/api-mode-grpc.md @@ -0,0 +1,125 @@ +# api-mode-grpc + +Implementation contract for connectors whose upstream API is gRPC. + +Use this file when `api-facts.md` declares `mode: grpc`. + +## Default stack + +- Client transport: `nice-grpc`, `nice-grpc-common` +- Proto workflow: `@bufbuild/buf` +- Protobuf runtime default: `@bufbuild/protobuf` (repo-aligned) +- Optional compatibility/runtime tooling: `protobufjs` + +## Hard rules + +1. **Use buf for proto lifecycle.** Generation/lint/breaking checks are + mandatory for gRPC connectors. +2. **Use generated service definitions with `nice-grpc`.** Avoid ad-hoc + dynamic method wiring in connector runtime code. +3. **Wrap RPC calls in Effect.** Use `Effect.tryPromise` and map failures into + typed `ConnectorError` (or connector-specific tagged errors mapped to it). +4. **Centralize call options.** Metadata/auth, deadline, and retry policy must + be configured in one API-layer place, not per-call copy-paste. +5. **Pin API/proto version deterministically.** Track selected version in + `api-facts.md`, docs, tests, and generation config. + +## Directory layout + +```text +connectors/producer-/ + buf.yaml # buf lint/generate config (connector root) + src/ + proto/ # .proto source files + api.ts # gRPC client + Effect wrapper + schemas.ts # domain entity schemas + streams.ts + connector.ts +``` + +Place all `.proto` files under `src/proto/` and `buf.yaml` at the connector +package root. Generated TypeScript goes into `src/proto/gen/` (add to +`.gitignore`; regenerate via `buf generate`). + +## Proto workflow + +Minimum required commands for gRPC mode: + +```bash +buf lint +buf breaking --against .git#branch=main +buf generate +``` + +If `breaking` cannot run in the local environment, document why and keep +`lint` + `generate` mandatory. + +## Effect integration contract + +Use a dedicated API service tag (for example `XGrpcApiClient`) built with +`Layer.effect(...)`, and structure call wrappers like this: + +- convert domain request -> proto request +- invoke gRPC client method with merged `CallOptions` +- convert proto response -> domain response +- map transport/status failures to typed errors + +Prefer one generic helper for repeated unary call patterns. + +## Retry and timeout policy + +- Set sensible default deadlines for all calls. +- Mark retryable statuses explicitly (e.g. transient unavailability). +- Do not retry non-retryable statuses (auth, permission, validation). +- Keep retry policy in API layer middleware/options, not business logic. + +## Metadata/auth policy + +- Inject auth metadata centrally in client construction or middleware. +- Never spread token/header construction across many call sites. +- If multiple auth modes exist, encode auth choice in config and keep one + resolver in the API layer. + +## Streaming policy + +For server-streaming RPCs: + +- bridge async iterator responses into connector-kit batch streams +- preserve ordering guarantees per stream +- ensure cancellation closes resources cleanly + +For unary-only APIs, keep pagination/cursor continuation deterministic and +document continuation fields in `api-facts.md`. + +## Protobuf runtime guidance + +- **Default:** `@bufbuild/protobuf` to stay aligned with existing repo + generation/runtime. +- **Optional:** `protobufjs` for cases requiring dynamic loading or + compatibility with external generated assets. + +If `protobufjs` is used, document exactly why and where it is required. + +## Required tests + +1. API integration tests covering at least one successful RPC path per shipped + entity/event source. +2. Typed error mapping tests for at least one retryable and one + non-retryable gRPC failure. +3. Deterministic fixture strategy for serialized payloads used in schema + mapping tests. +4. Webhook tests where applicable (for hybrid connectors). + +### VCR applicability + +`vcr-workflow.md` is HTTP-client cassette guidance and does not apply directly +to binary gRPC traffic. For gRPC mode, use deterministic proto fixtures and/or +mock test servers instead of HTTP VCR cassettes. + +## Anti-patterns + +- Calling gRPC directly from `streams.ts`/`connector.ts` without API wrapper. +- Trying to apply HTTP VCR cassette workflow directly to gRPC binary traffic. +- Inconsistent runtime choice without documentation. +- Per-method auth/deadline logic duplicated throughout code. +- Reporting done without proto generation evidence and deterministic tests. diff --git a/.agents/skills/airfoil-kit/references/api-research.md b/.agents/skills/airfoil-kit/references/api-research.md new file mode 100644 index 0000000..477c08c --- /dev/null +++ b/.agents/skills/airfoil-kit/references/api-research.md @@ -0,0 +1,235 @@ +# api-research + +Gathering the minimum set of facts about a target API before you write +any code. The output of this phase is a short API-facts artifact +(`api-facts.md` by default) kept in the connector directory during +development (required during implementation), plus a clear picture of +which archetype applies. + +## Hard rule + +**Never fabricate API behavior or response shape.** Treat official platform +docs and version/changelog pages as the contract source of truth, and use +recorded traffic to validate implementation and capture real payload details. +If you can't record, you can't ship the connector — stop and ask the user for +creds. + +## Tool availability fallback + +If MCP tools (Context7, DeepWiki, service MCPs) are unavailable, continue. +They are accelerators, not requirements. + +Fallback stack: + +1. Local repo source of truth (`AGENTS.md`, connector-kit, effect-vcr, + producer-polar, producer-template). +2. Official vendor docs using normal web fetch/search. +3. User clarification for missing secrets/scope decisions. + +Do not pause implementation solely because MCPs are missing. + +## Research order + +Source-of-truth precedence (always apply this order): + +1. Official vendor docs + changelog/version policy. +2. Official vendor SDK docs/examples. +3. Internal repo patterns (`connector-kit`, `effect-vcr`, + `producer-polar`, template). +4. Community posts/issues/videos (non-authoritative; use only as hints). + +Tooling order (how you gather the above sources): + +1. **Context7 MCP**. Use it first for any well-known + library or SaaS with public SDK docs. Example: + + ``` + server: plugin-context7-plugin-context7 + tool: get-library-docs (or whatever the server exposes) + args: { library: "stripe" } + ``` + + Always read the tool descriptor first; tool names and argument shapes + can change. + + If you need direct API details (auth, endpoints, rate limits), use: + `https://context7.com/docs/api-guide`. + +2. **WebFetch** the platform's official docs pages for specifics Context7 + doesn't cover. Example URLs to grab: + - Auth docs (`/docs/api/authentication`). + - Pagination docs (`/docs/api/pagination`). + - Webhook catalog (`/docs/webhooks/events`). + - Reference schemas for the entities you plan to ingest. + +3. **Targeted web searches** for stack-specific gotchas: rate limits, + retry semantics, undocumented headers, known quirks. + +4. **Context7 for Effect v4 docs** using `effect-ts/effect-smol` + (plus DeepWiki as optional fallback) for service tags, layers, Config + idioms, and HTTP module locations. + +5. **Ask the user** when anything material is ambiguous: + - Which tenancy model? (single-tenant vs per-tenant URL). + - Which auth flow is acceptable for MVP? (static token vs full OAuth2). + - Do they have sandbox/test-mode creds, or is this live-only? + - Are there MCP seeders available (e.g. Stripe MCP) for test-data? + - Which entities matter **for v1**? + +## Evidence block (required) + +Your API-facts artifact must include a short evidence block before +implementation: + +```markdown +## Evidence + +- URL: +- Retrieved: YYYY-MM-DD +- Used for: +- Decision: +``` + +Minimum requirement: one evidence entry each for auth, pagination, webhook +contract (or explicit no-webhook), and selected API version. + +## Version pin checklist (required) + +When you choose an API version, update all of these in one pass: + +1. Connector config default or required version field. +2. API client path/header/query parameter carrying version. +3. `.env.example` variable/value guidance. +4. README version notes. +5. Tests/cassettes targeting that versioned endpoint. + +Do not leave mixed versions across code/tests/docs. + +## Shape of the API-facts artifact + +Keep the summary small and concrete: + +```markdown +# api-facts: + +## Mode + +- mode: rest | graphql | grpc +- rationale: + +## Evidence + +- URL: +- Retrieved: YYYY-MM-DD +- Used for: +- Decision: + +## Base URL + +- Production: https://api..com/v2 +- Sandbox: https://sandbox..com/v2 (same shape) + +## Auth + +- Scheme: Bearer token +- Header: `Authorization: Bearer ` +- Token lifetime: long-lived API key +- Env var: `_API_TOKEN` + +## Pagination + +- Style: cursor-based +- Request: `?starting_after=&limit=` +- Response: `{ data: [...], has_more: boolean }` +- Last item's `id` becomes the next cursor. + +## Entities (v1) + +- `users`: GET /users (list), GET /users/:id (get) +- `orders`: GET /orders (list), webhook on `order.created` + +## Webhooks + +- Endpoint we host: POST /webhooks/ +- Signature header: `-Signature` +- Scheme: HMAC-SHA256 of raw body, hex lowercase +- Timestamp tolerance: 5 min + +## Rate limits + +- 100 req/sec per API key, 429 w/ Retry-After. +``` + +## What to actually record (and why) + +Record **minimum sufficient** cassettes: + +- **One page of each list endpoint** you plan to back-fill. +- **One detail fetch** per entity if you use it. +- **One webhook payload of each type** you dispatch (captured separately + — copy from the platform's dashboard or webhook debugger; webhooks + aren't captured by the HTTP client layer). +- **One auth error** (401) if your error handling depends on distinguishing + it. + +For gRPC mode, replace HTTP cassettes with deterministic proto fixtures and/or +mock gRPC server recordings. + +Do NOT record hundreds of pages. Large cassettes: + +- Bloat the repo. +- Slow replay tests. +- Leak real tenant data even when redacted. + +After recording, re-open the payloads and reconcile schema depth with observed +nested fields required by v1 entities/events. Do not stop at a minimal schema +if downstream logic or users need nested data. + +## Minimum required facts before coding + +Do not start connector implementation until the API-facts artifact contains all +of: + +1. Base URL(s) and environment model (sandbox/prod or key-mode). +2. Auth scheme and exact header names. +3. Pagination contract (request params + response continuation fields). +4. Concrete v1 entity list and list endpoints. +5. Webhook signature contract (header names, canonical signed string, + encoding, timestamp tolerance) or explicit "no signed webhooks". +6. Rate-limit behavior (`429`, `Retry-After`, burst/sustained limits if known). + +## Decision points the user owns + +If any of these are unclear, **stop and ask**: + +- **Which entities to ingest in v1.** Be explicit; listing 20 entities + and shipping 3 is a scope problem. +- **How to handle tenant-specific URLs.** Are multiple tenants + multiplexed into one connector instance, or does each tenant run its + own instance? +- **Retention of historical data.** Does backfill need to go back 1 year + or all-time? Affects `initialCutoff` and pagination strategy. +- **Credentials scope.** Read-only vs read-write. Prefer the minimum. + +## Anti-patterns + +- Reading an existing `producer-` repo to crib its schemas — + this is anti-cheat territory. See `anti-cheat.md`. +- Writing the schema from "I remember Stripe has a `created` field". + Memory is not ground truth; the cassette is. +- Assuming the sandbox and production APIs have identical shapes. Most + do, but not all (e.g., Shopify dev stores expose extra debug fields). +- Asking a dozen questions before doing any research. Research first, + then surface a focused question list. + +## When to use which tool + +| Need | Tool | +| -------------------------------------------- | ---------------------------------- | +| SDK setup, quickstart, library usage | Context7 MCP | +| Endpoint catalogs, response fields | Official docs (`WebFetch`) | +| Effect runtime/library patterns | Context7 (`effect-ts/effect-smol`) | +| Edge cases, community gotchas | Targeted web search | +| Ground truth of response body (REST/GraphQL) | VCR recording | +| Ground truth of response body (gRPC) | Proto fixtures / mock server | +| Architectural choice | Ask user | diff --git a/.agents/skills/airfoil-kit/references/connector-archetypes.md b/.agents/skills/airfoil-kit/references/connector-archetypes.md new file mode 100644 index 0000000..ea53cd6 --- /dev/null +++ b/.agents/skills/airfoil-kit/references/connector-archetypes.md @@ -0,0 +1,76 @@ +# connector-archetypes + +Classify the target platform by capability dimensions before writing code. +This is a decision framework, not a list of provider-specific recipes. + +For each dimension, record: + +- what the official docs say, +- what your v1 scope needs, +- what you will implement now, +- what you defer as follow-up. + +## Core capability dimensions + +1. **Transport mode** + - REST, GraphQL, gRPC, or mixed. + - This selects your primary API client contract and test strategy. + +2. **Authentication model** + - Static token/key, OAuth2, signed requests, or another documented scheme. + - Mirror provider docs exactly; do not assume another platform's model. + +3. **Tenancy model** + - Single global base URL vs tenant/region/account-specific endpoints. + - Decide whether one connector instance handles one tenant or many. + +4. **Data acquisition model** + - Polling only, webhook only, push+pull hybrid, or job/bulk export. + - This determines how `live` and `backfill` streams are wired. + +5. **Pagination / continuation contract** + - Derive request and response continuation semantics from official docs. + - Validate behavior with recorded traffic and deterministic tests. + +6. **Webhook verification model (if applicable)** + - Determine signed input, canonicalization, algorithm, encoding, + replay protection, and tolerance directly from docs. + - Fail closed when verification is enabled but prerequisites are missing. + +7. **Versioning model** + - Header-based, path-based, date-based, or implicit version policy. + - Pin intentionally and keep code/tests/docs aligned to one version. + +8. **Rate limit and retry semantics** + - Capture throttling behavior (`429`, backoff hints, retry headers). + - Decide minimal retry/backoff behavior required for v1. + +## Decision outputs (required) + +Before implementation, your API-facts artifact must include a concise decision +for each dimension above plus links to official sources. + +If a dimension is unknown or undocumented, mark it explicitly and ask the user +for the minimum missing inputs needed to proceed. + +## Implementation mapping + +Use this mapping after classification: + +- **Transport mode** -> which mode contract to load (`rest`, `graphql`, `grpc`). +- **Auth model** -> `Config` fields + request middleware strategy. +- **Tenancy model** -> base URL construction and runtime configuration shape. +- **Acquisition model** -> how `live`, `backfill`, and webhooks are composed. +- **Pagination model** -> `fetchList` and cursor handling implementation. +- **Webhook model** -> verifier implementation and webhook test cases. +- **Versioning model** -> config defaults, request paths/headers, README notes. +- **Rate limits** -> retry and error mapping behavior. + +## Anti-patterns + +- Starting from provider examples without proving applicability to the target. +- Assuming auth, pagination, or webhook verification works like another service. +- Leaving capability dimensions implicit and deciding ad hoc during coding. + +Always anchor decisions in official docs first, then validate with observed +traffic and deterministic tests. diff --git a/.agents/skills/airfoil-kit/references/connector-kit-api.md b/.agents/skills/airfoil-kit/references/connector-kit-api.md new file mode 100644 index 0000000..688b26d --- /dev/null +++ b/.agents/skills/airfoil-kit/references/connector-kit-api.md @@ -0,0 +1,455 @@ +# connector-kit-api + +Exhaustive reference for every export of +[`@useairfoil/connector-kit`](../../../packages/connector-kit/). + +Import from the package root: + +```ts +import { + ConnectorError, + defineConnector, + defineEntity, + defineEvent, + makePullStream, + makeWebhookQueue, + Publisher, + runConnector, + StateStore, + StateStoreInMemory, + WingsPublisherLayer, + ConnectorRuntimeContext, + ConnectorRuntimeContextLayer, + buildWebhookRouter, +} from "@useairfoil/connector-kit"; + +import type { + BackfillStream, + Batch, + ConnectorDefinition, + Cursor, + EntityDefinition, + EntityKey, + EntityRow, + EntitySchema, + EntityType, + EventDefinition, + IngestionState, + LiveSource, + LiveStream, + RunConnectorOptions, + StreamState, + Transform, + WebhookRoute, + WebhookStream, +} from "@useairfoil/connector-kit"; +``` + +All items are re-exported from +[`packages/connector-kit/src/index.ts`](../../../packages/connector-kit/src/index.ts). + +--- + +## Core types + +### `Cursor` + +```ts +type Cursor = string | number | bigint | Date; +``` + +Opaque watermark emitted by a stream. Use the same shape across a stream's +live and backfill branches so `IngestionState` stays consistent. + +### `Batch` + +```ts +type Batch = { + readonly cursor: Cursor; + readonly rows: ReadonlyArray; +}; +``` + +Unit of ingestion. The engine publishes one batch at a time, then persists +the cursor after a successful publish. + +### `StreamState` / `IngestionState` + +```ts +type StreamState = { + readonly cutoff: C; + readonly current?: C; +}; + +type IngestionState = { + readonly backfill: StreamState; + readonly live: StreamState; +}; +``` + +Persisted by `StateStore`. `cutoff` is the watermark that delimits live vs +backfill. `current` advances each time a batch is published. + +### `Transform` + +```ts +type Transform = (row: T) => Effect.Effect; +``` + +Optional per-row transformer. Applied after decoding, before publish. Use +it to enrich rows with joined data. + +### `LiveStream` / `BackfillStream` + +Both are type aliases for `Stream.Stream, ConnectorError>`. + +### `WebhookStream` + +```ts +type WebhookStream = { + readonly queue: Queue.Queue>; + readonly stream: Stream.Stream, ConnectorError>; +}; +``` + +Returned by `makeWebhookQueue`. The webhook handler calls +`Queue.offer(stream.queue, ...)`; the engine consumes from `stream.stream`. + +### `LiveSource` + +```ts +type LiveSource = LiveStream | WebhookStream; +``` + +An entity's `live` field accepts either a regular `Stream` (polling) or a +`WebhookStream` (event-driven). The engine detects the webhook shape by +checking for `queue` + `stream` fields. + +### Schema utility types + +```ts +type EntitySchema = Schema.Schema; +type EntityType = Schema.Schema.Type; +type EntityKey = ...; // "id" | "email" | ... +type EntityRow = ...; // intersect with Record +``` + +Use `EntityType` to derive row types (`type Customer = EntityType`). + +### `EntityDefinition` + +```ts +type EntityDefinition = { + readonly name: string; + readonly schema: S; + readonly primaryKey: EntityKey; + readonly live: LiveSource>; + readonly backfill: BackfillStream>; + readonly transform?: Transform>; +}; +``` + +Entities are upserts: live and backfill can overlap. The engine tracks a +`Set` of primary keys already emitted so backfill does not re-publish +rows seen live. + +### `EventDefinition` + +Same shape as `EntityDefinition` but: + +- `primaryKey` is absent. +- `backfill` is optional. +- The engine runs `backfill` **to completion** before starting `live`. + +Use for append-only log streams where order matters and upserts do not apply. + +### `ConnectorDefinition` + +```ts +type ConnectorDefinition = { + readonly name: string; + readonly entities: ReadonlyArray>; + readonly events: ReadonlyArray>; +}; +``` + +--- + +## Errors + +### `ConnectorError` + +```ts +class ConnectorError extends Data.TaggedError("ConnectorError")<{ + readonly message: string; + readonly cause?: unknown; +}> {} +``` + +Single error channel for connector code. Wrap upstream errors with +`Effect.mapError((cause) => new ConnectorError({ message, cause }))`. + +--- + +## Builders + +### `defineConnector(definition)` + +```ts +const connector = defineConnector({ + name: "producer-foo", + entities: [defineEntity({ ... })], + events: [], +}); +``` + +Identity function with inference hints; `const` generics preserve literal +names and entity arrays. Always use it rather than object literals. + +### `defineEntity(definition)` + +Returns the input with correct inference. `S` is inferred from +`definition.schema`, so `primaryKey` autocompletes from the schema's decoded +shape. + +### `defineEvent(definition)` + +Same, for events (`EventDefinition`). + +--- + +## Runtime + +### `runConnector(connector, options?)` + +Two overloads: + +```ts +// No webhook: requires StateStore + Publisher +runConnector( + connector, + options?: { initialCutoff?: Cursor; webhook?: undefined }, +): Effect.Effect; + +// With webhook: also requires HttpServer +runConnector( + connector, + options: { + initialCutoff?: Cursor; + webhook: { + routes: ReadonlyArray>; + healthPath?: HttpRouter.PathInput; // default "/health" + disableHttpLogger?: boolean; // default true + }; + }, +): Effect.Effect; +``` + +Internally: + +- Provides `ConnectorRuntimeContextLayer(connector)` so downstream spans can + tag metrics with `connector.name`. +- Wraps the whole run in an `Effect.withSpan("connector.run", ...)`. +- Emits `connector_batches_total`, `connector_rows_total`, and + `connector_batch_size` via `effect/Metric`. +- For webhooks, composes `buildWebhookRouter(routes)` with a `/health` + route and serves it via `HttpRouter.serve(app, { disableLogger })`. + +### `RunConnectorOptions` + +Exposed type for callers who build options programmatically. + +--- + +## State persistence + +### `StateStore` (service tag) + +```ts +class StateStore extends Context.Service< + StateStore, + { + readonly getState: ( + key: string, + ) => Effect.Effect | undefined, ConnectorError>; + readonly setState: ( + key: string, + state: IngestionState, + ) => Effect.Effect; + } +>()("StateStore") {} +``` + +Keyed by entity/event name. One row per stream. + +### `StateStoreInMemory` + +In-process `Map` backed `StateStore` layer. Use for +the sandbox runner and tests. Production deployments provide a durable +implementation (e.g. backed by a key-value store). + +--- + +## Publishing + +### `Publisher` (service tag) + +```ts +class Publisher extends Context.Service< + Publisher, + { + readonly publish: (options: { + readonly name: string; + readonly source: "live" | "backfill"; + readonly batch: Batch>; + }) => Effect.Effect; + } +>()("Publisher") {} +``` + +`PublishAck = { readonly success: boolean }`. The engine fails the stream +if `publish` fails. + +### `WingsPublisherLayer(config)` + +```ts +WingsPublisherLayer({ + connector, + topics: { customers: customerTopic, orders: orderTopic }, + partitionValues: { customers: "account_id" }, +}): Layer.Layer; +``` + +Production-grade publisher that fans each entity into a Wings topic. For +the sandbox / tests, use a hand-written console publisher instead. + +--- + +## Streams + +### `makeWebhookQueue(options?)` + +```ts +makeWebhookQueue({ capacity?: number }): Effect.Effect>; +``` + +Creates a bounded `Queue` (default capacity 1024) and its `Stream.fromQueue` +view. Always keep the queue bounded — unbounded queues can let a noisy +webhook drown the publisher. + +### `makePullStream(options)` + +```ts +makePullStream({ + initialCursor?: Cursor, + fetchPage: (cursor: Cursor | undefined) => Effect.Effect, ConnectorError, R>, +}): Stream.Stream, ConnectorError, R>; + +type PullPage = { + readonly cursor: Cursor; + readonly rows: ReadonlyArray; + readonly hasMore: boolean; +}; +``` + +Paging unfold. Skips empty pages automatically (keeps fetching until rows +arrive or `hasMore: false`). Use for every backfill that pages through a +list endpoint. + +--- + +## Webhooks + +### `WebhookRoute` + +```ts +type WebhookRoute = { + readonly path: HttpRouter.PathInput; + readonly schema: Schema.Schema; + readonly handle: ( + payload: TPayload, + request: HttpServerRequest.HttpServerRequest, + rawBody?: Uint8Array, + ) => Effect.Effect; +}; +``` + +The framework decodes the request body, validates against `schema`, and +invokes `handle(payload, request, rawBody)`. Use `rawBody` for HMAC +verification; use `payload` for dispatch. + +### `buildWebhookRouter(routes)` + +Low-level helper that turns an array of routes into an `HttpRouter` Layer. +`runConnector(...)` uses this internally; you rarely call it directly. + +--- + +## Runtime context + +### `ConnectorRuntimeContext` + +Service tag exposing `{ connector: ConnectorDefinition }`. The engine sets +this via `ConnectorRuntimeContextLayer(connector)`. Metrics attributes use +it to tag batches with `connector.name`. + +### `ConnectorRuntimeContextLayer(connector)` + +Returns a `Layer.succeed(ConnectorRuntimeContext)({ connector })`. Call this +in custom test harnesses if you bypass `runConnector`. + +--- + +## Observability (provided by the engine) + +### Spans + +- `connector.run` wraps the whole connector (attributes: `connector.name`, + `connector.entities.count`, `connector.events.count`). +- `connector.batch.process` wraps each batch publish (attributes: + `connector.name`, `connector.stream.name`, `connector.stream.source`, + `connector.batch.rows`). + +### Metrics + +- `connector_batches_total` (counter). +- `connector_rows_total` (counter). +- `connector_batch_size` (histogram, + `boundaries: [1, 5, 10, 25, 50, 100, 250, 500, 1000]`). + +All three carry `connector`, `stream`, and `source` (`live` | `backfill`) +attributes. + +To export telemetry, provide an Effect observability layer (the Polar +sandbox uses `Observability.Otlp.layerJson({ baseUrl, resource })` from +`effect/unstable/observability` plus `Metric.enableRuntimeMetricsLayer`). + +--- + +## Typical composition recipe + +```ts +const runtimeLayer = Layer.mergeAll( + StateStoreInMemory, + ConsolePublisherLayer, // or WingsPublisherLayer(...) + MyConnectorConfig(), // Layer + Logger.layer([Logger.consolePretty()]), + TelemetryLayer, // optional + Layer.mergeAll( + FetchHttpClient.layer, + Layer.succeed(ConfigProvider.ConfigProvider, ConfigProvider.fromEnv()), + ), +); + +const program = Effect.gen(function* () { + const { connector, routes } = yield* MyConnector; + return yield* runConnector(connector, { + initialCutoff: new Date(), + webhook: { routes }, + }).pipe(Effect.provide(NodeHttpServer.layer(createServer, { port: 8080 }))); +}); + +Effect.runPromise(Effect.scoped(program).pipe(Effect.provide(runtimeLayer))); +``` + +See `connectors/producer-polar/src/sandbox.ts` for the live reference. diff --git a/.agents/skills/airfoil-kit/references/definition-of-done.md b/.agents/skills/airfoil-kit/references/definition-of-done.md new file mode 100644 index 0000000..229d542 --- /dev/null +++ b/.agents/skills/airfoil-kit/references/definition-of-done.md @@ -0,0 +1,199 @@ +# definition-of-done + +The gates every new connector must clear before declaring the task +complete. Treat this as a literal checklist; skip nothing. + +## Completion states (report explicitly) + +Use these states in status updates and PR descriptions: + +1. **Code Complete** + - Connector code compiles and package-level lint/typecheck/build/tests pass. + - Runtime and docs are wired, but real cassettes may still be pending. +2. **Verified with Real Cassettes** + - Every shipped entity/event has replayable tests backed by real recorded + traffic for that API mode. + - Schemas are hardened from observed payloads, not inferred from memory. +3. **CI Complete** + - Root CI baseline passes locally (`lint`, `build`, `typecheck`, `test:ci`). + +Do not claim "done" without stating the current completion state. + +## Package-level gates (state gate: Code Complete) + +Run from the repo root, substituting `` for the connector slug: + +```bash +pnpm run lint +pnpm --filter @useairfoil/producer- run typecheck +pnpm --filter @useairfoil/producer- run test:ci +pnpm --filter @useairfoil/producer- run build +``` + +All four must exit 0. + +### Notes + +- `lint` runs oxlint at the repo root. +- `typecheck` runs `tsc --noEmit`. It needs dependent packages' `dist/` + to exist; run `pnpm run build` at the repo root first if you see + "Cannot find module '@useairfoil/...'" errors. +- `test:ci` runs Vitest in `run` mode with `CI=true`. In VCR `auto` mode, + missing cassettes fail in CI instead of recording. All committed cassettes + must exist and match tests. +- `build` produces `dist/` via `tsdown`. Re-run if you change the entry + file or `tsdown.config.ts`. + +## Determinism gates (state gate: Verified with Real Cassettes) + +- **Schema proof from traffic**: each shipped entity has at least one + deterministic replay test that decodes with its concrete schema (not + `Schema.Any`). REST/GraphQL: VCR replay test. gRPC: deterministic proto + fixture or mock-server test. +- **Webhook verification tests**: if webhooks are signed, ship both: + 1. valid signature test (200 + publish), and + 2. invalid signature test (non-2xx + no publish). +- **Fail-closed verification behavior**: if webhook verification is enabled, + missing verification inputs (such as raw body bytes or required signature + headers) must fail explicitly. +- **Pagination transition proof**: for paginated entities, tests must cover at + least one continuation step (for example page-1 to page-2 or cursor-token + handoff), not just the initial page. +- **Typed recoverable failures**: expected runtime failures are represented in + the typed error channel (no untyped defects for normal contract failures). +- **No manual cassette edits**: replay artifacts are generated by recording + flows, not hand-edited JSON patches. +- **No template stubs left**: after rename/porting, no placeholder + verification code remains (e.g. template `verifyWebhookSignature` + pass-through). + +## Sandbox boot check (state gate: Code Complete) + +```bash +pnpm --filter @useairfoil/producer- run sandbox +``` + +This boots the connector against the configured `.env`. Verify: + +- Startup reaches webhook/server-ready logs without missing-layer errors. +- No unhandled promise rejections in the first ~5 seconds. +- The health path returns `"ok"`: + + ```bash + curl http://localhost:/health + ``` + +- At least one batch appears on stdout (via `ConsolePublisherLayer`) if + the API has data. If it's an empty account, note that in the PR. + +Stop the sandbox (`Ctrl+C`) before moving on. + +## README requirements (state gate: Code Complete) + +`connectors/producer-/README.md` must include: + +1. One-line description ("Producer connector for "). +2. Supported entities and events, with a short sentence each. +3. Environment variables (copied from `.env.example`). +4. `pnpm install` + local dev commands. +5. How to run in production (Node entry; mention Bun equivalent if supported). +6. Credentials pointer: link to the service's API key docs, a sentence + on which scope/role is sufficient. +7. Known limitations (e.g., "backfill limited to last 30 days because + the API does not expose older records"). +8. Credential acquisition steps: exact click/path or API flow used to + obtain credentials for this service. +9. Required scopes/permissions for v1 entities/events. +10. Common setup errors and fixes (at minimum: auth failure, missing scope, + and webhook signature mismatch when relevant). + +If you copied `templates/producer-template/README.md` as a starting +point, make sure you've replaced every `Template`, `template`, and +JSONPlaceholder reference. + +## Root CI baseline (state gate: CI Complete) + +From the repo root: + +```bash +pnpm run lint +pnpm run build +pnpm run typecheck +pnpm run test:ci +``` + +All must exit 0. This matches the `.github/workflows/build.yaml` +pipeline. + +## Hands-off checks + +- **Scope discipline**: default to changes under `connectors/producer-/`. + If cross-package changes are required (`packages/**`, `templates/**`, + `.agents/**`), keep them minimal and explicitly document rationale + impact. +- **No opportunistic refactors**: avoid broad cleanup unrelated to connector + correctness, determinism, or required framework wiring. +- **`api-facts.md` lifecycle is explicit**: either delete + `connectors/producer-/api-facts.md` before merge, or keep it intentionally + with a short README note explaining why it is retained. + +## Cassettes or fixtures committed (state gate: Verified with Real Cassettes) + +```bash +git status connectors/producer-/test/__cassettes__ +``` + +REST/GraphQL: every cassette file your `test:ci` depends on must be tracked. +gRPC: deterministic proto fixtures or mock-server assets must be tracked. +These artifacts are infrastructure — do not gitignore them. + +## No template placeholders + +Run a final check for template placeholders before PR: + +```bash +rg -n "Template intentionally accepts everything|producer-template|/webhooks/template|TEMPLATE_" \ + connectors/producer- --glob '!**/__cassettes__' --glob '!**/dist' +``` + +Expected result: no hits (except intentionally documented migration notes). + +## Credentials hygiene + +```bash +git status | rg "\\.env$" +# should be empty — .env must not be staged + +rg -n "sk_live_|sk_test_|AKIA|" \ + connectors/producer- --glob '!**/__cassettes__' +# should be empty — no creds baked into source +``` + +If you scrubbed a cassette of real creds, double-check before committing. +Cassettes are public-data territory. + +## The "does it work end-to-end" gut check + +Before PR: + +1. Delete `node_modules/` and `dist/` directories. +2. Run `pnpm install` at the repo root. +3. Run `pnpm run build && pnpm run test:ci` at the repo root. +4. Smoke-test the sandbox with real creds one more time. + +If all four pass, the connector is portable, deterministic, and +functional. Ship it. + +## When a gate legitimately can't pass + +Only the following are acceptable reasons to skip a gate, and each must +be surfaced in the PR description: + +- **Test can't be recorded**: API refuses programmatic access (see + `test-data.md`). Document which entity is uncovered. +- **Sandbox not bootable**: missing creds for the target environment in + the reviewer's account. Include a sandbox screencast or log output to + prove it works in your environment. +- **Kit missing a feature**: surfaced to the user during implementation. + Task is paused, not marked done. + +Missing a gate because "I didn't run it" is not acceptable. diff --git a/.agents/skills/airfoil-kit/references/effect-v4-essentials.md b/.agents/skills/airfoil-kit/references/effect-v4-essentials.md new file mode 100644 index 0000000..6ab8497 --- /dev/null +++ b/.agents/skills/airfoil-kit/references/effect-v4-essentials.md @@ -0,0 +1,325 @@ +# effect-v4-essentials + +The SDK is pinned to **Effect v4 beta** (`effect@4.0.0-beta.54`). Many +patterns changed from v2/v3. This file is the short list of idioms you +**must** use in connector code. + +For deep dives, read: + +- Effect v4 source/docs repo only: `https://github.com/effect-ts/effect-smol` +- Context7 Effect v4 LLM docs: + `https://context7.com/effect-ts/effect-smol/llms.txt?tokens=10000` +- Context7 API guide (if using API directly): + `https://context7.com/docs/api-guide` + +Do **not** treat legacy Effect docs/repositories as Effect v4 source of truth +for this repo. They may reflect older API generations. + +## Prerequisite check (Effect source mirror) + +Before Effect-related implementation or refactors, verify a local mirror exists +at `.temp/effect-smol` (repo-local, disposable) and points to `effect-smol`. + +If missing, clone it: + +```bash +git clone https://github.com/Effect-TS/effect-smol.git .temp/effect-smol +``` + +If present, refresh it before deep API lookups: + +```bash +git -C .temp/effect-smol pull --ff-only +``` + +Use this mirror as local, greppable ground truth when MCP tools are flaky. +It is disposable and can be deleted any time: + +```bash +rm -rf .temp/effect-smol +``` + +Context7 quick use (recommended for Effect v4 content): + +1. Resolve library id for Effect docs (`effect-smol`) and query docs. +2. Ask focused questions (service tags, Config patterns, HTTP paths). +3. Cross-check answers against local package APIs before coding. + +DeepWiki MCP quick use (optional fallback): + +1. Ensure the repo is indexed in DeepWiki (open + `https://deepwiki.com/effect-ts/effect-smol` once if needed). +2. Read available topics: `deepwiki_read_wiki_structure({ repoName: "effect-ts/effect-smol" })`. +3. Ask focused questions: `deepwiki_ask_question({ repoName: "effect-ts/effect-smol", question: "..." })`. +4. Cross-check answers against local package APIs before coding. + +If Context7/DeepWiki are unavailable, fall back to: + +1. Local source in this repo (especially `packages/connector-kit/src/**` and + `packages/effect-vcr/src/**`). +2. Official Effect docs + GitHub source. + +Never block implementation on Context7/DeepWiki availability. + +--- + +## API integration contract (checklist) + +Apply this checklist for REST, GraphQL, and gRPC connectors: + +1. **Config-only runtime/test inputs:** no `process.env` reads in connector + code or tests; use `Config` and `ConfigProvider`. +2. **Service-layer clients:** build API clients as `Context.Service` + + `Layer.effect(...)`, not ad-hoc singleton objects. +3. **Boundary decode:** parse external payloads with `Schema` at API boundaries + before they enter stream/entity logic. +4. **Typed errors:** map unknown/transport/decode failures to tagged domain + errors (`ConnectorError` and/or connector-specific tagged errors). +5. **Central transport policy:** retries, timeouts/deadlines, auth headers, and + rate-limit behavior are configured in the API client layer, not scattered + across connector orchestration code. + +--- + +## 1. Imports you will use + +```ts +import { + Config, + ConfigProvider, + Context, + DateTime, + Deferred, + Effect, + Layer, + Logger, + Metric, + Option, + Queue, + Ref, + Stream, +} from "effect"; + +import * as Schema from "effect/Schema"; +import * as Observability from "effect/unstable/observability"; + +import { + FetchHttpClient, + HttpClient, + HttpClientRequest, + HttpClientResponse, + type HttpRouter, + HttpServer, + HttpServerRequest, + HttpServerResponse, + type Headers, +} from "effect/unstable/http"; + +import { NodeHttpServer, NodeHttpClient, NodeFileSystem } from "@effect/platform-node"; + +import { describe, expect, it } from "@effect/vitest"; +``` + +Notes: + +- HTTP lives under `effect/unstable/http` in v4. Do not import from + `@effect/platform` (that was the v2/v3 location). +- `Schema` lives at `effect/Schema`, not `@effect/schema`. +- `Context.Service` replaces `ServiceMap.Service` patterns from older versions. + +## 2. Defining services + +```ts +export class MyApiClient extends Context.Service()( + "@useairfoil/producer-foo/MyApiClient", +) {} +``` + +- The string tag must be unique across all services. +- Use `yield* MyApiClient` inside `Effect.gen(function* () { ... })` to + access the service. + +## 3. Defining typed errors + +```ts +import { Data } from "effect"; + +export class MyError extends Data.TaggedError("MyError")<{ + readonly message: string; + readonly cause?: unknown; +}> {} +``` + +`ConnectorError` is defined this way. Prefer tagged errors over plain +classes; they play well with `Effect.catchTag`. + +## 4. Config and ConfigProvider + +```ts +export const MyConfig = Config.all({ + apiToken: Config.string("FOO_API_TOKEN"), + apiBaseUrl: Config.string("FOO_API_BASE_URL").pipe(Config.withDefault("https://api.foo.com")), + webhookSecret: Config.option(Config.string("FOO_WEBHOOK_SECRET")), +}); +``` + +- `Config.option(...)` returns `Option.Option` — check with + `Option.isSome` / `Option.isNone`. +- `Config.withDefault(v)` makes a field optional with a fallback. +- `Config.port(name)` parses integers. +- `Config.boolean(name)` parses `"true"` / `"false"`. + +Runtime wiring: + +```ts +Layer.succeed( + ConfigProvider.ConfigProvider, + ConfigProvider.fromEnv(), // or fromUnknown({ FOO_API_TOKEN: "..." }) +); +``` + +Never read `process.env` directly in library code; always go through +`Config`. + +## 5. Layers + +- `Layer.succeed(Tag)(impl)` — constant service. +- `Layer.effect(Tag)(effect)` — service built from an Effect. +- `Layer.mergeAll(a, b, c)` — union two or more layers. +- `Layer.provide(layer)` — provide a sub-layer that the outer layer depends on. +- `Layer.unwrap(Effect.gen(function* () { return Layer.mergeAll(...) }))` — + dynamically decide which layers to build based on config. +- `Layer.empty` — the no-op layer, useful in `Layer.unwrap` branches. + +## 6. Effect.gen is the default style + +```ts +Effect.gen(function* () { + const config = yield* MyConfig; + const api = yield* MyApiClient; + const rows = yield* api.fetchList(schema, path, options); + return rows; +}); +``` + +- Use `yield*` for every Effect, never `await`. +- Mapping simple values: `Effect.map`, `Effect.andThen`. +- Mapping errors: `Effect.mapError` or `Effect.catchTag`. + +## 7. HttpClient pipeline + +```ts +const client = (yield * HttpClient.HttpClient).pipe( + HttpClient.mapRequest(HttpClientRequest.prependUrl(baseUrl)), + HttpClient.mapRequest(HttpClientRequest.bearerToken(token)), + HttpClient.mapRequest(HttpClientRequest.acceptJson), +); + +const request = HttpClientRequest.get("/v1/things").pipe( + HttpClientRequest.setUrlParams({ page: "1" }), +); + +const rows = + yield * + Effect.scoped( + client.execute(request).pipe( + Effect.flatMap(HttpClientResponse.filterStatusOk), + Effect.flatMap((response) => response.json), + Effect.flatMap(Schema.decodeUnknownEffect(schema)), + ), + ); +``` + +- Always `Effect.scoped(...)` around `client.execute(...)` unless the + surrounding context is already scoped. +- `HttpClient.transform((effect, request) => ...)` lets you wrap requests + (that's how VCR is built). + +## 8. Streams + +- `Stream.fromEffect(e)` — single-element stream. +- `Stream.fromQueue(q)` — stream that emits whatever is pushed to the queue. +- `Stream.unfold(state, step)` — the building block behind `makePullStream`. +- `Stream.merge(a, b)` — run two streams concurrently. +- `Stream.map(s, f)`, `Stream.mapEffect(s, f)` — transform batches. +- `Stream.tap(s, f)` — side effect on each element. +- `Stream.runForEach(s, f)` — drain. + +## 9. Concurrency primitives + +- `Ref.make(value)`, `Ref.get(ref)`, `Ref.update(ref, fn)`, + `Ref.updateAndGet(ref, fn)`. +- `Deferred.make()`, `Deferred.succeed(d, v)`, `Deferred.await(d)`. +- `Queue.bounded(capacity)`, `Queue.offer(q, v)`, `Queue.take(q)`. +- `Effect.forkScoped(effect)` — spawn in the current scope; the fiber is + interrupted when the scope closes. +- `Effect.all([a, b], { concurrency: "unbounded" })` — run in parallel. + +## 10. Schema + +```ts +const Post = Schema.Struct({ + id: Schema.Number, + title: Schema.String, + body: Schema.NullOr(Schema.String), + tags: Schema.Array(Schema.String), + status: Schema.Literals(["draft", "published"]), + metadata: Schema.Record(Schema.String, Schema.Any), + nested: Schema.optional(Schema.Struct({ foo: Schema.String })), +}); + +type Post = Schema.Schema.Type; +``` + +- `Schema.decodeUnknownEffect(schema)(value)` returns + `Effect.Effect`. +- Use `Schema.Any` for fields you don't want to validate (common for Polar's + `product` / `discount` fields which are large nested objects). + +## 11. Observability + +- `Effect.withSpan("span.name", { attributes: {...} })` — wrap an effect + in a tracing span. +- `Metric.counter("name", { description })`, `Metric.histogram("name", { boundaries })`, + `Metric.update(metric, value)`, `Metric.withAttributes(metric, attrs)`. +- Provide telemetry via `Observability.Otlp.layerJson({ baseUrl, resource })` + from `effect/unstable/observability`. + +Avoid high-cardinality labels (user ids, request ids, timestamps). + +## 12. Vitest + Effect + +```ts +import { describe, expect, it } from "@effect/vitest"; + +describe("things", () => { + it.effect("works", () => + Effect.gen(function* () { + const result = yield* something; + expect(result).toBe(42); + }).pipe(Effect.provide(TestLayer)), + ); +}); +``` + +- `it.effect` expects an Effect. The framework runs it with a default + runtime and fails the test on any unhandled defect. +- To run your own scoped effect, wrap with `Effect.scoped`. + +--- + +## What **not** to do + +- `import { ... } from "@effect/platform"` — v2/v3 only. +- `import * as Schema from "@effect/schema"` — v2/v3 only. +- `ServiceMap.Service` examples — use `Context.Service` instead. +- `process.env.FOO` in library code — always `Config.string("FOO")`. +- `Effect.die(new Error(...))` for expected failures — use tagged errors. +- `async/await` inside `Effect.gen` — use `yield*`. +- Mutating a `Ref` without `Ref.update` — the whole point is atomic updates. +- `Stream.bracket`, `Stream.ensuring` from v2 — v4 uses `Effect.scoped` + and `Scope` instead. + +When a pattern you find online doesn't match what's in the repo, trust +the repo: `connectors/producer-polar/` and `packages/connector-kit/` +are the ground truth. diff --git a/.agents/skills/airfoil-kit/references/effect-vcr-api.md b/.agents/skills/airfoil-kit/references/effect-vcr-api.md new file mode 100644 index 0000000..78abe55 --- /dev/null +++ b/.agents/skills/airfoil-kit/references/effect-vcr-api.md @@ -0,0 +1,80 @@ +# effect-vcr-api + +Reference notes for `@useairfoil/effect-vcr`. + +## Package exports + +From `packages/effect-vcr/src/index.ts`: + +- `CassetteStore` namespace (`cassette-store.ts`) +- `FileSystemCassetteStore` namespace (`file-system-cassette-store.ts`) +- `VcrHttpClient` namespace (`vcr-http-client.ts`) + +## Core types and services + +- `CassetteStore.CassetteStore` service tag +- `CassetteStore.CassetteStoreError` +- `CassetteStore.createEmptyCassette()` +- `CassetteStore.createEmptyCassetteFile()` + +`FileSystemCassetteStore.layer()` provides a filesystem-backed cassette store. + +## VCR HTTP layer + +Use `VcrHttpClient.layer({ ... })` to wrap `HttpClient.HttpClient` with +record/replay behavior. + +Common config fields: + +- `vcrName?: string` +- `cassetteName?: string` +- `mode?: "record" | "replay" | "auto"` +- `redact?: { requestHeaders?, responseHeaders?, requestBodyKeys?, responseBodyKeys? }` +- `matchIgnore?: { requestHeaders?, requestBodyKeys? }` +- `match?: (request, entry) => boolean` + +## Typical test wiring + +```ts +import { NodeServices } from "@effect/platform-node"; +import { FileSystemCassetteStore, VcrHttpClient } from "@useairfoil/effect-vcr"; +import { Effect, Layer } from "effect"; +import { FetchHttpClient } from "effect/unstable/http"; + +const vcrLayer = VcrHttpClient.layer({ + vcrName: "producer-", + mode: "replay", +}).pipe( + Layer.provideMerge(FileSystemCassetteStore.layer()), + Layer.provideMerge(FetchHttpClient.layer), + Layer.provideMerge(NodeServices.layer), +); + +const program = Effect.gen(function* () { + // call connector runtime / stream logic that needs HttpClient +}); + +const runnable = program.pipe(Effect.provide(vcrLayer)); +``` + +## Cassette naming + +Under Vitest, inferred cassette names follow `.cassette`. + +Example: + +- test file: `test/api.vcr.test.ts` +- cassette file: `test/__cassettes__/api.vcr.test.cassette` + +## Environment behavior + +- `ACK_DISABLE_VCR` can bypass VCR by `vcrName` (comma-separated list or `*`). +- In `auto` mode, missing cassette behavior is CI-sensitive (fails in CI, + records locally). + +## Source of truth + +- `packages/effect-vcr/src/types.ts` +- `packages/effect-vcr/src/cassette-store.ts` +- `packages/effect-vcr/src/file-system-cassette-store.ts` +- `packages/effect-vcr/src/vcr-http-client.ts` diff --git a/.agents/skills/airfoil-kit/references/example-auth.md b/.agents/skills/airfoil-kit/references/example-auth.md new file mode 100644 index 0000000..84b1b6a --- /dev/null +++ b/.agents/skills/airfoil-kit/references/example-auth.md @@ -0,0 +1,219 @@ +# example-auth + +Auth patterns expressed as Effect `Config` + `HttpClient.mapRequest`. +All patterns plug into the `XApiClientConfig(config)` factory layer from +`api.ts`. Nothing here requires changes to the connector kit. + +These are illustrative implementation patterns, not a protocol contract. +Always implement authentication according to official platform docs for the +target service. + +## Bearer token (Polar, Stripe, GitHub v4, most modern APIs) + +**Config**: + +```ts +export const XConfigConfig = Config.all({ + apiBaseUrl: Config.string("X_API_BASE_URL"), + accessToken: Config.string("X_ACCESS_TOKEN"), +}); +``` + +**HttpClient wiring** (inside `api.ts`): + +```ts +import { HttpClient, HttpClientRequest } from "effect/unstable/http"; +import { Redacted } from "effect"; + +export const XApiClientConfig = (config: XConfig) => + Layer.effect(XApiClient)( + Effect.gen(function* () { + const httpClient = yield* HttpClient.HttpClient; + const client = httpClient.pipe( + HttpClient.mapRequest(HttpClientRequest.prependUrl(config.apiBaseUrl)), + HttpClient.mapRequest(HttpClientRequest.bearerToken(Redacted.make(config.accessToken))), + HttpClient.mapRequest(HttpClientRequest.acceptJson), + ); + // ... fetchJson, fetchList built from client + }), + ); +``` + +Notes: + +- `Redacted.make(...)` wraps the token so it doesn't appear in logs. +- `HttpClientRequest.bearerToken` sets `Authorization: Bearer `. + The VCR layer redacts this header by default. + +## API key in a custom header + +For services like Anthropic (`x-api-key`), SendGrid (`Authorization: +Bearer`), Twilio (basic), etc. — the shape is the same, only the +header name changes. + +```ts +const client = httpClient.pipe( + HttpClient.mapRequest(HttpClientRequest.prependUrl(config.apiBaseUrl)), + HttpClient.mapRequest(HttpClientRequest.setHeader("x-api-key", config.apiKey)), + HttpClient.mapRequest(HttpClientRequest.acceptJson), +); +``` + +**Redact** the custom header in tests: + +```ts +VcrHttpClient.layer({ + vcrName: "producer-x", + redact: { requestHeaders: ["authorization", "x-api-key"] }, +}); +``` + +## Basic auth (Twilio, Jira on-prem) + +```ts +HttpClient.mapRequest( + HttpClientRequest.basicAuth(config.accountSid, Redacted.make(config.authToken)), +); +``` + +Config: + +```ts +export const XConfigConfig = Config.all({ + apiBaseUrl: Config.string("X_API_BASE_URL"), + accountSid: Config.string("X_ACCOUNT_SID"), + authToken: Config.string("X_AUTH_TOKEN"), +}); +``` + +## OAuth2 with long-lived refresh token + +The MVP pattern: store a refresh token in `.env`, exchange it on startup, +keep the access token in a `Ref`. + +```ts +type TokenState = { accessToken: string; expiresAt: number }; + +export class XOAuthTokens extends Context.Service>()( + "@useairfoil/producer-x/XOAuthTokens", +) {} + +export const XOAuthTokensConfig = (config: XConfig) => + Layer.effect(XOAuthTokens)( + Effect.gen(function* () { + const httpClient = yield* HttpClient.HttpClient; + const initial = yield* exchangeRefreshToken(httpClient, config); + return yield* Ref.make(initial); + }), + ); +``` + +On 401, refresh and retry. The cleanest way is to wrap the client: + +```ts +const clientWithRefresh = Effect.gen(function* () { + const tokens = yield* XOAuthTokens; + const base = yield* HttpClient.HttpClient; + const withBearer = (accessToken: string) => + base.pipe(HttpClient.mapRequest(HttpClientRequest.bearerToken(Redacted.make(accessToken)))); + return { + execute: (request: HttpClientRequest.HttpClientRequest) => + Effect.gen(function* () { + const current = yield* Ref.get(tokens); + const first = yield* withBearer(current.accessToken).execute(request); + if (first.status !== 401) return first; + const fresh = yield* exchangeRefreshToken(base, config); + yield* Ref.set(tokens, fresh); + return yield* withBearer(fresh.accessToken).execute(request); + }), + }; +}); +``` + +Surface to the user: "For the MVP I'm using a static refresh token from +`.env`. If you need full OAuth2 with user redirect, I need to know your +redirect URI and where to store tokens." + +## Signed-request auth (AWS SigV4, some enterprise APIs) + +Use the platform's signing library (`@aws-sdk/signature-v4`, etc.) and +wrap as an `HttpClient.mapRequestEffect`: + +```ts +HttpClient.mapRequestEffect((request) => + Effect.tryPromise({ + try: () => signer.sign(request), + catch: (cause) => new ConnectorError({ message: "SigV4 signing failed", cause }), + }), +); +``` + +- Use `mapRequestEffect` (not `mapRequest`) because signing is async. +- Never roll your own SigV4 — use the vendor library. + +## Per-tenant credentials + +When a single connector instance serves multiple tenants (rare but +possible): + +```ts +export const XConfigConfig = Config.all({ + apiBaseUrl: Config.string("X_API_BASE_URL"), + tenantTokens: Config.hashMap(Config.string(), "X_TENANT_TOKENS"), // "alice=t1,bob=t2" +}); +``` + +- Choose the token at request time based on some tenant context. +- Requires a per-request layer wrapping the HttpClient. Complex — push + back on the user if you can scope to one tenant per connector + instance. + +## Sandbox vs production + +Two common shapes: + +1. **Different URL, same token format** (Polar): + + ``` + X_API_BASE_URL=https://sandbox-api.x.com/v1/ # default + X_API_BASE_URL=https://api.x.com/v1/ # production + ``` + + One env var toggles environments. + +2. **Same URL, token prefix differentiates** (Stripe): + ``` + STRIPE_API_KEY=sk_test_... # test mode + STRIPE_API_KEY=sk_live_... # live + ``` + No URL change; the key tells the platform which mode. + +Document which model your connector uses in README. + +## Redacted logging + +Always wrap secrets in `Redacted.make(secret)` when passing through +`HttpClientRequest.bearerToken` / `basicAuth` / custom headers. Effect's +logger will render these as `` and error messages won't leak +them. + +## What NOT to do + +- Don't read `process.env` in `api.ts` or elsewhere. Use `Config`. +- Don't embed tokens in cassettes. The `authorization` header is + redacted by default, but custom headers need explicit redaction. +- Don't skip Bearer for "easier" querystring auth (`?api_key=...`). + Query strings leak in logs and cassettes. +- Don't store tokens globally. The token is part of the `Config` + struct, not module-level state. + +## Decision matrix + +| API signal | Pattern | +| ----------------------------------- | ------------------------------------- | +| `Authorization: Bearer ` | Bearer token | +| Custom header with token | API key header | +| `user:pass` base64 in Authorization | Basic auth | +| OAuth2 with refresh | Refresh-on-401 via Ref | +| AWS / GCP signed requests | Platform library + `mapRequestEffect` | +| Short-lived STS tokens | Refresh-on-401 with ambient provider | diff --git a/.agents/skills/airfoil-kit/references/example-pagination.md b/.agents/skills/airfoil-kit/references/example-pagination.md new file mode 100644 index 0000000..87eb4fe --- /dev/null +++ b/.agents/skills/airfoil-kit/references/example-pagination.md @@ -0,0 +1,186 @@ +# example-pagination + +Every connector needs a historical backfill. In this repo, backfill paging +should use `makePullStream` from `@useairfoil/connector-kit`. + +This document is a pattern catalog, not an exhaustive list of API-specific +cases. Pick the pattern that matches observed platform behavior and verify it +from docs + recorded traffic. + +Source of truth: `packages/connector-kit/src/streams/pull-stream.ts`. + +## Current `makePullStream` shape + +```ts +type PullPage = { + readonly cursor: Cursor; + readonly rows: ReadonlyArray; + readonly hasMore: boolean; +}; + +type PullStreamOptions = { + readonly initialCursor?: Cursor; + readonly fetchPage: (cursor: Cursor | undefined) => Effect.Effect, ConnectorError, R>; +}; +``` + +Important behavior: + +- `fetchPage` receives only the previous cursor. +- Return `{ cursor, rows, hasMore }`. +- Empty pages are skipped automatically while `hasMore === true`. +- Stream ends when `hasMore === false` and no rows remain. + +## Baseline pattern + +```ts +const backfill = makePullStream({ + initialCursor: 1, + fetchPage: (cursor) => + Effect.gen(function* () { + const page = typeof cursor === "number" ? cursor : 1; + const response = yield* api.fetchList(PostSchema, "/posts", { + page, + limit: 100, + }); + + return { + cursor: response.hasMore ? page + 1 : page, + rows: response.items, + hasMore: response.hasMore, + }; + }), +}); +``` + +## Page + limit + +Use numeric page cursors. + +```ts +fetchPage: (cursor) => + Effect.gen(function* () { + const page = typeof cursor === "number" ? cursor : 1; + const response = yield* api.fetchList(Schema, "/things", { + page, + limit: 100, + }); + + return { + cursor: response.hasMore ? page + 1 : page, + rows: response.items, + hasMore: response.hasMore, + }; + }); +``` + +## Cursor token (`starting_after`, `next_token`) + +Use opaque string cursors. + +```ts +fetchPage: (cursor) => + Effect.gen(function* () { + const response = yield* api.fetchCursorPage(Schema, { + starting_after: typeof cursor === "string" ? cursor : undefined, + limit: 100, + }); + + const last = response.items.at(-1); + const next = response.nextToken ?? last?.id; + + return { + cursor: next ?? cursor ?? "", + rows: response.items, + hasMore: Boolean(next), + }; + }); +``` + +## Link-header pagination + +If the API returns `rel="next"`, parse it in `api.ts` and emit it as a cursor. + +Important: continuation URLs from link headers may be absolute URLs or +relative paths. Verify which form your API returns. If your HTTP client +preprends a base URL for relative paths, do not apply that transform to +already-absolute continuation URLs. + +```ts +fetchPage: (cursor) => + Effect.gen(function* () { + const url = + typeof cursor === "string" && cursor.length > 0 + ? cursor + : "/repos/org/repo/issues?per_page=100"; + + const { items, nextUrl } = yield* api.fetchListWithLinkHeader(Schema, url); + + return { + cursor: nextUrl ?? url, + rows: items, + hasMore: Boolean(nextUrl), + }; + }); +``` + +## Offset + limit + +```ts +fetchPage: (cursor) => + Effect.gen(function* () { + const offset = typeof cursor === "number" ? cursor : 0; + const response = yield* api.fetchOffsetPage(Schema, { + offset, + limit: 100, + }); + + return { + cursor: offset + response.items.length, + rows: response.items, + hasMore: response.items.length === 100, + }; + }); +``` + +## Time-window pagination + +```ts +fetchPage: (cursor) => + Effect.gen(function* () { + const since = typeof cursor === "string" ? cursor : "1970-01-01T00:00:00Z"; + + const response = yield* api.fetchEvents(Schema, { since, limit: 500 }); + const lastTs = response.items.at(-1)?.created_at; + + return { + cursor: lastTs ?? since, + rows: response.items, + hasMore: response.items.length === 500, + }; + }); +``` + +When timestamps can tie, prefer a cursor that includes a tie-breaker +(timestamp + id), or use entity primary-key dedupe defensively. + +## `initialCutoff` with `runConnector` + +`runConnector` currently accepts: + +```ts +runConnector(connector, { + initialCutoff?: Cursor, + webhook?: { ... }, +}) +``` + +`initialCutoff` is a single cursor value (for example `new Date()` or an +ISO timestamp string), not a keyed object. + +## Practical guidance + +- Prefer server-emitted monotonic cursors (`created_at`, `updated_at`, id). +- Keep page sizes bounded. +- Map transport/parsing failures into `ConnectorError`. +- Add retry/backoff around rate-limit responses where needed. diff --git a/.agents/skills/airfoil-kit/references/example-producer-polar.md b/.agents/skills/airfoil-kit/references/example-producer-polar.md new file mode 100644 index 0000000..c945708 --- /dev/null +++ b/.agents/skills/airfoil-kit/references/example-producer-polar.md @@ -0,0 +1,283 @@ +# example-producer-polar + +Kitchen-sink walkthrough of `connectors/producer-polar/`. This connector +exercises nearly every feature of the kit: four entities, real HMAC +verification, optional config values, the Deferred-cutoff handoff from +webhooks to backfill, a sandbox base URL, and a VCR test suite. Read +this after the template walkthrough to see "what good looks like" for a +real connector. + +Everything you see here is source-of-truth code. Reference the actual +files instead of re-typing blocks. + +## File inventory + +``` +connectors/producer-polar/ + src/ + api.ts # 103 lines — PolarApiClient service + layer + connector.ts # 344 lines — PolarConfigConfig, webhook dispatch, runtime + index.ts # 8 lines — public re-exports + sandbox.ts # 132 lines — local runner with telemetry toggle + schemas.ts # 234 lines — four entity schemas + webhook payload union + streams.ts # 120 lines — makeEntityStreams, dispatch helpers + test/ + api.vcr.test.ts # 58 lines — per-entity replay tests + helpers.ts # 29 lines — test publisher + webhook.test.ts # 90 lines — end-to-end webhook dispatch test + __cassettes__/ # committed recorded responses + package.json, tsconfig.json, tsdown.config.ts, vitest.config.ts, README.md +``` + +## `src/connector.ts` — the centerpiece + +```45:50:connectors/producer-polar/src/connector.ts +export class PolarConnector extends Context.Service< + PolarConnector, + PolarConnectorRuntime +>()("@useairfoil/producer-polar/PolarConnector") {} +``` + +- `PolarConnector` is the service tag. It holds the fully-assembled + `{ connector, routes }` pair. Callers inject it into `runConnector`. + +```52:59:connectors/producer-polar/src/connector.ts +export const PolarConfigConfig = Config.all({ + accessToken: Config.string("POLAR_ACCESS_TOKEN"), + apiBaseUrl: Config.string("POLAR_API_BASE_URL").pipe( + Config.withDefault("https://sandbox-api.polar.sh/v1/"), + ), + organizationId: Config.option(Config.string("POLAR_ORGANIZATION_ID")), + webhookSecret: Config.option(Config.string("POLAR_WEBHOOK_SECRET")), +}); +``` + +- `Config.all({...})` composes the four required/optional env vars. +- `Config.withDefault` points at the sandbox by default — this is the + "sandbox archetype" (see `connector-archetypes.md`). +- `Config.option` lets the two optional fields be absent without + failing decode. + +```62:83:connectors/producer-polar/src/connector.ts +const verifyWebhookSignature = (options: { + readonly rawBody: Uint8Array; + readonly headers: Headers.Headers; + readonly secret: string; +}): Effect.Effect => + Effect.try({ + try: () => { + validateEvent( + Buffer.from(options.rawBody), + options.headers, + options.secret, + ); + }, + catch: (error) => + new ConnectorError({ + message: + error instanceof WebhookVerificationError + ? "Invalid Polar webhook signature" + : "Failed to validate Polar webhook", + cause: error, + }), + }); +``` + +- Uses Polar's official SDK (`@polar-sh/sdk/webhooks.validateEvent`) + rather than rolling its own HMAC. Prefer official libs when the + platform ships one. +- Maps ambient errors into a typed `ConnectorError` with a meaningful + message. + +```86:220:connectors/producer-polar/src/connector.ts +const resolveWebhookDispatch = (options: { + readonly payload: WebhookPayload; + readonly customers: EntityStreams; + ... +}) => { + switch (payload.type) { + case "checkout.created": + case "checkout.updated": + ... + } +}; +``` + +- Exhaustive switch over every webhook type documented by Polar. +- Event types that fan out to the same entity are merged (e.g., all + `checkout.*` go to the `checkouts` stream). +- Types that Polar sends but we intentionally ignore (membership, + refunds, products) fall through to `Effect.void`. +- Unknown types hit the default branch and emit `logWarning` — a + deliberate trade-off: we don't fail on new webhook types, but we + make them visible. + +```223:321:connectors/producer-polar/src/connector.ts +const makePolarConnector = (config: PolarConfig) => + Effect.gen(function* () { + const api = yield* PolarApiClient; + const customerStreams = yield* makeEntityStreams({ api, schema: CustomerSchema, path: "customers/", cursorField: "created_at" }); + // ... three more entity streams + const connector = defineConnector({ + name: "producer-polar", + entities: [ defineEntity({...}), ... ], + events: [], + }); + const webhookRoute: WebhookRoute = { + path: "/webhooks/polar", + schema: WebhookPayloadSchema, + handle: (payload, request, rawBody) => + Effect.gen(function* () { + if (Option.isSome(config.webhookSecret) && rawBody) { + yield* verifyWebhookSignature({...}); + } + yield* resolveWebhookDispatch({...}); + }), + }; + return { connector, routes: [webhookRoute] }; + }); +``` + +- Four entities, each wired through `makeEntityStreams`. +- `cursorField: "created_at"` is Polar's monotonically-increasing field. +- A single webhook route handles all four entities. +- `Option.isSome(config.webhookSecret) && rawBody` gates verification — + missing secret in dev is a warn, not an error. + +```323:344:connectors/producer-polar/src/connector.ts +export const PolarConnectorConfig = (): Layer.Layer< + PolarConnector, + ConnectorError, + HttpClient.HttpClient +> => + Layer.effect(PolarConnector)( + Effect.gen(function* () { + const config = yield* PolarConfigConfig; + return yield* makePolarConnector(config).pipe( + Effect.provide(PolarApiClientConfig(config)), + ); + }).pipe( + Effect.mapError((error) => + error instanceof ConnectorError + ? error + : new ConnectorError({ message: "Polar config failed", cause: error }), + ), + ), + ); +``` + +- The public layer factory. Reads config, builds the API client layer + on the fly, produces the runtime. +- Narrows the error channel to `ConnectorError`. +- Requires `HttpClient.HttpClient` — callers supply this via + `FetchHttpClient.layer` or `VcrHttpClient.layer(...).pipe(Layer.provide(FetchHttpClient.layer))`. + +## `src/api.ts` — HTTP layer + +The shape is exactly the pattern described in `patterns.md` §4: + +- `PolarApiClient` service tag. +- `fetchJson(schema, path, params?)` for single-resource fetches. +- `fetchList(schema, path, options)` for paginated lists — Polar uses + `page`/`limit` query params and returns `{ items, pagination }`. +- Base URL + bearer header are baked into the `HttpClient` via + `HttpClient.mapRequest(HttpClientRequest.prependUrl(...))` + + `HttpClientRequest.bearerToken(accessToken)`. +- `PolarApiClientConfig(config)` factory layer provides the service, + requiring `HttpClient.HttpClient` from below. + +Use this as the template for any bearer-token + page+limit API. + +## `src/schemas.ts` — data shapes + +- Four `Schema.Struct` definitions (Customer, Checkout, Subscription, + Order). +- A `WebhookPayloadSchema = Schema.Union([...])` that tags each payload + variant with its literal `type`. +- Ignored event types appear in a second `Schema.Struct` with + `type: Schema.Literal(...)` and an open `data: Schema.Any` — this + lets decode succeed so the connector can log+skip rather than crash. + +Patterns to steal: + +- Optional fields wrapped with `Schema.NullOr(...)` when the API returns + `null` for empty values. +- Timestamp fields typed as `Schema.String` (ISO-8601) rather than + `Schema.Date`, because the cursor is a string. + +## `src/streams.ts` — stream wiring + +- `resolveCursor(row, field)` extracts the cursor value from a row. +- `setCutoff(deferred, cursor)` is idempotent — safe to call on every + incoming webhook. +- `dispatchEntityWebhook({queue, cutoff, row, cursor})` sets the cutoff + and enqueues. +- `makeBackfillStream(...)` wraps `makePullStream` with a cutoff filter. +- `makeEntityStreams(...)` creates the `{live, cutoff, backfill}` trio. + +This file is almost entirely generic — 90% of it is reusable across +connectors (and is essentially what the template ships). + +## `src/sandbox.ts` — local runner + +- Reads `ACK_TELEMETRY_ENABLED` (via Effect Config) to toggle OTLP + export. When enabled, composes + `Observability.Otlp.layerJson(...) + Metric.enableRuntimeMetricsLayer`. +- Mounts `NodeHttpServer.layer(createServer, { port: webhookPort })`. +- Uses `StateStoreInMemory` + `ConsolePublisherLayer` for zero + infrastructure — prints batches to stdout. +- Entry point: `Effect.runPromise(program.pipe(Effect.provide(RuntimeLayer)))`. + +Copy this sandbox shape unchanged for any connector; only the +connector-specific layer (`PolarConnectorConfig` → `XConnectorConfig`) +and env-var name change. + +## `test/api.vcr.test.ts` — replay tests + +- One cassette covers all four list endpoints. +- Each test decodes the real response through the schema. If the schema + drifts from the cassette, the test fails — this is the mechanism that + keeps schemas honest. + +## `test/webhook.test.ts` — end-to-end + +- Uses `NodeHttpServer.layerTest` for an in-process HTTP transport. +- Forks `runConnector(...)` with `Effect.forkScoped`, so the webhook + route is actually mounted. +- POSTs a realistic `customer.created` payload. +- Uses `makeTestPublisher` to capture the emitted batch, then asserts + shape. + +This is the template for every webhook test — the only thing that +changes is the payload fixture and the expected stream. + +## `test/helpers.ts` — test publisher + +- ~29 lines. Creates a `Publisher` layer that buffers batches into a + `Ref` and resolves a `Deferred` after `expected` deliveries. +- Drop-in for any connector test. + +## What NOT to copy verbatim + +- `@polar-sh/sdk` dependency — Polar-specific. +- The four entity names / cursor fields — platform-specific. +- The list of ignored webhook types — this is the Polar event catalog. +- `POLAR_*` env var names. + +## Anatomy summary + +Polar demonstrates: + +- Single-tenant sandbox-URL archetype. +- Bearer token auth. +- Page+limit pagination. +- Webhook-driven live + API-driven backfill. +- Cutoff-deferred handoff. +- Optional signing secret with friendly warning. +- Telemetry wiring toggled by one env var. +- VCR replay tests + in-process webhook tests. + +If you're building a connector that matches these shapes, Polar is the +best code to mirror. If your target differs (OAuth, per-tenant URL, +polling-only), combine Polar's structure with the relevant archetype +delta in `connector-archetypes.md`. diff --git a/.agents/skills/airfoil-kit/references/example-webhook-verification.md b/.agents/skills/airfoil-kit/references/example-webhook-verification.md new file mode 100644 index 0000000..915e7b3 --- /dev/null +++ b/.agents/skills/airfoil-kit/references/example-webhook-verification.md @@ -0,0 +1,243 @@ +# example-webhook-verification + +Illustrative webhook verification patterns for common platforms. +Use these as implementation references only. The target platform's official +webhook docs are the contract source of truth and may require different +algorithms, canonicalization rules, encodings, or replay protections. + +All examples assume you receive `rawBody: Uint8Array` (provided by the +kit in the webhook handler) and return +`Effect.Effect`. + +## Stripe + +Header: `Stripe-Signature` +Format: `t=,v1=` +Scheme: HMAC-SHA256 of `"."`, hex lowercase. + +```ts +import { createHmac, timingSafeEqual } from "node:crypto"; +import { Effect } from "effect"; +import { ConnectorError } from "@useairfoil/connector-kit"; + +const verifyStripeSignature = (options: { + readonly rawBody: Uint8Array; + readonly headers: Headers.Headers; + readonly secret: string; + readonly toleranceSeconds?: number; +}): Effect.Effect => + Effect.try({ + try: () => { + const header = Headers.get(options.headers, "stripe-signature"); + if (!header) throw new Error("Missing Stripe-Signature header"); + const parts = Object.fromEntries( + header.split(",").map((kv) => kv.split("=") as [string, string]), + ); + const timestamp = Number(parts.t); + const expected = parts.v1; + if (!timestamp || !expected) throw new Error("Malformed signature"); + const tolerance = options.toleranceSeconds ?? 300; + const now = Math.floor(Date.now() / 1000); + if (Math.abs(now - timestamp) > tolerance) { + throw new Error("Signature timestamp outside tolerance"); + } + const payload = `${timestamp}.${Buffer.from(options.rawBody).toString("utf8")}`; + const hmac = createHmac("sha256", options.secret).update(payload).digest(); + const provided = Buffer.from(expected, "hex"); + if (provided.length !== hmac.length || !timingSafeEqual(hmac, provided)) { + throw new Error("Invalid signature"); + } + }, + catch: (cause) => new ConnectorError({ message: "Stripe webhook verification failed", cause }), + }); +``` + +Env: `STRIPE_WEBHOOK_SECRET` (starts with `whsec_`). + +## Shopify + +Header: `X-Shopify-Hmac-Sha256` +Format: base64-encoded HMAC-SHA256 of the raw body. + +```ts +const verifyShopifySignature = (options: { + readonly rawBody: Uint8Array; + readonly headers: Headers.Headers; + readonly secret: string; +}): Effect.Effect => + Effect.try({ + try: () => { + const signature = Headers.get(options.headers, "x-shopify-hmac-sha256"); + if (!signature) throw new Error("Missing Shopify signature"); + const hmac = createHmac("sha256", options.secret) + .update(Buffer.from(options.rawBody)) + .digest(); + const provided = Buffer.from(signature, "base64"); + if (provided.length !== hmac.length || !timingSafeEqual(hmac, provided)) { + throw new Error("Invalid signature"); + } + }, + catch: (cause) => new ConnectorError({ message: "Shopify webhook verification failed", cause }), + }); +``` + +Env: `SHOPIFY_WEBHOOK_SECRET`. + +## GitHub + +Header: `X-Hub-Signature-256` +Format: `sha256=` + +```ts +const verifyGithubSignature = (options: { + readonly rawBody: Uint8Array; + readonly headers: Headers.Headers; + readonly secret: string; +}): Effect.Effect => + Effect.try({ + try: () => { + const signature = Headers.get(options.headers, "x-hub-signature-256"); + if (!signature?.startsWith("sha256=")) { + throw new Error("Missing or malformed GitHub signature"); + } + const hmac = createHmac("sha256", options.secret) + .update(Buffer.from(options.rawBody)) + .digest(); + const provided = Buffer.from(signature.slice("sha256=".length), "hex"); + if (provided.length !== hmac.length || !timingSafeEqual(hmac, provided)) { + throw new Error("Invalid signature"); + } + }, + catch: (cause) => new ConnectorError({ message: "GitHub webhook verification failed", cause }), + }); +``` + +Env: `GITHUB_WEBHOOK_SECRET`. + +## Polar (reference) + +Polar ships an official verifier. Prefer it over rolling your own. + +```ts +import { validateEvent, WebhookVerificationError } from "@polar-sh/sdk/webhooks"; + +const verifyPolarSignature = (options: { + readonly rawBody: Uint8Array; + readonly headers: Headers.Headers; + readonly secret: string; +}): Effect.Effect => + Effect.try({ + try: () => { + validateEvent(Buffer.from(options.rawBody), options.headers, options.secret); + }, + catch: (error) => + new ConnectorError({ + message: + error instanceof WebhookVerificationError + ? "Invalid Polar webhook signature" + : "Failed to validate Polar webhook", + cause: error, + }), + }); +``` + +See `connectors/producer-polar/src/connector.ts` for the live version. + +## Slack + +Header: `X-Slack-Signature` + `X-Slack-Request-Timestamp` +Format: `v0=`; HMAC over `"v0::"`. + +```ts +const verifySlackSignature = (options: { + readonly rawBody: Uint8Array; + readonly headers: Headers.Headers; + readonly secret: string; +}): Effect.Effect => + Effect.try({ + try: () => { + const signature = Headers.get(options.headers, "x-slack-signature"); + const timestamp = Headers.get(options.headers, "x-slack-request-timestamp"); + if (!signature || !timestamp) throw new Error("Missing Slack headers"); + if (Math.abs(Date.now() / 1000 - Number(timestamp)) > 300) { + throw new Error("Slack timestamp outside tolerance"); + } + const base = `v0:${timestamp}:${Buffer.from(options.rawBody).toString("utf8")}`; + const hmac = createHmac("sha256", options.secret).update(base).digest(); + const provided = Buffer.from(signature.slice("v0=".length), "hex"); + if (provided.length !== hmac.length || !timingSafeEqual(hmac, provided)) { + throw new Error("Invalid Slack signature"); + } + }, + catch: (cause) => new ConnectorError({ message: "Slack webhook verification failed", cause }), + }); +``` + +## General HMAC template + +If your target isn't listed, figure out these three things from the +docs: + +1. Which header carries the signature? +2. What exactly is signed (raw body? body + timestamp? URL?) +3. Is it hex or base64 encoded? + +Plug into: + +```ts +const verifyGeneric = (options) => + Effect.try({ + try: () => { + const hmac = createHmac( + "sha256", // or "sha1" if the API is old; sha256 is the default + options.secret, + ) + .update(Buffer.from(options.rawBody)) // or the documented canonical string + .digest(); + const provided = Buffer.from(options.signature, "hex"); // or "base64" + if (provided.length !== hmac.length || !timingSafeEqual(hmac, provided)) { + throw new Error("Invalid signature"); + } + }, + catch: (cause) => new ConnectorError({ message: "...", cause }), + }); +``` + +## Always + +- Use `createHmac` (not `crypto.createHash`) for keyed HMACs. +- Use `timingSafeEqual` — **never** `===` or `Buffer.compare` for + signature comparison. +- Verify **before** decoding the payload. +- Treat the signing secret as `Redacted.make(secret)` anywhere it + touches logs. +- Map every failure to `ConnectorError` so the handler's error channel + stays narrow. +- Gate on the raw body being present; if the transport lost it, fail + loudly. + +## Timestamp tolerance + +Most schemes include a timestamp to prevent replay attacks. Use the +platform's documented tolerance (usually 5 minutes). A sample clock-skew +check: + +```ts +if (Math.abs(Date.now() / 1000 - timestamp) > tolerance) { + throw new Error("Signature timestamp outside tolerance"); +} +``` + +Document the tolerance choice in the connector README if it's +non-default. + +## What to test + +For each connector, ship two webhook tests: + +1. **Valid signature** — assert 200 and a published batch. +2. **Invalid signature** — assert 500 (or the chosen rejection status) + and **no** published batch. + +Both tests drive `NodeHttpServer.layerTest` + `Effect.forkScoped(runConnector)` +per `webhooks.md`. diff --git a/.agents/skills/airfoil-kit/references/patterns.md b/.agents/skills/airfoil-kit/references/patterns.md new file mode 100644 index 0000000..04cb848 --- /dev/null +++ b/.agents/skills/airfoil-kit/references/patterns.md @@ -0,0 +1,279 @@ +# patterns + +Patterns shared by `templates/producer-template/` and +`connectors/producer-polar/`. For each pattern this file explains: what it +is, when to deviate, and where to look in the existing code. + +--- + +## 1. Config struct vs individual fields + +**Pattern:** a single `Config.all({...})` that produces a flat struct. Pass +the decoded struct into downstream factories (`makeXApiClient(config)`), +never reach into `ConfigProvider` from deep inside the connector. + +**Deviate when:** none. Even for large configs, keep one struct. + +**See:** `PolarConfigConfig`, `TemplateConfigConfig`. + +## 2. Service tag per logical component + +Three service tags per connector: + +- `XApiClient` — HTTP-level operations. +- `XConnector` — the `{ connector, routes }` pair. +- (Optional) `XOAuthTokens` — refreshing tokens, if applicable. + +Each tag lives in the file that owns the logic, with a string tag of the +form `@useairfoil/producer-/`. + +**Deviate when:** never merge unrelated responsibilities into one tag. + +## 3. Layer factories return `Layer.effect(Tag)(factory)` + +```ts +export const XConnectorConfig = (): Layer.Layer< + XConnector, + ConnectorError, + HttpClient.HttpClient +> => + Layer.effect(XConnector)( + Effect.gen(function* () { + const config = yield* XConfigConfig; + return yield* makeXConnector(config).pipe(Effect.provide(XApiClientConfig(config))); + }), + ); +``` + +- The layer **requires** whatever its factories need (`HttpClient` here). +- It reads config itself, so callers only supply the `ConfigProvider`. +- Error channel is narrowed to `ConnectorError` via `Effect.mapError`. + +## 4. API client with `fetchJson` + `fetchList` + +```ts +type XApiClientService = { + readonly fetchJson: (schema, path, params?) => Effect.Effect; + readonly fetchList: ( + schema, + path, + options, + ) => Effect.Effect, ConnectorError, R>; +}; +``` + +- `fetchJson` for detail fetches and non-list endpoints. +- `fetchList` encapsulates the pagination convention. Return + `{ items, hasMore, ...maybeCursor }` — whatever your API communicates. +- Derive pagination semantics from official platform docs and validate against + recorded traffic. Do not assume cursor or continuation behavior from another + connector. + +**Deviate when:** your API is GraphQL (replace GET with POST + query), +bulk-export based (replace `fetchList` with a job runner), or returns +protocol buffers (add a `fetchBytes` helper that decodes). + +## REST mode summary (default) + +For REST APIs, treat this file + `example-auth.md` + +`example-pagination.md` as the mode contract. + +- Keep list/detail access in `fetchJson` and `fetchList` helpers. +- Keep auth middleware in one client construction pipeline. +- Keep pagination mapping deterministic and isolated in `fetchList`. +- Decode response bodies at the API boundary using `Schema`. +- Map all transport/decode failures to `ConnectorError`. + +If your API is not REST, switch to mode-specific docs: + +- GraphQL: `api-mode-graphql.md` +- gRPC: `api-mode-grpc.md` + +## 5. Entity stream trio: `{ live, cutoff, backfill }` + +Always wire every entity with `makeEntityStreams({ api, schema, path, cursorField })`. +The returned trio has exactly the shape the engine expects: + +- `live`: `WebhookStream` — pushed to by the webhook handler. +- `cutoff`: `Deferred` — resolved by the first live event + (or by initialCutoff for polling-only connectors). +- `backfill`: `Stream, ConnectorError>` — waits on cutoff, then pages. + +**Deviate when:** + +- Pure polling — skip `WebhookStream`, use `makePullStream` as `live` and + point `initialCutoff` at the desired history window. +- Webhook-only — return an empty backfill stream. + +## 6. First-webhook-sets-cutoff + +The first live event dispatched to an entity resolves its `Deferred`. +Backfill waits on that deferred, so it can only run historical data that +happened **before** the first live event. This guarantees no overlap gap. + +```ts +export const dispatchEntityWebhook = (options) => + Effect.gen(function* () { + yield* setCutoff(options.cutoff, options.cursor); // idempotent + yield* Queue.offer(options.queue.queue, { + cursor: options.cursor, + rows: [options.row], + }).pipe(Effect.asVoid); + }); +``` + +**Deviate when:** your connector is polling-only (no live events); +`initialCutoff` passed to `runConnector` becomes the canonical cutoff. + +## 7. Seen-set for upsert de-dupe + +The engine tracks a `Set` of primary keys that have already been +published (live or backfill). Backfill filters its rows through that set +before emitting, so overlapping windows don't re-publish the same row. + +This is implemented inside `runEntity` in +`packages/connector-kit/src/ingestion/engine.ts`. You don't need to do +anything in connector code. + +## 8. Events run backfill then live (order matters) + +For `defineEvent` streams, the engine drains the entire backfill before +starting live. Events are append-only logs; ordering must be preserved. + +**Deviate when:** you want overlap (which would violate ordering) — in +that case, use `defineEntity` instead. + +## 9. Webhook handler pattern + +```ts +const webhookRoute: WebhookRoute = { + path: "/webhooks/", + schema: WebhookPayloadSchema, + handle: (payload, request, rawBody) => + Effect.gen(function* () { + if (Option.isSome(config.webhookSecret)) { + if (!rawBody) { + return yield* Effect.fail( + new ConnectorError({ + message: "Webhook raw body is required for signature verification", + }), + ); + } + yield* verifyWebhookSignature({ + rawBody, + headers: request.headers, + secret: config.webhookSecret.value, + }); + } + yield* resolveWebhookDispatch({ payload /* ...streams */ }); + }), +}; +``` + +Key points: + +- `Schema.Union([...])` validates the payload structure against known types. +- Raw body is used for signature verification. +- Verification is fail-closed when enabled: missing verification inputs are + explicit typed failures. +- Dispatch logic is extracted into a pure function for testability. + +## 10. Explicit enumeration of ignored events + +`producer-polar` lists every ignored event type in a dedicated +`Schema.Literals([...])` union. Unknown types fall through to a +`logWarning` default. This is deliberate: silent schema failures are +nightmare to debug. + +```ts +switch (payload.type) { + case "order.created": + return handleOrder(...); + case "organization.updated": // ignored on purpose + return Effect.void; + default: + return Effect.logWarning("Ignoring unknown webhook type").pipe(...); +} +``` + +**Deviate when:** the service has hundreds of event types — then group +into a dispatch table `const handlers: Record`. + +## 11. Sandbox runner layer composition + +Always the same shape: + +```ts +const RuntimeLayer = Layer.mergeAll( + StateStoreInMemory, + ConsolePublisherLayer, + ConnectorLayer, + Logger.layer([Logger.consolePretty()]), + TelemetryLayer, + EnvLayer, // FetchHttpClient.layer + ConfigProvider.fromEnv() +); +``` + +Callers toggle telemetry via `ACK_TELEMETRY_ENABLED` and choose the +publisher via which layer they merge in (console vs Wings). + +## 12. Test publisher + +Always `makeTestPublisher(expected)` that captures into a `Ref` and +resolves a `Deferred` after `expected` batches land. Never count on +timeouts to decide "the connector is idle now". + +## 13. Error mapping + +Wrap every non-`ConnectorError` failure: + +```ts +Effect.mapError((error) => + error instanceof ConnectorError + ? error + : new ConnectorError({ + message: "", + cause: error, + }), +); +``` + +Without this, `Layer.effect` will complain that the error channel isn't +narrowed, and `runConnector`'s contract (`E = ConnectorError`) won't hold. + +## 14. Connector config ↔ test config + +In sandbox/prod, `EnvLayer` provides `ConfigProvider.fromEnv()`. + +In tests, use either: + +- `ConfigProvider.fromUnknown({ ... })` for hermetic deterministic tests, or +- `ConfigProvider.fromEnv()` for integration-style tests that intentionally use + environment-backed settings. + +Pick one deliberately and keep `test` and `test:ci` behavior equivalent. + +--- + +## Shape of a connector-kit test + +``` +┌───────────────┐ +│ Test body │ runs the Effect program +│ (Effect.gen) │ +└───────┬───────┘ + │ requires +┌───────▼───────────────────────────────────────────────┐ +│ connectorLayer = XConnectorConfig().pipe( │ +│ Layer.provide(apiLayer OR vcrLayer) │ +│ ) │ +└───────┬───────────────────────────────────────────────┘ + │ requires +┌───────▼───────────────────┐ ┌────────────────────┐ +│ apiLayer: Layer │ │ + real HttpClient │ +└───────────────────────────┘ └────────────────────┘ +``` + +Plus `ConfigProvider` and `StateStoreInMemory` / `test publisher` as +needed. Polar has working examples for both shapes. diff --git a/.agents/skills/airfoil-kit/references/playbook.md b/.agents/skills/airfoil-kit/references/playbook.md new file mode 100644 index 0000000..a2dd8b9 --- /dev/null +++ b/.agents/skills/airfoil-kit/references/playbook.md @@ -0,0 +1,230 @@ +# Playbook: building a new producer connector + +End-to-end flow, expanded from `SKILL.md`. Follow in order; do not skip steps. + +--- + +## 0. Confirm intent + +Before touching the repo, restate what you will build: + +- Target service (e.g. Stripe). +- Entities to ingest (e.g. `customers`, `charges`, `subscriptions`). +- Event types to process (if any). +- Does the service sign webhooks? What header? +- Does the service have a sandbox/test-mode? What base URL? What credentials? + +If anything is unclear, **ask the user**. Do not guess. + +## 1. Anti-cheat pre-flight + +Run the checks in [`anti-cheat.md`](./anti-cheat.md). Abort and report if any +existing connector code for this service is found. + +## 2. Classify the API archetype + +Identify which of the archetypes in [`connector-archetypes.md`](./connector-archetypes.md) +matches. This tells you what config knobs to expose and what stream shape to +use (webhook-first vs polling-only, single-tenant vs multi-tenant). + +Also choose one implementation mode for this connector: + +- `rest` (default for JSON HTTP APIs) +- `graphql` +- `grpc` + +Do not start implementation until this mode is explicit in `api-facts.md`. + +## 3. API research + +Follow [`api-research.md`](./api-research.md) to collect: + +- Base URL (sandbox + prod). +- Auth scheme (Bearer, API key header, Basic, OAuth2). +- Required scopes / API version headers. +- Pagination style (see [`example-pagination.md`](./example-pagination.md)). +- List + detail endpoint shapes for each entity. +- Webhook event catalog and signature algorithm. + +Write an API-facts artifact during this step (required before coding), +including: + +- selected mode (`rest`/`graphql`/`grpc`), +- source evidence URLs + retrieval date, +- chosen API version and rationale, +- auth, pagination, and webhook contracts. + +Default path is `connectors/producer-/api-facts.md`. If the user asks +not to persist this file, keep equivalent facts in notes and include them in +the final report. + +Use Context7 for Effect-specific v4 docs (`effect-ts/effect-smol`) and +service SDK docs, and `WebFetch` for public API reference pages. DeepWiki +is optional fallback. Capture everything you learn in a short notes file +or in the PR description so nothing is lost. + +## 4. Read mode-specific contract + +- REST: [`patterns.md`](./patterns.md), [`example-auth.md`](./example-auth.md), + [`example-pagination.md`](./example-pagination.md) +- GraphQL: [`api-mode-graphql.md`](./api-mode-graphql.md) +- gRPC: [`api-mode-grpc.md`](./api-mode-grpc.md) + +Treat your selected mode doc as implementation contract. + +## 5. Credentials + test data + +See [`test-data.md`](./test-data.md). + +- Ask the user for a sandbox API key (and webhook secret if webhooks are + signed). +- Seed the sandbox with representative data (`mcp`, UI, or curl scripts). +- Write `.env.example` listing every env var the connector reads. +- Copy `.env.example` to `.env` locally (the user should fill in real values). + +## 6. Scaffold from the template + +```bash +cp -R templates/producer-template connectors/producer- +cd connectors/producer- +``` + +Run the search-and-replace pass from [`assets/rename-checklist.md`](../assets/rename-checklist.md). +Verify the new package installs: + +```bash +cd ../.. # back to repo root +pnpm install +pnpm --filter @useairfoil/producer- run typecheck +``` + +## 7. Implement the API client (`src/api.ts`) + +Use your selected mode contract: + +- **REST:** implement auth, endpoint paths, and pagination in `fetchList` + based on researched docs and recorded traffic (see + [`example-auth.md`](./example-auth.md) and + [`example-pagination.md`](./example-pagination.md)). +- **GraphQL:** implement the request helper, envelope decode, `errors` branch, + and pagination mapping per [`api-mode-graphql.md`](./api-mode-graphql.md). + Create `src/graphql/operations.ts` for query constants. +- **gRPC:** generate client stubs via `buf generate`, build the `XGrpcApiClient` + service layer, and centralize deadlines/auth/retry per + [`api-mode-grpc.md`](./api-mode-grpc.md). + +For all modes: keep recoverable runtime failures in typed error channels. +Map transport/decode/contract failures to `ConnectorError` (or a connector- +specific tagged error mapped to it). + +## 8. Define schemas from recorded traffic (`src/schemas.ts`) + +For REST/GraphQL mode: + +1. Set one test's VCR `mode: "record"`, drop the real sandbox token into + `.env`, and run `pnpm run test`. This records the cassette against the + real API. +2. Open the cassette (`test/__cassettes__/.cassette`) and read the + actual response body. +3. Translate the observed JSON to `Schema.Struct({...})`. Use `Schema.NullOr` + for nullable fields, `Schema.Array` for arrays, `Schema.Literals([...])` + for enums, `Schema.Record(Schema.String, Schema.Any)` for free-form maps. +4. Flip `mode: "replay"` before committing. See [`vcr-workflow.md`](./vcr-workflow.md). + +For gRPC mode: + +1. VCR HTTP cassettes do not apply. +2. Use deterministic proto fixtures and/or mock gRPC servers. +3. Derive schemas/contracts from recorded fixture payloads and generated types. + +Repeat per entity + per webhook event type. Union webhook payload variants +the same way `producer-polar` does — see +[`example-producer-polar.md`](./example-producer-polar.md). + +## 9. Wire entities and streams (`src/streams.ts`, `src/connector.ts`) + +- For each entity, call `makeEntityStreams({ api, schema, path, cursorField, limit })`. +- `cursorField` should be a monotonically increasing server-emitted timestamp + (or numeric id) that appears on every row. +- Register each entity with `defineEntity({ name, schema, primaryKey, live, backfill })`. +- If the service also emits append-only events (e.g. audit logs), use + `defineEvent` instead. Events backfill first, then go live. +- Compose into `defineConnector({ name, entities, events })`. See + [`connector-kit-api.md`](./connector-kit-api.md) and [`patterns.md`](./patterns.md). + +## 10. Webhook route (`src/connector.ts`) + +- Define one `WebhookRoute` per inbound path the service + uses (often just one). +- Verify signatures against `rawBody` using the documented HMAC or library + helper. See [`webhooks.md`](./webhooks.md) and + [`example-webhook-verification.md`](./example-webhook-verification.md). +- If signature verification is enabled, treat missing verification inputs + (for example `rawBody` or signature header) as explicit failures. Do not + silently bypass verification in this state. +- In the `handle` function, switch on `payload.type` and dispatch to the + right entity queue via `dispatchEntityWebhook`. + +## 11. Sandbox runner (`src/sandbox.ts`) + +- Rename service identifiers in logs and telemetry. +- Rename env vars (`TEMPLATE_WEBHOOK_PORT` → `_WEBHOOK_PORT`). +- Keep the telemetry layer as-is; callers can enable it via `ACK_TELEMETRY_ENABLED`. +- Required layer checklist: `HttpClient`, `ConfigProvider`, `StateStore`, + `Publisher`, and server layer. +- Run once and confirm startup reaches webhook server ready/health output. + +## 12. Tests (`test/*`) + +- REST/GraphQL: `api.vcr.test.ts` record once, commit the cassette, then + replay. Cover at least one list endpoint + a documented pagination + transition for the target platform. +- gRPC: use deterministic proto fixtures and/or mock gRPC servers; do not rely + on HTTP VCR cassettes for gRPC traffic. +- `webhook.test.ts`: use `NodeHttpServer.layerTest` (or Bun equivalent test layer) + to POST a sample payload + and assert the publisher received one batch with the expected entity name. +- If signed webhooks are used, include both valid-signature and + invalid-signature test paths. +- Include one no-op/ignored webhook event path to confirm unknown or + unsupported events do not publish side effects. +- Optional: a second replay/fixture test for a second entity or cutoff + boundary. +- Ensure `test` and `test:ci` load configuration equivalently. + +## 13. README + +- Document install, required env, sandbox command, recording/replay flow (or + fixture flow for gRPC), and test commands. +- Satisfy the full README checklist in [`definition-of-done.md`](./definition-of-done.md) + instead of mirroring any single connector README. + +## 14. Local CI gate + +Run each of these from the repo root. Every one must pass: + +```bash +pnpm run lint +pnpm run typecheck +pnpm run build +pnpm run test:ci +``` + +If any fail, fix before proceeding. See [`definition-of-done.md`](./definition-of-done.md). + +## 15. Report back + +Final message should list: + +- Entities + events delivered. +- Endpoints/operations exercised under deterministic replay (VCR or fixtures). +- Commands you ran and their outcomes. +- Completion state (`Code Complete`, `Verified with Real Cassettes`, or + `CI Complete`). +- Known follow-ups (e.g. pagination patterns you could not record yet). +- Environment setup guide with, for each env var: + 1. where to obtain it, + 2. required scopes/permissions, + 3. exact setup steps (`cp .env.example .env`, fill values), + 4. verification command + expected signal. +- Any questions for the user. diff --git a/.agents/skills/airfoil-kit/references/template-walkthrough.md b/.agents/skills/airfoil-kit/references/template-walkthrough.md new file mode 100644 index 0000000..0bb4c11 --- /dev/null +++ b/.agents/skills/airfoil-kit/references/template-walkthrough.md @@ -0,0 +1,227 @@ +# template-walkthrough + +File-by-file tour of `templates/producer-template/`. The template targets +[JSONPlaceholder](https://jsonplaceholder.typicode.com) so the code runs and +tests pass with zero credentials. Every file below has a "what to change" +section for when you port it to a real API. + +--- + +## `package.json` + +Minimal workspace package. Key points: + +- `"name": "@useairfoil/producer-template"` — rename to + `@useairfoil/producer-`. +- `"private": true` — keep private unless explicitly publishing. +- `"type": "module"` — all packages in this repo are ESM. +- `"exports"` — points `.` to `dist/index.js` and `dist/index.d.ts`. +- `dependencies.effect: "catalog:"` — Effect and `@effect/*` versions are + managed at the monorepo catalog level. +- `devDependencies['@useairfoil/effect-vcr']` — VCR lives in a devDep + so it does not leak into the published bundle. + +**What to change:** `name`, `version`, and any service-specific dependencies +(e.g. `stripe`, `@octokit/rest`, `shopify-api-node`). Do **not** change the +Effect, `@effect/platform-*`, or `@effect/vitest` versions — they are pinned +at the monorepo level. + +## `tsconfig.json` + +Extends the repo root tsconfig. `strict: true`, `verbatimModuleSyntax: true`, +`noEmit: true` (the build runs through tsdown). + +**What to change:** nothing. + +## `tsdown.config.ts` + +Single entry (`src/index.ts`) bundled as ESM with `.d.ts` output. Same as +every other package in the repo. + +**What to change:** nothing. + +## `vitest.config.ts` + +`fileParallelism: false` (VCR tests share cassette files, don't race them), +60s timeout for network recording. + +**What to change:** nothing. + +## `.env.example` + +Template env surface. Every variable the connector reads from `Config` should +appear here with a stubbed value. + +**What to change:** replace `TEMPLATE_*` with `_*` and add real +variables — API key, webhook secret, tenant id, etc. + +## `src/schemas.ts` + +Effect `Schema.Struct` for the `Post` entity + a `WebhookPayloadSchema` union +of two event shapes (`post.created|post.updated` and an ignored `post.deleted`). + +**What to change:** replace `PostSchema` with the real entity schemas and +`WebhookPayloadSchema` with the real event union. Always derive fields from a +recorded cassette — see [`vcr-workflow.md`](./vcr-workflow.md). + +## `src/api.ts` + +Defines: + +- `TemplateApiClientService` — the typed service surface (`fetchJson`, + `fetchList`). +- `TemplateApiClient` — `Context.Service` tag. +- `makeTemplateApiClient` — Effect factory that obtains an `HttpClient`, + prepends the base URL, attaches `bearerToken`, and returns typed helpers. +- `TemplateApiClientConfig` — `Layer.effect(...)` wrapper for composition. + +**What to change:** + +- Auth middleware on `HttpClient.mapRequest(...)` — Bearer by default, swap + to `setHeader`, `basicAuth`, OAuth2 refresh layer as needed. See + [`example-auth.md`](./example-auth.md). +- Pagination style in `fetchList`. JSONPlaceholder uses `_page`/`_limit`; + your API may use `cursor`, `page_size`, `starting_after`, link headers, etc. + See [`example-pagination.md`](./example-pagination.md). +- Endpoint paths — `/posts` → your real list endpoints. +- Error mapping — keep mapping into `ConnectorError`, but add service-specific + error enrichment where useful. + +## `src/streams.ts` + +Entity-stream factory: + +- `resolveCursor(row, field)` turns a row's cursor field into a `Cursor`. +- `dispatchEntityWebhook({ queue, cutoff, row, cursor })` — enqueue + set + cutoff in one go. +- `makeBackfillStream(...)` — waits on the cutoff deferred, then uses + `makePullStream` to page until `hasMore` is false. Filters to + `row[cursorField] <= cutoff`. +- `makeEntityStreams(...)` — one-shot factory returning `{ live, cutoff, backfill }`. + +**What to change:** + +- `isOnOrBeforeCutoff` — tweak the cutoff comparison if your cursor is a + timestamp (`new Date(...)`) vs a numeric id. For timestamps, prefer the + Polar connector's string-compare (`new Date(value).getTime()`). +- Pagination hand-off. The JSONPlaceholder example paginates by incrementing + `_page`. For cursor-based APIs, return `cursor: next_token` and rely on the + API's own `hasMore`/`has_more` flag. +- `limit` default (10 for JSONPlaceholder; 100 is a good default for real APIs). + +## `src/connector.ts` + +The main wire-up file: + +- `TemplateConfig` — plain type describing the decoded config struct. +- `TemplateConfigConfig` — `Config.all({...})` that decodes env vars. +- `TemplateConnector` — `Context.Service` exposing + `{ connector, routes }` to callers. +- `verifyWebhookSignature` — **stub**. Replace with real HMAC verification. +- `resolveWebhookDispatch` — switch on `payload.type`, dispatch to the right + entity queue. +- `makeTemplateConnector` — builds `EntityStreams` and composes everything. +- `TemplateConnectorConfig` — `Layer.effect(TemplateConnector)(...)` for + runtime composition. + +**What to change:** + +- Rename every `Template` / `TEMPLATE_` identifier. See + [`assets/rename-checklist.md`](../assets/rename-checklist.md). +- Implement real webhook signature verification. Use the service's SDK + helper where available (e.g. `stripe.webhooks.constructEvent`, + `@polar-sh/sdk/webhooks.validateEvent`). See [`webhooks.md`](./webhooks.md). +- Add one `makeEntityStreams` call per entity. +- Add one `WebhookRoute` per inbound path. +- Extend `resolveWebhookDispatch` with cases for every event type you care + about. Ignored events should fall into a `.void`/`.asVoid` case to keep + them explicit. + +## `src/sandbox.ts` + +End-to-end runner for local development: + +- `SandboxConfig`, `TelemetryConfig` — Effect `Config.all({...})` for runtime + knobs. +- `ConsolePublisherLayer` — a `Publisher` that logs batches instead of + pushing them to Wings. +- `program` — obtains the connector + routes, starts a `NodeHttpServer` (or + Bun equivalent), and + calls `runConnector(connector, { initialCutoff, webhook: { routes } })`. +- `EnvLayer` — merges `FetchHttpClient.layer` and + `ConfigProvider.fromEnv()`. +- `TelemetryLayer` — opt-in OTLP export + runtime metrics. +- `RuntimeLayer` — composes every layer the program needs. +- Final `Effect.runPromise(...)` with a fatal error logger. + +**What to change:** only identifiers (`TEMPLATE_*` → `_*`, +`producer-template` → `producer-`), never the layer structure. + +## `src/index.ts` + +Re-exports the public API. Keep the shape small: service tag, config +factory, config struct type, runtime type, and schemas you want consumers +to pattern-match against. + +## `test/helpers.ts` + +Test-only `Publisher` that captures every published batch into a `Ref` and +resolves a `Deferred` after N batches land. Used by every webhook test. + +**What to change:** nothing. + +## `test/api.vcr.test.ts` + +VCR replay test. Construction order: + +1. Build `program` that uses `TemplateApiClient` directly. +2. Build an `apiLayer` that supplies `TemplateApiClient` from + `makeTemplateApiClient`. +3. Build a `cassetteLayer` from `FileSystemCassetteStore.layer()`. +4. Build a `vcrLayer` from `VcrHttpClient.layer({ vcrName, mode })`. +5. Provide everything + a `ConfigProvider.fromUnknown({ ... })` with the + minimum env needed for `TemplateConfigConfig` to decode. + +The first time you run this against a real API, set `mode: "record"`. After +the cassette is written, switch to `"replay"` and commit. + +## `test/webhook.test.ts` + +In-memory webhook test using `NodeHttpServer.layerTest` (or Bun equivalent): + +1. Build a test publisher via `makeTestPublisher(1)`. +2. Fork `runConnector(connector, { webhook: { routes } })`. +3. POST a synthetic payload to the webhook path via `HttpClient.execute`. +4. Wait on `Deferred.await(done)`; assert one batch was published to the + right entity name. + +**What to change:** the fixture payload object, the webhook path, and the +expected entity name. + +## `test/__cassettes__/` + +JSON cassette files, committed. One per `*.vcr.test.ts`, keyed by the Vitest +test name. See [`vcr-workflow.md`](./vcr-workflow.md) for the file format. + +## `README.md` + +Document the connector. Mirror the structure of +`connectors/producer-polar/README.md`: Install → Env → Minimal wiring → +Architecture → Testing with VCR. + +--- + +## Where the template intentionally differs from Polar + +- Only one entity (`posts`), no `events`. +- Numeric cursor (`id`) instead of a timestamp, because JSONPlaceholder does + not emit timestamps. Real connectors should prefer timestamps. +- No real webhook signing — the stub accepts everything. Polar delegates to + `@polar-sh/sdk/webhooks.validateEvent`. +- No service SDK dependency. Real connectors usually add one. +- A simpler `TemplateListPage` with `{ items, hasMore }` instead of + `{ items, pagination: { total_count, max_page } }` since JSONPlaceholder + has no totals. + +When in doubt, compare the new connector against Polar: +[`example-producer-polar.md`](./example-producer-polar.md). diff --git a/.agents/skills/airfoil-kit/references/test-data.md b/.agents/skills/airfoil-kit/references/test-data.md new file mode 100644 index 0000000..683984e --- /dev/null +++ b/.agents/skills/airfoil-kit/references/test-data.md @@ -0,0 +1,193 @@ +# test-data + +How to source realistic test data for a new connector, what to commit, +and what coverage is expected before declaring done. + +## Reporting state when credentials are missing + +When credentials or platform access are unavailable, report status using +`definition-of-done.md` completion states: + +- `Code Complete`: allowed while waiting on credentials. +- `Verified with Real Cassettes`: not allowed until recordings exist. +- `CI Complete`: only after root CI baseline passes. + +Do not collapse these into a binary done/not-done label. + +## Credentials: asking and handling + +1. **Ask the user explicitly.** Before recording any cassette, say: + + > I need sandbox (or test-mode) credentials to record VCR + > cassettes. Do you have: + > + > - A test/sandbox API key? (preferred) + > - A production API key with read-only scope? (acceptable) + > - Access to a seeding MCP (e.g. Stripe MCP)? (ideal) + +2. **Store creds in `.env`** at the connector package root. Example: + + ``` + STRIPE_API_KEY=sk_test_xxxxxxxxxxxx + STRIPE_WEBHOOK_SECRET=whsec_xxxxxxxx + ``` + +3. **Never commit `.env`.** Confirm the connector's `.gitignore` or the + repo root `.gitignore` excludes it. + +4. **Always commit `.env.example`** with placeholder values and a comment + per variable: + + ``` + # Stripe test-mode API key (starts with sk_test_) + STRIPE_API_KEY= + + # Stripe webhook signing secret from the dashboard (starts with whsec_) + STRIPE_WEBHOOK_SECRET= + ``` + +5. **Redact credentials in cassettes.** Authorization header is redacted + by default; extend `redact.requestHeaders` for any other auth header + the service uses. See `vcr-workflow.md`. + +6. **Never edit cassette JSON by hand.** If sensitive fields leak or replay + mismatches occur, update redaction/matching and re-record. + +## Seeding test data + +Order of preference: + +1. **Target-service MCP** (e.g., Stripe MCP). Ask the user if one is + available and configured. MCPs can deterministically seed a sandbox + with specific fixtures, making recordings reproducible. + +2. **Sandbox seed scripts** that the platform provides. Many SaaS + offerings have "populate my test account" helpers; document the exact + steps in the connector README. + +3. **Manual dashboard seeding** — create a handful of records by hand + through the service's UI. Document the minimum set. + +4. **Live read-only fetch**. If the user has a real account with data, + record against it (with redaction). Take special care: + - Redact emails, PII, tenant identifiers. + - Review the cassette diff before committing. + +## Required coverage per entity + +For gRPC mode, replace `test/api.vcr.test.ts` with deterministic fixture or +mock-server tests. The coverage categories below still apply. + +For each entity the connector ships: + +| Coverage | Required? | Test file | +| ---------------------------------------------- | ----------------------- | ----------------------- | +| Backfill: list page 1 | yes | `test/api.vcr.test.ts` | +| Backfill: list with pagination (page 2+) | yes if API paginates | `test/api.vcr.test.ts` | +| Empty page / end-of-data | recommended | `test/api.vcr.test.ts` | +| Detail fetch (one `GET /thing/:id`) | yes if used by dispatch | `test/api.vcr.test.ts` | +| Webhook: one payload per dispatched event type | yes | `test/webhook.test.ts` | +| Webhook: signature verification success | yes | `test/webhook.test.ts` | +| Webhook: signature verification failure | yes | `test/webhook.test.ts` | +| Auth failure (401) | recommended | separate test or inline | + +## Webhook payloads + +Webhook tests don't capture cassettes — they drive in-process `POST`s +against `NodeHttpServer.layerTest` (or Bun equivalent). The payloads +themselves come from: + +1. **Platform dashboards** — most SaaS providers let you trigger a test + webhook from their UI and inspect the payload. +2. **Platform CLI tools** — e.g. `stripe trigger customer.created`. +3. **Official docs** — paste in the documented example payload. + +Copy the **verbatim** payload into a fixture file or inline string in +the test. Treat it as ground truth (same principle as VCR). + +## Fixture files + +If a test needs a large payload, place it at +`test/__fixtures__/.json` and import via +`fs.readFileSync` in Node or via Bun's `import ... with { type: "json" }`. +Prefer tiny inline fixtures for common cases. + +## Test data hygiene + +- **Stable IDs.** Use short, non-PII strings (e.g., `user_123`). Avoid + UUIDs from the real platform when possible — they look real and + obscure intent. +- **Deterministic timestamps.** ISO-8601 strings like + `"2024-01-01T00:00:00Z"` keep diffs clean. +- **No emails, phone numbers, or addresses.** Use + `test+@example.com`, `+15550000000`, etc. + +## Commit what, exactly + +Commit: + +- `test/__cassettes__/*.cassette` (JSON, human-diffable). +- `test/__fixtures__/**` if used. +- `.env.example`. + +Do not commit: + +- `.env`. +- `node_modules/`. +- Any file containing actual API keys, customer emails, or internal + tenant identifiers. + +## Running tests locally vs CI + +Local (record or replay): + +```bash +pnpm --filter @useairfoil/producer- run test +``` + +CI (`CI=true`): + +```bash +pnpm --filter @useairfoil/producer- run test:ci +``` + +`test:ci` sets `CI=true` implicitly (some packages set it in their +script). In VCR `auto` mode, missing cassettes fail fast in CI instead +of recording. + +`test` and `test:ci` must load config equivalently. If one loads +`.env` via script/runtime flags, the other must too (unless tests only use +`ConfigProvider.fromUnknown`). + +## When tests can't be recorded at all + +Some APIs forbid programmatic access without paid accounts. In that case: + +1. Surface to the user and request an explicit waiver. +2. Default outcome is **pause** (do not mark connector done without VCR + evidence for shipped entities). +3. If the user explicitly accepts a temporary exception, document: + - uncovered entities/endpoints, + - why recording is impossible right now, + - exact follow-up needed to record and validate. +4. Add a TODO with a tracking link in the deterministic replay test file: + - REST/GraphQL: `TODO(vcr)` in `test/api.vcr.test.ts`. + - gRPC: `TODO(fixtures)` in fixture/mock-server test file. + +Do not ship a connector with fabricated schemas silently. + +In this situation, final reporting must explicitly say: + +- current completion state (`Code Complete` only), +- what is blocked by credentials, +- exact action required to reach `Verified with Real Cassettes`. + +## Final report env setup guide (required) + +Final report must include a short setup guide the user can apply immediately: + +1. Each required env var. +2. Where to obtain it (dashboard page / API flow link). +3. Required scopes/permissions. +4. Exact setup steps (`cp .env.example .env`, fill values, run sandbox/test). +5. Quick verification commands (`test:ci`, `sandbox`, `/health`). diff --git a/.agents/skills/airfoil-kit/references/vcr-workflow.md b/.agents/skills/airfoil-kit/references/vcr-workflow.md new file mode 100644 index 0000000..26309d8 --- /dev/null +++ b/.agents/skills/airfoil-kit/references/vcr-workflow.md @@ -0,0 +1,151 @@ +# vcr-workflow + +VCR captures real HTTP interactions once, then replays them in CI. + +Source of truth: + +- `packages/effect-vcr/src/types.ts` +- `packages/effect-vcr/src/vcr-http-client.ts` + +## Real-API verification loop + +Use this loop per entity endpoint you ship. + +1. Write a schema in `src/schemas.ts` from docs as a starting point. +2. Write/update `test/api.vcr.test.ts` to call the real endpoint. +3. Set VCR mode to `"record"` temporarily. +4. Run test with real credentials from `.env`. +5. Inspect cassette response body and tighten schema fields. +6. Switch VCR mode back to `"replay"`. +7. Re-run test (replay-only) and commit cassette. + +## Correct layer wiring in tests + +```ts +import { NodeServices } from "@effect/platform-node"; +import { FileSystemCassetteStore, VcrHttpClient } from "@useairfoil/effect-vcr"; +import { Layer } from "effect"; +import { FetchHttpClient } from "effect/unstable/http"; + +const vcrLayer = VcrHttpClient.layer({ + vcrName: "producer-", + mode: "replay", // switch to "record" only when recording +}).pipe( + Layer.provideMerge(FileSystemCassetteStore.layer()), + Layer.provideMerge(FetchHttpClient.layer), + Layer.provideMerge(NodeServices.layer), +); +``` + +Do not omit `FileSystemCassetteStore.layer()`; VCR needs cassette storage. If +`FetchHttpClient.layer` and `NodeServices.layer` are already provided higher in +your test runtime, keep them there and only merge the missing dependencies. + +## Cassette path and export key + +Default inference under Vitest: + +- test file: `test/api.vcr.test.ts` +- cassette file: `test/__cassettes__/api.vcr.test.cassette` +- export key: current Vitest test name (`describe > it`) + +If not running under Vitest state, pass `cassetteDir` + `cassetteName` explicitly. + +## Cassette file shape (current) + +```json +{ + "exports": { + "default": { + "meta": { + "createdAt": "1970-01-01T00:00:00.000Z", + "version": "1" + }, + "entries": {} + }, + "suite > test name": { + "meta": { + "createdAt": "1970-01-01T00:00:00.000Z", + "version": "1" + }, + "entries": { + "": { + "request": { "method": "GET", "url": "..." }, + "response": { "status": 200, "body": "..." } + } + } + } + } +} +``` + +`entries` is a record keyed by a stable request key, not an array. + +## Modes + +| Mode | Behavior | +| -------- | ----------------------------------------------- | +| `record` | Always call live API, then write cassette entry | +| `replay` | Never call live API, fail if entry missing | +| `auto` | Replay when cassette exists; otherwise record | + +CI behavior: + +- `CI=true` only affects `auto` mode. +- In `auto`, missing cassette fails in CI instead of recording. +- `record` still records even in CI. + +## `ACK_DISABLE_VCR` + +Per-connector bypass uses `VcrConfig.connectorName`: + +```bash +ACK_DISABLE_VCR=producer-stripe,producer-shopify pnpm run test +``` + +Behavior: + +- Match is case-insensitive after trimming. +- If matched, VCR returns the live client directly (no cassette read/write). +- Use full connector names (`producer-stripe`, not just `stripe`). + +## Redaction and matching + +Defaults: + +- `authorization` is ignored for matching by default. +- `authorization` is redacted by default on write. + +Add service-specific headers/keys when needed: + +```ts +VcrHttpClient.layer({ + vcrName: "producer-", + mode: "replay", + redact: { + requestHeaders: ["authorization", "x-api-key"], + responseHeaders: ["set-cookie"], + responseBodyKeys: ["secret", "token"], + }, +}); +``` + +## Rerecording safely + +1. Delete stale cassette file (or stale export key). +2. Switch test to `mode: "record"`. +3. Run with real credentials. +4. Inspect diff for sensitive fields. +5. Switch back to `mode: "replay"`. +6. Re-run `test:ci` and commit. + +Never manually edit cassette JSON. If secrets leak, fix redaction config, +delete the cassette, and re-record. + +## Troubleshooting + +- **Missing replay entry**: request shape changed (URL/body/header key mismatch) + or cassette not recorded yet. +- **Cassette path inference failure**: not under Vitest; provide path/name. +- **Unexpected live call**: mode is `auto` and cassette/export missing. +- **Invalid cassette format**: regenerate cassette from record mode. diff --git a/.agents/skills/airfoil-kit/references/webhooks.md b/.agents/skills/airfoil-kit/references/webhooks.md new file mode 100644 index 0000000..3269062 --- /dev/null +++ b/.agents/skills/airfoil-kit/references/webhooks.md @@ -0,0 +1,231 @@ +# webhooks + +How to wire inbound webhooks, and what to do when the target platform has +no webhooks at all. + +## Anatomy of a `WebhookRoute` + +```ts +import type { WebhookRoute } from "@useairfoil/connector-kit"; +import { Effect } from "effect"; +import * as Schema from "effect/Schema"; + +const ExamplePayloadSchema = Schema.Union([ + Schema.Struct({ type: Schema.Literal("post.created"), data: PostSchema }), + Schema.Struct({ type: Schema.Literal("post.updated"), data: PostSchema }), +]); + +const route: WebhookRoute> = { + path: "/webhooks/example", + schema: ExamplePayloadSchema, + handle: (payload, request, rawBody) => + Effect.gen(function* () { + // 1. Verify signature (if applicable) + // 2. Dispatch by payload.type to the correct entity/event stream + }), +}; +``` + +- `path` — relative URL mounted by `runConnector`. Prepend `/webhooks/` by + convention to keep the tree tidy. +- `schema` — Effect Schema used by the kit to decode **after** the route + body has been read. Signature verification should use the raw body. +- `handle(payload, request, rawBody)` — your handler. + - `payload`: decoded value of `schema`. + - `request`: `HttpServerRequest.HttpServerRequest` — use for headers. + - `rawBody`: `Uint8Array | undefined` — only populated when the transport + preserves it (the kit does). + +The handler returns `Effect`. A success maps +to 200 OK; a failure maps to 500 unless you catch and return `Effect.void` +for idempotency cases (duplicate deliveries). + +## Registering routes with `runConnector` + +```ts +import { NodeHttpServer } from "@effect/platform-node"; + +yield * + runConnector(connector, { + webhook: { + routes: [route], + healthPath: "/health", // default; override if the platform requires it + }, + }).pipe(Effect.provide(NodeHttpServer.layer({ port: config.webhookPort }))); +``` + +- Provide a platform server layer separately (`NodeHttpServer.layer`, + `NodeHttpServer.layerTest`, or Bun equivalents) via `Effect.provide`. +- `healthPath` — auto-mounted returning `"ok"` with 200. +- `disableHttpLogger` — set `true` in noisy CI if you want to silence + the default access-log middleware. + +Omit the `webhook` option entirely if the connector is polling-only. + +## Signature verification + +Implement signature verification strictly from official platform docs. Do not +reuse another provider's signing recipe. + +Use the raw body when the platform signs exact request bytes. Never substitute +`JSON.stringify(payload)` unless the provider contract explicitly says so. + +```ts +handle: (payload, request, rawBody) => + Effect.gen(function* () { + if (Option.isNone(config.webhookSecret)) { + yield* Effect.logWarning("Webhook secret unset; skipping verification"); + } else if (rawBody) { + yield* verifySignature({ + rawBody, + secret: config.webhookSecret.value, + signatureHeader: Headers.get(request.headers, "x-sig"), + }); + } else { + yield* Effect.fail( + new ConnectorError({ + message: "Missing raw body; cannot verify signature", + }), + ); + } + // ... dispatch + }); +``` + +See `example-webhook-verification.md` for optional illustrative patterns. +Platform docs always override examples. + +### Key rules + +- Run signature verification **before dispatching/publishing side effects**. + (`WebhookRoute.handle` receives already-decoded payload plus `rawBody`.) +- Use the comparison and verification primitives required by the provider. + For HMAC flows, use a constant-time comparison. +- When the secret is `Option.none()` (explicitly missing), **log a + warning** but do not crash — this keeps local development workable. +- Wrap verification errors into `ConnectorError` so `runConnector`'s + error channel stays narrow. + +## Dispatch by event type + +Always switch exhaustively: + +```ts +switch (payload.type) { + case "post.created": + case "post.updated": + return ( + yield * + dispatchEntityWebhook({ + queue: streams.posts.live, + cutoff: streams.posts.cutoff, + cursor: payload.data.id.toString(), + row: payload.data, + }) + ); + case "unrelated.event": + return Effect.void; // ignore intentionally + default: + return Effect.logWarning("Unknown webhook type").pipe( + Effect.annotateLogs({ type: (payload as { type: string }).type }), + ); +} +``` + +Exhaustive switches force you to look at new event types when they +appear, rather than silently dropping them. + +## Polling fallback (when no webhooks exist) + +For platforms without webhooks, replace the `live` stream with a polled +stream that repeats fetching on a schedule. You still use `makePullStream` +for backfill, but the "live" side comes from `Stream.repeatEffect` or +`Stream.schedule`. + +```ts +import { Stream, Schedule, Effect } from "effect"; + +const live: Stream.Stream, ConnectorError, TemplateApiClient> = Stream.unwrap( + Effect.gen(function* () { + const api = yield* TemplateApiClient; + return Stream.repeatEffect( + Effect.gen(function* () { + const page = yield* api.fetchList(PostSchema, "/posts", { page: 1 }); + return { + cursor: page.items[0].id.toString(), + rows: page.items, + } satisfies Batch; + }), + ).pipe(Stream.schedule(Schedule.spaced("30 seconds"))); + }), +); +``` + +Notes for polling-only connectors: + +- Do **not** pass the `webhook` option to `runConnector`. The kit will + skip all HTTP server setup. +- The cutoff deferred is still required by the engine. Set it via + `initialCutoff` in `RunConnectorOptions`, or resolve it synthetically on + first poll. +- Per-poll cursor must advance — if not, you will re-publish the same + rows on every tick (the seen-set will dedupe, but you're wasting work). + +## Multiple routes + +Connectors with multiple event sources (e.g., Stripe sends to `/webhooks` +but GitHub mounts `/hooks/`) list multiple routes: + +```ts +routes: [postsWebhookRoute, commentsWebhookRoute], +``` + +Each route gets its own `schema` and `handle`. Typically they share a +single secret; have each handler read from the same `config.webhookSecret`. + +## Testing webhooks + +```ts +import { NodeHttpServer, NodeHttpClient } from "@effect/platform-node"; +import { HttpClientRequest, HttpClient } from "effect/unstable/http"; + +const ServerLayer = NodeHttpServer.layerTest; + +it.effect("dispatches webhook", () => + Effect.gen(function* () { + yield* Effect.forkScoped( + runConnector(connector, { + webhook: { + /* ... */ + }, + }), + ); + + const client = yield* HttpClient.HttpClient; + const response = yield* client.execute( + HttpClientRequest.post("/webhooks/example").pipe( + HttpClientRequest.bodyJsonUnsafe({ type: "post.created", data: post }), + ), + ); + expect(response.status).toBe(200); + + const batches = yield* capturedBatches; + expect(batches).toHaveLength(1); + }).pipe(Effect.provide(layers)), +); +``` + +`NodeHttpServer.layerTest` wires server + client to an in-process +transport — no real port needed. + +## Gotchas + +- **Deliveries arrive before backfill is ready.** The kit's cutoff + Deferred handles this: first live event sets the cutoff, backfill waits. + Don't try to block deliveries on a "ready" flag — you'll lose them. +- **Idempotency**. Most platforms retry on non-2xx. Returning 200 is + sufficient; the kit's seen-set handles duplicate rows. +- **Large payloads**. Platforms cap webhook body size; for bulk imports, + receive a notification and then fetch the full object via the API. +- **Ordering**. Webhook delivery order is never guaranteed. Use the + cursor field (monotonic, usually `updated_at`) to deduplicate. diff --git a/.github/workflows/build.yaml b/.github/workflows/build.yaml index 4332cbb..5896459 100644 --- a/.github/workflows/build.yaml +++ b/.github/workflows/build.yaml @@ -21,8 +21,6 @@ jobs: node-version: 24 - name: Install pnpm uses: pnpm/action-setup@v6 - with: - version: 10 - name: Install Protoc uses: arduino/setup-protoc@v1 with: diff --git a/.github/workflows/release.yaml b/.github/workflows/release.yaml index 32675f6..761e3dd 100644 --- a/.github/workflows/release.yaml +++ b/.github/workflows/release.yaml @@ -31,8 +31,6 @@ jobs: node-version: 24 - name: Install pnpm uses: pnpm/action-setup@v6 - with: - version: 10 - name: Install Protoc uses: arduino/setup-protoc@v1 with: diff --git a/.gitignore b/.gitignore index 291934d..49989ea 100644 --- a/.gitignore +++ b/.gitignore @@ -11,7 +11,8 @@ dist out *.tgz .claude - +.temp/ +AGENTS.md .nx/cache -.nx/workspace-data \ No newline at end of file +.nx/workspace-data diff --git a/change/@useairfoil-connector-kit-4763d6fe-014a-4d4c-8707-0e193b6bfc3d.json b/change/@useairfoil-connector-kit-4763d6fe-014a-4d4c-8707-0e193b6bfc3d.json new file mode 100644 index 0000000..714a6ba --- /dev/null +++ b/change/@useairfoil-connector-kit-4763d6fe-014a-4d4c-8707-0e193b6bfc3d.json @@ -0,0 +1,7 @@ +{ + "type": "minor", + "comment": "feat: improve webhook handling, add telemetry support and update runConnector", + "packageName": "@useairfoil/connector-kit", + "email": "jadejajaipal5@gmail.com", + "dependentChangeType": "patch" +} diff --git a/change/@useairfoil-effect-vcr-ede5ef7e-1d02-4983-88c8-ef6ae93a3830.json b/change/@useairfoil-effect-vcr-ede5ef7e-1d02-4983-88c8-ef6ae93a3830.json new file mode 100644 index 0000000..e82b192 --- /dev/null +++ b/change/@useairfoil-effect-vcr-ede5ef7e-1d02-4983-88c8-ef6ae93a3830.json @@ -0,0 +1,7 @@ +{ + "type": "patch", + "comment": "chore: export relevant types", + "packageName": "@useairfoil/effect-vcr", + "email": "jadejajaipal5@gmail.com", + "dependentChangeType": "patch" +} diff --git a/change/@useairfoil-wings-8f5b8a46-771f-4444-a0e7-cba8ec2c31be.json b/change/@useairfoil-wings-8f5b8a46-771f-4444-a0e7-cba8ec2c31be.json new file mode 100644 index 0000000..bc5d3f7 --- /dev/null +++ b/change/@useairfoil-wings-8f5b8a46-771f-4444-a0e7-cba8ec2c31be.json @@ -0,0 +1,7 @@ +{ + "type": "patch", + "comment": "chore: fix tests", + "packageName": "@useairfoil/wings", + "email": "jadejajaipal5@gmail.com", + "dependentChangeType": "patch" +} diff --git a/connectors/producer-polar/README.md b/connectors/producer-polar/README.md index fc853ba..c879566 100644 --- a/connectors/producer-polar/README.md +++ b/connectors/producer-polar/README.md @@ -11,7 +11,7 @@ This section shows how to wire the connector in your own application. The built- ### Install ```bash -bun add @useairfoil/producer-polar +pnpm add @useairfoil/producer-polar ``` ### Provide config via environment @@ -35,6 +35,8 @@ POLAR_WEBHOOK_PORT=8080 ### Minimal wiring (Node + Fetch) +This example uses Node. Bun works too if you provide Bun's HttpServer layer. + You must provide these runtime layers: - `PolarConnectorConfig()` @@ -44,15 +46,11 @@ You must provide these runtime layers: - `Publisher` and `StateStore` layers ```ts -import { FetchHttpClient, HttpServer } from "@effect/platform"; +import { FetchHttpClient } from "effect/unstable/http"; import { NodeHttpServer } from "@effect/platform-node"; -import { - buildWebhookRouter, - Publisher, - runConnector, - StateStoreInMemory, -} from "@useairfoil/connector-kit"; +import { Publisher, runConnector, StateStoreInMemory } from "@useairfoil/connector-kit"; import { ConfigProvider, Effect, Layer } from "effect"; +import { createServer } from "node:http"; import { PolarConnector, PolarConnectorConfig } from "@useairfoil/producer-polar"; const ConsolePublisher = Layer.succeed(Publisher, { @@ -61,11 +59,12 @@ const ConsolePublisher = Layer.succeed(Publisher, { const program = Effect.gen(function* () { const { connector, routes } = yield* PolarConnector; - const router = buildWebhookRouter(routes); - const app = router.pipe(HttpServer.serve(), HttpServer.withLogAddress); - const serverLayer = Layer.provide(app, NodeHttpServer.layer({ port: 8080 })); + const serverLayer = NodeHttpServer.layer(createServer, { port: 8080 }); - return yield* runConnector(connector, new Date()).pipe(Effect.provide(serverLayer)); + return yield* runConnector(connector, { + initialCutoff: new Date(), + webhook: { routes }, + }).pipe(Effect.provide(serverLayer)); }).pipe( Effect.provide(StateStoreInMemory), Effect.provide(ConsolePublisher), @@ -106,18 +105,14 @@ At runtime you typically provide: - an `HttpClient` layer (Fetch or VCR) - a `Publisher` and `StateStore` layer -Minimal wiring (Bun + FetchHttpClient): +Minimal wiring (Node + FetchHttpClient): ```ts -import { FetchHttpClient, HttpServer } from "@effect/platform"; -import { BunHttpServer } from "@effect/platform-bun"; -import { - buildWebhookRouter, - Publisher, - runConnector, - StateStoreInMemory, -} from "@useairfoil/connector-kit"; +import { FetchHttpClient } from "effect/unstable/http"; +import { NodeHttpServer } from "@effect/platform-node"; +import { Publisher, runConnector, StateStoreInMemory } from "@useairfoil/connector-kit"; import { ConfigProvider, Effect, Layer } from "effect"; +import { createServer } from "node:http"; import { PolarConnector, PolarConnectorConfig } from "./src/index"; const ConsolePublisher = Layer.succeed(Publisher, { @@ -126,11 +121,12 @@ const ConsolePublisher = Layer.succeed(Publisher, { const program = Effect.gen(function* () { const { connector, routes } = yield* PolarConnector; - const router = buildWebhookRouter(routes); - const app = router.pipe(HttpServer.serve(), HttpServer.withLogAddress); - const serverLayer = Layer.provide(app, BunHttpServer.layer({ port: 8080 })); + const serverLayer = NodeHttpServer.layer(createServer, { port: 8080 }); - return yield* runConnector(connector, new Date()).pipe(Effect.provide(serverLayer)); + return yield* runConnector(connector, { + initialCutoff: new Date(), + webhook: { routes }, + }).pipe(Effect.provide(serverLayer)); }).pipe( Effect.provide(StateStoreInMemory), Effect.provide(ConsolePublisher), @@ -161,29 +157,26 @@ The connector supports VCR-style record/replay for outgoing Polar API calls thro Minimal VCR wiring (Node test example): ```ts -import { FetchHttpClient } from "@effect/platform"; -import { NodeFileSystem } from "@effect/platform-node"; -import { CassetteStoreLive, VcrHttpClientLayer } from "@useairfoil/effect-http-client"; +import { FetchHttpClient } from "effect/unstable/http"; +import { FileSystemCassetteStore, VcrHttpClient } from "@useairfoil/effect-vcr"; import { ConfigProvider, Effect, Layer } from "effect"; import { PolarConnector, PolarConnectorConfig } from "../src/index"; -const cassetteLayer = CassetteStoreLive.pipe(Layer.provide(NodeFileSystem.layer)); - -const vcrLayer = VcrHttpClientLayer({ - cassetteDir: "cassettes", - cassetteName: "customers-backfill-replay", +const vcrLayer = VcrHttpClient.layer({ + vcrName: "producer-polar", mode: "auto", matchIgnore: { requestHeaders: ["authorization"] }, redact: { requestHeaders: ["authorization"] }, -}).pipe(Layer.provide(Layer.mergeAll(FetchHttpClient.layer, cassetteLayer))); - -const configProvider = ConfigProvider.fromMap( - new Map([ - ["POLAR_ACCESS_TOKEN", process.env.POLAR_ACCESS_TOKEN ?? "test"], - ["POLAR_API_BASE_URL", "https://sandbox-api.polar.sh/v1/"], - ]), +}).pipe( + Layer.provideMerge(FileSystemCassetteStore.layer()), + Layer.provideMerge(FetchHttpClient.layer), ); +const configProvider = ConfigProvider.fromUnknown({ + POLAR_ACCESS_TOKEN: "test", + POLAR_API_BASE_URL: "https://sandbox-api.polar.sh/v1/", +}); + const program = Effect.gen(function* () { const { connector } = yield* PolarConnector; // run connector with your publisher/state layers... @@ -198,5 +191,5 @@ Example test run from the connector directory: ```bash POLAR_API_BASE_URL=https://sandbox-api.polar.sh/v1/ \ -bun run --cwd connectors/producer-polar test +pnpm --filter @useairfoil/producer-polar run test ``` diff --git a/connectors/producer-polar/package.json b/connectors/producer-polar/package.json index 168e6e4..65e9723 100644 --- a/connectors/producer-polar/package.json +++ b/connectors/producer-polar/package.json @@ -21,8 +21,8 @@ "scripts": { "build": "tsdown", "sandbox": "tsx --env-file=.env src/sandbox.ts", - "test": "vitest", - "test:ci": "vitest run", + "test": "dotenvx run --ignore=MISSING_ENV_FILE --quiet -- vitest", + "test:ci": "dotenvx run --ignore=MISSING_ENV_FILE --quiet -- vitest run", "typecheck": "tsc --noEmit" }, "dependencies": { @@ -32,6 +32,7 @@ "effect": "catalog:" }, "devDependencies": { + "@dotenvx/dotenvx": "^1.62.0", "@effect/vitest": "catalog:", "@types/node": "catalog:", "@useairfoil/effect-vcr": "workspace:*", diff --git a/connectors/producer-polar/src/sandbox.ts b/connectors/producer-polar/src/sandbox.ts index a0224f0..f9773b9 100644 --- a/connectors/producer-polar/src/sandbox.ts +++ b/connectors/producer-polar/src/sandbox.ts @@ -1,14 +1,10 @@ import type { ConnectorError } from "@useairfoil/connector-kit"; import { NodeHttpServer } from "@effect/platform-node"; -import { - buildWebhookRouter, - Publisher, - runConnector, - StateStoreInMemory, -} from "@useairfoil/connector-kit"; -import { Config, ConfigProvider, DateTime, Effect, Layer, Logger } from "effect"; -import { FetchHttpClient, HttpRouter, HttpServerResponse } from "effect/unstable/http"; +import { Publisher, runConnector, StateStoreInMemory } from "@useairfoil/connector-kit"; +import { Config, ConfigProvider, DateTime, Effect, Layer, Logger, Metric } from "effect"; +import { FetchHttpClient } from "effect/unstable/http"; +import * as Observability from "effect/unstable/observability"; import { createServer } from "node:http"; import { PolarConnector, PolarConnectorConfig } from "./index"; @@ -17,11 +13,16 @@ const SandboxConfig = Config.all({ port: Config.port("POLAR_WEBHOOK_PORT").pipe(Config.withDefault(8080)), }); +const TelemetryConfig = Config.all({ + enabled: Config.boolean("ACK_TELEMETRY_ENABLED").pipe(Config.withDefault(false)), + baseUrl: Config.string("ACK_OTLP_BASE_URL").pipe(Config.withDefault("http://localhost:4318")), + serviceName: Config.string("ACK_SERVICE_NAME").pipe(Config.withDefault("producer-polar")), +}); + const ConsolePublisherLayer = Layer.succeed(Publisher)({ - publish: ({ name, batch }) => + publish: ({ name, source, batch }) => Effect.gen(function* () { const ids = batch.rows.map((r) => r["id"]).filter(Boolean); - const source = typeof batch.cursor === "number" ? "backfill" : "live"; yield* Effect.logInfo(`[publisher] -> Source: ${source} | Name: ${name}`).pipe( Effect.annotateLogs({ count: batch.rows.length, @@ -38,14 +39,7 @@ const program = Effect.gen(function* () { const config = yield* SandboxConfig; const { connector, routes } = yield* PolarConnector; const routePaths = routes.map((route) => route.path); - const routerLayer = Layer.mergeAll( - buildWebhookRouter(routes), - HttpRouter.add("GET", "/health", Effect.succeed(HttpServerResponse.text("ok"))), - ); - const app = HttpRouter.serve(routerLayer, { - disableLogger: true, - }); - const serverLayer = Layer.provide(app, NodeHttpServer.layer(createServer, { port: config.port })); + const serverLayer = NodeHttpServer.layer(createServer, { port: config.port }); yield* Effect.logInfo("webhook server ready").pipe( Effect.annotateLogs({ port: config.port, routes: routePaths }), @@ -53,7 +47,14 @@ const program = Effect.gen(function* () { const now = yield* DateTime.now; - return yield* runConnector(connector, DateTime.toDate(now)).pipe(Effect.provide(serverLayer)); + return yield* runConnector(connector, { + initialCutoff: DateTime.toDate(now), + webhook: { + routes, + healthPath: "/health", + disableHttpLogger: true, + }, + }).pipe(Effect.provide(serverLayer)); }).pipe(Effect.annotateLogs({ component: "polar" })); const EnvLayer = Layer.mergeAll( @@ -61,13 +62,40 @@ const EnvLayer = Layer.mergeAll( Layer.succeed(ConfigProvider.ConfigProvider, ConfigProvider.fromEnv()), ); -const ConnectorLayer = PolarConnectorConfig().pipe(Layer.provideMerge(EnvLayer)); +const ConnectorLayer = PolarConnectorConfig(); + +const TelemetryLayer = Layer.unwrap( + Effect.gen(function* () { + const telemetry = yield* TelemetryConfig; + if (!telemetry.enabled) { + return Layer.empty; + } + + yield* Effect.logInfo("telemetry enabled").pipe( + Effect.annotateLogs({ + serviceName: telemetry.serviceName, + baseUrl: telemetry.baseUrl, + }), + ); + + return Layer.mergeAll( + Observability.Otlp.layerJson({ + baseUrl: telemetry.baseUrl, + resource: { + serviceName: telemetry.serviceName, + }, + }), + Metric.enableRuntimeMetricsLayer, + ); + }), +); const RuntimeLayer = Layer.mergeAll( StateStoreInMemory, ConsolePublisherLayer, ConnectorLayer, Logger.layer([Logger.consolePretty()]), + TelemetryLayer, EnvLayer, ); diff --git a/connectors/producer-polar/test/helpers.ts b/connectors/producer-polar/test/helpers.ts index bde4f80..e595f1f 100644 --- a/connectors/producer-polar/test/helpers.ts +++ b/connectors/producer-polar/test/helpers.ts @@ -5,6 +5,7 @@ import { Deferred, Effect, Layer, Ref } from "effect"; export type Published = { readonly name: string; + readonly source: "live" | "backfill"; readonly batch: Batch>; }; @@ -13,11 +14,11 @@ export const makeTestPublisher = (expected: number) => const publishedRef = yield* Ref.make>([]); const done = yield* Deferred.make(); const layer = Layer.succeed(Publisher)({ - publish: ({ name, batch }) => + publish: ({ name, source, batch }) => Effect.gen(function* () { const next = yield* Ref.updateAndGet(publishedRef, (items) => [ ...items, - { name, batch }, + { name, source, batch }, ]); if (next.length === expected) { yield* Deferred.succeed(done, next.length); diff --git a/connectors/producer-polar/test/webhook.test.ts b/connectors/producer-polar/test/webhook.test.ts index d9809dc..11f61d6 100644 --- a/connectors/producer-polar/test/webhook.test.ts +++ b/connectors/producer-polar/test/webhook.test.ts @@ -1,13 +1,8 @@ import { NodeHttpServer } from "@effect/platform-node"; import { describe, expect, it } from "@effect/vitest"; -import { - buildWebhookRouter, - ConnectorError, - runConnector, - StateStoreInMemory, -} from "@useairfoil/connector-kit"; +import { ConnectorError, runConnector, StateStoreInMemory } from "@useairfoil/connector-kit"; import { ConfigProvider, Deferred, Effect, Layer, Ref } from "effect"; -import { HttpClient, HttpClientRequest, HttpRouter } from "effect/unstable/http"; +import { HttpClient, HttpClientRequest } from "effect/unstable/http"; import { PolarApiClient, type PolarApiClientService } from "../src/api"; import { PolarConnector, PolarConnectorConfig } from "../src/index"; @@ -56,15 +51,17 @@ describe("producer-polar webhook", () => { return Effect.gen(function* () { const { publishedRef, done, layer } = yield* makeTestPublisher(1); const { connector, routes } = yield* PolarConnector; - - const serverLayer = buildWebhookRouter(routes).pipe( - HttpRouter.serve, - Layer.provideMerge(runtimeLayer), - ); - const runLayer = Layer.mergeAll(StateStoreInMemory, layer, serverLayer); + const runLayer = Layer.mergeAll(StateStoreInMemory, layer, runtimeLayer); yield* Effect.gen(function* () { - yield* Effect.forkScoped(runConnector(connector, new Date())); + yield* Effect.forkScoped( + runConnector(connector, { + initialCutoff: new Date(), + webhook: { + routes, + }, + }), + ); const client = yield* HttpClient.HttpClient; const request = HttpClientRequest.post("/webhooks/polar").pipe( diff --git a/connectors/producer-shopify/.env.example b/connectors/producer-shopify/.env.example new file mode 100644 index 0000000..dd43405 --- /dev/null +++ b/connectors/producer-shopify/.env.example @@ -0,0 +1,6 @@ +# Shopify Admin REST connector configuration. + +SHOPIFY_API_BASE_URL=https://your-development-store.myshopify.com/admin/api/2026-01 +SHOPIFY_API_TOKEN=shpat_your_admin_api_access_token +SHOPIFY_WEBHOOK_SECRET=your_app_client_secret +SHOPIFY_WEBHOOK_PORT=8080 diff --git a/connectors/producer-shopify/README.md b/connectors/producer-shopify/README.md new file mode 100644 index 0000000..fee0d58 --- /dev/null +++ b/connectors/producer-shopify/README.md @@ -0,0 +1,78 @@ +# producer-shopify + +Shopify producer connector for Airfoil Connector Kit (ACK). + +Current v1 scope: + +- Entity: `products` +- Backfill source: Shopify Admin REST `GET /products.json` +- Live source: Shopify webhooks on `products/create` and `products/update` + +## Architecture + +- `src/api.ts`: REST client with `X-Shopify-Access-Token` auth and Link-header pagination support +- `src/streams.ts`: cutoff-aware backfill stream plus live webhook queue +- `src/connector.ts`: connector/entity registration and webhook route/signature verification +- `src/sandbox.ts`: runnable local runtime (Node server + in-memory store + console publisher) + +## Environment variables + +Copy `.env.example` to `.env` and fill values: + +- `SHOPIFY_API_BASE_URL` - full base URL including pinned API version, for example `https://your-store.myshopify.com/admin/api/2026-01` +- `SHOPIFY_API_TOKEN` - Admin API access token (`X-Shopify-Access-Token`) +- `SHOPIFY_WEBHOOK_SECRET` - app shared secret used to validate `X-Shopify-Hmac-SHA256` +- `SHOPIFY_WEBHOOK_PORT` - local webhook server port (default `8080`) + +Recommended scope for this v1 connector: `read_products`. + +## Usage + +Run sandbox: + +```bash +pnpm --filter @useairfoil/producer-shopify run sandbox +``` + +Webhook endpoint: + +- `POST /webhooks/shopify` + +Expected headers: + +- `X-Shopify-Topic` (`products/create` or `products/update`) +- `X-Shopify-Hmac-SHA256` (verified against raw body bytes) + +## Tests + +- `test/api.vcr.test.ts`: deterministic replay of a recorded `products.json` response +- `test/webhook.test.ts`: in-memory webhook flow with HMAC signature verification + +### VCR workflow + +1. Ensure `.env` contains valid `SHOPIFY_API_BASE_URL` and `SHOPIFY_API_TOKEN`. +2. Record cassette: + +```bash +rm -rf connectors/producer-shopify/test/__cassettes__ +pnpm --filter @useairfoil/producer-shopify run test:ci -- test/api.vcr.test.ts +``` + +3. Replay-only verification: + +```bash +pnpm --filter @useairfoil/producer-shopify run test:ci +``` + +Run tests: + +```bash +pnpm --filter @useairfoil/producer-shopify run test:ci +``` + +## Notes + +- Shopify REST Admin API is legacy; GraphQL is recommended by Shopify for new apps. +- This connector pins REST paths by embedding the version in `SHOPIFY_API_BASE_URL`. +- Pagination follows Shopify Link header `rel="next"` URLs with `page_info` cursors. +- Inbound webhook signature validation uses `SHOPIFY_WEBHOOK_SECRET` and raw body HMAC SHA-256. diff --git a/connectors/producer-shopify/package.json b/connectors/producer-shopify/package.json new file mode 100644 index 0000000..da8aa41 --- /dev/null +++ b/connectors/producer-shopify/package.json @@ -0,0 +1,41 @@ +{ + "name": "@useairfoil/producer-shopify", + "version": "0.1.0", + "private": true, + "files": [ + "dist", + "src", + "README.md" + ], + "type": "module", + "main": "./dist/index.js", + "types": "./dist/index.d.ts", + "exports": { + ".": { + "types": "./dist/index.d.ts", + "import": "./dist/index.js", + "default": "./dist/index.js" + } + }, + "scripts": { + "build": "tsdown", + "sandbox": "tsx --env-file=.env src/sandbox.ts", + "test": "dotenvx run --ignore=MISSING_ENV_FILE --quiet -- vitest", + "test:ci": "dotenvx run --ignore=MISSING_ENV_FILE --quiet -- vitest run", + "typecheck": "tsc --noEmit" + }, + "dependencies": { + "@effect/platform-node": "catalog:", + "@useairfoil/connector-kit": "workspace:*", + "effect": "catalog:" + }, + "devDependencies": { + "@dotenvx/dotenvx": "^1.62.0", + "@effect/vitest": "catalog:", + "@types/node": "catalog:", + "@useairfoil/effect-vcr": "workspace:*", + "tsdown": "catalog:", + "tsx": "^4.21.0", + "vitest": "catalog:" + } +} diff --git a/connectors/producer-shopify/src/api.ts b/connectors/producer-shopify/src/api.ts new file mode 100644 index 0000000..cc8c92c --- /dev/null +++ b/connectors/producer-shopify/src/api.ts @@ -0,0 +1,144 @@ +import { ConnectorError } from "@useairfoil/connector-kit"; +import { Context, Effect, Layer, Schema } from "effect"; +import { HttpClient, HttpClientRequest, HttpClientResponse } from "effect/unstable/http"; + +import type { ShopifyConfig } from "./connector"; + +// Page of rows returned by the list helper. +export type ShopifyListPage = { + readonly items: ReadonlyArray; + readonly nextUrl: string | null; + readonly hasMore: boolean; +}; + +export type ShopifyApiClientService = { + readonly fetchJson: ( + schema: Schema.Decoder, + path: string, + params?: Record, + ) => Effect.Effect; + readonly fetchList: ( + schema: Schema.Decoder, + path: string, + options: { + readonly limit: number; + readonly nextUrl?: string; + }, + ) => Effect.Effect, ConnectorError, R>; +}; + +export class ShopifyApiClient extends Context.Service()( + "@useairfoil/producer-shopify/ShopifyApiClient", +) {} + +// Factory that resolves an HttpClient and exposes a small typed API surface. +const extractNextUrl = (linkHeader: string | undefined): string | null => { + if (!linkHeader) { + return null; + } + const match = linkHeader.match(/<([^>]+)>;\s*rel="?next"?/i); + return match?.[1] ?? null; +}; + +const inferListField = (path: string): string => { + const [firstSegment] = path.split("?"); + const finalSegment = firstSegment?.split("/").at(-1) ?? ""; + return finalSegment.replace(/\.json$/i, ""); +}; + +const isAbsoluteUrl = (value: string): boolean => /^https?:\/\//i.test(value); + +export const makeShopifyApiClient = ( + config: ShopifyConfig, +): Effect.Effect => + Effect.gen(function* () { + const rawClient = yield* HttpClient.HttpClient; + const authAndJsonClient = rawClient.pipe( + HttpClient.mapRequest(HttpClientRequest.setHeader("X-Shopify-Access-Token", config.apiToken)), + HttpClient.mapRequest(HttpClientRequest.acceptJson), + ); + const relativePathClient = authAndJsonClient.pipe( + HttpClient.mapRequest(HttpClientRequest.prependUrl(config.apiBaseUrl)), + ); + + const fetchJson = ( + schema: Schema.Decoder, + path: string, + params?: Record, + ): Effect.Effect => { + const request = params + ? HttpClientRequest.get(path).pipe(HttpClientRequest.setUrlParams(params)) + : HttpClientRequest.get(path); + return Effect.scoped( + relativePathClient.execute(request).pipe( + Effect.flatMap(HttpClientResponse.filterStatusOk), + Effect.flatMap((response) => response.json), + Effect.flatMap(Schema.decodeUnknownEffect(schema)), + Effect.mapError( + (error) => + new ConnectorError({ + message: "Shopify API request failed", + cause: error, + }), + ), + ), + ); + }; + + const fetchList = ( + schema: Schema.Decoder, + path: string, + options: { + readonly limit: number; + readonly nextUrl?: string; + }, + ): Effect.Effect, ConnectorError, R> => { + const useAbsolute = typeof options.nextUrl === "string" && isAbsoluteUrl(options.nextUrl); + const client = useAbsolute ? authAndJsonClient : relativePathClient; + const request = options.nextUrl + ? HttpClientRequest.get(options.nextUrl) + : HttpClientRequest.get(`${path}?limit=${options.limit}`); + const arraySchema = Schema.Array(schema) as unknown as Schema.Decoder, R>; + const listField = inferListField(path); + + return Effect.scoped( + client.execute(request).pipe( + Effect.flatMap(HttpClientResponse.filterStatusOk), + Effect.flatMap((response) => + Effect.all({ + body: response.json, + linkHeader: Effect.succeed(response.headers["link"]), + }), + ), + Effect.flatMap(({ body, linkHeader }) => { + const unknownEnvelope = body as Record; + const unknownItems = unknownEnvelope[listField]; + return Schema.decodeUnknownEffect(arraySchema)(unknownItems).pipe( + Effect.map((items) => { + const nextUrl = extractNextUrl(linkHeader); + return { + items, + nextUrl, + hasMore: nextUrl !== null, + }; + }), + ); + }), + Effect.mapError( + (error) => + new ConnectorError({ + message: "Shopify list request failed", + cause: error, + }), + ), + ), + ); + }; + + return { fetchJson, fetchList }; + }); + +export const ShopifyApiClientConfig = ( + config: ShopifyConfig, +): Layer.Layer => + Layer.effect(ShopifyApiClient)(makeShopifyApiClient(config)); diff --git a/connectors/producer-shopify/src/connector.ts b/connectors/producer-shopify/src/connector.ts new file mode 100644 index 0000000..6928923 --- /dev/null +++ b/connectors/producer-shopify/src/connector.ts @@ -0,0 +1,191 @@ +import type { HttpClient } from "effect/unstable/http"; + +import { + type ConnectorDefinition, + ConnectorError, + defineConnector, + defineEntity, + type WebhookRoute, +} from "@useairfoil/connector-kit"; +import { Config, Context, Effect, Layer, Option } from "effect"; +import { createHmac, timingSafeEqual } from "node:crypto"; + +import { ShopifyApiClient, ShopifyApiClientConfig } from "./api"; +import { type Product, ProductSchema, type WebhookPayload, WebhookPayloadSchema } from "./schemas"; +import { + dispatchEntityWebhook, + type EntityStreams, + makeEntityStreams, + resolveCursor, +} from "./streams"; + +export type ShopifyConfig = { + readonly apiBaseUrl: string; + readonly apiToken: string; + readonly webhookSecret: Option.Option; +}; + +export type ShopifyConnectorRuntime = { + readonly connector: ConnectorDefinition; + readonly routes: ReadonlyArray>; +}; + +export class ShopifyConnector extends Context.Service()( + "@useairfoil/producer-shopify/ShopifyConnector", +) {} + +export const ShopifyConfigConfig = Config.all({ + apiBaseUrl: Config.string("SHOPIFY_API_BASE_URL").pipe( + Config.withDefault("https://your-development-store.myshopify.com/admin/api/2026-01"), + ), + apiToken: Config.string("SHOPIFY_API_TOKEN"), + webhookSecret: Config.option(Config.string("SHOPIFY_WEBHOOK_SECRET")), +}); + +const verifyWebhookSignature = (options: { + readonly rawBody: Uint8Array; + readonly signature: string | null; + readonly secret: string; +}): Effect.Effect => + Effect.try({ + try: () => { + if (!options.signature) { + throw new Error("Missing x-shopify-hmac-sha256 header"); + } + const digest = createHmac("sha256", options.secret) + .update(Buffer.from(options.rawBody)) + .digest(); + const provided = Buffer.from(options.signature, "base64"); + if (provided.length !== digest.length || !timingSafeEqual(digest, provided)) { + throw new Error("Invalid Shopify webhook signature"); + } + }, + catch: (cause) => + new ConnectorError({ + message: "Shopify webhook verification failed", + cause, + }), + }); + +const resolveWebhookDispatch = (options: { + readonly payload: WebhookPayload; + readonly topic: string; + readonly products: EntityStreams; +}) => { + switch (options.topic) { + case "products/create": + case "products/update": { + return Effect.logInfo(`webhook ${options.topic}`).pipe( + Effect.annotateLogs({ id: options.payload.id }), + Effect.andThen( + resolveCursor(options.payload, "updated_at").pipe( + Effect.flatMap((cursor) => + dispatchEntityWebhook({ + queue: options.products.live, + cutoff: options.products.cutoff, + row: options.payload, + cursor, + }), + ), + ), + ), + ); + } + default: { + return Effect.logWarning("Ignoring unknown Shopify webhook topic").pipe( + Effect.annotateLogs({ topic: options.topic }), + Effect.asVoid, + ); + } + } +}; + +const makeShopifyConnector = ( + config: ShopifyConfig, +): Effect.Effect => + Effect.gen(function* () { + const api = yield* ShopifyApiClient; + const productStreams = yield* makeEntityStreams({ + api, + schema: ProductSchema, + path: "/products.json", + cursorField: "updated_at", + limit: 50, + }); + + const connector = defineConnector({ + name: "producer-shopify", + entities: [ + defineEntity({ + name: "products", + schema: ProductSchema, + primaryKey: "id", + live: productStreams.live, + backfill: productStreams.backfill, + }), + ], + events: [], + }); + + const webhookRoute: WebhookRoute = { + path: "/webhooks/shopify", + schema: WebhookPayloadSchema, + handle: (payload, request, rawBody) => + Effect.gen(function* () { + const topic = request.headers["x-shopify-topic"] ?? ""; + + if (Option.isSome(config.webhookSecret)) { + const verifiedBody = rawBody; + if (!verifiedBody) { + return yield* Effect.fail( + new ConnectorError({ + message: "Webhook raw body is required for Shopify signature verification", + }), + ); + } + yield* verifyWebhookSignature({ + rawBody: verifiedBody, + signature: request.headers["x-shopify-hmac-sha256"] ?? null, + secret: config.webhookSecret.value, + }); + } + + yield* resolveWebhookDispatch({ + payload, + topic, + products: productStreams, + }); + }), + }; + + if (Option.isNone(config.webhookSecret)) { + yield* Effect.logWarning( + "SHOPIFY_WEBHOOK_SECRET is not set. Incoming webhooks will not be signature-verified.", + ); + } + + return { connector, routes: [webhookRoute] }; + }).pipe(Effect.annotateLogs({ component: "producer-shopify" })); + +export const ShopifyConnectorConfig = (): Layer.Layer< + ShopifyConnector, + ConnectorError, + HttpClient.HttpClient +> => + Layer.effect(ShopifyConnector)( + Effect.gen(function* () { + const config = yield* ShopifyConfigConfig; + return yield* makeShopifyConnector(config).pipe( + Effect.provide(ShopifyApiClientConfig(config)), + ); + }).pipe( + Effect.mapError((error) => + error instanceof ConnectorError + ? error + : new ConnectorError({ + message: "Shopify config failed", + cause: error, + }), + ), + ), + ); diff --git a/connectors/producer-shopify/src/index.ts b/connectors/producer-shopify/src/index.ts new file mode 100644 index 0000000..01e352b --- /dev/null +++ b/connectors/producer-shopify/src/index.ts @@ -0,0 +1,10 @@ +export { ShopifyApiClient, ShopifyApiClientConfig } from "./api"; +export { + type ShopifyConfig, + ShopifyConfigConfig, + ShopifyConnector, + ShopifyConnectorConfig, + type ShopifyConnectorRuntime, +} from "./connector"; +export type { Product, WebhookPayload } from "./schemas"; +export { ProductSchema, WebhookPayloadSchema } from "./schemas"; diff --git a/connectors/producer-shopify/src/sandbox.ts b/connectors/producer-shopify/src/sandbox.ts new file mode 100644 index 0000000..661d428 --- /dev/null +++ b/connectors/producer-shopify/src/sandbox.ts @@ -0,0 +1,115 @@ +import type { ConnectorError } from "@useairfoil/connector-kit"; + +import { NodeHttpServer } from "@effect/platform-node"; +import { Publisher, runConnector, StateStoreInMemory } from "@useairfoil/connector-kit"; +import { Config, ConfigProvider, DateTime, Effect, Layer, Logger, Metric } from "effect"; +import { FetchHttpClient } from "effect/unstable/http"; +import * as Observability from "effect/unstable/observability"; +import { createServer } from "node:http"; + +import { ShopifyConnector, ShopifyConnectorConfig } from "./index"; + +const SandboxConfig = Config.all({ + port: Config.port("SHOPIFY_WEBHOOK_PORT").pipe(Config.withDefault(8080)), +}); + +const TelemetryConfig = Config.all({ + enabled: Config.boolean("ACK_TELEMETRY_ENABLED").pipe(Config.withDefault(false)), + baseUrl: Config.string("ACK_OTLP_BASE_URL").pipe(Config.withDefault("http://localhost:4318")), + serviceName: Config.string("ACK_SERVICE_NAME").pipe(Config.withDefault("producer-shopify")), +}); + +// Console publisher so you can see ingestion output during `pnpm run sandbox`. +// Real connectors plug in `WingsPublisherLayer` from @useairfoil/connector-kit. +const ConsolePublisherLayer = Layer.succeed(Publisher)({ + publish: ({ name, source, batch }) => + Effect.gen(function* () { + const ids = batch.rows.map((r) => r["id"]).filter((id) => id != null); + yield* Effect.logInfo(`[publisher] -> Source: ${source} | Name: ${name}`).pipe( + Effect.annotateLogs({ + count: batch.rows.length, + ids, + cursor: batch.cursor, + source, + }), + ); + return { success: true }; + }), +}); + +const program = Effect.gen(function* () { + const config = yield* SandboxConfig; + const { connector, routes } = yield* ShopifyConnector; + const routePaths = routes.map((route) => route.path); + const serverLayer = NodeHttpServer.layer(createServer, { port: config.port }); + + yield* Effect.logInfo("webhook server ready").pipe( + Effect.annotateLogs({ port: config.port, routes: routePaths }), + ); + + const now = yield* DateTime.now; + + return yield* runConnector(connector, { + initialCutoff: DateTime.toDate(now), + webhook: { + routes, + healthPath: "/health", + disableHttpLogger: true, + }, + }).pipe(Effect.provide(serverLayer)); +}).pipe(Effect.annotateLogs({ component: "producer-shopify" })); + +const EnvLayer = FetchHttpClient.layer; + +const ConnectorLayer = ShopifyConnectorConfig().pipe(Layer.provide(EnvLayer)); + +const TelemetryLayer = Layer.unwrap( + Effect.gen(function* () { + const telemetry = yield* TelemetryConfig; + if (!telemetry.enabled) { + return Layer.empty; + } + + yield* Effect.logInfo("telemetry enabled").pipe( + Effect.annotateLogs({ + serviceName: telemetry.serviceName, + baseUrl: telemetry.baseUrl, + }), + ); + + return Layer.mergeAll( + Observability.Otlp.layerJson({ + baseUrl: telemetry.baseUrl, + resource: { + serviceName: telemetry.serviceName, + }, + }), + Metric.enableRuntimeMetricsLayer, + ); + }), +); + +const RuntimeLayer = Layer.mergeAll( + StateStoreInMemory, + ConsolePublisherLayer, + ConnectorLayer, + Logger.layer([Logger.consolePretty()]), + TelemetryLayer, +); + +Effect.runPromise( + Effect.scoped(program).pipe( + Effect.provide(RuntimeLayer), + Effect.provideService(ConfigProvider.ConfigProvider, ConfigProvider.fromEnv()), + ) as Effect.Effect, +).catch((error) => { + void Effect.runPromise( + Effect.logError("fatal error").pipe( + Effect.annotateLogs({ + component: "producer-shopify", + error: String(error), + }), + ), + ); + process.exit(1); +}); diff --git a/connectors/producer-shopify/src/schemas.ts b/connectors/producer-shopify/src/schemas.ts new file mode 100644 index 0000000..56799b6 --- /dev/null +++ b/connectors/producer-shopify/src/schemas.ts @@ -0,0 +1,28 @@ +import * as Schema from "effect/Schema"; + +export const ProductSchema = Schema.Struct({ + id: Schema.Number, + admin_graphql_api_id: Schema.String, + body_html: Schema.NullOr(Schema.String), + created_at: Schema.String, + handle: Schema.String, + image: Schema.NullOr(Schema.Any), + images: Schema.Array(Schema.Any), + options: Schema.Array(Schema.Any), + product_type: Schema.String, + published_at: Schema.NullOr(Schema.String), + published_scope: Schema.optional(Schema.String), + status: Schema.String, + tags: Schema.String, + template_suffix: Schema.NullOr(Schema.String), + title: Schema.String, + updated_at: Schema.String, + variants: Schema.Array(Schema.Any), + vendor: Schema.String, +}); + +export type Product = Schema.Schema.Type; + +export const WebhookPayloadSchema = ProductSchema; + +export type WebhookPayload = Schema.Schema.Type; diff --git a/connectors/producer-shopify/src/streams.ts b/connectors/producer-shopify/src/streams.ts new file mode 100644 index 0000000..98d6f29 --- /dev/null +++ b/connectors/producer-shopify/src/streams.ts @@ -0,0 +1,147 @@ +import type * as Schema from "effect/Schema"; + +import { + type Batch, + ConnectorError, + type Cursor, + makePullStream, + makeWebhookQueue, + type WebhookStream, +} from "@useairfoil/connector-kit"; +import { Deferred, Effect, Queue, Stream } from "effect"; + +import type { ShopifyApiClientService } from "./api"; + +const toEpochMillis = (value: unknown): number | undefined => { + if (value instanceof Date) { + return value.getTime(); + } + if (typeof value === "number") { + return Number.isFinite(value) ? value : undefined; + } + if (typeof value === "string") { + const parsed = Date.parse(value); + return Number.isNaN(parsed) ? undefined : parsed; + } + return undefined; +}; + +const isOnOrBeforeCutoff = (value: unknown, cutoff: Cursor): boolean => { + const valueMillis = toEpochMillis(value); + const cutoffMillis = toEpochMillis(cutoff); + if (valueMillis == null || cutoffMillis == null) { + return false; + } + return valueMillis <= cutoffMillis; +}; + +export const resolveCursor = >( + row: T, + cursorField: keyof T & string, +): Effect.Effect => + Effect.try({ + try: () => { + const value = row[cursorField]; + if (typeof value === "string" || typeof value === "number") { + return value; + } + if (value instanceof Date) { + return value; + } + throw new Error(`Unsupported cursor value for field '${cursorField}'`); + }, + catch: (cause) => + new ConnectorError({ + message: "Failed to resolve Shopify cursor", + cause, + }), + }); + +const setCutoff = (deferred: Deferred.Deferred, cursor: Cursor) => + Deferred.succeed(deferred, cursor).pipe(Effect.asVoid); + +// Enqueue a single webhook row after recording its cursor as the backfill +// cutoff. This is safe to call many times — Deferred.succeed is idempotent. +export const dispatchEntityWebhook = >(options: { + readonly queue: WebhookStream; + readonly cutoff: Deferred.Deferred; + readonly row: T; + readonly cursor: Cursor; +}): Effect.Effect => + Effect.gen(function* () { + yield* setCutoff(options.cutoff, options.cursor); + yield* Queue.offer(options.queue.queue, { + cursor: options.cursor, + rows: [options.row], + }).pipe(Effect.asVoid); + }); + +// Backfill stream for a single entity. Waits for the cutoff deferred to +// resolve (set by the first live webhook or by initialCutoff), then pages +// through the list endpoint until hasMore is false. +const makeBackfillStream = >(options: { + readonly api: ShopifyApiClientService; + readonly schema: Schema.Decoder; + readonly path: string; + readonly cutoff: Deferred.Deferred; + readonly cursorField: keyof T & string; + readonly limit?: number; +}): Stream.Stream, ConnectorError> => + Stream.fromEffect(Deferred.await(options.cutoff)).pipe( + Stream.flatMap((cutoff) => + makePullStream({ + fetchPage: (cursor: Cursor | undefined) => { + const nextUrl = typeof cursor === "string" ? cursor : undefined; + return options.api + .fetchList(options.schema, options.path, { + limit: options.limit ?? 10, + nextUrl, + }) + .pipe( + Effect.map((response) => { + if (response.items.length === 0) { + return { + cursor: nextUrl ?? options.path, + rows: [], + hasMore: false, + }; + } + + const filtered = response.items.filter((row: T) => + isOnOrBeforeCutoff(row[options.cursorField], cutoff), + ); + + return { + cursor: response.nextUrl ?? nextUrl ?? options.path, + rows: filtered, + hasMore: response.hasMore, + }; + }), + ); + }, + }), + ), + ); + +export type EntityStreams> = { + readonly live: WebhookStream; + readonly cutoff: Deferred.Deferred; + readonly backfill: Stream.Stream, ConnectorError>; +}; + +// Convenience factory: creates the live webhook queue, the cutoff deferred, +// and the backfill stream all at once. Callers destructure the result into a +// defineEntity() call. +export const makeEntityStreams = >(options: { + readonly api: ShopifyApiClientService; + readonly schema: Schema.Decoder; + readonly path: string; + readonly cursorField: keyof T & string; + readonly limit?: number; +}): Effect.Effect, ConnectorError> => + Effect.gen(function* () { + const queue = yield* makeWebhookQueue({ capacity: 1024 }); + const cutoff = yield* Deferred.make(); + const backfill = makeBackfillStream({ ...options, cutoff }); + return { live: queue, cutoff, backfill }; + }); diff --git a/connectors/producer-shopify/test/__cassettes__/api.vcr.test.cassette b/connectors/producer-shopify/test/__cassettes__/api.vcr.test.cassette new file mode 100644 index 0000000..4e92148 --- /dev/null +++ b/connectors/producer-shopify/test/__cassettes__/api.vcr.test.cassette @@ -0,0 +1,63 @@ +{ + "exports": { + "default": { + "meta": { + "createdAt": "1970-01-01T00:00:00.000Z", + "version": "1" + }, + "entries": {} + }, + "producer-shopify api (vcr) > replays products list page with VCR": { + "meta": { + "createdAt": "1970-01-01T00:00:00.000Z", + "version": "1" + }, + "entries": { + "{\"body\":\"\",\"headers\":{\"accept\":\"application/json\"},\"method\":\"GET\",\"url\":\"https://nothing-12348377.myshopify.com/admin/api/2026-01/products.json?limit=50\"}": { + "request": { + "method": "GET", + "url": "https://nothing-12348377.myshopify.com/admin/api/2026-01/products.json?limit=50", + "headers": { + "accept": "application/json" + } + }, + "response": { + "status": 200, + "body": "{\"products\":[{\"admin_graphql_api_id\":\"gid://shopify/Product/9204512850170\",\"body_html\":\"

asdasd

\",\"created_at\":\"2026-04-24T02:54:20+05:30\",\"handle\":\"lmao\",\"id\":9204512850170,\"image\":null,\"images\":[],\"options\":[{\"id\":11563331485946,\"name\":\"Title\",\"position\":1,\"product_id\":9204512850170,\"values\":[\"Default Title\"]}],\"product_type\":\"\",\"published_at\":\"2026-04-24T02:54:22+05:30\",\"published_scope\":\"global\",\"status\":\"active\",\"tags\":\"\",\"template_suffix\":\"\",\"title\":\"lmao\",\"updated_at\":\"2026-04-24T02:54:36+05:30\",\"variants\":[{\"admin_graphql_api_id\":\"gid://shopify/ProductVariant/48749871399162\",\"barcode\":\"\",\"compare_at_price\":null,\"created_at\":\"2026-04-24T02:54:20+05:30\",\"fulfillment_service\":\"manual\",\"grams\":0,\"id\":48749871399162,\"image_id\":null,\"inventory_item_id\":50587913224442,\"inventory_management\":\"shopify\",\"inventory_policy\":\"deny\",\"inventory_quantity\":0,\"old_inventory_quantity\":0,\"option1\":\"Default Title\",\"option2\":null,\"option3\":null,\"position\":1,\"price\":\"120.00\",\"product_id\":9204512850170,\"requires_shipping\":true,\"sku\":null,\"taxable\":true,\"title\":\"Default Title\",\"updated_at\":\"2026-04-24T02:54:20+05:30\",\"weight\":0,\"weight_unit\":\"kg\"}],\"vendor\":\"nothing\"},{\"admin_graphql_api_id\":\"gid://shopify/Product/9204601323770\",\"body_html\":\"\",\"created_at\":\"2026-04-24T03:08:54+05:30\",\"handle\":\"poke\",\"id\":9204601323770,\"image\":null,\"images\":[],\"options\":[{\"id\":11563425366266,\"name\":\"Title\",\"position\":1,\"product_id\":9204601323770,\"values\":[\"Default Title\"]}],\"product_type\":\"\",\"published_at\":\"2026-04-24T03:08:55+05:30\",\"published_scope\":\"global\",\"status\":\"active\",\"tags\":\"\",\"template_suffix\":\"\",\"title\":\"poke\",\"updated_at\":\"2026-04-24T03:09:06+05:30\",\"variants\":[{\"admin_graphql_api_id\":\"gid://shopify/ProductVariant/48750043758842\",\"barcode\":\"\",\"compare_at_price\":null,\"created_at\":\"2026-04-24T03:08:54+05:30\",\"fulfillment_service\":\"manual\",\"grams\":0,\"id\":48750043758842,\"image_id\":null,\"inventory_item_id\":50588085584122,\"inventory_management\":\"shopify\",\"inventory_policy\":\"deny\",\"inventory_quantity\":0,\"old_inventory_quantity\":0,\"option1\":\"Default Title\",\"option2\":null,\"option3\":null,\"position\":1,\"price\":\"600.00\",\"product_id\":9204601323770,\"requires_shipping\":true,\"sku\":null,\"taxable\":true,\"title\":\"Default Title\",\"updated_at\":\"2026-04-24T03:08:54+05:30\",\"weight\":0,\"weight_unit\":\"kg\"}],\"vendor\":\"nothing\"},{\"admin_graphql_api_id\":\"gid://shopify/Product/9204512653562\",\"body_html\":\"

zxczxc

\",\"created_at\":\"2026-04-24T02:52:36+05:30\",\"handle\":\"something\",\"id\":9204512653562,\"image\":null,\"images\":[],\"options\":[{\"id\":11563331256570,\"name\":\"Title\",\"position\":1,\"product_id\":9204512653562,\"values\":[\"Default Title\"]}],\"product_type\":\"\",\"published_at\":\"2026-04-24T02:52:37+05:30\",\"published_scope\":\"global\",\"status\":\"active\",\"tags\":\"\",\"template_suffix\":\"\",\"title\":\"something\",\"updated_at\":\"2026-04-24T02:52:41+05:30\",\"variants\":[{\"admin_graphql_api_id\":\"gid://shopify/ProductVariant/48749870022906\",\"barcode\":\"\",\"compare_at_price\":null,\"created_at\":\"2026-04-24T02:52:36+05:30\",\"fulfillment_service\":\"manual\",\"grams\":0,\"id\":48749870022906,\"image_id\":null,\"inventory_item_id\":50587911848186,\"inventory_management\":\"shopify\",\"inventory_policy\":\"deny\",\"inventory_quantity\":0,\"old_inventory_quantity\":0,\"option1\":\"Default Title\",\"option2\":null,\"option3\":null,\"position\":1,\"price\":\"200.00\",\"product_id\":9204512653562,\"requires_shipping\":true,\"sku\":null,\"taxable\":true,\"title\":\"Default Title\",\"updated_at\":\"2026-04-24T02:52:36+05:30\",\"weight\":0,\"weight_unit\":\"kg\"}],\"vendor\":\"nothing\"},{\"admin_graphql_api_id\":\"gid://shopify/Product/9203154321658\",\"body_html\":\"

x

\",\"created_at\":\"2026-04-22T22:51:32+05:30\",\"handle\":\"x\",\"id\":9203154321658,\"image\":null,\"images\":[],\"options\":[{\"id\":11561390276858,\"name\":\"Title\",\"position\":1,\"product_id\":9203154321658,\"values\":[\"Default Title\"]}],\"product_type\":\"\",\"published_at\":\"2026-04-22T22:51:34+05:30\",\"published_scope\":\"global\",\"status\":\"active\",\"tags\":\"\",\"template_suffix\":\"\",\"title\":\"x\",\"updated_at\":\"2026-04-22T22:51:46+05:30\",\"variants\":[{\"admin_graphql_api_id\":\"gid://shopify/ProductVariant/48738424062202\",\"barcode\":\"\",\"compare_at_price\":null,\"created_at\":\"2026-04-22T22:51:32+05:30\",\"fulfillment_service\":\"manual\",\"grams\":0,\"id\":48738424062202,\"image_id\":null,\"inventory_item_id\":50576427483386,\"inventory_management\":\"shopify\",\"inventory_policy\":\"deny\",\"inventory_quantity\":0,\"old_inventory_quantity\":0,\"option1\":\"Default Title\",\"option2\":null,\"option3\":null,\"position\":1,\"price\":\"500.00\",\"product_id\":9203154321658,\"requires_shipping\":true,\"sku\":null,\"taxable\":true,\"title\":\"Default Title\",\"updated_at\":\"2026-04-22T22:51:32+05:30\",\"weight\":0,\"weight_unit\":\"kg\"}],\"vendor\":\"nothing\"}]}", + "headers": { + "alt-svc": "h3=\":443\"; ma=86400", + "cf-cache-status": "DYNAMIC", + "cf-ray": "9f15f7b2bc96ff68-BOM", + "connection": "keep-alive", + "content-encoding": "gzip", + "content-security-policy": "default-src 'self' data: blob: 'unsafe-inline' 'unsafe-eval' https://* shopify-pos://*; block-all-mixed-content; child-src 'self' https://* shopify-pos://*; connect-src 'self' wss://* https://*; frame-ancestors 'none'; img-src 'self' data: blob: https:; script-src https://cdn.shopify.com https://cdn.shopifycdn.net https://checkout.pci.shopifyinc.com https://checkout.pci.shopifyinc.com/build/04ed4e1/card_fields.js https://api.stripe.com https://mpsnare.iesnare.com https://appcenter.intuit.com https://www.paypal.com https://js.braintreegateway.com https://c.paypal.com https://maps.googleapis.com https://www.google-analytics.com https://v.shopify.com 'self' 'unsafe-inline' 'unsafe-eval'; upgrade-insecure-requests; report-uri /csp-report?source%5Baction%5D=index&source%5Bapp%5D=Shopify&source%5Bcontroller%5D=admin%2Fproducts&source%5Bsection%5D=admin_api&source%5Buuid%5D=bc9a125d-0e05-49d8-b461-0d2e29a91431-1777042721; report-to shopify-csp", + "content-type": "application/json; charset=utf-8", + "date": "Fri, 24 Apr 2026 14:58:42 GMT", + "nel": "{\"success_fraction\":0.01,\"report_to\":\"cf-nel\",\"max_age\":604800}", + "referrer-policy": "origin-when-cross-origin", + "report-to": "{\"endpoints\":[{\"url\":\"https:\\/\\/a.nel.cloudflare.com\\/report\\/v4?s=04ywOUq55uN0Ixsue67Ax%2BxhhZtUMXVbKKSmYnjK97XCVCqoZQW0Q1SU57iLg9xyemkBN2flyFMuAIwGkFjCtT0Ik%2B49nJbPTKapyro3GxoPoxyIN3myqstkhJdxR45y0nPabQO2N43eapnGChdboQ%3D%3D\"}],\"group\":\"cf-nel\",\"max_age\":604800}", + "reporting-endpoints": "shopify-csp=\"/csp-report?source%5Baction%5D=index&source%5Bapp%5D=Shopify&source%5Bcontroller%5D=admin%2Fproducts&source%5Bsection%5D=admin_api&source%5Buuid%5D=bc9a125d-0e05-49d8-b461-0d2e29a91431-1777042721\"", + "server": "cloudflare", + "server-timing": "processing;dur=76, verdict_flag_enabled;desc=\"count=7\";dur=2.521, _y;desc=\"43db08ea-20d1-4664-8e95-782e2e252bc6\", _s;desc=\"36881617-e07c-413e-9cf3-0fbb925f07bf\", cfRequestDuration;dur=386.999846", + "transfer-encoding": "chunked", + "vary": "Accept-Encoding,Sec-Fetch-Site", + "x-content-type-options": "nosniff", + "x-dc": "gcp-asia-southeast1,gcp-us-east1,gcp-us-east1", + "x-download-options": "noopen", + "x-frame-options": "DENY", + "x-permitted-cross-domain-policies": "none", + "x-request-id": "bc9a125d-0e05-49d8-b461-0d2e29a91431-1777042721", + "x-shopify-api-deprecated-reason": "https://shopify.dev/api/admin-rest/latest/resources/product", + "x-shopify-api-version": "2026-01", + "x-shopify-shop-api-call-limit": "1/40", + "x-stats-apiclientid": "352073908225", + "x-stats-apipermissionid": "695642423546", + "x-stats-userid": "", + "x-xss-protection": "1; mode=block" + } + } + } + } + } + } +} \ No newline at end of file diff --git a/connectors/producer-shopify/test/api.vcr.test.ts b/connectors/producer-shopify/test/api.vcr.test.ts new file mode 100644 index 0000000..3465500 --- /dev/null +++ b/connectors/producer-shopify/test/api.vcr.test.ts @@ -0,0 +1,71 @@ +import type { VcrEntry, VcrRequest } from "@useairfoil/effect-vcr"; + +import { NodeServices } from "@effect/platform-node"; +import { describe, expect, it } from "@effect/vitest"; +import { FileSystemCassetteStore, VcrHttpClient } from "@useairfoil/effect-vcr"; +import { ConfigProvider, Effect, Layer } from "effect"; +import { FetchHttpClient } from "effect/unstable/http"; + +import { makeShopifyApiClient, ShopifyApiClient } from "../src/api"; +import { ProductSchema, ShopifyConfigConfig } from "../src/index"; + +const normalizeRequestPath = (value: string): string => { + const url = new URL(value); + const pairs = Array.from(url.searchParams.entries()); + pairs.sort((a, b) => a[0].localeCompare(b[0])); + const query = pairs.map(([k, v]) => `${k}=${v}`).join("&"); + return query.length > 0 ? `${url.pathname}?${query}` : url.pathname; +}; + +const matchByPathAndMethod = (request: VcrRequest, entry: VcrEntry): boolean => + request.method.toUpperCase() === entry.request.method.toUpperCase() && + normalizeRequestPath(request.url) === normalizeRequestPath(entry.request.url); + +describe("producer-shopify api (vcr)", () => { + it.effect("replays products list page with VCR", () => { + const program = Effect.gen(function* () { + const api = yield* ShopifyApiClient; + const result = yield* api.fetchList(ProductSchema, "/products.json", { + limit: 50, + }); + + expect(result.items.length).toBeGreaterThan(0); + expect(typeof result.hasMore).toBe("boolean"); + }).pipe(Effect.scoped); + + const apiLayer = Layer.effect(ShopifyApiClient)( + Effect.gen(function* () { + const config = yield* ShopifyConfigConfig; + return yield* makeShopifyApiClient(config); + }), + ); + + return program.pipe( + Effect.provide(apiLayer), + Effect.provide( + VcrHttpClient.layer({ + vcrName: "producer-shopify", + mode: "auto", + match: matchByPathAndMethod, + redact: { + requestHeaders: ["x-shopify-access-token", "authorization"], + }, + matchIgnore: { + requestHeaders: ["x-shopify-access-token", "authorization"], + }, + }), + ), + Effect.provide(FileSystemCassetteStore.layer()), + Effect.provide(FetchHttpClient.layer), + Effect.provide(NodeServices.layer), + Effect.provideService( + ConfigProvider.ConfigProvider, + ConfigProvider.fromUnknown({ + SHOPIFY_API_BASE_URL: "https://nothing-12348377.myshopify.com/admin/api/2026-01", + SHOPIFY_API_TOKEN: "test-token", + }), + ), + Effect.scoped, + ); + }); +}); diff --git a/connectors/producer-shopify/test/helpers.ts b/connectors/producer-shopify/test/helpers.ts new file mode 100644 index 0000000..33088b8 --- /dev/null +++ b/connectors/producer-shopify/test/helpers.ts @@ -0,0 +1,34 @@ +import type { Batch } from "@useairfoil/connector-kit"; + +import { Publisher } from "@useairfoil/connector-kit"; +import { Deferred, Effect, Layer, Ref } from "effect"; + +export type Published = { + readonly name: string; + readonly source: "live" | "backfill"; + readonly batch: Batch>; +}; + +// Layers a `Publisher` service that captures each publish into a Ref and +// resolves `done` after `expected` batches land. Tests use `Deferred.await(done)` +// to synchronize on ingestion completion. +export const makeTestPublisher = (expected: number) => + Effect.gen(function* () { + const publishedRef = yield* Ref.make>([]); + const done = yield* Deferred.make(); + const layer = Layer.succeed(Publisher)({ + publish: ({ name, source, batch }) => + Effect.gen(function* () { + const next = yield* Ref.updateAndGet(publishedRef, (items) => [ + ...items, + { name, source, batch }, + ]); + if (next.length === expected) { + yield* Deferred.succeed(done, next.length); + } + return { success: true }; + }), + }); + + return { publishedRef, done, layer }; + }); diff --git a/connectors/producer-shopify/test/webhook.test.ts b/connectors/producer-shopify/test/webhook.test.ts new file mode 100644 index 0000000..ca93e12 --- /dev/null +++ b/connectors/producer-shopify/test/webhook.test.ts @@ -0,0 +1,144 @@ +import { NodeHttpServer } from "@effect/platform-node"; +import { describe, expect, it } from "@effect/vitest"; +import { ConnectorError, runConnector, StateStoreInMemory } from "@useairfoil/connector-kit"; +import { ConfigProvider, Deferred, Effect, Layer, Ref } from "effect"; +import { HttpClient, HttpClientRequest } from "effect/unstable/http"; +import { createHmac } from "node:crypto"; + +import { ShopifyApiClient, type ShopifyApiClientService } from "../src/api"; +import { ShopifyConnector, ShopifyConnectorConfig } from "../src/index"; +import { makeTestPublisher } from "./helpers"; + +const webhookSecret = "test-shopify-webhook-secret"; + +const productWebhookPayload = { + id: 1072481062, + admin_graphql_api_id: "gid://shopify/Product/1072481062", + body_html: "Good snowboard!", + created_at: "2026-01-09T19:39:49-05:00", + handle: "burton-custom-freestyle-151", + image: null, + images: [], + options: [], + product_type: "Snowboard", + published_at: null, + published_scope: "web", + status: "draft", + tags: "", + template_suffix: "", + title: "Burton Custom Freestyle 151", + updated_at: "2026-01-09T19:39:49-05:00", + variants: [], + vendor: "Burton", +} as const; + +const makeApiStub = (): ShopifyApiClientService => ({ + fetchJson: (_schema) => Effect.fail(new ConnectorError({ message: "Unexpected fetchJson" })), + fetchList: (_schema) => Effect.succeed({ items: [], nextUrl: null, hasMore: false }), +}); + +const signPayload = (rawBody: string): string => + createHmac("sha256", webhookSecret).update(rawBody).digest("base64"); + +describe("producer-shopify webhook", () => { + it.effect("publishes live product webhook batches", () => { + const runtimeLayer = NodeHttpServer.layerTest; + const apiLayer = Layer.succeed(ShopifyApiClient)(makeApiStub()); + + const connectorLayer = ShopifyConnectorConfig().pipe(Layer.provide(apiLayer)); + const configProvider = ConfigProvider.fromUnknown({ + SHOPIFY_API_BASE_URL: "https://your-development-store.myshopify.com/admin/api/2026-01", + SHOPIFY_API_TOKEN: "test-token", + SHOPIFY_WEBHOOK_SECRET: webhookSecret, + }); + + return Effect.gen(function* () { + const { publishedRef, done, layer } = yield* makeTestPublisher(1); + const { connector, routes } = yield* ShopifyConnector; + const runLayer = Layer.mergeAll(StateStoreInMemory, layer, runtimeLayer); + + yield* Effect.gen(function* () { + yield* Effect.forkScoped( + runConnector(connector, { + initialCutoff: new Date(), + webhook: { + routes, + }, + }), + ); + + const rawBody = JSON.stringify(productWebhookPayload); + const signature = signPayload(rawBody); + + const client = yield* HttpClient.HttpClient; + const request = HttpClientRequest.post("/webhooks/shopify").pipe( + HttpClientRequest.setHeader("x-shopify-topic", "products/create"), + HttpClientRequest.setHeader("x-shopify-hmac-sha256", signature), + HttpClientRequest.bodyText(rawBody, "application/json"), + ); + const response = yield* client.execute(request); + + expect(response.status).toBe(200); + + yield* Deferred.await(done); + const published = yield* Ref.get(publishedRef); + expect(published.length).toBe(1); + expect(published[0]?.name).toBe("products"); + }).pipe(Effect.provide(runLayer)); + }).pipe( + Effect.provide(connectorLayer), + Effect.provide(runtimeLayer), + Effect.provideService(ConfigProvider.ConfigProvider, configProvider), + Effect.scoped, + ) as Effect.Effect; + }); + + it.effect("rejects invalid webhook signatures", () => { + const runtimeLayer = NodeHttpServer.layerTest; + const apiLayer = Layer.succeed(ShopifyApiClient)(makeApiStub()); + + const connectorLayer = ShopifyConnectorConfig().pipe(Layer.provide(apiLayer)); + const configProvider = ConfigProvider.fromUnknown({ + SHOPIFY_API_BASE_URL: "https://your-development-store.myshopify.com/admin/api/2026-01", + SHOPIFY_API_TOKEN: "test-token", + SHOPIFY_WEBHOOK_SECRET: webhookSecret, + }); + + return Effect.gen(function* () { + const { publishedRef, layer } = yield* makeTestPublisher(1); + const { connector, routes } = yield* ShopifyConnector; + const runLayer = Layer.mergeAll(StateStoreInMemory, layer, runtimeLayer); + + yield* Effect.gen(function* () { + yield* Effect.forkScoped( + runConnector(connector, { + initialCutoff: new Date(), + webhook: { + routes, + }, + }), + ); + + const rawBody = JSON.stringify(productWebhookPayload); + const invalidSignature = signPayload(`${rawBody}-invalid`); + + const client = yield* HttpClient.HttpClient; + const request = HttpClientRequest.post("/webhooks/shopify").pipe( + HttpClientRequest.setHeader("x-shopify-topic", "products/create"), + HttpClientRequest.setHeader("x-shopify-hmac-sha256", invalidSignature), + HttpClientRequest.bodyText(rawBody, "application/json"), + ); + const response = yield* client.execute(request); + + expect(response.status).toBe(500); + const published = yield* Ref.get(publishedRef); + expect(published.length).toBe(0); + }).pipe(Effect.provide(runLayer)); + }).pipe( + Effect.provide(connectorLayer), + Effect.provide(runtimeLayer), + Effect.provideService(ConfigProvider.ConfigProvider, configProvider), + Effect.scoped, + ) as Effect.Effect; + }); +}); diff --git a/connectors/producer-shopify/tsconfig.json b/connectors/producer-shopify/tsconfig.json new file mode 100644 index 0000000..df2aa9e --- /dev/null +++ b/connectors/producer-shopify/tsconfig.json @@ -0,0 +1,17 @@ +{ + "extends": "../../tsconfig.json", + "compilerOptions": { + "outDir": "dist", + "declarationDir": "dist", + "rootDir": ".", + "types": ["node"], + "moduleResolution": "bundler", + "esModuleInterop": true, + "verbatimModuleSyntax": true, + "noEmit": true, + "resolveJsonModule": true, + "skipLibCheck": true, + "strict": true + }, + "exclude": ["node_modules", "dist"] +} diff --git a/connectors/producer-shopify/tsdown.config.ts b/connectors/producer-shopify/tsdown.config.ts new file mode 100644 index 0000000..7b434c7 --- /dev/null +++ b/connectors/producer-shopify/tsdown.config.ts @@ -0,0 +1,9 @@ +import { defineConfig } from "tsdown"; + +export default defineConfig({ + entry: ["src/index.ts"], + format: ["esm"], + dts: true, + sourcemap: true, + clean: true, +}); diff --git a/connectors/producer-shopify/vitest.config.ts b/connectors/producer-shopify/vitest.config.ts new file mode 100644 index 0000000..50ca1af --- /dev/null +++ b/connectors/producer-shopify/vitest.config.ts @@ -0,0 +1,9 @@ +import { defineConfig } from "vitest/config"; + +export default defineConfig({ + test: { + fileParallelism: false, + testTimeout: 60_000, + hookTimeout: 60_000, + }, +}); diff --git a/package.json b/package.json index f7abaa8..17171ee 100644 --- a/package.json +++ b/package.json @@ -9,7 +9,8 @@ }, "workspaces": [ "connectors/*", - "packages/*" + "packages/*", + "templates/*" ], "scripts": { "build": "nx run-many --target=build --parallel", @@ -27,6 +28,7 @@ "oxlint": "^1.61.0", "typescript": "catalog:" }, + "packageManager": "pnpm@10.33.2+sha512.a90faf6feeab71ad6c6e57f94e0fe1a12f5dcc22cd754db40ae9593eb6a3e0b6b12e3540218bb37ae083404b1f2ce6db2a4121e979829b4aff94b99f49da1cf8", "beachball": { "disallowedChangeTypes": [ "major" diff --git a/packages/connector-kit/README.md b/packages/connector-kit/README.md index 5425ef8..353d61f 100644 --- a/packages/connector-kit/README.md +++ b/packages/connector-kit/README.md @@ -11,20 +11,25 @@ This section is for connector authors who want to build and run a connector. ### Install ```bash -bun add @useairfoil/connector-kit +pnpm add @useairfoil/connector-kit ``` ### Minimal example +This snippet uses Node. Bun is also supported by swapping in Bun's HttpServer +layer. + ```ts -import { Schema, Effect, Layer, Stream } from "effect"; +import { NodeHttpServer } from "@effect/platform-node"; +import { Schema, Effect, Layer, Queue, Stream } from "effect"; +import { createServer } from "node:http"; import { + type WebhookRoute, defineConnector, defineEntity, Publisher, runConnector, StateStoreInMemory, - WebhookServerLayer, makeWebhookQueue, } from "@useairfoil/connector-kit"; @@ -37,6 +42,18 @@ const Customer = Schema.Struct({ const program = Effect.gen(function* () { const webhook = yield* makeWebhookQueue>(); + const routes: ReadonlyArray>> = [ + { + path: "/webhook/customers", + schema: Customer, + handle: (payload) => + Queue.offer(webhook.queue, { + cursor: new Date(), + rows: [payload], + }).pipe(Effect.asVoid), + }, + ]; + const connector = defineConnector({ name: "producer-example", entities: [ @@ -44,37 +61,23 @@ const program = Effect.gen(function* () { name: "customers", schema: Customer, primaryKey: "id", - live: webhook.queue, + live: webhook, backfill: Stream.empty, }), ], events: [], }); - yield* runConnector(connector, new Date()); + yield* runConnector(connector, { + initialCutoff: new Date(), + webhook: { routes }, + }); }).pipe( + Effect.provide(NodeHttpServer.layer(createServer, { port: 8080 })), Effect.provide(StateStoreInMemory), Effect.provide( Layer.succeed(Publisher, { - publish: () => Effect.succeed({ requestId: 1n }), - }), - ), - Effect.provide( - WebhookServerLayer({ - port: 8080, - routes: [ - { - path: "/webhook/customers", - schema: Customer, - dispatch: (payload) => - Effect.succeed([ - { - queue: webhook.queue, - batch: { cursor: new Date(), rows: [payload] }, - }, - ]), - }, - ], + publish: () => Effect.succeed({ success: true }), }), ), ); @@ -92,7 +95,7 @@ Effect.runPromise(program); - `defineEntity` wires live and backfill streams for each entity. - `Publisher` is the output boundary (where batches go). - `StateStore` tracks cursors and backfill state. -- `WebhookServerLayer` turns webhook routes into an HTTP server. +- `runConnector(..., { webhook: { routes } })` wires webhook routes and health endpoint into the HTTP runtime you provide. ### Layers and Effect services @@ -100,24 +103,26 @@ Connector-kit is designed around Effect services and Layers. Your application sh - a `Publisher` Layer - a `StateStore` Layer -- an HTTP server Layer (if you use webhooks) +- an HTTP server Layer (if you pass webhook routes to `runConnector`) - any custom services your connector needs (API clients, Effect Config) +`runConnector` automatically provides connector runtime context for internal +tracing/metrics annotations. + ### Testing with VCR VCR is provided via `@useairfoil/effect-vcr` as an Effect `HttpClient` layer. This keeps HTTP recording out of connector logic. ```ts -import { FetchHttpClient } from "@effect/platform"; -import { NodeFileSystem } from "@effect/platform-node"; +import { FetchHttpClient } from "effect/unstable/http"; import { FileSystemCassetteStore, VcrHttpClient } from "@useairfoil/effect-vcr"; import { Layer } from "effect"; -const cassetteLayer = CassetteStoreLive.pipe(Layer.provide(NodeFileSystem.layer)); - -const vcrLayer = VcrHttpClientLayer({ - cassetteDir: "cassettes", - cassetteName: "example", +const vcrLayer = VcrHttpClient.layer({ + vcrName: "producer-polar", mode: "auto", -}).pipe(Layer.provide(Layer.mergeAll(FetchHttpClient.layer, cassetteLayer))); +}).pipe( + Layer.provideMerge(FileSystemCassetteStore.layer()), + Layer.provideMerge(FetchHttpClient.layer), +); ``` diff --git a/packages/connector-kit/src/index.ts b/packages/connector-kit/src/index.ts index cf4f1ea..5dade62 100644 --- a/packages/connector-kit/src/index.ts +++ b/packages/connector-kit/src/index.ts @@ -18,10 +18,12 @@ export type { Transform, WebhookStream, } from "./core/types"; +export type { RunConnectorOptions } from "./ingestion/engine"; export { runConnector } from "./ingestion/engine"; export { StateStore, StateStoreInMemory } from "./ingestion/state-store"; export { Publisher } from "./publisher/service"; export { WingsPublisherLayer } from "./publisher/wings"; +export { ConnectorRuntimeContext, ConnectorRuntimeContextLayer } from "./runtime/context"; export { makePullStream } from "./streams/pull-stream"; export { makeWebhookQueue } from "./streams/webhook-queue"; export { buildWebhookRouter } from "./webhook/server"; diff --git a/packages/connector-kit/src/ingestion/engine.ts b/packages/connector-kit/src/ingestion/engine.ts index bd312c1..985781b 100644 --- a/packages/connector-kit/src/ingestion/engine.ts +++ b/packages/connector-kit/src/ingestion/engine.ts @@ -1,4 +1,5 @@ -import { Effect, Queue, Ref, Stream } from "effect"; +import { Effect, Layer, Metric, Queue, Ref, Stream } from "effect"; +import { HttpRouter, type HttpServer, HttpServerResponse } from "effect/unstable/http"; import type { ConnectorError } from "../core/errors"; import type { @@ -14,8 +15,11 @@ import type { Transform, WebhookStream, } from "../core/types"; +import type { WebhookRoute } from "../webhook/types"; import { Publisher } from "../publisher/service"; +import { ConnectorRuntimeContext, ConnectorRuntimeContextLayer } from "../runtime/context"; +import { buildWebhookRouter } from "../webhook/server"; import { StateStore } from "./state-store"; type TaggedBatch = { @@ -23,19 +27,100 @@ type TaggedBatch = { readonly batch: Batch; }; -export const runConnector = ( +const connectorBatchesTotal = Metric.counter("connector_batches_total", { + description: "Total batches attempted by connector streams", +}); + +const connectorRowsTotal = Metric.counter("connector_rows_total", { + description: "Total rows attempted by connector streams", +}); + +const connectorBatchSize = Metric.histogram("connector_batch_size", { + description: "Distribution of batch row counts", + boundaries: [1, 5, 10, 25, 50, 100, 250, 500, 1000], +}); + +type RunConnectorBaseOptions = { + readonly initialCutoff?: Cursor; +}; + +export type RunConnectorOptions = RunConnectorBaseOptions & { + readonly webhook?: { + readonly routes: ReadonlyArray>; + readonly healthPath?: HttpRouter.PathInput; + readonly disableHttpLogger?: boolean; + }; +}; + +type RunConnectorNoWebhookOptions = RunConnectorBaseOptions & { + readonly webhook?: undefined; +}; + +type RunConnectorWebhookOptions = RunConnectorOptions & { + readonly webhook: NonNullable["webhook"]>; +}; + +export function runConnector( + connector: ConnectorDefinition, + options?: RunConnectorNoWebhookOptions, +): Effect.Effect; +export function runConnector( + connector: ConnectorDefinition, + options: RunConnectorWebhookOptions, +): Effect.Effect; +export function runConnector( + connector: ConnectorDefinition, + options?: RunConnectorOptions, +) { + return Effect.withSpan( + Effect.gen(function* () { + const initialCutoff = options?.initialCutoff ?? new Date(); + const ingestion = runIngestion(connector, initialCutoff); + + if (!options?.webhook) { + return yield* ingestion; + } + + return yield* ingestion.pipe(Effect.provide(makeWebhookServerLayer(options.webhook))); + }).pipe(Effect.provide(ConnectorRuntimeContextLayer(connector))), + "connector.run", + { + attributes: { + "connector.name": connector.name, + "connector.entities.count": connector.entities.length, + "connector.events.count": connector.events.length, + }, + }, + ); +} + +const runIngestion = ( connector: ConnectorDefinition, initialCutoff: Cursor, -): Effect.Effect => - Effect.gen(function* () { - // Start ingestion for every entity and event in parallel. - const entityRuns = connector.entities.map((entity) => runEntity(entity, initialCutoff)); - const eventRuns = connector.events.map((event) => runEvent(event, initialCutoff)); - // main runner - yield* Effect.all([...entityRuns, ...eventRuns], { - concurrency: "unbounded", - }); +): Effect.Effect => { + const entityRuns = connector.entities.map((entity) => runEntity(entity, initialCutoff)); + const eventRuns = connector.events.map((event) => runEvent(event, initialCutoff)); + + return Effect.all([...entityRuns, ...eventRuns], { + concurrency: "unbounded", + }).pipe(Effect.asVoid); +}; + +const makeWebhookServerLayer = (options: { + readonly routes: ReadonlyArray>; + readonly healthPath?: HttpRouter.PathInput; + readonly disableHttpLogger?: boolean; +}): Layer.Layer => { + const healthPath: HttpRouter.PathInput = options.healthPath ?? "/health"; + const app = Layer.mergeAll( + buildWebhookRouter(options.routes), + HttpRouter.add("GET", healthPath, Effect.succeed(HttpServerResponse.text("ok"))), + ); + + return HttpRouter.serve(app, { + disableLogger: options.disableHttpLogger ?? true, }); +}; const createInitialState = (cutoff: Cursor): IngestionState => ({ backfill: { cutoff }, @@ -57,7 +142,7 @@ const makeStateRef = ( const runEntity = ( entity: EntityDefinition, initialCutoff: Cursor, -): Effect.Effect => +): Effect.Effect => Effect.gen(function* () { type Row = EntityRow; const stateRef = yield* makeStateRef(entity.name, initialCutoff); @@ -134,7 +219,7 @@ const runEntity = ( const runEvent = ( event: EventDefinition, initialCutoff: Cursor, -): Effect.Effect => +): Effect.Effect => Effect.gen(function* () { type Row = EntityRow; const stateRef = yield* makeStateRef(event.name, initialCutoff); @@ -176,28 +261,61 @@ const processTaggedStream = >( name: string, transform: Transform | undefined, stateRef: Ref.Ref>, -): Effect.Effect => - Stream.runForEach(stream, ({ source, batch }) => - Effect.gen(function* () { - // Optional per-row transformation. - const rows = transform ? yield* Effect.forEach(batch.rows, transform) : batch.rows; - - // Publish before updating cursor state. - const publisher = yield* Publisher; - yield* publisher.publish({ - name, - batch: { - cursor: batch.cursor, - rows, - }, - }); +): Effect.Effect => + Effect.gen(function* () { + const runtime = yield* ConnectorRuntimeContext; + const connectorName = runtime.connector.name; - // Persist state only after publish succeeds. - const nextState = yield* Ref.updateAndGet(stateRef, (state) => - updateState(state, source, batch.cursor), - ); + yield* Stream.runForEach(stream, ({ source, batch }) => + Effect.withSpan( + Effect.gen(function* () { + const metric = { + connector: connectorName, + stream: name, + source, + }; - const store = yield* StateStore; - yield* store.setState(name, nextState); - }), - ); + yield* Metric.update(Metric.withAttributes(connectorBatchesTotal, metric), 1); + yield* Metric.update( + Metric.withAttributes(connectorRowsTotal, metric), + batch.rows.length, + ); + yield* Metric.update( + Metric.withAttributes(connectorBatchSize, metric), + batch.rows.length, + ); + + // Optional per-row transformation. + const rows = transform ? yield* Effect.forEach(batch.rows, transform) : batch.rows; + + // Publish before updating cursor state. + const publisher = yield* Publisher; + yield* publisher.publish({ + name, + source, + batch: { + cursor: batch.cursor, + rows, + }, + }); + + // Persist state only after publish succeeds. + const nextState = yield* Ref.updateAndGet(stateRef, (state) => + updateState(state, source, batch.cursor), + ); + + const store = yield* StateStore; + yield* store.setState(name, nextState); + }), + "connector.batch.process", + { + attributes: { + "connector.name": connectorName, + "connector.stream.name": name, + "connector.stream.source": source, + "connector.batch.rows": batch.rows.length, + }, + }, + ), + ); + }); diff --git a/packages/connector-kit/src/publisher/service.ts b/packages/connector-kit/src/publisher/service.ts index 36f6813..9f8c023 100644 --- a/packages/connector-kit/src/publisher/service.ts +++ b/packages/connector-kit/src/publisher/service.ts @@ -14,6 +14,7 @@ export class Publisher extends Context.Service< { readonly publish: (options: { readonly name: string; + readonly source: "live" | "backfill"; readonly batch: Batch>; }) => Effect.Effect; } diff --git a/packages/connector-kit/src/publisher/wings.ts b/packages/connector-kit/src/publisher/wings.ts index 9de8a68..49c1edb 100644 --- a/packages/connector-kit/src/publisher/wings.ts +++ b/packages/connector-kit/src/publisher/wings.ts @@ -81,7 +81,7 @@ export const WingsPublisherLayer = ( } return { - publish: ({ name, batch }) => + publish: ({ name, source: _source, batch }) => Effect.gen(function* () { const entry = entries.get(name); if (!entry) { diff --git a/packages/connector-kit/src/runtime/context.ts b/packages/connector-kit/src/runtime/context.ts new file mode 100644 index 0000000..a0a0742 --- /dev/null +++ b/packages/connector-kit/src/runtime/context.ts @@ -0,0 +1,15 @@ +import { Context, Layer } from "effect"; + +import type { ConnectorDefinition } from "../core/types"; + +export type ConnectorRuntimeContextValue = { + readonly connector: ConnectorDefinition; +}; + +export class ConnectorRuntimeContext extends Context.Service< + ConnectorRuntimeContext, + ConnectorRuntimeContextValue +>()("@useairfoil/connector-kit/ConnectorRuntimeContext") {} + +export const ConnectorRuntimeContextLayer = (connector: ConnectorDefinition) => + Layer.succeed(ConnectorRuntimeContext)({ connector }); diff --git a/packages/connector-kit/test/engine.test.ts b/packages/connector-kit/test/engine.test.ts index 7c1847c..230fc76 100644 --- a/packages/connector-kit/test/engine.test.ts +++ b/packages/connector-kit/test/engine.test.ts @@ -22,7 +22,7 @@ const makeTestPublisher = ( expectedCount: number, ) => Layer.succeed(Publisher)({ - publish: ({ batch }) => + publish: ({ source: _source, batch }) => Effect.gen(function* () { const rows = batch.rows as ReadonlyArray; const next = yield* Ref.updateAndGet(publishedRef, (acc) => [...acc, ...rows]); @@ -79,7 +79,7 @@ describe("engine merging logic", () => { const publisherLayer = makeTestPublisher(publishedRef, done, 2); yield* Effect.forkScoped( - runConnector(connector, new Date()).pipe( + runConnector(connector, { initialCutoff: new Date() }).pipe( Effect.provide(StateStoreInMemory), Effect.provide(publisherLayer), ), @@ -143,7 +143,7 @@ describe("engine merging logic", () => { const publisherLayer = makeTestPublisher(publishedRef, done, 3); yield* Effect.forkScoped( - runConnector(connector, new Date()).pipe( + runConnector(connector, { initialCutoff: new Date() }).pipe( Effect.provide(StateStoreInMemory), Effect.provide(publisherLayer), ), @@ -204,7 +204,7 @@ describe("engine merging logic", () => { const publisherLayer = makeTestPublisher(publishedRef, done, 1); yield* Effect.forkScoped( - runConnector(connector, new Date()).pipe( + runConnector(connector, { initialCutoff: new Date() }).pipe( Effect.provide(StateStoreInMemory), Effect.provide(publisherLayer), ), diff --git a/packages/effect-vcr/src/index.ts b/packages/effect-vcr/src/index.ts index e8bb099..317d318 100644 --- a/packages/effect-vcr/src/index.ts +++ b/packages/effect-vcr/src/index.ts @@ -2,3 +2,12 @@ export * as CassetteStore from "./cassette-store"; export * as FileSystemCassetteStore from "./file-system-cassette-store"; export * as VcrHttpClient from "./vcr-http-client"; +export type { + Cassette, + CassetteFile, + Configuration, + VcrEntry, + VcrMode, + VcrRequest, + VcrResponse, +} from "./types"; diff --git a/packages/wings/test/cluster-metadata.test.ts b/packages/wings/test/cluster-metadata.test.ts index deb7f7f..72b6dab 100644 --- a/packages/wings/test/cluster-metadata.test.ts +++ b/packages/wings/test/cluster-metadata.test.ts @@ -1,4 +1,4 @@ -import { describe, expect, it } from "@effect/vitest"; +import { describe, expect, layer } from "@effect/vitest"; import { TestWings } from "@useairfoil/wings-testing"; import { Effect, Exit, Layer } from "effect"; import { customAlphabet } from "nanoid"; @@ -18,7 +18,10 @@ const wingsLayer = Layer.effect(WingsClusterMetadata.ClusterMetadata)( }), ); -describe("ClusterMetadata", () => { +// One container is shared across all tests +const testLayer = wingsLayer.pipe(Layer.provide(TestWings.container)); + +layer(testLayer, { timeout: "60 seconds" })("ClusterMetadata", (it) => { describe("Layer Configuration", () => { it.effect("should create layer with direct config", () => Effect.gen(function* () { @@ -28,7 +31,7 @@ describe("ClusterMetadata", () => { expect(result).toHaveProperty("tenants"); expect(Array.isArray(result.tenants)).toBe(true); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); }); @@ -42,7 +45,7 @@ describe("ClusterMetadata", () => { }); expect(tenant.name).toBe(`tenants/${tenantId}`); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should get a tenant", () => @@ -56,7 +59,7 @@ describe("ClusterMetadata", () => { }); expect(tenant.name).toBe(`tenants/${tenantId}`); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should list tenants", () => @@ -67,7 +70,7 @@ describe("ClusterMetadata", () => { expect(response).toHaveProperty("tenants"); expect(Array.isArray(response.tenants)).toBe(true); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should delete a tenant", () => @@ -87,7 +90,7 @@ describe("ClusterMetadata", () => { ); expect(Exit.isFailure(exit)).toBe(true); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should handle tenant not found error", () => @@ -99,7 +102,7 @@ describe("ClusterMetadata", () => { ); expect(Exit.isFailure(exit)).toBe(true); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); }); @@ -149,7 +152,7 @@ describe("ClusterMetadata", () => { }); expect(namespace.name).toBe(`tenants/${tenantId}/namespaces/${namespaceId}`); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should get a namespace", () => @@ -200,7 +203,7 @@ describe("ClusterMetadata", () => { }); expect(namespace.name).toBe(`tenants/${tenantId}/namespaces/${namespaceId}`); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should list namespaces", () => @@ -212,7 +215,7 @@ describe("ClusterMetadata", () => { expect(response).toHaveProperty("namespaces"); expect(Array.isArray(response.namespaces)).toBe(true); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); }); @@ -236,7 +239,7 @@ describe("ClusterMetadata", () => { }); expect(topic.name).toBe(`tenants/default/namespaces/default/topics/${topicId}`); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should get a topic", () => @@ -260,7 +263,7 @@ describe("ClusterMetadata", () => { expect(topic.name).toBe(`tenants/default/namespaces/default/topics/${topicId}`); expect(topic.schema.fields.length).toBeGreaterThan(0); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should list topics", () => @@ -272,7 +275,7 @@ describe("ClusterMetadata", () => { expect(response).toHaveProperty("topics"); expect(Array.isArray(response.topics)).toBe(true); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should delete a topic", () => @@ -302,7 +305,7 @@ describe("ClusterMetadata", () => { ); expect(Exit.isFailure(exit)).toBe(true); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); }); @@ -331,7 +334,7 @@ describe("ClusterMetadata", () => { }); expect(objectStore.name).toBe(`tenants/${tenantId}/object-stores/${objectStoreId}`); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should get an object store", () => @@ -362,7 +365,7 @@ describe("ClusterMetadata", () => { }); expect(objectStore.name).toBe(`tenants/${tenantId}/object-stores/${objectStoreId}`); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should list object stores", () => @@ -374,7 +377,7 @@ describe("ClusterMetadata", () => { expect(response).toHaveProperty("objectStores"); expect(Array.isArray(response.objectStores)).toBe(true); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); }); @@ -396,7 +399,7 @@ describe("ClusterMetadata", () => { }); expect(dataLake.name).toBe(`tenants/${tenantId}/data-lakes/${dataLakeId}`); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should create an Iceberg data lake", () => @@ -416,7 +419,7 @@ describe("ClusterMetadata", () => { }); expect(dataLake.name).toBe(`tenants/${tenantId}/data-lakes/${dataLakeId}`); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should get a data lake", () => @@ -440,7 +443,7 @@ describe("ClusterMetadata", () => { }); expect(dataLake.name).toBe(`tenants/${tenantId}/data-lakes/${dataLakeId}`); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); it.effect("should list data lakes", () => @@ -452,13 +455,13 @@ describe("ClusterMetadata", () => { expect(response).toHaveProperty("dataLakes"); expect(Array.isArray(response.dataLakes)).toBe(true); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); }); describe("Error Handling", () => { it.effect("should handle connection errors gracefully", () => { - const layer = WingsClusterMetadata.layer({ + const errorLayer = WingsClusterMetadata.layer({ host: "localhost:9999", // Non-existent port }); return Effect.gen(function* () { @@ -469,7 +472,7 @@ describe("ClusterMetadata", () => { ); expect(Exit.isFailure(exit)).toBe(true); - }).pipe(Effect.provide(layer)); + }).pipe(Effect.provide(errorLayer)); }); it.effect("should catch ClusterMetadataError with Effect.catchTag", () => @@ -483,7 +486,7 @@ describe("ClusterMetadata", () => { ); expect(result).toHaveProperty("error"); - }).pipe(Effect.provide(wingsLayer), Effect.provide(TestWings.container)), + }), ); }); diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index 9b01ee0..8bfad80 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -84,6 +84,43 @@ importers: specifier: 'catalog:' version: 4.0.0-beta.54 devDependencies: + '@dotenvx/dotenvx': + specifier: ^1.62.0 + version: 1.62.0 + '@effect/vitest': + specifier: 'catalog:' + version: 4.0.0-beta.54(effect@4.0.0-beta.54)(vitest@3.2.4(@types/node@24.12.2)(jiti@2.6.1)(tsx@4.21.0)(yaml@2.8.3)) + '@types/node': + specifier: 'catalog:' + version: 24.12.2 + '@useairfoil/effect-vcr': + specifier: workspace:* + version: link:../../packages/effect-vcr + tsdown: + specifier: 'catalog:' + version: 0.15.12(@emnapi/core@1.10.0)(@emnapi/runtime@1.10.0)(typescript@5.9.3) + tsx: + specifier: ^4.21.0 + version: 4.21.0 + vitest: + specifier: 'catalog:' + version: 3.2.4(@types/node@24.12.2)(jiti@2.6.1)(tsx@4.21.0)(yaml@2.8.3) + + connectors/producer-shopify: + dependencies: + '@effect/platform-node': + specifier: 'catalog:' + version: 4.0.0-beta.54(effect@4.0.0-beta.54)(ioredis@5.10.1) + '@useairfoil/connector-kit': + specifier: workspace:* + version: link:../../packages/connector-kit + effect: + specifier: 'catalog:' + version: 4.0.0-beta.54 + devDependencies: + '@dotenvx/dotenvx': + specifier: ^1.62.0 + version: 1.62.0 '@effect/vitest': specifier: 'catalog:' version: 4.0.0-beta.54(effect@4.0.0-beta.54)(vitest@3.2.4(@types/node@24.12.2)(jiti@2.6.1)(tsx@4.21.0)(yaml@2.8.3)) @@ -222,6 +259,10 @@ importers: nice-grpc-common: specifier: 'catalog:' version: 2.0.3 + optionalDependencies: + testcontainers: + specifier: 'catalog:' + version: 11.14.0 devDependencies: '@bufbuild/buf': specifier: 'catalog:' @@ -244,10 +285,6 @@ importers: vitest: specifier: 'catalog:' version: 3.2.4(@types/node@24.12.2)(jiti@2.6.1)(tsx@4.21.0)(yaml@2.8.3) - optionalDependencies: - testcontainers: - specifier: 'catalog:' - version: 11.14.0 packages/wings: dependencies: @@ -338,6 +375,40 @@ importers: specifier: 'catalog:' version: 3.2.4(@types/node@24.12.2)(jiti@2.6.1)(tsx@4.21.0)(yaml@2.8.3) + templates/producer-template: + dependencies: + '@effect/platform-node': + specifier: 'catalog:' + version: 4.0.0-beta.54(effect@4.0.0-beta.54)(ioredis@5.10.1) + '@useairfoil/connector-kit': + specifier: workspace:* + version: link:../../packages/connector-kit + effect: + specifier: 'catalog:' + version: 4.0.0-beta.54 + devDependencies: + '@dotenvx/dotenvx': + specifier: ^1.62.0 + version: 1.62.0 + '@effect/vitest': + specifier: 'catalog:' + version: 4.0.0-beta.54(effect@4.0.0-beta.54)(vitest@3.2.4(@types/node@24.12.2)(jiti@2.6.1)(tsx@4.21.0)(yaml@2.8.3)) + '@types/node': + specifier: 'catalog:' + version: 24.12.2 + '@useairfoil/effect-vcr': + specifier: workspace:* + version: link:../../packages/effect-vcr + tsdown: + specifier: 'catalog:' + version: 0.15.12(@emnapi/core@1.10.0)(@emnapi/runtime@1.10.0)(typescript@5.9.3) + tsx: + specifier: ^4.21.0 + version: 4.21.0 + vitest: + specifier: 'catalog:' + version: 3.2.4(@types/node@24.12.2)(jiti@2.6.1)(tsx@4.21.0)(yaml@2.8.3) + packages: '@babel/code-frame@7.29.0': @@ -424,6 +495,16 @@ packages: '@clack/prompts@0.11.0': resolution: {integrity: sha512-pMN5FcrEw9hUkZA4f+zLlzivQSeQf5dRGJjSUbvVYDLvpKCdQx5OaknvKzgbtXOizhP+SJJJjqEbOe55uKKfAw==} + '@dotenvx/dotenvx@1.62.0': + resolution: {integrity: sha512-dHMoiNqIyLnDxbsy16Zr55qN6a52dyocvOiVV4+ptjRIWNrBItbCNjazcv+hwKZGa7+WSKDHLTlyxzpK5yhxaQ==} + hasBin: true + + '@ecies/ciphers@0.2.6': + resolution: {integrity: sha512-patgsRPKGkhhoBjETV4XxD0En4ui5fbX0hzayqI3M8tvNMGUoUvmyYAIWwlxBc1KX5cturfqByYdj5bYGRpN9g==} + engines: {bun: '>=1', deno: '>=2.7.10', node: '>=16'} + peerDependencies: + '@noble/ciphers': ^1.0.0 + '@effect/platform-node-shared@4.0.0-beta.54': resolution: {integrity: sha512-aSHimEVevmgjm1GhvSmEM5xb8D7bUohHcf4NTvf+WqCQbt3Y58kT8xdD93EbpxZDkteY0DraCbUrtxx3cRBQ3Q==} engines: {node: '>=18.0.0'} @@ -699,6 +780,18 @@ packages: '@emnapi/core': ^1.7.1 '@emnapi/runtime': ^1.7.1 + '@noble/ciphers@1.3.0': + resolution: {integrity: sha512-2I0gnIVPtfnMw9ee9h1dJG7tp81+8Ob3OJb3Mv37rx5L40/b0i7djjCVvGOVqc9AEIQyvyu1i6ypKdFw8R8gQw==} + engines: {node: ^14.21.3 || >=16} + + '@noble/curves@1.9.7': + resolution: {integrity: sha512-gbKGcRUYIjA3/zCCNaWDciTMFI0dCkvou3TL8Zmy5Nc7sJ47a0jtOeZoTaMxkuqRo9cRhjOdZJXegxYE5FN/xw==} + engines: {node: ^14.21.3 || >=16} + + '@noble/hashes@1.8.0': + resolution: {integrity: sha512-jCs9ldd7NwzpgXDIf6P3+NrHh9/sD6CQdxHyjQI+h/6rDNo88ypBxxz45UDuZHz9r3tNz7N/VInSVoVdtXEI4A==} + engines: {node: ^14.21.3 || >=16} + '@nodelib/fs.scandir@2.1.5': resolution: {integrity: sha512-vq24Bq3ym5HEQm2NKCr3yXDwjc7vTsEThRDnkp2DK9p1uqLR+DHurm/NOTo0KG7HYHU7eppKZj3MyqYuMBf62g==} engines: {node: '>= 8'} @@ -1629,6 +1722,10 @@ packages: resolution: {integrity: sha512-85UdvzTNx/+s5CkSgBm/0hzP80RFHAa7PsfeADE5ezZF3uHz3/Tqj9gIKGT9PTtpycc3Ua64T0oVulGfKxzfqg==} engines: {node: '>=12.20.0'} + commander@11.1.0: + resolution: {integrity: sha512-yPVavfyCcRhmorC7rWlkHn15b4wDVgVmBA7kV4QVBsF7kv/9TKJAbAXVTxvTnwP8HHKjRCJDClKbciiYS7p0DQ==} + engines: {node: '>=16'} + compress-commons@6.0.2: resolution: {integrity: sha512-6FqVXeETqWPoGcfzrXb37E50NP0LXT8kAMu5ooZayhWWdgEY4lBEEcbQNXtkuKQsGduxiIcI4gOTsxTmuq/bSg==} engines: {node: '>= 14'} @@ -1735,6 +1832,10 @@ packages: resolution: {integrity: sha512-47qPchRCykZC03FhkYAhrvwU4xDBFIj1QPqaarj6mdM/hgUzfPHcpkHJOn3mJAufFeeAxAzeGsr5X0M4k6fLZQ==} engines: {node: '>=12'} + dotenv@17.4.2: + resolution: {integrity: sha512-nI4U3TottKAcAD9LLud4Cb7b2QztQMUEfHbvhTH09bqXTxnSie8WnjPALV/WMCrJZ6UV/qHJ6L03OqO3LcdYZw==} + engines: {node: '>=12'} + dts-resolver@2.1.3: resolution: {integrity: sha512-bihc7jPC90VrosXNzK0LTE2cuLP6jr0Ro8jk+kMugHReJVLIpHz/xadeq3MhuwyO4TD4OA3L1Q8pBBFRc08Tsw==} engines: {node: '>=20.19.0'} @@ -1751,6 +1852,10 @@ packages: eastasianwidth@0.2.0: resolution: {integrity: sha512-I88TYZWc9XiYHRQ4/3c5rjjfgkjhLyW2luGIheGERbNQ6OY7yTybanSpDXZa8y7VUP9YmDcYa+eyq4ca7iLqWA==} + eciesjs@0.4.18: + resolution: {integrity: sha512-wG99Zcfcys9fZux7Cft8BAX/YrOJLJSZ3jyYPfhZHqN2E+Ffx+QXBDsv3gubEgPtV6dTzJMSQUwk1H98/t/0wQ==} + engines: {bun: '>=1', deno: '>=2', node: '>=16'} + effect@4.0.0-beta.54: resolution: {integrity: sha512-hbnBnyGt2bftwlZPtjXnFfnCQwDYCEYbHYiNs5MvYL+NgnQvAhjZugKaTxZXul2tT8RaNQZyqXX0XBhUIWkcFw==} @@ -2003,6 +2108,10 @@ packages: ieee754@1.2.1: resolution: {integrity: sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA==} + ignore@5.3.2: + resolution: {integrity: sha512-hsBTNUqQTDwkWtcdYI2i06Y/nUBEsNEDJKjWdigLvegy8kDuJAS8uRlpkkcQpyEXL0Z/pjDy5HBmMjRCJ2gq+g==} + engines: {node: '>= 4'} + ignore@7.0.5: resolution: {integrity: sha512-Hs59xBNfUIunMFgWAbGX5cq6893IbWg4KnrjbYwX3tx0ztorVgTDA6B2sxf8ejHJ4wz8BqGUMYlnzNBer5NvGg==} engines: {node: '>= 4'} @@ -2074,6 +2183,10 @@ packages: isexe@2.0.0: resolution: {integrity: sha512-RHxMLp9lnKHGHRng9QFhRCMbYAcVpn69smSGcq3f36xjgVVWThj4qqLbTLlq7Ssj8B+fIQ1EuCEGI2lKsyQeIw==} + isexe@3.1.5: + resolution: {integrity: sha512-6B3tLtFqtQS4ekarvLVMZ+X+VlvQekbe4taUkf/rhVO3d/h0M2rfARm/pXLcPEsjjMsFgrFgSrhQIxcSVrBz8w==} + engines: {node: '>=18'} + iter-ops@3.5.0: resolution: {integrity: sha512-0/DU67w30nLAyyYETTAVOwtts871N6jjQEY9NAzSXqtlAGuo8CX6O+/Hx11eRZIchACaAMUn6alEndINx27j/Q==} engines: {node: '>=18'} @@ -2301,6 +2414,10 @@ packages: resolution: {integrity: sha512-NuAESUOUMrlIXOfHKzD6bpPu3tYt3xvjNdRIQ+FeT0lNb4K8WR70CaDxhuNguS2XG+GjkyMwOzsN5ZktImfhLA==} engines: {node: '>= 0.4'} + object-treeify@1.1.33: + resolution: {integrity: sha512-EFVjAYfzWqWsBMRHPMAXLCDIJnpMhdWAqR7xG6M6a2cs6PMFpl/+Z20w9zDW4vkxOFfddegBKq9Rehd0bxWE7A==} + engines: {node: '>= 10'} + obug@2.1.1: resolution: {integrity: sha512-uTqF9MuPraAQ+IsnPf366RG4cP9RtUi7MLO1N3KEc+wb0a6yKpeL0lmk2IB1jY5KHPAlTc6T/JRdC/YqxHNwkQ==} @@ -2877,6 +2994,11 @@ packages: engines: {node: '>= 8'} hasBin: true + which@4.0.0: + resolution: {integrity: sha512-GlaYyEb07DPxYCKhKzplCWBJtvxZcZMrL+4UkrTSJHHPyZU4mYYTv3qaOe77H7EODLSSopAUFAc6W8U4yqvscg==} + engines: {node: ^16.13.0 || >=18.0.0} + hasBin: true + why-is-node-running@2.3.0: resolution: {integrity: sha512-hUrmaWBdVDcxvYqnyh09zunKzROWjbZTiNy8dBEjkS7ehEDQibXJ7XvlmtbwuTclUiIyN+CyXQD4Vmko8fNm8w==} engines: {node: '>=8'} @@ -2933,6 +3055,14 @@ packages: resolution: {integrity: sha512-rVksvsnNCdJ/ohGc6xgPwyN8eheCxsiLM8mxuE/t/mOVqJewPuO1miLpTHQiRgTKCLexL4MeAFVagts7HmNZ2Q==} engines: {node: '>=10'} + yocto-spinner@1.1.0: + resolution: {integrity: sha512-/BY0AUXnS7IKO354uLLA2eRcWiqDifEbd6unXCsOxkFDAkhgUL3PH9X2bFoaU0YchnDXsF+iKleeTLJGckbXfA==} + engines: {node: '>=18.19'} + + yoctocolors@2.1.2: + resolution: {integrity: sha512-CzhO+pFNo8ajLM2d2IW/R93ipy99LWjtwblvC1RsoSUMZgyLbYFr221TnSNT7GjGdYui6P459mw9JH/g/zW2ug==} + engines: {node: '>=18'} + zip-stream@6.0.1: resolution: {integrity: sha512-zK7YHHz4ZXpW89AHXUPbQVGKI7uvkd3hzusTdotCg1UxyaVtg0zFJSTfW/Dq5f7OBBVnq6cZIaC8Ti4hb6dtCA==} engines: {node: '>= 14'} @@ -3015,6 +3145,23 @@ snapshots: picocolors: 1.1.1 sisteransi: 1.0.5 + '@dotenvx/dotenvx@1.62.0': + dependencies: + commander: 11.1.0 + dotenv: 17.4.2 + eciesjs: 0.4.18 + execa: 5.1.1 + fdir: 6.5.0(picomatch@4.0.4) + ignore: 5.3.2 + object-treeify: 1.1.33 + picomatch: 4.0.4 + which: 4.0.0 + yocto-spinner: 1.1.0 + + '@ecies/ciphers@0.2.6(@noble/ciphers@1.3.0)': + dependencies: + '@noble/ciphers': 1.3.0 + '@effect/platform-node-shared@4.0.0-beta.54(effect@4.0.0-beta.54)': dependencies: '@types/ws': 8.18.1 @@ -3222,6 +3369,14 @@ snapshots: '@tybys/wasm-util': 0.10.1 optional: true + '@noble/ciphers@1.3.0': {} + + '@noble/curves@1.9.7': + dependencies: + '@noble/hashes': 1.8.0 + + '@noble/hashes@1.8.0': {} + '@nodelib/fs.scandir@2.1.5': dependencies: '@nodelib/fs.stat': 2.0.5 @@ -3948,6 +4103,8 @@ snapshots: table-layout: 4.1.1 typical: 7.3.0 + commander@11.1.0: {} + compress-commons@6.0.2: dependencies: crc-32: 1.2.2 @@ -4054,6 +4211,8 @@ snapshots: dotenv@16.4.7: {} + dotenv@17.4.2: {} + dts-resolver@2.1.3: {} dunder-proto@1.0.1: @@ -4064,6 +4223,13 @@ snapshots: eastasianwidth@0.2.0: {} + eciesjs@0.4.18: + dependencies: + '@ecies/ciphers': 0.2.6(@noble/ciphers@1.3.0) + '@noble/ciphers': 1.3.0 + '@noble/curves': 1.9.7 + '@noble/hashes': 1.8.0 + effect@4.0.0-beta.54: dependencies: '@standard-schema/spec': 1.1.0 @@ -4323,6 +4489,8 @@ snapshots: ieee754@1.2.1: {} + ignore@5.3.2: {} + ignore@7.0.5: {} import-fresh@3.3.1: @@ -4382,6 +4550,8 @@ snapshots: isexe@2.0.0: {} + isexe@3.1.5: {} + iter-ops@3.5.0: {} jackspeak@3.4.3: @@ -4616,6 +4786,8 @@ snapshots: object-keys@1.1.1: {} + object-treeify@1.1.33: {} + obug@2.1.1: {} once@1.4.0: @@ -5316,6 +5488,10 @@ snapshots: dependencies: isexe: 2.0.0 + which@4.0.0: + dependencies: + isexe: 3.1.5 + why-is-node-running@2.3.0: dependencies: siginfo: 2.0.0 @@ -5366,6 +5542,12 @@ snapshots: yocto-queue@0.1.0: {} + yocto-spinner@1.1.0: + dependencies: + yoctocolors: 2.1.2 + + yoctocolors@2.1.2: {} + zip-stream@6.0.1: dependencies: archiver-utils: 5.0.2 diff --git a/pnpm-workspace.yaml b/pnpm-workspace.yaml index 3396031..5e097d4 100644 --- a/pnpm-workspace.yaml +++ b/pnpm-workspace.yaml @@ -1,6 +1,7 @@ packages: - "connectors/*" - "packages/*" + - "templates/*" catalog: effect: "4.0.0-beta.54" diff --git a/templates/producer-template/.env.example b/templates/producer-template/.env.example new file mode 100644 index 0000000..d25fc02 --- /dev/null +++ b/templates/producer-template/.env.example @@ -0,0 +1,10 @@ +# Template connector configuration (JSONPlaceholder). +# +# JSONPlaceholder is a free public REST API that does NOT require auth. +# We still thread an API token through the Config so you can see exactly +# where credentials plug in when you rename the template for a real API. + +TEMPLATE_API_BASE_URL=https://jsonplaceholder.typicode.com +TEMPLATE_API_TOKEN=anonymous +# TEMPLATE_WEBHOOK_SECRET= # only needed when verifying inbound webhooks +TEMPLATE_WEBHOOK_PORT=8080 diff --git a/templates/producer-template/README.md b/templates/producer-template/README.md new file mode 100644 index 0000000..6ac0ded --- /dev/null +++ b/templates/producer-template/README.md @@ -0,0 +1,73 @@ +# producer-template + +A minimal, **buildable**, **CI-verified** Airfoil Connector Kit (ACK) connector template. +It targets [JSONPlaceholder](https://jsonplaceholder.typicode.com) (a free public +REST API) so the template can be compiled, typechecked, and tested without any +external credentials or sandbox setup. + +Use it as the starting point for any new producer connector. See +[`.agents/skills/airfoil-kit/SKILL.md`](../../.agents/skills/airfoil-kit/SKILL.md) +for the end-to-end playbook. + +--- + +## What this template demonstrates + +- `defineConnector` with a single entity (`posts`). +- `defineEntity` with a paginated backfill stream and a live webhook stream. +- A small Effect `HttpClient`-based API client (bearer-token stubbed). +- Effect v4 `Config` composition for credentials, base URL, webhook port, + and webhook secret (optional). +- A `WebhookRoute` with `Schema`-validated payload and optional raw-body + signature verification hook. +- VCR tests: one recorded cassette for the backfill happy path + one in-memory + webhook test using `NodeHttpServer.layerTest`. +- `sandbox.ts` runner using `NodeHttpServer` (or Bun equivalent), `FetchHttpClient`, an in-memory + `StateStore`, a console `Publisher`, and optional OTLP telemetry. + +## Files + +``` +src/ +├── schemas.ts - entity + webhook payload schemas (Effect Schema) +├── api.ts - HttpClient-based API service +├── streams.ts - backfill + live stream helpers +├── connector.ts - defineConnector wiring + webhook route +├── sandbox.ts - local dev runner (Node example, Bun-compatible) +└── index.ts - public exports + +test/ +├── helpers.ts - test publisher layer +├── api.vcr.test.ts - VCR replay of the backfill path +└── webhook.test.ts - in-memory webhook round trip +``` + +## Using the template + +This package is meant to be **copied**, not installed. The agent workflow is: + +1. `cp -R templates/producer-template connectors/producer-` +2. Replace `TEMPLATE_` / `template` identifiers with your service name. +3. Replace the JSONPlaceholder endpoint / schemas with real API calls. +4. Re-record VCR cassettes against the real sandbox. +5. Run `pnpm run lint && pnpm run typecheck && pnpm run build && pnpm run test:ci` + from the repo root. + +See [`.agents/skills/airfoil-kit/assets/rename-checklist.md`](../../.agents/skills/airfoil-kit/assets/rename-checklist.md) +for the exact search-and-replace list. + +## Local development + +```bash +cd templates/producer-template +cp .env.example .env +pnpm run sandbox # starts the webhook server on :8080 +``` + +## Scripts + +- `pnpm run build` — bundle `src/` via `tsdown`. +- `pnpm run test` — vitest (the template tests do not require `.env`). +- `pnpm run test:ci` — vitest `run` mode. +- `pnpm run typecheck` — `tsc --noEmit`. +- `pnpm run sandbox` — local end-to-end runner. diff --git a/templates/producer-template/package.json b/templates/producer-template/package.json new file mode 100644 index 0000000..c16a102 --- /dev/null +++ b/templates/producer-template/package.json @@ -0,0 +1,41 @@ +{ + "name": "@useairfoil/producer-template", + "version": "0.0.1", + "private": true, + "files": [ + "dist", + "src", + "README.md" + ], + "type": "module", + "main": "./dist/index.js", + "types": "./dist/index.d.ts", + "exports": { + ".": { + "types": "./dist/index.d.ts", + "import": "./dist/index.js", + "default": "./dist/index.js" + } + }, + "scripts": { + "build": "tsdown", + "sandbox": "tsx --env-file=.env src/sandbox.ts", + "test": "dotenvx run --ignore=MISSING_ENV_FILE --quiet -- vitest", + "test:ci": "dotenvx run --ignore=MISSING_ENV_FILE --quiet -- vitest run", + "typecheck": "tsc --noEmit" + }, + "dependencies": { + "@effect/platform-node": "catalog:", + "@useairfoil/connector-kit": "workspace:*", + "effect": "catalog:" + }, + "devDependencies": { + "@dotenvx/dotenvx": "^1.62.0", + "@effect/vitest": "catalog:", + "@types/node": "catalog:", + "@useairfoil/effect-vcr": "workspace:*", + "tsdown": "catalog:", + "tsx": "^4.21.0", + "vitest": "catalog:" + } +} diff --git a/templates/producer-template/src/api.ts b/templates/producer-template/src/api.ts new file mode 100644 index 0000000..e0479f0 --- /dev/null +++ b/templates/producer-template/src/api.ts @@ -0,0 +1,101 @@ +import { ConnectorError } from "@useairfoil/connector-kit"; +import { Context, Effect, Layer, Schema } from "effect"; +import { HttpClient, HttpClientRequest, HttpClientResponse } from "effect/unstable/http"; + +import type { TemplateConfig } from "./connector"; + +// Page of rows returned by the list helper. When porting to a real API, prefer +// returning whatever the API returns (total count, next token, link header) +// and handle pagination inside the stream layer rather than here. +export type TemplateListPage
= { + readonly items: ReadonlyArray; + readonly hasMore: boolean; +}; + +export type TemplateApiClientService = { + readonly fetchJson: ( + schema: Schema.Decoder, + path: string, + params?: Record, + ) => Effect.Effect; + readonly fetchList: ( + schema: Schema.Decoder, + path: string, + options: { + readonly page: number; + readonly limit: number; + }, + ) => Effect.Effect, ConnectorError, R>; +}; + +export class TemplateApiClient extends Context.Service< + TemplateApiClient, + TemplateApiClientService +>()("@useairfoil/producer-template/TemplateApiClient") {} + +// Factory that resolves an HttpClient via the layer it is provided into and +// returns a small typed API surface. The auth header is Bearer by default; +// swap it out for `setHeader("X-Api-Key", ...)`, Basic auth, or OAuth2 as +// required by your upstream API. +export const makeTemplateApiClient = ( + config: TemplateConfig, +): Effect.Effect => + Effect.gen(function* () { + const client = (yield* HttpClient.HttpClient).pipe( + HttpClient.mapRequest(HttpClientRequest.prependUrl(config.apiBaseUrl)), + HttpClient.mapRequest(HttpClientRequest.bearerToken(config.apiToken)), + HttpClient.mapRequest(HttpClientRequest.acceptJson), + ); + + const fetchJson = ( + schema: Schema.Decoder, + path: string, + params?: Record, + ): Effect.Effect => { + const request = params + ? HttpClientRequest.get(path).pipe(HttpClientRequest.setUrlParams(params)) + : HttpClientRequest.get(path); + return Effect.scoped( + client.execute(request).pipe( + Effect.flatMap(HttpClientResponse.filterStatusOk), + Effect.flatMap((response) => response.json), + Effect.flatMap(Schema.decodeUnknownEffect(schema)), + Effect.mapError( + (error) => + new ConnectorError({ + message: "Template API request failed", + cause: error, + }), + ), + ), + ); + }; + + const fetchList = ( + schema: Schema.Decoder, + path: string, + options: { + readonly page: number; + readonly limit: number; + }, + ): Effect.Effect, ConnectorError, R> => { + const params: Record = { + _page: String(options.page), + _limit: String(options.limit), + }; + const arraySchema = Schema.Array(schema) as unknown as Schema.Decoder, R>; + return fetchJson(arraySchema, path, params).pipe( + Effect.map((items) => ({ + items, + hasMore: items.length === options.limit, + })), + ); + }; + + return { fetchJson, fetchList }; + }); + +export const TemplateApiClientConfig = ( + config: TemplateConfig, +): Layer.Layer => + Layer.effect(TemplateApiClient)(makeTemplateApiClient(config)); diff --git a/templates/producer-template/src/connector.ts b/templates/producer-template/src/connector.ts new file mode 100644 index 0000000..980e398 --- /dev/null +++ b/templates/producer-template/src/connector.ts @@ -0,0 +1,173 @@ +import type { HttpClient } from "effect/unstable/http"; + +import { + type ConnectorDefinition, + ConnectorError, + defineConnector, + defineEntity, + type WebhookRoute, +} from "@useairfoil/connector-kit"; +import { Config, Context, Effect, Layer, Option } from "effect"; + +import { TemplateApiClient, TemplateApiClientConfig } from "./api"; +import { type Post, PostSchema, type WebhookPayload, WebhookPayloadSchema } from "./schemas"; +import { + dispatchEntityWebhook, + type EntityStreams, + makeEntityStreams, + resolveCursor, +} from "./streams"; + +export type TemplateConfig = { + readonly apiBaseUrl: string; + readonly apiToken: string; + readonly webhookSecret: Option.Option; +}; + +export type TemplateConnectorRuntime = { + readonly connector: ConnectorDefinition; + readonly routes: ReadonlyArray>; +}; + +export class TemplateConnector extends Context.Service< + TemplateConnector, + TemplateConnectorRuntime +>()("@useairfoil/producer-template/TemplateConnector") {} + +// Effect Config surface. Callers supply a ConfigProvider (fromEnv, fromUnknown, +// or layered) and this struct is decoded from it at runtime. +export const TemplateConfigConfig = Config.all({ + apiBaseUrl: Config.string("TEMPLATE_API_BASE_URL").pipe( + Config.withDefault("https://jsonplaceholder.typicode.com"), + ), + apiToken: Config.string("TEMPLATE_API_TOKEN").pipe(Config.withDefault("anonymous")), + webhookSecret: Config.option(Config.string("TEMPLATE_WEBHOOK_SECRET")), +}); + +// Replace this stub with the real verification for the upstream service (e.g. +// Stripe, Shopify, GitHub HMAC-SHA256 variants). The signature MUST be +// computed against the raw request body, not a re-serialized JSON string. +const verifyWebhookSignature = (_options: { + readonly rawBody: Uint8Array; + readonly signature: string | null; + readonly secret: string; +}): Effect.Effect => + // Template intentionally accepts everything. When you port this to a real + // service, compare the header against the HMAC of `rawBody` using the + // shared secret and fail with a ConnectorError on mismatch. + Effect.void; + +const resolveWebhookDispatch = (options: { + readonly payload: WebhookPayload; + readonly posts: EntityStreams; +}) => { + const { payload } = options; + switch (payload.type) { + case "post.created": + case "post.updated": { + return Effect.logInfo(`webhook ${payload.type}`).pipe( + Effect.annotateLogs({ id: payload.data.id }), + Effect.andThen( + resolveCursor(payload.data, "id").pipe( + Effect.flatMap((cursor) => + dispatchEntityWebhook({ + queue: options.posts.live, + cutoff: options.posts.cutoff, + row: payload.data, + cursor, + }), + ), + ), + ), + ); + } + case "post.deleted": { + return Effect.void; + } + default: { + return Effect.logWarning("Ignoring unknown webhook type").pipe( + Effect.annotateLogs({ type: (payload as { type: string }).type }), + Effect.asVoid, + ); + } + } +}; + +const makeTemplateConnector = ( + config: TemplateConfig, +): Effect.Effect => + Effect.gen(function* () { + const api = yield* TemplateApiClient; + const postStreams = yield* makeEntityStreams({ + api, + schema: PostSchema, + path: "/posts", + cursorField: "id", + limit: 10, + }); + + const connector = defineConnector({ + name: "producer-template", + entities: [ + defineEntity({ + name: "posts", + schema: PostSchema, + primaryKey: "id", + live: postStreams.live, + backfill: postStreams.backfill, + }), + ], + events: [], + }); + + const webhookRoute: WebhookRoute = { + path: "/webhooks/template", + schema: WebhookPayloadSchema, + handle: (payload, request, rawBody) => + Effect.gen(function* () { + if (Option.isSome(config.webhookSecret) && rawBody) { + yield* verifyWebhookSignature({ + rawBody, + signature: request.headers["x-template-signature"] ?? null, + secret: config.webhookSecret.value, + }); + } + + yield* resolveWebhookDispatch({ + payload, + posts: postStreams, + }); + }), + }; + + if (Option.isNone(config.webhookSecret)) { + yield* Effect.logWarning( + "TEMPLATE_WEBHOOK_SECRET is not set. Incoming webhooks will not be signature-verified.", + ); + } + + return { connector, routes: [webhookRoute] }; + }).pipe(Effect.annotateLogs({ component: "producer-template" })); + +export const TemplateConnectorConfig = (): Layer.Layer< + TemplateConnector, + ConnectorError, + HttpClient.HttpClient +> => + Layer.effect(TemplateConnector)( + Effect.gen(function* () { + const config = yield* TemplateConfigConfig; + return yield* makeTemplateConnector(config).pipe( + Effect.provide(TemplateApiClientConfig(config)), + ); + }).pipe( + Effect.mapError((error) => + error instanceof ConnectorError + ? error + : new ConnectorError({ + message: "Template config failed", + cause: error, + }), + ), + ), + ); diff --git a/templates/producer-template/src/index.ts b/templates/producer-template/src/index.ts new file mode 100644 index 0000000..1a103a3 --- /dev/null +++ b/templates/producer-template/src/index.ts @@ -0,0 +1,10 @@ +export { TemplateApiClient, TemplateApiClientConfig } from "./api"; +export { + type TemplateConfig, + TemplateConfigConfig, + TemplateConnector, + TemplateConnectorConfig, + type TemplateConnectorRuntime, +} from "./connector"; +export type { Post, WebhookPayload } from "./schemas"; +export { PostSchema, WebhookPayloadSchema } from "./schemas"; diff --git a/templates/producer-template/src/sandbox.ts b/templates/producer-template/src/sandbox.ts new file mode 100644 index 0000000..1c00e1d --- /dev/null +++ b/templates/producer-template/src/sandbox.ts @@ -0,0 +1,119 @@ +import type { ConnectorError } from "@useairfoil/connector-kit"; + +import { NodeHttpServer } from "@effect/platform-node"; +import { Publisher, runConnector, StateStoreInMemory } from "@useairfoil/connector-kit"; +import { Config, ConfigProvider, DateTime, Effect, Layer, Logger, Metric } from "effect"; +import { FetchHttpClient } from "effect/unstable/http"; +import * as Observability from "effect/unstable/observability"; +import { createServer } from "node:http"; + +import { TemplateConnector, TemplateConnectorConfig } from "./index"; + +const SandboxConfig = Config.all({ + port: Config.port("TEMPLATE_WEBHOOK_PORT").pipe(Config.withDefault(8080)), +}); + +const TelemetryConfig = Config.all({ + enabled: Config.boolean("ACK_TELEMETRY_ENABLED").pipe(Config.withDefault(false)), + baseUrl: Config.string("ACK_OTLP_BASE_URL").pipe(Config.withDefault("http://localhost:4318")), + serviceName: Config.string("ACK_SERVICE_NAME").pipe(Config.withDefault("producer-template")), +}); + +// Console publisher so you can see ingestion output during `pnpm run sandbox`. +// Real connectors plug in `WingsPublisherLayer` from @useairfoil/connector-kit. +const ConsolePublisherLayer = Layer.succeed(Publisher)({ + publish: ({ name, source, batch }) => + Effect.gen(function* () { + const ids = batch.rows.map((r) => r["id"]).filter((id) => id != null); + yield* Effect.logInfo(`[publisher] -> Source: ${source} | Name: ${name}`).pipe( + Effect.annotateLogs({ + count: batch.rows.length, + ids, + cursor: batch.cursor, + source, + }), + ); + return { success: true }; + }), +}); + +const program = Effect.gen(function* () { + const config = yield* SandboxConfig; + const { connector, routes } = yield* TemplateConnector; + const routePaths = routes.map((route) => route.path); + const serverLayer = NodeHttpServer.layer(createServer, { port: config.port }); + + yield* Effect.logInfo("webhook server ready").pipe( + Effect.annotateLogs({ port: config.port, routes: routePaths }), + ); + + const now = yield* DateTime.now; + + return yield* runConnector(connector, { + initialCutoff: DateTime.toDate(now), + webhook: { + routes, + healthPath: "/health", + disableHttpLogger: true, + }, + }).pipe(Effect.provide(serverLayer)); +}).pipe(Effect.annotateLogs({ component: "producer-template" })); + +const EnvLayer = Layer.mergeAll( + FetchHttpClient.layer, + Layer.succeed(ConfigProvider.ConfigProvider, ConfigProvider.fromEnv()), +); + +const ConnectorLayer = TemplateConnectorConfig(); + +const TelemetryLayer = Layer.unwrap( + Effect.gen(function* () { + const telemetry = yield* TelemetryConfig; + if (!telemetry.enabled) { + return Layer.empty; + } + + yield* Effect.logInfo("telemetry enabled").pipe( + Effect.annotateLogs({ + serviceName: telemetry.serviceName, + baseUrl: telemetry.baseUrl, + }), + ); + + return Layer.mergeAll( + Observability.Otlp.layerJson({ + baseUrl: telemetry.baseUrl, + resource: { + serviceName: telemetry.serviceName, + }, + }), + Metric.enableRuntimeMetricsLayer, + ); + }), +); + +const RuntimeLayer = Layer.mergeAll( + StateStoreInMemory, + ConsolePublisherLayer, + ConnectorLayer, + Logger.layer([Logger.consolePretty()]), + TelemetryLayer, + EnvLayer, +); + +Effect.runPromise( + Effect.scoped(program).pipe(Effect.provide(RuntimeLayer)) as Effect.Effect< + void, + Config.ConfigError | ConnectorError + >, +).catch((error) => { + void Effect.runPromise( + Effect.logError("fatal error").pipe( + Effect.annotateLogs({ + component: "producer-template", + error: String(error), + }), + ), + ); + process.exit(1); +}); diff --git a/templates/producer-template/src/schemas.ts b/templates/producer-template/src/schemas.ts new file mode 100644 index 0000000..ec4c193 --- /dev/null +++ b/templates/producer-template/src/schemas.ts @@ -0,0 +1,33 @@ +import * as Schema from "effect/Schema"; + +// Entity schema for a JSONPlaceholder post. JSONPlaceholder does not return a +// created_at timestamp, so we cursor on the numeric `id` field. When porting +// this template to a real API, replace `PostSchema` with your own struct and +// prefer a monotonically increasing cursor field (e.g. created_at). +export const PostSchema = Schema.Struct({ + id: Schema.Number, + userId: Schema.Number, + title: Schema.String, + body: Schema.String, +}); + +export type Post = Schema.Schema.Type; + +// Webhook payload union. JSONPlaceholder does not emit real webhooks, but the +// shape below mirrors what most SaaS APIs send. The handler in connector.ts +// uses the `type` discriminator to fan out to the right entity queue. +const PostEventSchema = Schema.Struct({ + type: Schema.Literals(["post.created", "post.updated"]), + timestamp: Schema.String, + data: PostSchema, +}); + +const IgnoredEventSchema = Schema.Struct({ + type: Schema.Literals(["post.deleted"]), + timestamp: Schema.String, + data: Schema.Any, +}); + +export const WebhookPayloadSchema = Schema.Union([PostEventSchema, IgnoredEventSchema]); + +export type WebhookPayload = Schema.Schema.Type; diff --git a/templates/producer-template/src/streams.ts b/templates/producer-template/src/streams.ts new file mode 100644 index 0000000..ec662b3 --- /dev/null +++ b/templates/producer-template/src/streams.ts @@ -0,0 +1,116 @@ +import type * as Schema from "effect/Schema"; + +import { + type Batch, + type ConnectorError, + type Cursor, + makePullStream, + makeWebhookQueue, + type WebhookStream, +} from "@useairfoil/connector-kit"; +import { Deferred, Effect, Queue, Stream } from "effect"; + +import type { TemplateApiClientService } from "./api"; + +// JSONPlaceholder has no timestamps, so we cursor on the numeric `id` field. +// For a real API prefer a monotonically increasing, server-emitted timestamp. +const toNumber = (cursor: Cursor): number => (typeof cursor === "number" ? cursor : Number(cursor)); + +const isOnOrBeforeCutoff = (value: unknown, cutoff: Cursor): boolean => { + if (typeof value !== "number") return false; + return value <= toNumber(cutoff); +}; + +export const resolveCursor = >( + row: T, + cursorField: keyof T & string, +): Effect.Effect => + Effect.sync(() => { + const value = row[cursorField]; + return typeof value === "number" ? value : Number(value); + }); + +const setCutoff = (deferred: Deferred.Deferred, cursor: Cursor) => + Deferred.succeed(deferred, cursor).pipe(Effect.asVoid); + +// Enqueue a single webhook row after recording its cursor as the backfill +// cutoff. This is safe to call many times — Deferred.succeed is idempotent. +export const dispatchEntityWebhook = >(options: { + readonly queue: WebhookStream; + readonly cutoff: Deferred.Deferred; + readonly row: T; + readonly cursor: Cursor; +}): Effect.Effect => + Effect.gen(function* () { + yield* setCutoff(options.cutoff, options.cursor); + yield* Queue.offer(options.queue.queue, { + cursor: options.cursor, + rows: [options.row], + }).pipe(Effect.asVoid); + }); + +// Backfill stream for a single entity. Waits for the cutoff deferred to +// resolve (set by the first live webhook or by initialCutoff), then pages +// through the list endpoint until hasMore is false. +const makeBackfillStream = >(options: { + readonly api: TemplateApiClientService; + readonly schema: Schema.Decoder; + readonly path: string; + readonly cutoff: Deferred.Deferred; + readonly cursorField: keyof T & string; + readonly limit?: number; +}): Stream.Stream, ConnectorError> => + Stream.fromEffect(Deferred.await(options.cutoff)).pipe( + Stream.flatMap((cutoff) => + makePullStream({ + fetchPage: (cursor: Cursor | undefined) => { + const page = cursor ? Number(cursor) : 1; + return options.api + .fetchList(options.schema, options.path, { + page, + limit: options.limit ?? 10, + }) + .pipe( + Effect.map((response) => { + if (response.items.length === 0) { + return { cursor: page, rows: [], hasMore: false }; + } + + const filtered = response.items.filter((row: T) => + isOnOrBeforeCutoff(row[options.cursorField], cutoff), + ); + + return { + cursor: response.hasMore ? page + 1 : page, + rows: filtered, + hasMore: response.hasMore, + }; + }), + ); + }, + }), + ), + ); + +export type EntityStreams> = { + readonly live: WebhookStream; + readonly cutoff: Deferred.Deferred; + readonly backfill: Stream.Stream, ConnectorError>; +}; + +// Convenience factory: creates the live webhook queue, the cutoff deferred, +// and the backfill stream all at once. Callers destructure the result into a +// defineEntity() call. +export const makeEntityStreams = >(options: { + readonly api: TemplateApiClientService; + readonly schema: Schema.Decoder; + readonly path: string; + readonly cursorField: keyof T & string; + readonly limit?: number; +}): Effect.Effect, ConnectorError> => + Effect.gen(function* () { + const queue = yield* makeWebhookQueue({ capacity: 1024 }); + const cutoff = yield* Deferred.make(); + const backfill = makeBackfillStream({ ...options, cutoff }); + return { live: queue, cutoff, backfill }; + }); diff --git a/templates/producer-template/test/__cassettes__/api.vcr.test.cassette b/templates/producer-template/test/__cassettes__/api.vcr.test.cassette new file mode 100644 index 0000000..8850fcb --- /dev/null +++ b/templates/producer-template/test/__cassettes__/api.vcr.test.cassette @@ -0,0 +1,62 @@ +{ + "exports": { + "default": { + "meta": { + "createdAt": "1970-01-01T00:00:00.000Z", + "version": "1" + }, + "entries": {} + }, + "producer-template api (vcr) > replays posts list page with VCR": { + "meta": { + "createdAt": "1970-01-01T00:00:00.000Z", + "version": "1" + }, + "entries": { + "{\"body\":\"\",\"headers\":{\"accept\":\"application/json\"},\"method\":\"GET\",\"url\":\"https://jsonplaceholder.typicode.com/posts\"}": { + "request": { + "method": "GET", + "url": "https://jsonplaceholder.typicode.com/posts", + "headers": { + "accept": "application/json" + } + }, + "response": { + "status": 200, + "body": "[{\"body\":\"quia et suscipit\\nsuscipit recusandae consequuntur expedita et cum\\nreprehenderit molestiae ut ut quas totam\\nnostrum rerum est autem sunt rem eveniet architecto\",\"id\":1,\"title\":\"sunt aut facere repellat provident occaecati excepturi optio reprehenderit\",\"userId\":1},{\"body\":\"est rerum tempore vitae\\nsequi sint nihil reprehenderit dolor beatae ea dolores neque\\nfugiat blanditiis voluptate porro vel nihil molestiae ut reiciendis\\nqui aperiam non debitis possimus qui neque nisi nulla\",\"id\":2,\"title\":\"qui est esse\",\"userId\":1},{\"body\":\"et iusto sed quo iure\\nvoluptatem occaecati omnis eligendi aut ad\\nvoluptatem doloribus vel accusantium quis pariatur\\nmolestiae porro eius odio et labore et velit aut\",\"id\":3,\"title\":\"ea molestias quasi exercitationem repellat qui ipsa sit aut\",\"userId\":1},{\"body\":\"ullam et saepe reiciendis voluptatem adipisci\\nsit amet autem assumenda provident rerum culpa\\nquis hic commodi nesciunt rem tenetur doloremque ipsam iure\\nquis sunt voluptatem rerum illo velit\",\"id\":4,\"title\":\"eum et est occaecati\",\"userId\":1},{\"body\":\"repudiandae veniam quaerat sunt sed\\nalias aut fugiat sit autem sed est\\nvoluptatem omnis possimus esse voluptatibus quis\\nest aut tenetur dolor neque\",\"id\":5,\"title\":\"nesciunt quas odio\",\"userId\":1},{\"body\":\"ut aspernatur corporis harum nihil quis provident sequi\\nmollitia nobis aliquid molestiae\\nperspiciatis et ea nemo ab reprehenderit accusantium quas\\nvoluptate dolores velit et doloremque molestiae\",\"id\":6,\"title\":\"dolorem eum magni eos aperiam quia\",\"userId\":1},{\"body\":\"dolore placeat quibusdam ea quo vitae\\nmagni quis enim qui quis quo nemo aut saepe\\nquidem repellat excepturi ut quia\\nsunt ut sequi eos ea sed quas\",\"id\":7,\"title\":\"magnam facilis autem\",\"userId\":1},{\"body\":\"dignissimos aperiam dolorem qui eum\\nfacilis quibusdam animi sint suscipit qui sint possimus cum\\nquaerat magni maiores excepturi\\nipsam ut commodi dolor voluptatum modi aut vitae\",\"id\":8,\"title\":\"dolorem dolore est ipsam\",\"userId\":1},{\"body\":\"consectetur animi nesciunt iure dolore\\nenim quia ad\\nveniam autem ut quam aut nobis\\net est aut quod aut provident voluptas autem voluptas\",\"id\":9,\"title\":\"nesciunt iure omnis dolorem tempora et accusantium\",\"userId\":1},{\"body\":\"quo et expedita modi cum officia vel magni\\ndoloribus qui repudiandae\\nvero nisi sit\\nquos veniam quod sed accusamus veritatis error\",\"id\":10,\"title\":\"optio molestias id quia eum\",\"userId\":1}]", + "headers": { + "access-control-allow-credentials": "true", + "access-control-expose-headers": "X-Total-Count, Link", + "age": "10776", + "alt-svc": "h3=\":443\"; ma=86400", + "cache-control": "max-age=43200", + "cf-cache-status": "HIT", + "cf-ray": "9f045b20f9c61658-SIN", + "connection": "keep-alive", + "content-encoding": "gzip", + "content-type": "application/json; charset=utf-8", + "date": "Wed, 22 Apr 2026 11:40:50 GMT", + "etag": "W/\"aa6-j2NSH739l9uq40OywFMn7Y0C/iY\"", + "expires": "-1", + "link": "; rel=\"first\", ; rel=\"next\", ; rel=\"last\"", + "nel": "{\"report_to\":\"heroku-nel\",\"response_headers\":[\"Via\"],\"max_age\":3600,\"success_fraction\":0.01,\"failure_fraction\":0.1}", + "pragma": "no-cache", + "report-to": "{\"group\":\"heroku-nel\",\"endpoints\":[{\"url\":\"https://nel.heroku.com/reports?s=aglXxOYpKMBNwQnafaCSANksQ%2BU9VRXY68DBNPzzvNA%3D\\u0026sid=e11707d5-02a7-43ef-b45e-2cf4d2036f7d\\u0026ts=1776234278\"}],\"max_age\":3600}", + "reporting-endpoints": "heroku-nel=\"https://nel.heroku.com/reports?s=aglXxOYpKMBNwQnafaCSANksQ%2BU9VRXY68DBNPzzvNA%3D&sid=e11707d5-02a7-43ef-b45e-2cf4d2036f7d&ts=1776234278\"", + "server": "cloudflare", + "transfer-encoding": "chunked", + "vary": "Origin, Accept-Encoding", + "via": "2.0 heroku-router", + "x-content-type-options": "nosniff", + "x-powered-by": "Express", + "x-ratelimit-limit": "1000", + "x-ratelimit-remaining": "999", + "x-ratelimit-reset": "1776234334", + "x-total-count": "100" + } + } + } + } + } + } +} \ No newline at end of file diff --git a/templates/producer-template/test/api.vcr.test.ts b/templates/producer-template/test/api.vcr.test.ts new file mode 100644 index 0000000..df98450 --- /dev/null +++ b/templates/producer-template/test/api.vcr.test.ts @@ -0,0 +1,54 @@ +import { NodeServices } from "@effect/platform-node"; +import { describe, expect, it } from "@effect/vitest"; +import { FileSystemCassetteStore, VcrHttpClient } from "@useairfoil/effect-vcr"; +import { ConfigProvider, Effect, Layer } from "effect"; +import { FetchHttpClient } from "effect/unstable/http"; + +import { makeTemplateApiClient, TemplateApiClient } from "../src/api"; +import { PostSchema, TemplateConfigConfig } from "../src/index"; + +// Replays a single page of JSONPlaceholder /posts from a recorded cassette. +// This mirrors the producer-polar VCR setup: the connector-level flow is +// covered by webhook.test.ts, and this test exercises only the API surface. +describe("producer-template api (vcr)", () => { + it.effect("replays posts list page with VCR", () => { + const program = Effect.gen(function* () { + const api = yield* TemplateApiClient; + const result = yield* api.fetchList(PostSchema, "/posts", { + page: 1, + limit: 10, + }); + + expect(result.items.length).toBeGreaterThan(0); + expect(result.hasMore).toBe(true); + }).pipe(Effect.scoped); + + const apiLayer = Layer.effect(TemplateApiClient)( + Effect.gen(function* () { + const config = yield* TemplateConfigConfig; + return yield* makeTemplateApiClient(config); + }), + ); + + return program.pipe( + Effect.provide(apiLayer), + Effect.provide( + VcrHttpClient.layer({ + vcrName: "producer-template", + mode: "replay", + }), + ), + Effect.provide(FileSystemCassetteStore.layer()), + Effect.provide(FetchHttpClient.layer), + Effect.provide(NodeServices.layer), + Effect.provideService( + ConfigProvider.ConfigProvider, + ConfigProvider.fromUnknown({ + TEMPLATE_API_BASE_URL: "https://jsonplaceholder.typicode.com", + TEMPLATE_API_TOKEN: "test", + }), + ), + Effect.scoped, + ); + }); +}); diff --git a/templates/producer-template/test/helpers.ts b/templates/producer-template/test/helpers.ts new file mode 100644 index 0000000..33088b8 --- /dev/null +++ b/templates/producer-template/test/helpers.ts @@ -0,0 +1,34 @@ +import type { Batch } from "@useairfoil/connector-kit"; + +import { Publisher } from "@useairfoil/connector-kit"; +import { Deferred, Effect, Layer, Ref } from "effect"; + +export type Published = { + readonly name: string; + readonly source: "live" | "backfill"; + readonly batch: Batch>; +}; + +// Layers a `Publisher` service that captures each publish into a Ref and +// resolves `done` after `expected` batches land. Tests use `Deferred.await(done)` +// to synchronize on ingestion completion. +export const makeTestPublisher = (expected: number) => + Effect.gen(function* () { + const publishedRef = yield* Ref.make>([]); + const done = yield* Deferred.make(); + const layer = Layer.succeed(Publisher)({ + publish: ({ name, source, batch }) => + Effect.gen(function* () { + const next = yield* Ref.updateAndGet(publishedRef, (items) => [ + ...items, + { name, source, batch }, + ]); + if (next.length === expected) { + yield* Deferred.succeed(done, next.length); + } + return { success: true }; + }), + }); + + return { publishedRef, done, layer }; + }); diff --git a/templates/producer-template/test/webhook.test.ts b/templates/producer-template/test/webhook.test.ts new file mode 100644 index 0000000..beda058 --- /dev/null +++ b/templates/producer-template/test/webhook.test.ts @@ -0,0 +1,75 @@ +import { NodeHttpServer } from "@effect/platform-node"; +import { describe, expect, it } from "@effect/vitest"; +import { ConnectorError, runConnector, StateStoreInMemory } from "@useairfoil/connector-kit"; +import { ConfigProvider, Deferred, Effect, Layer, Ref } from "effect"; +import { HttpClient, HttpClientRequest } from "effect/unstable/http"; + +import { TemplateApiClient, type TemplateApiClientService } from "../src/api"; +import { TemplateConnector, TemplateConnectorConfig } from "../src/index"; +import { makeTestPublisher } from "./helpers"; + +const postWebhookPayload = { + type: "post.created", + timestamp: "2026-01-01T00:00:00Z", + data: { + id: 1, + userId: 1, + title: "sunt aut facere", + body: "quia et suscipit", + }, +} as const; + +// API stub — the webhook test does not exercise any backfill, so fetchList +// returns an empty page and fetchJson is never expected to be called. +const makeApiStub = (): TemplateApiClientService => ({ + fetchJson: (_schema) => Effect.fail(new ConnectorError({ message: "Unexpected fetchJson" })), + fetchList: (_schema) => Effect.succeed({ items: [], hasMore: false }), +}); + +describe("producer-template webhook", () => { + it.effect("publishes live webhook batches", () => { + const runtimeLayer = NodeHttpServer.layerTest; + const apiLayer = Layer.succeed(TemplateApiClient)(makeApiStub()); + + const connectorLayer = TemplateConnectorConfig().pipe(Layer.provide(apiLayer)); + const configProvider = ConfigProvider.fromUnknown({ + TEMPLATE_API_BASE_URL: "https://jsonplaceholder.typicode.com", + TEMPLATE_API_TOKEN: "test", + }); + + return Effect.gen(function* () { + const { publishedRef, done, layer } = yield* makeTestPublisher(1); + const { connector, routes } = yield* TemplateConnector; + const runLayer = Layer.mergeAll(StateStoreInMemory, layer, runtimeLayer); + + yield* Effect.gen(function* () { + yield* Effect.forkScoped( + runConnector(connector, { + initialCutoff: new Date(), + webhook: { + routes, + }, + }), + ); + + const client = yield* HttpClient.HttpClient; + const request = HttpClientRequest.post("/webhooks/template").pipe( + HttpClientRequest.bodyJsonUnsafe(postWebhookPayload), + ); + const response = yield* client.execute(request); + + expect(response.status).toBe(200); + + yield* Deferred.await(done); + const published = yield* Ref.get(publishedRef); + expect(published.length).toBe(1); + expect(published[0]?.name).toBe("posts"); + }).pipe(Effect.provide(runLayer)); + }).pipe( + Effect.provide(connectorLayer), + Effect.provide(runtimeLayer), + Effect.provideService(ConfigProvider.ConfigProvider, configProvider), + Effect.scoped, + ) as Effect.Effect; + }); +}); diff --git a/templates/producer-template/tsconfig.json b/templates/producer-template/tsconfig.json new file mode 100644 index 0000000..df2aa9e --- /dev/null +++ b/templates/producer-template/tsconfig.json @@ -0,0 +1,17 @@ +{ + "extends": "../../tsconfig.json", + "compilerOptions": { + "outDir": "dist", + "declarationDir": "dist", + "rootDir": ".", + "types": ["node"], + "moduleResolution": "bundler", + "esModuleInterop": true, + "verbatimModuleSyntax": true, + "noEmit": true, + "resolveJsonModule": true, + "skipLibCheck": true, + "strict": true + }, + "exclude": ["node_modules", "dist"] +} diff --git a/templates/producer-template/tsdown.config.ts b/templates/producer-template/tsdown.config.ts new file mode 100644 index 0000000..7b434c7 --- /dev/null +++ b/templates/producer-template/tsdown.config.ts @@ -0,0 +1,9 @@ +import { defineConfig } from "tsdown"; + +export default defineConfig({ + entry: ["src/index.ts"], + format: ["esm"], + dts: true, + sourcemap: true, + clean: true, +}); diff --git a/templates/producer-template/vitest.config.ts b/templates/producer-template/vitest.config.ts new file mode 100644 index 0000000..50ca1af --- /dev/null +++ b/templates/producer-template/vitest.config.ts @@ -0,0 +1,9 @@ +import { defineConfig } from "vitest/config"; + +export default defineConfig({ + test: { + fileParallelism: false, + testTimeout: 60_000, + hookTimeout: 60_000, + }, +});