Skip to content

Commit 9588707

Browse files
authoredJan 9, 2025
Merge pull request #7374 from segmentio/develop
Release 25.02.1
2 parents 09226d7 + 20606d6 commit 9588707

File tree

23 files changed

+981
-944
lines changed

23 files changed

+981
-944
lines changed
 

‎src/_data/catalog/destination_categories.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# AUTOGENERATED FROM PUBLIC API. DO NOT EDIT
2-
# destination categories last updated 2024-12-19
2+
# destination categories last updated 2025-01-09
33
items:
44
- display_name: A/B Testing
55
slug: a-b-testing

‎src/_data/catalog/destinations.yml

+599-842
Large diffs are not rendered by default.

‎src/_data/catalog/destinations_private.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# AUTOGENERATED FROM PUBLIC API. DO NOT EDIT
2-
# destination data last updated 2024-12-19
2+
# destination data last updated 2025-01-09
33
items:
44
- id: 54521fd925e721e32a72eee1
55
display_name: Pardot

‎src/_data/catalog/regional-supported.yml

+9
Original file line numberDiff line numberDiff line change
@@ -78,6 +78,15 @@ sources:
7878
- us
7979
endpoints:
8080
- us
81+
- id: WXNgKpZMsd
82+
display_name: Antavo
83+
hidden: false
84+
slug: antavo
85+
url: connections/sources/catalog/cloud-apps/antavo
86+
regions:
87+
- us
88+
endpoints:
89+
- us
8190
- id: dZeHygTSD4
8291
display_name: Apple
8392
hidden: false

‎src/_data/catalog/source_categories.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# AUTOGENERATED FROM PUBLIC API. DO NOT EDIT
2-
# source categories last updated 2024-12-19
2+
# source categories last updated 2025-01-09
33
items:
44
- display_name: A/B Testing
55
slug: a-b-testing

‎src/_data/catalog/sources.yml

+20-1
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# AUTOGENERATED FROM PUBLIC API. DO NOT EDIT
2-
# sources last updated 2024-12-19
2+
# sources last updated 2025-01-09
33
items:
44
- id: 8HWbgPTt3k
55
display_name: .NET
@@ -175,6 +175,25 @@ items:
175175
- Analytics
176176
status: PUBLIC
177177
partnerOwned: false
178+
- id: WXNgKpZMsd
179+
display_name: Antavo
180+
isCloudEventSource: true
181+
slug: antavo
182+
url: connections/sources/catalog/cloud-apps/antavo
183+
hidden: false
184+
regions:
185+
- us
186+
endpoints:
187+
- us
188+
source_type: cloud-app
189+
description: AI Loyalty Platform
190+
logo:
191+
url: >-
192+
https://cdn-devcenter.segment.com/9d26b38a-0f7a-4a24-b89f-2abd17fbdbbb.svg
193+
categories:
194+
- Marketing Automation
195+
status: PUBLIC_BETA
196+
partnerOwned: false
178197
- id: dZeHygTSD4
179198
display_name: Apple
180199
isCloudEventSource: false

‎src/connections/destinations/actions.md

+9-1
Original file line numberDiff line numberDiff line change
@@ -163,7 +163,11 @@ You can also test within the mapping itself. To test the mapping:
163163
1. Navigate to the **Mappings** tab of your destination.
164164
2. Select a mapping and click the **...** and select **Edit Mapping**.
165165
3. In step 2 of the mappings edit page, click **Load Test Event from Source** to add a test event from the source, or you can add your own sample event.
166-
4. Scroll to step 4 on the page, and click **Test Mapping** to test the mapping and view the response from the destination.
166+
4. Scroll to step 4 on the page, and click **Test Mapping** to test the mapping and view the response from the destination.
167+
168+
169+
> info "Test Mapping might not return the events you're looking for"
170+
> Segment only surfaces a small subset of events for the Test Mapping feature and might not always return the event you're looking for. If you'd like to test with a specific event, copy a specific event from your [Source Debugger](/docs/connections/sources/debugger/) and paste it into the **Add test event** interface.
167171
168172
## Customize mappings
169173

@@ -207,6 +211,10 @@ The coalesce function takes a primary value and uses it if it is available. If t
207211

208212
The replace function allows you to replace a string, integer, or boolean with a new value. You have the option to replace up to two values within a single field.
209213

214+
### Flatten function
215+
216+
The flatten function allows you to flatten a nested object to an object with a depth of 1. Keys are delimited by the configured separator. For example, an object like {a: { b: { c: 1 }, d: 2 } } will be converted to { 'a.b.c': 1, 'a.d': 2 }.
217+
210218
### Conditions
211219

212220
> info ""
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,106 @@
1+
---
2+
title: Google Campaign Manager 360
3+
strat: google
4+
hide-boilerplate: true
5+
hide-dossier: false
6+
id: 66e97a37a8f396642c0bd33c
7+
hidden: true
8+
private: true
9+
versions:
10+
- name: "Google Campaign Manager 360"
11+
link: '/docs/connections/destinations/catalog/actions-google-campaign-manager-360/'
12+
---
13+
14+
The Google Campaign Manager 360 destination allows users to upload [conversions](https://developers.google.com/doubleclick-advertisers/guides/conversions_upload){:target="_blank"} and [conversion enhancements](https://developers.google.com/doubleclick-advertisers/guides/conversions_ec){:target="_blank"} to Google Campaign Manager 360. Marketers can use this integration to attribute conversions to specific campaigns, ad groups, and ads.
15+
16+
## Getting Started
17+
18+
> info ""
19+
> You can connect the Google Campaign Manager 360 Destination to an event source, Reverse ETL source, or Engage space.
20+
21+
### Prerequisites
22+
23+
Before you begin, you need to have a Google Campaign Manager 360 account, with a Profile ID and a Floodlight Configuration ID. These are necessary to configure the Floodlight activities you want to track.
24+
25+
### Connect to Google Campaign Manager 360
26+
27+
1. From the Segment web app, navigate to **Catalog > Destinations**.
28+
2. Search for “Google Campaign Manager 360” in the Destinations Catalog, and select it.
29+
3. Click **Add destination**.
30+
4. Select the source that will send data to Google Campaign Manager 360.
31+
* If you select an Engage space, you'll be redirected to Engage to complete the following steps.
32+
* If you select a Reverse ETL source, you must enter a name for your destination and click **Create destination**.
33+
5. On the **Settings** tab for your Google Campaign Manager destination:
34+
* Enter your **Profile ID**. Optionally, you can also provide your default **Floodlight Configuration ID** and/or your default **Floodlight Activity ID**. These fields are optional, but if you provide them, they will be used as defaults for all events sent to Google Campaign Manager 360. Otherwise, you can override these values in your mappings.
35+
6. Click **Save**.
36+
7. Follow the steps in the Destinations Actions documentation to [customize your mappings](/docs/connections/destinations/actions/#customize-mappings).
37+
38+
## Available actions
39+
40+
The Google Campaign Manager 360 Action Destination supports the following actions:
41+
42+
* [Conversion Upload](#conversion-upload)
43+
* [Conversion Adjustment Upload](#conversion-adjustment-upload)
44+
45+
### Conversion Upload
46+
47+
The Conversion Upload action allows you to send conversion data to Google Campaign Manager 360. This action is useful for tracking conversions that occur on your website or app.
48+
49+
#### Fields
50+
51+
The Google Campaign Manager 360 destination requires the following fields for the Conversion Upload action:
52+
53+
* **Required ID**: The identifier that identifies a user for the conversion. Only one value at a time can be provided from the following fields:
54+
* Google Click ID (gclid);
55+
* Display Click ID (dclid);
56+
* Encrypted User ID;
57+
* Mobile Device ID;
58+
* Match ID;
59+
* Impression ID;
60+
* Encrypted User ID Candidates;
61+
* **Timestamp**: The time the conversion occurred.
62+
* **Value**: The value of the conversion.
63+
* **Ordinal**: The ordinal of the conversion. This field is used to control how conversions of the same user and day are de-duplicated.
64+
65+
### Conversion Adjustment Upload
66+
67+
The Conversion Adjustment Upload action allows you to send conversion adjustment data to Google Campaign Manager 360. This action is useful for adjustments to conversions that have already been uploaded, as well as enhancing conversions.
68+
69+
#### Fields
70+
71+
The Google Campaign Manager 360 destination requires the following fields for the Conversion Adjustment Upload action:
72+
73+
* **Required ID**: The identifier that identifies a user for the conversion. Only one value at a time can be provided, from the following fields:
74+
* Google Click ID (gclid);
75+
* Display Click ID (dclid);
76+
* Encrypted User ID;
77+
* Mobile Device ID;
78+
* Match ID;
79+
* Impression ID;
80+
* **Timestamp**: The time the conversion occurred.
81+
* **Value**: The value of the conversion.
82+
* **Ordinal**: The ordinal of the conversion. This field is used to control how conversions of the same user and day are de-duplicated.
83+
84+
## Hashing
85+
86+
Google requires you to hash all PII before sending it to the Google API.
87+
88+
The Google Campaign Manager 360 destination supports hashing for the following fields:
89+
90+
* Email
91+
* Phone
92+
* First Name
93+
* Last Name
94+
* Street Address
95+
96+
The hashing algorithm used is SHA-256. If incoming data arrives already hashed, the destination will not hash it again. The values will be sent as-is to Google.
97+
98+
{% include components/actions-fields.html settings="true"%}
99+
100+
## FAQ and troubleshooting
101+
102+
### Refreshing access tokens
103+
104+
When you use OAuth to authenticate into the Google Campaign Manager 360 destination, Segment stores an access token and refresh token. Access tokens for Google Campaign Manager 360 expire after one hour. Once expired, Segment receives an error and then uses the refresh token to fetch a new access token. This results in two API requests to Google Campaign Manager 360, one failure and one success.
105+
106+
Because of the duplicate API requests, you may see a warning in Google for unprocessed conversions due to incorrect or missing OAuth credentials. This warning is expected and does not indicate data loss. Google has confirmed that conversions are being processed, and OAuth retry behavior will not cause any issues for your web conversions. Whenever possible, Segment caches access tokens to reduce the total number of requests made to Google Campaign Manager 360.

‎src/connections/destinations/catalog/actions-marketo-static-lists/index.md

+3
Original file line numberDiff line numberDiff line change
@@ -57,6 +57,9 @@ In this step, you'll create an API-Only Marketo user with both Access API and Le
5757
> warning "Warning:"
5858
> Do not create a list in the folder for the audience. Segment creates the list for you!
5959
60+
### Using Marketo Static Lists (Actions) with the Event Tester
61+
This destination keeps track of a `List Id` field for you on the backend. That field is added to payloads as Segment processes them. This means that the Event Tester can't be used out-of-the-box as it can with most destinations. To test an event using the Event Tester for Marketo Static Lists (Actions), you need to add a valid `List Id` to the payload at the `context.personas.external_audience_id` key.
62+
6063
### Using Marketo Static Lists (Actions) destination with Engage
6164

6265
1. From your Segment workspace, go to **Engage → Engage Settings → Destinations → Add Destination**, and then Search for Marketo Static Lists (Actions).

‎src/connections/destinations/catalog/actions-podscribe/index.md

-2
Original file line numberDiff line numberDiff line change
@@ -5,8 +5,6 @@ id: 643fdecd5675b7a6780d0d67
55

66
[Podscribe](https://podscribe.com/){:target="\_blank”} measures the effectiveness of podcast advertising. Through integrations with podcast hosting providers, matches downloads with on-site actions, providing advertisers household-level attribution.
77

8-
{% include content/beta-note.md %}
9-
108
## Getting started
119

1210
1. From the Segment web app, navigate to **Connections > Catalog**.

‎src/connections/destinations/catalog/actions-sendgrid-audiences/index.md

+6-3
Original file line numberDiff line numberDiff line change
@@ -79,7 +79,10 @@ At least one of the following identifier types is required when syncing members
7979
- Phone Number ID (must be in [E.164](https://www.twilio.com/docs/glossary/what-e164){:target="_blank”} format)
8080
- External ID
8181

82-
To sync Engage users to a list using Anonymous ID, Phone Number ID, and External ID identifier types, complete the following configuration steps:
82+
> warning ""
83+
> If you provide more than one type of identifier for each user in your initial sync, you must send all of those identifier types for any future updates to that Contact.
8384
84-
1. Configure [ID Sync](/docs/engage/trait-activation/id-sync/) to include Anonymous ID, Phone Number ID, or External ID identifiers when syncing users from an Engage Audience to the SendGrid List.
85-
2. Map the Anonymous ID, Phone Number ID, and External ID identifiers using the [Sync Audience ](#sync-audience-action) Action's Anonymous ID, Phone Number ID, and External ID fields.
85+
To sync Engage users to a SendGrid list using an identifier type other than email, complete the following additional steps:
86+
87+
1. Configure [ID Sync](/docs/engage/trait-activation/id-sync/) to include a value for the identifier when syncing users from an Engage Audience to the SendGrid List.
88+
2. Map the identifier using the [Sync Audience Action](#sync-audience-action)'s mapping field.

‎src/connections/destinations/catalog/appsflyer/index.md

+3
Original file line numberDiff line numberDiff line change
@@ -265,6 +265,9 @@ For example, an attribution event coming from an attribution partner would look
265265
}];
266266
```
267267
268+
> info "Attribution and install counts might differ between Segment and attribution providers like AppsFlyer"
269+
> For more information about the factors that contribute to these differences, see the [Segment's Role in Attribution](/docs/guides/how-to-guides/segment-and-attribution/) documentation.
270+
268271
## Other Features
269272
270273
### Revenue Tracking

‎src/connections/destinations/destination-filters.md

-1
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,6 @@ Keep the following limitations in mind when you use destination filters:
3737
- [Swift](/docs/connections/sources/catalog/libraries/mobile/apple/swift-destination-filters/){:target="_blank"}
3838
- [React Native](/docs/connections/sources/catalog/libraries/mobile/react-native/react-native-destination-filters/){:target="_blank"}
3939
- Destination Filters don't apply to events that send through the destination Event Tester.
40-
- Destination Filters within the UI and [FQL](/docs/api/public-api/fql/) do not currently support matching on event fields containing '.$' or '.$.', which references fields with an array type.
4140

4241
[Contact Segment](https://segment.com/help/contact/){:target="_blank"} if these limitations impact your use case.
4342

Loading
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,82 @@
1+
---
2+
title: Antavo Source
3+
id: WXNgKpZMsd
4+
---
5+
6+
[Antavo](http://www.antavo.com){:target="_blank"} allows you to synchronize loyalty events and profile updates into Segment.
7+
8+
The Antavo Source allows you to sync profile updates and loyalty events into Segment Destination apps and Segment warehouse.
9+
10+
This source is maintained by Antavo. For any issues with the
11+
source, [contact the Antavo support team](mailto:support@antavo.com).
12+
13+
## Getting started
14+
15+
1. From your workspace's Sources catalog page click `Add Source`.
16+
2. Search for "Antavo" in the Sources Catalog, select Antavo, and click Add Source.
17+
3. On the next screen, you can name the Source (e.g. Antavo or Loyalty Engine).
18+
1. The name is used as a label in the Segment app, and Segment creates a related schema name in your warehouse.
19+
2. The name can be anything, but we recommend using something that reflects the source and distinguishes amongst your environments.
20+
4. Click Add Source to save your settings.
21+
5. Copy the Write key from the Segment UI.
22+
6. Log into your Antavo account.
23+
7. Select Twilio Segment integration in Antavo platform.
24+
25+
![Enable Twilio Segment extension](images/1-antavo-enable_segment_extension.png)
26+
8. Insert the Segment write key and select which attribute contains the userID that will be used as User identifier when syncing events.
27+
28+
![Configure Twilio Segment extension](images/2-antavo-configure_segment_extension.png)
29+
9. Go to the Outbound settings page and select:
30+
- The events you want to sync to Segment.
31+
- The customer attribute updates you want to sync to Segment.
32+
33+
![Configure event synchronization](images/3-antavo-configure_event_sync.png)
34+
35+
## Events
36+
37+
Antavo syncs two main types of events to Segment: Profile Updates and Loyalty Events. Profile Updates are sent as Segment Identify events, while Loyalty Events are sent as Segment Track events.
38+
39+
Both event types include a `userId`, which can be configured in Antavo. You can designate any customer attribute as the "external customer ID" to use as the Segment `userId`.
40+
41+
### Profile updates
42+
43+
Profile Updates occur when a customer attribute, added to the Antavo **Customer field sync**, updates. Customer attributes are included in the traits object.
44+
45+
```
46+
{
47+
"traits": {
48+
"first_name": "New",
49+
"last_name": "Name",
50+
},
51+
"userId": "antavo-customer-id",
52+
"timestamp": "2024-11-26T15:19:14.000Z",
53+
"type": "identify",
54+
}
55+
```
56+
57+
### Loyalty events
58+
59+
Loyalty Events occur when a built-in or custom event, added to the Antavo Event sync, is triggered. The event data is then sent to the Segment Antavo Source. Event properties are included in the properties object.
60+
61+
```
62+
{
63+
"properties": {
64+
"points": 5000
65+
},
66+
"type": "track",
67+
"event": "point_add",
68+
"userId": "antavo-customer-id",
69+
"timestamp": "2024-11-26T15:15:49.000Z",
70+
}
71+
```
72+
73+
### Integrations Object
74+
Antavo automatically filters data from being sent to Salesforce destinations ([Salesforce (Actions)](https://segment.com/docs/connections/destinations/catalog/actions-salesforce){:target="_blank"}, [Salesforce Marketing Cloud (Actions)](https://segment.com/docs/connections/destinations/catalog/actions-salesforce-marketing-cloud){:target="_blank"}) and the [Antavo](https://segment.com/docs/connections/destinations/catalog/antavo){:target="_blank"} destination. This is achieved by adding these destinations to the [Integrations object](https://segment.com/docs/guides/filtering-data/#filtering-with-the-integrations-object){:target="_blank"} in the event payloads. Since Antavo has a dedicated Salesforce integration, this filtering helps prevent infinite loops.
75+
76+
## Adding Destinations
77+
78+
As the last step of the Antavo Source setup, you can select Destinations to receive data.
79+
80+
Log into your downstream tools and check to see that your events appear as expected, and that they contain all of the properties you expect. If your events and properties don’t appear, check the [Event Delivery](https://github.com/segmentio/segment-docs/blob/develop/docs/connections/event-delivery){:target="_blank"} tool, and refer to the Destination docs for each tool for troubleshooting.
81+
82+
If there are any issues with how the events are arriving to Segment, [contact the Antavo support team](mailto:support@antavo.com).

‎src/connections/sources/catalog/libraries/server/http-api/index.md

+3-2
Original file line numberDiff line numberDiff line change
@@ -462,8 +462,9 @@ When sending a HTTP call from a user's device, you can collect the IP address by
462462

463463
Segment returns a `200` response for all API requests except errors caused by large payloads and JSON errors (which return `400` responses.) To debug events that return `200` responses but aren't accepted by Segment, use the Segment Debugger.
464464

465-
Common reasons events are not accepted by Segment include:
466-
- **Payload is too large:** The HTTP API can handle API requests that are 32KB or smaller. The batch API endpoint accepts a maximum of 500KB per request, with a limit of 32KB per event in the batch. If these limits are exceeded, Segment returns a 400 Bad Request error.
465+
Common reasons that events are not accepted by Segment:
466+
- **Payload is too large:** Most HTTP API routes can handle API requests that are 32KB or smaller. If this limit is exceeded, Segment returns a 400 Bad Request error.
467+
- **The `\batch` API endpoint:** This endpoint accepts a maximum of 500KB per batch API request. Each batch request can only have up to 2500 events, and each batched event needs to be less than 32KB. Segment returns a `200` response but rejects the event when the number of batched events exceeds the limit.
467468
- **Identifier is not present**: The HTTP API requires that each payload has a userId and/or anonymousId. If you send events without either the userId or anonymousId, Segment’s tracking API responds with an no_user_anon_id error. Check the event payload and client instrumentation for more details.
468469
- **Track event is missing name**: All Track events sent to Segment must have an `event` field.
469470
- **Deduplication**: Segment deduplicates events using the `messageId` field, which is automatically added to all payloads coming into Segment. If you're setting up the HTTP API yourself, ensure all events have unique messageId values with fewer than 100 characters.

‎src/connections/sources/catalog/libraries/server/node/index.md

+130-85
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ All of Segment's server-side libraries are built for high-performance, so you ca
1515
## Getting Started
1616

1717
> warning ""
18-
> Make sure you're using a version of Node that's 16 or higher.
18+
> Make sure you're using a version of Node that's 18 or higher.
1919
2020
1. Run the relevant command to add Segment's Node library module to your `package.json`.
2121

@@ -289,25 +289,105 @@ Setting | Details
289289
290290
See the complete `AnalyticsSettings` interface [in the analytics-next repository](https://github.com/segmentio/analytics-next/blob/master/packages/node/src/app/settings.ts){:target="_blank"}.
291291
292-
## Usage in serverless environments
292+
## Usage in serverless environments and non-node runtimes
293+
Segment supports a variety of runtimes, including, but not limited to:
294+
- AWS Lambda
295+
- Cloudflare Workers
296+
- Vercel Edge Functions
297+
- Web Workers / Browser (no device mode destination support)
293298
294-
When calling Track within functions in serverless runtime environments, wrap the call in a `Promise` and `await` it to avoid having the runtime exit or freeze:
299+
### Usage in AWS Lambda
300+
- [AWS lambda execution environment](https://docs.aws.amazon.com/lambda/latest/dg/lambda-runtime-environment.html){:target="_blank"} is challenging for typically non-response-blocking async activities like tracking or logging, since the runtime terminates or freezes after a response is emitted.
295301
296-
```js
297-
await new Promise((resolve) =>
298-
analytics().track({ ... }, resolve)
299-
)
302+
Here is an example of using analytics.js within a handler:
303+
```ts
304+
const { Analytics } = require('@segment/analytics-node');
305+
306+
// Preferable to create a new analytics instance per-invocation. Otherwise, we may get a warning about overlapping flush calls. Also, custom plugins have the potential to be stateful, so we prevent those kind of race conditions.
307+
const createAnalytics = () => new Analytics({
308+
writeKey: '<MY_WRITE_KEY>',
309+
}).on('error', console.error);
310+
311+
module.exports.handler = async (event) => {
312+
const analytics = createAnalytics()
313+
314+
analytics.identify({ ... })
315+
analytics.track({ ... })
316+
317+
// ensure analytics events get sent before program exits
318+
await analytics.flush()
319+
320+
return {
321+
statusCode: 200,
322+
};
323+
....
324+
};
325+
```
326+
327+
### Usage in Vercel Edge Functions
328+
329+
```ts
330+
import { Analytics } from '@segment/analytics-node';
331+
import { NextRequest, NextResponse } from 'next/server';
332+
333+
const createAnalytics = () => new Analytics({
334+
writeKey: '<MY_WRITE_KEY>',
335+
}).on('error', console.error)
336+
337+
export const config = {
338+
runtime: 'edge',
339+
};
340+
341+
export default async (req: NextRequest) => {
342+
const analytics = createAnalytics()
343+
344+
analytics.identify({ ... })
345+
analytics.track({ ... })
346+
347+
// ensure analytics events get sent before program exits
348+
await analytics.flush()
349+
350+
return NextResponse.json({ ... })
351+
};
300352
```
301353
302-
See the complete documentation on [Usage in AWS Lambda](https://github.com/segmentio/analytics-next/blob/master/packages/node/README.md#usage-in-aws-lambda){:target="_blank"}, [Usage in Vercel Edge Functions](https://github.com/segmentio/analytics-next/blob/master/packages/node/README.md#usage-in-vercel-edge-functions){:target="_blank"}, and [Usage in Cloudflare Workers](https://github.com/segmentio/analytics-next/blob/master/packages/node/README.md#usage-in-cloudflare-workers){:target="_blank"}
354+
### Usage in Cloudflare Workers
355+
356+
```ts
357+
import { Analytics, Context } from '@segment/analytics-node';
358+
359+
360+
const createAnalytics = () => new Analytics({
361+
writeKey: '<MY_WRITE_KEY>',
362+
}).on('error', console.error);
363+
364+
export default {
365+
async fetch(
366+
request: Request,
367+
env: Env,
368+
ctx: ExecutionContext
369+
): Promise<Response> {
370+
const analytics = createAnalytics()
371+
372+
analytics.identify({ ... })
373+
analytics.track({ ... })
374+
375+
// ensure analytics events get sent before program exits
376+
await analytics.flush()
377+
378+
return new Response(...)
379+
},
380+
};
381+
382+
```
303383
304384
## Graceful shutdown
305-
Avoid losing events after shutting down your console. Call `.closeAndFlush()` to stop collecting new events and flush all existing events. If a callback on an event call is included, this also waits for all callbacks to be called, and any of their subsequent promises to be resolved.
385+
Avoid losing events after shutting down your console. Call `.flush({ close: true })` to stop collecting new events and flush all existing events. If a callback on an event call is included, this also waits for all callbacks to be called, and any of their subsequent promises to be resolved.
306386
307387
```javascript
308-
await analytics.closeAndFlush()
388+
await analytics.flush({ close: true })
309389
// or
310-
await analytics.closeAndFlush({ timeout: 5000 }) // force resolve after 5000ms
390+
await analytics.flush({ close: true, timeout: 5000 }) // force resolve after 5000ms
311391
```
312392
313393
Here's an example of how to use graceful shutdown:
@@ -316,7 +396,7 @@ const app = express()
316396
const server = app.listen(3000)
317397
318398
const onExit = async () => {
319-
await analytics.closeAndFlush()
399+
await analytics.flush({ close: true })
320400
server.close(() => {
321401
console.log("Gracefully closing server...")
322402
process.exit()
@@ -326,15 +406,15 @@ const onExit = async () => {
326406
```
327407
328408
### Collect unflushed events
329-
If you need to preserve all of your events in the instance of a forced timeout, even ones that came in after analytics.closeAndFlush() was called, you can still collect those events by using:
409+
If you need to preserve all of your events in the instance of a forced timeout, even ones that came in after analytics.flush({ close: true }) was called, you can still collect those events by using:
330410
331411
```javascript
332412
const unflushedEvents = []
333413
334414
analytics.on('call_after_close', (event) => unflushedEvents.push(events))
335-
await analytics.closeAndFlush()
415+
await analytics.flush({ close: true })
336416
337-
console.log(unflushedEvents) // all events that came in after closeAndFlush was called
417+
console.log(unflushedEvents) // all events that came in after flush was called
338418
```
339419
340420
## Regional configuration
@@ -362,22 +442,17 @@ analytics.on('error', (err) => console.error(err))
362442
363443
364444
### Event emitter interface
365-
The event emitter interface allows you to track events, like Track and Identify calls, and it calls the function you provided with some arguments upon successful delivery. `error` emits on delivery error.
366-
367-
```javascript
368-
analytics.on('error', (err) => console.error(err))
445+
The event emitter interface allows you to pass a callback which will be invoked whenever a specific emitter event occurs in your app, such as when a certain method call is made.
369446
370-
analytics.on('identify', (ctx) => console.log(ctx))
447+
For example:
371448
449+
```javascript
372450
analytics.on('track', (ctx) => console.log(ctx))
373-
```
374-
375-
Use the emitter to log all HTTP Requests.
451+
analytics.on('error', (err) => console.error(err))
376452
377-
```javascript
378-
analytics.on('http_request', (event) => console.log(event))
379453
380-
// when triggered, emits an event of the shape:
454+
// when triggered, emits an event of the shape:
455+
analytics.on('http_request', (event) => console.log(event))
381456
{
382457
url: 'https://api.segment.io/v1/batch',
383458
method: 'POST',
@@ -388,32 +463,43 @@ Use the emitter to log all HTTP Requests.
388463
body: '...',
389464
}
390465
```
466+
467+
### Emitter Types
391468
469+
The following table documents all the emitter types available in the Analytics Node.js library:
392470
393-
## Plugin architecture
394-
When you develop in [Analytics.js 2.0](/docs/connections/sources/catalog/libraries/website/javascript/), the plugins you write can improve functionality, enrich data, and control the flow and delivery of events. From modifying event payloads to changing analytics functionality, plugins help to speed up the process of getting things done.
471+
| Emitter Type | Description |
472+
|-------------------|-----------------------------------------------------------------------------|
473+
| `error` | Emitted when there is an error after SDK initialization. |
474+
| `identify` | Emitted when an Identify call is made.
475+
| `track` | Emitted when a Track call is made.
476+
| `page` | Emitted when a Page call is made.
477+
| `group` | Emitted when a Group call is made.
478+
| `alias` | Emitted when an Alias call is made.
479+
| `flush` | Emitted after a batch is flushed.
480+
| `http_request` | Emitted when an HTTP request is made. |
481+
| `register` | Emitted when a plugin is registered
482+
| `call_after_close`| Emitted when an event is received after the flush with `{ close: true }`. |
395483
396-
Though middlewares function the same as plugins, it's best to use plugins as they are easier to implement and are more testable.
484+
These emitters allow you to hook into various stages of the event lifecycle and handle them accordingly.
397485
398-
### Plugin categories
399-
Plugins are bound by Analytics.js 2.0 which handles operations such as observability, retries, and error handling. There are two different categories of plugins:
400-
* **Critical Plugins**: Analytics.js expects this plugin to be loaded before starting event delivery. Failure to load a critical plugin halts event delivery. Use this category sparingly, and only for plugins that are critical to your tracking.
401-
* **Non-critical Plugins**: Analytics.js can start event delivery before this plugin finishes loading. This means your plugin can fail to load independently from all other plugins. For example, every Analytics.js destination is a non-critical plugin. This makes it possible for Analytics.js to continue working if a partner destination fails to load, or if users have ad blockers turned on that are targeting specific destinations.
402486
403-
> info ""
404-
> Non-critical plugins are only non-critical from a loading standpoint. For example, if the `before` plugin crashes, this can still halt the event delivery pipeline.
487+
## Plugin architecture
488+
The plugins you write can improve functionality, enrich data, and control the flow and delivery of events. From modifying event payloads to changing analytics functionality, plugins help to speed up the process of getting things done.
489+
405490
406-
Non-critical plugins run through a timeline that executes in order of insertion based on the entry type. Segment has these five entry types of non-critical plugins:
491+
### Plugin categories
492+
Segment has these five entry types of plugins:
407493
408-
| Type | Details
409-
------ | --------
410-
| `before` | Executes before event processing begins. These are plugins that run before any other plugins run. <br><br>For example, validating events before passing them along to other plugins. A failure here could halt the event pipeline.
411-
| `enrichment` | Executes as the first level of event processing. These plugins modify an event.
412-
| `destination` | Executes as events begin to pass off to destinations. <br><br> This doesn't modify the event outside of the specific destination, and failure doesn't halt the execution.
413-
| `after` | Executes after all event processing completes. You can use this to perform cleanup operations. <br><br>An example of this is the [Segment.io Plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/segmentio/index.ts){:target="_blank"} which waits for destinations to succeed or fail so it can send it observability metrics.
414-
| `utility` | Executes once during the bootstrap, to give you an outlet to make any modifications as to how Analytics.js works internally. This allows you to augment Analytics.js functionality.
494+
| Type | Details
495+
| ------------- | ------------- |
496+
| `before` | Executes before event processing begins. These are plugins that run before any other plugins run. Thrown errors here can block the event pipeline. Source middleware added using `addSourceMiddleware` is treated as a `before` plugin. No events send to destinations until `.load()` method is resolved. |
497+
| `enrichment` | Executes as the first level of event processing. These plugins modify an event. Thrown errors here can block the event pipeline. No events send to destinations until `.load()` method is resolved. |
498+
| `destination` | Executes as events begin to pass off to destinations. Segment.io is implemented as a destination plugin. Thrown errors here will _not_ block the event pipeline. |
499+
| `after` | Executes after all event processing completes. You can use this to perform cleanup operations. |
500+
| `utility` | Executes _only once_ during the bootstrap. Gives you access to the analytics instance using the plugin's `load()` method. This doesn't allow you to modify events. |
415501
416-
### Example plugins
502+
### Example plugin
417503
Here's an example of a plugin that converts all track event names to lowercase before the event goes through the rest of the pipeline:
418504
419505
```js
@@ -430,49 +516,8 @@ export const lowercase: Plugin = {
430516
return ctx
431517
}
432518
}
433-
434-
const identityStitching = () => {
435-
let user
436-
437-
const identity = {
438-
// Identifies your plugin in the Plugins stack.
439-
// Access `window.analytics.queue.plugins` to see the full list of plugins
440-
name: 'Identity Stitching',
441-
// Defines where in the event timeline a plugin should run
442-
type: 'enrichment',
443-
version: '0.1.0',
444-
445-
// Used to signal that a plugin has been property loaded
446-
isLoaded: () => user !== undefined,
447-
448-
// Applies the plugin code to every `identify` call in Analytics.js
449-
// You can override any of the existing types in the Segment Spec.
450-
async identify(ctx) {
451-
// Request some extra info to enrich your `identify` events from
452-
// an external API.
453-
const req = await fetch(
454-
`https://jsonplaceholder.typicode.com/users/${ctx.event.userId}`
455-
)
456-
const userReq = await req.json()
457-
458-
// ctx.updateEvent can be used to update deeply nested properties
459-
// in your events. It's a safe way to change events as it'll
460-
// create any missing objects and properties you may require.
461-
ctx.updateEvent('traits.custom', userReq)
462-
user.traits(userReq)
463-
464-
// Every plugin must return a `ctx` object, so that the event
465-
// timeline can continue processing.
466-
return ctx
467-
},
468-
}
469-
470-
return identity
471-
}
472519
```
473520
474-
You can view Segment's [existing plugins](https://github.com/segmentio/analytics-next/tree/master/packages/browser/src/plugins){:target="_blank"} to see more examples.
475-
476521
### Register a plugin
477522
Registering plugins enable you to modify your analytics implementation to best fit your needs. You can register a plugin using this:
478523

‎src/connections/sources/catalog/libraries/server/node/migration.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -32,14 +32,14 @@ If you're using the [classic version of Analytics Node.js](/docs/connections/sou
3232

3333
<br> Before:
3434
```javascript
35-
await analytics.flush(function(err, batch) {
35+
await analytics.flush((err, batch) => {
3636
console.log('Flushed, and now this program can exit!');
3737
});
3838
```
3939

4040
After:
4141
```javascript
42-
await analytics.closeAndFlush()
42+
await analytics.flush({ close: true })
4343
```
4444

4545
### Key differences between the classic and updated version

‎src/engage/audiences/index.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -277,4 +277,4 @@ The audience builder accepts CSV and TSV lists.
277277
This error occurs when creating audiences that reference each other, meaning audience X refers to audience Y in its trigger condition, and later you attempt to modify audience Y's trigger condition to refer back to audience X. To avoid this error, ensure that the audiences do not reference each other in their conditions.
278278

279279
### How does the historical data flag work?
280-
Including historical data lets you take past information into account. You can data only exclude historical data for real-time audiences. For batch audiences, Segment includes historical data by default.
280+
Including historical data lets you take past information into account. You can only exclude historical data for real-time audiences. For batch audiences, Segment includes historical data by default.

‎src/protocols/faq.md

+4
Original file line numberDiff line numberDiff line change
@@ -177,6 +177,10 @@ Blocking events within a [Source Schema](/docs/connections/sources/schema/) or [
177177

178178
Warehouse connectors don't use data type definitions for schema creation. The [data types](/docs/connections/storage/warehouses/schema/#data-types) for columns are inferred from the first event that comes in from the source.
179179

180+
### Can I use schema controls to block events forwarded to my source from another source?
181+
182+
You can only use schema controls to block events at the point that they are ingested into Segment. When you forward an event that Segment has previously ingested from another source, that event bypasses the pipeline that Segment uses to block events and cannot be blocked a second time.
183+
180184
## Protocols Transformations
181185

182186
### Do transformations work with Segment replays?

‎src/segment-app/iam/sso.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -75,7 +75,7 @@ You can now test using IdP-initiated SSO (by clicking login to Segment from with
7575

7676
For most customers, Segment recommends requiring SSO for all users. If you do not require SSO, users can still log in with a username and password. If some members cannot log in using SSO, Segment also supports SSO exceptions.
7777

78-
These options are off by default, but configurable on the "Advanced Settings" page.
78+
These options are off by default, but you can configure them on the **Advanced Settings** page. Log in using SSO to toggle the **Require SSO** setting.
7979

8080
![Screenshot of the Advanced Settings page in the Authentication settings tab.](images/asset_require_sso.png)
8181

0 commit comments

Comments
 (0)
Please sign in to comment.