You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: src/connections/destinations/actions.md
+9-1
Original file line number
Diff line number
Diff line change
@@ -163,7 +163,11 @@ You can also test within the mapping itself. To test the mapping:
163
163
1. Navigate to the **Mappings** tab of your destination.
164
164
2. Select a mapping and click the **...** and select **Edit Mapping**.
165
165
3. In step 2 of the mappings edit page, click **Load Test Event from Source** to add a test event from the source, or you can add your own sample event.
166
-
4. Scroll to step 4 on the page, and click **Test Mapping** to test the mapping and view the response from the destination.
166
+
4. Scroll to step 4 on the page, and click **Test Mapping** to test the mapping and view the response from the destination.
167
+
168
+
169
+
> info "Test Mapping might not return the events you're looking for"
170
+
> Segment only surfaces a small subset of events for the Test Mapping feature and might not always return the event you're looking for. If you'd like to test with a specific event, copy a specific event from your [Source Debugger](/docs/connections/sources/debugger/) and paste it into the **Add test event** interface.
167
171
168
172
## Customize mappings
169
173
@@ -207,6 +211,10 @@ The coalesce function takes a primary value and uses it if it is available. If t
207
211
208
212
The replace function allows you to replace a string, integer, or boolean with a new value. You have the option to replace up to two values within a single field.
209
213
214
+
### Flatten function
215
+
216
+
The flatten function allows you to flatten a nested object to an object with a depth of 1. Keys are delimited by the configured separator. For example, an object like {a: { b: { c: 1 }, d: 2 } } will be converted to { 'a.b.c': 1, 'a.d': 2 }.
The Google Campaign Manager 360 destination allows users to upload [conversions](https://developers.google.com/doubleclick-advertisers/guides/conversions_upload){:target="_blank"} and [conversion enhancements](https://developers.google.com/doubleclick-advertisers/guides/conversions_ec){:target="_blank"} to Google Campaign Manager 360. Marketers can use this integration to attribute conversions to specific campaigns, ad groups, and ads.
15
+
16
+
## Getting Started
17
+
18
+
> info ""
19
+
> You can connect the Google Campaign Manager 360 Destination to an event source, Reverse ETL source, or Engage space.
20
+
21
+
### Prerequisites
22
+
23
+
Before you begin, you need to have a Google Campaign Manager 360 account, with a Profile ID and a Floodlight Configuration ID. These are necessary to configure the Floodlight activities you want to track.
24
+
25
+
### Connect to Google Campaign Manager 360
26
+
27
+
1. From the Segment web app, navigate to **Catalog > Destinations**.
28
+
2. Search for “Google Campaign Manager 360” in the Destinations Catalog, and select it.
29
+
3. Click **Add destination**.
30
+
4. Select the source that will send data to Google Campaign Manager 360.
31
+
* If you select an Engage space, you'll be redirected to Engage to complete the following steps.
32
+
* If you select a Reverse ETL source, you must enter a name for your destination and click **Create destination**.
33
+
5. On the **Settings** tab for your Google Campaign Manager destination:
34
+
* Enter your **Profile ID**. Optionally, you can also provide your default **Floodlight Configuration ID** and/or your default **Floodlight Activity ID**. These fields are optional, but if you provide them, they will be used as defaults for all events sent to Google Campaign Manager 360. Otherwise, you can override these values in your mappings.
35
+
6. Click **Save**.
36
+
7. Follow the steps in the Destinations Actions documentation to [customize your mappings](/docs/connections/destinations/actions/#customize-mappings).
37
+
38
+
## Available actions
39
+
40
+
The Google Campaign Manager 360 Action Destination supports the following actions:
The Conversion Upload action allows you to send conversion data to Google Campaign Manager 360. This action is useful for tracking conversions that occur on your website or app.
48
+
49
+
#### Fields
50
+
51
+
The Google Campaign Manager 360 destination requires the following fields for the Conversion Upload action:
52
+
53
+
***Required ID**: The identifier that identifies a user for the conversion. Only one value at a time can be provided from the following fields:
54
+
* Google Click ID (gclid);
55
+
* Display Click ID (dclid);
56
+
* Encrypted User ID;
57
+
* Mobile Device ID;
58
+
* Match ID;
59
+
* Impression ID;
60
+
* Encrypted User ID Candidates;
61
+
***Timestamp**: The time the conversion occurred.
62
+
***Value**: The value of the conversion.
63
+
***Ordinal**: The ordinal of the conversion. This field is used to control how conversions of the same user and day are de-duplicated.
64
+
65
+
### Conversion Adjustment Upload
66
+
67
+
The Conversion Adjustment Upload action allows you to send conversion adjustment data to Google Campaign Manager 360. This action is useful for adjustments to conversions that have already been uploaded, as well as enhancing conversions.
68
+
69
+
#### Fields
70
+
71
+
The Google Campaign Manager 360 destination requires the following fields for the Conversion Adjustment Upload action:
72
+
73
+
***Required ID**: The identifier that identifies a user for the conversion. Only one value at a time can be provided, from the following fields:
74
+
* Google Click ID (gclid);
75
+
* Display Click ID (dclid);
76
+
* Encrypted User ID;
77
+
* Mobile Device ID;
78
+
* Match ID;
79
+
* Impression ID;
80
+
***Timestamp**: The time the conversion occurred.
81
+
***Value**: The value of the conversion.
82
+
***Ordinal**: The ordinal of the conversion. This field is used to control how conversions of the same user and day are de-duplicated.
83
+
84
+
## Hashing
85
+
86
+
Google requires you to hash all PII before sending it to the Google API.
87
+
88
+
The Google Campaign Manager 360 destination supports hashing for the following fields:
89
+
90
+
* Email
91
+
* Phone
92
+
* First Name
93
+
* Last Name
94
+
* Street Address
95
+
96
+
The hashing algorithm used is SHA-256. If incoming data arrives already hashed, the destination will not hash it again. The values will be sent as-is to Google.
97
+
98
+
{% include components/actions-fields.html settings="true"%}
99
+
100
+
## FAQ and troubleshooting
101
+
102
+
### Refreshing access tokens
103
+
104
+
When you use OAuth to authenticate into the Google Campaign Manager 360 destination, Segment stores an access token and refresh token. Access tokens for Google Campaign Manager 360 expire after one hour. Once expired, Segment receives an error and then uses the refresh token to fetch a new access token. This results in two API requests to Google Campaign Manager 360, one failure and one success.
105
+
106
+
Because of the duplicate API requests, you may see a warning in Google for unprocessed conversions due to incorrect or missing OAuth credentials. This warning is expected and does not indicate data loss. Google has confirmed that conversions are being processed, and OAuth retry behavior will not cause any issues for your web conversions. Whenever possible, Segment caches access tokens to reduce the total number of requests made to Google Campaign Manager 360.
Copy file name to clipboardexpand all lines: src/connections/destinations/catalog/actions-marketo-static-lists/index.md
+3
Original file line number
Diff line number
Diff line change
@@ -57,6 +57,9 @@ In this step, you'll create an API-Only Marketo user with both Access API and Le
57
57
> warning "Warning:"
58
58
> Do not create a list in the folder for the audience. Segment creates the list for you!
59
59
60
+
### Using Marketo Static Lists (Actions) with the Event Tester
61
+
This destination keeps track of a `List Id` field for you on the backend. That field is added to payloads as Segment processes them. This means that the Event Tester can't be used out-of-the-box as it can with most destinations. To test an event using the Event Tester for Marketo Static Lists (Actions), you need to add a valid `List Id` to the payload at the `context.personas.external_audience_id` key.
62
+
60
63
### Using Marketo Static Lists (Actions) destination with Engage
61
64
62
65
1. From your Segment workspace, go to **Engage → Engage Settings → Destinations → Add Destination**, and then Search for Marketo Static Lists (Actions).
Copy file name to clipboardexpand all lines: src/connections/destinations/catalog/actions-podscribe/index.md
-2
Original file line number
Diff line number
Diff line change
@@ -5,8 +5,6 @@ id: 643fdecd5675b7a6780d0d67
5
5
6
6
[Podscribe](https://podscribe.com/){:target="\_blank”} measures the effectiveness of podcast advertising. Through integrations with podcast hosting providers, matches downloads with on-site actions, providing advertisers household-level attribution.
7
7
8
-
{% include content/beta-note.md %}
9
-
10
8
## Getting started
11
9
12
10
1. From the Segment web app, navigate to **Connections > Catalog**.
Copy file name to clipboardexpand all lines: src/connections/destinations/catalog/actions-sendgrid-audiences/index.md
+6-3
Original file line number
Diff line number
Diff line change
@@ -79,7 +79,10 @@ At least one of the following identifier types is required when syncing members
79
79
- Phone Number ID (must be in [E.164](https://www.twilio.com/docs/glossary/what-e164){:target="_blank”} format)
80
80
- External ID
81
81
82
-
To sync Engage users to a list using Anonymous ID, Phone Number ID, and External ID identifier types, complete the following configuration steps:
82
+
> warning ""
83
+
> If you provide more than one type of identifier for each user in your initial sync, you must send all of those identifier types for any future updates to that Contact.
83
84
84
-
1. Configure [ID Sync](/docs/engage/trait-activation/id-sync/) to include Anonymous ID, Phone Number ID, or External ID identifiers when syncing users from an Engage Audience to the SendGrid List.
85
-
2. Map the Anonymous ID, Phone Number ID, and External ID identifiers using the [Sync Audience ](#sync-audience-action) Action's Anonymous ID, Phone Number ID, and External ID fields.
85
+
To sync Engage users to a SendGrid list using an identifier type other than email, complete the following additional steps:
86
+
87
+
1. Configure [ID Sync](/docs/engage/trait-activation/id-sync/) to include a value for the identifier when syncing users from an Engage Audience to the SendGrid List.
88
+
2. Map the identifier using the [Sync Audience Action](#sync-audience-action)'s mapping field.
Copy file name to clipboardexpand all lines: src/connections/destinations/catalog/appsflyer/index.md
+3
Original file line number
Diff line number
Diff line change
@@ -265,6 +265,9 @@ For example, an attribution event coming from an attribution partner would look
265
265
}];
266
266
```
267
267
268
+
> info "Attribution and install counts might differ between Segment and attribution providers like AppsFlyer"
269
+
> For more information about the factors that contribute to these differences, see the [Segment's Role in Attribution](/docs/guides/how-to-guides/segment-and-attribution/) documentation.
- Destination Filters don't apply to events that send through the destination Event Tester.
40
-
- Destination Filters within the UI and [FQL](/docs/api/public-api/fql/) do not currently support matching on event fields containing '.$' or '.$.', which references fields with an array type.
41
40
42
41
[Contact Segment](https://segment.com/help/contact/){:target="_blank"} if these limitations impact your use case.
Antavo syncs two main types of events to Segment: Profile Updates and Loyalty Events. Profile Updates are sent as Segment Identify events, while Loyalty Events are sent as Segment Track events.
38
+
39
+
Both event types include a `userId`, which can be configured in Antavo. You can designate any customer attribute as the "external customer ID" to use as the Segment `userId`.
40
+
41
+
### Profile updates
42
+
43
+
Profile Updates occur when a customer attribute, added to the Antavo **Customer field sync**, updates. Customer attributes are included in the traits object.
44
+
45
+
```
46
+
{
47
+
"traits": {
48
+
"first_name": "New",
49
+
"last_name": "Name",
50
+
},
51
+
"userId": "antavo-customer-id",
52
+
"timestamp": "2024-11-26T15:19:14.000Z",
53
+
"type": "identify",
54
+
}
55
+
```
56
+
57
+
### Loyalty events
58
+
59
+
Loyalty Events occur when a built-in or custom event, added to the Antavo Event sync, is triggered. The event data is then sent to the Segment Antavo Source. Event properties are included in the properties object.
60
+
61
+
```
62
+
{
63
+
"properties": {
64
+
"points": 5000
65
+
},
66
+
"type": "track",
67
+
"event": "point_add",
68
+
"userId": "antavo-customer-id",
69
+
"timestamp": "2024-11-26T15:15:49.000Z",
70
+
}
71
+
```
72
+
73
+
### Integrations Object
74
+
Antavo automatically filters data from being sent to Salesforce destinations ([Salesforce (Actions)](https://segment.com/docs/connections/destinations/catalog/actions-salesforce){:target="_blank"}, [Salesforce Marketing Cloud (Actions)](https://segment.com/docs/connections/destinations/catalog/actions-salesforce-marketing-cloud){:target="_blank"}) and the [Antavo](https://segment.com/docs/connections/destinations/catalog/antavo){:target="_blank"} destination. This is achieved by adding these destinations to the [Integrations object](https://segment.com/docs/guides/filtering-data/#filtering-with-the-integrations-object){:target="_blank"} in the event payloads. Since Antavo has a dedicated Salesforce integration, this filtering helps prevent infinite loops.
75
+
76
+
## Adding Destinations
77
+
78
+
As the last step of the Antavo Source setup, you can select Destinations to receive data.
79
+
80
+
Log into your downstream tools and check to see that your events appear as expected, and that they contain all of the properties you expect. If your events and properties don’t appear, check the [Event Delivery](https://github.com/segmentio/segment-docs/blob/develop/docs/connections/event-delivery){:target="_blank"} tool, and refer to the Destination docs for each tool for troubleshooting.
81
+
82
+
If there are any issues with how the events are arriving to Segment, [contact the Antavo support team](mailto:support@antavo.com).
Copy file name to clipboardexpand all lines: src/connections/sources/catalog/libraries/server/http-api/index.md
+3-2
Original file line number
Diff line number
Diff line change
@@ -462,8 +462,9 @@ When sending a HTTP call from a user's device, you can collect the IP address by
462
462
463
463
Segment returns a `200` response for all API requests except errors caused by large payloads and JSON errors (which return `400` responses.) To debug events that return `200` responses but aren't accepted by Segment, use the Segment Debugger.
464
464
465
-
Common reasons events are not accepted by Segment include:
466
-
-**Payload is too large:** The HTTP API can handle API requests that are 32KB or smaller. The batch API endpoint accepts a maximum of 500KB per request, with a limit of 32KB per event in the batch. If these limits are exceeded, Segment returns a 400 Bad Request error.
465
+
Common reasons that events are not accepted by Segment:
466
+
-**Payload is too large:** Most HTTP API routes can handle API requests that are 32KB or smaller. If this limit is exceeded, Segment returns a 400 Bad Request error.
467
+
-**The `\batch` API endpoint:** This endpoint accepts a maximum of 500KB per batch API request. Each batch request can only have up to 2500 events, and each batched event needs to be less than 32KB. Segment returns a `200` response but rejects the event when the number of batched events exceeds the limit.
467
468
-**Identifier is not present**: The HTTP API requires that each payload has a userId and/or anonymousId. If you send events without either the userId or anonymousId, Segment’s tracking API responds with an no_user_anon_id error. Check the event payload and client instrumentation for more details.
468
469
-**Track event is missing name**: All Track events sent to Segment must have an `event` field.
469
470
-**Deduplication**: Segment deduplicates events using the `messageId` field, which is automatically added to all payloads coming into Segment. If you're setting up the HTTP API yourself, ensure all events have unique messageId values with fewer than 100 characters.
Copy file name to clipboardexpand all lines: src/connections/sources/catalog/libraries/server/node/index.md
+130-85
Original file line number
Diff line number
Diff line change
@@ -15,7 +15,7 @@ All of Segment's server-side libraries are built for high-performance, so you ca
15
15
## Getting Started
16
16
17
17
> warning ""
18
-
> Make sure you're using a version of Node that's 16 or higher.
18
+
> Make sure you're using a version of Node that's 18 or higher.
19
19
20
20
1. Run the relevant command to add Segment's Node library module to your `package.json`.
21
21
@@ -289,25 +289,105 @@ Setting | Details
289
289
290
290
See the complete `AnalyticsSettings` interface [in the analytics-next repository](https://github.com/segmentio/analytics-next/blob/master/packages/node/src/app/settings.ts){:target="_blank"}.
291
291
292
-
## Usage in serverless environments
292
+
## Usage in serverless environments and non-node runtimes
293
+
Segment supports a variety of runtimes, including, but not limited to:
294
+
- AWS Lambda
295
+
- Cloudflare Workers
296
+
- Vercel Edge Functions
297
+
- Web Workers / Browser (no device mode destination support)
293
298
294
-
When calling Track within functions in serverless runtime environments, wrap the call in a `Promise` and `await` it to avoid having the runtime exit or freeze:
299
+
### Usage in AWS Lambda
300
+
- [AWS lambda execution environment](https://docs.aws.amazon.com/lambda/latest/dg/lambda-runtime-environment.html){:target="_blank"} is challenging for typically non-response-blocking async activities like tracking or logging, since the runtime terminates or freezes after a response is emitted.
295
301
296
-
```js
297
-
await new Promise((resolve) =>
298
-
analytics().track({ ... }, resolve)
299
-
)
302
+
Here is an example of using analytics.js within a handler:
// Preferable to create a new analytics instance per-invocation. Otherwise, we may get a warning about overlapping flush calls. Also, custom plugins have the potential to be stateful, so we prevent those kind of race conditions.
307
+
const createAnalytics = () => new Analytics({
308
+
writeKey: '<MY_WRITE_KEY>',
309
+
}).on('error', console.error);
310
+
311
+
module.exports.handler = async (event) => {
312
+
const analytics = createAnalytics()
313
+
314
+
analytics.identify({ ... })
315
+
analytics.track({ ... })
316
+
317
+
// ensure analytics events get sent before program exits
318
+
await analytics.flush()
319
+
320
+
return {
321
+
statusCode: 200,
322
+
};
323
+
....
324
+
};
325
+
```
326
+
327
+
### Usage in Vercel Edge Functions
328
+
329
+
```ts
330
+
import { Analytics } from '@segment/analytics-node';
331
+
import { NextRequest, NextResponse } from 'next/server';
332
+
333
+
const createAnalytics = () => new Analytics({
334
+
writeKey: '<MY_WRITE_KEY>',
335
+
}).on('error', console.error)
336
+
337
+
export const config = {
338
+
runtime: 'edge',
339
+
};
340
+
341
+
export default async (req: NextRequest) => {
342
+
const analytics = createAnalytics()
343
+
344
+
analytics.identify({ ... })
345
+
analytics.track({ ... })
346
+
347
+
// ensure analytics events get sent before program exits
348
+
await analytics.flush()
349
+
350
+
return NextResponse.json({ ... })
351
+
};
300
352
```
301
353
302
-
See the complete documentation on [Usage in AWS Lambda](https://github.com/segmentio/analytics-next/blob/master/packages/node/README.md#usage-in-aws-lambda){:target="_blank"}, [Usage in Vercel Edge Functions](https://github.com/segmentio/analytics-next/blob/master/packages/node/README.md#usage-in-vercel-edge-functions){:target="_blank"}, and [Usage in Cloudflare Workers](https://github.com/segmentio/analytics-next/blob/master/packages/node/README.md#usage-in-cloudflare-workers){:target="_blank"}
354
+
### Usage in Cloudflare Workers
355
+
356
+
```ts
357
+
import { Analytics, Context } from '@segment/analytics-node';
358
+
359
+
360
+
const createAnalytics = () => new Analytics({
361
+
writeKey: '<MY_WRITE_KEY>',
362
+
}).on('error', console.error);
363
+
364
+
export default {
365
+
async fetch(
366
+
request: Request,
367
+
env: Env,
368
+
ctx: ExecutionContext
369
+
): Promise<Response> {
370
+
const analytics = createAnalytics()
371
+
372
+
analytics.identify({ ... })
373
+
analytics.track({ ... })
374
+
375
+
// ensure analytics events get sent before program exits
376
+
await analytics.flush()
377
+
378
+
return new Response(...)
379
+
},
380
+
};
381
+
382
+
```
303
383
304
384
## Graceful shutdown
305
-
Avoid losing events after shutting down your console. Call `.closeAndFlush()` to stop collecting new events and flush all existing events. If a callback on an event call is included, this also waits for all callbacks to be called, and any of their subsequent promises to be resolved.
385
+
Avoid losing events after shutting down your console. Call `.flush({ close: true })` to stop collecting new events and flush all existing events. If a callback on an event call is included, this also waits for all callbacks to be called, and any of their subsequent promises to be resolved.
306
386
307
387
```javascript
308
-
await analytics.closeAndFlush()
388
+
await analytics.flush({ close: true })
309
389
// or
310
-
await analytics.closeAndFlush({ timeout: 5000 }) // force resolve after 5000ms
390
+
await analytics.flush({ close: true, timeout: 5000 }) // force resolve after 5000ms
311
391
```
312
392
313
393
Here's an example of how to use graceful shutdown:
If you need to preserve all of your events in the instance of a forced timeout, even ones that came in after analytics.closeAndFlush() was called, you can still collect those events by using:
409
+
If you need to preserve all of your events in the instance of a forced timeout, even ones that came in after analytics.flush({ close: true }) was called, you can still collect those events by using:
The event emitter interface allows you to track events, like Track and Identify calls, and it calls the functionyou provided with some arguments upon successful delivery. `error` emits on delivery error.
The event emitter interface allows you to pass a callback which will be invoked whenever a specific emitter event occurs in your app, such as when a certain method call is made.
@@ -388,32 +463,43 @@ Use the emitter to log all HTTP Requests.
388
463
body: '...',
389
464
}
390
465
```
466
+
467
+
### Emitter Types
391
468
469
+
The following table documents all the emitter types available in the Analytics Node.js library:
392
470
393
-
## Plugin architecture
394
-
When you develop in [Analytics.js 2.0](/docs/connections/sources/catalog/libraries/website/javascript/), the plugins you write can improve functionality, enrich data, and control the flow and delivery of events. From modifying event payloads to changing analytics functionality, plugins help to speed up the process of getting things done.
|`error`| Emitted when there is an error after SDK initialization. |
474
+
|`identify`| Emitted when an Identify call is made.
475
+
|`track`| Emitted when a Track call is made.
476
+
|`page`| Emitted when a Page call is made.
477
+
|`group`| Emitted when a Group call is made.
478
+
|`alias`| Emitted when an Alias call is made.
479
+
|`flush`| Emitted after a batch is flushed.
480
+
|`http_request`| Emitted when an HTTP request is made. |
481
+
|`register`| Emitted when a plugin is registered
482
+
|`call_after_close`| Emitted when an event is received after the flush with `{ close: true }`. |
395
483
396
-
Though middlewares functionthe same as plugins, it's best to use plugins as they are easier to implement and are more testable.
484
+
These emitters allow you to hook into various stages of the event lifecycle and handle them accordingly.
397
485
398
-
### Plugin categories
399
-
Plugins are bound by Analytics.js 2.0 which handles operations such as observability, retries, and error handling. There are two different categories of plugins:
400
-
* **Critical Plugins**: Analytics.js expects this plugin to be loaded before starting event delivery. Failure to load a critical plugin halts event delivery. Use this category sparingly, and only for plugins that are critical to your tracking.
401
-
* **Non-critical Plugins**: Analytics.js can start event delivery before this plugin finishes loading. This means your plugin can fail to load independently from all other plugins. For example, every Analytics.js destination is a non-critical plugin. This makes it possible for Analytics.js to continue working if a partner destination fails to load, or if users have ad blockers turned on that are targeting specific destinations.
402
486
403
-
> info ""
404
-
> Non-critical plugins are only non-critical from a loading standpoint. For example, if the `before` plugin crashes, this can still halt the event delivery pipeline.
487
+
## Plugin architecture
488
+
The plugins you write can improve functionality, enrich data, and control the flow and delivery of events. From modifying event payloads to changing analytics functionality, plugins help to speed up the process of getting things done.
489
+
405
490
406
-
Non-critical plugins run through a timeline that executes in order of insertion based on the entry type. Segment has these five entry types of non-critical plugins:
491
+
### Plugin categories
492
+
Segment has these five entry types of plugins:
407
493
408
-
| Type | Details
409
-
------ | --------
410
-
| `before` | Executes before event processing begins. These are plugins that run before any other plugins run. <br><br>For example, validating events before passing them along to other plugins. A failure here could halt the event pipeline.
411
-
| `enrichment` | Executes as the first level of event processing. These plugins modify an event.
412
-
| `destination` | Executes as events begin to pass off to destinations. <br><br> This doesn't modify the event outside of the specific destination, and failure doesn't halt the execution.
413
-
| `after` | Executes after all event processing completes. You can use this to perform cleanup operations. <br><br>An example of this is the [Segment.io Plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/segmentio/index.ts){:target="_blank"} which waits for destinations to succeed or fail so it can send it observability metrics.
414
-
| `utility` | Executes once during the bootstrap, to give you an outlet to make any modifications as to how Analytics.js works internally. This allows you to augment Analytics.js functionality.
494
+
| Type | Details
495
+
|-------------| ------------- |
496
+
|`before`| Executes before event processing begins. These are plugins that run before any other plugins run. Thrown errors here can block the event pipeline. Source middleware added using `addSourceMiddleware` is treated as a `before` plugin. No events send to destinations until`.load()` method is resolved. |
497
+
|`enrichment`| Executes as the first level of event processing. These plugins modify an event. Thrown errors here can block the event pipeline. No events send to destinations until`.load()` method is resolved. |
498
+
|`destination`| Executes as events begin to pass off to destinations. Segment.io is implemented as a destination plugin. Thrown errors here will _not_ block the event pipeline. |
499
+
|`after`| Executes after all event processing completes. You can use this to perform cleanup operations. |
500
+
|`utility`| Executes _only once_ during the bootstrap. Gives you access to the analytics instance using the plugin's `load()` method. This doesn't allow you to modify events. |
415
501
416
-
### Example plugins
502
+
### Example plugin
417
503
Here's an example of a plugin that converts all track event names to lowercase before the event goes through the rest of the pipeline:
// ctx.updateEvent can be used to update deeply nested properties
459
-
// in your events. It's a safe way to change events as it'll
460
-
// create any missing objects and properties you may require.
461
-
ctx.updateEvent('traits.custom', userReq)
462
-
user.traits(userReq)
463
-
464
-
// Every plugin must return a `ctx` object, so that the event
465
-
// timeline can continue processing.
466
-
return ctx
467
-
},
468
-
}
469
-
470
-
return identity
471
-
}
472
519
```
473
520
474
-
You can view Segment's [existing plugins](https://github.com/segmentio/analytics-next/tree/master/packages/browser/src/plugins){:target="_blank"} to see more examples.
475
-
476
521
### Register a plugin
477
522
Registering plugins enable you to modify your analytics implementation to best fit your needs. You can register a plugin using this:
Copy file name to clipboardexpand all lines: src/engage/audiences/index.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -277,4 +277,4 @@ The audience builder accepts CSV and TSV lists.
277
277
This error occurs when creating audiences that reference each other, meaning audience X refers to audience Y in its trigger condition, and later you attempt to modify audience Y's trigger condition to refer back to audience X. To avoid this error, ensure that the audiences do not reference each other in their conditions.
278
278
279
279
### How does the historical data flag work?
280
-
Including historical data lets you take past information into account. You can data only exclude historical data for real-time audiences. For batch audiences, Segment includes historical data by default.
280
+
Including historical data lets you take past information into account. You can only exclude historical data for real-time audiences. For batch audiences, Segment includes historical data by default.
Copy file name to clipboardexpand all lines: src/protocols/faq.md
+4
Original file line number
Diff line number
Diff line change
@@ -177,6 +177,10 @@ Blocking events within a [Source Schema](/docs/connections/sources/schema/) or [
177
177
178
178
Warehouse connectors don't use data type definitions for schema creation. The [data types](/docs/connections/storage/warehouses/schema/#data-types) for columns are inferred from the first event that comes in from the source.
179
179
180
+
### Can I use schema controls to block events forwarded to my source from another source?
181
+
182
+
You can only use schema controls to block events at the point that they are ingested into Segment. When you forward an event that Segment has previously ingested from another source, that event bypasses the pipeline that Segment uses to block events and cannot be blocked a second time.
Copy file name to clipboardexpand all lines: src/segment-app/iam/sso.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -75,7 +75,7 @@ You can now test using IdP-initiated SSO (by clicking login to Segment from with
75
75
76
76
For most customers, Segment recommends requiring SSO for all users. If you do not require SSO, users can still log in with a username and password. If some members cannot log in using SSO, Segment also supports SSO exceptions.
77
77
78
-
These options are off by default, but configurable on the "Advanced Settings" page.
78
+
These options are off by default, but you can configure them on the **Advanced Settings** page. Log in using SSO to toggle the **Require SSO** setting.
79
79
80
80

0 commit comments