You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{% capture name %}{{page.title | replace: 'Destination', ''}}{% endcapture %}
3
+
4
+
<divclass="premonition info"><divclass="fa fa-info-circle"></div><divclass="content"><pclass="header">View observability metrics about your {{title}} with Delivery Overview</p><pmarkdown=1>Delivery Overview, Segment's built-in observability tool, is now in public beta for storage destinations. For more information, see the [Delivery Overview](/docs/connections/delivery-overview/) documentation.</p></div></div>
Copy file name to clipboardexpand all lines: src/connections/delivery-overview.md
+40-14
Original file line number
Diff line number
Diff line change
@@ -4,12 +4,13 @@ title: Delivery Overview
4
4
5
5
Delivery Overview is a visual observability tool designed to help Segment users diagnose event delivery issues for any cloud-streaming destination receiving events from cloud-streaming sources.
6
6
7
-
> info "Delivery Overview for RETL destinations, Storage destinations, and Engage Audience Syncs currently in development"
8
-
> This means that Segment is actively developing Delivery Overview features for RETL destinations, Storage destinations, and Engage Audience syncs. Some functionality may change before Delivery Overview for these integrations becomes generally available.
7
+
> info "Delivery Overview for RETL destinations and Engage Audience Syncs currently in development"
8
+
> This means that Segment is actively developing Delivery Overview features for RETL destinations and Engage Audience syncs. Some functionality may change before Delivery Overview for these integrations becomes generally available.
9
9
>
10
-
> Delivery Overview is generally available for streaming connections (cloud-streaming sources and cloud-streaming destinations).
10
+
> Delivery Overview is generally available for streaming connections (cloud-streaming sources and cloud-streaming destinations) and in public beta for storage destinations. Some metrics specific to storage destinations, like selective syncs, failed row counts, and total rows seen, are not yet available.
11
11
> All users of Delivery Overview have access to the Event Delivery tab, and can configure delivery alerts for their destinations.
12
12
13
+
13
14
## Key features
14
15
15
16
Delivery Overview has three core features:
@@ -20,25 +21,50 @@ Delivery Overview has three core features:
20
21
You can refine these tables using the time picker and the metric toggle, located under the destination header. With the time picker, you can specify a time period (last 10 minutes, 1 hour, 24 hours, 7 days, 2 weeks, or a custom date range over the last two weeks) for which you'd like to see data. With the metric toggle, you can switch between seeing metrics represented as percentages (for example, *85% of events* or *a 133% increase in events*) or as counts (*13 events* or *an increase of 145 events*.) Delivery Overview shows percentages by default.
21
22
22
23
### Pipeline view
23
-
The pipeline view provides insights into each step your data is processed by enroute to the destination, with an emphasis on the steps where data can be discarded due to errors or your filter preferences. Each step provides details into counts, change rates, and event details (like the associated Event Type or Event Names), and the discard steps (Failed on ingest, Filtered at source, Filtered at destination, & Failed delivery) provide you with the reasons events were dropped before reaching the destination. Discard steps also include how to control or alter that outcome, when possible. The pipeline view also shows a label between the Filtered at destination and Failed delivery steps indicating how many events are currently pending retry.
24
-
25
-
The pipeline view shows the following steps:
24
+
The pipeline view provides insights into each step your data is processed by enroute to the destination, with an emphasis on the steps where data can be discarded due to errors or your filter preferences. Each step provides details into counts, change rates, and event details (like the associated Event Type or Event Names), and the discard steps (Failed on ingest, Filtered at source, Filtered at destination, & Failed delivery) provide you with the reasons events were dropped before reaching the destination. Discard steps also include how to control or alter that outcome, when possible. The pipeline view also includes a label between the Filtered at destination and Failed delivery steps indicating how many events are currently pending retry.
26
25
27
-
-**Successfully received**: Events that Segment ingested from your source
28
-
-**Failed on ingest**: Events that Segment received, but were dropped due to internal data validation rules
29
-
-**Filtered at source**: Events that were discarded due to schema settings or [Protocols](/docs/protocols/) Tracking Plans
26
+
#### Classic destinations
27
+
The pipeline view for classic destinations includes the following steps:
28
+
-**Successfully received**: Events that Segment ingested from your source.
29
+
-**Failed on ingest**: Events that Segment received, but were dropped due to internal data validation rules.
30
+
-**Filtered at source**: Events that were discarded due to schema settings or [Protocols](/docs/protocols/) Tracking Plans.
30
31
-**Filtered at destination**: Events that were discarded due to [Destination Filters](/docs/guides/filtering-data/#destination-filters), [filtering in the Integrations object](/docs/guides/filtering-data/#filtering-with-the-integrations-object), [Destination Insert functions](/docs/connections/functions/insert-functions/), or [per source schema integration filters](/docs/guides/filtering-data/#per-source-schema-integrations-filters). [Actions destinations](/docs/connections/destinations/actions/) also have a filtering capability: for example, if your Action is set to only send Identify events, all other event types will be filtered out. Actions destinations with incomplete triggers or disabled mappings are filtered out at this step. [Consent Management](/docs/privacy/consent-management/) users also see events discarded due to consent preferences.
31
-
-**Failed delivery**: Events that have been discarded due to errors or unmet destination requirements
32
-
-**Successful delivery**: Events that were successfully delivered to the destination
32
+
-**Failed delivery**: Events that have been discarded due to errors or unmet destination requirements.
33
+
-**Successful delivery**: Events that were successfully delivered to the destination.
34
+
35
+
#### Actions destinations
36
+
The pipeline view for Actions destination includes the following steps:
37
+
-**Successfully received**: Events that Segment ingested from your source.
38
+
-**Failed on ingest**: Events that Segment received, but were dropped due to internal data validation rules.
39
+
-**Filtered at source**: Events that were discarded due to schema settings or [Protocols](/docs/protocols/) Tracking Plans.
40
+
-**Mapping dropdown**: Select a [mapping](/docs/connections/destinations/actions/#customize-mappings) to filter the events in the Filtered at destination, Failed delivery and Successful delivery pipeline steps.
41
+
-**Filtered at destination**: Events that were discarded due to [Destination Filters](/docs/guides/filtering-data/#destination-filters), [filtering in the Integrations object](/docs/guides/filtering-data/#filtering-with-the-integrations-object), [Destination Insert functions](/docs/connections/functions/insert-functions/), or [per source schema integration filters](/docs/guides/filtering-data/#per-source-schema-integrations-filters). [Actions destinations](/docs/connections/destinations/actions/) also have a filtering capability: for example, if your Action is set to only send Identify events, all other event types will be filtered out. Actions destinations with incomplete triggers or disabled mappings are filtered out at this step. [Consent Management](/docs/privacy/consent-management/) users also see events discarded due to consent preferences.
42
+
-**Retry count**: The number of events currently pending retry.
43
+
-**Failed delivery**: Events that have been discarded due to errors or unmet destination requirements.
44
+
-**Successful delivery**: Events that were successfully delivered to the destination.
33
45
34
-
Actions destinations also include a mapping dropdown, which allows you to select a [mapping](/docs/connections/destinations/actions/#customize-mappings) to filter the events in the Filtered at destination, Failed delivery and Successful delivery pipeline steps. The following image shows an Actions destination filtered to include only Track Page View events in the last three pipeline steps:
46
+
The following image shows an Actions destination filtered to include only Track Page View events in the last three pipeline steps:
35
47
36
48

37
49
50
+
#### Storage destinations
51
+
The pipeline view for storage destination includes the following steps:
52
+
-**Successfully received**: Events that Segment ingested from your source.
53
+
-**Failed on ingest**: Events that Segment received, but were dropped due to internal data validation rules.
54
+
-**Filtered at source**: Events that were discarded due to schema settings or [Protocols](/docs/protocols/) Tracking Plans.
55
+
-**Filtered at destination**: Events that were discarded due to [Destination Filters](/docs/guides/filtering-data/#destination-filters), [filtering in the Integrations object](/docs/guides/filtering-data/#filtering-with-the-integrations-object), [Destination Insert functions](/docs/connections/functions/insert-functions/), or [per source schema integration filters](/docs/guides/filtering-data/#per-source-schema-integrations-filters). [Actions destinations](/docs/connections/destinations/actions/) also have a filtering capability: for example, if your Action is set to only send Identify events, all other event types will be filtered out. Actions destinations with incomplete triggers or disabled mappings are filtered out at this step. [Consent Management](/docs/privacy/consent-management/) users also see events discarded due to consent preferences.
56
+
-**Events to warehouse rows**: A read-only box that shows the point in the delivery process where Segment converts events into warehouse rows.
57
+
-**Failed to sync**: Syncs that either failed to sync or were partially successful. Selecting this step takes you to a table of all syncs with one or more failed collections. Select a sync from the table to view the discard reason, any collections that failed, the status, and the number of rows that synced for each collection. For information about common errors, see Ware
58
+
-**Successfully synced**: A record of all successful or partially successful syncs made with your destination. To view the reason a partially successfully sync was not fully successful, see the Failed to sync step.
59
+
60
+
The following image shows a storage destination with 23 partially successful syncs:
61
+
62
+

63
+
38
64
### Breakdown table
39
65
The breakdown table provides you with greater detail about the selected events.
40
66
41
-
To open the breakdown table, select either the first step in the pipeline view (Successfully received,) the last step in the pipeline view (Successful delivery,) or select a discard step and then click on a discard reason.
67
+
To open the breakdown table, select either the first step in the pipeline view, the last step in the pipeline view, or select a discard step and then click on a discard reason.
42
68
43
69
The breakdown table displays the following details:
44
70
-**Event type**: The Segment spec event type (Track call vs. Identify call, for example)
@@ -96,7 +122,7 @@ You can use the Event Delivery alerting features (Delivery Alerts) by selecting
96
122
97
123
Note that this is dependent on your [notification settings](/docs/segment-app/#segment-settings). For example, if the threshold is set to 99%, then you'll be notified each time less than 100% of events fail.
98
124
99
-
You can also use Connections Alerting, a feature that allows Segment users to receive in-app, email, and Slack notifications related to the performance and throughput of an event-streaming connection.
125
+
You can also use [Connections Alerting](/docs/connections/alerting), a feature that allows Segment users to receive in-app, email, and Slack notifications related to the performance and throughput of an event-streaming connection.
100
126
101
127
Connections Alerting allows you to create two different alerts:
102
128
-**Source volume alerts**: These alerts notify you if your source ingests an abnormally small or large amount of data. For example, if you set a change percentage of 4%, you would be notified when your source ingests less than 96% or more than 104% of the typical event volume.
Copy file name to clipboardexpand all lines: src/connections/storage/catalog/aws-s3/index.md
+3-1
Original file line number
Diff line number
Diff line change
@@ -11,14 +11,16 @@ The AWS S3 destination provides a more secure method of connecting to your S3 bu
11
11
12
12
Functionally, the two destinations (Amazon S3 and AWS S3 with IAM Role Support) copy data in a similar manner.
13
13
14
-
## Getting Started
14
+
## Getting started
15
15
16
16
The AWS S3 destination puts the raw logs of the data Segment receives into your S3 bucket, encrypted, no matter what region the bucket is in.
17
17
18
18
AWS S3 works differently than most destinations. Using a destinations selector like the [integrations object](/docs/connections/spec/common/#integrations) does not affect events with AWS S3.
19
19
20
20
The Segment Tracking API processes data from your sources and collects the Events in batches. Segment then uploads the batches to a secure Segment S3 bucket, from which they're securely copied to your own S3 bucket in small bursts. Individual files won't exceed 100 MB in size.
21
21
22
+
{% include content/storage-do-include.md %}
23
+
22
24
{% comment %}
23
25
24
26

Copy file name to clipboardexpand all lines: src/connections/storage/catalog/azuresqldw/index.md
+2
Original file line number
Diff line number
Diff line change
@@ -9,6 +9,8 @@ redirect_from:
9
9
10
10
Azure's [Azure Synapse Analytics](https://azure.microsoft.com/en-us/services/synapse-analytics/){:target="_blank"}, previously known as Azure SQL Data Warehouse, is a limitless analytics service that brings together enterprise data warehousing and Big Data analytics.
11
11
12
+
{% include content/storage-do-include.md %}
13
+
12
14
## Getting Started
13
15
14
16
Complete the following prerequisites in Microsoft Azure before connecting your Azure Synapse Analytics databases to Segment:
Copy file name to clipboardexpand all lines: src/connections/storage/catalog/bigquery/index.md
+2
Original file line number
Diff line number
Diff line change
@@ -13,6 +13,8 @@ Google AdWords into a BigQuery data warehouse. When you integrate BigQuery with
13
13
The Segment warehouse connector runs a periodic ETL (Extract - Transform - Load) process to pull raw events and objects from your sources and load them into your BigQuery cluster.
14
14
For more information about the ETL process, including how it works and common ETL use cases, refer to [Google Cloud's ETL documentation](https://cloud.google.com/learn/what-is-etl){:target="_blank"}.
15
15
16
+
{% include content/storage-do-include.md %}
17
+
16
18
## Getting Started
17
19
18
20
To store your Segment data in BigQuery, complete the following steps:
Copy file name to clipboardexpand all lines: src/connections/storage/catalog/db2/index.md
+3-1
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@ all of your event and Cloud Source data in a warehouse built by IBM. This
11
11
guide will walk through what you need to know to get up and running with Db2
12
12
Warehouse and Segment.
13
13
14
-
> note " "
14
+
> info " "
15
15
> This document refers specifically to [IBM Db2 Warehouse on Cloud](https://www.ibm.com/cloud/db2-warehouse-on-cloud){:target="_blank"}, [IBM Db2 Warehouse](https://www.ibm.com/analytics/db2){:target="_blank"}, and the [IBM Integrated Analytics System](https://www.ibm.com/products/integrated-analytics-system){:target="_blank"}. For questions related to any of these products, see the [IBM Cloud Docs](https://cloud.ibm.com/docs){:target="_blank"}.
16
16
17
17
## Getting Started
@@ -21,6 +21,8 @@ To get started, you'll need to:
21
21
2.[Grant the user sufficient permissions](#grant-the-segment-user-permissions).
22
22
3.[Create the the IBM Db2 Destination in the Segment app](#create-segment-db2-destination).
23
23
24
+
{% include content/storage-do-include.md %}
25
+
24
26
### Create a User for Segment
25
27
26
28
In order to connect your IBM Db2 warehouse to Segment, you need to create a Db2 user account that Segment can assume. To create a user account for Segment:
The Google Cloud Storage (GCS) destination puts the raw logs of the data Segment receives into your GCS bucket. The data is copied into your bucket at least every hour. You might see multiple files over a period of time depending on how much data is copied.
10
8
11
9
> warning ""
@@ -20,7 +18,6 @@ The Google Cloud Storage (GCS) destination puts the raw logs of the data Segment
20
18
1. Create a Service Account to allow Segment to copy files into the bucket
21
19
2. Create a bucket in your preferred region.
22
20
23
-
24
21
## Set up Service Account to give Segment access to upload to your Bucket
Copy file name to clipboardexpand all lines: src/connections/storage/catalog/postgres/index.md
+3-1
Original file line number
Diff line number
Diff line change
@@ -11,14 +11,16 @@ PostgreSQL, or Postgres, is an object-relational database management system (ORD
11
11
12
12
PostgreSQL is ACID-compliant and transactional. PostgreSQL has updatable views and materialized views, triggers, foreign keys; supports functions and stored procedures, and other expandability. Developed by the PostgreSQL Global Development Group, free and open-source.
13
13
14
-
> note "Segment sources required"
14
+
> info "Segment sources required"
15
15
> In order to add a Postgres destination to Segment, you must first add a source. To learn more about sources in Segment, check out the [Sources Overview](/docs/connections/sources) documentation.
16
16
17
17
## Getting started
18
18
Segment supports the following Postgres database providers:
19
19
-[Heroku](#heroku-postgres)
20
20
-[RDS](#rds-postgres)
21
21
22
+
{% include content/storage-do-include.md %}
23
+
22
24
Segment supported a third Postgres provider, Compose, until Compose was [was deprecated on March 1, 2023](https://help.compose.com/docs/compose-deprecation){:target="_blank"}. To continue sending your Segment data to a Postgres destination, consider using either [Heroku Postgres](#heroku-postgres) or [Amazon's Relational Database Service](#rds-postgres).
Copy file name to clipboardexpand all lines: src/connections/storage/catalog/redshift/index.md
+2
Original file line number
Diff line number
Diff line change
@@ -17,6 +17,8 @@ Complete the following steps to provision your Redshift cluster, and connect Seg
17
17
3.[Create a database user](#create-a-database-user)
18
18
4.[Connect Redshift to Segment](#connect-redshift-to-segment)
19
19
20
+
{% include content/storage-do-include.md %}
21
+
20
22
## Choose the best instance for your needs
21
23
22
24
While the number of events (database records) are important, the storage capacity usage of your cluster depends primarily on the number of unique tables and columns created in the cluster. Keep in mind that each unique `.track()` event creates a new table, and each property sent creates a new column in that table. To avoid storing unnecessary data, start with a detailed [tracking plan](/docs/protocols/tracking-plan/create/) before you install Segment libraries to ensure that only the necessary events are passed to Segment.
Copy file name to clipboardexpand all lines: src/connections/storage/catalog/snowflake/index.md
+2
Original file line number
Diff line number
Diff line change
@@ -23,6 +23,8 @@ There are six steps to get started using Snowflake with Segment.
23
23
5.[Test the user and credentials](#step-5-test-the-user-and-credentials)
24
24
6.[Connect Snowflake to Segment](#step-6-connect-snowflake-to-segment)
25
25
26
+
{% include content/storage-do-include.md %}
27
+
26
28
### Prerequisites
27
29
28
30
To set up the virtual warehouse, database, role, and user in Snowflake for Segment's Snowflake destination, you must have the `ACCOUNTADMIN` role, or, a custom role with the following [Snowflake privileges](https://docs.snowflake.com/en/user-guide/security-access-control-overview#label-access-control-overview-privileges){:target="_blank"}:
0 commit comments