Skip to content
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions _partials/_not-supported-for-azure.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
This feature is on our roadmap for $CLOUD_LONG on Microsoft Azure. Stay tuned!
16 changes: 16 additions & 0 deletions migrate/livesync-for-kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,14 @@ tags: [stream, connector]

import PrereqCloud from "versionContent/_partials/_prereqs-cloud-only.mdx";
import EarlyAccessNoRelease from "versionContent/_partials/_early_access.mdx";
import NotSupportedAzure from "versionContent/_partials/_not-supported-for-azure.mdx";

# Stream data from Kafka

<Tabs label="Tiger Cloud on AWS and Azure" persistKey="tiger-platform-clouds">

<Tab title="Tiger Cloud on AWS" label="aws-cloud">

You use the Kafka source connector in $CLOUD_LONG to stream events from Kafka into your $SERVICE_SHORT. $CLOUD_LONG connects to your Confluent Cloud Kafka cluster and Schema Registry using SASL/SCRAM authentication and service account–based API keys. Only the Avro format is currently supported [with some limitations][limitations].

This page explains how to connect $CLOUD_LONG to your Confluence Cloud Kafka cluster.
Expand Down Expand Up @@ -232,6 +237,17 @@ Unsupported examples:
}
```

</Tab>

<Tab title="Tiger Cloud on Azure" label="azure-cloud">

<NotSupportedAzure />

</Tab>

</Tabs>


[confluent-cloud]: https://confluent.cloud/
[connection-info]: /integrations/:currentVersion:/find-connection-details/
[confluence-signup]: https://www.confluent.io/get-started/
Expand Down
16 changes: 16 additions & 0 deletions migrate/livesync-for-s3.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,14 @@ tags: [recovery, logical backup, replication]

import PrereqCloud from "versionContent/_partials/_prereqs-cloud-only.mdx";
import EarlyAccessNoRelease from "versionContent/_partials/_early_access.mdx";
import NotSupportedAzure from "versionContent/_partials/_not-supported-for-azure.mdx";

# Sync data from S3

<Tabs label="Tiger Cloud on AWS and Azure" persistKey="tiger-platform-clouds">

<Tab title="Tiger Cloud on AWS" label="aws-cloud">

You use the $S3_CONNECTOR in $CLOUD_LONG to synchronize CSV and Parquet files from an S3 bucket to your $SERVICE_LONG in real time. The connector runs continuously, enabling you to leverage $CLOUD_LONG as your analytics database with data constantly synced from S3. This lets you take full advantage of $CLOUD_LONG's real-time analytics capabilities without having to develop or manage custom ETL solutions between S3 and $CLOUD_LONG.

![Tiger Cloud connectors overview](https://assets.timescale.com/docs/images/tiger-cloud-console/tiger-cloud-connector-overview.png)
Expand Down Expand Up @@ -157,6 +162,17 @@ To sync data from your S3 bucket to your $SERVICE_LONG using $CONSOLE:
And that is it, you are using the $S3_CONNECTOR to synchronize all the data, or specific files, from an S3 bucket to your
$SERVICE_LONG in real time.

</Tab>

<Tab title="Tiger Cloud on Azure" label="azure-cloud">

<NotSupportedAzure />

</Tab>

</Tabs>


[about-hypertables]: /use-timescale/:currentVersion:/hypertables/
[lives-sync-specify-tables]: /migrate/:currentVersion:/livesync-for-postgresql/#specify-the-tables-to-synchronize
[compression]: /use-timescale/:currentVersion:/compression/about-compression
Expand Down
16 changes: 16 additions & 0 deletions migrate/upload-file-using-console.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,14 @@ keywords: [import]

import ImportPrerequisitesCloudNoConnection from "versionContent/_partials/_prereqs-cloud-no-connection.mdx";
import EarlyAccessGeneral from "versionContent/_partials/_early_access.mdx";
import NotSupportedAzure from "versionContent/_partials/_not-supported-for-azure.mdx";

# Upload a file into your $SERVICE_SHORT using $CONSOLE_LONG

<Tabs label="Tiger Cloud on AWS and Azure" persistKey="tiger-platform-clouds">

<Tab title="Tiger Cloud on AWS" label="aws-cloud">

You can upload files into your $SERVICE_SHORT using $CONSOLE_LONG. This page explains how to upload CSV, Parquet, and text files, from your local machine and from an S3 bucket.

<Tabs label="Upload files using Tiger Cloud Console" persistKey="console-import">
Expand Down Expand Up @@ -205,6 +210,17 @@ To import a Parquet file from an S3 bucket:

And that is it, you have imported your data to your $SERVICE_LONG.

</Tab>

<Tab title="Tiger Cloud on Azure" label="azure-cloud">

<NotSupportedAzure />

</Tab>

</Tabs>


[credentials-iam]: https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-user.html#roles-creatingrole-user-console
[credentials-public]: https://docs.aws.amazon.com/AmazonS3/latest/userguide/example-bucket-policies.html#example-bucket-policies-anonymous-user
[console]: hhttps://console.cloud.timescale.com/dashboard/services
16 changes: 16 additions & 0 deletions use-timescale/tigerlake.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,14 @@ keywords: [data lake, lakehouse, s3, iceberg]

import IntegrationPrereqsCloud from "versionContent/_partials/_integration-prereqs-cloud-only.mdx";
import EarlyAccessGeneral from "versionContent/_partials/_early_access.mdx";
import NotSupportedAzure from "versionContent/_partials/_not-supported-for-azure.mdx";

# Integrate data lakes with $CLOUD_LONG

<Tabs label="Tiger Cloud on AWS and Azure" persistKey="tiger-platform-clouds">

<Tab title="Tiger Cloud on AWS" label="aws-cloud">

$LAKE_LONG enables you to build real-time applications alongside efficient data pipeline management within a single
system. $LAKE_LONG unifies the $CLOUD_LONG operational architecture with data lake architectures.

Expand Down Expand Up @@ -333,6 +338,17 @@ data lake:
* Iceberg snapshots are pruned automatically if the amount exceeds 2500.
* The Iceberg namespace is hard coded to `timescaledb`, a custom namespace value is work in progress.

</Tab>

<Tab title="Tiger Cloud on Azure" label="azure-cloud">

<NotSupportedAzure />

</Tab>

</Tabs>


[cmc]: https://console.aws.amazon.com/cloudformation/
[aws-athena]: https://aws.amazon.com/athena/
[apache-spark]: https://spark.apache.org/
Expand Down