Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Design and Implementation of an Event-Driven API for Efficient Batch Data Insertion into DynamoDB #11

Open
VCauthon opened this issue Sep 1, 2024 · 0 comments

Comments

@VCauthon
Copy link
Owner

VCauthon commented Sep 1, 2024

This task involves designing an event-driven API architecture to handle large batch data insertions into DynamoDB while maintaining performance during traffic spikes.

The proposed solution leverages AWS services like S3, Step Functions, Lambda, ECS with Fargate, and CloudWatch for orchestration, processing, and monitoring.

The workflow initiates when an external agent uploads data to an S3 bucket.

The data is then processed by an ECS Task in a Fargate cluster, allowing for scalable and time-unrestricted processing before being inserted into DynamoDB.

The architecture also includes mechanisms for error handling and cost optimization using S3 lifecycle policies and CloudWatch alerts.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant