- Go to IoT Core
- Create a thing called
device-sensor-1Name itdevice-sensor-1 - Create a certificate for the thing Certificate for the thing One-click certificate Download everything Download the first AWS certificate SA-1
- Create a policy for the thing and attach it to the certificate
1. Subscribe and publish to a topic
2. Setup a Python virtual environment for the IoT Script
python3 -m venv venv
source venv/bin/activate
pip install AWSIoTPythonSDK- Publish to the same topic with the python script Edit the script (medical_script.py) Change the BROKER_PATH Change the Root CA, Private Key and Certificate paths Use the script to publish to the topic and see it publish live in the browser
Add action
Review integrations
Send message to Amazon Kinesis Firehose
Enter Kinesis or "Create new Resource" on the Configure Action stuff in IoT
Create the Stream
Process records
Transform Records for Glue - On
Destination
Glue ETL, Kinesis Analytics
Settings
IAM Role - Create a new one
Review and create
- Go to Kinesis
- Create the Analytics Stream using
CREATE OR REPLACE STREAM "DESTINATION_SQL_STREAM" (
);
CREATE OR REPLACE PUMP "STREAM_PUMP" AS INSERT INTO "DESTINATION_SQL_STREAM"
SELECT STREAM
FROM "SOURCE_SQL_STREAM_001"
GROUP BY
FLOOR(("SOURCE_SQL_STREAM_001".ROWTIME - TIMESTAMP '1970-01-01 00:00:00') SECOND / 10 TO SECOND);-> It will show data in every 10 Seconds.
- Go to Lambda
- Create a Lambda Function from the blueprints for taking data from Firehouse and creating an SNS alert.
- Grant the required permission and create a role.
- Name the Handler and Upload the package details with the files.
- Use the SNS topic to send notifications from inside the Lambda Function
- Go to SNS
- Create the topic for Alerting the user
- Set up the subscriptions with the sms or Email configurations
- Or setup a HTTP notification
- Go to AWS Glue
- Create a new Job
- Configure Job properties
- Specify the source as the Kinesis Firehouse
- Transform as required.
- Specify the Data Target as the S3, (Made by Lake Formation)
- Set-up the Schema
- Go to Lake Formation
- Register Location, by creating the S3 bucket.
- Set-up the IAM role for the S3 bucket.
- Create a database for storing the data and set-up the IAM role.
- Grant Permissions as required.
- Go to Lambda
- Create a Lambda Function from the blueprints for taking data from S3 and to Elastic Search.
- Grant the required permission and create a role.
- Name the Handler and Upload the package details with the files.
- Use the Lambda function to send the data to Elastic Search Service.
- Go to Elastic Search Service
- Create a domain.
- Choose the deployment type.
- Configure the Cluster and access policies as necessary.
- Set-up the Access and VPC settings.
- Use the generated end-points to access the Elastic Search.
- Use the generated Kibana URL to go to the Kibana
- Register or Log In the Kibana
- Create Index
- Discover and Visualize the data as required.
- Go to AWS Glue
- Create a new Job
- Configure Job properties
- Specify the source as the the given data source.
- Transform as required.
- Specify the Data Target as the S3, (Made by Lake Formation)
- Set-up the Schema
- Go to Amazon Athena.
- Create a database and table.
- Specify the dataset source and name as in S3 Bucket.
- Create the Data format
- Set-up the columns and partitions.
- Use the SQL queries to analyse the data.
- Go to Amazon Machine Learning
- Create a model. Specify the dataset source and name as in the S3 bucket Knowledge Base data. Set-up the Schema. Set the target Column to be predicted. Set the Row Id, Review and launch. After training test and the model.
- Get the generated endpoints for the model.
- Use the endpoints in the Lambda function for classifying the issue and finding the solution with the help of knowledge base.
OR
- Go to SageMaker
- Create a model Using the Notebooks. Train the model. Fine tune the model by testing. Store the model.
- Get the generated endpoints for the model.
- Use the endpoints in the Lambda function for classifying the issue and finding the solution with the help of knowledge base.
- Go to DynamoDB Dashboard.
- Create a new table.
- Provide the Partition key and sort key as required.
- On the Overview tab, choose Manage Stream.
- Choose the information that will be written to the stream whenever the data in the table is modified.
- After the setting, click Enable.
- Go to Lambda
- Create a Lambda Function from the blueprints for taking data from DynamoDB Stream and to generate the SNS Notification.
- Grant the required permission and create a role.
- Name the Handler and Upload the package details with the files.
- Use the Lambda function to use the API endpoints of the ML Model and classify the issue.
- After classifying, send the required steps docuemnt from the knowledge base using the SNS Topic.
- Go to SNS
- Create the topic for Replying the user about the steps to perform for solving the issue.
- Set up the subscriptions with the sms or Email configurations.
- Or setup a HTTP notification to the caller SDK user.
