Games  

Build a Serverless Real-time Data Processing App

Build a Serverless Real-Time Data Processing App

https://aws.amazon.com/getting-started/projects/build-serverless-existent-time-data-processing-app-lambda-kinesis-s3-dynamodb-cognito-athena/?trk=gs_card

Intro

Create an AWS Business relationship

Set upwardly AWS Cloud9 IDE

Installation

Real-fourth dimension Streaming Data

Create an Amazon Kinesis stream

Produce messages into the stream

  • Switch to the tab where have Cloud9 environment opened.
  • In the concluding, run the producer to first emiting sensor information to the stream: ./producer

    avatar

Read messages from the stream

  • Click
    New Terminal
    to open a new terminal tab.
  • Run the consumer to start reading sensor data from the stream: ./consumer

    avatar

Create an identity pool for teh unicorn dashboard

Grant the unauthenticated role access to the stream

View unicorn status on the dashboard

Experiment with the producer

Aggregate data

Create an Amazon Kinesis stream

  • Become to the AWS Management Panel, click Services then select Kinesis under Analytics.
  • Select Get started if prompted with an introductory screen.
  • Select Create data stream.
  • Enter wildrydes-summary into Kinesis stream proper name and ane into Number of shards, then select Create Kinesis stream.
  • Whithin 60 seconds, your Kinesis stream volition exist ACTIVE and ready to store the real-time streaming data.

    avatar

Create an Amazon Kinesis Data Analytics application

CREATE
            OR
            REPLACE STREAM
            
              "DESTINATION_SQL_STREAM"
            
            (
            
              "Name"
            
            VARCHAR(16),
            
              "StatusTime"
            
            TIMESTAMP,
            
              "Altitude"
            
            SMALLINT,
            
              "MinMagicPoints"
            
            SMALLINT,
            
              "MaxMagicPoints"
            
            SMALLINT,
            
              "MinHealthPoints"
            
            SMALLINT,
            
              "MaxHealthPoints"
            
            SMALLINT
            );  CREATE
            OR
            Supervene upon PUMP
            
              "STREAM_PUMP"
            
            AS
            INSERT INTO
            
              "DESTINATION_SQL_STREAM"
            
            SELECT
            STREAM
            
              "Proper name"
            ,
            
              "ROWTIME"
            ,
            SUM(
              "Distance"
            ),
            MIN(
              "MagicPoints"
            ),
            MAX(
              "MagicPoints"
            ),
            MIN(
              "HealthPoints"
            ),
            MAX(
              "HealthPoints"
            )
            FROM
            
              "SOURCE_SQL_STREAM_001"
            
            GROUP By
            FLOOR(
              "SOURCE_SQL_STREAM_001"
            .
              "ROWTIME"
            
            TO Minute),
            
              "Name"
            ;

Read messages from the stream

  • Switch to the tab where you accept your CLoud9 surround opened.
  • Run the consumer to start reading sensor data from the stream: ./consumer -stream wildrydes-summary

    avatar
Popular:   Is Tom Holland in Fast & Furious 10

Experiment with the producer

  • Switch to the tab where you have your Cloud9 surroundings opened.
  • Stop the producer by pressing Command + C and find the letters end.
  • Start the producer again and discover the letters resume.
  • Hit the (+) button and click New Last to open a new terminal tab.
  • Start another instance of the producer in the new tab. Provide a specific unicorn name and detect information points fo rboth unicorns in

    avatar

Process streaming data

Create an Amazon DynamoDB tables

Create an IAM office for your Lambda role

  • From the AWS Console, click on Services and so select IAM in the Securit, Identiy & Compliance section.
  • Select Policies from the left navigation and so click Create policy.
  • Using the Visual editor, nosotros’re going to create an IAM policy to allow our Lambda function access to the DynamoDB tabular array created in the terminal section. To brainstorm, select Service, begin typing DynamoDB in Find a service, and click DynamoDB.
  • Select Action, brainstorm typing BatchWriteItem in Filter actions, and tick the BatchWriteItem checkbox.
  • In Region, enter the AWS Region in which yous created the DynamoDB table in the previous section, e.g.: us-east-1.
  • In Account, enter your AWS Account ID which is a twelve digit number, e.g.: 123456789012. To notice your AWS business relationship ID number in the AWS Direction Console, click on Support in the navigation bar in the upper-correct, and then click Back up Center. Your currently signed ID appears in the upper-right corner beneath the Support menu.
  • In Table Proper name, enter UnicornSensorData.
  • Select Add.

    avatar
  • Select Review policy.
  • Enter WildRydesDynamoDBWritePolicy in the Name field.

    avatar
  • Select Create policy.
  • Select Roles from the left navigation and and so select Create role.

    avatar
  • Select Side by side: Permissions.
  • Begin typing AWSLambdaKinesisExecutionRole in the Filter text box and check the box adjacent to that role.
  • Begin typing WildRydesDynamoDBWritePolicy in the Filter text box and check the box side by side to that role.
  • Clikc Next: Review.
  • Enter WildRydesStreamProcessorRole for the Function name.

    avatar
  • Click Create role.
Popular:   Burned Out Memes for Ems Teams

Create a Lambda part to process the stream

'utilise strict' ; const AWS = require( 'aws-sdk' ); const dynamoDB = new AWS.DynamoDB.DocumentClient(); const tableName = process.env.TABLE_NAME; exports.handler = function(upshot, context, callback) { const requestItems = buildRequestItems(event.Records); const requests = buildRequests(requestItems); Promise.all(requests) .then(() => callback(aught, `Delivered ${effect.Records.length} records` )) .catch(callback); }; part buildRequestItems(records) { return records.map((record) => { const json = Buffer.from(tape.kinesis.information, 'base64' ).toString( 'ascii' ); const item = JSON.parse(json); return { PutRequest: { Detail: particular, }, }; }); } office buildRequests(requestItems) { const requests = []; while (requestItems.length > ) { const request = batchWrite(requestItems.splice(, 25)); requests.button(request); } render requests; } part batchWrite(requestItems, try = ) { const params = { RequestItems: { [tableName]: requestItems, }, }; let delay = ; if (attempt > ) { delay = 50 * Math.pow(2, try); } return new Hope(function(resolve, reject) { setTimeout(part() { dynamoDB.batchWrite(params).hope() .so(function(data) { if (data.UnprocessedItems.hasOwnProperty(tableName)) { render batchWrite(data.UnprocessedItems[tableName], attempt + one); } }) .and so(resolve) .take hold of(reject); }, delay); }); }


avatar

Monitor the Lambda function

  • Run the producer to first emiting sensor data to the stream with a unicorn name : ./producer -name Rocinante
  • Select the Monitor tab and eplore the metrics bachelor to monitor the function. Select Spring to Logs to explore the function’southward log output.

    avatar
Popular:   Price of Rebirth Guide Afk Arena

Query the DynamoDB table

  • Select Services and so select DynamoDB in the Database section.
  • Select Tables from the left-mitt navigation.
  • Select UnicornSensorData.
  • Select the Items tab. Here you should encounter each per-minute information point for each Univorn for which y’all’re running a producer.

    avatar

Shop & query data

Create an Amazon S3 bucket

  • From the AWS Management Console select Services then select S3 under Storage.
  • Select + Create bucket.
  • Provide a globally unique name for your saucepan such equally wildeydes-data-yourname.

    avatar
  • Select the region you’ve been using for your bucket.
  • Select Adjacent twice, and then select Create bucket.

Create an Amazon Kinesis Information Firehose delivery stream

Create an Amazon Athena table

  • Select Services then select Athena in the Analytics department.

    avatar
  • If prompted, select Get started and exit the first-run tutorial by striking the x in the upper correct hand corner of the modal dialog
  • Re-create and paste the following SQL statement to create the table. Supercede the YOUR_BUCKET_NAME_HERE placeholder with your bucket name (due east.grand. wildrydes-information-johndoe) in the LOCATION clause:

CREATE EXTERNAL TABLE IF NOT EXISTS wildrydes (      Proper noun string,      StatusTime
            timestamp,      Latitude float,      Longitude float,      Distance bladder,      HealthPoints
            int,      MagicPoints
            int
            )    ROW FORMAT SERDE
            
              'org.apache.hive.hcatalog.data.JsonSerDe'
            
            LOCATION
            
              's3://YOUR_BUCKET_NAME_HERE/'
            ;


avatar

  • Select Run Query.
  • Verify the table wildrydes was crerated by ensuring information technology has been added to the list of tables in the left navigation.

    avatar

Explore the batched information files

  • Select Services then select S3 in the Storage section.
  • Enter the bucket proper noun you lot created in the starting time section in the Search for buckets text input.
  • Select 1 of the files and select Download. Open the file with a text editor and explore its content.

    avatar

Query the data files

  • Select Services then select Athena in the Analytics department.
  • Copy and paste the post-obit SQL query:
  • Select Run Query.

    avatar

Clean Up

Source: https://github.com/Rajpratik71/Distributed-Systems-Complete/blob/master/Build%20a%20Serverless%20Real-Time%20Data%20Processing%20App.md