Over 10 years we help companies reach their financial and branding goals. Engitech is a values-driven technology agency dedicated.

Gallery

Contacts

Bhubaneswar, India

info@krescitus.com

+1 -800-456-478-23

Blogs

Amazon Kinesis Data Stream is used to collect large streams of data records in real time. You can create data-processing applications, known as Kinesis Data Streams applications. A typical Kinesis Data Streams application reads data from a data stream as data records. 

Amazon Kinesis Data Stream provides two APIs for putting data into an Amazon Kinesis stream: PutRecord and PutRecords. Put Record allows a single data record within an API call as well while Put Records allows multiple data records within an API call.

Amazon Kinesis Data Firehose is a fully managed service for delivering real-time streaming data to destinations such as Amazon Simple Storage Service and DynamoDB.

AWS Lambda is an event-driven, serverless computing platform provided by Amazon as part of Amazon Web Services. In AWS Lambda, the code is executed based on the responses to events in AWS services, such as adding or deleting files in an S3 bucket, and HTTP requests from the Amazon API Gateway.

AWS Identity and Access Management (IAM) roles are entities you can create and assign specific permissions to perform actions in AWS. When your trusted identities assume IAM roles, they grant only the permissions scoped by those IAM roles.

Amazon Simple Storage Service (Amazon S3) is an object storage service that offers scalability, data availability, security, and performance. You can use Amazon S3 to store and retrieve data at any time.

But Now We can go towards practically checking how it works. Basically, we have a few steps to work with real-time streaming data.

  1. A role 
  2. A kinesis stream
  3. A lambda function to write data to the stream
  4. A lambda function to read data from the stream

To create a role, go to the IAM console, choose the role and add the permissions.

Go to the AWS console, then go to the Kinesis stream.

Choose the name of your Kinesis stream. Then, click create data stream You can create your stream right now. And it looks like this.

Open the Functions page of the Lambda console to create a Lambda function. Through the button create function.

Let’s proceed with the Lambda function chosen, built from scratch. Give the name of the function. Runtimes include programming languages such as Node.js, Go, Python, Ruby, Java, C#, and .NET. IAM roles grant permissions to Lambda functions. Click on create function, then your function is ready to work now.

Then first the writing part write-kstream.py

Now for the reading part. read-kstream.py

After successfully running that code, you can check on your Kinesis stream metrics to see how the flow of data.

Now we can create a delivery stream to deliver the records from the Kinesis stream data to any destination path, like S3, DynamoDB, through the Kinesis delivery stream

Then, to create a delivery stream, you have to go Kinesis console then choose the delivery stream, and after that choose the source and destination as per the requirements.

Then click create delivery stream, it takes some time to create. When that is created, it automatically goes to its dashboard.

To store that given data, there is one more step required: we have to create an S3 bucket to store those records.

To create a bucket, go to the S3 console, simply click Create bucket, choose your bucket name and region, then click Create bucket.

We can now test our delivery stream by putting records througha  lambda function to the Kinesis data stream.