Ends in

$2 OFF in ALL Azure Practice Exams & NEW AZ-500 Microsoft Azure Security Engineer Associate Practice Exams at $10.99!

Find answers, ask questions, and connect with our
community around the world.

Home Forums AWS AWS Certified DevOps Engineer Professional Amazon Elasticsearch Service data ingestion

  • Amazon Elasticsearch Service data ingestion

  • claude

    April 29, 2020 at 8:22 pm


    In Quizz Practice exams, Review Mode Set 1 – AWS Certified DevOps Engineer Professional :

    QUESTION 25 :

    A fast-growing company has multiple AWS accounts which are consolidated using AWS Organizations and they expect to add new accounts soon. As the DevOps engineer, you were instructed to design a centralized logging solution to deliver all of their VPC Flow Logs and CloudWatch Logs across all of their sub-accounts to their dedicated Audit account for compliance purposes. The logs should also be properly indexed in order to perform search, retrieval, and analysis.

    Which of the following is the MOST suitable solution that you should implement to meet the above requirements?

    Supposed correct Answer :

    In the Audit account, create a new stream using Kinesis Data Streams to send all of the logs to an Amazon ES cluster. Create a CloudWatch subscription filter and use Kinesis Data Streams to stream all of the VPC Flow Logs and CloudWatch Logs from the sub-accounts to the Kinesis data stream in the Audit account.

    Explanations :

    You can load streaming data into your Amazon Elasticsearch Service domain from many different sources in AWS. Some sources, like Amazon Kinesis Data Firehose and Amazon CloudWatch Logs, have built-in support for Amazon ES. Others, like Amazon S3, Amazon Kinesis Data Streams, and Amazon DynamoDB, use AWS Lambda functions as event handlers. The Lambda functions respond to new data by processing it and streaming it to your domain.

    I fully agree with the explanations, but not with the answer.

    You cannot directly feed ES from Kinesis Data Stream, as you would do with Kinesis Firehose.

    So the sentence : create a new stream using Kinesis Data Streams to send all of the logs to an Amazon ES cluster is incorrect.

    What do you think ?



  • Jon-Bonso

    April 30, 2020 at 12:44 pm

    Hi Claude,

    Thank you for posting your question. You have a valid point that you can’t directly send the stream data from Kinesis Data Stream. You have to use a Lambda function as an event handler, that will send the data to your ES Cluster, as shown here:



    To avoid any ambiguity in the answers, we will update the correct answer to accentuate the need for an additional Lambda function (that serves as an event handler) for the Kinesis Data Streams.

    Let us know if you need further assistance. The Tutorials Dojo team is dedicated to help you pass your AWS exam on your first try!


    Jon Bonso @ Tutorials Dojo

  • claude

    April 30, 2020 at 1:49 pm

    Hi JB,

    thanks for fast reply & explanations.

    It really helps in my learning.



    • Jon-Bonso

      May 1, 2020 at 9:33 am

      You’re welcome, Claude! The Tutorials Dojo team is always here to help!

Viewing 1 - 3 of 3 replies

Log in to reply.

Original Post
0 of 0 posts June 2018