Ends in
00
days
00
hrs
00
mins
00
secs
SHOP NOW

💝 Valentine's Sale! Get 30% OFF Any Reviewer. Use coupon code: VDAYSALE2026 & 5% OFF Store Credits/Gift Cards

Find answers, ask questions, and connect with our
community around the world.

Home Forums AWS AWS Certified Machine Learning – Specialty Out dated question on Amazon Elastic Inference – AEI

  • Out dated question on Amazon Elastic Inference – AEI

  • PKX01

    Member
    September 17, 2025 at 5:52 pm

    AEI is no longer provided as it’s now SageMaker.

    Category: MLS – Machine Learning Implementation and Operations

    A Machine Learning Specialist has trained an Apache MXNet model using Amazon SageMaker. The Specialist wants to accelerate his inference workloads without having to pay for expensive GPU-based instances.

    Which is the MOST cost-effective solution for this problem?

    Use a P2 or P3 instance type for inference.

    * Use Amazon Elastic Inference

    Use Amazon Kinesis Data Streams to get a real-time inference.

    Use Inference Pipeline

    Reference: https://aws.amazon.com/blogs/machine-learning/model-serving-with-amazon-elastic-inference/

  • Irene-TutorialsDojo

    Administrator
    September 18, 2025 at 12:55 pm

    Hello PKX01,

    Thank you for bringing this to our attention.

    You are correct that Amazon Elastic Inference (AEI) has been deprecated and is no longer available as a standalone option. The original intent of the question was to highlight a cost-effective way to accelerate inference without relying on GPU-based instances, but AWS now recommends using newer solutions such as SageMaker with Inferentia (Inf1/Inf2) instances, serverless inference, or multi-model endpoints for that purpose. We’ll update this item to reflect the latest AWS guidance so that it remains accurate and aligned with current best practices.

    If you have further questions or need additional clarification, please don’t hesitate to contact us.

    Best,

    Irene @ Tutorials Dojo

Viewing 1 - 2 of 2 replies

Log in to reply.

Original Post
0 of 0 posts June 2018
Now
Skip to content