Introducing storage optimized Amazon EC2 I8g instances powered by AWS Graviton4 processors and 3rd gen AWS Nitro SSDs

Today, we’re announcing the general availability of Amazon EC2 I8g instances, a new storage optimized instance type to provide the highest real-time storage performance among storage-optimized EC2 instances with the third generation of AWS Nitro SSDs and AWS Graviton4 processors.

AWS Graviton4 is the most powerful and energy efficient processor we have ever designed for a broad range of workloads running on EC2 instances using a 64-bit ARM instruction set architecture. AWS Nitro System SSDs are custom built by AWS and offer high I/O performance, low latency, minimal latency variability, and security with always-on encryption.

EC2 I8g instances are the first instance type to use third-generation AWS Nitro SSDs. These instances offer up to 22.5 TB local NVME SSD storage with up to 65 percent better real-time storage performance per TB and 60 percent lower latency variability compared to the previous generation I4g instances. Based on the AWS Graviton4 processors, I8g instances deliver up to 60 percent better compute performance and two times larger caches compared to I4g.

I8g instances offer up to 96 vCPUs, 768 GiB of memory, and 22.5 TB of storage to deliver more compute and storage choices compared with I4g instances.

Instance name vCPUs Memory (Gib) Storage (GB) Network bandwidth (Gbps) EBS bandwidth (Gbps)
I8g.large 2 16 468 up to 10 up to 10 Gbps
I8g.xlarge 4 32 937 up to 10 up to 10 Gbps
I8g.2xlarge 8 64 1,875 up to 12 up to 10 Gbps
I8g.4xlarge 16 128 3,750 up to 25 up to 10 Gbps
I8g.8xlarge 32 256 7,500
(2 x 3,750)
up to 25 10 Gbps
I8g.12xlarge 48 384 11,520
(3 x 3,750)
up to 28.125 15 Gbps
I8g.16xlarge 64 512 15,000
(4 x 3,750)
up to 37.5 20 Gbps
I8g.24xlarge 96 768 22,500
(6 x 3,750)
up to 56.25 20 Gbps
I8g.metal-24xl 96 768 22,500
(6 x 3,750)
up to 56.25 30 Gbps

You can use I8g instances for I/O intensive workloads that require low latency access to data such as transactional databases (MySQL and PostgreSQL), real-time databases, NoSQL databases, (Aerospike, Apache Druid, MongoDB) and real-time analytics such as Apache Spark.

Additionally, I8g instances are built on the AWS Nitro System, which offloads CPU virtualization, storage, and networking functions to dedicated hardware and software to enhance the performance and security of your workloads. The Graviton4 processors offer you enhanced security by fully encrypting all high-speed physical hardware interfaces.

Things to know
Here are some things that you should know about EC2 I8g instances:

  • Operating system – EC2 I8g instances support Amazon Linux 2023, Amazon Linux 2, CentOS Stream 8 or newer, Ubuntu 18.04 or newer, SUSE 15 SP2 or newer, Debian 11 or newer, Red Hat Enterprise 8.2 or newer, CentOS 8.2 or newer, FreeBSD 13 or newer, Rocky Linux 8.4 or newer, Alma Linux 8.4 or newer, and Alpine Linux 3.12.7 or newer.
  • Networking – You can use I8g instances in storage intensive workloads that typically have burst network usage patterns. All I8g instance sizes have burst network bandwidth and can burst more than 60 minutes, depending on the instance sizes, to support the majority of the workloads requiring instance storage data hydration, backup, and snapshot over the network.
  • Migration – If you’re using I4g instances now, you will have straightforward experience migrating storage intensive workloads to I8g instances because these instances offer similar memory and storage ratios to existing I4g instances.

Now available
Amazon EC2 I8g instances are now available in the US East (N. Virginia) and US West (Oregon) AWS Regions through On-Demand instances, Savings Plans, Spot Instances, Dedicated Instances, or Dedicated Hosts.

Give EC2 I8g instances a try in the Amazon EC2 console. To learn more, visit the EC2 I8g instances page and send feedback to AWS re:Post for EC2 or through your usual AWS Support contacts.

Channy


Blog Article: Here

  • Related Posts

    OpenAI’s latest o1 model now available in GitHub Copilot and GitHub Models

    The December 17 release of OpenAI’s o1 model is now available in GitHub Copilot and GitHub Models, bringing advanced coding capabilities to your workflows.

    The post OpenAI’s latest o1 model now available in GitHub Copilot and GitHub Models appeared first on The GitHub Blog.

    Inside the research: How GitHub Copilot impacts the nature of work for open source maintainers

    An interview with economic researchers analyzing the causal effect of GitHub Copilot on how open source maintainers work.

    The post Inside the research: How GitHub Copilot impacts the nature of work for open source maintainers appeared first on The GitHub Blog.

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Announcing CodeQL Community Packs

    60 of our biggest AI announcements in 2024

    60 of our biggest AI announcements in 2024

    Our remedies proposal in DOJ’s search distribution case

    Our remedies proposal in DOJ’s search distribution case

    How Chrome’s Autofill can drive more conversions at checkout

    How Chrome’s Autofill can drive more conversions at checkout

    The latest AI news we announced in December

    The latest AI news we announced in December

    OpenAI’s latest o1 model now available in GitHub Copilot and GitHub Models

    OpenAI’s latest o1 model now available in GitHub Copilot and GitHub Models