CategoryAWS Events
Date Published
December 3, 2021

The AWS re:Invent Rundown – Highlights in Data, AI & ML

We’ve been checking in on the daily releases, announcements and innovations in cloud provider technology as re:Invent continues to deliver on their 10th anniversary!

What’s caught our eye at this year’s AWS re:Invent

As fast as it’s arrived, the final day of the 2021 Las Vegas AWS re:Invent is here! What a show it has been!

In today’s AWS Insight, we’ll be talking about the latest news that will certainly contribute to our direction, training and focus in 2022, covering artificial intelligence, machine learning and data focused tools and services.

Education for all

AWS have come out the gates swinging by making it a key initiative to inspire young learners to consider a career in tech. AWS announced the launch of the AWS AI & ML Scholarship program in collaboration with Intel and Udacity, designed to prepare underrepresented and underserved students globally for careers in ML.

The AWS AI & ML Scholarship program is launching as part of the all-new AWS DeepRacer Student service and Student League. This is a new student division of the popular AWS DeepRacer program, a cloud-based 3D car racing simulator that provides a fun way to learn about ML and reinforcement learning (RL).

How to apply?

To be considered for a scholarship, you must successfully finish all AWS DeepRacer Student learning modules and achieve a score of at least 80% on all course quizzes, reach a certain lap time performance with your DeepRacer car in the Student League, and submit an essay.

Each year, 2,000 students will win a scholarship to the AI Programming with Python Udacity Nanodegree program. Udacity Nanodegrees are massive open online courses (MOOCs) designed to bridge the gap between learning and career goals.

To enroll in the AWS AI & ML Scholarship program, simply sign up at the AWS DeepRacer Student service with a valid email address.

Enrolling at the AWS DeepRacer Student ML Program.

Amazon Redshift Serverless is here!

AWS were seeing the use of data analytics expanding among new audiences within organizations. For example, with users like developers and business analysts who don’t have the expertise or the time to manage a traditional data warehouse.

3 days ago, they introduced Amazon Redshift Serverless, a new capability that makes it super easy to run analytics in the cloud with high performance at any scale. Just load your data and start querying. There is no need to set up and manage clusters. You pay for the duration in seconds when your data warehouse is in use, for example, while you are querying or loading data. There is no charge when your data warehouse is idle.

Amazon Redshift Serverless automatically provisions the right compute resources for you to get started. As your demand evolves with more concurrent users and new workloads, your data warehouse scales seamlessly and automatically to adapt to the changes. You can optionally specify the base data warehouse size to have additional control on cost and application-specific SLAs.

You can learn more about Amazon Redshift Serverless here.

Setting up Serverless with default/custom settings.

Experiment with Machine Learning for free!

AWS announced the public preview of Amazon SageMaker Studio Lab, a free service that enables anyone to learn and experiment with ML without needing an AWS account, credit card or cloud configuration knowledge.

Studio Lab is based on open-source JupyterLab and gives users free access to AWS compute resources to quickly start learning and experimenting with ML. Studio Lab is also simple to set up. In fact, the only configuration you have to do is one click to choose whether you need a CPU or GPU instance for your project.

Click the preview image below to get started with the SageMaker Studio Lab, (we’ll see you in there)!

Intuitive UI to look at predictive outcomes of datasets.

The 6th Pillar has arrived!

The AWS Well-Architected Framework has been helping AWS customers improve their cloud architectures since 2015. The framework consists of design principles, questions and best practices across multiple pillars:

• Operational Excellence
• Security
• Reliability
• Performance Efficiency
• Cost Optimization

Yesterday, AWS revealed the 6th Pillar – Sustainability.

Sustainability is aimed to help organizations learn, measure and improve their workloads using environmental best practices for cloud computing.

Similar to the other pillars, the Sustainability Pillar contains questions aimed at evaluating the design, architecture and implementation of your workloads to reduce their energy consumption and improve their efficiency. The pillar is designed as a tool to track progress towards policies and best practices that support a more sustainable future, not just a simple checklist.

As a Partner that prides itself on community outreach as well as hosting a wealth of internal programs for team members, sustainability is at the heart of what we do. We couldn’t be more delighted to see this new addition to the AWS pillar system and are already planning on integrations within our own Well Architected Framework Reviews.

If you’re interested in how the this pillar will shape your future infrastructure builds, visit the model of sustainability section here.

The AWS Shared Model of Sustainability.

The AWS Shared Model of Sustainability.

“Redshift serverless and the clear direction of machine learning courses, training and advancement, makes for an exciting future with AWS. 2022 is sure to be yet another year where ML & AI integration into cloud infrastructures becomes the standard and best practice.”

Ahmed Nuaman

Managing Director - Firemind

Introducing Sagemaker Training Compiler

Never thought you could accelerate training of deep learning models by 50%? Well now you can! Say hello to the Training Compiler, automatically compiling your Python training code and generating GPU kernels specifically for your model.

The technicalities – SageMaker Training Compiler uses graph-level optimization (operator fusion, memory planning and algebraic simplification), data flow-level optimizations (layout transformation, common sub-expression elimination), and back-end optimizations (memory latency hiding, loop oriented optimizations) to produce an optimized model that efficiently uses hardware resources. As a result, training is accelerated by up to 50% and the returned model is the same as if SageMaker Training Compiler had not been used.

AWS have provided some helpful guidance on the new tool that helps alleviate the technical features of this new service. Click the image below to see how adding just two lines of code can help the compiler with your models.

It’s as simple as adding 2 lines of code! Using Training Compiler in your data models.

Calling all Developers – Let’s save you some time!

On Wednesday, AWS revealed Amazon DevOps Guru for RDS, a new capability for Amazon DevOps Guru that allows developers to easily detect, diagnose and resolve performance and operational issues in Amazon Aurora.

Hundreds of thousands of customers are currently using Amazon Aurora because it is highly available, scalable and durable. But as applications grow in size and complexity, it becomes more challenging for these customers to detect and resolve operational and performance issues quickly.

Since last years re:Invent, where DevOps Guru was first announced, the product team have made leaps and strides in it’s evolution. It can now use ML to automatically identify and analyse a wide range of performance-related database issues, such as over-utilization of host resources, database bottlenecks or misbehavior of SQL queries. It also recommends solutions to remediate the issues it finds.

To fully understand how it works, we recommend checking out the launch article here.

This image is an example of a finding that DevOps Guru for RDS reported. The graph shows that from the AAS, most of them were waiting for access to a table or for CPU.

In Summary

With today still to go, we’re looking forward to seeing what else is on the horizon at re:Invent, especially with new data and ML services.

AWS have certainly come out strong this year with a heavy focus on tools to make data science easier, cloud management seamless, database costings cheaper and computing power stronger.

To see all of the top announcements so far at this year’s event, visit the AWS News Blog. If you’re already looking for a partner who can help you harness your data, reach out to us.

Jodie Rhodes - Digital Marketing Assistant

Jodie Rhodes - Digital Marketing Assistant