Kicking-off his second AWS re:Invent keynote in Las Vegas, Adam Selipsky, CEO of Amazon Web Services, highlighted that Cloud is more an imperative now than ever before for enterprises seeking to drive efficiencies, benefit from its elasticity, innovate better, and attain agility amidst the prevailing macroeconomic climate. Take, for instance, Airbnb, an AWS customer. When the pandemic hit the hospitality industry hard, Airbnb went into driving cost reduction exercise. They were able to reduce their cloud spends by 27% with almost no lead time. With the worst of the pandemic behind us, Airbnb has been able to build and accelerate their growth on the high, pent-up demand for tourism and hospitality, with the Cloud emerging as a competitive differentiator.
With Cloud, data is increasingly taking the centerstage for enterprise decision-making. Data is at the core of every organization’s digital transformation foray. AWS has >200 services now. Throughout his keynote, Adam Selipsky highlighted the importance of data, and showcased new additional capabilities, integrations, as well as new services from AWS that will enable enterprises to make more meaningful data extraction and analysis.
These are some of the new announcements that Adam Selipsky highlighted in his re:Invent keynote.
With an intent to move towards zero-ETL future on AWS, new integrations were announced that make it easier to connect and analyze data across data stores without having to move data between services. For instance, customers can analyze Amazon Aurora data with Amazon Redshift in near real time, eliminating the need to extract, transform, and load (ETL) data between services. Customers can also now run Apache Spark applications easily on Amazon Redshift data using AWS analytics and machine learning (ML) services (e.g., Amazon EMR, AWS Glue, and Amazon SageMaker). Together, these new capabilities help customers move toward a zero-ETL future on AWS.
Amazon DataZone, a new data management service makes it easier for enterprises to catalog, discover, share, and govern data stored across AWS, on-premises, and third-party sources. With Amazon DataZone, users throughout an organization can be discover, use, and collaborate with data to derive insights. Data producers use Amazon DataZone’s web portal to set up their own business data catalog by defining their data taxonomy, configuring governance policies, and connecting to a range of AWS services (e.g., Amazon S3 and Amazon Redshift), partner solutions (e.g., Salesforce and ServiceNow), and on-premises systems.
Enterprises today have vast data pools that they use to build informed decisions, spot trends and drive efficiencies. With new natural language querying capabilities and the ML-powered forecasting with Amazon Quicksight Q, enterprises now have the ability to view or forecast future business performance outcomes. QuickSight Q lets anyone explore historical trends and metrics without technical expertise, allowing users to derive new insights from the data that powers their dashboards and reports. For example, a sales user could ask, “Where did we sell the most items last year?” or a finance user could ask, “What is actual revenue compared to goal?”
With Amazon Security Lake, a new purpose-built data lake that automatically aggregates an enterprises’ security data from cloud and on-premises, using a standards-based format, enabling enterprises to manage data throughout its lifecycle, and act on security data faster. The new service builds the security data lake using Amazon Simple Storage Service (Amazon S3) and AWS Lake Formation to automatically set up security data lake infrastructure in a customer’s AWS account, providing full control and ownership over security data.
Supply chains have, in the recent past, witnessed volatile disruptions, driven by geopolitical events as well as the pandemic. The AWS Supply Chain enhances supply chain visibility and provides actionable insights to help businesses optimize supply chain processes and improve service levels. The AWS Supply Chain automatically combines and analyzes data across multiple supply chain systems so businesses can observe their operations in real-time, find trends more quickly, and generate more accurate demand forecasts that ensure adequate inventory to meet customer expectations.
With Amazon Omics, a purpose-built service will enable scientists to store, query, and analyze large data sets pertaining to genomic, transcriptomic, and other omics data. They will be able to then derive insights from that data to improve health and advance scientific discoveries. With just a few clicks in the Omics console, one can import and normalize petabytes of data into formats optimized for analysis. Amazon Omics provides scalable workflows and integrated tools for preparing and analyzing omics data and automatically provisions and scales the underlying cloud infrastructure.
The AWS Clean Rooms, a new analytics service offering enables companies across industries to easily and securely analyze and collaborate on their combined data sets without sharing the underlying datasets. With AWS Clean Rooms, customers can create a secure data clean room in minutes and collaborate with any other company in the AWS Cloud to generate unique insights about advertising campaigns, investment decisions, clinical research, and more. AWS Clean Rooms provides a broad set of built-in data access controls that protect sensitive data, including query controls, query output restrictions, query logging, and cryptographic computing tools.
When it comes to fostering sustainability, AWS will be water positive (water+) by 2030, returning more water to communities than it uses in its direct operations. AWS has been driving four key strategies in pursuit of becoming water+ by 2030. These include improving water efficiency, using sustainable water sources, returning water for community reuse, and supporting water replenishment projects.
The article has been written by Prabhu Ram, Head – Industry Intelligence Group, CyberMedia Research