Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and gain efficiency by improving and scaling citizen developers. look now.
New data governance and sharing, business intelligence, supply chain management, security, AI/ML, space simulation tools and capabilities – it’s been a busy week at AWS re:Invent, with AWS rolling out a slew of new services .
Here are some of the most important announcements from the AWS Annual Conference.
Real world simulation
Dynamic 3D experiences help organizations across all industries (transportation, robotics, public safety) understand possible real-world outcomes and train accordingly.
For example, determining new workflows for a factory, running different natural disaster response scenarios, or considering different combinations of road closures.
Smart Security Summit
Learn about the critical role of AI and ML in cybersecurity and industry-specific case studies on December 8. Sign up for your free pass today.
But complex space simulations require significant computational resources, and it can be difficult and expensive to integrate and scale simulations with millions of interacting objects on compute instances.
To help customers build, operate, and run large-scale space simulations, AWS has deployed AWS SimSpace Weaver. The fully managed compute service allows users to deploy spatial simulations to model systems with many data points, such as traffic patterns in a city, crowd flows in a venue, or land layouts in a city. ‘factory. These can then be used to conduct immersive training and gather critical information, according to AWS.
Users can run simulations with over a million entities (people, cars, traffic lights, roads) interacting in real time. “Like an actual city, the simulation is a vast ‘world’ unto itself,” according to AWS.
When a client is ready to deploy, SimSpace Weaver automatically configures the environment, connects up to 10 Amazon EC2 instances in a networked cluster, and distributes the simulation between instances. The service then manages network and memory configurations, replicating and synchronizing data across instances to create a single, unified simulation where multiple users can interact and manipulate the simulation in real time, AWS said.
Customers include Duality Robotics, Epic Games and Lockheed Martin; the latter worked with AWS to develop a San Francisco earthquake recovery demo to illustrate how first responders could stage a humanitarian aid mission.
“We need to be able to simulate on a real-world scale to be sure that the information we get from simulation is transferable to reality,” said Wesley Tanis, virtual prototyping engineer at Lockheed Martin.
Working with AWS, they were able to simulate more than a million objects “on a continental scale,” he said, “giving us real-world insight to improve our situational preparedness and planning in a wide range of scenarios, including natural disasters”.
Better data management
Organizations today collect petabytes, even exabytes, of data spread across multiple departments, departments, on-premises databases, and third-party sources.
But before they can unlock the full value of this data, administrators and data stewards need to make it accessible. At the same time, they must maintain control and governance to ensure that data is only accessible to the right person and in the right context.
The new Amazon DataZone service was launched to help organizations catalog, discover, share, and manage data across AWS, on-premises, and third-party sources.
“Good governance is the foundation that makes data accessible to the entire organization,” said Swami Sivasubramanian, vice president of databases, analytics, and ML at AWS. “But customers often tell us it’s hard to find the right balance between making data visible and maintaining control.”
Using the new Data Management Service web portal, organizations can set up their own enterprise data catalog by defining their data taxonomy, configuring governance policies, and connecting to a range of AWS services. (such as Amazon S3 or Amazon Redshift), partner solutions (such as Salesforce and ServiceNow), and on-premises systems, Sivasubramanian said.
ML is used to collect and suggest metadata for each dataset; Once the catalogs are configured, users can search and discover assets through the Amazon DataZone web portal, examine metadata for context, and request access to datasets. The new tool is integrated with AWS analytics services – Amazon Redshift, Amazon Athena, Amazon QuickSight – so consumers can access it as part of their data project.
As Sivasubramanian put it, the new service “liberates data across the organization, so every employee can help generate new insights to maximize their value.”
Secure data sharing
Similarly, to obtain critical information, organizations often wish to supplement their data with that of their partners. At the same time, however, they must protect sensitive consumer information and reduce or eliminate the sharing of raw data.
This often means sharing user-level data and then ensuring that partners will fully adhere to contractual agreements.
Data clean rooms can help address this challenge because they allow multiple parties to combine and analyze their data in a protected environment where participants cannot see each other’s raw data. But clean rooms can be difficult to build, requiring complex privacy controls and specialized data movement tools.
AWS Clean Rooms aims to make this process easier. Organizations can now quickly create secure data cleanrooms and collaborate with any other business in the AWS Cloud.
According to AWS, customers choose the partners they want to collaborate with, select their datasets, and configure restrictions for participants. They have access to configurable data access controls including query controls, query output restrictions, and query logging, while advanced cryptographic computing tools maintain data encryption.
“Clients can collaborate on a range of tasks, such as more efficient generation of advertising campaign insights and analysis of investment data, while improving data security,” said Dilip Kumar, vice-president. President of AWS Applications.
Proactively act on security data
Organizations want to quickly detect and respond to security risks. This allows them to act quickly to secure data and networks.
Yet the data they need for analysis is often spread across multiple sources and stored in a variety of formats.
To facilitate this process, AWS customers can now take advantage of Amazon Security Lake. This service automatically centralizes security data from cloud and on-premises sources into a purpose-built data lake in a customer’s AWS account.
According to AWS, security analysts and engineers can then aggregate, manage, and optimize large volumes of disparate log and event data to enable faster incident detection, investigation, and response.
“Customers tell us they want to act on this data faster to improve their security, but the process of collecting, normalizing, storing and managing this data is complex and time-consuming,” said Jon Ramsey, Vice President of Security Services at AWS. .
Respond to supply chain complexity
In recent years, supply chains have experienced unprecedented volatility in supply and demand – and this has only been accelerated by widespread resource shortages, geopolitics and natural events.
Such disruptions push companies to plan for potential supply chain uncertainty and react quickly to changes in customer demand while reducing costs.
But when companies fail to properly anticipate supply chain risks – for example, component shortages, shipping port congestion, unforeseen peaks in demand or weather disruptions – they can face costs. excessive stock or stock-outs. In turn, this can lead to poor customer experience.
The new AWS Supply Chain simplifies this process by combining and analyzing data from multiple supply chain systems. According to AWS, companies can observe operations in real time, quickly identify trends, and generate more accurate demand forecasts.
“Clients tell us that the undifferentiated heavy lifting required to connect data between different supply chain solutions has inhibited their ability to see and react quickly to potential supply chain disruptions,” said Diego Pantoja-Navajas. , vice president of supply chain at AWS.
The new service is based on nearly 30 years of Amazon.com’s logistics network experience, according to the company. It uses pre-trained ML models to understand, extract and aggregate data from ERP and supply chain management systems. The information is then contextualized in real time, highlighting the current inventory selection and quantity at each location.
ML insights indicate potential stock-outs or delays, and users are alerted when risks arise. Once an issue is identified, AWS Supply Chain provides recommended actions — moving inventory between locations, for example — based on percent risk resolved, distance between facilities, and impact on sustainability , according to AWS.
“As supply chain disruptions continue for the foreseeable future, companies must remain focused on balancing profitability, sustainability and the suitability of their supply networks to support growth,” said Kris Timmermans, Head of Supply Chain and Global Operations at Accenture (an AWS Supply Chain customer).
“Executing a cloud-based digital strategy can enable an agile and resilient supply chain that responds to market changes and customer demands,” Timmermans said.
Also this week at AWS re:Invent, AWS announced five new database and analytics features, five new features for its Amazon QuickSight business intelligence tool, and eight new Amazon SageMaker features.
VentureBeat’s Mission is to be a digital public square for technical decision makers to learn about transformative enterprise technology and conduct transactions. Discover our Briefings.
#biggest #announcements #AWS #reInvent