3/13/2023 0 Comments Way to the woods platformsWood: What we heard very clearly was a couple things. What feedback do you get from customers for your current slate ? In the keynote, you cited 100 new machine learning features or services since re:Invent last year. Another really good example is build a machine learning model and test it out on a data set you understand really well, which is really common in oil and gas, medicine and medical imaging. They want to look at product sales at the end of a quarter or the end of a month predict the demand going forward. We see a lot of customers that want to run billing reports or forecasting. These two things are highly complementary. But there are tons of cases in which you want to be able to apply predictions to large amounts of data, either that just arrives or gets exported from a data warehouse, or that is just too large in terms of the raw data size to process one by one. You want to run predictions against fresh data as it arrives in real time you can do that with SageMaker-hosted endpoints. Matt Wood: We support the two major ways you'd want to run predictions. How does that capability apply to customers trying to process larger data files? Matt Wood, AWS general manager of deep learning and AI, discussed advancements to the AWS AI platform, adoption trends, customer demands and ethical concerns in this interview.ĪWS has added a batch transform feature to its SageMaker machine learning platform to process data sets for non-real-time inferencing. Civil rights advocacy groups worry that technology providers' breakneck pace to provide AI capabilities, such as Rekognition, could lead to abuses of power in the public sector and law enforcement, among others. But despite outlines for customers' use, the AWS AI platform is not immune to growing concerns over potentially unethical usage of these advanced systems.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |