AWS’ Roger Barga: ‘Robots Are Part Of A Company’s Digital Transformation’

‘Robots are not just useful for carrying out tasks,’ says Roger Barga, general manager and director of development of Amazon Web Services Robotics and Autonomous Services. ‘They get data digitally about what's going on in your operations, which you can then use to optimize your operations.’

Robots are not just useful for carrying out tasks, said Roger Barga, general manager and director of development of Robotics and Autonomous Services at Amazon Web Services.

“They're actually part of a company company's digital transformation,” he said. “They get data digitally about what's going on in your operations, which you can then use to optimize your operations.”

Barga has been in his role for just over one and a half years, leading AWS development of tools to build, simulate, deploy and manage intelligent robotics applications at scale.

Sponsored post

His team, which builds on the open-source Robot Operating System (ROS) middleware and contributes open-source software to the newer industrial-grade ROS2 version, released its first cloud robotics service— AWS RoboMaker—in November.

RoboMaker is DevOps for robotics, according to Barga. It provides a robotics development environment for application development, a robotics simulation service to accelerate application testing, and a robotics fleet management service for remote application deployments, updates and management. It extends ROS with connectivity to AWS cloud services including machine learning, monitoring and analytics that allow a robot to stream data, navigate, communicate, comprehend and learn.

AWS recently added two new RoboMaker features. AWS RoboMaker now supports configurable timeouts in over-the-air deployments. Developers can configure the timeout durations of deployment jobs—from one minute to seven days—to prevent undue delays or premature failures so updates and deployments can be completed.

RoboMaker also now supports log-based simulation, event-driven simulation termination and simulation event tagging, allowing developers to easily run a large number of automated simulations for use cases including regression testing, parameter tuning and machine-learning model training.

CRN sat down with Barga this month at AWS’ Seattle headquarters for a wide-ranging conversation about robotics that included the genesis of the AWS program, ROS and ROS2 and the need for developers who can program ROS, as well as drones, Amazon’s inaugural re:MARS artificial intelligence event on machine learning, automation, robotics and space, and other topics.

AWS’ Entry Into Robotics

Barga previously was general manager and director of development of Amazon Kinesis, a fully managed streaming data service that makes it easier to collect, process and analyze video and data streams in real time.

While he and his colleagues were thinking through the various use cases of edge processing, their attention shifted to robotics, since Amazon uses robotics in its retail fulfillment centers.

“The edge is getting really interesting in terms of the processing power, like on a Tesla or any other remote device, and increasingly we're going to be having to think about how to program the edge,” Barga said. “How do you control it? How do you give it instructions? How can you do pre-processing? You have GPGPUs [general-purpose computing on graphics processing units] in a Tesla. It can actually do real intelligent processing.”

Barga’s team realized that hardware costs for robots were falling precipitously, but robots primarily were only used by very large companies.

“We dove a little deeper, trying to understand programming the edge, and we realized actually programming and managing and controlling a robot is still very, very hard,” Barga said.

Barga attended the Consumer Electronics Show two years ago, and there were about 100 companies demonstrating robots, all of which were running on ROS.

“Several of them looked at my badge and said, 'You could help us,’” Barga said. “We talked more about what the pain points they were experiencing were, and you realize that the reason why we don't see more robots being used by the long tail of companies is that they don't have very good tooling for programming robots—for programming the edge and managing and monitoring these and getting value out of it. It's just too hard, unless you have very deep pockets and a very deep team of expertise. This is what motivated our service. We sat down as a team, and we asked them where they spent their time. We saw there was an incredible amount of undifferentiated, heavy lifting happening across these projects. In fact, so much that they had very, very little time for adding stuff that was of unique value to their robot. No wonder we're not seeing a lot of innovation.”


If five people wanted to build a robot together, they’d have to make sure all of their laptops were configured the very same way, and they had all the right binaries. It's not easy, according to Barga, but there are many examples of commercial success.

German carmakers Audi, BMW and Daimler paid $2.9 billion in 2015 for Nokia's HERE digital navigation business that runs on ROS. In 2016, GM acquired autonomous vehicle company Cruise, which also runs on ROS, for an undisclosed amount that was reported to be as much as $1 billion.

“When we talked to [the companies], they said, 'Oh, yeah, it took us about a year to pick up this academic open-source software, test it, harden it, throw out the pieces we didn't need, but make it ready for industrial-grade,'” Barga said. “So we asked ourselves, could we, working with others in the community, build an industrial-grade version of ROS, open source, free and supported by multiple companies like Microsoft, Bosch, Toyota Research, Honda, LG? Just about every company out there that has a big stake in robotics is now contributing to … ROS2, which is industrial-grade ROS. I think of it as like the Linux, if you will, for robots,” he said.

“So we started investing in ROS2,” Barga said. “We built a cloud-based IDE [integrated development environment], so that if we all started the project, we'd start with the very same development environment. We have scalable compute and scalable storage. That way we can actually recompile our programs. We can have five different versions of them in our service. We'd have compute power when I want to compute. It's pay as you go.”

Barga talked with a couple of developers at CES who had integrated Alexa into their robots.

“I thought that was the coolest thing,” he said. “I said, 'What else could you integrate?' Polly, Lex, CloudWatch to monitor where the robot is, Kinesis Video to stream video off of the robot. Instead of making this proprietary, we thought, ‘Why don't we actually add them as ROS nodes?’ That's the open-source version of how you package code in ROS. To an ROS developer, they look very familiar in terms of their APIs. We've hidden all the magic as open source on how to handshake with AWS Cloud. Now a robotics developer is not limited by the CPU power or the software running on the robot. They can actually have a connection back to the cloud and use the cloud and the services in the cloud.”

Simulation As A Service And Over-The-Air Updates

Developers run simulations instead of physically testing robots because the latter is too expensive and can be dangerous, according to Barga. ROS comes with a simulator called Gazebo, which lets developers build 3-D worlds with robots, terrain and other objects and has a physics engine for modeling illumination, gravity and other forces.

“You can put your robot in the 3-D world and run your program, and the simulator will give sensor inputs to the robot like sight, sound, distance measures to walls,” Barga said. “And you can debug it and test it that way to see how it's going to run end to end.”

But small startups said they couldn’t afford team members doing simulations when they were just scrambling to keep up with other tasks. Even large companies, including Amazon Robotics, said they had far too many people managing their servers for simulation.

“So we offer simulation as a service,” Barga said. “They can actually put the robot in simulation and see how it operates and see the log records, see if it has accidents and, if so, debug it and try it again and iterate to make sure the robot runs correctly.”

AWS was prompted to add over-the-air updates for customers’ robotics after learning how developers at CES deployed applications to their robots.

“They said, 'We put our code in GitHub, and our users have to download it to a USB, stick it in the back of the robot and press an update button,’” Barga said. “My Kindle and iPhone update smarter than that.”

A developer instead can register their physical robots with AWS.

“We install AWS Greengrass Core, which … gives a security model,” Barga said. “Only the AWS customer can push code into the robot. Similarly, when the robot decides to call back home to AWS to use some of these services … only that robot that's registered and has permissions can call back.”

AWS launched over-the-air updates at re:Invent in November.

“They can install their application over the air, and they can update it,” Barga said. “This is exactly the same way we update our IoT devices today and pretty much what we use for Kindles as well.”


Amazon held its inaugural re:MARS AI event on ML, automation, robotics and space in June in Las Vegas.

“[Amazon founder and CEO Jeff Bezos] believes, and we believe, that these areas are going to become so big and have such commercial impact, it's time to make it public and start bringing CIOs and CTOs to these conferences, expose them to where the state-of-the-art is today and get a dialogue going,” Barga said.

“Something we heard from customers as well as developers is, 'Hey, It's really hard to get hardware up and running in my robot. It doesn't come with the right software,” Barga said. “So we announced partnerships with Qualcomm, Nvidia and Intel. That represents about 95 percent of the CPUs that people run on robots. They're integrated with RoboMaker. They've been quality-tested, so we can actually guarantee it's going to work out of the box the way it's advertised.”

Barga Evaluating Supporting Drones

Barga’s team is evaluating supporting drones, he said.

“That's as strong of a comment that I'm going to make,” Barga said. “The drones require a lot of simulation, though, because today they're fly-by-wire with a human watching it. But they generate so much data. I started to really appreciate the data exhaust coming off of a robot and how it can actually feed the business. In the case of a drone, it's sensor data.”

Barga pointed to an unnamed customer using a commercial drone to fly over the deadly and destructive California wildfires last year.

“They could actually plan and run models saying, 'Here's where the fire is,’ and it's amazing,” Barga said. “A lot of these fires and the associated deaths is because basically they were doing one of these guesses: They'd look at the weather forecast, they'd guess where the fire was, they'd send somebody to fight it, and they were completely wrong, because it was moving much faster. And now they can actually run real-time monitoring every hour. So, 'You need to get out of there. This thing is moving much faster, the wind is changing. The predictive model says you have an hour before this thing is actually going to cut you off.' You start to realize that real-time latency, the data, the processing—who cares about the drones—it’s the decisions you're going to make that it gives you.”

Shenzhen, China-based DJI has what’s probably one of the largest commercial drones coming out of China, and AWS has it running on ROS.

“The demo we could show is from an iPad or an iPhone,” Barga said. “You could choose a mission that you wanted your drone to run, like an inspection of an oil rig. You could actually run the simulation, and you can see the simulation running like 'Yep, that's exactly what I want it to do.' You could choose the machine- learning model that had been built with SageMaker and push it over the air to the drone. We are running a Nvidia chip, and we installed the machine-learning model from SageMaker on the … chip.”

As the drone physically runs its mission, and its camera detects with high accuracy a leak, the cloud service integration of Amazon CloudWatch would send an alert back to the dashboard saying, “I've detected a leak.” Kinesis Video would start to record and stream the video from the leak back up to the cloud.

“We're just trying to show programming the edge,” Barga said. “These are robots, they sense, they compute, they act. We can now program them. We can even reflash the drone with a completely new mission from the ground up. We can change how they operate and get data off of them and stream that back to the cloud.”

Physics Models For Robots

NASA funded Open Robotics, Gazebo and ROS to build an accurate simulation of the International Space Station with .01 gravity.

“They actually built a simulation for the moon with the correct lighting, the very bright lighting you get up on the moon. They can test rovers and other things. There are raw space robots on the Space Station today. They're not doing mission-critical stuff. They're taking photographs, which is what a lot of astronauts spend their time on,” he said.

“The developer who's showing us the new version, which is going up this year—all tested in simulation—he built a completely new propulsion system,” Barga said. “It kind of just sucks in a little bit of air, compresses it ever so lightly and gives it a puff of air to move around. He has completely tested it in simulation with Gazebo with the right gravity model and the right structure from the Space Station. It actually will anchor itself with one arm it has and then start taking photos and then run its mission for the day before going back to a charging station,” he said.

“To be clear, with Gazebo, it's a little bit of a plug-and-play,” Barga said. “MIT [professor Russ Tedrake] has developed robotic simulations for robotic arms called Drake. You can plug in Drake, and actually it's superior to other ones. There's MuJoCo, which is multi-joint control for robotic arms developed in the University of Washington. All of these physics engines can be selected when you're running your simulation, so you can pick and choose. There's not good ones right now, in my humble opinion, for drones, with all of the air and wind.”

Amazon Air has built such a simulator, according to Barga.

“We're talking with them about taking that physics model and putting it into Gazebo as an option to choose as your physics model,” he said.

‘Robots Are A Part Of A Company’s Digital Transformation’

Australian oil and natural gas company Woodside Energy is an AWS Robotics customer that’s running a very large mining operation Down Under.

It built a robot that would move material from Point A to Point B, so it didn’t need an individual on a Caterpillar doing it. Woodside had the robot up and running within a few months on ROS when it started to realize the value of attaching a camera to it for monitoring pipes and oil pumps to ensure oil reservoirs didn’t get too low.

“They put these images from these oil pumps, oil reservoirs in AWS Ground Truth,” Barga said. “They just got a label training set and ran it through SageMaker to build a supervised learning model. Now, as that thing goes through, it actually will scan, identify, take an image of and classify the oil percentage below some threshold, send an alert back to the operator and say, 'Oil reservoir 20 really needs to be filled pretty quickly. We have a work stoppage,'” he said.

“What's interesting to me about this is you're starting to realize robots are not just useful for carrying out tasks,” Barga said. “They're actually part of a company's digital transformation. They get data digitally about what's going on in your operations, which you can then use to optimize your operations.”

That’s had a big impact on Amazon Robotics, which didn’t know its fulfillment center picking times—how long it took to pick an item and deliver it to a box—because it had humans doing it, according to Barga.

“We didn't want to slow them down during data collection,” Barga said. “But when you have a robot going out … picking and putting it in the box, they can start to see anomalies like, ‘Why did it take this so long?’ We realized we had congestion because it was a hot item, a hot spot.”

Amazon experimented by redirecting robots when they encountered traffic, and the picking time went down.

“They ran simulations, and even some of our best-thought-out layouts for fulfillment centers actually were not what ended up actually being chosen,” Barga said. “That started to show the role of simulation to do 'what if.' Now that I've got my robot models, and the physics are all correct, I can actually do 'what if' analysis and come up with a new design. It's fascinating how this is going to be used.”

Still Early Stage

In terms of Amazon scale, AWS Robotics and Autonomous Services still is in the early stage, according to Barga, noting AWS doesn’t disclose actual hard numbers.

“However, as a business, we try to project where we think we should be a year or two years down,” Barga said. “We're well within those bounds, which is great to see. That tells us the product is finding a market. We're seeing large companies, which previously were running on their own software stack, saying, 'We're moving to ROS,’” he said.

“One of the interesting opportunities we have here [is] we have so much in the depth and breadth of robotics experience—from drones to robotic arms to mobile bases—just to tell us what was hard, what works, where could we innovate, where's it going in the future, and what should we take advantage of,” he said.

Educational Outreach

AWS has been doing educational outreach to address the lack of developers who know how to program ROS.

“We have been making huge investments in educational outreach,” Barga said. “The numbers are there, the customers are there. What we've been surprised with is some of our enterprise customers have approached us and said, 'Love RoboMaker, great DevOps for robotics. I can't hire these people. I want a near-finished robot.’” AWS now is working with partners that will launch with RoboMaker.

“We're seeing an appetite for this in enterprise companies who will never have a big robotics team,” Barga said. “They don’t want to start from scratch. Our first target customers are robotics developers who want to run faster, operate at scale. Enterprises weren't near ready, so we're augmenting our partner strategy not to work with the Qualcomms, the Nvidias and the Intels, but entire robotic-based companies.”

The most innovative and cutting-edge applications of robotics are taking place at universities, and that's where many startups begin, so AWS also wants to get plugged into that community, Barga said.

“But we also started hearing and sort of having a concern ourselves: If this takes off the way we think it's going to take off in the next three to five years—where every company out there…[is] going to have automation projects—where the heck are all the developers going to come from?” Barga said. “So we started an outreach program working with AWS Educate with 20 schools such as RMIT, Cambridge University in the U.K., Northeastern University, University of Texas in Austin, Arizona State University.”

AWS created a learning badge on AWS Educate with free access to anyone with a .edu account.

“They can go end to end through a series of labs, exercises, lessons on how to program a robot with ROS, all testing and simulation, but it can be deployed to a TurtleBot, which is the most commonly used educational research robot,” Barga said. “We gave each of those schools 20 TurtleBots so they can use them in their curriculum.”

In April, AWS Educate launched a RoboMaker Classroom to give students, researchers and educators a new way to accelerate robotics learning. Educators can request AWS Promotional Credits and set up virtual environments for students to use AWS RoboMaker in projects and assignments.

“To date, 100 faculty members have created Classrooms,” Barga said. “Over 4,500 students have actually been invited to join those Classrooms to learn robotics. We're now talking to faculty members about curriculum that we can build together for learners to actually push even further and deeper into robotics.”

“We have to have the ability to have thousands of people trained and programming ROS robots,” Barga said. “We need to make sure there's a pipeline of qualified engineers that can actually pick it up and adeptly work at it.”

LEA By Robot Care Systems

Netherlands-based Robot Care Systems was a launch customer for AWS RoboMaker.

The company’s LEA (Lean Powering Assistant) is a walker robot for the elderly and disabled.

“Immediately that jars your thoughts about what a robot is—where are the arms, where are the legs?” Barga said. “LEA has 72 sensors on her, very low CPU power. All those sensors are being used to keep the patient safe. There's no CPU overhead for running a lot of extra software. They have to keep the price point low if they're going to run a productive business.”

The company was running ROS right out of the box.

“They start with ROS, they harden it, they very easily moved over to RoboMaker, they integrated our cloud service extensions,” Barga said. “They immediately started running simulation of patients with different height, different weight, different room configurations to make sure the parameters for LEA were still safe. They'd never been able to run simulations like that before. They integrated CloudWatch, they integrated Kinesis, they integrated Polly and Lex. Customers can actually talk to the robot and command LEA to come to them from across the room.”

If LEA senses a patient isn't stable, it will give verbal commands. It can even go into a dance mode for exercise and give commands and encouragement to the customer.

“They're also streaming data off of the robot on activity so a doctor or caregiver or a loved one can actually monitor the activity,” Barga said. “And Robot Care Systems are now thinking about an iPhone app for the doctor. How active was the patient over the week, over the day? They're actually looking at predictive models. If they're predicting a patient is recovering, are they recovering, are they regressing, should we step in here? What they're thinking is taking that data feed and using that to actually get insights. It's really clever.”

AWS Robotics Consulting Or Certifications?

AWS Robotics and Autonomous Services doesn’t offer consulting services, but it’s under consideration.

“We may in the future to help on-board our customers,” Barga said. “We're trying to work through partners right now.”

AWS is now talking to customers about how they can operate robots at scale.

“How can you get more value out of your robot—the operations part,” Barga said. “As we look forward, we'll be looking at more services to help you operate a robot and ask the question, 'How can we allow small startups to stand on our shoulders and run much faster? What should we be putting in our services to allow startups to move faster?’ There's a few things in that basket.”

AWS also is thinking about robotics certification.

“We're thinking about that and what it would mean, without being too onerous,” Barga said.

Reviews of ROS packages—similar to retail reviews—also are being contemplated.

“Imagine looking at all the ROS packages, and somebody says, 'Thumbs up, it worked for me,” Barga said. “That might actually work, where someone says, 'This ROS package is trusted.' Given RoboMaker and actually seeing all these packages and how they're used, we can start making recommendations—'eighty percent of the customers choose this ROS node package or use this sensor.’ Just trying to see if we can actually get some feedback that way. I want to make developers make quick decisions faster and better.”