You see robots jumping, bouncing, somersaulting, falling down and standing back up on social media every day. We see dreams of what life will be like in the smart homes of the future. We see in movies how autonomous cars and unmanned aerial vehicles (UAV) can change our lives. So how is this future being built in front of us?
All the software developers, entrepreneurs and corporations, convinced by platform companies such as Sun Microsystems, Digital, IBM, Microsoft, Oracle, Google, Apple, Amazon and Facebook to believe in a shared future, build robots, communication networks, autonomous cars, smart homes and objects and smart life of the future. Some of these platform companies have expired, while some have not even appeared yet. Some of them are waiting for the right time for transformation. It shows us that one person, institution or ecosystem alone does not build the future. When the energies of ecosystems combine, the power to transform the future emerges. The passion for believing in the shared future changes and transforms.
Transformation companies that extract simple services from complex infrastructures using concepts we do know as yet are transforming life through new concepts. Not only that, of course – we must believe that those platform companies are just, equal to everyone, accountable and good listeners. In other words, we are talking about platform companies that are structured like countries. We understand these platform companies from their public structures, accountability and transparency to investors, regulatory agencies and the media. A platform company that is constantly changing the rules of the game cannot build the future. People follow consistencies. Inconsistent platforms cannot continue to live on.
AI IS EVERYWHERE
In everyday life and media, technologies are interspersed with advertising messages of products, some of which transform even if they do not touch the consumer directly. For example, there is artificial intelligence (AI) software behind the ability to edit a picture in seconds. Many smartphones or cameras use AI. Washing machines, televisions, dishwashers and other devices that learn with artificial intelligence technology emerge. To manage all of these devices without delay, organizations are using complex cloud technology in the background. They want to see scalable and continuous structures with a pay-as-you-go option to simplify their work. The people who build all these are called software developers.
Software developers also build one-click shopping services. Platform companies or ecosystems are building the future, bringing together software developers from all over the world who transform the most complex technologies into the services we use at our fingertips.
One of these platform companies is Amazon Web Services (AWS), Amazon's most profitable company with $28 billion in revenue in 2018 under the iceberg. Amazon Web Services, which has become the best implementer of cloud technology developed by Amazon in search of remedies, now owns about 50% of the cloud market. Moreover, the cloud market has only realized 3% of its potential.
ROBOTS PAVE THE WAY FOR HUMANS
Most news reports that robots are taking people's jobs away. However, expert comments do not point to bad scenarios. Robots help patients and the elderly walk, educate children and make life easier in general.
You have watched the videos on social media hundreds of times of robots that open and close doors, do somersaults and continue to run even after falling to the ground. Or you may have also seen videos featuring unmanned vehicles as the vehicles of the future. One of the companies that make up this future is Amazon Web Services, a technology company that operates independently of Amazon. We met Roger Barga, general manager of AWS Robotics and Autonomous Services, at re:Invent, which brought together 65,000 software developers, entrepreneurs and informatics professionals who believe in the future. Barga spoke about re:Invent meetings, the story of robots, autonomous devices and the future.
ROBOTS ARE BEING BUILT
The task of building robots is performed by the machine learning representative before the transformation from simulation to reality. Today, anyone who uses DeepRacer actually uses RoboMaker because the car’s road is simulated at RoboMaker. The car, of course, crashes, breaks, fails to complete the path, receives negative reinforcements, learns, heals – all processes that happen at RoboMaker. And finally, when it is time to put the model forward, RoboMaker's service updates the bug to download the program into the car and the car sets off into the real world – the intersection of interesting applications and technology.
AUTOMOTIVE GIANTS EYE ACADEMIA-BORN STARTUPS
If I need to share a wider perspective, AWS RoboMaker builds on a system called Robot Operation System (ROS), which is open-source software. ROS was originally developed for academic research 10 years ago. But guess what? That ROS software was used by Cruise, purchased by General Motors. Cruise was an autonomous car that General Motors bought for about $1 billion dollars. This car was built entirely on the ROS system. HERE is another company acquired by Daimler, Benz and Audi. Again, an autonomous car and data collection, all built on ROS. APAC's artificial intelligence is another autonomous car company, a known example of the use of ROS. The ROS-based code that we use in RoboMaker and that our customers use as a base has become a commercial application many times. In fact, there is a formation called ROS-Industrial Consortium. This consortium consists of companies that contribute code and money to develop ROS commercial software packages. One of the packages built by the consortium this year is the open-source library for sanding and polishing. Another is for drilling. When you look at ROS, you see commercially open-source software that is protected and contributed to by large industrial companies. In this field, you can think of many companies, such as AWS, Bosch, LGE, Toyota and Honda. If they have activities in the field of robotics, they help preserve the essence of ROS, especially ROS 2.
ROBOTIC ECOSYSTEM HELPS PATIENTS WALK
There are consortia, such as Ros-Industrial, contributed to by hundreds of companies offering money and open-source software support to their software packages. Developers can choose whichever they want. This is the basis of the service we provide, but we take it further. We are adding open-source software that allows the robot to connect back to the cloud, meaning AWS' services. A startup from the Netherlands has a walker robot for the elderly and disabled. It has 72 sensors, mainly focused on puddles and obstacles. This walker wants to keep the patient safe while walking with the robot. LEA's motors can determine the speed at which the patient is walking and motorize themselves. It slows them down if they walk too fast or sense if their balance is off.
LEA: A ROBOT FOR THE ELDERLY
The entire operating system has to be used to ensure that the patient is safe. However, they wanted LEA to be able to have more natural voice interaction. They wanted customers to be able to talk to LEA, saying, "LEA, come here." Maybe they left the room, LEA is on the other side of the room, and they cannot walk toward it. In such cases, they wanted customers to say, "I want to be able to talk to LEA." The developers wanted LEA to respond to commands such as "Walk slower" and "Shall we move?"
They actually have a dance program. They wanted to encourage people to be active and work. However, they could not add the process to the robot because they are not specialized in machine learning. That is why they added open-source software links for Polly and Lex. After four hours, the procedure was complete, and they could now talk to the robot. They showed us that this could be done. We got them started with the solution architecture, and they could communicate with LEA within four hours.
Another thing they want is "literal telemetry." Did the patient walk? Is the patient slowing down? Did the patient fall, stutter or stagger? As a result, we moved one of our cloud plug-ins to CloudWatch and MQTT to stream data from LEA. There is now a control panel for each patient, and doctors can monitor activity and cases, meaning that they have real-time control panels showing what is going on. And finally, they wanted visual interaction with the option to add a video camera and a screen. We used Kinesis video, a basic operating adapter to stream video to the cloud. There is no operation on the device or the robot, so there is nothing to add, but patients can now talk to their loved ones, call for help or talk to their doctors to get advice. All this was done in a period of two weeks. It is currently in the production stage, but you can only see how the cloud increases what LEA can do. This is just one of the examples I can give.