Here’s how Amazon’s Alexa AI is helping NASA become smarter at work
Alexa can help NASA employees scan through 400,000 sub-contracts and get the requested copy of the contract from the data-set right on the desktop in a jiffy.Updated: Jun 27, 2018 13:53 IST
While you are busy giving Alexa commands to play your favourite song or book an Uber, the cloud-based voice service from Amazon is helping the US space agency organise daily tasks while making sense of intrinsic data-sets.
According to Tom Soderstrom, IT Chief Technology and Innovation Officer at NASA’s Jet Propulsion Laboratory (JPL), voice as a platform will become the next big thing once we learn to talk to digital assistants and chatbots in a fashion we do with friends and family.
“If you have Alexa-controlled Amazon Echo smart speaker at home, tell her to enable the ‘NASA Mars’ app. Once done, ask Alexa anything about the Red Planet and she will come back with all the right answers,” Soderstrom said during the Amazon Web Services’ (AWS) public sector summit in Washington.
“This enables serverless computing where we don’t need to build for scale but for real-life work cases and get the desired results in a much cheaper way. Remember that voice as a platform is poised to give 10 times faster results,” Soderstrom noted on the inaugural “Earth and Space Day”.
Serverless computing allows people to build and run applications and services without thinking about provision, scale and manage any servers. AWS is a marker leader in this segment, providing a set of fully-managed services to clients in the public sector, thus allowing them to focus more on product innovation.
Alexa, for example, can help JPL employees scan through 400,000 sub-contracts and get the requested copy of the contract from the data-set right on the desktop in a jiffy.
“It is kind of a virtual helpdesk. Alexa doesn’t need to know where the data is stored or what the passwords are to access that data. She scans and quickly provides us what we need. The only challenge now is to figure out how to communicate better with digital assistants and chatbots to make voice as a more powerful medium,” emphasised Soderstrom.
Pasadena, California-based JPL is a federally-funded research and development centre, managed for NASA by the California Institute of Technology (Caltech) that carries out key robotic space and Earth science missions.
The facility currently has nearly 6,000 employees. According to Soderstrom, there are six technology waves that will primarily force developers to create better solutions before they impact the common people.
These waves are “New Habits” (like Always Connected workplace, gaming); “Applied AI” (Machine Learning, chatbots, automation, analytics); “Ubiquitous Computing” (mobile, smart devices, IoT, Augmented Reality); “Cyber Security challenges” (like Blockchain); “Accelerated Computing” (serverless, edge computing) and “Software-Defined Everything” (Networks, DevOps, Open Source and application programming interface or APIs, etc). READ: Type less, talk more: Tech firms want you to speak to your device
“It is crucial to make right combinations among these technologies and move forward. Cloud computing forms the base for these emerging technologies to work,” said Soderstrom.
The JPL has integrated conference rooms with Alexa and IoT sensors which help them fast solve queries. The NASA-JPL executive stressed that AI is not going to take away jobs and will supplement humans in the future.
“AI will transform industries ranging from healthcare to retail and e-commerce and auto and transportation. Sectors that won’t embrace AI will be left behind,” Soderstrom added. AI-powered digital assistants will be the key to enhance productivity.
“Humans are 80 per cent effective and machines are also 80 per cent effective. When you bring them together, they’re nearly 95 per cent effective,” he noted. The next technological tsunami, Soderstrom said, will come in the form of built-in intelligence everywhere and the world needs to prepare itself to handle that.