A peek at artificial intelligence in action at NASA Jet Propulsion Laboratory

At JPL, colleagues are chatting with intelligent digital assistants to get answers and collaborate. Here are a few examples
394 readers like this.

At NASA’s Jet Propulsion Laboratory, we constantly assess and experiment with emerging technologies.  We have identified six technology waves of the future, which together build to form one giant tsunami we call "Built-in Intelligence Everywhere."

This giant tsunami is comprised of Cybersecurity Challenges (e.g. blockchain); Accelerated Computing (e.g. quantum computing); Software-Defined Everything (e.g software defined networking and APIs); Ubiquitous Computing (e.g. the Internet of Things and augmented reality); New Habits (e.g. always the-connected workplace and the sharing economy); and Applied AI (e.g. machine learning and chatbots). 

This article will explore the Applied AI technology wave.

One manifestation of this built-in intelligence is the rise of the Intelligent Digital Assistant (IDA). These will benefit us both in the near-and long-term. To be truly useful, they need to be easy and natural to interact with and get smarter over time as we add intelligence.

[ How well do AI chatbots handle IT help desk tasks? Read also: AI chatbots for IT support: A CIO's lessons learned. ]

At JPL, we’ve built IDAs using natural user interfaces, natural language understanding, and APIs. We are using Amazon’s Alexa and LEX to allow employees to have a conversation with the IDA and get answers to questions almost instantly by either speaking, typing, or texting with it. These IDAs are saving us massive amounts of time and enabling collaboration in ways we’ve never experienced before.

Meet JPL’s intelligent digital assistants

When we first began to experiment with digital assistants, we focused on making them easy to use. You can speak to Alexa via an Echo, use websites or Slack to type in questions, or text using your mobile device. That enables multiple ways to interact with your answers: You can listen to what Alexa has to say, see them on a display, or interact with them via a touchscreen.

In practice, these digital assistants are providing tremendous value in a variety of use cases. And, we’re just at the starting line. How about a few examples?

Employees can use an IDA to quickly find available conference rooms (our most requested IDA), enable technology within those rooms, and most importantly, receive instant answers to complicated, data-driven questions.

For example, once the IDA has found a conference room, an employee can ask Alexa to turn on video conferencing. It solves the problem of remembering how to work the multiple video and audio inputs and remotes.

In that meeting you might ask Alexa to show a data set, and it will instantly appear on the screen. Because this is happening during a video conference, both parties can see it. You can also say: “Alexa, show me the latest security threats,” and that data will pop up on the display.

“Who is the owner of subcontract XYZ? Show it to me. Zoom in. Zoom out. Combine it with other data.”

Beyond simplifying technology logistics, the IDAs can instantly search data for answers that would otherwise take a long time to uncover. You can ask Alexa questions about hundreds of thousands of subcontracts and drill down further by asking, “Who is the owner of subcontract XYZ? Show it to me. Zoom in. Zoom out. Combine it with other data.”

One key: Keeping user interface constant

The real magic happens when you overlay a digital assistant with artificial intelligence. You can then make the IDA smarter in the background over time, without having to change the user interface.

In the example where the IDA found an available conference room (version 1), users can now ask for more details, such as searching nearby buildings (version 2), how many people will fit in the room (version 3), and what equipment is in the room (version 4). Note that the iteration from version 1 through 4 took only a few days and happened without release notes or fanfare.

We’re building these chatbots and intelligent digital assistants at breakneck speed because the use cases are so compelling and they provide so much value. Once you’ve figured out the recipe, building upon them isn’t difficult. One caveat: you must have access to accurate data. The IDAs are getting better and better all the time, without changing the user interface. 

That last piece is important. IT has become so complex in the enterprise that we need to drastically simplify the user experience. The user shouldn’t have to remember which URLs to type, whether there’s a password necessary to login, or a specific way to ask a question. Instead, they just want to have a conversation with the system and get the help they need when they need it. This allows people to focus on solving their problems (and not the IT specifics) which makes them more effective and efficient.

Realizing the value

The benefits of IDAs are wide reaching: They already work on behalf of the human, searching billions of records and retrieving the answers from hundreds of data sources. Again, we have just begun the journey.

If a spacecraft on the ground needs a particular part, traditionally someone had to place a call and someone had to answer that call, look it up and tell them where it is. Now we can build an IDA to do that.

It’s not just us at JPL who will find value in intelligent digital assistants: They’re applicable to every business. If your help desk is bogged down by answering questions, consider developing an IDA to answer your most-asked questions. This helps relieve your help desk personnel, allowing them to answer more difficult questions, and prevents you from hiring additional staff, which is costly.

Testing intelligent digital assistants and striving for ubiquitous built-in intelligence shouldn’t intimidate you: It’s a valuable capability that builds on itself and enables businesses to get better answers to questions more quickly and efficiently. And we will benefit from industry’s rapid advancements in natural language processing. 

If you’re a CIO and you see a use case for IDAs, it’s a matter of connecting inexpensive sensors in the background, using a natural user interface in the foreground, tying in an API, and making it smarter over time by connecting more data and benefitting from artificial intelligence. Who can do it for you? Chances are that there are people in your organization already implementing home automation and analytics, and that’s a great place to start.

[ Want lessons learned from CIOs applying AI? Get the new HBR Analytic Services report, An Executive's Guide to Real-World AI. ]

Tom serves as the Chief Technology and Innovation Officer, in the Office of the CIO at NASA's Jet Propulsion Laboratory, where his mission is to identify and infuse new IT technologies into JPL's environment.He has led remote teams and large scale IT best practices development and change efforts in both small startup and large commercial companies, in international venues, and in the US Governm