When NASA debuts its new massive deep-space rocket for the first time in the coming months, a familiar voice assistant and video teleconferencing tool will be going along for the ride. A version of Amazon’s Alexa voice assistant and Cisco’s Webex videoconferencing platform will be included on the flight to space, part of a technology demonstration to see if these tools might benefit future astronauts flying to distant destinations like the Moon and Mars.
The upcoming flight is known as Artemis I, and it’s the first test mission in a series of flights planned for NASA’s Artemis program — an initiative to send the first woman and the first person of color to the surface of the Moon. Tentatively set for March, Artemis I will mark the inaugural flight of NASA’s next-generation rocket, the Space Launch System, or SLS, a gargantuan rocket that Boeing has been developing for the last decade. The SLS is designed to launch people and cargo into deep space, with passengers riding on top of the vehicle in a new crew capsule called Orion, developed by Lockheed Martin.
“a future in which astronauts could turn to an onboard artificial intelligence for information”
For Artemis I, SLS will launch an Orion crew capsule around the Moon on a weeks-long flight — the first time the two vehicles will fly to space together. This is a critical test launch, so no people will be flying inside Orion, save for a mannequin. However, the fake passenger will have some machine companions. Lockheed Martin teamed up with Amazon and Cisco to mount a “human-machine interface” in the spot where Orion’s control panel will be in the future. Called Callisto after the companion of Artemis in Greek mythology, the box will have a voice-activated Alexa speaker, with its iconic blue ring light, and an iPad that runs Webex.
At some point during the Artemis I mission, people on the ground will test out the box, as if astronauts are interacting with the speaker and the screen on board Orion. Ultimately, Lockheed Martin, Amazon, and Cisco want to see if such an interface would be beneficial for future deep space travelers.
“We... envision a future in which astronauts could turn to an onboard artificial intelligence for information and for assistance and ultimately for companionship,” Aaron Rubenson, vice president of Alexa Everywhere at Amazon, said during a press briefing. “You could easily imagine astronauts turning to this onboard AI to talk about the status of a subsystem or maybe controlling the lights in the cabin or asking for a particular camera view.”
To see if these tools work, Lockheed Martin will employ “virtual crew members” on the ground. While Orion is in space, a person in NASA’s mission control center in Houston will give a command to Alexa. That person’s voice will play out over a speaker inside Orion in order to activate Alexa. The virtual crew members will ask for certain types of information, such as the speed Orion is moving through space or how long until the capsule performs its next thruster burn. Alexa is designed to pull real-time data from Orion in order to answer those questions through its speaker.
virtual crew members will ask for certain types of information
The Orion spacecraft is equipped with Wi-Fi, but since the vehicle will be hurtling through space away from Earth during the demonstration, internet connectivity is going to be limited. As a result, Alexa won’t need to access the internet during the flight to answer some of the immediate questions from the virtual crew members. Instead, Amazon designed this Alex with a system known as “local voice control,” allowing it to respond to a wide variety of pre-determined commands. “There are hundreds of parameters, thousands of utterances, where we’ll be able to get that real-time access,” Rob Chambers, director of commercial civil space strategy at Lockheed Martin, said during the briefing.
Virtual crew members will also ask Alexa to change the lighting inside Orion. “It is the iconic use case of Alexa, at least around my house,” said Chambers. Lockheed Martin has installed a separate LED lighting system inside the capsule behind the panel display, which Alexa should be able to control. Lockheed Martin has also mounted a few microphones and cameras throughout Orion’s cockpit, as well as a virtual reality camera, to record the demonstration and make sure the box works during the mission.
The last test will see if the Webex platform works. Virtual crew members on the ground will appear on the iPad screen inside Orion and have a video conference in 720P with Alexa during the flight. Of course, poor internet connectivity will likely be a problem with this one, too. “There will be a lot of back loss compensation technology because your network connectivity is going to be not as reliable as what you have,” Jeetu Patel, executive vice president and general manager of security and collaboration at Cisco, said during the briefing. “And so we have to make sure that that’s factored in.” Cisco envisions this tool could be used by astronauts to videoconference with members of mission control or perhaps loved ones on the ground while astronauts are traveling through space.
However, Callisto is first and foremost a technology demonstration, and there are currently no plans to fly the box on future missions with Orion. The next flight after Artemis I is Artemis II, which will actually have astronauts on board Orion flying around the Moon. If Callisto does turn out to be a success, it’s possible a future version of the system will make it onto upcoming Artemis missions but in a very different form. “We’re discussing with NASA the other applications of this,” Chambers said.
The partners already have grand visions for what future Callisto systems could do, from controlling timers, video displays, cameras inside a spacecraft cockpit, or ambient temperatures. “We see the value now,” said Chambers. “We can start working with members of the space industry to figure out what are the most valuable things that should buy their way onto this capability.”