NASA is collaborating with Google to test an AI-powered medical assistant that can provide support to astronauts on missions operating beyond low Earth orbit such as missions to Moon and Mars.
NASA, which is committing to a new era of human spaceflight with its Artemis mission, is working with Google to test a proof of concept for Crew Medical Officer Digital Assistant (CMO‑DA), a type of Clinical Decision Support System (CDSS). This has been created to allow astronauts to diagnose and treat symptoms when no doctor is available or communications to Earth are blacked out.
“Trained on spaceflight literature, the AI system uses cutting-edge natural language processing and machine learning techniques to safely provide real-time analyses of crew health and performance,” Google representatives said in a statement.
READ: NASA faces around 20% workforce exodus (July 28, 2025)
Google and NASA have put CMO-DA through three scenarios: an ankle injury, flank pain, and ear pain. Three physicians, including an astronaut graded the assistant’s performance across the initial evaluation, history-taking, clinical reasoning, and treatment. They found a high degree of accuracy, judging the flank pain evaluation and treatment plan to be 74% likely correct; ear pain, 80%; and 88% for the ankle injury. The two organizations are working with doctors to further test and refine the models.
The multimodal tool includes text, speech, and images, and runs within Google Cloud’s Vertex AI environment. It runs under a fixed-price Google Public Sector subscription agreement that covers cloud services, application development infrastructure, and model training, David Cruley, a customer engineer at Google’s Public Sector unit, told TechCrunch.NASA owns the source code to the app and has helped fine-tune the models. The Google Vertex AI platform provides access to models from Google and other third parties. NASA scientists said that the roadmap is deliberately incremental, according to TechCrunch.
They are planning on adding more data sources like medical devices, and training the model to be “situationally aware,” that is, attuned to space medicine-specific conditions like microgravity.
READ: Google’s AI Overviews reach 2 billion monthly users, AI Mode hit 100 million (July 24, 2025)
Deep-space missions, including to the moon or Mars may involve communication delays, making real-time consultations impossible. An onboard AI assistant can help solve the problem in a situation like that. The technology might also be of use on Earth, in remote and demanding environments where access to medical professionals is limited.
While Cruley was vague about whether Google intends to pursue regulatory clearance to take this type of medical assistant into doctor’s offices here on Earth, it could be an “obvious next step,” according to TechCrunch.This isn’t the only way in which NASA is using AI. In a recent test, the organization showed how artificial intelligence-based technology could help orbiting spacecraft provide more targeted and valuable science data. The technology enabled an Earth-observing satellite for the first time to look ahead along its orbital path, rapidly process and analyze imagery with onboard AI, and determine where to point an instrument. The whole process took less than 90 seconds without any human involvement.
While these developments are happening, 4,000 NASA employees have chosen to leave the space agency through President Donald Trump’s deferred resignation program. This number amounts to around 20% of its workforce.


