News
Google Cloud and Adtalem Want Clinicians to Learn AI—Before It Lands on the Ward
- By John K. Waters
- 10/15/2025
Adtalem Global Education will team up with Google Cloud to launch an artificial‑intelligence credential aimed at students and practicing clinicians, a bid to make tools like Gemini and Vertex AI feel less like novelty and more like standard kit in clinics and hospitals.
Set to debut in 2026 across Adtalem's network—including Chamberlain University and Walden University—the program promises hands-on training with Google Cloud's AI services alongside coursework on clinical use cases, ethics, and patient‑safety protocols. Adtalem says its institutions serve more than 91,000 students and count about 365,000 alumni—large pipes for feeding AI-literate workers into a strained healthcare system.
"The partnership with Google Cloud gives our students a competitive edge in their careers — whether they're treating patients, providing mental health counseling, or leading healthcare teams," said Michael Betz, Adtalem's chief digital officer, in a statement. "Our graduates will enter the workforce confident in using AI to enhance their clinical decisions, spend less time on paperwork, and more time connecting with patients."
Google casts the alliance as a guardrails‑first effort. Brent Mitchell, vice president of Google Public Sector, said the goal is to help clinicians implement AI "safely, responsibly and effectively."
Why this matters now
Hospitals are spending heavily on AI to triage tasks, streamline scheduling, and reduce documentation, but the workforce isn't uniformly ready. A Harris poll last month found that more than half of US healthcare workers are actively looking to leave their jobs. The same survey reported 42 percent worry AI will replace some aspects of their role, and just 41 percent feel comfortable using AI tools today.
That disconnect—investment on one side, anxiety and uneven training on the other—is what this credential targets. Rather than dumping generic "AI literacy" into already packed curricula, Adtalem says it will anchor the program in clinical scenarios and the tools commonly deployed in hospital systems. Think: how to evaluate when a model belongs in a workflow, how to verify outputs, and how to keep patient data out of the wrong places.
The fine print
The companies are positioning the effort as practical, not theoretical: time in the console with Gemini-powered services, exposure to healthcare-specific integrations, and structured modules on responsible use. The emphasis on ethics and patient safety is notable given the field's sensitivity to bias, privacy, and explainability—and the reality that any productivity gains vanish if clinicians don't trust the tools.
The bigger picture
Credentialing won't settle the debate over where generative models should live in clinical decision‑making, but it could normalize a baseline of skills—how to prompt effectively, how to audit results, and when to escalate to a human‑only workflow. If the program reaches even a fraction of Adtalem's student body and alumni, it could shape how quickly AI moves from pilots to routine practice.
The bet is simple: train clinicians to use AI before AI is thrust upon them. Whether that eases burnout or deepens skepticism will depend on execution—and on whether those "safe, responsible, effective" promises hold up once the tools leave the classroom and hit the ward.
About the Author
John K. Waters is the editor in chief of a number of Converge360.com sites, with a focus on high-end development, AI and future tech. He's been writing about cutting-edge technologies and culture of Silicon Valley for more than two decades, and he's written more than a dozen books. He also co-scripted the documentary film Silicon Valley: A 100 Year Renaissance, which aired on PBS. He can be reached at [email protected].