Google and AI in Healthcare: What It Means for Nurses
- Google and Mayo Clinic partnered to use generative artificial intelligence (AI) to improve patient care.
- The companies have said they are addressing safety concerns while leveraging AI in healthcare, including privacy issues.
- AI technologies have begun transforming nursing care, and generative AI may be the next step.
The complexity of healthcare data has made artificial intelligence (AI) attractive to payers and providers. The hope is that AI can perform tasks as well or better than humans. But the question remains, how reliable is artificial intelligence in an industry where accuracy is vital to the physical, emotional, and mental health of patients?
Let's explore the Google and Mayo Clinic partnership to leverage AI in healthcare.
Are you ready to earn your online nursing degree?
Google and Mayo Clinic Partnership: Leveraging AI in Healthcare
The partnership began in September 2019, when the two companies announced the strategic alliance would redefine healthcare delivery and potentially accelerate innovation. Mayo Clinic chose Google's cloud and AI capabilities as the cornerstone of its digital transformation, enabling the hospital to use a technologically enhanced roadmap to develop a strategy that advances diagnosis and treatment.
Before the COVID-19 pandemic, Mayo Clinic determined that cloud computing, machine learning, and AI could help bring together consumers and providers across the globe to improve healthcare delivery. Mayo Clinic chose Google to transform virtual care and leverage advanced technology in medical research.
In a press release from Mayo Clinic, Google CEO Sundar Pichai, said, "Health care is one of the most important fields that technology will help transform over the next decade, and it's a major area of investment for Google. By pairing the Mayo Clinic's world-class clinical expertise with our capabilities in AI and cloud computing, we have an extraordinary opportunity to develop services that will significantly improve lives."
On June 7, 2023, Google announced Mayo Clinic was testing the latest development in generative AI to give providers access to technological capabilities to increase their efficiency and impact on healthcare delivery. The announcement included generative AI search to help create customized chatbots allowing workers to interpret medical data more quickly. The data could include medical histories, imaging, and lab results.
Popular Online RN-to-BSN Programs
Learn about start dates, transferring credits, availability of financial aid, and more by contacting the universities below.
Potential Use Cases for Generative AI in Healthcare
Generative AI is a form of computing that creates new content. Examples of this generative AI include platforms such as ChatGPT, DALL-E, and Soundful. ChatGPT produces content, DALL-E produces images, and Soundful produces music — all using generative AI programming.
By creating unique output and accelerating efficiency, Mayo Clinic hopes generative AI cloud computing will help improve patient care by reducing administrative tasks. The strategic alliance between Google and Mayo Clinic will explore how combining search with generative AI can help find important information faster and seamlessly.
For example, the program unifies data so providers can search quicker for relevant results. When healthcare providers need to see information about a specific cohort of patients, they can enter the query for the test results they want to compare against patient data.
This capability will also make research much quicker. For example, research teams seeking medical information about female patients aged 35-45, including vitamin D levels and mammogram results, can use this search tool instead of digging out the information separately.
It will also help providers match patients with potential clinical trials. However, while the potential use of generative AI within healthcare is significant, safety concerns must be addressed before a broader application of the technology is made available.
Addressing Safety Concerns
The AI framework can provide healthcare providers with real-time data and streamline specific tasks that have taken hours in the past. However, using AI in healthcare continues to require human surveillance. Diagnostic applications can sometimes overlook social variables, and since AI is generally dependent on networks, it continues to be susceptible to security risks.
Aashima Gupta is the global director of healthcare strategy and solutions at Google Cloud. In an interview with CNBC, Gupta stressed Google's approach to privacy was compliant with the Health Insurance Portability and Accountability Act (HIPAA).
“We want to be very thoughtful and responsible in how we leverage such a powerful tool like generative AI in an enterprise setting, especially in health care,” Gupta told CNBC.
Google has said they have developed an approach to privacy so customers retain control of their data. Mayo Clinic stated they have created "safe sandboxes" where workers are testing the applications.
“We take privacy of patient data very, very seriously, and our needs of our patients come first. That’s one of the reasons health care has to be very cautious in general as an industry in adopting technology that may not be fully tested, may not be fully vetted,” said Vish Anantraman, chief technology officer at Mayo Clinic.
What the Use of AI in Healthcare Could Mean for Nurses
AI has already begun transforming nursing and enhancing patient care. AI Tools in place include mobile health and sensor-based technologies that enable patients to remain at home and nurses to monitor patients remotely.
Clinical decision support tools, including clinical practice guidelines, reports, and order sets, help nurses make clinical decisions more efficiently. However, generative AI can take these capabilities to the next level.
AI may be a tool that helps nurses practice at the top of their license and eliminates much of the time they spend doing tasks that can be performed by someone else. The speed of AI applications being developed in healthcare requires the involvement of nursing professionals in AI teams that develop applications affecting patient care.
Nurses must also be involved in ethical considerations. Computer algorithms are developed by humans and are therefore inherently biased. Bias can creep into an AI project, placing nurses accountable for how judgments and decisions are applied.
The increasing use of AI has also raised the question of whether human workers may be replaced by technology. Yet, in nursing, very few roles can be automated. While nursing will be impacted by AI technologies, nursing experience and skills will always be required for patient care and must be used in human surveillance of AI technology.
You might be interested in
Clinical Automation Technology in Nursing: Pros vs. Cons
What's Nursing Informatics? Everything to Know About This Cutting-Edge Field
NurseJournal.org is an advertising-supported site. Featured or trusted partner programs and all school search, finder, or match results are for schools that compensate us. This compensation does not influence our school rankings, resource guides, or other editorially-independent information published on this site.
Are you ready to earn your online nursing degree?
Whether you’re looking to get your pre-licensure degree or taking the next step in your career, the education you need could be more affordable than you think. Find the right nursing program for you.