Researchers asked 330 cancer patients how they felt about using artificial intelligence (AI) in their treatment and care. Most patients were comfortable with AI helping with cancer screening and lifestyle advice like exercise and diet. However, they were less comfortable with AI making big decisions about diagnosis, treatment plans, or predicting outcomes. About half of the patients worried about AI replacing human doctors or making medical mistakes. The study shows that while patients generally support AI in cancer care, doctors need to listen to patient concerns and involve them in how AI is developed and used.

The Quick Take

  • What they studied: How comfortable cancer patients are with using artificial intelligence (computer programs that learn and make decisions) in different parts of their cancer care, from finding cancer to treating it.
  • Who participated: 330 cancer patients from September to December 2024. About 56% were men, half were 65 years or older, and most were actively receiving cancer treatment. The group included people from different racial and ethnic backgrounds.
  • Key finding: Most patients (about 8 out of 10) were comfortable with AI for cancer screening and lifestyle help. However, only about 6 out of 10 were comfortable with AI helping decide on treatment or predict outcomes. Nearly half worried about losing personal doctor interaction and the risk of medical mistakes.
  • What it means for you: If you have cancer, AI tools may help your doctors find cancer early and give you health tips. However, your doctors should still make major treatment decisions with you, not just rely on AI. It’s okay to ask questions about how AI is being used in your care.

The Research Details

This was a cross-sectional survey study, which means researchers asked cancer patients questions at one point in time (September through December 2024) rather than following them over months or years. Researchers approached 383 cancer patients and 330 agreed to participate, which is a very good response rate of 86%. The survey asked patients to rate their comfort level with AI in 8 different areas of cancer care using a simple 1-to-5 scale, where 1 meant “very uncomfortable” and 5 meant “very comfortable.” Patients also answered questions about their concerns with AI, such as worries about medical errors or privacy problems, using a 1-to-3 scale.

This type of study is useful for understanding what people think and feel about a topic right now. It’s like taking a snapshot of patient opinions rather than watching how opinions change over time. The researchers designed the survey to be straightforward so patients could easily understand and answer the questions.

Understanding what patients think about AI is really important because doctors are starting to use AI more and more in cancer care. If patients don’t trust AI or feel uncomfortable with it, they might not want to use these tools, even if they could help. By asking patients directly what they think, doctors can design AI systems that patients actually want to use and that match what patients care about most.

This study has several strengths: a high number of patients participated (330), the response rate was very good (86%), and the group included people of different ages, genders, and backgrounds. However, the study only captured what patients thought at one moment in time, so we don’t know if their opinions changed later. The study was done in 2024, so it reflects current patient thinking. The survey questions were clear and easy to understand, which is good for getting honest answers.

What the Results Show

When asked about different uses of AI in cancer care, patients showed clear preferences. They were most comfortable with AI helping find cancer early through screening (80% comfortable) and helping with supportive care like exercise routines (78%), diet advice (75%), and information about herbs and supplements (72%). These are areas where AI can help patients feel better and prevent cancer without making major medical decisions.

Patients were less comfortable when AI played a bigger role in medical decisions. Only 70% were comfortable with AI helping doctors diagnose cancer, 68% with AI helping manage symptoms, 65% with AI helping plan treatment, and 62% with AI predicting what might happen to them in the future. These are the decisions that directly affect treatment choices, so patients understandably wanted more human involvement.

When asked about concerns, nearly half of all patients (50%) said they were at least somewhat worried about AI in cancer care. The biggest worries were losing personal interaction with their doctors and the possibility of AI making medical mistakes. Some patients also worried about their private health information being shared or used incorrectly.

The study found that most patients (73%) were actively receiving cancer treatment when surveyed, which means they had real experience with cancer care and could speak from that perspective. Older patients (65 and up) made up about half the group, so the findings reflect the views of people who are most likely to get cancer. The study also showed that even though patients had concerns, the majority still had a generally positive view of AI in cancer care overall, suggesting they see potential benefits if concerns are addressed.

This is one of the first studies to directly ask cancer patients what they think about AI in their care. Previous research has mostly focused on what doctors and AI experts think, not what patients want. This study fills an important gap by showing that patient opinions are different from what experts might assume. Patients aren’t against AI, but they want it used in the right way—helping with screening and support, while keeping doctors involved in major decisions.

This study has some important limits to keep in mind. It only asked patients at one point in time, so we don’t know if their comfort with AI grows or shrinks as they use it more. The study was done with patients from specific hospitals or clinics, so the results might not apply to all cancer patients everywhere. We don’t know if patients’ comfort levels would be different if they had more experience with AI tools. Also, the survey asked about AI in general, but didn’t show patients specific examples of how AI actually works, which might have changed their answers.

The Bottom Line

Based on this research, here are evidence-based suggestions: (1) Doctors should feel confident using AI for cancer screening and supportive care like exercise and diet advice—most patients support this. (2) When AI is used to help with diagnosis, treatment planning, or predicting outcomes, doctors should always explain how the AI works and make final decisions with the patient, not just based on what the AI says. (3) Healthcare systems should protect patient privacy carefully when using AI. (4) Doctors should talk openly with patients about both the benefits and risks of AI in their care. These recommendations have moderate to strong support from this study.

Cancer patients and their families should care about this research because it shows that your opinions about AI in your care matter. If you’re getting cancer treatment, you have the right to ask your doctors how they’re using AI and whether you’re comfortable with it. Doctors and hospitals should care because this shows what patients actually want, which can help them use AI in ways that patients will trust and support. People developing AI tools for cancer care should definitely pay attention because this shows what features patients want and what concerns they have.

Patient comfort with AI may increase over time as they see it work well in their care. However, concerns about losing human interaction and medical errors won’t go away quickly—these need to be addressed through good communication and proven safety records. Most patients might feel more comfortable with AI within a few months of positive experiences, but building trust takes time and consistent good results.

Want to Apply This Research?

  • Track your comfort level with different AI tools in your cancer care using a simple weekly check-in. Rate from 1-5 how comfortable you feel with AI in screening, diagnosis support, treatment planning, and symptom management. Note any specific concerns that come up.
  • Use the app to ask your doctor specific questions about AI before your appointments. For example: “Is AI being used to help with my screening?” or “How does AI help you decide on my treatment?” Log your doctor’s answers so you have a record of how AI is being used in your care.
  • Over 3-6 months, track how your comfort level changes as you learn more about AI in your care. Also note which AI applications (screening, diet advice, treatment planning) you feel most and least comfortable with. Share this feedback with your healthcare team to help them understand your preferences.

This research reflects what cancer patients think about AI in their care based on a survey from 2024. The findings do not provide medical advice or recommendations for specific cancer treatments. If you have cancer or are concerned about cancer, please talk with your doctor about what’s best for your individual situation. Your doctor can explain how AI is being used in your care and answer your questions. This study shows general patient preferences, but your own comfort level with AI may be different. Always discuss any concerns about your treatment with your healthcare team.