Advice for telling the difference between medical fact and fiction.


One day last summer Antonia Prescott was scrolling the internet when she saw an article with a headline that intrigued her.

“Harvard professor names best exercise to burn fat and keep it off: Dr. Daniel E. Lieberman has explained what type of exercise and for how long a week people should be doing it for best results,” it said.

Curious, Prescott, turned to her husband, who was doing the dishes nearby and asked him what type of exercise people should do to burn fat and keep it off.

“That’s really complicated. I can’t answer that,” responded her husband, who happens to be Daniel E. Lieberman, a professor at Harvard, who never provided such guidance to anyone.

Like much of the “information” available online, what she was reading wasn’t accurate, or at least it was so oversimplified as to be meaningless.

The internet is filled with questionable guidance on weight loss and nearly every other topic ‒ but when it comes to health, such sketchy bits of content can be downright dangerous.

Most Americans encounter false information related to health online, according to a recent poll by the Kaiser Family Foundation, and most aren’t sure whether that information is true.

Some may be harmless ‒ such as the best exercise for burning fat, which Lieberman, a paleoanthropologist, can’t answer simplistically from his data on human evolution.

But some of it, including outright lies, is often provided by bad actors who are trying to make money or gain power by manipulating innocent people, experts say.

These bad actors also take advantage of a flawed medical system that can leave people without access to professionals they trust to give them accurate, useful information.

Systemic changes are needed to help rebuild public trust, experts say.

At the individual level, people should learn to recognize the difference between accidental misinformation and intentional disinformation, said Lee McIntyre, a philosopher and author, who has written extensively on the subject.

Mistakes, like natural disasters, will always happen. There’s not much to be done about them.

But disinformation, he said, is a lie against which people can fight back.

“I want people to train themselves,” McIntyre said, to ask where the information in question is coming from, what’s at stake, who’s behind it and what benefit does it serve to get that information out to the public?

Building health literacy

By promoting fear, misinformation causes mental and physical fatigue, said David Novillo Ortiz, European regional adviser on data and digital health for the World Health Organization.

It has a direct impact on trust in government, government response and public health messaging, which then disempowers people and risks their health, he said.

“We have a challenge ahead of us in how we can rebuild this trust in government that has been damaged by misinformation,” said Novillo Ortiz, who is working to do that within Europe.

The world has changed so much even within the lifetime of most people alive today. Anyone over a certain age didn’t grow up in a world where they had to defend themselves against misinformation on social media.

There are more mobile devices than people in most countries and only half the nations in Europe and Central Asia have policies to improve digital health, Novillo Ortiz said, so it’s become easy to spread false medical information.

“We are leaving people behind because we are not investing enough in digital health literacy.”

Everyone, from politicians to public sector employees to journalists to individuals, needs to play a role in fighting misinformation, Novillo Ortiz said.

“This is a problem for all of us,” he said.

Know who to trust

Even otherwise trustworthy sources sometimes screw up, said Dr. Richard Baron, president and CEO of the American Board of Internal Medicine, which certifies doctors.

There’s no question, for instance, that the Centers for Disease Control and Prevention, made mistakes early in the pandemic. But that doesn’t mean everything the CDC says should be dismissed. “They got a couple of things wrong, but I wouldn’t throw the baby out with the bathwater,” he said.

If several typically reliable sources agree, such as the CDC, along with experts or websites from well-known hospitals and universities, they’re probably right, he said.

“When you start to see information converging from reliable sources, that is trustworthy information,” he said.

Baron noted we live in an increasingly specialized society where we can’t possibly know or understand everything by ourselves, so we have to rely on experts.

His office, for instance, is on the building’s 17th floor, so he has to take an elevator to work. He doesn’t really understand how the elevator works and he has no interest in fixing it when it doesn’t. He just wants to get to the 17th floor, so he trusts other people to get him there.

Similarly, he said, the public needs to be able ‒ and willing ‒ to trust people with medical expertise.

But that doesn’t mean everyone with an MD after their name is equally trustworthy, said John Robert Bautista, now a health misinformation researcher at the University of Missouri, Columbia.

Based on his previous research at the University of Texas at Austin, Bautista said doctors who post misinformation ‒ including the Disinformation Dozen, who promoted false information about vaccines before the pandemic ‒ are typically selling a product or themselves.

They play on people’s emotions to get followers, he said. “Once they get a certain number of followers, they can use that platform to sell stuff, or if they have plans to run for office they can use that social capital they have.”

Freedom of speech is a legitimate right for doctors, as for everyone else, Baron said. But accuracy and avoidance of harm are important too. Doctors don’t get to claim freedom of speech in malpractice cases, he noted.

Also, Baron, said, it’s standard fare for people pitching disinformation to attribute bad motives to others. “It’s not that everybody always has pure motives,” he said. But ask yourself: why would they have those motives? Would drug companies really sell more drugs if those drugs killed people?

Everyone likes to criticize Big Pharma for being greedy, for instance, but there’s plenty of money in the $50 billion U.S. market for dietary supplements, which are subject to far fewer government regulations than pharmaceuticals.

So, if you’re paying attention to a doctor or other self-proclaimed expert who is outside of the mainstream and you think that person ‒ and by extension, you ‒ are smarter than everyone else for doing so, you might want to reconsider, Baron and others said.

“(You have to be) skeptical about one’s skepticism,” he said. “You really can outsmart yourself.”

Institutions have a lot of work to do too

Baron said institutions like his have taken the public’s trust for granted rather than trying to deliberately build that trust.

Doctors and academic scientists have long thought about “marketing” and communicating to patients as someone else’s job, said Dr. Geeta Nayyar, author of the new book “Dead Wrong: Diagnosing and Treating Healthcare’s Misinformation Illness.”

Every candy store has an Instagram account telling customers about offerings and hours and offering opportunities to interact, she said. But “health care is arguably the complete opposite. Once you leave, you have no idea how to interact with us.”

Many people today don’t even have a regular doctor, so when they show up truly in need of medical advice, they haven’t built up the kind of trust that used to define the doctor-patient relationship.

This also puts an added strain on doctors and nurses and may explain at least some of the caregiver burnout.

Nayyar said she’s had patients come in and ask her how much money she makes on COVID-19 vaccinations. (Answer: Nothing.) “To walk in so mistrusting is difficult for anyone to swallow.”

That lack of easy communication between provider and patient has left a gaping hole that people with other agendas have been only too happy to fill.

“Misinformation grows in the dark,” Nayyar said. “We left this space dark and people are seeing the profits they can make (by taking advantage of that information vacuum).”

How to inoculate yourself

To make sure you and your family are getting the best medical information online, look for content that’s posted to platforms that are broadly available and have editors, suggests Marzyeh Ghassemi, an assistant professor at the Massachusetts Institute of Technology who develops machine-learning algorithms to inform health care decisions.

Bots and social media accounts can post anything, but something that’s been vetted by many people and posted to an institutional website is likely to be more reliable, she said.

“You’re going to go for high efficiency if your goal is to spread misinformation,” she said, so if it’s very simple to get information onto a platform, there’s a higher risk it won’t be accurate.

People behave differently toward information when they are primed to evaluate it for accuracy, Ghassemi said.

Content warnings, like those the social media site X (formerly Twitter) used to include, were effective in making people question misinformation, she said.

“That is a very powerful intervention,” she said. “If we can’t control how (information) gets generated, we can at least control how it gets delivered.”

Another way to destroy the power of lies is through “prebunking,” or exposing it as fraudulent before it can become part of the popular imagination, said McIntyre, whose most recent books include “How to Talk to a Science Denier” and “On Disinformation: How to Fight for Truth and Protect Democracy.”

Too often, people opt for the “do nothing” option when doing something is actually safer or makes more sense. That’s why people frequently skip routine medical checks that might help prevent serious medical problems.

“Taking too long to make a decision is in effect making a decision,” he said.

The people who want to take advantage of others know how to exploit people’s natural prejudices, McIntyre said. “The disinformers know what the cognitive biases are and what the existing divisions are and so where to plant it,” he said.

McIntyre said he doesn’t blame conspiracy theorists for being sensitive about being deceived. “It’s a very powerful human motivation not to be fooled,” he said.

But they are being led astray by someone different than they think. “You think you’re being duped by the CDC and the FDA, but you’re actually being duped by Alex Jones and Naomi Wolf and these other people on Twitter (now X).”

In a way, falling for misinformation and not trusting “official” sources is a reflection of people not feeling heard, Ghassemi said.

Your doctor used to be someone within your community whom you knew and trusted.

“You were disproportionately likely to listen to advice they had. I don’t think that is true as much today,” she said. Electronic health records were supposed to improve things, but in some ways just baked in racial and other prejudices that were there before, she said.

“Many communities do not feel that their pain is being heard and acknowledged by power structures,” she said. “Some movements are weaponizing this collective feeling in a way that is very dangerous, and spreading misinformation can be part of normalizing behavior that comes from fear and anger.”

Karen Weintraub can be reached at kweintraub@usatoday.com.



Source link

Rate this post

Leave a Comment