View Single Post
Old 07-01-2010, 07:55 PM   #274
Softhearted
Member

How Do You Identify?:
---
Preferred Pronoun?:
----
Relationship Status:
---
 

Join Date: Nov 2009
Location: ---
Posts: 298
Thanks: 454
Thanked 285 Times in 109 Posts
Rep Power: 1556234
Softhearted Has the BEST ReputationSofthearted Has the BEST ReputationSofthearted Has the BEST ReputationSofthearted Has the BEST ReputationSofthearted Has the BEST ReputationSofthearted Has the BEST ReputationSofthearted Has the BEST ReputationSofthearted Has the BEST ReputationSofthearted Has the BEST ReputationSofthearted Has the BEST ReputationSofthearted Has the BEST Reputation
Default Interesting articles from the Wall Street Journal

source: http://blogs.wsj.com/health/2008/05/...cine-not-well/

"How Do American Journalists Cover Medicine? Not Very Well

By Scott Hensley
Journalist, heal thyself.

When it comes to covering the medical news of the day, journalists could do a much better job.

An independent analysis of 500 stories about medical topics by major consumer print and broadcast outlets in the U.S. found “journalists usually fail to discuss costs, the quality of the evidence, the existence of alternative options, and the absolute magnitude of potential benefits and harms.”

The findings from 22 months of media scrutiny appear in the current issue of PLoS Medicine. The work was done by HealthNewsReview.org, which started looking over our shoulder in April 2006.

Here’s a table from the PLoS paper that catalogs our shortcomings:

(sorry, the image did not want to download... argh!!!)

As self-respecting journalists, we asked who’s behind this schoolmarmish outfit and how do they do what they do? The reviewers are a bunch of doctors and public health types, and their painstaking process for deciding how vigorously to wag fingers at us is described here.

A reformed journalist named Gary Schwitzer, author of the PLoS paper, serves as the third reviewer of each piece. He’s also publisher of the reviews, a professor of journalism at the University of Minnesota, and, gasp, a blogger. Maybe that makes him a peer reviewer?

What about funding, you ask? Any hidden agenda? A-ha! All this journalistic second-guessing can be laid at that feet of that dastardly quality guru Jack Wennberg from Dartmouth. The financial support for the graders comes from the Foundation for Informed Medical Decision Making, founded in 1989 by Wennberg and colleagues.

Bonus Prescription: What can be done? A PLoS editorial that accompanies Schwitzer’s paper calls the findings “a wake-up call for all of us involved in disseminating health research—researchers, academic institutions, journal editors, reporters, and media organizations—to work collaboratively to improve the standards of health reporting.”



Another article:
source: http://blogs.wsj.com/health/2009/05/...research-hype/


"Academic Medical Centers Often Guilty of Research Hype

By Sarah Rubenstein
The media may be guilty of exaggerating the results of medical studies, but academic medical centers that hype the results aren’t blameless themselves.

A piece out in the Annals of Internal Medicine takes a look at press releases that academic medical centers sent out about their research, examining such details as whether they gave information on the studies’ size, hard results numbers and cautions about how solid the results are and what they mean. The conclusion: The press releases “often promote research that has uncertain relevance to human health and do not provide key facts or acknowledge important limitations.”

The authors, led by Steven Woloshin and Lisa Schwartz of Dartmouth, looked at releases from EurekAlert issued by 20 academic medical centers and their affiliates in 2005. (EurekAlert compiles many press releases and sends them to journalists.) The researchers found that 58 out of 200 releases, or 29%, exaggerated the findings’ importance.

Exaggeration was more common in releases about animal studies than human studies. Out of the 200 releases, 195 included quotes from the scientific investigators: 26% of them were “judged to overstate research importance,” the authors write.

One example they cite: A release from the Huntsman Cancer Institute at the University of Utah that had to do with a study of mice with skin cancer and was titled, “Scientists inhibit cancer gene.” It quotes the lead investigator, Matthew Topham, saying that the “implication is that a drug therapy could be developed to reduce tumors caused by Ras without significant side effects.” This was an exaggeration, the Dartmouth folks write, because “neither treatment efficacy nor tolerability in humans was assessed.”

We put in a call to Topham, who told us he thought the critique itself was an exaggeration. Though he acknowledged the release could have explicitly said the results wouldn’t necessarily be the same in humans, “we were very careful to say we had done this in mice.” The word “implication” used in the press release “suggests that we have not done anything in humans,” he says, adding he assumed it was common knowledge that animal results don’t always translate into human results.

The authors of the Annals piece didn’t look at how often exaggerated press releases actually resulted in exaggerated news reports. However, they wrote, “We believe that academic centers contribute to poor media coverage and are forgoing an opportunity to help journalists do better.”

Woloshin and Schwartz have written before about medical research and the media, including another piece about flawed press releases from medical journals and one about news reports that “often omit basic study facts and cautions” about research presentations at scientific meetings. They’re not the only ones who make a case that journalists don’t cover medicine very well.
"

Even if these articles date a few years back, I doubt that the situation has changed much.
Softhearted is offline   Reply With Quote
The Following User Says Thank You to Softhearted For This Useful Post: