The Still-Early State of Online Doctor Reviews

9 Min Read

Image

Image

A front-page Boston Globe article on a neurosurgeon suing a caregiver for a harsh blog post  is exciting but unrepresentative of the overall state of online doctor reviews. However it caused me to take another look at online physician ratings from the perspective of someone trying to find a doctor. Conclusion: we are still in the early days and there is plenty of opportunity for better, more useful information. It’s still difficult to use the sites for real decision making.

First I tried searching HealthGrades, Yelp, Angie’s List and Massachusetts Health Quality Partners (MHQP) for information on something I really care about. I typed in the name of a medical specialist at a local academic medical center who is caring for a family member with a serious illness. This doctor has been in practice for 20 years but only one site I looked at (HealthGrades) had any reviews, and those two were not detailed. I then looked for other specialists and found that there are typically very few reviews available. It’s unusual to find more than five reviews for a given specialist on any one site, although I’m sure there are some exceptions. MHQP doesn’t include specialists.

Next I turned to primary care. The information is better –MHQP in particular stands out on data quality– but there is still a lot to be desired. I searched for my physician, Dr. Johanna Klein of the Beth Israel Deaconess Medical Center’s Washington Square Group. Here’s what I found:

Healthgrades — a listing with a lot of publicly available information (address, phone, insurance, date of graduation) plus seven patient experience surveys showing that people generally like her

Angie’s List — which I paid $11 to join– has a confusing search function. I found Dr. Klein but no reviews for her. There were 16 reviews for the broader medical group, though, enough to get a general idea of the practice and some specific doctors within it. One of the reviews is harsh “I seriously question if she has actual medical training…” but most are pretty sober, boring and don’t sway me one way or the other. This site was the most disappointing overall and I don’t recommend subscribing.

Yelp –is the liveliest of the sites, at least in its reviews of this practice, and also incorporates some of the most innovative social media features. There are 7 reviews, 3 of which give 5 stars, 3 with 1 star and 1 with 2 stars. In addition to the rating most have a significant amount of text –quite a bit more than Angie’s List. Reviews are sorted by “Yelp Sort” as a default and can also be sorted by date, rating, Elites (a Yelp designation for evangelists) and Facebook friends. The Yelp sort takes into account various factors –like user votes and recency– to list the most helpful reviews first. Each reviewer has her or her first name, last initial, town and photo displayed, along with the number of Yelp friends, number of reviews posted and how many times they have “checked in” at the location. Clicking on the reviewer’s name provides a profile of the person, ratings of the usefulness of the person’s reviews, and a distribution of the person’s ratings. The distribution of ratings is interesting because it gets to a key concern physicians have about ratings: are they just posted by people with negative experiences?

The Yelp sort did an excellent job of ranking the ratings. The first review is by a person with multiple chronic illnesses who’s seen a specific doctor at the practice for 10 years and gave 5 stars. She had many specific things to report about her doctor and clearly had plenty of basis for her comments. Four people had rated the review helpful, and it showed that she had checked in twice on Yelp while at the practice (compared to none for the others).

The next two reviewers gave low ratings: 2 stars and 1 star. These reviewers have written more than 150 reviews each –awarding 4 or 5 stars in the vast majority of cases– so this is a helpful credibility builder for me.

The last 2 reviews –1 star each– are written by people with no Yelp friends and only a few reviews. The negative ratings are based on specific anecdotes and even though one has six “useful” votes it is still at the bottom, where I think it deserves to be.

Overall the reviews rung true to me based on my own experience.

MHQP is much more scientifically rigorous than the rest of the sites, and its data forms the basis for Consumer Reports’ recent report on physician quality in Massachusetts. Data on clinical quality comes from health plan data and patient experience is derived from a statewide survey. In patient experience there are 90 responses for the Washington Square Group. Results are also displayed as one to four stars, but here the stars have a statistical basis: e.g., 4 stars means an office did better than 85 percent of others in the survey, 1 star means it did worse than 85 percent of the offices. MHQP also enables a side-by-side comparison of different offices, which is a nifty feature.

Despite the harshness of some of the Yelp reviews of my practice the picture painted by the MHQP results are –if anything– worse. There are quite a few categories with 1 star (e.g., How well doctors give preventive care and advice) and few with 4. And yet 71 percent of the Washington Square Group’s respondents say they would “definitely” recommend their doctor and 19 percent say “probably.” Because of its statistical rigor the MHQP site is bereft of qualitative comments that could shed light on the findings, and results are reported at the level of the group rather than for individual physicians. And of course MHQP is only available in Massachusetts, although certain other states and regions have similar resources.

I looked at these websites when I picked my primary care physician. They didn’t have much influence on me then and wouldn’t today. In the end the number one issue was finding a specific physician I liked –and as mentioned there is essentially nothing documented on my doctor. Instead I relied on my previous doctor’s recommendation after eliminating a few other potential choices. Location was also important and I wanted someone within the Beth Israel system because I like the hospital and my records are on the PatientSite portal. I do have some concerns about the overall customer service of the practice and some of the low MHQP ratings, but figure if I watch out for myself that these things won’t affect me.

In an ideal world the rigor of MHQP ratings would be extended to the individual physician level –at least for certain measures– and to medical and surgical specialists. Physicians or practice manager would also have a way to reply to the ratings and reviews at least in a general way. If some of the Yelp approach could be applied to add texture to the data through user commentary then we’d really have something.

image:doctorratings/shutterstock

 

Share This Article
Exit mobile version