In an age in which people can find ratings on restaurants, books, movies and the stuff they buy on Amazon, why don’t patients know more about their doctors?
By Rose Hoban
Recently, Cathy Zizzi decided to change orthopedists.
“I had a terrible experience and it was the third one,” said the Winston-Salem resident. “I thought, ‘That’s it.’”
Zizzi tried to complain to the orthopedist’s office, but had a hard time finding a way to do it. When her patient-satisfaction survey came, there was no place to write a comment.
“I eventually wrote something on their Facebook page,” she said.
Zizzi’s experience isn’t unique. Often people throw those patient-satisfaction surveys in the recycling bin, while those who are unhappy often end up sending them back.
But patients rarely, if ever, receive a response from the hospital or physician practice once they send in that survey. It’s enough to wonder if the surveys aren’t an exercise in futility.
“It’s hard to find a doctor you’ll like,” said Zizzi, wondering why there’s not something like Yelp for doctors.
Now there is something like Yelp, in Zizzi’s town, where Wake Forest Baptist Medical Center recently became one of only four hospitals in the U.S. to show patient reviews of their physicians on their online profiles.
The hospital started putting results from those paper patient-satisfaction surveys online quietly this spring. The idea was to get enough ratings on doctors so that when they made the announcement, there was something to show.
“If you want to succeed, you need to understand what people you serve think,” said Wake Forest’s chief medical officer Russell Howerton, who lead the initiative to bring transparent grades to his institution.
Mum’s the word
In the past, hospitals have been mum about what patients think of their doctors, even though they collect information about their services on those patient-satisfaction surveys and share the information internally. But patients had little information about their doctors, the overall quality of the hospital or the level of satisfaction with care.
In short, there’s been no transparency.
That’s changing though. In the past few years, a variety of websites have been created to rate doctor and hospital quality. They range from systematic, data-driven, wonky sites like the Leapfrog Group’s ratings of hospital quality to consumer-driven Angie’s List’s commentaries on the customer service in doctors’ offices.
“There’s much discussion in physician circles about the kinds of websites that accumulate comments and managing reputations around these websites,” Howerton said.
Most recently, the website HealthGrades has created a way to do physician reviews, using eight questions ranging from “total wait time” to “how well the provider listens and answers questions.”
But patients have to know about HealthGrades and seek out the website to make comments, so there tend to be fewer data points. In contrast, WFBMC queries its patients and about 18 percent of them return their surveys, either on paper or electronically.
“The electronic ones come back more quickly,” said Hannah Lacko, the patient-experience adviser at WFBMC. “You’ll see when you click on comments that sometimes the dates are as recent as two weeks. There’s more real-time input and constant updating.”
“We send tens of thousands of surveys to people we know for sure are our patients,” Howerton said. “The feedback to our process is orders of magnitude richer than feedback that comes to HealthGrades.”
Utah leads the way
When Howerton attended a conference where the University of Utah hospital system presented its experience with posting physician ratings on its website, he said he knew instantly he had to do it at WFBMC.
“People should have the ability to understand what people were saying about their providers,” Howerton said. “It’s so self-evidently aligned with our journey to quality that my mind said we need to do that.”
Howerton and Lacko consulted with the leadership at the University of Utah frequently as they prepared to launch their own system.
“They’re on my speed dial,” Lacko laughed.
What really convinced both of them was what the leadership at the University of Utah found: As they got more patient reviews, the reviews got better, and so did their external quality rankings.
“The attention caused many people to focus on hearing the voice of the patient and make a thousand little changes,” Howerton said.
Slow adopters, fast adopters
Howerton said that at first, many of the physicians on his staff were wary, so he and Lacko had to be strategic.
“We were not naive and started with groups we thought would have affinity for this,” Howerton said. “Between the time we started in October or so and the time the site was ready to be turned on, everywhere we’ve been we’ve gotten them to sign on enthusiastically.”
Still, there are some individual physicians and groups who have been slower adopters; but that’s changing. Howerton recounted what he heard from one doctor who had a patient tell him that she made an appointment with him because of his ratings. The first doctor she looked at did not have enough reviews to have ratings, so the patient went looking for another doctor who had more.
“Suddenly, faculty who were wary wanted more voices of the customer to be in the website so they would have rankings,” Howerton said.
Cody Hand, a lobbyist from the North Carolina Hospital Association, said patient rating systems on all hospital websites are just a matter of time.
“Wake Forest is just the first domino,” he said. “I give everyone else in the state five years.”
Hand noted that there are 135 hospitals in the state.
“It’s the future,” Howerton said. “You’ve been to Yelp to check on a restaurant. Our industry is a service industry. It’s inconceivable that young doctors will finish their careers where this kind of transparency isn’t everywhere.”
“We’re happy to be a leader,” he said. “The industry is going to follow us here.”