Statistics

At the June Hospital District meeting, Unity Care COO Shanon Hardie shared with the commissioners, as she does quarterly, the clinic utilization metrics. One of these metrics is the number of unique patients seen in the quarter. As usual, the commissioners spent little time on this metric, and as always the commissioners had no questions about what these numbers meant and how to address them. Since they won't, we will.

Here are the data Shanon Hardie provided.

As you can see, the clinic has lost almost 100 unique patients since a peak at the start of 2015. In the 16 months that one or both of us has been attending Hospital District meetings, this trend has never been fully discussed by the commissioners. They have referred at times to dropping patient counts and speculated on causes, but they never took any action to discover those causes, let alone address them, until we suggested the community survey that was run early this year.

Extending the data back to 2013 provides more perspective:

Now it's clear that there were strong and sustained drops in 2015 and 2017. What caused them?

Did we do it? Did the Community Paramedic (CARES) program do it?

This chart shows the patient counts from 2015 on. In Elaine Komusi's resignation speech in March, she cited both the Small Point Bulletin and the CARES program as reasons. Did either of these affect clinic utilization?

We published our first comic about the Hospital District on December 10 of 2017, shown with the blue arrow, and our most intense coverage of the district took place during the first quarter of 2018, during which she resigned. The red arrow shows when the CARES program launched.

Based on the timing, it's obvious that neither the Small Point Bulletin nor the CARES program could account for the drops in utilization through 2015 and 2017. You can see that utilization rose slightly in Q1 2018, also suggesting that our coverage of the district did not affect it. And it shouldn't have. How could asking the district commissioners and district staff to do their jobs have any effect on the clinic?

Regarding the CARES program, it also launched far too late to have any effect on patient counts in these quarters. But what kind of threat does it pose now? Here, a picture is worth a thousand words:

In our analysis of the district's budget, we discussed how the declining number of patients reduces the financial viability of the clinic because fewer patients produce fewer patient-related revenues (fees, insurance reimbursements, pharmacy payments, etc.). These revenue shortfalls have to be made up somehow. For now, it's from increased tax subsidies from the district to Unity Care. We also showed that the clinic operating costs were exceeding projections and that the difference between profit and loss in any quarter depended on the mix of patients that came through the door. The bottom line is that the economics of running the clinic are extremely fragile at these patient levels. Thus, it would seem that addressing this issue would be the top priority for the district and Unity Care but we have never heard it addressed at any of the district meetings. We had hoped that the community survey we suggested would shed light on why some people who pay taxes for the clinic don't use it, so the district and clinic management knew what to address.

Clinic Patient Satisfaction

In my (Vic's) interview to replace Robin Nault on the Hospital District board, I mentioned that I know statistics. Anyone managing a taxing district should at least know something about basic math. Here's why.

Here are the clinic patient satisfaction survey results (not to be confused with the community survey) from the first quarter of 2018, supplied by Unity Care at the June district meeting:

There were 489 unique patients in the quarter. Note A establishes that only six patients responded to the clinic satisfaction survey in this quarter. All of them reported overall satisfaction, shown in the bottom table as 100%. We covered the subject of why summary statistics of small numbers are not valid in December, so we won't repeat that here. Instead, look at note B. What does this number mean? Did four people walk in without talking to anyone on the phone and only the other two talked to the receptionist over the phone? Or does it mean that out of six respondents, only two chose to answer this question, but they were both satisfied? If so, then satisfaction would be 33% instead of 100%. Without knowing the actual ratio, this number can't be interpreted.

But a bigger issue is the use of percentages to report single digit measures. If two people in the survey talked to the phone receptionist and both reported satisfaction, it should be shown as a ratio of 2 out of 2, not as 100%. Since a percentage basically means "out of one hundred", using a percentage implies a broad, generalizable result. In prior meetings, it has appeared that the commissioners and district staff were interpreting these very narrow measures as broad, general measures of overall satisfaction, even when they only represented two people. We published the December comic referred to above to (hopefully) educate them so they wouldn't draw the wrong conclusions. If they've been misinterpreting these statistics, that may explain their persistent belief that everything was running well at the clinic and their blindness to warning signals.

Community Survey

That's why the community survey was needed. While Unity Care routinely surveyed people who use the clinic, there was no information about people who were not using the clinic. That was the whole reason we suggested the community survey. The district published preliminary summary statistics of the results in February but has not released the final results, so we decided to look again at the preliminary data.

Before discussing the satisfaction metrics from the survey, it's useful to understand what type of pattern we would expect to see based on standard principles of statistics.

A typical result from an organization with average customer service would tend to follow the normal bell curve, with most people giving a medium rating and smaller but similar numbers giving higher or lower ratings. Most people would rate the service as average, but due to variations between people in expectations, preferences, and perceptions, some would rate it as above average and some as below average.

The key point is that a typical response pattern would have a single peak and a smooth drop off from that peak. Given that the service provided to everyone is equal, the response variations are due to differences among raters which tend to follow a normal distribution at sufficiently high numbers.


A well-performing organization that provided superior customer service to all customers equally might produce the ratings distribution on the right, with most customers rating it as above average, fewer as average, and the least as below average. The distribution is skewed toward greater satisfaction, but again there is a single peak here at the most satisfied end and a smooth drop off toward not satisfied.

Now look at the actual satisfaction ratings in the clinic survey. The vertical axis shows the numbers of respondents.

What's striking about this result is that there are two peaks separated by a gap. Informally speaking, people either love it or hate it, but only about 2% just like it. This is exactly opposite the usual pattern for an organization, in which most people like it while a few love it and a few hate it. This unusual result suggests that community satisfaction with the clinic is very polarized.

What does a result like this suggest? One possibility could be preferential treatment of favored customers and poor treatment of less favored customers. Another is that it might be due to which staff members are seen by the two groups. We're not saying that either of these is definitely what is happening at the clinic; the data are suggestive, not definitive. But this is the type of analysis the district and clinic management should have been doing in order to understand issues and figure out how to address them.

So what happened as a result of the community survey?

We have no idea.

When Elaine Komusi announced her pending resignation at the March meeting, Shanon Hardie said, "I talked to the staff a lot today about the survey, because I could see some nice trends, I saw some trends, some things about what we could do to help, some things about cost, and I thought, oh wow, there’s a lot, people don’t understand all the things we can potentially do, we could get some more information out there and help people with this and that. There were a lot of really great trends that I think we could help with. So it’s not that we’re perfect. We know that. And we’re happy to be part of those solutions. We need to be collaborating, not fighting, but collaborating so that we’re meeting those needs, and not fighting against each other."

Yes. Exactly. That's what we intended the survey be used for: analyze the trends, find issues, and fix them. The district has had the final survey results for over three months now but they have not discussed any findings in any Hospital District meetings, let alone potential solutions. When they review clinic metrics provided by Unity Care, they focus on what appears to be going right rather than on what's going wrong. This apparently leads the commissioners to conclude that there are no problems. There has also been very little acknowledgement of the long term drop in utilization, and even less discussion of how to reverse it.

There have been rumors around the Point this week that the clinic is on the verge of closing. We hope that isn't true. It would be particularly tragic if there are reliable trends in the survey results that, if acted on, could have allowed the clinic to stay open and even thrive again.

New comic every Monday