Grading the grades – a data driven look at Restaurant Food Safety Scores

The food scene in Charleston is unquestionably one of my favorite parts of living here. Charleston has high quality restaurants of all types and for all budgets. Despite the nuances that make each restaurant unique, they all have something in common: a food safety grade assigned by a DHEC inspector on display in the restaurant showing an “A, B,” or “C” grade.

During a recent dining experience in a city outside of Charleston, my wife found a hair in her food. The restaurant in question had their ‘A’ food safety rating card properly, and perhaps proudly displayed, but that did little to console my wife who promptly put down her fork. I typically don’t pay much attention to the ratings, but the ensuing conversation about restaurant cleanliness and food safety ratings inspired me do a little research on what goes into a DHEC grade and frankly, DHEC may have a quality control problem.

According to the SC DHEC website, restaurants are subject to unannounced routine and/or follow-up compliance inspections. During the inspection points are assessed for violations that are uncovered, totaled, and a score is assigned; 0-12 points is an ‘A,’ 13-22 points a ‘B,’ 23-30 a ‘C,’ and anything over 30 points is presumed to shut down the restaurant. Points are assigned at the discretion of the inspector and there are a number of violations that can be corrected during the inspection which then reduces the final point-value of the violation. My first thought is that this score range seems to be quite wide, potentially masking serious violations behind an ‘A’ rating displayed in the window of your favorite eatery.

The data available through the DHEC website is admittedly limited, but what is available is suspicious. Of 1,452 routine (unannounced) restaurant inspections recorded between 2013 and 2015, 97% of the restaurants inspected received an ‘A’ rating, the remaining 3% received a ‘B’ which was the lowest grade in the dataset.

The histogram below shows how scores are distributed for the available data. I expected the data to be skewed, but I was shocked to find that the most frequently awarded score was 0 (indicating no violations were found) given that the inspection report has 56 sections to score and each of the 56 sections has multiple applicable codes – there are literally hundreds of possible violations that could be found during an inspection and yet 12.5% of the routine inspection yielded not one violation!

Picture2

The other component of this data that is of particular interest to me is the difference between the number of restaurants that score 12 and the number of restaurants that score 13 – keep in mind this is the difference between an ‘A’ and a ‘B’ rating for the restaurant. The number of restaurants that score 12 points (just enough for an ‘A’) is 17 times higher than those that score 13, or rate a ‘B’. That so many restaurants would do just enough to score an ‘A’ and that so few would fall just one point short seems fishy. Given the perception around restaurant grades, a ‘B’ grade at a restaurant could be cause for a major loss of business if word got out about anything less than an ‘A.’ It makes sense then that an inspector might provide enough leniency to keep a restaurant in the ‘A’ category. In fact, I’m reminded of the time in high school that I managed to get a lucky break and catch a bump from B+ to A- based on (non-graded) class participation. It wasn’t a score warranted strictly by the points I scored, but based on the discretion and leniency of the teacher.

However, this article isn’t about getting a grade I didn’t deserve; it’s about food safety and the reliability of restaurant safety grades. The DHEC materials make it clear that inspectors are allowed to use discretion during the inspection, but the data suggests that inspectors may be too lenient on restaurant grading.

There are a number of things that could be done to address quality issues in the inspection process:

  1. Review score distribution by inspector; this distribution might indicate training issues (inspectors consistently giving very low scores indicating few or no violations) or too much leniency (inspectors that have given out many scores of 12, but few or no scores of 13).
  2. Have two inspectors present for every inspection and have them look for different things. They can tabulate the total score at the end and neither would know if a restaurant was on the cusp between letter grades.
  3. Update the scoring system; given the fairly wide range of scores between the grades you could update the points system, or narrow the grade ranges. In reviewing the report sheet, it seems to me that the point values assigned for each violation seem quite generous – a restaurant could theoretically rack up a number of violations and maintain a score less than 12, allowing the restaurant to keep an ‘A’ rating.

The data found on the DHEC website indicates that DHEC may have a quality control issue among its inspectors – quality issues that could put the public at unnecessary risk.

There is little evidence that food borne illnesses are a big problem in Charleston – anecdotally I can’t ever recall myself, or any of my friends dealing with food sickness after a meal in Charleston. In light of the data, I believe this is likely a result of Charleston restaurants making an effort to keep diners safe in spite of DHEC inspections, not because of them.

Advertisements