Home >> ALL ISSUES >> 2015 Issues >> QC for accreditation: CMS validation inspections

QC for accreditation: CMS validation inspections

Print Friendly, PDF & Email

Anne Paxton

May 2015—Quality control is second nature and part of the air that laboratories breathe. So it’s no surprise that QC should be subject to quality checks of its own, as one of the pivotal checklist areas that CAP’s Laboratory Accreditation Program focuses on during inspections.

But many people in the laboratory may not know there’s yet another layer to the review process—a triple check, if you will. The Centers for Medicare and Medicaid Services conducts its own form of quality control for the inspection process itself: validation inspections. Under a CLIA mandate that has been in place for more than two decades, all seven of the deemed laboratory accreditation organizations are subject to these quality checks.

For the College, this validation process means that 80 to 120 of its laboratory inspections are followed up each year with re-inspections by CMS staff. After a validation inspection, the CMS compares what it finds with what CAP inspectors reported, scoring CAP “misses” as disparities.

Dr. Datto

Dr. Datto

Disparities can occur for a wide range of reasons, so variations in rates are typical. But the most recent summary report from the CMS, which covered the 103 laboratories for which the CMS performed validation inspections in fiscal year 2013, showed an unfamiliar uptick in disparities—to 17 percent. “The level was getting to the point of being a bit uncomfortable,” says Michael Datto, MD, PhD, vice chair of the CAP Accreditation Committee.

The CAP Laboratory Accreditation Program is on solid footing, and at the end of March the CMS reapproved the CAP as a deemed accreditor for another six years. But a disparity level of 20 percent is the mark at which the CMS steps up its scrutiny via a “deeming authority review.” The CAP would like to get its disparity level down to where it normally hovers, at under 10 percent. “The College believes this may be a good time to make more laboratories aware of the CMS process and its importance,” says Dr. Datto, who is medical director for clinical laboratories, Duke University Health System, and an associate professor in the Department of Pathology at Duke.

The validation process provides evaluative data for all deemed laboratory accreditation organizations under CLIA. Only the 90 percent of CAP-accredited labs with CLIA numbers are subject to CMS validation inspections. If chosen, a laboratory gets about a week’s notice that CMS inspectors are planning to be on site to do a validation, says Amy Daniels, the Laboratory Accreditation Program’s senior manager of investigations.

“In general, they are selected at random, according to CMS,” she notes. But if the CMS receives a complaint against a laboratory that is due to have an inspection by the CAP, the CMS may add that laboratory to the validation list. Still, “A validation is kind of an unusual event for a laboratory,” Daniels says. “Most of them aren’t aware of this process and what it entails.”

The CMS must carry out the validations within a 90-day window after a CAP inspection, says Linda Palicki, CAP’s director of continuous compliance. “If they find a deficiency that was missed by the CAP inspection team, and if it is a condition-level deficiency of a higher severity or significance, that counts as a discrepancy between the CMS audit and the CAP inspection.”

The CMS and the CAP have different setups for assessing compliance, Dr. Datto explains. “At CAP, we have phase one and phase two deficiencies, but CMS is a little different in that they can take any of their accreditation requirements and say it’s ‘severe’ to such a level that it’s a condition-level deficiency.”

Nevertheless, “The deficiencies really do have to match exactly,” Daniels says. “If we cited the lab for not having performed competency on a person, but CMS cited that person for not having credential documentation in their file, that would not be the same thing. It would be a disparity.”

Because the College accredits most of the large laboratories in the country, disparities may not be what they seem. “We may be at an increased statistical risk of having a disparity because of the large size of the laboratories we accredit,” Daniels says. The differing methodologies of CAP and CMS also tend to create somewhat different results. “We may go in with a team of 12 and stay one day, while they may go in with a team of three and stay all week.”

On the CAP’s end, the validation inspection results get careful review and often lead to policy changes. The College takes two steps whenever it gets a validation report from the CMS, Daniels says. “First, we review the report, and if there are any problems, we work with the laboratory to help it correct them. The other piece is we look at all the validation results for all our labs every year comprehensively, to use them from a quality improvement perspective, to see how the College’s accreditation program can improve so that issues aren’t identified at the CMS inspection.”

Five main areas are the primary source of disparities: personnel records, test validation, proficiency testing, competency, and the responsibility of the laboratory director.

Lack of personnel documentation has been a leading issue in validation inspections since 2008 when the CMS was just beginning to issue personnel citations. “In a CLIA audit, they do a very thorough job of making sure all personnel records are there and intact,” Dr. Datto says.

As Palicki describes it, the CMS’ approach to personnel records is “a very deep dive” to make sure every employee has all the documentation of his or her education and experience in the file. “If one person—and you might have a lab of 50 or 60 people—is missing something, it’s an automatic citation.”

That procedure contrasts with the CAP’s approach in most inspections. “Oftentimes we’re dealing with very large labs with several hundred employees, so inspectors will do a sampling of employees’ files. But with sampling, you won’t find that needle in a haystack,” Palicki notes. The CMS itself is accustomed to accrediting labs, but they tend to be smaller and include physician office labs and smaller independent labs, while the CAP accredits larger institutions such as university hospital labs and reference labs in addition to community hospital laboratories.

At those institutions, in many cases, there may be a partial personnel record retained in the laboratory while the institution’s human resources department is in possession of other items, says Desiree Carlson, MD, chair of the CAP Complaints and Investigation Committee.

Dr. Carlson

“It would be really good if the laboratory could keep copies of all the items right there in the laboratory. But sometimes it’s really difficult, if HR is miles away, to get those records.” Many people don’t realize they have to have copies of diplomas, says Dr. Carlson, who is chief of pathology at Signature Healthcare Brockton Hospital in Brockton, Mass. “And if someone is a high school graduate from 30 years ago and their high school burned down, it’s not going to be possible to get that diploma, and it can be distressing for everyone.”

Even in laboratories that do have complete records, they are usually not placed in the same order, she adds, so better organization is often called for. “It would be helpful if each file had tabs so you’d know where you’re going to find the diplomas or the competency assessments or the color blindness.”

Some hospitals use a third-party company to verify credentials, says Richard Scanlan, MD, chair of the CAP Commission on Laboratory Accreditation. Particularly in an area like point-of-care testing, where there may be 2,000 nurses performing testing of one kind or another, this can be almost essential.

“If somebody just prints up a diploma at home, it’s hard to know if it’s legitimate or not, so somebody needs to call up the school, which can be difficult if it’s outside the U.S. So they’ve hired companies to do that,” Dr. Scanlan says. “But the companies don’t automatically forward the records they’ve vetted to the labs, so that creates a problem.”

In recent years, the Laboratory Accreditation Program has made sure to draw laboratories’ attention to this checklist area. “As soon as you open your inspection packet, the first couple of pages is about how to look at personnel records,” Dr. Datto says. Other measures are under consideration. One of the arrangements the CAP has proposed to the CMS is a system whereby if the labs can produce records in seven days, that would be sufficient for purposes of an accreditation policy, says Dr. Scanlan, vice chair of laboratory medicine and director of the transfusion medicine service at Oregon Health & Science University. However, he adds, that’s not current policy and is still only at the proposal stage.

Proper validation of laboratory tests is another area that creates big questions for laboratory accreditors, Dr. Datto says. Whether a lab modifies an approved test or develops one of its own, the inspections have to make sure to take a close enough look at all the laboratory-developed tests, to ensure test accuracy, precision, and reproducible and reliable results. The College and the CMS require that all tests have accuracy, precision, and reportable range determined before patient testing starts. Says Dr. Scanlan: “When tests are laboratory-developed or modifications are made to FDA-approved tests, laboratories must also determine analytic sensitivity, specificity, and reference intervals. CMS will cite laboratories that fail to do the required additional validation studies. Both laboratories and inspectors need to pay particular attention that all necessary validation studies have been done.”

It’s a dilemma for accrediting organizations, Dr. Datto says. “How feasible is it for an inspection team to go to a mid- or large-sized medical center and actually review all the validations for the new laboratory-developed tests—not only LDTs but all the FDA-approved tests, which still need validation? In a complex molecular lab, an inspection team might spend a whole day, but if the lab has introduced six new tests in the past two years, and you have to review all of the validations, that’s tough. That is a lot to ask during a short inspection,” he points out. “When the FDA reviews tests, they take weeks, months, and years. We’re trying to squeeze all that into an afternoon.”

“We’ve had several discussions on how to make sure laboratories understand how to perform a validation. We also need to make sure inspectors know all the new tests that have been brought online in the laboratory that they are inspecting,” he adds. One molecular checklist item requires that all LDTs developed in the last two years be reviewed, and sometimes the discrepant results from the CMS validation inspections concern lack of due diligence on test validation. “But it’s hard to find deficiencies in a targeted inspection over several days, particularly in a big laboratory system.”

The average laboratory does validations appropriately, Daniels says, but it is easy to slip into using a test on a different specimen type and overlooking the need for validation. “An assay may be FDA-approved for serum, but the laboratory may decide to perform the test on another body fluid or plasma. The test has to be validated on that specimen type before it can be offered to patients.”

Some tests may fall through the cracks, Dr. Carlson agrees. Before an inspection, “sometimes we’re not even aware that the laboratory has added a test or an instrument. One of the things we’re suggesting is requiring the lab, when it sends in its application for inspection/reinspection every two years, to list all the new instruments it has, so the inspector is reminded to look at the validations, because those should have been done before patient testing begins.”

CMS and CAP inspectors don’t always agree. In some instances, the CMS might feel the laboratory is not following the manufacturer’s instructions to a T, whereas the inspection team saw it differently, Palicki notes. “Sometimes it’s a subjective judgment and the inspectors may have felt what the lab did was sufficient.”

In addition to the CAP’s internal review of the disparity data, a workgroup of the Laboratory Accreditation Program is studying the data to see if there is some way to help volunteer inspectors recognize the occasions when the inspector makes a recommendation but probably should have cited a deficiency.

Proficiency testing, too, can lead to disparities between CAP and the CMS validation inspections, and the CMS has increased its scrutiny of PT, Daniels says. Among the irregularities related to proficiency testing that are frequently cited by validation inspectors: not handling PT specimens in the same way as patient specimens—for example, by testing PT samples in duplicate but patient samples only one time.

Until recently, Dr. Carlson notes, the CAP allowed testing on two instruments—if you had two blood gas analyzers you could run PT on both—but the CMS disallowed that in 2014. “Now you have to run the PT on one or the other instrument to make sure that the two have corresponding results, and you have to do an independent testing twice a year comparing the two instruments.”

Another important rule is that anyone who does patient testing has to do proficiency testing and quality control, Dr. Carlson adds. “If you have five people working in hematology over the course of the two-year cycle, all five people should be running the proficiency unknown and sending the answers to the College.” But what happens in practice may not quite meet that standard. For example, “Sometimes a person who seems at risk of getting the wrong answer may be informally excluded.”

One issue that comes up concerns proficiency testing referrals, Dr. Datto says. “Within a big laboratory system of the type we see associated with academic medical centers, you might have labs on either side of a street or in different buildings. Because they are physically separate, they are considered by CMS as separate labs and would have separate CLIA numbers. Walking a sample across the street—even if this is the normal workflow for the lab—is PT referral.” The great majority of PT referral is inadvertent, he adds. But to comply with CLIA, “all proficiency testing has to happen under the roof of one CLIA number.”

Follow-up is sometimes lacking. “If a laboratory fails a PT event and it doesn’t investigate or figure out appropriate corrective action—that’s another type of issue we’re seeing,” Daniels says.

What typically happens is that the laboratory scores 80 percent on five challenges and passes the survey, but then fails to investigate why one challenge was wrong. The CMS and the CAP want them to look at every mistake and understand the nature of the error. “CAP inspection packets include a PT exception report describing all unacceptable PT results,” Dr. Scanlan says. “Inspectors should thoroughly review these instances to make sure appropriate investigation and corrective action were taken.”

All of this can be frustrating for the CAP’s program, because the checklist requirements are clear, Daniels says. “Our laboratories know. They’re just not able to do it all the time.”

The issue with competency is primarily that there are six elements of it “and you have to do all of them and usually a laboratory is missing a component,” Daniels says. “For example, it may have completely omitted a specific instrument from its competency program.”

It’s easy to explain that every person doing testing has to have his or her competency assessed annually, but it’s a hard requirement to implement, Dr. Scanlan says. “It has to be on every test system the lab uses. In a core lab like mine here at OHSU, everybody rotates around to each station, so nobody is doing just hematology or chemistry; they’re doing all the analytes. With upward of 30 test systems in my lab, that’s 30 direct observations. It gets to be an enormous amount of work to amass all the data they want to see.”

Dr. Scanlan

Dr. Scanlan

More often, a problem arises because the CMS inspector asks for verification. “The laboratory may have prepared by simply checking off yes, yes, yes, and yes, on the checklist,” Dr. Carlson says. “But the inspector comes in saying, ‘Show me this data from August 2013’ or ‘Show me the letter where the medical director delegates certain functions in the lab to the supervisors.’ And they won’t be able to find the documentation, or maybe it was never written.” Preparation for the inspections should include locating the actual documents that show compliance with requirements, she stresses.

Because she has been closely involved with the CAP’s accreditation program, at her own institution Dr. Carlson uses Outlook to remind herself to send an email to all supervisors on the 20th of each month, to prompt them to get their QC to the medical director before the end of the month and have the medical director sign off on the procedure review. “Ideally, everyone would have that same level of attention to the process,” Dr. Carlson says.

A final key difference of approach between CMS and CAP inspections is the point at which the laboratory director, based on his or her responsibility for a particular area, is cited for a deficiency. Once CMS inspectors find a deficiency, they are more likely than CAP inspectors to include an additional citation for the person they believe is responsible.

“CMS inspectors may cite the lab director for oversight, so there would actually be two CMS citations, where often we will only cite the specific situation,” Palicki says. Or, as Dr. Carlson puts it, the CMS has more of a tendency to “double bill” on some checklist items. As a result, the Condition for Laboratory Director is a top deficiency in CMS validation inspections.

Generally, Dr. Carlson says, “our inspectors will wait until they see a more global quality issue before citing the laboratory director. So that will sometimes contribute to the disparity rate.” It’s a judgment call, Daniels notes. “If one person’s records were missing a document, would I cite a team leader checklist deficiency? No, I wouldn’t. But if half the personnel records were missing documentation, yes, I would cite it.”

Leaders of the CAP program have discussed possibly noting on specific checklists that if a laboratory is cited for a certain deficiency, “then the team leader should also cite the laboratory director so it pairs up,” Dr. Carlson says. In fact, she considers citation of the medical director, whenever there is a problem on a section-specific checklist, the most important change inspectors should make to get in sync with CMS validation inspections. “This is at least a moderate to severe level issue we are planning to address with a number of steps,” she says.

The CMS has distinct expectations of what the laboratory medical director is doing, Dr. Scanlan stresses. “When it comes to personnel, proficiency testing, and quality control, that’s the laboratory director’s personal, legal responsibility, and he or she needs to pay attention to that. Those are crucial areas for quality in the lab, and the director needs to get personally involved in making sure that whatever has been delegated is getting done.”

In addition to raising general awareness about CMS’ validation inspections, CAP accreditation program officials and staff are considering conducting webinars for laboratory inspectors, or possibly making modifications in the lab packets to ensure the information is providing clear expectations to inspectors, Palicki says. “I’m sure it just adds more stress for the laboratory when CMS walks in the door and finds things the previous team didn’t find. And we definitely want to minimize that for them. We want to make sure the process is as fair as possible for the labs while at the same time ensuring the quality of the labs.”

In a perfect world, CAP and CMS inspection findings would match, Daniels says, but inspectors can get a little lost in the accreditation process. “Sometimes you’re used to the checklist questions and you go through all of them, but you can lose perspective about the big picture. The big picture here is that someone could come into the laboratory after your team was there and there are key areas they are focusing on too. However, if we make labs aware of those issues and they’re conscientious about them, I have to believe our disparity rate will improve and ultimately our labs will improve.”

The CAP program is not a punitive one, Dr. Datto notes. “Very few labs get revoked. There are a large number of labs, but CAP puts an enormous amount of effort into making sure they get up to speed in their validation, their proficiency testing, or their competency assessments.” Labs are fortunate in a way, he says. “We have standard operating procedures for everything. We have key performance indicators. We have quality management plans. We have a nice set of checklists by which we can live our laboratory lives, and all of these things are not necessarily in the DNA of other parts of a hospital system.”

As Dr. Carlson notes, the laboratory is more diligent than any other part of the hospital in keeping things up to date and recording and documenting. There’s nothing analogous to the laboratory accreditation process in other departments, she adds. “Nobody comes in and inspects radiology with 3,000 questions.”

In the quest to sync up with CMS validation inspections, the key task is paying more attention to how these high standards are met. “The whole reason why we have deemed status from the government is we’re supposed to be as stringent as the CLIA requirements,” Dr. Carlson emphasizes. In the CAP’s case, “we tend to be more stringent. We just have to make sure the laboratories are documenting that, and when they’re not, that inspectors pick up on it.”


Anne Paxton is a writer in Seattle.


Check Also

For autopsy service, new requirements in AP checklist plus nine new requirements for forensic autopsies

September 2018—Quality management, communication, and consent are among the issues addressed in new and revised requirements in the autopsy pathology section of the latest edition of the CAP accreditation program anatomic pathology checklist.