technician

Today’s clinical laboratories are facing staffing issues, and at a time when quality control is about to undergo a major revolution.
Photo courtesy of  QUANTIMETRIX

The law of averages suggests that the longer a laboratory runs tests without having a result out of range, the more likely it is to produce an incorrect answer. Although not really supported with a mathematical principle, it is unlikely that a laboratory will never have an erroneous result. Quality control and quality assurance programs are there to minimize this occurrence and, perhaps more importantly, recognize it when it does occur—a rather simple concept.

The challenge is that these programs are dynamic, responding to changing laboratory systems, evolving hospital needs, and developing market forces, particularly regulations. To stay on top, laboratorians need the time and the expertise to assess and adapt their programs so that their efforts are maximized and successful.

“Quality improvement really depends on having quality people. Laboratories can’t just buy a new and better quality program off the shelf, like they buy a new instrument or method,” says James O. Westgard, PhD, professor emeritus, Department of Pathology and Lab Medicine, University of Wisconsin, Madison, Wis, and president of Westgard QC Inc, also in Madison.

Unfortunately, today’s clinical laboratories are facing staffing issues, and at a time when quality control is about to undergo a major revolution. “Everyone knows that equivalent QC is going away and risk-based analysis will become the industry standard,” says Andrew Schaeffer, MS, senior R&D scientist, Quantimetrix, Redondo Beach, Calif.

While equivalent QC has offered some advantages, particularly in the short term, many believe risk-based programs are a better option. Some—like Schaeffer, Westgard, and David Hunter, Randox Laboratories—consider equivalent QC to be scientifically unsound. Risk analysis offers a science-based improvement in quality protocols, though it may still have some disadvantages, like complexity.

“Risk analysis is a new quality tool that will take some time to learn and considerable practice to apply in a rigorous manner,” Westgard says. Laboratories face a steep learning curve, but tools and training opportunities are available to help them develop these quality programs—now, they just have to find the resources (time, expertise, money) to utilize them.

technician

Andrew Schaeffer, MS, senior R&D scientist, Quantimetrix, Redondo Beach, Calif

WHO WILL LEAD THE CHARGE?

Expertise may be the most challenging resource to come by, particularly as more senior analysts retire. “[Laboratories] are finding it difficult to replace experienced people because educational programs in clinical laboratory science are struggling to keep up with the demands for new analysts,” Westgard says.

As laboratorians leave, they take “institutional knowledge” with them, sometimes making it difficult for the laboratory to maintain its quality program. Automation has been a tremendous boon in terms of performance, and it has had a positive impact on quality, but it helps to have an experienced eye to notice a problem with a system and then to fix it. Internal quality control checks cannot test a system’s full range of function.

“A lack of educated or experienced staff available to troubleshoot any potential QC errors—or even a lack of understanding the reasons for running QC—can have a very high impact on the quality of the results being released by any laboratory,” says David Hunter, manager of quality management solutions and products, Randox Laboratories, Co Antrim, UK.

Larger laboratories are at an advantage in the move to risk-based analysis and QC in that they can hire a specialist to manage the program. “The challenge is going to come to the smaller labs, which don’t have the resources to bring in a quality director and may not even have the resources for a significant amount of consulting time to help them design a risk-based program,” Schaeffer says.

These laboratories should identify analysts on the team who are interested in quality and then nurture them. All laboratory technologists should be aware of QC, but those who specialize can help with new tools and new plans.

“Someone in the laboratory must invest significant time and effort to understand how new tools should be applied to improve the quality of laboratory testing processes. That may require that those analysts are sent to courses and workshops outside the organization,” Westgard says.

IT’S ALL ABOUT RISK (NOT THE ECONOMY)

This will be particularly true in the case of the upcoming switch. “CMS is expected to phase out existing practices for ‘equivalent quality control’ in favor of a new approach that employs risk analysis to develop QC plans,” Westgard says. For some, the change cannot come soon enough. Many laboratories have already implemented programs that exceed the requirements of equivalent QC.

technician

Laboratories can use instantaneous reporting features from data analysis services, such as Quantimetrix Corp’s Quantrol Online, to monitor performance against peers.

“Equivalent QC was a terrible concept, in my opinion,” Hunter says, noting that according to the guidelines, a control could be run as little as once a month. “If the control then falls out of range within that time period, it could take you a month to notice, and that would mean you would potentially have to go back, recall all the patients for that month, and rerun their samples. The cost involved for that could be phenomenal, and there are lives at stake,” Hunter says.

Risk-based programs, theoretically, should do a better job balancing risk and resources. Ultimately, the goal is to identify the various sources of error related to a device or test and select controls that can reduce the associated potential for error. For instance, “You can look at what the internal checks [on an instrument] do and think about how much risk that mitigates. Then, you can look at what further risks you need to mitigate with external, or wet, QC,” Schaeffer says.

It’s almost certain that this will mean more controls are run than were required under equivalent QC guidelines. “The challenges that labs face is there will, of course, be a cost implication of running more QC,” Hunter says. Yet, over the long term, it could actually help to save money by avoiding large patient recalls, such as in the earlier scenario.

Most importantly, however, the method will help to improve quality and patient care, and so laboratories will have to tackle the challenges. “There’s going to be a learning curve, but once they get into it and learn to analyze these risks routinely, I think we’ll see an overall improvement in quality throughout the industry,” Schaeffer says.

SIMPLER IS SOMETIMES BETTER

Because of the staffing situation in clinical laboratories, the learning curve is likely to feel steep. “CLSI [the Clinical and Laboratory Standards Institute] has partnered with CMS [Centers for Medicare and Medicaid Services] in making an aggressive push to provide guidance in this area, including educational materials, training programs, and workshops,” Westgard says, noting that usually proactive professional organizations have not yet shown much involvement.

Technology could help with these efforts, but Westgard says laboratories need improved data analysis packages that support the evaluation of methods, the design of statistical QC procedures, the implementation of control rules, auto-verification, patient data QC checks, the application of risk analysis, and the implementation of QC plans.

“Right now the data-analysis programs are usually assembled piecemeal, meaning there are individual programs operating on different platforms to address individual issues related to quality management,” Westgard says. “An integration of these tools would provide a more complete and comprehensive quality management program with fewer ‘hand-offs’ between one piece of software and another and fewer gaps where problems might get lost.”

Manufacturers are recognizing these needs and developing new tools that can help quality. “Industry is taking more responsibility because they recognize that more and more problems in the field are coming back to them to be solved,” Westgard says. “We’re seeing more interest and commitment from industry to fill in the training and education needs, and to support and improve quality management activities and programs.”

Westgard, an industry expert on quality control, maintains a Web site (www.westgard.com) that offers educational materials for free and purchase, including articles, books, references, software, online training, and workshops. Randox participates in association events, presenting quality-related talks when appropriate and possible. Earlier this year, the company offered an educational event in Dubai on good quality control practices for the clinical laboratory. Topics included daily QC, proficiency testing, and the use of an internal QC software program as well as the value of having a program.

“It’s important [for laboratorians] to understand why they would want to have good QC practices and the impact that has on the patient, because if you release a bad patient result, it has implications down the line, from a misdiagnosis to the worst-case scenario—a fatality,” Hunter says.

ONLINE AND IN TUNE

Naturally, many of today’s products take advantage of the Internet and offer practical options. Online quality control programs can provide significant assistance with data analysis, if used and used properly. Larger programs can offer more features and, therefore, more value, but the work can become complex.

“For small laboratories and POC [point-of-care] applications, there is a need for simple tools that provide the necessary analysis and results, with minimum demands on the operator’s skills and understanding of statistical data analysis,” Westgard says.

Bearing this in mind, Quantimetrix offers Quantrol Online, which provides instant, live, intralaboratory control information and peer group statistics. The program offers accessibility (with the ability to review and enter information at any time), instantaneous reporting, error checking (and flagging), and customized reporting. Another version of the program, UA Quantrol, provides similar capabilities for another line of tests (The Dipper, The Dropper, The Dropper Plus, DipandSpin, and QuanTscopics urinalysis controls; confirmatory tests; hCG, or pregnancy tests, specific gravity by refractometry, and microscopic evaluation).

Randox also offers an interlaboratory data management program, Acusera 24.7, which supports the associated line of controls that share its name; it’s available in both an online or desktop format. Both enable laboratories to monitor analytical performance, interpret QC results, access peer group reports, and compare results with other laboratories using the same quality controls and often the same method and instrument. The program also provides access to fresh data and comprehensive reporting capabilities.

“Acusera 24.7 Web is the newest version of our online software that allows you to monitor your daily QC,” Hunter says. “Alerts can be programmed, based on specific and customized parameters, if QC results fall outside of the designated range.” For example, most people would aim to have their QC results within two SDs [standard deviations] of the mean. “With the software, you can set a rule that if the QC results fall out of that range twice within a certain time frame, the user is notified with an alert.”

Laboratories can, therefore, use these programs to provide another quality control monitor; the reporting and charting tools help to identify trends (such as a positive or negative bias in results), ideally before they become serious problems. Newer automation can also help, offering more internal controls to reduce the demand on the laboratorian’s physical time.

“If you can afford to invest in new equipment, you can save a lot of time and trouble simply using the supply-and-control modules that manufacturers have already developed to work with the instruments’ internal QC checks,” Schaeffer says.

Of course, many laboratories may not be able to afford the new equipment, or (more likely) they have a combination of analyzers with different ages and life cycles—which is why the customization of risk-based analysis plans is “ideal” and complex.

Ultimately, however, the issue is simple: Whether the law of averages is scientifically sound or unsound, eventually, a laboratory’s quality control values will fall out of range. It’s up to the quality control program to mitigate this risk and to the laboratorian to implement the program so that it maximizes success and minimizes cost. It’s so simple, it’s complicated.


Renee Diiulio is a contributing writer for CLP. For more information, contact Editor Judy O’Rourke,

COMPANIES PRESENT MORE THAN PRODUCTS

With the expected switch in guidelines (from equivalent quality control to risk-based analysis), education is now more important than ever. But laboratorians do not have a lot of resources—time or money—to invest in obtaining this education. So manufacturing companies have stepped up their educational efforts to provide laboratorians with the knowledge they need and want to create successful quality control programs.

Bio-Rad Laboratories, Hercules, Calif, has developed a number of learning opportunities in response to customer demands. The company has five categories of training (Foundation (the basics), Implement QC Tools, Achieve Accreditation and Compliance, Unlock Your Lab’s Potential, and Stay on the Cutting Edge), and can develop sessions in these areas that match the audience’s need.

technician

Curtis A. Parvin, PhD, and James O. Westgard, PhD, jointly presented a session on risk management and lab QC at the recent AACC annual meeting.
Photo courtesy of  WESTGARD QC

“Sometimes, the best person to speak is a regional leader, someone in the state or area who is constantly driving improved performance in their lab and whose opinion is sought out by other laboratorians. Other times, a nationally recognized speaker is more appropriate due to the attendee makeup or the need for specialty expertise, but the core of our Foundation training is performed by our field-based Quality System Specialists,” says Andy Quintenz, global scientific and professional affairs manager, Bio-Rad Quality Systems Division. Learning objectives are set, and a program developed.

Most recently, the company was involved in the presentation of material at the annual meeting of the American Association for Clinical Chemistry (AACC), including a session on risk management and lab QC, roundtables, coursework on patient safety, and a poster on the statistical validity of using group reference mean instead of instrument mean.”The labs we talked to thought there was some validity to the unendorsed practice, but there was no scientific rationale or understanding of its effect on patient results. In both cases, we were surprised to discover that the statistical analysis showed that what the labs were doing not only was valid but also had pretty strong supporting data,” Quintenz says.

In terms of risk management, Quintenz shared the primary takeaway from the focused session, which was presented by two industry thought leaders, James O. Westgard, PhD, professor emeritus in the Department of Pathology and Lab Medicine, University of Wisconsin, Madison, and president, Westgard QC Inc, both in Madison, Wis; and Curtis A. Parvin, PhD, manager of advanced statistical research at Bio-Rad. “Risk management isn’t going to supplant statistical QC. It helps more in pre- and post-analytical areas where statistical QC has never been the most appropriate tool,” Quintenz says.

Bio-Rad selects its subjects based on the popular topics in industry publications and conversations with laboratorians. QC Design is a frequent hot topic and is expected to get hotter as labs prepare to create new risk-based quality control programs. Manufacturers will be there to fill the gaps in education along with those in operations.