It’s rather well known among control manufacturers—not just chemistry, but all controls—that these are not really sexy products. It’s a little glass bottle containing liquid or a little plastic tube with some powder in it, and that’s kind of it,” admits John Innocenti, president of AUDIT MicroControls Inc in Carlsbad, Calif.

Innovations, therefore, most often take place with packaging, delivery, and management systems. Mirroring trends in diagnostics, control sample sizes have gotten smaller, analytes have been consolidated, and automation has improved the management and reporting of quality control data. Some of these changes have meant increased costs and labor for laboratories, while others have created savings in both areas.

There has been similar give-and-take in the laboratory regarding the freedom and responsibility surrounding quality control. With equivalent quality control requirements, laboratories are free to develop quality control programs that maximize their resources while assuring quality. This may mean running fewer controls, but it also means collecting and maintaining evidence to support that process.

“Equivalent quality control is not as well adopted as it might seem because it puts the onus on the laboratory to really understand the entire process to [be able to] make the determination of the quality control requirements for that laboratory for that test system,” says Max Williams, global scientific and professional affairs manager with Bio-Rad Laboratories, Hercules, Calif.

More or Less

Market growth, in general, is expected to be modest. “As far as chemistry controls, we’ve looked at a number of different marketing studies, and they seem to indicate the growth in this sort of general testing is only going to be about 1 percent over the next year,” says Andrew Schaeffer, senior R&D scientist at Quantimetrix in Redondo Beach, Calif.

On the other hand, greater scrutiny under new Clinical Laboratory Improvement Amendments (CLIA) regulations may push laboratories to run more controls, somewhat off-setting the expected slow growth. “If laboratories do tighten up their control rules in response to the additional pressure from CLIA, that would force them to run more controls. So we might expect to see a modest growth in this particular sector,” Schaeffer says.

(L to R) Kevin Jones, VP, sales and marketing, Aalto Scientific, and and John Innocenti, president, AUDIT MicroControls Inc.

The added pressure on laboratories to successfully manage patient risk requires them to take greater control of their own processes and design their own quality control programs. “Whether following the regulatory minimum per CLIA or other standards being developed around the world, the laboratory has to look at everything, from the clinical utility of the test to the actual performance in the lab, and it requires them to look at all sources of information,” Williams says.

One of those resources is the document recently released by the Clinical Laboratory Standards Institute (CLSI) and that is currently in review: “Laboratory Quality Control Based On Risk Management, Proposed Guidelines (EP23-P).” The recommendations suggest developing quality control programs based on risk management and tailored to the test’s specific combination of measuring system, laboratory setting, and clinical application.

More in One

This may mean more controls or more sophisticated controls for some laboratories. “The current requirement is basically to run a control, two levels, once a day, but this doesn’t really address patient risk. If you have a lab, [for instance], a regional lab that runs anywhere from 100 to 150 tests of a specific analyte once a day and then runs QC once a day, its risk exposure is going to be far less compared to a laboratory running 2,000, 3,000, 4,000, or 5,000 results of that same type of analyte,” says Serge Jonnaert, manager for informatics strategy and business development at Bio-Rad.

The laboratory with the higher volume may consider running additional controls, or it may consider different controls. An emerging trend in an increasing number of laboratories is to run more than two levels of controls. “Even though there are some short-term savings that can be achieved by running two levels of control instead of three levels, running three levels of controls eliminates the need to perform linearity verification every 6 months. The lab saves money and time by not needing to purchase and run expensive linearity tests,” says Doug Borses, director of sales and marketing for Diayzme Laboratories, located in Poway, Calif. In addition, some laboratorians report increased confidence with three-level controls because they enable the laboratory to verify performance above, below, and at critical cut points on every run.

Greater sophistication in controls also means one vial can be used to measure more than one analyte. “Since we offer more lyophilized products than we do liquid, we aim to put as many different tests in one vial as we possibly can to assist in keeping the cost down. The customer doesn’t have to buy six, seven, or eight different controls but possibly only one that could include everything in that vial,” says Innocenti, adding that the trend crosses disciplines and is evident in hematology, coagulation, and immunoassay in addition to chemistry.

“If the laboratory can monitor all of its analyte testing with a fewer number of controls, that can save the laboratory money and, maybe even more importantly, the labor and time spent processing the controls,” says Paul Hardy, business unit marketing manager for Bio-Rad.

Smaller Samples: Less Cost, More Storage

Savings have also been realized as sample sizes have shrunk, in part to reduce the amount of trauma the patient must suffer. “[In the past], a test would need 0.5 mL of serum. Now, they’re down to 30 or 40 microliters or cc’s to run a full battery—not just one test, but 20, 30, 40 different tests at one time,” Innocenti says.

Mirroring trends in diagnostics, control sample sizes have gotten smaller, analytes have been consolidated, and automation has improved the management and reporting of quality control data.

A smaller requirement in the amount of patient sample is typically reflected in the control, since it will also run as a smaller sample. Innocenti notes that some products previously offered in a 5-mL or 10-mL vial are now available in lots of 1-mL or 2-mL vials as a result.

The ability to run smaller amounts of control has two potential impacts: the first is the cost savings realized from having to purchase fewer control materials; and the second is the need to store the controls, which might now take longer to consume, so they maintain stability. “It highlights the need for long open-vial stability of the controls, so there’s no waste and the customer can still use all the control purchased,” Hardy says.

Hardy sees some laboratories returning to lyophilized controls as a result, but liquid reagents remain popular despite a higher cost. Advances have brought the price down and stability up. Yet, laboratories must still balance cost and stability with volume and budget in selecting control material.

Better Algorithms, Better Data

If a control is no longer viable, a laboratory should be able to determine this—not only by shelf life expiration, but through its own data. Consecutive monitoring of program metrics, built into the quality control program, will help a laboratory detect trends, identify corrective actions, and facilitate improvement. With automation, this collection, analysis, and presentation of data is much quicker and more effective, and advances in software help to detect trends in lots of reagents more easily.

“Back in the day, we used to have paper charts on a bulletin board in the hall, and we would fill in little circles with a number two pencil and send the form off to the company from which we had purchased the controls. The company would then send back a report, which usually took a long time,” Innocenti recounts.

Today, automated systems enable this data to be collected and analyzed in real time, both within the laboratory as well as among laboratories. “Laboratories want to see not only how well they are running, but how they compare to other laboratories running the same material on the same kind of instruments,” Innocenti says.

Quicker comparison facilitates more rapid identification and correction of any problems and, subsequently, less error. “You can know right away whether a test is in control or not instead of having to wait to get some feedback,” says Kevin Jones, vice president of sales and marketing for Aalto Scientific Limited, Carlsbad, Calif. A faster response may prevent the laboratory from having to repeat a large number of patient samples to validate earlier results.

More intelligent algorithms can discriminate between a quality control failure and artifact (also preventing repeat runs), as well as provide auto-verification features. “Basically, a QC evaluation is done, and only upon a QC pass are patient results released. In the case of a QC failure, then the system automatically stops the release of patient results,” Jonnaert says. Further software advances are expected to develop optimal quality control strategies and schedule quality control events.

More Tests, More Controls, Little Time

Of course, what’s inside the vial matters too, and advances in control materials do take place. In addition to the development of multiconstituent controls, manufacturers have also focused on creating control materials with improved accuracy, stability, and effectiveness. This may mean less artificial matrices, more environmentally friendly components, and improved stability.

“These assays are pretty well established, but that doesn’t mean the assay, machine, and kit manufacturers have stopped working on them. So what you see is continuous refinement with regard to the performance of these assays. They continue to incrementally improve, and we have to make sure that our controls keep up with that,” Schaeffer says.

A change in the diagnostic method can impact the effectiveness of a control. Schaeffer recalls a time when one of the company’s controls that had worked for years suddenly didn’t. “We had lost all of our stability, and it turned out this was related to the switch from a polyclonal format to a monoclonal format for this particular assay. So we have to be very cognizant of these kind of changes,” Schaeffer says.

Control manufacturers also stay on top of new assays in development so that when a diagnostic test is introduced, controls are available to facilitate management. New tests within the lab will require new controls and processes. Collaboration between diagnostic and control manufacturers is therefore common.

Laboratories will seek to use the best control materials available, but they must balance their needs between performance and cost. “Laboratories have budgets to content with, but the control also has to meet their needs,” Schaeffer says.

This may mean using a specialized control, choosing between liquid and lyophilized materials, and/or purchasing controls from a manufacturer different from the one who produces the analyzer. “This allows an unbiased check of the testing system,” Hardy says.

To compare chemistry control products, search our buyer’s guide

Controls are typically handled as if they were patient samples, meaning the time to complete them mirrors the turnaround time on test results. “In general, a very small percentage of the overall time the laboratory is working with patient samples is spent on controls,” Hardy says, noting the value of processing quality controls exceeds the labor required, particularly when controls detect a problem.

“If certain quality control practices are not followed regularly or at appropriate frequency, when things go bad, there is a bigger trail to track back, and it involves more staff and more costs,” Williams says. On the other hand, an appropriate quality control program can avoid these issues and help to ensure reliable results. A laboratory’s reputation is built on this reliability—not very sexy, but very important.


Renee Diiulio is a contributing writer for CLP.