OR WAIT null SECS
© 2024 MJH Life Sciences™ and Pharmaceutical Technology. All rights reserved.
In 1987, when the US Food and Drug Administration issued its Guideline on General Principles of Process Validation, a young FDA reviewer asked her supervisors.
In 1987, when the US Food and Drug Administration issued its Guideline on General Principles of Process Validation, a young FDA reviewer asked her supervisors, "What does this term validation really mean?"
"We don't know," they responded.
Much has changed in the past 18 years. So much has changed, in fact, that the current concept of process validation, once a fresh idea in quality control, and which later became accepted dogma, may now be ready for the trash bin. With companies achieving new levels of process understanding, what does it mean to validate a manufacturing process? Industry leaders and FDA are now examining that question and looking at new models to follow.
Losing sight of the goal
Although the 1987 guideline did not introduce the concept of validation, it did plant the notion firmly in industry. It defined validation as:
establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its pre-determined specifications and quality attributes (1).
That seems logical enough. The problem, many experts say, is that over time validation activities have become centered on documentation instead of on ensuring quality. "It's a forest for the trees situation," says Russell E. Madsen, president of The Williamsburg Group (Gaithersburg, MD, www.thewilliamsburggroup.com). "A whole industry grew up around process validation. It has resulted in a proliferation of validation protocols, validation reports, and validation documentation; but you still have processes out there that don't work." We've lost sight of the goal, Madsen says, "which is to demonstrate the process reliably does what it's supposed to do. It's as simple as that."
The continuous quality verification model
Validating without understanding
Fernando Muzzio, PhD, a professor in the Department of Biochemical Engineering at Rutgers University (Piscataway, NJ, www.rutgers.edu) says the main problem underlying the traditional approach to validation is that that we don't understand our manufacturing processes very well. And if we don't, our validation activities are essentially meaningless. "How can we agree to validate a process we don't understand?" he asks. "You know the old tale about the emperor's new clothes? Well, that's validation. Validation is, and always was, a fantasy."
The second key problem with the traditional approach to validation, Muzzio says, is it presupposes that if you don't change anything from the validation batches, everything will remain the same. But this assumption is false, he says, because neither ingredients nor processing conditions remain fixed. "There will be lots of little changes over time that operators will introduce, or equipment will be moved from one place to another, where room moisture may be higher or lower. We will get a new supplier for a given material, and the new material may be within specifications, but it could be different in terms of its particular morphology, flow, or one of 60 possible variables. It was never real that you could keep everything the same."
The fixed validation model also has a third major problem: It's based on an approach to quality control that relies on discarding bad lots to keep quality high. That makes for inefficiency.
Philippe Cini, PhD, a principal at Tunnell Consulting (King of Prussia, PA, www.tunnellconsulting.com) explains it in terms of the Six Sigma approach to quality. "Right now, pharmaceutical products on the market meet a high quality standard—about 5.5 Sigma. So we have very, very few defects reaching patients or in the distribution pipeline," he says. But average industry manufacturing capabilities are at about 2.5 Sigma. "Today we close the gap from 2.5 Sigma to 5.5 Sigma by deploying quality management systems that screen out bad product once it's been made," he explains. "That's a very, very expensive way of manufacturing."
Disincentives and fear
Ironically, the ingrained practices of validation, intended to ensure quality, have become an impediment to quality improvements. One of the first things any industry newcomer learns is that pharmaceutical manufacturers avoid altering an approved process—even to improve it— because any change requires regulatory filing, which can take a long time and involve a lot of paperwork.
But the problem goes beyond that. Quality assurance staff have been afraid to even generate new data about manufacturing, much less submit them to FDA, for fear of giving the regulators the impression that they didn't understand their processes well.
"Twenty years ago, when I was trained, I was told, 'Never generate data if you don't know what you're going to do with it,'" says Gerry Migliaccio, vice-president of Global Quality Operations at Pfizer (New York, NY, www.pfizer.com). "There's been a huge fear that if you have data about your processes and you can't explain it, you're setting yourself up."
The truth is, when manufacturers launch a new product, usually they don't know that much about their manufacturing processes. Most companies don't want to delay filing for drug approval by spending a long time developing the commercial manufacturing process, because every additional day a product is on the market can mean thousands of dollars in sales. So it's not until after approval that a manufacturer generates large volumes of data about the process—from its commercial operations.
Change is in the air
Today, however, the mentality is changing, and some companies are willing to spend time and effort studying and improving their processes.
New technologies, and a change in thinking, are working hand-in-hand in this shift. The pharmaceutical industry is starting to catch up to other industries in the use of online monitoring technologies such as near infrared (NIR), Raman, and acoustic spectroscopy to gather more in-depth data about processes. FDA's guidance on process analytical technology (PAT) was important in encouraging the adoption of these technologies (2). The PAT guidance also was important in another way, because in it FDA addressed industry's fear of scrutiny. The guidance expressly states that FDA will not examine manufacturing data collected for research purposes. This guidance spurred some companies to study their processes. It also brought out of the closet process improvements that quality control managers were already making but not talking about.
Different approaches, different goals
A poll conducted last year showed that only 11% of respondents felt their companies had a high level of understanding of PAT, but more than 60% felt PAT could assist with quality control concerns such as identifying critical sources of variability and preventing scraps and rejects (3). There are several strategies for bringing more sophisticated process models into the manufacturing floor. AstraZeneca, Genentech, and Pfizer all have programs, for example, but each illustrates a different approach to different goals.
Changing without filing: fixing broken processes. Pfizer's primary focus is on studying older, problematic processes, with the aim of reducing variability.
In one example, Pfizer had high failure rates for a reaction step in the manufacture of an active pharmaceutical ingredient (API), at a cost of more than $125,000 per batch. When the quality control team applied spectroscopic methods all along the process, it traced the problem to an unsuspected failure of an earlier reaction step which formed an intermediate ingredient, Precursor A. Studies from NIR monitors revealed that about half the time, the Precursor A formation reaction never occurred. The application of new technology was required to discover this.
"Because this was a cryogenic process, occurring at –20 °C, we didn't have an effective way to sample it," says Laurie St. Pierre Berry, PhD, a senior development scientist with Pfizer's Process Development Laboratories in Groton, Connecticut. "When we starting using NIR monitoring, we were able to see what was going on."
Berry and her team discovered that variations in a moisture-sensitive raw material were causing the problem. Now the team is working to try to understand and control the raw material variations. In the meantime, Pfizer can now monitor the Precursor A formation reaction and scrap batches earlier in the manufacturing process, before they reach API synthesis. This saves money, and it saves time.
Tim Marten, PhD, vice-president of global compliance at AstraZeneca (London, UK, www.astrazeneca.com), says his company also applies spectroscopy to reduce variability in difficult processes for currently marketed products. First, the manufacturing team uses probes and sensors to gain a better understanding of the critical variables. Now, with this increased understanding, the quality control staff collects and analyzes data online and uses them to monitor and control process steps.
"We've managed to increase our throughput of successful batches quite considerably, by not having products sit around waiting between stages—because instead of doing the analysis after we've finished, we've done the analysis during the processing," says Marten. "That's made an enormous difference to both pass rates and lead times."
The usual response to examples like these is, What do the regulators require to allow companies to replace laboratory tests with online monitoring? But Pfizer and AstraZeneca aren't letting regulatory filings inhibit their quest for process understanding. Instead, they are continuing to run their standard laboratory analyses even as they gather data with modern analytical techniques and apply that data to process monitoring and control.
"Pfizer's program is really about process understanding," says Migliaccio. "Some companies are working hard toward real-time release, but for Pfizer, that's a tertiary goal. Our priority is to reduce the number of deviations, to get to definitive root cause, to reduce cycle times." Migliaccio says the reason is limited resources. "We believe the biggest gain is eliminating deviation, rather than eliminating finished-product release testing."
Real-time release. AstraZeneca also is running its online analysis in addition to its laboratory tests, and Marten is pleased with the benefits they have achieved by reducing variability. He would like to take the use of these technologies further, however, to eliminate end-product testing. "Ultimately, quality control analytical testing at the end should be a thing of the past. In the long-term future, we would expect that good process understanding and good monitoring of the parameters during processing will assure us that the product we make is the one we want to make. We won't do quality control testing in the end, because it will add nothing—we will already know this is the right product." (See Sidebar, "A dynamic approach to quality control").
A dynamic approach to quality control.
Investing early in process development. Genentech (San Francisco, CA, www.gene.com) has taken a different approach. Its focus has been on applying design of experiments to gain a thorough understanding of a new product's manufacturing process before filing for FDA approval. With one of its latest drugs, "Avastin" (Bevacizumab, a recombinant monoclonal antibody for treating metastatic colon and rectal cancer), Genentech used carefully designed, statistically valid experiments during the development stage to quickly zero in on which process parameters were critical to ensuring product quality.
Although one would expect such studies to delay the product's release to the market, Ron Branning, vice-president of commercial quality at Genentech, says this approach actually saved them time.
"In using these methods, you save time because you know what the critical process parameters are much more quickly than by using any other technique." He explains that in using a design-of-experiments strategy, developers may start with 20 or 30 parameters that could be critical. A set of well-designed, statistically valid experiments might then indicate that only a few of those factors are critical, in a very narrow range. Then, one can repeat just a small number of the experiments to confirm the results, focusing on the key factors, instead of conducting 30 or 40 runs to gather the same amount of data with the same statistical validity.
"That's a lot faster than doing a very large number of experiments and then trying to correlate data," says Branning. "I believe this approach helped us with a rapid review and approval of Avastin."
Costs and benefits
What about the cost of doing all this manu-facturing science? Clearly, the manufacturers most visible in this effort are Big Pharma companies with the greatest resources. But in some cases, the benefits have been shown to outweigh the costs. FDA cites figures that the costs associated with pharmaceutical manufacturing can far exceed those for research and development operations (4). Others point to the cost of noncompliance: Ask any company with a consent decree how much it has spent on remediation.
Migliaccio also sees the investment as time and money well spent. Pfizer has a policy of treating projects to improve manu-facturing as a research investment; anyone who proposes such a project does not have to submit a return-on-investment analysis. As a result, Migliaccio can't attach a cost savings number to the company's work on process understanding, but he thinks the allocation of resources is beneficial. "We're looking at problematic processes that we were already throwing a lot of resources into anyway, because of the time and effort spent trying to resolve deviations and find root cause. We have certainly shifted the amount of time our quality control staff spends resolving problems to more fulfilling work in proactive quality improvement."
With Avastin, Genentech decided not to rush to file with incomplete data, and the company feels this strategy paid off in several ways. Branning says the submission was easier for FDA to understand, leading to rapid review and approval; the development of batch records was easier; and it reduced the amount of validation work, because fewer parameters had to be validated. "And we had ironclad, statistically valid data to base those decisions on," he says.
Since approval, the company has also found the ongoing monitoring and control of the manufacturing process easier. "We find that we have fewer discrepancies and investigations, and when we have those, they're easier to resolve," Branning says.
The future
So where do the industry and the regulators go from here? Focusing on good science is imperative. "It doesn't need to be exhaustive science," says Branning. "It just needs to be good science, that focuses on the issues at hand. We need to get people to understand that that's the fundamental necessity."
Muzzio agrees, and thinks that the leaders will bring the rest of the industry with them. "Over time, whenever a higher quality standard becomes clearly achievable, it becomes part of the expectation. The bar rises naturally."
References
1. US Food and Drug Administration, Guideline on General Principles of Process Validation (FDA, Rockville, MD, May 1987).
2. FDA, Guidance for Industry PAT—A Framework for Innovative Pharmaceutical Development, Manufacturing, and Quality Assurance (FDA, Rockville, MD, Sept. 2004).
3. D. McCormick, "PAT Survey Reflects Optimism, Uncertainty," Pharm. Technol. 29 (1), 24 (2005).
4. FDA, Innovation and Continuous Improvement in Pharmaceutical Manufacturing. Pharmaceutical CGMPs for the 21st Century (FDA, Rockville, MD, Sept. 2004), p. 8; available at http://www.fda.gov/cder/gmp/gmp2004/manufSciWP.pdf.