
OR WAIT null SECS
© 2026 MJH Life Sciences™ , Pharmaceutical Technology - Pharma News and Development Insights. All rights reserved.
Susan Schniepp, Regulatory Compliance Associates Inc., shares strategies to streamline pharma quality, reduce costs via early detection, and use data to ensure safety.
Susan Schniepp, Regulatory Compliance Associates Inc., as a part of PDA Week 2026, shares her insights on streamlining quality systems, the financial value of robust inspections, and the necessity of looking at the big picture in manufacturing. She breaks down why quality shouldn’t be a burden and how transparency can save millions.
Check out the 3-part video interview with Schneipp:
Part 1: Why Robust Quality Systems Save Pharmaceutical Companies Millions
Part 2: How AI Can Connect Pharma Manufacturing Data Systems
Part 3: How Small Pharma Companies Can Simplify Quality Compliance
Schniepp: It is actually quite simple. When we talk about a culture of quality, we are really talking about the culture of the entire organization. It boils down to a few basic behaviors: if you see something, say something, and if you make a mistake, fess up. Maintaining a quality system is not as difficult as people claim. It requires control over documents, validated manufacturing processes, and technology transfer data, especially if you are working with a CDMO.
Absolutely. I think people often make a bigger deal out of it than it is, and we need more flexibility and creativity. For instance, small or virtual companies often feel they must have an SOP for everything listed in the Code of Federal Regulations. But if you don't actually perform a specific operation, like a recall, you shouldn’t have an SOP for it. Instead, just have a statement saying you have contracted that procedure out to another party.
The goal is to avoid being burdensome. The FDA has a beautiful system where they don’t tell you how to do something; they tell you what standards you must meet. Whether your SOPs are text-heavy or full of pictures is up to you, as long as you achieve the end result: a thoroughly documented operation with investigated deviations and appropriately released batches.
The hardest part is maintaining the proper ratio of people to run the systems versus people to interpret the data. Robust investigations require significant manpower. While an investigation might take resources away from running product, it is worth it if it solves a problem and prevents a future, equally bad issue from occurring.
You justify it through the value of catching defects early. I call it the Rule of 10. It costs nothing to catch a defective component at incoming inspection; you just ship it back. However, if that defect makes it to your shelf, it costs $10,000. If it hits the manufacturing floor, it’s $20,000. If it gets into a batch and you must abort midterm, you’re at $50,000. Catching it in the final product costs 1$,000,000 or more. Beyond the money, any recall, even for a cosmetic defect like a smeared label, damages the company's reputation.
We need to understand how all the separate data points, smoke studies, particulate matter data, and environmental monitoring, relate to each other to protect the product. Traditionally, we treat these as isolated data points. But they work together to keep the product safe.
Think of it this way: there is a continuous background of activities like media fills and water results that must run 24/7 to maintain the facility, even when you aren't running a batch. Then you have snapshots, which are your specific chemistry or micro results for a particular batch. We need to better understand how that background supports the release of every single batch.
Transparency is everything. Operations run smoother and cost less when there is a robust quality agreement and both parties understand each other. Complexity arises when a client forces a CMO to use a special requirement that differs from the CMO's general standards. This creates duplicate standards, which makes the system far more complicated than it needs to be.