We've all read articles on how to develop and validate methods. They're normally heavily focused on meeting regulatory scrutiny and compliance. But, the one thing that's often missing is an honest discussion of how well these methods will actually work in a day-to-day, manufacturing setting.
Frequently, the methods and procedures developed in the Research and Development (R&D) environment overlook real-world Quality Control (QC) concerns that later become all too apparent on the manufacturing floor. The development of robust analytical methods and streamlined Standard Operating Procedures (SOPs) can have a positive impact on the cost of goods by designing methods that allow for higher throughput, lower solvent consumption, and reduced manpower. A few extra days spent addressing QC concerns, likely to present themselves when production is scaled up, can result in tremendous savings in the long run.
Most methods are developed by well-meaning highly educated people with the best intentions of delivering a workable and cost-efficient solution. Unfortunately, the frame of reference for defining success in an R&D laboratory can be substantially different from high volume QC operations. For starters, R&D time frames occur within the context of a project that takes years from concept to the clinic. Overnight assays and analytical procedures are more dependent upon the schedule and convenience of individual scientists. QC in-process and release testing time frames, on the other hand, are measured according to minutes, hours, and work shifts. The timely and efficient release of the finished product is vital to the manufacturer's bottom line.
There is also a similar disparity in the sample throughput. An R&D laboratory may perform testing on a few lots during pre-clinical and the early stages of clinical development. In a manufacturing environment, QC labs will be testing and releasing material from multiple products within a single eight-hour shift.
Method development and validation SOPs should take into account high throughput and the collateral impact on other areas of the company. Cumbersome SOPs can adversely affect throughput in the QC area causing delays and increasing the costs associated with reserving manufacturing suites. Higher sample throughput also amplifies costs associated with solvent consumption and associated disposal costs.
Consider the issues posed by the following real-world Case Studies and the corresponding suggestions for optimizing procedures and methods.
Case Study #1
An HPLC method was developed as the in-process test for a formulation with a run time of 45 minutes. The QC lab would run system suitability testing while waiting for the sample, but once the sample arrived still needed to run a standard check and prepare two sample dilutions as per the validated method. On average, it took 5 hours to get approved in-process QC results to the manufacturing floor. Meanwhile, the manufacturing suite and personnel were sitting idle while awaiting the test results. In addition, the multiple preparations generated more laboratory solvent waste, dirty glassware, and the need for more bench space.
Suggested alternatives:- Because the in-process test does not have to be stability-indicating, an ultraviolet (UV) test can be used instead, reducing turnaround time to less than an hour,
- Change the sample preparation to a single dilution. This would reduce the number of pipetting steps (and associated risk of error) and reduce solvent consumption.
- The R&D SOP called for a check standard during the HPLC sample run and multiple sample preparations. Yet, the validation showed no need for multiple sample preparations. Rewrite the SOP to eliminate the blanket requirement for multiple preparations and instead let the number of preparations be based on the validation data.
- Change the HPLC injection sequence so that the check standard is part of the system suitability sequence, which is run prior to sample analysis, thereby shortening the total run time.
Case Study #2
A multi-point dissolution profile method was developed and the final sample timepoint chose was 14 hours. The regulatory requirement for the final time point is that >80% of the label claim of the product has been released. The 14-hour time point was chosen because the R&D scientist started the analysis after lunch, had an autosampler and integrated data system, and wanted all of the data available when he came in the next morning. Unfortunately, the QC lab in the manufacturing environment did not have an automated dissolution system and had to add a second shift in order to pull and test the final sample. Safety regulations required two people to be in the laboratory so a chemist and supervisor need to be hired.
An acceptable alternative, with no impact on product quality, would have been to use a later time, e.g. 20 hours, which would meet the regulatory requirement of >80% label claim released, allow the QC chemist to pull the sample when they first came into the laboratory the next day and thereby eliminate the need for a second shift.
These problems, e.g. reduction in sample preparations, modified injection sequences, SOP changed, seem minor but their effects are cumulative. Samples are processed faster, fewer personnel are required, risk of laboratory errors is reduced, and products are released more quickly, with minimal effort on the front end of a project.
The returns on well-crafted and robust QC methods result in long-term improvements in manufacturing efficiency and a reduced Cost of Goods.