drennen idrc 2004

63
Implementation of PAT in the Pharmaceutical Industry Duquesne University Center for Pharmaceutical Technology

Upload: duquesne-university

Post on 03-Mar-2016

217 views

Category:

Documents


3 download

DESCRIPTION

Implementation of PAT in the Pharmaceutical Industry Duquesne University Center for Pharmaceutical Technology http:// http:// http:// www.dcpt.duq.edu www.dcpt.duq.edu www.dcpt.duq.edu James K. Drennen, III 2 Montaigne James K. Drennen, III 3

TRANSCRIPT

Page 1: Drennen IDRC 2004

Implementation of PAT in the Pharmaceutical Industry

Duquesne UniversityCenter for Pharmaceutical Technology

Page 2: Drennen IDRC 2004

James K. Drennen, III

2

Duquesne University Center for Pharmaceutical Technology (DCPT)

http://http://http://www.dcpt.duq.eduwww.dcpt.duq.eduwww.dcpt.duq.edu

Page 3: Drennen IDRC 2004

James K. Drennen, III

3

“No wind favors him who has no destined port.”

Montaigne

Page 4: Drennen IDRC 2004

James K. Drennen, III

4

Today’s Manufacturing ProcessesCharacterised by:

Large inefficient batch equipmentLow utilization 30 - 40 % on averageLow product yieldsExcessive amounts of product non-conformancesLong lead-times due to stage and final product testingCapital and labour intensiveHigh operating costsHigh inventories and excessive warehouse capacityResistant to innovationCycle time improvement perceived to be limited by regulatory constraints

CAMP Member Companies Presentation to FDA March 2002

Page 5: Drennen IDRC 2004

James K. Drennen, III

5

“The significant problems we face cannot be solved by the same level of thinking we were at when we created them.”

Albert Einstein

Page 6: Drennen IDRC 2004

James K. Drennen, III

6

Pharmaceutical Manufacturing in the Future

Guidance for IndustryPAT- A framework for Innovative Pharmaceutical Manufacturing and Quality Assurance

Scientific principles and tools supporting innovationPAT ToolsProcess Understanding Risk-Based Approach

Regulatory Strategy accommodating innovation PAT Team approach to Review and Inspection Joint training and certification of staff

Page 7: Drennen IDRC 2004

James K. Drennen, III

7

What is PAT?

A system for:designing, analyzing, and controlling manufacturingtimely measurements (i.e., during processing)monitoring critical quality and performance attributes raw and in-process materialsprocess understanding

“Analyzing” includes:chemical, physical, microbiological, mathematical, and risk analysis

Page 8: Drennen IDRC 2004

James K. Drennen, III

8

Process Understanding

A process is well understood when:all critical sources of variability are identified and explainedvariability is managed by the processproduct quality attributes can be accurately and reliably predicted

Accurate and Reliable predictions reflect process understanding

Page 9: Drennen IDRC 2004

James K. Drennen, III

9

Key Elements of PAT Implementation

Risk AnalysisExperimental DesignControl Strategies

SensorsModel development/MaintenanceSPC

Process SamplingInformation Management

Page 10: Drennen IDRC 2004

James K. Drennen, III

10

Risk Analysis/Experimental Design

FBDrier

Milling

Blender Press

Coater

Sieve

Dispensary

Wet granulation

NIRNIR

NIRNIR

NIRNIR

NIRNIR

Direct Compression

Page 11: Drennen IDRC 2004

James K. Drennen, III

11

Sampling/SPC/Data Management

Page 12: Drennen IDRC 2004

James K. Drennen, III

12

Objectives (I)

Qualify capabilities of instrument and sampling systemEvaluate the potential effect of “process signature” on calibration developmentCompare the utility of reflection and transmission data

Page 13: Drennen IDRC 2004

James K. Drennen, III

13

Instrument Performance Testing-Performed On-Line, When Possible

Page 14: Drennen IDRC 2004

James K. Drennen, III

14

Test of Sample Positioning System

Page 15: Drennen IDRC 2004

James K. Drennen, III

15

Test of Sample Positioning System

Two sides of tablet provide identical spectra

Page 16: Drennen IDRC 2004

James K. Drennen, III

16

Test of Sample Positioning System

Early positioning studies led to improvements in conveyor and trigger system

X-position study for 2nd Deriv. Intensity vs. Position along the belt

-0.002

0.000

0.002

0.004

0.006

0.008

0.010

0.012

0.014

0.016

40.0 42.0 44.0 46.0 48.0 50.0 52.0 54.0 56.0 58.0

P o sit io n alo ng the belt (1 / 64 in)

Page 17: Drennen IDRC 2004

James K. Drennen, III

17

Sample Position- Reflection

Page 18: Drennen IDRC 2004

James K. Drennen, III

18

Sample Position- Transmission

Page 19: Drennen IDRC 2004

James K. Drennen, III

19

Process Signature

Page 20: Drennen IDRC 2004

James K. Drennen, III

20

Production Samples

Page 21: Drennen IDRC 2004

James K. Drennen, III

21

Compression Samples- Reflection

Page 22: Drennen IDRC 2004

James K. Drennen, III

22

Production Samples Projected onto Compression Model

Page 23: Drennen IDRC 2004

James K. Drennen, III

23

Conclusion (I)

The impact of positioning error on NIR reflection and transmission analysis can be mitigated using preprocessing techniques, automatic positioning system suitable for its intended useShielding not required for transmission measurements Spectra acquired from laboratory and production samples can be pooledDiffuse reflection spectra were less sensitive to sample positioning

Page 24: Drennen IDRC 2004

James K. Drennen, III

24

Objectives (II)

Calibration Development/Validation

Page 25: Drennen IDRC 2004

James K. Drennen, III

25

Truncated Spectra from API Content Calibration

Page 26: Drennen IDRC 2004

James K. Drennen, III

26

Robustness Index (RI) for selection of preprocessing

Page 27: Drennen IDRC 2004

James K. Drennen, III

27

Robustness Index

∫=

++= 3

0

2 CBA

1RI

NLNN LL

Page 28: Drennen IDRC 2004

James K. Drennen, III

28

Robustness Index

Where:RI = Robustness indexLN = Level of simulated noise added

= Quadratic fit of the noise augmented prediction error data

RI is inverse of the AUC defined by quadriaticfit of RMSE plotted as function of added spectral noise

CBA 2 ++ NN LL

Page 29: Drennen IDRC 2004

James K. Drennen, III

29

Cross-Validation and Robustness Testing for API Calibration

Preprocessing Treatment

PLS Factors

RMSECV ( mg )

RMSE ( mg ) r2

Robustness Index

Raw Data 5 2.36 1.96 0.886 0.18SNV 5 2.19 1.70 0.915 0.25SNV + 1st Deriv. 4 1.79 1.49 0.935 0.33SNV + 2nd Deriv. 3 1.93 1.64 0.921 0.29MSC 5 2.12 1.68 0.917 0.26MSC + 1st Deriv. 4 1.80 1.48 0.936 0.33MSC + 2nd Deriv. 3 1.89 1.61 0.923 0.291st Deriv. 4 1.80 1.48 0.936 0.322nd Deriv. 3 1.89 1.61 0.924 0.29

Page 30: Drennen IDRC 2004

James K. Drennen, III

30

Cross-Validation and Robustness Testing for Hardness Calibration

Preprocessing Treatment Model Type

Factors/ Terms RMSECV RMSE r2

Robustness Index

Raw Data PLS 3 10.11 8.88 0.907 0.042SNV PLS 3 10.96 9.75 0.888 0.040SNV 1st Deriv. PLS 3 12.04 10.86 0.861 0.038SNV 2nd Deriv. PLS 3 12.38 11.08 0.855 0.037MSC PLS 3 9.66 8.82 0.908 0.043MSC 1st Deriv. PLS 3 8.82 8.11 0.923 0.047MSC 2nd Deriv. PLS 3 8.21 7.48 0.934 0.0461st Deriv. PLS 2 9.11 8.22 0.920 0.0432nd Deriv. PLS 3 8.73 7.91 0.926 0.0451st Order Baseline Fit 2 NA 8.79 0.909 0.0361st Order Baseline Fit 1 NA 8.75 0.910 0.0382nd Order Baseline Fit 3 NA 8.01 0.924 0.0332nd Order Baseline Fit 2 NA 8.32 0.918 0.038

Page 31: Drennen IDRC 2004

James K. Drennen, III

31

Prediction Plot for API Content

Page 32: Drennen IDRC 2004

James K. Drennen, III

32

Calibration/Validation API

Calibration Dataset VAL1 VAL2 VAL3

Samples ( n ) 500 350 40 38 Batches ( n ) 23 30 4 2

Maximum ( mg ) 65.32 66.06 50.31 65.53Mean ( mg ) 48.90 49.07 49.44 47.92

Minimum ( mg ) 32.66 33.63 47.70 32.43Standard Deviation ( mg ) 5.83 4.94 0.67 13.93

Model TypePreprocessing

Spectral Range ( nm )Latent Variables ( n )

RMSE ( mg )* 1.48 1.25 5.35 (1.04) 5.07 (3.76)RMSE ( %, nominal )* 2.96 2.50 10.7 (2.08) 10.1 (7.52)

r 0.967 0.972 0.441 0.974r2 0.936 0.944 0.194 0.948

RPD* 3.9 4.0 NA 2.7 (3.8)Bias ( mg )* 0.00 -0.22 -5.3 (0.71) -4.0 (2.04)

Full-spectrum PLS regression

* A prediction bias was identified for the VAL2 and VAL3 datasets. The corrected values are in parentheses.

MSC+1st Derivative(1300 - 2000), 2

4

Page 33: Drennen IDRC 2004

James K. Drennen, III

33

Hardness Prediction

Page 34: Drennen IDRC 2004

James K. Drennen, III

34

Calibration/Validation Hardness

Calibration Dataset

Validation Dataset

Samples ( n ) 437 152Batches ( n ) 22 8

Maximum ( N ) 140.0 145.0Mean ( N ) 61.7 58.1

Minimum ( N ) 16.0 13.0Standard Deviation ( N ) 29.2 30.9

Model TypePreprocessing

Spectral Range ( nm )Latent Variables ( n )

RMSE ( N )* 8.1 12.0 (8.5)r 0.961 0.961

r2 0.922 0.92344RPD* 3.6 2.6 (3.6)

Bias ( N )* 0.0 -8.0 (-0.01)

* A prediction bias was identified. Corrected values are in parentheses.

Full-spectrum PLS regressionMSC+1st Derivative

(1300 - 2000), 23

Page 35: Drennen IDRC 2004

James K. Drennen, III

35

Model Robustness

Page 36: Drennen IDRC 2004

James K. Drennen, III

36

High-Flux Noise Robustness Test

Page 37: Drennen IDRC 2004

James K. Drennen, III

37

Wavelength Accuracy Robustness

Page 38: Drennen IDRC 2004

James K. Drennen, III

38

Conclusion (II)

Calibration model form (for hardness) and preprocessing operations were selected based on RI analysis and cross-validationValidation of accuracy, precision, linearity, specificity and robustness, using independent datasetsRobustness demonstrated to variation in instrumental high-flux noise and wavelength shift

Page 39: Drennen IDRC 2004

James K. Drennen, III

39

Objectives (III)

Develop a system for continuous calibration monitoringFormulate a strategy for calibration transfer/update to support instrument maintenance and inter-instrument transferDetermine the required number (and evaluate stability) of instrument standardization “rescue” samples

Page 40: Drennen IDRC 2004

James K. Drennen, III

40

Failure Detection

NIRInstrument Sample

NIRData

Pre -treatment

Model

Result(Prediction) Final

Result

Is prediction valid?(Qres and T 2)

InstrumentStandardization Instrument

Evaluation andcorrective action

Potential errors ( A) due to:-New instrument-Changed instrument response

Potential errors ( B) due to:-Raw material change-Process change

Result is valid

Result requires investigationInstrumentMatching

Calibration TransferProcesses NIR Prediction Prediction Validity

Historicaldata and

actionthreshold

Page 41: Drennen IDRC 2004

James K. Drennen, III

41

Instrument Performance Testing

Page 42: Drennen IDRC 2004

James K. Drennen, III

42

IPC using Pearson Correlation

21

21

22

111

)(Σ)n(Σ)(Σ)n(Σ

))(Σ(Σ)(n(Σ),r(

+λ+λλλ

+λλ+λλ+λλ

Χ−ΧΧ−Χ

ΧΧ−ΧΧ=ΧΧ

Xλ = Odd-numbered spectral data pointsXλ+1 = Even-numbered spectral data

points.

Page 43: Drennen IDRC 2004

James K. Drennen, III

43

IPC for Typical (blue) and Noisy (red) NIR Spectra

Page 44: Drennen IDRC 2004

James K. Drennen, III

44

Noise Factor Level (NFL)

Using this formula, a noise factor level (NFL) is estimated

NFL = f(1 – r(Xλ, Xλ+1))

Page 45: Drennen IDRC 2004

James K. Drennen, III

45

Histogram of historical NFL scores for API content calibration spectra

Page 46: Drennen IDRC 2004

James K. Drennen, III

46

Sample-Based AOTF Wavelength Uncertainty Test

Page 47: Drennen IDRC 2004

James K. Drennen, III

47

Calibration Monitoring

Page 48: Drennen IDRC 2004

James K. Drennen, III

48

Use of Q residual and T2 for Calibration Monitoring

Page 49: Drennen IDRC 2004

James K. Drennen, III

49

Use of Q and T2 for API CAL and VAL2

Page 50: Drennen IDRC 2004

James K. Drennen, III

50

Calibration Maintenance and Transfer

Page 51: Drennen IDRC 2004

James K. Drennen, III

51

Calibration Maintenance/Transfer

Why are transfer/update protocols necessary?Need for a calibrated backup instrumentEventual expansion to further linesTransfer-in-time of knowledge from earlier experiments

Page 52: Drennen IDRC 2004

James K. Drennen, III

52

Master and Slave Instruments

Page 53: Drennen IDRC 2004

James K. Drennen, III

53

Exhibit 5

Potential investigation actions1. Review daily internal

performance test and internal performance test history

2. Run an internal performance test

3. Rescan tablet4. Review SPC of method

assessment5. Review SPC of tablet test

results6. Perform a parallel laboratory

test on tablet

Potential remediation actionsA. Instrument repair,

standardization, and external performance test

B. Calibration updateC. Address as an

out-of-specification (OOS) investigation

Instrumentevaluation and

corrective action(legend)

Page 54: Drennen IDRC 2004

James K. Drennen, III

54

Example: Lamp change

1300 1400 1500 1600 1700 1800 1900 20000.7

0.8

0.9

1

1.1

1.2

1.3

1.4

Wavelength ( nm )

Ref

lect

ance

Rat

io

Page 55: Drennen IDRC 2004

James K. Drennen, III

55

Justification for baseline subtraction method

46 46.5 47 47.5 48 48.5 49 49.5 5046

46.5

47

47.5

48

48.5

49

49.5

50

Prior to Lamp Change, ( mg )

Follo

win

g La

mp

Cha

nge,

Unc

orre

cted

( m

g )

Page 56: Drennen IDRC 2004

James K. Drennen, III

56

Calibration transfer model for correcting lamp change

1300 1400 1500 1600 1700 1800 1900 2000-0.05

-0.04

-0.03

-0.02

-0.01

0

0.01

0.02

0.03

0.04

0.05

Wavelength ( nm )

Ref

lect

ance

Rat

io

Additive Calibration Transfer Coefficients

Page 57: Drennen IDRC 2004

James K. Drennen, III

57

Prediction quality with calibration transfer

46 46.5 47 47.5 48 48.5 49 49.5 5046

46.5

47

47.5

48

48.5

49

49.5

50

Prior to Lamp Change, ( mg )

Follo

win

g La

mp

Cha

nge,

Tra

nsfe

r Cor

rect

ed (

mg

)

Page 58: Drennen IDRC 2004

James K. Drennen, III

58

How Many Transfer Samples?

5 10 15 20 25 30

1

Number of Transfer Samples ( n )

Rel

ativ

e Er

ror (

mul

tiple

)

Page 59: Drennen IDRC 2004

James K. Drennen, III

59

Continuous Calibration Monitoring for Stability Samples

Page 60: Drennen IDRC 2004

James K. Drennen, III

60

Conclusion (III)

Key instrument performance parameters can be monitored using features of sample spectraHotelling’s T2 and Q residuals provide basis of predicting spectral deviationsCalibration transfer among multiple instruments can be achieved using baseline subtraction and as few as 15 transfer samples

Page 61: Drennen IDRC 2004

James K. Drennen, III

61

Conclusion (III)

Calibration transfer samples can be stored for at least one month without compromising calibration transfer performanceLong-term spectral database uniformity can be maintained using appropriate calibration transfer methods

Page 62: Drennen IDRC 2004

James K. Drennen, III

62

Acknowledgements

Carl AndersonRobert CogdillDavid MolseedMiriam Delgado

Robert ChisholmAli AfnanRaymond BoltonThorsten HerkertKen Leiper

Page 63: Drennen IDRC 2004

James K. Drennen, III

63

Duquesne University Center for Pharmaceutical Technology

http://http://http://www.dcpt.duq.eduwww.dcpt.duq.eduwww.dcpt.duq.edu