All posts by MikeWilliamson

Interested in collecting and analyzing data through software while travelling the world

ISPE’s Commissioning and Qualification Guide Second Edition

Hello good people of the world! Today’s post covers ISPE’s release of the second edition of their commissioning and qualification guide. This is volume 5 of the baseline guides. The first edition was first released way back in March 2001, so we should expect this to be a significant revision. Please note this guide is not available for free.

One very nice thing about this second edition is that it not only updates the first edition of the volume 5 guide, but incorporates scope from two other now-outdated guides as well: “Science and Risk-Based Approach for the Delivery of Facilities, Systems, and Equipment” and “Applied Risk Management for Commissioning and Qualification.” So if you have these in your library, you can safely archive them.

It’s always important to note that industry guides such as this one do not constitute regulations and are not required to be followed. It is often the case however that best practices documented in guides become industry standard, and then set expectations for regulators. Per the guide, it is intended to comply with EU GMP Annex 15FDA Guidance on Process Validation, and ICH Q9.

The table of contents shows the following sections:

  1. Introduction
  2. User Requirements Specification
  3. System Classification
  4. System Risk Assessment
  5. Design Review and Design Qualification
  6. C&Q Planning
  7. C&Q Testing and Documentation
  8. Acceptance and Release
  9. Periodic Review
  10. Vendor Assessment for C&Q Documentation Purposes
  11. Engineering Quality Process
  12. Change Management
  13. Good Documentation Practice for C&Q
  14. Strategies for Implementation of Science and Risk-Based C&Q Process

And the following appendices:

  1. Regulatory Basis
  2. User Requirements Specification Example
  3. System Classification Form Example
  4. Direct Impact System Examples
  5. System Risk Assessment Example
  6. Design Review/Design Qualification Examples
  7. Supporting Plans
  8. System Start-Up Examples
  9. Discrepancy Form Example
  10. Qualification Summary Report Examples
  11. Periodic Review Example
  12. Periodic Review for Controlled Temperature Chambers
  13. Vendor Assessment Tool Example
  14. Organizational Maturity Assessment Example
  15. Approach to Qualifying Legacy Systems or Systems with Inadequate Qualification
  16. References
  17. Glossary

You’ll have to purchase the guide to get all the details, but below are some highlights that stuck out to me:

  • This second edition introduces the term Critical Design Elements (CDEs). CDEs are defined as “design functions or features of an engineered system that are necessary to consistently manufacture products with the desired quality attributes.”
  • Concepts that were removed from this edition of the guide include Component Criticality Assessment, Enhanced Commissioning, Enhanced Design Review, Enhanced Document, Indirect Impact (systems are either direct impact or not direct impact now), and the V-Model.
  • A Direct Impact system is defined as a system that directly impacts product CQAs, or directly impacts the quality of the product delivered by a critical utility system. All other systems are considered to be not direct impact. An example included in section 3 demonstrates the previously categorized “indirect impact” systems would become not direct impact systems and would be commissioned only, although the commissioning for these system may be more robust than a purely “no impact” system. The guide provides an eight (8) question process for determining if a system is direct impact.
  • System boundaries should be marked on design drawings.
  • Inputs to the URS should include: CQAs, CPPs, regulatory, organization quality, business, data integrity and storage, alarm, automation, and health, safety, and environmental requirements, and engineering specifications and industry standards. The example URS template does include a classification of each requirement (e.g. business, safety, quality).
  • A system risk assessment is performed to identify CDEs and controls required to mitigate risks. Standard off-the-shelf systems typically do not require a risk assessment. Risk levels are defined as low, medium, and high and the risk assessment approach is not a typical FMECA process. Instead each CQA at each step gets one entry on how the CQA can be impacted, what are the design controls around that CQA and any alarm or procedural controls to mitigate risk. The residual risk post-controls is includes as low, medium, or high.
  • Design Qualification looks somewhat informal 0=- no DQ protocol, but a DQ report that summarizes other documents (URS, SIA) and design review meetings.
  • A C&Q plan should include clear scope, the execution strategy, documentation expected for each system (URS, FAT, SAT, IOQ, SOPs, etc.), and roles and responsibilities (e.g. approval matrix).
  • The discrepancy form has closure signatures only (no pre-implementation signatures)
  • For legacy systems without adequate C&Q documentation, focus should be on identifying product and process user requirements including Critical Quality Attributes (CQAs) and Critical Process Parameters (CPPs), and then the Critical Design Elements (CDEs) that affect them. It is necessary to confirm that accurate drawings exist, that maintenance files are up-to-date, and there is test evidence to support changes since commissioning. A risk-based approach can be used to qualify the system in the absence of typical C&Q documentation.

Do you use the ISPE guides for your C&Q approach? Comment below.

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!

 

Validation Project Plans

develop-project-plan-1200x480

Hello good people of the world! Today’s post is about Validation Project Plans, which is a specific type of project plan for projects in the pharmaceutical, biotechnology, and medical device regulated industries. This post covers Validation Project Plans for pharmaceutical/biotechnology industries in particular.

Often I’ve see Validation Project Plans contain a lot of fluff but little meat, making them of less value to the project team. A good project plan clearly documents the following, at a minimum:

  1. What facilities, systems, and equipment are in scope of the plan
  2. What are the expected activities and deliverables
  3. Who is responsible for what
  4. What is the validation approach and rationale for that approach
  5. What happens after the validation scope covered in the plan is completed (i.e. ongoing requirements)

Note I do not include project cost or schedule in a project plan, because these are often changing rapidly and should be maintained in a less controlled, more flexible manner, e.g. with scheduling software for a schedule.

The plan itself should be controlled (i.e. approved and revision controlled) as soon as possible in the project but early enough so that scope will not change (too much).

Additional things to think about when drafting your plan:

  1.  Commissioning versus Qualification versus Validation. If your project has multiple phases (and any decent-sized project should), be sure to clearly state responsibilities and deliverables at each stage.
  2. Include references to regulations, industry guidance, and site procedures that govern your plan. Make it clear to everyone who reads the plan what framework you are working inside.
  3. The purpose and scope of the document should be clear and up front.
  4. Get buy-in from all functional groups by having them approve the document.
  5. Like all controlled documents, the plan should have version/revision history.
  6. Use tables to clearly present information.

I put together a quick template here:

Validation Project Plan Template MWV

What do you feel is necessary in a Validation Project Plan? Comment below.

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!

 

 

Specifications: How to Write Them

GAMP V Model

Hello good people of the world! Today’s post is about the left side of GAMP’s V-model: specifications. Specifically, their purpose and how to write them.

Of course there are many variations of the V-model, and it is best to find what suits your organization and processes. For the purposes of this discussion, I’ll refer to the basic GAMP V-model, pictured above.

The specifications are: User Requirements Specification, Functional Specification, and Design Specification. In general each feeds in to the next. Also, Installation Qualification may test the details of the Design Specification, Operational Qualification may test the functional descriptions in the Functional Specification, and Performance Qualification may test the high-level requirement’s of the User Requirements Specification, although these boundaries are often blurred in practice.

Any new project should start with a User Requirements Specification which clearly defines the testable user requirements. It is important that requirements are testable, and often a SMART approach is applied: each user requirements should be Specific, Measurable, Achievable, Relevant, and Time-bound. It is also helpful to categorize user requirements upfront, since not all will be quality-related. This makes it easier to rationalize the requirements that are explicitly tested in qualification protocols versus commissioning or not at all. Typical categories include: business, safety, maintenance, and quality.

The Functional Specification is then a response to the User Requirements Specification, typically provided by the vendor, explaining in detail how automated components will function to fulfill the user requirements. Functional Specifications are often confused with Functional Requirements or Functional Requirements Specification, which may be another document defined by a process. GAMP’s V-model does not intend the Functional Specification to document new or further detail requirements, but to define the functionality employed to meet the requirements defined in the User Requirements Specification. The Functional Specification can describe sequence of operations, interlocks, alarms, etc.

The Design Specification should provide sufficient detail that an engineer could recreate the control system from scratch if need-be, to recreate the functionality described in the Functional Specification to meet the user requirements. The Design Specification is typically provided by the vendor and should contain such details as I/O list, alarm list, HMI security levels, sequence of operations details including device positions at each step and transition conditions. This should be a very detailed document, and if you’re working with 10-20 pages it is too light.

Documentation can be expensive and is maybe not fun to generate and review, but is critical to a highly effective validation program.

What do you like to see in specifications?

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!

 

Basic Components of a Test Form

Hello good people of the world! Today’s post is a short one on what are the basic components of a test form. In Validation, you’re going to record a lot of data, and you want this data to be well organized and easily understood. Here are the basic components I think every test form should have:

  1. Numbering! Each step should have a unique number so that it is easily identifiable and easy to reference elsewhere.
  2. A Title! What is this test all about? A short description should be provide.
  3. Purpose! What is the purpose of the test? Make it clear.
  4. Verification Steps! Clearly define what steps need to be performed.
  5. Expected Results! Clearly define what the expected results are. Does every step need an expected result? Every step can have one, so include it.
  6. Actual Results! This is where the actual data is collected. The actual results can be recorded exactly as the expected results are stated to avoid any confusion.
  7. Pass/Fail! Did the step pass or fail? This will quickly tell you. Also a good place to reference any comments.
  8. Initials/Date! In order for the data to be attributable, initial/date uniquely identifying the test executer must be included for each step.

What basic components do you include on your test forms? Comment below.

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!

What NOT to Re-qualify

autoclave
Hello good people of the world! Today’s post is on what NOT to include in requalification/revalidation. I was recently on a site that had a five (5) year requalification requirement for sterilizers per a site SOP, which sounds reasonable (continuous monitoring would be better). But then I noted they included in their requalification requirements a re-execution of the entire initial controls system IOQ! The requalification included verification of:

  • Hardware/software installation
  • E-stop, guarding, and door interlocks
  • Restart and recovery
  • Recipe management
  • Temperature, pressure, and time control
  • Communication
  • Security

And it was expected that this would be done every five years! It just so happened that in 2014 they paid a contractor to do the work, who sadly did not help the site out by letting them know the wastefulness of such an endeavor. This is an egregious example of resource misuse and not understanding the expectations of a validation program/taking a risk-based approach. The point of requalification/revalidation is to look for drift in processes, not blindly repeat testing already performed.

What misunderstanding-of-validation-expectation horror stories do you have? Comment below.

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!

Serialization Basics

vials
Hello good people of the world! Today’s post is high-level regarding serialization. Serialization is a process mandated by the world’s regulatory agencies to reduce counterfeit drug products in the market. Besides being costly to drug companies, counterfeit drug products are often less efficacious and less safe than the real drug they are purporting to be. Additionally, counterfeit drug products can be contaminated with other APIs and/or toxic excipients.
Continue reading Serialization Basics

How to Store an Electronic Signature

Electronic Signature

Hello good people of the world! Today’s post is a short one on the topic of electronic signatures. 21CFR11 includes requirements around the use of electronic signatures in place of traditional handwritten signatures and these requirements are fairly well understood at this point. The question of this post is, how should your electronic system store the electronic signature?

It is important to understand that electronic signatures are meta-data. That is, they are data about data. So if you have an electronic batch record, for example, the electronic signature recorded during approval is not part of the batch record data, but meta-data of the batch record data. In this example, it is specifically the who, what, and when of the batch record approval.

Given that, electronic signature data should not be included in the same record (table row, document, etc.) in the electronic system. It is meta-data and should be handled as such, an important consideration in electronic system design. This will ensure electronic signature data is robust, auditable, and readily available.

How do you store electronic signatures? Any lessons learned? Comment below.

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!