Tag Archives: GAMP

Montrium Connect Compliance Concerns

Hello good people of the world! Today we’re talking about a specific Software-as-a-Service (SaaS) product: Montrium’s Connect. Montrium offers a number of modules in their Connect software (Document Management, Training, CAPA, Incidents, etc.) Today I’ll focus on the Document Management module.

What makes Montrium’s offering unique is that is it built on top of Microsoft Sharepoint. I previously talked about Sharepoint Online with respect to compliance concerns here (a little out-of-date, but still relevant).

The first point I’ll make is having the application built on Sharepoint brings some significant advantages and disadvantages. The primary advantage I see, in comparison with other electronic Document Management Systems (eDMS), is that Sharepoint uses Microsoft’s Office Online suite, and arguably the world’s best online word processor: Word. I am not aware of any online word processor as fully featured as this one. I have used other eDMSs that have their own word processor and having less features can be really frustrating.

That said, Sharepoint also brings it’s clunky user interface and outdated Active Server Page (.aspx) architecture. The application won’t feel as snappy as modern websites, and you’ll see page reloads for things that would be handled by a component re-render in more modern applications. Overall the application feels very slow. I found myself having to wait minutes sometimes for items moving through a workflow to pop up in my task list.

An example of Montrium / Sharepoint UI

The first thing that struck me with the compliance aspect of Montrium’s offering is that they have categorized their Connect SOP (which is the brand name for the Document Management Module) as GAMP category 3 software. GAMP category 3 is commerical-off-the-shelf (COTS) non-configurable software. I don’t know how they consider this software non-configurable, because there is a lot of configuration options that change how it functions, including workflows. This results in end-users not creating a configuration specification and not testing the configuration to their specific intended use. This could be a compliance risk.

Another thing I noticed is the audit trail functionality. There is no interface for audit trail, instead it is automatically exported a protected Excel file every 28 days. I find it strange that the audit trail would not be available in real-time, and think this could introduce some compliance risk. It also falls into the trap of including at least some non-human readable data. See the example below:

Audit Trail Example

So just a couple points of concern with Montrium’s Connect software in a regulated use case.

What has your experience with Montrium Connect been? What is your favorite eDMS? Comment below.

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!

Compliance in the Cloud: SaaS, PaaS, and IaaS

Hello good people of the world! Today’s post is the first in a series on compliant software in the cloud. Cloud software is characterized by running strictly in a web browser; no software is installed on a client’s local hard drive. Cloud software has significant advantages for users and vendors alike, so it is no surprise that it has become the standard for modern personal and business software applications.

It should also not be surprising that cloud software has made it’s way into compliant industries such as medical device, pharmaceutical, and biologic manufacturing, which is the focus of this blog. Vendors and customers alike want the advantages that cloud software bring.

And what are the main advantages? It’s all about distribution. With cloud software, vendors can push new features to users without any downtime or need for manual upgrading, installation, etc. Vendors can monitor software use, responding to bugs much more quickly, and can make improvements based on how users are actually using the software, without any additional overhead for the user.

It has been demonstrated that the data collected through the routine use of cloud software can be immensely valuable, and there’s the added bonus of what insights could be gained when Artificial Intelligence (AI)/Machine Learning (ML) is applied to these big data sets.

Current commercial offerings are often categories into three categories:

  • Software as a Service (SaaS)
  • Platform as a Service (PaaS)
  • Infrastructure as a Service (IaaS)

Going from top to bottom, IaaS is a service offering typically offering server hardware, server virtualization, data storage, and networking. PaaS would offer all that and additionally server operating system(s), middleware (such as load balancing software, malware protection, etc.), and runtime applications such as databases. SaaS would then offer all that and the specific software application to boot.

Examples:

  • IaaS: DigitalOcean, Rackspace, Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP)
  • PaaS: Heroku, Windows Azure, Google App Engine
  • SaaS: Dropbox, Salesforce, Zoom, Facebook

In medical device, pharmaceutical, and biologic manufacturing companies may use IaaS and/or PaaS to outsource some of their IT needs, and may use SaaS products for such functions as Enterprise Resource Planning (ERP), electronic Document Management (eDMS), Laboratory Information Management (LIMS), etc.

Stay tuned for future blog posts on the subject of compliant cloud computing concerns, including: existing solutions, the validation life-cycle, regulatory documentation expectations, data integrity concerns, 483s, and CSA (Computer Software Assurance).

MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!

Specifications: How to Write Them

GAMP V Model

Hello good people of the world! Today’s post is about the left side of GAMP’s V-model: specifications. Specifically, their purpose and how to write them.

Of course there are many variations of the V-model, and it is best to find what suits your organization and processes. For the purposes of this discussion, I’ll refer to the basic GAMP V-model, pictured above.

The specifications are: User Requirements Specification, Functional Specification, and Design Specification. In general each feeds in to the next. Also, Installation Qualification may test the details of the Design Specification, Operational Qualification may test the functional descriptions in the Functional Specification, and Performance Qualification may test the high-level requirement’s of the User Requirements Specification, although these boundaries are often blurred in practice.

Any new project should start with a User Requirements Specification which clearly defines the testable user requirements. It is important that requirements are testable, and often a SMART approach is applied: each user requirements should be Specific, Measurable, Achievable, Relevant, and Time-bound. It is also helpful to categorize user requirements upfront, since not all will be quality-related. This makes it easier to rationalize the requirements that are explicitly tested in qualification protocols versus commissioning or not at all. Typical categories include: business, safety, maintenance, and quality.

The Functional Specification is then a response to the User Requirements Specification, typically provided by the vendor, explaining in detail how automated components will function to fulfill the user requirements. Functional Specifications are often confused with Functional Requirements or Functional Requirements Specification, which may be another document defined by a process. GAMP’s V-model does not intend the Functional Specification to document new or further detail requirements, but to define the functionality employed to meet the requirements defined in the User Requirements Specification. The Functional Specification can describe sequence of operations, interlocks, alarms, etc.

The Design Specification should provide sufficient detail that an engineer could recreate the control system from scratch if need-be, to recreate the functionality described in the Functional Specification to meet the user requirements. The Design Specification is typically provided by the vendor and should contain such details as I/O list, alarm list, HMI security levels, sequence of operations details including device positions at each step and transition conditions. This should be a very detailed document, and if you’re working with 10-20 pages it is too light.

Documentation can be expensive and is maybe not fun to generate and review, but is critical to a highly effective validation program.

What do you like to see in specifications?

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!