Validating automation

Peter Woods, head of medical device manufacturing, GB Innomech discusses the challenges involved in designing flexible automation for medical device and regulated production environments.

Dr Peter Woods, GB Innomech

It is not uncommon to specify and supply special purpose automation for assembly or testing of a new product line while the design of the new product itself is still evolving.  The challenge for the automation provider is to provide flexibility in machine setup while still enabling the required control of process when the finalised product is in production.

The ability to fine tune manufacturing processes is vital to maintain consistent quality while achieving best possible yield, but in the manufacture of medical devices the possible impact on human health of a faulty assembly means that almost every step on assembly and testing must be shown to be operating correctly within a well-defined range of process parameters.

The exercise to formally document that the machine, as well as the overall manufacturing process, achieves this is at the heart of the validation process which applies in these industries.  Machine validation is a far from trivial task, but one that can be significantly simplified if validation is anticipated in the design of the automation. The techniques that can solve these issues in regulated production environments can also be helpful in other industry sectors.

Anticipating product design

To validate a special purpose machine, every aspect of its functional requirements must be traceable to tests carried out on the built machine, with clear and objective criteria laid out for passing each test.

For example, an end of line test function to prove the correct assembly and operation of a probe used in keyhole surgery may incorporate sensors to measure the force the surgeon needs to apply to operate a slide valve.

End of line testing equipment - for example for multi-dose injector pens as shown here – needs to be fully optimised for fast throughput but also with ‘lockdown’ to prevent production staff from adjusting system parameters in an attempt to shorten test times

The validity of the measurement will need to be established using specially prepared set of examples of the product, including examples spanning the range of acceptable variation as well as samples outside the acceptable range.  At the start of the project though, there are no production samples, and truly representative examples might only be available once the production line is constructed and capable of producing them.

Let’s use this example to illustrate four possible pitfalls that can result:

  1. The acceptable range of a test will be different to the range specified when the testing system is first specified.

In our example, a certain minimum and maximum force should be required to operate the valve in order for it to be ergonomic for the surgeon.

This range may be quite broad, but the device itself will involve a much narrower range, centred on the particular properties of the slider as implemented in that particular design.  To confirm correct assembly, the measurement system will need to be configured to apply a narrow pass criterion corresponding to the final design.

This can be dealt with by designing the user interface so that not only the test thresholds themselves, but also the range over which they can be adjusted, are configurable as needed both during development and subsequent use of the machine, and are not hard coded or accessible only to software engineers.

A little psychology helps too, as illustrated by a further example, this time relating to a leak test of the same device.

One parameter of the test is the delay period between establishing initial conditions and applying the measurement to detect a leak in the probe.  Another parameter is the measurement period over which results are averaged.  The units used to define these are important:  a resolution of a second would be considered too coarse, and so defining these times to a millisecond resolution might seem to alleviate any such objections.

However, in that case, production staff keen to obtain the best throughput will want to try to set the measurement time to a few milliseconds – shorter than the time taken to make any measurement.

Conversely, in order to diagnose machine behaviour during development, a delay of a minute or two might be required – a value which in milliseconds is too large to be represented as an integer.

Defining these parameters in tenths of a second removes the possibility of these unphysical conditions without seeming to impose artificial restrictions on production staff.  This in turn simplifies validation and removes the risk of the machine being operated outside a valid range.

  1. Implementing lockdown to prevent unauthorised changing of system parameters and operation of a machine function will interfere with testing to show the function itself is operating as intended.

To ensure the correct records are generated, the machine is typically designed to only operate when a valid batch code has been set up, and when precursory machine checks have been executed and passed.  Process parameters are then frozen so that the same process is demonstrably applied to the entire batch.

This security is a real obstacle when first commissioning the machine, especially one which performs a series of processing stages, since the whole system would need to be functioning in order to set up each station.  One way to resolve this is by providing maintenance functions that allow the complete process sequence at a station to be executed as in production mode, but in isolation.

The extra work to design and code these “maintenance” functions is repaid many times during commissioning and testing, and they are deliberately designed as a permanent part of the deliverable machine with appropriate access control built in.

  1. The output from an externally calibrated sensor, once integrated into the machine control system, may require a secondary calibration of the machine as a whole.

An example is the loadcell that is used to measure the force needed to move the slider in the surgical probe.  The load cell voltage output can be calibrated by an accredited (e.g. UKAS approved) test house, but when this voltage is converted into a digital value within the machine, arbitrary offset and scale factors may render the load cell calibration meaningless.

One solution is to use the load cell in tandem with a dedicated amplifier which outputs a character string corresponding to the force reading in Newtons.  Now, the amplifier/loadcell combination can be treated as a single item for calibration purposes and making further scaling or manipulation unnecessary.

In another example, involving flow measurement, the calibrated flow meter was chosen to have its own digital display as well as providing a serial output to the control system.  In this case, the correspondence is immediately obvious by observing the readout display and the results recorded, obviating any need for detailed analysis of the computer codes to establish validation.

  1. The manufacturer’s abstract requirement for a machine function might only become crystal clear when the machine itself has been designed.

The temptation is to update the requirement in terms of the intended solution.  But if the machine implementation has to change due to a change in the product, the machine no longer meets the restated requirement.

This is another example where human psychology pushes in the opposite direction to the best overall solution.

Engineers developing the machine, and also the QA staff responsible for validation on the customer side all prefer to aim for clarity of detail so that software design and test protocols can be fully defined as early as possible.

However, when the design of the product itself is still in a state of flux there is a risk that these details have to be revisited, and once defined, the cost of changing the machine design, specifications, and test protocols to keep up can be considerable even in the simplest of cases.  This can be avoided by explicitly acknowledging what is not yet fixed, and providing suitable configuration functions instead of assuming fixed values.

About GB Innomech

GB Innomech (Innomech) specialises in automating highly complex and labour-intensive manufacturing processes to maximise outputs, improve product quality and boost business performance.

The company works with major international manufacturers in sectors such as pharmaceuticals, medical devices and environmental, as well as earlier-stage businesses looking to bring breakthrough technologies or products to market.

The company was founded in 1990, is based at The Innovation Centre in Witchford, north of Cambridge and was awarded The Queen’s Award for Enterprise 2009 to recognise its sustained growth in international markets.

For additional information about GB Innomech please visit or contact: