Validation of calibration software ? as required by ISO 17025, for instance ? is a topic that folks don? เกจวัดแรงดัน prefer to talk about. Often there is uncertainty about the following: Which software actually must be validated? If so, who should take care of it? Which requirements must be satisfied by validation? How do you take action efficiently and how could it be documented? The following post explains the background and provides a recommendation for implementation in five steps.
In a calibration laboratory, software is used, among other activities, from supporting the evaluation process, up to fully automated calibration. Regardless of the amount of automation of the program, validation always identifies the entire processes into which the program is integrated. Behind validation, therefore, is the fundamental question of if the process of calibration fulfills its purpose and whether it achieves all its intended goals, in other words, does it supply the required functionality with sufficient accuracy?
To be able to do validation tests now, you ought to know of two basic principles of software testing:
Full testing isn’t possible.
Testing is always influenced by the environment.
The former states that the test of all possible inputs and configurations of a program cannot be performed because of the large number of possible combinations. With regards to the application, the user should always decide which functionality, which configurations and quality features must be prioritised and which are not relevant for him.
Which decision is manufactured, often depends on the next point ? the operating environment of the software. Depending on application, practically, you can find always different requirements and priorities of software use. Additionally, there are customer-specific adjustments to the software, such as concerning the contents of the certificate. But also the average person conditions in the laboratory environment, with an array of instruments, generate variance. The wide variety of requirement perspectives and the sheer, endless complexity of the program configurations within the customer-specific application areas therefore make it impossible for a manufacturer to test for all the needs of a specific customer.
Correspondingly, taking into account the above points, the validation falls onto the user themself. To make this technique as efficient as you possibly can, a procedure fitting the following five points is recommended:
The info for typical calibration configurations should be defined as ?test sets?.
At regular intervals, typically one per year, but at the very least after any software update, these test sets ought to be entered into the software.
The resulting certificates can be weighed against those from the prior version.
In the case of an initial validation, a cross-check, e.g. via MS Excel, may take place.
The validation evidence ought to be documented and archived.
WIKA provides a PDF documentation of the calculations carried out in the software.
Note
For further information on our calibration software and calibration laboratories, go to the WIKA website.