Measurement data management (MDM) means the efficient management and organization of vast quantities of measurement data or Big Test Data. It is the prerequisite for ensuring this data is available in suitable form for further long-term use, processing and analysis.
Efficient measurement data management – what tasks does it involve?
Measurement data must be backed up and secured against unauthorized access. In most cases, the data originates in various formats from a variety of measuring instruments and test stands. Generally speaking, data is exchanged seamlessly via appropriate import and export functions. So that data can be located, interpreted and networked as rapidly as possible, and (for example) to allow data exchange among teams, the data needs to be indexed with meaningful descriptions: these are known as “metadata.ˮ Metadata includes (but is not restricted to) details about the test object, measurement equipment used, physical units, time, measurement setup as well as test objectives and procedures. Indexing is the basis for possible queries, presentation of data in specific views or limitation to specific datasets.
Key requirements for a high-performance measurement data management (MDM) system:
- Data storage, cloud solutions
- Data backup
- Data checking: identify multiple identical datasets, repair corrupted datasets
- Data protection, including encryption options; rights assignment
- Flexible indexing system to ensure maximum transparency and searchability
- Extensive import and export functions to load different formats and data from different manufacturers’ devices
- Comprehensive search and filter functions
- Options to generate complex queries and save them for re-use in the future
What opportunities does an efficient measurement data management system offer?
- Fast, straightforward availability of measurement data – no matter how complex
- High data transparency levels that boost efficiency and help cut costs: different teams can exchange and benefit from each other’s measurement data. This can avoid the performance of multiple identical tests at different times in different projects
- Data is optimally prepared for post-processing – e.g. downstream analysis of measurement data and data mining