With around 400,000 analyses per year, we generate around five million medically validated results.Udo Margraff
However, all these tools generate data, the adequate utilisation of which is not an easy task, said Udo Margraff, CEO of Laboratoires Réunis in Luxembourg, during our European Hospital interview. Among other issues, data protection proves to be a limiting factor, with the meaningful use of big data falling by the wayside. ‘The laboratory has always been very focused on IT,’ Udo Margraff points out. ‘With around 400,000 analyses per year, we generate around five million medically validated results. However, this is just a part of the data packet. Every analytical result we release additionally contains countless data on the traceability of samples.
This could be the date the sample was taken, specific data for the validation of the laboratory results, the clinical anamnesis, any possible difficulties encountered when blood was taken etc. Altogether we generate up to 100 million data sets in our IT system that the doctor doesn’t even get to see – and these are just the figures from our relatively small, private laboratory.’
All data is evaluated for different processes. ‘If, for instance, we receive feedback from the doctor that the potassium levels of a sample are too high, we test for the cause,’ the CEO explains. ’We investigate how the sample was transported, or whether there was a problem with the centrifuge or reagents. These analyses are incredibly complex but of no interest to the referring doctor as long as they receive the reason for the increased levels.’
To Margraff, statistics and studies are additionally available tools for statistical evaluations in light of the data volume. Unfortunately though, big data analyses entail two major difficulties, as Udo Margraff admits: ‘Everybody talks about big data, but ultimately this data cannot be utilised in the best possible way we like to imagine.
Only by anonymising and standardising data can we satisfy data protection regulations. But, I’m hopeful that we will find a solution for the future.Udo Margraff
‘It is no longer a big problem to obtain validated data due to certifications and standardisations. We have a lot of very good data available. However, this requires software in surgeries and hospitals to be well maintained. The big problem is the consolidation of this data for adequate evaluations, and we are only just starting out here.’
A Canadian-Chinese research group, for instance, has been able to prove a connection between a lack of vitamin A and Alzheimer’s disease. ‘Many Alzheimer’s patients have a lack of vitamin A, which suggests a link,’ says Margraff. ‘I therefore wanted to correlate these results with data from our laboratory, but out of the 2,600 eligible results from the last three years only a few hundred correlated with the respective statistics. This is not enough for a statistical evaluation. We lack the clinical patient data required to carry out such studies.’ Making validated statements for publications and studies is therefore almost impossible.
Studies like this would be very interesting for health insurers, especially with a view to prevention. ‘Prevention should begin in childhood, with a healthy diet. Laboratory data is also of interest for research into prevention with regards to next generation sequencing in genetics or the microbiome: Diabetes, cardiac disease, autism, Alzheimer’s disease and multi-resistance are all areas where good studies could facilitate fantastic preventative work,’ Margraff emphasises.
When it comes to big data analyses, cost is also a big factor because, he underlines, ‘laboratory diagnostics is expensive, therefore not utilising the data adequately is a shame. Laboratory and imaging diagnostics play a central role in medicine: 60-70% of all diagnoses are made with the help of the laboratory or imaging.’
The Agence eSanté founded in Luxembourg is the first eHealth authority to establish itself in the EU. ‘I hope we will succeed in setting up a server for laboratory data that consolidates all relevant data, making data processing for publications and studies, and those on prevention, epidemics, and infections more efficient and effective,’ Margraff hopes. However, in this, data protection is proving a big obstacle.
‘Only by anonymising and standardising data can we satisfy data protection regulations,’ Margraff points out. ‘But, I’m hopeful that we will find a solution for the future. Transregional data usage is also of interest for everyone against the backdrop of cross-border healthcare. There is a growing trend towards people living and working across, and without, borders.’