Forum: help


RE: Problem Filtering Data with a lot of artefacts [ Reply ] By: Robin Anno Wester on 2019-09-03 13:21 | [forum:46955] |
Thanks for the clarification! Since I'm taking part in a HRV-study for the first time, I thought it might be possible to avoid manual artifact removal. With regard to the improvement of the filtering-algorithm: Maybe you could use the approach used to validate the detection of artifacts for the Artiifact-program, as described in: Kaufmann, T., Sütterlin, S., Schulz, S., & Vögele, C. (2011). ARTiiFACT: a tool for heart rate artifact processing and heart rate variability analysis. Behavior Research Methods, 43(4), 1161-1170. doi:10.3758/s13428-011-0107-7 |
RE: Problem Filtering Data with a lot of artefacts [ Reply ] By: Abraham Otero on 2019-09-03 12:37 | [forum:46954] |
If I remember correctly (the RHRV project is even older than the R-Forge source code repository suggests) the heartbeat filtering algorithm was developed by Xose Anton Vila in his doctoral thesis, which I think is not available in digital format. I think there was no publication just about that algorithm. In any case, the algorithm is not perfect. We now have as a pending task to try to improve it, and the first step to do this is to create a benchmark that measures the performance of the current algorithm (by the way, we are completely open to external contributions :) ). However, I doubt it is possible to create an algorithm that removes all the artifacts without eliminating a significant number of normal beats. Both for the current algorithm, and for possible future improvements, I believe that the optimal solution would be performing an automatic filtering with parameters that are not too aggressive (thus avoiding the removal of normal beats), and then manually reviewing the results. |
RE: Problem Filtering Data with a lot of artefacts [ Reply ] By: Robin Anno Wester on 2019-09-03 09:48 | [forum:46949] |
Hello Abraham, thank you! Unfortunately the tutorial doesn't give any information on the validation process of the algorithm, neither does the paper that is referenced in the tutorial. After re-filtering in RHRV with different settings and looking into other programs and their artifact detection features I now have the impression that with some artifacts there is no way around visual inspection. Best regards, Robin |
RE: Problem Filtering Data with a lot of artefacts [ Reply ] By: Abraham Otero on 2019-08-21 10:22 | [forum:46918] |
Hello Robin, Check page 70 of the tutorial: http://rhrv.r-forge.r-project.org/tutorial/tutorial.pdf You could also check: https://www.youtube.com/watch?v=PoHx3d067PI Regards, |
RE: Problem Filtering Data with a lot of artefacts [ Reply ] By: Robin Anno Wester on 2019-08-20 13:12 | [forum:46910] |
One more question: Did the package developers come up with the algorithm for outlier detection themself or was a pre-existing algorithm used, that has been validated elsewhere? |
Problem Filtering Data with a lot of artefacts [ Reply ] By: Robin Anno Wester on 2019-08-20 10:27 | [forum:46909] |
Hello, I'm new to HRV-Analysis and have a question regarding automatic artefact removal: I have some EKG-recordings with a lot of artefacts. Because I have a lot of datasets I want to automatically filter the recordings and not manually go through every recording (that would take forever). My problem is: when I apply FilterNIHR() for some bad recordings the number of removed artefacts is much smaller than the number of artefacts I had manually removed. Is there any way to alter the FilterNIHR settings so that it is more sensitive? Or is there no way around manually going over recordings with a lot of artefacts? Thank you for your help! Robin |