Navigation überspringen
Universitätsbibliothek Heidelberg
Status: Bibliographieeintrag

Verfügbarkeit
Standort: ---
Exemplare: ---
heiBIB
 Online-Ressource
Verfasst von:Schülein, Patrick [VerfasserIn]   i
 Teufel, Hannah [VerfasserIn]   i
 Vorpahl, Ronja [VerfasserIn]   i
 Emter, Indira [VerfasserIn]   i
 Bukschat, Yannick [VerfasserIn]   i
 Pfister, Marcus [VerfasserIn]   i
 Rathmann, Nils-Andreas [VerfasserIn]   i
 Diehl, Steffen J. [VerfasserIn]   i
 Vetter, Marcus [VerfasserIn]   i
Titel:Comparison of synthetic dataset generation methods for medical intervention rooms using medical clothing detection as an example
Verf.angabe:Patrick Schülein, Hannah Teufel, Ronja Vorpahl, Indira Emter, Yannick Bukschat, Marcus Pfister, Nils Rathmann, Steffen Diehl and Marcus Vetter
Jahr:2023
Umfang:21 S.
Illustrationen:Illustrationen
Fussnoten:Veröffentlicht: 02. August 2023 ; Gesehen am 28.09.2023
Titel Quelle:Enthalten in: European Association for Speech, Signal and Image ProcessingEURASIP journal on image and video processing
Ort Quelle:New York, NY : Hindawi Publishing Corp., 2007
Jahr Quelle:2023
Band/Heft Quelle:2023(2023), Artikel-ID 12, Seite 1-21
ISSN Quelle:1687-5281
Abstract:Purpose: The availability of real data from areas with high privacy requirements, such as the medical intervention space is low and the acquisition complex in terms of data protection. To enable research for assistance systems in the medical intervention room, new methods for data generation for these areas must be researched. Therefore, this work presents a way to create a synthetic dataset for the medical context, using medical clothing object detection as an example. The goal is to close the reality gap between the synthetic and real data. Methos: Methods of 3D-scanned clothing and designed clothing are compared in a Domain-Randomization and Structured-Domain-Randomization scenario using two different rendering engines. Additionally, a Mixed-Reality dataset in front of a greenscreen and a target domain dataset were used while the latter is used to evaluate the different datasets. The experiments conducted are to show whether scanned clothing or designed clothing produce better results in Domain Randomization and Structured Domain Randomization. Likewise, a baseline will be generated using the mixed reality data. In a further experiment it is investigated whether the combination of real, synthetic and mixed reality image data improves the accuracy compared to real data only. Results: Our experiments show, that Structured-Domain-Randomization of designed clothing together with Mixed-Reality data provide a baseline achieving 72.0% mAP on the test dataset of the clinical target domain. When additionally using 15% (99 images) of available target domain train data, the gap towards 100% (660 images) target domain train data could be nearly closed 80.05% mAP (81.95% mAP). Finally, we show that when additionally using 100% target domain train data the accuracy could be increased to 83.35% mAP. Conclusion: In conclusion, it can be stated that the presented modeling of health professionals is a promising methodology to address the challenge of missing datasets from medical intervention rooms. We will further investigate it on various tasks, like assistance systems, in the medical domain.
DOI:doi:10.1186/s13640-023-00612-1
URL:Bitte beachten Sie: Dies ist ein Bibliographieeintrag. Ein Volltextzugriff für Mitglieder der Universität besteht hier nur, falls für die entsprechende Zeitschrift/den entsprechenden Sammelband ein Abonnement besteht oder es sich um einen OpenAccess-Titel handelt.

Volltext: https://doi.org/10.1186/s13640-023-00612-1
 DOI: https://doi.org/10.1186/s13640-023-00612-1
Datenträger:Online-Ressource
Sprache:eng
Sach-SW:3D modeling
 3D scanning
 Camera-based AI-methods
 Deformable objects
 Domain Randomization
 Medical clothing detection
 Mixed Reality
 Structured Domain Randomization
 Synthetic dataset
K10plus-PPN:1860417051
Verknüpfungen:→ Zeitschrift

Permanenter Link auf diesen Titel (bookmarkfähig):  https://katalog.ub.uni-heidelberg.de/titel/69125092   QR-Code
zum Seitenanfang