6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery
Vitreoretinal (VR) surgery is typical microsurgery with delicate and complex surgical procedures. The vision-based navigation for robot-assisted VR surgery has not been fully exploited because of the challenges that arise from illumination, high precision, and safety assessments. This paper presents...
Ausführliche Beschreibung
Autor*in: |
Mingchuan Zhou [verfasserIn] Xing Hao [verfasserIn] Abouzar Eslami [verfasserIn] Kai Huang [verfasserIn] Caixia Cai [verfasserIn] Chris P. Lohmann [verfasserIn] Nassir Navab [verfasserIn] Alois Knoll [verfasserIn] M. Ali Nasseri [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2019 |
---|
Schlagwörter: |
---|
Übergeordnetes Werk: |
In: IEEE Access - IEEE, 2014, 7(2019), Seite 63113-63122 |
---|---|
Übergeordnetes Werk: |
volume:7 ; year:2019 ; pages:63113-63122 |
Links: |
---|
DOI / URN: |
10.1109/ACCESS.2019.2912327 |
---|
Katalog-ID: |
DOAJ007592949 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | DOAJ007592949 | ||
003 | DE-627 | ||
005 | 20230309234339.0 | ||
007 | cr uuu---uuuuu | ||
008 | 230225s2019 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1109/ACCESS.2019.2912327 |2 doi | |
035 | |a (DE-627)DOAJ007592949 | ||
035 | |a (DE-599)DOAJ31a73c366e2f43e9a51bc77611db6d9a | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
050 | 0 | |a TK1-9971 | |
100 | 0 | |a Mingchuan Zhou |e verfasserin |4 aut | |
245 | 1 | 0 | |a 6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery |
264 | 1 | |c 2019 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
520 | |a Vitreoretinal (VR) surgery is typical microsurgery with delicate and complex surgical procedures. The vision-based navigation for robot-assisted VR surgery has not been fully exploited because of the challenges that arise from illumination, high precision, and safety assessments. This paper presents a novel method to estimate the 6DOF needle pose specifically for the application of robotic intraocular needle navigation using optical coherence tomography (OCT) volumes. The key ingredients of the proposed method are (1) 3D needle point cloud segmentation in OCT volume and (2) needle point cloud 6DOF pose estimation using a modified iterative closest point (ICP) algorithm. To address the former, a voting mechanism with geometric features of the needle is utilized to robustly segment the needle in OCT volume. Afterward, the CAD model of the needle point cloud is matched with the segmented needle point cloud to estimate the 6DOF needle pose with a proposed shift-rotate ICP (SR-ICP). This method is evaluated by the existing ophthalmic robot on ex-vivo pig eyes. The quantitative and qualitative results are evaluated and presented for the proposed method. | ||
650 | 4 | |a Biomedical engineering | |
650 | 4 | |a biomedical image processing | |
650 | 4 | |a biomedical signal processing | |
650 | 4 | |a medical robotics | |
653 | 0 | |a Electrical engineering. Electronics. Nuclear engineering | |
700 | 0 | |a Xing Hao |e verfasserin |4 aut | |
700 | 0 | |a Abouzar Eslami |e verfasserin |4 aut | |
700 | 0 | |a Kai Huang |e verfasserin |4 aut | |
700 | 0 | |a Caixia Cai |e verfasserin |4 aut | |
700 | 0 | |a Chris P. Lohmann |e verfasserin |4 aut | |
700 | 0 | |a Nassir Navab |e verfasserin |4 aut | |
700 | 0 | |a Alois Knoll |e verfasserin |4 aut | |
700 | 0 | |a M. Ali Nasseri |e verfasserin |4 aut | |
773 | 0 | 8 | |i In |t IEEE Access |d IEEE, 2014 |g 7(2019), Seite 63113-63122 |w (DE-627)728440385 |w (DE-600)2687964-5 |x 21693536 |7 nnns |
773 | 1 | 8 | |g volume:7 |g year:2019 |g pages:63113-63122 |
856 | 4 | 0 | |u https://doi.org/10.1109/ACCESS.2019.2912327 |z kostenfrei |
856 | 4 | 0 | |u https://doaj.org/article/31a73c366e2f43e9a51bc77611db6d9a |z kostenfrei |
856 | 4 | 0 | |u https://ieeexplore.ieee.org/document/8695065/ |z kostenfrei |
856 | 4 | 2 | |u https://doaj.org/toc/2169-3536 |y Journal toc |z kostenfrei |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_DOAJ | ||
912 | |a GBV_ILN_11 | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_31 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_65 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_4012 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4126 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4367 | ||
912 | |a GBV_ILN_4700 | ||
951 | |a AR | ||
952 | |d 7 |j 2019 |h 63113-63122 |
author_variant |
m z mz x h xh a e ae k h kh c c cc c p l cpl n n nn a k ak m a n man |
---|---|
matchkey_str |
article:21693536:2019----::dfedeoesiainorbtsitdir |
hierarchy_sort_str |
2019 |
callnumber-subject-code |
TK |
publishDate |
2019 |
allfields |
10.1109/ACCESS.2019.2912327 doi (DE-627)DOAJ007592949 (DE-599)DOAJ31a73c366e2f43e9a51bc77611db6d9a DE-627 ger DE-627 rakwb eng TK1-9971 Mingchuan Zhou verfasserin aut 6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Vitreoretinal (VR) surgery is typical microsurgery with delicate and complex surgical procedures. The vision-based navigation for robot-assisted VR surgery has not been fully exploited because of the challenges that arise from illumination, high precision, and safety assessments. This paper presents a novel method to estimate the 6DOF needle pose specifically for the application of robotic intraocular needle navigation using optical coherence tomography (OCT) volumes. The key ingredients of the proposed method are (1) 3D needle point cloud segmentation in OCT volume and (2) needle point cloud 6DOF pose estimation using a modified iterative closest point (ICP) algorithm. To address the former, a voting mechanism with geometric features of the needle is utilized to robustly segment the needle in OCT volume. Afterward, the CAD model of the needle point cloud is matched with the segmented needle point cloud to estimate the 6DOF needle pose with a proposed shift-rotate ICP (SR-ICP). This method is evaluated by the existing ophthalmic robot on ex-vivo pig eyes. The quantitative and qualitative results are evaluated and presented for the proposed method. Biomedical engineering biomedical image processing biomedical signal processing medical robotics Electrical engineering. Electronics. Nuclear engineering Xing Hao verfasserin aut Abouzar Eslami verfasserin aut Kai Huang verfasserin aut Caixia Cai verfasserin aut Chris P. Lohmann verfasserin aut Nassir Navab verfasserin aut Alois Knoll verfasserin aut M. Ali Nasseri verfasserin aut In IEEE Access IEEE, 2014 7(2019), Seite 63113-63122 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:7 year:2019 pages:63113-63122 https://doi.org/10.1109/ACCESS.2019.2912327 kostenfrei https://doaj.org/article/31a73c366e2f43e9a51bc77611db6d9a kostenfrei https://ieeexplore.ieee.org/document/8695065/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 7 2019 63113-63122 |
spelling |
10.1109/ACCESS.2019.2912327 doi (DE-627)DOAJ007592949 (DE-599)DOAJ31a73c366e2f43e9a51bc77611db6d9a DE-627 ger DE-627 rakwb eng TK1-9971 Mingchuan Zhou verfasserin aut 6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Vitreoretinal (VR) surgery is typical microsurgery with delicate and complex surgical procedures. The vision-based navigation for robot-assisted VR surgery has not been fully exploited because of the challenges that arise from illumination, high precision, and safety assessments. This paper presents a novel method to estimate the 6DOF needle pose specifically for the application of robotic intraocular needle navigation using optical coherence tomography (OCT) volumes. The key ingredients of the proposed method are (1) 3D needle point cloud segmentation in OCT volume and (2) needle point cloud 6DOF pose estimation using a modified iterative closest point (ICP) algorithm. To address the former, a voting mechanism with geometric features of the needle is utilized to robustly segment the needle in OCT volume. Afterward, the CAD model of the needle point cloud is matched with the segmented needle point cloud to estimate the 6DOF needle pose with a proposed shift-rotate ICP (SR-ICP). This method is evaluated by the existing ophthalmic robot on ex-vivo pig eyes. The quantitative and qualitative results are evaluated and presented for the proposed method. Biomedical engineering biomedical image processing biomedical signal processing medical robotics Electrical engineering. Electronics. Nuclear engineering Xing Hao verfasserin aut Abouzar Eslami verfasserin aut Kai Huang verfasserin aut Caixia Cai verfasserin aut Chris P. Lohmann verfasserin aut Nassir Navab verfasserin aut Alois Knoll verfasserin aut M. Ali Nasseri verfasserin aut In IEEE Access IEEE, 2014 7(2019), Seite 63113-63122 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:7 year:2019 pages:63113-63122 https://doi.org/10.1109/ACCESS.2019.2912327 kostenfrei https://doaj.org/article/31a73c366e2f43e9a51bc77611db6d9a kostenfrei https://ieeexplore.ieee.org/document/8695065/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 7 2019 63113-63122 |
allfields_unstemmed |
10.1109/ACCESS.2019.2912327 doi (DE-627)DOAJ007592949 (DE-599)DOAJ31a73c366e2f43e9a51bc77611db6d9a DE-627 ger DE-627 rakwb eng TK1-9971 Mingchuan Zhou verfasserin aut 6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Vitreoretinal (VR) surgery is typical microsurgery with delicate and complex surgical procedures. The vision-based navigation for robot-assisted VR surgery has not been fully exploited because of the challenges that arise from illumination, high precision, and safety assessments. This paper presents a novel method to estimate the 6DOF needle pose specifically for the application of robotic intraocular needle navigation using optical coherence tomography (OCT) volumes. The key ingredients of the proposed method are (1) 3D needle point cloud segmentation in OCT volume and (2) needle point cloud 6DOF pose estimation using a modified iterative closest point (ICP) algorithm. To address the former, a voting mechanism with geometric features of the needle is utilized to robustly segment the needle in OCT volume. Afterward, the CAD model of the needle point cloud is matched with the segmented needle point cloud to estimate the 6DOF needle pose with a proposed shift-rotate ICP (SR-ICP). This method is evaluated by the existing ophthalmic robot on ex-vivo pig eyes. The quantitative and qualitative results are evaluated and presented for the proposed method. Biomedical engineering biomedical image processing biomedical signal processing medical robotics Electrical engineering. Electronics. Nuclear engineering Xing Hao verfasserin aut Abouzar Eslami verfasserin aut Kai Huang verfasserin aut Caixia Cai verfasserin aut Chris P. Lohmann verfasserin aut Nassir Navab verfasserin aut Alois Knoll verfasserin aut M. Ali Nasseri verfasserin aut In IEEE Access IEEE, 2014 7(2019), Seite 63113-63122 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:7 year:2019 pages:63113-63122 https://doi.org/10.1109/ACCESS.2019.2912327 kostenfrei https://doaj.org/article/31a73c366e2f43e9a51bc77611db6d9a kostenfrei https://ieeexplore.ieee.org/document/8695065/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 7 2019 63113-63122 |
allfieldsGer |
10.1109/ACCESS.2019.2912327 doi (DE-627)DOAJ007592949 (DE-599)DOAJ31a73c366e2f43e9a51bc77611db6d9a DE-627 ger DE-627 rakwb eng TK1-9971 Mingchuan Zhou verfasserin aut 6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Vitreoretinal (VR) surgery is typical microsurgery with delicate and complex surgical procedures. The vision-based navigation for robot-assisted VR surgery has not been fully exploited because of the challenges that arise from illumination, high precision, and safety assessments. This paper presents a novel method to estimate the 6DOF needle pose specifically for the application of robotic intraocular needle navigation using optical coherence tomography (OCT) volumes. The key ingredients of the proposed method are (1) 3D needle point cloud segmentation in OCT volume and (2) needle point cloud 6DOF pose estimation using a modified iterative closest point (ICP) algorithm. To address the former, a voting mechanism with geometric features of the needle is utilized to robustly segment the needle in OCT volume. Afterward, the CAD model of the needle point cloud is matched with the segmented needle point cloud to estimate the 6DOF needle pose with a proposed shift-rotate ICP (SR-ICP). This method is evaluated by the existing ophthalmic robot on ex-vivo pig eyes. The quantitative and qualitative results are evaluated and presented for the proposed method. Biomedical engineering biomedical image processing biomedical signal processing medical robotics Electrical engineering. Electronics. Nuclear engineering Xing Hao verfasserin aut Abouzar Eslami verfasserin aut Kai Huang verfasserin aut Caixia Cai verfasserin aut Chris P. Lohmann verfasserin aut Nassir Navab verfasserin aut Alois Knoll verfasserin aut M. Ali Nasseri verfasserin aut In IEEE Access IEEE, 2014 7(2019), Seite 63113-63122 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:7 year:2019 pages:63113-63122 https://doi.org/10.1109/ACCESS.2019.2912327 kostenfrei https://doaj.org/article/31a73c366e2f43e9a51bc77611db6d9a kostenfrei https://ieeexplore.ieee.org/document/8695065/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 7 2019 63113-63122 |
allfieldsSound |
10.1109/ACCESS.2019.2912327 doi (DE-627)DOAJ007592949 (DE-599)DOAJ31a73c366e2f43e9a51bc77611db6d9a DE-627 ger DE-627 rakwb eng TK1-9971 Mingchuan Zhou verfasserin aut 6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Vitreoretinal (VR) surgery is typical microsurgery with delicate and complex surgical procedures. The vision-based navigation for robot-assisted VR surgery has not been fully exploited because of the challenges that arise from illumination, high precision, and safety assessments. This paper presents a novel method to estimate the 6DOF needle pose specifically for the application of robotic intraocular needle navigation using optical coherence tomography (OCT) volumes. The key ingredients of the proposed method are (1) 3D needle point cloud segmentation in OCT volume and (2) needle point cloud 6DOF pose estimation using a modified iterative closest point (ICP) algorithm. To address the former, a voting mechanism with geometric features of the needle is utilized to robustly segment the needle in OCT volume. Afterward, the CAD model of the needle point cloud is matched with the segmented needle point cloud to estimate the 6DOF needle pose with a proposed shift-rotate ICP (SR-ICP). This method is evaluated by the existing ophthalmic robot on ex-vivo pig eyes. The quantitative and qualitative results are evaluated and presented for the proposed method. Biomedical engineering biomedical image processing biomedical signal processing medical robotics Electrical engineering. Electronics. Nuclear engineering Xing Hao verfasserin aut Abouzar Eslami verfasserin aut Kai Huang verfasserin aut Caixia Cai verfasserin aut Chris P. Lohmann verfasserin aut Nassir Navab verfasserin aut Alois Knoll verfasserin aut M. Ali Nasseri verfasserin aut In IEEE Access IEEE, 2014 7(2019), Seite 63113-63122 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:7 year:2019 pages:63113-63122 https://doi.org/10.1109/ACCESS.2019.2912327 kostenfrei https://doaj.org/article/31a73c366e2f43e9a51bc77611db6d9a kostenfrei https://ieeexplore.ieee.org/document/8695065/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 7 2019 63113-63122 |
language |
English |
source |
In IEEE Access 7(2019), Seite 63113-63122 volume:7 year:2019 pages:63113-63122 |
sourceStr |
In IEEE Access 7(2019), Seite 63113-63122 volume:7 year:2019 pages:63113-63122 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Biomedical engineering biomedical image processing biomedical signal processing medical robotics Electrical engineering. Electronics. Nuclear engineering |
isfreeaccess_bool |
true |
container_title |
IEEE Access |
authorswithroles_txt_mv |
Mingchuan Zhou @@aut@@ Xing Hao @@aut@@ Abouzar Eslami @@aut@@ Kai Huang @@aut@@ Caixia Cai @@aut@@ Chris P. Lohmann @@aut@@ Nassir Navab @@aut@@ Alois Knoll @@aut@@ M. Ali Nasseri @@aut@@ |
publishDateDaySort_date |
2019-01-01T00:00:00Z |
hierarchy_top_id |
728440385 |
id |
DOAJ007592949 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ007592949</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230309234339.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230225s2019 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1109/ACCESS.2019.2912327</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ007592949</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ31a73c366e2f43e9a51bc77611db6d9a</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TK1-9971</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Mingchuan Zhou</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2019</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Vitreoretinal (VR) surgery is typical microsurgery with delicate and complex surgical procedures. The vision-based navigation for robot-assisted VR surgery has not been fully exploited because of the challenges that arise from illumination, high precision, and safety assessments. This paper presents a novel method to estimate the 6DOF needle pose specifically for the application of robotic intraocular needle navigation using optical coherence tomography (OCT) volumes. The key ingredients of the proposed method are (1) 3D needle point cloud segmentation in OCT volume and (2) needle point cloud 6DOF pose estimation using a modified iterative closest point (ICP) algorithm. To address the former, a voting mechanism with geometric features of the needle is utilized to robustly segment the needle in OCT volume. Afterward, the CAD model of the needle point cloud is matched with the segmented needle point cloud to estimate the 6DOF needle pose with a proposed shift-rotate ICP (SR-ICP). This method is evaluated by the existing ophthalmic robot on ex-vivo pig eyes. The quantitative and qualitative results are evaluated and presented for the proposed method.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Biomedical engineering</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">biomedical image processing</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">biomedical signal processing</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">medical robotics</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Electrical engineering. Electronics. Nuclear engineering</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Xing Hao</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Abouzar Eslami</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Kai Huang</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Caixia Cai</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Chris P. Lohmann</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Nassir Navab</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Alois Knoll</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">M. Ali Nasseri</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">IEEE Access</subfield><subfield code="d">IEEE, 2014</subfield><subfield code="g">7(2019), Seite 63113-63122</subfield><subfield code="w">(DE-627)728440385</subfield><subfield code="w">(DE-600)2687964-5</subfield><subfield code="x">21693536</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:7</subfield><subfield code="g">year:2019</subfield><subfield code="g">pages:63113-63122</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1109/ACCESS.2019.2912327</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/31a73c366e2f43e9a51bc77611db6d9a</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/document/8695065/</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2169-3536</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">7</subfield><subfield code="j">2019</subfield><subfield code="h">63113-63122</subfield></datafield></record></collection>
|
callnumber-first |
T - Technology |
author |
Mingchuan Zhou |
spellingShingle |
Mingchuan Zhou misc TK1-9971 misc Biomedical engineering misc biomedical image processing misc biomedical signal processing misc medical robotics misc Electrical engineering. Electronics. Nuclear engineering 6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery |
authorStr |
Mingchuan Zhou |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)728440385 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut aut aut aut aut aut aut aut |
collection |
DOAJ |
remote_str |
true |
callnumber-label |
TK1-9971 |
illustrated |
Not Illustrated |
issn |
21693536 |
topic_title |
TK1-9971 6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery Biomedical engineering biomedical image processing biomedical signal processing medical robotics |
topic |
misc TK1-9971 misc Biomedical engineering misc biomedical image processing misc biomedical signal processing misc medical robotics misc Electrical engineering. Electronics. Nuclear engineering |
topic_unstemmed |
misc TK1-9971 misc Biomedical engineering misc biomedical image processing misc biomedical signal processing misc medical robotics misc Electrical engineering. Electronics. Nuclear engineering |
topic_browse |
misc TK1-9971 misc Biomedical engineering misc biomedical image processing misc biomedical signal processing misc medical robotics misc Electrical engineering. Electronics. Nuclear engineering |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
IEEE Access |
hierarchy_parent_id |
728440385 |
hierarchy_top_title |
IEEE Access |
isfreeaccess_txt |
true |
familylinks_str_mv |
(DE-627)728440385 (DE-600)2687964-5 |
title |
6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery |
ctrlnum |
(DE-627)DOAJ007592949 (DE-599)DOAJ31a73c366e2f43e9a51bc77611db6d9a |
title_full |
6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery |
author_sort |
Mingchuan Zhou |
journal |
IEEE Access |
journalStr |
IEEE Access |
callnumber-first-code |
T |
lang_code |
eng |
isOA_bool |
true |
recordtype |
marc |
publishDateSort |
2019 |
contenttype_str_mv |
txt |
container_start_page |
63113 |
author_browse |
Mingchuan Zhou Xing Hao Abouzar Eslami Kai Huang Caixia Cai Chris P. Lohmann Nassir Navab Alois Knoll M. Ali Nasseri |
container_volume |
7 |
class |
TK1-9971 |
format_se |
Elektronische Aufsätze |
author-letter |
Mingchuan Zhou |
doi_str_mv |
10.1109/ACCESS.2019.2912327 |
author2-role |
verfasserin |
title_sort |
6dof needle pose estimation for robot-assisted vitreoretinal surgery |
callnumber |
TK1-9971 |
title_auth |
6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery |
abstract |
Vitreoretinal (VR) surgery is typical microsurgery with delicate and complex surgical procedures. The vision-based navigation for robot-assisted VR surgery has not been fully exploited because of the challenges that arise from illumination, high precision, and safety assessments. This paper presents a novel method to estimate the 6DOF needle pose specifically for the application of robotic intraocular needle navigation using optical coherence tomography (OCT) volumes. The key ingredients of the proposed method are (1) 3D needle point cloud segmentation in OCT volume and (2) needle point cloud 6DOF pose estimation using a modified iterative closest point (ICP) algorithm. To address the former, a voting mechanism with geometric features of the needle is utilized to robustly segment the needle in OCT volume. Afterward, the CAD model of the needle point cloud is matched with the segmented needle point cloud to estimate the 6DOF needle pose with a proposed shift-rotate ICP (SR-ICP). This method is evaluated by the existing ophthalmic robot on ex-vivo pig eyes. The quantitative and qualitative results are evaluated and presented for the proposed method. |
abstractGer |
Vitreoretinal (VR) surgery is typical microsurgery with delicate and complex surgical procedures. The vision-based navigation for robot-assisted VR surgery has not been fully exploited because of the challenges that arise from illumination, high precision, and safety assessments. This paper presents a novel method to estimate the 6DOF needle pose specifically for the application of robotic intraocular needle navigation using optical coherence tomography (OCT) volumes. The key ingredients of the proposed method are (1) 3D needle point cloud segmentation in OCT volume and (2) needle point cloud 6DOF pose estimation using a modified iterative closest point (ICP) algorithm. To address the former, a voting mechanism with geometric features of the needle is utilized to robustly segment the needle in OCT volume. Afterward, the CAD model of the needle point cloud is matched with the segmented needle point cloud to estimate the 6DOF needle pose with a proposed shift-rotate ICP (SR-ICP). This method is evaluated by the existing ophthalmic robot on ex-vivo pig eyes. The quantitative and qualitative results are evaluated and presented for the proposed method. |
abstract_unstemmed |
Vitreoretinal (VR) surgery is typical microsurgery with delicate and complex surgical procedures. The vision-based navigation for robot-assisted VR surgery has not been fully exploited because of the challenges that arise from illumination, high precision, and safety assessments. This paper presents a novel method to estimate the 6DOF needle pose specifically for the application of robotic intraocular needle navigation using optical coherence tomography (OCT) volumes. The key ingredients of the proposed method are (1) 3D needle point cloud segmentation in OCT volume and (2) needle point cloud 6DOF pose estimation using a modified iterative closest point (ICP) algorithm. To address the former, a voting mechanism with geometric features of the needle is utilized to robustly segment the needle in OCT volume. Afterward, the CAD model of the needle point cloud is matched with the segmented needle point cloud to estimate the 6DOF needle pose with a proposed shift-rotate ICP (SR-ICP). This method is evaluated by the existing ophthalmic robot on ex-vivo pig eyes. The quantitative and qualitative results are evaluated and presented for the proposed method. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 |
title_short |
6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery |
url |
https://doi.org/10.1109/ACCESS.2019.2912327 https://doaj.org/article/31a73c366e2f43e9a51bc77611db6d9a https://ieeexplore.ieee.org/document/8695065/ https://doaj.org/toc/2169-3536 |
remote_bool |
true |
author2 |
Xing Hao Abouzar Eslami Kai Huang Caixia Cai Chris P. Lohmann Nassir Navab Alois Knoll M. Ali Nasseri |
author2Str |
Xing Hao Abouzar Eslami Kai Huang Caixia Cai Chris P. Lohmann Nassir Navab Alois Knoll M. Ali Nasseri |
ppnlink |
728440385 |
callnumber-subject |
TK - Electrical and Nuclear Engineering |
mediatype_str_mv |
c |
isOA_txt |
true |
hochschulschrift_bool |
false |
doi_str |
10.1109/ACCESS.2019.2912327 |
callnumber-a |
TK1-9971 |
up_date |
2024-07-04T02:08:54.486Z |
_version_ |
1803612517288640512 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ007592949</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230309234339.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230225s2019 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1109/ACCESS.2019.2912327</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ007592949</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ31a73c366e2f43e9a51bc77611db6d9a</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TK1-9971</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Mingchuan Zhou</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">6DOF Needle Pose Estimation for Robot-Assisted Vitreoretinal Surgery</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2019</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Vitreoretinal (VR) surgery is typical microsurgery with delicate and complex surgical procedures. The vision-based navigation for robot-assisted VR surgery has not been fully exploited because of the challenges that arise from illumination, high precision, and safety assessments. This paper presents a novel method to estimate the 6DOF needle pose specifically for the application of robotic intraocular needle navigation using optical coherence tomography (OCT) volumes. The key ingredients of the proposed method are (1) 3D needle point cloud segmentation in OCT volume and (2) needle point cloud 6DOF pose estimation using a modified iterative closest point (ICP) algorithm. To address the former, a voting mechanism with geometric features of the needle is utilized to robustly segment the needle in OCT volume. Afterward, the CAD model of the needle point cloud is matched with the segmented needle point cloud to estimate the 6DOF needle pose with a proposed shift-rotate ICP (SR-ICP). This method is evaluated by the existing ophthalmic robot on ex-vivo pig eyes. The quantitative and qualitative results are evaluated and presented for the proposed method.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Biomedical engineering</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">biomedical image processing</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">biomedical signal processing</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">medical robotics</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Electrical engineering. Electronics. Nuclear engineering</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Xing Hao</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Abouzar Eslami</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Kai Huang</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Caixia Cai</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Chris P. Lohmann</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Nassir Navab</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Alois Knoll</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">M. Ali Nasseri</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">IEEE Access</subfield><subfield code="d">IEEE, 2014</subfield><subfield code="g">7(2019), Seite 63113-63122</subfield><subfield code="w">(DE-627)728440385</subfield><subfield code="w">(DE-600)2687964-5</subfield><subfield code="x">21693536</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:7</subfield><subfield code="g">year:2019</subfield><subfield code="g">pages:63113-63122</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1109/ACCESS.2019.2912327</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/31a73c366e2f43e9a51bc77611db6d9a</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/document/8695065/</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2169-3536</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">7</subfield><subfield code="j">2019</subfield><subfield code="h">63113-63122</subfield></datafield></record></collection>
|
score |
7.3991747 |