Vis enkel innførsel

dc.contributor.authorDybedal, Joacim
dc.date.accessioned2023-10-12T13:27:56Z
dc.date.available2023-10-12T13:27:56Z
dc.date.created2023-10-10T20:51:57Z
dc.date.issued2023
dc.identifier.citationDybedal, J. (2023). 3D Sensor Placement and Embedded Processing for People Detection in an Industrial Environment [Doctoral dissertation]. University of Agder.en_US
dc.identifier.isbn978-82-8427-150-7
dc.identifier.issn1504-9272
dc.identifier.urihttps://hdl.handle.net/11250/3096172
dc.descriptionPapers I, II and III are extracted from the dissertation and uploaded as separate documents to meet post-publication requirements for self-arciving of IEEE conference papers.en_US
dc.description.abstractAt a time when autonomy is being introduced in more and more areas, computer vision plays a very important role. In an industrial environment, the ability to create a real-time virtual version of a volume of interest provides a broad range of possibilities, including safety-related systems such as vision based anti-collision and personnel tracking. In an offshore environment, where such systems are not common, the task is challenging due to rough weather and environmental conditions, but the result of introducing such safety systems could potentially be lifesaving, as personnel work close to heavy, huge, and often poorly instrumented moving machinery and equipment. This thesis presents research on important topics related to enabling computer vision systems in industrial and offshore environments, including a review of the most important technologies and methods. A prototype 3D sensor package is developed, consisting of different sensors and a powerful embedded computer. This, together with a novel, highly scalable point cloud compression and sensor fusion scheme allows to create a real-time 3D map of an industrial area. The question of where to place the sensor packages in an environment where occlusions are present is also investigated. The result is algorithms for automatic sensor placement optimisation, where the goal is to place sensors in such a way that maximises the volume of interest that is covered, with as few occluded zones as possible. The method also includes redundancy constraints where important sub-volumes can be defined to be viewed by more than one sensor. Lastly, a people detection scheme using a merged point cloud from six different sensor packages as input is developed. Using a combination of point cloud clustering, flattening and convolutional neural networks, the system successfully detects multiple people in an outdoor industrial environment, providing real-time 3D positions. The sensor packages and methods are tested and verified at the Industrial Robotics Lab at the University of Agder, and the people detection method is also tested in a relevant outdoor, industrial testing facility. The experiments and results are presented in the papers attached to this thesis.en_US
dc.language.isoengen_US
dc.publisherUniversity of Agderen_US
dc.relation.ispartofDoctoral dissertations at University of Agder
dc.relation.ispartofseriesDoctoral Dissertations at the University of Agder; no. 433
dc.relation.haspartPaper I: Dybedal, J. & Hovland, G. (2017). Optimal placement of 3D sensors considering range and field of view. IEEE International Conference on Advanced Intelligent Mechatronics (AIM), 2017, 1588–1593. Doi: https://doi.org/10.1109/AIM.2017.8014245. Accepted manuscript. Full-text not available in AURA as a separate file.en_US
dc.relation.haspartPaper II: Dybedal, J. & Hovland, G. (2020). GPU-Based Optimisation of 3D Sensor Placement Considering Redundancy, Range and Field of View. 15th IEEE Conference on Industrial Electronics and Applications (ICIEA), 2020, 1484-1489. Doi: https://doi.org/10.1109/ICIEA48937.2020.9248170. Accepted manuscript. Full-text not available in AURA as a separate file.en_US
dc.relation.haspartPaper III: Dybedal, J. & Hovland, G. (2020). GPU-Based Occlusion Minimisation for Optimal Placement of Multiple 3D Cameras. 15th IEEE Conference on Industrial Electronics and Applications (ICIEA), 2020, 967-972. Doi: https://doi.org/10.1109/ICIEA48937.2020.9248399. Accepted manuscript. Full-text not available in AURA as a separate file.en_US
dc.relation.haspartPaper IV: Dybedal, J., Aalerud, A. & Hovland, G. (2019). Embedded Processing and Compression of 3D Sensor Data for Large Scale Industrial Environments. Sensors, 19(3), 1-20. Doi: https://doi.org/10.3390/s19030636. Accepted version. Full-text is available in AURA as a separate file: https://hdl.handle.net/11250/2648520.en_US
dc.relation.haspartPaper V: Dybedal, J. & Hovland, G. (2021). CNN-based People Detection in Voxel Space using Intensity Measurements and Point Cluster Flattening. MIC Journal: Modeling, Identification and Control, 42(2), 37-46. Doi: https://doi.org/10.4173/mic.2021.2.1. Accepted version. Full-text is not available in AURA as a separate file.en_US
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internasjonal*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/deed.no*
dc.title3D Sensor Placement and Embedded Processing for People Detection in an Industrial Environmenten_US
dc.typeDoctoral thesisen_US
dc.description.versionpublishedVersionen_US
dc.rights.holder© 2023 Joacim Dybedalen_US
dc.subject.nsiVDP::Teknologi: 500en_US
dc.source.pagenumber185en_US
dc.identifier.cristin2183512
dc.relation.projectNorges forskningsråd: 237896en_US
cristin.ispublishedfalse
cristin.fulltextoriginal


Tilhørende fil(er)

Thumbnail
Thumbnail
Thumbnail
Thumbnail
Thumbnail
Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel

Attribution-NonCommercial-NoDerivatives 4.0 Internasjonal
Med mindre annet er angitt, så er denne innførselen lisensiert som Attribution-NonCommercial-NoDerivatives 4.0 Internasjonal