Examinator: Knut Åkesson, Inst för elektroteknik
Using Autonomous Transport Robots (ATRs) for transporting material in factories is a constantly growing concept in modern industries. These ATRs are often required to share pathways with other agents, for example, humans and forklifts. This emphasizes the need for safety precautions. In this study, cameras mounted in the ceiling facing the factory floor are analyzed to provide input to the trajectory planning algorithms of the ATRs in order to guarantee safe and efficient execution. This study aims to cover (i) the detection and identification of obstacles in the ATRs’ paths, and (ii) the detection and localization of the ATRs. For the detection and identification, machine learning in the form of semantic seg-mentation is used to distinguish obstacles from the floor. In order to train the machine learning network, annotated ground truth data was generated through a foreground extraction algorithm. Transfer learning is used to utilize the already trained state-of-the-art neural network for semantic segmentation, neglecting the need for excessive training and a massive amount of data. A background subtraction method is used in combination with the neural network output to identify if an obstacle is static or dynamic. To keep track of the objects, a Kalman filter algorithm is used. The detection and localization of the ATRs are done using a mounted fiducial marker and its corresponding software library. All obstacles and ATRs detected are projected into the world coordinate frame of the factory in order to create a map for the trajectory planning algorithms. Because of the shared pathways, these algorithms require low latency to guarantee safety. Experimental evaluation using an NVIDIA RTX 5000 GPU shows that (i) can be done in 500 ms and (ii) in 120 ms. These times were evaluated using input from four cameras simultaneously.