Iversen Dahlgaard (phonestate1)

For naive robots to become truly autonomous, they need a means of developing their perceptive capabilities instead of relying on hand crafted models. The sensorimotor contingency theory asserts that such a way resides in learning invariants of the sensorimotor flow. We propose a formal framework inspired by this theory for the description of sensorimotor experiences of a naive agent, extending previous related works. We then use said formalism to conduct a theoretical study where we isolate sufficient conditions for the determination of a sensory prediction function. Furthermore, we also show that algebraic structure found in this prediction can be taken as a proxy for structure on the motor displacements, allowing for the discovery of the combinatorial structure of said displacements. Both these claims are further illustrated in simulations where a toy naive agent determines the sensory predictions of its spatial displacements from its uninterpreted sensory flow, which it then uses to infer the combinatorics of said displacements.Ensuring care is one of the biggest humanitarian challenges of the future since an acute shortage in nursing staff is expected. At the same time, this offers the opportunity for new technologies in nursing, as the use of robotic systems. One potential use case is outpatient care, which nowadays involves traveling long distances. Here, the use of telerobotics could provide a major relief for the nursing staff, as it could spare them many of those-partially far-journeys. Since autonomous robotic systems are not desired at least in Germany for ethical reasons, this paper evaluates the design of a telemanipulation system consisting of off-the-shelf components for outpatient care. Furthermore, we investigated the suitability of two different input devices for control, a kinesthetic device, and a keyboard plus mouse. We conducted the investigations in a laboratory study. This laboratory represents a realistic environment of an elderly home and a remote care service center. It was carried out with 25 nurses. Tasks common in outpatient care, such as handing out things (manipulation) and examining body parts (set camera view), were used in the study. After a short training period, all nurses were able to control a manipulator with the two input devices and perform the two tasks. It was shown that the Falcon leads to shorter execution times (on average 054.82 min, compared to 0110.92 min with keyboard and mouse), whereby the participants were more successful with the keyboard plus mouse, in terms of task completion. There is no difference in usability and cognitive load. Moreover, we pointed out, that the access to this kind of technology is desirable, which is why we identified further usage scenarios.This paper describes a portable, prosthetic control system and the first at-home use of a multi-degree-of-freedom, proportionally controlled bionic arm. The system uses a modified Kalman filter to provide 6 degree-of-freedom, real-time, proportional control. We describe (a) how the system trains motor control algorithms for use with an advanced bionic arm, and (b) the system's ability to record an unprecedented and comprehensive dataset of EMG, hand positions and force sensor values. Intact participants and a transradial amputee used the system to perform activities-of-daily-living, including bi-manual tasks, in the lab and at home. This technology enables at-home dexterous bionic arm use, and provides a high-temporal resolution description of daily use-essential information to determine clinical relevance and improve future research for advanced bionic arms.During human-robot interaction, errors will occur. Hence, understanding the effects of interaction errors and especially the effect of prior knowledge on robot learning performance is relevant to develop appropriate approaches for learning under natural interaction conditions, since future robots will continue to learn based on what they have already learn