Recent advances in deep learning have sparked interest in AI-generated art, including robot-assisted painting. Traditional painting machines use static images and offline processing without considering the dynamic nature of painting. Neuromorphic cameras, which capture light intensity changes through asynchronous events, and mixed-signal neuromorphic processors, which implement biologically plausible spiking neural networks, offer a promising alternative. In this work, we present a robotic painting system comprising a 6-DOF robotic arm, event-based input from a Dynamic Vision Sensor (DVS) camera and a neuromorphic processor to produce dynamic brushstrokes, and tactile feedback from a force-torque sensor to compensate for brush deformation. The system receives DVS events representing the desired brushstroke trajectory and maps these events onto the processor's neurons to compute joint velocities in close-loop. The variability in the input's noisy event streams and the processor's analog circuits reproduces the heterogeneity of human brushstrokes. Tested in a real-world setting, the system successfully generated diverse physical brushstrokes. This network marks a first step towards a fully spiking robotic controller with ultra-low latency responsiveness, applicable to any robotic task requiring real-time closed-loop adaptive control.
- Publikační typ
- časopisecké články MeSH