A visual motion detecting module for dragonfly-controlled robots
- Publication Type:
- Conference Proceeding
- Citation:
- 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2014, 2014, pp. 1666 - 1669
- Issue Date:
- 2014-01-01
Closed Access
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
© 2014 IEEE. When imitating biological sensors, we have not completely understood the early processing of the input to reproduce artificially. Building hybrid systems with both artificial and real biological components is a promising solution. For example, when a dragonfly is used as a living sensor, the early processing of visual information is performed fully in the brain of the dragonfly. The only significant remaining tasks are recording and processing neural signals in software and/or hardware. Based on existing works which focused on recording neural signals, this paper proposes a software application of neural information processing to design a visual processing module for dragonfly hybrid bio-robots. After a neural signal is recorded in real-time, the action potentials can be detected and matched with predefined templates to detect when and which descending neurons fire. The output of the proposed system will be used to control other parts of the robot platform.
Please use this identifier to cite or link to this item: