language-icon Old Web
English
Sign In

An Experimental Cognitive Robot

2010 
The Experimental Cognitive Robot version 1 (XCR-1) is a simple three-wheel platform for implementation experiments with the “Haikonen cognitive architecture”. This platform is non-digital and non-programmable; it is not based on any microprocessor. It is not based on pre-programmed algorithms or any kind of programs. Instead, it is based on associative neuron groups in a widely cross-connected architecture where information is carried by spatial and temporal patterns of neural signals. No common code for these signal patterns is used or is necessary. The Haikonen cognitive architecture is a multi-modal perceptive system that utilizes so called perception/response feedback loops for each sensory modality. These modalities are associatively cross-connected. Therefore any experiment with the architecture would have to include multiple sensory modalities so that the cross-connection effects could be demonstrated. Accordingly the robot XCR-1 has sensory modalities for vision, sound, touch, shock (“pain”) and “pleasure”. The robot is able to move with two wheel motors and it also has a gripper mechanism, which can grab suitable-sized objects. In addition to the motor responses the robot has audible “self-talk”, a verbal report, which reflects the robot's “mental perception” of the on-going situation. Presently no associative neuron chips are available. Therefore, in order to minimize the required hardware, the modalities are realized in the most minimal way. However, the realizations are sufficient for the demonstration of many essential issues.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    5
    References
    3
    Citations
    NaN
    KQI
    []