Magnus Johnsson

Cognitive Scientist, Computer Scientist







Referee Assignments


Action Recognition




When I was a PhD student I designed and built a number of robots with limited resources using whatever was at hand. This was done to enable my research on tactil perception in robots. I stopped doing this in 2008 when my interests became more exclusively focused on cognitive modelling and computational intelligence. Thus these robots are now all outdated, but I have kept short descriptions as well as pictures and demo movies below.

The LUCS Haptic Hand I. This is a simple robot gripper built by me and Christian Balkenius in 2004. The LUCS Haptic Hand 1 is equipped with nine touch sensitive piezo electric sensors. A picture is available here. This gripper has been used in a number of computational models of the human haptic system, which demonstrated ability to learn to categorize objects according to size.

The LUCS Haptic Hand II is an 8 d.o.f. robot hand equipped with 45 piezo electric touch sensors. The robot hand consists of three fingers, each with two phalanges, mounted on a triangular plastic plate. It is equipped with a wrist and a mechanism for vertical repositioning. The piezo electric touch sensors are distributed on the palmar side of the finger phalanges. The LUCS Haptic Hand II has been used in a number of artificial systems for haptic shape perception. These systems used active explorations of the objects by sequences of grasps with the robot hand to gather cutaneous and proprioceptive information. The systems made heavy use of tensor product operations or a novel neural network called Tensor Multiple Peak SOM (T-MPSOM) to code the tactile information in a useful way. The systems were able to learn to cluster different shapes by the aid of a Self-Organizing Map (SOM). A picture of the LUCS Haptic Hand II is available here and a movie that shows the robot hand (before it was equipped with tactile sensors) during the grasping of a ball is available here: LHH2.avi.

The LUCS Haptic Hand III is a five fingered 12 d.o.f. anthropomorphic robot hand equipped with 11 proprioceptive sensors. Some successful haptic systems, based solely on proprioception, with capacity to recognize shape, size as well as individual objects were implemented with this robot hand together with various kinds of Self-Organizing Neural Networks. A picture of the robot hand is available here and a couple of pictures of the robot hand equipped with 18 very sensitive experimental binary tactile sensors of my own design are available here and here. There are also some movies of this robot hand available. One movie that shows the robot hand when it shakes hands with its creator is available here: handshake.mpeg handshake.avi. Another movie that shows the robot hand while grasping a beer can is available here: cheers.mpeg cheers.avi. A movie that shows finger movements from one side is available here fingermovements1.mpeg fingermovements1.avi and another movie that shows finger movements seen from the other side is available here fingermovements2.mpeg fingermovements2.avi.

The LUCS Arm 1 is an anthropomorphic robot arm with in total 9 d.o.f. It is equipped with 5 proprioceptive sensors. The robot hand is designed in a similar fashion as the LUCS Haptic Hand III but is of a much smaller size and has a reduced number of d.o.f. The arm is equipped with a flexible elbow and a shoulder that enables horizontal as well as vertical rotation. A picture of the robot arm is available here. A picture of the forearm is available here. Pictures of the hand is available here, here, here and here. Movies showing the robot arm moving and grasping is available here LUCSArm1_movie1.avi and here LUCS_arm1_movie2.avi. A movie showing the hand grasping a wooden block is available here: LUCS_arm1_movie3.avi. A movie showing finger movements is available here: LUCS_arm1_movie4.avi.

I have also developed a microphone based texture sensor and a hardness sensor. A picture is available here. The texture sensor slides a tiny metal edge on the surface of the explored object at a constant speed. The metal edge is in physical contact with a capacitor microphone and thus transfers the vibrations created. The pattern of vibration is transformed into a spectrogram by the Fast Fourier Transform (FFT) algorithm. The hardness sensor explores an object by pressing at it with a stick with a constant force while measuring the displacement of the stick. The displacement of the stick is estimated by measuring the resistance of a variable resistor. These sensors were used in SOM based systems that learned to cluster objects according to their texture and hardness properties, as well as in bimodal systems.