ICEA: SCRATCHbot (2006-2009)

Scratchbot in action

Spatial cognition and representation through active touch

Following on from Whiskerbot a group of researchers from the University of Sheffield and the Bristol Robotics Lab, have gone on to develop the SCRATCHbot (Spatial Cognition and Representation through Active TouCH bot), which is a significant milestone in the pan-European ICEA project.

Inspired by rats

The new technology has been inspired by the use of touch in the animal kingdom. In nocturnal creatures, or those that inhabit poorly-lit places, this physical sense is widely preferred to vision as a primary means of discovering the world. Rats are especially effective at exploring their environments using their whiskers. They are able to accurately determine the position, shape and texture of objects using precise rhythmic sweeping movements of their whiskers, make rapid accurate decisions about objects, and then use the information to build environmental maps.

SCRATCHbot has a number of improvements over the Whiskerbot platform, namely;

  • the active whisker array consists of 18 rather than 6 whiskers
  • there is an additional non-actuated micro-vibrissae array located on the "nose"
  • the "head" is connected to the body by a 3 degrees of freedom neck, and the body is driven by 3 independently steerable motor drive units.

These improvements have so far allowed us to model more closely the sensori-motor coordination of the active whisker array (brain stem) and spatial orientation behaviour (superior collicular inspired) to move the micro-vibrissae array to contact points in three dimensional space. We also plan to integrate the whisker sensory information with a hippocampal model (developed by other ICEA consortium members) to investigate how a rat may use it's whiskers to navigate through the environment.

SCRATCHbot project details

The robot was designed to reproduce the behaviour of rats as they use their whiskers to explore their environment. To get a clearer picture of how rats use their whiskers we filmed them using high speed video cameras (500fps) and manually tracked the position of each whisker in the array on a frame by frame basis.

Whisker tracking

The data from this whisker tracking allowed us to quantify the kinematics of whiskers as the rats explored novel environments. From this we found that following a whisker making contact with an object there was a very rapid (~13ms) change in the velocity profile of the ‘whisking’, or movement pattern of the whiskers.

We also observed that the rat will tend to move, or orient, its nose toward the exact point of contact. Our hypotheses were that the rat was trying to optimise the force applied by the whiskers making contact with the object as well as bringing as many addition whiskers as possible, and its nose for smelling, to bear on that point.

We designed our robot to mimic both the low level contact mediated adaptation of the whisker motion pattern and the ability to orient its ‘nose’ towards points in three dimensional space. By designing the physical robot to be capable of mimicking these behaviours allows us to test different computational models of the underlying brain structures which can control it. This exposes the models to realistic, real-time sensory information, which tests the models robustness and validity in a very rapid and obvious way. It also reveals side effects that may otherwise not have been seriously considered, such as noise induced by the rats own movement, which has led into further hypotheses of brain function and generated new models to test.

Choosing a material for the robot's whiskers

The material choice for the whisker shaft was and still is an area of significant research. Obviously we have the details of the material properties of real rat whiskers, however, our current choice of sensory transducer limits the minimum size of the whiskers that we can build. In turn this has led to compromises in the choice of material properties of the artificial whiskers.

We have experimented with glass-fibre in the past but have more recently adopted abs plastic as it is much closer to the stiffness of real rat whiskers. However, the stiffness is not the only physical parameter that we think is important, we are also interested in the shape of the whiskers. Rat whiskers are tapered and curved, both features which affect the mechanics of whisker shaft and, we believe, could be instrumental in how the rat experiences the world.

Robot sensors

For the larger whiskers which move (known as the macro-vibrissae) we use Hall effect sensors which can measure the movement of small magnets bonded to the base of each whisker. For the smaller whiskers on the front (micro-vibrissae) we have used glass-fibre rods bonded to the casing of small micro-phones embedded in a rubber substrate. The robot has no vision, proximity or auditory sensors.

Robot construction

All the yellow components of the robot in the photos above were designed and built in our laboratory here using one of our rapid prototyping machines (3D printers). The whisker columns, located on either side of the ‘head’, are actuated by brushed dc motors and controlled locally using embedded micro-controllers. The "neck" was built by an external contractor (Elumotion Ltd.) and consists of brushless dc motors to give 3 degrees of freedom to the head (pitch, yaw and elevation).

The 3 motor drive units on the ‘body’ were designed and built in the laboratory and provide independent drive and steering through 180 degrees, again using brushless dc motors. As with the whisker columns, all motors are controlled locally by separate embedded micro-controllers which receive their desired velocity or position profiles via a CAN (Controller Area Network) bus using the same protocol as is used in most modern cars. The ‘brain’ of the robot is a combination of FPGA (Field Programmable Gate Array) based and PC based neural models, which issue the motor command and process the sensory input from the whisker array.

The robot does not send data back to a computer, all processing is done on the platform in real-time. At the end of each experimental run of the robot all the sensory data and robot odometry can be uploaded remotely for further analysis.

We are, however, now very close to the limit of what we can do with the on-board processing and power resources so we are designing a small asynchronous wireless board which can broadcast the whisker sensory data for external processing. An interesting point here being that the varying propagation delays introduced by wireless communication would prohibit the stable control of the robot and would certainly prohibit the modelling of certain neural structures in a biologically realistic time frame. We have therefore adopted an approach inspired by the architecture of the brain in which low level motor functions can operate perfectly well without a high level cortex. However, the cortex can ‘modify’ the behaviour of the lower level control loops, this is analogous to our processing on the robot maintaining real-time performance whilst any external processing (via wireless link) can exert a ‘modifying’ influence on the robots behaviour in a non-time critical manner.

Application domains

The robot was designed primarily as a research tool to study the whiskers of a rat and the functional architecture of the mammalian brain, both of which were justification enough for the robots development. The technology demonstrated on the robot, namely the active touch sense and the ability to explore an environment in a non-visual way, could be exploited further, and rescue operations is one possible application domain.

Other applications which may benefit from this technology could be in the inspection of large fluid filled tanks found in the nuclear industry or inspection of pipes or conduits filled with dirty fluids or for textiles quality inspection, the point here being that the robot was not specifically designed for search and rescue operations. However, there are no current plans to exploit this robot commercially.

Theme Leader

Project video

Scratchbot in operation at the Bristol Robotics Laboratory

Related links

 

Page last updated 8 November 2012

  •