ASSISIbf
Animal‌ and‌ robot‌ Societies‌ ‌Self-organise‌
and‌ Integrate‌ by‌ Social‌ Interaction‌ (bees‌ and‌ fish)‌

All posts in Blog

Robots are used in ethology and behavioural experiments to study the modes of interactions and communications between animals. These robots are usually teleoperated by the human experimenters, in order to trigger a response from the studied animals. However, an increasing number of ethological studies uses autonomous robots, capable of social integration with the animals without human intervention. We apply the same idea in the ASSISIbf project to create mixed-societies of animals and robots.

For instance, here is a video of a robot that socially integrates into a group of four zebrafish, in a closed two-patches set-up:

Acceptation

To be accepted by the group of fish, several aspects of robotic design must be considered. First, the fish lure must be biomimetic (i.e. “look” like a real zebrafish). Our lure was created by 3D printing a 3D scan of a real zebrafish. The lure is also covered with a decal to have the same type of color as a zebrafish.

LeurreAcceptance

Biomimetic fish lure

Second, the robot must behave like a fish (biomimetic behaviour). Fish have a tendency to be attracted to each other, and to move from one room to the other. They can have complex group dynamics, with fish joining or leaving short-lived sub-groups. The robot must be able to mimic these behaviour to be accepted by the group of fish. The robot must also appear to move like a fish, with biomimetic movement patterns composed of repeated series of high acceleration (tail beat) followed by a relaxation period.

Lastly, fish move very fast in the experimental set-up (in average 9cm/s, but they can reach up to 30 cm/s). As such, the robot hardware and software must also be designed to reach comparable speeds.

With a robot socially integrated in the group of fish, it is possible to use it to influence their behaviour, and control the collective behaviours of the entire biohybrid population of fish and robots. This can be accomplished by using the robot to initiate the collective departures from one room to the other, as shown in this video:

Initiation

During transitions, we observe that the fish which initiates collective departures is not always the same: the leadership is shared. However, some fish can be the leader more often than others. The leader position is linked to the number of times a fish tries to exit the room. As such, to influence the behaviour of a group of fish, our integrated robot must attempt to exit the room more often than fish. In this video, the robot integrates into the group of the fish and then initiates a transition.


ASSISIbf at the ARS ELECTRONICA FESTIVAL

Categories: Blog
Comments Off

In September 2016, the project ASSISIbf was showcasing an installation of fish & robots and bees & robots at the ARS ELECTRONICA FESTIVAL. This festival for art, technology and society was held from 8th-12th of September in Linz, Austria.  We had the opportunity to present our project at the Post City Linz, a former postal letter and package sorting facility. Hundreds if not thousands of visitors visited our booth during those 5 days of the exhibition, attracted by online media posts, radio interviews and TV features. ASSISIbf was also represented at a panel session „Chemistry of Intelligence“ on the FIS stage organized by Mahir Yavuz and Pablo Honey, where the project coordinator Thomas Schmickl discussed  with Barbara Ondrisek and Hitoyo Nakano.

The event was organized for months by Manon Briod and Martina Szopek. Their statements at the end just say it all:

Manon: Scientists in an art fair are like fish in Frank’s experiment, first they are afraid and then they follow happily!
Martina: It’s been an exciting experience to present a scientific project at an art fair. Our booth was busy like our bee hives!

 

IMG_8087IMG_8080IMG_8126IMG_8083    IMG_8093  IMG_8178


ASSISIbf is leading the effort towards open science. Read more on the openAIRE blog!

openaire

 

 

 

 


Workshop announcement: “Steering living and life-like complex systems” to be held at Arfiticial Life XV in July 2016, in Cancún, Mexico.

New technologies that exploit or emulate the unique properties of living systems have great potential, but the non-linearity and complexity exhibited by these systems render “brute force” approaches to control insufficient. An emerging collection of approaches use “steering”, whereby we continually interact with systems and attempt to move them between attractors.  This may be achieved, for instance, via manipulating the abiotic environment (e.g. in the evolution of biofilms) or by artifacts injecting social information (e.g. in bio-hybrid societies).  Understanding system dynamics and using effective leverage points can thus reduce the effort needed to retain a given desirable state.  Conceptually-related approaches are also being proposed in life-like complex adaptive systems such as regional economies, industrial networks and smart cities.

 

The workshop aims to bring together researchers interested in understanding and modulating complex biological and societal systems, and researchers of bio-hybrid systems are particularly encouraged to participate: the meeting would welcome perspectives focusing on methodology, ethical issues or conceptual issues in bio-hybrid systems or steering living systems more broadly.

 

Submissions open until 9 May! See full details of the workshop here: http://steeringcomplexsystems.wordpress.com/workshop-2016

The workshop is being organised by Alexandra Penn (ERIE/CECAN, Surrey), Rob Mills (ASSISIbf, Lisboa), and Emma Hart (FoCAS, Edinburgh Napier)


ASSISIbf pushing open science

Categories: Blog
Comments Off
In the ASSISIbf project we are committed to publish our hardware design as open source. This is motivated by a long successful tradition of publication of open source robot design like the e-puck and the Thymio robots.

By the regular publication of our design we could observe, in the last decade, that the landscape of open source hardware has evolved a lot, mainly because of the improvement of manufacturing possibilities, for instance by the accessibility of 3D printing or laser cutting. Another important factor of change is the increasingly accessibility of electronics, with large open source projects like Arduino. A last factor is the trend toward open science, well represented in Europe by the OpenAIRE initiative (https://www.openaire.eu/).

In this dynamic landscape, some elements are generating conflicts, some others are generating opportunities. Which are these elements, how to avoid or profit from them?
This is the main motivation of two surveys we are running: one among provider of design tools (in progress) and one among the open source hardware communities.

Therefore if you are an open source hardware enthusiast, could you help us by filling in (during 4 minutes, active until March 27, 2016) the survey under: https://www.surveymonkey.com/r/open_source_hardware

Thank you for your help. If you give your email address we will inform you about the results publication.

From January 12 to 14, 2016, we organized a winter school on the topic “From bio-inspired to bio-hybrid (robotic) systems”. The school was in Lausanne, Switzerland, at the Ecole Polytechnique Fédérale de Lausanne.

The goal of the winter school was to present the transition from bio-inspired systems, that are focused on developing technology, to bio-hybrid systems, where technology is in symbiosis with living systems. These bio-hybrid systems can make the best use of the properties of both components: biological and technological. For this, both systems and their interactions need to be modeled, in a more detailed way than for bio-inspired systems. During the school, examples came from hybrid systems involving robots and bees, fish and plants.

The school consisted of four main parts: preparation (article reading), lectures, practicals, and a reporting phase after the winter school for students who would like to submit the result of their practical work for evaluation. A lab visit completed the winter school.

Among the 6 lectures, 5 were given by members of the ASSISIbf project, one was given by members of the Flora Robotica project. Each lecture was mainly associated with a practical, but participants had to follow the same practical during the whole winter school, for a duration of 17 hours. Lectures and practicals were ensured by a total of 13 people.

winterschool_2016_3

We had 13 participants registered and attending the winter school, with academic positions spanning from PhD students to professors. The high interest for the topic made the practicals extremely constructive and productive. Among the participants, 4 decided not to write a report on the practical and 9 used this opportunity to get a feedback from the lecturer.

Moreover lectures and practicals were highly appreciated, as illustrated by the evaluation done with the participants:

 graph-results-eval

We look forward to organize another training hoping to get people as interesting and passionate than the ones of this year!


The BeeFish game is now available!

Categories: Blog
Comments Off

The University of Lisbon (UNILIS) team developed a videogame called BeeFish for the dissemination of the ASSISIbf project. The BeeFish game, for mobile devices, consists of two different gameplays, one with bees and CASU and other with fish and CASU, each one with several levels. In both gameplays, players control CASUs to guide the animals to their goal, somehow mimicking the experiments with real animals that are carried out by researchers of the project.

Screenshots of the BeeFish game.

Screenshots of the BeeFish game.

fish

The game is now available for free download for both Android and Apple iOS smartphones and tablets. To easily find the game:

* for android, search  for “BeeFish FCUL” on the Play Store or use this link:
* for iOS, search for “BeeFish FCUL” in the iOS app store or use this link:

The game is already being used at UNILIS to disseminate the ASSISIbf project.

students1

The ASSISIbf project and the BeeFish game were presented to high school students that visited UNILIS.

The ASSISIbf project and the BeeFish game were presented to high school students that visited UNILIS.


ASSISIbf Training FER 2015

Categories: Blog
Comments Off

On 16th of December 2015 the ASSISIbf LARICS team organized a training at the University of Zagreb, Faculty of Electrical Engineering and Computing (FER), for local students. The purpose of the training was to present, through selected lectures and a practical session, the overall goals of the ASSISIbf project and the results accomplished in the first half of the project. There were 12 students attending the training, who were  divided in 4 groups of 3 students for the practical session. LARICS staff (Prof. Stjepan Bogdan, Dr. Damjan Miklić, Karlo Griparić, Tomislav Haus, Damir Mirković) was responsible for giving lectures and support during the practicals.

Prof. Bogdan gave a talk on the general concepts of the project. The students could learn about the FET programme, the FoCAS initiative, collective adaptive systems and the methodology applied within the ASSISIbf project. Dr. Miklić presented the software architecture developed for facilitating ethological experiments on honeybees. The emphasis was on distributivity and modularity of the developed framework. Students got familiar with the ZeroMessageQueue (ZMQ) framework that is used as middleware in our system. Google Protobuf was introduced to the students, as a powerful tool for message serialization/deserialization. The lectures were concluded by Karlo Griparić’s talk on the developed Combined Actuator Sensor Unit (CASU) arena. The students got familiar with the techniques that we use to design the arena, ranging from mechanical CAD design to PCB design and embedded system design.

Prof. Bogdan giving the lecture on ASSISIbf concepts.

Prof. Bogdan giving the lecture on ASSISIbf concepts.

In the practical session, the task for the students was to program CASUs. LARICS staff prepared the arena with 4 fully functional CASUs. Each group was given its own CASU to program. Bristlebots (HEXBUG Nano), tiny robots powered by a battery and vibration motor, were used to mimic the presence of bees. After the presentation of the basic CASU controller, which included the introduction of the developed Python API, the students were given 3 tasks to complete.

The first task was the calibration of infrared (IR) sensors that we use to detect bees. The second task was to control the CASU temperature and color based on the number of detected robots. In particular, if a bristlebot is detected, the temperature reference should be increased by 0.5 °C (positive feedback), and if there is no detection in a specified time period (e.g., 10s), the reference should be decreased by 0.5 °C. The color of the embedded LED should be changed with respect to the measured CASU temperature. The first two tasks were successfully completed by each group.

Students worked in groups to solve the training tasks. We used a thermocamera live footage to monitor the arena temperature.

Students worked in groups to solve the training tasks. We used a thermocamera live footage to monitor the arena temperature.

The final task was more ambitious and included the estimation of the number of bristlebots in the arena, based on IR detections. The students were left free to develop any kind of algorithm and it was interesting to see how they quickly employed their knowledge from machine learning. Eventually, one of the group implemented a linear regression algorithm and started the learning procedure by collecting the IR data while altering the number of bristlebots in the arena.

Bristlebots were used to mimic bee motion. One of the tasks was to implement a Bristelbot counting algorithm.

Bristlebots were used to mimic bee motion. One of the tasks was to implement a Bristelbot counting algorithm.

We hope that the insights that the students got during the training improved their programming skills and could help them in their ongoing and future projects. We also hope that we managed to familiarize students with daily activities typical for scientific projects, such as ASSISIbf, and encourage them to continue their education and start a career in science.

 


Introducing RiBot

Categories: Blog
Comments Off

One year ago, we posted an article on this same blog that introduced a wheeled mobile robot that could move passive lures inside an aquarium to interact with living zebrafish Danio rerio. During the past year, we have been able to design an active lure, RiBot, that is equipped with an actuated tail in order to mimic the fish body movements underwater. RiBot is a combination of the words “Riba” (рыба) and “Robot” (робот) that means respectively fish and robot in Slavic language. This remotely controlled and waterproof device has a total length of 7.5 cm which is only 1.8 times the size of a zebrafish and is able to beat its tail with different frequencies and amplitudes, while following the group of living animals using the external wheeled mobile robot that is coupled with the robotic lure using magnets. An infrared universal TV control with RC5 protocol can be used to change the different beating tail modes. The robotic lure is also equipped with a rechargeable battery and has an autonomy of more than one hour.
Preliminary experiments of mixed societies of an actuated lure combined with an external wheeled robot and zebrafish Danio rerio will be presented at the SWARM 2015 conference in Kyoto, Japan

fish_casu

The first prototype of RiBot, compared with a zebrafish Danio rerio used during the experiments. (c) EPFL


Having established that the basic CASU functionality works as expected, the ASSISIbf team was ready to undertake the first collective-behavior experiments, lead by Rob Mills of Lisboa who had worked out and coded up a number of interesting test cases.

In the first group of experiments, two CASUs were heating opposite corners of the arena, with one of them providing the Bee’s preferred temperature of 36 degrees, and the other one heating to a two degree lower temperature. After the bees had aggregated at the optimal spot, the heating CASUs were turned off, and two new attractive spots were created, in the other two corners of the arena. Again, one spot was only locally optimal, the other one was globally optimal. The experiment was performed in two varinats: one with an abrupt optimum change, as descirbed above, and another where the optima moved along a chain of neigboring CASUs. The goal was to see whether the CASU array can be used to “guide” the bees to the global optimum. Further experiments are needed in order to draw definite conclusions, however, in the performed experiments, the bees’ time to reach the global optimum was decreased with the help of the CASUs.

Now THAT is what I call an agregation!

Now THAT is what I call an aggregation!

The other set of experiments was the first step towards the ambitious ASSISIbf goals of interaction between spatialy-separated societies. Two groups of bees, physically separated in two smaller arenas within the CASU array, were required to coordinate their decision on an aggregation spot. The CASUs were “counting” the bees in their surroundings by means of IR proximity sensors and closing a positive feedback loop by producing more heat when counting more bees. This was the first set of fully autonomous CASU experiments!

So close no matter how far

So close no matter how far!

Additional experiments featured a mixed society, but the details of this experiment will be kept secret for the time being.

This experiment is still Top secret!

Some experimental results are not ready to be publicized yet!

One of the highlights of the workshop was the bee detection and tracking software implemented and tested by Marcelo of EPFL. Due to their high density and unpredictable motion, bee-tracking is a notoriously difficult problem and to the best of our knowledge no robust solutions, weather commercial or  academic, are currently avalable. Well, at least until the end of this week, when Marcelo adapted his fish-tracking tool to tracking bees. It took a lot of coding and some adjustments to the environment

Reliable tracking requires perfect environmental conditions.

Reliable tracking requires perfect environmental conditions.

but the results are more than impressive:

https://www.youtube.com/watch?v=OPHZg52_irA&feature=youtu.be

All in all, the whole ASSISIbf team is satisfied with the progress achieved and confident that they can keep up the dynamic pace set out in the project DoW.