Project

Spirit of the e-puck project

This project has been started at the Ecole Polytechnique Fédérale de Lausanne as collaboration between the Autonomous Systems Lab, the Swarm-Intelligent Systems group and the Laboratory of Intelligent System.

An educational robot

The main goal of this project is to develop a miniature mobile robot for educational purposes at university level. To achieve this goal the robot needs, in our opinion, the following features:

  • Good structure. The robot should have a clean mechanical structure, simple to understand. The electronics, processor structure and software has to be a good example of a clean modern system.
  • Flexibility. The robot should cover a large spectrum of educational activities and should therefore have a large potential in its sensors, processing power and extensions. Potential educational fields are, for instance, mobile robotics, real-time programming, embedded systems, signal processing, image or sound feature extraction, human-machine interaction, robotics for artists.
  • Autonomy. The robot should be able to run programs for more than one hour without external power.
  • Connectivity. The robot should be able to communicate with the PC through a robust link.
  • Robust. The robot should resist a certain amount of abuse by students.
  • Open source. All documentation, schematics and firmware should be available.
  • Cheap. The robot, for large use, should be cheap (450-550 euros)

An open hardware platform

To help the creation of a community inside and outside EPFL, the project is based on an open hardware concept, where all documents are distributed and submitted to a license allowing everyone to use and develop for it.

Status of the project

A first set of 20 robots (e-puck version 1) has been produced in november 2004 and tested in the courses "microinformatique I" and "microinformatique II". The results of these tests have been used to create a new version of the robot (e-puck version 2) during summer 2005. We have produced 400 units of this final version in the beginning of 2006. Version 2 is incompatible with version 1 and we therefore discourage everybody to start working on version 1. Version 2 includes many improvements and is a stable one.

Distribution

Two manufacturing companies are started the production of e-puck and several distributors have started to sell it. We list them on this sit in the section "Where to buy". If you are interested in producing / distributing this robot please let Francesco Mondada or Alcherio Martinoli know.

Partners

 

The e-puck project is supported by the Ecole Polytechnique Fédérale de Lausanne (EPFL)

epfl logo

 

 

EPFL laboratories involved in the project:

Autonoumous system lab
Laboratory of Intelligent Systems Swarm-Inteligent Systems group

logo lis



 


 


People

 

ASL group

Francesco Mondada Michael Bonani

Roland Siegwart

Project manager

Design and production
Basics e-puck

Professor


LIS group

Adam Klaptocz Julien Hubert
Jean-Christophe Zufferey Dario Floreano

Linear Camera module
Ground Sensors module
Colour LED Communication
module
Omnidirectional Camera
module

Software developer

LIS project manager

Professor


SWIS group

Xavier Raemy

Jim Pugh Alcherio Martinoli
Zigbee module
Professor

 

Cyberbotics

Olivier Michel

WEBOTS Simulator adaptation

https://www.cyberbotics.com/


 

 

License

e-puck Robot
Open Source Hardware License

Version 1.0

August 2005
Copyright © Ecole Polytechnique Fédérale de Lausanne (EPFL)
Switzerland

Preamble

This Open Source Hardware License aims at the dissemination of the specifications necessary to build the e-puck robot, a mobile robot developed by the Ecole Polytechnique Fédérale de Lausanne ( "EPFL"), Switzerland.

1. Definitions

"License" shall mean this Open Source Hardware License.
"Specifications" shall mean the hardware specifications, printed circuit board designs, drawings, CAO files, list of components and other artwork for building an e-puck mobile robot as released by EPFL and published on its web site (www.e-puck.org), from time to time.
"Robot" or "Robots" shall mean the e-puck mobile robot(s) built using the Specifications or any modification or derivation thereof.
"You" shall mean each licensee under this License.

2. Using the Specifications and Building Robots

You are hereby authorized to use the Specifications for any purpose.

You are further authorized to build Robot(s) based on the Specifications and distribute them, provided that you give to each purchaser of Robot(s) a copy of this License along with a copy of the Specifications under the conditions specified in Section 3.1 herein.

You may design and develop a Robot based on modification you make to the Specifications. If you decide to distribute such a Robot, you shall only do so provided that you give to each purchaser of such a Robot a copy of this License along with a copy of the modified Specifications under the conditions specified in Sections 3.1 and 3.2 herein.

When referring to the Robot in any scientific publication, you shall make a clear reference to the e-puck Robot pages on EPFL's Web site : www.e-puck.org.

You will not be charged any royalties by EPFL for such use of the Specifications and/or building and distributing the Robot(s) or the Specifications.

3. Copying, Distributing and Modifying the Specifications

3.1 You may copy and distribute verbatim copies of the Specifications as you receive them, in any medium. If you choose to do so (or if you choose to distribute Robot(s) based on the Specifications), you shall only do so under the following conditions :

i) conspicuously and appropriately publish on each copy of the Specifications the following copyright notice and disclaimer of warranty :

"Copyright © Ecole Polytechnique Fédérale de Lausanne (EPFL), Switzerland.

The specifications of the e-puck mobile robot are "open source hardware". You can redistribute them and/or modify them under the terms of the e-puck Robot Open Source Hardware License as published by EPFL. You should have received a copy of the EPFL e-puck Robot Open Source Hardware License along with these specifications; if not, write to the Ecole Polytechnique Fédérale de Lausanne (EPFL), Industrial Relations Office, Station 10, 1015 Lausanne, Switzerland.

These specifications are distributed in the hope that they will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. For more details, see the EPFL e-puck Robot Open Source Hardware License. "

and

ii) give any other recipients of the Specification a copy of this License along with the Specifications

3.2 You may modify your copy of the Specifications or any portion thereof. You are not required under the terms of this License, to distribute any of the modifications that you make to the Specifications. However, if you choose to distribute modified Specifications (or to distribute Robot(s) based on modified Specifications), you shall only do so under the terms of Section 3.1 above and subject to all of the following conditions :

i) you must cause the modified Specifications to carry prominent notices stating that you have changed the Specifications as well as the date and details of such change;

and

ii) you must allow any modified Specifications that you distribute or publish, that in whole or in part contains or is derived from the Specifications or any part, modification or derivation thereof, to be licensed as a whole at no charge by any third party under the terms of this License.

The requirements of this Section 3.2 do not apply to works that are not derivatives of the Specifications (in the sense of the applicable copyright law).

4. General

4.1 Save for the fair use or similar right granted under the applicable copyright law, you may not use copy, modify, sublicense, or distribute the Specifications or Robots except as expressly provided under this License. Any attempt otherwise to use, copy, modify, sublicense or distribute the Specifications is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance.

4.2 You are not required to accept this License. However, nothing else grants you permission to use, copy, modify or distribute the Specifications or its derivative works or Robots and save for the fair use or similar right granted under the applicable copyright law, these actions are prohibited by law if you do not accept this License. Therefore, by using, copying, modifying or distributing the Specifications (or any work based on the Specifications) or any Robots, you indicate your acceptance of this License to do so, and all its terms and conditions for using, copying, distributing or modifying the Specifications or works based on it.

4.3 Each time you redistribute the Specifications, distributes any modifications of the Specifications, or distribute any Robot, the recipient or purchaser automatically receives a license form EPFL to use, copy, distribute or modify the Specifications subject to the terms and conditions of this Licensee. You may not impose any further restrictions on the recipients' exercise of the rights granted herein.

4.4 EPFL may publish revised and/or new versions of the e-puck Robot Open Source Hardware License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns.

Each version is given a distinguishing version number. If the Specifications specify a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by EPFL. If the Specifications do not specify a version number of this License, you may choose any version ever published by EPFL.

5. Disclaimer of Warranty

THE SPECIFICATIONS ARE PROVIDED "AS IS". EPFL DOES NOT MAKE ANY WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING WITHOUT LIMITATION WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE, ABSENCE OF DEFECT AND NON-INFRINGEMENT OF THIRD PARTIES' RIGHTS IN RESPECT OF THE SPECIFICATIONS OR MODIFIED SPECIFICATIONS OR ANY ROBOT(S) THAT MAY BE BUILD OR MANUFACTURED BASED ON THE SPECIFICATIONS OR ON MODIFIED SPECIFICATIONS. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF ROBOTS BUILT OR MANUFACTURED BASED ON THE SPECIFICATIONS OR ON MODIFIED SPECIFICATIONS SHALL BE WITH YOU. SHOULD THE SPECIFICATIONS OR MODIFIED SPECIFICATIONS PROVE TO BE DEFECTIVE, YOU SHALL ASSUME THE COST OF ALL NECESSARY REPAIR OR CORRECTIONS TO THE ROBOT(S).
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL EPFL OR ANY PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE SPECIFICATIONS AS PERMITTED HEREIN, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE SPECIFICATIONS AND/OR THE ROBOT(S), EVEN IF EPFL OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.

END OF LICENSE


Education

The e-puck has been developed as an easy platform for education. Currently it is used in the following courses:

Courses using the e-puck Laboratory School
Biological and Artificial Intelligent Systems Laboratory of Intelligent Systems EPFL
Microinformatique I (old) Laboratoire de Systèmes Robotiques EPFL
Microinformatique II (old) Laboratoire de Systèmes Robotiques EPFL
Robot Systems Laboratoire de Systèmes Robotiques / Learning Algorithms and Systems Laboratory EPFL
Mobile Robots Autonomous Systems Lab / Laboratory of Intelligent Systems EPFL
Swarm Intelligence Swarm-Intelligent Systems Group EPFL
TP de robotique (filière RSA) Laboratoire de Systèmes Robotiques / Learning Algorithms and Systems Laboratory / Laboratory of Intelligent Systems EPFL


There are also more and more students that are using e-puck for learning about robotics in general. Most of the labs listed above also offer student projects on the e-puck.


Robot

The e-puck robot has been developed as an open tool. An article illustrating the concept and design is:

Mondada, F., Bonani, M., Raemy, X., Pugh, J., Cianci, C., Klaptocz, A., Magnenat, S., Zufferey, J.-C., Floreano, D. and Martinoli, A. (2009) The e-puck, a Robot Designed for Education in Engineering. Proceedings of the 9th Conference on Autonomous Robot Systems and Competitions, 1(1) pp. 59-65.

The full PDF can be found here.

To exploit in the best way this tool, it's important to know it well. This section include general information about the hardware of the robot and its specifications. Full specifications (schematics and more) can be found in the download section (top menu).

epuck with keyboard and mouse

Access to the several sections by the submenus on the left.

Mechanics of the e-puck

Simple mechanics

To reach the goals of the project (low price and robustness) the mechanics has to be very simple and elegant. The structure has been therefore based on one single part (followings pictures) ensuring the structure of the whole robot.

mechanics of the e-puck main part of e-puck

 

This basic part has a diameter of 70 mm and ensures the support of the motors, the printed circuit and the battery. The battery is located under the motors and is connected to the printed circuit by two vertical contacts. The battery can be easily extracted and recharged externally.

battery and e-puck extraction of e-puck battery

Motor

The choice of the motors is crucial for this purpose. On the e-puck design we decided to use miniature steppers motors with gear reduction. The motor has 20 steps per revolution and the gear has a reduction of 50:1. The final axis is sufficiently strong to support directly the wheel.

Wheels

The wheels diameter is about 41 mm.  The fixation is made by a four teeth chuck. There is a bras peaces insert in the plastical wheele. A green o ring is used as tire. The distance between the wheels (mounted) is about 53 mm. The maximum speed of the wheels is about 1000 steps / s, which corresponds to one wheel revolution per second.

wheel of e-puck wheel unmounted with motor

Electronic design

The electronics was designed to support a large number of possible uses of the robot in a context of university level education. The following structure has been implemented in the actual version:

 

 

schematics

 

Short description of the main components around the processor:

  • Single power supply. All electronics runs at 3.3V, excepted the camera needing an additional 1.8V.
  • An extension connector allows to add new functionalities on an extension board.
  • The robot is equipped with a RS232 and a bluetooth interface to communicate with a host computer.
  • A 16 positions rotating switch allows an input from the user, for instance to set a running mode.
  • Reset button and programming connector as usual.
  • The battery is based on LiION technology, has 5Wh capacity and is sufficient for about 2-3 hours of intensive use. The battery can be removed and recharged externally. A battery protection is implemented.
  • 8 IR proximity sensors are placed around the robot but not in a regular way (more sensors in front of the robot).
  • A 3 axis accelerometer is placed inside the robot.
  • Three microphones and a speaker allow sound capture and generation.
  • A camera with a resolution of 640x480 color pixels is placed inside the body and looks forward.
  • Can be used only through windowing or subsampling.
  • 8 red leds are placed around the robot body to diplay patterns.The camera should be able to see them.

 

Processor

The choice of the processor is crucial for this robot. To enable nice programming, use of standard compilers (GNU) and have good computational power we made the choice of the dsPIC family from microchip. This microcontroller has a lot of very nice interfaces and has a dsp core allowing very efficient data processing. It is a very nice processor for education because of the nice structure (registers, set of instructions).

Sound sensors

The e-puck robot is equipped with three microphones placed as illustrated here:

 

microphones placement on the robot

 

Acquiring the three microphones, the maximal acquisition speed is 33kHz (A/D maximal frequency of 100kHz divided by three). A sound wave at 1kHz emitted from a computer speaker in front of the robot is acquired by the microphones as following:

sound data acquired with a sound coming from front

If the same sound comes from the left, the data acquired is the following:

sound wave with sound coming from left

If the same sound comes from the right, the data acquired looks as following:

sound wave for sound coming from right

The quality of the signal is of course very dependent from the type of sound source. The microphones of the e-puck are not very senstive. This is the reason of the first tests using a "big" computer speaker. Taking a small speaker (another e-puck) placed at 130 mm of the robot on the left side, the sound (1kHz) acquired looks as following (please note the amplitude and the noise of the signal):

sound wave coming from another e-puck

 

Here you can find the sound definition file that was used with the standard library to generate the sound on the second e-puck.

The FFT of this last signal shows well the main frequency (1kHz) but shows also a noise frequency at 11kHz... this is perhaps due to a resonance peak at 11kHz of the e-puck speaker.

FFT of the sound from another e-puck

 

Accelerometer

The e-puck robot is equipped with 3D accelerometer placed as illustrated here:

 

accelerometer placement on the robot with axes direction

 

 

Next figure is the data acquisition when taking the e-puck and putting it back on a table making a circular mouvement in YZ plan direction (on the horizontal axis you have the sample number, on the vertical axis the values acquired on the three axis):

 

accelerometer data acquired with circular mouvement

 

Next figure is the data acquisition with a 3D random movement (on the horizontal axis you have the sample number, on the vertical axis the values acquired on the three axis):

accelerometer data with random mouvement

 

When the robot moves, the steppers motors perform a given number of steps per seconds. At each step there is a strong micro-acceleration and micro-deceleration of the robot. Next figure shows the acceleration measured by the accelerometer and due to the stepper motor when moving at 70steps/s. Sampling is made at 7kHz. On the X axis there is the sample number.

noise on accelerometer data with 70step/s

Next figure shows the acceleration due to the stepper motor when moving at 150steps/s. Sampling is made at 7kHz. On the X axis there is the sample number.
noise on accelerometer data with 150step/s

Proximity sensors

The e-puck robot is equipped with 8 IR proximity sensors placed as illustrated here:

 

IR proximity placement on the robot

 

 

Next figure is the raw data aquisition of IR0 (a front sensor) when the robot starts againts a wall and goes backwards. The X axis shows the number of steps of distance from the wall. 1000 steps correspond to 12.8cm.

 

 data IR0 ambiant and LEDon when robot going away

 

Next figure is the computed sensor information (reflected light - ambient light) data aquisition of IR0 IR1 IR6 IR7 (the four front sensors) when robot is going backwards from a wall. The X axis represents the number of steps. 1000 steps correspond to 12.8cm.

data 4 sensor in front of wall

The next three figures show some noise on the sensor measurement due to the activity of the stepper motor turning at 100steps/s 400steps/s and 800steps/s. Sampling is made at 64Hz. This noise comes from the power supply of the e-puck. It seem to increase when battery is going down.

noise on IR sensors when motor speed is 100step/s

noise on IR sensors when motor speed is 400step/s
noise on IR sensors when motor speed is 800step/s

Camera

The camera has a resolution of 640(h)x480(l) pixels and is color. The full flow of information this camera generates cannot be processed by a simple processor like the dsPIC on the robot. Moreover the processor has 8k of RAM, not sufficient to even store one single image. To be able to acquire the camera information, the image rate has to be reduced and the resolution too. Typically we can acquire a 40x40 subsampled color image at 4 frames per second.Without color we can go up to 8 frames per second. Not more. But you can also acquire another type of image, for instance a line of 480x1 color pixels on the ground facing the robot.

Here are some examples of images taken with the e-puck monitor program:

screenshot of image capture

This first image is taken from a distance of 60 cm with a zoom of 8 (means a pixel every 8).  Max sampling is 4.3 fps.

camera screenshot at 4 zoom

Here the distance is the same as above (60 cm) but zoom factor is 4 (one pixel every 4). Framerate is same as zoom 8 (4.3 fps).

screenshot of camera with room 1

Here the distance is the same as above (60 cm) but zoom factor is 1 (every pixel is taken in the region). Framerate is lower (1.3 fps) because the speed is given by the acquisition rate of the processor.

Grayscale image

Last image is the same but in grayscale. Same distance, but with higher framerate (8.6 fps).

screenshot of camera not square size

You can of course choose a non square picture (here 80 x 20) and get more data in one direction.

Communication

Bluetooth and e-puck

The dsPIC, e-puck's microcontroller, has two uarts. The first one is connected to a bluetooth chip and the second one is physically available through a micromatch connector. The bluetooth chip, LMX9820A, can be used to access to the uart "transparently" using a bluetooth rfcomm channel. Using this mode, one can access the e-puck as it was connected to a serial port, with the exception that the LMX9820A will send special commands upon connection/disconnection. The user application must take them into account.

How to communicate with the e-puck

This section is only about communicating with an e-puck running a user program, not using the bluetooth bootloader.

Setting up a fake serial connection using bluetooth rfcomm under Linux

In order to setup a fake serial connection using bluetooth rfcomm with your e-puck under Linux, you will need the following:

  • A USB bluetooth dongle
  • a recent 2.4 or 2.6 kernel
  • the following packages: bluez-firmware bluez-pin bluez-utils

The lsusb, hciconfig, hcitool and rfcomm commands are very useful to test the presence of a USB bluetooth dongle connected to your Linux box and to setup the connection with the e-puck:

# lsusb
# hciconfig
# hcitool scan
# l2ping
# rfcomm release rfcomm0
# rfcomm bind rfcomm0
# rfcomm connect rfcomm0

You will also need to edit your /etc/bluetooth/rfcomm.conf config file to add entries corresponding to your e-puck robots:

rfcomm0 {
bind yes;
device 08:00:17:2C:E0:88;
channel 1;
comment "e-puck_0006";
}

And possibly restart the bluetooth service:

# service bluetooth restart

A PIN number will asked to you (by bluez-pin) when trying to establish the connection. This PIN number if a 4 digits number corresponing to the name (or ID) of your e-puck. I.e, if you e-puck is called "e-puck_0006", then, this number is 0006.

If you are not asked for a PIN number and you get a message like: "Can't connect RFCOMM socket: Connection refused", it may be because the PIN number is given automatically by the PC to the e-puck without prompting you. In this case, you should edit your /etc/bluetooth/hcid.conf and change the "security" parameter from "auto" to "user". Then, you may need to restart bluez-util:

# /etc/init.d/bluez-utils restart

Extensions

The e-puck robot can be extended through a bus that includes most of the signals of the processor and some signals of the sensors and motors. This allows to:

We invite people to design their own extensions and to share them with the community.

Fly-Vision Turret

flycam07

The Fly-Vision Turret is a top turret specifically designed for experiments related to Optic Flow (OF). The turret consists of 3 black-and-white 102-pixel linear cameras that can be placed in several different positions. The cameras are connected directly to the dsPIC on the e-Puck through the extension bus, and thus there is no communication required through the I2C bus.

Printed Circuit Board Layout:

flycam01

Base PCB Features:

  • There are 6 connectors for the 3 cameras, allowing several different camera configurations depending on the task
  • Oscilloscope test points are available for all camera signals and can be used for debugging
  • A connector is available for an optoinal analogue gyroscope module (to be used for OF calculation) or any other analogue sensor

Camera Configurations:

flycam08 flycam04 flycam15
flycam18 flycam19 flycam16
This configuration can be used to avoid obstacles during forward motion
This configuration is useful for corridor-following algorithms
This configuration provides near-omnidirectional vision using a camera pointed backwards

 

Detailed schematics and PCB gerber files for the Fly-Vision Turret are available on the Download Page .

lis logoThis extension was designed at the Laboratory of Intelligent Systems (LIS) at EPFL.

Ground Sensors

The Ground Sensors extension is comprised of three Infrared (IR) proximity sensors pointed directly at the ground in front of the e-Puck. These sensors are mounted on a small PCB which is placed in the slot near the front of the e-Puck's plastic base. The PCB also includes a microcontroller that continually samples the IR sensors. The value of the sensors can then be read by the e-Puck through the I2C serial interface.

The Ground Sensors can be used for many different applications, such as following a black line on a white surface, surface-detection below the robot, or fall-detection. This extension is currently being used for a student exercise in the Bio-Inspired Adaptive Machines course taught by Prof. Dario Floreano at EPFL.

floor01 The front of the e-Puck with a Ground Sensor module
floor07 A close-up view of the module
floor06 The three IR sensors are spaced out on a horizontal line

Detailed schematics and PCB gerber files for the Ground Sensors can be found on the Download Page .

The user manual (by AAI Canada) of this extension is on the download page also.

LIS_logoThis extension was designed at the Laboratory of Intelligent Systems (LIS) at EPFL.

Colour LED Communication Turret

The Colour LED Communication Turret is part of a visual communications system currently being developed by the Laboratory of Intelligent Systems (LIS) at EPFL. The LED Turret is composed of 8 sets of 3-colour Light Emitting Diodes (LEDs), each set containing a blue, red and green LED. These three colours can be mixed together at different strengths to produce hundreds of different colours in any pattern imaginable. The LEDs are all controlled using 24 separate Pulse Width Modulated (PWM) signals generated by an on-board PIC18F6722 (Microchip) microcontroller.

 led19

The Colour LED Communication Turret circuit board with side-mounted LEDs

 

The firmware on the PIC can control each LED individually for maximum flexibility, or in a synchronised fashion to send the same signal from every angle. The user simply has to set a few parameters (such as the maximum brightness, blinking period, and rising slope) though the I2C interface, and the firmware then generates the PWM signals by itself!

led02

 led03

led06  led07
 led09

Some of the many colours that can be produced by the LED Turret

 

led15 The LED Turret can also display different colours on each LED group at the same time

The other part of the visual communication system is an omni-directional camera turret that is used to read signals produced by other robots' LED turrets. This camera turret is currently in prototype development at LIS.

Detailed schematics and PCB gerber files for the LED Turret will soon be posted on the Download Page .

LIS_logoThis extension was designed at the Laboratory of Intelligent Systems (LIS) at EPFL.

ZigBee Comm Turret

An e-puck with the communication module

This ZigBee ready communication turret was designed by Xavier Raemy in the Swarm-Intelligent Systems (SWIS) group lab of Alcherio Martinoli . It consists of a MSP430 microcontroller running TinyOS and a Chipcon CC2420 2.4Ghz transceiver. The turret communicates with the robot using the I2C bus. An accompanying board allows the module to be programed via USB.

 

In the event that you find the information presented here useful, or use a communication module based on this design for obtaining experimental results, we kindly request that you cite the following paper as the official reference:

Cianci C., Raemy X., Pugh J., and Martinoli A., “Communication in a Swarm of Miniature Robots: The e-Puck as an Educational Tool for Swarm Robotics”. Proc. of the SAB 2006 Workshop on Swarm Robotics, September-October 2006, Rome, Italy. To appear in Lecture Notes in Computer Science (2007).

 

Hardware details

 

The communication module alone

This module is a re-designed custom version of the MoteIV TmoteSky.

The principal goal was to reduce the RF range to as low as several centimeters; such that interesting experiments could be performed in a space-limited laboratory setting (on a table, for instance). In order to achieve this, we have added an RF attenuator of ~25dB between the antenna and the CC2420 transceiver. Using this, combined with the ability of the CC2420 to reduce its transmission power at runtime, we have a software-adjustable range of approximately 15 centimeters to 5 meters.

The module also includes the following hardware features:

  • Software controlled bypass of the added hardware attenuation.
  • 4Mbit of flash memory connected via SPI to the MCU
  • An I2C decoder allows users to put the MCU in "bootloader programming mode" using an I2C command.
  • The serial lines used for programming the MCU are connected to the Bluetooth UART lines from the e-puck. It is therefore theoretically possible to ask the DsPic to put the MCU in programming mode using an I2C command, and then reprogram the turret using the Bluetooth connexion of the robot. (not yet tested).
  • 3 Leds (Red, Yellow, & Green)






 

 

Software details

Currently, the MCU runs TinyOS 1.x, but it can run TinyOS 2.x.

Because the harware is not identical the the TmoteSky one, we had to create a new "platform" within TinyOS, a module for controlling the hardware atternuator, and a I2C slave layer (by default, TinyOS doen't support I2C operating in "slave" or "multi-master" modes). These modifications will eventually need to be ported to TinyOS 2.x.

 

Performance

Range of communication :

  • Min : ~15cm
  • Max:  ~5m

 

Power consumption (3.3V):

  • MCU only : 1.4mW
  • MCU and radio module enabled (receiving) :   76.2mW

 

Omnidirectional Vision Turret

The Omnidirectional Vision Turret (OVT) was developed at the Laboratory of Intelligent Systems (LIS) at EPFL. The purpose of this extension is to provide basic omnidirectional vision capabilities to the epuck, without the expense of complex processing hardware. Using a simple microcontroller, the turret is capable of advanced vision tasks such as distinguishing colour blogs in any direction, finding other robots in its vicinity, and basic navigation.

Vision System

Omnidirectional vision is achieved using a standard perspective camera (the same VGA camera used on the e-Puck) looking up on a custom-designed hyperbolic mirror. The mirror was designed to provide a full 360 degree view around the robot, and can see up to 5 degrees above the horizon.

An added advantage of the OVT is that it is not a top turret! All the signals from the e-Puck are passed on to a top PCB using card-edge connectors, which allows other turrets to be stacked on top of the OVT.

Hardware

The CMOS camera is connected to a First-In-First-Out (FIFO) frame buffer (Averlogic AL440B-24-PBF), which is then connected to a microcontroller (dsPIC33FJ256GP506 from Microchip). The frame buffer allows de-coupling between the camera and the microcontroller, which means that the microcontroller is no longer a slave to the camera, and can read an image at whatever pace is required from the frame buffer.

Firmware

Several algorithms have been implemented, including basic drivers for reading an image from the camera, as well as image processing algorithms. There are two ways of reading the image:

  • An entire image can be read in low resolution and stored in the memory of the dsPIC. An image of up to approximately 80x80pixels can be read this way.
  • The image can be read line by line, and then processed before the next line is read. This allows for the full resolution of the camera to be used (480x480 pixels), but slows down the reading time of the image. Some image processing steps also require the entire image in dsPIC’s memory to work.

Some basic image processing algorithms have also been implemented on the dsPIC

  • Image unwrapping (of both low-res images resident in memory, and high-res images being read line-by-line)
  • Colour thresholding (only on images stored in memory)
  • Edge detection (only on images stored in memory)

The firmware for the OVT will be posted on the download page once the bugs have been ironed out.

Detailed schematics and PCB gerber files for the Omnidirectional Vision Turret can be found on the Download Page .

LIS_logoThis extension was designed at the Laboratory of Intelligent Systems (LIS) at EPFL.

Magnetic Wheels

This is a very simple extension consisting in adding some magnets on the bottom or replacing the wheels with a magnetic version.

Here a video of the system:

 

Here are some details on the implementation: For adhesion on ferromagnetic ceilings, only the wheels have to be modified. The magnetic wheel has the following shape (magnet in red):

 

Image

 

The wheel is fixed to the axis like the standard wheel. The new wheel is in ferromagnetic iron. A 3D pdf file can be found here. On this file you can take measures for size.

To climb a wall, the standard e-puck motor has not sufficient torque. For this type of operation the motor of the e-puck (a custom PG15S-020 with a resistance of 18 ohm) need to be replaced with a more powerful motor (the standard PG15S-020 with a resistance of 10 ohm). Our custom and standard motors have been purchased by EME in Switzerland at 55CHF/sample.

Epuck Range and Bearing Board

The e-Puck Range & Bearing board (e-RandB) improves the existing relative localization/communication software library (libIrcom) developed for the e-puck robot and based on its on-board infrared sensors. The e-RandB has been developed in collaboration with Robolabo, Iridia and RBZ Robot Design.

The board allows situated agents to communicate locally, obtaining at the same time both the range and the bearing of the emitter without the need of any centralized control or any external reference. Therefore, the board allows the robots to have an embodied, decentralized and scalable communication system. The system relies on infrared communications with frequency modulation and is composed of two interconnected modules for data and power measurement.

The board can communicate with the e-puck through the Uart or I2C bus. It allows the user to select one of the two buses depending on which is already used by other extension modules. However, different frame rates are achieved depending on the communicaton bus. With the actual software version we are obtaining a 50 msg/sec with the Uart bus while 150 msg/sec for the I2C, where each message is made up of a 6 bits header, 16 bits for data and 4 bits crc.

All the information about the e-RandB is provided here

e-RandB

The e-RandB board is actually used at:

  • ROBOLABO, ETSI Telecomunicación, Universidad Politécnica de Madrid, Madrid, Spain
  • IRIDIA, CoDE, Université Libre de Bruxelles, Bruxelles, Belgium
  • IDSIA, University of Applied Sciences of Southern Switzerland, Manno-Lugano, Switzerland
  • LARAL, Institute of Cognitive Sciences and Technologies, CNR, Rome, Italy
  • ISR, Instituto Superior Técnico, Lisbon, Portugal
  • Self-Organizing Systems Research Group, Harvard University, Cambridge, MA, USA

E-RandB Files:

E-RandB Articles:

Its use and capabilities are demonstrated in the following papers:

  • Álvaro Gutiérrez, Alexandre Campo, Marco Dorigo, Félix Monasterio-Huelin and Jesús Donate. Open E-puck Range & Bearing Miniaturized Board for Local Communication in Swarm Robotics. Proceedings of the IEEE International Conference on Robotics and Automation, May 2009, IEEE Press, Piscataway, NJ, In press.
  • Álvaro Gutiérrez, Alexandre Campo, Marco Dorigo, Daniel Amor, Luis Magdalena and Félix Monasterio-Huelin. An Open Localisation and Local Communication Embodied Sensor. Sensors 8(11), pages 7545-7563, November 2008. ISSN: 1424-8220

Software

Software available for the e-puck platform :


Check the Download section for a complete list of software/firmware/electronics files.

Tutorial

Development on the e-puck can be learned through the tutorial.

Libraries

We have developed few simple libraries allowing to access to the hardware features. The following libraries are available:

  • e-puck initialisation of basic hardware features
  • management of proximity sensors
  • motor speed control
  • leds management
  • accelerometer reading
  • microphone sampling
  • sound emition
  • image acquisition

Release versions of the libraries are available at the e-puck snapshot.

Development versions of the libraries are available at gna. If you want to take part in their development, you can join the development mailing list.

Bootloader

We have a bootloader that works through bluetooth.

ASEBA

The ASEBA architecture allows to run scripts into the e-puck microcontroller keeping a good control on the code, allowing easy debugging and visualization of key variable and sensor data. A bridge between ASEBA and ROS has been implemented and allows a link with one of the most well known robotic programming environments.

Debug

Hardware online debug has to be done using an external programmer ICD2 from Microchip. But you can manage doing simple debuging by sending information trough bluetooth.

Communication protocol

We have implemented a serial communication protocol to control the e-puck via a serial RS232 line or bluetooth. This allows simple experimentation with code developed on a PC.

Demonstrations

We have a set of demonstration illustrating the use of several sensors of the robot in several typical applications.

Compiler and IDE

The dsPIC processor family is supported by the GCC GNU compiler collection as the PIC30 target.

Windows

Under Windows, the MPLAB environment from Microchip can be used with their C30 compiler, which is a version of GCC for dsPIC with some limitations. Those limitations can be removed by recompiling C30 from its sources.

To connect the ICD2 hardware to the e-puck you need to change the connector and use an e-puck (red 6 pins) connector. It’s a Tyco connector called Micro match, type is 7-215083-6 .

To upload the .hex files without the use of microchip's ICD, you can use the Tiny PIC bootloader.

Unixes

In Unixes, the piklab environment is available. Its web page provides instructions on how to get or build gcc for PIC30.

In addition, if you are running a Debian derivative (such as Ubuntu), there are instructions here on how to build cleans debs of latest PIC30 compiler (version 2.05). If you have troubles during compilation (particularly on Ubuntu, which uses dash as /bin/sh), please read the rest of the page as there is further explanations.

To upload the .hex files you have the choices of three Unix program:

  • epuckuploadbt: a C++ program, requires libbluetooth to be compiled, but then is fully automatic, uses the discovery capabilities of bluetooth and does not require any changes in /dev
  • epuckupload: a perl program, does not need to be compiled nor requires libbluetooth, but need a proper rfcomm link setuped.
  • using piklab-prog (from piklab): it allows you to progam your e-puck using the ICD2. Instructions for ICD2 under Linux::
                        First, your user must have the rights to access the ICD2 hardware. To achieve this, you simply need to add these udev rules (this requires root privileges):
    
                        valentin@lsro1pc307:~$ cat /etc/udev/rules.d/26-microchip.rules
                        #PICKit
                        SYSFS{idVendor}=="04d8", SYSFS{idProduct}=="0032", MODE="0660", GROUP="microchip"
                        #PICKit2
                        SYSFS{idVendor}=="04d8", SYSFS{idProduct}=="0033", MODE="0660", GROUP="microchip"
                        #ICD2
                        SYSFS{idVendor}=="04d8", SYSFS{idProduct}=="8000", MODE="0660", GROUP="microchip"
                        #ICD21
                        SYSFS{idVendor}=="04d8", SYSFS{idProduct}=="8001", MODE="0660", GROUP="microchip"
    
                        Be sure that your user is added in the microchip group. This can easily be done using the following command (replace user by your actual username):
    
                        valentin@lsro1pc307:~$ sudo gpasswd -a user microchip
                        When this is done, do not forget to restart the udev service and maybe relog so that all changes are effective.
    
                        You can then use piklab-prog to program you e-puck with a .hex file:
    
                        valentin@lsro1pc307:~$ piklab-prog -p icd2 -d 30F6014A --target-self-powered --force -c program e-puck.hex
                        
  •  

ePic2

ePic2 is a complete framework to command the e-puck within Matlab. It allows its user to interact with the e-puck as he would do using the Windows' Hyper Terminal but improves on it by allowing the data gathered by the robot to be used as variables for Matlab's tools.

ePic2 can be used in two ways. The first is through its graphical interface that can be seen below. The interface offers basic commands over the e-puck and displays the values of the sensors. It also provides a way to execute controllers written in a M-File. The second method uses Matlab functions to command the robot from the Matlab's command line. This offers more control compared to the interface.

ePic2.1 interface

ePic2 is available on the e-puck svn in the tool section. It requires Matlab 7.1 or higher equipped with the Instrument Control Toolbox installed. It is nevertheless recommended to use the most recent version of Matlab to avoid some stability issues. Currently ePic can be used under Windows and Linux. It does not work under Mac Os as the necessary functions to connect to the e-puck are not supported on that platform.

ePic2 and its documentation can be found in the download section but it is highly recommended to download them from the tool section of the e-puck svn as it may contain a more recent version.

ePic2 is used for education purposes at EPFL in the course Mobile Robots. ePic is the main tool used for every exercise sessions whose details, including exercise sheets and source codes, can be found on the course website.

This software was developed at the Laboratory of Intelligent Systems at EPFL by Yannick Weibel, Julien Hubert and others contributors.

IR communication

What is it ?

libIrcom is a library that can be used straightforward on the e-puck robots to achieve local range infrared communication, also known as range and bearing. libIrcom relies on the infrared sensors of the robots to transmit and receive information. However, the communication system is multiplexed with the proximity sensing system commonly used on the robots. It is therefore possible to both communicate and avoid obstacles.

libIrcom allows communications at a rate of 30 bytes per seconds max from sender to receiver, including a 2bits CRC check in each byte to detect erroneous messages. Messages are encoded using a frequency modulation that permits usage in a wide range of light conditions. Messages can be detected at a distance of up to 25 cm between emitter and receiver. If continuous listening is activated, libIrcom will try to catch as many messages as possible, independently of the cycle duration of your controller. Messages are stored in a queue and can be retrieved at any time, unless they are overwritten when the queue is full.

Get it right now !

Here you can find the library and two examples. All material is made available under the GNU General Public License (GPL).

How to test the examples ?

With the libIrcom library 2 examples are provided : test and synchronize.

  • Windows : Open project files located in synchronize or test folders. All necessary files are loaded, you can simply compile with mplab IDE then upload with the robots. We tested compilation with mplab 8.02.

  • Linux : Before compiling the examples you have to compile all the libraries. Go to the root directory of the library and launch the compilation script by typing:
    > ./make.sh

  • Test example : This example shows the comunication between two robots. One will be the emitter and the other the receiver. You will choose each one with the selector switch : 1-> Emiter 2-> Receiver.

    The emitter will start transmiting a sequence of numbers from 0 to 255 (1 byte) continuously.

    The receiver will take the frame and demodulate it. If the frame is well decoded the receiver will send it through bluetooth along with information about the location of the emitter of the message (orientation and distance). The orientation is expressed in the local mark of the receiver starting from the camera and going CCW. The distance is between the center of the 2 robots.

    Boths robots have to be connected to a computer through bluetooth and press enter to start the comunication between them.

    The code for the robots is located in libIrcomXXX/e-puck/src/test/. Go to the directory and type :
    > make

    The binary will be generated on libIrcomXXX/e-puck/bin/. Download the file to the robots :
    > epuckupload -f test.hex

    The code for the computer transmition is on libIrcomXXX/pc/ircomTest/. Go to the directory and type
    > make

    followed by
    > ./ircomTest

    Once you are connected to both robots, press enter and check the transmision.
    Good luck !

  • Synchronize example : This example shows how a group of robots can communicate about directions where they are pointing to surrounding neighbours and end up aligned towards the same direction. The example is based on the libIrcom library. In a complete loop, the robots try to:

    i)  Send information: Because of the number of robots, each robot will first send his own ID, and after that his own direction related to the robots from which it has receive a message.
    ii)  Listen for some transmissions and decode the messages. If a message is an ID, it will store the direction where it cames from. If the message is an orientation addressed to it, it will change the orientation, according to this new information.

    The code for the robots is on libIrcomXXX/e-puck/src/synchronize/. Go to the directory and type:  > make

    The binary will be generated on libIrcomXXX/e-puck/bin/. Download the file to the robots :
    > epuckupload -f sync.hex

Troubleshooting

  • Wrong messages received, or messages not received. Robots emit the same infrareds. So when several robots try to send in the same time, you are likely to receive only wrong messages. In addition, IR communications are slightly sensitive to light conditions. If you want to get close to 0 messages lost, try to work with controlled light, emitting few infrareds, and without oscillations.

  • Timers. libIrcom is cpu consuming because it is monitoring very regularly the sensors. The interrupt used is set to a high priority which may superseed others. If you use agendas or any other interrupt, it is likely that some steps are lost. To cope with this problems of timing, we have implemented a time counter inside libIrcom (see ircomTools.c). libIrcom is the only tool that can give you accurate timings.

  • Messages going through a robot or blocked by a wall. Infrared is light. Therefore a message can not be sent accross an opaque material. Given that e-pucks are transparent, it is possible that a message goes accross a robot. Do not expect that a robot is necessarily preventing messages to go through, make some tests, and restrict the communication range if needed.


More informations

Find more informations on https://www.e-puck.org, or ask to the e-puck community on GNA mailing lists : https://gna.org/projects/e-puck/

Contact the authors

We are interested by your work and would be happy to know what kind of exciting things you achieved with this tool. Let us know !
For specific requests you may also contact the authors of this software...

Alexandre Campo : alexandre (dot) campo (at) ulb.ac.be
Alvaro Gutierrez : aguti (at) etsit.upm.es
Valentin Longchamp : valentin (dot) longchamp (at) epfl.ch

Player driver

A Player driver has been developed by Renato Garcia and is available here.


Simulators

Three simulators are available with e-puck modules:

Enki

Enki is a open-source, fast 2D physics-based robot simulator written in C++. It is able to simulate cinematics, collisions, sensors and cameras of robots evolving on a flat surface. It also provides limited support for friction. It is able to simulate groups of robots hundred times faster than realtime on a modern desktop computer.

ENKI webpage here

Webots

The Webots mobile robot simulation software now includes models of the e-puck robot and a bluetooth remote control interface working on all Windows, Mac OS X and Linux

e-puck simulation in Webots

A model of the e-puck robot is included in the Webots simulator.

e-puck control window in Webots simulated e-puck control window in Webots

The real and simulated e-puck control window in Webots

 

The e-puck.wbt model features accurate physics simulation (including inertia, friction, etc.).

A bluetooth connection between Webots and the real e-puck robot allows you to remote-control the real robot from your Webots controller programs (works currently under Linux, Windows and Mac OS X).

We are working on supporting also the cross-compilation of Webots controller programs so that you can execute the code on the robot microcontroller directly.

This software is being improved regularly. The current version models the following devices:

device number simulation remote-control botstudio cross-compilation
DistanceSensor 8
LightSensor 8
LED 9
Wheels 2
Camera 1
Accelerometer 3
Bluetooth 1
Speaker 1
Microphone 3

V-REP

V-REP is a 3D robot simulator based on a distributed control architecture: control programs (or scripts) can be directly attached to scene objects and run simultaneously in a threaded or non-threaded fashion. This makes V-REP very versatile and ideal for multi-robot applications, and allows users to model robotic systems in a similar fashion as in reality - where control is most of the time also distributed.

Complex simulation can be created by making use of the various supported functionality, including:

- 2 physics engines support (Bullet and ODE)
- Full forward/inverse kinematics solver (handles any type of mechanism)
- Collision detection calculations
- Minimum distance calculations (mesh-mesh distance)
- Path planning (holonomic tasks in 2-6 dimensions and non-holonomic tasks for car-like vehicles)
- Proximity sensor simulation (exact minimum distance calculation within a customizable detection volume)
- Camera-like sensor simulation (fully customizable, with image processing capabilities)
- Various integrated edit modes (mesh edit modes, path edit mode, and custom UI edit mode)
- CAD data import/export (DXF, OBJ, 3DS or STL)
- Simulation data recording, graphing and export
- Various extension mechanisms (through scripts, plugins or a custom client application)

 

VREP screenshot

 

Here a video on the e-puck model running:

V-REP has various prices and licensing schemes. Students can use V-REP on their private computer free of charge.


Accessories

Here are listed some accessories that can be used with the e-puck robot.

Infra-Red Remote Control

The e-puck robot is equipped with an IR remote control receiver. If you want to send commands you need a remote control generating RC5 signals. For those who do not want to try all remote controls, here is one that has been tested and works:

rc univers15 remote

Some details:

Model name: RC UNIVERS15
Lenght: 130mm
Initialisation: press "SET" and "TV" then enter code 026

Where to buy: https://www.donberg.ie
Product page: https://www.donberg.ie/descript/r/rcuies15.htm

Transport box

To safely transport your e-puck around, we use the following box:

box for e-puck

 

Some details:

Article name (in german): Schraubdosen - Polypropylen (PP)
Diameter: 90 mm
Height: 80 mm
Content: 500 ml

Article number: 2066
Price: 1.30 CHF/box

Where to buy: https://eshop.semadeni.com/
Product page:

FRENCH:
https://eshop.semadeni.com/?&country=CH&language=FR&action=search&searchtype=3&search=2066

GERMAN:
https://eshop.semadeni.com/?&country=CH&language=DE&action=search&searchtype=3&search=2066


Download

Documentation

Tutorials

Software

Library

Programs

e-Puck

Electronics

Mechanics

Extensions

Colour LED Communication Turret

Fly-Vision Turret

Ground Sensors

Omnidirectional Vision Turret

Video

e-puck


Where To Buy

Several companies are selling the e-puck robot:

GCtronic, a Swiss company, is manufacturing and selling the e-puck educational mobile robot directly or through distributors all over the world.

Europe

Asia

For Taiwan please contact Pitotech.

Clients in China mainland can contact BJROBOT TECHNOLOGY CO.,LTD.

For other countries you can contact GCtronic directly.


Publications

The reference publication on the e-puck robot is:

Mondada, F., Bonani, M., Raemy, X., Pugh, J., Cianci, C., Klaptocz, A., Magnenat, S., Zufferey, J.-C., Floreano, D. and Martinoli, A. (2009) The e-puck, a Robot Designed for Education in Engineering. Proceedings of the 9th Conference on Autonomous Robot Systems and Competitions, 1(1) pp. 59-65.

Publications that mention the e-puck robot (cronological order) are listed below. If you have a publication using the e-puck and not listed here plase send the bibtex information to info@gctronic.com.

 

[] Mattias Jacobsson, Ylva Fernaeus, and Lars Erik Holmquist. Glowbots: designing and implementing engaging human-robot interaction, 2008-06. [ bib ]
[] Yating Zheng, Cristián Huepe, and Zhangang Han. Experimental capabilities and limitations of a position-based control algorithm for swarm robotics. Adaptive Behavior, 0(0):1059712320930418, 0. [ bib | DOI | arXiv | www: ]
[] Nikolaus Correll, Christopher M. Cianci, Xavier Raemy, and Alcherio Martinoli. Self-organized embedded sensor/actuator networks for ""smart"" turbines. 2006. [ bib ]
[] S. Ljungblad, K. Walter, M. Jacobsson, and L.E. Holmquist. Designing personal embodied agents with personas. In ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication, pages 575--580, 2006. [ bib | DOI ]
[] Christopher M. Cianci, Xavier Raemy, Jim Pugh, and Alcherio Martinoli. Communication in a swarm of miniature robots: The e-puck as an educational tool for swarm robotics. In Erol Şahin, William M. Spears, and Alan F. T. Winfield, editors, Swarm Robotics, pages 103--115, Berlin, Heidelberg, 2007. Springer Berlin Heidelberg. [ bib ]
[] Alberto Acerbi, Davide Marocco, and Stefano Nolfi. Social facilitation on the development of foraging behaviors in a population of autonomous robots. In Fernando Almeida e Costa, Luis Mateus Rocha, Ernesto Costa, Inman Harvey, and António Coutinho, editors, Advances in Artificial Life, pages 625--634, Berlin, Heidelberg, 2007. Springer Berlin Heidelberg. [ bib ]
[] O. Gigliotta and S. Nolfi. Formation of spatial representations in evolving autonomous robots. In IEEE Symposium on Artificial Life, 2007. ALIFE'07, pages 171--178, 2007. [ bib ]
[] Yong Xu, Matthieu Guillemot, and Toyoaki Nishida. An experiment study of gesture-based human-robot interface. In 2007 IEEE/ICME International Conference on Complex Medical Engineering, pages 457--463, 2007. [ bib | DOI ]
[] Sara Ljungblad and Lars Erik Holmquist. Transfer scenarios: Grounding innovation with marginal practices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '07, page 737–746, New York, NY, USA, 2007. Association for Computing Machinery. [ bib | DOI | https ]
[] Christopher M. Cianci, Thomas Lochmatter, Jim Pugh, and Alcherio Martinoli. Toward multi-level modeling of robotic sensor networks: A case study in acoustic event monitoring. In Proceedings of the 1st International Conference on Robot Communication and Coordination, RoboComm '07. IEEE Press, 2007. [ bib ]
[] Davide Marocco and Alberto Acerbi. Adaptation and social facilitation in a population of autonomous robots. In Proceedings of the Seventh International Conference on Epigenetic Robotics, pages 85--91, Lund, 2007. LUCS. [ bib ]
[] Mattias Jacobsson, Sara Ljungblad, Johan Bodin, Jeffrey Knurek, and Lars Holmquist. Glowbots: Robots that evolve relationships. 01 2007. [ bib | DOI ]
[] Herianto Herianto, Toshiki Sakakibara, and Daisuke Kurabayashi. Artificial pheromone system using rfid for navigation of autonomous robots. Journal of Bionic Engineering - J BIONIC ENG, 4:245--253, 12 2007. [ bib | DOI ]
[] Giovanni Pini, Elio Tuci, and Marco Dorigo. Evolution of social and individual learning in autonomous robots. In IN: ECAL WORKSHOP: SOCIAL LEARNING IN EMBODIED AGENTS, 2007. [ bib ]
[] Farzad Rastegar and Majid Nili Ahmadabadi. Grounding abstraction in sensory experience. In 2007 IEEE/ASME international conference on advanced intelligent mechatronics, pages 1--8, 2007. [ bib | DOI ]
[] Jim Pugh and Alcherio Martinoli. Parallel learning in heterogeneous multi-robot swarms. In 2007 IEEE Congress on Evolutionary Computation, pages 3839--3846, 2007. [ bib | DOI ]
[] Jim Pugh and Alcherio Martinoli. Inspiring and modeling multi-robot search with particle swarm optimization. In 2007 IEEE Swarm Intelligence Symposium, pages 332--339, 2007. [ bib | DOI ]
[] Pierre Roduit, Alcherio Martinoli, and Jacques Jacot. A quantitative method for comparing trajectories of mobile robots using point distribution models. In 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 2441--2448, 2007. [ bib | DOI ]
[] Saeed Amizadeh, Majid Nili Ahmadabadi, Babak N. Araabi, and Roland Siegwart. A bayesian approach to conceptualization using reinforcement learning. In 2007 IEEE/ASME international conference on advanced intelligent mechatronics, pages 1--7, 2007. [ bib | DOI ]
[] Jim Pugh and Alcherio Martinoli. The cost of reality: Effects of real-world factors on multi-robot search. In Proceedings 2007 IEEE International Conference on Robotics and Automation, pages 397--404, 2007. [ bib | DOI ]
[] A. Acerbi, F. Cecconi, D. Marocco, and S. Zappacosta. Individual vs Social Learning in a Population of Autonomous Robots. In Proceedings of the ECAL, pages 10--14, 2007. [ bib ]
[] Louis-Emmanuel Martinet, Benjamin Fouque, Jean-Baptiste Passot, Jean-Arcady Meyer, and Angelo Arleo. Modelling the cortical columnar organisation for topological state-space representation, and action planning. In Minoru Asada, John C. T. Hallam, Jean-Arcady Meyer, and Jun Tani, editors, From Animals to Animats 10, pages 137--147, Berlin, Heidelberg, 2008. Springer Berlin Heidelberg. [ bib ]
[] Jim Pugh and Alcherio Martinoli. Distributed adaptation in multi-robot search using particle swarm optimization. In Minoru Asada, John C. T. Hallam, Jean-Arcady Meyer, and Jun Tani, editors, From Animals to Animats 10, pages 393--402, Berlin, Heidelberg, 2008. Springer Berlin Heidelberg. [ bib ]
[] Dieter Vanderelst and Emilia Barakova. Autonomous parsing of behavior in a multi-agent setting. In Leszek Rutkowski, Ryszard Tadeusiewicz, Lotfi A. Zadeh, and Jacek M. Zurada, editors, Artificial Intelligence and Soft Computing -- ICAISC 2008, pages 1198--1209, Berlin, Heidelberg, 2008. Springer Berlin Heidelberg. [ bib ]
[] Winai Chonnaparamutt and Emilia I. Barakova. Robot simulation of sensory integration dysfunction in autism with dynamic neural fields model. In Leszek Rutkowski, Ryszard Tadeusiewicz, Lotfi A. Zadeh, and Jacek M. Zurada, editors, Artificial Intelligence and Soft Computing -- ICAISC 2008, pages 741--751, Berlin, Heidelberg, 2008. Springer Berlin Heidelberg. [ bib ]
[] Takaaki Shimone, Daisuke Kurabayashi, Kunio Okita, and Tetsuro Funato. Implementation of Formation Transition System Using Synchronization in a Mobile Robot Group, pages 423--432. Springer Berlin Heidelberg, Berlin, Heidelberg, 2008. [ bib | DOI | https ]
[] Olivier Michel, Fabien Rohrer, and Yvan Bourquin. Rat's Life: A Cognitive Robotics Benchmark, pages 223--232. Springer Berlin Heidelberg, Berlin, Heidelberg, 2008. [ bib | DOI | https ]
[] Stanislav Slušný, Roman Neruda, and Petra Vidnerová. Rule-based analysis of behaviour learned by evolutionary and reinforcement algorithms. In De-Shuang Huang, Donald C. Wunsch, Daniel S. Levine, and Kang-Hyun Jo, editors, Advanced Intelligent Computing Theories and Applications. With Aspects of Artificial Intelligence, pages 284--291, Berlin, Heidelberg, 2008. Springer Berlin Heidelberg. [ bib ]
[] F. Mondada, M. Bonani, X. Raemy, J. Pugh, C. Cianci, A. Klaptocz, S. Magnenat, J.-C. Zufferey, D. Floreano, and A. Martinoli. The e-puck, a robot designed for education in engineering. The International Journal of Advanced Robotic Systems, 6(4):11--29, 2009. [ bib | DOI ]
[] M. Bonani, J. Pugh, A. G. F. Silva, and A. Martinoli. Design and implementation of an open-hardware/open-software platform for cooperative distributed robotics. In 2009 IEEE International Conference on Robotics and Automation, pages 1076--1081, 2009. [ bib | DOI ]
[] S. Magnenat and F. Mondada. Aseba: A distributed and embedded programming architecture for swarm robotics. In F. Groen, S. M. I. A. M. de Jong, and W. J. M. Wiegerinck, editors, ICT. Open Access ICT Journal., pages 411--420, 2009. [ bib ]
[] Stefano Nolfi, Alberto Acerbi, and Davide Marocco. Evolutionary swarm robotics: A review. In A. P. Agogino, K. A. De Jong, P. J. Bentley, and S. K. Ong, editors, Computational Intelligence in Engineering, pages 191--214, Berlin, Heidelberg, 2011. Springer Berlin Heidelberg. [ bib ]
[] R. L. G. de Oliveria, B. C. de Souza, V. H. R. C. M. F. B. de Oliveira, R. P. G. L. T. G. N. de Oliveira, E. G. M. H. V. C. R. C. V. H. B. C. L. G. de Oliveira, and L. C. M. L. T. C. R. C. R. C. B. C. L. G. de Oliveira. Robot for learning about robotic and evolutionary algorithms. In 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pages 3226--3231, 2012. [ bib | DOI ]
[] Andrés González and Jesús G. González. Social interaction test between a rat and a robot: A pilot study. International Journal of Advanced Robotic Systems, 13(1):4, 2016. [ bib | DOI | arXiv | www: ]
[] =., G., A., and C. Software infrastructure for e-puck ( and tam ). In "", 2016. [ bib ]
[] Yuri K. Lopes, Stefan M. Trenkwalder, André B. Leal, Tony J. Dodd, and Roderich Groß. Supervisory control theory applied to swarm robotics. Swarm Intelligence, 10(1):65--97, Mar 2016. [ bib | DOI ]
[] D. Bortoluzzi, B. L. S. de Almeida, A. I. M. T. A. M. L. B. T. L. B. D. E. J. A. D. E. J. L. A. B. D. E. J. E. J. A. M. T. G. J. E. T. M. D. A. M. L. P. M. B. D. E. J. M. D. A. M. M. P. T. D. E. J. E. T. L. M. B. D. E. J. D. E. J. S. S. A. M. T. G. A. S. A. M. T. L. B. D. L. P. M. B. D. E. J. R. M. G. E. T. L. B. D. E. J. D. E. J. S. T. D. L. B. D. E. J. S. D. L. B. D. E. J. B. D. E. J. M. A. B. D. E. J. T. L. A. M. J. D. E. J. and A. D. E. J. Learning distributed control policies for multi-robot teams: A comparison of three approaches. IEEE Access, 5:18175--18186, 2017. [ bib | DOI ]
[] T. Kargar Tasooji and H. J. Marquez. Cooperative localization in multirobot systems using event-triggered communication: The centralized case. IEEE Transactions on Control Systems Technology, 27(3):967--979, 2019. [ bib | DOI ]
[] Dara Mohammad and Sanaz Mostaghim. On the communication capacity of a small mobile robot: A real-world experiment. In Proceedings of the 2019 Genetic and Evolutionary Computation Conference Companion, GECCO '19, page 209–210, New York, NY, USA, 2019. Association for Computing Machinery. [ bib | DOI | https ]
[] D. E. J. R. M. L. B. D. E. J. T. E. M. E. V. L. V. L. V. M. R. C. M. E. V. L. E. M. J. A. D. E. J. E. M. A. T. E. M. T. M. J. E. M. V. M. E. J. S. V. E. J. E. M. V. L. S. T. E. M. M. D. A. M. E. M. J. E. M. L. V. E. J. and S. S. E. M. Active multi-robot simultaneous localization and mapping (slam) based on coverage maximization and mutual information. IEEE Access, 6:58843--58855, 2018. [ bib | DOI ]
[] Yating Zheng, Cristián Huepe, and Zhangang Han. Experimental capabilities and limitations of a position-based control algorithm for swarm robotics. International Journal of Advanced Robotic Systems, 17(3):1729881420930418, 2020. [ bib | DOI | arXiv | www: ]
[] M. D. Erbaş and İ. H. Parlak. Evaluation of vocal communication in a robot collective. DUZCE UNIVERSITY JOURNAL OF SCIENCE AND TECHNOLOGY, 8:2029--2040, 2020. [ bib ]
[] Mehmet Dinçer Erbaş and İsmail Hakkı Parlak. Evaluation of vocal communication in a robot collective. Düzce Üniversitesi Bilim ve Teknoloji Dergisi, 8:2029 -- 2040, 2020. [ bib | DOI ]
[] Valeria Villani, Beatrice Capelli, Cristian Secchi, Cesare Fantuzzi, and Lorenzo Sabattini. Humans interacting with multi-robot systems: a natural affect-based approach. Autonomous Robots, 44(3):601--616, Mar 2020. [ bib | DOI ]
[] Qihao Shan and Sanaz Mostaghim. Discrete collective estimation in swarm robotics with distributed bayesian belief sharing. Swarm Intelligence, Sep 2021. [ bib | DOI | https ]
[] Eduardo Castelló Ferrer, Thomas Hardjono, Alex Pentland, and Marco Dorigo. Secure and secret cooperation in robot swarms. Science Robotics, 6(56):eabf1538, 2021. [ bib | DOI | arXiv | www: ]
[] T. Kargar Tasooji and H. J. Marquez. Cooperative localization in multirobot systems using decentralized event-triggered communication: With and without timestamps mechanism. IEEE Transactions on Industrial Electronics, 70(6):5982--5993, 2023. [ bib | DOI ]
[] Tohid Kargar Tasooji and Horacio J. Marquez. Decentralized event-triggered cooperative localization in multirobot systems under random delays: With/without timestamps mechanism. IEEE/ASME Transactions on Mechatronics, 28(1):555--567, 2023. [ bib | DOI ]
[] Tohid Kargar Tasooji, Sakineh Khodadadi, and Horacio J. Marquez. Event-based secure consensus control for multirobot systems with cooperative localization against dos attacks. IEEE/ASME Transactions on Mechatronics, 29(1):715--729, 2024. [ bib | DOI ]
[] Tohid Kargar Tasooji and Sakineh Khodadadi. Multi-robot coordination under physical limitations, 2025. [ bib | arXiv | https ]

This file was generated by bibtex2html 1.99.