Spirit of the e-puck projectThis project has been started at the Ecole Polytechnique Fédérale de Lausanne as collaboration between the Autonomous Systems Lab, the Swarm-Intelligent Systems group and the Laboratory of Intelligent System. An educational robotThe main goal of this project is to develop a miniature mobile robot for educational purposes at university level. To achieve this goal the robot needs, in our opinion, the following features:
An open hardware platformTo help the creation of a community inside and outside EPFL, the project is based on an open hardware concept, where all documents are distributed and submitted to a license allowing everyone to use and develop for it.Status of the projectA first set of 20 robots (e-puck version 1) has been produced in november 2004 and tested in the courses "microinformatique I" and "microinformatique II". The results of these tests have been used to create a new version of the robot (e-puck version 2) during summer 2005. We have produced 400 units of this final version in the beginning of 2006. Version 2 is incompatible with version 1 and we therefore discourage everybody to start working on version 1. Version 2 includes many improvements and is a stable one. DistributionTwo manufacturing companies are started the production of e-puck and several distributors have started to sell it. We list them on this sit in the section "Where to buy". If you are interested in producing / distributing this robot please let Francesco Mondada or Alcherio Martinoli know. |
People
ASL group
Cyberbotics
|
Licensee-puck Robot
|
The e-puck has been developed as an easy platform for education. Currently it is used in the following courses:
Courses using the e-puck | Laboratory | School |
---|---|---|
Biological and Artificial Intelligent Systems | Laboratory of Intelligent Systems | EPFL |
Microinformatique I (old) | Laboratoire de Systèmes Robotiques | EPFL |
Microinformatique II (old) | Laboratoire de Systèmes Robotiques | EPFL |
Robot Systems | Laboratoire de Systèmes Robotiques / Learning Algorithms and Systems Laboratory | EPFL |
Mobile Robots | Autonomous Systems Lab / Laboratory of Intelligent Systems | EPFL |
Swarm Intelligence | Swarm-Intelligent Systems Group | EPFL |
TP de robotique (filière RSA) | Laboratoire de Systèmes Robotiques / Learning Algorithms and Systems Laboratory / Laboratory of Intelligent Systems | EPFL |
There are also more and more students that are using e-puck for learning about robotics in general. Most of the labs listed above also offer student projects on the e-puck.
The e-puck robot has been developed as an open tool. An article illustrating the concept and design is:
Mondada, F., Bonani, M., Raemy, X., Pugh, J., Cianci, C., Klaptocz, A., Magnenat, S., Zufferey, J.-C., Floreano, D. and Martinoli, A. (2009) The e-puck, a Robot Designed for Education in Engineering. Proceedings of the 9th Conference on Autonomous Robot Systems and Competitions, 1(1) pp. 59-65.
The full PDF can be found here.
To exploit in the best way this tool, it's important to know it well. This section include general information about the hardware of the robot and its specifications. Full specifications (schematics and more) can be found in the download section (top menu).
Access to the several sections by the submenus on the left.
Mechanics of the e-puck |
||
Simple mechanicsTo reach the goals of the project (low price and robustness) the mechanics has to be very simple and elegant. The structure has been therefore based on one single part (followings pictures) ensuring the structure of the whole robot.
This basic part has a diameter of 70 mm and ensures the support of the motors, the printed circuit and the battery. The battery is located under the motors and is connected to the printed circuit by two vertical contacts. The battery can be easily extracted and recharged externally.
MotorThe choice of the motors is crucial for this purpose. On the e-puck design we decided to use miniature steppers motors with gear reduction. The motor has 20 steps per revolution and the gear has a reduction of 50:1. The final axis is sufficiently strong to support directly the wheel. WheelsThe wheels diameter is about 41 mm. The fixation is made by a four teeth chuck. There is a bras peaces insert in the plastical wheele. A green o ring is used as tire. The distance between the wheels (mounted) is about 53 mm. The maximum speed of the wheels is about 1000 steps / s, which corresponds to one wheel revolution per second.
|
Electronic designThe electronics was designed to support a large number of possible uses of the robot in a context of university level education. The following structure has been implemented in the actual version:
Short description of the main components around the processor:
ProcessorThe choice of the processor is crucial for this robot. To enable nice programming, use of standard compilers (GNU) and have good computational power we made the choice of the dsPIC family from microchip. This microcontroller has a lot of very nice interfaces and has a dsp core allowing very efficient data processing. It is a very nice processor for education because of the nice structure (registers, set of instructions). |
Sound sensorsThe e-puck robot is equipped with three microphones placed as illustrated here:
![]()
Acquiring the three microphones, the maximal acquisition speed is 33kHz (A/D maximal frequency of 100kHz divided by three). A sound wave at 1kHz emitted from a computer speaker in front of the robot is acquired by the microphones as following: If the same sound comes from the left, the data acquired is the following: If the same sound comes from the right, the data acquired looks as following: The quality of the signal is of course very dependent from the type of sound source. The microphones of the e-puck are not very senstive. This is the reason of the first tests using a "big" computer speaker. Taking a small speaker (another e-puck) placed at 130 mm of the robot on the left side, the sound (1kHz) acquired looks as following (please note the amplitude and the noise of the signal):
Here you can find the sound definition file that was used with the standard library to generate the sound on the second e-puck. The FFT of this last signal shows well the main frequency (1kHz) but shows also a noise frequency at 11kHz... this is perhaps due to a resonance peak at 11kHz of the e-puck speaker.
|
AccelerometerThe e-puck robot is equipped with 3D accelerometer placed as illustrated here:
![]()
Next figure is the data acquisition when taking the e-puck and putting it back on a table making a circular mouvement in YZ plan direction (on the horizontal axis you have the sample number, on the vertical axis the values acquired on the three axis):
![]()
Next figure is the data acquisition with a 3D random movement (on the horizontal axis you have the sample number, on the vertical axis the values acquired on the three axis):
When the robot moves, the steppers motors perform a given number of steps per seconds. At each step there is a strong micro-acceleration and micro-deceleration of the robot. Next figure shows the acceleration measured by the accelerometer and due to the stepper motor when moving at 70steps/s. Sampling is made at 7kHz. On the X axis there is the sample number. Next figure shows the acceleration due to the stepper motor when moving at 150steps/s. Sampling is made at 7kHz. On the X axis there is the sample number.
![]() |
Proximity sensorsThe e-puck robot is equipped with 8 IR proximity sensors placed as illustrated here:
![]()
Next figure is the raw data aquisition of IR0 (a front sensor) when the robot starts againts a wall and goes backwards. The X axis shows the number of steps of distance from the wall. 1000 steps correspond to 12.8cm.
![]()
Next figure is the computed sensor information (reflected light - ambient light) data aquisition of IR0 IR1 IR6 IR7 (the four front sensors) when robot is going backwards from a wall. The X axis represents the number of steps. 1000 steps correspond to 12.8cm. The next three figures show some noise on the sensor measurement due to the activity of the stepper motor turning at 100steps/s 400steps/s and 800steps/s. Sampling is made at 64Hz. This noise comes from the power supply of the e-puck. It seem to increase when battery is going down. ![]() ![]() |
CameraThe camera has a resolution of 640(h)x480(l) pixels and is color. The full flow of information this camera generates cannot be processed by a simple processor like the dsPIC on the robot. Moreover the processor has 8k of RAM, not sufficient to even store one single image. To be able to acquire the camera information, the image rate has to be reduced and the resolution too. Typically we can acquire a 40x40 subsampled color image at 4 frames per second.Without color we can go up to 8 frames per second. Not more. But you can also acquire another type of image, for instance a line of 480x1 color pixels on the ground facing the robot. Here are some examples of images taken with the e-puck monitor program: This first image is taken from a distance of 60 cm with a zoom of 8 (means a pixel every 8). Max sampling is 4.3 fps. Here the distance is the same as above (60 cm) but zoom factor is 4 (one pixel every 4). Framerate is same as zoom 8 (4.3 fps). Here the distance is the same as above (60 cm) but zoom factor is 1 (every pixel is taken in the region). Framerate is lower (1.3 fps) because the speed is given by the acquisition rate of the processor. Last image is the same but in grayscale. Same distance, but with higher framerate (8.6 fps). You can of course choose a non square picture (here 80 x 20) and get more data in one direction. |
CommunicationBluetooth and e-puckThe dsPIC, e-puck's microcontroller, has two uarts. The first one is connected to a bluetooth chip and the second one is physically available through a micromatch connector. The bluetooth chip, LMX9820A, can be used to access to the uart "transparently" using a bluetooth rfcomm channel. Using this mode, one can access the e-puck as it was connected to a serial port, with the exception that the LMX9820A will send special commands upon connection/disconnection. The user application must take them into account. How to communicate with the e-puckThis section is only about communicating with an e-puck running a user program, not using the bluetooth bootloader. Setting up a fake serial connection using bluetooth rfcomm under LinuxIn order to setup a fake serial connection using bluetooth rfcomm with your e-puck under Linux, you will need the following:
The lsusb, hciconfig, hcitool and rfcomm commands are very useful to test the presence of a USB bluetooth dongle connected to your Linux box and to setup the connection with the e-puck: # lsusb You will also need to edit your /etc/bluetooth/rfcomm.conf config file to add entries corresponding to your e-puck robots: rfcomm0 { And possibly restart the bluetooth service: # service bluetooth restart A PIN number will asked to you (by bluez-pin) when trying to establish the connection. This PIN number if a 4 digits number corresponing to the name (or ID) of your e-puck. I.e, if you e-puck is called "e-puck_0006", then, this number is 0006. If you are not asked for a PIN number and you get a message like: "Can't connect RFCOMM socket: Connection refused", it may be because the PIN number is given automatically by the PC to the e-puck without prompting you. In this case, you should edit your /etc/bluetooth/hcid.conf and change the "security" parameter from "auto" to "user". Then, you may need to restart bluez-util: # /etc/init.d/bluez-utils restart |
The e-puck robot can be extended through a bus that includes most of the signals of the processor and some signals of the sensors and motors. This allows to:
We invite people to design their own extensions and to share them with the community.
Fly-Vision Turret
Printed Circuit Board Layout:
Camera Configurations:
Detailed schematics and PCB gerber files for the Fly-Vision Turret are available on the Download Page .
|
Ground SensorsThe Ground Sensors extension is comprised of three Infrared (IR) proximity sensors pointed directly at the ground in front of the e-Puck. These sensors are mounted on a small PCB which is placed in the slot near the front of the e-Puck's plastic base. The PCB also includes a microcontroller that continually samples the IR sensors. The value of the sensors can then be read by the e-Puck through the I2C serial interface. The Ground Sensors can be used for many different applications, such as following a black line on a white surface, surface-detection below the robot, or fall-detection. This extension is currently being used for a student exercise in the Bio-Inspired Adaptive Machines course taught by Prof. Dario Floreano at EPFL.
Detailed schematics and PCB gerber files for the Ground Sensors can be found on the Download Page . The user manual (by AAI Canada) of this extension is on the download page also.
|
Colour LED Communication TurretThe Colour LED Communication Turret is part of a visual communications system currently being developed by the Laboratory of Intelligent Systems (LIS) at EPFL. The LED Turret is composed of 8 sets of 3-colour Light Emitting Diodes (LEDs), each set containing a blue, red and green LED. These three colours can be mixed together at different strengths to produce hundreds of different colours in any pattern imaginable. The LEDs are all controlled using 24 separate Pulse Width Modulated (PWM) signals generated by an on-board PIC18F6722 (Microchip) microcontroller.
The firmware on the PIC can control each LED individually for maximum flexibility, or in a synchronised fashion to send the same signal from every angle. The user simply has to set a few parameters (such as the maximum brightness, blinking period, and rising slope) though the I2C interface, and the firmware then generates the PWM signals by itself!
The other part of the visual communication system is an omni-directional camera turret that is used to read signals produced by other robots' LED turrets. This camera turret is currently in prototype development at LIS. Detailed schematics and PCB gerber files for the LED Turret will soon be posted on the Download Page .
|
ZigBee Comm Turret
In the event that you find the information presented here useful, or use a communication module based on this design for obtaining experimental results, we kindly request that you cite the following paper as the official reference: Cianci C., Raemy X., Pugh J., and Martinoli A., “Communication in a Swarm of Miniature Robots: The e-Puck as an Educational Tool for Swarm Robotics”. Proc. of the SAB 2006 Workshop on Swarm Robotics, September-October 2006, Rome, Italy. To appear in Lecture Notes in Computer Science (2007).
Hardware details
Software detailsCurrently, the MCU runs TinyOS 1.x, but it can run TinyOS 2.x. Because the harware is not identical the the TmoteSky one, we had to create a new "platform" within TinyOS, a module for controlling the hardware atternuator, and a I2C slave layer (by default, TinyOS doen't support I2C operating in "slave" or "multi-master" modes). These modifications will eventually need to be ported to TinyOS 2.x.
PerformanceRange of communication :
Power consumption (3.3V):
|
Omnidirectional Vision TurretThe Omnidirectional Vision Turret (OVT) was developed at the Laboratory of Intelligent Systems (LIS) at EPFL. The purpose of this extension is to provide basic omnidirectional vision capabilities to the epuck, without the expense of complex processing hardware. Using a simple microcontroller, the turret is capable of advanced vision tasks such as distinguishing colour blogs in any direction, finding other robots in its vicinity, and basic navigation. Vision SystemOmnidirectional vision is achieved using a standard perspective camera (the same VGA camera used on the e-Puck) looking up on a custom-designed hyperbolic mirror. The mirror was designed to provide a full 360 degree view around the robot, and can see up to 5 degrees above the horizon. An added advantage of the OVT is that it is not a top turret! All the signals from the e-Puck are passed on to a top PCB using card-edge connectors, which allows other turrets to be stacked on top of the OVT. HardwareThe CMOS camera is connected to a First-In-First-Out (FIFO) frame buffer (Averlogic AL440B-24-PBF), which is then connected to a microcontroller (dsPIC33FJ256GP506 from Microchip). The frame buffer allows de-coupling between the camera and the microcontroller, which means that the microcontroller is no longer a slave to the camera, and can read an image at whatever pace is required from the frame buffer. FirmwareSeveral algorithms have been implemented, including basic drivers for reading an image from the camera, as well as image processing algorithms. There are two ways of reading the image:
Some basic image processing algorithms have also been implemented on the dsPIC
The firmware for the OVT will be posted on the download page once the bugs have been ironed out. Detailed schematics and PCB gerber files for the Omnidirectional Vision Turret can be found on the Download Page .
|
Magnetic WheelsThis is a very simple extension consisting in adding some magnets on the bottom or replacing the wheels with a magnetic version. Here a video of the system:
Here are some details on the implementation: For adhesion on ferromagnetic ceilings, only the wheels have to be modified. The magnetic wheel has the following shape (magnet in red):
![]()
The wheel is fixed to the axis like the standard wheel. The new wheel is in ferromagnetic iron. A 3D pdf file can be found here. On this file you can take measures for size. To climb a wall, the standard e-puck motor has not sufficient torque. For this type of operation the motor of the e-puck (a custom PG15S-020 with a resistance of 18 ohm) need to be replaced with a more powerful motor (the standard PG15S-020 with a resistance of 10 ohm). Our custom and standard motors have been purchased by EME in Switzerland at 55CHF/sample. |
Epuck Range and Bearing BoardThe e-Puck Range & Bearing board (e-RandB) improves the existing relative localization/communication software library (libIrcom) developed for the e-puck robot and based on its on-board infrared sensors. The e-RandB has been developed in collaboration with Robolabo, Iridia and RBZ Robot Design. The board allows situated agents to communicate locally, obtaining at the same time both the range and the bearing of the emitter without the need of any centralized control or any external reference. Therefore, the board allows the robots to have an embodied, decentralized and scalable communication system. The system relies on infrared communications with frequency modulation and is composed of two interconnected modules for data and power measurement. The e-RandB board is actually used at:
E-RandB Files:
E-RandB Articles:Its use and capabilities are demonstrated in the following papers:
|
Software available for the e-puck platform :
Check the Download section for a complete list of software/firmware/electronics files.
TutorialDevelopment on the e-puck can be learned through the tutorial. LibrariesWe have developed few simple libraries allowing to access to the hardware features. The following libraries are available:
Release versions of the libraries are available at the e-puck snapshot. Development versions of the libraries are available at gna. If you want to take part in their development, you can join the development mailing list. BootloaderWe have a bootloader that works through bluetooth. ASEBAThe ASEBA architecture allows to run scripts into the e-puck microcontroller keeping a good control on the code, allowing easy debugging and visualization of key variable and sensor data. A bridge between ASEBA and ROS has been implemented and allows a link with one of the most well known robotic programming environments. DebugHardware online debug has to be done using an external programmer ICD2 from Microchip. But you can manage doing simple debuging by sending information trough bluetooth. Communication protocolWe have implemented a serial communication protocol to control the e-puck via a serial RS232 line or bluetooth. This allows simple experimentation with code developed on a PC. DemonstrationsWe have a set of demonstration illustrating the use of several sensors of the robot in several typical applications. |
Compiler and IDEThe dsPIC processor family is supported by the GCC GNU compiler collection as the PIC30 target. WindowsUnder Windows, the MPLAB environment from Microchip can be used with their C30 compiler, which is a version of GCC for dsPIC with some limitations. Those limitations can be removed by recompiling C30 from its sources. To connect the ICD2 hardware to the e-puck you need to change the connector and use an e-puck (red 6 pins) connector. It’s a Tyco connector called Micro match, type is 7-215083-6 . To upload the .hex files without the use of microchip's ICD, you can use the Tiny PIC bootloader. UnixesIn Unixes, the piklab environment is available. Its web page provides instructions on how to get or build gcc for PIC30. In addition, if you are running a Debian derivative (such as Ubuntu), there are instructions here on how to build cleans debs of latest PIC30 compiler (version 2.05). If you have troubles during compilation (particularly on Ubuntu, which uses dash as /bin/sh), please read the rest of the page as there is further explanations. To upload the .hex files you have the choices of three Unix program:
|
ePic2ePic2 is a complete framework to command the e-puck within Matlab. It allows its user to interact with the e-puck as he would do using the Windows' Hyper Terminal but improves on it by allowing the data gathered by the robot to be used as variables for Matlab's tools. ePic2 can be used in two ways. The first is through its graphical interface that can be seen below. The interface offers basic commands over the e-puck and displays the values of the sensors. It also provides a way to execute controllers written in a M-File. The second method uses Matlab functions to command the robot from the Matlab's command line. This offers more control compared to the interface. ePic2 is available on the e-puck svn in the tool section. It requires Matlab 7.1 or higher equipped with the Instrument Control Toolbox installed. It is nevertheless recommended to use the most recent version of Matlab to avoid some stability issues. Currently ePic can be used under Windows and Linux. It does not work under Mac Os as the necessary functions to connect to the e-puck are not supported on that platform. ePic2 and its documentation can be found in the download section but it is highly recommended to download them from the tool section of the e-puck svn as it may contain a more recent version. ePic2 is used for education purposes at EPFL in the course Mobile Robots. ePic is the main tool used for every exercise sessions whose details, including exercise sheets and source codes, can be found on the course website. This software was developed at the Laboratory of Intelligent Systems at EPFL by Yannick Weibel, Julien Hubert and others contributors. |
IR communication
Get it right now !Here you can find the library and two examples. All material is made available under the GNU General Public License (GPL). How to test the examples ?With the libIrcom library 2 examples are provided : test and synchronize.
Troubleshooting
More informationsFind more informations on https://www.e-puck.org, or ask to the e-puck community on GNA mailing lists : https://gna.org/projects/e-puck/ Contact the authorsWe are interested by your work and would be happy to know what kind of exciting things you achieved with this tool. Let us know ! |
Player driverA Player driver has been developed by Renato Garcia and is available here. |
Three simulators are available with e-puck modules:
Webots includes the e-puck cross-compilation tools, run on Linux, Windows and Mac OS X. The free version of Webots allows you to cross-compile e-puck programs.
EnkiEnki is a open-source, fast 2D physics-based robot simulator written in C++. It is able to simulate cinematics, collisions, sensors and cameras of robots evolving on a flat surface. It also provides limited support for friction. It is able to simulate groups of robots hundred times faster than realtime on a modern desktop computer. ENKI webpage here |
WebotsThe Webots mobile robot simulation software now includes models of the e-puck robot and a bluetooth remote control interface working on all Windows, Mac OS X and Linux A model of the e-puck robot is included in the Webots simulator.
The real and simulated e-puck control window in Webots
The e-puck.wbt model features accurate physics simulation (including inertia, friction, etc.). A bluetooth connection between Webots and the real e-puck robot allows you to remote-control the real robot from your Webots controller programs (works currently under Linux, Windows and Mac OS X). We are working on supporting also the cross-compilation of Webots controller programs so that you can execute the code on the robot microcontroller directly. This software is being improved regularly. The current version models the following devices:
|
V-REPV-REP is a 3D robot simulator based on a distributed control architecture: control programs (or scripts) can be directly attached to scene objects and run simultaneously in a threaded or non-threaded fashion. This makes V-REP very versatile and ideal for multi-robot applications, and allows users to model robotic systems in a similar fashion as in reality - where control is most of the time also distributed.
Here a video on the e-puck model running:
V-REP has various prices and licensing schemes. Students can use V-REP on their private computer free of charge. |
Here are listed some accessories that can be used with the e-puck robot.
The e-puck robot is equipped with an IR remote control receiver. If you want to send commands you need a remote control generating RC5 signals. For those who do not want to try all remote controls, here is one that has been tested and works:
Model name: RC UNIVERS15
Lenght: 130mm
Initialisation: press "SET" and "TV" then enter code 026
Where to buy: https://www.donberg.ie
Product page: https://www.donberg.ie/descript/r/rcuies15.htm
To safely transport your e-puck around, we use the following box:
Article name (in german): Schraubdosen - Polypropylen (PP)
Diameter: 90 mm
Height: 80 mm
Content: 500 ml
Article number: 2066
Price: 1.30 CHF/box
Where to buy: https://eshop.semadeni.com/
Product page:
FRENCH:
https://eshop.semadeni.com/?&country=CH&language=FR&action=search&searchtype=3&search=2066
GERMAN:
https://eshop.semadeni.com/?&country=CH&language=DE&action=search&searchtype=3&search=2066
Several companies are selling the e-puck robot:
GCtronic, a Swiss company, is manufacturing and selling the e-puck educational mobile robot directly or through distributors all over the world.
For Taiwan please contact Pitotech.
Clients in China mainland can contact BJROBOT TECHNOLOGY CO.,LTD.
For other countries you can contact GCtronic directly.
The reference publication on the e-puck robot is:
Mondada, F., Bonani, M., Raemy, X., Pugh, J., Cianci, C., Klaptocz, A., Magnenat, S., Zufferey, J.-C., Floreano, D. and Martinoli, A. (2009) The e-puck, a Robot Designed for Education in Engineering. Proceedings of the 9th Conference on Autonomous Robot Systems and Competitions, 1(1) pp. 59-65.
Publications that mention the e-puck robot (cronological order) are listed below. If you have a publication using the e-puck and not listed here plase send the bibtex information to info@gctronic.com.
[] | Mattias Jacobsson, Ylva Fernaeus, and Lars Erik Holmquist. Glowbots: designing and implementing engaging human-robot interaction, 2008-06. [ bib ] |
[] | Yating Zheng, Cristián Huepe, and Zhangang Han. Experimental capabilities and limitations of a position-based control algorithm for swarm robotics. Adaptive Behavior, 0(0):1059712320930418, 0. [ bib | DOI | arXiv | www: ] |
[] | Nikolaus Correll, Christopher M. Cianci, Xavier Raemy, and Alcherio Martinoli. Self-organized embedded sensor/actuator networks for ""smart"" turbines. 2006. [ bib ] |
[] | S. Ljungblad, K. Walter, M. Jacobsson, and L.E. Holmquist. Designing personal embodied agents with personas. In ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication, pages 575--580, 2006. [ bib | DOI ] |
[] | Christopher M. Cianci, Xavier Raemy, Jim Pugh, and Alcherio Martinoli. Communication in a swarm of miniature robots: The e-puck as an educational tool for swarm robotics. In Erol Şahin, William M. Spears, and Alan F. T. Winfield, editors, Swarm Robotics, pages 103--115, Berlin, Heidelberg, 2007. Springer Berlin Heidelberg. [ bib ] |
[] | Alberto Acerbi, Davide Marocco, and Stefano Nolfi. Social facilitation on the development of foraging behaviors in a population of autonomous robots. In Fernando Almeida e Costa, Luis Mateus Rocha, Ernesto Costa, Inman Harvey, and António Coutinho, editors, Advances in Artificial Life, pages 625--634, Berlin, Heidelberg, 2007. Springer Berlin Heidelberg. [ bib ] |
[] | O. Gigliotta and S. Nolfi. Formation of spatial representations in evolving autonomous robots. In IEEE Symposium on Artificial Life, 2007. ALIFE'07, pages 171--178, 2007. [ bib ] |
[] | Yong Xu, Matthieu Guillemot, and Toyoaki Nishida. An experiment study of gesture-based human-robot interface. In 2007 IEEE/ICME International Conference on Complex Medical Engineering, pages 457--463, 2007. [ bib | DOI ] |
[] | Sara Ljungblad and Lars Erik Holmquist. Transfer scenarios: Grounding innovation with marginal practices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '07, page 737–746, New York, NY, USA, 2007. Association for Computing Machinery. [ bib | DOI | https ] |
[] | Christopher M. Cianci, Thomas Lochmatter, Jim Pugh, and Alcherio Martinoli. Toward multi-level modeling of robotic sensor networks: A case study in acoustic event monitoring. In Proceedings of the 1st International Conference on Robot Communication and Coordination, RoboComm '07. IEEE Press, 2007. [ bib ] |
[] | Davide Marocco and Alberto Acerbi. Adaptation and social facilitation in a population of autonomous robots. In Proceedings of the Seventh International Conference on Epigenetic Robotics, pages 85--91, Lund, 2007. LUCS. [ bib ] |
[] | Mattias Jacobsson, Sara Ljungblad, Johan Bodin, Jeffrey Knurek, and Lars Holmquist. Glowbots: Robots that evolve relationships. 01 2007. [ bib | DOI ] |
[] | Herianto Herianto, Toshiki Sakakibara, and Daisuke Kurabayashi. Artificial pheromone system using rfid for navigation of autonomous robots. Journal of Bionic Engineering - J BIONIC ENG, 4:245--253, 12 2007. [ bib | DOI ] |
[] | Giovanni Pini, Elio Tuci, and Marco Dorigo. Evolution of social and individual learning in autonomous robots. In IN: ECAL WORKSHOP: SOCIAL LEARNING IN EMBODIED AGENTS, 2007. [ bib ] |
[] | Farzad Rastegar and Majid Nili Ahmadabadi. Grounding abstraction in sensory experience. In 2007 IEEE/ASME international conference on advanced intelligent mechatronics, pages 1--8, 2007. [ bib | DOI ] |
[] | Jim Pugh and Alcherio Martinoli. Parallel learning in heterogeneous multi-robot swarms. In 2007 IEEE Congress on Evolutionary Computation, pages 3839--3846, 2007. [ bib | DOI ] |
[] | Jim Pugh and Alcherio Martinoli. Inspiring and modeling multi-robot search with particle swarm optimization. In 2007 IEEE Swarm Intelligence Symposium, pages 332--339, 2007. [ bib | DOI ] |
[] | Pierre Roduit, Alcherio Martinoli, and Jacques Jacot. A quantitative method for comparing trajectories of mobile robots using point distribution models. In 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 2441--2448, 2007. [ bib | DOI ] |
[] | Saeed Amizadeh, Majid Nili Ahmadabadi, Babak N. Araabi, and Roland Siegwart. A bayesian approach to conceptualization using reinforcement learning. In 2007 IEEE/ASME international conference on advanced intelligent mechatronics, pages 1--7, 2007. [ bib | DOI ] |
[] | Jim Pugh and Alcherio Martinoli. The cost of reality: Effects of real-world factors on multi-robot search. In Proceedings 2007 IEEE International Conference on Robotics and Automation, pages 397--404, 2007. [ bib | DOI ] |
[] | A. Acerbi, F. Cecconi, D. Marocco, and S. Zappacosta. Individual vs Social Learning in a Population of Autonomous Robots. In Proceedings of the ECAL, pages 10--14, 2007. [ bib ] |
[] | Louis-Emmanuel Martinet, Benjamin Fouque, Jean-Baptiste Passot, Jean-Arcady Meyer, and Angelo Arleo. Modelling the cortical columnar organisation for topological state-space representation, and action planning. In Minoru Asada, John C. T. Hallam, Jean-Arcady Meyer, and Jun Tani, editors, From Animals to Animats 10, pages 137--147, Berlin, Heidelberg, 2008. Springer Berlin Heidelberg. [ bib ] |
[] | Jim Pugh and Alcherio Martinoli. Distributed adaptation in multi-robot search using particle swarm optimization. In Minoru Asada, John C. T. Hallam, Jean-Arcady Meyer, and Jun Tani, editors, From Animals to Animats 10, pages 393--402, Berlin, Heidelberg, 2008. Springer Berlin Heidelberg. [ bib ] |
[] | Dieter Vanderelst and Emilia Barakova. Autonomous parsing of behavior in a multi-agent setting. In Leszek Rutkowski, Ryszard Tadeusiewicz, Lotfi A. Zadeh, and Jacek M. Zurada, editors, Artificial Intelligence and Soft Computing -- ICAISC 2008, pages 1198--1209, Berlin, Heidelberg, 2008. Springer Berlin Heidelberg. [ bib ] |
[] | Winai Chonnaparamutt and Emilia I. Barakova. Robot simulation of sensory integration dysfunction in autism with dynamic neural fields model. In Leszek Rutkowski, Ryszard Tadeusiewicz, Lotfi A. Zadeh, and Jacek M. Zurada, editors, Artificial Intelligence and Soft Computing -- ICAISC 2008, pages 741--751, Berlin, Heidelberg, 2008. Springer Berlin Heidelberg. [ bib ] |
[] | Takaaki Shimone, Daisuke Kurabayashi, Kunio Okita, and Tetsuro Funato. Implementation of Formation Transition System Using Synchronization in a Mobile Robot Group, pages 423--432. Springer Berlin Heidelberg, Berlin, Heidelberg, 2008. [ bib | DOI | https ] |
[] | Olivier Michel, Fabien Rohrer, and Yvan Bourquin. Rat's Life: A Cognitive Robotics Benchmark, pages 223--232. Springer Berlin Heidelberg, Berlin, Heidelberg, 2008. [ bib | DOI | https ] |
[] | Stanislav Slušný, Roman Neruda, and Petra Vidnerová. Rule-based analysis of behaviour learned by evolutionary and reinforcement algorithms. In De-Shuang Huang, Donald C. Wunsch, Daniel S. Levine, and Kang-Hyun Jo, editors, Advanced Intelligent Computing Theories and Applications. With Aspects of Artificial Intelligence, pages 284--291, Berlin, Heidelberg, 2008. Springer Berlin Heidelberg. [ bib ] |
[] | F. Mondada, M. Bonani, X. Raemy, J. Pugh, C. Cianci, A. Klaptocz, S. Magnenat, J.-C. Zufferey, D. Floreano, and A. Martinoli. The e-puck, a robot designed for education in engineering. The International Journal of Advanced Robotic Systems, 6(4):11--29, 2009. [ bib | DOI ] |
[] | M. Bonani, J. Pugh, A. G. F. Silva, and A. Martinoli. Design and implementation of an open-hardware/open-software platform for cooperative distributed robotics. In 2009 IEEE International Conference on Robotics and Automation, pages 1076--1081, 2009. [ bib | DOI ] |
[] | S. Magnenat and F. Mondada. Aseba: A distributed and embedded programming architecture for swarm robotics. In F. Groen, S. M. I. A. M. de Jong, and W. J. M. Wiegerinck, editors, ICT. Open Access ICT Journal., pages 411--420, 2009. [ bib ] |
[] | Stefano Nolfi, Alberto Acerbi, and Davide Marocco. Evolutionary swarm robotics: A review. In A. P. Agogino, K. A. De Jong, P. J. Bentley, and S. K. Ong, editors, Computational Intelligence in Engineering, pages 191--214, Berlin, Heidelberg, 2011. Springer Berlin Heidelberg. [ bib ] |
[] | R. L. G. de Oliveria, B. C. de Souza, V. H. R. C. M. F. B. de Oliveira, R. P. G. L. T. G. N. de Oliveira, E. G. M. H. V. C. R. C. V. H. B. C. L. G. de Oliveira, and L. C. M. L. T. C. R. C. R. C. B. C. L. G. de Oliveira. Robot for learning about robotic and evolutionary algorithms. In 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pages 3226--3231, 2012. [ bib | DOI ] |
[] | Andrés González and Jesús G. González. Social interaction test between a rat and a robot: A pilot study. International Journal of Advanced Robotic Systems, 13(1):4, 2016. [ bib | DOI | arXiv | www: ] |
[] | =., G., A., and C. Software infrastructure for e-puck ( and tam ). In "", 2016. [ bib ] |
[] | Yuri K. Lopes, Stefan M. Trenkwalder, André B. Leal, Tony J. Dodd, and Roderich Groß. Supervisory control theory applied to swarm robotics. Swarm Intelligence, 10(1):65--97, Mar 2016. [ bib | DOI ] |
[] | D. Bortoluzzi, B. L. S. de Almeida, A. I. M. T. A. M. L. B. T. L. B. D. E. J. A. D. E. J. L. A. B. D. E. J. E. J. A. M. T. G. J. E. T. M. D. A. M. L. P. M. B. D. E. J. M. D. A. M. M. P. T. D. E. J. E. T. L. M. B. D. E. J. D. E. J. S. S. A. M. T. G. A. S. A. M. T. L. B. D. L. P. M. B. D. E. J. R. M. G. E. T. L. B. D. E. J. D. E. J. S. T. D. L. B. D. E. J. S. D. L. B. D. E. J. B. D. E. J. M. A. B. D. E. J. T. L. A. M. J. D. E. J. and A. D. E. J. Learning distributed control policies for multi-robot teams: A comparison of three approaches. IEEE Access, 5:18175--18186, 2017. [ bib | DOI ] |
[] | T. Kargar Tasooji and H. J. Marquez. Cooperative localization in multirobot systems using event-triggered communication: The centralized case. IEEE Transactions on Control Systems Technology, 27(3):967--979, 2019. [ bib | DOI ] |
[] | Dara Mohammad and Sanaz Mostaghim. On the communication capacity of a small mobile robot: A real-world experiment. In Proceedings of the 2019 Genetic and Evolutionary Computation Conference Companion, GECCO '19, page 209–210, New York, NY, USA, 2019. Association for Computing Machinery. [ bib | DOI | https ] |
[] | D. E. J. R. M. L. B. D. E. J. T. E. M. E. V. L. V. L. V. M. R. C. M. E. V. L. E. M. J. A. D. E. J. E. M. A. T. E. M. T. M. J. E. M. V. M. E. J. S. V. E. J. E. M. V. L. S. T. E. M. M. D. A. M. E. M. J. E. M. L. V. E. J. and S. S. E. M. Active multi-robot simultaneous localization and mapping (slam) based on coverage maximization and mutual information. IEEE Access, 6:58843--58855, 2018. [ bib | DOI ] |
[] | Yating Zheng, Cristián Huepe, and Zhangang Han. Experimental capabilities and limitations of a position-based control algorithm for swarm robotics. International Journal of Advanced Robotic Systems, 17(3):1729881420930418, 2020. [ bib | DOI | arXiv | www: ] |
[] | M. D. Erbaş and İ. H. Parlak. Evaluation of vocal communication in a robot collective. DUZCE UNIVERSITY JOURNAL OF SCIENCE AND TECHNOLOGY, 8:2029--2040, 2020. [ bib ] |
[] | Mehmet Dinçer Erbaş and İsmail Hakkı Parlak. Evaluation of vocal communication in a robot collective. Düzce Üniversitesi Bilim ve Teknoloji Dergisi, 8:2029 -- 2040, 2020. [ bib | DOI ] |
[] | Valeria Villani, Beatrice Capelli, Cristian Secchi, Cesare Fantuzzi, and Lorenzo Sabattini. Humans interacting with multi-robot systems: a natural affect-based approach. Autonomous Robots, 44(3):601--616, Mar 2020. [ bib | DOI ] |
[] | Qihao Shan and Sanaz Mostaghim. Discrete collective estimation in swarm robotics with distributed bayesian belief sharing. Swarm Intelligence, Sep 2021. [ bib | DOI | https ] |
[] | Eduardo Castelló Ferrer, Thomas Hardjono, Alex Pentland, and Marco Dorigo. Secure and secret cooperation in robot swarms. Science Robotics, 6(56):eabf1538, 2021. [ bib | DOI | arXiv | www: ] |
[] | T. Kargar Tasooji and H. J. Marquez. Cooperative localization in multirobot systems using decentralized event-triggered communication: With and without timestamps mechanism. IEEE Transactions on Industrial Electronics, 70(6):5982--5993, 2023. [ bib | DOI ] |
[] | Tohid Kargar Tasooji and Horacio J. Marquez. Decentralized event-triggered cooperative localization in multirobot systems under random delays: With/without timestamps mechanism. IEEE/ASME Transactions on Mechatronics, 28(1):555--567, 2023. [ bib | DOI ] |
[] | Tohid Kargar Tasooji, Sakineh Khodadadi, and Horacio J. Marquez. Event-based secure consensus control for multirobot systems with cooperative localization against dos attacks. IEEE/ASME Transactions on Mechatronics, 29(1):715--729, 2024. [ bib | DOI ] |
[] | Tohid Kargar Tasooji and Sakineh Khodadadi. Multi-robot coordination under physical limitations, 2025. [ bib | arXiv | https ] |
This file was generated by bibtex2html 1.99.