This laboratory is intended for research and teaching in mobile robotics, focusing
on the following topics: Planning, sensing and control; Trajectory planning and autonomous vehicle control; Artificial intelligent and machine vision algorithms; Sensor fusion; Human-robot collaboration.
 
 
Academic Advisor: Yael Edan
Technical Manager: Noam Peles
 
 
The lab includes:
 
Two Active Media Pioneer 2DX 3 wheeled mobile platform equipped with 16 front and rear sonars, bump sensors, encoders, a compass and a camera (LACHISH, ARAD), 1 Active Media Pioneer 2AT 4 wheeled mobile platform equipped with 16 front and rear sonars, 1 SONY CCD camera and 1 SICK Laser (NEGEV),a tailored design outdoor agricultural robot, 2 Gaus robots, a Fischertechnik educational robot, and 1 advanced platform based on a Motorola controller (RAMON).
 

Courses using this lab: Computational Methods in AI (graduate compulsory), Intelligent Automation Systems (graduate elective), and Foundations of robotics in industry (graduate elective).

 Research Projects:

 

A sensor fusion framework for online sensor and algorithm selection
Researchers: Ofir Cohen, Yael Edan
A sensor fusion framework for selecting online the most reliable logical sensors and the most suitable algorithm for fusing sensor data in a robot platform was developed. The framework is rule-based, employing the concept of using the simplest sensor fusion algorithm with the most reliable sensors. The framework is realized by implementing measures that were developed to quantify online sensor performance. Statistical,
histogram, time series and graphical analyses demonstrate the advantages of this new framework.
Statistical evaluation method for comparing grid map
based sensor fusion algorithms
Researchers: Ofir Cohen, Edna Schechtman, Yael Edan
A method for evaluating sensor fusion algorithms based on a quantitative comparison, which is independent of the data acquired and the sensors used was developed. The sensor fusion performance measures and performance analysis procedure provide a basis for modeling, analyzing, experimenting and comparing different sensor fusion algorithms. The capability to compare different algorithms creates a ranking basis, making it possible to select the best algorithm.
The statistical evaluation method defines the experimental design and statistical analysis. The number of experiments and repetitions required are derived from the statistical characteristics and the desired confidence level. Since procedures are defined to ensure that the experiments are indeed conducted differently, the results are not specific for either the evaluated test cases or the sensor characteristics. The statistical analysis provides a systematic method for comparing sensor fusion algorithms.
Although this method requires experimentation, it offers the ability to compare actual performances in the real world. Quantitative procedures are developed to ensure that specific environmental conditions evaluated do not influence the evaluation. To demonstrate the statistical evaluation method it is applied to a case study that compared five different sensor fusion algorithms in a mobile robot experiment.
Adaptive weighted average sensor fusion algorithms for mobile robots
Researchers: Keren Kapach, Yael Edan
A new set of sensor fusion algorithms for mapping the environment of mobile robots using grid maps was developed. The algorithms use the adaptive weighted average method and consider as weights the number of times each cell was sampled by the sensor. Analysis in an indoor mobile robot experiment indicated superior performance of one of the new algorithms when compared to a previously developed adaptive fuzzy logic algorithm
Navigation methodology of Automatic guided vehicles
Researchers: Keren Kapach, Yael Edan
A navigation methodology for decentralized autonomous automated guided vehicles used for material handling. The navigation methodology is based on behavior-based control augmented with multi-robot coordination behaviors and a priori way-point determination. Results indicate that the developed methodology fuses well between the desires for optimal vehicle routes on the one hand and decentralized reactive operation on the other.
Evaluation of Automatic Guided Vehicle Systems
Researchers: Sigal Berman, Edna Schechtman, Mo Jamishidi, Yael Edan.
Partners: University of New Mexico
A methodology for detailed evaluation of autonomous automated guided vehicles systems (AGVS) used for material handling. The methodology includes: stand-alone sub module evaluation including comprehensive simulations and statistical analysis of the system’s sub-modules along with hardware validation; quantitative system evaluation for integrated system performance investigation; and structured qualitative analyses for identifying strengths and weaknesses not readily apparent. The defined performance measures include aspects from both multi-robot and AGV fields. The developed methodology provides a systematic way to model, experiment, analyze, and compare different AGVS control methods.
KISS human-robot interfaces
Researchers: A. Eliav, T. Lavie, Y. Parmet, H. Stern, J. Wachs, Y. Edan
A human robot interface was designed based on previous guidelines for human-robot interface design and following the ‘Keep It Simple and Stupid Principle’. Two types of directional controllers (hand gestures/touch screen) and two types of distance display modes (head-up/radar) were compared in an experiment requiring an operator to follow a path in a complex industrial environment. Implicit results indicated that when examining the system according to ‘mission complete time’ parameter, the touch screen directional controller is preferable. Subjective results indicated that the head-up display is more efficient and that the touch screen directional controller is much comfortable, intuitive, efficient and simple when compared to the hand-gesture directional controller.
Toward Elevated Agrobotics: An Autonomous Field Robot for Spraying and Pollinating Date Palm Trees
Agriculture operations such as spraying and pollinating date palm trees are currently performed manually by a team of several workers from a platform raised 18 meters or more above the ground. This method is unsafe and accidents have often occurred in the past. Alternatively, date clusters are sprayed by a large pressurized sprayer directly from the ground – a method that is highly unselective and environmentally harmful. We are developing an autonomous field robot that will effectively and accurately spray and pollinate date clusters. The robotic system consists of a visually-controlled robotic arm that guides the jet of a mounted sprayer directly to the ate clusters, completely autonomously. Rather than an expensive, dedicated platform, this robotic system can be towed by a standard tractor operated by a single driver, and eliminates the need for human workers to operate at great heights.

Image processing algorithms for a selective vineyard robotic sprayer
Researchers: Ron Berenstein, Ohad Ben Shahar, Amir Shapiro, Avital Bechar, Yael Edan
Image processing algorithms for a selective robotic sprayer in vineyards were developed. Two types of machine vision algorithms were developed to directly spray grape clusters and foliage. The first algorithm is based on the difference in the distribution of edges between the foliage and the grape clusters. The second detection algorithm uses a decision tree algorithm for separating the grape clusters from the background based on a training dataset from 100 images. Both image processing algorithms were tested on data from movies acquired in vineyards during the growing season of 2008. Results indicate high reliability of both foliage detection and grape clusters detection. Preliminary results show 90% percent accuracy of grape clusters detection, leading to 30% reduction in the use of pesticides.
​​
A quadruped legged robot driven by linear actuators
Researchers: Amir Shapiro, Raziel Riemer, Gal A. Kaminka, Hugo Guterman
The project's aim is to develop a low-cost, high-reliability legged robot. The main development thrusts are: mechanical design, control, and the gait algorithms.