Industrial actionsPh.D grantsResearch grants |
National actions"Investissements d'avenir"ANR National projects |
FUI ProjectDGAOSEO Projects |
International actionsEuropean projectsCollaborations |
FP7 Regpot Across: this 42-month project started in September 2011. It is a network entitled ``Center of Research Excellence for Advanced Cooperative Systems'' managed by Prof. Ivan Petrovic from University of Zagreb (Croatia). It involves many european partners: KTH (Sweden), ETHZ (Switzerland), TUM (Germany), University of Manchester (UK), Vienna University of Technology (Austrtia), Politecnico di Milano (Italy), University of Sevilla (Spain), Eindhoven University of Technology (The Netherlands), University of Athens (Greece), etc. It provides grants for scientific visits and internships: three master students from the University of Zagreb came in Rennes for four months in 2012, and one Ph.D. student will come for three months in 2013. |
|
This FP6 project started in September 2006. It is managed by Dassault Aviation and groups many industrial and academic partners (Alenia Aeronautica, Eurocopter, EADS, Walphot, I3S, EPFL, ETHZ, IST, JSI). It is concerned with the automatic landing of fixed wing aircrafts and helicopters using a vision sensor. In this project, we are the leader of the workpackage devoted to visual tracking and visual servoing. |
|
This 2-year project ended fall 2011. It was a collaboration with the computer science center of Federal University of Pernambuco in Recife, Brazil, and with the Robotics and Automation Dpt., Mining Technology Center at the Universidad de Chile in Santiago. The collaboration with Santiago de Chile allowed François Chaumette to participate to the 5th IEEE Latin American Summer School on Robotics in December 2011, and to the IEEE RAS Summer School on Robot Vision and Applications in December 2012. The collaboration with Recife was about augmented reality [Lima13a][Lima12a][Lima12b][Lima12c]. |
Equipex Robotex: this 10-year project managed by CNRS started fall 2011. Lagadic is one of the 15 French partners involved in this network. It is devoted to get significative equipments in the main robotics labs in France. Up to now, it allowed us to get a Viper S650 arm and a Pioneer 3DX. Our main equipment, the humanoid robot Romeo by Aldebaran Robotics, will be delivered in June~2013. Robotex also allowed us to enlist Aurélien Yol as an engineer for one year in 2012. Aurélien worked to add functionalities in ViSP requested by the Robotex partners (mainly Laas and Isir). |
This 4-year project started at the beginning of 2012. It involves Femto-ST (prime) in Besançon, LPN and Isir in Paris, Thalès and Lagadic group through the Université de Rennes 1. Nanorobust deals with the development of nano-manipulation within SEM (Scanning Electron Microscope). Our goal is to provide visual servoing techniques for positioning and manipulation tasks with a nanometer accuracy. |
|
This 33-month project ended fall 2012. It was composed of a consortium managed by Technicolor with Artefacto, Istia, Soniris, Bilboquet companies, Télécom Bretagne, and Inria Metiss and Lagadic groups, all located in Rennes. The goal of this project was to provide tools to develop new TV programs allowing the final user to interact within an immersive and convivial interface. Within this project, we developed visual tracking and servoing algorithms for 3D camera localization. Our works on RGB-D sensor have been realized in the scope of this project. www: http://www.rev-tv.eu |
|
This project is a multidisciplinary industrial research type project led by the Prisme lab (previously called LVR) in Bourges. It is just starting in collaboration with Lirmm in Montpellier, LMS in Poitiers, CHU of Tours, and the Robosoft company. The object of this project is to develop an interactive master-slave robotic platform for a medical diagnosis application (tele-echography) and to develop a cluster of interactive functionalities combining: visual servoing, force control, haptic feedback, virtual human interface, 3D representation of organs. Within this project, we will study and develop autonomous control modes that directly make use of visual data provided by a camera observing the patient and information contained in the ultrasound image to move the ultrasound probe. |
This project, led by Alexandre Krupa, has just started. It involves a collaboration with the Visages team in Rennes, LSIIT in Strasbourg and Lirmm in Montpellier. Its goal is to provide methodological solutions for real-time compensation of soft tissues motion during ultrasound imaging. The approach will consist in synchronizing the displacement of a 2D or 3D ultrasound transducer to stabilize the observed image by the use of a robotic arm actuating the ultrasound probe. The problematic concerns more specifically the use in the control scheme of the peroperative ultrasound image, the interaction force between the probe and the soft tissues and the measurements of external signals providing the breathing state of the patient. |
This project started in March 2008. It is realized in collaboration with BA Systèmes, CEA List, and Université de Caen. RobM@rket project aims at developing automated applications for order picking in a fast-expanding business which mainly includes manual tasks. The system would apply to packaging before dispatching items ordered on a website through an online catalogue including more than 1000 references or to order picking with orders dedicated to kitting. The robotic system will be made of a PLC mobile platform of AGV type (Automatic Guided Vehicles, by BA Systèmes) and of an industrial robot arm. This platform will be used to integrate several algorithms allowing picking up selected items in a warehouse through a command file and bringing them back for dispatching or assembling them. The items could be either methodically stored or jumbled in the boxes. Our current work consists in developing vision-based objects localization techniques for grasping them. |
ANR Tosa CityVIP: this 42-month project ended at the end of 2011. It involved Lasmea (prime) in Clermont-Ferrand, Inria (Lagadic and ARobAS), Heudiasyc in Compiègne, LCPC in Nantes, IGN in Paris, XLim in Limoges, and Benomad company in Nice. The project consisted of enhancing the autonomy of urban vehicles by integrating sensor-based techniques with a geographical database. Our works about navigation using a visual memory and obstacles avoidance have been realized in the scope of this project
This project started in March 2008. It is realized in collaboration with Orange Labs, CEA Leti, Movea, Polymorph, and the Museum of fine arts in Rennes. The Augmented Reality (AR) concept aims to enhance our real world perception, combining it with fictitious elements. AR research is concerned with the different methods used to augment live video imagery with coherent computer generated graphics. The combination of mobile technologies and AR will allow the design of a video rendering system with an augmentation of the real world depending on user localisation and orientation. In this project we propose to focus on indoor environments having as a main objective the implementation of AR technologies on mobile devices. The experimental field proposed is the Museum, a controlled environment (constant lightening and location of objects) without some of the perturbations of outdoor environments. We do estimate that a successful museum prototype could be used as the backbone of many other indoor and outdoor AR applications. Within this project we are involved in tracking and sensor fusion parts of the AR process. |
This project, led by Tarek Hamel from I3S, started in June 2007. It is realized in collaboration with I3S, the EPI ARobAS at Inria Sophia Antipolis-Méditerranée, Heudiasyc in Compiègne, the CEA-List and the Bertin company. It is devoted to the sensor-based control of small helicopters for various applications (stabilization landing, target tracking, etc.).
PEA Decsa: this 3-year project funded by DGA started fall 2011. It is composed of a consortium managed by Astrium in Toulouse with the Novadem, Sirehna, Spot Image and Magellium companies, and with Inria Steep group in Grenoble and Lagadic. It is devoted to the development of navigation and perception algorithms for small drones in urban environment. |
|
This four-year project started in December 2012. It is a huge project lead by Aldebaran Robotics to develop functionalities on the new humanoid robot Romeo. Our work in this project is devoted to develop vision-based navigation and manipulation tasks. |
|
Oseo Apash: this 2-year project started in September 2012. It is managed by Marie Babel from Lagadic and involves three laboratories connected to Insa Rennes, namely Irisa/Inria, IETR and LGCGM, and two industrial partners: AdvanSEE in Nantes and Ergovie in Rennes. It aims at designing a driving assistance for electrical wheelchair towards the autonomy and security of disabled people. |
Astrium 2010: this 6-month contract was devoted to test our MBT for autonomous satellites rendezvous. It has been very succesful [Petit11b][Petit11a], which allowed us to start a fruitful collaboration with Astrium (Antoine Petit and Tawsif Gokhool's Ph.D., PEA Decsa). |
Orange-Labs: this 3-year contract will end in March 2013. It supports the Cifre convention between Orange Labs and Université de Rennes 1 regarding Pierre Martin's Ph.D about augmented reality on mobile devices. |
Dassault Aviation: this 3-year contract ended fall 2012. It supported the Inria-DGA grant for Laurent Coutard's Ph.D. about autonomous landing on aircraft carrier. |
This contract that started in March 2011 supports Antoine Petit's Ph.D. about 3D model-based tracking for applications in space (satelitte servincing, rendezvous, debris removal,...). |
Astrium 2012-2015: this 3-year contract started in February 2012. It supports Tawsif Gokhool's Ph.D. about visual mapping of complex 3D environments that evolve over time, in the scope of the recent general convention between Astrium and Inria. |
ECA Robotics: this 3-year contract started in May 2012. It supports the Cifre convention between ECA Robotics and Inria Sophia Antipolis regarding Romain Drouilly's Ph.D. about specifying a semantic representation well adapted to the problem of navigation in structured indoor or outdoor environments. |
Industrial actionsPh.D grantsResearch grants |
National actionsANR National projectsDGAARC INRIA |
International actionsEuropean projectsCollaborations
|
We began in october 2005 a project for the European Space Agency. It will be realized in collaboration with the Trasys company (Brussels), Galileo Avionica (Milano) and KUL in Leuven. Its aims is to develop a demonstrator able to grasp objects using vision-based control of robot in space environment. The considered robot is the ESA Eurobot prototype. Our task in this project is to provide algorithms for objects tracking and vision-based control. |
Lagadic is involved in an Inria associate team (EA) with Prof. Seth Hutchinson from Beckman Institute at the University of Illinois at Urbana-Champaign (UIUC). In the scope of this project, Seth Hutchinson has spent a one-week visit in March, September and November 2008. Reciprocally, Mohammed Marey and François Chaumette have spent a one-week visit at Beckman Institute in May and November 2008 respectively. Roméo Tatsambon Fomena has spent a one-month visit in August 2008 to work on the visual servoing of a mobile robot using an omnidirectional vision sensor |
This international collaboration between France and Australia is supported by CNRS. It is about visual servo-control of unmanned aerial vehicles. It started fall 2005 for three years. It joins Rob Mahony (Australian National University, Canbera), Peter Corke and Jonathan Roberts (CSIRO, Melbourne), Tarek Hamel (I3S, Sophia-Antipolis), Vincent Moreau (CEA-List, Paris) and our group.
This collaboration with IST Lisbon, Portugal (Prof. J. Santos-Victor) is concerned with visual servoing for robotics applications. M. Lopes did a one month visit in our group in February 2005, and N. Mansard did a one month visit at IST in June 2005.
This project is a large project headed by Inria Sophia Antipolis. It is concerned with the navigation of mobile vehicles in urban environments. Within this project, our work consists in designing autonomous vision-based navigation techniques using an image database of the environment. As for scientific aspects, this project is closely related to the Robea Bodega project.
The goal of the SORA project (Objects tracking for augmented reality) is the developpement of tracking algorithm for augmented reality application. Our partners are Total-Immersion and VideoMage. Augmented Reality has now progressed to the point where real-time applications are being considered and needed. At the same time it is important that synthetic elements are rendered and aligned in the scene in an accurate and visually acceptable way. In order to address these issues a real-time, robust and efficient 3D model-based tracking algorithm is proposed for a 'video see through' monocular vision system. Virtual objects can then be projected into the scene using the pose. The tracking rate is 50Hz. |
This project is a large project realized for the DGA through a consortium led by Thales Optronics. We work in close collaboration with the ARobAS team at Inria Sophia Antipolis-Méditerranée. This project is about the development of tracking algorithms and the control of non-holonomic vehicles. Within this project, our work consists in developing 2D image-based tracking algorithms in complex outdoor scenes. The algorithms provided last year using points of interest were improved, new functionalities were added and our contribution was ported to the DGA's autonomous military terrestrial vehicle dedicated to survey missions. |
This contract started in November 2005. It is also supported by the Brittany Council through a grant to Claire Dune for her PhD (``krog'' means grasping in the Breton language). It is dedicated to object manipulation by visual servoing. The goal of this project is to allow disable persons to grasp an object with the help of a robotic arm mounted on a wheel chair. This task should be achieved with a minimum of a priori information regarding the environment, the considered object, etc. |
This contract is devoted to support the Cifre convention between France Telecom R&D and Inria regarding Fabien Servant's Ph.D. The goal of the Ph.D. is to enable augmented reality on mobile devices like GSM or PDA used by pedestrians in urban environments. More precisely, its aim is to compute the absolute pose of the camera to show to the end-user geolocalized information in an explicit way.
This contract started in October 2005. Considering the algorithm Markeless, the goal of this project is to evaluate its capability to handle rough models provided by geographic information systems. A second part of this project will consider the automatic initialization of the tracking process.
The Marker software implements an algorithm allowing the computation of
camera attitude the calibration of the cameras using fiducial markers. Pose
computation is handling using the virtual visual servoing approach. The idea
consists in regarding the pose and the calibration as the dual problem of
visual control. This method presents many avantages: precision identical to
the traditional non-linear minimization methods, simplicity, effectiveness. |
This study relates to the evaluation of the quality of sides of meat of pig by active vision and IRM measurements. It follows upon a joint proposal of CEMAGREF Rennes, Olympig company and Vista project at Ofival which agreed to give a financial support. Within this project, our task deals with a vision-absed estimation of the volume of piece of ham. This is join project with Cemagref for the ofival. We proposed algorithms for 3D reconstruction and exploration of a scene using a mobile and controlled camera. We used a space carving algorithm to obtain a precise and robust reconstruction of the 3D structure of unknown objects. To ensure the complete reconstruction of all the objects of the scene we present a gaze planning strategy that mainly uses a representation of known and unknown areas as a basis for selecting new viewpoints. The trajectory that leads to this exploration is handled using a visual servoing approach. |
One major problem of under-water observation with an automatic engine is the instability of image acquiring. Indeed, this kind of small engines are submitted to low-frequency motions due to weak friction and currents of water. In this study, we proposed to maintain stabilization in the image by controlling the pan and tilt motions of the camera mounted inside the engine, using techniques applied for target tracking. The main idea within this approach lies in the fact that, as it is very difficult to track a point of an unknown scene using geometrical tools, it can be retrieved by the integration of its speed. Indeed, the velocity in the image can now be estimated in real-time, and without any a priori knowledge of the image content. Our approach has been validated on a dry set-up. |
In this study, we proposed a method to control the displacement of a robot arm (the Victor 6000) with no proprioceptive sensor. The joint positions are not available and this manipulator is usually open-loop controlled. In order to get a more efficient control interface, we propose a closed-loop system based on an eye-to-hand visual servoing approach. We show that, using such an approach, measurement of the manipulator motion with proprioceptive sensors is not required to precisely control the end-effector motion. We propose solutions for position-based control and velocity control of the manipulator. To maintain the end effector in the camera field of view, the camera orientation is also controlled. |
This three years project, which began in october 2002 and financed by CNRS within the ROBEA framework is a collaboration between INRIA Grenoble (Sharp, Movi, PRIMA), the LAAS (RIA) and the VISTA project. The objective of this project is the automation of the control of a vehicle in road environment.
This two years project started in October 2002. The aim of this project is to develop robot navigation schemes using a panoramic vision sensor. It is realized through a collaboration between Lasmea in Clermont-Ferrand, Lirmm in Montpellier, Crea in Amiens and our group. This year, we have determined the analytical form of the interaction matrix related to image moments using the very particular geometrical models of panoramic sensors. A visual servoing control law has then been developed from this modeling step. A project meeting has been organized at Irisa in November 2003.
This three years project started in September 2003. The aim of this project is to develop tasks sequencing schemes to realize high level robotics tasks from local sensor-based control techniques. Comparison between vision-based control and human beings for gaze control is also considered. It is realized through a collaboration between Laas, Cerco and Enit, all located in Toulouse, and our group.
This two years project started in November 2003. Its aims is to develop vision-based and sensor-based methods for the autonomous navigation of mobile vehicles moving around an urban environments. It is realized through a collaboration between Ensil in Limoges, UTC in Compiègne, Lasmea in Clermont-Ferrand, the Icare group of Inria Sophia-Antipolis, and our group. A project meeting has been organized at Irisa in November 2003.
This two years project started in November 2003. Its aims is to develop vision-based localization techniques and visual servoing schemes for small helicopters moving around an indoor environment. It is realized through a collaboration between I3S in Nice, Cea in Fontenay-aux-Roses, and our group.
| Lagadic
| Map
| Team
| Publications
| Demonstrations
|
Irisa - Inria - Copyright 2014 © Lagadic Project |