Category archive - ViCOROB

Chain inspection at sea

On January 2015 the UdG team performed some experiments at sea for testing the final developments regarding chain inspection. We used a high resolution imaging sonar, which delivers acoustic images at near-video frame rate, in order to detect each of the links and follow the chain. In this way, the system can operate regardless of the visibility conditions and the suspended marine fouling that may arise during cleaning. However working with sonar data introduces several challenges (noisy data, insonification artifacts, narrow field of view, etc.) that had to be addressed. We have tackled the problem in two different configurations: a chain lying on the seafloor and a chain suspended vertically in the water column. For each of these configurations we have provided solutions for chain detection and for chain following using forward-looking sonar and also multibeam data in the vertical case.

Mock-up of the chain lying on the seafloor (left) and hanging vertically (middle and right), in Sant Feliu de Guixols (Girona coast). The water visibility was very reduced pointing out the benefit of using acoustic sensors.

Mock-up of the chain lying on the seafloor (left) and hanging vertically (middle and right), in Sant Feliu de Guixols (Girona coast). The water visibility was very reduced pointing out the benefit of using acoustic sensors.

After successful performance of the chain detection and following algorithms in the water tank, we attempted the same procedures at sea, thus performing a final demonstration one step closer to a real operational environment, and exposing the system to more challenging conditions (larger environment, worse visibility, water currents, etc). Finally, we have also developed a system for forward-looking sonar mapping to perform a first evaluation of the chain state at a high level. This allows seeing an overall view of the spatial layout of the links in the environment as well as provides a map of increased signal-to-noise ratio with respect to the individual frames in which features on the range of few centimeters can be appreciated.

mosaic

Search trajectory and inspection of the horizontal chain at sea. After following several waypoints, Girona 500 AUV found the links of the chain and started the inspection. Left image shows the trajectory of the AUV and right image shows the post processed acoustic mosaic of the same trajectory.

Inspection of the chain in vertical position at sea. The left image shows the point cloud representation of the chain acquired with the acoustic multibeam. The right image shows the post-processed acoustic mosaic of the chain acquired with the forward looking sonar.

Inspection of the chain in vertical position at sea. The left image shows the point cloud representation of the chain acquired with the acoustic multibeam. The right image shows the post-processed acoustic mosaic of the chain acquired with the forward looking sonar.

Watch the horizontal chain inspection:

Watch the vertical chain inspection:

IIT and UdG work with new capabilities to perform a robust valve Turning

From the 28th of November to the 5th of December, researchers from Istituto Italiano di Tecnologia (IIT) came to the University of Girona (UdG) to do different tests to develop a successful valve turning.

NTUA, IIT and UdG  working at the CIRS facilities.

NTUA, IIT and UdG working at the CIRS facilities.

During this short period of time the two teams have worked together in two different tasks: First, the integration and testing of the new end-effector. Second, testing the Reactive Fuzzy Decision Maker (RFDM) to evaluate the safety of the valve turning.

New end-effector designed to improve the quality of the grasping and equipped with a camera in hand and a Force Torque sensor.

New end-effector designed for the valve turning.

The new end-effector has been designed in three different parts: First the shape of the passive gripper to grasp the valve handle, second a camera installed inside the center of the end-effector to see the manipulated elementa and third a Force/Torque sensor to evaluate the quality of the grasping and the torque needed to turn the valve.

Valve turning scenario  with the perturbation system installed

Valve turning scenario with the perturbation system installed

An external thruster has been install in the valve turning scenario in order to add perturbations during the manipulation task. The perturbations effect the valve turning and thus allow to detect the parameters to evaluate the safety. Furthermore, the communication between the RFDM and the Learning by Demonstration reproductor has been tested.

NTUA and UdG teams working towards chain following

During the last week of November NTUA and UdG members put their efforts together to push forward the autonomous chain cleaning task of the PANDORA Project. To this end it is required to detect the chain links and follow them accurately.

NTUA and UdG teams working at CIRS for the chain following task

NTUA and UdG teams working at CIRS for the chain following task

UdG team provided a module that performs detections of chain links on the sonar imagery. The chain link detector has been designed to overcome the difficulties of performing object recognition on sonar data (such as the presence of noise, moving shadows or intensity alterations due to viewpoint changes). Taking as input the link detections, NTUA team developed a module that fits a curve through the multiple detections and groups them to obtain a waypoint at the center of each link. The last step that must be performed consists in concurrently follow the identified waypoints while performing new detections. Here, two problems were identified. First, the insonification area of the forward-looking sonar lies always several meters ahead of the vehicle, so the AUV must point on the direction of the last link while keeping its position over the current one. Second, if this two movements are not well coordinated the chain can easily drop off the sonar’s field of view since it is very narrow (30º).

ROS visualization while performing chain following

ROS visualization while performing chain following

These algorithms were tested in the UdG water tank using Girona 500 AUV equipped with the ARIS3000 sonar, over a mock up of a chain of 7 meters. Successful results were obtained in the link detection and path generation stages. For the following algorithm new strategies are under development.

Girona 500 AUV equipped with ARIS sonar over the chain scenario

Girona 500 AUV equipped with ARIS sonar over the chain scenario

Pandora @Oceans’13 San Diego, California

Great success of the Pandora project in San Diego, California, USA for the Oceans’13 conference, 23-26 September 2013, the biggest conference in oceanic engineering.

  • Pandora Special Sessions: The Pandora team presented seven papers, grouped into two dedicated sessions. The room was always full, with many questions showing interest from the different researchers attending the conference.
  • Speakers’ Breakfast: PANDORA Table for AUV Autonomy sessions!

  • Tali Hurtòs wins the Poster Competition: with a work focused on novel blending techniques for sonar mosaicing, in the framework of the Pandora project, Tali wins the 1st prize of the student poster competition. Congratulations !!!
  • The winning poster

The conference was very well attended with two exhibition halls, and several parallel technical tracks, and in-water demos at the harbour, being a unique possibility to showcase the Pandora project and its results.
Alongside the hard work, some time for an Hawaiian-style dinner at USS Midway was well deserved for the Pandora team:

Pandora team at USS Midway for Conference Gala Dinner (Hawaiian style)

Sonar mosaicing of chain scenario

After the first successful tests with the ARIS sonar, the UdG team worked towards reproduction of the chain scenario of PANDORA project. A chain of 13 links and a total length of about 7 meters has been built simulating a real mooring chain.

chain scenario

Reproduction of the chain scenario at UdG’s water tank.

Before the first year review of the project, we conducted some experiments inside the UdG water tank to simulate inspection of the chain by means of sonar.

Girona-500, equipped with ARIS, was manually teleoperated along the chain gathering images at a short range to generate afterwards an acoustic mosaic of high resolution.

The following video summarizes the mosaicing process of the sonar images:

The figure below shows the obtained full chain mosaic:

chain mosaic

Girona500 AUV performing a visual servo control

One of the demonstrations shown during the first year review was a visual servo control performed by Girona 500 AUV in front of a valve panel. This work has been carried out by the NTUA CSL group together with UdG. Three main algorithms work together to achieve this task: A visual detector identifies the valve panel and computes relative positions to it; an EKF-SLAM algorithm combines these updates with navigation sensor measurements to localize the vehicle while mapping the panel in the world. Finally, a control scheme navigates and stabilizes the vehicle in front of the detected target. The control scheme algorithm has been reported in a paper submitted at IROS 2013.

ARIS Forward-Looking Sonar: first tests

Last month the UdG team received a new piece of equipment for the PANDORA project: the ARIS Forward-Looking Sonar (FLS). This sonar generates high-resolution acoustic images at a near-video rate, and can play a key role on those underwater inspections where the water visibility does not allow the use of optical cameras. In the chain scenario of the PANDORA project, the process of cleaning the chain is prone to generate turbidity in the water which can difficult the algorithms taking control of the cleaning process itself as well as the subsequent inspection. By using a forward-looking sonar we plan to work with acoustic images and overcome this lack of visibility.
We have worked towards the development of an algorithm for the generation of acoustic mosaics and we have done some preliminar tests with the ARIS sonar in the water tank of the UdG. Although the sonar is still not integrated to Girona-500, it has been attached to the vehicle and we have used the ARIS commercial software to gather images of several small objects placed on the bottom of the water tank. The robot was driven in a zig-zag trajectory along three different tracklines gathering around 1500 different sonar frames. The developed mosaicing algorithm was able to register successfully a high number of frames, including many loop closures, achieving the consistent mosaic shown in the figure below.

Girona-500 with ARIS

Girona-500 with ARIS installed for the experiments.

Mosaic generated from ARIS frames

Mosaic generated from the ARIS frames.

Panel and Valve Detection

One of the problems to be addressed in the project is the detection and localisation of an underwater panel. The aim is to have the vehicle perform a valve-turning task autonomously once the location of the panel is known. Detection of the panel is performed by comparing images from the camera with a pre-defined template. A series of images of the panel are taken and the “best” image is selected for use as the template.

Panel template is taken from a close-up image of the panel. A mask is used to highlight the static marks on the panel and ignore the valve handles.

Valve detection is performed by first detecting the panel and making known of the rigid geometry to localise the valves with respect to the centre of the panel. To detect the orientation of the valves, a Hough transform is applied first to detect lines within a bounding box around each valve. The orientation of each valve is then obtained by searching for lines of specified minimum length within the bounding box.

The figures show the detected panel and valves highlighted with white lines at approximate distances 2m and 1m. The detection of valve orientation at large distances can be inaccurate and is only considered at short distances.

Feature Based Localisation and Mapping

Last week Tom Larkworthy of Heriot-Watt University (HWU) visited the University of Girona (UdG) to initiate integration of recent SLAM research onto HWU’s Nessie AUV. At UdG, Sharad Nagappa has been focusing on development of SLAM using recent advances in the field of multi-object estimation.

What is SLAM?

Simultaneous Localisation and Mapping (SLAM) is a way of improving estimates of vehicle position in unknown environments. We estimate the position of landmarks based on the current vehicle position, and we can then use knowledge of these (stationary) landmarks to infer the position of the vehicle. By relying on a fixed reference, we can reduce the error due to drift.

PHD Filter and SLAM

The Probability Hypothesis Density (PHD) filter is a suboptimal Bayes filter used for multi-object estimation. Here, we are required to estimate the number and position of an unknown number of objects. The PHD filter performs this while eliminating the need for data association. We can combine this with SLAM by using the PHD filter to represent the landmarks. More technically, this forms a single cluster process, with the vehicle position as the parent state, and the landmarks as the daughter states conditioned on a vehicle position. This formulation is a form of feature-based SLAM since we approximate landmarks as point features.

Figure: Simulation of SLAM with a combination of detected and undetected landmarks

Detecting and Estimating Map Features

The PHD SLAM formulation only relies on a set of point observations. The algorithm does not change depending on whether we are using sonar or vision. Consequently, this offers the potential to combine these two sources using a single technique – as long as we can detect useful features from the sensors! Currently, we are relying on image feature extractors such as SURF and ORB to detect features from our stereo camera. In the coming months we will consider features from the forward looking sonar as well as apply PHD SLAM to real data.

Challenges For PANDORA

Computational resources are particularly constrained on AUVs. SLAM algorithms are notoriously computing intensive. One option available land robotics is the use of CUDA computing architectures to brute force around the problem, but for the underwater domain there are no suitable embedded CUDA systems. Therefore one big challenge for integration in PANDORA is adapting cutting edge SLAM algorithms to run on our embedded systems.
Another difficulty associated with the underwater domain is combining SLAM with sonar data. Standard forward looking sonars are unable to accurately localise in the depth dimension, thus observations are underconstrained. Furthermore, sonar pings do not have reliable high frequency components that optical vision does – this means that common feature extractors, such as SIFT, do not see anything on sonar data. In PANDORA we will be using next generation sonars to get better sonar data into the SLAM system, and developing new feature detectors that compliment SLAM in an underwater domain better.

Presentation of the Valve Panel and the new Robotic Arm

The robotic arm which will be installed in the Girona500 arrived last week. This will be used to manipulate the valve panel. This arm has 4 Degrees of Freedom(DoF)(slew, elevation, elbow, jaw rotation), and also the opening and close of the jaw. The arm has arrived without a proper control software and some time will be needed to perform the integration of the arm in the Girona500.

robotic arma E5

On the other hand, the valve panel has been built and is ready to start working with it in the water tank at CIRS. The first steps will be focus on the visual detection of the valve panel and the valves positions. Also, the valve panel in the simulator has been updated to match the one built.

robotic arma E5

In this video we want to show the speed and kind of movement which the robotic arm is capable of. The movement will be quite different from the one done when the arm is installed in the robot because the control of the arm will have also take in consideration the DoF of the robots, which can make more movements possible. To control the arm for this video we have used the commercial software provided by the manufacturer.