Category archive - Ocean Systems Lab

Pandora @Oceans’13 San Diego, California

Great success of the Pandora project in San Diego, California, USA for the Oceans’13 conference, 23-26 September 2013, the biggest conference in oceanic engineering.

  • Pandora Special Sessions: The Pandora team presented seven papers, grouped into two dedicated sessions. The room was always full, with many questions showing interest from the different researchers attending the conference.
  • Speakers’ Breakfast: PANDORA Table for AUV Autonomy sessions!

  • Tali Hurtòs wins the Poster Competition: with a work focused on novel blending techniques for sonar mosaicing, in the framework of the Pandora project, Tali wins the 1st prize of the student poster competition. Congratulations !!!
  • The winning poster

The conference was very well attended with two exhibition halls, and several parallel technical tracks, and in-water demos at the harbour, being a unique possibility to showcase the Pandora project and its results.
Alongside the hard work, some time for an Hawaiian-style dinner at USS Midway was well deserved for the Pandora team:

Pandora team at USS Midway for Conference Gala Dinner (Hawaiian style)

Feature Based Localisation and Mapping

Last week Tom Larkworthy of Heriot-Watt University (HWU) visited the University of Girona (UdG) to initiate integration of recent SLAM research onto HWU’s Nessie AUV. At UdG, Sharad Nagappa has been focusing on development of SLAM using recent advances in the field of multi-object estimation.

What is SLAM?

Simultaneous Localisation and Mapping (SLAM) is a way of improving estimates of vehicle position in unknown environments. We estimate the position of landmarks based on the current vehicle position, and we can then use knowledge of these (stationary) landmarks to infer the position of the vehicle. By relying on a fixed reference, we can reduce the error due to drift.

PHD Filter and SLAM

The Probability Hypothesis Density (PHD) filter is a suboptimal Bayes filter used for multi-object estimation. Here, we are required to estimate the number and position of an unknown number of objects. The PHD filter performs this while eliminating the need for data association. We can combine this with SLAM by using the PHD filter to represent the landmarks. More technically, this forms a single cluster process, with the vehicle position as the parent state, and the landmarks as the daughter states conditioned on a vehicle position. This formulation is a form of feature-based SLAM since we approximate landmarks as point features.

Figure: Simulation of SLAM with a combination of detected and undetected landmarks

Detecting and Estimating Map Features

The PHD SLAM formulation only relies on a set of point observations. The algorithm does not change depending on whether we are using sonar or vision. Consequently, this offers the potential to combine these two sources using a single technique – as long as we can detect useful features from the sensors! Currently, we are relying on image feature extractors such as SURF and ORB to detect features from our stereo camera. In the coming months we will consider features from the forward looking sonar as well as apply PHD SLAM to real data.

Challenges For PANDORA

Computational resources are particularly constrained on AUVs. SLAM algorithms are notoriously computing intensive. One option available land robotics is the use of CUDA computing architectures to brute force around the problem, but for the underwater domain there are no suitable embedded CUDA systems. Therefore one big challenge for integration in PANDORA is adapting cutting edge SLAM algorithms to run on our embedded systems.
Another difficulty associated with the underwater domain is combining SLAM with sonar data. Standard forward looking sonars are unable to accurately localise in the depth dimension, thus observations are underconstrained. Furthermore, sonar pings do not have reliable high frequency components that optical vision does – this means that common feature extractors, such as SIFT, do not see anything on sonar data. In PANDORA we will be using next generation sonars to get better sonar data into the SLAM system, and developing new feature detectors that compliment SLAM in an underwater domain better.

Pandora: internationally highly visible project

Pandora: internationally highly visible project

The month of September has seen the FP7 Pandora project, coordinated by HWU featuring in very high profile events across Europe:

Brussels, Belgium : MoU (Memorandum of Understanding) Signing Event for the Robotics Public-Private Partnership, with the presence of the EU Commissioner Neelie Kroes and members of her cabinet. Pandora was the only project highlighted dealing with underwater technology.

Arenzano, Italy : IFAC MCMC ’12 conference – one of the most important AUV-related conferences organised by the International Federation of Automatic Control, on Manoeuvring and Control of Marine Craft, with worldwide speakers and audience.

Southampton, UK : IEEE AUV’12 – the only IEEE conference totally devoted to AUV research and technology, with worldwide speakers and audience.

Visit to The Underwater Centre, Fort William

On Thursday 17th May, the Heriot-Watt OSL team visited The Underwater Centre in Fort William, Scotland. The aim of the visit was to evaluate the site for PANDORA trials, and initial impressions were very favourable.

The Underwater Centre has a pier from which dive, ROV and AUV operations can be conducted, plus a large indoor tank. The indoor tank features a wreck and several columns, and there are many potential targets in open water next to the pier, including a mock-up section of oil rig (known as the NDT, non-destructive-testing, structure). Some photos and videos we took on the day are shown below.

Water access from the pier

Access steps to the water

Looking along the pier toward the land

Looking along the pier toward the sea, showing the ROV shacks (on the right) that could be used as a base of operations

Video of the tour of the indoor tank (with more photos below)

The indoor tank from above

Viewing area, and possibly where vehicles could be craned into the tank from

Looking into the tank from the main viewing window

View of the wreck structure in the tank

Diver access pool for the indoor tank

Scale model of the NDT oil platform mock-up (on the right)

Access hatch from the walkway above the tank. This is a second possible route for putting vehicles in the tank

Dual Head P900X BlueView Sonar data

We wanted to get a feel for the data produced by the BlueView P900 sonar, so last week we ran experiments in our underwater tank. This both will inform us to whether the BlueView technology is a good match for the inspection task problem, and will also provide a ground truth sonar dataset for preliminary sonar PDF SLAM work.
At Heriot-Watt we have a 3x4x2m water tank integrated with a Cartesian robot. This allows us to precisely (1mm) and autonomous position the sonar head and gather data. We also have a pan and tilt unit that gives us a total of 5 degrees-of-freedom with which to position the sonar head. It should be noted that the pan-and-tilt head has to be manually positioned, which means it is not as precisely controlled as the XYZ dims. Furthermore the water tank is not particularly big, which means the data is corrupted heavily by multi-path reflections and near field anomalies, but the advantageous of knowing exactly where the head is in the data gathering procedure outweighs these drawbacks (especially if you are trying to train a SLAM system).
We ran the experiments by moving the head in a lawnmower pattern in the X-Y plane three times for each Z level (0, 5cm, 10cm). A data collection “run” consisted of two data streams for each head of the sonar (mounted horizontally and vertically), for a fixed setting of the pan and tilt unit. We performed 7 runs in total for different pan and tilt settings. The video included here is of a single run with the pan set to -10 degrees (sensor nearly facing straight ahead), and the tilt at 45 degrees (sensor facing forwards and to the floor tank). In order to produce the Youtube video, several video codecs were used sequentially so the quality has degraded somewhat, furthermore I was unable to sync the horizontal and vertical videos to be in time (they start together but drift). The video, however, does give a flavour of the type of data coming from the sensor, but for any serious work you should use the raw data provided at (5Gb):-

git clone git://pastis3.eps.hw.ac.uk/P900X_sonar_experiments

Raw data is saved in BlueView’s propriety dataformat “.son”. You can use the free viewer downloadable from BlueView’s website, ProViewer 3.5 to view these files. Windows only I am afraid (very annoying). That software can export the .son data to a few different video formats, although I had difficult getting a lot of programs to read the exported data. More technical info about the experiments is in the repository.

Tom Larkworthy

PANDORA’S BOX HOLDS KEY TO FULLY AUTONOMOUS UNDERWATER ROBOTS

PANDORA’S BOX HOLDS KEY TO FULLY AUTONOMOUS UNDERWATER ROBOTS


Scientists from a leading university in Scotland are heading up a pan-European project to create the most advanced, autonomous, cognitive robot which could help to dramatically reduce the cost of underwater monitoring operations and maintenance within the oil and gas industry.
The team from the Ocean Systems Laboratory (OSL) at Heriot-Watt University is designing a new approach to computational cognition for use in Autonomous Underwater Vehicles (AUVs), with the aim of significantly improving the inspection, repair and maintenance reliability of vehicles used for underwater monitoring.
The system, named PANDORA: Persistent Autonomy through learNing, aDaptation, Observation and Re-plAnning will be trialed on three different AUVs in Scotland and Spain covering in-lab and deep water conditions, to test the system on vehicles and monitor their ability to operate in ‘real’ environments, overcome challenges, accommodate hardware failures and seek alternative missions when idle.
Following the Deep Water Horizon crisis in 2010, calls were made to dramatically improve the type and intelligence of vehicles used for underwater inspection and intervention, to reduce the opportunity for events of that scale to reoccur.
The European Commission issued a call for ideas on ways to increase the thinking capacity of robots supported by the provision of significant funding for viable projects. Identifying the opportunity to put their expertise to the test, the team of scientists, headed up by Professor David Lane, Founder of SeeByte, a Heriot-Watt spin out company, applied to the Commission with a comprehensive, three year research plan to create and develop Pandora for global commercial use. Their plan was the highest praised out of all responses and the team was duly awarded €2.8M (£2.3M).
Professor Lane explained the background to the project: “The issue with autonomous robots is that they are not very good at being autonomous. They often get stuck or ask for help and generally only succeed in familiar environments, when carrying out simple tasks. Over the next three years, our challenge is to develop a computational prpgramme which will enable robots to recognise failure and have the intelligence to respond to it.
“We will develop and evaluate new computational methods to make human-built robots persistently autonomous, significantly reducing the frequency of assistance requests. This is an exceptionally exciting time for us and we are delighted with the response we had from the European Commission, which has allowed us to progress with our research.”
Libor Král, Head of Unit, Cognitive Systems and Robotics, DG Information Society and Media, European Commission, said: “PANDORA is a particularly exciting robotics project undertaken by top European experts.
“The researchers have identified a real issue in an underwater environment where cutting-edge technology can help solve challenging problems. The European Commission is delighted to be supporting this latest addition to its portfolio of over 100 projects, within the EU’s research seventh framework programme.”
Three core themes will be explored over the course of the project, working synergistically together, that are tailored for the underwater environment. These are:

  • Structure inspection using sonar and video
  • Cleaning marine growth using water jets and
  • Finding, grasping and turning valves
  • The project is being run in partnership with five universities across Europe – Instituto Italiano di Tecnologia, University of Girona, King’s College London and National Technical University of Athens, with steering committee members from BP and SubSea7.
    More information can be found at http://osl.eps.hw.ac.uk/.

    ENDS
    For further information, please contact Lally Cowell, Associate Communications Director at MMM on T: 0141 221 9041 / E: lally@mmm.pr

    New! – HWU Ocean Systems Lab YouTube Channel

    Heriot Watt’s Ocean Systems Lab has uploaded demonstrations of vehicles, robots and projects run from within the laboratory over the last twenty years. World firsts include soft dextrous underwater actuators, autonomous docking and intervention with vehicles and manipulators, autonomous pipeline and riser inspection, collaborating vehicles in service oriented architectures with goal based mission planning, student competition vehicles and more. You can follow these achievements here.

    Ocean System Lab’s Cartesian Robot


    This is one of our local testing assets at Heriot-Watt. It is a 3m by 2m pool, 2m deep, with a Cartesian robot suspended above. We will be using this asset to generate some training data for our vision/sonar inspection tasks. The robot allows us to precisely position sensors in the water, providing us with positional ground truth for testing localisation algorithms. The pool is clearly not large enough to run missions in, but it is also very useful to have a test tank to move our robot, NESSIE V, about in, before we undertake serious missions away from University.

    We are currently making the robot compatible with ROS, so that the same software that moves the real AUV, can be used to control the Cartesian’s position (although the Cartesian robot is only 3DOF verses the AUV’s 5DOF).