Acceleration and optimization of complex computing calculations


GRASP aerosole algorithm

Technology abstract

An Austrian SME successfully enhanced satellite data processing algorithms by novel High Performance Computing (HPC) techniques, bringing about higher performance, testability and portability by restructuring and modularizing the code. The approach has been adapted to the finance sector, manufacturing logistics, and weather forecasting. Partners for applying it in e.g. oil & gas exploration or insurance are sought.

Versatile concept, and the applicability to big data makes it particularly interesting.

- Andrea Kurz -

Read more about this broker

Technology Description

Algorithms play an important role in scientific, engineering, production and even financial contexts. Yet their practical use is sometimes limited by long calculation times. This leads to a demand for accelerating algorithms, both in research for reducing calculation times, and in industry for reducing cost, capturing new markets and making near real-time applications possible.
The HPC techniques developed by the company can identify and resolve performance hotspots (sections of code that are frequently or repeatedly executed) and optimization candidates, enhancing the execution speed of top hotspots by a factor of up to 100. Higher performance, better scalability and better resource utilization are achieved by:

  • looking for and implementing fine grain parallelism,
  • optimizing the memory access patterns,
  • optimizing memory consumption,
  • restructuring code and enhancing its functionality - this also supports testability and portability,
  • employing highly parallel hardware (GPGPU, Xeon Phi, Multi-CPU clusters).

The technique has applications in tuning and restructuring any class of algorithm, from physical modelling to simulations and machine learning. This often includes huge data sets generated in:

  • real-time scenarios, where a given processing time per frame must not be exceeded to avoid congestion or loss of data.
  • reprocessing activities, where the accumulated data for an instrument or a field campaign is analyzed.

The techniques employed were developed to optimise two highly complex satellite data processing algorithms. Algorithms in the fields of Remote Sensing, Automotive, Air & Space, Weather and Climate, Medicine, Telecommunication, and Oil & Gas can typically benefit from similar treatment, particularly for adapting software initially developed for research projects to a production environment.

Innovations & Advantages

For ESA, optimisation of two highly complex satellite data processing algorithms accelerated the top performance bottleneck by a factor of over 100, and brought an end-to-end cost reduction of around 25% when incorporating GPGPU technology.
The R&D community is full of ideas for new algorithms, but bringing an idea to fruition can take several years. The earlier an algorithm can be applied to large data sets, the sooner its potential can be assessed. To speed up this assessment, overly-complex custom routines that often impede experimentation with large datasets are replaced whenever possible by optimized general purpose implementations. If necessary, scientific code written in Matlab, IDL, or Python, is transferred to a native, high-level language.
For industry, advantages lie in:

  • higher performance
  • faster results
  • better scalability
  • better resource utilization
  • reduced cost for large data sets
  • improved testability
  • portability to future technologies
  • reprocessing of previous exploration campaigns
  • more effective adaption of experimental algorithms to real production environments

Finally, costs for processing large data sets are reduced – as a by-product of meeting performance and scalability goals.

Further Information

Current and Potential Domains of Application

For ESA, two highly complex satellite data processing algorithms designed to detect aerosol and surface characteristics and the atmospheric gas compositions were optimised.
Adapted currently to:

  • Automated material testing: optical coherence tomography algorithm on GPGPU devices, which is able to process the image data acquired from an interferometer in real-time with a rate of over 200 images per second.
  • Crane stability calculations
  • Optimized contracts risk calculation rate for financial industry, opening up new opportunities for business analyses.
  • Incorporation of parallel machine learning algorithms to optimize flight plans for Airlines.

Further application domains, e.g.:

  • Machining, aeronautics, insurance, oil and gas exploitation