Areas of focus

Areas of focus

The Netherlands is excellently positioned in the global computational sciences in the following in five strategic focus areas. These five focus areas are important because together they provide the tools to address the challenges in complex (sustainable) issues.

Multiscale modeling
and simulate

Can you perfectly predict how a material behaves if you know what atoms are in it? With a perfect computer, this should be possible by consistently computing the quantum mechanics of the smallest particles. In practice, this appears to be feasible only for ultra-short phenomena of very small numbers of particles. Instead, in the real world, issues play out simultaneously on different large and small scales, with fast and slow processes influencing each other.

The research area of multiscale modeling and simulation (MMS) focuses on techniques to describe such systems by linking elementary algorithmic building blocks for different length and time scales.

The purpose of Multi-scale modeling and simulation (MMS)

The goal is to develop a simulation of phenomena in complex systems, such as electrochemistry in batteries, nuclear fusion in a reactor, and CO2 sequestration. The scope of MMS extends to both the exact sciences and the social sciences, engineering and industry, and it requires a multidisciplinary approach.

An integrated systems approach is incredibly relevant at a time when challenging sustainability issues are becoming increasingly important, particularly in the areas of water, food and energy. An integrated systems approach includes all processes across the life cycle and even beyond, including recycling.

Because almost all technologies that can contribute to a sustainable future are almost always multiscale in nature, MMS will play a crucial role in this transition. And thus an important area of focus within Computational Sciences.

Data-driven methods

Large amounts of data these days we collect in the blink of an eye. Hidden in this unstructured data is valuable information. By applying new methods to analyze this data, we discover new insights. Combined with machine learning, this allows us to develop predictive models to provide solutions to a variety of issues. This requires a systematic approach, such as high-throughput screening, which allows us to do many tests quickly.

Rapid solutions

In contrast to the time-intensive trial-and-error method, this rational and systematic data-driven approach enables us to find solutions to urgent and big issues around sustainability, responsible materials, health and climate adaptation at lightning speed.

Data-driven methods are crucial for analyzing huge amounts of data and developing mathematical prediction models (datasets). Both are needed to develop a sustainable, climate-neutral and circular production of high-quality plant and animal food, thus reducing our carbon footprint.

Guidelines for data-driven methods

The success of data-driven methods depends on the availability of large amounts of data. When developing data-driven methods, it is therefore important to adhere to guidelines for describing, storing and publishing data, according to the FAIR principle: Findability, Accessibility, Interoperability, and Reusability. In addition, metadata such as creation date, quality and owner are essential for searching and navigating datasets.

There are now several platforms with data repositories that facilitate organizing, searching, sharing and using high-quality data. There is a great need and also challenge to integrate datasets from various sources with each other so that their potential can be further exploited.

Ultra-fast computer simulations via machine learning

Within computational science, a new approach to accelerating simulations is emerging. With lightning-fast machine learning, such as deep neural networks, models can be trained, allowing parts of calculations to be performed thousands or even millions of times faster. This use of machine learning is known as surrogate modeling or surrogate modeling. This approach is rapidly evolving with new and more effective methods being developed continuously. That brings innovative applications closer.

Training machine learning

In computational science, efficient description of atomic and molecular interactions is a central ingredient. Surrogate models here have smoothed the way for robust, data-driven models. In this approach, computationally intensive, detailed simulations are used to train machine learning, after which these machine learning models can predict the outcome of new computations. These superfast models make it possible to investigate a wide range of materials for energy storage and energy conversion, materials for water purification, and complex pharmaceuticals, among others.

A completely different application of surrogate models is the accurate and efficient modeling of turbulence, a decisive factor for the success of applications such as circular water treatment, wind energy, fuel cells, and concentrated solar power (CBP).

Future of surrogate modeling

Surrogate modeling is rapidly evolving and carries great promise for rapid applications to virtually all major societal challenges. The “low-hanging fruit” is expected to be harvested soon, while for the longer term the challenge is to lift the field beyond “new promise” status.

Consider, for example:

  • creation of surrogate models transferable to other domains
  • the ability to use efficient surrogate models to virtually examine all kinds of designs and predict optimal solutions, without time-intensive real-life testing
  • Combining machine learning techniques with physics insights

Much research is still needed for such techniques and their combinations.

Uncertainties and sensitivity analyses

Without a good estimate of reliability, simulations are meaningless. To make accurate predictions and support decision-making, it is critical to know the uncertainty of model outputs, and how sensitive they are to variations in starting data.

Computational Science NL - nieuws
Reliance on predictions

Inaccuracies in simulations can occur for a variety of reasons: invalid assumptions, coefficients that are not well defined, or measurement errors in initial conditions. Determining these uncertainties is computationally intensive because sometimes hundreds or thousands of simulations are required. Nevertheless, good uncertainty and sensitivity analysis increases the expressiveness of predictions. Thus, we can not only predict the temperature of the outside air, but also indicate a “plume” within which the temperature will be with 95% probability. The same goes for forecasts of Covid-19, stock market prices or mortgage rate trends.

The development of reliable results

In several application areas, such as drug development, food safety, and flight safety, committees exist to ensure that calculated results are reliable. To meet this, there must be sufficient qualified personnel available who can use state-of-the-art methods.

Taking the focus area a step further requires new developments, such as developing efficient algorithms. These must become up to 10 times faster because the complexity of the applications under study is increasing as models must include multiple time and length scales in their calculations, for example.

The challenge here is to develop methods that can estimate what effects the errors at the individual scales have on the coupled models. It is also important to conduct research to examine the validity of assumptions to make computational methods more reliable.

Role of machine learning

Uncertainty analysis is even about choosing exactly which models to use to address a problem. For example, probability theory techniques are under development to automatically choose between different computational models. The rise of machine learning also presents new opportunities and challenges for uncertainty analysis. For example, most uncertainty analysis methods become much more computationally intensive as the number of uncertain parameters increases. Machine learning could help predict which parameters are most important.

Energy efficient computing

Modern computers require more and more energy: the world’s largest supercomputer even requires a separate power plant. A move to other, more energy-efficient types of computers, such as the quantum computer or neuromorphic computer chips, could curb this energy consumption. But using such computers efficiently requires new models and computational schemes.

The Dutch computational science community is a global leader in this research. So too are steps toward even more efficient computers, some programmed in software and some in hardware. When programming in hardware, connections are made between certain computational processors to speed up computation. Developing good algorithms is even more challenging, but efficiencies of a factor of ten have already been achieved.

The Netherlands is also a leader in quantum computing. These can theoretically solve large computational jobs in far fewer computational steps. The number of quantum algorithms available today is limited, but could offer enormous potential for efficiency improvements.

In the future, energy-efficient computing will need to be further developed to keep the energy consumption of computers manageable. This requires investment in the following components:

  1. Current artificial intelligence algorithms are often started with a “clean slate,” i.e., from scratch, which often requires long computations before the information is up to date.
  2. By calculating with adjustable precision, many energy gains are possible.
  3. Applying existing software to new problems speeds up the development of new simulation packages, but is usually not optimized for the new application.
  4. In many calculations, transporting the data to the computational units takes the most energy and time.

In short, this research area has many challenges but can contribute immensely to the new generation of energy efficient computers. Investment in this area will ensure that research in the Netherlands can maintain, and if possible improve, its top position.