RP 5 (Principal Developer: IMI-BAS)


Modeling, simulation and optimization (MSO) is an integral area which has been called the third pillar for scientific progress and innovation. The necessity of advanced methods for MSO is highly pronounced in view of the importance of HP computing and ICT for science and technology-driven industry. Variational methods, stochastic methods and the approximation methods take special places in the processes of MSO. Variational methods are more and more attractive in the recent investigations of smart grids. Optimization is a priority area in industrial mathematics involving discrete and continuous optimization with all types of constraints, optimization of complex systems with many constraints, numerical analysis of optimization problems and software development. Control and dynamics of real world processes is another priority area, in particular model-predictive control of dynamic processes. The stringent requirements for safety and reliability of some models (in medicine, architecture, robotics, etc.) demand methods for validation, verification and uncertainty quantification. The stochastic approach is the most widely used in this respect; alternatively efficient interval algorithms find more and more applications. Nowadays, statistics is part of the research in big data helping to screen and split big data, to reduce the “noise”, false interpretation, etc. Special approximation methods and algorithms based on of needlets and wavelets are developed for comparing big data and in data compression. The needlet algorithms compute a tested model problems about 1000 times faster than the classical algorithms and, thus, make feasible the comparison of real Big measurement data arising for example in climate models, with the values predicted by the model. In recent years, low-rank and tensor approximation methods also increase their applications to many engineering problems employing computer-aided geometric design (CAGD). In general, the approximation techniques provide interface between computational mathematics and the computer architectures supporting floating-point arithmetic. The development of specialized algorithms and software is an indispensable part of MSO. Their embedding in HP computer architectures is a recent challenge of particular importance.

1. Control and optimization in technology-driven applications

Methods and algorithms of control and optimization play a crucial role in virtually all branches of modern science and technology. The research, as part of ICT approaches in machine-building, medicine and in relation to other 3 thematic areas of ISSS, will include: (i) development of new variational techniques for modeling continuous optimization phenomena;  (ii) methods for numerical and computational analysis of optimization problems in power supply and distribution (e.g. smart grids), design of medical equipment, electromagnetic processing of materials; (iii) model-predictive control of dynamic processes in demography, human capital accumulation, environment protection (e.g. emissions abatement) and epidemiology and development of solvers for their simulation; (iv) discrete optimization models in bioinformatics  (such as finding the 3D structure of proteins), homology modeling of human receptors with application to drug synthesis. All of the above items inherently require advanced HPC infrastructure for the software development and simulations.

2. Stochastic approach in medicine, science and industry

The continued increase in size, dimensionality, and number of variables in real-life mathematical models  results in data of unprecedented volume and complexity, and creates new challenges that traditional approaches from applied mathematics cannot address. The stochastic techniques are powerful in constructing rich models used to capture the complexity of Big Data from real-life phenomena. Our research in this direction will be part of ICT approaches in machine-building, medicine and industry, in particular on: (i) developing advanced mathematical and statistical methods applicable to genetic data and genetic analysis. The most common data analysis of experiments related to RNA and DNA sequencing is differential gene expression analysis using statistical tests for multiple comparisons. Novel computationally efficient and test-diagnostic calibration technique will be proposed to reduce the conservativeness of some tests; (ii) study of the spectral properties of stochastic processes to allow effective and robust estimation of reliability of industrial processes; (iii) modeling and analysis of data from nuclear power plants to estimate the capacity and safety of low-level waste storage. Processing Big Data requires advanced HP computing infrastructure and development of corresponding algorithms, software and parallel data handling methods for statistical analysis.

3. Approximation techniques in 3D digitalization, Big Data, web-based computation services

Efficient algorithms based on approximation techniques are highly demanded for computational treatment of various important problems arising, e.g., in biology, medicine, other natural sciences and engineering. The following will be among the main lines of research in this task: (i) Development of needlet and wavelet methods and construction of highly localized frames for design of fast and efficient algorithms for data compression, climate models and their 3D digitalization (aligned with the new equipment of the planned Lab for 3D digitalization) and comparison to huge amount of real measurement Big Data; (ii) Design of multi-dimensional and low-rank spline approximation algorithms with applications in engineering problems based on computer-aided design models and Cloud Technologies; The above require storage/processing of Big Data and relies on HP computing; (iii) New algorithms and software for solving linear algebraic equations (involving dependent uncertain parameters), that arise in finite element models, models in robotics, etc., will be embedded in web-based applications providing remote, interactive and dynamic computational services for the end-users. These must be installed on advanced computing infrastructure so that new methods show their power and spread quickly.