Solved problems


Life Prediction for Service Loads


A project for a new reliability method for Volvo Aero started in 2005. Based on a few worked out examples from the company GMMC researchers have written a handbook of instructions for a reliability approach based on available information. The method has also been introduced at an internal course at the company and is included in their practice. For the air engine applications at Volvo Aero fatigue life predictions are based on specified thermal and external loads, material data, nominal structural geometry, mathematical models describing thermal and mechanical deformation, and models for damage accumulation. In cooperation with Volvo Aero, a special reliability method has been designed for cases when there is weak knowledge about the statistical distributions of the input variables and the model uncertainties. The method is only based on estimated variances without reference to distribution types, which can be seen as an adjustment to the amount of prior knowledge that is at hand in practice. The method has been used on several Volvo Aero applications and proved useful. In particular it has been helpful 1) to find the weakest links in the reliability assessment, 2) to establish relevant safety factors, and 3) to update the reliability assessment based on partial new knowledge.
 


Robust portfolio optimisation


How does one trade in the stock market in the best possible way? It is widely known that when classic optimal portfolio strategies are applied with parameters estimated from data the resulting portfolio weights become very unstable and volatile. The predominant explanation is the difficulty to estimate the expected return with a sufficient accuracy. A new approach has been developed at GMMC which gives stable and robust estimates of expected returns. FCC has implemented this technology into an Excel plug-in called RoPOX for computing optimal portfolio weights and judging portfolio stability. The new proposal is built on the observation that an n-stock Black-Scholes model can be modelled as an n-factor arbitrage pricing theory model where each factor has the same expected return. The idea to identify the relevant risk factors, assigning each of them the same fraction of wealth and then transferring them back to the world of shares is quite intuitive and has the advantage that the problem of estimating the drift rates of stock prices can be avoided. The new ideas have been analysed, with very encouraging results, in a joint pre-study by the Second Swedish National Pension Fund/AP2 and FCC.
 


Combustion engine optimization


It is nowadays possible to utilize computational fluid dynamics (CFD) methods to simulate the physical and chemical processes inside combustion engines. By varying the design parameters of the engine, different configurations can be simulated and their performances compared over a set of representative engine work load conditions. The goals when designing a new engine are typically in conflict: for highly efficient engines we have high burning temperatures which may result in the formation of a lot of nitrogen oxides, in addition to high fuel consumption. Good designs are Pareto optimal points in a multi-criteria optimization problem which also takes into account constraints on the combustion process, like a maximum pressure inside the engine during the combustion stroke. Even the best CFD codes can take days to run one simulation, while the simulation still will produce numerical errors; this implies a true challenge for optimization. A team of GMMC researchers at FCC, Chalmers, Volvo Car Corporation, and Volvo Powertrain Corporation have devised and tested a new algorithm, called qualSolve, on the design of a new diesel engine. For a given set of evaluated designs an explicit approximation model is constructed, based on radial basis functions. Its (approximate) Pareto optimal set guides the search for one or more new designs, with the aim to reduce the uncertainty of the approximation model near the Pareto front. The new method avoids potential pitfalls stemming from simulation errors, and further produces a final approximation model that can be used for post-processing of the designs proposed.
 


Superior analysis method for industry-standard microarray experiments


Microarray experiments have for about a decade now been a heavily used method for finding out which groups of genes are particularly active in different types of diseases. Here one compares the responses of a large number of genes simultaneously between groups of patients under experimental conditions related to the mechanisms of the disease. The industry standard of arrays for microarray experiments are the multiple probe Affymetrix type arrays. For this type of arrays we have developed an analysis method which is considerably better than all currently used methods. In fact, in Åstrand et al. (2008) our PLW method is compared with 14 other currently used methods for five publicly available spike-in data sets. Spike-in data sets have the advantage that for these one knows the answer for a number of crucial genes. It turns out that the PLW method comes out as the best method in four of these five data sets. Computer programs for the PLW method have also been made available.

Reference: Åstrand Magnus, Mostad Petter, Rudemo Mats: "Empirical Bayes models for multiple probe type microarrays at the probe level". BMC Bioinformatics. 2008 Mar 20; 9:156
 


Published: Tue 27 Nov 2012. Modified: Wed 16 Aug 2017