Вычислительные технологии
Том 9, № 6, 2004
FLOW COMPUTATIONS AT THE HIGH PERFORMANCE COMPUTING CENTER STUTTGART (HLRS)
E. Krause
Aerodynamisches Institut, RWTH Aachen, Germany e-mail: [email protected]
Представлен обзор некоторых численных исследований задач о течениях, выполненных в Центре высокопроизводительных вычислений (HLRS) г. Штутгарта (Германия) в 1998-2003 гг. Подробное описание этих исследований приведено в Трудах центра. Некоторуе из работ были выполнены в Суперкомпьютерном Центре (SSC) г. Карлсруэ (Германия). В прошлом оба центра тесно сотрудничали, а недавно объединились для создания Экспертного Суперкомпьютерного Центра в земле Баден-Вюртемберг. В обзоре рассмотрены следующие задачи: переходные и турбулентные течения, течения вокруг аэродинамических профилей, течения в технических приложениях, течения в турбинах и двухфазнуе течения.
Introduction
The High-Performance Computing Center Stuttgart, until recently closely associated with the Supercomputing Center Karlsruhe, and now joined to form the Center of Competence in High-Performance Computing in Baden-Wurttemberg, is one of the main computing centers in Germany. It supplies universities, research centers, and industry with computing power. Founded in 1996 as a federal center, it since then has availed the entire spectrum of computational services to a large number of research groups in Germany at the universities in Aachen, Augsburg, Berlin, Bayreuth, Bielefeld, Braunschweig, Chemnitz, Dortmund, Erlangen-Nurnberg, Essen, Frankfurt, Freiburg, Gottingen, Hagen, Hamburg, Hannover, Hohenheim, Kaiserslautern, Karlsruhe, Kassel, Kiel, Koln, Konstanz, Mainz, Marburg, Munchen, Munster, Paderborn, Rostock, Saarland, Salzburg, Stuttgart, Tubingen, Ulm, and Wurzburg; and also to the German Climate Research Center in Hamburg, the German Cancer Research Center, the German Center of Aero- and Astronautics, the DLR in Braunschweig, the Konrad-Zuse Center in Berlin, the Max-Planck Institute in Tubingen, and the National Hydraulic Research Institute.
During the past eight years the main production systems at the HLRS were the NEC SX-4 with 32 CPUs, the NEC SX-5 with 2-16 CPUs and the CRAY T3E with 512 CPUs, with which a large number of projects was supported. At times more than 200 projects were processed on the machines per year, with, for example, 21 new projects approved by the steering committee of the HLRS in 2002. The largest amount of machine time, about 45 percent, were used for computations in fluid dynamics, including flows of reacting gases, while for physics
© Институт вычислительнух технологий Сибирского отделения Российской академии наук, 2004.
about 30 percent were reserved. The rest was spent on computations in chemistry, climate research, bioinformatics, and computer science. The problems in computational fluid dynamics were mainly originated in fundamental research, in part related to aerospace and automotive research. The vector-based NEC systems were preferred for flow computations, due to the high memory bandwidth requirements. The investigations in physics — and also the other disciplines preferred to use the T3E, since the algorithms developed in this field can make good use of a much larger number of CPUs and are not that much bandwidth bound.
From 1998 on a selection of results was presented at annual results and review workshops. Since several scientific disciplines participated, the main goal was not the detailed explanation of one or the other physical phenomenon, but rather to elucidate and explain the mathematical structures of the problems to be solved, the methods of approach, the discretization techniques, algorithms, and other methodological aspects. Experience has shown, that this interdisciplinary concept of presentation proved to be of interest for all disciplines and stimulated discussions.
The results presented at the workshops were subsequently published as Transactions of the High Performance Computing Center Stuttgart in a series High Performance Computing in Science and Engineering, initiated by the Springer Verlag in 1998, and edited by W. Jager of Heidelberg University and the author. Seven volumes have appeared since then.
Since the contributions to computational fluid dynamics constitute the largest portion of the projects processed on the HLRS systems, and because of the author’s own affiliation to this field, the articles published in the Springer series on fluid dynamic problems will briefly be reviewed here. An analysis of the problems showed that mainly four topics were investigated. Some 60 articles, representing only about 40 percent of all flow investigations could be grouped into the following sections: Transitional and turbulent flows, flows about aerodynamic shapes, flows in technical applications, flows in turbomachinery, and two-phase flows. Investigations of reactive flows were included only if they dealt with aerodynamic applications, as for example supersonic combustion. These topics will be discussed in the following in the order given.
1. Transitional and Turbulent Flow
Most of the work in computing transitional and turbulent boundary layers is presently done by Prof. S. Wagner’s group at the University of Stuttgart and by the Prof. Friedrich’s group at the Technical University in Munich. The results of the Stuttgart group are discussed first, the details may be found in references [1] to [6].
Beginning in 1998 Stemmer et al. investigated the laminar-turbulent transition induced by a harmonic point-source disturbance in an adverse pressure-gradient boundary-layer with the method of direct numerical simulation of the Navier — Stokes equations (DNS) [1]. The computations were carried out with a 6th-order accurate in space and 4th-order accurate in time solution on the NEC SX-4 and on the CRAY T3E with 15 and 38 million grid points. The computing performance ranged from 2.6 to 8 Gflops, obtained with 11 and 94 processors, respectively. The corresponding scaling factors of the parallelization were 0.82 and 0.94. In the following year this work was continued by Maucher et al. of the same group, as reported in [2]: The authors investigated the laminar-turbulent transition in a separation bubble again with the DNS method, implemented on 16 processors of the NEC SX-4-32, yielding a performance of 14 Gflops, about 50 percent of the theoretical peak performance.
In 2000 Wassermann et al. had advanced the DNS method to the point, that they could investigate the cross-flow instability in a transitional three-dimensional boundary layer on a
swept wing [3]. The analysis was detailed enough so that delay of transition could be shown to be possible. The computation was carried out on the NEC SX-4-32 on a grid with 40 million points and a speed of about 1 Gflops in the serial code version. A speedup of 7 to 8 was obtained with 9 parallel processors. In a subsequent investigation Gmelin et al. developed a control algorithm for the DNS method with which a substantial delay of transition could be simulated for a two-dimensional boundary layer [4]. This result was obtained by adapting the method of superposing anti-phase disturbances, implemented on the NEC SX-5 on 16 processors and 5 GB RAM, yielding a performance of up to 1.6 Gflops on a single processor.
Bonfigli et al., in 2002 [5] using the DNS method were able to explain a shear-layer vortex interaction in the flow over a yawed wing, discovered by Bippes of the DLR in an earlier experimental investigation. The flow simulation was again implemented on 16 processors of the NEC SX-5 with a memory of 32 GB RAM, which was urgently needed for the data handling. Meyer et al. in 2003 investigated the late stages of laminar-turbulent transition of the flow in a flat-plate boundary layer with the DNS method. To speed up the simulation they applied two different parallelization techniques: The directive-based shared-memory parallelization with OpenMP and the explicitly coded distributed memory parallelization with MPI. This combination of methods enabled the authors to investigate the late stages of transition for the first time in agreement with earlier experimental results [6].
The work of the Munich group is described in references [7] to [13]. In [7] Manhart of the Munich group in 1998 implemented the DNS method on the CRAY T3E-900 of the HLRS and on the FUJITSU VPP 700 of the Leibniz Computing Center in Munich and simulated the turbulent boundary layer on a flat plate for the first time with a locally refined grid near the wall. This approach led to considerable savings of computation time compared to a complete fine grid. The solution was extended by Manhart [8] to the simulation of a turbulent separating boundary layer on a flat plate in 1999. About 36 million grid points were necessary to properly resolve the flow in the separation bubble, in agreement with earlier experimental data. In the same year Huttl et al. of the Technical University Munich in cooperation with Smieszeks of the Technical University Cottbus formulated the DNS method and the large eddy simulation method (LES) for cylindrical coordinates and computed laminar pipe flows, cylindrical Couette flow, Taylor — Couette flow, and turbulent boundary layers along circular cylinders [9]. In the following year Huttl et al. tested the accuracy of k-u models used in the solution of the Reynolds averaged Navier — Stokes equations (RANS) by comparing with statistically evaluated results of the DNS method developed in [8]. Profiles of the mean velocities, computed with the RANS method agree well with those obtained with the DNS method, but deviations were noted in the profiles of the Reynolds stresses, the turbulent kinetic energy, and the distributions of the skin friction. Up to 40 million grid points were used in calculations on the Cray T3E with 256 processors [10]. The studies were continued in 2001 by Manhart and Hiittl with the adverse pressure gradient extended over a longer distance in the streamwise direction. Some irregularities were noted in the computations with the two-equation turbulence models. About 70 Mflops per processor were obtained and a linear speed-up for more than 200 processors [11]. In [12], reported in 2002, Eisenbach et al. showed the limitations of sub-grid-scale models used in a simulation of a separating and reattaching turbulent boundary layer with the LES method in comparison with experimental data. In a companion paper Manhart carried out a simulation with the DNS method for the flow simulated in [12]. He reported a satisfying agreement with the experimental data [13].
In addition to the work in Stuttgart and Munich, flow simulations with the LES method were carried out in 1999 and 2000 at the Technical University Hamburg-Harburg by Schmid and
Peric [14], and [15], who computed the flow around a sphere at a Reynolds number Re = 50 000 on an unstructured grid with the Smagorinsky subgrid turbulence model. The results, obtained in computations on 64 processors of the CRAY T3E were reported to be of the same order of accuracy as the experimental errors. In a third investigation, reported in 2001, Schmid et al. calculated the flow around a sphere again at the Reynolds number of their previous study. Good agreement was obtained for the time averaged results with available experimental data, including the vortex pairing initiated by the Kelvin — Helmholtz instability in the shear layer [16].
At Karlsruhe University, Frohlich and Rodi in 1999 simulated the flow around a circular cylinder with its axis normal to the direction of the free stream with the LES method, closed by the Smagorinsky subgrid model at a Reynolds number Re = 140 000 [17]. The iso-pressure surfaces computed for the near wake region exhibit Karman vortex shedding and their breakup into three-dimensional vortex structures. The computation was carried out on the VPP300 on a grid with about 5 million points, yielding a computational speed of 2600 Mflops. In a follow-up study Frohlich et al. in 2000 extended the investigation to the numerical simulation of the flow about a matrix of cubes, mounted on a plate, with one the cubes heated internally. The Reynolds number based on the cube height was Re = 3824. It is reported that the flow with heat transfer posed more difficulties for the simulation than did the cold flow [18].
At Aachen University of Technology Rutten et al. employed the LES method to study turbulent flow through 90 degree pipe bends. It was reported that the time-averaged data and the turbulence intensities could be predicted with good accuracy compared to available experimental data. The vectorization rate of the flow solver reached 99%. The single processor performance of the NEC SX-4 approached 990 Mflops [19].
2. Flows about Aerodynamic Shapes
This section contains 16 contributions concerned with the simulation of flows about aerodynamic shapes. The details can be found in references [20] to [35]. As in the first section, almost one half of the investigations was originated at Stuttgart University, and two additional at the University of Karlsruhe. The Universities in Braunschweig and Aachen follow with three papers each, and the last is from the University in Essen. The various contributions are briefly discussed in the order given.
In [20] Fischer and Wagner of Stuttgart University in 1998 computed the flow around a helicopter fuselage with a numerical solution of the Navier — Stokes equations, and evaluated and improved the state of the art, in particular the prediction of the drag of the fuselage. It could be shown that the computational grid influences the numerical results significantly and that coarse grids yield unreliable data. In continuation of this work Buchtala et al. in 1999 developed a new solution and studied the aerodynamic and aeroelastic behavior of a helicopter rotor in forward flight. The complex vortex structures in the wake of the rotor were analyzed with the visualization system for three-dimensional flows, available at the HLRS [21]; in 2000 Pomin et al. investigated the flow around helicopter rotors in hover and forward flight with a numerical solution based on an aeroelastic approach. The solutions for the flow problem and for the structural problem were programmed separately from each other, but time-accurately coupled. The overall accuracy of both solutions and of the non-iterative coupling scheme are of second order accuracy. The solution for the aerodynamic problem was implemented on the NEC SX-4 and achieved 4.5 Gflops with 9 vector CPUs. A SGI Origin 200 work station hosted the
solution for the structural problem and communicated with the NEC platform via a TCP/IP socket connection [22]. This work was continued until 2002 by Pomin et al., who employed two different methods for the aeroelastic analysis of the behavior of rotor blades. Analysis of fluid-structure interactions was shown to be mandatory. The influence of viscous forces on the overall results was found to be important, requiring an appropriate numerical solution for the Navier — Stokes equations and a coupling procedure. Up to 42.6 million grid cells were used in the computations on the NEC SX-5 with 16 CPUs, for which a sustained performance of 27 Gflops was obtained with a memory of 23.5 GB [23].
Also in 2002 Gerlinger et al. of Stuttgart University investigated problems of supersonic combustion. The influence of temperature and species fluctuations on chemical reaction rates was analyzed by assuming probability density functions. Two-dimensional simulations were carried out on the NEC SX-5 [24]. Schneider, Gerlinger, and Aigner extended the previous work to the study of three-dimensional mixing and combustion of hydrogen in a model scramjet in 2003. A reaction mechanism of 20 reactions and nine chemical species were assumed. More than 200 nodes of the CRAY T3E/512-900 were used with MPI employed for the parallelization [25]. In investigations started as early as 1999 Friihauf et al. also at Stuttgart University studied thermo-chemical relaxation in the gas phase and investigated aerothermal surface loading on re-entry vehicles. Computations were carried out on the CRAY T3E. A nearly linear scale-up was obtained for up to 512 processors [26]. Applications of this solution to the solution of non-equilibrium flow problems were reported elsewhere.
At Karlsruhe University Hafermann in 1998 parallelized a multiblock finite-volume fluid-dynamics code using a static computational and communicational load balancing algorithm. It was reported that the code can be used for the solution of aerospace, automotive, and other industrial problems. Measurements on the IBM RS/6000 SP system of the SSC Karlsruhe using up to 128 processors showed very good performance for example for the computation of the flow about the ONERA M 6 wing [27]. Also at Karlsruhe University Mellen et al. computed the flow around an airfoil at high Reynolds number and large angle of attack in 2000 with the LES method. Special attention was paid to the near-wall region and the development of suitable subgrid-scale models for accurate simulation of turbulent separation and reattachment on the wing. It was concluded that refinement of the grid is essential for improvement of LES calculations [28].
The following three contributions were originated at the University and the German Center of Aero- and Astronautics (DLR) in Braunschweig. In 2001 Neef et al. at the university used numerical solutions of the Euler and Navier — Stokes equations to study the flapping flight of birds. In the investigation the FLOWer code developed at the DLR was implemented on a single processor of the NEC SX-5. The performance obtained was 2 Gflops with a vector operation ratio of 99 percent. The computations yielded insight into the time dependent formation of tip vortices, thrust, and drag [29]. Melber et al. of the DLR in 2002 investigated numerically the viscous compressible flow around high-lift configurations of transport aircraft. The DLR TAU code was used on a hybrid unstructured grid with 10 million grid points. A vector-operation ratio of nearly 99 percent and a performance of 816 Mflops was achieved on a single processor of the NEC SX-5 with 12 GB of memory. At high angles of attack and partial flow separation a fully converged solution required 5000 iterations amounting to about 650 hours of CPU-time [30]. Melber-Wilkending et al. continued this work in 2003 and extended the investigation to include the formation and interaction of the vortices in the wake. Adequate resolution could be obtained with 12 million grid points for a four-vortex configuration [31].
The contributions originated at Aachen University of Technology were concerned with
the simulation of the flow around aerodynamic shapes taking the elastic properties of the configuration into account. Britten et al. in 2000 developed numerical solutions for the flow based on the Euler and the Reynolds averaged Navier — Stokes equations. The elastic wing was described with a numerical solution for a generalized Timoshenko-like beam structure. The solution for the flow field was coupled with the one for the wing structure with an iteration procedure, strongly consistent in time. The performance on a single processor of the NEC SX-4 was about 800 Mflops. A typical unsteady inviscid aeroelastic simulation for a clean wing required 20 CPU hours [32]. In a subsequent investigation in 2002 Reinartz et al. simulated the supersonic flow around the German shuttle-like technology demonstrator PHOENIX. The computation was restricted to inviscid flow, and 2.15 million grid points were needed for proper resolution. Simulation based on the Navier — Stokes equations are under way [33]. The third contribution published in 2003 was again concerned with the numerical simulation of fluid-structure interaction of aircraft wings. Braun et al. performed a direct numerical simulation for a wind tunnel model of a wing body configuration, tested in the European Transonic Wind Tunnel. The computed results were reported to agree well with the experimental data for the static aeroelastic problem of the model deformed by the aerodynamic loads. The simulation was based on a solution of the Reynolds averaged Navier — Stokes equations, implemented on a single processor of the NEC SX-5 system, requiring a computation time of about 15 hours for roughly 4 million grid points [34].
The last contribution of this section was reported by von Lavante et al., who originated an investigation of supersonic hydrogen-air combustion at the University of Essen in 1999. The LES method was used to simulate the unsteady three-dimensional supersonic flow with nonequilibrium chemistry in a square channel with transverse hydrogen injection. The flow was found to be highly unsteady, exhibiting fluctuations in the direction of the main stream and normal to it [35].
3. Flows in Technical Applications
Ten investigations of the period 1998 to 2003 were aimed at analyzing problems occurring in technical applications with numerical techniques. They are briefly reviewed in this section, with the details reported in references [36] to [45].
In [36] Krause and Meinke of Aachen University of Technology in 1998 reported several applications of numerical solutions of the conservation equations of fluid dynamics to technical problems. The examples include the numerical simulation of an air-in-air jet at a Reynolds number Re = 20 000, of the flow in a 90° pipe bend with the LES method; further the numerical analysis of the compressible three-dimensional viscous unsteady flow during the suction and compression stroke in cylinders of automotive engines, and the numerical simulation of vortex breakdown in swirling pipe flow. The performance of a single processor achieved at the time of investigation was typically 800 Mflops. Up to one million grid points were used, resulting in CPU times of about 120 hours.
In the same year turbulent flow in straight, curved and helically coiled pipes were numerically simulated with the DNS method by Huttl and Friedrich at the Technical University in Munich. Details of the formation of the secondary flow motion and the resulting pressure distributions were described in [37]. Pipe flows were again investigated in 2003 by Khalifa and Laurien at Stuttgart University. The investigation was focussed on flows with low Reynolds numbers, of the order of a few thousand. The studies, verified by experiments, revealed several flow
instabilities that can cause surface deformations. The computations were carried out on the NEC SX-5 with roughly 130 000 grid points [38].
In another study of 1998 the laminar incompressible unsteady three-dimensional flow around a circular cylinder, with its axis positioned normal to the oncoming flow in a square duct, was studied with a numerical solution of the Navier — Stokes equations by Becker et al. at Heidelberg University. Results were reported for a multilevel preconditioner and a full multigrid algorithm [39]. In a further study of 1998 Janoske and Piesche of Stuttgart University studied the separation behavior of the flow between conical rotating discs in a disc stack centrifuge with a finite-volume method at a Reynolds number Re = 100. The simulations were found in good agreement with available experimental data [40].
The following two papers were concerned with the numerical simulation of the fluid flow and heat transfer in industrial Czochralski melts. Enger et al. of the University at Erlangen-Nurnberg in 1999 used a block-structured finite-volume solution of the Reynolds averaged Navier — Stokes equations, closed with the k — £ turbulence model, and concluded that this approach suppressed the fluid mechanical instabilities known to be present in the flow [41]. In 2000 Enger and Breuer used a quasi-DNS method to simulate the flow in the Czochralski melt again. The important details of the flow, as for example a buoyancy-driven vortex and the location of the hottest regions in the melt could be detected in the computation on a grid with about 2 million control volumes [42].
Dedner et al. at Freiburg University in 2002 developed a numerical solution of the governing equations for the description of three-dimensional magneto-hydrodynamic flow on an unstructured tetrahedral mesh with dynamic load balancing and demonstrated that control of the divergence of the magnetic field is absolutely crucial [43]. Also in 2002 Becker and Laurien carried out numerical simulations of the three-dimensional flow and the heat transfer in high-temperature nuclear reactors on the NEC SX-4 and NEC SX-5 with the aim to eventually be able to simulate accidents of a high-temperature reactors. The computations were carried out with 360 000 control volumes [44].
In the last contribution of this section, in [45] Hartleb and Tilgner of Gottingen University in 2003 reported results of their study of the Rayleigh — Benard convection in a plane layer with periodic boundary conditions in the horizontal directions. Successful application of a spectral method made it possible to compute flows with Rayleigh numbers Ra > 100 and an aspect ratio of 10.
4. Flows in Turbomachinery
Numerical simulation of flows in turbomachinery has become an important subject of high performance computing. Since the initiation of the transactions of the HLRS six investigations of flow problems in compressors and turbines were reported, see references [46] to [51]. Of these the first four were originated at Stuttgart University and the last two at Karlsruhe. The first investigation dates back to 1998. Jung et al. of Stuttgart University, using a parallelized implicit numerical solution of the Navier — Stokes equations, adapted to turbomachinery applications, studied unsteady flow phenomena in an axial turbine. Detailed features of secondary flow motion were simulated, as for example, tip clearance affects, blade row interactions, and passage vortices [46]. In another investigation of the same group Bauer et al. in 2000 simulated the viscous unsteady transonic flow over oscillating blades in a linear cascade. The computation was carried out on 4 CPUs of the NEC SX-4 [47]. In the following year Anker et al. of the
same group investigated the influence of the leakage on the main flow in a turbine with a numerical solution of the Navier — Stokes equations. The studies showed that the leakage flow introduces mixing losses and can dominate the secondary flow, resulting in severe losses [48]. Also in 2001 the problem of rotating stall was investigated by Ginter et al., also at the University of Stuttgart, for a single-stage axial water compressor with 30 runner and 30 stator blades. The numerical solution was based on the Reynolds averaged Navier — Stokes equations and implemented on a Hitachi SR 8000 computer [49].
The last two studies were originated at Karlsruhe University. In 2002 Michelassi et al. investigated the flow in a low pressure turbine affected by incoming wakes with the LES method. The algorithm was implemented on the HITACHI SR 8000 system. Domain decomposition into 32 sub-domains was used for two million grid points. The fine time and space resolution revealed several details of the flow structure, e.g. the appearance of intermittent separation of the boundary layer and of large elongated flow structures [50]. In the following year the authors continued their studies and investigated the flow in a highly loaded low-pressure cascade, again affected by incoming wakes. The LES method was implemented on the same system as before. The simulations showed that increasing the strength of the impinging wake inhibits separation of the boundary layer at the suction side of the blade and reduces the total pressure loss [51].
5. Two-Phase Flows
The last section contains 10 contributions devoted to problems of two-phase flows, see references [52] to [61], seven of them originated at Stuttgart University. They will be discussed first. Starting in 1999, Rieber and Frohn studied the interface dynamics of two incompressible fluids with totally different densities and viscosities with a numerical solution of the Navier — Stokes equations, adapted to the problem. A binary droplet collision and a droplet impact on a liquid film were simulated with the parallelized algorithm on the NEC SX-4 and the CRAY T3E/512-900. A comparison of the results obtained for the droplet impact problem with experimental data showed good agreement, even in the details, the formation of the crown with fingers ejecting small droplets [52]. In the second contribution, Hase et al. in 2001 deviced a numerical solution for the Navier — Stokes equations and studied the behavior of spherical and deformed droplets in gas flow. The computed drag coefficients of spherical droplets show good agreement with measured data. Initially deformed droplets approach the spherical shape asymptotically. The simulations were performed on 16 CPU’s on the CRAY T3E/512-900 [53]. In continuation of this work Hase and Weigand in 2002 computed the heat transfer from droplets moving with increased, decreased, and constant velocity with the volume-of-fluid method, this time implemented on 128 processors of the CRAY T3E/512-900 [54], and, one year later, the authors studied the three-dimensional unsteady heat transfer on strongly deformed droplets at high Reynolds numbers. Large differences were observed in the variation of the Nusselt number and the heat transfer for the various deformations [55].
In another investigation of 2001 Giese and Laurien began to investigate the two-phase flow of water in form of liquid and steam in a pipe system. The commercial code CFX-4.2 was used for the solution of the Navier — Stokes equations, adapted to the two-phases. The simulations showed a clear influence of the secondary flow motion on the distribution of the steam in bends and of the cavitation intensity on the pressure loss. The algorithm was implemented on the NEC SX-4 [56]. The authors continued their studies in the following year and extended the simulation to include the simulation of bubbly and stratified flows. In addition to the results
obtained in 2001 the influence of secondary flow motion on recondensation was shown. The numerical data were verified by experimental results [57]. Also in 2002 Olmann et al. deviced a numerical parallelized simulation technique for the computation of multiphase flow and transport processes in porous media. The algorithm was implemented on the Hitachi SR 8000, on which the multi-step-outflow experiment was simulated. Further work is in progress [58].
In a cooperative study of the University at Halle-Wittenberg, the University of Michigan at Ann Arbor, and the Worchester Polytechnic Institute, USA, Goz et al. in 2000 simulated bubbly gas-liquid flow by using a solution of the Navier — Stokes equations on a fixed, regular three-dimensional grid with the DNS method. A parallelized finite-difference front-tracking method on two-dimensional meshes was used to follow the motion of the phase boundaries and to accurately account for the stress boundary conditions at the interface between the gas and the liquid. A large number of bubbles had to be used, as a consequence of the fact that the fluctuation velocities of the bubbles influence their dispersion characteristics markedly [59]. This work was continued in 2001. Since then various bubbly bidisperse systems with spherical bubbles have been investigated and compared with each other, including also the comparison with monodispersed systems. The computations were carried out on the IBM RS/6000 SP of the SSC Karlruhe [60].
Finally, in 2003, [61] Albina and Peric of the Technical University of Hamburg-Harburg studied the forced brake-up of a liquid jet. The numerical simulations were carried out on 64 CPUs of the T3E/512-900.
Conclusions
The major results obtained in flow computations at the High Performance Computing Center Stuttgart during the period 1998 to 2003 were listed and briefly discussed in this article. Altogether 61 contributions were introduced, the details of which are reported in the Transactions of the High Performance Computing Center Stuttgart in the series High Performance Computing in Science and Engineering, initiated by the Springer Verlag in 1998.
The fluid dynamic investigations were mainly chosen for this survey, since they constitute the largest part of the projects processed on the HLRS systems and some on the systems of the Supercomputing Center Karlsruhe. The contributions represent only about 40 percent of all flow problems investigated with numerical solutions. Most of them were published in archive journals.
The contributions reported here deal with problems of transitional and turbulent flows, flows about aerodynamic shapes, flows in technical applications, flows in turbomachinery, and two-phase flows. The review is meant to provide an overview over the flow problems investigated at the HLRS. The references refer to the HLRS transactions, with the year identifying the annual volumes.
References
[1] Stemmer C., Kloker M., Ristand U., Wagner S. DNS of point-source induced transition in an airfoil boundary-layer flow // 1998. P. 213-222.
[2] Maucher U., Rist U., Kloker M., Wagner S. DNS of laminar-turbulent transition in separation bubbles // 1999. P. 279-294.
[3] Wassermann P., Kloker M., Rist U., Wagner S. DNS of laminar-turbulent transition in a 3D aerodynamics boundary-layer flow // 2000. P. 275-289.
[4] Gmelin C., Rist U., Wagner S. DNS of active control of disturbances in a blasius boundary layer // 2001. P. 273-285.
[5] Bonfigli G., Kloker K., Wagner S. 3-D-boundary-layer transition induced by superposed steady and traveling crossflow vortices // 2002. P. 255-271.
[6] Meyer D., Rist U., Kloker M. Investigation of the flow randomization process in a transitional boundary layer // 2003. P. 239-254.
[7] Manhart M. Direct numerical simulations of turbulent boundary layers on high performance computers // 1998. P. 199-212.
[8] Manhart M. Direct numerical simulations of an adverse pressure gradient turbulent boundary layer on high performance computers // 1999. P. 315-326.
[9] HUttl T.J., Smieszek M., FrOhlich M. et al. Numerical flow simulation in cylindrical geometries // 1999. P. 267-278.
[10] HUttl T.J., Deng G., Friedrich R., Manhart M. Testing turbulence models by comparison with DNS data of adverse-pressure-gradient boundary layer flow // 2000. P. 356-367.
[11] Manhart M., HUttl T. Statistical analysis of a turbulent adverse pressure gradientboundary layer // 2001. P. 286-297.
[12] Eisenbach S., Manhart M., Friedrich R. Large-eddy-simulations of turbulent wall bounded flow with and without adverse pressure gradient // 2002. P. 272-284.
[13] Manhart M. Investigation of a turbulent separating boundary layer by direct numerical simulation // 2002. P. 285-298.
[14] Schmid M., Peric M. Computation of turbulent flows with separation by coherent structure capturing // 1999. P. 304-311.
[15] Schmid M., Peric M. Large-eddy-simulation of subcritical flow around a sphere // 2000. P. 368-376.
[16] Schmid M., Bakic V., Peric M. Vortex shedding in the turbulent wake of a sphere at subcritical Reynolds number // 2001. P. 309-316.
[17] FrOhlich J., Rodi W. Large-eddy-simulation of the Flow around a Circular Cylinder // 1999. P. 312-314.
[18] FrOhlich J., Mathey F., Rodi W. Large-eddy-simulation of the flow over a matrix of surface-mounted cubes // 2000. P. 317-325.
[19] RUtten F., Meinke M., SchrOder W. LES of turbulent flows through 90°-pipe bends on NEC SX-4 // 2000. P. 377-388.
[20] Fischer A., Wagner S. Navier — Stokes-calculations of the flow around a helicopter fuselage / / 1998. P. 295-307.
[21] Buchtala B., Hierholz K.-H., Wagner S. Aeroelastic analysis of a helicopter rotor in forward flight // 1999. P. 327-330.
[22] Pomin H., Altmikus A., Buchtala B., Wagner S. Rotary wing aerodynamics and aeroelasticity // 2000. P. 338-348.
[23] Pomin H., Altmikus A., Wagner S. Aeroelastic analysis of helicopter rotor blades using HPC // 2002. P. 391-405.
[24] Gerlinger P., Stoll P., Schneider F., Aigner M. Implicit LU time integration using domain decomposition and overlapping grids // 2002. P. 311-322.
[25] Schneider F., Gerlinger P., Aigner M. 3D simulations of supersonic chemically reacting flows // 2003. P. 267-276.
[26] FrUhauf H.-H., Fertig M., Olawsky F., BOnisch B. Upwind relaxation algorithm fur re-entry nonequilibrium flows // 1999. P. 365-378.
[27] Hafermann D. Pararallelization of the CFD code KAPPA for distributed memory computers // 1998. P. 252-260.
[28] Mellen C. P., FrOhlich J., Rodi W. Computation for the European LESFOIL Project //
2000. P. 389-398.
[29] Neef, M. F., Hummel D. Euler and Navier — Stokes solutions for flapping wing propulsion //
2001. P. 386-395.
[30] Melber S., Wikl J., Rudnik R. Numerical high lift research // 2002. P. 406-422.
[31] Melber-Wilkending S., Stumpf E., Wild J., Rudnik R. Numerical high lift research II // 2003. P. 315-330.
[32] Britten G., Weile M., Hesse M., Ballmann J. Analysis of an elastic wing in subsonic flow using direct numerical aeroelastic simulation // 2000. P. 305-316.
[33] Reinartz B.U., Hesse M., Ballmann J. Numerical investigation of the shuttle-like configuration PHOENIX // 2002. P. 379-390.
[34] Braun C., Boucke A., Hanke M. et al. Prediction of the model deformation of a high speed transport aircraft type wing by direct aeroelastic simulation // 2003. P. 331-342.
[35] von Lavante E., Kallenberg M., Zeitz D. Numerical simulation of supersonic hydrogen-air combustion // 1999. P. 295-303.
[36] Krause E., Meinke M. CFD-applications on NEC SX-4 // 1998. P. 223-235.
[37] HUttl T.J., Friedrich R. High performance computing of turbulent flow in complex pipe geometries // 1998. P. 236-251.
[38] Khalifa E., Laurien E. Numerical investigation of semi-turbulent pipe flow // 2003. P. 277-288.
[39] Becker Ch., Oswald H., Turek S. Parallel multilevel algorithms for solving the incompressible Navier — Stokes equations // 1998. 308-325.
[40] Janoske U., Piesche M. Numerical simulation of the fluid flow and the separation behavior in a disc stack centrifuge // 1998. P. 261-268.
[41] Enger S., Breuer M., Basu B. Numerical simulation of fluid flow and heat transfer in an industrial czochralski melt using a parallel-vector supercomputer // 1999. P. 253-266.
[42] Enger S., Breuer M. High-performance computing: numerical simulation of the melt flow in an industrial czochralski cruzible // 2000. P. 290-304.
[43] Dedner A., KrOner D., Rohde C., Wesenberg M. Efficient divergence cleaning in threedimensional MHD simulations // 2002. P. 323-334.
[44] Becker S., Laurien E. Three-dimensional numerical simulation of flow and heat transport in high-temperature nuclear reactors // 2002. P. 367-378.
[45] Hartlep T., Tilgner A. Rayleigh — Benard convection at large aspect ratios // 2003. P. 343-358.
[46] Jung A.R., Mayer J.F., Stetter H. Unsteady flow simulation in an axial flow turbine using a parallel implicit Navier — Stokes method // 1998. P. 269-294.
[47] Bauer H., Mayer J.F., Stetter H. Unsteady flow simulations for turbomachinery applications on dynamic grids // 2000. P. 349-355.
[48] Anker J.E., Mayer J.F., Stetter H. Computational study of the flow in an axial turbine with emphasis on the interaction of labyrinth seal leakage flow and main flow // 2001. P. 363-374.
[49] Ginter F., Ruprecht A., GOde E. Numerical simulation of rotating stall in an axial compressor // 2001. P. 375-385.
[50] Michelassi V., Wissink J., Rodi W. LES of flow in a low pressure turbine with incoming wakes // 2002. P. 335-346.
[51] Michelassi V., Wissink J.G., Rodi W. The effect of impinging wakes on the boundary layer of a thin-shaped turbine blade // 2003. P. 303-314.
[52] Rieber M., Frohn A. Parallel computation of interface dynamics in incompressible two-phase flows // 1999. P. 241-252.
[53] Hase M., Rieber M., Graf F. et al. Parallel computation of the time dependent velocity evolution for strongly deformed droplets // 2001. P. 342-351.
[54] Hase M., Weigand B. Predictions of the 3D unsteady heat transfer at moving droplets // 2002. P. 299-310.
[55] Hase M., Weigand B. Numerical simulation of 3d unsteady heat transfer at strongly deformed droplets at high reynolds numbers // 2003. P. 255-266.
[56] Giese T., Laurien E. Simulation of two-phase flow in pipes // 2001. P. 352-362.
[57] Giese T., Laurien E. Three-dimensional simulation of two-phase flow in pipes // 2002. P. 354-366.
[58] Olmann U., Hinkelmann R., Helmig R. Parallel two-phase flow simulations in porous media // 2002. P. 347-353.
[59] GOtz M.F., Bunner B., Sommerfeld M., Tryggvason G. Simulation of bubbly gas-liquid flows by a parallel finite-difference front-tracking method // 2000. P. 326-337.
[60] GOz M.F., Bunner B., Sommerfeld M., Tryggvason G. Simulation of bidisperse bubbly gas-liquid flows by a parallel finite-difference front-tracking method // 2001. P. 298-308.
[61] Albina F.-O., Peric M. Numerical simulation of forced breakup of a liquid jet // 2003. P. 289-302.
Received for publication August 20, 2004