Work presented at the AHS 70th Annual Forum demonstrates that extracts are invaluable for both data reduction and quantitative analysis.
In their paper, “Turbulence Transport Phenomena in the Wakes of Wind Turbines”, Jha et al, show that data reduced by three orders of magnitude still retains full fidelity enabling quantitative analysis not possible before.
See the movie created for this project
Review the paper
The image shows volume rendering of AVF-LESLIE results for a turbulent planar flame front.
Click to view animation
Last month at the PAR CFD 2015 meeting in Montreal, Canada, I presented a paper entitled “The Impact of In Situ Data Processing and Analytics upon Scaling of CFD Solvers and Workflows“.
This work is based on research under our Department of Energy Phase II grant (Award Number DE-SC0007548) and also work under a DOE grant, through the Office of Advanced Scientific Computing Research (Award number DE-SC0012449).
The latter grant is an effort led by Wes Bethel at Lawrence Berkeley National Laboratory in collaboration with Argonne National Laboratory, Georgia Tech and Kitware. In this work we used the AVF-LESLIE code from Prof. Suresh Menon’s lab at Georgia Tech, instrumenting it with VisIt/Libsim.
The goal is to determine the overhead associated with in situ processing in comparison to conventional file based volumetric post-processing at scale. For this paper, ‘at scale’ means on the order of 60,000 cores.
Next year we plan to be at 120,000 cores. To date, AVF-LESLIE has been instrumented with VisIt/Libsim and is now able to directly output FieldView XDB files using 40,000 cores and we’re working on new in situ data analysis pipelines that can only be performed in situ.
I enjoyed the meeting and learned a lot about the challenges as we all work toward exa-scale CFD simulations.
I had the pleasure of attending this year’s ASME Verification and Validation Symposium last month. Verification, Validation and Uncertainty Quantification is an ongoing focus area for me, so in addition to the Symposium, I attended the 2 day course taught by Bill Oberkampf and Chris Roy, authors of the book, Verification and Validation for Scientific Computing. The course gave me a deeper understanding of the techniques and issues that we need to address to ensure that our simulations are accurate and reliable. I look forward to bringing what I’ve learned to my work leading the Applied Research Group.
Intelligent Light R&D continues to produce new tools for large scale CFDParametric Studies. EPISODE is a new large scale data management tool that enables an engineer to readily extract knowledge and insight from their large scale physics based simulations and experimental data. EPISODE provides tools that enable the user to create a relevant subset of their solution results via in-situ data extraction at regions of interest, further reduce the size of that data via proper orthogonal decomposition (POD), and then sort the parametric space of both the input and output solver parameters using self organizing maps (SOMs).
This project consists of new data extracts and compression methods based upon POD (proper orthogonal decomposition) and image compression methods such as JPEG. In addition, we’re developing a new UI based upon self organizing maps which will automatically sort a large number of simulations based upon the parametric inputs and outputs. The result is a set of colored maps that helps to determine the trends in the data. The user will be able to click on different areas of the map which will then display the results in FieldView. The results may be directly from the CFD solver, reconstructed from the POD or it may be computed from a reduced order model (ROM) that was derived from the CFD and POD results.
The EPISODE project will address limitations of current data analysis tools by:
- Performing data management and post-processing in-situ, without writing to storage and without direct engineer interaction.
- Maintain ability to support high-frequency information for maximum temporal fidelity.
- Handle both experimental and computational data together, supporting automated batch processing.
- Deliver post-processing capabilities to rapidly collect engineering information such as mass flows through a passage, FFTs, etc.
Key project collaborators include:
- Robert Haimes from MIT – Haimes’ expertise will support the development of a scalable data extracts architecture and plug-in components.
- Prof. Steven Gorrell from BYU – Professor Gorrell contributes CFD domain expertise in applications for gas turbines.
This work is sponsored by the Air Force Research Laboratory (AFRL) through a Phase II SBIR, Contract FA8650-14-C-2439, and TPOC Michael List.