Reproducible and replicable CFD: it’s harder than you think and replicable CFD: it’s harder than you think ... This is the story of what happened next: ... Comparing results obtained under such different

  • Published on

  • View

  • Download


Reproducible and replicable CFD:its harder than you thinkCompleting a full replication study of our previously published findings on bluff-bodyaerodynamics was harder than we thought. Despite the fact that we have goodreproducible-research practices, sharing our code and data openly. Heres what welearned from three years, four CFD codes and hundreds of runs.Olivier Mesnard, Lorena A. BarbaMechanical and Aerospace Engineering, George Washington University, Washington DC 20052Our research group prides itself forhaving adopted Reproducible Re-search practices. Barba made a pub-lic pledge titled Reproducibility PIManifesto1 (PI: Principal Investigator), which atthe core is a promise to make all research mate-rials and methods open access and discoverable:releasing code, data and analysis/visualizationscripts.In 2014, we published a study on Physics ofFluids titled Lift and wakes of flying snakes.2It is a study that uses our in-house code for solv-ing the equations of fluid motion in two dimen-sions (2D), with a solution approach called theimmersed boundary method. The key of sucha method for solving the equations is that it ex-changes complexity in the mesh generation stepfor complexity in the application of boundaryconditions. It makes possible to use a simple dis-cretization mesh (structured Cartesian), but at thecost of an elaborate process that interpolates val-ues of fluid velocity at the boundary points to en-sure the no-slip boundary condition (that fluidsticks to a wall). The main finding of our studyon wakes of flying snakes was that the 2D sec-tion with anatomically correct geometry for thesnakes body experiences lift enhancement at agiven angle of attack. A previous experimentalstudy had already shown that the lift coefficientof a snake cross section in a wind tunnel gets anextra oomph of lift at 35 degrees angle-of-attack.Our simulations showed the same feature in theplot of lift coefficient.3 Many detailed observa-tions of the wake (visualized from the fluid-flowsolution in terms of the vorticity field in spaceand time) allowed us to give an explanation ofthe mechanism providing extra lift.When a computational research group pro-duces this kind of study with an in-house code,it can take one, two or even three years to write afull research software from scratch, and completeverification and validation. Often, one gets thequestion: why not use a commercial CFD pack-age? (CFD: computational fluid dynamics.) Whynot use another research groups open-sourcecode? Doesnt it take much longer to write yetanother CFD solver than to use existing code?Beyond reasons that have to do with inventingnew methods, its a good question. To exploreusing an existing CFD solver for future research,we decided to first complete a full replication ofour previous results with these alternatives. Ourcommitment to open-source software for researchis unwavering, which rules out commercial pack-ages. Perhaps the most well known open-sourcefluid-flow software is OpenFOAM, so we set outto replicate our published results with this code.A more specialist open-source code is IBAMR, aproject born at New York University that has con-tinued development for a decade. And finally,our own group developed a new code, imple-menting the same solution method we had be-fore, but providing parallel computing via therenowned PETSc library. We embarked on a fullreplication study of our previous work, usingthree new fluid-flow codes.This is the story of what happened next: threeyears of dedicated work that encountered adozen ways that things can go wrong, conqueredone after another, to arrive finally at (approxi-mately) the same findings and a whole new un-derstanding of what it means to do reproducibleresearch in computational fluid dynamics.1arXiv:1605.04339v2 [physics.comp-ph] 18 May 2016Fluid-flow solvers we used:cuIBM Used for our original study,2 this code is written in C CUDA to exploit GPU hardware, but is serial on CPU. Ituses the NVIDIA Cusp library for solving sparse linear systems on GPU. A free and open-source CFD package that includes a suite of numerical solvers. The core discretizationscheme is a finite-volume method applied on mesh cells of arbitrary shape. http://www.openfoam.orgIBAMR A parallel code using the immersed boundary method on Cartesian meshes, with adaptive mesh refinement. Our own re-implementation of cuIBM, but for distributed-memory parallel systems. It uses the PETSc library forsolving sparse linear systems in parallel. 1: Meshing and boundary con-ditions can ruin everythingGenerating good discretization meshes is proba-bly the most vexing chore of computational fluiddynamics. And stipulating boundary conditionson the edge of a discretization mesh takes somenerve, too. Our first attempts at a full replica-tion study of the 2D snake aerodynamics withIcoFOAM, the incompressible laminar Navier-Stokes solver of OpenFOAM, showed us just howvexing and unnerving this can be.OpenFOAM can take various types of dis-cretization mesh as input. One popular meshgenerator is called GMSH: it produces trianglesthat are as fine as you want them near the body,while getting coarser as the mesh points are far-ther away. Already, we encounter a problem:how to create a mesh of triangles that gives acomparable resolution to that obtained with ouroriginal structured Cartesian mesh? After ded-icated effort (see supplementary material), weproduced the best mesh we could that matchesour previous study in the finest cell width nearthe body. But when using this mesh to solvethe fluid flow around the snake geometry, we gotspurious specks of high vorticity in places wherethere shouldnt be any (Figure 1). Despite the factthat the mesh passed the quality checks of Open-FOAM, these unphysical vortices appeared forany flow Reynolds number or body angle of at-tack we tried, although they were not responsiblefor the simulations to blow up. Finally, we gaveup with the (popular) GMSH and tried anothermesh generator: SnappyHexMesh. Success! Nounphysical patches in the vorticity field this time.But another problem persisted: after the wakevortices hit the edge of the computational domainin the downstream side, a nasty back pressure ap-peared there and started propagating to the in-side of the domain (Figure 2). This situation isalso unphysical, and we were certain there was aproblem with the chosen outflow boundary con-dition in OpenFOAM, but did not find any wayto stipulate another, more appropriate boundarycondition. We used a zero-gradient condition forthe pressure at the outlet (and tried several otherpossibilities), which we found was a widespreadchoice in the examples and documentation ofOpenFOAM. After months, one typing mistakewhen launching a run from the command linemade OpenFOAM print out the set of availableboundary conditions, and we found that an advec-tive condition was available that could solve ourproblem (all this time, we were looking for a con-vective condition, which is just another name forthe same thing). Finally, simulations with Open-FOAM were looking correctand happily, themain feature of the aerodynamics was replicated:an enhanced lift coefficient at 35 degrees angle-of-attack (Figure 3). But not all is perfect. The timesignatures of lift and drag coefficient do show dif-ferences between our IcoFOAM calculation andthe original published ones (Figure 4). The keyfinding uses an average lift coefficient, calculatedwith data in a time range that is reasonable butarbitrary. Although the average force coefficientsmatch (within Figure 1: Vorticity field after 52time units of flow simulation withIcoFOAM for a snakes section withangle-of-attack 35 degrees andReynolds number 2000. We cre-ated a triangular mesh (about 700ktriangles) with the free softwareGMSH. Notice the patch of spuri-ous vorticity at the center-bottom ofthe domain.Figure 2: Pressure field after 52(top) and 53 (bottom) time units offlow simulation with IcoFOAM forsnake section with angle-of-attack35 degrees and Reynolds number2000. The simulation crashed afterabout 62 time-units because of theback pressure at the outlet bound-ary. coefficient25o 30o 35o 40oangle of attack1. coefficientRe = 1000IcoFOAMKrishnan et al. (2014)Re = 2000IcoFOAMKrishnan et al. (2014)Figure 3: Time-averaged drag (top) and lift (bottom) coeffi-cients as function of the snakes angle-of-attack for Reynoldsnumbers 1000 and 2000. We averaged all IcoFOAM force co-efficients between 32 and 64 time-units of flow-simulation aswe have done in our previous study.grids are complex geometrical objects. Two un-structured meshes built with the same parame-ters will not be exactly the same, even. The mesh-generation procedures are not necessarily deter-ministic, and regularly produce bad triangles thatneed to be repaired. The complications of build-ing a good quality mesh is one of the reasons someprefer immersed boundary methods!Story 2: Other researchers open-source codes come with trapsOpen-source research software can often bepoorly documented and unsupported, and on oc-casion it can even be an unreadable mess. But inthis case, we are in luck. IBAMR is a solid pieceof software, well documented, and you can evenget swift response from the authors via the topi-cal online forum. Still, we ran against an obscuretrick of the trade that changed our results com-pletely.The numerical approach in IBAMR belongsto the same family as that used in our pub-lished work on wakes of flying snakes: an im-20 30 40 50 60 70 80non-dimensional time-unit0. coefficientsCd - IcoFOAMCd - Krishnan et al. (2014)Cl - IcoFOAMCl - Krishnan et al. (2014)20 30 40 50 60 70 80non-dimensional time-unit1. coefficientsCd - IcoFOAMCd - Krishnan et al. (2014)Cl - IcoFOAMCl - Krishnan et al. (2014)Figure 4: Instantaneous force coefficients on the snakes sec-tion with angle-of-attack 30 degrees (top) and 35 degrees(bottom) at Reynolds number 2000. We compare the Ico-FOAM results with the cuIBM results from our previous study.We created a 3.4 million cells (mostly hexahedra) with Snap-pyHexMesh, one of the OpenFOAM mesh utilities.mersed boundary method. The essence of the ap-proach is that the fluid is represented by a struc-tured mesh, while the immersed body is rep-resented by its own, separate mesh that moveswith the body. We speak of an Eulerian meshfor the fluid, and a Lagrangian mesh for thesolid. The forces exerted by the fluid on the body,and vice versa, appear as an additional integralequation and interpolation schemes between thetwo meshes. The role of these is to make thefluid stick to the wall (no-slip boundary con-dition) and allow the body to feel aerodynamicforces (lift and drag). IBAMR implements a sub-class of the immersed-boundary method calledthe direct-forcing method suitable for rigid bod-ies. Our cuIBM code uses a variant called theimmersed-boundary projection method.4 Despitethe variations, the essence is the same, and it isreasonable to assume they would work similarly.We already know that boundary conditions atthe outlet of the computational domain can be4Figure 5: Vorticity field after about 61 time-units of flow-simulation with IBAMR for a snakes section with angle-of-attack 35 degrees and Reynolds number 2000. We used azero-gradient condition at the outlet for both the velocity andpressure quantities.problematic. This is no different with immersedboundary methods. Our first attempt withIBAMR used a zero-gradient velocity boundarycondition at the outlet. This resulted in a spu-rious blockage of the wake vortices when theyreach the domain boundary: strong vorticity re-bounds from the artificial boundary and propa-gates back to the domain (Figure 5). Of course,this is unphysical and the result is unacceptable.After a long search in the literature and in thedocumentation, it was through a conversationwith the main developers on the online forumthat we discovered the solution: using a stabi-lized outlet boundary condition, which adds aforcing to push the vortices out. (IBAMR does notprovide a convective/advective boundary con-dition.) With this new configuration, the sim-ulations of the snake profile resulted in a wakethat looked physical, but a computed lift coef-ficient that was considerably different from ourpublished study (Figure 6). Another deep divein the literature led us to notice that a bench-mark example described in a paper reporting onextensions to IBAMR5 was set up in an unex-pected way: the no-slip condition is forced insidethe body, and not just on the boundary. As faras we could find, the publications using IBAMRare the only cases where interior points are con-strained. Other papers using immersed bound-ary methods apply the constraint only on bound-ary points. When we followed their example,our simulations with IBAMR were able to repro-duce the lift enhancement at 35 degrees angle-of-attack, although with a slightly different value ofaverage lift (20 30 40 50 60 70 80non-dimensional time-unit0. coefficientsCd - IBAMRCd - Krishnan et al. (2014)Cl - IBAMRCl - Krishnan et al. (2014)20 30 40 50 60 70 80non-dimensional time-unit1. coefficientsCd - IBAMRCd - Krishnan et al. (2014)Cl - IBAMRCl - Krishnan et al. (2014)Figure 7: Instantaneous force coefficients at Reynolds num-ber 2000 for the snakes section at angle-of-attack 30 degrees(top) and 35 degrees (bottom). Here, the no-slip condition isenforced inside the section. We compare the IBAMR resultswith cuIBM ones from our past study.One of the issues may be that our communitydoes not have a habit of communicating nega-tive results, nor of publishing papers about soft-ware. We learned from this experience that us-ing an open research code and getting correct re-sults with it could involve a long investigative pe-riod, potentially requiring communication withthe original authors and many failed attempts. Ifthe code is not well documented and the originalauthors not responsive to questions, then build-ing your own code from scratch could be moresensible!Story 3: All linear algebra libraries arenot created equalOur previous study used cuIBM, running on asingle GPU device. The largest problem that wecan fit in the memory of a high-end GPU has justa few million mesh points, which is not enoughto solve three-dimensional flows. We developedPetIBM, a code that uses the same mathemati-cal formulation as cuIBM, to allow solving largerproblems on distributed CPU systems. SincePetIBM and cuIBM implement exactly the samenumerical method, youd expect that giving thetwo codes the same mesh with the same initialconditions will result in the same solution (withinfloating-point error). Not so fast! We rely on ex-ternal libraries to solve sparse linear systems ofequations: Cusp for GPU devices and PETSc fordistributed CPU systems. It turns out, the itera-tive solvers may have differences that affect thefinal solution.When repeating our previous simulations ofthe aerodynamics of a snake cross-section withPetIBM, the solutions do not always match thosecomputed with cuIBM. At a Reynolds number of1000, both the time-averaged lift and drag coeffi-cients match. But at Reynolds equal to 2000, av-erage lift and drag match up to 30 degrees angle-of-attack, but not at 35 degrees. That means thatwe dont see lift enhancement (Figure 8) and themain finding of our previous study is not fullyreplicated. Looking at the time evolution of theforce coefficients for the simulation with PetIBMat Re=2000 and 35 degrees angle-of-attack, wesee a marked drop in lift after 35 time units (topgraph in Figure 9). What is different in the twocodes? Apart from using different linear algebralibraries, they run on different hardware. Leavinghardware aside for now, lets focus on the itera-tive solvers. Both Cusp and PETSc use the sameconvergence criterion. This is not always the case,and needs to be checked! Were also not usingthe same iterative solver with each library. ThecuIBM runs (with Cusp) used an algebraic multi-grid preconditioner and conjugate gradient (CG)solver for the modified-Poisson equation. WithPETSc, the CG solver crashed because of an in-definite preconditioner (having both positive andnegative eigenvalues), and we had to select a dif-ferent method: we used a bi-CG stabilized al-gorithm (while still using an algebraic multigridpreconditioner).Could this difference in linear solvers affect ourunsteady fluid-flow solution? The solutions withboth codes match at lower angles of attack (andlower Reynolds numbers), so what is going on?We checked everything multiple times. In theprocess, we did find some small discrepancies.Even a small bug (or two). We found, for exam-ple, that the first set of runs with PetIBM created aslightly different problem set-up, compared withour previous study, where the body was shifted6by less than one grid-cell width. Rotating thebody to achieve different angles of attack wasmade around a different center, in each case (oneused the grid origin at 0,0 while the other usedthe body center of mass). This tiny differencedoes result in a different average lift coefficient(bottom graph in Figure 9)! The time signal of liftcoefficient shows that the drop we were seeing ataround 35 time units now occurs closer to 50 timeunits, resulting in a different value for the aver-age taken in a range between 32 and 64. Again,this range for computing the average is a choicewe made. It covers about ten vortex shedding cy-cles, which seems enough to calculate the averageif the flow is periodic. What is causing the dropin lift? Visualizations of the wake vortices (Figure10) show that a vortex-merging event occurs inthe middle of the wake, changing the near-wakepattern. The previously aligned positive and neg-ative vortices are replaced by a wider wake with asingle clockwise vortex on the top side and a vor-tex dipole on the bottom part. With the change inwake pattern comes a drop in the lift force.Postmortem.Although PetIBM implements the sameimmersed-boundary method and was devel-oped by the same research group, we were notable to fully replicate the previous findings. Theaerodynamic lift on a snake section at 35 degreesangle-of-attack is a consequence of the near-wakevortices providing extra suction on the upperside of the body. When a vortex merger eventchanges the wake pattern, lift drops. Vortexmerging is a fundamentally two-dimensionalinstability, so we expect that this problem wonttrouble us in more realistic 3D simulations. Butit is surprising that small changeswithin thebounds of truncation error, roundoff error andalgebraic errorscan trigger this instability,changing the flow appreciably. Even when theonly difference between two equivalent simula-tions is the linear algebra library used, there canbe challenges to reproducibility.Story 4: Different versions of yourcode, external libraries or even com-pilers may challenge reproducibilityIn the span of about three years, we ran more than100 simulations with OpenFOAM, IBAMR, andPetIBM, encountering about a dozen things that0. coefficient25o 30o 35o 40oangle of attack1. coefficientRe = 1000PetIBM (shifted markers)Krishnan et al. (2014)Re = 2000PetIBM (shifted markers)Krishnan et al. (2014)PetIBM (exact markers)Figure 8: Time-averaged drag (top) and lift (bottom) co-efficients as function of the snakes angle-of-attack and forReynolds numbers 1000 and 2000 using the same Eule-rian mesh as in our past cuIBM simulations. We showPetIBM results obtained when the immersed-boundary is ro-tated around: (1) its center of mass (green and orange sym-bols) and (2) the reference origin (solo red marker).can go wrong. We replicated our previous scien-tific finding (enhanced lift at 35 degrees angle-of-attack for sufficiently large Reynolds number) intwo out of three campaigns. Ironically, the casethat did not replicate our findings was that of ourown code re-write. The original code (cuIBM)and the re-write (PetIBM) use different linear al-gebra libraries, and its unnerving to think thiscould change our results. This final story is aboutwhat happened when we went back to our orig-inal code and tried to reproduce the publishedfindings.As we mentioned in the opening of this article,we adopted a set of practices years ago to makeour research reproducible. The study publishedas Lift and wakes of flying snakes followed theguidance of the Reproducibility PI Manifesto,which includes: (1) code developed under ver-sion control; (2) completed validation and veri-fication, with report published on Figshare; (3)open data and figures for the main results of thepaper on Figshare; (4) pre-print made available70 10 20 30 40 50 60 70 80non-dimensional time-unit1. coefficientsCd - PetIBMCd - Krishnan et al. (2014)Cl - PetIBMCl - Krishnan et al. (2014)20 30 40 50 60 70 80non-dimensional time-unit1. coefficientsCd - original markersCd - displaced markersCl - original markersCl - displaced markersFigure 9: Instantaneous force coefficients for the snakes sec-tion with angle-of-attack 35 degrees and Reynolds number2000. The top plot compares the PetIBM results with thosereported in our previous study. The bottom plot shows resultswith the immersed-boundary being rotated around the refer-ence origin or around its center of mass (the latter is slightlyshifted compared to our previous study).on arXiv; (5) code released under MIT License;(6) a Reproducibility statement in the paper. Ofcourse we expect to be able to reproduce our ownresults!The first hurdle we faced is that, three years af-ter we completed our previous study, we haveupdated our lab computers: new operating sys-tems, new GPU devices, new external libraries.The code itself has been modified to implementnew features. Happily, we have version con-trol. So, we set out to reproduce our results withcuIBM using (1) the same old version of thecode and (2) the current version. In both cases,we used identical input parameters (Lagrangianmarkers to discretize the geometry, grid parame-ters, flow conditions, and solver parameters). Butthe three-year-old simulations used a version ofCusp (0.3.1) that is no longer compatible with theoldest CUDA version installed on our machines(5.0). Thus, we adapted old cuIBM to be com-patible with the oldest version of Cusp (0.4.0) thatFigure 10: Vorticity field after 19, 52, 53, and 64 time-units offlow-simulation with PetIBM for a snakes section at angle-of-attack 35 degrees and Reynolds number 2000. The vortex-merging event is responsible for the change in the wake sig-nature and the drop in the mean lift coefficient.we can run. The case at angle-of-attack 35 de-grees and Reynolds number 2000 now gave anappreciable difference compared with our pre-vious study: the instantaneous force coefficientsstart to slowly drop after about 60 time units (Fig-ure 11(c)). Now, this is really the same code, withonly a difference in the version of the linear al-gebra library. Repeating the case with the mostcurrent version of cuIBM and the same version ofCusp (0.4.0) leads to the same force signals, witha slight drop towards the end (Figure 11(d)). Andthe same is the case with the current version ofcuIBM and a later version of Cusp (0.5.1). Thefinal findings in these cases do not vary from ourpublished work: there is, in fact, lift enhancementat 35 degrees angle-of-attack ... but the resultsmatch only because we calculate the average liftin a time interval between 32 and 64. Yet, the flowsolution was affected by changing the version of adependent library. (The revision history of Cuspsays that they refactored the smooth-aggregation8solver between the two versions we are using.)The hardware was also different (a K20 GPU ver-sus a C2070 in our older study), and the operat-ing system, and the compiler. In an iterative lin-ear solver, any of these things could be related tolack of floating-point reproducibility. And in un-steady fluid dynamics, small floating-point dif-ferences can add up over thousands of time stepsto eventually trigger a flow instability (like vortexmerging).Postmortem.Making research codes open source is not enoughfor reproducibility: we must be meticulous indocumenting every dependency and the versionsused. Unfortunately, some of those dependencieswill get stale over time, and might cease to beavailable or usable. Your application code maygive the same answer with a different version ofan external library, or it may not. In the caseof unsteady fluid dynamics, the nonlinear natureof the equations combined with numerical non-reproducibility of iterative linear solvers (in par-allel!) can change the results.Lessons learnedReproducibility and replication of studies are es-sential for the progress of science, and much ofscience today advances via computation. We usecomputer simulations to create new knowledge.How can we certify that this new knowledge isjustified, that there is enough evidence to sup-port it? The truth is computational science andengineering lacks an accepted standard of evi-dence. We label computational research repro-ducible when authors provide all the necessarydata and the computer code to run the analy-sis again, re-creating the results. But what dataare necessary? We found that open-source codeand open data sets are a minimal requirement.Exhaustive documentation during the process ofcomputational research is key. This includes doc-umenting all failures. Current publication cus-tom is biased towards positive resultsthis iswhy we had to spend so much time to discoverthat IBAMR gave correct results with Lagrangianpoints inside the body. The negative results withpoints only on the boundary are not published.We learned how important the computationalmesh and the boundary conditions can be. A re-producible computational paper should includethe actual meshes used in the study (or a de-terministic mesh-generation code) and careful re-porting of boundary conditions. This is rarely (ifever!) the case. We learned that in addition to de-veloping our code under version control, we needto carefully record the versions used for all de-pendencies. In practice, such careful documenta-tion is feasible only with a fully automated work-flow: launching simulations via running scripts,storing command-line arguments for every run,capturing complete environment settings. Post-processing and visualization ideally should alsobe scripted, avoiding software GUIs for manipu-lation of images.We learned that highly unsteady fluid dynam-ics is a particularly tough application for repro-ducibility. The Navier-Stokes equations are non-linear and can exhibit chaotic behavior under cer-tain conditions (e.g., geometry, Reynolds number,external forcing). Some flow situations are sub-ject to instabilities, like vortex merging in two di-mensions and other vortex instabilities in 3D. Inany application that has sufficient complexity, weshould repeat simulations checking how robustthey are to small variations. And report nega-tive results! Understandably, long 3D simulationsthat take huge computational resources may notbe feasible to repeat. We should continue the con-versation about what it means to do reproducibleresearch in high-performance computing (HPC)scenarios. When large simulations run on specifichardware with one-off compute allocations, theyare unlikely to be reproduced. In this case, it iseven more important that researchers advance to-wards these HPC applications on a solid progres-sion of fully reproducible research at the smallerscales.Computational science and engineering makesubiquitous use of linear algebra libraries likePETSc, Hypre, Trilinos and many others. Rarelydo we consider that using different librariesmight produce different results. But that is thecase. Sparse iterative solvers use various defini-tions of the tolerance criterion to exit the iterations,for example. The very definition of residual couldbe different. This means that even when we setthe same value of the tolerance, different librariesmay declare convergence differently! This posesa challenge to reproducibility, even if the applica-tion is not sensitive to algebraic error. The situa-tion is aggravated by parallel execution. Globaloperations on distributed vectors and matricesare subject to rounding errors that can accumu-late to introduce uncertainty in the results.940 45 50 55 60 65 70 75 80non-dimensional time-unit1. coefficientsCd - current - CUSP-0.5.1Cd - Krishnan et al. (2014)Cl - current - CUSP-0.5.1Cl - Krishnan et al. (2014)40 45 50 55 60 65 70 75 80non-dimensional time-unit0. coefficientsCd - current - CUSP-0.5.1Cd - Krishnan et al. (2014)Cl - current - CUSP-0.5.1Cl - Krishnan et al. (2014)40 45 50 55 60 65 70 75 80non-dimensional time-unit1. coefficientsCd - old - CUSP-0.4.0Cd - Krishnan et al. (2014)Cl - old - CUSP-0.4.0Cl - Krishnan et al. (2014)50 55 60 65 70 75 80non-dimensional time-unit1. coefficientsCd - current - CUSP-0.4.0Cd - current - CUSP-0.5.1Cd - old - CUSP-0.5.1Cl - current - CUSP-0.4.0Cl - current - CUSP-0.5.1Cl - old - CUSP-0.4.0Figure 11: Instantaneous force coefficients on the snakes section at (a) Reynolds number 1000 and angle 35 degrees and(b) Reynolds number 2000 and angle 30 degrees. (c) Instantaneous force coefficients at Reynolds number 2000 and angle-of-attack 35 degrees running the same version of cuIBM (adapted to CUSP-0.4.0) than the one used for our previous study. (d)Drop of the mean force coefficients observed over the end of the simulation using two versions of cuIBM ("current" and "old")with different CUSP versions (0.4.0 and 0.5.1).We are recommending more rigorous stan-dards of evidence for computational science andengineering, but the reality is that most CFD pa-pers are not even accompanied by a release ofcode and data. The reasons for this are varied:historical, commercial interests, academic incen-tives, time efficiency, etc. The origins of CFDin the Los Alamos Laboratory in the 1950s wassecret research, and when computer code wasstored in large boxes of punched cards, there washardly a way to share it. The 1970s saw thebirth of commercial CFD, when university pro-fessors and their students founded companiesfunded under the US governments SBIR pro-gram. Its not unreasonable to speculate that thepotential for commercial exploitation was a de-terrent for open-source release of CFD codes for along time. It is only in the last 15 years or so thatopen-source CFD codes have become available.But the CFD literature became entrenched in thehabit of publishing results without making avail-able the code that generated those results. Andnow, we face the clash between the academic in-centive system and the fact that reproducible re-search takes a substantial amount of time and ef-fort. This campaign to replicate our previous re-sults taught us many lessons on how to improveour reproducibility practices, and we are commit-ted to maintaining this high standard. We willcontinue to share our experiences.Supplementary materialsWe provide supplementary materials in theGitHub repository for this paper, including: (1)all input files to generate the runs reported inthe paper; (2) Jupyter notebooks for all the simu-lations, detailing every condition (dependencies,compilation, mesh, boundary conditions, solverparameters, command-line options); (3) Pythoncodes and bash scripts needed to recreate all thefigures included in the paper.10Definition of reproducible research:The literature is rife with confused and sometimes contradictory meanings for reproducible research, reproducibility, replicability,repetition, etc. It is thus worth clarifying what we mean by these terms. The phrase "reproducible research" in reference tocomputational studies that can be reproduced by other scientists was introduced by geophysics professor Jon Claerbout in the1990s. An early volume of CiSE published an article about the reproducible-research environment they created in his group atStanford.6 Their goal was complete documentation of scientific computations, in such a way that a reader can reproduce all theresults and figures in a paper using the author-provided computer programs and raw data. This ability requires open data andopen-source software, and for this reason the reproducibility movement is closely linked with the open science movement. Infollowing years, the term replication was distinctly adopted to refer to an independent study, re-running experiments to generatenew data that, when analyzed, leads to the same findings. We follow this convention, clarified more recently in an article inScience.7Reproducible research The authors of a study provide their code and data, allowing readers to inspect and re-run theanalysis to recreate the figures in the paper. Reproducible research makes replication easier.Replication An independent study that generates new data, using similar or different methods, and analyzes it to arrive atthe same scientific findings as the original study.CiSE has dedicated two theme issues to reproducibility: January/ February 2009 and July/August 2012.Our codes are available for unrestricted use,under the MIT license; to obtain the codes andrun the tests in this paper, the reader mayfollow instructions on Barba, L. A. (2012), "Reproducibility PI Mani-festo," figshare,, ICERM Workshopon Reproducibility in Computational and Ex-perimental Mathematics (December 10-14, 2012), Krishnan, A., Socha, J. J., Vlachos, P. P., Barba, L. A.(2014). Lift and wakes of flying snakes. Physics of Flu-ids, 26(3), 031901, Krishnan, Anush; J. Socha, John; P. Vlachos, Pav-los; Barba, L. A. (2013): Lift and drag coefficientversus angle of attack for a flying snake cross-section. figshare. Taira, K., Colonius, T. (2007). The immersed boundarymethod: A projection approach. Journal of Computa-tional Physics, 225, 2118?2137.5. Bhalla, A. P. S., Bale, R., Griffith, B. E., Patankar, N.A. (2013). A unified mathematical framework and anadaptive numerical method for fluid?structure interac-tion with rigid, deforming, and elastic bodies. Journalof Computational Physics, 250, 446-476.6. Schwab, M., Karrenbach, N., Claerbout, J. (2000) Mak-ing scientific computations reproducible, Computing inScience and Engineering 2(6):61?67 Peng, R. (2011), Reproducible Research in Computa-tional Science, Science 334(6060):1226?1227 grateful for the support from the US Na-tional Science Foundation for grant number NSFOCI-0946441 and from NVIDIA Corp. for equip-ment donations under the CUDA Fellows Pro-gram. LAB would like to acknowledge the hos-pitality of the Berkeley Institute of Data Science(BIDS), where this paper was written.Olivier Mesnard is a doctoral student at the GeorgeWashington University. He has an Engineering degreefrom Ecole Suprieur de Mcanique (SUPMECA) inParis, and a Masters degree in Sciences from Uni-versit Pierre et Marie Curie, also in Paris. His re-search interests include computational fluid dynamicsand immersed-boundary methods with application toanimal locomotion.Lorena A. Barba is Associate Professor of Mechani-cal and Aerospace Engineering at the George Wash-ington University. She obtained her PhD in Aeronau-tics from the California Institute of Technology. Her re-search interests include computational fluid dynamics,especially immersed boundary methods and particlemethods for fluid simulation; fundamental and appliedaspects of fluid dynamics, especially flows dominatedby vorticity dynamics; the fast multipole method andapplications; and scientific computing on GPU archi-tecture. She received the Amelia Earhart Fellowship, aFirst Grant award from the UK Engineering and Phys-ical Sciences (EPSRC), the National Science Founda-tion Early CAREER award, and was named CUDA Fel-low by NVIDIA, Corp. She is a Visiting Scholar at theBerkeley Institute of Data Science.11


View more >