Computational fluid dynamics—a technology that has emerged
within the past decade because of the availability of ever more powerful
supercomputers—is completely changing the way aerospace vehicles are
This new technology represents as significant a milestone in
flight as the invention of the wind tunnel, which it complements. Just as the wind
tunnel was the essential first step toward heavier-than-air vehicles, the new
computer-based analytical techniques will make possible the high-performance
vehicles of the future.
The role of the wind tunnel is often overlooked. Everybody
knows of the Wright brothers' success at Kitty Hawk, N. C., on December 17,
1903. What is not widely known is that three years earlier, back in their shop
in Dayton, Ohio, the two bicycle-makers achieved the breakthrough that made
that flight possible and ushered in the age of aviation. They built the first
crude wind tunnel to test their designs before they flew them.
Until then there was only one way to test aircraft: Fly
them. That's what all the other aviation pioneers did. Many of them, like Otto
Lilienthal, died in the process. Others, like Samuel Langley, suffered a
series of embarrassing failures.
The Wright brothers correctly guessed that the key to
powered flight was the way the cross-section shape of the wings provided lift.
Birds don't fly simply by flapping their wings; birds fly because their wings
are remarkably efficient airfoils.
Once that principle of lift had been established in
ground-based testing, the first flight was, scientifically speaking, almost an
anticlimax. Nearly ninety years later, aircraft designers still base their
work on this principle as they expand the flight envelope to ever greater
speeds and altitudes.
Testing the Next
Air Force Systems Command operates the world's largest
aerospace ground-test facility, the $3 billion complex of wind tunnels and environmental
chambers at the Arnold Engineering Development Center (AEDC) near Tullahoma,
Tenn. Since it opened for business in 1951, this facility has tested most of
the Air Force's new aircraft and missiles along with such NASA ye-hicks as
Gemini, Apollo, and the Space Shuttle.
This also is where the Air Force will test its next
generation of vehicles, including the Advanced Tactical Fighter and the X-30
National Aerospace Plane. These new vehicles will operate in a much more
demanding environment and therefore will require much more complex testing.
This is where computers become a critical factor.
Computerized simulation of aerodynamics is not new. The
idea of "flying" an airplane in a computer before undertaking
dangerous flight tests emerged after World War II from pioneering work by the
Air Force, the National Advisory Committee for Aeronautics (NACA, the
predecessor to NASA), and the aerospace industry.
What is new is the power of today's supercomputers, which
can analyze the airflow around aerodynamic vehicles with sufficient precision
to enable them to operate in the more demanding flight regimes of the future.
Although the Wright brothers were the first to demonstrate ground testing,
they made a fundamental error: They thought the flow of air under the wing provided
the lift. Today aerodynamicists know that it is the partial vacuum created
above the airfoil that is responsible for lift. An error like that was no
problem for an aircraft with the performance of the Wright Flyer. It would be
fatal for today's aircraft.
All new flight programs will rely on computational fluid
dynamics. Breaking the term into its component parts makes it easier to understand.
The computational part is obvious. This is a technology based on the use of
computers to do calculations that were heretofore impossible. A fluid is what
airplanes fly in; it's called air. The key to the concept is the third
part—dynamics. By knowing the dynamic interaction of a vehicle with its
environment, developers can optimize its performance.
Thus, CFD, as it's known, is essentially a set of software
techniques that takes advantage of trends within the computer industry to
build much more powerful machines for a variety of demanding applications.
At its Ames Research Center near San Francisco, for example,
NASA has just put into operation a Cray Y-MP supercomputer capable of more than
a billion computations a second. NASA is shooting for a trillion computations
per second at its Numerical Aerodynamic Simulation Facility there by the end
of the century.
New Tools at
At the Arnold test site, the Air Force operates two smaller
Cray supercomputers, an X-MP and an earlier model Cray 1, both linked to each
other and to a larger Cray 2 at Kirtland AFB, N. M. These are the hardware
tools of AEDC's CFD efforts.
The critical software tools have evolved over the past ten
years, recalls Dr. Donald C. Daniel, chief scientist at the Arnold center.
They consist of two parts: gridding, which is a mathematically generated
picture of the air vehicle that he calls "a sophisticated checkerboard,"
and the algorithms that the computer uses to calculate the airflow over the
simulated vehicle (or through it, in the case of a propulsion system). The
more grid points that can be analyzed and the more sophisticated the algorithms
(actually partial differential equations) used to analyze them, the more accurately
the vehicle's performance can be calculated.
Furthermore, these calculations only begin on the vehicle's
surface. They must be extended outward from the vehicle's body with emphasis
on flow gradients (changes of flow) that affect vehicle performance. This
would be a simple process if all aerospace vehicles were perfect spheres or
cylinders. They aren't.
Because of the complex shapes that have to be tested,
according to Dr. Daniel, the software engineers' task is to develop equally
complex adaptive grids incorporating a feedback loop between the solution and
the grid. This is a tedious process, and Dr. Daniel notes that it took a year
to initially set up the grid and solve the flow field for the F-16 fighter.
Questions at Mach 15
Further complicating the process is the need for a better
understanding of the basic aerodynamic processes. "We still don't
understand turbulence," Dr. Daniel says. "It's more or less random,
and we can't model a random event well." He expects there's enough
research to be done in this area to keep scientists busy for the rest of this
The problem isn't so bad at subsonic and supersonic speeds.
It's the transonic regime that worries scientists like Dr. Daniel. He calls
that "the most nonlinear part" of the flight envelope, or the one in
which the relationship between flow fields and vehicle performance is least understood.
When it comes to hypersonic vehicles like the X-30 operating at Mach 15 at
300,000 feet, Dr. Daniel can only shrug, "What's your guess?"
Nonetheless, the basic principles of CFD are in place to
handle future flight programs. Dr. Daniel pays tribute to Boeing for its
pioneering work on its 757 and 767 commercial jetliners, adding that the Air
Force will get maximum benefits from the technology on the ATF, "where the
tools were there from the inception of the aircraft."
Dr. Edward M. Kraft, manager of the technology and analysis
branch of the Calspan Corp. contractor team operating the wind tunnel test
facilities at Arnold, describes the synergistic relationship among the three
facets of vehicle testing: ground testing (in which the wind tunnel is the
traditional tool), flight testing, and CFD. "Each tool has its
limitations," he says, "but the other tools overlap and accommodate
Ground tests can't duplicate all conditions, particularly in
the case of a spacecraft, but they are less costly and less dangerous. Flight
tests are still essential because they represent the "truth,"
according to Dr. Kraft: "What you see is what you get." CFD is now
entering the picture as part of an effort to do the diagnostics first and thus
minimize ground testing and certification changes later in the program. As Dr.
John H. Fox, a principal engineer with the Calspan technology and analysis
branch, puts it, "We fly the aircraft on the computer."
"The name of the game is optimizing," adds Ralph E.
Graham, chief of the aeronautical systems division at Arnold's directorate of
aerospace flight dynamics test. "We're looking for the last one percent of
Graham cites a very practical application of CFD that is paying
off for the Air Force right now: certifying the release of stores. The Air
Force has 110 kinds of stores (fuel tanks, bombs, missiles) in its inventory,
he explains, and they're used with a variety of different aircraft. This adds
up to thousands of possible combinations, so certifying a particular store for
a particular aircraft can be a lengthy, costly process.
Instead, by using CFD in conjunction with wind tunnel test
and analysis to determine the basic aerodynamic behavior of the stores and their
host aircraft, the Air Force will be able to greatly reduce flight testing—in
some cases by fifty percent—and "mix and match" the two. To do this
entirely in a wind tunnel could take up to three years.
With CFD, wind tunnel testing and analysis, according to
Graham, the process at AEDC can be cut to three months. How much money could
this save? "The cost of an F-15," Graham quips. There's also a
potential performance improvement in better circular error probable (CEP) for
air-to-ground and air-to-air missiles.
Tracy Donegan, a Calspan senior engineer, describes a
typical CFD project completed last August for the F-15 fighter: The entire
aircraft (except its tail) with its seven pylons, a store, and pod was
computationally simulated with 1.1 million grid points. It took four engineers
six months working part-time to develop all the algorithms for the grids and
The initial purpose was to determine the aircraft/store
flow field, but the program became much broader than that, Donegan explains.
For the first time it gave the Air Force a picture of the flow field around a
complete aircraft. That picture is available on demand at a video computer
terminal in three dimensions and color-coded to show flow field gradients.
This technology is now available to airframe prime contractors, and Donegan estimates
the X-30 would require about the same number of grid points.
"CFD hasn't been extensively applied from cradle to
grave," says Col. Dale F Vosika, Arnold's deputy for operations, "but
it does give us a level of expertise when integrated with ground and flight
tests." In the case of the X-30, he notes, the lack of ground-test
facilities will require a lot of computer simulation. This program, as well as
the ATF, will require coordination with the Air Force's Aeronautical Systems
Division (particularly the Flight Dynamics Laboratory) and the Air Force
Flight Test Center. Colonel Vosika cites the complexity of the aircraft—higher
performance, speeds, and maneuverability. The supercomputer complex at the NASA
Ames Center will also be heavily involved in CFD studies to support the X-30.
Reducing test time by using the electronic analog known as
CFD has a major impact on costs, according to Rampy, who estimates that
electrical power requirements eat up seventy percent of all test costs. That's
cheaper than the operating costs of test aircraft, but it is still a cost to be
avoided if possible.
In fact, this voracious appetite for electrical power is why
the Arnold center is located in the heart of Tennessee Valley Authority
territory. The availability of relatively low-cost power—plus water for cooling
the test facilities—reduces overall costs.
They are still hefty. Col. (Brig. Gen. selectee) Stephen P.
Condon, Arnold's Commander, has an electricity bill that would make most
homeowners blanch: $2 million a month. That amounts to nearly 500,000
megawatt-hours a year—enough, he says, to provide power for a city of more than
CFD Saves Money
Dr. Keith L. Kushman, chief of the center's facility
technology division, has pinpointed some of the cost savings attributable to
CFD. He figures the computational costs at Arnold at about $4 million a year,
of which half is salaries and most of the rest is the amortized cost of the
supercomputers. He has documented more than $2 million in cost savings to the
center's customers (principally other elements of the Air Force Systems
Command), but he estimates there is another $8 million in intangible savings
from reduced risks to conventional ground-test equipment by doing the tests in
a computer instead of wind tunnels. Furthermore, he maintains, half of the
tests his team has conducted couldn't be done at all without CFD.
His colleagues at Wright-Patterson AFB, Ohio, agree.
"Computational aerodynamic simulation now is a valid, inexpensive
alternative to wind tunnel testing of new aircraft and aerospace designs,"
according to a statement by Dr. Joseph J. S. Shang, a technical manager at the
Flight Dynamics Lab, after a series of simulations four years ago using the
X-24C lifting body. The computed results duplicated the results of earlier
wind tunnel tests for flow fields and aerodynamic forces on the vintage 1974
experimental reentry vehicle.
As supercomputers become even more powerful, the technology
of CFD can be extended even further, according to Arnold chief scientist Dr.
Daniel. He is more concerned about memory capacity than about
multibillion-operation speeds and says even the 256-million-word memory of the
top-of-the-line Cray 2 "won't be nearly enough" for some of the
projects he has in mind.
"The great thing about supercomputers is that they
unlock the mind," Dr. Daniel concludes.
John Rhea is a
freelance writer in Woodstock, Va., who specializes in technology issues. He
is the author of SDI—What Could Happen: 8 Possible Star Wars Scenarios, published in 1988 by Stackpole Books.
Daily Report: Read the day's top news on the US Air Force, airpower, and national security issues.
Tweets by @AirForceMag