Supercomputers Helping Scientists Crack Massive Problems Faster

New high-powered number crunchers popping up in computer centers across the U.S. are attracting more theoretical chemists and extending the range of ...
0 downloads 0 Views 2MB Size
NEWS FOCUS

Supercomputers Helping Scientists Crack Massive Problems Faster New high-powered number crunchers popping up in computer centers across the U.S. are attracting more theoretical chemists and extending the range of problems that can be solved effectively Ron Dagani, C&EN Washington

In 1978, Charanjit S. Pangali, then a graduate student working with chemical physicist Bruce J. Berne at Columbia University, composed a computer program to solve a difficult problem in statistical mechanics. Simply put, they were trying to predict the effect of intermolecular forces on two unsolvated xenon atoms in water. Using a conventional mainframe and a minicomputer, both located on campus, they got their answer, but only after the machines had spent a total of some 500 expensive hours grinding out solutions to a blizzard of equations. Two months ago, Pangali tried the same problem on a supercomputer. He got the correct answer in 19 minutes. With performance like that, the supercomputer is quickly becoming the darling of computationally oriented researchers. Its dazzling speed is enabling them to tackle gargantuan problems that, previously, were too time-consuming to even be considered. After all, how many scientists could afford to tie up their institution's mainframe for thousands of hours while it plugged away on a single problem? But with the arrival of the latest generation of supercomputers, such mind-benders often can be cracked in a mere hour or two. Until now, though, a relatively small number of researchers had access to supercomputers. The few machines that existed were ensconced in a handful of corporate labs—mainly oil and aerospace companies— or in government installations devoted largely to energy and weapons research, intelligence gathering, or the space program. Academic scientists, in particular, usually found themselves out in the cold. Most universities just couldn't afford the $5 million to $20 million systems. And unless they were involved in problems of interest to the government, most professors had little hope of logging on the prized machines. Some of them, out

of necessity, took their research projects to Europe, where supercomputers have been more accessible. Fortunately, such overseas pilgrimages no longer are necessary. These most powerful of modern scientific computers now are popping up in more U.S. computer centers than ever before. In a move that has generated much excitement, the National Science Foun-

Cray customer engineer Jerry Cavallo briefs Anita Winfield, computer support supervisor at Lawrence Livermore National Lab's magnetic fusion computer center, on the center's new Cray-2 supercomputer August 12, 1985 C&EN

7

News Focus NSF guiding supercomputer use to new highs The year 1986 is going to see an explosion of supercomputer use, says John W. D. Connolly, director of the Office of Advanced Scientific Computing (OASC) at the National Science Foundation. Academic researchers who had been chafing over the general unavailability of supercomputers finally will get their chance to log onto these most powerful of number crunchers. The big jump in use will come at the five new national supercomputer centers that NSF announced earlier this year (C&EN, March 11, page 14; July 8, page 7). These will be located at the University of California, San Diego; the University of Illinois; Cornell University; Princeton University; and the Pittsburgh Center for Advanced Computing. The last is a consortium consisting of Carnegie-Mellon University, the University of Pittsburgh, and Westinghouse Electric. To NSF insiders, these five sites are known as Phase II centers, to distinguish them from six sites, called Phase I centers, where many NSF grantees have gotten their first taste of working with a supercomputer. The Phase I centers, such as Purdue University, the University of Minnesota, and Boeing Com-

puter Services, already owned superspeed machines before NSF's initiative began. NSF wanted to wedge open the gates to large-scale computing as soon as possible. So, it started buying supercomputer time at the Phase I centers and parceling it out to NSF grantees. The foundation awarded about 5000 hours in fiscal 1985. For the second year of the program, the allocation will be increased to about 8000 hours, Connolly says. Of course, once the Phase II centers come on line in the next seven months, the amount of supercomputer time available will shoot up to about 60,000 hours. That's what Connolly means by an explosion. As it looks now, whatever demand that develops for supercomputer time will have to be handled largely by the Phase II centers already announced. Because of the unpromising budgetary outlook, NSF is not planning to set up any more Phase II centers in the near future, Connolly tells C&EN. Of course, he adds, if Congress makes more funds available, the number of centers could rise to 10 or 12. For fiscal 1986, NSF requested $46.2 million for its supercomputing initiative.

dation is helping set up five national supercomputing centers that will be available to scientists in all 50 states. The increased access to supercomputers is expected to have a dramatic impact on research, engineering, and higher education in the U.S. These computers are extending the range of problems that can be solved effectively. They also are furnishing more detailed, accurate, and realistic solutions to complex problems that previously could be solved only to a crude approximation. Furthermore, many more professors and students now will be getting hands-on experience with supercomputers. People are betting that this will boost the U.S/s scientific standing in the world. Although supercomputers are capable of performing hundreds of millions of arithmetic operations per second, they cannot be considered merely hyped-up calculators. Rather, they are machines that can reveal the unseeable and even the not otherwise knowable. They are helping scientists make discoveries in two key ways. First, by mathematically simulating natural phenomena, such as the birth of a star, which cannot be studied at close hand. And second, by testing hypotheses and simulating experiments that would be difficult or impossible to carry out in the laboratory. 8

August 12, 1985 C&EN

That's 15% more than it got last year and is, essentially, what Congress gave it in the recently concluded budgetary battle. NSF plans to spend about $200 million over the next five years on the Phase II centers. These centers will get between $7 million and $13 million apiece each year, depending on their size and scope. Already, about 400 research groups at almost 100 institutions around the U.S. have received NSF grants involving supercomputer use. About 60 of these groups, or 15%, are involved in chemical research. Not surprisingly, the areas that account for the bulk of supercomputer use are physics, materials science, chemistry, engineering, and mathematics. In chemistry, researchers use the machines most frequently for computations in quantum mechanics and statistical mechanics. Supercomputers don't seem to have broken into other areas of chemistry where smaller computers are used routinely. No one, as yet, is using a supercomputer to plan organic syntheses, for example. "Very few experimentalists in chemistry are at the stage where they need a

Such simulations can have very practical benefits. For example, by simulating the air flow around an entire airplane on a computer screen, engineers can try out new aerodynamic designs without resorting to time-consuming wind-tunnel tests on prototypes. By charting the flow of reactants and products inside an internal-combustion engine, researchers will be able to design more fuel-efficient and trouble-free engines. And by simulating the circulation of the oceans and atmosphere and the effects of carbon dioxide buildup in the atmosphere, scientists expect to improve longrange weather forecasting. The great value of such simulations is that they permit researchers "to circumvent some real-world constraints/' according to mathematician Bill L. Buzbee and physicist David H. Sharp of Los Alamos National Laboratory. For example, a nuclear reactor accident can be played out without risking environmental contamination. Or, a chemical reaction that takes place in the blink of an eye can be scrutinized in slow motion on a graphics terminal. Among chemical scientists, the supercomputer has been embraced most avidly by the theoretical chemistry community, in whose midst the line between physical chemists and chemical physicists tends to blur.

20#