A Supercomputing Surviver [Archives:2000/49/Science & Technology]

archive
December 4 2000

Prof. Eng. Salem Al-Abdel-Rahman
Fellow of ISES (Aust.), AAS (USA), ICTP (Italy)
Once, in the late 1980s and early 1990s, there were four national supercomputing centers. Now there are only two: the National Center for Supercomputing Application (NCSA) at the University of Illinois, Urbana-Champaign, and the San Diego Supercomputer Center (SDSC) at the University of California, San Diego, in La Jolla, each the hub of multi-institutional alliances. Both centers arose from unsolicited proposals to the National Science Foundation (NSF) by their directors, and if NCSA is the child of physicist Larry Smarr [see: S. Al-AbdelRahman, The world of Supercomputing, AIP,1995, P.496, (USA)] SDSC is the creation of unclear engineer Sidney Karin.
The effects of this change (from four individual supercomputing centers to two alliances) havent yet been felt, but the effects will be enormous, says Karin, who is director of the San Diego based National Partnership for Advanced Computational Infrastructure. In the past , the centers catered to a core of traditional users, big consumers of large numbers of floating-point operations for simulation, he says. But there are broadening and integrative activities going on in many dimensions through-out the alliances.
SDSCs university partners – such as the University of Michigan, the University of Texas, Caltech, and the University of California at Berkeley- are sharing the load of providing computational resources to users. Meanwhile, NPACI has a more explicit charter to explore new architecture, such as the Tera MTA machine here in San Diego, the Now cluster of workstations at Berkeley, and the Hewlett – Packard Convex Exemplar at Caltech, Karin says. Nevertheless, Karin suggests, NPACI will encourage studies of new computational directions and bring in new classes of researchers.
A new direction in SDSCs activities is the centers movement into information- intensive computing. In the past, researchers with large collections of data that were not the product simulations have been isolated from (or not closely integrated with) modelers, Karin notes. My favorite example is weather prediction, he says, Where we measure on enormous amount of data about weather around the globe, but most of that data are not fed into predictive models. One reason for this has been that their computer systems capable of dealing with large masses of data have not been the same systems as those capable of running the simulations.
Now, as the computer systems become bigger and more powerful, integrating the two approaches becomes more possible, Karin says. Bringing the observational and experimental approach to science together with computer modeling and simulation will assist the user community in ways not foreseen during the supercomputing center program.
Many people who access NPACI applications via the World Wide Web have no idea what kind of computer supports them, Karin noted. He gives as an example the MICE resource for protein – sequence analysis at SDSC. MICE is based on artificial – intelligence techniques that look for patterns in amino acid sequences , and it runs on a parallel-processor super computer. Such transparent supercomputing allows NPACI to reach out to new communities of researchers while staying on the bleeding edge of innovation [see: S. Al-Abdelrahman, Bleeding Edge of Innovation, physical review, 1996, pp. 200-232, USA]
It was in college, as a mechanical engineering major at the City College of New York, that Karin became hooked on computers. The university has just gotten an IBM 7000-series mainframe, and the professor offered students an account on it.
With nothing better to do, I just tried it and became instantly addicted, Karin says. Subsequently, he spent all his free hours at the computer center, sitting at keypunches and submitting decks of cards, getting back stacks of 14 7/8-inch-wide computer paper form these $30,000 printers. I distorted everything I did, until I finished [college] , to be a computing application of some kind.
In 1966, Karin enrolled as a graduate student in nuclear engineering at the University of Michigan at Ann Arbor. He was following his interests in energy conversion and power production. But even then, Karin says, he distorted every assignment… into some competing opportunity. Fortunately, some researchers in his department from the nearby scientific laboratory of Ford Motor Co. in Dearborn had arranged for teletype access to a philco computer at Ford, which the Michigan graduated students could use when the Ford researchers were not using it. So, instantly, I was in interactive computing, which was more addictive than the [punched] cards were, he says. Soon, Michigan obtained an early IBM System 360 mainframe that ran an interactive operating system, and he became even more addicted to computing.
Eventually a little consulting company was formed by some of the faculty in my department, and they hired some of the graduate students, including myself, Karin says. This group undertake a variety of substantial software projects, including large (for the time) simulation codes for nuclear reactors and a reservations system of a travel agency. We were, as a team, doing pretty large computer calculations [for the time]. The team work over a major fraction of the virtual memory of a dual-processor IBM System 360167 main frame with problems so large that they would crash the [resource] accounting system regularly.
For his thesis, Karin developed a computer application to parameterize neutron-cross-section calculations. One of Karins innovations was to substitute table lookups for recalculations of complicated integrals, so as to speed up the calculations. Between his graduate study and work for the consulting firm, he got exposed to a fairly wide variety of technical-computing problems in the late 1960s and early 1970s.
For graduate school, Karin took a position as a nuclear engineer with General Atomic Corp. (now General Atomics Corp.) in San Diego, CA. Here, too, he focused on the use of computers to solve engineering problems. Through a series of different assignment within the company, it got me to the position of director of information systems for General Atomic, what would today be called Chief information officer, in 1981, he says.
Part of Karins responsibility was managing General Atomics involvement in the Department of Energys (DOEs) Fusion Computing Network, whose hub was the Controlled Thermo-nuclear Reactions Computer Center at Lawrence Livermore National Laboratory in California. That got me involved with the supercomputing people at Livermore and was how Ii met Jim Decker of DOE, he says.
Around 1982, at Deckers request, Karin was invited to participate in a governmental committee with the goal of examining academic access to super computers. It was through that I recognized the opportunity at NSF, to establish a super computer resource, he says. Academics did not have the same access to super computers as did people at the national laboratories or NASA. What I learned was that the federal government wanted to do something about that.
It seemed at the time fairly clear what to do, Karin says. So I went off and bootlegged a proposal to NSF that led to the San Diego Supercomputer center. At approximately the same time, Larry Smarr submitted a similar proposal that led to the formation of the NCSA at the University of Illinois, Karin notes. Both proposals were based on experiences that the two researchers had undergone at Livermore.
We both sent in unsolicited proposals at the end of 1983 or 1984, Karin says, but NSF decided to hold off and formally solicit proposals for national supercomputer centers. The two proposals were resubmitted with alterations to fit the solicitation, and both centers received funding.
When I realized that there was an opportunity, I also realized that the opportunity required the cooperation to the academic community. Karin says. He immediately approached the president of General Atomic, Harold Agnew, who had previously been director of Los Alamos National Laboratory when it installed the first Cray-1 supercomputer. He made it clear to Agnew that the center was not going to be a major source of revenue for the company, but also that it would not lose money and would be an important service to the community.
He immediately said Yes, and it was through Harold good offices that I was able to gain access to the right people, such as the chancellor of the University of California at San Diego (UCSD), Karin says. We submitted it as a proposal from General Atomic with very direct involvement of UCSD in particular and the other educational institutions in San Diego.
This linkage of industry and academia made the SDSC special, a creature of a unique nature, as karin describes it. As the centers director, he had an involvement on both sides. I had two separate budgets, two separate bureaucracies to deal with, and the center was the union of all of this activity – not one or the other. The company was the lead partner for financing purchases, but Karin also worked at the university. My office, ever since we started the project, has always been here on the campus, and Ive had a joint appointment on the faculty, he says.
When NSF replaced the 12-year-old supercomputer – centers program with the Partnerships for Advanced Computational Infrastructure, Karin and others recognized that the center would be more competitive if a university were the lead institution. Its not a secret that this caused a fair amount of dissension and discussion, he says, but eventually it was resolved, I think, to everybodys satisfaction. As proof, he notes that General Atomics is still involved in SDSC in a significant way.

——
[archive-e:49-v:2000-y:2000-d:2000-12-04-p:./2000/iss49/techno.htm]