Behind an unmarked door in an obscure corner of Notre Dame’s Stepan Chemistry building lies a room cold enough to store meat. It is home to some 140 dual-processor Dell desktop computers stacked on common restaurant-style storage racks. Hot-wired together, these units represent one of the finer high-speed computing facilities at a U.S. university. Its keepers have named the facility BoB, and their penchant for humanizing is warranted. When BoB is fed, it gives back the kind of data that may, for example, help crack medical mysteries. Such are the new frontiers of science, which is in a period of change so revolutionary some see the dawning of a distinctive new era, one in which white-coat experimentation is being augmented by computerized modeling and simulation. Among examples on the Notre Dame campus:p. Johannes Westerink, associate professor of civil engineering and geological sciences, has barely caught up from the sleep he lost in October as he tracked the path of Hurricane Lili. Westerink has created a software program that, paired with the supercomputing resources of a Louisiana weather tracking facility, aided evacuation planning by predicting the path and level of destruction as hurricane winds slammed walls of seawater against land.p. Olaf Wiest, associate professor of chemistry and biochemistry and one of BoB’s proprietors, has used high-speed computing to synthesize an artificial enzyme that in time may — at the DNA level — be able to reverse the path and potential for skin cancer caused by ultraviolet rays.p. In chemical engineering, Joan Brennecke, Edward J. Maginn and Mark Stadtherr collaborate on a molecular modeling project to study and design ionic liquids, a new class of materials that can be used to eliminate industrial emissions of organic compounds that contribute to air pollution and global climate change. Working in another vein, Maginn employs molecular-level technology and high-speed computing in search of a means of siphoning the toxic substance mercury from water.p. Computer science and engineering professors Kevin W. Bowyer and Patrick J. Flynn are compiling the world’s largest inventory of digitized faces. When incorporated into a three-dimensional modeling process, this heir-apparent to fingerprinting will allow security devices to swiftly identify a person — such as a terrorist — from a substantial distance. The potential for airport security alone is staggering.p. Physicist Albert-Laszlo Barabasi’s work on the behaviors of large networks, from the World Wide Web to Al Qaeda, has convinced the physicist that all networks have a deep underlying order and operate according to simple but powerful rules. Knowledge of the structure and behavior of these networks illuminates everything from the vulnerability of economies to the ways that diseases are spread.p. Scientists and engineers sometimes refer to such work as “grand challenge” scientific problems, differentiating those that demand deep computational resources from the science of creating computational resources. In late October, Notre Dame’s scientific and engineering communities were abuzz with excitement about the latest successful grand challenge: the solving of the most difficult arithmetic problem ever attempted, Certicom’s ECCp-109 challenge. Post-doctoral mathematics fellow Christopher Monico turned neither to BoB nor the Office of Information Technology’s powerful High Performance Computing Center. Rather, he “distributed” the computational assignments across the computers of some 10,000 volunteers from both on campus and off.p. As for contributing to the advancement of information technology, Notre Dame’s history dates back more than 100 years, to when Jerome Green, professor of electrical engineering, sent the first North American wireless transmission from Notre Dame to Saint Mary’s College. And their contributions also have been “grand.” What else to call the solution Oliver Collins, professor of electrical engineering, designed in the mid-1990s, when all the photos of Jupiter from the Galileo spacecraft appeared lost to malfunction. Collins’ solution for compressing and transmitting images was conceived, mounted and delivered to Galileo after launching. He is a member of the Coding Theory Research Group, electrical engineers who are building new resources for wireless and digital communications.p. Peter Kogge, Ted H. McCourtney Professor of Computer Science and Engineering, has been doing significant work on computer architectures since before the first “Star Wars” movie was launched. His research team is refining Processing-in-Memory (PIM), which combines the chip dictating the logic of a machine with the chip accessing its memory. Eliminating an architecture in which the left hand of logic has had to coordinate with the right hand of memory is expected to result in huge increases in speed in bandwidth. Using BoB, chemist Dan Gezelter can take trillions of snapshots a day of how the molecules are moving in biological membranes, glasses and nanoparticles. As fast as that sounds, PIM should send BoB the way of the dinosaurs.p. If PIM represents the quest for better and faster, the work of the Nano Science Technology group is pursing the quest for smaller. This consortium of engineers, physicists and chemists is refining Quantum-dot Cellular Automata (QCA), a new generation of computational devices whose “chips” are the size of molecules, whose “transistors” are electrons, and whose need for electrical current will be obsolete.p. Kogge, who also works with the Nano Science team, can get as fired up about the work of Monico, the post-doc mathematician, as he can about his own achievements. It is the collegiality and pollination of creative ideas across the sciences and engineering that brought Kogge from the IBM labs to Notre Dame. When you’re creating the new boundaries of science, it just makes sense to leave behind those territorial definitions.p. “The barriers are down here, between computer science, electrical engineering, chemistry, physics,” Kogge said. “It’s immaterial.”
TopicID: 3896