A report in today’s issue of the journal Nature describes a fundamental principle of self-organizing systems that helps explain why the Internet and the Web, like some other communications networks, are sitting ducks for saboteurs.p. Researchers led by Albert-Laszle Barabasi, an associate professor of physics at the University of Notre Dame, present evidence that the Internet is highly tolerant of random failures among the millions of routers and servers that make upthe network. Even the Web, comprising millions of documents and links, can tolerate random failures quite well.p. But using computer-modeling and experimental data-visualization techniques, Mr. Barabasi and his colleagues demonstrate why the Web and the Internet are highly vulnerable to attacks that use malicious software agents. To render the networks unusable, such agents would have to attack only the routers and servers — or the Web pages — that provide the most connections to the rest of the network.p. “Understanding the structure of these networks is the first step towards designing tools that could, in the long term, help us,” Mr. Barabasi says. “However, this is not an easy task,” he adds, because the networks’ vulnerability is not the result of engineering design. Each institution adds its own links and routers as needed, he says.p. The Internet and the Web are the result of “a self-assembly process,” Mr. Barabasi says. “There is no central engineering design that is flawed.” If people could grasp that notion, he says, they would not expect that “a silver bullet” could fix the networks’ structural weaknesses.p. Proponents of new spending on protection against cyber-terrorism may see immediate practical value in Mr. Barabasi’s research. But he sees much larger implications for his theoretical model of large communications networks.p. Even though most people think of the Internet and Web as one network, the Notre Dame researchers analyzed them as separate — and tried to explain why they are vulnerable to cyber-attack but resilient in other ways.p. For instance: Mr. Barabasi and his colleagues — Hawoong Jeong, a postdoctoral research associate, and Reka Albert, a doctoral candidate in physics at the university — write that the hidden structures and growth patterns of the Web appear to be similar to those of complex living systems, including the metabolic networks that operate inside cells and the social networks that make up societies.p. The Nature article builds on earlier research in which Mr. Barabasi described the Web, which now comprises nearly one billion pages, as having a surprisingly small “diameter.” The Web’s diameter, as he describes it, is a measure ofthe average number of clicks required to navigate between any two Web pages. Any randomly selected Web page is separated by an average of only 19 links, or mouse clicks, from any other randomly selected Web page. (See an article from The Chronicle, September 9, 1999.)p. In today’s Nature, Mr. Barabasi’s group writes that a well-aimed attack could expand the diameter of the Web to such an extent that it would no longer be practical to follow links. In modeling attacks on both the Internet and Web, they found that “connectivity is maintained by a few highly connected nodes,” and that destroying only 4 percent of those nodes would effectively disable the Web.p. “The diameter of these networks increases rapidly, and they break into many isolated fragments when the most connected nodes are targeted,” the group reports.p. Could that knowledge help malicious people launch more efficient and devastating attacks on the Web? “I hope not, but my suspicion is that it might,” Mr. Barabasi says.p. Yuhai Tu, a scientist at the International Business Machines Corporation’s Thomas J. Watson Research Center, says the Notre Dame researchers present a useful approach to analyzing many varieties of “self-organized” networks, including gene-regulatory networks and neural networks.p. “Perhaps if we could understand why a certain network topology is preferred and selected by nature,” he writes in the same issue of Nature, “such knowledge could ultimately help us design more-robust artificial networks.”p. p. p. Thursday, July 27, 2000
TopicID: 304