Web Chat: Napster gave us a peek at Internet's future

With the Napster controversy slowly withering into the annals of history (an industry settlement may be on the way), it's time for Internet connoisseurs and corporations to look at what transpired.

First we learned about the power of innovation. Shawn Fanning, the creator of Napster software and 9 percent owner of the company, looked at the Internet differently than the Microsofts and Oracles.

He was interested in efficiency - eliminate the steps between the user and the product. In three months he created a peer-to-peer network that used the Internet only as a device through which Napster users could connect. Once connected, their computers could talk to each other, employing the theory that the shortest distance is a straight line.

Efficiency had to be the model for Fanning's plan to share music on the Net. Relatively large files transferred through a series of servers is a slow process, even at optimum speed. Fanning must have thought that was the reason music downloads were cumbersome and rarely employed.

Napster also taught us that contrary to the old Silicon Valley way of thinking, the Internet is not destined to be a solely broadcast/receiver medium. While corporate creators of content will continue to dominate the Web, true technophiles will always have the ability to share content unencumbered with commercial intrusion. Ironically, the servers that power the commercial sites will aid in P2P connections.

Riding on Napster's success, venture capital poured into P2P startups and already several companies, like Scour and Gnutella, have created twists on the existing technology.

Napster was designed to transfer solely Mp3 (compressed music) files, but the others have enabled their users to trade anything digital. This can include movies, radio and even software. The shortest distance/straight line theory makes the transference of these data-heavy files more efficient.

Unlike its cousin Napster, Gnutella does not use a central search server, a grassroots move that shields it from copyright liability. Instead, when Gnutella users log on, they introduce themselves to a network of PCs that are strung together through the Gnutella software. Thus, these users are riding on top of the Internet, only using server connections to connect with each other.

Now that the theories for P2P networking are in place, the next step is streamlining the transference of information; to make it even faster and more efficient.

Nowadays programmers are dealing with the reality that most of the computer power in the Internet lies dormant. Most PCs are simply running network software that sends and receives, to and from the servers that talk to each other.

In the near future all of that unused processing and storage power will be added to the information chain. And it may eliminate the need for "broadband," transference of information. P2P networking will be the engine. As one programmer said at a recent P2P conference, "With peer-to-peer the computer is no longer just a life support system for your browser."

As we move away from the broadcaster/receiver philosophy, the backbone for today's Internet, information will be sent in chunks to Internet hubs. These hubs will act like a community center for information.

To use an example, most Tahoe.com users are local to the Carson/Tahoe areas, but using the site is barely faster than accessing a similar service on the East Coast. A P2P connection may allow a Tahoe.com user to maintain part of Tahoe.com on their home computers and connect directly to our local server for updates.

Imagine the possibilities. Video, sound and information may one day transfer at many times the speed that a cumbersome Internet connection does today. And it won't require an expensive cable or DSL connection. Ironically, dial-up modems may prove sufficient to handle all of our data needs.

Questions?Ideas? E-mail at jimscripps@Tahoe.com

Comments

Use the comment form below to begin a discussion about this content.

Sign in to comment