Future of the 'Net
This column originally ran in ComputorEdge on October 3, 2003
The Internet is forever in a state of flux; always has been, since it was first created in the late 1960s as the ARPAnet.
If we end users don't always notice any but the big changes (say, the launching of the World Wide Web protocols in the early '90s), that doesn't mean that the developers and programmers aren't working on improving or at least changing the Internet.
The Internet is, at its heart, not a place or even a thing but simply a set of accepted protocols, or standards. E-mail is based on the SMTP or Simple Mail Transfer Protocol. Any two computers that both use SMTP can exchange e-mail (albeit through an SMTP server). The Web runs on HTTP or HyperText Transfer Protocol and FTP is the File Transfer Protocol.
Each protocol defines how computers talk to each other over the Internet; the connections between computers that make up the Internet are also standardized as TCP/IP, or Transmission Control Protocol/Internet Protocol.
Setting the standards
The above standards are all developed and agreed upon by various standard-setting bodies nonprofit organizations that include both academic and corporate members.
Without standards, there would be no Internet because different computers wouldn't be able to communicate.
We can see the results of that in the instant messaging arena where there is no standard, and thus the Yahoo Messenger client can't communicate with the Microsoft messaging servers.
That's not always bad the competition in instant messaging has led to several different approaches that have benefitted consumers.
But just as the original networking protocols of the ARPAnet were developed from earlier methods of using dumb terminals to remotely log in to mainframe computers, future instant messaging may well have a single industry standard based in part on the different proprietary protocols now in use.
Exercises in prognosticating the future are inherently futile, but fun nevertheless. And there are some areas where future developments are fairly well mapped out. Nobody knows what the Internet will be like in 20 years; but we can confidently look ahead say, 5 years, by looking at what's on the horizon now.
With the ongoing onslaught of spam and viruses, e-mail may well be in for the biggest overhaul. SMTP has worked as well as it has for so long because everyone bought into it. But its open nature (return addresses can easily be faked, for instance) is its biggest weakness, and there is growing demand from consumers for a more secure system.
Some sort of opt-in, security key-authenticated system may well replace SMTP. Such a protocol would require someone to have not only a copy of your personal security "key," (a unique, and long, number) but for you to have a copy of theirs and to have accepted it for e-mail use. How that gets done without communicating initially via e-mail is the challenge (web forms, perhaps?).
There are many different approaches being bandied about, and no clear leader yet (at least not that I've run across), but whatever the programmers come up with will have to meet some basic thresholds of ease of use in order to gain the kind of public acceptance needed to replace SMTP.
We've been hearing about the need to replace or seriously augment HTML for almost a decade now. HTML 3.2 was the last official upgrade to the language the Web is written in, though; HTML 4.0 is in widespread use, but is not yet an "official" release, but still a recommendation (since 1997!).
XHTML, or eXtensible HyperText Markup Language, is now in revision 2.x (you can visit the World Wide Web Consortium at www.w3.org to learn more). It combines the design features of HTML and the database-descriptive language of XML.
But XHTML isn't gaining a lot of traction on the 'Net; it's difficult to learn, albeit because it is designed to accomplish some complicated tasks.
Next to e-mail, instant messaging may be the next Internet area to get a serious upgrade. The government has made thinly veiled threats about stepping in to order an industry-wide standard if the various competing factions don't come up with a universal approach on their own.
The problem is that Microsoft and AOL both want to own the new standard. Where FTP, HTTP, TCP/IP and SMTP are all open standards i.e., no one owns them MSN Messenger and AOL Instant Messenger are both private, proprietary standards.
The weird thing is that there aren't really any performance or feature differences between any of the different messenger systems MSN, AIM, Yahoo and ICQ are all about the same. The selling point in each case is the customer base which system are your friends using?
FTP and Telnet
Before there was a World Wide Web, the Internet was FTP and telnet. In fact, when the Internet was first made public in the late 1980s, the Web was not yet born and so there wasn't exactly a flood of new users to the suddenly open 'Net. There were no graphical FTP clients yet, so to transfer files you had to know arcane line commands to change your local directory or view the contents of a remote server.
Compare that to the dial-up bulletin board systems of the day, where you could download the latest shareware software with easy-to-navigate menus, and you can understand why it took the Web to make the Internet what is is.
Today, graphical FTP clients make file transfers over the 'Net just as easy as on any old-school BBS.
The biggest challenge to FTP (which is much faster than HTTP in most cases) is security; both FTP and telnet (for remotely logging into Unix/Linux and even some Windows systems) transmit passwords in an open, unscrambled format.
The SSH standard (www.openssh.org) is beginning to catch on; it looks and acts just like FTP or telnet, but encrypts passwords and commands for greater security. An open-source standard, it may well be adopted as an actual replacement for the FTP protocol. If not, it's available now and it's free.
© Copyright Jim Trageser
All rights reserved