October 23, 2000

By Karen Kenworthy

IN THIS ISSUE

Aren't seasons wonderful? Just a few months ago I was reveling in the new warm weather. Light cotton clothes, cold drinks, shade trees, going for a swim. Now the seasons are changing. At the secluded Power Tools workshop those shade trees, after a flash of brilliant color, are becoming bare. Wool socks, warm soup, fresh-baked bread, reading by a fire, are suddenly very appealing.

Now I can't wait until Winter arrives. Then Spring, so we can start it all over again!

URL Discombobulator

Before we go on to other matters, I want to tell URL Discombobulator fans that version 1.5 is now available. Like it's predecessors, this new version can convert familiar Web addresses, such as https://www.karenware.com/, to their less familiar "shrouded" versions, such as http://%77%77%77%2E%6B%61%72%65%6E%77%61%72%65%2E%63%6F%6D/ or http://3353972754/.

But now the Discombobulator can reverse this process, converting common shrouded forms of web addresses to their more readable and familiar counterparts. This feature should make it easier to decipher the identity of e-mail spammers and others who try to hide their true identify behind coded Web URLs.

To pickup your copy of the new URL Discombobulator, visit https://www.karenware.com/powertools/ptlookup. And if you'd like to learn how these little identity-concealing tricks are played, download the program's Visual Basic source code too. As always, they're both free.

EBCDIC

Speaking of codes, several of you wrote last week regarding our recent discussion of history of computer codes -- the way computers use ones and zeroes to represent the letters, digits, and punctuation marks you and I depend on to communicate. As you'll recall, we started with Samuel Morse's telegraph code. From there we then followed the trail to the 5-bit Baudot code used by Teletype machines and other teleprinters. These eventually led us to the 7-bit ASCII (American Standard Code for Information Interchange), ASCII's 8-bit extensions, and finally the new 16-bit Unicode.

Some of you pointed out a missing member in my computer code family tree, called EBCDIC which stands for Extended Binary Coded Decimal Interchange Code. Though something of an evolutionary dead end, this code was once the most popular computer code in the world. And still today, it's widely, though decreasingly, used by large mainframe computers.

Herman Hollerith

Although you can make a case for even earlier ancestors, most people start the story of EBCDIC with Herman Hollerith. In 1890 he modernized the United States Census, by introducing "punched cards." His plan was to use one rectangular piece of thin cardboard, a bit larger than today's dollar bill, to represent each person residing in the U.S. Information about each person was recorded by punching small holes in his or her card. Hollerith-designed card sorters and readers were then used to automate the tallying of census information.

The holes in Herman's cards were arranged into 80 columns, one for each tidbit of information the card could contain. Today we'd say the card could hold 80 "characters" of information.

Each of the card's columns was divided into 12 rows. Any, all, or none of the rows could contain a hole. In effect, these rows were bits -- ones and zeroes -- encoding the character stored in each column.

Now Herman Hollerith was a brilliant inventor. But he was a less talented business man. In 1986 he founded a company to sell his card readers and sorters, called the Tabulating Machine Company. But before long his company was losing to its competitors, and was forced to merge in order to survive. So, in 1911, Herman's company joined two others to form the Computer Tabulating Recording Company.

A young engineer as soon promoted to run the new company. And under his leadership it thrived. In fact, the new company succeeded so well, it's still around today. Only now it's known by the name it adopted in 1924, International Business Machines. Or just IBM.

So now you know the rest of the story. But I'd like to tell it anyway. :)

EBCDIC

Let's skip ahead in time to the 1960's. They were a decade of dramatic changes, standards in flux, challenges to authority, and new ways of doing things. Many of you probably have your most, or least, favorite change that came out of that turbulent time. But I'm sure for most of us, the most important development of the 1960's was ASCII -- the code that changed the world. Well, most of the world, anyway.

You see, by 1964 IBM had spent hundreds of millions of dollars developing a new line of mainframe computers, called the System 360. If this line succeeded, it would allow IBM to dominate the computer market for 20 years or more. If the new computers failed, so could IBM.

IBM was tempted to adopt ASCII as the internal language of its new computer line. But there was one problem. For years IBM and its customers had been using Hollerith-style punch cards to enter and store data. And the "collating sequence," or sorting order, of the Hollerith code was incompatible with the new ASCII code. If IBM adopted the new code, it could alienate its existing customers, force them to update all their data files, and reduce the chances the new computer line would be successful.

Now IBM is nothing if not conservative. Such a big gamble is just not its style. So instead of ASCII, IBM adopted a new computer code whose collating sequence was compatible with the Hollerith code. That code, along with its half dozen more recent variations, is known as EBCDIC.

Ironically, the conservative folks at IBM tried to hedge their bet on EBCDIC. In case EBCDIC didn't prove popular, or ASCII proved too popular, they added a little-known feature to all System 360 computers. With the flip of an electronic switch, the 360 could communicate in ASCII!

Unfortunately, this fact was not fully communicated to the programmers writing the 360's operating system. As a result, they wrote lots of code that only functioned correctly when the computer was running in its EBCDIC mode. This little oversight, and the EBCDIC momentum it created, delayed IBM's migration to ASCII-based computing for years. In fact, IBM didn't release a computer and operating system that made full use of ASCII until 1981. That computer was the IBM PC.

But that's a whole 'nother story. And it'll have to wait until another time. Right now I'm hearing distant thunder, and big fat raindrops on the roof. It's three a.m. and this sleeping weather is too good to pass up. But I'll be up and back on the 'net in a few hours. If you see me there, be sure to wave and say "Hi!" I'll be looking for you!

TTYL,