.comment-link {margin-left:.6em;}
Computer Careers Book

Blog Search Engine

Sunday, August 08, 2004

 
Can technological advances be delayed?
"To me, the outsourcing trend indicates that an
ever-larger part of IT work has become routine,
repetitive, and low-bandwidth--one might even
say unexciting or boring. . . . Outsourcing has
been historically a prelude to mechanization, a
nd mechanization is a high-value domestic opportunity."

-- Charles Simonyi, former Microsoft star and now CEO of
Intentional Software Corp. Information Week

My credentials and opinion don't hold a candle to
Mr. Simonyi's, and I'm sure he has a much better
grasp of the technicals related to what he
calls the "mechanization" of software.
However, I do grasp the essential point he is
making. Maybe I can explain it more clearly for those
without his breadth of knowledge and experience.
Let me put this debate into some perspective, because it's
not really new. Ever since advances in technology started
occuring with enough speed to be noticed, they have
been opposed by those who feel threatened by the change
in the current status quo.

This was as true of the Dutch workers who broke early
manufacturing machinery by throwing their wooden
shoes (or sabots) into it (thus originating the
term 'saboteur.') as it is of modern day programmers
who sense the not so long term vulnerability of
their profession.

Do you know how the original computer programmers worked?
To make sure you understand this, let's back up to
the machine level of computers. All they know is two
settings: off and on.

That's it. Everything you've ever done on a computer,
from making an ATM deposit to sending email to a
friend to listening to a hit song, was simply a
complex combination of off and on, Off and on.

Usually shown as 0 or 1. Binary code in groups
of eight.

The sequences of 0s and 1s are the basis of all
computer codes and languages.

These sequences of 0s and 1s are mind boggling to
work with even a keyboard. They are known as
assembler language and only a relative handful of
programmers understand how to work with it.
Now, the computer registers these 0s and 1s (off and
on) through electrical impulses in silicon
microchips. In the early days, it was not so easy.
Microchips hadn't even been invented.
That's why early computers were so huge -- they
consisted of huge banks of wires.

To program those machines, the pioneer programmers
had to WELD those wires together.

Yes, one wire to indicate either 0 or 1. So it took
welding a large number of wires to instruct the
computer to do even the simplest task.

How long would your programs be if you had to "write"
them by welding computer a computer wire for every
single 0 or 1? Assembler language, remember?
It's safe to say that either the IT industry would
employ almost everybody in the world to weld enough
wires to perform about 1/1,000,000th of all the tasks
we now use computers for.

Our lives would be vastly different, there'd be no
Internet or Microsoft (who would take the time to
weld enough wires to display Windows on a computer?),
but we'd all have jobs!

Obviously, we have progressed since then.
Even those early computer programmers got tired of
welding wires and they installed switches to flip.
Then then learned how to flip them electronically.
Then the microchip was invented.

By the time I took a class in COBOL in college in
1976, "programming" consisted of sitting down at a
machine the size of a small organ and punching holes
in IBM punchcards.

One line of code per card.

Since even the small programs I had to do as a student
had a few hundred lines, plus sample data, that
amounted to a big stack of IBM cards. I took that to
the geek in charge of the computer room, who ran
the stack through a reader and if I was lucky, in a
day or two I got the results back, since everything
was batched.

It's safe to say that if this was still the standard
method of programming, that it would take almost
everybody in the world to punch enough cards to
perform about 1/1,000 of all the tasks we now use
computers for.

We'd still have the standard business mainframe
applications we had then now, but certainly no
World Wide Web. A web site cannot be supported by
punch card processing.

But we'd all have jobs!

Eventually programming progressed to using computer
programs called editors, which you could write
and view one line of code at a time. Then more
comprehensive editors and then the full scale
IDEs that we have now.

At every stage, more work got done more easily,
and there was less routine work for the
programmer. In short, as computers advance
they not only do more, they take over more of
the necessary but boring chores.

Another example:

Take the history of cartooning, in which Walt
Disney was a great pioneer. Ever watch the old
classic full length cartoons he made in the
late 30s and early 40s? SNOW WHITE, PINOCCHIO
and DUMBO.

They are full of exquisite detail and sophisticated
movement.

If you were a dedicated Mousekateer who watched
the Mickey Mouse Club shows in the 1950s, you
probably remember the times they took us
behind the scenes and explained how cartoons were
made -- by artists who did tons of grunt work,
making hundreds or thousands of drawings, all only
slightly different from each other, to bring the
magic of cartoon motion to the screen.

To make a full length animated feature such as
BAMBI required many cartoonists drawing up to
millions of separate drawings.

A huge amount of boring work, in other words.
The Disney Studio could do that because in the
30s and 40s a lot of artists worked for peanuts,
happy just to have a job.

But eventually Walt had to start paying his artists
more money and the level of detail in cartoons
went down.

By the early 1960s, the Hannah-Barbara cartoons I
watched on TV as a kid were much cruder. When
Yogi Bear ran from the ranger, the background
rotated and his legs and arms moved but it was
obviously fake even to me as a child.

The sophistication level of cartoons continued
to decline even at Disney through the 1970s. But
the 1980s they saw a re-birth.

Why? It wasn't because Disney cartoonists agreed
to go back to drawing thousands of pictures for
India-level pay. No, it was because computing
power became more affordable, and software could
supply now supply animation in intricate detail
and motion.

Now, computer aided graphics are an important tool
to every filmaker. They added bullet splashes to
the river water in the John Kerry campaign
movie. No fantasy or SF movie could be made
without them.

If you really think we should ignore available
technological breakthroughs in favor of keeping
jobs made obsolete, why not go back to the
beginning?

Let's get rid of tractors, harvesters, mechanical
pickers and ALL motorized farm equipment. Even
plows -- the plow, even before it was pulled by
a horse or donkey, was a great technological
advance the enable farmers to grow a lot more
food than previously.

No, let's go back to farming with sharpened sticks.
That'll give us all jobs. It'll take a lot of us
and a lot of sharpened sticks to feed the six
billion people in the world.

In every generation, we do what we have to do. When sticks
were the only way to plants seeds, we used them, because
we wanted to eat. When the plow came along, we adopted it
because we wanted to eat more food with less work. When
Walt Disney had to pay hundreds of artists subsistence
wages to create beautiful works of animation, that's what
he did.

When we had to weld wires to get a computer to add 2 plus
2 equals 4, that's what we did.

Now that we've advanced beyond those levels, we wouldn't
think of returning to them. Yet some people -- even
techies who of all people should know better -- want to
think the advance of technology can be delayed.

To read about one of the great third generation
programming languages of our time, go to:

C for Not Yet Techies


Powered by Blogger Add to My Yahoo!