by Central Sussex College
People in the high-tech industries don’t like to demonstrate knowledge of older technology. In some circles of the computer field such knowledge actually carries a stigma!
There are valid arguments for using NEW technology whenever possible. Older technology at whatever level is a mixture of elegance and crudeness, with old questions answered and new questions left unanswered, problems left unsolved. Newer technology answers some of those new questions, solves some of those problems, and so it makes sense to move on from there instead of living and working as though those questions were never answered. It just makes common sense to move beyond early stages of development when the opportunity is there, and especially if the marketplace demands it. Most young people (and a lot of old people, too) tend to take this view, and lace it with scorn for the old.
As a result, some people will avoid mention of older stuff except in a disparaging sentence.
How far should we carry this attitude? Is E = mc² (discovered about 80 years ago) out of date? Is c² = a² + b² (determined around 2,500 years ago) old-fashioned? Or are these timeless principles? Granted, these are abstractions (rather than technology), but so is Object-Oriented logic.
Campers sometimes use gaslights on their trips through woods. In a gaslight a piece of material is burned to an ash, and the ash glows brightly as gas burns in it. I saw streetlights on a historic street on Boston’s Beacon Hill that used gas in that way. I remember thinking, “Amazing! We can actually get that much light, without electricity!” I had to catch myself and realize that my thinking was strangely upside-down. A hundred years ago with the introduction of the incandescent bulb people were thinking, “Amazing! That light that can be produced without flame, without oil or gas!” This was considered progress. And I looked at it backwards, with the same sense of wonder!
Though I’d worked in computer programming for years for big companies, my first home computer was a crude second-hand laptop, and my first exposure to the Internet was at a very low baud rate with very limited resources. But I gained a pretty good reputation with some educated people on a few newsgroups because these people considered my postings to be well-reasoned, well-thought-out, and well-expressed. Given the overwhelming number of posts with bad spelling, bad grammar, bad ideas, and good ideas expressed in bad ways with poor communication, these people considered my writing to be a breath of fresh air (I just know that some of my flame-prone colleagues will take issue with that statement). I did what I could using limited resources. We don’t have to look far to see more examples of people doing just that. Old mainframe programs may contain “spaghetti code” and heavy use of language features now considered “verboten,” but the programs also contain creative coding that managed space and run-time economically back in days when those resources were scarce and expensive. Japanese in past generations built thin paper houses to stretch their extremely limited natural resources. When I see people with what I call “Starship” computers with tremendous resources — superfast processors, huge RAM, superhuge capacity hard drives, advanced and versatile programs, and wizardlike peripherals — yet lack basic communication skills, basic historical and cultural knowledge, the essential elements of a classical education, I’m appalled. I actually have more respect and admiration for someone who accomplishes great feats with knowledge, insights, and intuitive leaps using crude, limited resources than I have for semi-literate people with generous resources at their disposal.
To support older ideas, let’s note that Isaac Newton was asked how he’d made his discoveries and accomplishments; despite the nasty things Stephen Hawking says about him, he humbly answered, “I stood on the shoulders of giants.” Even though what he did surpassed the deeds of his predecessors, he regarded them with respect, realizing that he couldn’t have reached so high without their support.
While older technology is relatively crude, it does do the job for which it was designed. There are many long-standing software applications and systems that continue to do useful work. Information technology not as sophisticated as today’s state of the art made it possible for companies to do business on a mega scale unheard of previously. Computer data-gathering and interpreting years ago made it possible to map the ocean floor, produce chromosomal maps, and really primitive computers used in the paleolithic 1960s made it possible for men to go to the moon and come back alive with pieces of it.
In his book “Small is Beautiful,” E. F. Shumacher wrote about the limits of the world’s resources, that not everybody in the world could live like a “little American” because of those limits. He suggested that less developed nations could get along well on intermediate technologies. But Schumacher threw light on a kind of ignorance many of us have, a limiting premise that if a problem can’t be solved using the latest technology, it can’t be solved — it just seems to make common sense that earlier technologies cannot solve problems if later technologies can’t.
We tend to think of technological advancement as an ordinal progression, rather than an expansion in different directions. CD-R burner drives are preferred over zip disks in desktop computers as the compact disks have greater storage capacity, but that doesn’t mean that zip drives are a more primitive technology. (Actually, files can be stored more quickly on a zip disk than on a CD, no special software is required to write the files, and the zip disk is in a sturdy casing that protects it from damage unlike the CD.) Computer pioneers John Mauchley and J. Presper Eckert, inventors of the ENIAC (Electronic Numerical Integrator And Computer), had plans to produce concurrent computer processors, but serial processors took off and concurrent processors fell by the wayside; that doesn’t mean that concurrent processing is an old-fashioned, primitive, or inferior concept (ironically, serial or Von Neumann processor-based programs may PRETEND to be concurrent processors through use of semaphores, mutexes, critical sections, etc.).
At this writing, Visual Studio .NET is a rising technological advancement in increasing demand in companies. It automatically provides the interfaces for modules written in different languages so they can interact, an ability that could otherwise be provided only by COM (Component Object Modeling). It enables people to develop software more quickly than with previous development environments. Yet this wonderful “new” development seems to be based on language principles that have been around for more than three decades and which were introduced over 25 years ago in Digital Equipment Corporation’s VAX computer. Perhaps we ought to call it “primitive”?
And if you don’t mind my getting science-fictionish, an episode of classic “Star Trek” (notice that “classic” is the new euphemism for “old”?) had the Enterprise’s landing party go to an Earth colony where the inhabitants were dying of radiation-induced aging. The landing party too started to age, except for Ensign Pavel Chekov. Dr. Leonard “Bones” McCoy remembered that Chekov entered a building, saw an extremely aged dead body, and ran out in fright. McCoy made the connection that adrenaline shot through Chekov’s system and made him resistant to radiation. Dr. Janet Wallace said that another medicine, hironaline, was used in radiation treatment instead, but “Bones” said that adrenaline was used in earlier radiation treatment research. A serum containing adrenaline was given to the surviving agers and they were restored to their proper biological ages. A recommendation was made to the Federation scientific community to include adrenaline in further studies on radiation.
The point is that not all old ideas are useless, primitive, nor naïve, and not all new ideas are better. Sometimes an idea is abandoned because of economics (greater market share for an idea that’s not as good), and sometimes a good idea just doesn’t have the charm or allure of a lesser idea.
Wilfred D. DeVoe is a software developer/programmer who has worked in both full-time permanent and consulting roles in public and private sectors. He holds Bachelor’s degrees in Computer Science (Boston University) and Psychology (Salem State College). His education and interests give him a unique perspective he likes to share with colleagues and interested people.
Technology has become the architect of our intimacies. Online, we fall prey to the illusion of companionship, gathering thousands …
Leading scientific theorist W. Brian Arthur puts forth the first complete theory of the origins and evolution of technology, in a …
In this report, Cathy Davidson and David Theo Goldberg focus on the potential for shared and interactive learning made possible by…