Son of Devil's Advocate
One of the ever-taxing HOPL (History of Programming Languages) mysteries is why some programming languages endure, warts and all, while others, equally undeserving, wilt and die. Of course, one of the culprits is the misleading analogy between natural and programming languages. Indeed, we have been cursed since the 1950s with this semantic overload. Whoever dubbed COBOL, APL and Fortran as "languages" is currently roasting in hell, forever doomed with the "unpardonable sin" of degrading decent discourse far beyond Jehovah's original Babel reprimand [ref 1].
The Cromwellian puritans who beheaded the Virgin Mary and her saints at Ely Cathedral [ref 2] are much less tormented.
We know quite a lot about the rise and fall of natural languages. They reckon that thirty such disappear each year as the last surviving "native" speaker bites/mouthes the dust. She (for that is nearly always her/his gender) mumbles her final farewell in words understood only by a few academics. They are anxiously tape-recording her death-bed Abschied, trying to capture a new facet of grammar and idiom.
"Does Miwok have an inessive inflection?" they ask in stilted Miwok. Too late. "Piss off," she moans in real Miwok beyond morphology.
You could say that Miwok died much earlier when sons and daughters started job-hunting in English, reserving their Mother tongue for family squabbles.
A "living" natural language seems to require a speech community of at least 300, evolving within which there are new, irregular constructions beyond formal analysis. Latin and Sankrit, although widely "written/spoken," are justly deemed "dead" because they are not subject to the cut'n'thrust of daily human usage.
It's therefore, almost by definition, impossible to resurrect a dead language. The last known speaker of Cornish (a neat Gaelic variant) was buried circa 1700, yet today there are 200 noble addicts keeping the language "alive."
Modern Hebrew is a major, wondrous, rare exception but not without many conflicts with the underpinning "dead" Biblical Hebrew.
I recall one debate on whether the modern Hebrew word for "computer bug" should be based on entomological or epistemological roots.
How we define "language" and "native" speaker is, of course, a whole can of worms to be opened with extreme caution. And the worms can never be re-canned.
Wars have been fought over minor linguistic variants.
Recall the life-death "shibboleth" -- are you a Gileadite or Ephraimite -- watch your sibilants, else you die (Judges 12:4-6).
Which reminds me of the definition:
"A language is a dialect with its own Army and Navy" (anon)
I suppose a programming language dies when nobody can find a suitable compiler. Meanwhile, there could be no end of executables buzzing away below our GUIs, written ages ago in some extinct version of Pascal?
Devil's Advocate Feb 1986
AI (Artificial Intelligence, dummy - this isn't BYTE Magazine, you know; we refuse to spell out every acronym to help the illiterate) is leaving the hypedom of academia and heading for the super-hypedom of your local, friendly computer boutique. Finding the doors bolted and the bailiffs on guard, AI is clever enough to move on to your distant, unfriendly computer mail-order company.
Unsold check-balancing routines are being rewritten in Prolog and repackaged as "Smart Money Managers." The new programs really know all about debits and credits. Reckless check-writing, I'm told, can actually invoke an "Insufficient Funds" warning. You can't get much more expert than that!
The same old, dumb spell-checkers are now called "intelligent" simply because they sit in RAM and can compare bytes on the fly as you type.
New database management systems offer affordable "natural language" query interfaces that expect you to formulate precise queries in the ambiguous mother tongue of your choice.
Compare the crisp, business-like, familiar QL syntax of:
FIND CLASS:CUSTOMERS= DUE$>=5000 AND STATE'AB=("CA" OR "OR" OR "WA")
with the slovenly:
"Get me dem f***ing West Coast flakes dat owe me 5 Gee or more"
I rest my case. Which reminds me that the algorithmic inadequacies of a natural language can be seen most clearly in the convoluted prose devised by the legal profession, and in the dollar-consuming court-room farces enacted to divine the meaning of it all. Lawyers might well claim that the only precise English version of the BASIC statement X%=X%+1 would be:
"Let it be known by these presents that whereas the symbol X% is here now and elsewhere implicitly and de facto declared to be of the type known as Integer the aforementioned value of the said symbol is notwithstanding prior or future statements and assignments to be henceforth incremented enlarged and adjuncted by the integer 1 (one) and further that this instruction is not liable for any damages directly or indirectly arising from carry overflow non-performance or mal de code."
This premature rush to spice up the PC market with add-on "smarts" will certainly backfire. It has already restoked the old AI controversy, which I prefer to call l'Affaire Dreyfus in honor of Hubert L. of that ilk, author of What Computers Can't Do(Harper/Colophon). After many uneasy cease-fires, the battle is now back to no-holds-barred trench-warfare. You may have seen the posters: "Thank You for Not taking Prisoners." Observers from several factions of the Beirut Militia have been horrified by the pointless savagery of the conflict.
The opposing camps are the True Believers ("One More Research Grant, and Victory is Ours!") and the Devout Atheists ("Time Flies Like an Arrow - So There!"). Crouching in between are the Cowardly Agnostics, scorned and hated by both belligerents.
My favorite agnostic philosopher was the late C. E. M. Joad, who parried all questions by saying "Well, it all depends what you mean by...." If you asked him "Does God exist?" he would puff on his pipe a while, then reply "Well, it all depends on what you mean by 'does'."
The key AI question is "Can machines think?" Well, it definitely does depend on how you define "machine," "think," and "can." Some of the possible definitions reduce the question to nonsense, while other equally plausible definitions convert it into a tautology.
If you feel that biochemistry has reduced Homo sapiens to a machine (one of daunting complexity, no doubt, but a machine nonetheless), then the question becomes "Can I think?" Only you can answer that one. I certainly think some of the time. At least I think I do.
Defining the act of "thinking" has proved to be a major stumbling block in the controversy. If you feel that "thinking" can be defined by a long but finite list of properties or examples, then theoretically any computer can be programmed to match such a definition. In fact, the AI Atheists are positive that intelligence and thinking entail certain non-finite, non- algorithmic properties that will forever be beyond the grasp of hardware and software. However complex and "expert" the AI systems become, the doubters will say "Yes, very clever! But does the machine really understand what it's doing."
At this point, many agnostics, such as myself, can sympathize a little with the AI practitioner. The chess-computer scene is a good illustration. Some atheists, including Dreyfus, unwisely set too low a limit in the 1970s on the standard of chess achievable by a set of chips. The improvement in performance (measured simply in victories against humans) over the last five years has certainly surprised the skeptics. I confess that in my only brief encounter, after winning four quick games in succession at level 3, I cranked up my silicon opponent one notch, and lost! It is no consolation to point out the incredibly stupid, clodhopping antics performed by a dumb machine that replaces overall positional finesse with preordained trial- and-error. I'm told, for example, that after an internal trial move, the machine often asks itself "Is this piece still on the board?" The moral is that in setting goals and tests for AI, one should fairly measure the results rather than the methods.
\\ Note: this was written before the IBM cheat, Deep Blue, beat
The layperson is no longer surprised when a machine rapidly and unerringly multiplies two large numbers. There is a general feeling that arithmetic is somehow "mechanizable." However, what if a computer quickly compares the bytes in the string "squate" with each of the strings stored in a dictionary, and outputs the message "No such word! Did you mean 'square'?" It is easy to get carried away with anthropomorphic delusions. But not for too long. Once the trick is revealed (no mirrors, just plain old arithmetic!) it is clear that the machine "understands" nothing. This is not to denigrate or discourage the incredible programming feats of the AI community. All we seek, as Professor Dreyfus stresses in What Computers Can't Do, is more honesty in describing the programs and the progress made.
My own contribution to AI will be released quite soon and well within living memory! It is based on the observation that the natural language compilers now emerging require the rewriting of your existing conventional (unnatural) programs. My solution will obviate this costly conversion. It ignores your code but compiles your comments. If you have not been commenting your programs adequately, don't blame me. We've warned you often enough!
Confident of success, I have already planned yacc (yet another comment compiler). To whet your appetite I can reveal that Pass 1 of yacc converts e.g.:
++a; // pre-increment count by 1
pre-increment count by 1 // ++a;
maintaining the old convention that any vagueness in the left hand column can be optionally clarified in the comment field, or vice versa.
ref 1: "GOTO [sic], let us go down and confound their language" (Genesis 11:7). A remarkably credible myth on the birth of diverse tongues. A divine, effective smote against Mankind vainly reaching to the sky for Godhood. Linguists keep boiling down their endless language-family categories each month seeking the pre-Babel "common language." My best guess is that native God speaks all seven Basques and is reasonably fluent in Mayan, Hebrew, Latin, Arabic, Prouvencau, and Hungarian. He tells me He's given up on His Berlitz English and French classes. His use of GOTO indicates a blessing on several unstructured programming languages such as BASIC and C/C++.
ref 2: "God and Mankind -- Comparative Religions," Robert Oden, Audio-Book set, 1992; Superstar Teachers, The Teaching Company, POB 3370, Dubuque, IA 52001-3370, (800) 832-2412
Liverpool-born Stan Kelly-Bootle has been exposed to computing, on and off and vice-versa, since 1953 when, after graduating in Pure Mathematics at Cambridge University, he switched to impure post-grad work on the wondrous EDSAC I. After some trenching with IBM and Univac in the 1960s and 70s, Stan opted for self-employment as a consultant, writer, folk-song revivalist, after-dinner entertainer, and cunning linguist.
His monthly DA ("Devil's Advocate") column ran and ran in UNIX Review (aka Performance Computing) from 1984 until January 2000 (a date that will live in infamy) but lives on as SODA ("Son of DA") via www.sarcheck.com the homepage devoted to UNIX performance.
The latest of his umpteen books are "The Computer Contradictionary" (MIT Press) and "UNIX Complete" (Sybex). More on his biblio- and disco-graphy can be found on http://www.feniks.com/skb/ soon due for its millennial update.
Stan welcomes reader reaction: email@example.com
The URL of this page may change in the future. Please bookmark the home page instead of this one.
Portions © copyright Stan Kelly-Bootle 2001.