* Focus on building working software.These are all very good goals in building quality software, and I have attempted to serve every one of them throughout my career. Does that make me an "agile" programmer? Hardly. Scott goes into more detail for each of these, and as they say, the Devil is in the details.
* Maximize collaboration.
* Become a generalizing specialist.
* Minimize handoffs.
* Constantly improve your skill set.
The handoffs line is really about communication. Scott does not communicate carefully (there were numerous miscommunications in our exchange, significantly more than I normally experience in email), so I can see why he wants to minimize opportunities for failure there.
Collaboration is necessary for very large projects. It also goes hand-in-hand with large development budgets. I tend to work on smaller budgets, so it's not much of an option for me at this time. I do work well with competent people whose skills complement my own; a large piece of my career was in such an environment. I also have done reasonably well nurturing trainees. I am less successful at pretending equality with persons who do not understand (or who deny) their own incompetence.
But today I want to focus on that first "Rule", which Scott elaborates:
Upon hearing the detailed description of a requirement you should have working, tested software in place within hours or at most days, not weeks or months.So this Rule is not about working software at all -- since all of us want working software -- but about making it work very soon. Hence, no doubt, the name "agile". And here also we see the connection with his summary line about evolution.
Evolution, as you know, posits a general principle pervading the whole universe, where highly complex systems "evolve" slowly by minute accidental improvements followed by extensive testing and culling out of the misfits by natural selection. It's a beautiful model. Its only problem is that nobody has ever seen it actually work. Worse, it violates the laws of physics that we know and understand. But it's a beautiful model.
If evolution really worked, it would be a marvelous cost-effective way to develop software: set thousands (or millions: think of SETI @ Home) of computers banging away like a million proverbial monkeys on typewriters, churning out occasional pieces of usable software amid all the noise. Computers are cheap, probably a couple hundred dollars each for a high-powered CPU in a server farm, certainly a lot cheaper by the thousand than the cost of a single high-quality programmer for a year or two. But nobody, NOBODY does that. Why? It doesn't work.
Agilists do not try to use evolution the way the atheistic biologists claim it works. They know better. Besides, they want to be "agile" (fast). Instead they just use evolution as a politically correct metaphor to represent what they think is beneficial in their methodology. Unfortunately, metaphors infect the way we think. "Working, tested software within hours" assumes that the changes you make are very small, easy to implement, and easier to test. It's pretty easy to show counterexamples.
Consider a reasonable, useful program -- I have one on this computer (I didn't write it, but I thought about it a lot) -- that has exactly three requirements:
1. It reads a scanned document (image) file,The first two requirements are pretty easy, any apprentice programmer can do that part in an hour or two, and have it tested by the end of the day. Requirement #3 is the whole essence of this program, and it's a very simple requirement that any nonprogrammer can understand -- but it takes a very big program to do that. It's not going to happen in days or weeks; if you are starting from scratch, it's going to take several people most of the year. That's one requirement.
2. It produces an editable text file, and
3. The text accurately reflects the scanned text image.
Now if I had to defend the agile methodology, I might try to claim that there are multiple requirements buried there, noting the word "detailed". That's not exactly true. There are multiple implementation strategies and subgoals, but one requirement. The requirement is sufficiently detailed. But as I said, Scott does not communicate well; let's give him the benefit of having misstated his Rule, and assume he really meant to say:
Upon hearing the detailed specification of a single software component carefully designed by a master software architect, you should have working, tested software in place within hours.Then it's only a matter of forcing that master architect to partition his design in small enough pieces and in sufficient detail, that the apprentice coder can turn out code in a few hours. Of course all we have done here is put all the burden of software development on that master architect, and relegated to the functionaries who putatively prove Rule #1, work that should be done by a computer and not by people at all. The architect is not turning out "working, tested software within hours." It takes him far more than hours to nail down all those specifications, and it's still not working yet. Besides, what you end up with is the old "waterfall" development model that agile is supposed to replace.
Scott wants us to believe that all the team members are generalists, and everybody is capable of doing their part of the design. What does each person do? Who decides whether to use neural nets or so-called genetic algorithms or hill-climbing algorithms or rule-based reasoning? Who chooses the metrics that these algorithms will optimize, or the rules to program and in what order? Who decides on the interface between the image-processing front end and the feature extraction engine? Each of those parts is pretty big, will you partition out to separate tasks/developers the identification of individual letters? There's still far too much work to do in a week.
But this is all pretty simple -- huge, but simple. What got me on the case is the situation where you have multiple components that interact in ways to achieve a result that cannot happen if one component is missing. Non-evolutionists call this "irreducible complexity" (IC), and evolutionists try to deny that such a thing exists. Michael Behe's 4-part mousetrap is a fairly simple example of IC, and nobody has yet shown it to be reducible. One wag tried to make all four components of one piece of metal. Of course all he proved is that by careful design you can attach all four components together into one piece, but the four functions are still separately identifiable, and he did separately identify them.
The real world is filled with examples of IC, and real-world software has numerous examples of it too. So I asked Scott how he could get "working, tested software in place within hours" when nothing works untill all the necessary components are there? I guess he did not like the idea of IC messing up his beautiful evolutionary model, so he carefully dodged the question. Several times. The dodge was unnecessary.
If he were a better communicator, he could have admitted that the "working software" line really means that for each component we specify its (tiny!) contribution to the whole and what its interactions with the other components are, and define it to be working when it responds properly to the test program we write that stimulates that interaction. The whole program of course only "works" to the extent that the tiny pieces there do their own thing in isolation. With IC functionality, that crucial part of what the final program does will not appear in hours or days, but only when all the required components are there and not only "working" but actually working together -- which probably means that your original notion of those interactions has been altered to encompass a better understanding of how the complex system has to work. Agilists call that "refactoring" and if you are lucky, you can do it in hours or days. If you are lucky. I just finished refactoring my PC framework, and it took 4 weeks. Most often I can do it in hours, but not always.
But I am not an evolutionist, so it does not offend me to admit that I "design" software, and that some of the increments take rather a long time to figure out and implement. If agilists cannot do that kind of development, then, as I told Scott, I have a competitive advantage over them in the software development market.
Kind of cool.
It's not hard to discern the demographics of a magazine subscriber list: just look at who the ads target (also known as "follow the money" ;-) WIRED magazine sells expensive playthings, some of them technical; Dr.Dobbs Journal offers software development tools, and InfoWorld promotes high-ticket computer technology products to top-level managers. BAR ads are a hearty combination of world travel and Christian kitsch (essentially the same companies and products that I see in WORLD magazine, an unapologetic conservative Christian newsweekly). Although a less reliable indicator, this predominance of conservative Christian BAR readers is confirmed in the letters they print, which often include numerous outraged "cancel my subscription" requests protesting some sacrilege, such as printing pictures of Moabite idols dug out of the ground. Yes, some guy really did cancel over that.
The editor, Hershel Shanks, like many magazine editors (WORLD and InfoWorld excepted) is not a member of his primary readership demographic. I think he's Jewish. However he is a good businessman, and he knows his readership. Well, sort of. In the current issue he got some complaints about an off-hand remark against President Bush. In any case, he vigorously promotes the publication of archeological findings in the middle east, which is what conservative Christians want to read about.
His own interests range rather wider than that limited demographic, and he tried to spin off magazines on general archaeology and (non-conservative) theology. Both folded -- or rather, he merged them back into BAR -- probably from lack of readers. So now we have to put up with irrelevant digs in other parts of the world (mildly interesting, but not a compelling component of our faith), and more pseudo-theology.
The current issue is a case in point. It has a fascinating cover story on "Satan's Throne" (mentioned in Revelation 2:13), which the author credibly links to the temple in Pergamon, and another update on the "James, son of Joseph, brother of Jesus" ossuary and the follies of the Israeli Antiquities Authority vainly trying to discredit it. And then there are three BibleReview leftovers, uninformed speculations trying to debunk the Biblical texts and early Israeli history.
No competent literary critic would ever use the kind of circular reasoning that goes into Wellhausen's "form criticism" on any other literature from any period in history -- except the Bible -- and no recent Biblical scholars consider it a credible study of the Bible. You get it in the so-called Jesus Seminar, but that's a publicity stunt, not scholarship. Yet here it is in BAR, where this author tries to throw Newton, Darwin and Wellhausen into the same pot from which he hopes to discredit the Bible as history. Newton and Darwin are not even in the same league. Newton was a conservative Christian, whose science flowed out of his faith. The faith was real, so the science was good science -- and still is, nearly 300 years later. Darwin came up with a credible idea, but the science just isn't there to support it. I guess Wellhausen was a Darwinist; he certainly did not follow the rigors of Newton's science.
The reason I continue to read BAR is that the facts
they dig up in Israel and its neighboring regions support the Bible quite
well. Sometimes the dates don't match, but archeological dating appears
to be on the same circular shaky ground as geological dating. The BAR
authors never explain how they arrive at their dates. I have repeatedly
asked Shanks to print an article explaining how they date the layers of
a dig, but he neither responds nor prints any. I once asked for more inscription
photographs with a readable transcription so those of us who imagine ourselves
able to read ancient languages can have a go at it -- and he now does that:
in the current issue are a number of inscriptions that you can read. Why
nothing on dates? The answer is obvious: they'd be laughed out of court.
People publish what they are proud of, and hide what they are ashamed of.
Sullivan has invented a neologism "Christianism" and, like most label-slingers, carefully defines it to best match his own personal qualities. By way of comparison, he observes that "Islamists are those who want to wield Islam as a political force..." The people who actually live in the so-called "religious right" actually prefer to wield their Constitutional right to vote their personal preferences, which is to restore the civil rights the Anti-Religious Left has over the last century or so taken away from them. But "-ism" labels are like that: they tend to become popular only with critics of the philosophy so labeled (see for example, the Wikipedia article on Darwinism). The key definition comes nearly at the end of the essay:
It is the belief that religion dictates politics and that politics should dictate the laws for everyone.Sullivan declines to tell us in this essay the particulars of his own efforts to dictate the laws for everyone, but his colleagues at TIME are less reticent. Judge Jones, whose public actions are indistinguishable from a militant atheist, we are told is a Lutheran. The left-wing bigots heartily cheered his atheistic politics, which dictate the establishment of their religion in the public schools and stifle even the scientific study of public issues for everyone. On another occasion Sullivan admitted to being gay without repudiating the homosexual agenda that forces their abnormal "lifestyle" not only in our faces but makes the other 98% of the population to bear the cost burden of their unhealthy behavioral choices -- under force of law.
Of course Sullivan doesn't tell us this in the context of his rant against Christians who, unlike himself, choose to live a life consistent with the teachings of their professed religion, and who -- like himself -- have every Constitutional right to vote their conscience. Nevermind that the left-wing bigots want the conservatives to shut up and just pay their taxes.
They can't have it both ways. Either this is a democracy and everybody (not just the lefties) is free to vote for whatever we think is best for the country, or else it's a religious monarchy, currently controlled mostly by atheists in black robes and writing in back-page magazine columns. The religious right isn't even close to forcing their agenda on the rest of the country. Not like the atheists have done for the last 50-80 years.
Me, I like the democracy idea better. If Sullivan doesn't like my politics,
he can vote for something different. Oh wait, he probably did. That's the
power of democracy: when you are in an extreme minority, your vote doesn't
count for much. Oh well, there's always magazine columns.
After the section on the "TIME 100" they printed an essay essentially explaining their choices. I think Joel Stein's column was supposed to be humorous (perhaps like the pig-headed political "humorist" who got so much coverage during the election, and they also let him write one of the pieces in this issue), but it had the ring of truth: Stein tells us that his "Joel 100" list (like TIME's list, although he did not say so explicitly) was nothing more than his own friends. That explains it perfectly. The editors and writers at TIME are so far out of touch with the American people, they do not know any honest conservatives; they just picked their friends for this list (with a few really powerful conservatives thrown in for the false appearance of honesty).
The back-page essay in the same issue told the same story. It was written by another admitted left-winger bemoaning the fact that the Democratic party is leaving her behind -- effectively pushing her unwillingly into the Republican party. I was reminded again of my Market Politics posting a couple years ago. The left-wing bigot politicians and in the national media have Clue Deficit Disorder. They are so far out of the mainstream they no longer can see how far away they are. The American people can, but it takes a telescope. Or a tele-vision.
One positive observation, in another magazine (you won't see this in
TIME), is that the left-wing bigots fail to meet their
own Darwinian criteria for survival. Because they kill off their children,
it's the conservatives who are populating the next generation of voters,
not the lefties. They will get shrill and tiny, tiny and shrill -- and
mostly irrelevant. Like the Marxists who ran the former Soviet Union, it
doesn't matter what they think because they are the Losing Team.
I have a "framework" on the Mac which puts up windows and handles user events about like the Mac itself does -- except this is designed to work the way my own operating system does. My operating system is written in my own Java-like programming language Turkish Demitasse (Turk/2 or T2, "a stronger brew than Java") and runs reasonably well in emulation. The framework is also written in T2, but instead of being interpreted in a virtual machine, it is compiled to native machine code and linked to the Mac user interface system calls. It's a lot faster that the emulated version, and for the last couple months I've been running major programs in that framework on the Mac.
However, the Mac is a dead operating system, and one of the goals here is to re-implement the framework on the PC. I actually have something like that running, but a large part of the translation process to use it depends on HyperCard on the Mac, which I'm trying to get out of. HyperCard is great, but it's so Mac (easy to use, but dead). So I retargeted the T2 compiler again, this time to generate C code. The last couple weeks have been spent redoing the framework. The existing PC version is written entirely in C, which makes it difficult to debug and maintain; this time I'm trying to do as much as possible in T2. With two previous T2 implementations and existing C code, you'd think it would go pretty fast. You'd think wrong.
C/C++ is a horrible language, but it's all we have. The reason for inventing T2 is to fix some of the problems and make programming more productive. This week I have been experiencing those problems. Writing in T2 goes quickly and the code is fairly robust. The C compiler is much more forgiving, so it's possible to get a lot more bugs in your code that the compiler doesn't catch -- and maybe the programmer doesn't either, which is why commercial software sucks and viruses abound. The result is that you can write C code somewhat faster (more sloppy) than T2, but it takes a lot longer to get a C program working properly. Programmers like being able to churn out very high KLoCs (thousands of lines of code), and maybe finding all those bugs makes them feel smart. Anyway they love C.
Converting existing C code to T2 is difficult, precisely because T2 won't let you do all those stupid things that C encourages. Adapting existing T2 code written for the Mac operating system, so it compiles instead to C and uses the Win32 system calls is difficult because the systems are so different. The Mac system calls were designed for Pascal (a robust language like T2, but long dead); the Win32 systems calls were designed for C. The difference shows.
For a break (I tend to take more breaks when the programming is such a chore) I read old computer magazines. I don't know why I started getting C/C++ Users Journal (I certainly didn't ask for it), but today I was reading the February issue. Other magazines go to the top of my reading pile; this goes to the bottom, so I don't get to it very soon. Counting the editorial, there are 11 articles in this issue, six of them about problems in C or C++ and how to get around them. That's more than half. Other magazines -- take PCworld, for example: it generally has maybe one or two articles on how to get around the problems in the PC, but never half of the whole issue. Before Apple started shipping Unix instead of the MacOS, MacAddict struggled to find one or two articles every six months on getting around problems in the Mac (that went up to at least a couple every month with OSX, but that's another story).
Somehow the preponderance of articles on getting around C problems fits
with my experience. Which is why I'm trying to get as much of my working
code into Turk/2 as possible. It's rough going, but I only have to do this
once.
I thought it would be intellectually dishonest not to say to his face what I say to people like you.I do not know many honest people like that.Former Secretary of State Madeleine Albright, to TIME
Part of the American value system that most people live by forbids disaffirming people to their face. It's apparently OK to slander them behind their back, where they don't know what hit them and cannot defend their good name, but speaking the truth to their face is socially unacceptable.
It's the Devil's own value system, and I won't do it. This policy tends to get me in trouble with people of what I can only understand as dishonest values. I suspect that because they reserve the most flattering things to be said to a person directly, and the most negative and hurtful things to be said only to other people, they must assume I am doing the same -- which makes me exceedingly cruel in their opinion. The reverse is actually true: I will never say anything more negative about you to any other person until you have had a chance to hear it from me directly (and correct any misinformation). Sometimes negative evaluations are necessary, but with me (and now I see, also Madeleine Albright) you always have the opportunity to deal with it as or before it causes you damage. There is no other honest way to do things.
Bravo, Madeleine Albright!
I suspect that the attraction of gnosticism is the feeling of superiority it confers on its adherents, "I know something you don't, and that makes me a better person" (more likely to attain Nirvana or Heaven, or at least actually make the computer do what I want). To retain this we-them distinction, it is important that the secret knowledge conferring this power remain secret -- but part of the power is exercised in controlling the novitiates, giving them little tidbits of information, but not enough to keep them from coming back and grovelling for more.
Linux qualifies as gnosticism because of the vast quantity of esoteric technical knowledge required to install and keep it running. Did you ever try to install Linux? Immediately you are confronted with a plethora of different partition options in arcane jargon, with dire warnings about erasing data or simply failing to boot, and none of which is really necessary to know in order to use a computer (as any Windows or Mac person will quickly tell you), and (most important) none of which is explained when you are sitting there looking at them on an otherwise dead computer.
There are vast quantities of self-help documents on the internet explaining what partitions are for and how to do some trivial part of setting them up in "fdisk" after you have Linux already running, but of course this is not much help during the install process. Not only is fdisknot running at install time, but all the terminology is different from whatever it is that is running.
There are two kinds of Linux users: the holier-than-thou high priests, who do not answer questions in other than cryptic acronyms like RTFM (nevermind that there are no manuals to read), and the newbies trying to reach that level of sanctification. The newbies write all the on-line documentation, but much of it is either wrong or irrelevant or both. Some of the Linux help forums I looked at have long sets of rules encouraging people to be nice to the newbies and answer their questions, but I don't think anybody does more than genuflect at the rules on their way by.
The Macintosh seemed to qualify as a gnosticism when it first came out -- it was so different, and it was so hard to write correct programs for it, but it wasn't long before the basic concepts were mastered and there was nothing more to learn, no way for priests to become bishops and cardinals. The Mac was too easy to use, so the cognoscenti deserted it for something more challenging. It destroyed the Mac market. Apple took note of the fact and replaced the MacOS with unix, and the sacerdotes are (slowly) returning. Apple's unix is still too easy for them, but all the unnecessary complexity is there under the hood. I just today read a news item, where a manager responsible for supporting both Macs and PCs reported that his team spent three times as long supporting each Mac, compared to a PC. Back before OSX (perhaps before his experience), the ratio was reversed: Mac users needed almost no help.
In a previous life I too drank the kool-aid of computer gnosis, but I learned the truth underlying the Macintosh Way, which is that the computer should not be a conscious part of your problem solving. It just works. Nothing else ever came close, before nor since, although Microsoft is blindly stumbling in that general direction -- sometimes. The problem is that this heretical opinion tends to show through when I ask questions about getting Linux up. "Infidel! Off with his head!" OK, the Linuxies are not waving scimitars and blowing people up (yet), but the disdain is palpable.
So in five years of trying, dozens of different versions, multiple platforms, I have never yet succeeded in getting a single Linux install to survive the first or second reboot and do something useful. A couple came close: I downloaded the PuppyLinux "live CD" and it booted up twice before crashing so bad I had to erase the hard drive. While it was up, it actually played a flash movie that neither my Mac nor WinXP can. But it doesn't do anything else worth doing. The Linux I ordered with the PC ("everything installed and working") boots into a Terri Schiavo coma -- you know, looks functional, but cannot do anything intelligent. We got YellowDog up on the SnowBall Mac at the university, but that's we, as in I had serious hands-on guru help. This week I downloaded Ubuntu, but it refused to install. At least it told me it wouldn't replace an existing Linux system, but the only clear way out of that partition mess was to erase the whole disk, which I'm not about to do. WinXP at least works, and I don't want it erased.
You'd think with a PhD in rocket sci-- ah, I mean computer science (which should be better), I could get something to work. The problem is, I'm not a gnostic. Christianity has always opposed gnosticism. Everything you need to know to be 100% going-to-Heaven Christian can fit on a small tract. There are things to learn, but that is for the purpose of helping other people, not for your own personal gratification. That makes me a populist, not a gnostic. It is also the Macintosh Way. Make that was, because it doesn't sell computers, so Apple no longer does that.
Oh well.
It's a simple matter of supply and demand. Demand is going up -- with India and China coming on-line, it's really going up -- and supply is going down. Besides just plain running out of reserves, we have the Iraq problem and nukes in Iran and Sudan doing stupid things to their people and the guy in Venezuela not much smarter, all of which tend to limit the availability of oil. Not to mention reduced gulf coast capacity because of Katrina, and limited Alaska capacity due to fruitcake environmentalist action.
No matter what color the pump is painted, all the gas comes out of the same wells and through the same refineries, which are all running at full capacity. There's nothing to drive any prices down. If Exxon can't sell the gas in their own stations, they will sell it on the spot market to other stations, just as they always have done. They can even charge a premium for the extra demand.
So what will the effect of this "boycott" be? Higher prices. Yup, the
other stations have greater demand, which drives their prices up, and Exxon
stations need to raise their prices to make up for the reduced volume.
I wonder how long it will take the driving public to figure out how foolish
they are.
I don't have a health problem like that (yet), but I often go through the same calculation. All of us are limited in what we can do. Even healthy people cannot run a two-minute mile nor work 27-hour days.
Computer programming is a strange discipline where we get to invent (most of) the rules that constrain us, so it feels like we have unlimited opportunity. However, producing real working software that people can use (as opposed to "OpenSource" freeware that is too hard to do useful work with) involves a lot of external constraints that I and my programmer colleagues would prefer to pretend do not exist. I have called the discipline "triage"; Christine Miserandino calls it "counting your spoons," but it's the same thing. The only difference is that some people (like Christine) run out of resources sooner.
Thank you Christine for the insight.
I am told that Oregon used to have a billboard on I-5 at the California border, "Welcome to Oregon. Now go home." Perhaps the sign is still there, but it's unnecessary. Anybody in their right mind is aware that Oregon is the only government in the whole world where persons more than a few days old can be legally killed against their will without committing any crime, solely if somebody deems their life not worth living. That can also happen to you in Holland and Florida, but not as a result of legislative action. Younger people can be killed with impunity all over the country -- again, not a legislative choice, but a consequence of the fact that we no longer live in a republic subject to the will of the voters. In Oregon the voters made their choice.
In Massachusetts the voters also (through their elected representatives) have chosen the harmful path to destruction -- not this time in the termination of life, but for the maintenance of affordable quality health care, which by the Law of Unintended Consequences they have utterly and foolishly abandonned while claiming to do the opposite. It's a Lie from the Pit.
I was going to repeat here my previous explanation of why the new Mass law will drive up health care costs for everybody, but I see there still are a few sane voices of reason in that state. The Coyote Blog begins with this poignant quote:
What are you guys smoking over there? Here I am in Massachusetts, without health insurance, and with a family of four, and all that has happened is on top of having to pay full freight for my family's doctor bills, I get fined $1000.00 for the privelege.Other postings in the same blog point out, as I have noticed over the years, that paying cash for medical costs is cheaper in the long run than buying insurance.
I downloaded the full text of the law to look for loopholes. There are a couple, one of which is the empty shell of a religious exemption, but if you ever pay for any medical services the exemption is revoked. In other words, Jehovah's Witnesses are not required to pay for services they don't use, but the rest of us must pay double. A somewhat sturdier loophole is a 63-day grace period between coverage blocks. Provided that you can affirm that you were covered on December 31, the law seems to allow for signing up for insurance on the last day of every other month, then cancelling it the next day. I don't know how much of a refund you can wheedle out of the carriers, and you may actually need to use the service that one day (go in for a hangnail or something) to keep them from retroactively cancelling out the whole enrollment, but it might cost less than the penalty tax. I was hoping I could find a way to declare self-insurance as compliance, but apparently not. Another possibility that might work for me, should I ever have the misfortune to live under this monstrosity, is to point out that my self-insurance has cost me $35/month (with no burden on other taxpayers) for over 20 years, and their cheapest offering is much more expensive. Of course they know that, that's why they want my dollars being put into the system.
They say Governor Romney is hoping this will energize his bid for the WhiteHouse in 2008. Not with my vote. He was quoted as comparing this to mandatory car insurance, but that's a lie. You don't have to buy car insurance if you don't own a car. Driving, we are told by the Department of Motor Vehicles, "is a priviledge, not a right." There are costs associated with that priviledge, one of which is being able to pay for the damage a car can do. Bad health mostly does not do damage to other persons or property. Some states provide an exemption to their mandatory car insurance law, if you post a bond to cover the statutory liability requirements. Rich people (and smart people of modest means) put the money in the bank instead of buying insurance, then post the bond if needed. Perhaps that option is not available in Massachusetts. It is reported that Romney wanted a bond-posting option in the health care bill, but the legislature took it out. More's the pity.
Some people are claiming this new law encourages personal responsibility, but that's another Lie from the Pit. It actually discourages people from making responsible health care decisions by having that option taken away from them. Now their only remaining choice is whether to spend another $10 for an unnecessary procedure recommended by the doctor -- who gets to keep a large piece of the $1000 of somebody else's money that it really costs. That's not a responsible choice, and it will not be made responsibly.
From where I sit -- recall that I have the full text of the law, not a regurgitated repetition of some other person's half-baked ideas who has neither read nor understood it -- the Democrats and crypto-Democrats in Massachusetts are doing for their cronies in the insurance and medical industry what they (falsely) accuse Cheney and the WhiteHouse of doing for the oil companies, namely fattening their corporate profits at the expense of innocent taxpayers. They call that evil and I agree.
The worst part of this law is that everybody recognizes it as the camel's nose in the tent. The sycophant left-wing bigot media mavens all love it (I was unable to find a single mainstream media writer finding fault with it), as do the homosexuals and others who want everybody else to pay the consequences of their own unhealthy behavior choices. The best we can hope for is that the other states will take long enough to follow suit, so that the inherent problems of this system will become apparent before it spreads over the whole country.
http://www.technologyreview.com/articles/04/10/wo_muller101504.asp?p=1Although these items are a few years old, it begins to look like the evidence for global warming is not significantly better than the evidence for Darwinistic evolution -- in other words, there is none. I am thus inclined to repeat my suggestion that this is not a problem requiring massive scientific and political attention, and that the only reason it gets so much press coverage is that it is the one putative large-scale problem for which the present Bush administration is not responding in a way likely to mitigate whatever problem might exist. In other words, it is an opportunity to bash President Bush, which the left-wing bigots controlling the media are only too eager to do.
http://www.john-daly.com/cause/cause.htm
I consider it unfortunate that Christians have jumped on the bandwagon without giving adequate attention to the facts. Well, not all of them. There still are thinking Christians. Josh, for example.
My essay "A Christian View of
Climate Change" considers this topic more fully (with links).
Perhaps his interview took place some time ago, before the recent "tipping point" discoveries in global warming recently reported in TIME magazine, but in this interview Houghton seems to think that the effect can be stopped. The TIME cover story gave no such illusion, despite that they urged everyone -- including India and China, which the Kyoto treaty conveniently omits -- to try anyway.
Does anybody honestly believe that government/human efforts can stop
global warming? I
really want to know.
Today I want to look at some philosophical reasons why the Marxist underpinnings of the Free Software movement consistent fail in the real world. I prefer to use the word "religion" to refer to philosophical questions that ask about ultimate reality, questions such as "Where did we come from?" and "Why should I do that?" Because questions like these have traditionally been answered by God (or the gods), without reference to repeatable science. That is still the case, except that the gods in question are no longer claimed to be supernatural (although they definitely are still not repeatable science).
Karl Marx famously said "Religion is an opiate of the masses." For Marx, the class struggle, the workers of the world rising up against their bourgeois masters and throwing off their chains, that was the ultimate reality. Somehow that got co-opted into the Marxists themselves becoming the new bourgeois masters, while the workers remained just as enslaved as ever. But the class struggle was an example of the prototypical Darwinist competition between the species, with the prize -- survival -- going to the fittest, which the Marxists of course imagined themselves to be.
There is a fundamental problem with that, which is that Marxist economic theory requires cooperation, not competition. In the economic arena, capitalism is the prototypical competitive sport. Marxists are supposed to contribute freely and altruistically to the public square, and take out only so much as they need. The Darwinist model, however, is somewhat more in tune with how people really operate. Marxism is not a Darwinist economic theory. If and to the extent that Darwinists accurately describe "the selfish gene" as the way things work, Marxism is doomed to failure. And so it was.
Longtime readers of my blog know I am no friend of Darwin. Nevertheless, Natural Selection is a wonderful insight for preserving a species in a changing and unpredictable natural environment (the same property works against forming new species). The same principle applies in the economic domain: some variation of Natural Selection drives capitalism on in preserving the economic resources of each participant, just as selfishly as Dawkins' genes. And just as brutally.
There is an alternative: altruism. Darwinists deny that there is such a thing as altruism; it doesn't fit in their theory. Because of their hostility to religion in general and Christianity in particular, Marxists would be inclined to agree -- except that Marxist economic theory only works in an altruistic environment, where each member works for the common good. The first recorded communism was in fact Christian [Acts 4:32-35]. It didn't last very long, because even committed Christians act selfishly from time to time.
Several years ago I read about an experiment -- I think it was Martin
Gardner, the Mathematical Games columnist for ScientificAmerican, but
I'm having trouble verifying that -- with the Prisoner's Dilemma puzzle,
which he tried on 50 real people with real money, and was utterly astounded
by the results. Logical, intelligent people, they should have all done
the same, but about 1/3 of them chose to defect, and another third confessed
that they would have if they thought nobody was looking. The numbers are
credible and repeatable: one third altruists, one third selfish, and one
third driven by what people will think. The Marxists expect everybody to
be altruists; the Darwinists know everybody is selfish; and the Christians
are
the altruists whose good behavior influences that other third to good behavior.
That's what makes America work, and Marxism does not offer it.
The most articulate and vocal spokesman for the Electronic Freedom Foundation, Lessig commonly promotes his ideas about "free software" in print. Few have the temerity and intellectual horsepower to point out his errors. I doubt I'm any better at it that the others, but it's worth a try.
To his credit, Lessig correctly points out a serious flaw in American copyright law, and I agree with him that the legislative trend is in the wrong direction. Then we part company.
Lessig severely damages his credibility, if not veracity, by tying his position to Richard Stallman's notion of "copyleft". Stallman is a Marxist -- if not in name, then certainly in disposition. Marxism, you will recall, is an economic system that was given a fair trial in a number of countries, and then overwhelmingly rejected by the people forced to live under it. It now survives only on the faculties of a few universities where its proponents were never forced to endure its consequences, and in a few has-been eastern-bloc bureaucrats, who likewise. The basic premise of Marxism, as stated by its early promoters, was "From each according to his ability, to each according to his need." In practice, this devolves down to something closer to "Put in as little as possible, and take out as much as possible." Capitalists go at it the same way, but because what you can take out is closely coupled to how much you put in, people are motivated to put more into the system, thereby raising the wealth of everybody (even the slackers and Marxists among us).
Stallman and Lessig promote a more limited form of Marxism, limited to intellectual property only, but the principle is the same. The most vocal proponents of "free software" hope to take more out of the system than they put in. Those actually making substantial contributions -- I'm thinking here of Linus Torvalds, creator of Linux -- tend not to promote the concept quite so vigorously; it's like he has little or no interest in taking software out of the system, but it was a convenient way to promote his own skills in the otherwise predominantly capitalist labor market. Really creative people are like that: excellence comes so easily to them that giving a little away now and then is no big deal. I've watched this effect numerous times in different people. It's the "suits" (pointy-haired managers who produce nothing on their own) who try to guard what little they have and acquire what they don't have.
Marxism is a fine idea in a zealot community where everybody believes in the same cause and they are willing to work together to achieve it. In the real world these communities are small and outcast. It is also important that their shared cause be something other than personal gain. The Marxists at universities (including both Lessig and initially Stallman) are on adequate salaries; they don't need to worry about how to feed their families or pay their rent. They are not promoting Marxist educational systems (they do not hold classes for, nor grant degrees to the public at no cost). If they did, they would soon be out of a job and forced to market their labor like the rest of us. Marxism only works (if at all) as a minority position in a much larger market driven by capitalist forces.
The so-called "copyleft" notion has another serious flaw: despite what they tell us, it is not about truly free software, free from all restrictions. There is a long-standing way to make something truly free, and Lessig and Stallman both know about it, and chose not to do that. It's called "public domain." The Gnu Public License is not free; if you build on a GPL property, you are forced to put your labor into their Marxist pool. You pay for the right to build by your labor. You make no required payment at all to build on a public domain base. You are truly free to release your work or not. That is freedom; the GPL is a form of slavery.
Despite their best intentions, the GPL only works by depending on copyright, as Lessig correctly states. When that copyright expires (75 years after the death of the author, under current law), all that "free software" becomes truly free, with no restrictions whatsoever, not even the requirement to put your labor back into the pool. Of course 75 years is far too long for any software copyright; it has long since become worthless. But Lessig is not arguing more sensibly for a reduction in the term of software copyrights. Lessig wants to lord it over the users of the intellectual property in his stable, to specify what they can and cannot do with his copyrights, while denying that same right to the other owners. Lessig is a hypocrite.
Lessig's essay is further damaged by several technical errors. He attributes to a 1909 "error in the wording" the current copyright protection of copies, completely ignoring the original Copyright Law of 1790 which used the word not only in its text but also in its very name. It's about the right to copy, that's why it's called a "copyright." That fundamental idea has not changed since George Washington himself signed it into law. The notion was at least 119 years old in 1909.
Lessig is also unduly alarmist about the technology of "digital rights management" (DRM). I agree that DRM is wrong-headed and an abuse of the intent of the Framers of the Constitution, and that DRM proponents are every bit as greedy in managing their copyrights as the "free software" folks are greedy to take it away from them. But technology cannot prevent copying, any more than it can prevent spam. All legally enforced technology can do is make unlawful copying slightly more difficult. It hasn't even done that for spam, which has increased tenfold since Congress passed the spam enabling ("Can-Spam") act. We have 20 years of experience on the ability of technology to stop unauthorized copying of floppy disks. Nobody bothers to try any more, because they can't. If Congress passes an even more onerous DMCA ("Digital Millenium Copyright Act, which already makes it illegal to exercise free speech telling you how to bypass anti-copy technology), then more of the work will be done overseas. It cannot be stopped by law-enforced technology, only pushed into other countries.
Pirate copying can be stopped -- or at least slowed down to where it is not much worse than the traditional "fair use" policy that the law and the courts upheld until the DMCA abomination came about -- and it's called "iTunes" (or any of its many competitors), no change in law needed.
Lessig goes on to claim, "No society has ever imposed the level of control that the proprietary culture of digital technologies and DRM would enable." Perhaps not de jure, as Lessig claims is now being proposed, but he didn't say "no government" ever imposed it. The fact is, technology itself imposed that restriction from the dawn of time until very recently. There simply was no way to make the kinds of copies that anybody at all can make today. You could hand-write a copy -- but then no technology or chip being proposed even now can prevent that.
Lessig worries about the limits in creativity when intellectual property owners seek to prevent "remixing" of their properties. Utter nonsense! What happens is that people are released to new levels of creativity unhampered by old forms. Consider what happened when Warner Chappell bought the rights to "Happy Birthday" in 1990 and started vigorously enforcing what is now recognized as an invalid copyright: All of a sudden restaurants stopped singing it and came up with a royalty-free clapping song in its place. When Adobe got greedy with PostScript fonts, Apple and Microsoft got creative and invented TrueType.
Lessig knows about this remedy. He even mentions a variation of it himself, remarking that the Brazilians intend to stop infringing Microsoft copyrights by ceasing to use Microsoft products. I say Bravo!
Unfortunately, free software tends to be worth what you pay for it. Linux enjoys a robust 3% of the market because only 3% of the computer users value their own many hours of time less than the $200 they might save if they bought Windows. Apache wins over the Microsoft product solely because the market for internet servers is so small that it does not justify the expense Microsoft would have to go to, to make their server easier to use than the open-source turkey.
The bottom line is still the bottom line. The number of people willing to program for free is a tiny fraction of the amount of programming that needs to be done. Who is going to pay for it? The government? In a Marxist economy, perhaps, but those countries mostly don't exist any more -- the few left are trying to re-invent their economies as something closer to what works so well here in the USA. Will Brazil's efforts at open-source work? I wish them the best of luck, but programmers -- even in poor countries like Brazil and China -- programmers need to feed their families.
Despite its occasional oscillations and excesses, the free market (including the property rights to support it) does an awesome job of increasing wealth for everybody, while balancing the interests of producers and consumers. When the rights holders get greedy, the market will swing back toward the consumers -- as indeed it is doing with "piracy" -- and when the consumers get greedy, it swings the other way, with DRM. Balance.
Professor Lessig has his head stuck in the 1930s sand.
Millions of people will be inconvenienced, maybe even die from the effects of global warming. Darwinism, which TIME promotes at the expense of more credible alternatives, predicts -- even requires -- that unfit people and organisms will die out and be replaced by more evolved (and thus more fit) species.
Millions of people die from genocide in Sudan and starvation in Zimbabwe, but nobody has the political will to make it stop. Millions of people die from AIDS, but only one country in the whole world has the political will to reduce the incidence of new cases -- and TIME doesn't even have the integrity to report on the success in Uganda.
TIME runs out the usual litany of technical solutions to global warming, but if all of them were implemented immediately and all at once, it would hardly slow global warming down, let alone stop it -- and the worldwide economic cost of doing so would be worse than the effects of unmitigated warming. The Sahara desert was getting bigger long before there was such a thing as global warming, but parched ground makes nice scary pictures for readers who presumably don't know any better.
When water levels rise, people can move away from the seashore to higher ground. They have plenty of time to do so. There is no need to restore and maintain underwater cities like New Orleans. Technology solved the food problem in India, it can solve the drought problem in Africa. If people want to.
Worrying about global warming is foolish. There are things that can be stopped, like nuclear bombs in Iran and North Korea, like AIDS in Africa, the "brain drain" in the USA, but global warming isn't one of them. It just happens to be something TIME can use to berate the Bush administration over; solving those other problems takes more of what he is doing, not less.
My essay "A Christian View of
Climate Change" considers this topic more fully (with links).
TIME has a new baby brother to compete with for fiction. WIRED magazine now openly admits to being a "technology-and-culture magazine." Yes, hyphenated like that. I wonder how long before they reverse the order on the two parts to accurately reflect what their advertizers already know.
The current issue -- perhaps as an April Fool joke, but I doubt it -- ran a couple of back-to-back fiction-as-fact stories. The first expresses the hopeful wishes of some biologist/computerist who thinks he has invented a way to work backward from the genomes of living species to their presumed common ancestor. He tested his software by starting with a specified sequence of DNA codes, then after applying what he supposes to be mutations the same as evolution, his program recovered the original sequence. What he doesn't tell you (and WIRED authors are generally too credulous to ask) is how he knows that his test sample matches evolution, which nobody has ever observed in real life. All he did was contrive out of his own head some rule for randomizing bits, and then applied the same rule forwards and backwards, to get the original data. Big whoop-de-doo! Well, I wish him all the fun with his new toy. Ten years from now -- perhaps sooner -- somebody else will show how his results are deeply flawed, and his work will be as discredited as Korean clones. Evolution-driven "science" is like that. Oh, did I mention? This guy is at Santa Cruz. I got a degree there; I know what a nuthouse that place is.
Immediately following is a wishful fantasy by game guru Will Wright. His lead paragraph describes children "in imaginary worlds, substituting toys and make-believe for the real surroundings that we are just beginning to explore and understand." And then, in his progression, "We add rules and goals." He doesn't emphasize it here, but the emphasis is clear later on, we (as children) add these rules; it is creativity in the child. Two paragraphs later he has morphed into a paean on video games (what else? He writes them) where "they [the children] are learning in a totally new way -- [which] means they'll treat the world as a place for creation, not consumption." Let's see if I understand this correctly. The children give up the creativity they add to their own make-believe (before video games came on the scene), and learn something totally different from that (no disagreement there), which is somehow now creative. No, the creative part is what they gave up.
Games, Wright tells us, "start in a well-defined state, ... and end when a specific state is reached." No room for creativity there. He goes on to say that in modern videogames "we're invited to create and interact with elaborately simulated worlds... they actually amplify our powers of imagination." Well, not really. The game software invites you to interact with the world the game developer invented, which operates by the rules he invented, and not any rules the player brings to the game. I know, I have written games too. Wright's own game lets the players add minor customizations, and he tells us the trend is in that direction, but don't be fooled: 99% of the world in a video game was designed by the developer and cannot be changed except as the developer chose in advance. Sort of like evolution, I think: completely designed by the Designer with the appearance of change over time, but nothing really significant.
Will Wright does offer one insight, which I can personally confirm. "Just watch a kid with a new videogame," he tells us. "They pick up the controller and start mashing buttons to see what happens. ... it's the essence of the scientific method." Kids have always done that, even as babies. They throw food to see what kind of sound and blob it makes on the floor or mommy's nice dress. They stick things in their mouth to see what this colorful shape tastes like. But because kids do this, new software no longer bothers with instruction manuals. Not just games, adult productivity tools must be learned the same way. Or else they create an aftermarket for "Photoshop for Dummies" books. I paid over $1000 for a software development tool that I have to learn the way a kid learns a game, by mashing buttons until something works.
There is an adult way to learn things that somebody else already knows. My father used to tell me, "Experience is a hard school, but the fool learneth in none other." He was right. The important things in life are too numerous and too complicated to learn by trial and error. That's why we send kids to school -- and increasingly, to college -- because they cannot self-teach that kind of knowledge in their lifetime. Even game programmers build their games on a lot of physics and sociology and other sciences that the programmers learned in school or by reading books, not by trial and error.
Games follow a fairly narrow paradigm of specific rules. Kids learn those rules pretty quickly, and if a game does not play by those rules, it's deemed unplayable. I know. If all their learning is in this restricted universe, then kids will have a pretty restricted skill set when they reach adulthood, nevermind what Will Wright wants to believe about his occupation.
"Greg" (not his real name) asked:
As a long time computer scientist, you discuss logic and logical concepts at length on your website, yet at the same time you espouse the bible and christianty; a belief that can hardly be inferred from logic. How is this possible?Well, there is a logical basis for it, which I very carefully outlined in my essay, "What's Really Important". So I pointed him to it and invited his response. There was none. After a while, I pinged him. He was strangely illogical:
One of the things I have noticed about evangelistic promoters of any belief is that they seem to start with the premise that they are right, ... they start with a conclusion and work backwords!It's like he didn't read what I wrote at all.
One of the things I have noticed about evangelistic promoters of any belief is that if they start making provably false claims about my thought processes, it usually makes most sense to take those claims as a projection of their own internal thoughts and motives. In this case, as usually happens, it makes sense. "Greg" went on to say:
For myself, I can only say that my personal God dwells within me and is known to me alone.In other words, he has invented a "god" from the whole cloth of his internal feelings, and then wants to assume (by projection) that the Bible and Christianity are based on the same kind of imaginary feelings.
I asked him if he has a problem with the traditional dictionary definition of "god". He didn't reply. I didn't expect him to. You see, he does have a problem with an external, objective God -- not the product of his or my imagination, but we are the product of God's imagination -- because that kind of God cannot be controlled. That kind of God might make moral demands on us that we cannot logically squirm out of. That's rather threatening, and "Greg" knows it. So he started at a more desirable conclusion (what kind of "god" he might be comfortable with) and worked backwards.
The problem is, that's not logical, and "Greg" knows it. Thus his initial question to me.
If I or "Greg" or anybody else tried to redefine any other objective external reality (for example, electricity or food or money or binary math) to our own liking, everybody would immediately call that insane. It is insane.
So I'm stuck with where the logic takes me.
Like "Greg" I don't debate feelings. My essay is about logic and science, not feelings. If you think it's illogical, I want to hear from you. I don't like being wrong, so if I am, I want to change my opinion. Logic can do that. Are you logical?
One of his hobby horses is called preterism. Wikipedia's article on preterism is reasonable but somewhat confusing. The idea seems to be tied up in theonomy -- I think that's another word for theocracy, the kind of government secularists unjustly accuse Bush of promoting -- and DeMar does seem to argue from time to time for an explicitly theocratic American government. Anyway, I read this 3-page article that seems mostly to castigate the dispensationalists (think: Left Behind books), and his explanation of the Bible texts is just as off-center as he attributes to the other guys.
So I sent an email asking for a clarification. He responded, but completely ignored my point that if he just wants to sell me a book, please give me some reason to believe the book has answers. It's weird, like a robot was answering his email, picking out a word or two to index some canned paragraphs. Now I happen to believe robots aren't that smart yet, but people usually aren't that stupid. He told me he doesn't get royalties from his book, but he sure doesn't give them out for free. He doesn't even give the information out for free, like on a website. Maybe he gets a salary, maybe his benefit from book sales is only secondary, I don't know.
Letters to other non-profits get varied results. Focus on the Family always answers my letters. A real, live human reads and thoughtfully replies. My thoughts never make it into Dr.Dobson's office, of course, not at 4000 letters per day, not a chance. Ligonier never even bothered to respond. Ligonier never got a second letter (you know, the one with green stuff in it ;-)
Each of these organizations has an agenda, ideas they are trying to sell in the marketplace of ideas. Focus is huge and has a lot of influence because they always give every letter and caller personal attention. If I were a secularist cynic looking in, I would describe it as good marketing. Well, it is good marketing, even if I happen to agree with (most of) their agenda. Ligonier has a good agenda too, but their marketing sucks. DeMar's group was responsive, but not in a meaningful way. With not much more effort than he spent replying to my email, he could have effectively marketed his book. But he didn't.
Oh well.
1. SOA is not really new; all software is essentially "service-oriented." The only difference is that they are getting better at making these services work together. It's a natural progression of existing technology in an obvious direction -- nevermind the novel moniker.
2. SaaS is also not a new idea. A decade or more ago the vendors were promoting "thin clients" (essentially software as a service, hosted from a central server). You don't hear about thin clients very much any more. Three decades before that, software as a service was all you could afford, because personal (desktop) computers had not yet been invented. The AppleII+VisiCalc changed all that. Business people learned that they could do business calculations on these $2000 desktop appliances that their mainframe SaaS providers could not. It was essentially all over for the mainframes. IBM capitulated and started manufacturing desktop computers. Nobody ever went back.
3. The primary national religion in the USA is personal autonomy. If there are two ways to do something, and one of them gives people more autonomy, kiss the other one goodbye. Desktop computers put control of business computation in the offices where the data was being used. Information Technologists have been trying -- unsuccessfully -- for 30 years to wrest control back. Automobiles and buses get people to work, but buses are cheaper. Everybody drives to work. Movie theaters and DVD players show the same movies, but the image and sound quality is better at the cinema, so everybody rents (or more often buys) DVDs. Plain Old Telephone Service has wires to every house, but more and more of those wires are falling into disuse as families with cell phones in every pocket or purse disconnect the landlines. Autonomy.
4. Some services are necessarily centralized. When solar power or personal nuclear reactors become affordable and convenient for the average home-owner, the power companies will be toast. Until then we are stuck with central power generation. Telephones only make sense because there is a national -- make that international -- grid connecting every to everybody else on demand. In a few places you can drill for water just about anywhere, but more and more the receding water table and encroaching polutants are making that impractical. LosAngeles residents get their water from two states away. So water is a public utility. Internet access is like telephone service: you need to be on the grid. Software isn't like that. Maybe there is a financial savings in letting the service provider aggregate the facilities and maintenance costs, but cost has never won out over autonomy, unless it's a vast order-of-magnitude difference, like local power generation. SaaS is not that kind of savings.
5. Here's where the local control wins: the InfoWorld article tells us that "SaaS providers incrementally swap in new functionality, streaming new innovations to all customers at once." Suppose you run an accounting firm. Your crunch time is March and April, and you don't want to have to stop and train the staff on new software during that heavy business time. Or you are a toy retailer, with the busy time in November and December when you don't want to be training staff on new SaaS functions that suddenly made your business processes fail. Or you have a travel agency, with a busy time in June and July. And so on. But the customer doesn't get to say "Please, it ain't broke, don't fix it during crunch time." Besides, there is no time that isn't somebody's crunch time.
22 years ago I switched over to Apple Macintosh. It was wonderful. My productivity took a leap that has never been equaled before nor since. I write software, and eventually I started writing Mac software. Then Apple changed the operating system and everything broke. We finally got up to speed on System/7 and Apple changed the processor. Instead of the 68000 I knew well, I had to deal with the PowerPC and a lot of secret system code Apple was not letting us into. I spent a lot of time getting up to speed on that, but at least the old software still ran (in emulation). Then Apple changed the operating system again -- completely threw out the MacOS and replaced it with unix. The old software no longer works. Some vendors rewrote their software for the new system, but my productivity tools were not among them. Did I mention the vendor? Apple. Right. Apple killed my mission-critical productivity tools and provided no upgrade path onto their new unix system. I still run these tools on a MacOS system (not OSX). Eventually I will rewrite the tools myself, but probably not for an Apple system. It no longer impacts me, but Apple killed the processor again. Recent programs run (slowly) in emulation, but none of the older stuff runs at all.
When I was at the university, I taught a VisualBasic course. Hey, this stuff is pretty cool, almost as easy to use as Apple's defunct HyperCard! I wrote several programs in VB, one of them now doing mission-critical work in a biotech research lab. When I left the university, I wanted to take VB with me (that is, buy my own copy), but Microsoft had already killed it, and replaced it with an incompatible Dot-Net sound-alike with no upgrade path. I spent a lot of time and money rewriting the VB application in C. At least C is an international standard, and (sort of) portable. I learned my lesson: NO MORE PROPRIETARY LOCK-IN products in mission-critical systems. Am I alone in this insight? I don't think so, I read about it all the time in the trade press.
I have a 50-year-old electrical appliance that I can plug into the wall and it still works. It's working right now, as I write this. I have a 30-year-old telephone I can plug into the wall and it still works. I have some 15-year-old software tools that I still use efficiently, but I can't plug them into any modern computer, because they don't work. At least I can buy older computers to run them on, until I can get them phased out on my schedule. If I were on SaaS I would be up the creek without a paddle. There is no compatibility between software operating systems, let alone between SaaS products. What happens when they unilaterally decide to kill your mission-critical service? Or just fold their doors? Public power and telephone utilities don't do that; computer companies do it all the time. Even BIG companies like Apple and Microsoft kill products without recourse. I can buy an old computer to keep limping along, but I can't buy an old service.
Am I the only one to notice that? I don't think so.
SaaS is not "the Next BIG Thing" until the service providers can standardize on their service so I can unplug from one provider and plug into their competitor and not miss a beat, the way I can plug my appliances or telephones in anywhere in the country and they Just Work. But SaaS providers aren't going to do that. They want lock-in.
I don't want lock-in, and I'm not alone. SaaS is going to be "the Next BIG Thing" for a looong time. But never arrive.
Darwin "Civilized races of man will almost certainly exterminate and replace the savage races"Anyway, it got me to thinking, and I realized that democracy is not friendly to Darwinism. The feeling is mutual. If Darwin were correct, there would be no democracy anywhere, only a super-race of superior beings (Aryans? The magazine article pointed out that Hitler was a good Darwinist) subjugating and eventually exterminating the inferior races. Democracy was invented by Christians (not Darwinists), and continues to be possible in this country because the Christians outnumber the Darwinists. I wrote up my thoughts in an essay, "Darwin and Democracy". Tell me what you think
I tried looking at the blog software he installed to see if I could modify it to be more in conformance with the policies on this website. It's an open-source package, so I could read it, but it's all written in PHP, which is one of those modern internet languages I had not yet learned. It even has a goofy unixy name: the "P" stands for "PHP" as in "PHP Hypertext Preprocessor" but I suspect the TLA (a self-referential Three-Letter Acronym) originally stood for Pretty Hard Programming.
Anyway, because PHP is open source, they make a valiant effort to publish their documentation in a variety of different formats -- all of them inaccessible to me for one reason or another. Did I mention? I run a high-security system here; the modem that came with my DSL connection has a firewall, but it's undocumented and blocks access to my own website, so it's essentially useless. Therefore I try to stay offline most of the time. I found a wonderful freeware program called SiteSucker which downloads to my hard drive all of a website's pages for offline reading -- except it doesn't work with some sites, including the php.net site. But they offer a variety of downloads: Everything in one giant 2MB HTML file (which my browser chokes on), or everything in one flat directory of HTML files compressed into a ".chm" file that this Mac never heard of, or else a hierarchical directory of HTML files compressed into a ".tar.gz" file which I happen to recognize as a unix format (despite my best efforts, I am unable to run a usable unix/Linux system here; it's a problem with the system that I am unwilling to spend the vast amounts of time and energy required to overcome), and not otherwise decodable.
Google was a little help here, I found a freeware "CHM -> HTML" conversion program that runs on a PC, so I was able to unpack that documentation package -- into 5180 files, which the Mac Finder chokes on. No matter, I just don't open that folder. I created an alias to the root index file and I open that, which works until I click on a link to read about one of the built-in functions, which in the original hierarchy is deeply nested in some folder structure that they helpfully flattened into a very long file name that the Mac system does not support -- so it can't find it. I could read the docs on the PC, but I can't copy/paste across the platforms and (worse) the HTML browser there has this exceedingly annoying mushy imprecise scrollbar that does not work properly and cannot be turned off. But at least I can read it, sort of.
They called the software "Simple PHP Blog" but the "simple" part is a lie. With 270 program files to wade through, plus themes and colors and ratings and a whole bunch of other decorations, it is anything but simple. I tried working my way through the code, but it is exceeding complex. I set it aside for another day -- or perhaps a sleepless night (of which I have very few lately). Maybe when my plate isn't so full of more important battles...
Last week a friend bcc'd me a reply he made to some magazine article, so I went to the magazine web site to see what he was replying to (at which quest I failed), but in the process I stumbled across that magazine's own blog page with a link to the Wikipedia article on RSS. Bingo! This is the article Google couldn't find. Wiki often has logorrhea, but they did link to a French site which actually explained RSS simply. So now I'm up in RSS. I still cannot verify on my own system what I post, so I hope y'all will let me know if something is broken.
Software is the same way. Early in my career my clients would ask if I could re-use older code. I usually told them that I would "hang it up on the wall for inspiration." Once I priced out the cost of adapting the old code versus throwing it out and starting over -- and gave them the choice. They bought the cheaper, more robust, new code.
Software magazines these days are full of ads and reviews and news items about virus and spam-blocking utilities. I think if you add it up, the total cost of this protection software -- which only works 95% of the time -- exceeds the cost of a total rewrite of the operating system to make such malware irrelevant. That garbage can't get in if you don't open the door, and a system designed from the ground up without those holes would be simpler, smaller, and easier to maintain than the current bloatware out of Redmond and the garages of LinuxWorld.
Case in point: I run a desktop system that viruses cannot get into. A few years ago InfoWorld printed some mind-blowing statistics, known vulnerabilities for various operating systems. Two systems were reported with less than ten flaws each; both systems are no longer marketed today -- but I still use the MacOS. The next runners-up had nearly ten times as many known vulnerabilities. WinNT (the same code base as current WinXP) and Linux were both three times that, and BSD-unix (the base system under Apple's OSX) was the worst, with twice the vulnerabilities as Linux or WinXP, and 43 times worse than the (real) MacOS. Why?
My best guess is that the business model for quality software and systems is not profitable. Apple had a quality operating system, but nobody could make any money on it -- except the users -- so they killed it and replaced it with oldest and the most vulnerable system they could find. The developers love it. If they weren't so busy making money on Wintel systems, the spammers and other villains would love it too. And everybody else (except me, of course) gets bilked out of thousands of dollars in filter software and "threat management" utilities and software upgrades and what-all else.
Go figure.
The time for e-postage has come and gone. If it had been implemented ten years ago, closer to when Bob Metcalf first proposed it, there would be no spam problem today. Now the spammers have had that heady first snort of free money, and there's no turning back the clock. All they will do is move on to other electronic messaging media -- think: cell phones -- as they already are doing.
The current issue of PCworld magazine has a depressing article on spammers -- including an interview with an unrepentant spammer. It's getting ugly, and they tell us it will get worse. But all they can hope for is more filter software.
Spam still can be stopped. It just needs to be made unprofitable. The only people with the horsepower to pull that off are the lawyers, and it's really not all that hard. Just empower the victims of spam to sue ANYBODY who knowingly profits directly or indirectly from unsolicited commercial electronic messages. Let the winner collect actual damages (punitive are better, but not necessary), including legal expenses. The spammers themselves may be beyond the reach of American law, in Israel or the Bahamas or southeast Asia, but they sell their services to American companies; so sue the beneficiaries. The payments have to go through American banks (which make a profit on the transactions!), so sue the banks. The erotic drugs are made by pharmaceutical companies who also sell other drugs in the USA, so sue them. It won't take long for the high-priced lawyers to figure out that $1 in actual damages (salary for the time it takes to delete one spam message in a corporate office), plus $10,000+ in legal fees is not a bad day's work, and it won't take long for the suppliers to refuse to sell to the spammers, and for the banks to require their customers to indemnify them against spam-related litigation. And it won't be long after that when the low-life scum who actually perpetrate this evil on the rest of us go back to holding up stagecoaches and robbing candy machines.
But it's not going to happen, at least not until Congress members themselves start to feel the pain. Just making it illegal, like the recent spam-enabling law "CAN-SPAM" only makes matters worse. You need to empower the people who can do something about it. Congress is disinclined to do that, for two reasons. First, and most important, Congress is elected by money, not votes, so they take care of the corporate interests who provide their primary income. Those corporate interests -- the internet service providers, the software companies selling spam-zapping software, and all the nefarious companies who use spam as an advertizing tool -- even the banks and credit card companies -- they make money from increasing spam, not reducing it. Then there is the feeling of power that the Federal Government (including Congress) gets by taking power away from the people and centralizing it in Washington. CAN-SPAM nullified some of the best existing (State) spam legislation and replaced it with the toothless law that now causes spam to grow faster than ever before. It used to be that one major political party favored more government waste and bureaucracy, while the other favored less; now both parties are indistinguishably Big-Government wastrels. sigh
Some technical writers will consciously alternate male and female pronouns when discussing male-dominated technical domains, presumably out of some misguided notion of erasing stereotypes. In this magazine all of the pronouns to non-specific persons are female. I picked one article that gave the email address of the author (a person with a feminine name), and asked discretely if it was her doing or the editor's. She said that of the seven pronouns in that article, two referred to specific female persons, two she herself had made feminine, and the other three were the editor's work, which she reminded me, all the editors of that magazine are female.
Now I have nothing against avoiding stereotyping gender in a field where it is presumably irrelevant. It's not hard to do, and I do it myself. The English language has a long history of using gender-neutral plural pronouns for non-specific persons, even when only one of them is being discussed at the moment. This practice actually predates the imported Latin rule that a singular person must be referred to consistently by a singular pronoun, a favorite of grammatical pedants. Most technical writers achieve this kind of non-sexist detent with the feminists.
Perhaps it was the intent and purpose of the editors to put people on edge, perhaps in retaliation for perceived grievances against themselves. They succeeded: I will not buy their magazine. I will not recommend it. The editors have a socio-political agenda that transcends the stated purpose of the magazine, so they cannot be trusted.
They are also misguided. The computer technology field, more than any other I know about, is a strict meritocracy. You are rewarded for performance, not for superficial qualities like gender or race. If it is dominated by men, it is because the men are willing to put in the devotion to succeed -- and the women complain about the requirements, almost in the same breath that they complain about the gender ratio. The complaints are sexist; the requirements are not.
So why is it grating to read female pronouns that refer to male programmers? Why would it be grating to read exclusively male pronouns referring to (mostly) female nurses? It's simply not the case. It will never be the case -- at least for nurses, because certain personality types, which are characteristically gender-specific, make better nurses. That may or may not be the case with programmers, I don't know. I do know that the proportion of female programmers is comparable to the proportion of male nurses. And yes, both proportions are rising -- at least at the present -- a fact unrelated to the use of pronouns.
The last issue of Better Software on this stack actually has one article with a couple of male pronouns in it. Maybe the editors got religion, or at least recognized that their heavy-handed agenda was counter-productive. Or maybe it was a fluke. I will never know.
When I was a child, our family was not wealthy. We did not throw leftovers away. Sometimes it was, "Come on, Tom, finish it up. We don't want leftovers." You wouldn't believe what that does to a person's psyche. Even today I have a hard time passing up food, even after I have eaten enough. So I cringe when I see perfectly good food go into the trash. But I'm a Christian. More than just being a product of my upbringing, I want to be conformed to the teachings of Jesus.
Did you know the Bible speaks to this issue? God does not seem to have a problem with throwing things away, like for example, walking away from bad decisions. In 2Chron.25:9 the Man of God tells the Jewish king that God can easily replace the 100 talents of silver he had already foolishly spent hiring mercenaries. In Joel 2:25 the LORD promises to replace the years the locusts have eaten. God owns it all, it's nothing to Him. Jesus can feed 5000 men (not counting women and children) out of five pita sandwiches made from a couple sardines (a child's lunch) -- and have twelve laundry baskets full of leftovers! And then Jesus says this very curious thing: "Gather up the leftovers, let nothing be wasted" [John 6:12]. Why? Jesus can make food out of stones if he wants to, why this concern about wasting food? Was Jesus raised in a dirt-poor family too? Well he was, but the expensive perfume slathered all over his feet didn't seem to bother him. Is food different? I don't have a good answer.
So I cringe when I see perfectly good food go into the trash.
The really good insight author Brian Button brings to this article is that the best sample code for other people to look at as they try to use your program code, is the test programs you should be writing anyway. Good unit tests explore the limits of the software they are testing, and they must be kept up to date with code changes, while ad-hoc sample code in separate documentation tends to get out of date as the code is revised (and documentation is not). Nobody likes writing and maintaining documentation, but if the test code is simple and readable, separately maintained documentation is unnecessary.
His only problem is that very large programs have so many test programs that finding the piece of code you need to look at can become difficult. Button suggests maintaining a roadmap of test programs -- while admitting that this roadmap document suffers the same flaw as all separately maintained documentation. He hopes the tool vendors will eventually get around to solving that problem.
Me, I'm a tool-maker. There is an easy solution to the problem. Easy, that is, for a compiler-writer like myself: Just invent special syntax in the test programming language that designates where this item fits on the roadmap and what the summary line there should be. The compiler can insist on this syntax, so your (test) program won't compile without it, and then it can automatically build the roadmap from all the test programs. The compiler can't force you to put meaningful comments there, but at least the foolish or out-of-date remarks will be there staring you in the face when you add revisions to the test code.
There is more in the article that I won't (and probably shouldn't) repeat here, but the fundamental idea is simple: Keep the test code simple, name the test programs descriptively, and release it to the users to look at. I'm working on a couple projects destined for open-source release, and I plan to use this idea extensively.
Thank you Brian.
Last year the same production team came out with a documentary, Beyond the Gates of Splendor, which describes the situation in more detail, but less dramatically. They interviewed on-screen an anthropologist who had studied a nearly identical tribe (I think in southeast Asia someplace, perhaps Indonesia), but which lacked the ferocity of the Waodani; he was able to compare the two cultures and identify the single point of difference, the high valuation the Waodani placed on individual freedom and autonomy. As these are also highly valued in our American culture, he offered an additional insight: "When you carry this constellation of American values to its extreme, what you get is Waodani."
The documentary used actual footage shot by the missionaries in 1956; a few of those scenes were also used in Spear, but most of it was a dramatization re-enacted in Panama with a different tribe (speaking their own language, with subtitles). I suspect Steve Saint's memory (and/or the screen writers) may have fudged some of the details to make a better story, but not much.
The whole story is personally interesting to me, because like Steve Saint, I also spent time as a child in the Amazon jungle. My family never knew the five families involved in this event, but I did later meet one of the widows, who had remarried and was the wife of one of my seminary professors. I had at the time just finished reading a missionary novel, No Graven Image by Elizabeth Elliot, another of the five. It seemed to me a rather nihilistic story. I said so. "Betty is pretty bitter," she said. I guess Ms.Elliot got over that phase in her life; she has a credible public ministry (speaking and writing) today, with no evidence of that negativity. I can't really fault her for it; I've had a few downers myself, three in the last decade, when things went rather badly. Like her, I got over it.
The sermon yesterday put a cap on my (and Ms.Elliot's) feelings. S is a missionary in a middle-eastern country, where his life is in real danger. His topic was John the Baptist's question of Jesus, after Herod had put him in prison and not long before he was executed. Jesus replied with a reference to the Messianic events going on around him -- almost an exact quote from prophecy in Isaiah -- and then added a curious remark, which S took to be a put-down. "Thank you, John, for your wonderful work as fore-runner. Your job is over. No, I'm not going to rescue you from prison and death. You are where I want you." The five men in Ecuador risked their lives to reach the Waodani. They personally did not succeed, but it did happen, largely as a result of their efforts. S is risking his life in the middle east, convinced that is where God wants him. My personal risk is rather smaller than theirs, but I was thinking along these lines just three weeks ago. Strange how that happens.
33 years ago the Supreme Court of the USA decided that human beings who have not yet breathed their first breath are not "persons" eligible for the protections of the U.S.Constitution. 116 years previous to that they made the same foolish decision -- except the poor unfortunate non-persons were descended from African slaves. The Court was wrong on both occasions. It took a bloody war to get the Dredd Scott decision turned around; what will it cost this time?
Make no mistake, It will cost. The country is hopelessly divided now as it was 150 years ago, and the Pro-Life folks have the moral high ground. Sooner or later the people of African descent will begin to realize that abortion is racist -- the major abortion provider and promoter in this country was founded with the explicit purpose of eradicating Negroes, and people of color are still disproportionately targeted. Sooner or later the people of Hispanic descent will begin to realize that the political party they voted for has values hostile to Hispanic religious and family values. Hispanics don't want abortion; why do they vote against their own conscience?
Sooner or later. I hope sooner. Give the young women a choice. Let them live.
Two years ago today I was handed a document which I immediately recognized to be the first step in involuntary termination. My immediate supervisor denied it -- I guess he was ashamed of what he was doing -- but subsequent events proved me correct. The administrator with actual hiring and firing power refused to tell me why. He too was ashamed. Other administrators were less careful, so I know why, and they are right to be ashamed. I am not ashamed of what I did. I acted with integrity in support of the corporate agenda they disclosed to me. The trouble is, they were not honest. They were ashamed of their own corporate policy. Don't do that.
Muslims are not ashamed of what they do. Many of them tell you quite openly that they plan to "kill the infidels." It's their religion they are ashamed of. Their god is not powerful enough to make his own converts, Allah needs the guns and bombs of his faithful followers to coerce people. The Christian God needs no coercion, it is self-evident that His precepts are higher and more honorable. The harder the Muslims try to coerce, the faster their own adherents become Christians. No wonder they are so ashamed.
The atheists are ashamed even of what they do. China and North Korea deny that they kill and torture Christians, but any observer will tell you otherwise. The atheists in our own country are ashamed of their religion. It is so unconvincing, that even with a state-funded monopoly in all the public schools for the last century, they still cannot convince more than a piddly 15% of the population to believe their religion of Darwin. No wonder they run scared. No wonder they try to deny that Darwinism is a religion. They are ashamed of it. They know it cannot stand on its own feet in a free market of ideas against Intelligent Design. There is no evidence for Darwinism, and they know it. That's embarassing. If I were a Darwinist, I would be ashamed of it too.
"I am not ashamed of the Gospel of Jesus Christ. It is the power of God" to persuade everyone who wants to know the truth, beatings and bombs and bullets not needed.
It should also be noted that I have registered this phone on the national Do Not Call registry. Federal action like that registry is worthless. The only way to stop harrassment is to let the victims sue for damages (including legal expenses). That would stop spam, too. But the Feds won't pass such a law, because they collect too much money from lobbyists who profit from spam and telemarketing calls. Not just the perpetrators, mind you: How much money did people spend last year on spam blockers? How much money did Internet Service Providers (ISPs) spend enlarging their servers and net connections to handle the volume (over 95% spam to my email address) that the excess traffic engenders? It didn't cost the ISPs anything; they passed the expense on to you and me. About the only thing Microsoft can hope for to sell their new Vista operating system is new security measures that would be unnecessary if people could just sue the bums into the poorhouse.
Last year's blog
Complete Blog Index
Itty Bitty Computers home page