The DVD manufacturers are trying to overcome their disadvantages, really they are. It also helps to scrub off the fingerprints on the media; the tapes have a complicated shutter interlock so you have to work at it to get your fingers and dirt onto the media surface. Anyway the third DVD player has gotten a little smarter, and it makes a valiant effort to find and play the movie, which the DVD producers try equally hard to make it fail. TV show collections are the worst, so it usually winds up playing the last episode on the disk, and I must resort to the remote and navigate over to the menu where I can choose "Play All" to get what the poor player couldn't figure out.
The latest movie for the player to fail on was billed as a "Smart Romantic Comedy". Usually those words accompany humorless and plotless stories that glorify sex at the expense of romance, which is why the library has so many of them. The library movie collection is entirely donated, and nobody gives away good movies -- except VHS, because their tape player stopped working. This one might have been a better story, but the player didn't choose to play the regular movie, and I didn't notice that I was seeing a mash-up until it was half over. I guess the player just looks for the longest episode in the menu and plays that. When there is a director's voice-over edit, the player somehow doesn't choose it over the regular movie -- perhaps eliminating the studio logos at the beginning makes it slightly shorter -- and the "Making of..." documentaries tend to be only 20 minutes, which is about all anybody can take of them. This one instead did a kind of mash-up of a "Making of..." mockumentary chopped and sliced into the movie. I didn't realize that wasn't the way it was supposed to play until I kept seeing the same (silent) producer and actor head shots over and over. Maybe I saw (or rather heard: they often played movie audio over production stills and shots of the camera crews) the whole movie, maybe not, but it wasn't worth going back to see the intended movie. Anyway, the mash-up lasted some 20 minutes longer than the box gave for the movie running time, which is probably why the player chose to play it. Apparently the DVD file format is just a series of links to little chunks of content files, so you can get different language voice-overs by using a different index file. The mash-up was just another index file with mangled links, mostly to the same audio and video content.
None of the DVD players have progressed to the level of access that my computer and every tape player has, where I can fast-forward (or backward) while watching, to skip over the commercials at the front, or back up and watch a scene over because I missed a crucial word or gesture. I also fast-forward the tape through the credits, so to see who the Foley artist is, or especially the location credits at the end. Many movies are shot in Canada (the idiots running California government seem to have taxed the local businesses out of existence, which should be a warning the current Feds are ignoring), and a lot of early America outdoor movies get shot in New Zealand. Viet Nam war movies get shot in the Philippines, Bible lands flicks in Morocco. Also, when there are credits for digital effects (always zillions of names), it's fun to try and guess what they used it for. The best digital effects are not obvious during the movie.
I read somewhere recently that some guy named Foley was particularly good at inserting artificial sounds to match the image, so that is now the generic name for whoever does that job in post-production. Less skillful Foley editors will sometimes make mistakes, like footsteps which sound like running on concrete as the guy on the screen runs across grass and dirt, or bangs that don't match the lightning on-screen. Lightning is particularly hard to get right. Distant lightning rumbles five to ten seconds after the flash (one or two miles away), which is the flash they showed; when it's close enough to get the high-frequency components of the "craaack" sound, it's too close to see much of the bolt. Real lightning when it's that close also includes some follow-up rumble, because the bolt is actually a mile or so long from sky to ground, and not all of the sound arrives at the same time. Foley artists don't seem to know much physics.
Oh well. It's not like seeing the movie is an important thing to do.
Sound scholarship always avoids extremes and seeks to build on empirical data and consensus views. -- Wm Dever in Biblical Archaeology ReviewIt's a remarkable definition from a recognized scholar in his own field (archaeology). Let's unpack it:
"Empirical data" is the least controversial component here. Entertainers, including authors of fiction, do not need to feel constrained by the real world. Indeed they are more successful when they are not. The rest of us must be constrained by the real world if we are to survive. There are exceptions...
"Consensus views" is rather more interesting. Three years ago Michael Crichton was quoted as saying "If it's consensus, it isn't science. If it's science, it isn't consensus." In the historical, umm, "sciences" such as Dever's archaeology, the global climate trends Crichton was addressing, and biological origins (a frequent topic of mine in this blog) there often isn't much empirical data to go on. So if you want to do original research, you need to come up with inventive new ways to interpret that limited data, which essentially is a denial of consensus. Dever's comment came in the context of a book review, where he praised the author for expressing ideas we might deem "conservative" (meaning closer to the empirical data), despite that there is a substantial subculture there that wants to reject anything consistent with the Bible. There is no consensus at this time, but despite their repeated lack of confirming data (see my June 28 "Man vs the Bible" post), the so-called "minimalsts" are slowly growing in numbers. A hundred years ago the biologists were experiencing the same demographics (and corresponding lack of supporting data), but it is now politically incorrect (and very likely to lead to unemployment) to argue the data over the "consensus" in biology. Biblical archeologists have one factor limiting their abandonment of the data: much -- perhaps most -- of their funds come from conservative Christians, who are more likely to support conservative (that is, data-driven) scholarship.
The "avoid extremes" part is a little more subtle. Extremity is essentially
a popularity issue, like the "consensus" item. All innovative new insights
are by definition "extreme", as are also older, possibly valid and data-conforming,
ideas after the whole population has abandonned them for the latest unsound
fad, such as global warming or Darwinism.
"Our Christ sacrificed his life on the cross for our sins. What has your prophet done for you?"It's an excellent question. The next question was not asked, but her persecutors implicitly answered it:
"Our God can fight His own battles, and without any human help at all, will send to eternal death all who oppose Him; do you believe as much about yours?"The Muslims themselves are ashamed of their prophet and their god. They do not believe their god can do any of the mighty acts we Christians depend on the Christian God to do. By attempting to "kill the infidels" by evil human methods (beheading, shooting, burning down their houses, etc) they implicitly admit that they do not believe their god is great enough to carry out the destruction himself.
A tiny number of so-called Christians have in the past lacked saving faith in the power of Almighty God, and sought to do such acts as God reserves to Himself -- the Spanish Inquisition and the Crusades are often mentioned in this context -- but Muslim blasphemers every year kill more of their opponents than the Inquisition and Crusades together did over their entire existence. Furthermore, most Christians (following the advice of Jesus Christ, who taught us to "love your enemies") repudiate violence against unbelievers. Our God is great enough to convert them without coercion, and often does. If there are any Muslims who are not ashamed of their prophet and their god, let them repudiate the killings and the burnings. Let their god fight his own battles.
Unless of course the Muslim blasphemers are right, and their god is as impotent as their actions tell us they believe him to be. It seems their actions might be more blasphemous than burning the Quran.
Bookmark this item (includes more thoughts
added later)
What surprised me most is that the scientific arguments the authors bring against Darwinism were well-documented and current. The alleged similarities between human and chimpanzee DNA has attracted a lot of ink lately (see my "Extreme Science" post two weeks ago). It was one of the main arguments Collins used to defend his own not-very-scientific persuasion (see my post two years ago about Collins). The new Nevin book shows a visual comparison between the two, and sure enough, the long ones line up with the long ones, and the short ones line up with the short ones. They don't say, but with a little reflection and reading between the lines, it's apparent that they numbered the chromosomes by size from large to small, so of course the big ones pair with the big ones. In fact, they didn't line up so well until somebody guessed that human chromosome #2 looks like it could have fused from (I think it was) ape #17 and #19. So they renumbered the two shorter ape chromosomes to be #2p and #2q. There is a rush among the evolutionary biologists doing genetic research to confirm the apparent similarity.
This book mentioned one of those recent findings:
It was reported that the presence of secondary alphoid DNA was not just in human chromosome 2 but also in human chromosome 9. To complicate matters further, others reported finding secondary alphoid DNA in all primates. Subsequently, other scientists hybridized twenty-one different chromosome-specific human alphoid DNA probes to the full complement of chromosomes from chimpanzee, gorilla and orang-utan. They found that the majority of the human probes did not hybridize to their corresponding equivalent ape chromosome but rather gave positive signals on non-corresponding chromosomes. They concluded that alphoid DNA sequences showed little conservation in the primates. [p.154-155]If you look closely at the picture on page 152 (I did not get a good scan from the book, so I cannot show it here), the light and dark banding is similar in some places, and very different in others. Apparently there is a lot of similarity between the respective DNA (a lot of our bodies are similar in structure and function), but it's not in the same places in the chromosomes. That suggests that a Designer recycling His own designs might be a better explanation than descent from a common ancestor, as I was saying more than 20 years ago (I believe my phrase at the time was "tightly coded subroutines").
I also learned from this book that the different number of chromosomes is not necessarily a problem for reproduction:
In addition, there are other examples of chromosomal fusion within individual species. Perhaps the most remarkable is the Muntjac deer. The Indian MuntJac (Muntiacus muntjac) possesses the lowest chromosomal number in mammals (six chromosomes in the female, seven in the male) whereas the Chinese MuntJac (Muntiacus reevesi) has twenty-three pairs of chromosomes in both sexes. They look identical and can interbreed. Variations in chromosomal numbers have also been reported in other mammals including the humble house mouse. So what is clear is that chromosomal number reduction within a specific lineage can be tolerated in some circumstances.The situation -- like all of God's creation -- is rather more complicated than the simplistic models we humans like to contrive.
Several of the seven ideas propose live research on humans in various states of development, with possible risks to future health and well-being. Some of these are not so different from what is being done today (with government funding) on human embryos killed without their informed consent for the sake of research, so I don't understand what the WIRED editor's so-called ethical objection is. We kill a million people every year in this country alone with no research benefits at all, and part of our current President's platform when he ran for office was to keep them dying. He has done what he promised in that one point. I do not see WIRED calling for an end to the slaughter.
Other proposed experiments involve adults. Christian values praise voluntary self-sacrifice for the benefit of other people, because even if we die in the process, the duration and quality of our eternal reward greatly exceeds the "light and momentary afflictions" of this present life. Pagan ethics praise coerced sacrifice from people (generally not including the ones doing the coersion, although they wouldn't say that), as in ObamaCare. The only difference between the experimentation eschewed by WIRED for alleged ethical reasons, and the social experimentation forced on the American public by the politicians they openly supported, is that these people are in denial about the damage done by the laws they approve. The stated ethical concerns in all of these cases seem disingenuous at best.
The seventh and final experiment they propose is a little different from the others: In view of the alleged similarity between human and chimpanzee DNA, they propose to cross-breed human+chimp and see what happens. Nothing will happen. That's the Christian prediction.
They call it "unethical" not for any actual ethical concerns -- if they are wildly successful and get an actual "ape-man" fetus, they can kill it before birth with no more ethical problem than happens a million times every year in this country with the blessing of all the left-wing bigots who might otherwise expect and/or hope to see such an experiment succeed. A good Christian would call such an attempt unethical because God said not to do that, but Darwinists don't believe God's commands apply to us and especially not to themselves. No, they call it "unethical" to preclude anybody actually trying it out and showing what a disaster it would be.
You see, humans and chimpanzees have a different number of chromosomes. They don't line up. One of the safety factors God programmed into the genetic reproductive system needs these genes to line up and match, code for code. Variations in the codes (ones for zeros and vice-versa) are designed into the program, but a complete missing strand can't reproduce properly, and the union never starts to grow. Horses and donkeys have a similar disparity, which I guess is why mules are sterile. The author of this item is hoping for as much. The Darwinists aren't telling us, but the chimp DNA is also in a different order than the same genes in a human. They won't line up because they are not descended from a common ancestor. That very idea is anathema to a Darwinist, so nobody ever publishes any data supporting that conclusion. They would lose their government grants if they did.
I think the chimp+human hybrid has already been tried, probably dozens
of times. They just don't dare publish their results, because it looks
so bad for the government-funded established Darwinist religion of this
country. So people keep trying -- and quietly failing.
Do not go hastily into [competition], for what will you do in the end if your neighbor puts you to shame? -- Prov.25:8Several times in my life I started out to attain some goal, then looked around and evaluated my competitors, and decided I could not beat them and dropped out. That's what competition is all about, and the smart ones get out before they waste a lot of resources on a lost cause. The honest capitalistic market encourages that kind of self-evaluation, but sometimes the honesty must be enforced externally. This time it was not, and the choice was denied me. I still got out, but in the Providence of God it was for other reasons, and only afterward did I learn the losing nature of my entry.
I do not begrudge the company for seeking the "best" products for their web-based delivery system. They didn't exactly lie to me, and maybe their business model does not really allow the competitors to evaluate each other. If so, that's an unfortunate flaw in their system, which might tend to limit the quality of the products they offer. I did observe that effect in the products they already had on display.
It would have taken me six months of hard work to produce a product for their market that I considered worthy of my name. I more or less said so, but they wanted it sooner. About a month into my efforts they disinvited me to compete, and five days later I saw their general announcement of the product they had chosen over mine. That other product had obviously been in preparation long before I got started.
A few years ago I heard about a website that let companies post the requirements for custom software, and invited programmers to bid openly on the jobs. I looked at a few or the items and the existing bids: programmers in India were bidding $200 for jobs I wouldn't do for less than $2000. I never went back. I can't compete in that market, but I had the choice. I believe the clients get what they pay for, but there's no way in that forum to sell my superior quality. They go there for cheap, and cheap is what they get.
Several decades ago, I was looking for a university to get a PhD in computer science. My first pick was Stanford, and I eventually did my research in the Stanford library. But the admissions officer there told me they only accept one applicant out of twenty (or something like that, I forget the exact number). "We see our students as an investment," he said. Picking up on the business terminology, I asked, "And what return on investment [RoI, a business term] are you looking for?" He looked me in the eye and said "We want our graduates to be Nobel Prize winners." I know I'm not Nobel material, so I went elsewhere. I had that choice.
There's more to be said about how the global market destroys quality, maybe some other time.
Now I'm casting about for something I can produce where the competition does not have a head start on me. I had that in the Mac software market, but Apple pulled the plug on the Mac and replaced it with Unix. There's no way I can catch up to a market technology that was 30 years old when Apple went for it. But I do still have the choice to evaluate my own options.
People say I write well. There are a lot of writers and writer wannabes out there. Can I compete? I don't know. Unlike software and education products, there is a big market for fiction and the readers keep buying new products, sometimes from new authors.
My compiler book (education, not fiction) got published because I had
a student who knew the publishers. His name is on the cover for getting
the contract, and he got it because he had an inside track, we weren't
merely one cypher out of hundreds competing for the slot. I keep sending
resumes to colleges announcing teaching positions in computer science,
but more and more of them are using robots to screen applicants, which
reduces us all to indistinguishable cyphers, and in this (Obamanomics high-unemployment)
economy there are a hundred or more applicants for every position; at least
the cost of competition (sending out a resume, or filling in the roboform)
is low. I have the choice.
Then I started working on that hopefully revenue-producing job, which required a video of (some) computer doing what I was (voice-over) talking about. OSX is somewhat behind the competition in several ways, one of them being the ability to make screen-capture videos of a specified resolution, so most of my time was spent in Microsoft's MovieMaker in WinXP, and in GraphicConverter on my Mac, tweeking screen-shot stills to be used in the movie. I did some screen captures in OSX for inclusion; they also needed their dimensions adjusted in GC. Everything else I still do on the Mac, so it got the lion's share of my attention, followed by most of the rest of my time in MM on WinXP, and only a much smaller fraction of time in OSX.
A few days ago I perceived that this movie project was not going to produce the anticipated revenue, so I dropped it and started trying to learn how to build OSX software. I have not yet succeeded, because Apple's Xcode development system seems more chaotic and poorly documented than Microsoft's VisualStudio, in which I experienced similar startup difficulties. But I hope to get up to speed eventually, as I did in VS.
The interesting thing about this particular week is that I spent significant time in all three systems, and I experienced software crashes in all three systems, two each on the Mac and Windows, and three in OSX, which is where I spent the least active time. Both failures in the Mac were caused by a third-party program that regularly crashes, perhaps every third or fourth time I use it. Both failures in Windows were in Microsoft's own MovieMaker, which I had not previously used. All three OSX failures were in flagship Apple software, one in their Safari browser, two in their Xcode development system.
That's rather remarkable: the system which the vendor promotes as "most stable" experienced the most crashes with the least actual use, and in premier vendor software. The MacOS, which regularly gets jeered as "unstable", only failed in flakey (I call it "unixy" because it appears to be designed by a person or team with preferences for that kind of system) third-party software. The only other software that fails on the Mac -- besides my own, while I'm still debugging it -- is another unixy program, and then only when it visits poisoned web sites (which I mostly avoid); that did not happen this week. I do not allow WinXP anywhere near the internet, so I can't say anything about its robustness online (which I obviously do not trust), but as far as I can recall, this PC has never crashed for me before this week, and the failures I did experience were in a program that normally sees very little use, and might thus be excused as from inadequate testing.
This is not a statistically valid sample. It's just illustrative of
my overall experience, that unixy programs crash a lot, no matter which
system they run on. Of course on a unix system like OSX,
all
the programs are by definition unixy, with a correspondingly higher overall
failure rate.
He who writes the dictionary wins the debate.The atheists do it all the time. They define "science" to mean "excluding supernatural effects," so by definition, there are no supernatural events to be studied. But the Christians are not immune to the disease.
The current issue of Acts & Facts has an unusually long article "How Natural Selection Is Given Credit for Design in Nature" where they base their entire argument on the artificial definition of "selection" as the product of a Selector (person, namely God). One could with equal validity argue that "sorted" (like the size of fossils in turbulent Flood waters, a Creationist argument promoted by A&F founder Henry Morris) requires a Sorter. We might prefer to believe that a personal Selector or Sorter is involved, but to present a case against Darwinism that young people (who have not yet made up their minds) can buy into, you must show that the Darwinist argument fails on its own grounds, using its own definitions.
The Darwinists credibly compare Natural Selection to a farmer who selectively breeds his crops or animals by choosing which traits to favor as breedstock for the next generation. The desirable traits survive by producing offspring; the undesirable traits die out by not being bred. Natural selection works the same way, but without any person making the decision, just the better ability of those traits to confer survival to the bearers. It's a reasonable analogy.
The Darwinist error is not in the definition of the word -- choose any
other word, perhaps "breedability" or "survival", and the analogy is still
valid -- but in the fact that there is no mechanism for injecting survivable
traits into the genome that does not also inject far more fatal traits,
so natural selection has its metaphorical hands full just keeping the organism
alive and well without killing it off each generation. I can't put my hands
on the reference today, but I read a few years back of research showing
directed "evolution" over multiple generations stopped improving and actually
went backward. Obviously the Darwinists who control the media don't want
real facts like that out where people can see them. Natural selection preserves
the genome, rather than enabling unbounded change. It's a much stronger
argument than quibbling over the meaning of words.
This glass brick does not have a well-defined way to tell it to stop right now. In fact, it easily gets into a mode where nothing seems to get it into a known state. For a phone whose nominal and main purpose for existence is so I can call 911 if I fall or get hurt, it is arbitrarily hard to do so.
What if I get something in my eye and can't see? I can find the power button, but suppose it's already on and in one of those goofy states where the power button does not turn it off? Maybe it's not in that state, and I got it turned on. The screen is locked until I rub my finger across it diagonally at just the right pressure. Too much or too little, and nothing happens. There's no way to know if it took without looking. Then to make a call you have to push the button in the lower right corner. It has a telephone icon on it, but I can remember which corner it's in. Then can you make a call? No, you have to select one of the touch panels to bring up a touch panel version of a telephone pad. Good luck on feeling where the buttons are.
The clamshell phone, I flip it open, feel where 9 is in the bottom right corner, press it, then slide my finger up diagonally to the 1 and press it twice, then slide my finger just past it to the next button (which happens to have the green word "SEND" on it) and press that, and my call goes through. I can do it with one hand and my eyes shut.
Back to turning things off. I keep accidentally triggering calls to numbers I don't want to call. On a desk phone, that doesn't happen, because I have to pick up the handset, then press the digits. But if I ever did, I can stop what it is doing now by replacing the handset on its cradle. Electronic phones with a squawk box, I might need to push the speakerphone button a second time to turn it off. The clamshell, I just close it, and it stops now. On this glass brick, I need to slide that red stop lever to the right. It usually doesn't take on the first or second try, then the phone locks up and I must first press the power button -- which doesn't turn it on or off, it just unlocks the touch screen for a couple seconds -- so you get another shot at sliding that red rectangle to the right before it freezes up again.
It has an alarm clock. The clamshell phone has nice musical tunes
I can program separately for my wake-up and the people who might call me,
but this brick only makes various similar-sounding noises. When the alarm
goes off, there is that little red rectangle again, and with a few tries,
I can slide it to the right, and the alarm stops. For five minutes, then
it sounds off again. To actually stop the alarm, you must slide the little
red rectangle to the left. Sliding it to the right only activates
snooze, which in smarter phones happens automagically if you do nothing
at all.
The previous two phones had a choice of musical ringtones, but this one only has a variety of indistinguishable and obnoxious noises. It has internet access which lets you go online to buy ringtones, but nothing worth paying for. I tried Googling for ringtones, and found a couple free sites, but the site terms were unconscionable. So now I have no way of knowing who's calling.
The worst feature of this phone is its touch screen, which tends to select things I'm not pointing to, or select things when I'm not even touching the screen, and more often than not refuses to select what I do touch. The first thing it did when I powered it up was put a "Missed event(s)" message on the middle of the screen, with no way of dismissing it and no response to anything I did to it. I pressed all six buttons in various sequences and combinations, to no effect. I touched and slid my finger all over the screen, to no effect. I even tried turning it on and off, including holding the power button for several seconds, which usually forces most computer-like things to quit whatever they are doing, but to no effect. The miniscule "All you need to know to get going" user manual said nothing about it. I downloaded the vendor's 100+page PDF -- I hate PDF, it's slow and hard to read -- from the internet, and it said nothing at all about it. I must have struggled with this for an hour or so before something -- I know not what, but I was pushing on the speaker bump about that time -- cleared it.
I still have difficulty figuring out how to navigate to one of the few features on this I actually want to use. I suspect that is a casualty of the small screen and the need for oversize -- they obviously aren't big enough -- buttons to activate. The farther we get from 1984 (when the Mac pioneered and simultaneously crested the peak of easy to use), the more like 1984 (the book, whose story line describes the opposite of friendly to people) our equipment becomes.
This thing is so hard to operate, I am convinced that if I never get
another touch screen device in the rest of my life, it will be too soon.
In the last couple decades, BAR has run a number of articles promoting or criticizing an anti-Christian position called "Biblical Minimalism" by which its adherents mean that the Bible text should be considered counter-productive in doing middle-eastern archaeology. If the Bible states that there were historical Kings David and Solomon who ruled over a vast united Hebrew Kingdom, then -- so these people claim -- there were no such kings nor kingdoms, it was all invented hundreds of years later. Aside from the particulars, people have been announcing the scientific death of Biblical literalism for centuries. The only deaths recorded in the debate are of the critics and their theories. Something over a century ago, the Bible was all wrong because there were no such people as Hittites. Then they found and excavated the capital of the Hittite empire in Turkey. The Bible could not be an acient document because there was no writing -- until they found a huge library of cuneiform tablets dated more than a thousand years before Moses. The so-called Minimalists argued against the historicity of David and Solomon -- until an undisputed reference to "the house [dynasty] of David" was found on a victory inscription dated only a hundred years later. So on page 46, BAR proudly announces (for eight full pages, plus notes) "The Birth and Death of Biblical Minimalism" as if any of their readers -- most of us actually believe the Bible -- are as surprised as the minimalists themselves. It's a fascinating read.
A few pages earlier is a one-page commentary on the political climate in Egypt and its neighbors provocatively titled "The Pharaoh, the Bible, and Liberation (Square)". Author Ronald Hendel accurately observes that the Bible aggressively promotes the liberation of people under oppression -- not only the Israelites in Egypt, but also the poor people living in the Israelite kingdom under their later kings -- and then goes on to point out (again accurately) that the Bible does not promote any kind of democracy. He neglects to mention that the so-called "divine right" of kings to rule is also not taught in Scripture. God grudgingly allowed Samuel to annoint Saul king, and then later selected David as more appropriate. After that, God was pretty much out of the picture, except to designate a couple replacement kings for the civil-war-torn northern kingdom. Clearly the wicked kings they replaced had no such thing as a divine right to rule. God is involved in all of history. Sometimes He sets up rulers; sometimes He takes them down. Mostly, whatever God is doing in national and international politics is inscrutible to the rest of us. But religious leaders like to think otherwise.
The remarkable insight in this short piece was his observation on how democracy came about in a world where kings ruled by Divine Right, and the Bible had the final say in all matters of faith and practice:
The greatest obstacle to the rise of modern democratic ideals was, perhaps ironically, the Bible... Government authority flowed from the top down, from God to the king, who ruled the people below him. The first people to question this arrangement and to argue that governmental authority must stem from the people -- from the bottom up -- had to somehow undermine the Biblical doctrine of the divine right of kings. To do this one had to argue that the Bible was not a sure authority in matters of government... This move required the invention of modern Biblical scholarship... [emphasis added]There you have it in black and white. The denigration of the Bible has a political agenda, unrelated to any science or facts. Despite Churchill's famous remark on the quality of democracy as a form of government, it's really not all that much better than any other form -- including divine-right monarchy. Democracy only worked in the USA (note the near-past tense) because, as one of our own Founding Fathers noted, we had the necessary religious and internally moral citizen base. The recent erosion of that moral base is moving us to the same state of chaos that plagues every other so-called democracy, where the people who make it to the top of the power pyramid figure out ways to stay there, until some other tyrant throws them out and they rewrite their constitution (again) to make his position secure against all comers. We have a long way to go before the USA is as bad as whoever is in second place, but all parties are vigorously trying to close the gap. Fortunately, we still have a fairly substantial Christian moral base to hold the power-mongers back.
That's not exactly true. I've been playing video games all week.
A video game is any of those computer programs where the software presents you with some kind of conundrum, and then you try to guess which of the available resources will solve the problem. For example, you might be facing a blue orc on the screen. Can you shoot it with your crossbow? Nope. Light saber? No again. Photon grenade? Still no effect. Laser defragger? That one worked against the trolls on the previous level, but not here. So you finally give up and go on the internet looking for "cheats", hints to solve the problem, and learn that what you need for combatting blue orcs is the bag of pixie dust you neglected to pick up on level 3.
When programming in the Goode Olde Dayes (30 or 40 years ago), you just read the vendor's documentation for everything you needed to know about why their compiler wasn't giving you code that worked properly. Usually it was a cockpit error, and easily fixed. Once in a while it was a "feature" -- in quotes like that, it means "a bug we aren't going to fix." I will never forget the advice in the Univac Fortran 5 manual: if your code gave wrong results, "randomly re-arrange the terms of your expression" until it works. It seems their aggressive code optimizer occasionally made mistakes that could be foiled by a different expression order.
Modern language processors, however, don't have any documentation at all. Microsoft VisualStudio pretends to, but it's inaccessible. If you do a search for one of their error messages (within the C++ compiler that generated it) you get 500 links to unrelated topics in VisualBasic, C#, and FoxPro. They cut off the results page at 500 hits, and the C items are not listed first. They have a checkbox for limiting your search to C++, but it has no effect. Fortunately, there are thousands of VS programmers out there, and many of them hit the same problems. Google usually turns up one or two good answers in the first page. That was five years ago.
Now I'm trying to program in JavaScript. I have the language spec from the Sun website (the link is long gone, but I captured and saved the files), and it's been helpful, but one of their examples kills the interpreter. Netscape (the vendor) is long gone, and all its documentation links hard-coded in their browser jump to a generic AOL greeting page with no information about Netscape at all. It's an older browser, so Google can't help much. It only finds what everybody else is looking for.
So I'm back to the video-game method... Maybe if I try this laser defragger...
Last night I woke up around 1am with lightning flashing and crashing all around me. After one particularly close hit (a noticible delay, but less than one second between flash and crash) I saw a red glow on the ceiling. It was the phone's off-hook light, which I know can also be triggered by being unplugged. I checked the second line, and it also was dead with the light on. So I called the phone company to report the outage.
Now if I were smarter (and more awake), I would have run around disconnecting all the phones and then tried the signal with a phone that had not been connected at the time of the hit. But I didn't, so the phone company sent a guy over to tell me that the signal was OK to the box, it was a short inside the house. Then he traipsed around messing up all my wires and doing (at a probable cost of $100 on my next bill) what I could have done myself in less time at no cost. Except for three telephones, all shorted out. One of them had cost me $150 used. I looked all over for a CallerID-aware answering machine four years ago. Nobody makes any, but I found a used Nortel unit. I groused about it to the phone tech, and he said they're available at Wal-Mart and RadioShack. Not true.
The LORD giveth, and the LORD taketh away. I needed that unit four years ago, when I was getting calls at 2am from one person. That's no longer the case, so now I don't need to spend the time figuring out how to drive it each week. I bought a piece of Chinese garbage at RadioShack (nothing at Wal-Mart) to serve as an answering machine.
I can think of it as losing $300 -- I never lost that much in an earthquake
in Calif (and that doesn't even count what tornados do here) -- or I can
be grateful I have the money to buy a replacement (and to make an equal
donation to VoM in penance for supporting
the abusive Chinese government). But even the cost of adverse weather here
does not overwhelm the savings in the cost of living: in Calif you must
drive for miles to get to anything, but here I can walk or bicycle to everything
but ALDI (cheap groceries, nothing like that out on
the Left Coast) and Wal-Mart, which are a couple miles. I fill my gas tank
maybe five times a year. So this is not such a bad place to park while
seeking gainful employment.
The science of genetics is about how DNA works to form proteins and what those proteins do and how that is different from other proteins that were constructed by different software -- yes, software. DNA is a binary code for making proteins, and the total genome is a computer program for making a whole organism. It's incredibly complicated, and we still do not know very much about how it all works, but people like Collins and mostly the people who worked under his supervision rather than Collins himself, they are learning more every day.
What they do not know is Who wrote the software. Collins and a few others say they do, but then they turn around and deny it.
Genetics is not an historical science. In fact, if it were about history, it wouldn't be science at all. Paleontology is in that category: it looks like science, and they use scientific-sounding long words, but on the bottom line it's myths and fairy tales invented by people who refuse to read the history Book, instead preferring to believe childrens stories about things like frogs turning into princes. They really believe that one. It's not science. Real frogs don't turn into people, not even in millions of years, and real science has never shown otherwise. For more on how I know that, see my essay on "Biological Evolution: Did It Happen?"
Francis Collins is mentioned in the appendix to my essay (with details here). I ask anybody and everybody who has done peer-reviewed research (Collins being among them), what in their own research supports the Darwinistic hypothesis. Like everybody else, Collins did not reply, but I found some of his work which comes close to answering it. The answer is the same for everybody who believes the myth: their own research does not support the story, they got it from somebody else, Collins included.
The "evidence" they are claiming in denial of the historicity of Adam consists mostly in finding similarities and differences in the DNA of humans and apes. Think of it as like deducing which book evolved from what other book by noticing similarities in their words and sentences. If we find the same whole paragraph in two different books, we start accusing one author of copying from the other. It happens all the time, and sometimes the later author gives credit to the source. But genetics is still studying the words; we haven't gotten to the whole paragraph stage yet. Every book in the English language has the same words in it, "the" and "a" and "is". There was no copying, that's just the language.
I'm reading a sci-fi novel this week, where some of the aliens and anti-religious humans quote lines from the Bible. It's like the novelist doesn't even know he copied. But the lines are different enough from the ordinary things people say, we know where it came from. Composers like Bach and Handel often copied from their own music. There was no evolution, they just copied a part they liked. Why can't God do that?
If they had science (meaning, they could perform experiments and get repeatable results) to prove the Bible is wrong, then I could not remain a Christian. After all, if I can't trust what it says that can be tested, why should I believe what cannot? The Bible itself points that out (see "my BS Detector"). Other people -- apparently Collins among them -- are like the movie Secondhand Lions, where (so they assume) "if it's worth believing, you should believe it, even if it isn't true." Honest people cannot do that. I can't. Even the movie director himself couldn't really do it.
People have been announcing the death of Christianity for hundreds of years. The evidence they offer generally shrivels up and blows away, usually within a decade or two, before the proponents have died off. That's right, the doom-sayers are the ones who die, not Christianity. When we know more about genetics, perhaps in 2020 or 2030, the arguments Francis Collins and his atheist colleagues are bringing against Adam will be as embarrassing as Piltdown Man is today. They will have a new set of bogus "scientific" accusations to bring. It always happens that way.
There was only one actual lion in the story, and it was not particularly clear that it was "secondhand," so I was left to wonder whence the title. Perhaps it was the unbelievable quality of the swashbuckling stories (as a pun on lines or lying) one of the two older guys told the kid about their experiences on another continent. In the documentary on the flip side, the writer-director explained it as a metaphor on their age and their feeling washed up and useless. I know the feeling.
The philosophy behind the flic, exposed mostly in the documentaries, but also in one turn-off line sometime after the middle, was completely bogus. It was the sort of thing you might expect from a pagan writer who grew up exposed to American cultural religion without catching any truth from it. All through the movie you were left wondering if the stories these guys told were fabricated or true, if they had earned their legendary wealth more or less legitimately in Africa, or as bank robbers not yet apprehended by the law. The kid found their stash of cash, which was bundles of crisp high-value currency and bags stamped with the name of a bank, not bags and piles of gold dubloons as suggested by the stories.
The philosophy line did not help with their credibility: "If it's worth believing," he said, "you should believe it, even if it isn't true." The same day I had spent a half-hour defending the rather contrary but totally Christian idea that we should believe what we believe because it is true. The Christian story of Adam and Jesus and the Resurrection is not worth believing if it's not true, and we mean "true" the same way these old guys in the movie meant it. It also happens to be the same way the judge wants you to mean "true" when you testify in court, which is conformance to reality and not a fabrication unrelated to what really happened. Churches are shrinking in America today because we have abandonned our firm foundation in Truth. The truth is there, but too many people have let themselves be bamboozled by unsubstantiated atheist bluster and fairy tales. The cultural American religion practiced in the Bible Belt (the film was set in Texas) "believes" even though (they suppose) what they "believe" isn't really true (like Santa Claus and the Tooth Fairy). And smart people won't buy it. Smart people have not been told the truth -- not by the Christians, and especially not by the atheists -- and they don't realize they need to dig it out for themselves, so they believe the Secondhand Lie and join the government funded Established religion of the USA, which is the atheist church.
More of the bogus philosophy came out in the documentaries. The writer-director said it was about parenting. The kid's single mother obviously was not doing a very good job of it, because she abandonned the child at her rich uncles' farm in hopes of his finding (and perhaps stealing) their money, but the actress -- you can't really blame her for wanting to believe in the character she played -- tried to defend the floozy. One of the producers gushed over the film because it gave him permission to be an absentee father, the kind where a few minutes of "quality time" are deemed to overcome the lack of quantity, while in fact the uncles did their substitute parenting 24/7 -- except in the case of the four punk teens the older guy had beaten one-on-four in the brawl, then gave them his "growing up" speech and somehow they magically turned out OK.
In the end (spoiler alert) the real truth wins out. The stories were
true, and as confirmation some of the African family shows up at the funeral
of the old guys to pay their respects. We Christians also carry the real
Truth, and the "Happily ever after" part of our story lasts a very long
time. It really is worth the effort to find out which of the stories are
true.
I don't like creepy crawlies. Spiders eat the other bugs, so (other than ants and an occasional fly) mostly I only see spiders. For a long time I would wake up from nightmares of huge cat-sized spiders crawling over my face. Once, as a teen, I woke up soffocating. The family cat had decided to sleep on my face. Maybe it's connected, so I don't tolerate bugs in my sleeping area. Two or three years ago a spider rode onto my bed on my Bible as I picked it up for morning devotion. Now I shake it over the floor first. Yesterday there was a brown smudge there on the floor. My vision is a little blurry early in the morning, so I supposed it might be a spider and dropped a book on it. There was a wet but dead spider stuck to the book when I picked it up. Today I picked up a hankerchief, and when I got it close to my nose, another spider ran off. Earlier, I'd seen a dark spot on the floor near a box of stuff, but the spot was long and narrow. Just to be safe, I picked up the box to drop on it, and the spot took off running, the way spiders do. Maybe I was seeing the shadow cast by my bedlamp.
I was going to say something about the ants now roaming my kitchen counter,
but I see I already said it in a post four
years ago. I just need to run a stream of ant spray all around the
counter. They won't cross it for a month or two, then I need to repeat
the application once or twice before the cold drives them off. The ants
stuck inside the perimeter wander around aimlessly, suffering from CBS.
Like many guys, the kid (let's call him "Chuck") prefers to believe he knows more than is justified by his cognitive accomplishments. Apple used to design their products so that don't-read-the-manual-just-do-it types like him could actually just do it. That appears no longer to be the case. If a user manual came with the iPod, Chuck already discarded it without noticing its importance. I don't normally throw anything away, but I was not there. There was no usable on-line documentation, but (like Microsoft products, which are similarly undocumented) Googling the failure mode turned up several non-Apple hits. Unlike MS, they weren't very helpful.
The iPod refused to do anything interesting before being connected to a computer running iTunes, which neither computer in the house had. This is where I came in. With some effort, I found it on the Apple website. It was a 5-hour download. I transferred it to Chuck's computer, where it promptly refused to install because it wanted a password, which Chuck gave it, but it was not the admin password being asked for. Chuck was clueless. I was not involved in setting up the system, so I didn't know what it wanted either. I refuse to configure any of my computers to be disabled without a password (see my post "Stuck Again"). Fortunately, I guessed that the dealer might have set this one up similarly, and just hitting Enter when it asked for the password worked.
Meanwhile, on my sister's computer (the one with internet access), I spent the rest of the evening helping her set up an iTunes store account so Chuck could use the gift cards she had encouraged family (like me) to give him. I refuse to transact financial business on the internet, and this long and error-frought process confirmed the wisdom of my policy. After I thought the song was successfully downloaded, I tried to disconnect her from the ISP before proceeding to transfer the song to his iPod. Fortunately, whoever had set up her ISP connection had neglected to provide such a tool (she just shuts down the computer each time), because it turned out that the download had another 20 minutes to go. Not so fortunately, it hung on completion. I killed the program and restarted (the usual recovery method for unixy software, just about all of which -- like iTunes -- is full of bugs), and at least it restarted its download and finished successfully.
Speaking of buggy iTunes software, I tried to disable all the unwanted functions of the iTunes install that she does not want to accidentally trip, but it does not offer that kind of freedom. I suspect she will be doing a lot of killing the program and/or shutting down the computer and restarting (the only reliable error recovery procedure since Apple killed the only commercially viable WYSIWYG operating system that ever existed nine years ago) to get out of some mode that she does not want to be in. Apple learned their lesson from the amazing success of the MacOS system (people want to do what they want to do, not what the vendor and the system gurus want to charge them money to do) and promptly replaced their system with something more to the liking of their high priestly preferences.
We finally gave up and called the Apple support number. After a half-hour or so, we finally got back into the iTunes setup which requires to be online and goes through the whole EULA nonsense and registration every time and refuses to let Chuck manage his own music on his own computer (no internet connection, for his own safety) while downloading tunes on his mother's computer under her supervision. She was reasonably philosophical about the extra hassle, it's something she gets a lot of because of his special needs.
Four hours later I finally managed to fake the deletion of iTunes from Chuck's computer, because like those nasty rootkit viruses, iTunes refuses to be uninstalled (it's unixy software, so it just hung = crashed). Even the Windoze uninstaller hung. Trying to delete the file told me I didn't (as "owner") have permission -- another unixy holdover whose primary function is to harrass and oppress users. Recalling that Windoze uses hard-coded file path names which break if you move or rename things, I just changed the names of the folders. Windoze complained, but let me do it.
Bottom line: I cannot recommend ANY current Apple product for any purpose
at this time. Nor Windows Vista. What a snail! The only thing slower than
Vista is Vista accessing an Apple website. That's Apple's fault, not Vista,
because it accesses my site (you are looking at it) very quickly.
My car died in the church parking lot Sunday evening, so I was in the repair shop Monday waiting for the bad news, and I happened to see a TIME magazine with Osama bin Laden X-ed out on the cover. I stopped reading TIME several years ago because I got tired of their left-wing politics in place of news. This was no exception. The editorial, the cover story, several accessory articles featured the death of Osama bin Laden, yet not one date when it happened.
I hit up Google today: lots of picture and video links, plenty of comentators eager to offer their own ignorant opinions, plus (in the second page) a couple of news stories -- and still no dates.
So I still have no idea when this happened. Maybe when ChristianityToday
comes out next month, some commentator will mention when it happened.
Knowing my own opinions on the government-funded ("established") religion of the USA, a friend pointed to chapter 62 of Breathless, which argues against the Darwinian hypothesis from a mathematical perspective. I majored in math as an undergraduate at Berkeley, and the argument in this book is not the sort of thing a mathematician would present, but rather what you might expect from a statistician. I then went back and read the whole book; this "mathematician" is introduced in chapter 7 beating a Las Vegas casino at blackjack by counting cards (and then giving his winnings away to needy people). Later we are told that he specializes in probabilistics (another word for statistics) and chaos. Chaos is the fair-haired child of fiction math since Jurassic Park -- probably because nobody knows anything about it yet. Koontz admits as much in chapter 44, where speaking to an expert biologist, he has the mathematician say, "When we've got a century and a half behind us, if we haven't piled up multiple irrefutable proofs of basic contentions, I'd agree with you that we should stop calling it science and start calling it religion." This is a not-very-subtle slam against Darwin, who published 150 years earlier and still has no primary evidence at all.
It was the abusive actions of the government commandos that most got my attention. As a Christian, I am required to submit to the government (except when it contradicts God's higher law), and the heroes of this story did both. But it took me a day or two to get past my rebellion over the high-handed tactics used against them, and not before I came up with a reasonable defense of my own. Here's the scenario:
Before being subjected to the session, she had been provided with a statement signed by [the government man] in the presence of a witness, stating that no information obtained herein could be used against her in any court of law and that she was immune from prosecution for any matters touched upon by his questions and her replies.I don't think so. There is a third way, which I see as inspired by Jesus under similar circumstances: say nothing. "No comment." In a court of law, you are legally and morally obligated to respond. Under adjuration (1st-century oath in a court of law) Jesus responded truthfully and completely. But this is not court. The guy in the story was on a fishing trip, and he was willing to be abusive, as we subsequently learned two pages later. But the clues were there before anybody said anything: the first people out of the unmarked military chopper were a SWAT team carrying assault rifles. That's a threat of punishment right there.On the other hand, once she had been granted immunity, if she still declined to be polygraphed, she could be prosecuted under two statutes that, upon conviction, allowed for consecutive sentences totalling as much as four years in prison.
When [she] still hesitated, [he] said, "Look at it this way. If you want to lie your head off, you can do so with no fear of punishment. You've got immunity..." [p.276]
So I can truthfully say "Your alleged immunity is worthless, because you are obviously lying. How do I know you are authorized to grant immunity? On 24 it must be signed by the President. But even that can be forged. It also can be revoked by people (like yourself) who value control more than they value truth. You can attach your machine, but I will not reply." What are they going to do, put me in jail? He just promised not to. Even if he does, it solves my unemployment problem: three hots and a cot, paid for at government expense. Plus a lot of very bad publicity. Or they could just shoot me "trying to escape." They do that in the movies. Quick trip to Heaven, now that I seem to have nothing useful to do here any more.
But this is fiction. Things like this don't happen to real people. Especially
we don't have alien invasions replacing what Jesus promised in himself
(and most everybody has already rejected).
This author had numerous books on the shelf, several of them with sci-fi stickers (mostly later volumes in one or more series lacking the first), some with fantasy stickers, and several with no sticker. I picked out one that had sci-fi-ish cover art, and the plot summary on the inside jacket didn't suggest it was set in a prior century as some of the others did. The plot summary appeared to be misleading: the story opened in the year 1340 and devoted its narrative detail to explaining the thoughts and motivations of 14th-century noblemen and fighters. Then a huge starship appeared in the sky and abducted the hero and his entourage. At that point I guessed from the title Excalibur Alternative and the fact that the lead character's name was Sir George, that it was likely to be some kind of sci-fi-ish alternative retelling of the sword in the stone and/or St.George slaying the dragon.
So I skipped forward to the next-last chapter to verify my guess in anticipation of returning the book mostly unread. Here Sir George joined with another alien race in overcoming their abductors (still in the 14th century). Suddenly (in the final chapter) the story resumed in our future, where these same abductors were threatening the whole earth with annihilation, and a totally different cast of characters. Enter Sir George to the rescue. It was proper sci-fi after all. But I didn't bother to go back and ready the intervening chapters, which I suppose mostly explained in detail how Sir George gained the confidence of the aliens and learned the technology needed to overcome them. Except for the one I'm currently writing, I don't do historical novels. I guess it's like video games: I don't play them, I write them.
Speaking of historical fiction, the same week I was reading (and skipping
over) this novel, I read of some essay in ChristianityToday, where some
film critic was asked to propose a Christian novel to become a movie, and
he offered Eifelheim, another sci-fi placed in 14th-century Europe.
I probably won't ever read the book (and I doubt anybody is likely to make
the movie), but it's a curious coincidence.
Earlier this year
Last year
Complete Blog Index
Itty Bitty Computers home page