The commentary explained that they had exactly two real boats. Nobody ever used the technical word "pentaconter", so named for the 50 oars, but they did mention 50 warriors on each. A while back I read about the efforts of some academics to recreate a single authentic pentaconter, so I knew the movie producers would not waste resources on even a hundred of them for less than a minute or two of screen time. They programmed the computers to do a thousand different (but similar) pentaconters. In the commentary they pointed out that 1000 ships that close together were unseaworthy, and reduced it to about 300.
I correctly guessed that the outdoor fight scenes used cheap Mexican labor -- even before I saw the credits I thought I heard some off-screen Spanish and wondered about it. It did not occur to me that 70,000 warriors were about as cost-effective as 1000 pentaconters. They used a few hundred real people and a zillion computer animations. Hence the motion capture: They had real people go through all the motions of fighting, then let the computer select different action sequences for every individual soldier on the beach. Thus there was no repetition. I looked for it among the ships, but did not see any, and silently congratulated them for the variability.
At church they have bought new presentation software that plays moving nature scenes (waterfalls, waves on a seashore, wind through the branches of a tree, and so on) behind the PowerPoint text of the 7-11 songs (seven words repeated eleven times). A number of us in the congregation have left off paying attention to the boring text and started looking to see where they stitched the film loop together for the repeat. The central image flows pretty close, but the bubbles around the edges mysteriously disappear or appear right at the seam. A stalk of grass waving in the breeze in the foreground, for about a second or two, is doubled into two stalks. Yes, we the audience notice those things.
The movie commentary pointed out their efforts to eliminate repetition. They have a bigger budget than church presentation software.
Jurassic Park was the turning point, when computer graphics would forever after be used for large-scale effects. Forest Gump put modern actors' faces on old newsreal clips. Now they are replacing whole actors -- well, extras -- with software. It's not just the programming and service jobs going to India, even our actors' jobs are being eaten by the computers.
Brad Pitt still has a job, and they can't computerize my work. Sorry
about the rest of you.
The show is a 90-minute monologue, the last disciple in prison in Patmos, mostly talking and responding to some (unseen) visitors from Ephesus. Much of the speech is lifted from John's gospel and his Apocalypse and first epistle, but the author has also brought in a lot from the other gospels and from a very modern emotional perspective that is utterly foreign to the flavor of the actual writings of John, even though John is the most relational of the evangelists. Unlike the DaVinci Code, which does the same kind of modernist injection from an anti-Christian perspective, this is at least a reverent and pro-Christian presentation.
I appreciate a good show -- and this is outstanding -- but I still could
wish for something a little more historically accurate in presenting how
the Apostle thought about his world. sigh
But you don't need their writers to explain things. I got a lot less interested in angel-food cake a few decades ago when I saw a "surfactant" among the ingredients; that's the stuff they put in laundry detergent to make it sudsy.
Take a look at the ingredients in peanut butter. After the peanuts, most of them list "hydrogenated peanut oil" which is essentially oleomargarine, and then a couple different names for sugar. So I buy "natural" peanut butter, which is made from peanuts and salt. The peanut oil tends to separate from the solids, forming about a quarter-inch to half-inch layer of oil at the top of the jar. Stir it up then keep it cold, and it works just fine. Lately I thought it too soft, so I was pouring off most of the oil before stirring in the rest. Once or twice the result was too stiff, but eventually I got the balance about right.
Until this week. Maybe this was a fresh jar that had not been sitting on the grocery shelf very long, but there wasn't much oil to pour off. The whole jar was not regular spreadable peanut butter, but something closer to peanut soup. It poured like honey on a warm day.
For a while, when I lived in urban California not too far from "The Bread of Life" (an organic grocery, eventually bought out by Whole Foods), I bought my peanut butter fresh-ground. They had a machine next to a huge peanut bin. You scooped peanuts into the hopper, turned the machine on, and held your tub under the spout to catch the peanut butter. It was thick and spreadable, nothing like what I bought last week.
That was then; this is now. The peanuts are different.
I suspect what has happened is that the peanut farmers have genetically engineered (or cross-bred) their peanuts to produce more oil, since there is a bigger market for peanut oil -- think: fried foods -- than there is for solid peanuts. Jiffy and PeterPan don't care, they just hydrogenate more of the oil. Roasted nuts, with more oil already in the nut, you can "dry-roast" them (higher profit) more easily because the nuts have enough of their own oil to keep from drying out. The tiny slice of the market that eats real peanut butter, we don't have the market clout to justify farmers growing the older dryer peanuts just for our peanut butter.
Oh well.
I guess I will need to buy up several jars of peanut butter and leave them on my pantry shelf for a few months to let the oil separate out, so I can pour it off. Or buy at a smaller grocery, so the jars will have sat on the store shelf longer.
As for this jar, I wish I had a centrifuge. Peanut butter that runs
all over the plate instead of staying in the sandwich isn't worth much.
God also seems to allow for categories, groups of similar things which share some common attribute. We are all humans, different from the animals, even more different from plants and bacteria and non-living things. We have names for these categories and the names are meaningful.
People seem to like categories. Unlike God, our minds are finite, and grouping things into categories enables us to process information about large numbers of things that would be utterly impossible if we could only refer to things and people by their unique features.
Nevertheless, people want to be considered unique. Nothing insults somebody so much as being relegated to some generic category -- unless it is a category held in high esteem, of course, and sometimes not even then.
Many years ago I was at a dinner party that included among its guests a black musician. Thinking it a compliment, I offered the opinion that black people were better musicians than us white folks. Ah the sweet taste of toe jam! He was incensed. He did not want to be considered a part of some categories. He is a unique person, to be esteemed for his own virtues, and not relegated to categories. I have repeated this error several times in different contexts, different -- um -- categories of people.
The problem is not racial, it is not sexist, it is not a matter of education, it is human.
My friend Dennis was trying to make sense of our relationship by identifying categories in which he supposed us to be similar (that is, in the same category), and others where he supposed us to be different. I didn't think much of his characterization of me. I didn't even like his categories. It became somewhat of a problem.
The trouble is, I was doing exactly the same thing to him, and he rejected my attempts to categorize him.
I'm not sure what to do with this, but it seems to be a consistent property of human relationships, (ahem) a category.
Your
comments invited
Bookmark this item
The problem is that Vista preserves the failed Unix security model, which gives various levels of permissions to users. That may have been a reasonable model 30 years ago when users shared limited access to a single mainframe at a remote site, and system administrators worked in a physically secure environment, and there were no such things as viruses and worms and trojan horses attacking over the internet. That was then; this is now.
The model we need today should give various levels of permissions to programs, not users. With very few exceptions, almost everyone works on single-user computers. Many -- perhaps most -- of us are responsible for administering our own computers. The operating system should protect the computer from external attacks, not from what we users need to do every day. The attacks come almost exclusively over the internet when we are doing ordinary internet kinds of things that do not need to endanger the system. There is only one program that we are running which enables those attacks, and that program only should be restricted from system-modifying activities, regardless of which user is online and regardless what that user's administrative permissions are.
There is no valid reason, at any time, for any user, for your web browser to be making system modifications such as installing drivers and rootkits and other programs capable of altering your hard drive contents or phoning home with confidential data. Never. The operating system should prevent it in such a way that the prohibition cannot be overridden, not even by experts.
In a corporate environment, system administrators may want to remotely install software upgrades. They can install their own trusted client software to do this; they do not need to do it in a general-purpose browser. Hackers may enjoy the power of Unix to castrate itself; let them do it in a suitably insecure operating system like Linux and older versions of Windows. The rest of us neither need nor want that happening on our computers.
It's too bad Microsoft didn't give us what we need. Maybe next time.
It's not that hard to do.
The program compiled and ran fine -- in the debugger. It bombed when I tried to run it as a standalone program. Even that ran fine in Win95, it just crashed in WinXP. Some bogus write to random memory deep in the bowels of the system software that I didn't write.
C/C++ is a horrible programming language, totally unusable for anything that you want to run reliably. But it's all we have. I wrote BibleTrans in my own strongly-typed language (T2), but it compiles to C for running on the PC. Worse, there are about 4000 lines of hand-coded C to support it. A small mistake in those 4000 lines of C could percolate down to and corrupt the system code and crash it. The Microsoft system software is all written in C/C++, so it has no defenses against errors coming from badly-written application programs (like mine, but I am not alone: everybody's software crashes from time to time).
Trying to find a bug in a C program is like finding a particular straw -- not a needle, you can use a magnet for that -- in a haystack. What you are looking for looks just like everything else. My only hope at getting around the problem was to completely rewrite those 4000 lines of C and hope I didn't make the same mistake twice. I also hoped to move a significant part of it to T2, but I did not succeed. I do not yet know today whether I eliminated the crashing bug or not -- which brings me to the title topic of this post.
Microsoft VisualStudio (VS) is the flagship of PC compilers. It regularly wins awards for being the best of the best. Compared to what I have been used to on the Mac, it sucks lemons. I suspect that is mostly the same effect as I noticed 20-some years ago, when a student reported that Microsoft Word did not have the ability to set font styles the way MacWrite did. Of course Word did have that capability, but she couldn't find it. With all the money Microsoft has to throw at it, I'm sure VS is able to do everything I want it to and more, but I can't figure out how. There is no reference manual in the top-of-the-line model I bought.
So it's like a video game: I'm being attacked by this orc, but my phaser doesn't work on him, try the laser defragger. Nope, that doesn't work either, maybe a proton grenade ... nope again. After a few hours of fiddling, I might just put the whole thing on hold and go search the internet. Other people often had the same problems and posted their solutions and "cheats". Ah so, what I need here is the light saber I left behind on Level 3. Programming in VS is like that. It's a huge drain on the economy -- or at least on my productivity.
VisualStudio has on-line documentation. When I ask for their "context-sensitive help" it offers to tell me about the latest features in C# and Dot-Net. Apparently their context sensitivity is non-functional. It cannot tell that I'm looking at a resource file in a C program, trying to figure out what all these commands mean. I try their help search function. It brings up 500 (apparently the size of their buffer) items about programming in FoxPro and multimedia and yes, more C# and Dot-Net. Nevermind that they have a "filter" that I have designated to be limited to C/C++ only. If I just open up the table of contents, all I see is yet more promo for C# and Dot-Net.
I don't know what they are smoking in Redmond, but when I buy a new product (VS in this case) to do with it what it is supposed to do (that is, make programs for the PC), it would be really nice if it could actually do that in a very few steps. Part of the reason for throwing the BibleTrans project away instead of rebuilding it is that I tried a rebuild last time, and VS keeps trying to search non-existant legacy files. I don't know how to purge its cache. I thought maybe starting over from scratch might work. I'm not so sure any more.
So here I am trying to compile 28,000 lines of source code generated by my T2 compiler (plus 4000 lines of handwritten C). In the old Unix days you just created a "makefile" that drove the command line to do it. VS lets you do that, but it's a lot of error-prone typing. This is the 21st century. IDEs (integrated development environments) are supposed to eliminate that drudge. VS claims to be an IDE -- hence the "Visual" in its name. So I start VS and try to figure out what to do next. "New Project" seems like an obvious choice, but it doesn't work; you have to choose "New Solution" but then it offers no help on how to get it to compile your code -- or even how to get it to see your code. After fiddling around interminably, I finally found an "Add Project" submenu that offers me a choice for C# and Dot-Net and a bunch of other stuff I don't want. At least C++ is in the list. After a bunch more false starts (throw the whole thing away and start over each time), I finally get to an open C++ generated text file. This creates a program that is non-standard and won't run on older computers, but I discovered a couple years ago that I can throw their code away and replace it with my own text file having the same name (and turn off some compiler options), and it works.
Why can't they tell me how to do that?
When I start up the program from scratch, the Help menu should have an item "Getting Started". If they want to promote their unsupported proprietary C# and Dot-Net to lock users in, I can understand that, put those items first. They could even offer two or three paths through the help system, one for previous VS customers who just now upgraded and want to play with their new toys, and another for professionals new to VS who just want to compile a program (maybe there aren't many of us?), and maybe yet another for newbies learning how.
What they are actually shipping in VS is a marketing ploy to create job security for the guru consultants who express their appreciation by recommending Microsoft products to their customers. But it doesn't benefit the users (me).
sigh
Memo to myself: Don't do that. Documentation should support the users
where they are.
My first electric was in college, a second-hand Remington Roll-a-Matic. I used it for 20 or 30 years, until it beat itself to death. It had three rows of slots separated by the rollers. Behind each row of slots was a matched set of cutting edges that worked like scissors by sliding back and forth against the slots to cut the whiskers off. The vibration induced by this action put a lot of stress on the whole device, and the power wires soon broke loose from pins in the plug where the cord went. I soldered them back in. Then the pins broke loose from the plug, but at least they still made a good connection. I let them hang loose on the wires. The other end of the wires broke loose from the motor winding terminals. I resoldered those joints several times. The motor mount itself came loose. Eventually some internal connection broke, so it would only work when you held it at a particular (awkward) angle. So I gave up and bought another shaver.
The current Remington model at that time replaced the rows of slots with a curved sheet of metal with a zillion tiny holes they called a MicroScreen, which worked essentially like the slots. I guess they could make this sheet of metal thinner than the slot structure, so the whiskers got cut off shorter. Consumer Reports rates things like shavers, and they gave this shaver a top rating. Remington turned around and cited this rating in their ads, which is against CR policy, so they delisted it. Me, I didn't see the ads, and I had been happy with the previous Remington product, so I bought it anyway. It did shave closer.
This was a battery model, and after a few years the battery wore out so it wouldn't hold a charge. I took it into the local shaver repair center in Carmel, and they put a new battery in. That wore out too, but when I took it back, the fellow told me that this was his last battery of that type, that they had stopped making them, and that a new shaver would cost me less than the battery. Or something like that. Anyway I bought another MicroScreen shaver.
This one differed from the previous MicroScreen model only in that the mustache trimmer was not mounted permanently next to the screen, but rather popped up when you pushed a little lever. This turned out to be important, because if the screen holes missed a whisker, the next time it would fold over and not go through ever again. I suspect the previous model had the same problem, but the mustache trimmer would cut them off as it went by, so they never got too long to go through. I got in the habit of running my hand over my face to feel for leftover hairs, then doing a trimmer run before finishing up.
I think the mustache trimmer on the earlier MicroScreen model was replaced with the screen, or maybe it was better built, because it never failed to keep the long hairs cut off. On this new model, the trimmer stopped cutting.
So I gave up and bought a shaver with slots. A folded over whisker will still fit through the slot and get cut off. The rotary cutters have less cutting surface than the Remington models, so it takes longer to do my whole face, and it doesn't cut them off as close as the MicroScreen -- I can feel the difference -- but it sure cuts them closer than the long hairs the MicroScreen left behind. It's also much quieter.
I was in the store evaluating shavers -- Consumer Reports got
too political for my tastes, which also tended to bias their reviews, so
I gave up reading it several years ago -- and noticed that the cheap corded
model cost only $2 more than the replacement cutter head; adding a battery
doubled the price. From what I read, this is good marketing: first invented
by Gillette, you give the rasor away free and sell the blades. The blades
in this case are the rotary cutter assembly for an obviously high-profit
$30, and the shaver behind it is a loss-leader $2. They can't make it completely
free, because people would just get a new shaver each time the blades wore
out, which defeats the marketing ploy. The second marketing trick is the
low come-on price, then sell up. The cord is too short to make its use
convenient, because they can actually make a profit from the battery model
and the other high-end versions (up to $120 or more) with no perceptible
advantages other than gimmicks. Batteries wear out. I don't need gimmicks.
I bought the come-on for $32. Eventually they will make some money off
me selling replacement heads, but not much.
Earlier this year
Last year's blog
Complete Blog Index
Itty Bitty Computers home page