Four months ago it was self-driving cars, and you may recall my take on the self-serving lawyer arguing for reduced liability for the manufacturers. It must have stirred up a hornets nest, because the same rag now did a full-issue, multi-article feature titled "Can We Trust Robots?" The authors went to a lot more effort to give due consideration to all facets of the question -- except to address the title question! Now you need to understand that the IEEE is a professional organization, meaning they are an organization of people, not corporations. At least that's what they want you to believe, and they probably believe it themselves.
I'm still a member, and I was active in their standards-making activities for a decade or more. My complaint then -- and the situation obviously has not improved -- is that the engineers (one of the "E"s in IEEE) cannot escape the corporate control of the companies that pay their salaries, sometimes to the detriment of the public good. One committee I was on consistently met in the San Francisco Bay Area. There were good reasons for that: two of the leading technical experts were at Berkeley, one as faculty, the other as one of his grad students. Educators and especially their students do not have money to go gallivanting across the country and across the seas to distant lands every month. Although it was not public knowledge, I also was a grad student at the same time. Another committee I was on, I got myself named head of the USA delegation, and the IEEE picked up the travel tab to Europe where most of the meetings were held every six months or so. But one of the corporate interests represented at the Floating Point standards committee could see that if Professor Kahan's designs became the standard, it would cost them dearly to build hardware to conform, and their delegates -- all there as "individual professionals, not representing any corporate interests" -- swarmed to the meeting in vast numbers from Massachusetts, all at company expense, hoping to overwhelm those of us convinced by the Professor's sound logic. Fortunately, there were enough of us locals (often no more than one from each silicon foundry, and many of the companies unrepresented at all, but Silicon Valley is so named because there are a lot of them) to keep the venue stable. One of that company's tactics was to claim -- plausibly, but disingenuously -- that the stable venue was a burden on East-coast parties, and yes, I suppose that would be true if they were paying their own way. Rotating it around the countryside would have meant that only employees of large corporations could afford to be present at every meeting, and the smart phone you hold in your hand today would not have been as smart. Meaning: the Good Guys won.
They are still at it. The current issue of IEEE Spectrum clearly and blatantly favors the robots. Duh. These are Electronic (another one of those "E"s in the name) engineers, they are paid to make electronic stuff that the public will buy. They even credibly address the ethical issues in designing cars and weapons that act autonomously and may accidentally (or in the case of weapons, intentionally) kill people.
But they studiously avoid any mention of the elephant in the room, like it isn't there. Stephenson had the same problem in Snow Crash, although more obliquely. The elephant is religion.
The problem is that computers operate by working through a set of rules (called a "program" or "computer code"). Artificially Intelligent computer programs have a very large number of rules, most these days programmed not explicitly line by line, but implicitly by giving the computer explicit rules (code) for deciding which rules to give priority based on working through a bunch of experiences that either get rewarded or punished. So (except for the core code) the rules are implicitly programmed by the choice of experiences and the quality of reward. We do that to children, but the built-in "code" in children is far more complex than anything we have yet been able to design into our robots. The programmers -- optimists, all of us -- think we are getting close, but they haven't got a clue.
The IEEE authors are at least honest enough to describe their problem in terms of "code" rather than "experiences," but they recognize that ethical decisions don't count as "ethical" if they fit the code the programmers imagined and predicted and programmed for. Here comes the elephant: Religion solves this problem with a different kind of rule -- it's still a "rule" but at a totally different level -- the Golden Rule (GR) invites each participant to imagine themselves in the other person's situation, and then to choose an outcome that you'd want if you were them. Not many of us do that, but even the atheists know about it. The trouble is, it's a religious rule. Because it is central to the teaching of Jesus (who got it right out of the middle of the Torah, [Lev.19]), and because Christianity has invaded the entire western half of the globe, our half of the globe is incredibly wealthy. The rest of the world sees that wealth, and sometimes tries to play catch-up by stealing the values without really understanding where they came from, and they fail. Christianity took over Russia a thousand years ago, but it became a formality, not a way of life as Jesus taught it, so people forgot the value system. Russia is now a third-world country with a fading memory of greatness. It's worse in "NAMEStan" (North Africa, Middle East, and a bunch of 'Stans), because Christianity was pushed out a few centuries earlier, and the replacement religion does not make the GR central in their teaching.
Why does this matter? Because the GR invites us to empathize with the other persons involved. How can a computer empathize with a person? It is not a person, it has never experienced the pain of failure nor the joy of success and approval. People can empathize with their pets, because dogs and cats show almost-human emotions, they get hungry and angry, and even seem ashamed when caught in the act of what is forbidden. We have done those things, and maybe the pet doesn't think exactly like a human, but it looks a lot more like it than the robot only following rules. The robot cannot think for itself until it has a "self" to think for. Nevermind what the atheists claim, God gave us that "self" and we are the robot's god. The robot will never think as clearly and deeply as we do, any more than we can be expected to think God's thoughts. I get a tiny hint of it when I program, but oh so tiny (see my "Me & My Computer" video).
Neal Stephenson is a programmer -- not hard-core like me, but he understands how it works. Like (almost) all geeks, he does not understand religion, so he invented this virus metaphor to explain why people are attracted to what he sees as fraud. It's a cute idea, but it basically ignores the elephant in the room. So the world he imagines is dystopic, "red in tooth and claw" -- oh wait, that's a Darwinist line, in Stephenson's mind but not in his book. The hero (unimaginatively named "Hiro," a Japanese name that sounds the same) is stranded in a raft in the middle of the ocean, having been rescued from a boat that the Bad Guys blew up and sank. His rescuer offered no altruism, he was hoping to do a hostage swap. Ships and boats sail by, see that there is nothing to steal, and keep on going. Stephenson has Hiro wishing those passing lookers would do a GR kind of thing, but Stephenson does not even have the vocabulary to describe it in those terms. The elephant is as invisible as Hiro is able to make himself in his virtual world (because he's a hacker = programmer, and can do those kinds of things).
Stephenson clearly understands what kind of world we would live in with
no religious GR, he just doesn't understand
why what we live in today is not that. The IEEE authors
do not understand why American -- even Texas -- drivers are more polite
than they are in South America or the Middle East. We have 300+ years of
"Christian" values -- basically the GR -- saturating
the culture. The gas tank is empty and we're running on the fumes, but
it's still sooo much better than those other parts of the world where nobody
has seen Christian virtue next door and across the street in a thousand