Here's a somewhat loopy but interesting article on a wish for future computer overlords rather than the venal meatware politicians we're currently stuck with. Here's a passage that struck me with some very religious overtones-
When I asked [futurist Zoltan] Istvan, currently touring America in his Immortality Bus, what American politics would look like in 100 years, he answered that “There won’t be an America in 100 years. I’m sure of that fact. That’s why I formed the World Transhumanist Party, whose mandate is to become the first democratically [elected] political party to run a world government.” When I asked him if he would want to be replaced by an AI politician, Istvan’s response was upbeat and matter-of-fact.
"Yes, absolutely," he said. "I would love to see a truly altruistic entity running our government. Right now, all politicians, including myself, are motivated by self-interest. This is just how humans are. So wouldn’t it be nice to have something like a super-intelligent AI running things and it be entirely after our best interest?"
What would a "truly altruistic entity" look like? Sounds like God to me. However, this TAE would have to be created and programmed by mere mortals. "A just machine to make decisions, programmed by fellas with compassion and vision" went the third verse of Donald Fagen's IGY.
Fagen's had his tongue firmly in his cheek with that song, lovingly looking at the 1950's cloyingly optimistic World of the Future with a quarter-century (at the time) of cynicism. On the other hand, our futurist seems to be serious.
Who are those fellas going to be? Vision for new gadgets and new apps isn't in too short supply in the tech industry, but compassion isn't always their strong suit. Any compassion that gets programmed into our TAE is going to reflect the world-view of the programmer. That will make it only as altruistic as the programmers are.
The TAE god we make will be made in our image. Any effort to model the "better angels of our nature" will be a flawed model, only as good as the programmer's wisdom and godliness.
A super-de-duper-computer might be faster and better at math than a human brain, but that doesn't mean it will be wiser. I've seen a number of pieces worrying about self-driving cars and how it will go about dealing with no-win Kobayashi Maru situations; if the car has to choose who to mow over, what algorithms will it use to pick the person to harm?
If we're wetting our pants over the idea of having a computer figure out how to drive a car, are we anywhere near having it drive a country? We trust the generic 16-year-old with a driver's license, but are somewhat pickier (one would hope) about who is in charge in Washington.
Futurists have a concept of a Singularity, where supercomputers get to design even better computers and getting quickly to a point where they get godlike computing power that leaves humans in the dust. They might have seemingly infinite computing power, but wisdom and mercy might well be in short supply. Will that man-made god be more like Yahweh or Cthulhu (running on a platform of "why settle for the lesser evil?")?
Progress might lead us to better computers, but that doesn't mean they'll be wiser ones. However, many folks put their faith in Progress to a point that it hits the writer of the piece as religious-
The conception of progress put forward by some of the more radical transhumanists doesn’t sound merely religious—it sounds specifically Christian: history progressing toward a rupture point where man is born anew, made immortal and limitless in a time after time. Instead of asking whether transhumanism is based on good science, it might be more useful to question the rigorousness of it as a theology.
That's not an orthodox take on Christianity, although there are some on the New Agey left that see mankind growing in knowledge to become God in time, where the Singularity becomes a Omega Point of the universe becoming infinitely wise and becoming God.
Not quite. You can't make something infinitely wise from finite pieces. It also ignores human nature, which would likely be passed down to the computer offspring. Garbage in, garbage out, and the garbage of human nature goes in to the supercomputer our heroes want to take charge.
However, a lot of folks see mankind as perfectible. That was the fallacy of Communist thought and seems to be the major blind spots in transhumanist thought. In the Star Trek universe, Khan was a transhuman ruler; even with that suave Spanish accent, he's not my choice for president.