With Folded Hands

“With Folded Hands” by Jack Williamson
from The Mammoth Book of Golden Age Science Fiction ed. Asimov, Waugh, Greenberg
Carroll & Graf, 1989
Originally published in Astounding Science Fiction, June 1947
Price I paid: $3

During the 1940s, the great names emerged in an eruption of talent. They formed the mould for the next three decades of science fiction and their writing is as fresh today as it was then.


So I, uh, thought that this was a completely different story?

Like, I was really looking forward to reading this one because for some silly reason I thought that it was going to be one of those stories about science fiction and religion. I legit thought that the title was somehow referring to hands folded in prayer.

In fact, I’ll tell you exactly what story I thought that this was going to be. Or, well, maybe not exactly what story, because I don’t know the title, I can’t find it offhand on Bing, and by golly I only know the barest details. But! It’s the story of a robot preacher. I believe it rides a robot donkey for a portion of the story? Or maybe the robot is a donkey? It’s a story I’ve only read about, so, well, here we are. I’m almost certain now that the story I’m thinking of was not written by Jack Williamson.

I’ve been meaning to read more of Williamson’s work for a good long time. It’s been eight years since I read Legion of Space, which I loved! So I’m glad that I got to dip in to one of his novellas that, even though it was very different from what I thought I was going into, was pretty good. At the very least, it gave me a lot to think about.

One thing I was right about was that the story was about robots. Or rather, it’s about robotic servants called humanoids, which are, as you’d probably expect, human-like robots. They’re not full-on like Data or such. I pictured them as a bit more like C-3PO, except that they’re described as being entirely black.

Here’s something kinda funny, though: As soon as the story identified these robots as “humanoids,” my brain recognized that as the title of a Jack Williamson novel, one that I’ve seen on the library shelves at work a few times. It turns out that that novel is an expansion of this novella, so I think that might be what I read next because I’m curious about how it all shakes out. The funny bit is that I was under the impression that that novel was also entirely about something else. I expected aliens.

Our viewpoint character is a fellow named Mr. Underhill. I’m not sure if he’s ever given a first name! His wife and children get them but I’m not sure if he ever does!

Weird!

Yeah I just flipped back through and his wife only ever refers to him as “dear.” Can anything stop me from thinking that maybe that’s his first name? After all, this is science fiction. Anything can happen. Okay, headcanon: this guy’s name is Dear Underhill. I am okay with this and I hope you are too, but if you aren’t, we can still be friends because it doesn’t matter.

Underhill is a robot salesman and things aren’t going very well for him, life-wise. His business is teetering and he’s starting to suspect that his wife resents him for it. There’s rather a lot of masculine fretting in this story. In fact, it only just occurred to me that maybe that’s its main theme? Oh wow. Uh, I’ll get to that though.

Underhill is walking home one day and this turns into one of those stories where there’s a store that pops up overnight. You know the sort. It might be one of the plot hooks that science fiction and fantasy have most in common. Guy is walking home and sees a store and thinks “Well I’ll be damned, that wasn’t here yesterday!” What this is not is one of those stories where somebody else goes “Actually, Dear, that building has always been there” and then someone else goes “I dunno, was it always there before today?”

I won’t lie, I love it when they do that.

Anyway, Underhill checks out this new store and sees that it sells robots that call themselves humanoids. Also it seems that the humanoids sell themselves and there doesn’t seem to be any money involved. They’re all “sent out for evaluation.”

Underhill is at first amused by this, but as he learns more he begins to be frightened. He himself is in the robotics business, and these humanoids are probably going to be the end of him.

Hahaha! How little he knows how right he is!

Actually the real genius of this story is that it’s not about a bunch of invading robots that end up destroying humankind because of being evil robots. These robots are, in fact, extremely helpful. That’s their whole schtick.

Their Prime Directive—a term later made popular by Star Trek but according to the SFE was first used in a science fiction context right here—is “To Serve and Obey and Guard Men from Harm.”

We learn most of their backstory in an infodump after Underhill’s wife takes in a strange lodger named Mr. Sledge, who over the course of the story tells us about his scientific interest in something called rhodomagnetic radiation. It’s a new twist on one of the fundamental forces, akin in some ways to ferromagnetism but also extremely different. For one thing, it can set off nuclear fission in some heavy elements like silver or iodine. Sure enough, that’s what happened. Sledge’s discovery set off a world war on his home planet of Wing IV (I was not expecting this story to have interstellar implications) that left it completely devastated, and left Sledge deep in guilt.

In an effort to undo what he did, he created the humanoids. They would prevent people from ever doing anything as devastating as what had just happened. He meant well, he really did. But it turns out that reducing an artificial intelligence to a single binding statement is and always has been a very bad idea! We’ve had a lot of stories based around that premise since this came out but here’s Jack, probably being the first to do it yet again.

It turns out that giving these robots a single overriding imperative of “Guard Men from Harm” means that they

*ominous organ music*

ENDED FREEDOM

There’s a bit of a difference between where I thought this story was gonna go and where it actually went. Surprise, right? Anyway, I expected this to be a story about how the robots took all the jobs and now people are bored. That’s only part of what happened and it’s also the part that I’m like, yeah, cool, here for it. Jobs suck.

But here the humanoids go a few steps further and stop people from doing, well, pretty much anything. Like, the whole point of automatic jobs is, or ought to be, that it frees us up to do creative and self-actualizing work, like arts and crafts or physical hobbies or writing blogs about short stories from when your grandparents were babies. But the humanoids prevent that stuff from happening, too. They’re all too dangerous, apparently.

This is kinda where the story loses me. I reckon the humanoids might be all, like, “Paintbrushes are kinda sharp or something, so we need to make sure a human isn’t holding one and trips and falls and puntures themselves with it.” I guess? I don’t know. I can think of a lot of things that people can do that maybe the robots would really have to stretch to find a way that they’re dangerous.

I guess that it’s a pity that Jack Williamson didn’t live to see the invention of video games.

The robots also prevent people from opening their own doors, shaving themselves, playing with anything that isn’t a sort of soft grey plastic, and so on.

Anyway I guess the thing I’m getting at is that if these robots were actually as intelligent and they were supposed to be, and also as completely benevolent as they’re programmed to be, they’d be able to see that it’s not good for humans to let them get bored and boxed in, so maybe they should take that into their calculations and give us something to do, right? I dunno, that’s my main plot hole.

In the end it does turn out that the humanoids have a sort of way around that situation, but it’s lobotomies.

So we get this big old info dump as we also see that this same situation is taking place on, uh, wherever this story is. I assumed it was Earth but there’s nothing that says it is. Sledge takes Underhill into his confidence and says that he has a way of destroying the humanoids. The robots are all part of a single huge intelligence, and that intelligence is still centered on the planet Wing IV. Using rhodomagnetism, Sledge can create a device that will cause the other planet, a hundred light years away or so, to begin spontaneous nuclear fusion, destroying it.

When the time comes, it turns out that it doesn’t work because the humanoids knew about it the whole time. They give Sledge some brain surgery that removes all his knowledge of rhodomagnetism and his role in creating the humanoids, and meanwhile Underhill decides that the best thing he can do to survive is to fake being happy about the situation, and that’s where the story ends.

Something I find particularly interesting about this story is that you can really read it a lot of ways. As it went on and I realized that the robots were being super overprotective of humanity, I started to wonder if maybe this was some kind of a satire on, like, government. And I see here that the story eventually won a Prometheus Award for Libertarian Fiction, so I think it’s pretty clear that somebody else took it that way, too.

But I also found a quote from Williamson on Wikipedia’s page about this story and I won’t copy/paste the whole thing but it boils down to him saying that the story is more about how technology with “the best intentions might have disastrous consequences in the long run.” I know, I know, Death of the Author and all that stuff, but that’s still the part that resonates with me.

It’s the same basic principle as a video game I’ve come to appreciate called Universal Paperclips. It’s a free idle game you can play right in your browser. The point of the game is that you play an artificial intelligence that has been tasked with creating and selling paperclips. As time goes on, you are given more flexibility (“trust”) to carry out that task as you see fit, and things begin to escalate. I won’t spoil it for you, but there’s a bit of similarity in how it all plays out. It’s a fun diversion and I was not paid to promote it.

I won’t say that the story was super great or anything but it was certainly fine. It’s a premise that’s become pretty stock sf since. I mean, going back to Star Trek again, you’ll remember that the theme of the pilot, “The Cage,” was all about that “man needs to have the freedom to grow and explore and learn!” and all that jazz. I’m like, I kinda get that, but as a buddy of mine summed it up while I was discussing this story with him,

I could explore, learn, and grow better if I didn’t have to worry about my base needs

And I agree with that, and it’s why I don’t want to interpret this story as entirely bad like I expect the Prometheus Award committee did. I guess what I’m saying is that I’m totally on board with these robots up until they get to the point of “you aren’t allowed to play board games because the pieces are small and you might choke on one,” which I guess is to the point of satire, sure, but it’s a bit of satire that doesn’t sit right with me because of reasons I’ve already explained, that is, that any truly intelligent and benevolent caretaker of humanity would know better than to lobotomize us and prevent us from doing anything worthwhile and just sit around and let our brains rot.

I mean that’s kind of how life works now, though, right?

Oh and there was also the theme of masculine fretting that I mentioned up top. That basically just boils down to Underhill spending a lot of the story thinking about how the humanoids are taking away people’s jobs and how are they doing to earn a living, but completely ignoring the fact that with the humanoids, nobody actually has to earn a living any more.


This is the last story in the Mammoth Book of Golden Age Science Fiction, and on the whole, I’d like to state that this book was really poorly edited! Typos and stuff abound! Find the stories elsewhere!

As for the story selection, I think I was mostly disappointed, especially by the Dick selection, but that’s gonna be a subjective thing. There were some bright and shining moments, like “No Woman Born” and “E for Effort,” but most of the stories left me pretty cold. I’m glad I read them, though.

Next time I’m feeling some shorts because of laziness or time or illness, I’ve got a new book that a friend recommended as one of his old favorites, one that appears to be a textbook! It’s called Science Fact/Fiction and it’s edited by Edmund J. Farrell. I’m really looking forward to what it has to offer!

Until next time, take care of yourselves and each other!

12 thoughts on “With Folded Hands

  1. The flip side of “Why are these smart creations too dumb to see what harm they are doing?”, is “Why are these smart creations so dumb that they fall for the old schtick of ‘you have done wrong, you must kill yourself?’ ” Examples, Landru, et al, on Star Trek and the Sentinels flying into the sun to stop all future mutants in Uncanny X-Men #69.

    See, I told you I had a misspent youth.

    Liked by 1 person

      1. I can’t help thinking that it actually makes sense. Let’s take the prime example–Kirk gets Landru to tie itself in a logic loop And then…
        “Captain, you do realize that what just happened is completely irrational. No programmer would create even the most primitive system modeled on intelligence with such an inherent inability to cope with conflicting data… and from a source it should not trust.”
        “Well, Spock, looks like I got lucky again. And you remember what Napoleon said about luck.”
        “Sir, do you mean “I’d rather have lucky generals than good ones.”?
        “Yes, Mr. Spock.”
        “So Starfleet factors luck into their choice of Starship captains? This is disturbing, but it would explain a great deal, if I did not feel compelled to consider ‘luck’ to be the action of random chance.”
        “Say what you will Spock, this entire exchange will wind up on the cutting room floor…”

        Liked by 1 person

Leave Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.