skygiants: Yankumi from Gosuken going "..." (dot dot dot)
[personal profile] skygiants
I have formed a vague & probably untenable ambition of reading all the Nebula nominees this year, the first of which I managed to get round to (that I hadn't read already) was The Terraformers.

This book is a sort of multigenerational saga on a planet that's being gradually prepared for habitation by humans [and others]; the first section takes place while the planet is still a protected zone inhabited only by terraforming staff, the second section focuses on urban planning during the very early stages of settlement, and the third section focuses on corporate land grabs and gentrification on the now heavily settled planet.

Conceptually this is a very cool premise, I really enjoy science fiction that explore cultural shifts over the long term and the book is dealing with a number of ideas that I find extremely interesting! I am really glad to have read it, I think it's an ambitious project, I am always glad to read books that give me things to think about and argue with and this book certainly gave me a lot to think about and argue with, as everyone who has had the pleasure of communicating with me in the past three days now knows in excruciating detail because I have not been able to shut up about it.


This book is really really profoundly concerned with personhood and who gets it. Wonderful; I too am concerned with this question.

We are in a far future space in which all reproduction is conducted by 3-D printing various sapient creatures to design. Do not worry about the casually eugenicist implications of this concept. Or, I mean, do worry about it -- the author is very worried about it, because the bad corporate entities are unethically 3-D printing unintelligent people to do service instead of printing everyone out as properly intelligent, and by people I mean 'moose' -- our first major plot point in this regard involves a romance between a pair of sentient moose, one of whom has been printed out with a lower intelligent rating than the other, to great concern for all.

The moose text [with their brains]. They sound exactly like human beings when they do so. The less-intelligent moose's human partner is deeply concerned that the smarter moose is just using her friend for sex. I explained this to [personal profile] genarti, who said "....does she know about rutting behavior?" We neither know nor care about rutting behavior. Once we give creatures human level sapience there is no difference between a texting human and a texting human in a moose suit.

Okay, so the basic argument being made here is "if we can 3-D print everything with human-type sentience, then we have a moral obligation to do so." This results in a number of wild and frankly extremely funny plot elements like:

- intelligent dogs learn about the history of domestication, get extremely angry about it, and leave in a huff to perform their own science on another continent
- intelligent moose from evil corporate town visits egalitarian paradise, asks for barn to stay in, and is met with politely appalled reactions: why a barn? why not an apartment like everyone else??
- intelligent cat gets cannot pay rent on apartment, stays with friends for a while, but then starts feeling self-conscious about freeloading

([personal profile] genarti, upon having these plot points to explained to her: has this author ever met a dog?? has this author ever met a cat??? is this a PETA tract????)

Eventually it's revealed that the less-intelligent moose is not in fact less intelligent in any way except for the fact that there is an artificial inhibitor in his brain that prevents him from using words of more than one syllable, which gets fixed by the end of the section!

the book, posing a question: hey, do we have a moral obligation to imagine different kinds of intelligence and treat beings with lesser intelligence by our standards or different ways of experiencing the world with respect and dignity?
the book, answering it: no! we have a moral obligation to make sure everyone can talk exactly like a human! problem solved!!!

At some point in the second section, the protagonists stumble upon a community that is working to give intelligence to earthworms, but have run into a problem where the earthworms won't talk to them.

"Ah," I thought, "we are complicating this at last! We've given intelligence to earthworms but they aren't interested in communicating like humans, and why should they be!"

but no, this is just a software bug and once they fix it the earthworms also start acting exactly like humans in earthworm suits.

(this does set up a very funny sequence in the third section where the protagonists go to a game jam and the earthworms are working on a video game for earthworms that's a bee simulator. 'earthworms make a bee simulator' is a great gag! if I wasn't so irritated already at the book's whole attitude towards animal intelligence I would be so charmed by this!)

When explaining why they felt compelled to give intelligence to earthworms, the scientist on the project says, "We were working with them on soil sustainability and infrastructure maintenance for the colony. It didn't seem right that we couldn't talk to them."

Later in this same section, the protagonists (one human and one intelligent cyborg cow) come across a (free-range) dairy farm populated by normal non-augmented cows and are shocked, horrified, appalled, made physically ill by the concept. And like -- sure, the point is that in this world where everyone is people, they are reacting like we would if we found a farm full of human women with cow-level intelligence being milked on the regular. Now, cows do experience the world in a different way than humans. All animals in fact experience the world in different ways than humans, and there is a lot of interesting science writing exploring our best understanding of those experiences. But not in this book! where everyone should experience the world exactly like humans do and if not we are doing something wrong!

This section takes care to point out that there are animals with animal-level intelligence wandering the terraformed planet -- but that's different, because they're part of a natural ecosystem. The point seems to be that humans cannot interact with other creatures ethically unless they can talk like us to express consent like us. We're not part of the ecosystem, I guess. We're exempt. By virtue of controlling the 3-D printers and being able to make everyone that we interact with specifically to design is this a bit eugenicist DO NOT WORRY ABOUT IT!!!!!

This section, by the way, is primarily composed of figuring out how to design a public transit system for the planet. Someone comes up with the idea of making trains that can fly and are intelligent! What a brilliant idea! Everyone in our consensus-based idyllic model community of free citizens on the planet is so stoked about this!

One (1) character -- the POV character's love interest -- raises a concern about the idea of creating a sentient species to fill a public infrastructure problem. What if we make trains specifically to be trains and they don't want to be trains?

Not only is this roundly shouted down -- the trains can be anything they want, but it's normal that creatures created to be good at a thing will want to do the thing, so probably many of the trains will enjoy being trains and others can be scientists or ballerinas if they want to! and obviously we will not compel the trains in any way, they will be their own self-governing union! no problems ever with maintaining infrastructure in consensus-based model community! -- but the POV character immediately starts ghosting the love interest, "repulsed" by the fact that he would even raise concerns about such an obviously cool and ethical solution to the problem, until the love interest makes a sincere and profound apology for his poor behavior.

I'm so mad about this in particular because conceptually I love sentient trains. I was all ready to adore the sentient trains. I cannot believe this book ruined sentient trains for me by forcing me to ask questions and then answering them with "if everyone acts ethically it will be fine, do not worry about it, and if you worry about it you're Bad and Wrong."

And the thing is, like -- a lot of this book is profoundly silly, intentionally so, viz. earthworms making a bee simulator. None of this would make me that angry if it wasn't so clear that it is also intending to be a Big Ideas book, "a feat of revolutionary imagination" (Publisher's Weekly), "a primer for how to embrace solutions to the challenges we all face" (Scientific American), etc. etc. etc. This is a didactic book. Even the bits that are silly are didactic. More than anything else, it really profoundly reminded me of the work of Sheri Tepper -- another author with a delightfully creative imagination who had a lot of Big Progressive Ideas, a real determination to explore Solutions to our Current Problems through Science Fiction, and an unfortunate tendency to accidentally slide from ecofeminism towards ecofascism with a little bit of eugenics thrown into the mix for flavor.

I am in no doubt that the author of this book and I agree in most of our political and cultural opinions. I too am concerned about the climate and the bad actions of corporations, I too would love to imagine a queer and trans-inclusive future, I too feel bad about eating animals! But I do think any progressive science fiction author runs in danger of falling into the trap of believing that they have hit the endpoint of human thought and moral behavior; that they can easily and without friction inscribe that onto a far future world and society in which all good and sympathetic characters are representing good behavior as we understand it here, now, in 2024. To me, this is both boring and annoying, EVEN WHEN it doesn't result in a take I disagree with as profoundly as "it's unethical to interact in any way with any living creature that can't communicate with you in complex sentences in your own language (! !! !!!)"

I have not listened to the author's podcast but my understanding is that it is called Our Opinions Are Correct. From reading this book, that is exactly what I would expect.

Date: 2024-04-01 06:53 pm (UTC)
sciatrix: A thumbnail from an Escher print, black and white, of a dragon with its tail in its mouth, wing outstretched behind. (Default)
From: [personal profile] sciatrix
but why could words of one beat not speak as wise as long words could? You can pack words so tight with thoughts, right?

Like--really? That's where you rest intelligence? Under syllable length?

Date: 2024-04-02 07:36 am (UTC)
rydra_wong: Lee Miller photo showing two women wearing metal fire masks in England during WWII. (Default)
From: [personal profile] rydra_wong
Also! It makes no sense in terms of language development! There are plenty of people with very limited spoken language who can, for example, say "biscuit" or "DVD"!

Date: 2024-04-02 05:58 pm (UTC)
sciatrix: A thumbnail from an Escher print, black and white, of a dragon with its tail in its mouth, wing outstretched behind. (Default)
From: [personal profile] sciatrix
right exactly--all words are just syllables strung together! There's nothing inherently smarter about saying more or less syllables to convey a concept!

The only reason we tend to assume this is a thing in English is because of the class-associated Romanicization of our loan words over historical time! That's an arbitrary historical development!

like I am just befuddled over here.

Date: 2024-04-02 06:29 pm (UTC)
rydra_wong: Lee Miller photo showing two women wearing metal fire masks in England during WWII. (Default)
From: [personal profile] rydra_wong
And if it's not intending to say that intelligence actually is functionally equivalent to how polysyllabic your speech is, it's saying that in this far-future super-sophisticated society, nobody has been able to tell the difference in all this moose's life up to this point between intellectual disability and a very very specific language impairment.

because the bad corporate entities are unethically 3-D printing unintelligent people to do service instead of printing everyone out as properly intelligent

Also why is this trope ("we will make stupid/slow people so they can be the perfect servants/slaves!") even a thing. Where does this idea come from that intellectually-disabled people are naturally compliant and subservient and happy obedient child-like slaves come from.

Because it's clearly not from anyone who's ever met people with intellectual disabilities, who are as autonomous and rebellious as anyone else (and in many cases especially so because being surrounded by people trying to control you will often tend to make people push back hard, in whatever ways they can).

Maybe it's supposed to be obviously wrong, but the evil corporate entities should have found that one out immediately.

Date: 2024-04-03 04:38 am (UTC)
sciatrix: A thumbnail from an Escher print, black and white, of a dragon with its tail in its mouth, wing outstretched behind. (Default)
From: [personal profile] sciatrix
Right! Obviously! Also very obvious that the author has never worked with real animals, either: not only that animals can often communicate extremely clearly without necessarily using language as humans know it (ie communicating concepts like "don't do that" or "gimme"), but real animals are often quite skilled at manipulating their handlers (and other animals) to get desired outcomes. Which may or may not be anything a human comprehends as "good!"

Neither has the author apparently spent much time around small children and toddlers, who are certainly not particularly biddable or pliant either despite still developing their cognitive faculties. Obviously adult intellectually disabled people are quite different from small children in that they are adults with the time to learn and percolate and develop skills and, you know, be adults—the best definition of what general intelligence is that I have seen is essentially processing speed and capacity, which does not comment on what is being processed—but the point re: pliancy remains.

In fact, in my experience it's if anything easier to convince individuals with more processing speed to behave and act as you'd like them to: you can basically fast talk them into dropping inputs and convincing themselves that doing what you want is actually in their best interest, whether or not it actually is. Or you can entrap them emotionally, at which point they will generally spend their considerable cognitive powers justifying why it actually makes total sense to do what you'd like them to.

Not that I've ever seen any of that play out in real life or anything. Sometimes staving off the meltdown to get through a stressful situation means keeping yourself in that situation much longer than you need to be.

Anyway, it's such an indication of a level of self centered complacency about cognition, variation, and diversity to find in a writer who is theoretically interested in what personhood is and means! Folks else where in discussion keep invoking animal rights positions and language to explain it, and I think that's exactly right: animal rights (vs animal welfare) ideology encourages people to explore personhood by projecting themselves into the position of an alien experience and imagining how they would feel if transplanted immediately into that social position. Animals are viewed as humans in furry suits without the ability to speak, not fundamentally experiencing different priorities, opinions, and sensory experiences than humans do.

I really do think that this emphasis on "intelligence" as language, and specifically language as easily parsed and unpacked by the speaker, is a key to the fundamental mindset. It's also the thing that lets people be tricked by processes that simulate language without actual cognition, like ChatGPT. It's just so heartbreaking watching someone concerned enough about about defining personhood and variation in cognitive experiences to write a whole speculative scifi novel probing the boundaries of the concepts... while being totally trapped within the paradigm of a world populated by tiny versions of ourselves, devoid of tools to use to actually connect to and understand the alien.

Profile

skygiants: Princess Tutu, facing darkness with a green light in the distance (Default)
skygiants

June 2025

S M T W T F S
123 45 67
891011121314
15161718192021
222324 25262728
2930     

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 28th, 2025 05:26 am
Powered by Dreamwidth Studios