Safe Exoticism, Part 1: Science

August 31st, 2011

Note: This 2-part article is an expanded version of the talk I gave at Readercon 2011.

I originally planned to discuss how writers of SF need to balance knowledge of the scientific process, as well as some concrete knowledge of science, with writing engaging plots and vivid characters. But the more I thought about it, the more I realized that this discussion runs a parallel course with another; namely, depiction of non-Anglo cultures in Anglophone SF/F.

Though the two topics appear totally disparate, science in SF and non-Anglo cultures in SF/F often share the core characteristic of safe exoticism; that is, something which passes as daring but in fact reinforces common stereotypes and/or is chosen so as to avoid discomfort or deeper examination. A perfect example of both paradigms operating in the same frame and undergoing mutual reinforcement is Frank Herbert’s Dune. This is why we get sciency or outright lousy science in SF and why Russians, Brazilians, Thais, Indians and Turks written by armchair internationalists are digestible for Anglophone readers whereas stories by real “natives” get routinely rejected as too alien. This is also why farang films that attain popularity in the US are instantly remade by Hollywood in tapioca versions of the originals.

Before I go further, let me make a few things clear. I am staunchly against the worn workshop dictum of “Write only what you know.” I think it is inevitable for cultures (and I use that term loosely and broadly) to cross-interact, cross-pollinate, cross-fertilize. I myself have seesawed between two very different cultures all my adult life. I enjoy depictions of cultures and characters that are truly outside the box, emphasis on truly. At the same time, I guarantee you that if I wrote a story embedded in New Orleans of any era and published it under my own culturally very identifiable name, its reception would be problematic. Ditto if I wrote a story using real cutting-edge biology.

These caveats do not apply to secondary worlds, which give writers more leeway. Such work is judged by how original and three-dimensional it is.  So if a writer succeeds in making thinly disguised historical material duller than it was in reality, that’s a problem. That’s one reason why Jacqueline Carey’s Renaissance Minoan Crete enthralled me, whereas Guy Gavriel Kay’s Byzantium annoyed me. I will also leave aside stories in which science is essentially cool-gizmos window dressing. However, use of a particular culture is in itself a framing device and science is rarely there solely for the magical outs it gives the author: it’s often used to promote a world view. And when we have active politics against evolution and in favor of several kinds of essentialism, this is something we must keep not too far back in our mind.

So let me riff on science first. I’ll restrict myself to biology, since I don’t think that knowledge of one scientific domain automatically confers knowledge in all the rest. Here are a few hoary chestnuts that are still in routine use (the list is by no means exhaustive):

Genes determining high-order behavior, so that you can instill virtue or Mozartian composing ability with simple, neat, trouble-free cut-n-pastes (ETA: this trope includes clones, who are rarely shown to be influenced by their many unique contexts). It runs parallel with optimizing for a function, which usually breaks down to women bred for sex and men bred for slaughter. However, evolution being what it is, all organisms are jury-rigged and all optimizations of this sort result in instant dead-ending. Octavia Butler tackled this well in The Evening and the Morning and the Night.

— The reductionist, incorrect concept of selfish genes. This is often coupled with the “women are from Venus, men are from Mars” evo-psycho nonsense, with concepts like “alpha male rape genes” and “female wired-for-coyness brains”. Not surprisingly, these play well with the libertarian cyberpunk contingent as well as the Vi*agra-powered epic fantasy cohort.

— Lamarckian evolution, aka instant effortless morphing, which includes acquiring stigmata from VR; this of course is endemic in film and TV SF, with X-Men and The Matrix leading the pack – though Star Trek was equally guilty.

— Its cousin, fast speciation (Greg Bear’s cringeworthy Darwin’s Radio springs to mind; two decent portrayals, despite their age, are Poul Anderson’s The Man Who Counts and The Winter of the World).  Next to this is rapid adaptation, though some SF standouts managed to finesse this (Joan Slonczewski’s A Door into Ocean, Donald Kingsbury’s Courtship Rite).

— The related domain of single-note, un-integrated ecosystems (what I call “pulling a Cameron”). As I mentioned before, Dune is a perfect exemplar though it’s one of too many; an interesting if flawed one is Mary Doria Russell’s The Sparrow. Not surprisingly, those that portray enclosed human-generated systems come closest to successful complexity (Morgan Locke’s Up Against It, Alex Jablokov’s River of Dust).

— Quantum consciousness and quantum entanglement past the particle scale. The former, Roger Penrose’s support notwithstanding, is too silly to enlarge upon, though I have to give Elizabeth Bear props for creative chuzpah in Undertow.

— Immortality by uploading, which might as well be called by its real name: soul and/or design-by-god – as Battlestar Galumphica at least had the courage to do. As I discussed elsewhere, this is dualism of the hoariest sort and boring in the bargain.

— Uplifted animals and intelligent robots/AIs that are not only functional but also think/feel/act like humans. This paradigm, perhaps forgivable given our need for companionship, was once again brought to the forefront by the Planet of the Apes reboot, but rogue id stand-ins have run rampant across the SF landscape ever since it came into existence.

These concepts are as wrong as the geocentric universe, but the core problems lie elsewhere. For one, SF is way behind the curve on much of biology, which means that stories could be far more interesting if they were au courant. Nanobots already exist; they’re called enzymes. Our genes are multi-cooperative networks that are “read” at several levels; our neurons, ditto. I have yet to encounter a single SF story that takes advantage of the plasticity (and potential for error) of alternative splicing or epigenetics, of the left/right brain hemisphere asymmetries, or of the different processing of languages acquired in different developmental windows.

For another, many of the concepts I listed are tailor-made for current versions of triumphalism and false hierarchies that are subtler than their Leaden Age predecessors but just as pernicious. For example, they advance the notion that bodies are passive, empty chassis which it is all right to abuse and mutilate and in which it’s possible to custom-drop brains (Richard Morgan’s otherwise interesting Takeshi Kovacs trilogy is a prime example). Perhaps taking their cue from real-life US phenomena (the Teabaggers, the IMF and its minions, Peter Thiel…) many contemporary SF stories take place in neo-feudal, atomized universes run amuck, in which there seems to be no common weal: no neighborhoods, no schools, no people getting together to form a chamber music ensemble, play soccer in an alley, build a telescope. In their more benign manifestations, like Iain Banks’ Culture, they attempt to equalize disparities by positing infinite resources. But they hew to disproved paradigms and routinely conflate biological with social Darwinism, to the detriment of SF.

Mindsets informed by these holdovers won’t help us understand aliens of any kind or launch self-enclosed sustainable starships, let alone manage to stay humane and high-tech at the same time. Because, let’s face it: the long generation ships will get us past LEO. FTLs, wormholes, warp drives… none of these will carry us across the sea of stars. It will be the slow boats to Tau Ceti, like the Polynesian catamarans across the Pacific.

You may have noticed that many of the examples that I used as good science have additional out-of-the-box characteristics. Which brings us to safe exoticism on the humanist side.

Part 2: Culture

Images: 1st, Bunsen and his hapless assistant, Beaker (The Muppet Show); 2nd, the distilled quintessence of safe exoticism: Yul Brynner in The King and I.

Related entries:

SF Goes McDonald’s: Less Taste, More Gristle

To the Hard Members of the Truthy SF Club

Miranda Wrongs: Reading too Much into the Genome

Ghost in the Shell: Why Our Brains Will Never Live in the Matrix

“Are We Not (as Good as) Men?”

August 23rd, 2011

— paraphrasing The Sayer of the Law from H. G. Wells’ The Island of Dr. Moreau

When franchises get stale, Hollywood does reboots — invariably a prequel that tells an origin story retrofitted to segue into already-made sequels either straight up (Batman, X-Men) or in multi-universe alternatives (Star Trek). Given the iconic status of the Planet of the Apes original, a similar effort was a matter of time and CGI.

In The Rise of the Planet of the Apes, we get the origin story with nods to the original: throwaway references to the loss of crewed starship Icarus on its way to Mars; a glimpse of Charlton Heston; the future ape liberator playing with a Lego Statue of Liberty. As Hollywood “science” goes, it’s almost thoughtful, even borderline believable. The idea that the virus that uplifts apes is lethal to humans is of course way too pat, but it lends plausibility to the eventual ape dominion without resorting to the idiotic Ewok-slings-overcome-Stormtrooper-missiles mode. On the other hand, the instant rise to human-level feats of sophistication is ridiculous (more of which anon), to say nothing of being able to sail through thick glass panes unscathed.

The director pulled all the stops to make us root for the cousins we oppress: the humans are so bland they blend with the background, the bad guys mistreat the apes with callous glee… and the hero, the cognitively enhanced chimpanzee Caesar (brought to disquieting verisimilitude of life by Andy Serkis), not only fights solely in defense of his chosen family… but to underline his messianic purity he has neither sex drive nor genitals. This kink underlines the high tolerance of US culture for violence compared to its instant vapors over any kind of sex; however, since Project Nim partly foundered on this particular shoal, perhaps it was a wise decision.

As it transpires, Ceasar is exposed to little temptation to distract him from his pilgrimage: there are no female hominids in the film, except for the maternal vessel who undergoes the obligatory death as soon as she produces the hero and a cardboard cutout helpmate there to mouth the variants of “There are some things we weren’t meant to do” — and as assurance that the human protagonist is not gay, despite his nurturing proclivities. Mind you, the lack of a mother and her female alliances would make Caesar (augmented cortex notwithstanding) a permanent outcast among his fellows, who determine status matrilinearly given the lack of defined paternity.

Loyal to human tropes, Caesar goes from Charly to Che through the stations-of-the-cross character development arc so beloved of Campbel/lites. Nevertheless, we care what happens to him because Serkis made him compelling and literally soulful. Plus, of course, Caesar’s cause is patently just. The film is half Spartacus turning his unruly gladiators into a disciplined army, half Moses taking his people home — decorated with the usual swirls of hubris, unintended consequences, justice, equality, compassion, identity and empathy for the Other.

Needless to say, this reboot revived the topic of animal uplift, a perennial favorite of SF (and transhumanist “science” which is really a branch of SF, if not fantasy). Human interactions with animals have been integral to all cultures. Myths are strewn with talking animal allies, from Puss in Boots to A Boy and His Dog. Beyond their obvious practical and symbolic uses, mammals in particular are the nexus of both our notions of exceptionalism and our ardent wish for companionship. Our fraught relationship with animals also mirrors preoccupations of respective eras. In Wells’ Victorian England, The Island of Dr. Moreau struggled with vivisection whereas Linebarger’s Instrumentality Underpeople and the original Planet of the Apes focused on racism (plus, in the latter, the specter of nuclear annihilation). Today’s discussions of animal uplift are really a discussion over whether our terrible stewardship can turn benign — or at least neutral — before our inexorable spread damages the planet’s biosphere past recovery.

When SF posits sentient mammal-like aliens, it usually opts for predators high in human totem poles (Anderson’s eagle-like Ythrians, Cherryh’s leonine Hani). On the other hand, SF’s foremost uplift candidates are elephants, cetaceans – and, of course, bonobos and chimpanzees. All four species share attributes that make them theoretically plausible future companions: social living, so they need to use complex communication; relative longevity, so they can transmit knowledge down the generations; tool use; and unmistakable signs of self-awareness.

Uplift essentially means giving animals human capabilities – primary among them high executive functions and language. One common misconception seems to be that if we give language to near-cousins, they will end up becoming hairy humans. Along those lines, in Rise chimpanzees, gorillas and orangutans are instantly compatible linguistically, emotionally, mentally and socially. In fact, chimpanzees are far closer to us than they are to the other two ape species (with orangutans being the most distant). So although this pan-panism serves the plot and prefigures the species-specific occupations shown in the Ape pre/sequels, real-life chances of such coordination, even with augmentation, are frankly nil.

There is, however, a larger obstacle. Even if a “smart bomb” could give instant language processing capability, it would still not confer the ability to enunciate clearly, which is determined by the configuration of the various mouth/jaw/throat parts. Ditto for bipedal locomotion. Uplift caused by intervention at whatever level (gene therapy, brain wiring, grafts) cannot bring about coordinated changes across the organism unless we enter the fantasy domain of shapeshifting. This means that a Lamarckian shift in brain wiring will almost certainly result in a seriously suboptimal configuration unlikely to thrive individually or collectively. This could be addressed by singlet custom generation, as is shown for reynards in Crowley’s Beasts, but it would make such specimens hothouse flowers unlikely to propagate unaided, much less become dominant.

In this connection, choosing to give Caesar speech was an erosion of his uniqueness. Of course, if bereft of our kind of speech he would not be able to give gruff Hestonian commands to his army: they would be reliant on line of sight and semaphoring equivalents. However, sticking to complex signed language (which bonobos at least appear capable of, if they acquire it within the same developmental time window as human infants) would keep Caesar and his people uncanny and alien, underlining the irreducible fact of their non-human sentience.

Which brings us to the second fundamental issue of uplift. Even if we succeed in giving animals speech and higher executive functions, they will not be like us. They won’t think, feel, react as we do. They will be true aliens. There is nothing wrong with that, and such congress might give us a preview of aliens beyond earth, should SETI ever receive a signal. However, given how humans treat even other humans (and possibly how Cro-Magnons treated Neanderthals), it is unlikely we’ll let uplifted animals go very far past pet, slave or trophy status. In this, at least, Caesar’s orangutan councillor is right: “Human no like smart ape,” no matter how piously we discuss the ethics of animal treatment and our responsibilities as technology wielders.

Images: top, Caesar; bottom, Neanderthal reconstruction (Kennis & Kennis, National Geographic). What gazes out of those eyes is — was — human.

The Unknown Archmage of Magic Realism

August 11th, 2011

Between long hours at the lab and a bout of lingering illness, I have let the blog go quiet for a while.

However, I haven’t been totally inactive. A year ago, I wrote an essay about the fact that writers feel free to use Hellenic contexts (myths, history, location), blithely assuming they know my culture well enough to do so convincingly. In that essay, I also stated that Hellás may be home to the best magic realist alive right now: Evgenía Fakínou. As a follow-up, I wrote a brief introduction to her work, which appeared today at the SFF Portal helmed by Val Grimm. Here is the conclusion:

“Fakínou’s books are full of vision quests, awakenings, boundary crossings. All have open endings, with their protagonists poised at thresholds on the last page. At the same time, they make their readers whole by reclaiming a past that might have led to an alternative future. Fakínou is a windwalker, a weaver of spider silk. I’m sorry she is not world-famous, but even sorrier for the dreamers who will never get a chance to lose – and find – themselves in her work.”

Image: Astradhení (Starbinder), Evgenía Fakínou’s first novel.

The Death Rattle of the Space Shuttle

July 25th, 2011

I get out of my car,
step into the night,
and look up at the sky.
And there’s something
bright, traveling fast.
Look at it go!
Just look at it go!

Kate Bush, Hello Earth

[The haunting a capella chorus comes from a Georgian folk song, Tsin Tskaro (By the Spring)]

I read the various eulogies, qualified and otherwise, on the occasion of the space shuttle’s retirement.  Personally, I do not mourn the shuttle’s extinction, because it never came alive: not as engineering, not as science, not as a vision.

Originally conceived as a reusable vehicle that would lift and land on its own, the shuttle was crippled from the get-go.  Instead of being an asset for space exploration, it became a liability – an expensive and meaningless one, at that.  Its humiliating raison d’ être was to bob in low earth orbit, becoming a toy for millionaire tourists by giving them a few seconds of weightlessness.  The space stations it serviced were harnessed into doing time-filling experiments that did not advance science one iota (with the notable exception of the Hubble), while most of their occupants’ time was spent scraping fungus off walls.  It managed to kill more astronauts than the entire Apollo program.  The expense of the shuttle launches crippled other worthwhile or promising NASA programs, and its timid, pious politics overshadowed any serious advances to crewed space missions.

In the past, I had lively discussions with Robert Zubrin about missions to Mars (and Hellenic mythology… during which I discovered that he, like me, loves the Minoans).  We may have disagreed on approach and details, but on this he and I are in total agreement: NASA has long floated adrift, directionless and purposeless.  Individual NASA subprograms (primarily all the robotic missions), carried on in the agency’s periphery, have been wildly successful.  But the days when launches fired the imagination of future scientists are long gone.

It’s true that the Apollo missions were an expression of dominance, adjuncts to the cold war.  It’s also true that sending a crewed mission to Mars is an incredibly hard undertaking.  However, such an attempt — even if it fails — will address a multitude of issues: it will ask the tough question of how we can engineer sustainable self-enclosed systems (including the biological component, which NASA has swept under the rug as scientifically and politically thorny); it will allow us to definitively decide if Mars ever harbored life; it will once again give NASA – and the increasingly polarized US polity – a focus and a worthwhile purpose.

I’m familiar with all the counterarguments about space exploration in general and crewed missions in particular: these funds could be better used alleviating human misery on earth; private industry will eventually take up the slack; robotic missions are much more efficient; humans will never go into space in their current form, better if we wait for the inevitable uploading come the Singularity.

In reality, funds for space explorations are less than drops in the ocean of national spending and persistent social problems won’t be solved by such measly sums; private industry will never go past low orbit casinos (if that); as I explained elsewhere, we in our present form will never, ever get our brains/minds into silicon containers; and we will run out of resources long before such a technology is even on our event horizon, so waiting for gods… er, AI overlords won’t avail us.

Barring an unambiguous ETI signal, the deepest, best reason for crewed missions is not science. I recognize the dangers of using the term frontier, with all its colonialist, triumphalist baggage. Bravado aside, we will never conquer space. At best, we will traverse it like the Polynesians in their catamarans under the sea of stars. But space exploration — more specifically, a long-term crewed expedition to Mars with the express purpose to unequivocally answer the question of Martian life — will give a legitimate and worthy outlet to our ingenuity, our urge to explore and our desire for knowledge, which is not that high up in the hierarchy of needs nor the monopoly of elites. People know this in their very marrow – and have shown it by thronging around the transmissions of space missions that mattered.

It’s up to NASA to once again try rallying people around a vision that counts.  Freed of the burden of the shuttle, perhaps it can do so, thereby undergoing a literal renaissance.

“We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard, because that goal will serve to organize and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one which we intend to win.”

John Fitzgerald Kennedy, September 1962

Images: Pat Rawlings, Beyond; Randy Halverson, Plains Milky Way; European Space Agency, High Aurora.

In the Undertow of the Heat Wave

July 22nd, 2011

Bull Spec magazine is a massive labor of love and care by editor Sam Montgomery-Blinn.  Issue 6 (Fall 2011) has just come out and in it is a poem of mine, Spacetime Geodesics.  Another one, Night Patrol, will appear in issue 7.

I also had two papers accepted within two months… it’s a good feeling, since I’ve been gathering the data for them over six years, through illness and lab moves and lack of money.  Who knows, their publication might even translate to grant funding!

As mentioned in my previous entry, my Readercon talk took an unexpected direction: it became a discussion of safe exoticism in SF/F.  I intend to post an expanded version of the talk in two parts: the first will focus on science, the second on culture.

By coincidence, yesterday I started reading a perfect example of safe exoticism, Jill Paton Walsh’s YA novel The Emperor’s Winding Sheet.  I say started because I had to stop halfway, something that happens extremely rarely — it was that unbearable. Purporting to tell the end of Byzantium from the POV of an innocent abroad, it’s a lifeless, clunky, contorted mess plastered with undigested descriptions from tourist guides and history textbooks.  The specific topic aside, it’s also awful by-the-numbers fiction: lazy plot devices, wet-cement characters, non-stop clichés.

Since I believe in giving people long ropes, I will read Walsh’s other Hellenic-based novel, Farewell, Great King, though at this point my expectations are, let’s say… modest.

The Hidden Readercon Panel

July 3rd, 2011

Because of other commitments, I restricted my Readercon activities to three items: a talk at 1 pm on Sunday, which is threatening (promising?) to be something a bit different from what I originally envisioned; a Saturday lunch with Francesca Forrest; and a Saturday dinner at my home.

My guests last year were Joan Slonczewski, Jack McDevitt and Sue Lange and I half-joked I should have registered it as a panel. This year they are Joan Slonczewski, Anil Menon and Alex Jablokov (definitely);  Vandana Singh, Sonya Taaffe and Kay Holt (possibly).

Now if only the weather allows cooking of the ambitious meal the chef envisions…

The Sheep Look Up

June 25th, 2011

Some will rob you with a six-gun
And some with a fountain pen.

Woody Guthrie, from Pretty Boy Floyd

I will not go into detail about the situation in Greece. The Guardian’s columnists have been covering it decently, given their distance from the source. Closer up so has The Press Project, whose documentary DebtOcracy puts the entire thing into perspective, showing up the contortions of the European Central Bank and the International Money Fund for what they are: attempts to safeguard large banks and their owners from any risks they incurred in the pursuit of exorbitant profits at the expense of societies and people.

I’m no blind patriot. I know what portion of this has come from bad internal policies and habits. Even so, I’m proud that my people are demonstrating non-stop, saying “Ohi!” (No) to the new occupiers-to-be just as they said it to the Italians and Germans when they demanded unconditional surrender. It is a source of great grief to me that my father, who was part of the resistance to the invaders and then helped rebuild the devastated country bit by hard-won bit as one of its most prominent engineers, will most likely leave life when his nation is at such a state.

I will leave you with a famous Cretan call to rebellion, sung by Nikos Ksilouris: “When will the skies clear, so we see the stars?” The people sang it before the Athens Parliament building.  Because humans need bread and roses — and recognize slavery when they see it, no matter what those in power call it.

Images: Acropolis moonrise (Antónis Ayiomamítis); protests in Syntagma (Constitution) Square in front of the Parliament building, Athens, June 2011 (Oréstis Panayótou).

Ghost in the Shell: Why Our Brains Will Never Live in the Matrix

June 23rd, 2011

Introductory note: Through Paul Graham Raven of Futurismic, I found out that Charles Stross recently expressed doubts about the Singularity, god-like AIs and mind uploading.  Being the incorrigible curious cat (this will kill me yet), I checked out the post.  All seemed more or less copacetic, until I hit this statement: “Uploading … is not obviously impossible unless you are a crude mind/body dualist. // Uploading implicitly refutes the doctrine of the existence of an immortal soul.”

Clearly the time has come for me to reprint my mind uploading article, which first appeared at H+ magazine in October 2009. Consider it a recapitulation of basic facts.

When surveying the goals of transhumanists, I found it striking how heavily they favor conventional engineering. This seems inefficient and inelegant, since such engineering reproduces slowly, clumsily and imperfectly, what biological systems have fine-tuned for eons — from nanobots (enzymes and miRNAs) to virtual reality (lucid dreaming). An exemplar of this mindset was an article about memory chips. In it, the primary researcher made two statements that fall in the “not even wrong” category: “Brain cells are nothing but leaky bags of salt solution,” and “I don’t need a grand theory of the mind to fix what is essentially a signal-processing problem.”

And it came to me in a flash that most transhumanists are uncomfortable with biology and would rather bypass it altogether for two reasons, each exemplified by these sentences. The first is that biological systems are squishy — they exude blood, sweat and tears, which are deemed proper only for women and weaklings. The second is that, unlike silicon systems, biological software is inseparable from hardware. And therein lies the major stumbling block to personal immortality.

The analogy du siècle equates the human brain with a computer — a vast, complex one performing dizzying feats of parallel processing, but still a computer. However, that is incorrect for several crucial reasons, which bear directly upon mind portability. A human is not born as a tabula rasa, but with a brain that’s already wired and functioning as a mind. Furthermore, the brain forms as the embryo develops. It cannot be inserted after the fact, like an engine in a car chassis or software programs in an empty computer box.

Theoretically speaking, how could we manage to live forever while remaining recognizably ourselves to us? One way is to ensure that the brain remains fully functional indefinitely. Another is to move the brain into a new and/or indestructible “container”, whether carbon, silicon, metal or a combination thereof. Not surprisingly, these notions have received extensive play in science fiction, from the messianic angst of The Matrix to Richard Morgan’s Takeshi Kovacs trilogy.

To give you the punch line up front, the first alternative may eventually become feasible but the second one is intrinsically impossible. Recall that a particular mind is an emergent property (an artifact, if you prefer the term) of its specific brain – nothing more, but also nothing less. Unless the transfer of a mind retains the brain, there will be no continuity of consciousness. Regardless of what the post-transfer identity may think, the original mind with its associated brain and body will still die – and be aware of the death process. Furthermore, the newly minted person/ality will start diverging from the original the moment it gains consciousness. This is an excellent way to leave a clone-like descendant, but not to become immortal.

What I just mentioned essentially takes care of all versions of mind uploading, if by uploading we mean recreation of an individual brain by physical transfer rather than a simulation that passes Searle’s Chinese room test. However, even if we ever attain the infinite technical and financial resources required to scan a brain/mind 1) non-destructively and 2) at a resolution that will indeed recreate the original, additional obstacles still loom.

To place a brain into another biological body, à la Mary Shelley’s Frankenstein, could arise as the endpoint extension of appropriating blood, sperm, ova, wombs or other organs in a heavily stratified society. Besides being de facto murder of the original occupant, it would also require that the incoming brain be completely intact, as well as able to rewire for all physical and mental functions. After electrochemical activity ceases in the brain, neuronal integrity deteriorates in a matter of seconds. The slightest delay in preserving the tissue seriously skews in vitro research results, which tells you how well this method would work in maintaining details of the original’s personality.

To recreate a brain/mind in silico, whether a cyborg body or a computer frame, is equally problematic. Large portions of the brain process and interpret signals from the body and the environment. Without a body, these functions will flail around and can result in the brain, well, losing its mind. Without corrective “pingbacks” from the environment that are filtered by the body, the brain can easily misjudge to the point of hallucination, as seen in phenomena like phantom limb pain or fibromyalgia.

Additionally, without context we may lose the ability for empathy, as is shown in Bacigalupi’s disturbing story People of Sand and Slag. Empathy is as instrumental to high-order intelligence as it is to survival: without it, we are at best idiot savants, at worst psychotic killers. Of course, someone can argue that the entire universe can be recreated in VR. At that point, we’re in god territory … except that even if some of us manage to live the perfect Second Life, there’s still the danger of someone unplugging the computer or deleting the noomorphs. So there go the Star Trek transporters, there go the Battlestar Galactica Cylon resurrection tanks.

Let’s now discuss the possible: in situ replacement. Many people argue that replacing brain cells is not a threat to identity because we change cells rapidly and routinely during our lives — and that in fact this is imperative if we’re to remain capable of learning throughout our lifespan.

It’s true that our somatic cells recycle, each type on a slightly different timetable, but there are two prominent exceptions. The germ cells are one, which is why both genders – not just women – are progressively likelier to have children with congenital problems as they age. Our neurons are another. We’re born with as many of these as we’re ever going to have and we lose them steadily during our life. There is a tiny bit of novel neurogenesis in the olfactory system and possibly in the hippocampus, but the rest of our 100 billion microprocessors neither multiply nor divide. What changes are the neuronal processes (axons and dendrites) and their contacts with each other and with other cells (synapses).

These tiny processes make and unmake us as individuals. We are capable of learning as long as we live, though with decreasing ease and speed, because our axons and synapses are plastic as long as the neurons that generate them last. But although many functions of the brain are diffuse, they are organized in localized clusters (which can differ from person to person, sometimes radically). Removal of a large portion of a brain structure results in irreversible deficits unless it happens in very early infancy. We know this from watching people go through transient or permanent personality and ability changes after head trauma, stroke, extensive brain surgery or during the agonizing process of various neurodegenerative diseases, dementia in particular.

However, intrepid immortaleers need not give up. There’s real hope in the horizon for renewing a brain and other body parts: embryonic stem cells (ESCs, which I discussed recently). Depending on the stage of isolation, ESCs are truly totipotent – something, incidentally, not true of adult stem cells that can only differentiate into a small set of related cell types. If neuronal precursors can be introduced to the right spot and coaxed to survive, differentiate and form synapses, we will gain the ability to extend the lifespan of a brain and its mind.

It will take an enormous amount of fine-tuning to induce ESCs to do the right thing. Each step that I casually listed in the previous sentence (localized introduction, persistence, differentiation, synaptogenesis) is still barely achievable in the lab with isolated cell cultures, let alone the brain of a living human. Primary neurons live about three weeks in the dish, even though they are fed better than most children in developing countries – and if cultured as precursors, they never attain full differentiation. The ordeals of Christopher Reeve and Stephen Hawking illustrate how hard it is to solve even “simple” problems of either grey or white brain matter.

The technical hurdles will eventually be solved. A larger obstacle is that each round of ESC replacement will have to be very slow and small-scale, to fulfill the requirement of continuous consciousness and guarantee the recreation of pre-existing neuronal and synaptic networks. As a result, renewal of large brain swaths will require such a lengthy lifespan that the replacements may never catch up. Not surprisingly, the efforts in this direction have begun with such neurodegenerative diseases as Parkinson’s, whose causes are not only well defined but also highly localized: the dopaminergic neurons in the substantia nigra.

Renewing the hippocampus or cortex of a Alzheimer’s sufferer is several orders of magnitude more complicated – and in stark contrast to the “black box” assumption of the memory chip researcher, we will need to know exactly what and where to repair. To go through the literally mind-altering feats shown in Whedon’s Dollhouse would be the brain equivalent of insect metamorphosis: it would take a very long time – and the person undergoing the procedure would resemble Terry Schiavo at best, if not the interior of a pupating larva.

Dollhouse got one fact right: if such rewiring is too extensive or too fast, the person will have no memory of their prior life, desirable or otherwise. But as is typical in Hollywood science (an oxymoron, but we’ll let it stand), it got a more crucial fact wrong: such a person is unlikely to function like a fully aware human or even a physically well-coordinated one for a significant length of time – because her brain pathways will need to be validated by physical and mental feedback before they stabilize. Many people never recover full physical or mental capacity after prolonged periods of anesthesia. Having brain replacement would rank way higher in the trauma scale.

The most common ecological, social and ethical argument against individual quasi-eternal life is that the resulting overcrowding will mean certain and unpleasant death by other means unless we are able to access extra-terrestrial resources. Also, those who visualize infinite lifespan invariably think of it in connection with themselves and those whom they like – choosing to ignore that others will also be around forever, from genocidal maniacs to cult followers, to say nothing of annoying in-laws or predatory bosses. At the same time, long lifespan will almost certainly be a requirement for long-term crewed space expeditions, although such longevity will have to be augmented by sophisticated molecular repair of somatic and germ mutations caused by cosmic radiation. So if we want eternal life, we had better first have the Elysian fields and chariots of the gods that go with it.

Images: Echo (Eliza Dushku) gets a new personality inserted in Dollhouse; any port in a storm — Spock (Leonard Nimoy) transfers his essential self to McCoy (DeForest Kelley) for safekeeping in The Wrath of Khan; the resurrected Zoe Graystone (Alessandra Torresani) gets an instant memory upgrade in Caprica; Jake Sully (Sam Worthington) checks out his conveniently empty Na’vi receptacle in Avatar.

Why I Won’t Be Taking the Joanna Russ Pledge

June 16th, 2011

A Geology Lesson

Here, the sea strains to climb up on the land
and the wind blows dust in a single direction.
The trees bend themselves all one way
and volcanoes explode often.
Why is this? Many years back
a woman of strong purpose
passed through this section
and everything else tried to follow.

— Judy Grahn, from She Who

Between the physical death of Joanna Russ and the latest endless lists and discussions about women’s visibility and recognition in SF/F, well-meaning people have come up with the Russ Pledge. Namely, a pledge to acknowledge and promote women’s work.

As recent history has shown, Twitter notices don’t start revolutions, let alone sustain them. Even if they did, I won’t be taking the Russ pledge for the simplest of reasons. I have been implementing it for the last forty-plus years. It’s not a cute button on my lapel. It’s not a talking point in my public persona. I cannot take it off when I take off my clothes. It’s not an option. It’s an integral component of my bone marrow that has shaped my personal and professional life.

Long before her death, Russ had been marginalized for being too prickly, a prominent target of the “tone” argument. Even many women found her uncomfortable — she might annoy the Powers that Be and compromise paltry gains. As if good behavior brought acceptance to boys’ treehouses. As if she didn’t threaten the status quo by her mere existence, let alone her uncompromising stories, essays and reviews. Most people know of The Female Man and How to Suppress Women’s Writing, if only by rumor, but the rest of her opus is just as radical. If you want to have your preconceptions soothed by feel-good feminism, Russ is not your woman.

It’s not surprising that eventually she burned out (“chronic fatigue syndrome”), like most people in equivalent circumstances. She kept showcasing true aliens — women as autonomous beings with agency! — and asking questions outside the box. She kept pointing out that even if you have been “promoted” from field hand to house servant you can still be sold down the river. An uncomfortable reminder for those who keep clinging to the hope of “change from within”, the illusion that being abjectly nice to the ensconced gatekeepers and kicking the more disenfranchised below will ensure decent treatment, or even survival.

Joanna Russ paved the way for all who walk the path of real change not merely with words, but with her body. Like the women in folk ballads who got buried alive so that bridges would stand, she deserves more than pious twitterings now that she’s safely dead. I recognize the good intentions of those who promote this pledge in her name. But enough already with “mistress lists” and their ilk. If people want to really do something, I suggest (and mind you, this is a suggestion, not the forcible penectomy some obviously consider it to be) that they read women’s books. Publish them for real money, as in pro-rate presses – pathetic as pro rates are, these days. Review them in major outlets. Nominate them for prestigious awards. Hire them as editors, columnists and reviewers (not slush readers or gofers) in major venues, in more than token numbers.  Teach them in courses.

Unconscious bias is a well-documented phenomenon and is alive and thriving even (especially) in self-labeled “progressive” communities. Women have shown up the arguments for intrinsic inferiority by winning open chairs in orchestras when performing behind curtains and winning major literary awards when hiding behind pseudonyms. But this does not change the dominant culture. And it does not make up for the oceans of creative talent that got lost — suppressed or squandered in anger or hopelessness.

I will let Russ herself have the last word. It’s striking how ageless she remains:

“Leaning her silly, beautiful, drunken head on my shoulder, she said, “Oh, Esther, I don’t want to be a feminist. I don’t enjoy it. It’s no fun.”

“I know,” I said. “I don’t either.” People think you decide to be a “radical,” for God’s sake, like deciding to be a librarian or a ship’s chandler. You “make up your mind,” you “commit yourself” (sounds like a mental hospital, doesn’t it?).

I said Don’t worry, we could be buried together and have engraved on our tombstone the awful truth, which some day somebody will understand:

WE WUZ PUSHED.”

from On Strike Against God

The Hard Underbelly of the Future: Sue Lange’s Uncategorized

June 14th, 2011

Sue Lange’s collection Uncategorized (Book View Café, 2009, $1.99 digital edition) contains fifteen stories published in various venues (among them Apex, Astounding Tales, Sentinel, Mbrane and Aoife’s Kiss). If I wanted to categorize them, I’d call them quasi-mundane near-future SF – but some of the unifying threads that run through them are unusual.

One is Lange’s love and professional knowledge of music, which pops up in unexpected spots in the stories and is the focus of one of them (“The Failure”).  Another is the matter-of-fact attitude toward technology: the people and societies in Uncategorized have come to terms with genetic engineering and its cousins, although they are aware of their problems.  Finally, the points of view are resolutely working class (many springing directly from Lange’s varied work experience). This doesn’t merely mean that the protagonists/narrators are blue collar. Instead, almost all the stories in Uncategorized center around work issues for people whose jobs are not a way of “expressing themselves” but a way to keep food on the table. These are people who cannot afford ennui or angst, who must punch time cards and undergo intrusive HR evaluations.

There’s an additional feature that makes Lange’s blue-collar protagonists stand out: most are women who do “traditionally masculine” work: meat plant workers, plumbers, soldiers, radiation cleanup crews. Furthermore, these women focus on their jobs and many of their co-workers and friends are women as well. In other words, Lange’s stories handily pass the Bechdel test without falling even remotely into the arbitrarily devalued subgenre of chicklit. If you took Rosie the Riveter and transposed her to a near-future alternate US (minus such outworn cyberpunk accessories as pneumatic-boob avatars and Matrix-style gyrations), you’d have the setting for most of the stories in Uncategorized.

Contributing to this gestalt are Lange’s deadpan humor and rapid-fire dialogue, which require some acclimatization but can become as catchy as strong beats and riffs. Her language is unvarnished Bauhaus – there’s scarcely a descriptive adjective or adverb to be found. Ditto for the settings, which are urban grit to the max even in the stories set off-Earth. One recurrent weakness is hurried endings, often accompanied by twists that were predictable (to me at least). Almost all the stories in Uncategorized would have increased their impact if they were longer and/or less sparse, because they grapple with important issues in original ways without fanfare.

Although Uncategorized hews to the premise of its title, some of its stories are thematically paired – one version comic, the other tragic. “The Club” / “How to Dispose of Sneakers” deal with the intractable problem of humanity’s ecological footprint; “BehaviorNorm” / “Buyer’s Club” tackle another intractable problem, the callousness of administrative management (think Dilbert with a touch of Big Brother transhumanism). For me, the standouts in the collection were: “Peroxide Head”, a poignant vignette on what balancing issues might really be like for a liaison to “Others” in Banks’ Culture universe; “The Meateaters”, a no-holds-barred Outland retelling of Eurypides’ Bacchae; “Buyer’s Club”; “Pictures”, a valentine to second chances; and “Zara Gets Laid”, in which sexual intercourse boosts the immunity of bio-augmented radiation cleanup workers (based on solid extrapolation, no less!).

Despite its deceptively plain trappings, Uncategorized subtends a wide arc and is textbook-classic SF: its stories follow “what if” questions to their logical conclusions, pulling no punches. It’s a prickly, bracing read that walks a fine line between bleakness and pragmatism, and it deserves the wider readership it might well have got if its author had been of the other gender.

Miranda Wrongs: Reading Too Much into the Genome

June 10th, 2011

Introductory note: My micro-bio in several venues reads “Scientist by day, writer by night.” It occurred to me that for lo these many years I have discussed science, scientific thinking and process, space exploration, social issues, history and culture, books, films and games… but I have never told my readers exactly what I do during the day.

What I do during the day (and a good part of the night) is investigate a process called alternative splicing and its repercussions on brain function. I will unpack this in future posts (interspersed among the usual musings on other topics), and I hope that this excursion may give a glimpse of how complex biology is across scales.

To start us off, I reprint an article commissioned by R. U. Sirius that first appeared in H+ Magazine in April 2010. An academic variant of this article appeared in Politics and the Life Sciences in response to Mark Walker’s “Genetic Virtue” proposal.

“We meant it for the best.” – Dr. Caron speaking of the Miranda settlers, in Whedon’s Serenity

When the sequence of the human genome was declared essentially complete in 2003, all biologists (except perhaps Craig Venter) heaved a sigh of gladness that the data were all on one website, publicly available, well-annotated and carefully cross-linked. Some may have hoisted a glass of champagne. Then they went back to their benches. They knew, if nobody else did, that the work was just beginning. Having the sequence was the equivalent of sounding out the text of an alphabet whose meaning was still undeciphered. For the linguistically inclined, think of Etruscan.

The media, with a few laudable exceptions, touted this as “we now know how genes work” and many science fiction authors duly incorporated it into their opuses. So did people with plans for improving humanity. Namely, there are initiatives that seriously propose that such attributes as virtue, intelligence, specific physical and mental abilities or, for that matter, a “happy personality” can (and should) be tweaked by selection in utero or engineering of the genes that determine these traits. The usual parties put forth the predictable pro and con arguments, and many articles get published in journals, magazines and blogs.

This is excellent for the career prospects and bank accounts of philosophers, political scientists, biotech entrepreneurs, politicians and would-be prophets. However, biologists know that all this is a parlor game equivalent to determining the number of angels dancing on the top of a pin. The reason for this is simple: there are no genes for virtue, intelligence, happiness or any complex behavioral trait. This becomes obvious by the number of human genes: the final count hovers around 20-25,000, less than twice as many as the number in worms and flies. It’s also obvious by the fact that cloned animals don’t look and act like their prototypes, Cc being the most famous example.

Genes encode catalytic, structural and regulatory proteins and RNAs. They do not encode the nervous system; even less do they encode complex behavior. At the level of the organism, they code for susceptibilities and tendencies — that is, with a few important exceptions, they are probabilistic rather than deterministic. And although many diseases develop from malfunctions of single genes, this does not indicate that single genes are responsible for any complex attribute. Instead they’re the equivalent of screws or belts, whose loss can stop a car but does not make it run.

No reputable biologist suggests that genes are not decisively involved in outcomes. But the constant harping on trait heritability “in spite of environment” is a straw man. Its main prop, the twin studies, is far less robust than commonly presented — especially when we take into account that identical twins often know each other before separation and, even when adopted, are likely to grow up in very similar environments (to say nothing of the data cherry-picking for publication). The nature/nurture debate has been largely resolved by the gene/environment (GxE) interplay model, a non-reductive approximation closer to reality. Genes never work in isolation but as complex, intricately regulated cooperative networks and they are in constant, dynamic dialogue with the environment — from diet to natal language. That is why second-generation immigrants invariably display the body morphology and disease susceptibilities of their adopted culture, although they have inherited the genes of their natal one.

Furthermore, there’s significant redundancy in the genome. Knockouts of even important single genes in model organisms often have practically no phenotype (or a very subtle one) because related genes take up the slack. The “selfish gene” concept as presented by reductionists of all stripes is arrant nonsense. To stay with the car analogy, it’s the equivalent of a single screw rotating in vacuum by itself. It doth not even a cart make, let alone the universe-spanning starship that is our brain/mind.

About half of our genes contribute directly to brain function; the rest do so indirectly, since brain function depends crucially on signal processing and body feedback. This makes the brain/mind a bona fide complex (though knowable) system. This attribute underlines the intrinsic infeasibility of instilling virtue, intelligence or good taste in clothes by changing single genes. If genetic programs were as fixed, simple and one-to-one mapped as reductionists would like, we would have answered most questions about brain function within months after reading the human genome. As a pertinent example, studies indicates that the six extended genomic regions that were defined by SNP (single nucleotide polymorphism) analysis to contribute the most to IQ — itself a population-sorting tool rather than a real indicator of intelligence — influence IQ by a paltry 1%.

The attempts to map complex behaviors for the convenience and justification of social policies began as soon as societies stratified. To list a few recent examples, in the last decades we’ve had the false XYY “aggression” connection, the issue of gay men’s hypothalamus size, and the sloppy and dangerous (but incredibly lucrative) generalizations about brain serotonin and “nurturing” genes. Traditional breeding experiments (cattle, horses, cats, dogs, royal families) have an in-built functional test: the progeny selected in this fashion must be robust enough to be born, survive and reproduce. In the cases where these criteria were flouted, we got such results as vision and hearing impairments (Persian and Siamese cats), mental instability (several dog breeds), physical fragility and Alexei Romanov.

I will leave aside the enormous and still largely unmet technical challenge of such implementation, which is light years distant from casual notes that airily prescribe “just add tetracycline to the inducible vector that carries your gene” or “inject artificial chromosomes or siRNAs.” I play with all these beasties in the lab, and can barely get them to behave in homogeneous cell lines. Because most cognitive problems arise not from huge genomic errors but from small shifts in ratios of “wild-type” (non-mutated) proteins which affect brain architecture before or after birth, approximate engineering solutions will be death sentences. Moreover, the proposals usually advocate that such changes be done in somatic cells, not the germline (which would make them permanent). This means intervention during fetal development or even later — a far more difficult undertaking than germline alteration. The individual fine-tuning required for this in turn brings up differential resource access (and no, I don’t believe that nanotech will give us unlimited resources).

Let’s now discuss the improvement touted in “enhancement” of any complex trait. All organisms are jury-rigged across scales: that is, the decisive criterion for an adaptive change (from a hemoglobin variant to a hip-bone angle) is function, rather than elegance. Many details are accidental outcomes of an initial chance configuration — the literally inverted organization of the vertebrate eye is a prime example. Optimality is entirely context-dependent. If an organism or function is perfected for one set of circumstances, it immediately becomes suboptimal for all others. That is the reason why gene alleles for cystic fibrosis and sickle cell anemia persisted: they conferred heterozygotic resistance to cholera and malaria, respectively. Even if it were possible to instill virtue or musicality (or even the inclination for them), fixing them would decrease individual and collective fitness. Furthermore, the desired state for all complex behaviors is fluid and relative.

The concept that pressing the button of a single gene can change any complex behavior is entirely unsupported by biological evidence at any scale: molecular, cellular, organismic. Because interactions between gene products are complex, dynamic and give rise to pleiotropic effects, such intervention can cause significant harm even if implemented with full knowledge of genomic interactions (which at this point is no even partially available). It is far more feasible to correct an error than to “enhance” an already functioning brain. Furthermore, unlike a car or a computer, brain hardware and software are inextricably intertwined and cannot be decoupled or deactivated during modification.

If such a scenario is optional, it will introduce extreme de facto or de jure inequalities. If it is mandatory, beyond the obvious fact that it will require massive coercion, it will also result in the equivalent of monocultures, which is the surest way to extinction regardless of how resourceful or dominant a species is. And no matter how benevolent the motives of the proponents of such schemes are, all utopian implementations, without exception, degenerate into slaughterhouses and concentration camps.

The proposals to augment “virtue” or “intelligence” fall solidly into the linear progress model advanced by monotheistic religions, which takes for granted that humans are in a fallen state and need to achieve an idealized perfection. For the religiously orthodox, this exemplar is a god; for the transhumanists, it’s often a post-singularity AI. In reality, humans are a work in continuous evolution both biologically and culturally and will almost certainly become extinct if they enter any type of stasis, no matter how “perfect.”

But higher level arguments aside, the foundation stone of all such discussions remains unaltered and unalterable: any proposal to modulate complex traits by changing single genes is like preparing a Mars expedition based on the Ptolemaic view of the universe.

Images: The “in-valid” (non-enhanced) protagonist of GATTACA; M. C. Escher, Drawing Hands; Musical Genes cartoon by Polyp; Rainbow nuzzles her clone Cc (“carbon copy”) at Texas A&M University.

A Plague on Both Your Houses – Reprise

May 28th, 2011

Note: This article originally appeared in the Apex blog, with different images. The site got hacked since then and its owner did not feel up to reconstituting the past database.  I reprint it as a companion piece to Sam Kelly’s Privilege and Fantasy.

Don’t you know
They’re talkin’ ’bout a revolution
– Tracy Chapman

In James Tiptree’s “Houston, Houston, Do You Read?” three male astronauts are thrown forward in time and return to an earth in which an epidemic has led to the extinction of men. They perceive a society that needs firm (male) guidance to restore correct order and linear progress. In fact, the society is a benevolent non-coercive non-hierarchical anarchy with adequate and stable resources; genetic engineering and cloning are advanced, spaceships are a given, there’s an inhabited Lunar base and multiple successful expeditions to Venus and Mars. One of the men plans to bring the women back under god’s command (with him as proxy) by applying Pauline precepts. Another plans to rut endlessly in a different kind of paradise. The women, after giving them a long rope, decide they won’t resurrect the XY genotype.

The skirmish in the ongoing war about contemporary fantasy between Leo Grin and Joe Abercrombie reminds me of Tiptree’s story. Grin and Abercrombie argued over fantasy as art, social construct and moral fable totally oblivious to the relevant achievements of half of humanity – closer to ninety percent, actually, when you take into account the settings of the works they discussed. No non-male non-white non-Anglosaxon fantasy writers were mentioned in their exchanges and in almost all of the reactions to their posts (I found only two partial exceptions).

I expected this from Grin. After all, he wrote his essay under the auspices of Teabagger falsehood-as-fact generator Andrew Breitbart. His “argument” can be distilled to “The debasement of heroic fantasy is a plot of college-educated liberals!” On the other hand, Abercrombie’s “liberalism” reminds me of the sixties free-love dictum that said “Women can assume all positions as long as they’re prone.” The Grin camp (henceforth Fathers) conflates morality with religiosity and hearkens nostalgically back to Tolkien who essentially retold Christian and Norse myths, even if he did it well. The Abercrombie camp (henceforth Sons) equates grittiness with grottiness and channels Howard – incidentally, a basic error by Grin who put Tolkien and Howard in the same category in his haste to shoehorn all of today’s fantasy into the “decadent” slot. In fact, Abercrombie et al. are Howard’s direct intellectual descendants, although Grin’s two idols were equally reactionary in class-specific ways. Fathers and Sons are nevertheless united in celebrating “manly” men along the lines demarcated by Tiptree.

As I’ve said elsewhere, I enjoy playing RPGs in many guises. But even for games – let alone for reading – I prefer constructs that are nuanced and, equally importantly, worlds in which I can see myself living and working. Both camps write stories set in medieval worlds whose protagonists are essentially Anglosaxon white men with a soupçon of Norse or Celt to spice the bland gruel. To name just a few examples, this is true of Tolkien’s Middle Earth, Howard’s Conan stories, Moorcock’s Elric saga, Leiber’s Fafhrd series, Jordan’s Wheel of Time toe-bruisers, Martin’s fast-diminishing-returns Fire and Ice cycle. The sole difference is approach, which gets mistaken for outlook. If I may use po-mo terms, the Fathers represent constipation, the Sons diarrhea; Fathers the sacred, Sons the profane – in strictly masculinist terms. In either universe, women are deemed polluting (that is, distracting from bromances) or furniture items. The fact that even male directors of crowd-pleasers have managed to create powerful female heroes, from Jackson’s Éowyn to Xena (let alone the women in wuxia films), highlight the tame and regressive nature of “daring” male-written fantasy.

Under the cover of high-mindedness, the Fathers posit that worthy fantasy must obey the principles of abrahamic religions: a rigid, stratified society where everyone knows their place, the color of one’s skin determines degree of goodness, governments are autocratic and there is a Manichean division between good and evil: the way of the dog, a pyramidal construct where only alpha males fare well and are considered fully human. The Sons, under the cover of subversive (if only!) deconstruction, posit worlds that embody the principles of a specific subset of pagan religions: a society permanently riven by discord and random cruelty but whose value determinants still come from hierarchical thinking of the feudal variety: the way of the baboon, another (repeat after me) pyramidal construct where only alpha males fare well and are considered fully human. Both follow Campbell’s impoverished, pseudo-erudite concepts of the hero’s quest: the former group accepts them, the latter rejects them but only as the younger son who wants the perks of the first-born. Both think squarely within a very narrow box.

Other participants in this debate already pointed out that Tolkien is a pessimist and Howard a nihilist, that outstanding earlier writers wrote amoral works (Dunsany was mentioned; I’d add Peake and Donaldson) and that the myths which form the base of most fantasy are riddled with grisly violence. In other words, it looks like Grin at least hasn’t read many primary sources and both his knowledge and his logic are terminally fuzzy, as are those of his supporters.

A prominent example was the accusation from one of Grin’s acolytes that contemporary fantasy is obsessed with balance which is “foreign to the Western temperament” (instead of, you know, ever thrusting forward). He explicitly conflated Western civilization with European Christendom, which should automatically disqualify him from serious consideration. Nevertheless, I will point out that pagan Hellenism is as much a cornerstone of Western civilization as Christianity, and Hellenes prized balance. The concept of “Midhén ághan” (nothing in excess) was crucial in Hellenes’ self-definition: they watered their wine, ate abstemiously, deemed body and mind equally important and considered unbridled appetites and passions detriments to living the examined life. At the same time, they did not consider themselves sinful and imperfect in the Christian sense, although Hellenic myths carry strong strains of defiance (Prometheus) and melancholy (their afterworld, for one).

Frankly, the Grin-Abercrombie fracas reminds me of a scene in Willow. At the climax of the film, while the men are hacking at each other down at the courtyard, the women are up at the tower hurling thunderbolts. By the time the men come into the castle, the battle has been waged and won by women’s magic.

So enough already about Fathers and Sons in their temples and potties. Let’s spend our time more usefully and pleasantly discussing the third member of the trinity. Before she got neutered, her name was Sophia (Wisdom) or Shekinah (Presence). Let’s celebrate some people who truly changed fantasy – to its everlasting gain, as is the case with SF.

My list will be very partial and restricted to authors writing in English and whose works I’ve read, which shows we are dealing with an embarrassment of riches. I can think of countless women who have written paradigm-shifting heroic fantasy, starting with Emily Brontë who wrote about a world of women heroes in those tiny hand-sewn diaries. Then came trailblazers Catherine Moore, Mary Stewart and André Norton. Ursula Le Guin’s Earthsea is another gamechanger (although her gender-specific magic is problematic, as I discussed in Crossed Genres) and so is her ongoing Western Shores series. Katherine Kurtz’s Deryni cycle is as fine a medieval magic saga as any. We have weavers of new myths: Jane Yolen, Patricia McKillip, Meredith Ann Pierce, Alma Alexander; and tellers of old myths from fresh perspectives: Tanith Lee, Diana Paxson, Marion Zimmer Bradley, Terri Windling, Emma Bull, C. J. Cherryh, Christine Lucas.

Then there’s Elizabeth Lynn, with her Chronicles of Tornor and riveting Ryoka stories. Marie Jakober, whose Even the Stones have haunted me ever since I read it. Elizabeth Marshall Thomas, whose heroic prehistoric fantasies have never been bested. Jacqueline Carey, who re-imagined the Renaissance from Eire to Nubia and made a courtesan into a swashbuckler in the first Kushiel trilogy, showing a truly pagan universe in the bargain. This without getting into genre-cracking mythmakers like Isak Dinesen (Karen Blixen) and Louise Erdrich.

These authors share several attributes: they have formidable writing skills and honor their sources even as they transmute them. Most importantly, they break the tired old tropes and conventional boundaries of heroic fantasy and unveil truly new vistas. They venture past medieval settings, hierarchical societies, monotheistic religions, rigid moralities, “edgy” gore, Tin John chest beatings, and show us how rich and exciting fantasy can become when it stops being timid and recycling stale recipes. As one of the women in Tiptree’s “Houston, Houston” says: “We sing a lot. Adventure songs, work songs, mothering songs, mood songs, trouble songs, joke songs, love songs – everything.”

Everything.

Images: Éowyn, shieldmaiden of Rohan (Miranda Otto) in The Two Towers; Sonja, vampire paladin (Rhona Mitra) in Rise of the Lycans; Yu Shu Lien, Wudan warrior (Michelle Yeoh) in Crouching Tiger, Hidden Dragon.

Area 51: Teen Commies from Outer Space!

May 19th, 2011

(with due props to Douglas Kenney, National Lampoon co-founder)

Driving to work earlier this week, I heard Terry Gross of Fresh Air (NPR) interview Annie Jacobsen about her new book, Area 51: An Uncensored History of America’s Top Secret Military Base. Jacobsen is a national security reporter, which means she is used to facing stonewalling and secrecy and therefore well aware that she must triple-check her information. It all sounded like sober investigative reporting, until it got to the coda. Bear with me, grasshoppahs, because the truth is definitely way out there.

As you know, Bobs, Area 51 is a military installation in Nevada next to the Yucca Flats where most of the US nuclear tests have been conducted. Area 51 was the home of the U-2 and Oxcart military aircraft testing programs and its resident experts appear to have reverse-engineered Soviet MiGs. Some conspiracy lovers opine that the lunar landing was “faked” there. Not surprisingly, its specifics are heavily classified, including an annually-renewed presidential EPA exception to disclosing (ab)use of toxic agents. People in the ostensibly free world can’t even get decent aerial pictures of it — which of course did not deter satellites of other nations, but who cares for rationality where national security is concerned?

To UFO believers, Area 51 is also the facility that analyzed whatever crashed near Roswell in 1947. Which is where Jacobsen’s theory comes in, backed by a single anonymous source. She proposes that the Roswell object was neither a weather balloon nor an alien spacecraft but a remotely flown Soviet craft based on prototypes by the Horten brothers, aircraft designers and Nazi party members. This part is old news, since this possibility was already considered in cold war US investigations.

Jacobsen’s addition (asserted with a completely straight face and demanding to be taken seriously) is that this craft contained “genetically/surgically altered” teenagers engineered by Josef Mengele at the command of that other monstrous Joseph, Stalin. The modifications had produced uniform results of “abnormally large heads and eyes” etc. The goal was to scare the US and weaken its defenses by a repetition of the panic created by Orson Welles’ War of the Worlds 1938 broadcast.

Got that? I can safely bet that Hollywood agents are bidding frantically for the rights to the screenplay even as we speak. And so they should — it’s a guaranteed blockbuster. It has everything: UFOs, Nazis, Frankenstein monsters, government conspiracies… It ties so many loose ends together so neatly that it’s irresistible.

I will leave to other experts the issues of quoting a single anonymous source and the previous debunkings of similar Area 51 “insiders” like Robert Lazar et al. The part that made me laugh out loud was the “genetically/surgically altered” cherry on top of that fabulous cake. To her credit, Gross pressed Jacobsen on this, only to get a rewinding of the tape without any real explanation beyond “I trusted my Cigarette-Smoking Man because I spent two years talking to him.”

For those who don’t live in a parallel universe, the fact that DNA is the carrier of heredity for most terrestrial lifeforms was established in the fifties, which (*counting on my fingers*) came after 1947. So Mengele or anyone else could not have engaged in any form of targeted genetic engineering; that only became possible, in its crudest form, in the eighties. If “genetic” is intended to mean plain ol’ interbreeding, humans take a bit more than two years (the interval from the time the Russians walked into Berlin till the Roswell crash) to 1) produce children, 2) have the children grow into teenagers and, just as crucially, 3) reliably reproduce traits.

Starvation or breaking of bones during childhood can lead to stunting (as Toulouse-Lautrec’s case demonstrates) but I know of no surgery that can increase head size — hydrocephalus kills its sufferers in rather short order. Grafting was so primitive back then that it’s unlikely its recipients would have survived long enough for a transatlantic trip. The only scenario I can envision that would result to anything remotely tangential to Jacobsen’s contention is if the Soviets used youngsters suffering from growth factor or growth hormone deficiencies — genetic conditions that arise without the intervention of experimentation.

Don’t misunderstand me, I know the idiocies that military and “intelligence” agencies are capable of — from marching soldiers to ground zero well after the consquences of radioactive fallout had become obvious, to the frightening abuses of MK-ULTRA, to the Stargate “Jedi warriors” who stared at spoons and goats. But all these are extensively documented, as well as compatible with the technology available at the time they occurred. Jacobsen’s theory is as grounded as the alien craft alternative it purports to debunk. Pity that Gross didn’t invite a biology undergrad to the program.

My theory (and I’ll be happy to talk to Hollywood agents about it) is that the engineered youngsters decided to defect, commandeered the craft and crashed it while drunk on freedom and contraband beer. I even have my own impeccable source: the small store owner at the outskirts of Roswell who sold them the beer. Smoking Parodies and drinking his regular shot of Colt 45 from an oil can, he confided wistfully: “They just wanted to see the Vegas shows, like any kid their age.”

Images: top, Independence Day — the alien craft secreted and reverse-engineered in Area 51; bottom, another possible explanation for the Roswell crash: abduction lesson troubles (from Pixar’s Lifted).

What’s Sex Got to Do with It?

May 17th, 2011

(sung to Tina Turner’s à propos catchy tune)

Two events unfolded simultaneously in the last few days: Arnold Schwarzenegger’s admission that he left a household servant with a “love child” and Dominique Strauss-Kahn’s arrest for attempting to rape a hotel maid. Before that, we had the almost weekly litany of celebrity/tycoon/politician/figurehead caught with barely-of-age girl(s)/boy(s). In a sadly familiar refrain, an ostensibly liberal commentator said:

“…we know that powerful men do stupid, self-destructive things for sexual reasons every single day. If we’re looking for a science-based explanation, it probably has more to do with evolutionarily induced alpha-male reproductive mandates than any rational weighing of pros and cons.”

Now I hate to break it to self-labeled liberal men but neither love nor sex have anything to do with sexual coercion and Kanazawa-style Tarzanism trying to pass for “evolutionary science” won’t cut it. Everyone with a functioning frontal cortex knows by now that rape is totally decoupled from reproduction. The term “love child”, repeated ad nauseam by the media, is obscene in this context.

Leaving love aside, such encounters are not about sex either. For one, coerced sex is always lousy; for another, no reproductive mandate is involved, as the gang rapes of invading armies show. What such encounters are about, of course, is entitlement, power and control: the prerogative of men in privileged positions to use others (women in particular) as toilet paper with no consequences to themselves short of the indulgent “He’s such a ladies’ man…” and its extension: “This was a trap. Such men don’t need to rape. Women fling themselves in droves at alpha males!”

As I keep having to point out, there are no biological alpha males in humans no matter what Evo-Psycho prophet-wannabees preach under the false mantra of “Real science is not PC, let the chips fall where they may”. Gorillas have them. Baboons have them, with variances between subgroups. Our closest relatives, bonobos and chimpanzees, don’t. What they have are shifting power alliances for both genders (differing in detail in each species). They also have maternally-based status because paternity is not defined and females choose their partners. Humans have so-called “alpha males” only culturally, and only since hoarding of surplus goods made pyramidal societies possible.

The repercussions of such behavior highlight another point. Men of this type basically tell the world “I dare you to stop my incredibly important work to listen to the grievances of a thrall. What is the life and reputation of a minimum-wage African immigrant woman compared to the mighty deeds I (think I can) perform?” Those who argue that the personal should be separate from the political choose to ignore the fact that the mindset that deems a maid part of the furniture thinks the same of most of humanity — Larry Summers is a perfect example of this. In fact, you can predict how a man will behave in just about any situation once you see how he treats his female partner. This makes the treatment of nations by the IMF and its ilk much less mysterious, if no less destructive.

Contrary to the wet dreams of dorks aspiring to “alpha malehood”, women generally will only interact with such specimens under duress. They’re far more popular with men who (like to) think that being a real man means “to crush your enemies, see them driven before you, and to hear the lamentation of their women.” Civilization came into existence and has precariously survived in spite of such men, not because of them. If we hope to ever truly thrive, we will have to eradicate cultural alpha-malehood as thoroughly as we did smallpox — and figure out how we can inculcate snachismo as the default behavioral model instead.

Images: Top, Malcolm McDowell as Caligula in the 1979 eponymous film; bottom, Biotest’s “Alpha Male” pills.

Privilege and Fantasy

May 14th, 2011

by Sam Kelly

This is the companion piece to Sam’s Nostalgia. The questions he raises bring to mind some of the haunting works of Isak Dinesen (Karen Blixen), most notably Sorrow-Acre in Winter’s Tales.

In my last essay, I talked about two forms of nostalgia, and the characterization of History within fantasy texts. This time around, it’s time for an assertion: it’s much harder for the privileged classes to write literary fantasy than it is for the oppressed and marginalized.

Let’s start with some definitions (do feel free to take issue with them in the comments—I’m not going to be ideological about them):

Literary: of enduring worth; of complexity; supporting multiple disparate readings; possessing novelty or making an original contribution. Layered and polysemous enough that it isn’t immediately accessible in its entirety. Possessing an awareness of itself as a text.

Fantasy: That Which Is Not: a change in the philosophical and/or metaphysical nature of the world, which I’ll tentatively call a diversa after Suvin’s “novum”. A desideratum, or an elegy. Passion is a necessary and perhaps sufficient condition for fantasy; there are some unpleasant words for fantasy without passion. Popular trope fantasy is perhaps the apotheosis of advertising, without any product. It’s normally impossible to tell it from pisstake fantasy.

Privileged: Possessing something inherited or innate that makes life easier for them than most people, and, in general, not aware that this makes a difference. Tending to ascribe their success entirely to hard work or luck. Generally, in the case of fantasy writers, it means “middle-class white cis urban-dwelling Western/minority-world men whose first language is English, and who aren’t disabled”, and it covers most of them.

One of the fundamental aspects of privilege is that it allows you to remain isolated from life, to an extent that others can’t. Write, as They say, what you know.

It is difficult to be sat upon all day, every day, by some other creature, without forming an opinion about them. On the other hand, it is perfectly possible to sit all day, every day, on top of another creature and not have the slightest thought about them whatsoever. — Dirk Gently’s Holistic Detective Agency, regarding horses.

Worlds are complex; even the kind that are only written about must, of necessity, contain multitudes. If one aspires to realism, or even to plausibility (which I find much more palatable in fantastika generally) then one must also know, and write, multitudes.

Knowing things is easy. What’s hard, and unusual, is wanting to know things that don’t directly affect you; being aware that there are questions, and that the questions are important. Writers are better at curiosity than most, but that doesn’t mean they’re better at knowing which questions to ask. So there’s only really one way to train that into someone: teach them that there are difficult, and important, things going on around them. Things that they don’t understand, things that people don’t want to teach them, things people have a vested interest in keeping from them.

One of the biggest examples of that—fittingly for fantasy, given how rooted in place & belonging it always is—is land. Who owns the land? Whose ancestors owned the land? Is talking about land in terms of ownership always useful? Who gets to name the land? Who has what resource rights in the land? And, crucially: who knows the histories of the land? (Disclaimer: my examples are drawn from British history, because that’s what I know.)

Narrative and naming are almost always the property of the privileged classes—not the ruling classes, because those tend to be much smaller—and are unchallenged in the popular narrative. Well, I say unchallenged. Alternative narratives tend to be subjected to the usual erasure & marginalization, because the unmarked nature of the privileged classes means that (by definition) no alternative is plausible. (Unmarked: seen as “normal”, ie. people & disabled people, people & people of color, people & gay people, people & poor people…)

The two aspects of privilege combine interestingly around the issue of the legible land; the privileged classes may hold property rights in it, but they’re very rarely experts on it, or strongly emotionally invested. Many of the land-owners we see in fantasy books are… but they’re usually invested in the picturesque or the monetary aspects, not the economic or ecological ones. And, as always, the frequency of representation in literature is no guide to reality! Fundamentally, being rich means you still eat that day if you go home without a rabbit and that all you need to know about wheat or cabbage is that the brown end goes in the ground, the other end goes in your stomach, and there are some boring processes in the middle which peasants and women deal with.

Being poor, on the other hand—or otherwise marginalized and reduced in scope—means you develop an intimate understanding of the means of production. This isn’t a symmetrical thing: it’s not that privileged people understand one set of things and non-privileged people understand another. People who have lacked privilege really do have a closer and more urgent understanding of life, being closer to the sharp end of the System. NB: I’m distinguishing experiences and understandings from the results of formal education here—obviously they overlap to an extent, but the infamous ivory tower phenomenon shows that formal education can have entirely the opposite effect.

(Food, of course, isn’t the only thing that the land gives us. There’s another, equally important resource: stories. Whether it’s stories about the time Ellis Gwyn lost two fingers to a tractor engine, or about Rhiannon easily outpacing Pwyll’s hounds, one of the most important things is that they happened Right There. There’s a whole long strand here, waiting to be unravelled, about fantasy, social mobility, and the motif of travel in portal-quest stories. But that’s for another time.)

Under the definition of “literary” above, I talked about novelty & original contributions. Here, we get to invoke Sturgeon’s Law, and I can point out that the only reason the good stuff looks so good is by comparison & contrast to the masses of tedious pabulum surrounding it. Nothing can be original or different unless there’s a mainstream to swim across, and the people who tend to swim across it are the ones whose whole life experience points them in other directions. As far as “awareness of itself as a text” goes, the World Is Text, this one and all others. Taking the text at face value is a luxury privileged people have. As for layers and multiple meanings… the idea that events & ontologies may be interpreted in several different ways is brain-bendingly difficult to internalize if you start off with the luxury of unopposed certainty. However, if you have it rubbed into you every day that you are not like other people, that normal people see the world differently and get different things from the world, then the wave/particle dualities of histories are easy by comparison.

So that’s “literary”, and as an illustration: without stopping to think, make a list of ten or a dozen 18th & 19th century novelists. How many of the list are women? Do you think the proportion is an accurate reflection of how many men & women were writing at the time?

As far as fantasy goes, that one’s easy to deal with. If you passionately want the world to be different, then you’re probably less than happy with the way of things as it is. As I talked about in “The time-binding of nostalgia”, there are two ways to desire change. You can either look forward to a golden time, or look back to a golden time. The first is a perfectly normal act of imagination, but the second always involves a regurgitated lump of plastic history and a covert appeal to the idea that things were always like that, but the forces of Darkness recently changed things away from their true course. You’re probably thinking of Tolkien as an example here, which is superficially reasonable—however, the Professor’s history is deliberately self-problematizing, including as it does its own historiography, and The Lord of the Rings is not even slightly a desideratum in that sense. It’s entirely elegiac, an extended if-only-it-could-have-been.

Of course, if you aren’t passionate about your fantasy, if you’re only proposing a diversa because it’s a vaguely interesting idea or because you need some stage setting for your Awesome Characters and Plot of Awesomeness, then that’s fluff. There’s nothing wrong with fluff, so long as it acknowledges that that’s what it is.

Another frequent failure mode is where the only diversa is that for every epic problem there is an equally epic solution, and the status quo ante is restored. This is a lazy and slapdash way to construct the framework for a story, but it’s unfortunately very common amongst privileged Extruded Fantasy Product writers. A well-constructed diversa, on the other hand, narrativises the textual world by introducing crosslinks, structural rhymes, and reified metaphors, and it’s easier to think about these—to acknowledge the possibility of them, to imagine a world with innate meaning—if you haven’t had the blithely unthinking benefit of the real world’s equivalent all your life.

This is not to say that any of these things—being female, or nonwhite, or trans, or poor, or far from urban life, or disabled, or any of the many other ways of lacking privilege—gives an author a free pass to Literary Fantasist status, or that privilege forbids it; all three concepts (literature, fantasy, and privilege) are far too complex and intersectional to be reduced to those sorts of rules. But privilege does make it harder to achieve.

Images: 1st, Mingary Castle, photo by Sam Kelly; 2nd, Rogue Roman by Frank Frazetta; 3rd, Helena Bonham Carter and Mark Wahlberg in Tim Burton’s Planet of the Apes.

The Time-Binding of Nostalgia

May 7th, 2011

by Sam Kelly (with an afterword by Athena)

It is my great pleasure to reprint an essay by my friend Sam Kelly (aka Eithin — Cymraeg for Gorse) who blogs at Cold Iron and Rowan-Wood. Sam was born to theatre folk and engineers, and brought up in the Essex woodlands and the Welsh mountains. Disability cut short a PhD in nanomaterials chemistry; he now lives in London, where he writes about fantastika and makes art.

I’ve been reading a lot of Guy Gavriel Kay recently (Under Heaven, The Wandering Fire, The Darkest Road, and The Lions of Al-Rassan) and have therefore naturally been thinking about identity, passion, and pride.

It’s a commonly accepted trope amongst many fantasy critics, scholars, and commenters that fantasy is, at its root, about nostalgia. I’ve never quite agreed with this, but I think that’s partly because nostalgia comes in several flavours. The word comes from the Greek nostos, a homecoming, and algos, pain, and was coined as a medical term in 1688 to describe Swiss mercenaries’ longing for the mountains of their home. (As a Welshman, I can relate to that! The Welsh word hiraeth is mostly untranslatable, but Heimweh does seem like a cultural analogue.)

In recent decades, however (and especially by the English) it’s been coopted to describe a kind of early 20th century idyll. You know the one—ploughmen, foaming nut-brown ale, small children waving at steam trains, The Countryside or The Beach two hours’ journey away, a distinct lack of brown people. It’s basically thinly disguised neo-mediaevalism, or rather neo-mediaevalism (in fantasy writers of a certain age, at least) is a proxy for their yearning for the kind of social certainty that supposedly existed in the recent past.

I feel compelled to point out here that that past (either of those pasts) never really existed, and the only way to pretend that they did is by wholesale erasure of the experiences and histories of women, the working classes, nonwhite people (there have always been nonwhite people in Britain, at least back to the Romans if not before) and Jews. Not to mention (and people rarely do mention) those who are more than one of those. It’s fairly safe to blame the Victorians for making up the mediaeval idyll. We’ve been reimagining recent history ever since, and it’s not as though revisionist history started in 1820 for that matter, but it was the Victorians who pioneered the mass production of History.

So that’s one way in which nostalgia is expressed in English-language fantasy fiction: the desire for an imagined past. That can be a joyful escapist wish, as with William Morris, or a heartfelt elegy for something that could never have been, as with Tolkien. In either version, the past (in the context of the novel, ie. the created world’s own imagined past) is seen explicitly as a good thing, a lost Golden Age.

There’s another version of nostalgia, however—nostalgia in its most etymologically strict sense, the pain of longing for a homecoming—and that is the one experienced by those whose home is contested, denied, erased. The interesting thing about that is that in the latter, the past-within-the-text is usually unpleasant, problematized, or generally Not Even Slightly Golden.

Athena’s afterword: There is a third group that experiences nostalgia – those who have left home (mostly) voluntarily and live as perpetual exiles, outsiders to both natal and adopted cultures, a shard of thick glass between our hearts and our words. For those who walk between worlds, home is what we carry in our heads. Even the worlds we remember never existed or no longer exist and we are feral orphans who press our faces against others’ lit windows. If we ever return home, it does not recognize us – and we are more like Angelopoulos’ Odysseus than Homer’s.

Black Swallow — a folksong of exile from Thrace, sung by Hrónis Aidhonídhis (“Son of the Nightingale”). Click on the title to listen:

My black swallow from afar,
my white dove from home,
you fly so high! Come lower
and open your wings,
so I can write to my mother,
my sisters — and my love.

And the lament of those left behind: the famous Tzivaéri mou (“My Treasure”) from the Dodecanese, sung by Dhómna Samíou.

Images: 1st, Oleg Yankovsky in Andrei Tarkovsky’s Nostalghia; 2nd, Harvey Keitel in Theódoros Angelópoulos’ The Gaze of Odysseus.

Of Federal Research Grants and Dancing Bears

May 1st, 2011

Warmth and comfort are yokes for us.
We chose thorns, shoals and starlight.
— from Mid-Journey

I came to the US in 1973, all fired up to do research. I have been doing research as my major occupation since 1980 and have run my own (tiny: average two-member) lab since 1989. So I fulfilled part of my dream and wrote about how it feels to do so in The Double Helix. Yet I may have to abandon it prematurely. Objectively, that’s not a tragedy. People die, people retire – hell, people have midlife epiphanies that make them join odd religions or take jobs in industry with salaries several-fold higher than their academic ones. But right now, I’m one of many who are disappearing. And our disappearance will have an impact far beyond what outsiders perceive (incorrectly) as cosseted academic careers.

Biomedical researchers in the US are on the selling side of a monopsony. If our research is very basic and/or we’re starting out, we can get small grants from the National Science Foundation. If it’s very directed or applied, we can get tiny grants from private foundations or the rare decent-sized grant from the Departments of Energy or Defense. But all these amount to peanuts. The engine behind US biomedical research is a single organization: the National Institute of Health (NIH). When the NIH says “Jump!” we ask “How high?” on the way up.

Grant submissions to the NIH have always been as arcane and painful as a complex religious ritual that includes flaying. Success depends not only on the quality of our science and the number and impact of our papers but also on sending the grant to the right study section for peer review, on using fashionable (“cutting edge”) gadgets and techniques, on doing science that is perceived to fit the interests of the NIH institute that hosts our grant, and on being lucky enough to get reviewers and program officers who agree on the importance of our proposed work (and in the case of reviewers, not direct competitors who are essentially handed unpublished data on a platter). All this, subsumed under the rubric “grantsmanship” or “being savvy”, is not taught at any point during our long, arduous training. In my youth, we learned by literally walking into brick walls.

However, when I started my PhD biomedical researchers could afford to have their egos and labs bashed by savage critiques and terrible grant scores because the NIH payline was 30-40%. This meant that one in three grants got funded – or that each of our grant applications (if of reasonable quality) got funded in one of the three tries the NIH allowed. Also, at that time the traditional university salary covered nine months. For most academic researchers, a grant meant they got the last three months of salary plus, of course, the wherewithal to do research.

My situation was different: I went to a poor institution (my startup package was seventeen thousand – the common minimum is a million) and except for my first two years and a six-month bridge later on I was entirely on soft money till mid-2008. So for me the equation was not “no grant, no lab”; it was “no grant, no job”. This was made a bit tougher by the fact that my research fit the “starts as heresy and ends as superstition” paradigm. For one, the part of my system that is relevant to dementia undergoes human-specific regulation. So unlike most of my colleagues I did not detour into mouse models, long deemed to be the sine qua non of equivalence (an equation that has hurt both biology and medicine, in my opinion, but that’s a different conversation).

Don’t misunderstand me, I’m not saying I’m a neglected Nobel-caliber genius. But much of what I pursued and discovered was against very strong headwinds: I investigated a molecule that had been delegated to the back of the neurodegeneration bus for decades. To give you a sense of how some of my work was received, I once got the following comment for an unexpected observation that flew against accepted wisdom (verbatim): “If this were true, someone would have discovered it by now.” The observation has since been confirmed and other labs eventually made plenty of hay with some of my discoveries, but I’ve never managed to get funding to pursue most of them. I was either too early and got slammed for lacking proof of feasibility or direct relevance to health, or too late and got slammed for lacking cutting-edginess.

It is hard to keep producing data and papers if your lab keeps vanishing like Brigadoon and has to be rebuilt from scratch with all the loss of knowledge and momentum each dislocation brings. To prevent this as much as I could, whenever I had a grant hiatus I cut my salary in half to pay my lab people as long as possible (if you go below 50% you lose all benefits, health insurance prominently among them). I could do this because I had no children to raise and educate, though it has seriously affected my retirement and disability benefits.

While I was marching to my malnourished but stubborn inner drummer, the NIH payline was steadily falling. It now stands at around 7% which means one in fifteen grants gets funded – or that we must send in fifteen grants to get one. Academic institutions, grown accustomed to the NIH largesse, have put more and more of their faculty on largely or entirely soft salaries. At the same time, grants are essentially the sole determinant for promotion and tenure – for those universities that still have tenure. Additionally, universities have decided to run themselves like corporations and use the indirect funds (money the NIH pays to institutions for administrative and infrastructure support) to build new buildings named after their board members, hire ever more assistant vice presidents and launch pet projects while labs get charged for everything from telephone bills to postdoc visas. Essentially, at this point most US biomedical researchers are employed by the NIH and their universities rent them lab space at markup prices comparable to Pentagon toilet seat tarifs.

Meanwhile, the NIH has been changing grant formats for the sake of streamlining (an odd objective, given its mission). We used to have three pages to respond to reviewers. Now we have one. We were able to send last-minute discoveries or paper acceptances that would make the difference between success and failure. Now we cannot. We used to get detailed critiques. Now we get bullet points. We were allowed three tries. Now we’re allowed two. If our two-strikes-and-we’re-out submission fails, we must change our work “substantially” (nebulously defined, and up to the interpretation of individual NIH officers) and to ensure compliance, the NIH has invested in software that combs past grants to uncover overlap. And of course there is essentially no appeal except for resubmission: the NIH appeal process, such as it is, has been copied from Kafka’s Trial.

All these changes are essentially guaranteeing several outcomes: young people will have fewer and fewer chances (or reasons) to become researchers or independent investigators; new and small labs will disappear; and despite lip service to innovation, research will seek refuge into increasingly safer topics and/or become “big ticket” science, doing large-scale politically dictated projects in huge labs that employ lots of robots and low-level technicians — a danger foreseen by Eisenhower in his famous “military industrial complex” address which today would label him as fringe hard left. A recent analysis by MIT’s Tech Review showed that biomedical work still timidly clusters around the few areas that have already been trampled into flatness, ignoring the far vaster unknown territories opened up by the human genome sequencing project and its successors.

Of course, we biomedical researchers have played a significant role in our own slaughter. Because of the unavoidable necessity of peer review, we have acted as judges, jury and executioners for each other. Like all academic faculty, we’re proud we are “as herdable as cats” and like all white-collar workers we have considered it beneath our dignity to be members of a union. Our only (weak and vanishing) protection has been tenure. Many of us are proud to demonstrate how much hazing we can take, how little we need to still produce publishable, fundable science while still carrying the burdens of mentoring, committee work, teaching… The early rounds of cuts were called “pruning dead wood” or “trimming fat”. Except that for the last two decades we haven’t been cutting into fat but into muscle – and now, into bone. Too, keeping ahead of the never-abating storm swells presupposes not only a lot of luck and/or the ability to tack with the winds of scientific fashion but also personal relationships with granite foundations and cast-iron health.

In my case, after my second hiatus I succeeded into landing two small grants that last two years. The day I found out one of them would be funded was the day I was interviewing at the NIH for a branch administrator position. I turned down that offer, with its excellent terms. My heart is irrevocably tied to research. I had not abandoned my home, my country, my culture to become an administrator — even though the position I was offered can make or break other people’s research. One month after the second of these small grants got activated, I got my cancer diagnosis. Being on soft money meant I could not suspend the grant during my surgery and recovery: my health bills would have driven me to penury. So the productivity suffered accordingly. But such factors are not considered during grant review.

Biology is an intrinsically artisan discipline: unlike physics, it cannot be easily reduced to a few large truths reached by use of increasingly larger instruments. Instead, it looks like a crazy quilt of intricately interwoven threads (take a look at the diagram of any biological pathway and you get the picture, let alone how things translate across scales). Some argue that larger labs are likelier to be innovative, because they have the money to pursue risky work and the gadgets to do so. However, it has been my personal experience that large labs are often large because they pursue fashionable topics with whatever techniques are hot-du-jour, regardless of the noise and artifacts they generate (plus their heads have time for politicking at all kinds of venues). They also have enormous burnout rates and tend to train young scientists by rote and distant proxy.

Granted, we need big-science approaches; but we need the other kind, too – the kind that is now going extinct. And it’s the latter kind that has given us most of the unexpected insights that have translated into real knowledge advances, often from neglected, poorly lit corners of the discipline.

Now I’m not all I thought I’d be,
I always stayed around:
I’ve been as far as Mercy and Grand,
Frozen to the ground.
I can’t stay here and I’m scared to leave;
Just kiss me once and then
I’ll go to hell —
I might as well
Be whistlin’ down the wind.
— Tom Waits, Whistle down the Wind

Images: 1st, Sand Lily Shadow (Tudio, Falassarna, Crete); 2nd, I Stand Alone (Michellerena, Burnie, Tasmania); 3rd, Sea Gate (Peter Cassidy, Heron Island, Australia)

The Anxiety of Kings

April 14th, 2011

by Calvin Johnson

I’m delighted to once again host my friend Calvin Johnson, who earlier gave us insights on Galactica/Caprica and Harry Potter.

Art casts a shadow, and any artist must move out of that shadow to create new art. This is the theme of The Anxiety of Influence by the critic Harold Bloom. Bloom identified six strategies but they really come down to two: doing it bigger and better or opposing, inverting, and subverting. Peter Jackson’s remake of King Kong was of the bigger-and-louder variety, while the reboot of Battlestar Galactica was a thorough subversion of the original.

Bloom was analyzing poetry but it applies to all arts and all genres. Nowhere is it more true than in fantasy literature, where the long shadow of John Ronald Reuel Tolkien has sent would-be writers scurrying for light for over half a century.

It is almost — almost — impossible to out-Tolkien Tolkien, hence many subsequent fantasies have relied upon some sort of inversion and subversion. Stephen R. Donaldson’s Thomas Covenant was a morally flawed anti-hero (who early on commits rape) and just to rub our noses in it, Covenant is also a leper. And whereas Frodo loses one finger to Gollum, Covenant loses two to his disease, another topper. Although Ursula Le Guin’s wizard hero Ged is no creep, her Earthsea series sheds the war-between-Good-and-Evil trope altogether, as well as highlighting heroes with non-European phenotypes.

Despite the flourishing (or infestation) of urban fantasy with vampires, zombies, and werewolves, and despite the welcome branching out of fantasy into more Asian and African-inspired settings (such as two of this year’s Nebula novel nominees, N. K. Jemisin’s The Hundred Thousand Kingdoms and Nkendi Okorafor’s Who Fears Death?), high fantasy still often echoes Tolkien and the Euromythic setting of Middle-Earth. Illuvatar knows I despair of all the awful, tepid, Tolkien knockoffs I still read these days in writing workshops.

But occasionally it’s done well.

Over a decade ago, our good host Athena sent me a paperback copy of a novel. At the time I was in a funk, struggling with faculty politics while an assistant professor at LSU. The cover showed a novel that was clearly high fantasy in the Tolkien mode and having been soured by too many bad imitations, I was prepared to dislike it.

I didn’t.

The book was George Richard Raymond Martin’s A Game of Thrones, the first in the series A Song of Fire and Ice. It is set in a quasi-medieval land, full of echoes of Middle-earth. But Martin both outdoes Tolkien and thoroughly, ruthlessly subverts him.

Tolkien was a medievalist, specializing in the languages of a thousand years ago, and Lord of the Rings has the scent of the formal Arthurian epic in it. There is war and death and struggle, but nonetheless LotR is sanitized. Not so with A Song of Fire and Ice. Martin does his homework; his series reads as if he had gone back in time, spent a couple of years banging around old castles and village hovels trading tales with the village witch, and then popped back with fresh memories of the stench of manure and hung pheasant. Imagine Middle-earth written with the grittiness of The Wire. Quite unlike Tolkien, Martin’s characters are complex and morally ambiguous.

Three sequels have appeared, with three more planned. Unfortunately the time between sequels has grown longer and longer, and the books ever longer too; the third, A Storm of Swords, weighed in at over 1100 pages.

Hence even a concise synopsis is not easy. There are half a dozen kings in the land of Westeros alone. The king in the North, Eddard Stark, is a noble if earthy lord who attempts to do what is right when what is right is impossible. Foremost of his foes is the conniving Cersei Baratheon née Lannister, queen in the South. You know the evil queens in Disney movies, the jealous Queen in Snow White, the shape-shifting Maleficent in Sleeping Beauty? Or real-life schemers such as Lucretia Borgia? Cersei would eat them all for breakfast and boil up the bones for tea for her brother, Jaime Lannister, the greatest swordsman in the land, who earned the nickname Kingslayer after killing the mad king Targaryen so that Cersei’s husband Robert could become king. But Cersei is bedding her brother Jaime (I told you she was evil), who casually cripples Eddard’s son Bran for spying on them; after that, it’s no stretch for them to plot to kill Robert, frame Eddard, and place Cersei’s son — fathered by Jaime, not Robert — on a throne made from the swords of defeated enemies.

Got it? And that’s just the beginning.

Thrown into the mix is Cersei and Jaime’s brother, Tyrion Lannister. Tyrion is the brains of the family. He is also a dwarf, so Cersei and Jaime tend to discount him, always a bad idea.

Tyrion is one of my two favorite characters, the other being Daenerys Targaryen, the daughter of the mad, dead King Targaryen. She was exiled overseas and forced into marriage to a Genghis Khan-like barbarian who promptly dies, leaving Daenerys nearly helpless. But not quite, for Daenerys has her wits and, importantly, her daddy’s dragon eggs. I forgot to mention that the Targaryens conquered Westeros with dragons several generations earlier; as the heirs of Eddard Stark, Robert Baratheon, and others battle to exhaustion, Daenerys begins a looping journey back to Westeros and revenge, collecting an army along the way.

I’ve left out, oh, about a dozen major characters, such as Stark’s daughters and his bastard son Jon Snow, sent to the farthest north to defend the land against an eerie, unseen foe. But you get the idea.

Martin plunks down dozens of sharply realized, multifaceted characters; even the least tavern whore whom Tyrion beds comes across as a real person, not just a prop. Martin also downplays the fantastic elements in favor of realpolitik: where there are dragons and there appears to be magic, it slinks arounds the edges, giving atmosphere rather than magi ex machina solutions to plot problems.

It’s not clear that Martin actually has a plot. In what is both the series’ greatest strength and biggest frustration, the books read like a fictionalization of real history, with all the messy, chaotic, and even unfair turns of events. There is no quest. There will be no vindication of good over evil. There is a war for power, but like real wars and political struggles, it is messy and chaotic. Main characters, characters who in any other series would clearly be destined to be the heroes, die halfway through, leaving the reader to swear in the name of the seven gods (Martin postulates a Jungian seven-fold deity rather than a Freudian trinity and gets in some nice wordsmithing, coining words like septon). Given the unraveling skein of events, narrative voices that multiply like rabbits with a supercomputer, and the increasing time between books, Martin is in danger of losing his audience.

But it’s been a hell of a ride.

And now A Game of Thrones is coming to television — to, alas as I don’t have cable, HBO. And what a great cast. Sean Bean (the tragic, haunted Boromir in Jackson’s Fellowship of the Ring) is Eddard Stark. Lena Headey (the robot-ass-kicking eponymous heroine of the short lived series The Sarah Connor Chronicles) is Cersei Lannister. And Tyrion is played by Peter Dinklage, usually unrecognizable under pounds of latex in fantasy movies, but who proved his acting chops in The Station Agent. Dinklage was Martin’s favored choice for this role as well, and I’d give good odds of him stealing the series. The buzz is favorable. We shall see.

Images: 1st, Stephen Yull’s cover for A Game of Thrones (cropped); 2nd, Lena Headey (Cersei Lannister) as Sarah Connor; 3rd, Peter Dinklage (Tyrion Lannister); 4th, Sean Bean (Eddard Stark) as Boromir.

Update: PZ Myers Why I’m not Interested in Watching GoT encapsulates my feelings about the opus.

The Quantum Choice: You Can Have either Sex or Immortality

March 29th, 2011

Note: A long-term study from Northwestern University (not yet in PubMed) has linked participation of young adults in religious activities to obesity in later life. Overhanging waistlines in First World societies undoubtedly contribute to the degenerative illnesses of lengthened lifespan. But it’s important to keep in mind that fat fulfills critical functions. This article, which looks at the other side of the coin, was commissioned by R. U. Sirius of Mondo 2000 fame and first appeared in H+ Magazine in September 2009.

Because of the four-plus centuries of Ottoman occupation, the folklore of all Balkan nations shares a Trickster figure named Hodja (based on 13th century Konyan Sufi wandering philosopher Nasreddin). In one of the countless stories involving him, Hodja has a donkey that’s very useful in carting firewood, water, etc.  The problem is that he eats expensive hay. So Hodja starts decreasing the amount of hay he feeds the donkey. The donkey stolidly continues doing the chores and Hodja, encouraged by the results, further decreases the feed until it’s down to nothing. The donkey continues for a few days, then keels over. Hodja grumbles, “Damnable beast! Just when I had him trained!”

Whenever I hear about longevity by caloric restriction, I immediately think of this story.

But to turn to real science, what is the basis for caloric restriction as a method of prolonging life? The answer is: not humans. The basis is that it appears (emphasis on the appears) that feeding several organisms, including mice and rhesus monkeys, near-starvation diets, seems to roughly double their lifespan. Ergo, reasons your average hopeful transhumanist, the same could happen to me if only I had the discipline and time to do the same –- plus the money, of course, for all the supplements and vitamins that such a regime absolutely requires, to say nothing of the expense of such boutique items as digital balances.

I will say a few words first about such beasties as flies (Drosophila melanogaster) and worms (Caenorhabditis elegans) before I climb the evolutionary ladder. Many organisms in other branches of the evolutionary tree have two “quantum” modes: survival or reproduction. For example, many invertebrates are programmed to die immediately after reproduction, occasionally becoming food for their progeny. In some cases, their digestive tracts literally disintegrate after they release their fertilized eggs. Conversely, feeding a infertile worker bee royal jelly turns her into a fully functioning queen. The general principle behind caloric restriction is that it essentially turns the organism’s switch from reproductive to survival mode.

Most vertebrates from reptiles onward face a less stark choice. Because either or both parents are required to lavish care on offspring, vertebrate reproduction is not an automatic death sentence. So let’s segue to humans. Due to their unique birth details, human children literally require the vaunted village to raise them — parents, grandparents, first degree relatives, the lot. At the same time, it doesn’t take scientific research to notice that when calories and/or body fat fall below a certain minimum, girls and women stop ovulating. It also takes just living in a context of famine, whether chosen or enforced, to notice the effects of starvation on people, from lethargy and fatigue to wasted muscles, brittle bones and immune system suppression, crowned with irritability, depression, cognitive impairment and overall diminished social affect.

Ah, says the sophisticated caloric restriction advocate, but much of this comes from imbalances in the diet –- missing vitamins, minerals, etc. Well, yes and no. Let me give a few examples.

All vitamins except B and C are lipid-soluble. If we don’t have enough fat, our body can’t absorb them. So the excess ends up in odd places where it may in fact be toxic –- hence the orange carotenoid-induced tint that is a common telltale sign of many caloric restriction devotees. Furthermore, if we have inadequate body fat, not only are we infertile, infection-prone and slow to heal due to lack of necessary hormones and cholesterol; our homeostatic mechanisms (such as temperature regulation) also flag. And because caloric restriction forces the body to use up muscle protein and leaches bones of minerals, practitioners can end up with weakened hearts and bone fractures.

Speaking of fat, the brain has no energy reserves. It runs exclusively on glucose. When starved of glucose, it starts doing odd things, including the release of stress chemicals. This, in turn, can induce anything from false euphoria to hallucinations. This phenomenon is well known from anorexics and diabetics entering hypoglycemia, but also from shamans, desert prophets and members of cultures that undertook vision quests, which invariably included prolonged fasting.  So caloric restriction may make its practitioners feel euphoric. But just as people feel they have comprehended the universe while under the influence of psychoactive drugs, so does this practice impair judgment and related executive functions.

So what about those glowing reports which purport to have demonstrated that caloric restriction doubles the lifespans of mice and rhesus monkeys, as well as giving them glossy pelts? Surely we can put up with a bit of mental confusion, even failing erections, in exchange for a longer life, as long as it’s of high quality –- otherwise we’ll end up like poor Tithonus, who was granted immortality but not youth and dwindled into a shriveled husk before the gods in their whimsical mercy turned him into a cicada. And it does seem that caloric restriction decreases such banes of extended human lifespan as diabetes and atherosclerosis. Well, there’s something interesting going on, all right, but not what people (like to) think.

In biology, details are crucial and mice are not humans. In Eldorado Desperadoes: Of Mice and Men, I explained at length why non-human studies are proof of principle at best, irrelevant at worst. Laboratory mice and monkeys are bred to reproduce early and rapidly. They’re fed rich diets and lead inactive lives –- the equivalent of couch potatoes. The caloric restriction studies have essentially returned the animals to the normal levels of nutrition that they would attain in the wild. Indeed, caloric restriction of wild mice does not extend their lives and when caloric levels fall below about 50%, both lab and wild mice promptly keel over, like Hodja’s donkey. In the rhesus studies, lifespans appeared extended only when the investigators counted a subset of the deaths in the animal group they tested.

On the molecular level, much attention has been paid to sirtuin activators, resveratrol chief among them. Sirtuins are a class of proteins that regulate several cell processes, including aspects of DNA repair, cell cycle and metabolism. This means they’re de facto pleiotropic, which should give would-be life extenders pause. As for resveratrol, it doesn’t even extend life in mice –- so the longer lives of the red-wine loving French result from other causes, almost certainly including their less sedentary habits and their universal and sane health coverage. That won’t stop ambitious entrepreneurs from setting up startups that test sirtuin activators and their ilk, but I predict they will be as effective as leptin and its relatives were for non-genetic obesity.

This brings to mind the important and often overlooked fact that genes and phenotypes never act in isolation. An allele or behavior that is beneficial in one context becomes deleterious in another. When longer-lived mutants and wild-type equivalents are placed in different environments, all longevity mutations result in adaptive disadvantages (some obvious, some subtle) that make the mutant strain disappear within a few generations regardless of the environment specifics.

Similarly, caloric restriction in an upper-middle class context in the US may be possible, if unpleasant. But it’s a death sentence for a subsistence farmer in Bangladesh who may need to build up and retain her weight in anticipation of a famine. For women in particular, who are prone to both anorexia and osteoporosis, caloric restriction is dangerous –- hovering as it does near keeling over territory. As for isolated, inbred groups that have more than their share of centenarians, their genes are far more responsible for their lifespan than their diet. So does the fact that they invariably lead lives of moderate but sustained physical activity surrounded by extended families, as long as they are relatively dominant within their family and community.

Human lifespan has already nearly tripled, courtesy of vaccines, antibiotics, clean water and use of soap during childbirth. It is unlikely that we will be able to extend it much further. Extrapolations indicate that caloric restriction will not lengthen our lives by more than 3% (a pitiful return for such herculean efforts) and that we can get the same result from reasonable eating habits combined with exercise. Recent, careful studies have established that moderately overweight people are the longest-lived, whereas extra-lean people live as long as do obese ones.

So what can you really do to extend your life? Well, as is the case with many other quality-of-life attributes, you should choose your parents carefully. Good alleles for susceptibilities to degenerative age-related diseases (diabetes, heart disease, hypertension, dementia) are a great help — as is high income in a developed country with first-rate medical services, which will ensure excellent lifelong nutrition and enough leisure time and/or devoted underlings to make it possible to attend to suchlike things.

Baby, You Were Great!

March 26th, 2011

— title of a story by Kate Wilhelm

Everyone who meets me inevitably finds out that science fiction and fantasy (SF/F) occupy a large portion of my head and heart: I write it, read it, review it and would like to see it discard the largely self-imposed blinkers that impoverish it. For a while, Strange Horizons (SH) magazine thrilled and captivated me. So it’s doubly hearbreaking for me to see it regressing into “normality” and losing sight of what made it stand out in the first place.

My relationship with SH has long been ambivalent. I was happy it was a major SF/F venue brought to vibrant life by female founders: Mary Anne Mohanraj and Susan Marie Groppi after her. I was pleased it published many works by women and Others and contained significant numbers of women in its masthead (as editors, not gofers or dishwashers). I was glad it showcased non-famous writers from the get-go and cast its net wide. My second major SF article appeared there when I was relatively unknown in the domain.

However, there were some worms in the tasty apple. One was that SH seemed to have adopted a stance of “science hurts our brains” – perhaps to distinguish itself from the scienciness of Analog and Asimov’s. This was true not only (increasingly) for its stories but also for the non-fiction articles which steered determinedly clear of science, concentrating instead on literary and social criticism. There’s nothing wrong with that, of course, especially when SF/F is still struggling for legitimacy as literature. But other speculative magazines – Lightspeed, for one – manage to include interesting science articles without shedding cooties on their fiction.

So I read SH fiction less and less but continued to browse its columns and reviews. Then in the last few years I noticed those shifting – gradually but steadily. They were increasingly by and about Anglosaxon white men and showed the tunnel vision this context denotes and promotes. The coalescent core reviewers were young-ish British men (with token “exotics”) convinced of their righteous enlightenment and “edginess” along the lines of “We discovered/invented X.”

I caught a whiff of the embedded assumptions that surface when these self-proclaimed progressives relax, safe from prying eyes. One of them recently reviewed a story on his site and characterized its protagonist by the term “cunt”. He used the word repeatedly, as a synonym for “empathy-lacking sociopath”. Having accidentally read the entry, I remarked that, feminism bona fides aside, the term doesn’t ring friendly to female ears and even the canon definition of the term (“extremely unpleasant person, object or experience”) is not equivalent to psychopath. Perhaps not so incidentally, I was the only woman on the discussion thread.

The reviewer’s first response was that only Amurrican barbarians “misunderstand” the term. I replied (in part) that I’m not American, and presumably he wishes to be read by people beyond Britain and its ex-colonies. At that point he essentially told me to fuck off. His friends, several of them SH reviewers or editors, fell all over themselves to show they aren’t PC killjoys. They informed me that US cultural hegemony is finally over (if only), that “cunt” is often used as an endearment (in which case his review was a paean?) and that women themselves have reclaimed the term (that makes it copacetic then!).

So this is the core group that has been writing the majority of reviews at SH for the last few years and is now firmly ensconced not only in SH but also across British SF/F venues. This may explain the abysmal gender percentages of the latter, which haven’t really budged even after the discussions around the not-so Mammoth Book of Mindblowing SF or the handwringings over the Gollancz aptly named Masterwork Series. The recent epic fantasy debate showcased the prevailing attitudes by discussing exclusively works of (repeat after me) white Anglosaxon men. Not surprisingly, the editor of SH just revealed that roughly two-thirds of recent SH reviews were by male reviewers and two-thirds discussed works of male authors, adhering to the in/famous “one-third rule” that applies to groups helmed by men.

People will argue that SH still has “a preponderance” of women in its masthead and pages. That’s mostly true — for now. However, it is significant that the percentages of works by women in SH consistently reflect the ratios and clout of women within each of its departments. Too, it’s human nature to flood the decks with one’s friends when someone takes over a ship. The problem is that, given the makeup of the current editor’s inner circle, an echo chamber is all but assured. To give one example, a new SH column does blurbs of online discussions relevant to SF/F. Although the editor in charge of it asked for input and admitted he got an avalanche of responses, its entries so far have come almost exclusively from members of the in-group.

So SH is inching towards a coterie of white Anglosaxon men as arbiters of value, a configuration Virginia Woolf would have found depressingly familiar. People are fond of repeating that publication ratios reflect the fact that women submit less than men. What I increasingly see at SH are stances that need not be in-your-face hostile to exert a chilling effect. If someone smirks at you constantly, the passive-aggressive condescension will eventually stop you from going to his parties as effectively as if he had explicitly barred you entry (check out The Valve to observe this dynamic at work).

It grieves me to see SH slowly but inexorably become literally a neo-Victorian club. It grieves me that one of the few SF/F venues once genuinely receptive to women’s work is resorting to smug lip-service. Perhaps the magazine is a victim of its success: once women had nurtured it to prominence, men could take over and reap the benefits – a standard practice.

I see developing patterns early, so much so that I often joke I should be called Cassandra, not Athena. Yet this once, for the sake of the genre and the women who painstakingly watered the now-vigorous SH tree, I fervently hope I’m proved wrong. Otherwise, given the attention span of the Internet, a handful of us will wistfully recall (to hoots of incredulous derision, no doubt) that once there was a verdant oasis in SF/F that women created, shaped and inhabited.

Remedios Varo, Nacer de Nuevo (To Be Reborn)

Note to readers: I am aware this will lead to polarizing and polarized views. I will not engage in lengthy back-and-forths, although I made an exception for the expected (and predictable) response by Abigail Nussbaum. People are welcome to hold forth at whatever length and pitch they like elsewhere.