Plough My Account Sign Out
My Account
    View Cart

    Subtotal: $

    Checkout
    Bay Area rationalists at a Secular Solstice celebration

    Simulating Religion

    A Christian takes stock of Silicon Valley’s rationalist community

    By Alexi Sargeant

    January 3, 2018
    4 Comments
    4 Comments
    4 Comments
      Submit
    • Aapje

      Good article. I do think that it tries to fit rationalism in the Christian traditionalist paradigm too much, though. I would argue that Yudkowsky's means (overcoming your own bias) became the unifying element of the rationalist community and that it is quite more diverse with regard to beliefs regarding monogamy, AI or a desire for immortality than you think. I would argue that the movement is especially attractive to a certain personality type (non-conformist, nerdy, likes to philosophize, disinclined to believe things without evidence, highly scrupulous). Rationalists tend to desire community, social contact and cooperation, like (pretty much) everyone else. In the (Christian) traditionalist paradigm, this is achieved by fairly strongly prescribing how people ought to live, as well as offering rituals. In the rationalist community, there is more focus on the differences between individuals, where community, social contact and cooperation are voluntary and can be designed around the needs of the person. Of course, this also means that moral behavior has to primarily come from people deciding to behave morally out of their own desire to do so, rather than fear of the consequences of breaking a set of rules. It also means that people have to be capable of making good choices for themselves. People have become considerably better educated over the last century or so, so a good argument can be made that people are becoming more and more capable of this & have less need of the traditionalist solutions. Of course, one can criticize this. I especially think that a substantial subset of the population may not be capable of self-directing their lives to such an extent and will make self-destructive choices. However, the individualization of society that is currently happening is not caused by rationalists, so I don't think they ought to be blamed for this. Ultimately, traditionalist society was also unpleasant to a substantial subset of the population and I think that fewer and fewer people accept this nowadays. If many people are unwilling to go back to the the solutions of the past, then we have to find new solutions. I have to disagree with your portrayal of the rationalist movement as being relatively dangerous because it may cause negative world-scale change, because it is largely a reactionary movement that tries to cope with changes that are happening or expected to happen. For example, MIRI is not in the business of bringing into existence a very powerful AI. What they seek to do is to ensure that if society makes a very powerful AI, it will not destroy humanity. Imagine a group of people building a community pool near you, where another group of people then demand that the pool gets various safety features, so the chance that children drown is reduced. It seems very unreasonable to call the latter group dangerous. PS. I don't consider RationalWiki to be rationalist, but rather to be 'skeptic'. They themselves say that their name is not the most accurate and that they would have used SkepticWiki if that name hadn't already been taken. They are rationalism-adjacent, but different in that skeptics tend not to have as strong a goal of overcoming their own bias, but are more focused on the faults of others. I think that rationalists are more prone to adhere to Matthew 7:1-5.

    • Ken B.

      Kevin, are you aware of another game in town?

    • Andrew F.

      Thank you for writing this article; as someone on the fringes of the rationalist movement I agree that outside perspectives and criticism are what it needs most--and this article is much fairer to the community than some of us have come to expect. One of the most interesting points you brought up was Altair's reluctance to "[make] normative claims about what we *should* value." I suspect that that experience was not unique, that it might be one of the most difficult gaps to bridge in dialogues between Christians and non-religious rationalists. Many members of the community are there precisely *because* they felt there was a fundamental mismatch between their own values and the values (Christian or otherwise) that they felt were imposed upon them by some external source, and therefore are rightly suspicious of claims that their, or anybody else's, values "should" change. So how to bridge that gap? The most effective way to do so would be to bring evidence into the discussion, evidence that an alternate (e.g. Christian) set of values would be more likely to lead to some outcome a rationalist seeks, or conversely to demonstrate that seeking different outcomes might be more in line with that person's existing values. The community's engagement in quasi-religious rituals might be a starting point, but would likely have to overcome the counterargument that there is little those rituals have in common with Christianity that they do not also share with Shinto, or Islam, or Norse paganism. Something which likely *cannot* bridge the gap are statements or arguments from faith alone, because treating "faith" as a core value ("terminal value", to use the rationalist term) is anathema to many if not most rationalists. For example, you say that "they are making themselves – and, increasingly, their children – human test subjects in a social experiment that is bound to prove destructive," for example; a rationalist would likely agree with you up to 'experiment', and even take it as a compliment! And then they'd take issue with the "bound to prove destructive" bit. "How do you *know* it will prove destructive?", they might ask. "Have you tried it? Do you have data that demonstrates this? What if it turns out that our relationship model results in children who are happier/more successful/more charitable/more socially engaged/[insert your preferred value here]?" That is the exact kind of argument you'll have to find a different kind of support for before the rationalist community as a whole will admit it might have something to learn from you on some subject.

    • Kevin Cushing

      A secular religion seeing people as absolute masters of their destiny, looking to science and technology as means to salvation. Such hubris.

    Christians, by and large, don’t like tech culture. To many, Silicon Valley seems irredeemably hostile to New Testament values. The young people working there, in this view, are hedonistic digital yuppies riding a fiscally conservative, socially liberal ideology to its dystopian endpoint, atomized individuals with no ties of loyalty except to over-powerful and under-regulated tech firms.

    There are good reasons for this ­skepticism, but it’s missing something. Strange and surprising subcultures are thriving in the hothouse of tech country. Among them is a loose-knit community calling themselves “rationalists.” Based in California, this group is united by a heady mix of futuristic idealism and communitarianism.

    As an outsider, I’m by turns fascinated and frightened by the rationalists and their worldview. They can be articulate critics of the tech world’s dominant ideology. They’re also social experimenters who serve as their own guinea pigs.

    I first came across the rationalist commu­nity by clicking links on the blog of the woman I would one day marry, Leah Libresco. My wife thinks her conversion to Catholicism is partly attributable to discussions she was having on the rationalist internet. Leah loved the rationalists’ focus on seeking truth above all else and investigating ways to set aside bias in order to do that. Leah says, “I was particularly taken with an essay by the Bay Area thinker Eliezer Yudkowsky on the virtues of rationality, especially that of lightness – holding oneself so as to be ready to be moved by evidence. It helped me, as Christianity became more and more plausible, to be ready to give in – to let myself be moved by truth, and ultimately, by God.”

    I’m by turns fascinated and frightened by rationalists and their world view.

    While I don’t know of many other rationalists who’ve made a similar journey, Leah’s experience has shown me that Christians need to take the rationalists’ ideals seriously. We can affirm their commitment to learning and living the truth; we can applaud their attempts to create community; and we can take a cue from their desire to do good on a global scale. We can also learn from their hunger for the sacred, a desire for meaning that drives them to engineer new secular sacraments and that offers a vivid glimpse into a different kind of religious seeking.

    Origins

    The rationalist community formed on the internet, first on the economics blog Overcoming Bias and then, especially, on its spinoff website, LessWrong.com. This site was cofounded by Eliezer Yudkowsky, a self-taught researcher in the world of artificial intelligence (AI) frightened by the possibility that these programs might one day become unfriendly to humankind and frustrated that people in his field did not prioritize investigating safety measures to protect humanity from them. What if we created something much smarter and therefore more powerful than us and it didn’t share our values but instead sought an alien goal of its own – like repurposing all the matter in the universe into paper clips?

    Yudkowsky felt he could best encourage AI safety research by creating a culture of rational thinking. LessWrong is an attempt at creating that culture, and it has attracted a crowd interested in AI safety. The community has also come to discuss and sometimes emulate Yudkowsky’s other interests: the Bayesian theory of statistics, polyamory, and a sci-fi future of cryonic freezing and mind-uploading immortality.

    Rationalists take a somewhat paradoxical approach to keeping hold of their humanity in the face of technology. Apprehensive of the potential anti-human power of artificial intelligence, some in the community ponder the possibility that humanity might have to be changed radically to defeat death, perhaps shedding this mortal coil to become digital beings. By undergoing this change, we will preserve what they see as those things that are crucial to our humanity, like our boundless curiosity and creativity. (Our physical selves are, for the vast majority in the community, not intrinsic to who we are.) One rationalist, Alex Altair, says he believes that not only could we someday live indefinitely, but we could alter our minds so as to always be learning new things and “trying on different personalities.”

    Altair, a software developer, was first drawn into the community by Yudkowsky’s fan-fiction epic Harry Potter and the Methods of Rationality. He loves the rationalists’ “world-scale ambition” and lack of bias toward localness, either in time or space. “Why care only about Earth when there are so many planets out there?” He sees no reason not to take the whole world – indeed, the whole of time – as a playing field. After all, if humanity survives to colonize the universe, he posits, it is likely that the vast majority of people who will ever exist are in our future. Averting existential risk and helping those future people is a major goal, a value that he holds because, he says, “All humans are pretty similar, compared to other objects in the universe.”

    Altair thinks that anyone who reasons with enough “hardiness of mind” will come to see AI as one of the most important risks we face as a species. Though he admits his apocalyptic worries can sound absurd, Altair thinks they are gaining steam among people who think seriously about these things, starting with Nick Bostrom (author of Superintelligence: Paths, Dangers, Strategies) and spreading to Bill Gates and Elon Musk.

    Bay Area rationalists at a Secular Solstice celebration

    Bay Area rationalists at a Secular Solstice celebration Image from secularsolstice.com/blog

    Rationalist Rituals

    Despite their commitment to the idea that genuine community can exist on the internet, many rationalists have affirmed the importance of community in the flesh as well. Bay Area rationalists have even congregated in group houses. Another physical gathering place is the Center for Applied Rationality (CFAR), which advances the cause of combating existential AI risk by running intensive four-day workshops on rationality. The 2016 annual LessWrong survey of rationalists had about three thousand respondents, perhaps a ballpark number for the community’s size.

    In building a shared culture, some rationalists draw on the example of religious communities. In a recent post on his blog Compass Rose, Ben Hoffman weighs the pros and cons of becoming a Quaker, given their track record of coming to what he considers correct positions (like abolitionism and pacifism) before the rest of society. In another post, he advocates the wisdom of Jewish Sabbath restrictions as a sanity check on the modern world: “If something like the Orthodox Sabbath seems impossibly hard, or if you try to keep it but end up breaking it every week – as my Reform Jewish family did – then you should consider that perhaps, despite the propaganda of the palliatives [e.g., fast food, Facebook], you are in a permanent state of emergency.” Contemporary society drowns us in noisy demands, he argues, producing spread-too-thin communities and lonely individuals.

    Christians need to take the rationalists’ ideals seriously.

    Altair has attended and helped organize iterations of the rationalists’ Secular Solstice celebration. The event had its genesis in rationalists’ observations of other cultures’ various rituals and traditions. “It may be weird to engineer that, but we’re all engineers,” says Altair. The first Secular Solstice event was spearheaded by musician Raymond Arnold in 2013. Arnold wrote in an introduction to a book of Solstice songs: “I have some oddly specific, nuanced, and weird beliefs. And I had the hubris to arrange a night of carefully designed ritual for myself and my friends to celebrate them.” The Solstice’s central structure is based on light and darkness. Attendees bring and light “oil lamps, LEDs, plasma balls and imitation lightsabers” that are eventually extinguished so that participants can experience darkness together. Finally, everything is lit again, a symbol of a brighter future. It’s a powerful visual, one that feels like a processed and repurposed Easter vigil, though it was apparently conceived in design as “reverse Hanukkah.”

    Music is a central component of the event. One of Altair’s favorites is the curiously titled “The Word of God,” a deist encomium to scientific investigation with a refrain of variations on “Humans wrote the Bible, God wrote the rocks.” After a few Solstices, Arnold amped up the song’s secularism by altering the lyrics to “Humans wrote the book of earth, time wrote the rocks” and variations thereon – God and the Bible have disappeared even as objects of deist critique.

    Apart from songs, the event features secular sermons and mindfulness exercises. Many of these focus on hopes and fears for the far-flung future, asking the attendees, “Are we being good ancestors?” It struck me that although Christians’ vision of the future is different, this would be a useful question to ask ourselves as well.

    The Secular Solstice event spread to seven cities by 2016, and attendance has steadily increased, with people coming for the music, speeches, and communal solidarity on offer. Altair sees the celebration as hugely successful at capturing the trust-building and bonding functions of religious rituals.

    overlapping circles showing the Impact Focus, Human Focus, and Truthseeking Focus of the Rationality Community

    Image by Raemon from LessWrong.com

    Rationalist Religion?

    There’s a parallel between the rationalists’ efforts to engineer ritual profundity and their way of describing human beings’ interior life. Rob Bensinger, research communications manager at Machine Intelligence Research Institute (MIRI), told me, “We can think of human minds as a kind of computer, or (if we reserve the word ‘computer’ for engineered artifacts) as a naturally occurring system that a computer could emulate.” Altair was confident that human minds could eventually be simulated perfectly: “The physical process that makes up the mind is all there is to the mind. If you simulate that in some other medium, you get the same thing.” David Souther, a CFAR alumnus and software engineer who considers himself on the periphery of the rationalist community, made the comparison more bluntly: “What makes us different from computers? Today, efficiency of processing.” He assured me that we’ll see computers as efficient as human minds in, at most, a few decades, and machine consciousness will probably follow that.

    As with minds, so with holidays. If no phenomenon, not even human consciousness, is more or other than its underlying physical process, then simulating the relevant parts of the process in a new medium will also produce that phenomenon. Take candlelight, closeness, and song; run the simulation with rationalists; and – ta-da! – the output is culture. Altair was intrigued by the analogy that the rationalist community was a sort of simulation of a religious community. “Reductionism is one of our primary tools that we tend to reach for first,” he said.

    This suggestive parallel, however, doesn’t mean rationalists are religious believers. To the contrary: LessWrong’s 2016 survey shows that believers in God were a tiny minority, with only 3.7 percent of respondents identifying as “committed theists” – a distant fourth after “atheist and not spiritual” (the winner by a mile at 48 percent), “agnostic” (8 percent), and “atheist but spiritual” (5.5 percent). Despite this, the problems the rationalists are trying to solve are essentially theological: What would a just, omnipotent entity be like? What eternal future should humans desire, and how can we get there? Thus the religious casts the ­rationalists take on, from Solstices to Sabbath-envy, shouldn’t be so surprising. The question from the AI research branch of the rationalists is “How can we responsibly make a god?” And the answer from the community-building branch is “First, make a church.”

    Altair, Arnold, and others are responding to a lacuna in the present world not by shrugging but by trying to build something to fill it. After all, rationalists like Hoffman are quite right that mainstream contemporary culture is in many ways inhumane, even dehumanizing. Christians have even more resources to draw on to make this critique and to model our lives on. I wish more of us were as willing as the rationalists to look foolish in the eyes of the dominant culture in service of wholeness (and in our case, holiness).

    To act more effectively on these good desires for wholeness, rationalists could use a more robust anthropology. The reductionism to which the rationalists so often resort poses challenges for a group that is seeking to preserve humanity against the threat of its potential extinction. What, in the minds of the rationalist community, are these humans whose well-being they are eager to promote? One thing that frustrated me in my conversation with Altair was the way he treated each person’s values as almost immutable, the way he resisted making normative claims about what we should value. Surely, rationality shouldn’t only help us achieve whatever we happen to want; shouldn’t it also help us shape our desires in accord with truth? According to Altair, this is a contentious issue among rationalists, though he himself has “cheated” by simply making his fundamental goal knowing true things. Though I tried, I could not get him to concede that some of those truths might be moral facts about human purpose.

    Sexual morality, for example, is an area in which rationalists’ views clash starkly with what Christianity teaches as true. It’s almost as though rationalists view monogamous marriage as something like an outdated technology. Some Bay Area rationalists have paired off in unconventional ways such as handfasting, a novel commitment ceremony that binds the participating couple for a year and a day. In LessWrong’s 2016 survey, 13 percent of respondents said they preferred polyamorous relationships. In my view, here the rationalists are blind to grave dangers: they are making themselves – and, increasingly, their children – human test subjects in a social experiment that is bound to prove destructive.

    The LessWrong rationalists have their critics. RationalWiki, another rationalist group, heaps the same scorn on Yudkowsky as it does on creationists and UFO conspiracy theorists, and essentially accuses him of leading a personality cult. But LessWrong participants are not blind to these dangers and write long sequences of blog posts on spotting and correcting bad aspects of cults. Rob Bensinger (MIRI) says he responds to this critique of the LessWrong community by asking the critic to be more specific: “I think ‘culty’ can be gesturing at a lot of different things, some of which are really important to be wary of in small ‘idea-oriented’ communities (the tendencies toward conformism and over­confidence), and others of which aren’t inherently concerning (such as weirdness).”

    The events focus on hopes and fears for the future, asking, “Are we being good ancestors?”

    The Yudkosky-as-cult-leader charge is undercut by the fact that plenty of rationalists are willing to critique him. When his utilitarianism leads him to abhorrent positions such as claiming it would be right to torture an innocent person if it saved a sufficiently high number of future people from minor annoyances, other rationalists in his community make counter­arguments. More light-heartedly, a humor page on LessWrong collected Chuck Norris–style Yudkowsky facts like “There was no ice age. Eliezer Yudkowsky just persuaded the planet to sign up for cryonics.”

    Another stick commonly used to beat the rationalists is the infamous episode of Roko’s Basilisk, a 2010 thought experiment. A LessWrong contributor posited that a future, friendly, omnipotent AI might be obligated to recreate and then torture a simulation of anyone who didn’t do enough to bring about its existence. As each day we aren’t ruled by a benevolent god-computer is a needless waste of human life, the future AI might need to institute these punishments in order to motivate us to contribute all we can to AI research. (Yes, this does involve all sorts of mental time travel, but that’s a detail; don’t forget rationalists like to think globally in time and space.) The thought experiment’s force depends on the utilitarianism common to the community: an action is right if it leads to the greatest happiness for the greatest number of people. And it was, for some of those in the community, remarkably upsetting.

    Yudkowsky deleted the original post and banned discussion of it, citing the psycho­logical harm it could cause readers who agonized over it. The whole incident is commonly used as evidence that the rationalists are crazy. But, at least in the 2016 LessWrong survey, only 10 percent of rationalists reported ever feeling any sort of anxiety about Roko’s Basilisk, and vanishingly few were still worrying about it. All in all, the episode mostly proves that some rationalists took seriously the stated eternal-life-or-eternal-death stakes of their beliefs. If hell does not exist, it might be necessary to invent it.

    The Shape of Alief

    In the end, I find I can only describe rationalists by using a term they introduced me to: alief, which means one of those attitudes that lie below our articulated beliefs but still inform our actions. Rationalists may not believe a spiritual creed, but many seem to share a strong alief in reverence, communion, and liturgy – even in the power of their pseudo-sacraments to help humanity pull itself out of the darkness and reach eternal life among the stars. I believe their desires – for meaning, for solidarity, for immortality – are true, truer even than they themselves know, even though they believe these things must be created by man in the face of an uncaring universe, not received as graces from the Creator.

    On this front, Christians can engage those involved in the rationalist project. These atheists don’t live in a “disenchanted” world, but in one where their actions and beliefs have meaning, even eschatological meaning. The shape of their secular hope, the desire for a future ruled by a benevolent intelligence, points towards the inescapable human longing for the kingdom of God. Of course, in their enthusiastic pursuit of truth, community, and techno-futurist salvation, they have invented some novel errors. Yet, these are people with whom we can and should be talking, though it might require learning their idiosyncratic language. And, fortunately, one of their virtues is an openness to talking to people radically at odds with them.

    Rationalists are people with whom we can and should be talking.

    I told my rationalist interlocutors that, frankly, I feared the rationalist community could come to represent the social equivalent of the unfriendly AI that Yudkowsky is worried about. After all, what he fears is the rule of a superintelligent moral naïf. The rationalists are a new, smart, powerful entity, not particularly bound by conventional morality, interested in rebuilding the world in their image. How can we be sure they aren’t turning themselves and us into paper clips – maximizing an idiosyncratic goal at the expense of normal human values?

    Altair said he thought many rationalists would take the comparison as a compliment, since it means I view them as very smart. But he also said he could understand someone fearing the scale of rationalist aims: “World-scale ambition comes with a world-scale potential for failure – not failure, destruction.” However, Altair asks critics of rationalism to talk to the rationalist community, so together we can test our beliefs and improve our mental map of the world. He says our common destiny should motivate cooperation: “Help us figure out how to make this one future that we share a good one.”

    Contributed By AlexiSargeant Alexi Sargeant

    Alexi Sargeant is a teacher and culture critic who writes from the DC area, where he lives with his wife Leah and their two daughters.

    Learn More
    4 Comments
    You have ${x} free ${w} remaining. This is your last free article this month. We hope you've enjoyed your free articles. This article is reserved for subscribers.

      Already a subscriber? Sign in

    Try 3 months of unlimited access. Start your FREE TRIAL today. Cancel anytime.

    Start free trial now