This might be a low IQ point so forgive me lmao but; if "human society" were to suffer a crisis of meaning wouldn't it really only happen in the increasingly secular, increasingly irreligious west?
Please don't take this as crude, but I think it's only not a problem for people who don't reflect on the ontological parameters of meaning (and as you said, you aren't a member of that group). If spiritual people remain confident in a fixed set of axioms that guarantee meaning, then they won't encounter any crisis. Of course, they will continue to encounter an unrelentingly material reality that undermines their beliefs, forcing them to rationalize their positions. As long as science keeps developing its knowledge of the Higgs Boson, neutrinos, etc. spirituality will have to answer by fortifying their assumptions about reality.
I don't think Bakker has much concern or time for spiritual people. For him, the unknown "outside" isn't the abode of a deity, but something more like the uncaring plane of the Old Ones, the "mad, black Deleuzianism" of Nick Land, the cosmic vortex that Rust Cohle stares down in the finale of True Detective. All metaphorically speaking, of course--but the basic idea is the same: that there is a reality that exceeds meaning, and spiritual people color this dark space in with the empyrean light of eternal godliness.
It may be the case that those who "believe" are able to fend off the semantic apocalypse, but only for so long; because if the outside isn't a humanist deity that crafted us in its own image, then it probably doesn't give a shit whether we "believe" or not.
To paraphrase Hemingway, we'll die like dogs, and for no good reason.
This is fucking mindblowing!
I'm skeptical of two of the author's claims:
1) "AI will destroy human society by destroying meaning": While I appreciate the exploration of this as a possibility, the author strikes me as overly certain that AI will develop this capability, and therefore overly certain of this destructive outcome. If his argument is that this is already happening, my anecdotal experience suggests that so far we're not necessarily more mired in cognitive dissonance/conflict today than at other points in time since the Enlightenment, if you factor out other developments like globalization, industrialization, and runaway depletion of natural resources.
2) "Civilization was doomed from the start due to the intractability of our biomechanical nature": It's easy to make this claim today in the context of environmental destruction and natural resource depletion, but I think this oversimplifies things unless we look at individual nations/cultures, as there are some (i.e. the Germanic ones, if we focus on the easier-to-evaluate developed nations) which have a much better track record for sustainability than others.
Aside from that, I thought this was a brilliant articulation of the limits of human cognition, and it looks at our social/political problems in a deeply atomic way that I haven't seen before. Thanks for sharing!
No prob! And I agree with your suspicions. Bakker is pretty confident in the evolution of superintelligent AI, and that seems to be a crux for his argument. He seems to speak from a certain teleological perspective that assumes history had to develop a certain way. Whether this is actually what he thinks or whether it's just a consequence of the way he talks about it, I'm not sure. But I will say that I think AI are just necessary for the semantic apocalypse (as he calls it) to come to pass. I think the conditions for which it
could come to pass are still present, regardless of whether it happens.