constant escape
winter withered, warm
Been interesting in elaborating ideologies that are not centered around the human (which is arguably impossible), but perhaps around something upon which the human is superstructural.
Spending time here, at Dissensus, granted me an understanding of accelerationism as an inhuman ideology (at least the Nick Land brand - but again, I cannot elaborate much much on his thoughts here. I believe someone here described accelerationism as a communism oriented around capital, which stuck with me). To be sure, whether or not accelerationism is effectively dead, exhausted, or has ever carried any political purchase is another question. My question: Is accelerationism a taste of the kind of inhuman ideologies we may adopt at some point in the future? Perhaps ideology is no longer the right word, at this point.
What kind of ideology, or value system, would be centered "around something upon which the human is superstructural"? Perhaps a vitalism, but I'd be inclined to go deeper. Something upon which even the vital/alive is superstructural? I lack the words here, but I would say: the manner in which matter forms and compounds upon itself is the very force that amounts to intelligence. Perhaps this force is intelligence, or perhaps intelligence emerges from this force - but whether or not this force is predestined, whether or not its development is predetermined, is almost an irrelevant question. This development can be stochastic or pseudorandom and still, in hindsight, abide by some telos/purpose (rather than seeming random in the present and in hindsight).
My question is this, is an inhuman value system necessarily anti-human, if carried to some conclusion? Could humans adopt an inhuman ideology without becoming inhumane, but rather indifferent towards the human? What could such an ideology orient around? The expansion/evolution of physical ordered complexity in the cosmos? Ever-deepening information density of computers, etc?
Perhaps humans wouldn't be the ones to "adopt" such an ideology, which is why I hesitate to even use the word ideology - perhaps there is a better word from computational theory, some set of values that operate within/behind/beyond the algorithmic? ("algorithm" as a set of instructions for a computer, instructions that spell out what is true and what is not. I welcome criticism here, but I think that is a good point of departure)
In thinking about the intersections of various developing technologies (blockchain, surveillance, machine learning, quantum computing, data/attention economies, internet of things), as well as the fact that covid is (arguably) accelerating our exodus from meatspace into the entirely manmade online realm, a realm which almost nobody understands enough to prevent their utter manipulation by the techies who do - in thinking about all this, a radically new day-to-day epistemology seems well warranted.
To be clear, its not like most people have a thorough understanding about how "meatspace" works, but the key point is that meatspace is not manmade, at least most of it isn't. With "cyberspace", everything is manmade, every quantum/pixel is determined, if even arbitrarily, according to a human's will. Thus, the navigator of such a world is not navigating a neutral space, but a space imbued with the ideological residue/inheritance of meatspace. This promises to enable a magnitude of power unseen in human history, right? There are people who construct/determine/curate search engines, which have become veritable extensions of our cognitions - and yet we traffic through these spaces as if they were spontaneously realized by some neutral yet benevolent programmers, merely because the infrastructure is practically alien to us.
Might as well end the rant here - but I am looking for a way to smuggle a cryptohumanism into an ostensibly posthuman/inhuman ideology. Many knots to tie and untie here.
Spending time here, at Dissensus, granted me an understanding of accelerationism as an inhuman ideology (at least the Nick Land brand - but again, I cannot elaborate much much on his thoughts here. I believe someone here described accelerationism as a communism oriented around capital, which stuck with me). To be sure, whether or not accelerationism is effectively dead, exhausted, or has ever carried any political purchase is another question. My question: Is accelerationism a taste of the kind of inhuman ideologies we may adopt at some point in the future? Perhaps ideology is no longer the right word, at this point.
What kind of ideology, or value system, would be centered "around something upon which the human is superstructural"? Perhaps a vitalism, but I'd be inclined to go deeper. Something upon which even the vital/alive is superstructural? I lack the words here, but I would say: the manner in which matter forms and compounds upon itself is the very force that amounts to intelligence. Perhaps this force is intelligence, or perhaps intelligence emerges from this force - but whether or not this force is predestined, whether or not its development is predetermined, is almost an irrelevant question. This development can be stochastic or pseudorandom and still, in hindsight, abide by some telos/purpose (rather than seeming random in the present and in hindsight).
My question is this, is an inhuman value system necessarily anti-human, if carried to some conclusion? Could humans adopt an inhuman ideology without becoming inhumane, but rather indifferent towards the human? What could such an ideology orient around? The expansion/evolution of physical ordered complexity in the cosmos? Ever-deepening information density of computers, etc?
Perhaps humans wouldn't be the ones to "adopt" such an ideology, which is why I hesitate to even use the word ideology - perhaps there is a better word from computational theory, some set of values that operate within/behind/beyond the algorithmic? ("algorithm" as a set of instructions for a computer, instructions that spell out what is true and what is not. I welcome criticism here, but I think that is a good point of departure)
In thinking about the intersections of various developing technologies (blockchain, surveillance, machine learning, quantum computing, data/attention economies, internet of things), as well as the fact that covid is (arguably) accelerating our exodus from meatspace into the entirely manmade online realm, a realm which almost nobody understands enough to prevent their utter manipulation by the techies who do - in thinking about all this, a radically new day-to-day epistemology seems well warranted.
To be clear, its not like most people have a thorough understanding about how "meatspace" works, but the key point is that meatspace is not manmade, at least most of it isn't. With "cyberspace", everything is manmade, every quantum/pixel is determined, if even arbitrarily, according to a human's will. Thus, the navigator of such a world is not navigating a neutral space, but a space imbued with the ideological residue/inheritance of meatspace. This promises to enable a magnitude of power unseen in human history, right? There are people who construct/determine/curate search engines, which have become veritable extensions of our cognitions - and yet we traffic through these spaces as if they were spontaneously realized by some neutral yet benevolent programmers, merely because the infrastructure is practically alien to us.
Might as well end the rant here - but I am looking for a way to smuggle a cryptohumanism into an ostensibly posthuman/inhuman ideology. Many knots to tie and untie here.