constant escape

winter withered, warm
My point is that whether or not there is something propelling complexity, complexity can hasten itself by imposing a "something" to propel it. Inventing history to predict the future.
 

constant escape

winter withered, warm
Disputability is negatively correlated to the kind of advancement I have in mind, because by advancing you are subsuming more and more of positionality into your superpositional state. It really is a vast all-or-nothing thing, it seems. But it withstands (seemingly) everything that is thrown at it. At the very least, it, as a "metaideology", can be embraced as a way to cut out a good bulk of the abstract suffering that would otherwise await us.

The dubious thing is the drinking-of-owns-own-koolaid that is fundamentally opposed to the "there is no "objectively" optimal path" sensibility, it seems. Not saying you said that, but it does seem to be a common sensibility.

And even after all of this, and at the very least, it still seems like a much more open-minded and productive dogma than we have previously seen. Presumptuous? Definitely. And that is even assuming the dogma isn't some cosmic optimal.
 

padraig (u.s.)

a monkey that will go ape
complexity can hasten itself by imposing a "something" to propel it
imparting a consciousness and will to "complexity" is the same as imparting them to the "cosmos"

you're just worshiping at a different altar - complexity, guided self-organization, whatever

while saying "but no, this is different"
 

padraig (u.s.)

a monkey that will go ape
but yes, there is no objectively optimal path

you'd have to qualify what you mean - optimal for who, or what - and then it isn't objective
 

constant escape

winter withered, warm
imparting a consciousness and will to "complexity" is the same as imparting them to the "cosmos"

you're just worshiping at a different altar - complexity, guided self-organization, whatever

while saying "but no, this is different"
Even if this is just worshipping at a different altar - is it not the most pragmatic one? A pragmatism that ought to calibrate around some averaged aim of humanity.

If it is imparting a consciousness to complexity, is it not also demystifying/dethroning consciousness as transcendent/divine? This still all needs to be grounded in physics, and I can't the "hard problem of consciousness" is something I yet have an answer for. But one of the goals would be to diffuse the significance of consciousness throughout the cosmos itself.
 

constant escape

winter withered, warm
"The dubious thing is the drinking-of-owns-own-koolaid that is fundamentally opposed"

can this be clarified without the word salad

You're right - I mean the circular logic of value, how a given metric can ascribe value to its own meters. Bootstrapping, the Book is true because the Book says the Book is true. Drinking one's own koolaid, circular logic, is opposed, not without reason, by a more empirical "value needs to come from something prior/higher" sensibility.

And I mean optimal in the sense that molecules can have some kind of optimal structure, some structure that preserves them against dismantling forces. There is arguably always a more optimal state, given the virtually infinite number of states, but the structure itself can become more and more optimal without hitting that point.

We can say "but its still subjective to say that molecules which last long enough to compound onto one another are better molecules, "objectively" more optimal" but that is where I draw the line, I guess, in terms of skepticism. At least, I don't see what is to be gained by it.
 

padraig (u.s.)

a monkey that will go ape
where you're going astray is with the idea that complexity is an objectively desirable aim in and of itself

I listened recently to a multidisciplinary roundtable on metaphysics (expansively) with people like Dennett etc

Stephen Jay Gould said something of relevance - that people often mistake evolution to imply that that which survives is "better" (I'm paraphrasing)

his example was woolly mammoths - they weren't "better" elephants than some other variant that died out, only more optimal for their conditions

just as a molecular structure can more optimal in terms of a specific biochemical process, but not "better"

not seeing what is gained by that distinction is exactly what places you at the altar of complexity

it's a common line of thinking these days with tech-savvy youth et al

the drive to optimize everything, under the assumption optimal is better and thus an end unto itself
 

constant escape

winter withered, warm
Part of the assumption is that science does actually peer ever more deeply into how things work, even if the measurement of this is how effectively we can predict/structure the cosmos. Even if some portion/bulk of science is supportive scaffolding for the rare insight, there is the rare insight.

And another assumption would be that there are universal optimals - not that the universe has preferences, but that we, in our limited nature, can afford to think of it as having preferences. The science doesn't need to be right, as much as it needs to be right enough. Enough for what? The advancement of science, perhaps?

Part of the difficulty is that, to get this across, one needs to express that all expressions, even the ones aimed against it, are valid. To take a position that affirms all positions, in a way that seems to pass through our register without registering, seemingly like this quantum business. How is it possible to be planted in multiple places at the same time, perhaps even in all places, everywhere, at the same time?
 

padraig (u.s.)

a monkey that will go ape
Part of the difficulty is that, to get this across, one needs to express that all expressions, even the ones aimed against it, are valid
it's hard to think of a better summation of benevolent totalitarianism

you can't even have a truly dissenting position because all is equally valid, equally contained within the metastructure (or superposition, or whatever)
 

sus

Well-known member
Great points, and yeah you're right, that the mainstream conversation regarding AI is locked in a kind of fetal paralysis because the concept is too fantastic and dominated by sci-fi portrayal (which is often sensationalized to highly detrimental extents, no?). Ultron, Her, Matrix - various portrayals (of varying calibers) which send messages that are almost always oveshadowed by the more impressive production elements.

Right, media portrayals basically show AI as human-like, which is crazy. Human intelligence, and all known animal intelligence, is like this tiny pin-prick in the space of all possible intelligences. Films like ExMachina portray it like, “Well, you know, we’ll build AIs, and the AIs will, of course, want some resources for themselves, but if we’re nice to them, they’ll probably be nice to us. And on the other hand, if we’re cruel to them, and we try to enslave them, then they’ll resent that, and they’ll feel rebellious, and they’ll try to break free.” But of course, that kind of thinking may have a lot to do with human normative biases and not with how intelligence works broadly. I like the idea of intelligence as a natural kind, that it's purely instrumental, one's ability to achieve goals, completely separate from goals themselves. That allows ethics and intelligence to be completely, cleanly uncoupled.
 

padraig (u.s.)

a monkey that will go ape
and I'm a staunch defender of science - I have a reasonable background in it, albeit not physics

but I'll tell you what science can't do, will never be able to do - tell you something is "better" or "worse" than something else

only better or worse in some specific context
 

sus

Well-known member
Vimothy, who makes over a million (£) a year as a programmer says there is no way AI is happening. It's sci-fi fantasy.

Pessimism is definitely the woke, insider take on this issue; a lot of people in the field look at e.g. the GPTs, which are cutting-edge and have high-wow factor, as very very far away from general superintelligence. But still, that kinda thinking means we're maybe 50-100 years away from transformative AI, rather than 10-20 years. The inevitability is there; it's just a matter of timescale.
 

sus

Well-known member
I thought you had studied history? or enough not to make such lame caricatures as "the Dark Ages"

hunter-gatherers successfully existed for tens of thousands of years. industrial humanity has wrecked the planet in a few hundred.

that's not an argument for quality of life, just sustainability

I'm glad it's not an argument for QoL, cuz if we're talking history, male homicide in historic hunter-gatherers are like, what, 30%? infant mortality is like 50%? childbirth mortality not far off, etc. Expected lifespan even if you beat child mortality is still like, 40? Brutal times.

There's no doubting we've fucked the ecosystem up, but I don't think it's quite so hopeless, we might just see this as a bad blip of irresponsibility. We're an adolescent society, right? We've never really wielded insane, planet-changing power until this past century. Until recently, no one even knew it was possible to change the climate through industrial production, gas emissions etc.

I think there's maybe both optimism and pessimism in the adolescent picture, each in equal measure
 

constant escape

winter withered, warm
benevolent totalitarianism
I can start to see where this could very much apply to what I'm talking about.

tho capitalism is already perfectly good at that, without all the fancy cosmology talk, in the form of recuperation
Hopefully the right cosmology can harness capitalism's incredible momentum, rather than let capitalism continue to exploit our relatively immature sensibilities.
 

sus

Well-known member
@suspendedreason you really do have to stop attributing positions to people they don't hold, it's a terrible habit

again you'd think someone with a fancy education would know not to keep setting up strawmen

if people bring up my "fancy education" one more time I swear to god I'm leaving this forum. ad hominem is a worse offense than unintentional misattribution
 
Top