version

Well-known member
They're just exercises. It's a matter of minutes. You're supposed to get up every twenty minutes or so whenever you're sat down anyway.
 

yyaldrin

in je ogen waait de wind
I dunno I think it all depends what your working assumptions are

My work on incentive structure, reward functions, surrogation, and Campbell's law has led me to think greater data collection is actually the path towards a fairer better world. There are certainly dangers of giving access to data to corrupt bodies, but at the same time, the data is probably our best solution to corruption. So there's opportunity as much as there is danger in this tech

you sound exactly like miles dyson from terminator.
 

constant escape

winter withered, warm
I dunno I think it all depends what your working assumptions are

My work on incentive structure, reward functions, surrogation, and Campbell's law has led me to think greater data collection is actually the path towards a fairer better world. There are certainly dangers of giving access to data to corrupt bodies, but at the same time, the data is probably our best solution to corruption. So there's opportunity as much as there is danger in this tech
I think my main point, regarding this widespread opposition to the ostensible techbro dystopia, is that much of what we think we need to live a magically fulfilling life is actually extraneous. I struggle to articulate beyond this, but what I'm saying is that the ever-data-heavy whirlwinds of algorithmic governance and production are only daunting and dark if we continue to clutch what we feel to be the only rocks in the river. They aren't.

There are rocks downstream that wade in much more hospitable waters, and moving downstream, or rather letting yourself be swept downstream, requires that you affirm the direction of the currents. In order to let yourself be swept up by the current, you need to believe there is a current in the first place. If you don't believe there is an "objective" direction the cosmos are taking, I can see where all this just seems ideological. And hey much of it is, to the extent that the current itself has a direction.

edit: to be sure, any one articulation of this stuff can only approximate it, so I could very well be leaving out or overlooking critical dynamics.
 
  • Love
Reactions: sus

constant escape

winter withered, warm
This is the thread for it - so what are the dark feelings or thoughts that arise when anticipating the future, specifically in light of the increasing power of the tech-literate, especially the higher-ups among them?

Is it a threat against widespread libertarian values? Against privacy? Or is it a broader aversion to having your strings pulled by someone/something? Perfectly reasonable, as far as I can tell.

Is the unease bolstered by the (ostensible) lack of demographic diversity among these players? I really am not familiar with the lineup, so I can't speak further here. But seemingly, the common opinion is that it is a bunch of young white guys, of varying crunchiness, that are leading us into this frontier. At least that is the sense one gets in America, albeit perhaps by lack of information. Again, I don't know.

So is it more of an ethical thing, that the kind of age-old biases will manage to permeate the infrastructure/code of our future landscape even more robustly than it currently does, perhaps despite best intentions at mitigating the "New Jim Code"? Or is it more a fear of being manipulated by people who can, quite effectively, reach through your screen and rewire your circuits?

Let's spill it. What is scary about this? It seems to be a point of consensus among otherwise ideologically diverse members, that this tech-heavy future is almost necessarily dystopic.
 

version

Well-known member
It's pretty much all of that, although it doesn't matter which demographic's pulling the strings. The problem's strings being pulled, not who's pulling them. I don't trust any human with that kind of power.
 

constant escape

winter withered, warm
What do you think of humans increasingly outsourcing that power/responsibility to the bots? We get a taste of this regarding the financial/stock market's usage of machine-learning/AI, millisecond operations, whatnot.

Also, what do you think of the proposition that part of our collective responsibility is to monitor these trends, predicting their trajectory and attempting to minimize damage from runaway developments?

The humans at the forefront of these developments might harness insane amounts of power, but we could get metaphysical and attribute that power to larger forces acting through them, forces that seek expression and yield only marginally to our values/intentions. But that margin could be the difference between dystopia and utopia, no?
 

version

Well-known member
I don't trust that either. I don't trust people not to do something stupid and dangerous with AI out of greed, malice or ignorance. And yeah, I think we do have a collective responsibility to monitor this stuff and attempt to minimise damage, but I don't expect us to do so since we still haven't done it with previous technologies.
 

sus

Well-known member
This is the thread for it - so what are the dark feelings or thoughts that arise when anticipating the future, specifically in light of the increasing power of the tech-literate, especially the higher-ups among them?

Is it a threat against widespread libertarian values? Against privacy? Or is it a broader aversion to having your strings pulled by someone/something? Perfectly reasonable, as far as I can tell.

Is the unease bolstered by the (ostensible) lack of demographic diversity among these players? I really am not familiar with the lineup, so I can't speak further here. But seemingly, the common opinion is that it is a bunch of young white guys, of varying crunchiness, that are leading us into this frontier. At least that is the sense one gets in America, albeit perhaps by lack of information. Again, I don't know.

So is it more of an ethical thing, that the kind of age-old biases will manage to permeate the infrastructure/code of our future landscape even more robustly than it currently does, perhaps despite best intentions at mitigating the "New Jim Code"? Or is it more a fear of being manipulated by people who can, quite effectively, reach through your screen and rewire your circuits?

Let's spill it. What is scary about this? It seems to be a point of consensus among otherwise ideologically diverse members, that this tech-heavy future is almost necessarily dystopic.

Plato's Republic with Spendy as Philosopher King
 

sus

Well-known member
SV's libertarian streak might be its saving grace, in the end—too much hippie utopianism unchecked is a recipe for high-modernist interventionism. "Oh, we're helping! They just don't know it yet, but one day they'll appreciate us."
 

sus

Well-known member
I don't trust that either. I don't trust people not to do something stupid and dangerous with AI out of greed, malice or ignorance. And yeah, I think we do have a collective responsibility to monitor this stuff and attempt to minimise damage, but I don't expect us to do so since we still haven't done it with previous technologies.

One thing we have to maybe face down is the technological inevitability. So, e.g., while it's true that a lot of AI dev is coming out of the Valley, it's also happening in China, in Russia, funded by centralized government agencies specifically in the service of warfare, and where there's no question that it would be put to (is already being put to) pretty oppressive ends. And individuals from the Valley are predominantly the group sounding the alarm on how dangerous AI can be—trying to get the public and the government to take it seriously. Apparently Clinton's campaign, for all HRC's faults, was taking superintelligence seriously, and it's a shame we didn't walk that timeline.
 

constant escape

winter withered, warm
Great points, and yeah you're right, that the mainstream conversation regarding AI is locked in a kind of fetal paralysis because the concept is too fantastic and dominated by sci-fi portrayal (which is often sensationalized to highly detrimental extents, no?). Ultron, Her, Matrix - various portrayals (of varying calibers) which send messages that are almost always oveshadowed by the more impressive production elements.

But underneath that frozen layer, that unactivated conversation, festers this age-old insecurity about being somehow essentially distinct from the physics around us, denying the possibility that we could eventually build something, physically, through increasingly masterful manipulations of whatever is around us, that surpasses us at some atomic or subatomic level.

Because even a cursory mainstream understanding of AI (basically that it is an intricate pattern recognition gauntlet, which could be expressed in such a way as to adhere to the Common Sense, to be accessible and even intuitive) could enable markets to open up, for various industries/fields, even recreational, and we could hasten the road to some AI golden age (perhaps an age where myriad material problems are solved, but are supplanted by higher-order existential and metaphysical problems?).

But such an understanding is bogged down, and yeah I suppose almost the only people attempting any public education are more or less coming from Silicon Valley, no? Is that your point, and that however dubious they may seem to us, we can rest assured that there are far more dubious technicians and investors out there?

Part of what makes it scary, perhaps, is how heavily associated it with surveillance (quick thought about phenomenology @suspendedreason , perhaps data collection can be appreciated subjectively as a kind of surveilled paranoia, like there are forces objectively above you that can cultivate you how they pleas. Phenomenology of paranoia?).

surveillance (n.)1802, from French surveillance "oversight, supervision, a watch," noun of action from surveiller "oversee, watch" (17c.), from sur- "over" (see sur- (1)) + veiller "to watch," from Latin vigilare, from vigil "watchful" (from PIE root *weg- "to be strong, be lively"). Seemingly a word that came to English from the Terror in France ("surveillance committees" were formed in every French municipality in March 1793 by order of the Convention to monitor the actions and movements of suspect persons, outsiders, and dissidents).

The example that would likely come up first is China, no? I know very little about China, and I wish I knew more, because it would be crucial to understand how AI is implemented there, even from an outside perspective. Is anyone here more familiar with how machine learning/AI has been integrated into China?
 
  • Like
Reactions: sus

luka

Well-known member
Vimothy, who makes over a million (£) a year as a programmer says there is no way AI is happening. It's sci-fi fantasy.
 
  • Haha
Reactions: sus

padraig (u.s.)

a monkey that will go ape
in re #notalltechbros - "tech lords" would be more accurate than "tech bros"

when I say something like "dystopia of UBI proles ruled by an elite of tech-bro would-be philosopher-kings"

I'm talking about yr Bezos, Zuckerberg, Thiel (especially), Jack Ma, etc and lesser examples

not so much programmers per se - may or may not be - as the robber baron class of Big Tech, and a posthuman global superrich in general

actual tech bros in this imagined (likely) future are more like an elite class of servants, feudal retainers, not the lords themselves
 
  • Like
Reactions: Leo

padraig (u.s.)

a monkey that will go ape
go live in the Dark Ages, or ancient Sparta as a peasant-slave or some shit
I thought you had studied history? or enough not to make such lame caricatures as "the Dark Ages"

hunter-gatherers successfully existed for tens of thousands of years. industrial humanity has wrecked the planet in a few hundred.

that's not an argument for quality of life, just sustainability

no one here is advocating a return to an earlier way of life. such a thing isn't possible.

or if it happens, it will be patchwork as society gradually collapses, bits of old and new all jumbled up together jury-rigged

the planet has been wrecked to such a degree that the only way out is probably some kind of technological solution

hopefully it's more Star Trek (leaving out the Federation's "we're the good guys" authoritarianism) than dystopic

I ain't holding out much hope
 
Top