|
Post by StyxD on Aug 14, 2016 12:19:14 GMT
best way to make all humans happy is to kill all humans? just don't have your theoretical AI think that way I guess. problem solved. Oh, absolutely. It's not a criticism. I just mentioned that thought because it occurred to me first when I was writing the previous post. no i invite discussion, my main critique is that you are treating it like a practical proposal. ITs more plot device and thought experiment, then my vision of a possible future. (...) So the point? I suppose my point is that a totalitarian rule can be a positive state of things, even if such control is taken to such extremes as "the resistance is part of the system" See, the problem I have with this thought experiment is that I don't really get what the experiment is. If the base assumption is that "it's totalitarian and it's awesome" then it's hard to really reason about whether or not totalitarianism could in theory be awesome (as what I gathered was the point of this) - it's awesome because you said so. Everything the AI does (not specifying what it does) is right. Everyone is happy (not specifying how and why). There's no discussion, really. My take is that Utopia would require an absolute dictatorship with a god like leader, and a level of technology approaching clarks's third law. So it's, basically, the biblical Kingdom of God. To be honest, I would be hoping that, at least in theory, the AIs in my idea would be objectively right. No political view can be said to be objectively right, and no vision for the future objectively the best - they are all predictions. The AIs would be just biased towards one of the visions, and perhaps able to efficiently bring it about, but still their direction could not be objectively right.
|
|
|
Post by TempestFennac on Aug 14, 2016 14:52:36 GMT
I meant their solutions would be practical within the confines of reality while still sticking with their ideology.
|
|
|
Post by tiberia on Aug 16, 2016 1:26:56 GMT
not quite. the idea is just of a hypothetical positive totalitarian state. The question is however if this is desirable from an ideaological standpoint. Not practical. A man like Dostoyevsky would say no, that human freedom is more important above all else. Another might say no cause they feel struggle itself has an inherent value, and so conflict and struggle should be embraced not lessened. etc... If you could put yourself in a box that would stimulate your brain so that you would have perfect happiness and satisfaction forever, and you would live forever more in that box, would you do it? by practical measure you should say yes, but I wager you will say no because you will have some serious ideaological issues with living in that box even if it gives "perfect happiness" politics, and the imagining of society have never been much for the practical. listen to how communist revolutions describe their goals, now tell me how practical they are being. Next look at how much ire the very idea of a communist society will garner even today. you say there is no discussion, but you are very wrong, you are just looking at the item of contemplation from the wrong angle. stop thinking in practical terms. I already said nothing about this is practical. Think about ideaology. Think about the ideal role of government the intrinsic value of freedom, struggle, and conflict is it desirable for things to be controlled, or undesirable? if you want to bring in the religious lines, would control of this magnitude constitute a very real example of playing god? is just creating the AI akin to a golden calf? NEVER say there is nothing to discuss. there is always something. even if its a painting that's a uniform shade of blue you aren't wrong they do call the singularity a modern messiah myth for nerds for a reason. and this also takes off some of the smoke screen from what people may actually be saying when they bring up a kingdom of heaven.
|
|
|
Post by Canuovea on Aug 16, 2016 3:32:55 GMT
Of course, one must wonder if such a state of plenty and happiness requires authoritarianism in the first place, even if said authoritarianism is in the form of an AI. So... basically, is what is "positive" about this state something that requires giving freedom up at all? I'm not so sure. Of course, it is difficult to discuss such things without diving into matters of practicality. Still, you've taken this in one direction, but you could easily take it in an anarchist direction as well.
Also, if you wouldn't want to put your brain in a box to have it stimulated to perfect happiness, then it clearly isn't perfect happiness.
|
|
|
Post by tiberia on Aug 16, 2016 4:27:53 GMT
practicality of anarchism aside
I don't think id want to live in a utopic anarchist society. It would for me feel directionless, listless. There would be a smallness to it, with everyones world limited to just themselves and their social circle. Maybe you would have some community projects, but I don't think you would get that sense of being a part of something, which i think is needed. The moment you really got something you could feel apart of it would stop being anarchism. as paradoxic as it might be too say i feel like a utopic anarchy by relying so much on the individual would make me feel all the smaller. I think we should come together in some organization of some form.
|
|
|
Post by Canuovea on Aug 16, 2016 4:33:32 GMT
I'm not entirely sure a sense of being part of something is needed for people in general, but perhaps for some. However, I feel that there are some anarchists who would reject the notion that anarchism and community are contradictory. So I suppose we'd need to ask an anarchist.
Though I personally do agree that you community organization and working together is generally to be desired. Nonetheless, feeling smaller isn't necessarily bad. Less can be more, and small can be beautiful. Then again, to achieve something like going to space or expanding the frontier or even (likely) scientific discovery, you'd need coordination.
|
|
|
Post by StyxD on Aug 16, 2016 15:22:22 GMT
A man like Dostoyevsky would say no, that human freedom is more important above all else. Another might say no cause they feel struggle itself has an inherent value, and so conflict and struggle should be embraced not lessened. etc... See, for me the question of freedom is a practical one. How will this "totalitarian AI" limit human freedom? Why at all? We are limited in freedom anyway, if only by laws of nature and causality. So it's: what more are we giving up in exchange for what? If you had, for example, the AI act as a guardian angel / advisor, which makes all the decisions for humans under its ward in according to their innermost needs and desires, and those decisions are guaranteed to maximize your satisfaction with existence - but in exchange you can't disagree, then we could discuss that. But as you presented in utopia, it's completely undefined what it actually is. Except that it's both totalitarian and utopic - two states that are essentially contradictory without further explanation. Well, I guess we learned that I'm not a "freedom freak", or something. But we don't - and can't - achieve total freedom anyway. The struggle argument I regard as inherently fallacious. If you could put yourself in a box that would stimulate your brain so that you would have perfect happiness and satisfaction forever, and you would live forever more in that box, would you do it? by practical measure you should say yes, but I wager you will say no because you will have some serious ideaological issues with living in that box even if it gives "perfect happiness" If we could create boxes that are eternal, faultless and can manufacture perfect happiness for all humans, then by all means, we would have achieved the pinnacle of human development. There's nothing more left to do but for everyone to GET IN DA BOX! But such a box cannot exist. For one, it would still have to exist and this universe and it could be affected by things from this universe. And it would be impossible for the person inside the box to react. So... my objection lies in that it would not be safe enough. But in the realm of ideals, I would have no objections. But! It also depends on what do you mean "stimulate the brain". If it's by chemical drugs, then that perpetual high would not be something I want. But if it's by senses stimulation, I'd be in for it. Next look at how much ire the very idea of a communist society will garner even today. To be fair, nowadays when people say they don't like communism, most of the time they mean "that Jewish conspiracy that forces us to accept homosexual deviants". Not all of them. But most. Words don't mean things anymore, sadly. NEVER say there is nothing to discuss. there is always something. even if its a painting that's a uniform shade of blue But... it's not uniform! they do call the singularity a modern messiah myth for nerds for a reason. Except more silly. It reminds me of when the LessWrong blog "reinvented" the wrath of God against the unfaithful and then part of them had a collective nervous breakdown about it. Because singularity is such a serious and inevitable concept. Maybe you would have some community projects, but I don't think you would get that sense of being a part of something, which i think is needed. The moment you really got something you could feel apart of it would stop being anarchism. Hey, but here I totally can muster an ideological objection. Setting aside the fact that anarchist aren't anti-community, just anti-hierarchy, and that people tend to limit themselves to their social circle anyway, whether or not they ascribe to them and their circle the meaning of "part of something more". I'm vehemently distrustful of the "need to be part of something more" feeling. On one hand it's natural, on the other, it's the root of so many evil things, the festering womb, the repugnant origin, whence all granfalloons spew forth. People feel "the need to be part of something bigger" - and so they join crazy sects, authoritarian mobs and all manners of cults of personality - anyone who can peddle to them the feeling of "bigger", of "some direction". It doesn't need to me something good, or concrete, or beneficial - it can be literally anything. Da feelz is all that matter in that regard. And with universe being what it is, it's much easier to create a grand illusion that generates huge feelz of having a direction in one's life than to create something actually meaningful - acknowledging the truth that ultimately this universe has none. Wow, dunno why I suddenly felt like ranting like that. Maybe because you've just admitted you'd rather have an authoritarian utopia instead of anarchistic one, because you want to feel (feel!) big. Which kind of plays right into this little theory of mine. But there we have a discussion.
|
|
|
Post by Canuovea on Aug 17, 2016 6:38:01 GMT
Question. How would you define Utopia? Give the dictionary definition if you want, but I'd prefer having it in your own words (we could always compare to the dictionary definition).
Basically, we need some understood shared premises here.
Also, StyxD, do you mean Utopia and Totalitarianism are essentially contradictory without further explanation or with further explanation? I'm kinda confused (as "without further explanation" is more commonly used in English, so I dunno if English is just being dumb again or if there was a typo) and what do you mean by that?
|
|
|
Post by StyxD on Aug 17, 2016 22:55:27 GMT
Also, StyxD, do you mean Utopia and Totalitarianism are essentially contradictory without further explanation or with further explanation? I'm kinda confused (as "without further explanation" is more commonly used in English, so I dunno if English is just being dumb again or if there was a typo) and what do you mean by that? It was always saying "without". You ain't not seen no typo. How would you define Utopia? A state where every citizen is happy and no (serious) problem occur for anyone in the populace.
|
|
|
Post by Canuovea on Aug 17, 2016 23:34:09 GMT
Nice. I'd probably say something similar.
Though how does a society that has an AI who knows everything you do (and probably what you think and want), provides all that everyone needs for life, and seems to be able to prevent any and all harm from occurring, end up contradicting that?
In fact, I'm not sure that we're really dealing with totalitarianism and anarchism as being different at all in this case. Ironically, I think we've got a marriage of anarchism and totalitarianism. The AI makes all the governmental decisions, but also seems to be mostly enforcing a certain base set of standards (health, safety, etc) and really ensuring a world without consequences. The humans are free to literally do whatever they wish in this scenario, even to each other so long as the AI doesn't consider it harmful (and since the AI knows what the people involved want, more likely than not...).
|
|
|
Post by StyxD on Aug 18, 2016 10:48:19 GMT
Though how does a society that has an AI who knows everything you do (and probably what you think and want), provides all that everyone needs for life, and seems to be able to prevent any and all harm from occurring, end up contradicting that? I guess the issue is not our definitions of utopia, but of totalitarianism. Totalitarianism means that the government strives to control every aspect of the citizens' lives. This is why it's contradictory with utopia: since you can't live the life you want to, but rather the government chooses it for you, it's not - in the general sense - a place where everyone can be happy. Unless you can somehow winnow the populace and keep only the people who willingly embrace the governing ideology - which is the holy grail of populist dictators everywhere. But I digress. Now, if the AI in our example bends around individual citizens to cater to their needs, while still allowing them no option but the one they should logically want, we can argue whether or not such totalitarianism would be an utopia. But it's not how Tib presented it. She said - as I understood it - that it's both totalitarianism and utopia, but did not specify what boundaries the AI enforces on the citizens. In fact, if: The higher tires would be left to the individual, with a bit of help. So all that would be left for a person to worry about is stuff like self actualization, and from there they decide what is best for them, and the AI just sort of plays referee from there. then I'm not even sure if this qualifies as totalitarianism. For example, can a citizen publish a pamphlet about how the hate the AI? There's another thing I forgot to mention. I'm not really taking the "controlled rebellion" aspect into consideration when writing all this, because it feels... very oddly specific in this case. All aspects of the system are left in the air, but the fact that the rebellion is controlled by the AI. But why even allow rebellion? Which will, as I mentioned, erase the utopia status because 1) innocent people will have to deal with threat of terrorism, 2) the game of cat and mouse with the rebellion (which the AI controls, but can't allow to overthrow itself) would have to escalate constantly, since if the rebels would achieve flat nothing for decades, they'd become disillusioned - which would defeat the whole purpose of controlled rebellion, somehow. There are other ways to deal with such things when you're an omniscient AI. It's such a specific element that I feel it only exist to facilitate erotic fiction, or something...
|
|
|
Post by TempestFennac on Aug 18, 2016 14:06:56 GMT
How would that set-up with the rebels facilitate erotic fictions?
|
|
|
Post by Canuovea on Aug 18, 2016 19:37:52 GMT
I was going to move on to totalitarianism's definition eventually, I'm sure...
In this case (I'm going to try to frame it), there would be no government, no citizen's input in terms of laws (at least not really), no citizen's say in any of the decisions the AI makes (at least not really), and no right to privacy from the AI (who would be watching literally everything). People would be living in a world that they have no say in actual structure and rules of. It has one individual entity that makes all those decisions for everyone. If that is not totalitarian (and you're right, it may not be because said entity does not necessarily want to totally micromanage people's lives), then what do we call that?
Certainly, I don't think she meant that the AI forces upon citizens the optimal choice for what they really want. You're right that, I think, she meant that the AI provides the baseline for human needs and enforces certain rules, rules which the AI probably created.
As for the rebellion thing...
Well, any threat of terrorism would be at most a nuisance, since nobody is going to actually be hurt. The most that could be done is damage infrastructure. While that might get old eventually, who knows?
Also, I don't think the rebellion bit has anything to do with erotic fiction really, since that may (may) have had more to do with me raising the question of "and what happens with people who find that kind of control over them unacceptable?" Or maybe Tiberia already thought of that without me.
|
|
|
Post by wordweaver3 on Aug 19, 2016 21:05:56 GMT
What I don't get is how an AI is going to interpret morality and utopia since both are subjective. One person may say the perfect world is one without any wants and all their needs are met, another might consider a world without struggle and desires as a pointless existence. In the end whoever programs the AI ends up being the one who is really in control of everyone's fate. The world becomes subject to that person's narrow view of "perfection".
|
|
|
Post by StyxD on Aug 19, 2016 22:35:53 GMT
I'm not really sure what that system would be called. If all important things would be handled by an absolute AI, which would be driven solely by logic, while individual and insignificant in the big picture choices being left to humans, then I guess it could be given the moniker of "technocracy"? Also, I don't think the rebellion bit has anything to do with erotic fiction really, since that may (may) have had more to do with me raising the question of "and what happens with people who find that kind of control over them unacceptable?" Then why just not have them... leave? It would be an imperfect solution, but surely more efficient then unending simulacrum for the sake of a really small handful of people (if everyone would be content and only idealists would oppose the system, there would be very few of them).
|
|