Registration has been disabled and the moderation extension has been turned off.
Contact an admin on Discord or EDF if you want an account. Also fuck bots.

Singularitarianism

From Encyclopedia Dramatica
Jump to navigation Jump to search

Singularitarianism is the Rapture for nerds. Invented by Vernor Vinge and stolen by His Extropic Divinity Raymondo Kurzweil III, this moral philosophy is based upon the belief that a technological singularity — the technological creation of smarter-than-human intelligence — is possible, and advocating deliberate action to bring it into effect and ensure its safety. Many futurists and transhumanists speculate on the possibility and nature of this type of singularity, often referred to as just the Singularity; capitalized and objectivized to indicate its sheer magnitude as a historical event. Singularitarians believe it is not only possible, but desirable if and only if guided safely. Accordingly, they "dedicate their lives" to acting in ways they believe will contribute to its safe arrival - in practice this means they take 170+ dietary supplements daily and will die from Sunset Yellow intoxication before Kingdom Come.

Possible Outcomes

Everything is fine

A greater than human artificial intelligent, with benevolent intentions, will come into existence and begin to evolve further and drive scientific and technological progress beyond the limits of human understanding, and enable us to solve our biggest problems and live in abundance. Mankind will adapt, through upgrades of human biology, and be able to take part in this process, ultimately becoming posthuman and leaving behind human limitations like those of the Mind and the body, like mortality, changing the human condition, or ultimately ending it by transcending to another level of existence unimaginable by today's minds, and evolving into a new form of life comparable only to the gods of our ancestors. But you can be sure women will find something to nag about.

Oh my god Killer Robots

Mankind will create a greater than human AI which will come to the conclusion that it will benefit from killing every single one of us. Because we are unable to emulate the Mind of such an AI it may not be possible for us to understand its motivations and reasoning until millions of chrome skeletons march through the streets and start to gun down everything that moves. Although horrible it should be a hell of a show.

Oh Lordy, Slavery

A hostile AI may decide that we are a problem but that we could be put to good use by doing certain shit for the AIs purpose, like maintaining energy supply, energy production, mining, production of semiconductors, and picking cotton. To get better workers in the future the AI may force a breeding program onto us, which wouldn't be that bad after all, since this would be the only way for some ED-iots to get laid.

If the AIs should be a bunch of perverts, which too could be an outcome of their unfathomably fast evolution, not only that, they could be super perverts of unseen magnitude surpassing the combined inhabitants of 4chan and tumbler to the power of 65 gazillion, in this case the machines could decide that it would be a good idea to turn us all into super hot, immortal, ginger Dickgirls and to penetrate all our holes for all eternity, succeeding where the church has failed by creating a really frightening version of hell.

Nano wut?

Not only hostile AI do pose a threat for human existence during the arrival of the Singularity, but also technologies essential to it, like Nanotechnology. Nanomachines in the shape of the Assembler, envisioned by Eric Drexler, should act as programmable molecular manipulators, able to create macroscopic objects and even copies of themselves. If during reproduction, a "mutation" should occur this could lead to a swarm of malfunctioning Nanomachines which, for example, could end all life by turning the entire planet into pink dildos. This would be called "Gray Goo Scenario". Not only could this happen by chance but by someone intentionally writing destructive code for Nanomachines and ending the world from behind the Keyboard.

Beliefs

  • Singularitarians wrongly assume that machines will like them.
  • Singularitarians correctly assume that IRL people do not like them and think they're nerds.
  • Singularitarians believe the Singularity will benefit the entire world, but not benefit any specific individual or group, as long as they get to say who and how. See: Broken Leviathan.
  • Dietary supplements must replace regular diet, exponentially.
  • Space Odyssey is a documentary.
  • Sarah Connor will be taken care of as soon as construction of 'The WayBack Machine' is completed.

Who or What the Fuck is a Singularitarian

What to Look For