Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

[deleted]


It's just one more "oh my god the world is going to be scary" ghost story. The human race has gone through a number of really, really deeply changing technological hurdles over the past few thousand years while still remaining fundamentally human (and in fact I think more fundamentally human), and I don't think this one is going to be any different.

Because when you get down to it, there are two ways an AI can be intelligent. Either it participates in the human world - in which case we are inherently part of its future - or it doesn't - in which case we are irrelevant to it, and it is irrelevant to us, and essentially in exactly the same way as bacteria. Sure, bacteria can kill us and the world is more risky with them in it. And sure, looked at from a certain angle, we evolved "beyond" bacteria and thus took over the world from them.

But when you look at reality, first: we're made of bacteria. And even if you don't consider eukaryotes just a good way for a few bacteria to survive better, there are actual bacteria that live in us, to the point that there are more bacterial cells that make up you than there are human cells with human DNA, and you would die without them.

So even though evolution has gone "beyond" bacteria - it's gone beyond them in a way that in no way makes their lives worse, because it really is bacteria all the way down. So far from being surpassed, actually the bacteria have conquered everything.

Human culture is a software running on that substrate, in a sense. The things that make us human are our language, our ideas - and those are at an entirely different level from the bacterial. They coexist as separate continua, really.

So let's assume that there arises a higher-level self-aware mind (and let's further assume that hasn't happened yet) - at worst it's going to evolve "beyond" us in the same way we are "beyond" the bacteria. I don't see that as precluding us continuing in our mundane human existence in a way that the vast majority of us will never notice.

So to be perfectly clear - I personally believe entirely that something like the Singularity is in our future. What I think is crazy is the idea that we will be superseded. These anti-tech terrorists think it's going to be Terminator, when actually it's going to be something like a more ubiquitous Internet. Nobody's afraid of the Internet - the terrorists aren't, even - even though it's highly probable that the Internet itself is what will evolve into the next level of life.

I just don't see how anybody can look at the changes that are coming down the pike and think, "Oh my God, the future is scary" - it's just going to be better. I mean, shit, I grew up in the middle of nowhere with interlibrary loan the best way for me to find out the really hard stuff. The Internet is like crack to me. I now do programming and technical translation for customers all over the world, and I can move whenever I want. That lifestyle was impossible twenty years ago. Impossible. No agency would track me down by mail if I had to keep leaving forwarding addresses, but now? I go to Hungary for the summer, we lived in Puerto Rico for two years, now I bought a foreclosure close to home (that I found on an Internet search) and we've been here a couple of years - and next year it's back to Europe. My customers might notice I'm getting up six hours earlier, but otherwise it's all the same to them.

As technology gets smarter, I will continue to build it into my life, and as a result, my life will continue to be closer to what it ought to be. I just don't see the downside to that. And as it gets really smart, it's going to be inside my brain anyway, and - poof! Exponential curve. I expect to be part of it. How could it be otherwise?

So, no. I think the only reason to be afraid of the future is if you want to be afraid of something. And I consider that truly crazy.


Keep in mind, humans had absolutely no compunction about inventing antibiotics to selectively exterminate entire species of bacteria that happened to inconvenience us. I'd hate to have something so powerful with the same kind of power over us. Technology enhances our lives now because we have agency over it, not the other way around.


That's an interesting angle, though I'd personally rate the higher risk being humans who do have agency over technology. Technology is a human ability multiplier, which is a problem if the ability includes destruction. The 20th century brought nuclear weapons, for example, which have been kept remarkably out of use through, I think, a combination of luck and extreme suppression. It just so happens that some of the engineering challenges in making a nuclear weapon are particularly hard to DIY, even when you understand the science (fuel enrichment is apparently the most significant bottleneck, with a secondary bottleneck at actual weapon construction). And then governments make extensive efforts to keep any sort of nuclear DIY scene from developing, to make sure that even harmless science-fair type versions of practical nuclear knowledge don't arise "in the wild".

Will that all also be true for 21st-century destructive technologies? If any technology appears where one person could kill 20 million people with it, either it will have to be very hard to DIY, we'll have to be very good at suppressing it, or likely, both simultaneously.


I think some of what you're saying is a bit too broad, but more or less on the same point as I am. Technology drives human progress, some could argue it -is- human progress and the idea that there is anything to fear from computers or "nanobots" is about as ridiculous as believing "The Matrix" is based on a true story.

Luddites have existed in our society (by that I mean First world) for a long time. The ITS is just another in a long list of people conspiratorially whispering about things William Gibson wrote about in fiction 20 years ago.


> things William Gibson wrote about in fiction 20 years ago

Greg Bear, 26 years ago (Blood Music): http://en.wikipedia.org/wiki/Grey_goo


Which is funny, because the goo is actually green.

I never understood the fear of nanotech, because I see no reason that tiny robots would be any more scary or harmful than all the bacteria out there already. And the bacteria can already both self-replicate and kill us.


If bacteria were as powerful as nanotech, there would not be any need for nanotech. Nanotech could potentially be to bacteria what the space shuttle is to a horse.


Nanotech can neither replicate nor kill us yet. It's not magic, and evolution has a huge head start on dealing with issues like powering it and self-replication. Devices at that scale are, generally, quite frail to things like stray cosmic rays or other background radiation causing them to break down.

And the space shuttles are dead; there will be no more of them. Horses, meanwhile, have survived for millions of years, so you're making the opposite point by mistake.


thanks for taking the time to write this. it was quite an epic read, while listening to radiohead's 'everything in it's right place'.

That deleted comment was mine. My intention was to trigger some argumentation, but then @feral wrote something that sounded like he had a much more formed opinion that I did, so my comment felt unneeded. anyway, deleting it was such a bad call, sorry about that. lesson learned.

anyway, for the record and the curious, here it is (snatched it back from my cache):

--

But to imagine that that self-aware technology will be anything but symbiotic with us is truly crazy.

Why is it absurd? I'm not arguing one way or the other, I'm just interested. I know very little about Ray Kurzweil's singularity theory (or AI research for that matter), the little I know doesn't sound that implausible. But it does sound quite dark, so if there's a reason why it's truly crazy then I would joyously hear it.

Edit: to be clear here, crazy or not crazy, I do not endorse violence.

--




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: