Highest Rated Comments


marvin14 karma

Deep gravity well. You need to bring a lot of fuel, landing systems and engines dimensioned for takeoff, which makes it too costly for a startup company ;)

At least "near-term" if we're talking about mining. It sounds like a good idea if humans ever have an ambition to return to the moon.

marvin11 karma

Hi, Luke. I'm a huge fan of yours and the other SIAI researchers' work. Either you're doing some of the most important work in the history of humanity (formalizing morality and friendliness in a form that would eventually be machine-readable to make strong AI that benefits humanity) and in the worst case you're just doing philosophical thinking that won't cause any problems. Either way, I was sure that philosophy had pretty much no practical applications before I saw your work.

Anyway, question is related to funding. Is SIAI well funded at the moment? Can you keep up your research and outreach to other institutions? Do you have any ambitions to grow? Do you see the science of moral philosophy moving in the right direction? Seems like SIAI asks questions more than it provides the answers, and it would be reassuring to start seeing some preliminary answers.

Once again, thanks for being the only institution that thinks about these things. Worst-case you're wasting a bit of time dreaming about important topics, but in my estimation you might prevent the earth from being turned into paperclips by a runaway superhuman artificial intelligence. Really wish you all the best.

[Edit: To anyone curious about these questions, have a read at http://singularity.org/research/. It's really interesting stuff.]

marvin11 karma

It's a pretty awesome system. You put the empathic people in key positions, using their empathy and soft skills to gain trust and information by treating the other side as humans. Understanding their humanity and meeting them on a level playing field. Doing the really tough and honest diplomatic work that's required.

Then you pay them very well and thank them for doing such a kick-ass job. Finally, they hand off the information they collected to a guy who launches a Hellfire missile to blow them and the rest of their wedding to smithereens.

That's what's so great about military intelligence. You get a team with capabilities that are better than the sum of the skills of the individual members!

marvin6 karma

I've got another question, actually. When/if it becomes possible to create strong/general artificial intelligence, such a machine will provide enormous economic benefits to any companies that use them. How likely do you believe it is that organizations with great computer knowledge (Google) will on purpose end up creating superhuman AI before it is possible to make such intelligence safe to humanity?

This seems like a practical/economic question that's worth pondering. These organizations might have the economic muscle to create a project like this before it becomes anywhere near commonplace, and there will be strong incentives to do it. Are you thinking about this, and what do you think can be done about it?

marvin3 karma

It's not about conception, it's about the fetus. We have no idea what would happen if pregnancy happens in zero gravity - for all we know, a full pregnancy in zero gravity could lead to horrible birth defects or even a horrible death for the mother. So it's a safety-first thing.