To quote from Rucker's post:

"This is because there are no shortcuts for nature’s computations. Due to a property of the natural world that I call the “principle of natural unpredictability,” fully simulating a bunch of particles for a certain period of time requires a system using about the same number of particles for about the same length of time. Naturally occurring systems don’t allow for drastic shortcuts."

Rucker's argument is fair enough as far as it goes but the whole point of the statistical mechanics invented by Gibbs and Maxwell and Boltzmann is that once you have enough particles in a system you can make accurate statistical statements about that system.

So we have the gas laws, the laws of thermodynamics etc.

Another point worth making is that current developments in spintronics (computations using the "spin" of electrons) offer a layer of computation beneath that of atomic matter.

I concede that at some point "fudging" will have to take place, but as I pointed out before: statistical mechanics isn't really fudging. Diffusion can be accurately modelled without having to model every single damn particle.

Anyway my gut feeling is that if something like a singularity happens it will be much weirder than simply grinding up the Earth into nanomachines then running a simulated Earth on the nanomachines.

I mean c'mon, if you're a superhuman intelligence what's the first thing you're going to do? Create the perfect lay? Work out the formula for the perfect cup of tea (of course, according to Douglas Adams this is a much more difficult computational problem than most anything else...).

Top 10 Ways to Speed Up Old Technology

30 minutes ago

## No comments:

Post a Comment