Human Scale Memory Timeline Calculator

I have previously mentioned my estimator for when human scale computation power will be available. I have since realized that the bottleneck might be memory rather than computation. I’ve created a similar estimator for memory.

Although we may achieve human level compute power in 2014, it looks like memory capacity will lag by another 6 years, assuming low estimates. With high estimates, compute power is available in 2020 and memory capacity will lag by 5 years after that, to 2025.

However, if Flash memory or similar technology will do the trick, a factor of 4 in cost reduction will advanced the timeline by about 4 years.

Brian Wang’s things to watch for 2009

Brian Wang writes about technologies to watch for 2009, including Memristors, high speed networking, optical computing and quantum computing.

Peer to Peer Development

GitTorrent (described on Advogato) is a really distributed version control system, based on Git and BitTorrent. It seems to hold the promise of:

  • Public keys (PGP) are used for authenticating changes
  • No central web site for a project
  • Easy forking of projects
  • Package and OS distributions without a central download location
  • A distributed mechanism for security and feature updates

The significance of all this is that it:

  • levels the playing field for individual developers and small groups
  • routes around censorship more effectively
  • allows end user to choose different views of the repository based on which developers they trust

H/T: Slashdot

I would suggest a further improvement – multiple signatures on sources and on binaries. This would greatly reduce the chance of Trojan binaries being installed on hundreds of thousands of computers next time that Canonical/Debian/RedHat distribution points are subverted by a black hat hacker. Binary signatures would require a repeatable transformation from source to binary – by fully specifying the compile tools and compilation environment and using specified values for timestamps.