Entries Tagged as 'General'

Started a new company

It’s named Base Zero, and it will be very cool.

 

BuddyNS

BuddyNS is a free secondary name service. Yes, free. I started using it a couple of months ago and had no issues.

Good for your random project domains where you can’t justify spending on DNS fees.

Sir Martin Reese about the Future of the Cosmos

Martin talks about the future of the cosmos and our responsibility to prevent existential risks at a long now foundation seminar.

Nice to see H+ memes coming from the president of the Royal Society.

H/T Tom McCabe @ Kurzweil AI

The computation market becomes more liquid

The Register tells us that Amazon will auction their excess capacity.  We’re a couple of steps away from computation becoming a liquid commodity.  The next step is for a couple of additional providers to arise (Google?).  The step after that is for the APIs to be brought in sync by the providers or by a third party intermediary.

Peer to Peer Development

GitTorrent (described on Advogato) is a really distributed version control system, based on Git and BitTorrent. It seems to hold the promise of:

  • Public keys (PGP) are used for authenticating changes
  • No central web site for a project
  • Easy forking of projects
  • Package and OS distributions without a central download location
  • A distributed mechanism for security and feature updates

The significance of all this is that it:

  • levels the playing field for individual developers and small groups
  • routes around censorship more effectively
  • allows end user to choose different views of the repository based on which developers they trust

H/T: Slashdot

I would suggest a further improvement – multiple signatures on sources and on binaries. This would greatly reduce the chance of Trojan binaries being installed on hundreds of thousands of computers next time that Canonical/Debian/RedHat distribution points are subverted by a black hat hacker. Binary signatures would require a repeatable transformation from source to binary – by fully specifying the compile tools and compilation environment and using specified values for timestamps.

Many-worlds Immortality and the Simulation Argument

An alternative to the simulation argument:

Nick Bostrom’s Simulation Argument argues that at least one of the following must be true:

  • the human species is very likely to go extinct before reaching a “posthuman” stage
  • any posthuman civilization is extremely unlikely to run a significant number of simulations of their evolutionary history
  • or we are almost certainly living in a computer simulation

However, I see other possibilities. Assumptions:

  • The strong many-worlds theory is correct (i.e. all consistent mathematical systems exist as universes, a.k.a “everything exists”)
  • The many-worlds immortality theory is correct (i.e. for every conscious state there is at least one smooth continuation of that state in the many-worlds)

Given these assumptions, it doesn’t matter if we are in a simulation because our conscious state exists in many simulations and many non-simulated worlds that look identical to us (but are different in imperceptible ways). Even if all the simulations stopped, there would still be a continuation of our conscious state in a non-simulated world consistent with our observations to date.

Further, it seems that there are more non-simulated worlds than simulated worlds. This is because there are many ways a mathematical model can exist so that it cannot be formulated in a finite way, and therefore not simulatable by an intelligent entity. It might even be that simulatable world are of measure zero in the many-worlds.

Further out ideas:

A fascinating related idea is the Egan Jump as described in the book Permutation City. The idea is to jump to another world in the many-worlds by simulating the genesis of a new universe. In this universe you code yourself into the initial conditions, and design the rules so that you end up as an upload in the substrate of the new universe. Because that universe will continue as it’s own mathematical model, your conscious state will continue in that universe, branching off your original self.

Yet another, more distantly related idea is that the peculiarities of our universe (quantum physics, large amounts of empty space) are in a sense an error correcting mechanism. Because any perturbation of a world is also a world, the result is quite chaotic and inhospitable to meaningful life. The structure we see around us with large aggregates “average out” the chaos. This leads to a stable environment as required for conscious observers to arise.

Amazon tries to patent S3

Slashdot reports that Amazon is trying to patent S3. This means that I will refrain from using it for any project, and stop using it for my existing business, FillZ.

Mouse brain simulated at 1/10 of real-time

Update: the BlueGene/L instance used here is only 1/32 of the size of the one deployed at LLNL, so we are still within the high bound after all. On the other hand, it remains to be seen how accurate the model is compared to a functional neuron.


Dharmendra S Modha posts an article about a recent result presented at CoSyNe 2007.

We deployed the simulator on a 4096-processor BlueGene/L supercomputer with 256 MB per CPU. We were able to represent 8,000,000 neurons (80% excitatory) and 6,300 synapses per neuron in the 1 TB main memory of the system. Using a synthetic pattern of neuronal interconnections, at a 1 ms resolution and an average firing rate of 1 Hz, we were able to run 1s of model time in 10s of real time!

This is excellent news, since it will now be possible to figure out what biological modeling aspects are important to functionality.

Since the human brain has 100 billion neurons, this represents 1/10,000 of a human brain. The computer was a $100 million BlueGene/L. So an improvement of 10,000,000 is required in order to model a human brain for $1M in real time.

However, the BlueGene/L is two years old, and it is about 20 times less efficient compared to commodity hardware (based on a quoted 360 teraflops). So the real improvement required is only around 100,000.

Based on this data, the human brain requires 10 Exa CPS, one order of magnitude above the high estimate use in my calculator. Human equivalent for $1M would be available around the year 2023.

Hardware specifically suitable for this application may bring this back to 1 Exa CPS and pull this back to the year 2020.

Various Autism pointers

– Discover has an article about environmental stress (diet, toxins), inflammation and Autism.

– A University of Nottingham study described on the Biosingularity blog shows that autistic children are actually very good at inferring mental states by looking at eyes. This study uses moving images and overturns previous studies where static images were used. I personally find expressions quite readable, yet I used to block them out because the associated emotional charge is anxiety provoking.

– IEET links to an article by Michael L. Ganz that attempts to quantify the cost of autism to society. However, this makes me wonder if the cost is overwhelmingly offset by the contribution of people with Asperger’s syndrome to science and technology. Asperger’s is thought to be a form of high-functioning autism.

Compute power estimate for future (and past) years

Since I end up figuring these a few times a year, I went ahead and created a little calculator.

It has two fields for year, and calculates the compute powers and ratio. It also compares the compute powers to that required to compete with (or simulate) a human brain.

Nano and lightyears in context

Check out this interactive “powers-of-ten” flash presentation from Nikon. Good for some perspective…

Make sure your browser is full screen or you may miss the controls at the bottom.

Hat tip to Nanodot.

Random collection of reputation system articles

Here is a collection of links to articles about reputation systems, in no particular order:

Random Reputation Ramblings (and quite a few other reputation posts on same blog)

Reputation systems vendor relationship management (VRM – opposite of CRM)

Group reputation and application to loans

– Jim Downing mentions a study on feedback gaming on eBay on the Smart Mobs blog

– On gaming reputation systems in Privacy Digest

Hybrid Wetware

Wired reports about machines controlled by natural neural networks.  Georgia Institute of technology.

Japan’s Petaflop Supercomputer

Japan builds MDGrape3 – a petaflop supercomputer for a measly $9 million.

The hardware is specialized, but may possibly be suitable for neural simulation.

Amazon releases grid storage

Price is reasonable too!

Amazon.com Amazon Web Services Store: Amazon S3 / Amazon Web Services

Blogged with Flock

Google is god

http://money.cnn.com/2006/01/24/technology/dumbest_googlegod/index.htm

Timeline to Singularity

(this one is by my father, Vladimir)

Timeline for Singularity is Near (by R. Kurzweil)

2025: IPM creates the first machine to execute 500 trillion instructions/second, the throughput needed to simulate the brain.

2027: Mr. Bill Bates obtains from IPM the exclusive rights to produce a human-level software system. Bill buys an inexpensive brain simulation
and creates the first Humachine.

2028: With huge VC financing and a grant from POD, Bill hires tens of thousands of programmers, builds a campus of 400 buildings and achieves
de facto monopoly in Humachine software business.

2029: A first division of our Humachines destroys the enemy at Pat Pot. Enemy retaliates.

2030: One million Humachines suffer sudden death. A machinopsy shows that the last message in the system was: “irreversible error”.

2031: Improved software which has less bugs is introduced. The production of Humachines takes most of the Earth
available resources. Earth warms up considerably as a result of excessive use of resources to produce Humachines.

2050: World human population decreases considerably as a result of low birthrate and losses related to Humachine wars.
Twenty million Humachines are destroyed by a terrorist attack on their power sources. Much more are built.

2052: Earth temperature continues to increase. A group of humans led by Steve Gobs design an interplanetary vessel
that defies the laws of Physics and might be able to reach outer worlds.

2072: Humachines attack the remaining humans. Gobs’ group escapes with a number of space vessels.

2112: Earth is too hot for life. After 40 years of wandering through the space, the last humans and their children reach the
New Outer World and establish a new civilization (where producing any machine that can do more than 100 computation/s is punishable by death).

The Singularity Was Here…

V.

I started a Blog.

Yup. I jumped on the bandwagon. The bandwagon has been circling the town for awhile now, waiting for me to jump on it.