Bringing the Search for Extra-Terrestrial Intelligence to your own Home!

This is about more than intelligent life in space, this is about intelligent life here on Earth. Our greatest scientists performed all of their mathematics by hand before Texas Instruments produced the first hand-held calculator in 1967, and even that could only perform the four primary operations; add, subtract, multiply, and divide. It is this need to develop more powerful computational tools that directly led to the development of the ‘computer’. Today, there are billions of computers around the world, and that number becomes even more staggering if you include the smart phones, tablets, and other devices in the count. Eventually, we started constructing ‘super-computers’, but here is the secret to the massive mainframes: they are really just millions of ordinary computers all using a type of organization pattern called ‘distributed computing’.

“What is distributed computing,” you might ask. Well, first consider a single, ordinary computer. It contains a processor, primary and secondary memory, a display and interface, and enough hardware to power those. Eventually, we started having computers work together to solve problems that were too time consuming for individual processors. Today, the most powerful super-computer, Japan’s K-computer, links 88,128 processors together to achieve 10.51 PetaFLOPS, a unit of measurement for computational power. This computer isn’t even finished yet, with the slated completion date of June 2012. When finished, it is expected to cost ten million US dollars a year to operate. Such an expensive enterprise has made super-computing a large scale project of research institutions. Until recently, that is.

Just as the computer was the natural successor to the calculator, the internet is the next step in the evolution of tech-assisted living. In order to develop an architecture that would allow multiple computers to work together efficiently and accurately, mathematicians created a thought experiment called the ‘Byzantine General’s problem’. The name harkens to the Byzantine Empire, the name given to the eastern half of the Roman Empire after it collapsed in late antiquity. The defense of the realm was entrusted to several generals, each with his own resources. When needed, they were supposed to communicate their plans to each other and work together to crush large invasions. The question was, how would two generals send messengers back and forth across enemy held territory in a way that would convince both that the attack was to happen, given that the enemy could capture and kill the messengers or even distort the message?

In 1999, the problem was tackled using what became known as Practical Byzantine Fault Tolerance, which started a cascade of further research. As a result, distributed computing was used to solve problems more complicated than mathematical ones; Berkeley uses it to sort through the results of the SETI project, Stanford uses it to calculate possible protein structures, Oxford uses it to predict the weather, and CERN uses it to improve the Large Hadron Collider. The project organized by Stanford University, at last inspection, involves 426,787 processors and benchmarks at 8.588 PetaFLOPS, which would make it the second most powerful super-computer in the world and over three times the power of the actual second-place super-computer, China’s 2.566 PetaFLOP Tianhe-1A. The projects expand on a volunteer basis by what is known as ‘volunteer computing’. Computer owners can install free software, BOINC for Windows/Linux/BSD or Xgrid for Mac, to dedicate excess operating power to projects of their choice.

Some uses of the program have been more controversial. Consider the BitCoin project, which is a digital currency that is simultaneously coordinated across all of the users of the software. By using algorithms that compare the real currency held by the network and the amount of coins generated by the software, the developers can control inflation and the total BitCoin supply. Since the currency is digital it can be divided down to eight decimal places, whereas conventional money usually only goes down to two which we call ‘cents’. Since its creation in 2009, it has been adopted by a variety of institutions, both legal and illegal, as a form of donations. The idea has grown into a reality, and some in the American government are calling for its termination on the grounds that it acts as a money laundering scheme. That may be, but it offers many advantages that the US dollar could only dream of, such as being implicitly international, 24/7/365.25 up-time, cannot be manipulated by “quantitative easing” or stolen, completely anonymous (unless desired otherwise), and make banks and their fees completely unnecessary.

Another popular use is known as the Tor browser, formerly an acronym standing for The Onion Router. Originally developed by the US Naval Research Laboratory for keeping the identity of covert operatives safe around the world when accessing government records, it has since grown into a fully fledged privacy movement. The idea is simple: users accessing the internet from the Tor browser, and the request is then forwarded through several other users on the network before being routed back to the first user. Thus, the person retrieving the information is completely unrelated from the person requesting it, resulting in anonymity. When the internet was killed in Egypt, such proxies were how the people inside the nation were able to communicate their situation to those outside. Furthermore, software exists for servers to ensure they are only accessed through such proxies, which mask their ultimate IP address and thus their physical locations. This has allowed the creation of black markets on the internet only accessible through onion routing.

As these tools become more widespread in the public domain, they are triggering legal questions around the world. The internet itself uses a version of distributed computing to organize the IP addresses of all computers connected to the internet. Governments have attempted to shut down parts of the internet by forcing internet service providers to ignore the existence of entire sections of this distributed network. Aside from being ineffective, these techniques would have absolutely no effect on these private distributed computing projects. One would think this is a boon to the first amendment guarantee of freedom of the press and free association, but it has instead prompted congressional investigations into the legality of such networks; should organizations outside the bounds of the law be allowed?

The most obvious answer is: absolutely! The technology is independent of its intent, just like handguns or vehicles. Both are leading causes of death in the modern world, yet we preserve the right to own both since their benefits strongly outweigh their costs. Consider the implications of distributed computing going global on a grassroots level. We are approaching the roughly 4.3 billion IPv4 address limit, which necessitated the creation of the IPv6 protocols; that means there are 4.3 billion computers connected to the internet. It only took 4.3 hundred-thousand to create a poor-man’s super-computer! How many critical computational problems could be solved in the first few milliseconds of the global computer’s existence? Perhaps we might finally find that needle in the radio signal haystack that Carl Sagan imagined existed when SETI was started. If you are interested in becoming part of the future of the internet, interested in becoming part of something bigger than yourself, then I strongly recommend you check out their work, SETI@home (http://setiathome.berkeley.edu/).