Federalism: The Original Cloud Computing Solution


 IJR Opinion is an opinion platform and any opinions or information put forth by contributors are exclusive to them and do not represent the views of IJR.
apple

In the late 1970s and early 1980s, a quiet war was brewing in the world of technology.

Computers and the functions they managed were becoming a part of the mainstream American consciousness, with nearly every industrial sector incorporating some form of automation or rapid calculation to make their products and services faster and more efficient, and therefore more affordable for consumers. Not since Henry Ford’s introduction of the automotive assembly line had there been such a seismic shift in how companies and governments conducted business.

But the scientists who had led the computerized revolution just a few years before were nervous, as the systems slowly took on more monolithic sensibilities. Systems were increasingly centralized, with data stored in massive, floor-spanning mainframe servers while the users were given individualized access through dumb terminals, consisting only of a screen, keyboard, and whatever meager processing hardware was required to retrieve data from some labyrinthine data storage facility.

Meanwhile, futurists, such as Dr. Alan Kay and Sir Arthur C. Clarke, believed the future would belong to decentralized systems, based around personal, self-contained home computer systems and a worldwide, peer-to-peer capable network.

Then, during the third quarter of Super Bowl XVIII, a commercial changed everything:

Apple Computer teamed with acclaimed director Ridley Scott, the filmmaker behind “Alien” and “Blade Runner,” to promote its new creation, the Macintosh. The true forerunner of the modern desktop computer, it featured built-in storage options, an easy-to-use graphical user interface featuring a mouse, and it was small enough to take up just a small corner of a user’s desk. And most importantly, it could be easily purchased by anyone who could afford one.

The resulting commercial, dubbed simply “1984” and inspired by George Orwell’s anti-communist novel of the same name, represented then-current computer system architecture as a monolithic, Big Brother-esque dictatorship from which Apple sought to rescue the world.

The ad was considered a masterpiece of marketing and is considered by many to be the greatest television commercial ever made. The sales numbers seem to correspond to that title; the Macintosh’s first-year sales outpaced those of its IBM-manufactured rival.

Fast-forward to 2017, and Kay & Clarke’s dream of the personal computer seems to have come true.

As I look around my office now, I count no fewer than three laptops, two tablets, two internet-capable video game consoles, a high-definition TV with two app-powered entertainment boxes attached, and a smartphone that records 4K video and possesses more computing power than the systems that put Neil Armstrong and Buzz Aldrin on the surface of the moon. And yet, old habits die hard.

In September, credit-reporting agency Equifax announced a breach in its systems that led to the leak of 145.5 million user records, nearly all of which contain sensitive information such as social security numbers, names, addresses, telephone numbers and credit history.

Anyone in possession of this data would have everything needed to steal someone's identity and take out loans in their name, acquire credit cards, and incur massive debt. In short, it is potentially the most devastating data breach in U.S. history, with nearly seven times as many victims as the Office of Personnel Management breach in 2014 and 2015. Only the dual hacks of Yahoo’s systems constitute a larger number of known victims, and the data loss in that case was comparatively minimal.

Last year, the newly elected Tory government in the U.K. announced plans to consolidate a large percentage of the government department’s data on every British citizen — tax records, medical records, government benefit rolls, criminal records — into a single, centralized, shared database. It was heavily criticized by civil libertarians worldwide as not only a breach of privacy, but also an open invitation to malicious hackers to steal massive amounts of exploitable data with a single breach.

In May, a cobbled-together worm program used a leaked National Security Exploit exploit to deploy a devastating piece of malware known as WannaCry, which quickly spread worldwide and crippled many computer systems, including patient record libraries held by branches of the U.K.’s National Health Service — some of the very databases the U.K. government wanted to integrate, top down, into every level of a British citizen’s daily life regarding government services.

The only real difference between the Equifax breach and the WannaCry outbreak within the U.K. government is that the former affected — ostensibly — the private sector. But the credit system in this country is still based around a centrally planned, government-run system called Social Security. And because of the bloated leviathan the federal government has grown into ever since Roosevelt’s New Deal, it is nearly impossible for a citizen to change their sensitive information in the event of a breach.

While Equifax has offered a year of free credit monitoring (not repair, mind you — MONITORING) to affected users, all someone with malicious intent would have to do is wait 15 months or so for the monitoring to expire, and begin wreaking havoc.

Consider for a moment how many services —from your utility providers to your cable company to your cellular provider — request the last four digits of your Social Security number to make changes to your account. This breach is more than just putting credit at risk. It’s a demonstration of how keeping too much identifiable information about a person in a single location could potentially destroy every aspect of their lives.

It’s a staggering number: 145.5 million people. Nearly half the population of the U.S. is impacted by this breach.

And the only reason this intrusion has grown to such a massive scale so quickly is because our banking system is built on a government platform that can only operate with anything resembling effectiveness when it is the central arbiter of our lives.

The only way this nation is going to be able to fight situations like this in the future is to decentralize not only our personal data and our economy, but our government itself. And if history is an indicator, it could lead to better things than we could possibly imagine.

0 Comments
0 Comments
Be the first to comment!
FEATURED
sort by: latest