Other than the frenzied anticipation for the coming breed of tablet PCs, the one topic that dominates the mindspace of the technorati these days is the world of e-readers. More specifically, a great debate is brewing; each of the e-readers and their associated online book stores favor differing standards and file formats, and we may have another good ole fashioned format war on our hands
. (Nothing gets a techie's blood pressure going more than watching as competing technologies duke it out.) Format wars are to the tech world what elections are to politics, or what playoffs are to sports: a chance for competing candidates to go big or go home -- based on the preferences of the masses. The most cited example is the great Betamax vs. VHS war of the early '80s (in which the objectively better standard got trounced), but, in truth, battles over standards have been with us since the first wheel was chipped from stone. Switched took a gander through time, and pulled together some of the greatest (and most interesting) format wars, most of which affect us today. Hit up the comments for battles that you think deserve inclusion in our hall of forgotten, or outmatched, tech.
1. Metric vs. U.S./English customary
There is no battle of standards that better exemplifies the sheer amount of stupidity and stubbornness humans can muster than the one between the metric system and English customary. The first version of the metric system was proposed by the French following their revolution in the 1790s. Eventually, the Gauls invited the international community to help refine and formalize it for use around the world. From the start, the U.S. was very supportive, officially allowing the use of the metric system in industry and business by 1866. In fact, at the so-called Treaty of the Meter in 1875, the U.S. was one of the first to agree with the idea of transitioning to a universal system of measurement, and, since then, all non-metric measurements in the U.S. have been defined in relation to the metric system.
Yet, it never took. Every few decades, another round of legislation and educational initiatives come out, promises are made, papers drafted and committees formed. (And, occasionally, Mars Orbiters are lost due to metric-English clashes
.) Inevitably some nationalistic outcry arises or studies come out showing that the cost of transitioning to metric would be too onerous (despite evidence from almost every other country in the world to the contrary). So, 200 years later, we still cling to our precious yards, miles, quarts and pounds -- one of just three countries, along with Burma and Liberia, on the planet still not using the metric system.
2. AC vs. DC
Although Thomas Edison is justly remembered for his many formidable contributions to our modern world, his direct current (DC) electrical grid was his greatest failure and most absurd folly. Edison, at first, had great success with DC, which debuted in 1882 in New York City and quickly became the early standard of electricity transmission in U.S. cities. However, as the demand for electricity increased, DC's limitations quickly became apparent when compared with the alternating current (AC) technology developed by Edison's one-time employee Nikola Tesla (and championed by his competitor Westinghouse). At the time, DC power came in one flavor (100 volts), and, due to technical constraints, couldn't be transported beyond 1.5 miles from a generation plant. AC, on the other hand, could be easily transported via cheaper wires at high voltage, and then could be stepped down for use in the home by a variety of devices.
To thwart the adoption of AC, and to save his investments and patents, Edison funded a vast and outrageous misinformation campaign to discredit it, even staging public electrocutions of animals to show how dangerous AC current was -- most famously frying a Coney Island circus elephant
that had gone rogue. To cap it off, despite being an adamant opponent of capital punishment, Edison secretly funded the creation of New York's first electric chair -- using AC power. Initially, it was a disaster, and the condemned had to be shocked multiple times before dying. The demonstration also did nothing to halt the widespread adoption of AC. Amazingly, many cities in the U.S. still had pockets of DC power for decades, with some lasting until just a few years ago.
3. QWERTY vs. Dvorak
In the early days of the typewriter, one of the most confounding dilemmas was creating a keypad that didn't get its works all jammed up. Inventor Christopher Latham Sholes came up with a novel and effective solution in 1873: rearranging the layout of the keys so that frequently used letters were spaced in a manner that largely prevented their mechanisms from clashing with each other. His system, to which we now refer as QWERTY, worked well, and eventually pushed out all the competing layouts. The only problem is that, apparently, the QWERTY layout is pretty darned inefficient, and is an ergonomic nightmare that promotes fatigue. One researcher working in the 1930s, a Dr. August Dvorak, created a keyboard layout that actually made typing easier, faster and more comfortable. Despite a wealth of evidence supporting his layout's value (as his students readily and repeatedly whupped most QWERTY typists in competition), world events conspired against him. The economics of the Great Depression, followed by the materials shortage of WWII and the reticence of already established QWERTY training schools, meant that no one was willing to give him a chance. Though tons of alternatives are available today (and the Dvorak layout
is offered on most PCs as an option), the vast majority of keyboards are QWERTY.
4. TV Scan Lines: 441 vs. 800
The early days of television are like a slow-motion gold rush. Starting at the turn of the last century, companies worldwide streamed into the new field strictly on the speculation that riches were to be found somewhere, some day. Not surprisingly, with so many differing and competing players, television's technical development was completely scattershot. By the onset of WWII, the U.S. had just nine television stations across the country, and each had unique standards of broadcasting and incompatible televisions sets; some networks' systems, like that of DuMont, even differed from city to city. Though all of the varying stations and budding TV makers negotiated with one another over the years, none could agree on a common standard that would allow any TV to receive signals from any station. To reconcile the fractured market before it fell apart, the FCC stepped in and appointed the National Television System Committee, which finally formalized broadcast standards in 1941. At the time, one of the crucial competing standards was the number of scan lines used for TVs -- effectively, the resolution of the set. The FCC chose 525 lines as a compromise between RCA's preferred 441 and a proposed higher resolution of 800 scan lines. That compromise -- which thwarted what would have effectively been the same horizontal resolution of today's 720p HDTVs -- would remain in the U.S. for more than 50 years.
5. AM Radio vs. FM Radio
There's no room to recount the travails of the radio world in this tiny space, but one of the more dramatic conflagrations involved the battle between AM and FM. After debuting around the turn of the last century, AM radio became big business and, of course, a groundbreaking development as people across the world could hear live events and entertainment programs. The problem with AM, though, was that its signal was highly susceptible to interference. Enter Edwin Armstrong, a success already for his other technical patents, who set out to experiment with ways to mitigate interference. Through his work, he ended up inventing FM radio.
After building an FM transmitter for RCA in 1934, Armstrong offered to license his patent to his friend, and head of RCA, David Sarnoff. Sarnoff, more concerned with developing TV, declined the offer, and Armstrong went on to build his own station. A public display of his technology brought incredible fanfare, and, within a few years, Sarnoff had realized that his AM radio business was threatened, and petitioned the FCC to restrict FM. His insider connections prevailed, forcing FM radio to move to different frequencies, which meant that thousands of FM radios (and transmitting equipment) became obsolete overnight. The coup de grace
came when RCA filed for -- and won -- the patent for FM radio. Armstrong was devastated, and ended up battling RCA in court for the rest of his life. After a court loss, and resulting financial ruin, Armstrong committed suicide in 1954 (though his wife posthumously won the case). FM radio would eventually prevail, largely relegating AM to the domain of talk radio and news, but it would take more than three decades after its initial aborted launch for the format to take off.
6. CDMA vs. GSM Cellular Technology
Like so many early experiments, the analog days of cellular telephones (like the one Gordon Gekko carried in the original 'Wall Street
') were characterized by a mishmash of technologies that only consolidated fairly recently into two competing and incompatible standards: CDMA, used in the U.S. and a minority of countries across the globe; and GSM, used by about 85-percent of the world market. GSM was developed and released in the '90s as a global standard to promote interoperability. Interestingly, CDMA came later but was based on far older technology, and was instead developed with security and restriction in mind. The most pronounced strength and unique characteristic of CDMA is that it uses a spread-spectrum technology called code division (the CD of CDMA). Not surprisingly, the basis for the technology came from the military, but it actually dates back to the WWII era. Even more mind-blowing is that it was actually invented and patented in 1942 by uber-legendary silver screen babe Hedy Lamarr
(with an artist partner) to remotely and securely control torpedoes. It wasn't used by the military for that purpose, but the coding system eventually found its way into Qualcomm's CDMA standard. By the time it launched in Korea in 1995, though, all of Europe had already agreed to use GSM technology.
While CDMA does offer many advantages and has the largest number of subscribers in the U.S., its lack of compatibility with most foreign countries' systems, and even with some operators in the U.S., has left consumers at a competitive disadvantage. Sadly, the next generation of wireless isn't doing much to fix the problem. (See Wireless Data below.)
7. SACD Vs. DVD-Audio
Staying true to historical form, music and electronics companies got bored in the late '90s and decided that it was time to switch up the physical format again and abandon CDs. (Poignantly, their greed would earn a karmic nut-kick that still has them reeling.) The nominees for the next best thing were SACD (made by Sony and Philips), which debuted in 1999, and DVD-Audio (supported by Toshiba and a posse of heavy-hitters) which came out the following year. Both were pitched to consumers with the idea that they offered incredibly high sample and bit rates, various forms of surround sound, and, in the case of DVD-A, the potential for added value, such as video, video games and other content. And, naturally, they also required special and pricey new players to use them, high-end surround-sound speaker systems to appreciate them, and a deep wallet to buy your entire collection of music all over again.
In a rare case of clearheaded decision-making, consumers simply balked, and opted instead for low-quality, downloadable music. Neither format has done much business since, other than in niche communities like classical music fans and audiophiles. It is with much delight, too, that we link to an Audio Engineering Society double-blind test
, which found that even engineers and audiophiles aren't able to distinguish SACDs (and we'd assume DVD-As, as well) from regular CDs. Enjoy.
8. Digital Audio Formats
The early '90s were a perfect storm of technological progress. At the same time that computers were getting cheap enough for home buyers yet powerful enough to handle multimedia, the World Wide Web suddenly became accessible to non-academic types. Soon CD ripping and file sharing were all the rage, and the first shots of the music format war were fired. Since the music business steadfastly refused to sell music online for several crucial years, the gap was filled by the end of the decade by online "sharing" (read: stealing) services like Napster
. Though several audio file formats existed that offered far better quality, MP3 became the quick and dirty lingua franca
due to the small file sizes needed for downloading on dial-up; which was the only option for most users at the time. Once the music industry realized it was quickly going bankrupt and decided to enter the game, they made their second blunder and decided to put out files laden with DRM -- a variety of file types that weren't transferable between computers or necessarily compatible with the digital music players that were entering the marketplace.
By the time so-called MP3 players became mainstream, the situation was not unlike that of e-readers now; consumers were having to decipher what file type their media was, what types their player supported, and what their rights were when using or sharing them. Rather than agree upon a single file type playable on any device, electronics companies (notably Apple) refused to cooperate. As Apple's iTunes store rose to dominate the market, the only competitive option left to other online stores was to sell DRM-free MP3s that could be played on any device. The result has been that virtually every online store -- with again the major exception being Apple's iTunes store (which sells DRM-free AAC files) -- now sells completely DRM-free MP3s that will work on any digital music player. Instead of enjoying tunes at the best possible quality, the average music lover today listens to lousy, lower-quality MP3s.
9. HD DVDS vs. Blu-ray
The most high-profile format war to have taken place since the great Betamax/VHS debacle, the battle between HD DVD and Blu-ray was sort of like the first Gulf War: tons of saber-rattling and build-up, leading to a massive invasion and a relatively quick denouement
. The battle pitted two large groups of electronics makers, software companies and movie studios against one another, with Toshiba leading the charge for HD DVD and Sony championing Blu-ray. The DVD revolution of the late '90s had left electronics companies struggling to make a profit in a market flooded by cheap players, while movie studios raked in cash. Sony, Toshiba and their ilk realized that the new hi-def technology was an opportunity to create a patented standard, for which they could charge royalties to use. Two competing technologies emerged, and, despite ongoing negotiations to find a single standard, the two sides quickly lined up behind their respective formats in 2002. Each had their merits. Among other variables, HD DVD was cheaper to manufacture, and used existing dual-layer DVD discs, while Blu-ray was pricier but offered almost twice the storage. Players for each camp debuted in 2006, and, for the first year or so, sales were tepid, especially since players were insanely expensive. What arguably tipped the scales in favor of Blu-ray was -- like the adoption of DVDs the generation before -- the inclusion of a Blu-ray player in Sony's PlayStation 3. While HD DVDs initially outsold Blu-ray, more households ended up owning Blu-ray players and investing in Blu-ray discs. So, when Warner Bros. announced on the eve of the 2008 CES convention that it would soon stop supporting HD DVD, the war was effectively over. HD DVD supporters canceled their programs at CES, and, within months, the format was dead.
10. 4G Wireless: WiMax vs. LTE
Complete entanglement-free wirelessness is the holy grail and future of virtually every techno-gadget, but speed is surely a close second place -- which makes the ongoing war for 4G supremacy a delicious battle royale. Seemingly having learned from the errors of the past, U.S. wireless companies got together with their foreign counterparts, and have been creating a worldwide standard for ridiculously fast and smartly run 4G systems. But, once again, there are two standards playing this particular game of chicken -- WiMax and LTE -- and neither side plans to back down. So, in the U.S., AT&T, MetroPCS and Verizon customers will (or may already) be using theoretically compatible LTE cell phones and PC modems, while Sprint users will be chugging down data using WiMax. (T-Mobile has decided to simply wait this one out, and just upgrade its 3G network for now.) Some critics have pointed out that the competition is good, and that it may work out so that both technologies end up succeeding. We agree, but we have to point out that, once again, consumers fail to have a single, universal standard. That leaves consumers in the hands of the wireless companies. Just where they want us. Great.