Click here to load reader
View
3
Download
0
Embed Size (px)
Citation preview
The Pocket Watch Was the World’s First Wearable Tech Game Changer
Google Glass is just the latest in a long line of body-borne technologies designed to enhance
our lives.
Would you wear a computer on your wrist?
It’s a new high-tech debate, as “wearable” computers begin to go on sale. We’ve long grown
accustomed to carrying a computer in our pockets—but now tech firms are betting we’d rather
have one on our wrist, showing us our messages, social-networking pings, maybe some Google
searches. Already, over 400,000 people bought Pebble smartwatches last year, and Google’s
head-mounted Glass computer was released to over 10,000 early adopters. Apple is widely
rumored to be putting out a smartwatch later this year.
For many, wearables seem like a final, crazed step in information overload: Tweets on your
wrist! Supporters, however, claim that a smartwatch might actually be less annoying—because
you can quickly glance at it.
This isn’t the first time we’ve run through this debate, though. To really understand how the
wearable computer could change our lives, consider the impact of the original wearables—the
pocket watch and the wristwatch.
Clocks began to transform everyday life as early as the medieval period, when church bells
sounded the hours, letting villagers know the pace of the day. But timekeeping began to weave
itself into day-to-day life in an entirely new way as clocks became more omnipresent and
portable. Affordable pocket watches weren’t common until the 19th century, but once they
arrived, they quickly invaded the world of commerce. When you could time your actions with
those of a remote trading partner, new styles of just-in-time commerce could emerge.
“Merchants desperately needed to time certain things,” says Nigel Thrift, co-author of Shaping
the Day, a history of early timekeeping. “If you think about all the farms, those goods and crops
around London, if they don’t get to the city at a certain time, they’re spoiled.” Meanwhile,
pocket-watch-wielding conductors meant trains could begin to keep regular schedules; scientists
and astronomers could conduct more precise experiments. Portable watches even made it easier
for lovers to conduct illicit affairs, by arranging to meet at a preordained spot and time. (“You
try conducting an affair without a sense of time,” Thrift jokes.)
And when precise time wasn’t available? Chaos ensued. In 1843, elections in Pottsville,
Pennsylvania, were disputed when nobody could agree on what time the polls had closed—
because the townspeople didn’t synchronize their clocks. (“It is well known that we have no
exact or certain standard of time in this borough,” complained a local paper.)
Having a watch wasn’t just about keeping to the clock, though. It was a cultural marker—a
performance of punctuality. Every time you pulled out your watch, conspicuously and in public,
you signaled others that you were reliable.
“You were a modern person, a timekeeping person, a regular person,” says Alexis McCrossen, a
professor of U.S. history at Southern Methodist University who wrote Marking Modern Times,
a history of American timekeeping. A 1913 Hamilton watch ad explicitly described the device
as a tool for moral improvement: “The Hamilton leads its owner to form desirable habits of
promptness and precision.” Soon, the watch was a straightforward metaphor for having attained
the middle class: Horatio Alger novels often showed the plucky protagonist had “arrived” when
he got a watch. The technology even created a new compliment: If you were ambitious and
hardworking, people called you a “stemwinder”—somebody who habitually wound his
timepiece.
“Punctuality gets marked as a morally elevated thing,” notes Robert Levine, author of A
Geography of Time and a social psychologist at California State University, Fresno.
But pocket watches had one problem: They were impractical when you were on the go. If you
were trying to do something active—like drive a car or ride a horse—reaching into your pocket
could distract you and cause disaster. So, much as today’s gym-goers put their iPods on an
armband while they work out, sporting folks of the 19th century began to fashion “wristlets”—
leather straps that would hold their pocket watch on their wrist while they rode on bicycles or on
horseback. The 18th and 19th centuries also saw some of the first formal wristwatches—with
delicate, small watch faces, worn by women as a form of jewelry.
Time became information you acquired with a quick glance. But because women were the main
wearers of wristwatches, men mostly avoided the trend. They looked too effeminate.
“They were very gender divided,” Thrift notes. Even watchmakers thought the wristwatch trend
was silly and hoped it would die off. One decried it as “the idiotic fashion of carrying one’s
clock on the most restless part of the body.”
The tide changed during World War I. Officers began using wristwatches to coordinate the new
style of attack: opening with a barrage of gunfire to stun and destabilize the enemy, followed
immediately by an onrush of soldiers.
“You’d want the soldiers to be alert to the fact that the guns were about to stop, and be ready to
spring,” says David Boettcher, a British horologist who has researched wartime watch-wearing.
This required precise timing, and officers fumbling around in the dark for a pocket watch
wouldn’t do. To make the wristwatches easily legible in battle, watchmakers fashioned them
with large, round faces that had prominent dark numbers set off by a white porcelain backing
and coated in radium that glowed brilliantly in the dark.
Suddenly, wristwatches seemed manly.
"It was the iPhone of its day, it was leading-edge technology,” Boettcher notes. And like many
forms of hot new tech, it spread virally. “You get loads of boys out on military maneuvers, and
one’s got on his watch that ticks and glows, and so everybody wants one.” Millions of soldiers
went home having developed a wristwatch-wearing habit. The numbers tell the tale: In 1920
wristwatches were only 15 percent of all watches made in America, but by 1935 they soared to
85 percent of the watches. (Even today, men’s wristwatches are ostentatiously large—and often
sold in ads boasting how jet-fighter pilots use them. “It’s almost to say, ‘I’m not a piece of
jewelry—I’m a piece of technology,’” as McCrossen jokes.)
By mid-century, the exploding world of white-collar work presumed that its employees would
—more often than not—have a wristwatch. Students received them as gifts upon graduation.
Glanceability was precious in the highly coordinated world of office meetings. Craning your
neck to look at the wall clock could risk offending a superior; a quick glance at your wrist
wouldn’t. “There are all sorts of ways you can glance at your watch without anyone knowing,
and it’s instantaneous,” McCrossen notes.
By the 1980s, the wristwatch had become, as York University humanities professor Douglas
Freake dubs it, “perhaps the most important cybernetic device in contemporary industrialized
societies.” We were cyborgs of time. And slaves, too, as critics pointed out. Wristwatches may
have made us more efficient, but as humanists had long fretted, perhaps total efficiency is a
creepy goal for everyday life.
These days, of course, glanceable time is no longer only on our wrists. It has evaporated into the
world around us. Clocks are everywhere: on computer screens, phones, coffeemakers and
microwave ovens. Nobody needs to wear a wristwatch to tell time anymore. It has transformed
into pure metaphor, nothing but a signal.
But if the evolution of the wristwatch offers any clues, the journey of the wearable computer is
likely to be tumultuous. As with early watches, the companies selling these odd new devices
make appeals to one’s morality. Google claims its head-mounted Glass helps you “get
technology out of the way,” while Pebble says a glance at the wrist is less rude than having to
“pull your phone out in the middle of the meeting.”
Whatever one thinks of those assertions, it’s certain that wearables would tweak our orientation
to the world around us. Much as wristwatch wearers developed a heightened sense of time, we’d
develop a heightened sense of “what’s going on”—news of the day, invisible details of our
health, the thoughts of a loved one. The watch allowed new feats of time coordination;
wearables would increase social coordination.
***
And so we’d probably see a cultural echo, too. Those who thrive off social contact will love a
wearable, but those already overwhelmed by Facebook and texting will find it tears at their
solitude and sense of self. Both will be, in part, right. The device may be new, but those hopes
and fears are old.
Tiktaalik roseae had fish-like fins, a flattened skull (similar to a crocodile), and is thought to
have lived in shallow water, using its fins to prop itself up.
Did the Evolution of Animal Intelligence Begin With Tiktaalik?
How one marvelously preserved fossil sheds light on how the vertebrate invasion of land took
place.
Water gave birth to life, and guarded it jealously. For billions of years, the land was barren,
while life proliferated in the buoyant, nurturing bath of the seas, ignorant of such terrestrial
concerns as falling down. The first invaders were plants, which began creeping upland from the
streams and swamps some 450 million years ago, followed by arthropods and a few brave
mollusks, which became the land snails. But waiting in the shallows was a nine-foot-long,
crocodile-headed fish with both gills and, on the top of its head, air-breathing nostrils called
spiracles. With a fateful lunge landward, it changed the course of life on earth forever.
In 2004, when the fossil bones of Tiktaalik roseae were dug from the ground of Ellesmere
Island, in the Canadian Arctic, the discovery was hailed as a breakthrough not just for
paleontology, but for beleaguered science teachers trying to keep creationism out of their
classrooms. A fish (with scales and gills) clearly resembling a tetrapod (with a flat head, a neck
and prototypes of terrestrial limb bones in its lobelike fins), it precisely filled one of the gaps in
the fossil record that creationists cited as evidence against Darwinian evolution.
Scientists can’t say whether Tiktaalik itself is the ancestor of any species alive today; there were
likely several related genera making the same transition around the same time. But the
marvelously preserved fossil sheds new light on how the vertebrate invasion of land took place,
some 375 million years ago.
Until this year, Tiktaalik was known only from its front half, but in January, evolutionary
biologist Neil Shubin of the University of Chicago and his colleagues reported excavating the
posterior skeleton of their original specimen. The hip and pelvis were surprisingly robust,
suggesting more powerful rear limbs than previously believed. Although almost certainly still
encased in fleshy lobes, appendages could have helped support or even propel the animal in
shallow water or mud flats. If so, it changes our view of the evolution of tetrapods, whose
ancestors were believed to drag themselves by their forefins, only developing useful hind legs
once ensconced on land.
As for what drove this epochal migration, “it’s extremely bloody obvious: There were resources
on land, plants and insects, and sooner or later something would evolve to exploit them,” says
vertebrate paleontologist Mike Benton of the University of Bristol. It’s also possible, says
Shubin, that fear played a part. “If you look at the other fish in the water at the time, they’re big
monstrous predators,” he says. Some exceeded 20 feet in length. Even for Tiktaalik, a toothy
carnivore itself, this was a “predator-rich, competitive environment.” If you can’t be the biggest
fish in the pond, maybe it’s better to get out of the water altogether.
And from those first lumbering steps, it appears, came the whole parade of terrestrial
vertebrates: amphibians and reptiles and birds and mammals, including those that later returned
to the oceans. The process set in motion by Tiktaalik (or its cousins) was necessary for the great
variety of animal life we see today.
On land, animals faced all new challenges. Nothing in the history of life to that point would
have prepared them for the rude experience of slipping off a branch, or the shocking necessity
of copulation as a substitute for broadcast spawning. Challenges, though, were also
opportunities, to expand and diversify; Benton estimates that the land holds perhaps ten times as
many species as the oceans.
Human intelligence is unique on the planet, and even by a generous definition of language, only
a few mammals and birds seem to have mastered it. Simon Conway Morris of Cambridge,
England, author of Life’s Solution, believes that evolution inevitably converges on certain traits,
including intelligence. The octopus, which can manipulate objects with its arms and solve
problems, is an example of an intelligent animal whose ancestors (as far as we know) never
lived on land. But it’s hard to imagine anything like our technology developing underwater.
The emergence of intelligence remains a mystery, Benton says: “Arguably, a coral reef is
equally complex as a forest. But why primates developed big brains to navigate around and find
food, but not, say, clown fish—I couldn’t say.”
Shubin’s book, Your Inner Fish, recently adapted as a PBS series, traces the evolutionary
history of the human body back through the time of Tiktaalik, showing, for instance, how a bone
adjacent to the spiracle evolved into a bone in the tetrapod’s middle ear. A fish that hauled itself
out of the water on its fleshy fins, for reasons we can only speculate on, bequeathed us our
limbs, backbone, teeth and sense organs—and, perhaps, our expansive curiosity and
restlessness.