[Metroactive CyberScape]

[ CyberScape | San Jose | Metroactive Central | Archives ]

[whitespace] Head
Photograph by George Sakkestad

Heads Up: Local researchers say that technology is forcing the human brain to compartmentalize and manage time in ways never before required. (photo taken at 'Cyberheads' exhibit at San Jose's Tech Museum of Innovation.)

Silicon Valley Human

As the information revolution rages onward, scientists here have quietly started to study what high technology is doing to the mind and body of the human species. How does it feel to be a guinea pig for the future?

By Traci Hukill

IN A RECENT EPISODE of Star Trek: Voyager, crewmember and former Borg "Seven of Nine" develops a case of what Shakespeare referred to as "o'ervaulting ambition." Questing for perfection and hungry for success, the 24th-century cyborg overloads her computerized brain, eventually paying the price the way Macbeth did 700 years earlier--with the sanity of her human half.

Seven of Nine's troubles start when she upgrades her electronically enhanced brain and starts downloading larger-than-usual quantities of the ship's log while she "regenerates," or sleeps. At first, the upgrade is a boon. Seven of Nine's skyrocketing productivity helps avert disaster when she discovers a dangerous virus in the ship's plasma coils, thanks to her turbo-charged circuitry. Praise and promotion are just around the bend.

But shortly the torrent of information flooding Seven of Nine's processors overwhelms her. Paranoid and unable to distinguish between truth and fiction, she starts spreading rumors of conspiracy that hurl the ship into chaos. In the end the diagnosis is simple and humbling: Seven of Nine's ambition has exceeded her capabilities. The ship's doctor dismantles the upgrade, and life aboard the Voyager is restored to order.

Leave it to science fiction to point out the dark side of progress. "Information overload" doesn't have a heading in The Merck Manual yet, but it should. The phrase aptly describes the cranky, insomniac state that follows when a person's synapses are fried to a smoking mess. There are other names, too: David Shenk, who wrote a book about it, calls it choking on "data smog." British psychologist David Lewis calls it "information fatigue syndrome." And of course there's Sting, who beat the curve in 1981 with the Police song "Too Much Information."

Edge Times

SCIENTISTS ARE just beginning to unravel what effect the information revolution is having on the human species. They know this much: the kind of work people do in the modern industrial world is unlike any work humans have done before. It involves huge amounts of information and demands that people wrap their minds around different subjects on cue, bouncing between instant email messages and urgent phone calls, pagers and websites, assimilating facts and ideas all day long. Multitasking used to be making sure the rolls didn't burn while also talking on the phone and chopping vegetables. Now it means pingponging not just between ideas but between the way they're presented: read and written, spoken and listened to, angled this way and that.

It's not just the amount of information that's changed; it's also the speed of transmission. People strive to work as quickly as their computers will let them, an unfair competition that always leaves the humans coming up short. In the all-powerful world of commerce, busyness is practically a religion and wired Silicon Valley its Vatican City. Silicon Valley prides itself on its hectic pace of life, so much so that the image of exhausted startup employees slumped face-down in cold pizza has become an icon. Product cycles have gotten so short it isn't worth taking a vacation because catching up is too hard. Times are good for some and times are hard for others, but time is scarce all around.

We have technology to thank for this state of affairs, technology that's become a feature of the environment, like cold winters. And there's reason to believe humans are changing their behavior to adapt to it.

As the old barriers separating work and home life dissolve, work gets diffused throughout the whole of life. New cultural trends reflecting this fact are already emerging, and something else may be beginning, too: a different way of thinking, one that doesn't require long uninterrupted periods of time but instead mirrors the quicksilver shift of the images on a computer screen. History might look back at something called Silicon Valley Human and recognize it as the prototype for a slightly but distinctly different human being, one that uses its brain in an entirely new way and whose cultural habits reflect a fundamental environmental shift.

Any time a species evolves, something is gained and something is lost. The question is, what are those things, and how will they change the way people live?

Body Scan Body of Evidence: Scientists say high tech tools have vastly increased our opportunities for 'multitasking'--doing several things at one time--which most modern day humans do at a higher rate than at any other point in human history.

Culture Incubator

NO HALLUCINOGEN-huffing Amazon tribe could offer San Jose State University anthropologists Jan English-Lueck, Chuck Darrah and Jim Freeman charms above those of the Silicon Valley clan. They believe this place is a culture incubator, and that studying Silicon Valleyites may provide a clue to how the rest of the world will behave once it too is wired to capacity (if the rest of the world gets wired to capacity, as they like to point out).

The word that describes the culture-incubator phenomenon stems from Greek roots meaning "culture" and "creation."

"To me Silicon Valley is a premier laboratory for ethnogenesis," says Jan English-Lueck. "We are creating culture right and left. I think that's tremendously exciting."

One year ago the trio completed 170 lengthy interviews with employees in the high-tech sector, from janitors to CEOs, shadowing their subjects throughout the workday and into their homes. Within a short time the project's indicators began to swing toward a kind of attitudinal magnetic north: Regardless of how many hours people spent at the office, they felt like they were working all the time.

Darrah, Freeman and English-Lueck found people (as we all have, to our great comfort and joy) making work calls during the commute, reading memos in front of the TV, checking email after the kids went to bed and networking during every conceivable social function, from picking up the kids at day care to chatting at barbecues. They found people taking self-improvement courses to further their careers and using job skills at home. People "worked" on their relationships, their bodies, their personalities, their spirituality.

"The code word I use for it is 'workification,'" English-Lueck says. "When we're touring people's houses they'll take us to various work stations: this is where they pay their bills, this is where they do research on their garden. The language people use, the processes they use, come straight out of work."

One person, she recalls, talked about having "family meetings." Being a consumer is another form of work: shopping for airline rates, picking an energy provider, taking care of the recycling.

There's just one problem. Most people don't want to work all the time. In fact, this sounds to lots of people like a living hell.

English-Lueck, however, is sanguine about the way work shades every part of people's lives. It's not nonstop drudgery in her view, but rather constant activity that leaves scant time to recharge.

"In many ways this is not an utterly bad thing, except that work is stressful, and that means that a lot of things that once were considered refuges are now considered work. So where do you go to get away from work if your family is work and your garden is work or driving is work?"

Tag Along Technology

SO WHO TRICKED US into accepting 16 hours of work every day? Computers were supposed to ease the burden of work, not just spread its weight over a larger portion of life.

Chuck Darrah believes that even though there's a crisis mentality among a lot of working people in Silicon Valley, much of it is self-induced. If so, then it seems even he's not exempt. Darrah arrives at a 3pm interview not yet having eaten--he's had meetings all day--and polishes off a burger between questions. A boots-wearing native of Santa Clara Valley, father of two and self-professed "old frump," Darrah insists that he's really not a prophet of doom. The pace of life in Silicon Valley is fast, he concedes, but people exaggerate it--and besides, a lot of them kind of like it.

"People are very scheduled, but they're not pathological," he says. "There's kind of a pride or machismo about working so much. Despite all the talk about how things are spinning apart, when you talk to people, quite often they'll also say their lives are quite satisfying. They'll say, 'Part of this is that my life is so exciting. I'm doing things I never thought I'd be able to do, meeting people I never thought I'd meet.'"

Employers are more than happy to feed that intoxicating momentum with more thrilling work. But they also have the help of little inanimate elves--the gadgets, the go-anywhere, do-anything toys of work: laptop computers, pagers, cell phones. This is one group of work's handmaidens, the things that allow people to haggle with clients while sunning at the beach.

"An awful lot of it has to do with the fact that this work isn't confined to certain settings," Darrah says. "It's transportable."

The work of manipulating information requires nothing so gross as an assembly line or machine press, so it can be done anywhere, even in pajamas. The slow seepage of work through the supposedly impermeable walls of the home is a disaster or a victory, depending on how you look at it. Maybe it spells mental absenteeism by keyboard-tapping parents who are present in body only. Maybe it's a return to a more integrated (if romanticized) way of life that prevailed for most of human history, when people worked in fields or shops close to home, their children nearby.

"Many people argue that what's going on today isn't anything new, that it's reuniting work and family in a way that was rendered apart by the Industrial Revolution," Darrah says.

He says he often hears "farm stories" from people who imagine that their lives working from a computer at home are not so different than the dairy farmers and cows of yesteryear. But Darrah sees a huge difference, namely that most of the activity is being driven by major corporations that are considerably more bossy than cows.

Corporations, he says, often hand employees the responsibility of balancing their private lives with their professional duties as if it's easy--"As if it's somehow a level playing field between you and Hewlett-Packard," Darrah says. He points out that in all corporate jobs, downsizing is a constant threat and competition as ferocious within companies as between them. Try slapping the big black-and-white haunches of Intel to make it move over next time it's about to back you into a corner. You'll soon find out you don't own the old girl, even if you do get to work by the pool. She owns you. And she knows that with the help of those fancy new tools, you can do a lot more work than you used to.

Chunklet Chips

DARRAH AND ENGLISH-Lueck don't know what life will be like in 50 years. But they do know that all the interruptions by email and phone calls and faxes make it difficult for their study's subjects to concentrate the way they used to.

"It means life is lived in very short chunklets, little chunks of time in which something gets done," Darrah says. "Whereas in the past you might just sit down and just do something for a long period of time. Very seldom do people have time to do that."

After having a family, Darrah had to learn to write academic papers in snippets of time, abandoning the linear beginning-to-end method he grew up perfecting. He says he can now take 10- or 15-minute increments and scribble paragraphs on his yellow tablets that he'll later rearrange into a coherent order, as if he were using a word processing program.

"I said, 'I can either never write again or I can learn to write [while] being interrupted,' " he says. "I didn't want to be one of those people who's 'home with the kids,' but they'd better not interrupt you."

Darrah forced himself to learn a new way of working because he recognized that some things, like kids, don't happen in chunklets.

"Most parents will admit that they have occasions when they'll be trying to get their kids to do something and they'll be going, 'Come on, come on, hurry up!' and then realize, 'Hey, this kid's only 4 years old.' And then they realize that they're kind of living in these short little pieces."

If every moment, even outside of work, is spent striving toward some officious end--reading a quick article in a trade journal, exercising to keep heart disease at bay, maintaining a network of potentially useful acquaintances with quick personal emails--then something has to fall away. And some people think it's the fragile things that go first: contemplative time, time to just be and not do, time to let the mind drift and spin.

Neil Quinn, director of ethics and technology for the Markkula Center for Applied Ethics, thinks creative thought is the biggest casualty of the gadget glut. It's the reflective time that's slipping away, he says, time people need to assimilate what they've learned and, as he puts it, "arrive at a new and higher understanding of things."

"Take two unrelated things that have occurred," he says. "You've read books by two authors who are completely unconnected. But you happen to have gained some third insight that draws a relationship between those two books. There has to be that quiet time that allows you to assimilate the information you're dealing with. You're never going to come up with that third conclusion, that synergy, without it."

Quinn thinks "real in-depth thinking ability" has taken a hit, too. "This may be a little bit far-fetched, but what I think we're seeing today is more people becoming jacks of all trades because we're gaining lots of information that's new to us," he says. "But we're not advancing the state of intelligence the way we previously did. So granted, more people know more things, but we're not increasing depth of knowledge."

In the argot of the metaphysicians, "as above, so below." Scattered, surfacey thinking can do a number on a person's physiology as well as on intelligence. If someone's juggling a number of tasks and the tasks don't get finished, says Larry Rosen, co-author of the 1997 book TechnoStress, the brain and the body get stressed out. The brain wants to do things thoroughly, and it rebels when it can't.

"There's something called the Zeigarnick effect that says that you remember unfinished tasks better than you remember finished ones," Rosen says. He explains that when you start a task and don't finish it, it stays "on" in the brain like a string of lights. If enough strings are left on, they will disrupt sleep--usually between 2 and 4am. Multitasking increases that likelihood.

"You could light up the same number of tasks one at a time," Rosen says, "and finish each one and it would go off. But if you have 10 areas all lit up, some dimmer than others, and you're switching between, then your brain's controller is always checking in on those unfinished tasks to make sure you don't forget them. You can think of it as your brain always being in a state of fight or flight."

Legend has it that Nero fiddled while Rome burned, but here's the problem with legends: violins didn't exist in 68 C.E. Many great works of art may owe their inspiration to disasters, but most people don't write poetry while their houses are burning.

Unitasking: Cupertino resident April Sakara, vice president of corporate marketing for Fast Forward Networks, says people can help prevent information overload by picking just three things to accomplish every day and creating free form, unstructured time away from work.

Photograph by Jeff Kearns

Silicon Solutions

FOR THE sleep-starved many who keep waking up at 3:30am thinking about work, Rosen has some recommendations. Don't check email or voicemail just before going to bed. Make a to-do list at night as a cue to the brain that you'll take care of unfinished business tomorrow. Then, before going to sleep, do something totally unrelated to work: read, watch TV, whatever.

People who are very busy have different strategies for getting through their days. Compartmentalizing is a key strategy for some of them.

April Sakara, vice president of marketing for a startup called FastForward Networks and mother of two, has a schedule as tight as a waterproof weave, with every half-hour accounted for. She prevents information overload by picking three things to accomplish every day and "having those channels running" in her mind. When something comes up related to one of those goals, she "finds the bucket to drop it in." But even her system isn't failproof.

Work is one thing. But when something personal blindsides her, like a family member's illness, she gets thrown. "Obviously if there are fires I get overloaded," she says. "I have to erase things in my head, it seems like."

Sakara is so efficient she's even recruited her dreams to help her out from time to time. Like Seven of Nine, she sometimes processes things in her sleep.

"It's very bizarre. Usually I don't dream, yet when I have things I'm struggling with I don't have the same kind of sleep. I'm like walking the earth or something. It's hard to explain. Sometimes it's a solution I'm looking for. Sometimes it'll be an at-ease feeling about it, like I found the bucket it needs to go in and it will be solved in my everyday routine."

This intensity of experience does not drive Sakara into a made-for-TV paranoid spin because she reserves one mental compartment for complete freedom. She regularly cordons off weekends and vacations and makes sure that they're totally clear of schedules.

"We just get in the car and go," she says. "When I'm with my family, whatever happens happens. There's no expectations set for what we're going to do."

Chopping up the day's activities into neat cubes that fit into certain compartments is just one part of the solution, though. The other part is moving between the compartments with the agility that urgent emails and ringing cell phones demand.

Heady Experience

PEOPLE SEEM naturally to head to the Cyberheads exhibit at the Tech Museum of Innovation. Guests enter the museum at a point directly beneath the souped-up 3D scanner, stepping into a gleaming lobby where, if they are lucky, hordes of schoolchildren spill across the polished floor toward the doors, done with their button-pushing, bell-ringing interactive tour and ready to board buses. If that's the case, the Tech's visitors get to leave the cacophony in the vestibule echoing behind them and take the escalator to the museum's second floor in relative quiet.

Quiet, that is, until they set foot in the Innovations collection of interactive exhibits and enter a thicket of screens and bleeps and lights where a different sort of cacophony begins. There is one quiet spot, though, the perfect place to get one's bearings: The Cyberheads exhibit.

There a senior-citizen volunteer genially ushers the visitor to a chair where a heavy camera circles at eye level, scanning the unique topography of a human head: the fleshy hills of cheek, the ledge of jaw, the shallow cirque of the temple.

Within moments an image of the viewer's disembodied head, able to be rotated 360 degrees in any direction, appears onscreen at a nearby console. With the click of an icon the familiar human flesh blinks out and a different skin overlays the cyberhead's 3-D line frame: marbled granite, green scales, iridescent titanium. It's fascinating but disturbing, this protean thing that is both familiar and foreign, made to morph at the click of a mouse. There is the recognizable valley of the eye socket, the rise of the lip, but they belong to a prehistoric green swamp creature or a mute piece of stone, milky and ghoulish.

If the schoolchildren beat the visitor to the Cyberheads exhibit, he or she will notice that the children are not nonplussed by the foreign heads rotating in front of them. In fact, they're having fun making them spin diabolically.

"Imagine that you could change like a chameleon or make yourself look like a shiny metal robot," the poster over the exhibit reads. The kids don't have to work too hard to imagine changing like chameleons. Some part of them has been preparing for it all their lives.

The Kids Are All Right

THERE IS ONE GROUP seemingly unfazed by the fuss, one that handles the ringing phones and beeping emails with perfect aplomb. The kids.

"Now we have the luxury of looking at younger people--teenagers and kids--and they just accept this as the norm," says Jan English-Lueck. "[For them] having this kind of density of information is just the way you live, and they don't have much angst about it, or regret. If anything, they kind of like it."

There's nothing like a half hour of quality time with MTV to make a grownup feel old and irrelevant. Music videos, like the commercials and movie trailers they inspired, explode with flashing images that stay onscreen for fractions of a second. It might make parents feel nauseated, but the tykes can gaze at it for hours. They can even do homework in front of it, or so countless kids have insisted in countless arguments with their parents.

The ante's been upped at the computer and console games table, too, says Christopher Erhardt, who teaches game design at Digipen Institute of Technology in Washington.

"It used to be possible to put out something with minimum graphics and minimum sound and they'd be content to play games that weren't what you'd call graphic showpieces," he says. "But now kids want bang for their buck. They tend to want high action with sophisticated controls that are innovative but not really overwhelming."

Erhardt notes that kids are able to keep up with just about anything game designers put on the market. He attributes this not to some mysterious cognitive development but to something quite recognizable to the parents arguing with their kids over doing homework in front of the TV while talking on the phone.

"It's more an ability we all had as teenagers that we tend to forget, and that's the ability to multitask," Erhardt says. "We always as adolescents have a much better ability to compartmentalize than we do as adults."

Caren Ludmer, a child psychologist working in San Francisco, explains why children and adolescents are more chameleonlike in their thinking than adults.

"In general, kids' brains are more plastic than adults' are," she says. "That's why kids can learn language more easily than adults. It's not a chemical difference, it's the fact that kids' brains are growing. First of all, the structures of the brain itself are growing, so pathways can grow, and alternate pathways too."

It is very difficult and very expensive to track the minute chemical footprints in the brain that show how the cognitive process has adapted to late 20th-century technology. In fact, no one's done it. But scientists have definite ideas about what they would find if they could.

"We've been tossing it around for a long time, and there's no study on it," says Larry Rosen. "Our guess is because of the multimedia nature of technology, and because of changes in the presentation of information over the years--from longer and more detailed to shorter and more to the point--that people today are getting less tolerant of longer bursts of information."

"But I'm pretty sure we're going to find that we have shorter attention spans," he says confidently. "Even the schools have cut down the amount of time for a lesson. When I was in junior high and high school, classes were exactly an hour. Classes now are 42 minutes long, and my kids can tell you exactly what time each class begins and ends."

Mental long-distance runners have fallen out of fashion. The sprinters are the ones who succeed today.

It Started With Book Larnin'

THIS IS NOT THE first time human brains have mutated to expand some functions while allowing others to wither. Anthropologists Darrah and English-Lueck both compare the present to the moment in the early 16th century when literacy touched down among the masses.

"One of the effects of literacy is that you can no longer remember anything," English-Lueck says. "There's no way on earth I could begin to memorize a five-hour saga, and yet that was the norm for someone who was educated, even 500 years ago. Because we learned to externalize information to paper, we started thinking differently and working differently and acting differently. And it could be that we are in a similar situation where we're going to treat information differently."

We already do. Why commit an 11-digit phone number to memory when there's a speed dialer on the phone? Email addresses can be easily stored on email programs. Automated billing and its nefarious cousin, automatic withdrawal, make keeping up with the bills easier. And this is just the beginning of the ways people rely on technology to remember things for them.

Maybe it's a good thing. In spite of ethicist Neil Quinn's fears about the prospects for creative thought, he thinks humans will ultimately benefit from not having to sweat the small stuff. In terms of the literacy model, he says, the modern-day analog to 16th-century memory loss is even more memory loss, courtesy of the "memory augmentation" offered by other mediums (specifically RAM).

The new crucial skill, Quinn says, will be knowing how to navigate the information quickly in order to retrieve what's useful. This development makes for unlikely heroes: reference librarians have been masters of navigating information for decades.

"I think it's better that we have our reasoning skills and deductive skills fine-tuned more than they used to be," Quinn says, "instead of cluttering our minds with pure information," he says.

"An improvement? Yeah, it really is."

Quinn's take is a refreshing change from the usual gripe that we're headed for a national ADHD epidemic. That one is a chilling scenario. It would leave many layers of cultural subtlety exposed to a stampede of people basically stuck in perpetual teenagehood. It kind of makes older generations think, "And after all we've done for you ..."

Maybe, like a nation of absent-minded professors who've finally found the perfect assistants to mind the details, we'll dump all our petty concerns into a Palm Pilot (or something like it) and, free at last to pursue poetry, philosophy and higher math, get down to the serious business of thinking. The computer will alert us when it's time to address one of the petty concerns, and humankind will evolve into a nobler version of itself.

OK, maybe that's optimistic. But here's the sad truth about the matter: whatever it is that will arrive in the coming generations is not really worth fearing, because it will be cloaked in normalcy soon enough. And whatever it is that will be lost in the coming generations is not really worth lamenting, because shortly no one will miss it, and one may as well not be too precious about something doomed.

The anthropologists know how to put it into perspective.

"We're in a time of transition," says Chuck Darrah. "We don't know how it's going to shake out yet. How long does it take for things to change? The simple answer to that is 'however long it takes the last generation to die out.'"

[ San Jose | Metroactive Central | Archives ]

From the January 6-12, 2000 issue of Metro, Silicon Valley's Weekly Newspaper.

Copyright © 1999 Metro Publishing Inc. Metroactive is affiliated with the Boulevards Network.

For more information about the San Jose/Silicon Valley area, visit sanjose.com.

Foreclosures - Real Estate Investing
San Jose.com Real Estate