About that pendulum...
This entry was originally posted on 8 October 2003 at 1:58 p.m.
The irrepressible Scott posted this in my guestbook:
I think he might be right.
Metaphors are important. They allow us to talk about abstract concepts in concrete ways without spending too much time on capturing the unique essence of a given situation¹. And what is more concrete than mechanical or physical processes? Okay, so they're not always tangible, but they are immutable: everyone who has seen a pendulum in action knows exactly how it acts and what its motion looks like.
So this might explain why we so easily adapt things like computers--whose odd behaviors and errors are already quantified and qualified, and whose workings and interfaces are common to many people worldwide--for use in metaphor. Because of this, a phrase like, "he doesn't multitask well" makes perfect sense².
But the fact still remains: we also use human behavior as a metaphor for computer behavior. For example, one might say that a certain CD-ROM drive is "finicky" or that a certain program "doesn't like" some action (presumably, one that produces the kind of error that causes the machine to crash). It isn't uncommon for someone to complain that his or her computer is stubborn, pissy, or otherwise displays some commonly human behavior or emotion.
It isn't unusual for humans to anthropomorphize inanimate objects. We've been talking about cars in human terms for decades now--but computers are different. We interact with them in more abstract and complex ways than we do with cars. Driving is complex, but we don't expect cars to learn from their interactions with us the way we expect Microsoft Word to automatically turn certain features on or off based on our habitual actions. The entire concept of artificial intelligence demonstrates that we expect something more of a computer than we do of a car--and that we're more willing to place responsibility for error on a computer than we would for a car (unless some component of the car fails or is defective).
And so the question: given the amount of crossover between human and computer metaphors, given the context in which we engage computers, and given our drive to make computers emulate humanity, will we someday begin to emulate computers? Or have we already?
Footnotes:
1. That uniqueness is suspect anyway; i have this theory that our experiences largely fall into classes, or perhaps resemble recipes composed of interchangeable subjective qualia--but that's another entry.
2. The link leads to Dictionary.com's definition of the word "multitask"--which comes straight from computers. Apropos, farther down the page, it mentions how this is often used in a human context.
The irrepressible Scott posted this in my guestbook:
"It probably happened a long time ago, in some transition that was so gradual that people barely noticed which way the pendulum was beginning to swing." I think maybe you've nailed your own question. Perhaps using a swinging pendulum as a model for human relations was the thing that kicked off our use of machine metaphor in human life?
I think he might be right.
Metaphors are important. They allow us to talk about abstract concepts in concrete ways without spending too much time on capturing the unique essence of a given situation¹. And what is more concrete than mechanical or physical processes? Okay, so they're not always tangible, but they are immutable: everyone who has seen a pendulum in action knows exactly how it acts and what its motion looks like.
So this might explain why we so easily adapt things like computers--whose odd behaviors and errors are already quantified and qualified, and whose workings and interfaces are common to many people worldwide--for use in metaphor. Because of this, a phrase like, "he doesn't multitask well" makes perfect sense².
But the fact still remains: we also use human behavior as a metaphor for computer behavior. For example, one might say that a certain CD-ROM drive is "finicky" or that a certain program "doesn't like" some action (presumably, one that produces the kind of error that causes the machine to crash). It isn't uncommon for someone to complain that his or her computer is stubborn, pissy, or otherwise displays some commonly human behavior or emotion.
It isn't unusual for humans to anthropomorphize inanimate objects. We've been talking about cars in human terms for decades now--but computers are different. We interact with them in more abstract and complex ways than we do with cars. Driving is complex, but we don't expect cars to learn from their interactions with us the way we expect Microsoft Word to automatically turn certain features on or off based on our habitual actions. The entire concept of artificial intelligence demonstrates that we expect something more of a computer than we do of a car--and that we're more willing to place responsibility for error on a computer than we would for a car (unless some component of the car fails or is defective).
And so the question: given the amount of crossover between human and computer metaphors, given the context in which we engage computers, and given our drive to make computers emulate humanity, will we someday begin to emulate computers? Or have we already?
Footnotes:
1. That uniqueness is suspect anyway; i have this theory that our experiences largely fall into classes, or perhaps resemble recipes composed of interchangeable subjective qualia--but that's another entry.
2. The link leads to Dictionary.com's definition of the word "multitask"--which comes straight from computers. Apropos, farther down the page, it mentions how this is often used in a human context.
0 Comments:
Post a Comment
<< Home