Our tools shape us April 12, 2013Posted by Cameron Shelley in : STV202 , comments closed
Marshall McLuhan once said that, “We shape our tools and thereafter our tools shape us.” The basic idea expressed here is that technology does not merely help its users to alter the world to their liking, but it can also change the users themselves in the process. (Perhaps he was following Winston Churchill who remarked, “We shape our buildings, and afterwards our buildings shape us.”) In short, technology is more than just a tool.
An interesting little piece from New Scientist provides an illustration of this observation. In 2010, an expedition from the National Museums of Kenya discovered a third metacarpal bone from a 1.4 million year old Homo erectus. This may not sound exciting, but such bones are rare in the fossil record, and this one was particularly informative:
Like modern human metacarpals, it has a small lump at its base – the styloid. This projection helps stabilise the wrist when the hand is gripping small objects between the thumb and fingers.
This feature of the bone suggests that the hands of Homo erectus were adapted for the manipulation of small objects, stone tools in particular. This feature stands in contrast to the hands of previous hominids, which were adapted to gripping branches.
Since the Acheulean stone tool kit appeared 1.7 million years ago, the implication is that the modern human hand evolved to make stone tools. How about that? Just as our tools are our handiwork, our hands are the handiwork of our tools, so to speak!
(Cameron Diaz courtesy of Tony Shek/Wikimedia commons)
Smart phones are emasculating February 28, 2013Posted by Cameron Shelley in : STV302 , comments closed
Wired provides this peculiar report from the 2013 TED conference. During his 10-minute talk, Google co-founder Sergey Brin explained to the crowd that smart phones are emasculating:
“You’re actually socially isolating yourself with your phone,” Brin told the audience. “I feel like it’s kind of emasculating…. You’re standing there just rubbing this featureless piece of glass.”
The point seems to be that smart phones require people to adapt their movements, and other aspects of their lives, to their phones.
(Steve Jurvetson/Wikimedia commons)
Of course, it is not clear why this point should be considered degrading in itself. New technologies typically invite or require users to change the way they act. This blog is full of examples. A car requires the driver to change the way they get around, but the effect can be quite liberating (sometimes).
However, it may be that Brin had something more specific in mind:
“Is this the way you’re meant to interact with other people? It’s kind of emasculating. Is this what you’re meant to do with your body?”, he asked.
According to the TED blog Brin said: “I have a nervous tic. The cell phone is a nervous habit — If I smoked, I’d probably smoke instead, It’d look cooler. But I whip this out and look as if I have something important to do. It really opened my eyes to how much of my life I spent secluding myself away in email.”
Here, the point seems to be that smart phones prevent users from living in reality, or in the moment. Their use becomes second nature.
Again, this is a feature of much technology. Cars, to return to that example, divide their occupants from the space around them, and their use can help people to feel cool. Is that emasculating?
Brin then went on to contrast the smart phone to Google’s new project, Google Glass. These are spectacles that present an augmented reality to the wearer, projecting information from Google on top of the user’s perspective. Having information presented to you, anticipated for you by Google, is empowering, not emasculating. People are meant to use high-tech glasses!
To be frank, I still do not get it. Users of Google Glass will be just as reliant on Google for their augmented reality as smart phone users are reliant on their apps. Google Glass may well divide people’s attention away from reality and become second nature just as much as smart phones currently do. Perhaps more so. Because the effect of Google Glass on its users is more private, it will not be so visible to others. Is that what makes it more manly? Only if manliness is merely a matter of appearances.
The Punisher’s dashcam February 27, 2013Posted by Cameron Shelley in : STV202, STV302 , comments closed
I recently posted an item about Russian dash cams. At the end, I wondered whether or not the presence of dash cams would affect the way that Russians drive.
Perhaps we have an answer already. Below is a video cutting together scenes from the dash cam of Alexei Volkov, a bus driver in Zelenograd near Moscow. Volkov has earned the name “The Punisher” because he relentlessly inflicts vigilante justice on people whom he considers poor drivers, that is, people who cut off his bus. His weapon? The bus.
Here is a typical event, as described in a recent piece in The Atlantic Cities:
Volkov performs some traffic maneuver that pisses off a nearby motorist. That motorist guns the engine to pull in front of Volkov and then slams on the brakes, because that’s always the smart thing to do in front of a 14-ton municipal transport vehicle. Much to the dismay of said driver, Volkov acts like he doesn’t see the stopped car and plows right into it, pushing it along the roadway like a bulldozer whose operator has fallen asleep.
Here are some examples of this process in action.
You can find more on Volkov’s YouTube Channel.
An interview with Volkov shows that he is unapologetic about his practice, and that his bosses are unconcerned, as long as Volkov does not lose any court cases. So far, he has a good track record, at least in that department.
So, this example illustrates a common theme on this blog, about how technology is not just a tool. The presence of a video camera does more than record already existing behaviors. It also can turn people into directors of their own shows, in which they become the stars by acting in potentially new ways.
Evolution in the human mouth February 22, 2013Posted by Cameron Shelley in : STV202, STV203 , comments closed
A pair of new articles in Science suggest and interesting story about the evolution of bacteria that live in the human mouth (registration required). In brief:
Some microbes that had lurked at low levels in the mouths of hunter-gatherers bloomed on the sugary films coating the teeth of farmers who munched cereal grains. Eventually the cavity-causing Streptococcus mutans, for one, took root. It adapted to the sweet life, multiplying like a weed and edging out many other species of bacteria. That leaves the modern mouth a depauperate ecosystem, according to two new genetic studies.
One study analyzed the DNA of microbes found in the tartar of ancient human teeth. By analyzing the relative population sizes of different bacteria in these teeth, Alan Cooper and colleagues were able to identify the point in time at which the cavity-causing Streptococcus mutans become the dominant member of the human oral fauna. Their study suggests that this event occurred around 10,000 years ago, at the point when agriculture was developed. The starchy sugars present in grainy foods allowed S. mutans to expand its domain at the expense of other critters less well adapted to feed on the new nutriment.
Omar Cornejo et al. took a different tack, analyzing the distribution of DNA variants in modern mouths and arrived at a very similar result. It seems that the introduction of agricultural technology brought about fundamental changes not just in human nutrition but in our oral health.
The story continues with the development of refined sugars during the Industrial Revolution:
Modern samples and preliminary data from the mid-19th century show an oral environment even more dominated by S. mutans. Cooper concludes that the change occurred with the Industrial Revolution, about 1850 in England, when cavities also increased and refined sugars entered the diet. “Sugar and flour caused everything to go berserk,” he says.
This story amplifies earlier work showing how the development of food technology affects the human form, such as dental occlusion. Modern, processed foods tend to be soft, requiring less chewing and, apparently, leaving our lower mandibles under-developed.
One result of this change in diet has been the development of dentistry and dental technology. All the cavities and other problems created by soft, sugary foods result in a need for special, oral hygiene and the technology to provide it. So, next time you pick up your toothbrush, spare a thought for your predecessors, who planted the seeds of modern, oral hygiene all those years ago.
(Henrik Abelsson Abelsson/Wikimedia commons)
The technology of glamour January 18, 2013Posted by Cameron Shelley in : STV202 , comments closed
Cameron Russell provides a brief and thought-provoking TEDx talk on her life as a fashion model. More specifically, she discusses how her photogenic looks were the result of a “genetic lottery” that won her a successful career but also issues with personal insecurity. More broadly, she discusses how our collective fascination with image and appearances tends to work to the benefit of beautiful people and to the detriment of others.
The talk certainly provides issues to think about. One issue that struck me is the role of image technology in the matter. Russell does not say so explicitly, but she appears to blame people (and their evolutionary past) for their obsession with image. I would agree that people are “wired” to attend to appearances. However, you might still wonder whether or not the technology for producing images makes a difference as well.
On this point, Russell displays and comments on pairs of photos of herself. In each pair, one photo is from a fashion shoot, emphasizing her nubile qualities, whereas the other photo is a contemporary, casual picture of her, emphasizing her engagement in normal, youthful activities. The impression left by each photo is quite different. This difference suggests that the camera does not inherently glamourize its subject. (That could change. Cameras already have cosmetic features such as “red eye removal”. What enhancements might they have installed in future?)
However, the camera is not the only technology involved in this situation. As Russell points out, there is also the apparel, that is, clothing and shoes. Also, cosmetics are clearly present. Russell also notes that some models undergo surgery to alter their appearance. In future, they may undertake genetic therapy to improve their standing in the “genetic lottery”.
In addition, there is the image reproduction technology, that is, the glossy magazine, where fashion photos are printed for public consumption.
All these things are a manifestation of people’s deep concern with appearances. I wonder, though, if their collective presence does not also increase this concern. If so, then image technology is not just a tool that allows us to express our preoccupation with images but a device that amplifies it.
20 years of txting December 3, 2012Posted by Cameron Shelley in : STV202 , comments closed
December 3rd marks the 20th anniversary of texting. On December 3rd, 1992, Neil Papworth sent the first message on his newly designed message transmission system. The message read simply, “Merry Christmas”:
Vodafone was having a Christmas party in a separate building, and Mr. Papworth, surrounded by colleagues, got down to work. He typed out the 14-character yuletide greeting to a company official at the party and hit “send.”
“I was a little bit nervous. I just wanted everything to work,” Mr. Papworth recalled. Word came back from the Christmas party: The text had landed.
Mr. Papworth now works at Tekelec in Montreal.
The article does a good job of reviewing the effects, mostly unexpected, that texting has had on the communication habits of people worldwide. For example:
“Texting has transformed youth,” says Richard Smith, director of the Centre for Digital Media in Vancouver. It is true in nations worldwide, he said. “It changed what it is to be a young person, under the thumb of your parents. Young people can be in constant touch with a whole range of people, unbeknownst to their parents – they’re texting under the table, behind their parents’ back, in their bedrooms. We might think that youth have always been crazy, but in fact, they’re much crazier, and text messaging is an enabler of that.”
I doubt that young people are any crazier than the youth of yesterday. However, texting may indeed provide more autonomy to them.
In the previous generation, youth in developed nations used phones to call each other ad nauseam, sometimes leading parents to provide separate phone lines to their children’s bedrooms so that the rest of the family could make and receive calls. Today, phone calls are considered rude or awkward, unless it is an emergency. Of course, this cultural change has affected not only youth but everyone. I have to admit that I am usually startled to get a phone call these days, unless it was previously arranged.
In any event, texting is an illustration of the observation that technology is more than just a tool. That is, sometimes a new technology does not merely facilitate people’s behaviors, it alters them profoundly.
(“Yes, she’s texting” by laszlo-photo/Wikimedia commons)
Who’s your daddy truck August 22, 2012Posted by Cameron Shelley in : STV203 , comments closed
From ABC comes news of the truck that cruises the streets of New York City offering DNA paternity tests. The truck makes its way from one neighborhood to another, offering people in each the opportunity to hop on board where the driver, Jared Rosenthal, can take photographs and cheek swabs. Rosenthal contacts the testees a few days later with the results.
The idea for the truck was Rosenthal’s, and was necessary because he could not afford an office. However, clients seem to like this form of service, finding it to be low-key and approachable. Rosenthal also says that people find it more intimate than a clinic; they seem happy to chat with Rosenthal and develop a more personal relationship than they would expect to have at more conventional medical site. Given the highly personal nature of paternity testing, this need for intimacy seems appropriate.
Knowledge of genetic paternity brings trade-offs. Sometimes, the results are positive:
He recounted meeting an 18-year-old woman from another state who had contacted the man she believed to be her father living in New York. A DNA test at the truck proved it was true, bringing a broken family back together. “He began to form a relationship with this woman and it was great.” Rosenthal said. “They lost 18 years but they found each other.”
Sometimes, the results are negative:
Rosenthal brought up the story of one woman in her early 20′s who came in for a test, only to find out that the people she believed to be her father and her three half-sisters was not related to her at all. In fact, the test revealed she was from an entirely different ethnic background. “When she found out her father wasn’t her biological father it totally rocked her identity to the core,” he said.
I am surprised that such a service is so viable. Was there always a substantial level of doubt about paternity that the “Who’s your Daddy?” truck has merely tapped into? Or, has the presence of such an easy and approachable service increased anxiety on the matter?
Daddy, what’s a remote control? May 31, 2012Posted by Cameron Shelley in : STV202 , comments closed
A recent New York Times article discusses the history and imminent demise of the TV remote control. The article provides a nice and articulate overview of the effects of the TV remote on modern life with television. The remote was designed simply as a convenience, a less effortful version of getting up, walking over to the tube, and working the controls located there. As we all know, however, the remote proved to be more than just a tool; it changed the way that television worked.
One of the changes effected by the remote is that it changed how viewers attended to the content of television. The effort of having to get up, stroll to the TV and work the controls there put people off of doing the work at all. In short, viewers tended to sit passively as the program they were watching went on, was interrupted by commercials and announcements, and then resumed. By severely reducing the effort needed to change channels, the remote made viewers–or their thumbs, at least–more active. If people got bored with a show, they could easily search for something more interesting. Instead of watching ads they did not like, viewers could find something more appealing on another station. To some extent, the remote allowed TV viewers to became curators of their own viewing experience, formerly the exclusive preserve of network producers.
The producers fought back, of course. For example, the presentation of TV shows changed in order to minimize the impact of the most boring parts, such as the credits at the end of a show:
Television began to change, rapidly and profoundly, as power shifted from corporate offices to increasingly fickle viewers. After a research team at NBC discovered that 25 percent of its audience changed channels when credits rolled, the network introduced the format known as “squeeze and tease” in 1999. Credits were consigned to a third of the screen, running simultaneously with promotional spots intended to keep the viewers hooked.
Nowadays, this form of presentation is standard.
This story is interesting in its own right. However, it also illustrates the broader theme of the effect of technology on attention. Nicholas Carr, for example, has argued that the Internet is scrambling people’s ability to pay attention by offering a feast of bite-sized chunks of information in hypertext form, allowing readers to skip quickly from one to the next. On his account, the upside is that we can form an acquaintance with information in many areas efficiently. The downside, however, is that we form a deep comprehension with few, if any, of those areas.
The history of the remote control tells us that this issue is not a new one. The channel surfing made possible by the remote also invites people to pay attention in an unsustained way. I am not aware that anyone, apart from TV executives, regretted this phenomenon, perhaps because TV literacy was never regarded as highly as book literacy, which the Internet is seen as threatening. Nevertheless, the fact that television has changed to address the problem suggests that it is not insurmountable. Although traditional book literacy may fade somewhat in importance, new and useful forms of literacy better adapted to the Internet age may yet be on the way.
Sleep on it March 1, 2012Posted by Cameron Shelley in : STV100, STV202 , comments closed
The BBC provides an interesting item on recent sleep research. A new book by Craig Koslofsky, called “Evening’s empire”, puts forward an account of how sleeping patterns have changed significantly since the 17th Century.
(Carlo Naya/Wikimedia commons)
Before that time, a normal sleep pattern, in many parts of the world, was bimodal. That is, people tended to go to sleep an hour or two after dusk and sleep for about four hours. Then, they would wake for an hour or so before going back to sleep for another four hour snooze. The period of wakefulness between sleep cycles could be used for different activities, such as reading, meditating on dreams, or having sex. References to such a bimodal sleep pattern from around the world suggest that this rhythm of sleep is a kind of natural or default condition for human beings.
However, this pattern began to change in Western Europe in the late 17th Century. Instead of enjoying a long period of rest after dark, people began to stay out, frequenting coffee houses, for example. It seems that people began to view the long break at night as unproductive, and sought out ways to use the hours of the night more efficiently.
Technological change seems to have had a hand in this development, explains Dr. Koslofsky. Clearly, one technology involved was the shipping trade. The increase in wealth in Western Europe through colonization and overseas trade generated more of what we might call disposable income. Coffee became cheaper and more widely available and, of course, could help keep a body awake after dark.
In addition, advances in lighting technology helped. Major cities began to light their streets at night, making going abroad after dark more feasible and acceptable:
In 1667, Paris became the first city in the world to light its streets, using wax candles in glass lamps. It was followed by Lille in the same year and Amsterdam two years later, where a much more efficient oil-powered lamp was developed.
Also implicit in this article’s discussion is that time-keeping technology may have had a role to play:
“People were becoming increasingly time-conscious and sensitive to efficiency, certainly before the 19th Century,” says Roger Ekirch. “But the industrial revolution intensified that attitude by leaps and bounds.”
Huygens invented the pendulum clock around that time, and personal timepieces, e.g., watches, also began to appear. Today, the spread of artificial lighting continues apace, as does the influence of time management technology in our lives.
Still, people are resilient and have adapted to the new regime with the eight-hour block sleep. However, it may be that problems people occasionally experience with sleep are brought about by tensions between our natural sleep cycle and the one that we have adopted:
This could be the root of a condition called sleep maintenance insomnia, where people wake during the night and have trouble getting back to sleep, [historian Roger Ekirch] suggests.
The condition first appears in literature at the end of the 19th Century, at the same time as accounts of segmented sleep disappear.
The story illustrates a number of themes regarding technology. For example, it illustrates that technology is not just a tool: the introduction of new lighting and timekeeping technologies did not simply help people to make their existing lifestyles more efficient, it changed how people lived, and slept. Also, it illustrates the trade-offs that people make in order to realize the advantages of new technologies. Lighting and timekeeping technologies have allowed people to create and exploit new opportunities for work and leisure. At the same time, we may have sacrificed some of the time for intimacy or contemplation that were afforded by the intermission in the ancient, bimodal sleep pattern.
Stop worrying and love your robot car February 10, 2012Posted by Cameron Shelley in : STV202 , comments closed
Tom Vanderbilt, author of Traffic: Why we drive the way we do, has posted a piece on Wired responding to critics of autonomous or robot cars. Vanderbilt does not name the critics whom he rebuts, so it is difficult to tell if he represents them correctly and fairly. However, the points he makes seem sensible enough on their own merits.
(Alex Goy/Wikimedia Commons)
Two points, however, may present difficulties that do not receive adequate attention in his brief commentary. Let’s start with autonomy and privacy.
Vanderbilt notes that some people may object to robot cars because they would allow Big Brother, aka the government, to have a larger say in the behavior of your car. The point is, I gather, that some drivers enjoy illegal practices such as speeding, or drinking, texting, or watching TV while driving, and so on. A robot car would be programmed to obey all local rules and regulations, and would also be programmed to rat out occupants who break them. Of course, it might be worth pointing out that drinking, watching TV etc., might not be considered an offense in a robot car.
In defense of Vanderbilt’s anonymous critics, robot cars will raise issues with driving and traffic that simply do not exist with human drivers. The problem of managing accidents is one that has already been considered in this blog. Another would be the potential for increasing complexity of traffic regulation. Consider speed limits. Presently, speed limits tend to be fairly generic, with major highways having a default limit of 110 km/h (here in Ontario), county highways 80 or 90 km/h, and city streets 50 km/h. One could imagine matching speed limits much more precisely to local conditions. A single stretch of road could have dozens of speed limits at different points, depending on how straight they are, how narrow, how far from housing or schools, and so on. And then there are weather conditions to think about. The possibilities for regulation of autonomous cars may be nearly boundless. The introduction of autonomous cars will raise many new issues for governments and citizens to consider. My point is not that we should ban robot cars. It is just that a robot car is more than just a tool; it cannot be added to the existing traffic system without affecting that system in return. A similar story will apply to considerations of privacy.
Another point that Vanderbilt raises is whether or not autonomous cars will increase or decrease the amount of driving that occurs. Vanderbilt is skeptical that Jevons’ Paradox will apply. This is the argument that, since robot cars will likely be more fuel efficient that regular cars, that people will consume their savings by making more trips and at longer distances. As he implies, there is only so much time in the day, and people have other things they would likely prefer to do than sit in a car. So, this factor should limit any effect from the Paradox. However, as Vanderbilt concedes, the situation is not so simple. With the car driving itself, the driver’s attention is freed up for other tasks, such as sending and reading emails:
The utility of the commute could theoretically improve as people once stuck driving the car can now fire off e-mails with abandon. Then again, this increased utility might lead to more people taking advantage of the utility, thus leading to more traffic and more time spent in gridlock. At which point you might long for that other, essentially “self-driving” vehicle: the train.
In other words, the autonomous car will probably increase the productivity of each car trip for its occupants. Thus, the effective cost of the trip will decrease. It will be as if, along the lines of Jevons’ argument, you had added more time to the length of the day. In that case, people might well be willing to consume that “extra” time, while their car drives them places. In that event, fuel consumption could actually rise as a result of the introduction of robot cars.
As Vanderbilt says, neither of these observations provides a compelling reason to ban or abandon autonomous cars. However, neither should we think of the introduction of such cars as being just like the introduction of a new model year in an existing type.