There are some people who believe that technology creates cultural convergence: that using similar technologies leads to cultural homogenization. It’s a deterministic view, and often comes up in discussions relating to technology and globalization. However, to continue the theme I started last week, we could easily draw the Olympics into this discussion. After all, it is a global event with over 200 countries participating, and the signs of globalization are omnipresent in logos of a select few multinational sponsors. Moreover, I hope you’re not visiting the games with a hankering for french fries — at least not the non-McDonald’s kind:
To sell fish and chips, the London organising committee (Locog) had to get a special dispensation from McDonald’s, the official restaurant sponsor, which is expected to provide 10% of meals served at the Games. Under its deal with the International Olympic Committee, the fast-food chain had the sole rights to sell chips or french fries. It allows Locog’s caterers to sell fish and chips, but not chips on their own.
The Olympic menu sounds pretty bland, given the hundreds of cultures that could have been represented. But even if there were hundreds of different cuisine available at the Games that might not matter. As David Nye points out in Technology Matters, such a choice might be illusionary. Consider the microcosm of a multi-ethnic food court at the mall, with a variety of different meal options, including curries, kebabs, pizza, and Timbits:
On the level of the technological systems used to produce and deliver the food, most differences evaporate. The food court’s businesses all use the same kinds of freezer, steam trays, fryers, and microwave ovens. They prepare dishes suited to the demands of a cafeterias, an assembly-line operation that functions best when food does not require much on-site preparation before it is is served. (p. 83)
Somehow, that passage reminds me of Olympic athletes: all using the same gear, the same training programs, the regimented and biomechanical pursuit of efficiency and maximization of the human body.
And yet, I’m not worried about cultural convergence. Not after learning of the American television broadcaster NBC deciding to cut a segment from its own coverage of the Opening Ceremonies last Friday. A somber tribute to “terrorism victims” and others who could not join in, depicting the struggle between life and death, was replaced by NBC with an interview of a star American athlete because:
“Our program is tailored for the U.S. television audience,” said NBC Sports spokesman Greg Hughes. “It’s a credit to [ceremony director] Danny Boyle that it required so little editing.”
So I guess the ceremony just wasn’t American enough. Huh. I don’t think it was Canadian enough either, because I don’t recall seeing that bit on the CTV broadcast in Canada. I also missed Lord Voldemort and dozens of flying Mary Poppins. Too British for our post-colonial tastes?
Finally, if we’re going to poke NBC’s lack of global awareness, or perhaps just awareness in general, I was a bit gobsmacked by the NBC reaction to the mid-ceremony appearance of Sir Tim Berners-Lee, who was sitting at a computer inside a giant house. Maybe not everyone has to remember the name of the very famous British man responsible for the creation and growth of the World Wide Web, but I’ll let Twitter take it from here:
“If you haven’t heard of him, we haven’t either.” NBC Olympic anchor on Tim Berners Lee. Co anchor: “Google him”. Breathtaking.—
The Firm (@TheFirmOnline) July 28, 2012
Turkle at TED: Alone together April 4, 2012Posted by Cameron Shelley in : STV202, STV302 , comments closed
There are many more ideas presented in the talk than I can comment on here at any length. Instead, I will just make a few, simple points.
First, Turkle’s concerns can be framed in a form common to many critiques of modern technology:
- There is a frailty in human nature
- Technology has a powerful way of exploiting this frailty
- Therefore, technology has the potential to worsen the human condition
Clearly, this schema is simplistic but it captures the gist of the argument, I hope. In this case, Turkle argues that we have vulnerability which is the fear of being isolated or alone. Information technology is able to provide us with constant connections to each other, or to some facsimile of connection. Therefore, people may become attached to the connection that technology offers, at the expense of losing touch with themselves. The problem with this form of argument, as such, is that it sets aside the strengths resident in human nature. So, advocates for technology may respond that people are resilient enough, on the whole, to enjoy the benefits of instant and persistent connectivity without becoming a slave to it. True? How so, or why not?
Second, Turkle concentrates on information technology as a means for people to avoid their fear of loneliness. Besides this sort of “push” mechanism, there may also be a “pull” mechanism at work. Here, I am reminded of some recent remarks by the novelist Jonathan Franzen on why it is that people love their high-tech gear:
Let me toss out the idea that, as our markets discover and respond to what consumers most want, our technology has become extremely adept at creating products that correspond to our fantasy ideal of an erotic relationship, in which the beloved object asks for nothing and gives everything, instantly, and makes us feel all powerful, and doesn’t throw terrible scenes when it’s replaced by an even sexier object and is consigned to a drawer.
In short, people like their gadgets because they make people feel awesome and powerful without making any demands on them. This observation brings us to the same conclusion as does Turkle’s but for different reasons. Besides helping us to avoid our fears of rejection, our gadgets help us to gratify our erotic longing to reject others without consequence. True?
Finally, Turkle makes use of the metaphor of technology as an active agent. She refers to technology as if it were a person with particular wants or proclivities. The idea that technology, rather than being inert, has a kind of agency is a feature of actor network theory among other approaches to understanding technology and society. However, the view that technology has a kind of “mind of its own” seems like a form of determinism. While useful for some purposes, anthropomorphicizing technology can distort our ideas about how to manage or control it. (Is it our friend or our foe? If the latter, could we declare “war” on technology to solve the problem? What would we use to fight the war?) In reality, technology is part of the human condition, not an independent actor co-starring with us on the world stage. Ultimately, I think we have to decide not what we want from technology but what we want from ourselves. Our technology polices would then guided by that decision.
Nuclear determinism March 16, 2012Posted by Cameron Shelley in : STV100 , comments closed
The Economist magazine has a special report on nuclear energy. The report is well worth a read and is very timely, coming on the anniversary of the disaster at the Fukushima Daiichi nuclear power plant.
The main thrust of the report is that civilian nuclear power has never realized the promise that it seemed to offer. Uranium fission reactors were first designed and built for a variety of reasons, including to exploit experience with nuclear powered submarines, to reconcile people to the development of nuclear weapons, and to begin what was imagined to be a sure progression to fusion power.
On that last point, consider the famous statement made by Lewis Straus, then chairman of the US Atomic Energy Commission, in 1954 that atomic energy would soon bring Americans electricity at virtually no cost:
Our children will enjoy in their homes electrical energy too cheap to meter.
Straus was actually referring not to uranium fission energy but to secret research on hydrogen fusion, the obvious final stage in the progression of atomic energy development.
Of course, despite decades and billions of dollars of research, it remains true that fusion is the power of the future, and always will be. For better or worse, the same seems to be essentially true of fission power. In spite of all the research and experience with fission energy, and continued enthusiasm for it in some quarters, fission power cannot compete in the marketplace with alternative energy sources. Despite hefty subsidies, fission power has remained stubbornly expensive and has not displaced much in the way of energy generated through fossil fuels. Also, while the cost of fission power remains high, the cost of energy from renewable sources continues to drop.
One of the lessons of this story is that the development of a technology over time is not determined by the intrinsic qualities of that technology:
… if nuclear power teaches one lesson, it is to doubt all stories of technological determinism. It is not the essential nature of a technology that matters but its capacity to fit into the social, political and economic conditions of the day.
Technological determinism is, roughly, the notion that the progress of a technology is dominated by its intrinsic nature. The history of nuclear energy demonstrates that the development of a technology may be dominated instead by social, political, and economic factors. This fact may be disappointing, and regrettable in some cases, but it cannot be ignored.
“Unstoppable, autonomous and out of control.” October 3, 2011Posted by Scott Campbell in : STV100 , comments closed
The post title sounds a bit like the tag-line to a Hollywood blockbuster about talking robots intent on taking over the Earth, or maybe a ragtag group of misfits intent on taking over the Earth.
What I was trying to summarize is the deterministic feeling some people have about technology in general: that it is an inescapable and certain force that shapes human fate and social circumstances. It is often linked to the idea that technological change known as “progress” is inevitable.
Writing on the Atlantic Monthly’s website, Alexis Madrigal observed this same kind of technological determinism in the acceptance or rejection of Facebook’s new “frictionless sharing” feature:
…where applications will be allowed to post about activities, like what news articles you’ve read, or what music you are listening to, without your explicitly deciding to share or “like” that bit of content (definition via http://www.informationweek.com/thebrainyard/news/social_networking_consumer/231700022).
At least one pundit feels this new and highly controversial feature is here to stay (“Why Facebook’s frictionless sharing is the future“), but Madrigal points out that not everyone agrees and some online services have declined to participate. In his words:
What’s important here, I think, is that Facebook is trying to push the idea that their version of ‘frictionless sharing’ is some kind of inevitable technological development about which people have no choice. “It’s like resisting cars, boyo!” But the idea that technologies run these independent paths with no intermediation from humans is far from established fact. People shape technologies as much as technologies shape people. When’s the last time you heard about supersonic flight? That was supposed to be the next big thing! But it had some problems and people said, “No, thanks.”
Indeed! Societies do reject technologies, and Madrigal is referring to the very same example we discuss in STV100: passenger supersonic transport planes. Though the French and British were able to cooperate for once and build the Concorde, the Americans never got that far, largely because the public rejected the technology outright. Not just because such a plane hopping from New York to LA or Chicago would be a noisy nuisance, but because it represented the inevitability of the government-military-academic-industrial complex of the mid- to late 20th century. It wasn’t just “No, thanks” to the notion of a supersonic plane, or that the technology was held up by some technical issues that couldn’t be resolved (after all, the Concorde flew for decades with a very good record), but instead it was “No, thanks” to the entire idea that expensive, blindly-funded and military-derived technology was unstoppable or out of control. Thus, no American SST.
Societies can and do reject technologies all the time. Even massive technological systems, which seem incredibly hard to shift and may persist for decades, will eventually decline. To borrow from Thomas Hughes, for every Charles Darwin who is there to explain the rise and evolution of a technological system, there must also be an Edward Gibbon to chronicle the decline and fall. In the late 1990s, who could have predicted Microsoft would fall beneath Apple’s shadow? I suspect that Facebook will meet its fate just as well. More worrisome to many, and without blaming this directly on Facebook or even technology, is how much our concept of privacy will change in the meantime.
Government accountability and IT July 21, 2011Posted by Cameron Shelley in : STV302 , comments closed
Can the government be made more accountable by consolidation of it’s IT infrastructure? “Yes,” argues Deborah Moores of the Globe and Mail. One of the obstacles to obtaining government data (assuming there are legitimate grounds for distributing it) is the very fact that governments often have multiple computing platforms. Those platforms may be incompatible or simply poorly connected, so that it is difficult to assemble relevant information when needed. So, a plausible solution would be to impose a single computing setup within the government.
(Image courtesy of Sub619 via Wikimedia Commons.)
Given the record of government IT projects, as revealed in several IEEE blogs, for example, this suggestion should be greeted with some scepticism. It is not that government computer installations should not be more efficient, it is that the complexity and cost of the task is so often underestimated.
Furthermore, efficient IT systems do not necessarily translate into greater transparency. The Chinese government has a pretty good system, which has not made it any more accountable. There is no deterministic relationship between IT and freedom of information, in spite of the early optimism among Internet developers that “information wants to be free“. Instead, transparency is a result of an administrative culture, the habits (and laws, of course) through which information is dispensed by those in office.
Instead, I would suggest that we seek a mandate from the government to be more transparent. Although the Harper government campaigned in the past on a platform of transparency, their record has been one of obscurantism and unaccountability, as Moore notes. If transparency is to be realized, let us aim for that goal directly, instead of hoping that it will follow as a side-benefit of an IT project.