The Titanic: too conservative or too experimental? April 13, 2012Posted by Cameron Shelley in : STV202 , comments closed
As we near the 100th anniversary of the Titanic disaster, many commentators are reviewing its causes and implications. An exhaustive list of commentaries is not within my purview, but instructive items can be found at NPR, IEEE Spectrum, and Edward Tenner’s website.
One item of note that would be worth adding comes from Paul Louden-Brown’s discussion of Titanic’s design. Louden-Brown notes that Titanic’s design stayed with the “tried and true”:
Titanic … adopted tried and trusted methods for her design and construction. No risks were taken with the choice of engines which were enlarged versions of the propulsion system first used experimentally in Laurentic, another White Star liner, in 1909. The triple screw vessel had proved that two expansion engines feeding exhaust steam into a low pressure turbine were more economical than vessels using expansion engines or turbines alone.
Actually, this statement strikes me as containing a curious contradiction. No risks were taken, yet Titanic’s engines were significantly larger than previous ones. In fact, Titanic as a whole was quite a bit larger than previous vessels run by White Star (even the Laurentic). A significant change of scale brings with it new risks, as bigger versions of old designs do not necessarily work in the same way.
Louden-Brown acknowledges that the issue of scale applied to Titanic, and to her maneuvering system in particular:
Titanic’s hull and upper works were also enlarged versions of designs refined over several decades. Her stern, with its high graceful counter and long thin rudder, was an exact copy of an 18th-century sailing ship, wrought in steel, a perfect example of the lack of technical development. … No account was made for advances in scale and little thought was given to how a ship, 852 feet in length, might turn in an emergency or avoid collision with an iceberg.
In fact, the rudder design, while adequate at a smaller scale and for lower speeds, left Titanic somewhat sluggish at changing direction in the water. This point becomes clearer as Louden-Brown contrasts Titanic’s design with that of ships designed by a rival company, Cunard:
They were built principally from lessons learnt from advances in warship construction, but most importantly both were powered by steam turbines driving quadruple screws, each fitted with a large balanced rudder, making them faster than the competition and easier to manoeuvre.
Compared with the rudder design of the Cunarders, Titanic’s was a fraction of the size.
The implication is clear: Had Titanic’s crew spotted the iceberg prior to their collision, Titanic may have been unable to maneuver sufficiently to avoid a devastating impact.
This point brings me back to the contradiction I noted above. Louden-Brown characterizes Titanic’s design as conservative. This implies that her design followed the strategy prescribed by the precautionary principle, on which new technological risks are to be avoided until proven acceptably safe. Insofar as Titanic’s design resembled that of previous ships, its design seems like a cautious one. However, the significant increase in scale cannot be considered a cautious step. This issue is clear when White Star’s approach is contrasted with Cunard’s: Cunard had learned from experience with warship design that just scaling up the rudder from smaller vessels was not adequate and, indeed, introduced novel risks. Thus, merely scaling up previous designs should be viewed as a permissive approach instead of a precautionary one.
How, then, can we explain the seeming confusion in which Titanic’s design counts as both cautious and novel? I think confusion is a key term here. To say that a significant increase in scale is conservative is to confuse conservatism with simplicity. It is conceptually simple to scale a design based on existing models. Make the engines bigger; make the rudder bigger, keeping everything in proportion. Unfortunately, simple measures are not necessarily conservative ones, although they might seem to be. An elephant would not “work” if scaled down to the size of a mouse, nor would a mouse “work” if scaled to the size of an elephant (see the “square-cube law“). In the final analysis, it would appear that the design of Titanic’s steering system should be considered an experimental rather than a tried-and-true feature.
uwaterloo.xxx November 21, 2011Posted by Cameron Shelley in : STV302 , comments closed
The site St. Louis Today has an article about universities that have been buying up .xxx domain names. The .xxx domain area is intended to allow for a special space on the Web for pornography services. (Is that truly necessary? By all accounts, the ‘net is already quite efficient at delivering smut.) I hesitate to think of some of the domain names to be appearing there, but washingtonuniversity.xxx was not one of them. Is the University becoming more broad-minded about its services?
Not so, as it turns out. Washington University is reserving the domain name in order to prevent it from becoming associated with any pornographer in future. It turns out that this concern is not unfounded; there is a pornographic connection with the name Washu:
But the school does share a name with a female character, Washu Hakubi, from the world of Japanese animated cartoons. The anime genre has inspired a subset of cartoons heavy on sex and violence, leaving open the possibility that Washington University could find itself an accidental victim.
So, the University is taking pre-emptive action against the website “WashU.xxx”. From the article, it seems that many universities are doing the same thing; the cost is only $200 per name, so they feel that they would rather be safe than sorry. It seems that the precautionary principle applies.
This matter raises a couple of questions. First: Is the University of Waterloo about to open up a .xxx site? A quick Web search fails to turn anything up, suggesting that it is not. At least, not yet. Second: if it did, what would be on it? The WashU site will point to nowhere, the article states. I guess that we could expect something about equally stimulating from Waterloo, which has always resisted the use of suggestive imagery.
Lasers arrive too slowly to Marines in Iraq February 17, 2011Posted by Cameron Shelley in : STV202 , comments closed
Here is an interesting pieced from Wired about how bureaucracy delayed the arrival of a laser-based security system destined for Marines in Iraq. The system in question is a non-lethal laser gun designed to dazzle car drivers by shining an intense light in their eyes. The expected result is that the drivers must slow down and stop or else lose control of their cars. The idea is that this system can be used at military checkpoints to give soldiers an alternative to firing bullets at cars that are not slowing when approaching the checkpoint.
The Marine Corps made an urgent request for the technology in 2005. As the article explains, Iraqi driving habits have long featured weaving and speeding. The result is that drivers would unwittingly speed towards military checkpoints in a manner similar to that of a suicide bomber. Marines at the checkpoints would flash their headlights and fire warning shots but not always to effect. In the end, the Marines would fire on the drivers to protect themselves, resulting in a number of innocent casualties. Having the laser guns on hand would give the soldiers a way of halting the cars without killing the drivers.
An urgent request for the dazzlers was made in September 2005. Such requests are supposed to take no more than six months to resolve. However, this request was not even addressed until six months had passed. The delay was caused, in part, by a dispute over which system to purchase. Also, the Marine force in Iraq tried to purchase the dazzlers directly, circumventing the bureaucracy at Development Command. Only after Development Command had stopped that purchase did they proceed with their own purchase. In the meantime, estimates a report done for the Corps brass, 50 innocent Iraqis were killed at checkpoints who might have been merely incapacitated had the dazzler been available.
This incident is, among other things, an illustration of what R. G. Little calls institutional bias. Institutions and organizations can have a kind of built-in bias towards precaution or permissiveness in their adoption of designs or technologies. For various reasons, the bias is not always healthy. Famously, the Space Shuttle program has a permissive bias, resulting in the Challenger disaster. In the current example, the Marine Corps could be accused of having an overly precautionary bias.
Of course, it is easy to dump on bureaucracies for being slow-moving and self-serving. However, we should bear in mind that precaution has its uses. A permissive system allows for the rapid acquisition of both useful and useless or even harmful technologies. Imagine, for instance, that the Corps urgently acquired a bomb detector (cf. this story about dogs vs high-tech detectors) that turned out to be defective. Probably, many soldiers and civilians would be harmed before the detector was abandoned. In that event, the bureaucracy might be blamed for not being slow-moving enough.
I suppose the moral is that we certainly want to keep watch on the biases of our institutions to ensure that they fit with the circumstances that they are in. However, in order to do so, we have to consider and balance the errors that come with both permissive and precautionary approaches to progress.
Bee careful? December 15, 2010Posted by Cameron Shelley in : STV202 , comments closed
FastCompany has run a couple of articles on the use of clothianidin as a pesticide on corn in the US. The pesticide may well be highly toxic to honey bees, which are vitally important both to the honey industry and as pollinators for the corn plants themselves (and other plants too, of course).
(Image courtesy of WWalas via Wikimedia Commons.)
A leaked document (not Wikileaks though) suggests that tests supporting the EPA’s decision to allow the pesticide were poorly designed and thus inconclusive as to the safety of clothianidin in the presence of honey bees. Yet, the EPA continues to allow the use of clothianidin. Several other countries, e.g., France, have already banned the pesticide out of fear of its effects on bee populations, and bee populations in the US have declined precipitously in recent years.
One of the topics we often discuss in my class is how to proceed in the introduction of designs or design elements in the face of uncertainties of this type. Do we proceed with innovations unless and until harm is proven, or do we adopt a precautionary stance and hold off unless and until safety is assured? In a more recent entry, Ariel Schwartz at FastCompany recommends the precautionary strategy:
No one can say for sure that neonicotinoids alone are causing bees to die off–many more studies have to be done. But the EPA would do well to err on the side of caution for the beekeepers who are rapidly losing their bees. Tom Theobald, for example, saw his smallest honey crop in 35 years of beekeeping, and Hackenberg claims that he has talked to beekeepers across the country who have lost up to 90% of their output this year.
Given our reliance on bees for crop pollination, and the seriousness of colony collapse disorder generally, this argument seems compelling. However, what are the risks to the agricultural industry if clothianidin is banned? What would you do if you were in charge at the EPA?
Institutional bias at BP June 17, 2010Posted by Cameron Shelley in : STV202 , comments closed
The Deepwater Horizon oil spill is much in the news and on everyone’s mind these days, quite naturally. It appears that this spill is not the first troubling incident that has occurred at a BP wellhead in the Gulf (and perhaps elsewhere) in recent years. A blog article on a site called the Oil Drum indicates that there have been a number of near misses.
One issue is the repeated reliance on a “Blow Out Preventer” or BOP. This device is essentially a set of shears poised at the connection between the well on the seafloor and the pipe that transfers oil up to the rig (see below).
(Image courtesy of The Oil Drum.)
The BOP is meant to be the “last line of defense”, in other words, the last in a series of mechanisms meant to prevent a spill. The blog article notes that the BOP has been activated a number of times on wells in the Gulf, thus preventing spills (until this year). This news should have been not comforting but alarming for the operators, as the poster notes:
In the 2003 spill, and in many similar cases, the fact that the blind-shear BOP functioned as intended is not a sign that the system worked, for a truly fail-safe system would be where the last line of defense from disaster is never reached.
It seems plausible to guess that the effectiveness of the BOP on previous occasions gave rise to some complacency instead of alarm about the overall safety of the drilling apparatus.
R. G. Little, among others, has noted how institutions can be biased in how they deal with risk. Some will tend towards a precautionary approach, others a permissive approach. The preferred approach can be reinforced by events over time.
The classic example is NASA which, according to the Rogers Commission report, became complacent about the safety of Shuttle launches as taking greater and greater risks continued to pay off in successful launches. Then came the Challenger disaster.
One of the roles of regulation is to arrest this sort of process. A rule can be made that establishes a certain minimal safety margin for operation of an oil rig, Space Shuttle, and so on. Of course, regulations bring problems of their own: They can become outmoded, ignored, or altered to suit inappropriate interests. In the end, we rely on the intelligence, good will and integrity of those who are operating these devices and overseeing their operations.
Apple bans stuff from App Store…again June 15, 2010Posted by Cameron Shelley in : STV202, STV302 , comments closed
Apple has attracted some controversy lately for banning some apps with controversial content. The latest episode concerns a woman, seen naked from the waist up, in a graphic novel adaptation of Ulysses, by James Joyce.
These incidents make Apple seem prudish and uncool, if not downright censorious. Perhaps it is. However, it may simply be applying a precautionary strategy: Err on the side of caution and thus avoid upsetting what they see as their somewhat unadventurous target market. They can always change their minds if enough people make a fuss.
The alternative would be a permissive strategy: Let people put what they like in their apps and win plaudits from libertarians and the more avant-guard.
Both strategies are equally rational on their own merits. So why should Apple prefer one over the other? Evidently, Apple wants its designs to be perceived as comfortable and non-threatening and thus appealing to a broad audience. It makes sense, then, to be cautious and back down occasionally, instead of being permissive and risking being perceived as libertine. Would you want to be the person at Apple who approved an app that the public perceived as, say, an instance of child porn?
Volcanic ash: Deadly threat or minor menace? April 22, 2010Posted by Cameron Shelley in : STV202 , comments closed
Robert Charette at IEEE Spectrum points out that EU officials have characterized as flawed their decision to halt air traffic over Europe in the face of the ash cloud from the Eyjafjallajökull eruption. The computer models they relied on, say the officials, were based on “incomplete science” and incomplete data.
However, Charette points out, some volcanologists take the view that the halt in air travel was appropriate. So, the controversy continues.
(Photo courtesy of Makaristos, Wikimedia Commons)
This situation illustrates an age-old tension in decision making and design. When confronted with a potential risk, should we adopt a permissive view, that is, full steam ahead, or adopt a precautionary view, that is, better safe than sorry?
In his article on the Y2K issue, Farhad Manjoo notes that the precautionary approach was adopted. Billions were spent to update computer code prior to the arrival of the year 2000. And what happened? Almost nothing. The resulting non-catastrophe persuaded many observers that the whole project was a waste of resources. Of course, it could be that nothing happened because the project was a success. In general, Manjoo argues, the precautionary approach tends to be self-undermining because success actually tends to make it seem less compelling.
The same could be said about precaution in the face of the volcanic ash cloud. No planes fell out of the sky, but a lot of money was lost! Of course, we will never know what would have happened had flights not been canceled but, I’ve heard it said, there’s nothing like a smoking hole in the ground to change people’s minds.