Link to the University of Waterloo home page

Link to the Centre for Society, Technology and Values home page

jump to navigation

Drone News Network: 24 Nov. 2014 November 24, 2014

Posted by Cameron Shelley in : STV302 , add a comment

As the holiday season approaches, drones are set to play a greater role in Christmas celebrations:

So, when you hear a faint buzzing sound coming down your chimney this Christmas, it is just Santa’s drone delivering your presents, probably.

Genetic privacy and insurance November 20, 2014

Posted by Cameron Shelley in : Uncategorized , add a comment

DOE Human Genome Project

In recent postings, I have noted that Canadian law does not protect the privacy of genetic information from insurers, and that insurers do require that information under certain circumstances. This situation can have unfortunate consequences.

Consider the case of Teresa Quick. This Toronto woman has a family history of cancer: both her mother and grandmother suffered from it. Teresa opted for genetic testing and was found to have the BRCA1 gene, predisposing her to breast (and ovarian) cancer. As a result, Teresa had a double mastectomy and applied for disability insurance that would help her to cover her mortgage in the event of a medical problem that prevented her from working. The application was denied due to her family history.

Teresa’s case could be considered a case of genetic discrimination. That is, Teresa’s insurance application was denied on the basis of genetic information whereas a similar application without that information would have been accepted. Most western countries have laws prohibiting genetic discrimination although Canada, as noted above, is an exception. Both Canadian and provincial governments have proposed genetic non-discrimination legislation but nothing has been passed.

This situation may change as the Canadian Senate is currently considering Bill S-201: An Act to prohibit and prevent genetic discrimination. The Bill would prohibit anyone from requiring a genetic test or disclosure of a genetic test result as a condition for a contract such as an insurance policy. The Bill has passed second reading and is currently before the Standing Senate Committee on Human Rights.

It is not clear to me that the Bill would alter what happened to Teresa Quick. The Bill defines a genetic test as follows:

“genetic test” means a test that analyzes DNA, RNA or chromosomes for purposes such as the prediction of disease or vertical transmission risks, or monitoring, diagnosis or prognosis.

Should a family history be considered a genetic test? Insurance applications routinely require disclosure of family history in order to establish “existing medical conditions” that insurers might wish not to cover. Of course, the point of a family history is to uncover any inherited conditions, which would usually be genetic (although epigenetic conditions might also be captured). If family history is not considered a genetic test, then the decision on Quick’s application would not be altered.

Canada needs laws regarding genetic privacy. However, lawmakers have to be clear on what it is they seek to regulate: Should it be specifically the technology of genetic testing, or should it be genetic information from whatever source?

Trust in Uber November 19, 2014

Posted by Cameron Shelley in : STV202, STV302 , add a comment

In case you do not know, Uber is a ride-sharing service in which car owners who want to make money giving people rides in their cars are connected with people who would like those rides. As such, it is a part of a wider market phenomenon known as the sharing economy in which people with excess capacity, such as cars they are not using for personal reasons, rent it out to those who would like to use it.

If Uber’s offering sounds to you much like a taxi service, then you are not alone. Although Uber has been widely successful, it is often opposed by city administrations and taxi companies as a source of unfair competition. Now it appears that Toronto can be added to this list of cities. City officials have sought a court order to prevent the service from operating there. Their argument is that since Uber operates like a taxi service, connecting riders with providers, it should be regulated like one. Uber disagrees.

Toronto’s incoming mayor, John Tory, is an Uber supporter:

John Tory, however, issued a statement late Tuesday saying Uber and services like it “are here to stay.”
“It is time our regulatory system got in line with evolving consumer demands in the 21st century,” it said. “As Mayor, I intend to see that it does, while being fair to all parties, respecting the law and public safety.”

This quotation is not abundantly clear but seems to appeal to some notion of technological determinism, on which the arrival of new technologies is inevitable, so we should just get used to the idea. As readers of this blog know, new technologies do sometimes go away without becoming established. In any event, the second part of the Mayor-elect’s statement seems to vitiate the first.

Besides determinism, another reason not to treat Uber as a taxi service is to point to differences in the way Uber operates compared to traditional taxis:

Prof. Gans [of UofT] is a frequent user of Uber’s service, saying the app allows him to pay using his smartphone and avoids the hassle of fumbling for change. As for safety, he says he ordered an Uber cab driver to pick up his 12-year-old son from school one day when he had lots of stuff to carry home. With the app, he knew who the driver would be and his son did not have to carry cash for the fare.

Payments for rides on Uber are handled by the service rather than the driver (and no tips!) and customers can accept or reject (and review) given drivers. The latter mechanism is a common means of building trust in a sharing-economy service. Clearly, Prof. Gans trusts Uber and enjoys that feature of the service.

Not everyone does, however. Like many online services, Uber compiles data about its users. In that way, the service can predict what kind of service a given user is likely to want. It also gives the service plenty of data about its users’ movements and activities. Uber’s privacy policy states that it does not sell personal data, but may share anonymized information for use by third parties such as advertisers. There are two caveats that apply to this assurance.

The first is that Uber seems to have abrogated this policy in the past. A story related by Peter Sims suggests that Uber has a “God view” that lets it follow individual rides in real time, as was exhibited at a party held by company executives in Chicago in 2011. From the story, it appears that Uber has built itself a “back door” in its service that allows it to share personal data for promotional purposes. Like most back doors, this feature is likely accessible to other interested parties on the Internet.

The second caveat is that anonymized data can sometimes be de-anonymized. As noted in a recent posting, anonymized taxi data released by the City of New York was successfully attributed to individuals by researchers.

In this respect, Uber is also different from traditional taxi companies, whose privacy policies seem to be more restrictive, e.g., the policy of Blue Line Taxi:

In the event that you should choose to send Blue Line Taxi Inc. your e-mail addresses, telephone numbers, addresses and/or any personal information so that we may communicate with you, Blue Line Taxi Inc. will not sell, trade or rent this information to third parties.

Whether this difference is sufficient to differentiate Uber from taxi services for legal purposes is another matter. In any event, it may suggest an area in which regulation would be appropriate.

Drone News Network: 17 Nov. 2014 November 17, 2014

Posted by Cameron Shelley in : STV202, STV302 , comments closed

Doodybutch/Wikimedia commons

The drones are coming, still!

This last item is interesting. Until the present, drone designers have been interested mainly in getting drones to fly well and take lots of pictures cheaply. The onus of mitigating the risks of drones has fallen on regulators over the usage of drones. However, drones could perhaps be designed to mitigate the risks they pose. Besides biodegradability, how might drones be designed to pay more attention to the social issues they raise?

Privacy and equitability November 13, 2014

Posted by Cameron Shelley in : STV202, STV302 , comments closed

Rama/Wikimedia commons

FastCompany has an interesting item about a photographer, Andrew Hammerand, who is staging an exhibition of photos from an Internet-connected camera. What makes the exhibit especially of interest is that the subjects do not know that they are being watched.

Six years ago, Hammerand heard about such cameras, which are often not protected by any security measure so that anyone on the ‘net can monitor the feed they generate. He located one camera attached to a cell phone tower over an American Midwest town that was completely controllable:

“When I found this one I knew it was immediately different because you had full control of the camera,” Hammerand says. “This one had a full 360-degree view. You could pan, tilt, and control the exposure, and it was a high enough quality camera where you could zoom in from quite far away.”

Apparently, the camera was installed by developers looking to allow prospective buyers to see what the town is like.

Hammerand has surveilled the town using the camera for years and generated some very interesting and jittery pictures. A selection is being displayed at an exhibition called “The New Town”.

We have commented previously on challenges for privacy in the era of the Internet-of-things. This case illustrates one such challenge. Cameras like the one above this unnamed town allow people on the ‘net to have a look at the public (and perhaps other) spaces of the town for legitimate purposes. In this respect, the camera functions much like Google Street View. However, it could also be used for surveillance in a way that the locals do not anticipate or understand and who might consider it a form of harassment.

Another way to understand the situation is through the concept of equitability. What I have in mind is a system for the exchange of access to information that treats all parties more-or-less equally. In this case, it would mean that person A could pry into the life of person B only to the extent that A would allow B to do the reverse.

The example above fails to meet this criterion. Anyone with an Internet connection can surveil the doings in this town, while the residents do not (necessarily) have a similar opportunity with respect to the public spaces of others. Thus, the access granted by this camera is inequitable.

One way to address the issue would be to mount open surveillance cameras on all public spaces. Whatever the feasibility of this option, my guess is that people would find it hard to accept. If so, then perhaps we should come to some agreement about a more restrictive default for cameras mounted over public spaces.

Drone News Network: Look, up in the sky! November 12, 2014

Posted by Cameron Shelley in : STV202, STV302 , comments closed

Walkera QR X350

It’s a bird! It’s a plane! No, it’s a rogue drone!

One of the most persistent issues with personal drones is where they should be allowed, and why their access to certain places might be restricted (or not). Here are some recent incidents to consider:

And so is born the concept of a rogue drone, a drone flying somewhere it should be not flying and thus causing concerns over safety and security. So far, authorities have reacted with restriction zones. How should they respond when those restrictions are not respected?

Bonus footage: An unknown drone watches Chancellor Angela Merkel of Germany last September!

Mother, father, DNA November 11, 2014

Posted by Cameron Shelley in : STV203 , comments closed

DOE Human Genome Project

As noted in earlier posts, biotechnology has been applied to fertility in ways that challenges traditional concepts of family relations. Who is the mother or father of a child when genetic transfer and surrogacy are involved? Two further illustrations of these questions have surfaced recently in the news.

The first concerns a ruling by the Irish Supreme Court holding that the genetic mother of twins is not their legal mother. The twins were conceived using the intended mother’s eggs and sperm from her husband. However, the intended mother’s sister carried the twins to term because the intended mother cannot do so due to a disability. After the birth, the Registrar of Births registered the genetic father as the twins’ father but refused to register the genetic mother as the “mother” on their birth certificate. Instead, the birth mother was listed because the Registrar interpreted this action as required by Irish law.

The couple sued and won their case in court. However, the government appealed, arguing that, by default, birth mothers should be considered as legal mothers. Furthermore, birth mothers who use donated eggs might find their parental rights removed by the ruling. The lawyer for the family argued that, among other things, genetic tests are already used to determine paternity, so the same standard should be applied to mothers.

The Court ruled in favour of the government saying that Irish law has no definition of “mother”. Until recently, there would have been no question that a birth mother is the mother of a child. Of course, biotechnology has undone that assumption and the Irish parliament has passed no legislation on the matter. Rather than make that determination, the Court ruled in favour of the government so that it could pass appropriate legislation without being constrained by a prior court decision.

In a California case, actor Jason Patric has recently been ruled the legal father of his genetic son Gus. Gus was conceived from sperm donated by Patric to impregnate his on-again-off-again girlfriend, Danielle Schreiber. By default, California law grants no parental rights to genetic fathers via sperm donation in the absence of a written agreement to the contrary. Patric sued for access and was denied by a lower court. He appealed and persuaded the appeals court judge that the presumption against in-vitro fathers should no be “so categorial”.

Ms. Schreiber objects that Patric has not played the role of father, neither signing the child’s birth certificate nor changing any of his diapers.

Doubts about paternity have always been with us. Sometimes, it was hoped that genetic testing would put such issues to rest. At the same time, however, biotechnology in fertility has somewhat detached the concept of fatherhood from that of genetic descent.

Maternity has usually appeared more clear cut. Again, biotechnology has unmoored traditional assumptions about who is a mother to whom. Laws are unclear on these matters in part because people themselves are not sure what to make of the new reality.

Privacy today November 10, 2014

Posted by Cameron Shelley in : STV202, STV302 , comments closed

That privacy is an important issue today is not news. Consider the recent posting in this blog about e-voting and the lack of ballot secrecy it involves. However, there has been an abundance of privacy-related news items that are worth pondering:

As ever, privacy is not a simple challenge.

E-voting in US midterm elections November 7, 2014

Posted by Cameron Shelley in : STV302 , comments closed

The United States recently held its 2014 midterm elections. The big winners were the Republican Party and, by some accounts, the voters of Alaska. That is because the state offered on-line voting (or e-voting). No more long lineups at the polling stations!

As this piece in IEEE Spectrum notes, security experts caution that online voting such as that offered in Alaska is open to online attacks. The same has always been true, as noted in earlier postings on this blog. Once again, Estonia’s online voting system is help up as a remedy–if the Estonians can do it, why can’t Americans? It is true that Estonia has invested heavily in its secure, online identification system, which has paid some dividends. However, researchers at the University of Michigan have examined Estonia’s system and found a number of vulnerabilities. They recommend that Estonia discontinue its online voting until the problems are addressed.

Another issue with online voting is that it is not secret. That is, voters casting ballots can be surveilled by other parties as they do so. In the past, this practice has resulted in some voter coercion and vote selling. If others can be sure how an elector is voting, then they may pressure that elector to vote in a given way or entice that behaviour with a reward. I noted in an earlier posting that this problem is why the Ontario Elections Act forbids so-called voting selfies, that is, photographs of ballots that voters taken with their smart phones in voting booths.

So, current online voting is neither secret nor secure.

That much is well known. One interesting wrinkle on this issue in the recent midterm elections is that Alaska has included a waiver in its information about its online voting system. Alaska’s own Division of Elections website includes the following statement:

When returning the ballot through the secure online voting solution, your are voluntarily waiving your right to a secret ballot and are assuming the risk that a faulty transmission may occur.

I think that this notice sets an interesting precedent. Why not apply the standard of informed consent to online voting? Currently, electors who use an online system are simply presented with a ballot informing them about their voting choices. They are not given information about the risks they are taking with that ballot. Perhaps they should be presented with this information before choosing to vote online.

The waiver on the Alaskan website would be a start. But a more complete waiver might run as follows:

    Voters choosing to cast ballots online agree to the following conditions:

  1. You are waiving your right to a secret ballot, and assuming the risk that third parties may pressure you to change your ballot.
  2. You have the right to sell your ballot to third parties on any terms agreeable to you both.
  3. You are assuming the risk that your ballot may be changed or lost in the event of faulty transmission.
  4. You are waiving your right to a secure ballot, and assuming the risk that others may alter or discount your ballot without your knowledge.
  5. You are waiving your right to have the counting of your ballot scrutinized by candidates or their appointed scrutineers.

(Agree) or (Disagree)?

The last right is guaranteed under the Ontario Elections Act (1990, section 45.5):

On the general polling day, the deputy returning officer and the poll clerk shall, at the hour fixed for the closing of the general poll, and in the presence of such of the candidates or their scrutineers as are present, proceed to count the ballots cast.

No one can scrutinize the counting of ballots when that occurs in a computer in a distant location.

Have I missed anything?

Data in the iCloud, by default? November 6, 2014

Posted by Cameron Shelley in : STV202, STV302 , comments closed

It is well known that privacy in the Internet era can be a challenge. The advent of big data techniques, for example, has made privacy harder to maintain, as discussed in an earlier posting. One problem mentioned there is the difficulty for users maintaining control over uses of their data by collectors of it.

Another aspect of this problem was recently brought back into focus by the introduction of the iCloud drive service in the new Macintosh OS, Yosemite. iCloud is simply Apple’s cloud data service, allowing users to access their data over the Internet from any appropriate Apple device. One of the features of this service turns out to be the fact that several applications, including TextEdit, Preview, and Keynote, upload working files to iCloud drive by default. The point of this feature, as noted by Apple, is to allow files to be viewed and edited from multiple locations.

However, some files contain sensitive or private information, which users may not want placed into a cloud service where they might be accessed by hackers. Think of the recent celebrity nude selfie hack, for example. So, it may come as an unpleasant surprise to those users to find that their information is being handled in this way without their prior consent. Of course, they can opt out by changing the system preferences for iCloud.

This case, then, is another illustration of the importance of defaults. Unless the system is changed by the user, it defaults to a setting where user data is less secure than it could be. The payoff comes in the form of greater user access to the data. Apple has assumed that convenience is more important, overall, and thus applied the paradigm of presumed consent by users to this handling of their data. The alternative would be to apply the paradigm of express consent, that is, keeping the behaviour off by default and seeking permission from users to change it.

Was Apple’s decision in favour of presumed consent the correct one?

  • Regulus by Ben @ Binary Moon
  • Created with WordPress
  • Top
  • -->