Genetic privacy and insurance November 20, 2014Posted by Cameron Shelley in : Uncategorized , add a comment
|DOE Human Genome Project|
In recent postings, I have noted that Canadian law does not protect the privacy of genetic information from insurers, and that insurers do require that information under certain circumstances. This situation can have unfortunate consequences.
Consider the case of Teresa Quick. This Toronto woman has a family history of cancer: both her mother and grandmother suffered from it. Teresa opted for genetic testing and was found to have the BRCA1 gene, predisposing her to breast (and ovarian) cancer. As a result, Teresa had a double mastectomy and applied for disability insurance that would help her to cover her mortgage in the event of a medical problem that prevented her from working. The application was denied due to her family history.
Teresa’s case could be considered a case of genetic discrimination. That is, Teresa’s insurance application was denied on the basis of genetic information whereas a similar application without that information would have been accepted. Most western countries have laws prohibiting genetic discrimination although Canada, as noted above, is an exception. Both Canadian and provincial governments have proposed genetic non-discrimination legislation but nothing has been passed.
This situation may change as the Canadian Senate is currently considering Bill S-201: An Act to prohibit and prevent genetic discrimination. The Bill would prohibit anyone from requiring a genetic test or disclosure of a genetic test result as a condition for a contract such as an insurance policy. The Bill has passed second reading and is currently before the Standing Senate Committee on Human Rights.
It is not clear to me that the Bill would alter what happened to Teresa Quick. The Bill defines a genetic test as follows:
“genetic test” means a test that analyzes DNA, RNA or chromosomes for purposes such as the prediction of disease or vertical transmission risks, or monitoring, diagnosis or prognosis.
Should a family history be considered a genetic test? Insurance applications routinely require disclosure of family history in order to establish “existing medical conditions” that insurers might wish not to cover. Of course, the point of a family history is to uncover any inherited conditions, which would usually be genetic (although epigenetic conditions might also be captured). If family history is not considered a genetic test, then the decision on Quick’s application would not be altered.
Canada needs laws regarding genetic privacy. However, lawmakers have to be clear on what it is they seek to regulate: Should it be specifically the technology of genetic testing, or should it be genetic information from whatever source?
Trust in Uber November 19, 2014Posted by Cameron Shelley in : STV202, STV302 , add a comment
In case you do not know, Uber is a ride-sharing service in which car owners who want to make money giving people rides in their cars are connected with people who would like those rides. As such, it is a part of a wider market phenomenon known as the sharing economy in which people with excess capacity, such as cars they are not using for personal reasons, rent it out to those who would like to use it.
If Uber’s offering sounds to you much like a taxi service, then you are not alone. Although Uber has been widely successful, it is often opposed by city administrations and taxi companies as a source of unfair competition. Now it appears that Toronto can be added to this list of cities. City officials have sought a court order to prevent the service from operating there. Their argument is that since Uber operates like a taxi service, connecting riders with providers, it should be regulated like one. Uber disagrees.
Toronto’s incoming mayor, John Tory, is an Uber supporter:
John Tory, however, issued a statement late Tuesday saying Uber and services like it “are here to stay.”
“It is time our regulatory system got in line with evolving consumer demands in the 21st century,” it said. “As Mayor, I intend to see that it does, while being fair to all parties, respecting the law and public safety.”
This quotation is not abundantly clear but seems to appeal to some notion of technological determinism, on which the arrival of new technologies is inevitable, so we should just get used to the idea. As readers of this blog know, new technologies do sometimes go away without becoming established. In any event, the second part of the Mayor-elect’s statement seems to vitiate the first.
Besides determinism, another reason not to treat Uber as a taxi service is to point to differences in the way Uber operates compared to traditional taxis:
Prof. Gans [of UofT] is a frequent user of Uber’s service, saying the app allows him to pay using his smartphone and avoids the hassle of fumbling for change. As for safety, he says he ordered an Uber cab driver to pick up his 12-year-old son from school one day when he had lots of stuff to carry home. With the app, he knew who the driver would be and his son did not have to carry cash for the fare.
Payments for rides on Uber are handled by the service rather than the driver (and no tips!) and customers can accept or reject (and review) given drivers. The latter mechanism is a common means of building trust in a sharing-economy service. Clearly, Prof. Gans trusts Uber and enjoys that feature of the service.
The first is that Uber seems to have abrogated this policy in the past. A story related by Peter Sims suggests that Uber has a “God view” that lets it follow individual rides in real time, as was exhibited at a party held by company executives in Chicago in 2011. From the story, it appears that Uber has built itself a “back door” in its service that allows it to share personal data for promotional purposes. Like most back doors, this feature is likely accessible to other interested parties on the Internet.
The second caveat is that anonymized data can sometimes be de-anonymized. As noted in a recent posting, anonymized taxi data released by the City of New York was successfully attributed to individuals by researchers.
In this respect, Uber is also different from traditional taxi companies, whose privacy policies seem to be more restrictive, e.g., the policy of Blue Line Taxi:
In the event that you should choose to send Blue Line Taxi Inc. your e-mail addresses, telephone numbers, addresses and/or any personal information so that we may communicate with you, Blue Line Taxi Inc. will not sell, trade or rent this information to third parties.
Whether this difference is sufficient to differentiate Uber from taxi services for legal purposes is another matter. In any event, it may suggest an area in which regulation would be appropriate.
Drone News Network: 17 Nov. 2014 November 17, 2014Posted by Cameron Shelley in : STV202, STV302 , add a comment
The drones are coming, still!
- Debate over the recreational use of drones in Australia has increased with news of a drone that took pictures of a woman sunbathing in her own backyard. The photography was commissioned by a real estate company in Melbourne to show photos of the area to prospective clients. One of the photos shows Mandy Lingard caught unawares on her lawn behind her house. The Civil Aviation Safety Authority stipulates that personal drones are not to be flown over populous areas or beaches, but these and similar restrictions are often more honored in the breach than the observance.
- Police forces in the United States are interested in small drones. The LAPD has acquired a pair and San Jose has purchased one that they would like to use to assist in police work. Both police forces note the potential usefulness of drones for hostage or standoff situations. Public reception has been tepid, however, over concerns that the drones would also be use for prying into people’s private spaces and affairs. It may be that the public would accept the use of police drones given appropriate restrictions and civilian oversight.
- The US Navy has recently deployed a laser gun that can shoot down military drones. The gun has several settings, from a low power that merely dazzles the optical systems of drones (stun) to a high power that lights their targets on fire (kill). No word on when a civilian version will be available.
- Given the increasing presence of drones in our airspace, not to mention efforts to bring them down, Evocative Design of New York has designed a “bio-drone.” Much, though not all, the drone’s chassis is made from biodegradable materials, so that the drone will simply decay away after a crash. Only battery, wires, and mechanical parts would be left to show that drone was ever present.
This last item is interesting. Until the present, drone designers have been interested mainly in getting drones to fly well and take lots of pictures cheaply. The onus of mitigating the risks of drones has fallen on regulators over the usage of drones. However, drones could perhaps be designed to mitigate the risks they pose. Besides biodegradability, how might drones be designed to pay more attention to the social issues they raise?
Privacy and equitability November 13, 2014Posted by Cameron Shelley in : STV202, STV302 , comments closed
FastCompany has an interesting item about a photographer, Andrew Hammerand, who is staging an exhibition of photos from an Internet-connected camera. What makes the exhibit especially of interest is that the subjects do not know that they are being watched.
Six years ago, Hammerand heard about such cameras, which are often not protected by any security measure so that anyone on the ‘net can monitor the feed they generate. He located one camera attached to a cell phone tower over an American Midwest town that was completely controllable:
“When I found this one I knew it was immediately different because you had full control of the camera,” Hammerand says. “This one had a full 360-degree view. You could pan, tilt, and control the exposure, and it was a high enough quality camera where you could zoom in from quite far away.”
Apparently, the camera was installed by developers looking to allow prospective buyers to see what the town is like.
Hammerand has surveilled the town using the camera for years and generated some very interesting and jittery pictures. A selection is being displayed at an exhibition called “The New Town”.
We have commented previously on challenges for privacy in the era of the Internet-of-things. This case illustrates one such challenge. Cameras like the one above this unnamed town allow people on the ‘net to have a look at the public (and perhaps other) spaces of the town for legitimate purposes. In this respect, the camera functions much like Google Street View. However, it could also be used for surveillance in a way that the locals do not anticipate or understand and who might consider it a form of harassment.
Another way to understand the situation is through the concept of equitability. What I have in mind is a system for the exchange of access to information that treats all parties more-or-less equally. In this case, it would mean that person A could pry into the life of person B only to the extent that A would allow B to do the reverse.
The example above fails to meet this criterion. Anyone with an Internet connection can surveil the doings in this town, while the residents do not (necessarily) have a similar opportunity with respect to the public spaces of others. Thus, the access granted by this camera is inequitable.
One way to address the issue would be to mount open surveillance cameras on all public spaces. Whatever the feasibility of this option, my guess is that people would find it hard to accept. If so, then perhaps we should come to some agreement about a more restrictive default for cameras mounted over public spaces.
Drone News Network: Look, up in the sky! November 12, 2014Posted by Cameron Shelley in : STV202, STV302 , comments closed
|Walkera QR X350|
It’s a bird! It’s a plane! No, it’s a rogue drone!
One of the most persistent issues with personal drones is where they should be allowed, and why their access to certain places might be restricted (or not). Here are some recent incidents to consider:
- The US Federal Aviation Administration is concerned with a “rash” of drones that have been showing up at American football games. A half-dozen or so incidents have been reported at college events, causing authorities to be concerned about safety. With multitudes of people packed in cheek-by-jowl, a drone crash could cause injuries. The article notes that a 19-year-old Brooklyn man was killed last year when the helicopter drone he was piloting crashed into his head. In addition, drones can simply disrupt sporting events, as demonstrated by an incident in a soccer game in Serbia.
- The US government is receiving “near daily” reports of drones flying in the vicinity of airports, near airplanes and helicopters. The FAA reports that there has already been a near miss between a personal drone and a passenger airliner in Florida. The Authority restricts drones from airports for obvious safety reasons:
The FAA tightly restricts the use of drones, which could cause a crash if one collided with a plane or was sucked into an engine. Small drones usually aren’t visible on radar to air traffic controllers, particularly if they’re made of plastic or other composites.
The article claims that the FAA’s restrictions on drones are routinely ignored.
- The New York Police Department is concerned that weaponized drones may be used by terrorists. Personal drones have already been equipped with paintball guns and tasers, so the idea that they could be even more fiercely armed is not implausible. So, when unidentified drones begin hovering around sensitive locations or events, authorities may feel compelled to respond.
- The French government reports that personal drones have also been sighted surveilling some of the nation’s nuclear power plants:
Between Oct. 5 and 20 people reported unidentified drones flying around seven of the country’s 58 state-owned nuclear power plants. In a press conference, French Minister of the Interior Bernard Cazeneuve said that the drones were small, commercially available models.
Suspicion has fallen on Greenpeace, an environmental group known for its interventions in institutional facilities. The group denies the allegation. Of course, nuclear power plants could be targets for terrorist groups as well.
And so is born the concept of a rogue drone, a drone flying somewhere it should be not flying and thus causing concerns over safety and security. So far, authorities have reacted with restriction zones. How should they respond when those restrictions are not respected?
Bonus footage: An unknown drone watches Chancellor Angela Merkel of Germany last September!
Mother, father, DNA November 11, 2014Posted by Cameron Shelley in : STV203 , comments closed
|DOE Human Genome Project|
As noted in earlier posts, biotechnology has been applied to fertility in ways that challenges traditional concepts of family relations. Who is the mother or father of a child when genetic transfer and surrogacy are involved? Two further illustrations of these questions have surfaced recently in the news.
The first concerns a ruling by the Irish Supreme Court holding that the genetic mother of twins is not their legal mother. The twins were conceived using the intended mother’s eggs and sperm from her husband. However, the intended mother’s sister carried the twins to term because the intended mother cannot do so due to a disability. After the birth, the Registrar of Births registered the genetic father as the twins’ father but refused to register the genetic mother as the “mother” on their birth certificate. Instead, the birth mother was listed because the Registrar interpreted this action as required by Irish law.
The couple sued and won their case in court. However, the government appealed, arguing that, by default, birth mothers should be considered as legal mothers. Furthermore, birth mothers who use donated eggs might find their parental rights removed by the ruling. The lawyer for the family argued that, among other things, genetic tests are already used to determine paternity, so the same standard should be applied to mothers.
The Court ruled in favour of the government saying that Irish law has no definition of “mother”. Until recently, there would have been no question that a birth mother is the mother of a child. Of course, biotechnology has undone that assumption and the Irish parliament has passed no legislation on the matter. Rather than make that determination, the Court ruled in favour of the government so that it could pass appropriate legislation without being constrained by a prior court decision.
In a California case, actor Jason Patric has recently been ruled the legal father of his genetic son Gus. Gus was conceived from sperm donated by Patric to impregnate his on-again-off-again girlfriend, Danielle Schreiber. By default, California law grants no parental rights to genetic fathers via sperm donation in the absence of a written agreement to the contrary. Patric sued for access and was denied by a lower court. He appealed and persuaded the appeals court judge that the presumption against in-vitro fathers should no be “so categorial”.
Ms. Schreiber objects that Patric has not played the role of father, neither signing the child’s birth certificate nor changing any of his diapers.
Doubts about paternity have always been with us. Sometimes, it was hoped that genetic testing would put such issues to rest. At the same time, however, biotechnology in fertility has somewhat detached the concept of fatherhood from that of genetic descent.
Maternity has usually appeared more clear cut. Again, biotechnology has unmoored traditional assumptions about who is a mother to whom. Laws are unclear on these matters in part because people themselves are not sure what to make of the new reality.
Privacy today November 10, 2014Posted by Cameron Shelley in : STV202, STV302 , comments closed
That privacy is an important issue today is not news. Consider the recent posting in this blog about e-voting and the lack of ballot secrecy it involves. However, there has been an abundance of privacy-related news items that are worth pondering:
- A number of LinkedIn users are suing the company in the US for violating that country’s Fair Credit Reporting Act. Their beef is that the service allows potential employers to construct lists of references connected in some way to a job applicant without the applicant’s knowledge. The plaintiffs claim that it is illegal for companies to perform such anonymous searches without ensuring the accuracy of the information gathered. LinkedIn contends that all the information made available through the service is information that users have consented to make public. US law does provide special protection for background checks intended for employee screening. Should LinkedIn users enjoy similar protection?
- New Scientist has an item on a privacy-concsious CCTV camera. The camera is designed to remove moving objects from its data stream. In effect, the visual feed includes fixed objects like buildings and mailboxes but not moving ones like people and vehicles. The editing is done in the camera so that hackers cannot defeat the feature, at least by intercepting the data stream. In effect, the design anonymizes the video feed. In a grocery store (one area where such cameras are deployed), users of the camera could track the flow of goods into and out of the aisles but would not know exactly who did the moving, thus protecting the privacy of customers. This camera could be considered an instance of privacy by design.
- Targetted marketing also involves surveillance and big data. Web services track user behavior and use the results to sell ad space to marketers specific to those users. The results increase the efficiency of advertising but decrease the privacy of users. Some users respond with browser extensions such as ad blockers. Another approach would be to degrade the performance of tracking software. This approach is exemplified by a browser extension called AdNauseam. In effect, the extension clicks on ads indiscriminately, thus making the user’s data very noisy. It provides a kind of privacy through obfuscation. Is that appropriate?
- An article in Slate addresses the general impact of “big data” on social control. Had early American slave owners been able to access modern surveillance and data mining techniques, would institutions such as the Underground Railroad have been possible? Neil Postman argued that new technologies tend to favour established social groups. This article makes a case that big data fits with Postman’s view:
These examples may seem extreme. But they highlight an important and uncomfortable fact: Throughout our history, the survival of our most vulnerable communities has often turned on their ability to avoid detection.
The point of the piece is that decisions over the use of data tend to be made without the interests of vulnerable constituencies in mind. True?
- An international task force recently shut down a number of “dark markets”, that is, online networks for items such as illegal drugs, stolen credit card numbers, and illegal weapons. Such markets are able to operate in the “dark”, in part, through the use of privacy protection software such as Tor. All transactions are anonymized through the service, thus making it difficult for authorities to track them. This situation illustrates one of the trade-offs that exists with privacy protections: They can benefit not only constituencies vulnerable to exploitation but also those seeking to hide from legitimate regulation.
As ever, privacy is not a simple challenge.
E-voting in US midterm elections November 7, 2014Posted by Cameron Shelley in : STV302 , comments closed
The United States recently held its 2014 midterm elections. The big winners were the Republican Party and, by some accounts, the voters of Alaska. That is because the state offered on-line voting (or e-voting). No more long lineups at the polling stations!
As this piece in IEEE Spectrum notes, security experts caution that online voting such as that offered in Alaska is open to online attacks. The same has always been true, as noted in earlier postings on this blog. Once again, Estonia’s online voting system is help up as a remedy–if the Estonians can do it, why can’t Americans? It is true that Estonia has invested heavily in its secure, online identification system, which has paid some dividends. However, researchers at the University of Michigan have examined Estonia’s system and found a number of vulnerabilities. They recommend that Estonia discontinue its online voting until the problems are addressed.
Another issue with online voting is that it is not secret. That is, voters casting ballots can be surveilled by other parties as they do so. In the past, this practice has resulted in some voter coercion and vote selling. If others can be sure how an elector is voting, then they may pressure that elector to vote in a given way or entice that behaviour with a reward. I noted in an earlier posting that this problem is why the Ontario Elections Act forbids so-called voting selfies, that is, photographs of ballots that voters taken with their smart phones in voting booths.
So, current online voting is neither secret nor secure.
That much is well known. One interesting wrinkle on this issue in the recent midterm elections is that Alaska has included a waiver in its information about its online voting system. Alaska’s own Division of Elections website includes the following statement:
When returning the ballot through the secure online voting solution, your are voluntarily waiving your right to a secret ballot and are assuming the risk that a faulty transmission may occur.
I think that this notice sets an interesting precedent. Why not apply the standard of informed consent to online voting? Currently, electors who use an online system are simply presented with a ballot informing them about their voting choices. They are not given information about the risks they are taking with that ballot. Perhaps they should be presented with this information before choosing to vote online.
The waiver on the Alaskan website would be a start. But a more complete waiver might run as follows:
- Voters choosing to cast ballots online agree to the following conditions:
- You are waiving your right to a secret ballot, and assuming the risk that third parties may pressure you to change your ballot.
- You have the right to sell your ballot to third parties on any terms agreeable to you both.
- You are assuming the risk that your ballot may be changed or lost in the event of faulty transmission.
- You are waiving your right to a secure ballot, and assuming the risk that others may alter or discount your ballot without your knowledge.
- You are waiving your right to have the counting of your ballot scrutinized by candidates or their appointed scrutineers.
(Agree) or (Disagree)?
The last right is guaranteed under the Ontario Elections Act (1990, section 45.5):
On the general polling day, the deputy returning officer and the poll clerk shall, at the hour fixed for the closing of the general poll, and in the presence of such of the candidates or their scrutineers as are present, proceed to count the ballots cast.
No one can scrutinize the counting of ballots when that occurs in a computer in a distant location.
Have I missed anything?
Data in the iCloud, by default? November 6, 2014Posted by Cameron Shelley in : STV202, STV302 , comments closed
It is well known that privacy in the Internet era can be a challenge. The advent of big data techniques, for example, has made privacy harder to maintain, as discussed in an earlier posting. One problem mentioned there is the difficulty for users maintaining control over uses of their data by collectors of it.
Another aspect of this problem was recently brought back into focus by the introduction of the iCloud drive service in the new Macintosh OS, Yosemite. iCloud is simply Apple’s cloud data service, allowing users to access their data over the Internet from any appropriate Apple device. One of the features of this service turns out to be the fact that several applications, including TextEdit, Preview, and Keynote, upload working files to iCloud drive by default. The point of this feature, as noted by Apple, is to allow files to be viewed and edited from multiple locations.
However, some files contain sensitive or private information, which users may not want placed into a cloud service where they might be accessed by hackers. Think of the recent celebrity nude selfie hack, for example. So, it may come as an unpleasant surprise to those users to find that their information is being handled in this way without their prior consent. Of course, they can opt out by changing the system preferences for iCloud.
This case, then, is another illustration of the importance of defaults. Unless the system is changed by the user, it defaults to a setting where user data is less secure than it could be. The payoff comes in the form of greater user access to the data. Apple has assumed that convenience is more important, overall, and thus applied the paradigm of presumed consent by users to this handling of their data. The alternative would be to apply the paradigm of express consent, that is, keeping the behaviour off by default and seeking permission from users to change it.
Was Apple’s decision in favour of presumed consent the correct one?
3D printed gun ammo November 5, 2014Posted by Cameron Shelley in : STV202 , comments closed
We have not said a great deal about 3D printed guns since the invention of the Liberator. That was the first firearm to be totally printable on a 3D printer.
One of the limitations of such weapons was that the gun’s material could not withstand many firings. This meant that firing the gun could be dangerous to the user and not just the target, and that semi-automatic 3D guns were not practicable. This fact has been welcomed by regulators, such as the US Bureau of Alcohol, Tobacco, and Firearms, who have not attempted to regulate this particular innovation yet.
However, this problem may now have been overcome. Wired reports that Michael Crumling, a machinist from York, Pa., has designed ammunition especially for 3D printed firearms:
His ammunition uses a thicker steel shell with a lead bullet inserted an inch inside, deep enough that the shell can contain the explosion of the round’s gunpowder instead of transferring that force to the plastic body or barrel of the gun. …
“It’s a really simple concept: It’s kind of a barrel integrated into the shell, so to speak,” says Crumling. “Basically it removes all the stresses and pressures from the 3-D printed parts. You should be able to fire an unlimited number of shots through the gun without replacing any parts other than the shell.”
Recently, for example, Crumling shot 19 of his home-made rounds from a gun printed on a run-of-the-mill home 3D printer. The results can be seen in his video:
Crumling has not yet tooled up to produce his ammunition en masse but says that he may do so if enough people express an interest.
If regulators were hoping for more time before taking action, it seems that that time is growing short.