Sandbox test area with all features enabled.
-The Border Search Exception- I'm writing a paper for a different class about the exception to the Fourth Amendment requirement for a warrant to conduct a search at the international border. I pricked up my ears in class last week when Eben said that en route to JFK someone could do the digital equivalent to locking a door and sliding the key underneath it. I thought it might be useful to set out what I've learned so far about this area of law, if nothing else to make people who are traveling internationally aware of the new border directives.

--Brief background-- The Fourth Amendment at the border rests on a tension between the privacy rights of individuals in their persons, papers and effects, and a government’s interest in controlling what crosses its borders. The Courts have determined that at the border the government's interest is "at its zenith" whereas the individual's expectations are lowered. (United States v. Flores-Montano, 541 U.S. 149, 152 (2004)) This asymmetry in these interests has influenced the development of the border search exception: an exception to the presumption that for a search to be reasonable under the Fourth Amendment the government must obtain a warrant. Electronic devices, now ubiquitous, have posed a challenge to the border search exception. The amount and type data these devices contain mean that laws and concepts that applied to the tangible world do not neatly translate to the digital era.

--The New US Border Directive-- The new CBP Directive allows an Officer to carry out a warrantless "basic search" with or without suspicion of wrongdoing. The networking function must be turned off and the Officer is limited to data stored on the device, its operating system and software stored on the device. To carry out an "advanced" search an Officer must have reasonable suspicion of the commission of offending or a "national security" concern and must obtain approval from higher officers. In an advanced search a device can be connected to tools that review, copy and/or analyze the data contained therein.

A traveler is obliged to present the device and information contained therein in a condition that allows inspection of the device and its contents. An Officer can request assistance in presenting the device and information in a way that allows its inspection. Passcodes or other means of access may be requested and retained as needed to facilitate the examination, including information on the device that is accessible through software applications on the device. If an Officer is unable to complete an inspection because of its password protection or encryption the Officer may detain the device.

--The New New Zealand Legislation-- As part of my paper I also looked up the situation in New Zealand, only to find that we also passed new legislation on this earlier this year. Prior to this Act the Courts had recognized a broad power regarding the searching of electronic devices. Contrary to the position of the NZ Customs to simply codify this broad power with clarifications Cabinet has broken away from Five Eyes partners in requiring reasonable suspicion of the commission of an offense? to conduct even an initial search of an electronic device. To conduct a "full search" with forensic assistance a Customs officer needs to have reasonable suspicion that evidence related to the commission of an offense is on the electronic device. After raising these threshold for a search Cabinet has made it an offense to, without reasonable excuse, fail to give access information (passwords etc) to enable a Customs officer to carry out the search.

--To sum up-- In a nutshell the US CBP have broad powers to conduct searches at the border. In my paper I argue an increased suspicion threshold is required for an initial search given the heightened privacy interests that people have in the data stored in their devices (whether intentionally or through sensor collection).

How Smartphones Hijack our Minds, WSJ Oct 6, 2017

Bug in Google Home - listening to everything Oct 11, 2017

Beacon tones - Rackmann v Colts Tech and Marketing Law Blog, Oct 4, 2017

U.S. Supreme Court to decide major microsoft email privacy fight Oct 17, 2017 Reuters - whether US prosecutors will be able to get access to emails stored on overseas servers.

-- RebeccaBonnevie - 11 Oct 2017

[Incomplete] How do parents protect the privacy environment of a digital native?

-- By RebeccaBonnevie - rewrite 31 March 2018

A child's privacy ought to be valued and protected. The protection falls primarily to the adults in her life. Unfortunately, due to convenience and ambivalence digital natives are being born into already polluted privacy environments. Furthermore, the adults actions are enabling the Machine to collect her data via their digital identities and sharing on behalf of the child. To change this, privacy needs to be protected as a collective interest. Failing this at the societal level, parent

Lawrence Lessig in “Law of the Horse” suggests there are four modalities of regulation in real and cyberspace: law, social norms, markets and architecture. Markets benefit from behavior collection so regulation is unlikely to come from that modality. This leaves law, social norms and architecture to address the protection of the privacy environment.

Adapting Law: covering the greater privacy environment

The UNICEF 10-Point proposal on the e-rights of children includes number 6: “The right to withhold personal data on the Internet and to preserve their identity and their image from possible unlawful use.” A California “online eraser” statute enacted in 2015 goes some way towards this. A platform provider must comply with a request from a person under the age of 18 to delete their posts. This law has some major limitations – it doesn't cover all online platforms, the information might be hidden from the public but may be kept on a server, and the platform is not required to remove something posted by a third party about the person including where a third party reposted the person's post.

Law tends to focus on privacy as an individual right that affects only the individual if it is relinquished by express consent. Cambridge Analytica has shown that to be falsity. The French attempted to create consequences for parents oversharing, but this is a reactive rather than preventative approach. The law is not an effective modality to protect children’s privacy environments.

Creating a social norm: Parents need to be educated

When I began this rewrite the challenge was how to convince parents that their actions to engage with the Machine and share information about themselves and their children affects the children's privacy and allows thtme to be targeted. However, the past few weeks of revelations about Cambridge Analytica seems to have got us half the way there. The masses have been shocked and revolted at the revelation that Cambridge Analytica could obtain data about person A via a decision person B had made about themselves: how dare CA obtain data to analyze and target me without my knowledge, consent or control! This should awaken them to the action needed to preserve a child’s privacy environment.

In New Zealand most expectant couples attend ante-natal classes about birth and having a newborn. At these classes, and through the health support system of midwives, hospitals and doctors, couples are given all sorts of information. From the first few weeks to vaccinations, from co-sleeping to breastfeeding; information is given about all aspects of a child’s life. All aspects except their privacy. When I asked myself when a social norm to protect a child’s privacy environment could be adopted, I thought of this as a vehicle. A child’s privacy environment needs to be treated like any other aspect of the child’s health – teach it alongside how to swaddle your newborn, and provide reading materials that emphasize the importance of the baby’s privacy that can be passed over with a pamphlet on the different breastfeeding holds.

Change the architecture: secure sharing.

Finally, the architecture must be made to provide parents means to share their cherub with family and friends without putting a child’s datapoints up for collection. I would expect it to begin with the privacy hygiene for the immediate family and expand to the mechanisms for sharing. My local early childhood center engage software called Storypark which they pay for but I own. The guardians of the child own the content posted and can invite family members to view it. The guardians keep control of the account even when no longer associated with the early childhood center and the terms and conditions say they will not sell the personal data of the children to anyone. This by itself is a good step, though of course has a limitation if the email notification of new content is landing in a gmail account. It is, however, not public broadcasting so may not provide the affirmation that the Facebook generation needs to get the requisite dopamine high.


Privacy is a combination of three things: secrecy, anonymity, and autonomy. One of the problems with it being addressed by law is that is often conceptualized as an individual right rather than a collective one. In fact, my actions affect those around me, a fact far too obvious in the Chinese system of social rating. Privacy should be approached in an environmental way, and the environment of the most vulnerable in our society should be protected by the rest of us. Like anything that involves societal change, often the momentum for the movement takes some time. Hopefully the knowledge of Cambridge Analytica will cause parents to be open to counter ideas about the privacy environment. I think the reality will be, like in so many areas, one generation will do the damage, and the next generation will be left trying to clean up the mess and pull the train back onto the tracks.

______________________________________________________________________________________ [1] Alpaydin, Ethem, Machine Learning, The New AI, MIT Press, October 2016



Webs Webs

r4 - 25 Apr 2018 - 02:21:44 - RebeccaBonnevie
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM