In December 2012 a homeless man named Lukis Anderson was charged with the murder of Raveesh Kumra, a Silicon Valley multimillionaire, based on DNA evidence. The charge carried a possible death sentence. But Anderson was not guilty. He had a rock-solid alibi: drunk and nearly comatose, Anderson had been hospitalized—and under constant medical supervision—the night of the murder in November. Later his legal team learned his DNA made its way to the crime scene by way of the paramedics who had arrived at Kumra’s residence. They had treated Anderson earlier on the same day—inadvertently “planting” the evidence at the crime scene more than three hours later. The case, presented in February at the annual American Academy of Forensic Sciences meeting in Las Vegas, provides one of the few definitive examples of a DNA transfer implicating an innocent person and illustrates a growing opinion that the criminal justice system’s reliance on DNA evidence, often treated as infallible, actually carries significant risks.
As virtually every field in forensics has come under increased scientific scrutiny in recent years, especially those relying on comparisons such as bite-mark and microscopic hair analysis, the power of DNA evidence has grown—and for good reason. DNA analysis is more definitive and less subjective than other forensic techniques because it is predicated on statistical models. By examining specific regions, or loci, on the human genome, analysts can determine the likelihood that a given piece of evidence does or does not match a known genetic profile, from a victim, suspect or alleged perpetrator; moreover, analysts can predict how powerful or probative the match is by checking a pattern’s frequency against population databases. Since the mid-1990s the Innocence Project, a nonprofit legal organization based in New York City, has analyzed or reanalyzed available DNA to examine convictions, winning nearly 200 exonerations and spurring calls for reform of the criminal justice system.
Like any piece of evidence, however, DNA is just one part of a larger picture. “We’re desperately hoping that DNA will come in to save the day, but it’s still fitting into a flawed system,” says Erin E. Murphy, a professor of law at New York University and author of the 2015 book Inside the Cell: The Dark Side of Forensic DNA. “If you don’t bring in the appropriate amount of skepticism and restraint in using the method, there are going to be miscarriages of justice.” For example, biological samples can degrade or be contaminated; judges and juries can misinterpret statistical probabilities. And as the Anderson case brought to light, skin cells can move.
Once your smart devices can talk to you, who else are they talking to? Kashmir Hill and Surya Mattu wanted to find out – so they outfitted Hill’s apartment with 18 different internet-connected devices and built a special router to track how often they contacted their servers and see what they were reporting back. The results were surprising – and more than a little bit creepy. Learn more about what the data from your smart devices reveals about your sleep schedule, TV binges and even your tooth-brushing habits – and how tech companies could use it to target and profile you. (This talk contains mature language.)
There is an unwitting mole amongst my friends. Without my permission, they passed my personal information to a Facebook app called “This Is Your Digital Life”, which eventually ended up in the hands of Cambridge Analytica, the company famed for using questionable tactics in an effort to influence election campaigns.
Facebook won’t say for certain exactly what happened, nor which friend was involved. Only 270,000 people ever used the This Is Your Digital Life (TIYDL) app, but Facebook estimates that data from 87 million people ended up in the hands of Cambridge Analytica this way.
As a result, Facebook’s boss Mark Zuckerberg spent last week being grilled by the US congress. In the UK, a legal team is gathering claimants to take Facebook to court for mishandling their data. Where did it all go wrong?
Personal information can sound so vague, so let’s be specific. People who used the TIYDL app gave it permission to access their friend’s Facebook public profile page, date of birth, current city and pages they had liked. Facebook also says that “a small number of people” gave permission to share their own timeline and private messages too, meaning that posts or correspondence from their friends would have been scooped up as well.
In May 2008, Facebook announced what initially seemed like a fun, whimsical addition to its platform: People You May Know.
“We built this feature with the intention of helping you connect to more of your friends, especially ones you might not have known were on Facebook,” said the post.
It went on to become one of Facebook’s most important tools for building out its social network, which expanded from 100 million members then to over 2 billion today. While some people must certainly have been grateful to get help connecting with everyone they’ve ever known, other Facebook users hated the feature. They asked how to turn it off. They downloaded a “FB Purity” browser extension to hide it from their view. Some users complained about it to the U.S. federal agency tasked with protecting American consumers, saying it constantly showed them people they didn’t want to friend. Another user told the Federal Trade Commission that Facebook wouldn’t stop suggesting she friend strangers “posed in sexually explicit poses.”
In an investigation last year, we detailed the ways People You May Know, or PYMK, as it’s referred to internally, can prove detrimental to Facebook users. It mines information users don’t have control over to make connections they may not want it to make. The worst example of this we documented is when sex workers are outed to their clients.
When lawmakers recently sent Facebook over 2,000 questions about the social network’s operation, Senator Richard Blumenthal (D-Conn.) raised concerns about PYMK suggesting a psychiatrist’s patients friend one another and asked whether users can opt out of Facebook collecting or using their data for People You May Know, which is another way of asking whether users can turn it off. Facebook responded by suggesting the senator see their answer to a previous question, but the real answer is “no.”
Facebook refuses to let users opt out of PYMK, telling us last year, “An opt out is not something we think people would find useful.” Perhaps now, though, in its time of privacy reckoning, Facebook will reconsider the mandatory nature of this particular feature. It’s about time, because People You May Know has been getting on people’s nerves for over 10 years.
What can regular Facebook users do to safeguard their personal data other than deactivating their account?
The question for many people is really how can they exercise more control over what Facebook knows about them:
It’s important to remember, even deactivating Facebook doesn’t erase the history of data Facebook has already collected on you, and the more Facebook was used, the more data it collected.
One of the best ways to manage what data you’re handing over is to review what apps you have connected to your Facebook account. You can also really think about what information you are posting and how you are using Facebook beyond actually looking at your timeline – for example signing into other services through Facebook.
Concerned users can use Facebook minimally. For many of us, leaving Facebook is like leaving our social networks, but users can treat Facebook as a directory of contacts and choose to move more conversations offline. This is valuable because the most precious data Facebook harvests from you is behavioral. It’s all about our social networks, our friends, our hobbies, etc., and using that information to guess what our preferences are. We’re generally okay with advertising firms doing this when selling us shoes, but not when political campaigns are trying to influence our votes.
This is a reversal of the way users’ relationship with Facebook should work. Why should users extract themselves from their social networks to protect themselves and their data, especially when the data users feed Facebook is Facebook’s lifeline?
Nearly two decades after 18-year-old Angie Dodge was brutally murdered in her Idaho Falls, Idaho, apartment, police were still hunting for the killer who left his DNA at the crime scene, while a man who did not match the DNA was serving a 30-year sentence for participating in the crime.
In 2014, police took a new and very controversial approach to try to find a match to that DNA. They searched a public DNA database owned by Ancestry.com, hoping to find someone related to Angie’s killer. They got a close enough match to make them think they had found the killer’s family tree – and there they found what they believed to be their man: a young New Orleans filmmaker who happened to have produced a short film about a girl’s brutal death.
But was he?
“Nobody every thinks that they’re gonna get picked up by the police and taken into an interrogation room and questioned about a murder,” filmmaker Michael Usry Jr. told “48 Hours.” “When it happens to you, it’s definitely a game changer.”
Dr. Sandra Wachter is a lawyer and Research Fellow in Data Ethics, AI, robotics and Internet Regulation/cyber-security at the Oxford Internet Institute and the Alan Turing Institute in London as well as a member of the Law Committee of the IEEE. She serves as a policy advisor for governments and NGO’s around the world on regulatory and ethical questions concerning emerging technologies. Prior to joining the OII, Sandra worked at the Royal Academy of Engineering and at the Austrian Ministry of Health.
It’s the smartphone conspiracy theory that just won’t go away: Many, many people are convinced that their phones are listening to their conversations to target them with ads. Vice recently fueled the paranoia with an article that declared “Your phone is listening and it’s not paranoia,” a conclusion the author reached based on a 5-day experiment where he talked about “going back to uni” and “needing cheap shirts” in front of his phone and then saw ads for shirts and university classes on Facebook.
(For what it’s worth, I also frequently see ads for shirts on Facebook, but I’m past the age of the target audience for back-to-school propaganda.)
Some computer science academics at Northeastern University had heard enough people talking about this technological myth that they decided to do a rigorous study to tackle it. For the last year, Elleen Pan, Jingjing Ren, Martina Lindorfer, Christo Wilson, and David Choffnes ran an experiment involving more than 17,000 of the most popular apps on Android to find out whether any of them were secretly using the phone’s mic to capture audio. The apps included those belonging to Facebook, as well as over 8,000 apps that send information to Facebook.
Sorry, conspiracy theorists: They found no evidence of an app unexpectedly activating the microphone or sending audio out when not prompted to do so. Like good scientists, they refuse to say that their study definitively proves that your phone isn’t secretly listening to you, but they didn’t find a single instance of it happening. Instead, they discovered a different disturbing practice: apps recording a phone’s screen and sending that information out to third parties.
THE THREE MEN who showed up at Michael Usry’s door last December were unfailingly polite. They told him they were cops investigating a hit-and-run that had occurred a few blocks away, near New Orleans City Park, and they invited Usry to accompany them to a police station so he could answer some questions. Certain that he hadn’t committed any crime, the 36-year-old filmmaker agreed to make the trip.
The situation got weird in the car. As they drove, the cops prodded Usry for details of a 1998 trip he’d taken to Rexburg, Idaho, where two of his sisters later attended college—a detail they’d gleaned by studying his Facebook page. “They were like, ‘We know high school kids do some crazy things—were you drinking? Did you meet anybody?’” Usry recalls. The grilling continued downtown until one of the three men—an FBI agent—told Usry he wanted to swab the inside of Usry’s cheek but wouldn’t explain his reason for doing so, though he emphasized that their warrant meant Usry could not refuse.
The bewildered Usry soon learned that he was a suspect in the 1996 murder of an Idaho Falls teenager named Angie Dodge. Though a man had been convicted of that crime after giving an iffy confession, his DNA didn’t match what was found at the crime scene. Detectives had focused on Usry after running a familial DNA search, a technique that allows investigators to identify suspects who don’t have DNA in a law enforcement database but whose close relatives have had their genetic profiles cataloged. In Usry’s case the crime scene DNA bore numerous similarities to that of Usry’s father, who years earlier had donated a DNA sample to a genealogy project through his Mormon church in Mississippi. That project’s database was later purchased by Ancestry, which made it publicly searchable—a decision that didn’t take into account the possibility that cops might someday use it to hunt for genetic leads.
Usry, whose story was first reported in The New Orleans Advocate, was finally cleared after a nerve-racking 33-day wait—the DNA extracted from his cheek cells didn’t match that of Dodge’s killer, whom detectives still seek. But the fact that he fell under suspicion in the first place is the latest sign that it’s time to set ground rules for familial DNA searching, before misuse of the imperfect technology starts ruining lives.