Category: digital

Robots Should Be Slaves

Robots Should Be Slaves:

Slaves are normally defined to be people you own. In recent centuries, due to the African slave trade, slavery came to be associated with racism and also with endemic cruelty. In the past though (and in some places still today) slaves were often members of the same race or even nation that had simply lost private status. This happened generally as an outcome of war, but sometimes as an outcome of poverty. Excesses of cruelty are greatest when actors are able to dehumanise those in their power, and thus remove their own empathy for their subordinates. Such behaviour can be seen even within contemporary communities of citizens, when a person in power considers their very social standing as an indication of a specialness not shared with subordinates. Our culture has for good reason become extremely defensive against actions and beliefs associated with such dehumanisation.

But surely dehumanisation is only wrong when it’s applied to someone who really is human? Given the very obviously human beings that have been labelled inhuman in the global culture’s very recent past, many seem to have grown wary of applying the label at all. For example, Dennett (1987) argues that we should allocate the rights of agency to anything that appears to be best reasoned about as acting in an intentional manner. Because the costs of making a mistake and trivialising a sentient being are too great, Dennett says we are safer to err on the side of caution.

Dennett’s position is certainly easy to be sympathetic with, and not only because such generosity is almost definitionally nice. As I discuss below, there are many reasons people want to be able to build robots that they owe ethical obligation to. But the position overlooks the fact that there are also costs associated with allocating agency this way. I describe these costs below as well.

But first, returning to the question of definition – when I say “Robots should be slaves”, I by no means mean “Robots should be people you own.” What I mean to say is “Robots should be servants you own.”

There are several fundamental claims of this paper:

Having servants is good and useful, provided no one is dehumanised.
A robot can be a servant without being a person.
It is right and natural for people to own robots.
It would be wrong to let people think that their robots are persons.

Digital Human: Series 15, Ep 5 – Subservience

Fairness, transparency, privacy

Fairness, transparency, privacy:

Aims

Every day seems to bring news of another major breakthrough in the fields of data science and artificial intelligence, whether in the context of winning games, driving cars, or diagnosing disease. Yet many of these innovations also create novel risks by amplifying existing biases and discrimination in data, enhancing existing inequality, or increasing vulnerability to malfunction or manipulation.

There also are increasingly many examples where data collection and analysis risks oversharing personal information or giving unwelcome decisions without explanation or recourse.

The Turing is committed to ensuring that the benefits of data science and AI are enjoyed by society as a whole, and that the risks are mitigated so as not to disproportionately burden certain people or groups. This interest group plays an important role in this mission by exploring technical solutions to protecting fairness, accountability, and privacy, as increasingly sophisticated AI technologies are designed and deployed.

Once your smart devices can talk to you, who…

Once your smart devices can talk to you, who else are they talking to? Kashmir Hill and Surya Mattu wanted to find out – so they outfitted Hill’s apartment with 18 different internet-connected devices and built a special router to track how often they contacted their servers and see what they were reporting back. The results were surprising – and more than a little bit creepy. Learn more about what the data from your smart devices reveals about your sleep schedule, TV binges and even your tooth-brushing habits – and how tech companies could use it to target and profile you. (This talk contains mature language.)

Digital Human: Series 15, Ep 1 – Jigsaw

Survival in the digital age Stephanie Hankey o…

Survival in the digital age Stephanie Hankey offers a guide to safe online activism.:

Digital Human: Series 15, Ep 1 – Jigsaw

‘People You May Know:’ A Controversial Faceboo…

‘People You May Know:’ A Controversial Facebook Feature’s 10-Year History:

In May 2008, Facebook announced what initially seemed like a fun, whimsical addition to its platform: People You May Know.

“We built this feature with the intention of helping you connect to more of your friends, especially ones you might not have known were on Facebook,” said the post.

It went on to become one of Facebook’s most important tools for building out its social network, which expanded from 100 million members then to over 2 billion today. While some people must certainly have been grateful to get help connecting with everyone they’ve ever known, other Facebook users hated the feature. They asked how to turn it off. They downloaded a “FB Purity” browser extension to hide it from their view. Some users complained about it to the U.S. federal agency tasked with protecting American consumers, saying it constantly showed them people they didn’t want to friend. Another user told the Federal Trade Commission that Facebook wouldn’t stop suggesting she friend strangers “posed in sexually explicit poses.”

In an investigation last year, we detailed the ways People You May Know, or PYMK, as it’s referred to internally, can prove detrimental to Facebook users. It mines information users don’t have control over to make connections they may not want it to make. The worst example of this we documented is when sex workers are outed to their clients.

When lawmakers recently sent Facebook over 2,000 questions about the social network’s operation, Senator Richard Blumenthal (D-Conn.) raised concerns about PYMK suggesting a psychiatrist’s patients friend one another and asked whether users can opt out of Facebook collecting or using their data for People You May Know, which is another way of asking whether users can turn it off. Facebook responded by suggesting the senator see their answer to a previous question, but the real answer is “no.”

Facebook refuses to let users opt out of PYMK, telling us last year, “An opt out is not something we think people would find useful.” Perhaps now, though, in its time of privacy reckoning, Facebook will reconsider the mandatory nature of this particular feature. It’s about time, because People You May Know has been getting on people’s nerves for over 10 years.

Digital Human: Series 15, Ep 1 – Jigsaw

Dr. Sandra Wachter is a lawyer and Research …

Dr. Sandra Wachter is a lawyer and Research Fellow in Data Ethics, AI, robotics and Internet Regulation/cyber-security at the Oxford Internet Institute and the Alan Turing Institute in London as well as a member of the Law Committee of the IEEE. She serves as a policy advisor for governments and NGO’s around the world on regulatory and ethical questions concerning emerging technologies. Prior to joining the OII, Sandra worked at the Royal Academy of Engineering and at the Austrian Ministry of Health.

Digital Human: Series 15, Ep 1 – Jigsaw

Domesday: on the record and on the road

Domesday: on the record and on the road:

Although Domesday had a permanent home at Westminster, it did still travel occasionally. Medieval kings travelled a great deal around the kingdom, and there is evidence that, on occasion, Domesday (and other treasured documents) went with them. During the plague years in the reign of Elizabeth I, Domesday accompanied Exchequer officials who relocated temporarily to Hertford. And in September 1666 it was taken to Nonsuch to escape the Great Fire of London. Fire was again a menace in 1834, when much of the Palace of Westminster was engulfed in flames. The fire was caused by the burning of wooden tallies, notched pieces of wood that had been used in historical accounting procedures. Domesday was being kept in the Chapter House, and the keeper of the Chapter House, the historian and scholar Sir Francis Palgrave, asked the Dean of Westminster to be allowed to move Domesday and other historical records to the Abbey for safekeeping. Astonishingly, the Dean refused, saying that he first needed a warrant from the Prime Minister, Lord Melbourne. Fortunately the fire did not spread to the Chapter House and Domesday survived.

An accounting roll for the removal of the ‘Receipt of the Exchequer’ to Nonsuch because of the ‘late dreadful fire’ (AO 1/865/1).

This appalling fire was one of the factors used to promote the foundation of a new Public Record Office – an institution where government records could be brought together from the various places where they were stored, and kept safely and securely, with their conditions carefully monitored. This was certainly necessary for Domesday: a report to the Royal Commission on Public Records in the early 19th century states that Domesday had to be rebound as the wooden boards which protected it were being attacked by woodworm. The Public Record Office (PRO) was eventually founded in 1838, and Domesday moved in in 1859. (We can also note that the tallies which survived the Westminster conflagration were later also transferred to the new PRO where they could cause no further harm!)

Report on the rebinding of Domesday in 1819 due to danger from worms (PRO 36/7, p.237).

One might think that its arrival at the PRO would have put an end to Domesday’s peregrinations. But this was not quite the case. In the late 1850s the head of the Ordnance Survey Department, Sir Henry James, had developed a new photographic technique called photozincography and was determined to prove its worth by reproducing medieval documents. In 1861 he was able to convince the various officials with responsibility for Domesday, including the aforementioned Sir Francis Palgrave (by now Deputy Keeper of the PRO) to allow him to reproduce it using his new technique. This involved disbinding Domesday and taking it, a few counties at a time, to Southampton, where the folios were photozincographed in the open air on the South Downs. By 1863, the whole of Great and Little Domesday had been reproduced in this way. The whole enterprise was extremely expensive, and documents held at The National Archives include all sorts of wrangling about which government departments should pay for what. The project was supported throughout by the raising of subscriptions and by the sale of the bound volumes of the reproductions. Although taking this ancient record onto the South Downs for this escapade sounds rather reckless to us, the resulting photozincograph edition was a great achievement, and did much to bring Domesday to wider public attention. 4

Digital Human: Series 15, Ep 1 – Jigsaw

No, Your Phone Isn’t Secretly Recording You

No, Your Phone Isn’t Secretly Recording You:

It’s the smartphone conspiracy theory that just won’t go away: Many, many people are convinced that their phones are listening to their conversations to target them with ads. Vice recently fueled the paranoia with an article that declared “Your phone is listening and it’s not paranoia,” a conclusion the author reached based on a 5-day experiment where he talked about “going back to uni” and “needing cheap shirts” in front of his phone and then saw ads for shirts and university classes on Facebook.

(For what it’s worth, I also frequently see ads for shirts on Facebook, but I’m past the age of the target audience for back-to-school propaganda.)

Some computer science academics at Northeastern University had heard enough people talking about this technological myth that they decided to do a rigorous study to tackle it. For the last year, Elleen Pan, Jingjing Ren, Martina Lindorfer, Christo Wilson, and David Choffnes ran an experiment involving more than 17,000 of the most popular apps on Android to find out whether any of them were secretly using the phone’s mic to capture audio. The apps included those belonging to Facebook, as well as over 8,000 apps that send information to Facebook.

Sorry, conspiracy theorists: They found no evidence of an app unexpectedly activating the microphone or sending audio out when not prompted to do so. Like good scientists, they refuse to say that their study definitively proves that your phone isn’t secretly listening to you, but they didn’t find a single instance of it happening. Instead, they discovered a different disturbing practice: apps recording a phone’s screen and sending that information out to third parties.

Digital Human: Series 15, Ep 1 – Jigsaw

Nervous Systems Interviews: Stephanie Hankey o…

Nervous Systems Interviews: Stephanie Hankey on Unfitbits:

Stephanie is so fascinating, I wish we could just follow her around with a mic for day… but that might be just as creepy as the data harvesting we talked about in today’s episode… never mind.

Digital Human: Series 15, Ep 1 – Jigsaw

Human-like robots may have a disturbing impact…

Human-like robots may have a disturbing impact on actual humans:

Stuart Russell, vice chair of the World Economic Forum Council on robotics and artificial intelligence, called for a “ban of highly human-like humanoid robots” during the Milken Institute’s panel titled “Artificial Intelligence: Friend or Foe?”

“We’re just not equipped in our basic brain apparatus to see something that’s perfectly humanoid and not treat it as a human being,” he said. “So in some sense, a humanoid robot is lying to us using the lower levels of our brain we don’t get to control.”

“Particularly for young children, growing up in a household where there are humanoid robots and humans it could be extremely confusing,” he said. “And we could see psychoses developing as a result of machines not behaving as the child expects them to behave because they think its a human.”

A study done by various Japanese researchers actually found that children are likely to show “serious abusive behaviors” towards robots. The researchers concluded that the more human-like the robots looked (or more they approached the uncanny valley) the more likely it was for kids to start beating them up.

Little kids might just be evil though… it’s a distinct possibility…

The Digital Human, Series 13, Episode 3 – Visage