Category: privacy

Sharenting: parent blogging and the boundaries…

Sharenting: parent blogging and the boundaries of the digital self – LSE Research Online:

We had Professor Sonia Livingstone on this weeks episode talking about where the right to tell your own story ends, and the privacy rights of others begin. But you should check out her article on Sharenting in full, it’s a fascinating read 🙂

This article asks whether “sharenting” (sharing representations of one’s parenting or
children online) is a form of digital self-representation. Drawing on interviews with 17
parent bloggers, we explore how parents define the borders of their digital selves and
justify what is their “story to tell.” We find that bloggers grapple with profound ethical
dilemmas, as representing their identities as parents inevitably makes public aspects of
their children’s lives, introducing risks that they are, paradoxically, responsible for
safeguarding against. Parents thus evaluate what to share by juggling multiple obligations
– to themselves, their children in the present and imagined into the future, and to their
physical and virtual communities. The digital practices of representing the relational self
are impeded more than eased by the individualistic notion of identity instantiated by
digital platforms, thereby intensifying the ambivalence of both parents and the wider
society in judging emerging genres of blogging the self.

Digital Human: Series 17, Ep 4 – Cameo

Social networking sites as virtual ‘showcases’

Social networking sites as virtual ‘showcases’:

A survey of Italian mothers who engage in ‘sharenting’ suggests they are motivated by both a desire for external validation, as well as more communitarian goals such as sharing moments with distant relatives and seeking support. But while many mothers see it as their right to engage in sharenting, what implications does this have for children’s rights and privacy? 

Digital Human: Series 17, Ep 4 – Cameo

BBC Radio 4 – Four Thought, Other People’s Sto…

BBC Radio 4 – Four Thought, Other People’s Stories:

Lunchtime listen for you guys, Dr Anna Derrig was on our show this week, but she goes in depth into the ethics of life writing in this episode of Four Thought. Well worth a listen if you’re going to write your life story.

Digital Human: Series 17, Ep 4 – Cameo

A cautionary tale about social media privacy

A cautionary tale about social media privacy:

A woman secretly photographed on a flight to Dallas has released a statement about how she has been shamed and harassed since a fictional romance about her went viral on social media.

Parts of a conversation she had with a fellow passenger on 3 July was overheard by actress and comedian Rosey Blair and her boyfriend, who documented their interpretation of it as an unfolding romance which became known online as #PlaneBae.

But the woman says she has been hounded and doxxed – internet terminology for revealing someone’s personal information without their consent.

“I did not ask for and do not seek attention. #PlaneBae is not a romance – it is a digital-age cautionary tale about privacy, identity, ethics and consent,” she said in a statement given to Business Insider by her lawyer.

Digital Human: Series 17, Ep 4 – Cameo

How brands are using emotion-detection technol…

How brands are using emotion-detection technology – Econsultancy:

Market research within the film industry is usually qualitative, with data being manually collated from surveys, reviews, and post-screening responses.

Disney, however, has been using technology to determine how audiences enjoy its movies, specifically creating an AI-powered algorithm that can recognise complex facial expressions and even predict upcoming emotions.

The software – which involves a method called ‘factorised variational auto encoders’ (FVAE) – captured people’s faces using infrared cameras during movie screenings including ‘The Jungle Book’ and ‘Star Wars: The Force Awakens’.

After just a few minutes of tracking facial behaviour, the algorithm was advanced enough to be able to predict when they would smile or laugh (in relation to specific moments in the movies).

Welp, well’s that’s not creepy in the slightest… :/

Digital Human: Series 17, Episode 1 – Numb

Fairness, transparency, privacy

Fairness, transparency, privacy:

Aims

Every day seems to bring news of another major breakthrough in the fields of data science and artificial intelligence, whether in the context of winning games, driving cars, or diagnosing disease. Yet many of these innovations also create novel risks by amplifying existing biases and discrimination in data, enhancing existing inequality, or increasing vulnerability to malfunction or manipulation.

There also are increasingly many examples where data collection and analysis risks oversharing personal information or giving unwelcome decisions without explanation or recourse.

The Turing is committed to ensuring that the benefits of data science and AI are enjoyed by society as a whole, and that the risks are mitigated so as not to disproportionately burden certain people or groups. This interest group plays an important role in this mission by exploring technical solutions to protecting fairness, accountability, and privacy, as increasingly sophisticated AI technologies are designed and deployed.

Once your smart devices can talk to you, who…

Once your smart devices can talk to you, who else are they talking to? Kashmir Hill and Surya Mattu wanted to find out – so they outfitted Hill’s apartment with 18 different internet-connected devices and built a special router to track how often they contacted their servers and see what they were reporting back. The results were surprising – and more than a little bit creepy. Learn more about what the data from your smart devices reveals about your sleep schedule, TV binges and even your tooth-brushing habits – and how tech companies could use it to target and profile you. (This talk contains mature language.)

Digital Human: Series 15, Ep 1 – Jigsaw

You sent spit for private DNA analysis. How lo…

You sent spit for private DNA analysis. How long before the police get it?:

In 2014, police in Idaho Falls, Idaho, were trying to solve a cold case from 1996, in which a young woman was murdered in her apartment. Police obtained DNA from the scene, but could not match it in criminal databases. So they went to a then-public database started by the Sorenson Molecular Genealogy Foundation, which held results for roughly 100,000 DNA tests and had recently been purchased by Ancestry.

That analysis and other matches led police to question a man named Michael Usry Jr., a New Orleans filmmaker. But after police took a sample of his DNA, they found — many weeks later — it did not match the sample found at the crime scene.

Murphy cites the false match as a cautionary tale. On the one hand, she said, DNA absolved Usry of murder. But before that, it put him under a cloud of suspicion for weeks. “Imagine what that would be like,” she said. “Imagine what that would mean if an employer, or a girlfriend, found out.”

Following that case, Ancestry put the Sorenson database behind a firewall and took other measures to tighten up its privacy policies, and other big consumer genetics companies followed suit. In response to the East Area Rapist case on Thursday, 23andMe went further than Ancestry by stating it is “our policy to resist law enforcement inquiries to protect customer privacy.”

Digital Human: Series 15, Ep 1 – Jigsaw

Cambridge Analytica scandal: Users shouldn’t h…

Cambridge Analytica scandal: Users shouldn’t have to leave social networks to protect themselves | View:

What can regular Facebook users do to safeguard their personal data other than deactivating their account?

The question for many people is really how can they exercise more control over what Facebook knows about them:

  • It’s important to remember, even deactivating Facebook doesn’t erase the history of data Facebook has already collected on you, and the more Facebook was used, the more data it collected.
  • One of the best ways to manage what data you’re handing over is to review what apps you have connected to your Facebook account. You can also really think about what information you are posting and how you are using Facebook beyond actually looking at your timeline – for example signing into other services through Facebook.
  • Concerned users can use Facebook minimally. For many of us, leaving Facebook is like leaving our social networks, but users can treat Facebook as a directory of contacts and choose to move more conversations offline. This is valuable because the most precious data Facebook harvests from you is behavioral. It’s all about our social networks, our friends, our hobbies, etc., and using that information to guess what our preferences are. We’re generally okay with advertising firms doing this when selling us shoes, but not when political campaigns are trying to influence our votes.

This is a reversal of the way users’ relationship with Facebook should work. Why should users extract themselves from their social networks to protect themselves and their data, especially when the data users feed Facebook is Facebook’s lifeline?

Digital Human: Series 15, Ep 1 – Jigsaw

Dr. Sandra Wachter is a lawyer and Research …

Dr. Sandra Wachter is a lawyer and Research Fellow in Data Ethics, AI, robotics and Internet Regulation/cyber-security at the Oxford Internet Institute and the Alan Turing Institute in London as well as a member of the Law Committee of the IEEE. She serves as a policy advisor for governments and NGO’s around the world on regulatory and ethical questions concerning emerging technologies. Prior to joining the OII, Sandra worked at the Royal Academy of Engineering and at the Austrian Ministry of Health.

Digital Human: Series 15, Ep 1 – Jigsaw