Tech for Girls

October 11, 2021

On October 11, 2021 I was invited by the amazing @Adewunmi to talk about how to support children and families navigate the data-centric tech landscape as part of #InternationalDayOfTheGirl with @TheABB, hosted by @cloudera. The panel was live-streamed on YouTube. But if you don't want to access YouTube, here is a rough script of my part of the contribution to the panel discussion.


Topic 1 - Importance of women working in data-centric technologies

The importance of facilitating more women involved in the tech sector cannot be over-emphasised. Today, the tech sector is no longer just providing the plumbing, such as the routers or hubs to get us connected or having a couple of programmers sitting in front of a computer monitor and coding something up to help us probably manage some accounting within a small business or serve up a couple of routine web pages. Today, the data-centric tech sector is much more sophisticated and affecting our lives in a much profound way.

The other day, on a school run, I had a count of the number of connected digital devices we have at home with my children, and we ended up with 26! We are probably way above the national average of 9 in an average UK household, but what can these devices be? Apart from the usual smartphones, there are many other things we may have at home that are constantly connected to the Internet, collecting data, and sending them off somewhere, such as smart speakers, smart meters, smart doorbells, etc. Without any doubt we benefit a great deal from these technologies, the convenience they provide for us, saving our bills, keeping us safe, or tracking our health etc. However, because these technologies are so embedded in our lives, we must get things right spot on; otherwise, the consequences can be very severe and dramatic.

I am not just talking about stopping our devices from getting hacked or keeping our money in the bank safe, but our main concerns today, shared by many computer scientists all over the world, is about how the computer code today can be a direct reflection of our society, of how our society value things and our society already operate. Therefore, it is critical for us to think about: 1) who are the people behind the computer code? 2) How do they write these codes? 3) Who are the executives defining what the code does and how it may affect our lives?

Let me give you two simple examples to show you how the computer code behind the technologies can have a dramatic effect on some of the populations in our society.

One widely discussed example is when you search for curly hair on a search engine, how the search results can be dominated by black-African women’s pictures, not men, not Asian or Caucasian. And if you search for black girls, Latin girls, or Asian girls, which I would suggest you not try this at home with your children beside you, it is often that the first page of the results can be dominated by a particular type of content which are being unfairly associated with girls-of-colour. As Professor Safiya Nobel from UCLA, who is, by the way, my absolute hero and role model, writes eloquently: “it tells us so much about how vulnerable communities and people are neglected and really not thought-about in the design of these algorithms.” (https://www.wnycstudios.org/podcasts/takeaway/segments/hidden-biases-search-engine-algorithms).

Another example comes from a research study that we conducted in Oxford a few years ago trying to understand children’s experiences with algorithmic systems. In one of the studies, one of the Muslim girls described how she had always been presented with an anti-terrorist video on a certain video platform before she was able to watch the video she went to the platform for. Until that day, she had no idea that this was not just a routine clip before being able to see a video on that platform, but she had been the only one in her class who had been exposed to such an experience.

These are not pleasant scenarios, but I believe they powerfully illustrate how computer code can reflect some deep societal issues. If the design of algorithms has not been dealt with carefully, has not been considered in a much bigger societal context, it will not only cause distress to our children’s online experiences but also keeps them in a very unfair digital society in the long run. And that is why we firmly believe that the increase of diversity in this sector must be dealt with seriously and with all sincerity.

Topic 2 - The role of data/digital literacy and autonomy in developing girls

As a result of the above observations, a few years ago I have started to be involved in one of the initiatives called 100+ Brilliant Women in AI and Ethics. We use this initiative as a platform to facilitate peer support of women-in-colour in the tech sector or related, who cares deeply about the societal issues related to technologies. Our goal is to promote the involvement of women of colour so that the so-called different perspectives could be heard much more strongly, and the power of such cooperation would make a difference in different parts of the globe.

Another amazing example I would like to share is that a few years ago, a first-year DPhil student came into my office and described her interest in exploring cybersecurity from a feminist’s perspective. I have to admit, at the time, I was thoroughly perplexed. The student did not come from a traditional computer science background, but a combination of law and sociology as well as some tech literacy of course. However, I was amazed by her perspective and realised how much women’s best interests had been neglected in the design of cybersecurity mechanisms, and what kind of dramatic consequences these can have. Her subsequent research about how digital devices’ design choices could actually exacerbate the risk of women being subject to domestic abuse or online stalking has been the most illustrating research I have seen for years.

We are not saying that men or boys in this sectors are not making the right choices or line of thinking, however, I hope the examples here show the huge potential when more women are encouraged to be involved in this sector, what a dramatic difference this can bring.

https://100brilliantwomeninaiethics.com

https://www.oii.ox.ac.uk/videos/reconfigure-feminist-action-research-in-cybersecurity-report-launch/

Topic 3: What can parents do?

· What do parents need to know?

· How can you talk to kids about this?


Parents are overburdened these days, but data literacy is so important so that children can have an informed understanding about the searches they do online or recommendations they see on common video or game platforms.

In Oxford, we have done lots of research about how much parents and children are aware of the data manipulation behind the online systems that they interact with. So far, our results have consistently shown that parents are predominantly concerns about keeping their children safe online and have limited support to talk about the data-centric nature of these platforms or how they affect their children.

We have done a lot of research work investigating how to enable children’s attention management with respect to social media platforms. Apart from trying to understand the current behaviours from a psychological point of view, we also hypothesise that a more profound understanding of how the social media platforms operate, such as the recommendation of social connections, or the generation of feeds on the social media platforms, could possibly make a great difference to children about how to perceive their interactions with these platforms.

One thing I would recommend to the parents is that do not simply focus on the amount of screen time your child has with their digital devices, but what they do on it. For this, I gave an invited talk at the Oxford AI festival two years ago and Professor Sonia Livingstone from the London School of Economics can provide the best well-rounded evidence about this, as the blog link that can be found along with this discussion afterward.

Another thing for parents to bear in mind is to tread carefully when thinking about applying existing parental control mechanisms on your children’s devices or related to their digital activities. We have a recent paper on this analysing the current landscape of parental control and how these mechanisms are perceived differently by parents and children. This generates a lot of fruit for thoughts, and again, if someone fancies an academic read, the link to the paper will be circulated.

For older children (8+), I would recommend parents use a platform that their children use regularly and talk about the data landscape around it; and the blog maintained by LSE offers a great set of resources: https://blogs.lse.ac.uk/parenting4digitalfuture/.

For even older children, there are brilliant education websites, such as https://atozofai.withgoogle.com/; or https://www.unicef.org/globalinsight/featured-projects/ai-children

It may be much harder to talk about data for much younger children, but the ebook from Cloudera offers a great start!