Dangers and advantages in the idea of implicit bias

Last night I was on a panel discussion around the theme of “Success: is it all in the mind? A discussion on women in science.”

On that panel we discussed the idea of implicit bias, that we can behave in ways that are prejudiced even if we believe ourselves to be without prejudice (or even, anti-prejudice). Relevant examples might be: in meetings interrupting women more than men, filling departmental seminar or conference keynote slots with men rather than women, rating CVs which come from women as less employable and deserving less salary and so on.

The idea of implicit bias has both benefits and dangers for how we talk about bias. On the positive side, it gives us a mechanism for thinking about discrimination which isn’t about straightforward explicit prejudice. Sure, there are people who think “Women can’t do physics” or “Women shouldn’t work”, but implicit bias lets us talk about more subtle prejudice, it helps make visible the intangible feeling that, in a thousand different ways, life is different for members of different social groups. Relatedly, implicit bias lets us recognise that the line of division cuts through every one of us. It isn’t a matter of dividing the world into the sexists vs the feminists, say. Rather, because we’re all brought up in a world which discriminates against women we acquire certain gender-prejudiced habits of thought. Even if that only means automatically of a man when asked to imagine a scientist, then that can have accumulating effects for women in science. Finally, thinking about implicit bias gives a handle on what it might mean for an institution or a culture to be prejudiced. Again, without the need to identify individuals, implicit bias can help us talk about the ways in which we participate, or our organisation participates, in perpetuating discrimination. Nobody has to want people who are more likely to have childcare commitments to be excluded, but if your departmental meetings are always at 4pm then you are risking excluding them.

But the idea of implicit bias can have a negative influence as well. We live in an age which is fascinated by the individual and the psychological. Just because implicit biases can be measured in individuals’ behaviour doesn’t mean that all problems of discrimination should be addressed at the psychological level of individuals. If one thing is clear about implicit bias it is that the best approaches to addressing it won’t be psychological. This is a collective project, there is little or no evidence that ‘retraining’ individual’s implicit biases works, and raising awareness, whilst important, doesn’t provide a simple cure. Approaching bias at an institutional or inter-personal level is more likely to be effective – things like tracking the outcomes of hiring decisions or anonymised marking have been shown to be effective for mitigating bias or insulating individuals from the possibility of bias.

Secondly, the way people talk about bias evokes a metaphor of our rational versus irrational selves which owes more to religion than it does to science. Implicit biases are often described as unconscious biases, when the meaning of unconscious is unclear, and there’s plenty of evidence that people are aware, in some ways, of their biases and/or able to intervene in their expression. By describing bias as ‘unconscious’ we risk thinking of these biases as essentially mysterious -unknowable and unalterable (and from there the natural thought is, well there’s nothing I can do about them). My argument is that biases are not some unconscious, extraneous, process polluting our thinking. Rather, they are constitutive of our thinking – you can’t think without assumptions and shortcuts. And assumptions and shortcuts, while essential, also create systematic distortions in the conclusions you come to.

The idea of implicit bias helps us see prejudice in unexpected places – including our own behaviour. It sets our expectations that there will be no magic bullet for addressing bias, and progress will probably be slow, because cultural change is slow. These are the good things about thinking about the psychology of bias, but although the psychological mechanisms of bias are fascinating, we must recognise the limitations of only thinking about individuals and individual psychology when trying to deal with prejudice, especially when that prejudice is embedded in far wider historical, social and economic injustices. Nor should we allow the rhetoric of biases being ‘unconscious’ trick us into thinking that bias is unknowable or unaccountable for. There is no single thing to be done about discrimination, but things can be done.

Links & Endnotes:

The event was “Success: is it all in the mind? A discussion on women in science.”, organised by Cavendish Inspiring Women the other panelists were the Jessica Wade , Athene Donald and Michelle Ryan. My thanks to all the organisers and our chair, Stuart Higgins.

My thinking about bias is funded by the Leverhulme Trust, on a project led by Jules Holroyd. All my thinking about this has benefited from extensive discussion with her, and with the other members of that project (Robin Scaife, Andreas Bunge).

A previous post of mine about bias mitigation, which arose from doing training on bias with employment tribunal judges

A great book review: What Works: Gender Equality by Design, by Iris Bohnet, which says many sensible things but which risks describing bias as unconscious and therefore more mysterious and intractable than it really is

A good example of the risk of ‘psychologising’ bias: there are more police killings of blacks than whites in the US, but that may reflect other injustices in society rather than straightforward racist biases in police decisions to shoot (and even if it did, it isn’t clear that the solutions would be to target individual officers). See also ‘Implicit Bias Training for Police May Help, but It’s Not Enough‘.

A great discussion of the Williams and Ceci (2015) claim that “sexism in science is over”, and also here . See also ‘How have gender stereotypes changed in the last 30 years?’.

2015 review

Here’s a selective round-up of my academic year

Teaching: I taught my Cognitive Psychology course for the second time. It takes inspiration from MOOCs and ‘flipped classroom’ models, so I try and scaffold the lectures with a bunch of online resources and pre- and post- lecture activities. This year I added pre-lecture quizes and personalised feedback for each student on their engagement. Based on thinking about my lecture discussions I wrote a short post on Medium ‘Cheap tricks for starting discussions in lectures‘ (the truth is, lectures are bad place for starting discussions, but sometimes that’s what you have to work with). I rewrote my first year course on emergent models of mind and brain. It uses interactive jupyter notebooks, which I’m very happy with. The lectures themselves show off a simple neural network as an associative model of memory, and the interactive notebooks mean that students can train the neural network on their own photos if they want. I also held an ‘intergenerational tea party’ every Thursday afternoon of autumn semester where I invited two students I supervise from every year of the undergraduate course (and my PG students and post-docs). If you came to one of these, thanks – I’ll be doing it again next semester.

Writing: I had a piece in the Guardian The science of learning: five classic studies,  as well as my regular BBC Future column, and a few pieces for The Conversation, and some ad-hoc blogging as a minor player on the mindhacks.com blog. I self published an e-book ‘For argument’s sake: evidence that reason can change minds‘ which was very briefly the 8th most popular experimental psychology e-book on Amazon (1 place behind ’50 sexting tips for women’).

Engagement: The year began with me on a sabbatical, which I spent at Folksy. Thanks to everyone there who made it such an enjoyable experience. I learnt more observing small business life in my home city than I think I would have in another Psychology department on the other side of the world. This year I was also lucky enough to do some work with 2CV related to a Transport for London brief on applying behavioural science to passenger behaviours, with Comparethemarket.com on understanding customer decisions and with McGraw-Hill Education on analysis of student learning. Our work on decision biases in court was also kindly mentioned on the UK parliament website, but I have to say that my getting-out-of-the-university highlight of the year was appearing in the promotional video for Folksy’s drone delivery project (released 1/4/2015).

Research: We rebooted interdisciplinary Cognitive Science activities at Sheffield with a Workshop, several seminars and a mailing list for everyone to keep in touch. Kudos to Luca for help instigating these things.

Several existing grants kept me very busy:

Our Leverhulme grant on Bias & Blame continued with our investigation into the cognitive foundation and philosophical implications of implicit bias. The PI, Jules Holroyd was awarded a prestigious Vice Chancellor’s Fellowship at Sheffield, so she’ll be a colleague in the new year as well as a collaborator (well done Jules!).  As part of this project we pre-registered an experimental test of our core hypothesis and this December Robin Scaife finished a heroic effort in data collection, so expect results on this in the new year. Pre-registration was an immensely informative process, not least because it made me finally take power analysis seriously (previously I just sought to side-step the issue). As a result of this work on decision making and implicit bias I did training for employment tribunal judges on bias in decision making, during which I probably learnt more from them than they learnt from me.

We’ve been scanning at the York Neuroimaging Centre, as part of our project on ‘Neuroimaging as a marker of Attention Deficit Hyperactivity Disorder (ADHD)’ . One of the inspirations for this project, Maria Panagiotidi, passed her PhD viva in November for her thesis titled: ‘The Role of the Superior Colliculus in Attention Deficit Hyperactivity Disorder’. Congratulations to Maria, who goes on to work as a research psychologist for Arctic Shores in Manchester.

Funded by the Michael J Fox Foundation we’d continued testing in Sheffield and Madrid, using typing as a measure of the strength of habitual behaviour in Parkinson’s Disease. For this grant the heroic testing efforts were performed by Mariana Leriche. For the analysis we are combing timing information (my specialty) and an information theoretic analysis based on language structure. Colin Bannard (University of Liverpool) is leading on this part of the analysis and working with him has been a great pleasure and immensely informative on computational linguistics.

Our students as part of the Sheffield Neuroeconomics network approach their final years. Angelo Pirrone and I have been working with James Marshall in Computer Science on perceptual decision making, and fitting models of decision making.

That’s not all, but that is all for now. The greatest pleasure of the year has been all the people I’ve had a chance to work with; students, colleagues and collaborators. Everything I have done this year has been teamwork. So apologies if you’re not mentioned above – it is only due to lack of space, not lack of appreciation – and my best wishes for 2016.

Crowdsourcing analysis, an alternative approach to scientific research

Crowdsourcing analysis, an alternative approach to scientific research: Many Hands make tight work

Guest Lecture by Raphael Silberzahn, IESE Business School, University of Navarra

11:00 – 12:00, 9th of December, 2015

Lecture Theatre 6, The Diamond (32 Leavygreave Rd, Sheffield S3 7RD)

Is soccer players’ skin colour associated with how often they are shown a red card? The answer depends on how the data is analysed. With access to a dataset capturing the player-referee interactions of premiership players from the 2012-13 season in the English, German, French and Spanish leagues we organised a crowdsourced research project involving 29 different research teams and 61 individual researchers. Teams initially exchanged analytical approaches — but not results — and incorporated feedback from other teams into their analyses. Despite, the teams came to a broad range of conclusions. The overall group consensus (that a correlation exists) was much more tentative than would be expected from a single-team analysis. Raphael Silberzahn will provide insights from his perspective as one of the project coordinators and Tom Stafford will speak about his experience as a participant in this project. We will discuss how also smaller research projects can benefit from bringing together teams of skilled researchers to work simultaneously on the same data and thereby balance discussions and provide scientific findings with greater validity.

Links to coverage of this research in Nature (‘Crowdsourced research: Many hands make tight work’), and on FiveThirtyEight (‘Science Isn’t Broken: It’s just a hell of a lot harder than we give it credit for’). Our group’s analysis was supported by some great data exploration and visualisation work led by Mat Evans. You can see an interactive notebook of this work here

 

Event: Crowdsourcing Psychology Data – Online, Mobile and Big Data approaches

StaffordFig3Smart phones, social media and networked sensors in everything from trains to toasters – The spread of digital technology creates new opportunities for cognitive scientists. Collecting and analysing the resulting “big data” also poses its own special challenges. This afternoon of talks and discussion is suitable for anyone curious about novel data collection and analysis strategies and how they can be deployed in psychological and behavioural research.

Time: 1pm-5pm, 11th of November 2014

Venue: Department of Psychology, University of Sheffield

We have four speakers followed by a panel discussion. Our speakers:

Martin Thirkettle: “Taking cognitive psychology to the small screen: Making a research focussed mobile app”

Developing a mobile app involves balancing a number of parties – researchers, funders, ethics committees, app developers, not to mention the end users. As the Open University’s “Brainwave” app, our first research-focussed cognitive psychology app, nears launch, I will discuss some of the challenges we’ve faced during the development process.

Caspar Addyman: “Measuring drug use with smartphones: Some misadventures”

Everyday drug use and its effects are not easily captured by lab or survey-based research. I developed the Boozerlyzer, an app that let people log their alcohol intake, their mood and play simple games that measured their cognitive and emotional responses. Although this had its flaws it led to a NHS funded collaboration to develop a simple smartphone tracker for Parkinson’s patients. Which was also problematic..

Robb Rutledge: “Crowdsourcing the cognitive science of decision making and well-being”

Some cognitive science questions can be particularly difficult to address in the lab. I will discuss results from The Great Brain Experiment, an app that allowed us to develop computational models for how decision making changes across the lifespan, and also how rewards and expectations relate to subjective well-being.

Andy Woods: “[C]lick your screen: probing the senses online”

We are at the cusp of some far-reaching technological advances that will be of tremendous benefit to research. Within a few short years we will be able to test thousands of people from any demographic with ‘connected’ technology every bit as good as we use in our labs today — indeed perhaps more so. Here I discuss on-web versus in-lab, predicted technological advances and issues with online research.

Tickets are free and available: here.