All decisions we make as humans are subject to implicit bias - simply a distortion in our thinking process, a cognitive shortcut we take to come to a decision. It comes in many forms, but all have the consequence of error prone decisions.
It can be difficult to avoid implicit bias as it often operates outside our immediate awareness, hence it is also known as unconscious bias.
We spoke to psychologist and scientist Pete Jones of Shire Professional Chartered Psychologists to discuss its potential influence on how scientific funding is awarded. Dr Jones is also working directly with ERC staff to ensure bias does not affect the granting process.
Can you give us an example of implicit bias?
The most prevalent effect of unconscious bias is affinity bias. We have an inclination to prefer people who are similar to us on the basis of a wide range of characteristics, including social or career background, gender, accent, education, ethnicity, age, hobbies and interests, etc. But there are many many other biases.
Anchoring can result in relying too heavily on one piece of information or a personality trait when making decisions. For example, if an applicant has a degree from a particularly high quality institution, the qualification becomes the entire or main focus of the employment decision. Stereotyping can create judgements about individuals within particular social groups. Confirmation bias can mean that once we feel someone or a situation is a particular way, we seek out information to confirm it and ignore evidence to the contrary.
Research and measurement of bias
To understand how and when to try and avoid a particular bias we first need to establish whether it actually exists and also how strong it is.
In 1868 the Dutch physician Franciscus Donders tried to measure differences in human reaction time to infer differences in cognitive processing. Ultimately this could be used to measure bias by examining the speed with which bias was shown in the associated process of decision making – e.g. how quickly a subject would associate the words "man" with the word "scientist" and so exhibit gender bias. Technological limits of the time made it impossible to measure the milliseconds involved.
In 1950s America unconscious bias was often discussed in terms of racial prejudice. Social psychologists such as Gordon Allport worked to define implicit bias, and work carried out in 1998 at Harvard created a popular way to measure the success or failure of action against it: the Implicit Association Test (IAT). This process became established as a metric of choice in clinical forensic psychology but it is a still subject to debate in academic circles due to inconsistent results.
Benchmarking data from previous studies is another way to estimate bias. One study implies that 30% of academic staff has sufficient bias in the areas of ethnicity or disability to influence their behaviour.
Can we eliminate bias?
No, but a realistic aim is to try and find ways to mitigate bias and so avoid distortion in decision making. Research from Harvard found that the effects of personal interventions such as awareness raising at a personal level are positive, but short-lived. Giving people strategies to use to mitigate bias proved more effective, resulting in a downward trend in bias displayed over a three month period. It has been estimated that even one hour of awareness training, plus understanding and putting mitigation strategies into practice, has a positive effect.
Systems used for decision making should also be adapted to minimise the chance that bias can influence the process. But ultimately we need to take a certain amount of personal responsibility. As with many things in life, it's down to us how we behave in practice.
Potential effects on a granting process
Implicit bias could initially affect who is encouraged to apply for scientific funding. To mitigate this, role modelling of a diverse range of people has been proven to be important. Apart from measures to encourage applications from different groups, grant allocators also need to ensure that once received, applications are treated equally.
Implicit bias could affect who is selected to consider which proposals should win grants, i.e. who qualifies to be a remote reviewer and who is selected to be an evaluator on a panel. It can affect which criteria reviewers are asked to use when evaluating a research project proposal and of course their actual behaviour and decision making while acting as peer reviewers.
Evaluators in a panel can react according to an affinity that has built up between them over the course of their work. They can be influenced by each other's opinions, for example if they have a lot of professional respect for a fellow evaluator who is very well known in their field. If they have a personal preference or dislike of a fellow evaluator, their judgement regarding that person's views can also be affected.
Avoiding bias in the granting process and grant lifecycle
At the ERC, the aim has always been to keep excellence as the sole project evaluation criterion. It therefore takes the challenge imposed by unconscious bias very seriously and as a preventative initiative, I was asked to train ERC staff members dealing with scientific aspects of the granting process (the "Scientific Officers") on implicit bias. The intention is to provide them with the tools to detect and eliminate any potential bias during the evaluation of grant applications. This training will be followed by the production of awareness raising video for the external evaluators involved in the granting process.
This article is the start of a series on the ERC's efforts to mitigate bias in all aspects of the granting process
Dr Pete Jones is a Chartered Psychologist and Chartered Scientist with a background in psychometrics and assessment. He is a former research manager at the UK Home Office. Pete is a leading UK applied subject matter expert on unconscious (implicit) bias. He has provided research, training and consultancy across the HEI sector from Faculty Exec. Boards, Student Union officers, researchers and support staff managers. He has delivered unconscious bias training for scientific institutes and societies. In the private sector he has provided research and consultancy services and trained board members and managerial staff in large companies. Pete is the developer of Implicitly® the unconscious bias test platform, a business metric for measuring unconscious biases and their likely impact upon decisions and behaviour at work. Pete talks openly about his own biases and his attempts to mitigate their effects.