The skilful use of data to improve teaching and learning is not easy. Being aware of some common mistakes and pitfalls that can hamper your ability to use data to transform your teaching and learning inquiry is very important.
1. Viewing data according to cognitive biases
We all have cognitive biases which influence the way we see and interpret the world, including how we analyse and interpret data. When we engage with data, we rely on habits and short-cuts for meaning-making (called “cognitive biases”) that might blind us to potential meanings in the data that fall outside of these habitual understandings. When viewing data it is critical that we challenge or interpret cognitive biases, otherwise data use is unlikely to lead to changes in either beliefs or practice. The use of data for inquiry can be more powerful if teachers engage in ‘intentional interruption’ of these cognitive biases.
Here are five biases that frequently influence our thinking and decision making. Understanding these biases can help you to become a better user of data.
i. Confirmation bias: making data fit your existing view of the situation
We have a natural tendency to transform information to fit what we already believe to be true about the world, rather than using new information to restructure our understandings. This is called the “confirmation bias”, which occurs when we have an existing theory or hypothesis about something, and tend to look for things that confirm existing beliefs rather than challenge them. For example we might look at data and pay more attention to that which supports our hypotheses than that which challenges them. This acts to preserve the status quo and prevents learning.
ii. Recognition bias: valuing the known over the unknown
We tend to place greater value on things we recognise, than the things we do not – this is called “recognition bias”. We might quickly make a decision or come to a conclusion based on something that is easily recognisable, while failing to notice other less familiar messages in the data.
iii. The competency trap: interpreting current events and data through past experience
We tend to choose solutions based on what has worked in the past. However, these may not always apply to a new situation. This is the ‘competency trap’. It can also occur when we have limited information or limited expertise with an issue, forcing us to look to precedents and past solutions.
iv. Vividness bias: over-focusing on dramatic data results
When something is striking, or conjures up particularly vivid images, it tends to be over-emphasised in our minds. This is the vividness bias. An example is the way in which we are more worried about being in a plane crash rather than a car accident, even though the latter is more likely. This is due to media coverage of plane accidents being far greater than that of car crashes, making them more vivid and more memorable.
v. False correlations: assuming two variables are related when they are not
Sometimes because certain events are more vivid or stand out in our minds, we make illusory correlations between them. An examples is the belief that “it always rains on weekends”, which develops because we tend to focus on all the times when these things have happened, and not on when they did not. We tend to believe there is a relationship because of the vividness of the examples when it has happened. It gets noticed when it happens, but not when it does not.
How to overcome these tendencies towards bias:
- Develop an awareness of cognitive bias; this can help to interrupt it.
- Don’t cherry pick your data but work systematically through all the data.
- Purposefully seek and pay attention to disconfirming evidence.
- Spend time considering all possible directions in the evidence, rather than quickly proceeding with initial thoughts and feelings.
- Be open to what the evidence is suggesting, and to the need for change.
- Work with data in groups where members are comfortable to challenge each other.
2. Focusing on action rather than evidence
Many people relate inquiry to engaging in a course of action, which results in spending limited time gaining an in-depth understanding of the problem in the rush to “do something”. There are two types of action that need to be guarded against:
i. Jumping to solutions
We like to feel that we are accomplishing something, so we often take shortcuts in problem analysis and immediately start thinking about solutions. However, experts often spend a long time mapping out the requirements of a problem before taking action. A lack of time spent understanding the problem leads to a superficial understanding of problems and time wasted pursuing less effective or wrong solutions. Choosing activities before understanding the problem creates an “activity-trap” in which activity, rather than understanding, is the focus.
How to overcome this tendency:
- Follow the steps of an inquiry cycle. (See spiral model)
- Use data to unpack and describe the problem under inquiry.
ii. Avoiding action
We can often be afraid to take a risk and change practice in case we make things worse for our students, rather than better. Somehow, it seems as if doing nothing and continuing with usual practice is better than making mistakes. It is common to believe that the harm caused by taking action is worse than the harm caused by taking no action. It is also a mistake to believe that doing nothing is doing nothing – of course you are still doing something, just doing nothing new.
How to overcome this tendency:
- Data can to help us see the consequences of continuing with the usual practice.
- Make comparisons between preferable futures: what you would like to happen, and the probable future, which is what you can expect if you maintain the status quo.
3. Limiting inquiry in order to protect relationships and status
For some people, the need to belong to and be esteemed by a group interferes with their capacity to engage in robust inquiry activities, such as asking probing questions, suggesting alternative ideas or offering hypotheses that contradict those of others in the group. However, a focus on preserving group harmony leads to superficial definition and exploration of problems, cognitive bias in reviewing the evidence, and inappropriate and ineffective solutions.
Another danger is to view problem solving as a competition and seek to prove yourself as the group member with the correct hypothesis or best solution. This means that valuable energy is spent advocating for proposals and opinions, and criticising those of others, and not in exploring problems and issues in depth and from a range of perspectives.
How to overcome this tendency:
- Spend time building relationships to support risk taking and honest dialogue in the group.
- Use clear processes to manage group behaviour, ensure equal participation and encourage professional critique.