fbpx

Common mistakes in using data for inquiry and improvement

The skilful use of data for improvement is not easy. It is important to be aware of some common mistakes and pitfalls that can hamper your ability to use data to inquire into and improve your teaching practice.  

Viewing data according to cognitive biases 

We all have cognitive biases which influence the way we see and interpret the world, including how we analyse and interpret data. When we engage with data, we rely on habits and short-cuts for meaning-making (called ‘cognitive biases’) that might blind us to potential meanings in the data that fall outside of these habitual understandings. It is critical that we challenge or interrupt cognitive biases when analysing and interpreting data.  

There are five common forms of bias that influence our thinking and decision making. Understanding these biases can help you to become a better user of data. 

  1. Confirmation bias: This describes our natural tendency to transform information to fit what we already believe to be true rather than using new information to restructure our understandings. Confirmation bias occurs when we have an existing theory or hypothesis about something and tend to look for things that confirm existing beliefs rather than challenge them. In fact, we easily avoid evidence to the contrary. 
  1. Recognition bias: This causes us to value the known over the unknown and leads us to place greater value on things we recognise than on things we do not. We might quickly make a decision or come to a conclusion based on something that is easily recognisable, while failing to notice other less familiar messages in the data. 
  1. The competency trap: This involves interpreting current events and data through past experience. We tend to choose solutions based on what has worked in the past, even if they may not apply to a new situation. It can also occur when we have limited information or limited expertise with an issue, forcing us to look to precedents and past solutions. 
  1. Vividness bias: This leads us to prioritise or over-emphasise striking or dramatic data results. An example is the way in which we are more worried about being in a plane crash than a car accident, even though the latter is more likely.  
  1. False correlations: This involves assuming two variables are related when they are not, particularly when certain events are vivid or stand out in our minds. An example is the belief that it always rains on weekends, which develops because we tend to focus on all the times it happened and not on the times it did not, leading us to make illusory correlations between the two things. 

There are a number of strategies you can use to overcome these tendencies towards bias. Start by developing an awareness of cognitive bias, which can help to interrupt it. Take care not to cherry-pick your data but work systematically through all the data you have gathered. Purposefully seek and pay attention to disconfirming evidence, and spend time considering all possible interpretations of the evidence rather than quickly proceeding with initial thoughts and feelings. Be open to what the evidence is suggesting and to the need for change. It can be useful to work with data in groups where members are comfortable to ask questions and to challenge each other. 

Focusing on action rather than evidence 

Another common mistake in using data is to regard inquiry as engaging in a course of action and, in the rush to start making changes to practice, spend limited time gaining an in-depth understanding of the problem. This focus on action over effective problem diagnosis can negatively affect the success of the inquiry. There are two types of action to guard against: 

  1. Jumping to solutions 

A lack of time spent understanding the problem leads to a superficial understanding of problems and time wasted pursuing less effective or wrong solutions. Choosing activities before understanding the problem creates an ‘activity trap’ in which activity rather than understanding is the focus. In order to overcome this tendency, ensure that you follow the steps of an inquiry cycle and use data to fully unpack and describe the problem into which you are inquiring before making changes to practice. 

  1. Avoiding action 

We can often be afraid to take a risk and change practice in case we make things worse for our students rather than better. Somehow, it seems as if doing nothing and continuing with usual practice is better than making mistakes. It is common to believe that the harm caused by taking action is worse than the harm caused by taking no action. It is also a mistake to believe that doing nothing is doing nothing – of course you are still doing something, just nothing new. To overcome this tendency, consider the consequences of continuing with the usual practice, and make comparisons between preferable futures: what you would like to happen, and what you can expect if you maintain the status quo. 

Limiting inquiry in order to protect relationships and status 

For some people, the need to belong to and be esteemed by a group interferes with their capacity to engage in robust inquiry activities such as asking probing questions, suggesting alternative ideas or offering hypotheses that contradict those of others in the group. However, a focus on preserving group harmony can lead to superficial definition and exploration of problems, cognitive bias in reviewing the evidence, and inappropriate and ineffective solutions. 

Another danger is to view problem solving as a competition and seek to prove yourself as the group member with the correct hypothesis or best solution. This means that valuable energy is spent advocating for proposals and opinions, and criticising those of others, and not in exploring problems and issues in depth and from a range of perspectives. You can overcome this tendency by spending time building relationships to support risk taking and honest dialogue in the group, and using clear processes to manage group behaviour, ensure equal participation and encourage professional critique. 

References 

Centre for Education Statistics & Evaluation. (2016). 5 essentials for effective evaluation. NSW: Department of Education. 

Katz, S., & Dack, L. A. (2014). Towards a culture of inquiry for data use in schools: Breaking down professional learning barriers through intentional interruption. Studies in Educational Evaluation, 42, 35–40. 

Ontario Leadership Strategy. (2011). Ideas into action for school and system leaders: Using data: Transforming potential into practice. Retrieved from: http://www.edu.gov.on.ca/eng/policyfunding/leadership/IdeasIntoActionBulletin5.pdf 

Timperley, H. (n.d.). Using evidence in the classroom for professional learning. Retrieved from: https://cdn.auckland.ac.nz/assets/education/about/schools/tchldv/docs/Using%20Evidence%20in%20the%20Classroom%20for%20Professional%20Learning.pdf 

By Dr Vicki Hargraves

PREPARED FOR THE EDUCATION HUB BY

Dr Vicki Hargraves

Vicki runs our ECE webinar series and also is responsible for the creation of many of our ECE research reviews. Vicki is a teacher, mother, writer, and researcher living in Marlborough. She recently completed her PhD using philosophy to explore creative approaches to understanding early childhood education. She is inspired by the wealth of educational research that is available and is passionate about making this available and useful for teachers.