1 Comment

I believe predictive processing theory - the latest thinking in cognitive and neuroscience on how our cognition works - provides an immensely simplifying underlying theory and heuristic for managing desirable difficulty. Predictive processing theory posits that our brain creates a prediction of what our senses will experience next form a generative model stored in long term memory in the context of what has just happened. We compare this prediction to our actual sensory input and generate an error if they don't match. if they do match then our prediction is correct and the generative model used was successful and it is reinforced in our LTM. If there is an error then this is used to correct our generative model being used. Successful adjustments that minimise future error are then the signal to update the generative model in LTM ( i.e. learn). If errors are low ( low difficulty) then learning is low, if errors are too high we get overloaded and loose information, if errors are just right( desirable difficulty) we learn best. So desirable difficulty is about having the right level of predictability and surprise. Each individual will have varying past knowledge/experience/ generative models affecting how predictable things are for them. Each situation and context will also vary how predictable things are. Extraneous load means more things to predict - familiar things create less prediction error, unfamiliar create more contributing to the overall prediction error rate. As educators we want to get this overall predication error rate into the "just right" range high enough to be generating learning, but not so high our prediction error bandwidth is overloaded. Shifting the framing from "difficulty" to "predictability" allows for a clearer understanding and management task to get into the "just right" range. See my blog: https://predictablycorrect.substack.com/ - Adam W

Expand full comment