Q. Why does it seem like learning content always going stale?
A. Because you can’t spell OLD without L&D ;)
But seriously, I think it’s simply because we don’t tend to track our “Best By” dates.
We could do that, y’know. We could totally capture how long we estimate a given learning experience is likely valid for AND we could choose to present to our learners upfront before they even register for training. Often in the initial analysis, we do the first part anyway, we just don’t have the practice of presenting this to the people it impacts for some reason that I’ve never understood.
Just like with food, a Best By date does not guarantee that the contents have been handled properly. It is not a promise of goodness or of badness, it is an indication of iffy-ness that people can judge for themselves — when they have the info they need to do so.
Though I’ve been talking about this simple concept for years now, and helping organizations implement solutions around it, to my knowledge it hasn’t made much real difference to learners because the people who could be providing consistent labeling don’t tend to keep doing that on their own. Metadata like this is deemed optional and skipped after a while, thereby risking giving people what I call “learn poisoning” all over again. And after a bad case of that, can you blame people if they don’t want to consume the same kind of training ever again?
We don’t tend to forget such events. Our survival biases go crazy with assumptions and pick up a ton of false positives as they have done for millennia (“avoid ALL courses at work!”). In a battle against the limbic brain’s evolutionary training, the humble training we make will loose. It deserves to.
A label can put us on the right side of human evolution and change the outcome. Not all at once, but over time, if we’re consistent, I still believe it can.
What do you think?
Would you be willing to try?
Or have you tried already? And how did that go?