Apophenia is "the tendency to perceive meaningful connections between unrelated things." But I prefer the poetic rendition that this is what enables us to stare up at the clouds and see in them familiar objects floating in the sky. Perhaps I like that rendition is it's actually not a kind of pattern recognition I am good at - I rarely see a cloud as something else. I do see more patterns than average though and as the wikipedia entry says: [link again] "Apophenia has come to imply a human propensity to seek patterns in random information".
If we learned anything from trying to build autonomous vehicles, it's that every human has amazing pattern recognition capabilities. At the same time, there are those whose "pattern recognition module" is just that bit more active. Some of them end up in jobs that use that capability (eg Futurist, Analyst, Forecaster, Doctor, Artist, Scientist to name just a few) but not all. But having more active pattern recognition is a possible advantage and at the same time poses particular challenges.
Possibly the best fictional example comes in the medical drama House [link]. What we see through numerous episodes is the challenge of diagnosis - linking disparate piece of information that indicate some underlying phenomenon. The main character is of course (it's a TV drama!) a genius at this kind of pattern recognition. But we also see some of the challenges. Sometimes a set of symptoms could point in more than one direction, i.e. it could be X or it could be Y. Other times, the set of symptoms is caused by more than one thing. Finally, rarely (because it doesn't make good TV) the symptoms are not actually a guide to the problem - or maybe someone is seeing a pattern in the random data.
As an ex-Futurist I've spent a lot of time with this problem. When is a pattern significant? or real? How do we walk the tightrope? And it is a tightrope - one response is to play safe, but failing to act on a pattern can be as bad as acting on a misperceived one.
Some of the professions I have mentioned are regulated and they have developed some particular approaches, notably sharing the decision with others and using safety margins. The safety margin for pattern matching is some external standard of confirmation. This works well for situations similar to those seen before, you can know where to look for confirmation. Of course in new situations it may not work so well. There is also the issue that very often recognising patterns is more valuable at early stages. By the time robust confirmation is available, everyone can see what is going on.
Sharing the decision with others can work every well, if no-one else can see the logic of what you are seeing, maybe that's a sign? However there are a couple of problems here too. First of all, outside some large organisations you may not have someone in a similar role to share your findings with. If I have been watching the development of trading patterns and I think there is a crunch coming, someone who has no familiarity with the trading system does not automatically provide a good check. The second is more subtle, which is that communicating a disparate set of information about a complicated pattern can be very difficult. It's actually not easy to identify what information and context is held in common and explaining everything can take a lot of time - which may not be available, especially if it is just as a check. Another issue is that communicating context is something we as a society are actually not very good at. Especially in broad situations, we lack the words and scaffolding to make it easier to paint these pictures for each other.
These problems matter because for those who are sensitive to patterns, a safety margin, or the veto of some others leaves a lot of situations untouched. The pattern was seen, but it was too diffuse to convince. And then it happens. And it can haunt you. Yet at the same time you become painfully aware over time that sometimes the pattern in the noise is just an accidental arrangement. You can flip a coin and get 10 heads in a row. It is rare - but it can happen and doesn't inherently mean the coin is biased. Sometimes things just line up and it doesn't have a deeper cause or meaning.
So we're left with a tightrope to walk - what can we do?
First of all we need to understand that we are operating at the edge. To keep an awareness that something we see may or may not be significant. An obvious step into trouble is to dismiss everything or to believe everything. If we deny the tightrope we deny our talent.
Second, we need to accept and defend the fact that this is a matter of judgement. We live in a world which likes simple rules and little automatic algorithms, but unusual patterns, especially detected early, don't fit into this scheme. Testing them or convincing others about them might have some data based components, but the case is always going to turn on a qualitative judgement about meaning and significance. Ironically this brings us back full circle to AI and ML - these are machines that can find correlations in anywhere - judging the value of them is a big coming challenge.
Third, we shouldn't dismiss safety margins and outside checks when they might be practical or useful. But we also need to think more about ways of testing, experiments which can give us some info, but are also safe even if they fail.
Finally, it's important to keep working on our understanding of the world in general, to keep finding, creating and updating models and mechanisms. To recognise a pattern is to make a prediction about what follows next. The interesting patterns occur in fields where we don't have easy proof (eg "the laws of physics") but if we have a meaningful sense of how things work, we can not only judge the pattern's significance better (by seeing how it evolves compared to how things work) but also stand a much better chance of explaining it to others.
Mind Atelier Newsletter
Join the newsletter to receive the latest updates in your inbox.