Some things we know we only know through instruments of different kinds. Microscopes and telescopes show us things that we could not observe with our biologically bounded powers of observation. We are now adding software to our set of scientific instruments, and there are things we can see in data sets only through the application of certain algorithms.
Our instruments present us with an interesting philosophical problem. The more advanced they become, the harder it becomes for us to explain exactly how we arrived at a certain piece of knowledge. Our ability to go back and reverse engineer the result, and to point to exactly what step in a a complex algorithm produced what result is declining. So we end up with a growing black box at the center of our scientific project.
We could imagine a future society in which the absolute majority of what we consider our common knowledge was produced inside of that black box. And as data sets grow and systems become more complex it becomes even harder to work out how we arrived at what we say we know. The foundations of our knowledge will become more and more inaccessible.
As we push the boundaries of what we know further out, we seem to be creating a center of black box ignorance at the middle of our knowledge. So does that mean that the following proposition holds true?
The more we know the less we will know how we know what we know.