One often gets the feeling that the steadily growing sentiment in the world is“the more, the merrier”. Quantity beats quality and information speaks involumes. So is the trend in computing – bigger really IS the better. Pour enough data into the data lake then feed it into some black box machine learning algorithms, rigged with n-depth layer semantic networks, and out comes all kinds of conclusions that one could, and could not, possibly think of. This development is partly a consequence of, and partly a driver behind, the viral spread of sensors and the corresponding rise in connectivity. Any conceivable thing is hooked up in the Internet of Things (IoT) age. Whatever is observable is now considered worth observing. Is it an AI renaissance, or its breakthrough? We have data, big time, it is Big Data time. The early masters of scientific observation, such as Isaac Newton and Leonardo da Vinci, didn’t have this luxury. They had to be smart, careful and alert, observing exactly what was needed in the process of discovery. They had to concentrate on that which could be used to shed light over their suspicions, hunches and hypotheses while avoiding distractions. In the words of da Vinci: They had to learn how to see.