Kim Albrecht visualizes cultural, technological, and scientific forms of knowledge. His diagrams are meant to unfold and question the structures of representation and explore the aesthetic of the intermingling of technology and society through the sensual knowledge of tracing information.
AI Senses visualizes sensor data of the machines that surround us to develop an understanding how they experience the world. In current times, machine learning and artificial intelligence are buzzwords. But they are more than that—they influence our behavior as well as our conception of the technologies themselves and the world they represent. A lack of understanding of how these systems operate on their own terms is dangerous. How can we live with, trust, and interact with this alien species, which we set forth into the world, if we know it only through interfaces designed to make the machine unnaturally akin to the world we already know? This project visualizes raw sensor data that our phones and computers collect and process, to help us understand how these machines experience the world.
Contemporary culture is unimaginable without the machines that surround us every day. Our knowledge is influenced by Google search results, our music taste by the mixes Spotify creates for us, and our shopping choices by Amazon recommendations. This strange new world became part of our reality in a very short time. Human-facing interface design makes these systems feel natural, as if they are really of our world. But if we want to live with these devices and understand them, we should not soley rely on the machines becoming something easily understandable to us. We need to develop an understanding of how these devices experience our world. The visualizations here explore a number of sensory domains: seeing, locating, orienting, hearing, moving, and touching. Rather than yielding machine’s sensory data in ways that we intuitively grasp, however, these visualization try to get closer to the machine’s experience. They show us a number of ways in which the machine’s reality departs from our own. With many of its sensors, for example, the machine is operating in a timescale that is too fast to understand; the orientation sensor returns data up to 300 times per second. This is too quick to draw each of these values on the screen, and also too quick for us to comprehend. In most cases, to make these visualizations, the machine had to be tamed and slowed for us to perceive its “experience.”
A second and more worrying finding is the similarity among many of the images. Seeing, hearing, and touching for humans are qualitatively different experiences of the world. They lead to a wide variety of understandings, emotions, and beliefs. For the machine, these senses are very much the same, reducible to strings of numbers with a limited range of actual possibilities. While some of these sensory experiences—notably temperature—have long been given numerical value, their effects on us remain ineffable. Nowadays, however, it is not only temperature that can be reduced to a discrete number, but seemingly anything. But is this really true? Isn’t there something that our current measures of temperature do not reveal about the entire spectrum, from crisp cold to feverishly hot? And is this a question of more data points, or is there a deeper disconnect, reflective of a difference in kind, in these translations? The entire orientation of a machine towards the world is mediated by numbers. For the machine, reality is binary—a torrent of on and off. Any knowledge about the world that we learn from the machine goes through this process of abstraction. As we become more dependent on our machines, we need to understand the underlying limits and boundaries of this abstraction.