Social Question
Are human ways of thinking evolving as computer "thought" evolves?
Before Euclid, Pythagoras and the founders of geometry contributed new ways of conceiving Place using planes, rectangles, squares, lines, triangles and circles; human thought was confined almost entirely to abstractions and philosophy. With the introduction of Geometry, we began to think in more precise terms about place, mapping the position of stars, planets and the sun and moon.
Human thinking had evolved. Without printing presses and advanced math, much of human wisdom had been locked in the minds of individual artisans and craftsmen. Their combined wisdom was great, but almost entirely unshared. There was no good way to share it. With the advent of Geometry, humans could share information about place in a powerful new way. We could calculate how to circumnavigate the Earth and which way to bow if we wished to bow to Mecca.
When Isaac Newton bent his mammoth mind to understanding the stars and planets, he wanted to define not just Place but Pace. To do this, he needed maths more powerful than geometry. He invented calculus. And with it came a whole new way of computing things accurately where only rough approximations had been possible before. Before electronic computing, it might take several weeks for human “computers” (yes, there used to be such a job description) to calculate the trajectory of a long range artillery round. But once the table was written, it could be communicated to gunnery teams around the world. Humans were now able to talk and think more definitively about both place and pace.
But there are things we need to describe that are vastly more complex than the motions of celestial bodies or artillery shells. Biological processes, ecology, global economics and all the other life processes are controlled by such a bewildering array of forces that we truly don’t even know whether free will enters into human actions, or whether we are executing an enormously complex, but totally deterministic program.
The human mind isn’t even well equipped for thinking about such things. We are reasonably adept at learning, but lousy at forgetting. We have no delete function. Once we think we know something, we tend to doggedly hold on to it. This served us well in adapting to our environment when we lived as hunter gatherers. But it gets squarely in the way of understanding biological processes. To understand the complexities of living, volitional systems in large group interactions, we human computers were woefully handicapped. We needed another set of new maths, maths that humans can’t even master.
Only when electronic computing and the idea of iterative algorithms came along could we begin to truly understand and predict life processes. Computers are perfectly happy running trillions of calculations and then erasing them all and running a new simulation, storing only the result of each iteration and feeding it into the next run. Even the relatively simple Monte Carlo Simulation would have been a monumental undertaking for human computers. But you can now download Monte Carlo Simulations for business management and run them in Excel on your desktop computer.
The Google Panda Update, using Machine Learning to hone an algorithm that simulates human reactions and forms human-like “opinions” about Web sites would have been utterly impossible without networked supercomputers. At its level, humans can’t even fathom the final program that emerges from the iterative process the network of supercomputers use. We concern ourselves primarily with the result of the computer simulation. Once completed, the process becomes irrelevant.
What is relevant is that we now have a way to define Pattern as well as Place and Pace. In the past, each step toward clarity of definition has taken us further from our original world, where human wisdom was held in the heads of the butcher, the baker and the candlestick maker. Each new step has changed how we think.
The cart wright is no longer king in his village. Wisdom is widely distributed and readily available. The Internet puts it before us all. How will this change the way we think? What will it do to the evolution of human thought? How long till we bend our algorithms to the improvement of DNA and accelerate actual evolution at the molecular level?