In 1687, Cambridge mathematics chairman Sir Isaac Newton published the Philosophiae Naturalis Principia Mathematica (Mathematical Principles of Natural Philosophy). It was a stunning achievement, destined to change the world forever. The Principia outlined the principles of classical mechanics - laws governing motion and gravity, including the motions of astronomical bodies, the tides, the progression of the seasons and general principles governing objects in motion.
The work would shape our view of the natural world for two and a half centuries. Its formulae describe an orderly universe, where objects follow strict, predictable laws of behavior with clockwork-like precision. The Principia helped usher in a new Age of Reason - a breathtakingly optimistic view that we could understand the deepest secrets of reality if we used the proper yardstick. For the first time in human history, we seemed to be the utter masters of the universe; incredibly complex phenomena like the movement of the planets and progression of the seasons could now be predicted using the simplest of mathematical formulae.
Then, in 1927, a brilliant young physics professor in Copenhagen released a paper that would rip away the illusion of comfortable predictably we had so long enjoyed. Matter, he demonstrated, acted in the strangest of ways.
With "On the Perceptual Content of Quantum Theoretical Kinematics and Mechanics" and subsequent writings, young Werner Heisenberg propelled the new field forward, helping physicists understand how stars ignite, why atoms don't implode, and that so-called "empty" space never truly is. His research would eventually net its author a Nobel Prize.
Over the previous ten years, renowned physicists like Heisenberg's mentor Neils Bohr and contemporary Erwin Schrodinger had been working out the principles of this new field of quantum theory, which explains the behavior of matter on a subatomic scale.
Heisenberg would go on to demonstrate that nature is inherently uncertain or "fuzzy", and that there are limits to what we can learn about quantum particles. At best, we can only calculate subatomic particle behavior and location in terms of probabilities. We would also learn that energy does not flow continuously, but instead takes the form of discrete packets called quanta, and these quanta can act like either a wave or a stream of particles, depending upon the way they are measured.
Heisenberg had discovered a problem with quantum particle measurement: we can measure either the position (x) or the momentum (p) of a subatomic particle with relative precision, but never both. In fact, the more precisely we determine one property, the less precisely we can measure the other.
Objects in the macroscopic world become visible when photons bounce off them and travel to retinal surfaces at the rear of your eyeballs. Here, a photon's energy stimulates a molecular change in a special protein called a rhodopsin, and when enough of these molecular changes accumulate, it stimulates a neural impulse that is delivered to the visual cortex of your brain.
However, to "see" the position of an electron or other subatomic particle, bouncing a photon against it may alter its path, making it impossible to precisely predict the direction in which it will travel. Similarly, an electron's rapid movement will change its position immediately after its measurement. In either case, either the position or the momentum will be inaccurately measured, disturbed by the very act of measuring it.
It's impossible to predict an electron's momentum and position with complete accuracy, so we speak of their orbits in terms of "probability clouds" - the likelihood that an electron will be in a certain area. In atoms, negatively-charged electrons orbit in "probability clouds" around a nucleus containing positively-charged protons.
Source: What is Heisenberg's Uncertainty Principle? Alok Jha, November 10, 2013, Guardian News and Media