Definitions for "Uncertainty Principle"
In the world of quantum mechanics, there is an intrinsic uncertainty in studying the position and the momentum of a particle at the same time. This means studying physics at small distances, where an accurate determination of the position is needed, requires high momentum and hence high energy.
principle of quantum mechanics, discovered by Heisenberg, that there are features of the universe, like the position and velocity of a particle, that cannot be known with complete precision. Such uncertain aspects of the microscopic world become ever more severe, as the distance and time scales on which they are considered become ever smaller. Particles and fields undulate and jump between all possible values consistent with the quantum uncertainty. This implies that the microscopic realm is a rolling frenzy, awash in a violent sea of quantum fluctuations.
Heisenberg's uncertainty principle states that the uncertainty in the measured value of momentum multiplied by the uncertainty in the measured value of position is of the order of Planck's constant ( h/2pi). The uncertainty principle is based on the idea that a measurement of a system must disturb it in some way resulting in a lack of precision of measurement. The more precise you try to measure the momentum the less precise your measurement of position becomes and vice versa. The principle is of fundamental importance in atomic/nuclear physics. A consequence of the principle is that you can never predict the exact behaviour of a system, unlike newtonian mechanics.