In 1913, Sir William Bragg and his son Sir Lawrence Bragg worked on a mathematical relation to determine inter-atomic distances from x-rays diffraction pattern. Bragg’s law states that
“The intensity of reflected beam from the crystal lattice at certain angle will be maximum if the path difference between the two reflected waves from two different planes is an integral multiple of the wavelength of incident X-ray.”
If path difference is
This relation is known as Bragg’s law.
Derivation of Bragg’s Law
Consider parallel monochromatic X-rays

According to the theory of interference, the reflected x-rays
In right angled triangle
In right angled triangle
[Trigonometric Functions and their Relations]
From equations
This is known as Bragg’s law.
The value of
Lattice Planes or Bragg’s Planes
Bragg thought that atoms inside the crystal are arranged over different sets of parallel planes separated by a fixed distance. These atomic planes are called lattice planes or Bragg’s planes.
By measuring the value of glancing angle (angle of diffraction) for a known value of wavelength of a monochromatic x-ray and for a particular order of reflection, the spacing between the lattice planes can be calculated.
Due to this, Bragg’s law is applicable because the inter-molecular spacing is of the order of few angstroms and wavelength of x-rays is also of same order. It is not valid for visible light because visible light ranges from