A thin convex lens of focal length is inserted between a luminous point object and a screen separated by . The lens forms a sharp image of the object on the screen. When a thin glass plate is introduced between the object and the lens, the screen has to be shifted away by to form a sharp image on the screen. If instead of this, we put the same glass plate between the lens and the screen, by what distance (in cm ) should the screen be shifted to form a sharp image of the object on the screen?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
According to given data and information object and image are separated by a distance 1 0 0 cm and the lens is placed at a distance of 5 0 cm from either of the two.
If a glass plate is placed between the object and lens, then the light rays(to the lens) will appear to come from a different source, which is a little closer as compared to the original object.
The difference between the positions of the original and disguised light source(or object) is also known as Lateral Shift. Lateral shift in object's position comes out to be, s = t ⋅ ( 1 − μ 1 ) where, t is the thickness of the plate and μ is the refractive index.
Writing the equation for when plate was first inserted and screen was shifted away (?), 5 0 − s 1 + 5 0 + 2 3 5 0 1 = 2 5 1
Which gives s = 2 cm
Now if the plate was placed on the other side then we would need to pull screen closer to the lens by Lateral Shift distance for the rays to converge at the screen. Thus the answer.
Lenses
Lateral Shift