In a Young's double slit experiment minimum intensity is found to be non zero. If one of the slits is covered by a transparent film which absorbs of light energy passing through it, then which one of these is true? (There maybe more than one)
Intensity at maxima must decrease
Intensity at maxima may decrease
Intensity at minima may increase
Intensity at minima may decrease
Also see straight lines problem
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
Resultant intensity at a point is given by ( I r e s u l t a n t ) 2 = ( I 1 ) 2 + ( I 2 ) 2 + 2 ( ( I 1 ) ( I 2 ) ) cos θ where theta is the phase difference angle between the two similar frequency lights incident on screen from the slits.
As the transparent film absorbs 10% light from a slit so the intensity at maxima is surely less than when the film is not there( I_2 decreases )
At the minima the intensity may increase or may decrease. So A, C , D are correct.