Folk tales are filled with cautionary tales about getting lost in the middle of a dark forest. This trope is not just a convenience for story tellers, it is a lesson about light scattering. As one goes farther into a forest, photons have had more opportunities to hit a tree and bounce backward out of the forest.
Suppose you find yourself into a dark forest which is illuminated from a (one) side. The canopy is extremely thick so that no light penetrates from above. Moreover, the forest consists of cylindrical trees with average radius which are randomly distributed with number density (i.e. 2 trees per square meter).
What fraction of the incoming light the can you see, on average?
Assumptions
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
The idea here is simple, the further one walks into the forest, the more trees have had a chance to to stand between the edge of the forest and your position. If a tree stands at a depth l into the forest, it will block light for all positions l ′ > l .
The effect of all the trees compound such that very deep into the forest, no light can penetrate at all, as illustrated in the diagram below.
On average there are ρ trees in every square meter patch of forest, and each tree blocks light entering a 2 r ˉ wide strip of forest. We expect each step of length Δ l into the forest to reduce the illumination, L , by a factor ( 1 − 2 r ˉ ρ Δ l ) , i.e.
Δ l Δ L = − L 2 r ˉ ρ Δ l
This has the solution L ( l ) = L 0 e − 2 r ˉ ρ l .
Therefore, we expect the illumination in the forest to drop off exponentially with the distance walked into the forest.