Two sound waves from a point source on the ground travel through the ground to a detector. The speed of one wave is 7.5 km/s, the speed of the other wave is 5.0 km/s. The waves arrive at the detector 15 s apart. What is the distance from the point source to the detector?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
Let D be the distance (in km) from the point source to the detector. Then with time = distance/speed, the time (in seconds) for the faster wave to reach the detector is D / 7 . 5 and the time for the slower wave to reach the detector is D / 5 . Since the waves arrive 1 5 seconds apart, we have that
5 D − 7 . 5 D = 1 5 ⟹ 1 5 3 D − 2 D = 1 5 ⟹ D = 1 5 × 1 5 = 2 2 5 km.