If a cloud sitting at 6500 feet produces a rain drop whose terminal velocity is 25 feet per second, about how many minutes would it take the drop to land at sea level?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
A good tip to solve this problem is to focus on the dimensional analysis to ensure the correct answer (ie pay close attention to the units).
The problem gives us feet and asks us to convert to minutes. And, the conversion factor presented is in feet per seconds. Therefore we must divide 6500 feet by 25 feet per seconds. This results in 260 seconds.
The last step is to divide 260 seconds by 60 seconds per minute. Finally we arrive at 4.3 minutes.