Recently there have been technological advances that allow for 3-d imaging up to kilometers away of an object. This can be done by using laser light that bounces off an object. By measuring the time it takes the laser light to get to the object and back, one can construct a detailed 3-d image of the object. The obstruction has been to get enough of the light back to actually make an image, but new detectors are sensitive enough to overcome this limitation.
If I want to measure a 3-d object down to millimeter accuracy, to what accuracy in seconds must my detector be able to measure the time of arrival of the light that bounced off the object?
Remember... the light has to bounce there and back.
Details and assumptions
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
how we can distance
Log in to reply
If you want to measure down to millimeter accuracy then the accuracy of your timer has to be the same as the time taken for light to return from the object if it was 1 m m away, so a total distance of 2 m m or 0 . 0 0 2 m has to be used to determine this.
i answered 0.6e-11..is it incorrect?
Log in to reply
You need to give your answer to 3 significant figures and it is always best to have it in the form 6.67e-12 rather than 0.667e-11 even though they are the same thing.
Log in to reply
The significand must be between 1 ≤ a < 1 0 for scientific notation a ⋅ 1 0 b . This is sometimes confused with the binary mantissa, written in binary in the form 1 . m ˉ × 2 n commonly used in IEEE, electrical engineering, and numerical analysis in programming.
I forgot to consider that the light was reflected...... So got it wrong.
For a more general solution, you can show that this is the time difference the light takes when it goes to a distance x versus when it goes to a distance x + 1 0 − 3 .
Problem Loading...
Note Loading...
Set Loading...
In this question, you have to find out how long the light would take to travel from its source to the object and back again when the object is only 1mm away. The speed of light is constant so you are able to use:
S p e e d = T i m e D i s t a n c e
Rearranged for Time:
T i m e = S p e e d D i s t a n c e
As time is in m s − 1 you have to convert the total distance covered by the light into meters:
D i s t a n c e = 0 . 0 0 2 m
Put this into the equation for time:
T i m e = 3 0 0 0 0 0 0 0 0 0 . 0 0 2
Therefore the accuracy for Time is:
6 . 6 7 x 1 0 − 1 2