A very thin lens has diameter D and focal length f. It has a weird equation:
(1/f)^n = (1/s)^n + (1/s')^n
where s is an object's position and s' is the image position measured from the lens, and n is an integer where n > 0 (a common lens has n = 1).
Then ray of light come to the lens with a small angle θ from axis of the lens (θ<<1). And a screen is placed behind the lens at focus.Determine radius of light appear on the screen!
Use (1+x)^n = 1 + nx + 1/2*n(n-1)x^2 for x << 1.
[9:33 PM
|
2
comments
]
2 comments
salam fisika dari budakfisika,,,,
iya gpa2 mas, yang penting konten tentang fisikanya mudah dimengerti....
by budakfisika
there is a bit of problem in the question...
through the theory of cardinal points, merely by the assumption of the existence of focus (i.e. assuming that parallel rays does focus at one point) we can prove the formula 1/v + 1/u = 1/f.
so, all of the optics would fail if you assume this relation for any lens. any assumption that you take in the solution would yield back the equation with n=1.
Post a Comment