Wanted a definition for near field and far field photometry
Wanted a definition for near field and far field photometry
(OP)
Please could someone clarify what near field and far field photometry are and when they are appropriate. I am measuring luminaires and would like to understand what the differences are so I do not make erroneous reports.





RE: Wanted a definition for near field and far field photometry
RE: Wanted a definition for near field and far field photometry
TTFN
RE: Wanted a definition for near field and far field photometry
RE: Wanted a definition for near field and far field photometry
Photometric science treats light as if it were propagating in straight lines, like rays. Diffraction is not considered. Therefore, "near-field" and "far-field" mean something totally different in photometry science than in diffraction science. As such, IRstuff has given the proper definition, and your prior impression was correct.
The best way to measure the transition point between near and far field is to measure the power as a function of distance from your luminaire (Your detector must be much smaller than your luminaire, if not, you can put an aperture in front of your detector). The boundary between near-field and far-field will be the distance where the power received by the detector begins to fall like 1/r^2.
CV
RE: Wanted a definition for near field and far field photometry
Prior to that, the source still obeys 1/r^2, but as you move back, you see more of the extended source, thus compensating for the 1/r^2 drop exactly. This is "constant brightness" law for extended sources.
TTFN