(Image source: https://www.thorlabs.de/newgrouppage9.cfm?objectgroup_id=5287)
In the Shack-Hartmann wavefront sensor, the local slope of the incident beam's wavefront is measured as a displacement of the spot positions, as shown in the figure above. Many documents and paper say the wavefront gradient is computed as the ratio of the displacement and the focal length of the microlens:
$$ \frac{\partial \phi(x,y)}{\partial y} \approx \frac{\Delta y}{f_{ML}} $$
This expression makes sense at first glance, but it occurred to me that the units of the left hand and right hand seem inconsistent. The wavefront is usually measured in the unit of angle (e.g. radians), while the infinitesimal displacement ($\partial y$), spot displacement ($\Delta y$), and focal length ($f_{ML}$) are in the unit of length (e.g. meters). That means,
$$\frac{\partial \phi(x,y)}{\partial y} \quad \text{[rad/meters]}$$
and
$$\frac{\Delta y}{f_{ML}} \quad \text{[meters/meters]=[rad]}$$
I think I misunderstood something. Does anyone tell me how to interpret this formula correctly in terms of the units?
