ART

In optics, defocus is the aberration in which an image is simply out of focus. This aberration is familiar to anyone who has used a camera, videocamera, microscope, telescope, or binoculars. Optically, defocus refers to a translation of the focus along the optical axis away from the detection surface. In general, defocus reduces the sharpness and contrast of the image. What should be sharp, high-contrast edges in a scene become gradual transitions. Fine detail in the scene is blurred or even becomes invisible. Nearly all image-forming optical devices incorporate some form of focus adjustment to minimize defocus and maximize image quality.

In optics and photography

The degree of image blurring for a given amount of focus shift depends inversely on the lens f-number. Low f-numbers, such as f/1.4 to f/2.8, are very sensitive to defocus and have very shallow depths of focus. High f-numbers, in the f/16 to f/32 range, are highly tolerant of defocus, and consequently have large depths of focus. The limiting case in f-number is the pinhole camera, operating at perhaps f/100 to f/1000, in which case all objects are in focus almost regardless of their distance from the pinhole aperture. The penalty for achieving this extreme depth of focus is very dim illumination at the imaging film or sensor, limited resolution due to diffraction, and very long exposure time, which introduces the potential for image degradation due to motion blur.

The amount of allowable defocus is related to the resolution of the imaging medium. A lower-resolution imaging chip or film is more tolerant of defocus and other aberrations. To take full advantage of a higher resolution medium, defocus and other aberrations must be minimized.

Defocus is modeled in Zernike polynomial format as \( a(2 \rho^2-1) \), where a is the defocus coefficient in wavelengths of light. This corresponds to the parabola-shaped optical path difference between two spherical wavefronts that are tangent at their vertices and have different radii of curvature.

For some applications, such as phase contrast electron microscopy, defocused images can contain useful information. Multiple images recorded with various values of defocus can be used to examine how the intensity of the electron wave varies in three-dimensional space, and from this information the phase of the wave can be inferred. This is the basis of non-interferometric phase retrieval. Examples of phase retrieval algorithms that use defocused images include the Gerchberg–Saxton algorithm and various methods based on the transport-of-intensity equation.
In vision

In casual conversation, the term blur can be used to describe any reduction in vision. However, in a clinical setting blurry vision means the subjective experience or perception of optical defocus within the eye, called refractive error. Blur may appear differently depending on the amount and type of refractive error. The following are some examples of blurred images that may result from refractive errors:

The extent of blurry vision can be assessed by measuring visual acuity with an eye chart. Blurry vision is often corrected by focusing light on the retina with corrective lenses. These corrections sometimes have unwanted effects including magnification or reduction, distortion, color fringes, and altered depth perception. During an eye exam, the patient's acuity is measured without correction, with their current correction, and after refraction. This allows the optometrist or ophthalmologist ("eye doctor") to determine the extent refractive errors play in limiting the quality of the patient's vision. A Snellen acuity of 6/6 or 20/20, or as decimal value 1.0, is considered to be sharp vision for an average human (young adults may have nearly twice that value). Best-corrected acuity lower than that is an indication that there is another limitation to vision beyond the correction of refractive error.
The blur disk

Optical defocus can result from incorrect corrective lenses or insufficient accommodation, as, e.g., in presbyopia from the aging eye. As said above, light rays from a point source are then not focused to a single point on the retina but are distributed in a little disk of light, called the blur disk. Its size depends on pupil size and amount of defocus, and is calculated by the equation

\( {\displaystyle d=0.057pD} \)

(d = diameter in degrees visual angle, p = pupil size in mm, D = defocus in diopters).[1]

In linear systems theory, the point image (i.e. the blur disk) is referred to as the point spread function (PSF). The retinal image is given by the convolution of the in-focus image with the PSF.
See also

Bokeh
Shape from defocus

References
Smith, Warren J., Modern Optical Engineering, McGraw–Hill, 2000, Chapter 11, ISBN 0-07-136360-2

Strasburger, Hans; Bach, Michael; Heinrich, Sven P. (2018). "Blur Unblurred—A Mini Tutorial". i-Perception. 9 (2): 204166951876585. doi:10.1177/2041669518765850. PMC 5946648. PMID 29770182.

Physics Encyclopedia

World

Index

Hellenica World - Scientific Library

Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License