Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

LED near-eye holographic display with a large non-paraxial hologram generation

Open Access Open Access

Abstract

In this paper, two solutions are proposed to improve the quality of a large image that is reconstructed in front of the observer in a near-eye holographic display. One of the proposed techniques, to the best of our knowledge, is the first wide-angle solution that successfully uses a non-coherent LED source. It is shown that the resulting image when employing these types of sources has less speckle noise but a resolution comparable to that obtained with coherent light. These results are explained by the developed theory, which also shows that the coherence effect is angle varying. Furthermore, for the used pupil forming display architecture, it is necessary to compute a large virtual nonparaxial hologram. We demonstrate that for this hologram there exists a small support region that has a frequency range capable of encoding information generated by a single point of the object. This small support region is beneficial since it enables to propose a wide-angle rigorous CGH computational method, which allows processing very dense cloud of points that represents three-dimensional objects. This is our second proposed key development. To determine the corresponding support region, the concept of local wavefront spatial curvature is introduced, which is proportional to the tangent line to the local spatial frequency of the spherical wavefront. The proposed analytical solution shows that the size of this area strongly depends on the transverse and longitudinal coordinate of the corresponding object point.

© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

The meta-universe is a universe where real and digital worlds merge and become one. Within this concept people can live a parallel - virtual life. In meta-universe, it is possible to attend schools, business meetings, entertainment events, and other social activities without leaving home [1]. This futuristic vision is becoming real, thanks to the rapid development of virtual reality and augmented reality near-eye displays. To ensure a faithful imitation of the reality, these displays must provide high quality 3D images and support optical parameters of human vision. Among many 3D near-eye display types, holographic displays are the best solution for the 3D object reconstruction. These devices allow reproducing images floating in the air with all depth cues of human vision [2,3]. Current developments in near-eye holographic displays include possibility of color reconstructions [4,5], supporting binocular vision [6], reduction of form factor [7], reconstruction of speckle free image [8], aberration minimalization [9], manipulation of the reconstructed object in interaction mode [10], efficient content generation [3,11], and others.

In holographic display, 3D object is reconstructed with the use of a spatial light modulator (SLM). This device enables obtaining high quality 3D image. However, SLM has large pixel size and thus small diffraction angle preventing direct reconstruction of large object visible at large angle. To meet the expectations of the customers, holographic display should be characterized by high immersion [3]. Thus, it must provide a high-quality image that, when reconstructed, can be viewed over a wide field of view (FOV). In the state of the art there are several solutions to optically enlarge FOV. These methods can be divided into two main display architectures: one projecting the hologram into the eye and other generating hologram in proximity to the reconstructed 3D object. In the first architecture, the display is commonly 4F pupil relay system [3,6,1114]. Two lenses with different focal lengths allow demagnifying the pixel pitch of SLM resulting in an increase of the FOV. However, to ensure large FOV focal length of one of the lenses must be much larger than the other one. The second one is a pupil forming architecture [15], where large FOV is obtained with 4F system and optical magnifier [1621]. As magnifier displays utilize eyepiece [16,17], holographic optical element [1820] or curved reflective element [21]. Nevertheless, the disadvantage of all wide-angle display solutions mentioned above is that they employ lasers. Such light sources produce coherent, high-contrast speckles that significantly reduce image quality and are potentially hazardous to human vision. This speckle effect can be diminished by using incoherent LED illumination [22,23]. However, such type of source has reduced spatial coherence, which may produce image blur and therefore loss of fine details of reconstructed object. For narrow angle displays, the advantages of employing LED illumination have been theoretically analyzed and experimentally demonstrated [24] by showing that high quality 3D images with reduced speckle noise can be obtained.

Computation of computer generated hologram (CGH) that enables visual content for wide-angle view is a key issue in holographic near-eye displays [3]. This is because CGH is the only known source of content for these devices and quality of the computed hologram determines the immersion level. Depending on how the FOV is enlarged, a suitable CGH technique must be chosen for calculating the required holograms. For displays within the first architecture, CGHs can be calculated with accurate methods like point-cloud based synthesis [25], ray tracing [26] or polygon based method [27]. Recently, approximate acceleration methods like the phase added stereogram method [28] were introduced to calculate wide-angle view CGHs efficiently [11,29]. For the second display architecture, ray tracing method [3032] or a technique representing the 3D object by set of slices [17] are employed for synthetizing CGHs. In ray tracing method the hologram is divided in sub-holograms each of equal sizes, where information of corresponding spherical wavefronts is stored. When selecting a sub-hologram of one pixel size, the coded information is accurate. However, the long computation time makes this case impractical. On the other hand, a sub-hologram larger than one pixel accelerates calculations at expense of reducing accuracy in the CGH due to aliasing. Thus, selection of the sub-hologram size for wide-angle CGH must be a compromise between accuracy and speed. What is more, it should depend on the depth and size of the object. In the second approach, 3D object is divided in several layers and each of them is propagated to the hologram plane by Fresnel-based propagation method [33]. Unfortunately, in this technique there is no method suitable for wide-angle field propagation. Thus, wide-angle propagation method is replaced by paraxial solutions [17,33], which introduce numerical errors.

 In this paper, we propose large FOV near-eye holographic display with LED illumination. The design of this work is based on the second architecture since it produces a large and defocused hologram with spherical non-paraxial reference wave. It is shown that this design allows fast non-paraxial calculations of wide-angle CGH of 3D object, because each object point of 3D object volume can be supported by small area of the hologram. Notably, this support region enables storing non-paraxial information free of aliasing, which additionally requires a small amount of space bandwidth product to be processed. To estimate the corresponding support area, a solution based on phase space analysis, which is a key-element of developed CGH algorithm, is proposed. By using this approach, it is demonstrated that there exists a small spatial region in which each point of the object has local spatial frequencies that fall within the frequency band of the hologram. Rigorous solution to this problem needs to process non-linear equations, which is computationally inefficient. To avoid this, a novel phase space concept of local spatial curvature, which is proportional to the tangent lines of the corresponding wavefront, is introduced. By using this concept an approximate analytical solution of calculating point source support area, which is computationally efficient and has negligible approximation error, is given.

The presented display concept, to the best of our knowledge, is the first wide-angle solution that successfully uses a non-coherent LED source. It has been experimentally confirmed that it can generate high quality large images with reduced speckle noise and resolution comparable to that obtained with coherent light. This experimental conclusion is supported by the developed theory, which is based on phase space analysis, describing spatial coherence effect and related image blur in wide-angle near-eye display. This theoretical solution reveals that the coherence effect is angle dependent. It is shown that image resolution improves for larger angles. For the proposed display with LED of size 960 µm, introduced theory indicates very small, negligible image blur.

2. Display

This work develops the near-eye holographic display with large FOV, which enables reconstruction of large object with LED illumination. The proposed imaging system is based on the pupil forming architecture, which is composed of three lenses. Two lenses form a 4F system, which gives real image of the SLM. Next, this image is viewed with the third lens acting as a magnifying glass. As a result, large hologram is projected far from the eye and close to the reconstructed object.

Figure 1 presents the near-eye holographic display setup with incoherent LED source and 4 K phase only SLM. It consists of two modules: illumination and imaging. In the first module, SLM (HoloEye Gaea 2.0, pixel count 3840×2160, pixel size ΔSLM = 3.74 µm) is illuminated by the set of polarized plane waves generated by the LED source (Doric Lenses pigtailed red diode, central wavelength λ = 635 nm, full width at half maximum = 25 nm, fiber core 960 µm) placed in the back focal plane of the collimating lens Lc (Fc = 400 mm). Input polarization state is set by the polarizer P correspondingly to the modulator’s liquid crystals main polarization axis. In the second module, assembly of 4F imaging system and eyepiece forms large size hologram from which large 3D object is reconstructed. The main role of 4F afocal system composed with lenses L1 (F1 = 100 mm) and L2 (F2 = 122 mm) is to create real magnified copy (magnification m4F = 1.22) of the SLM, which is viewed through an eyepiece Lep (Fep = 33 mm). The optics creates final virtual magnified image of the SLM where the large hologram is reconstructed. The magnification introduced by the eyepiece depends on its axial position in relation to real SLM image given by the 4F setup. It increases when the distance s, between eyepiece and real image of the SLM, takes values close to the Fep. In our implementation s = 31 mm. Using the lens equation: 1/s’-1/s = 1/f’ distance s’ between eyepiece and the final virtual image is zh = 511 mm. For these parameters, the magnification introduced by the eyepiece mep is 16.48 and total magnification of the display mtot = 20.11. As a result, large hologram is formed with pixel size Δh equal to 75.21 µm and physical dimensions 288.80 × 162.45 mm2 giving maximum FOV 31.6°. The optics of the display images a LED source to the viewing window (VW) plane with two 4F afocal systems. Lenses Lc and L1 form the first one with magnification 0.25 while lenses L2 and Lep the second one with magnification 0.27. The combined transverse magnification is mts = 0.067. For LED source of diameter 960 µm its size at the VW is small, it equals 64.32 µm. Therefore, the 3D image generation process can be represented as the reconstruction of a large virtual hologram using a beam generated by a small incoherent source.

 figure: Fig. 1.

Fig. 1. Scheme of large FOV near-eye holographic display setup with LED illumination. Lc – collimating lens, P - polarizer, BS – beam splitter, L1 and L2 lenses forming 4F system, Lep – eyepiece.

Download Full Size | PDF

In the display, a complex virtual hologram is generated by the phase only SLM using complex amplitude encoding technique [34,35]. In this method hologram is encoded by a sum of two complex distributions. In decoding step physical cut-off frequency filter is needed. It is placed in the Fourier plane of lens L1 and passes only object beam of the complex virtual hologram. In our solution, cosine modulation complex amplitude encoding [35] is employed. It is optimal since the applied cut-off filter is large. Its dimensions correspond to the half and full bandwidth of the SLM in x and y directions, respectively. Thus, the resolution of the image is reduced only in x direction and corresponding reduction factor is close to 2.

3. Holographic imaging analysis

With the aim to evaluate the parameters of the image, this work employs the phase space representation (PSR), which investigates the object signal recorded in the hologram. In this Section, the analysis is carried out for coherent light. The case of a non-coherent source (LED) is considered in Section 5. Figure 2 shows the simplified geometry of the display in image domain, where observer, hologram, and object planes are marked. Object plane is the plane to which an analyzed point from a 3D object belongs. Using the PSR, the frequency range of an object, which is encoded in a hologram, can be evaluated. Applying PSR in the plane of the hologram, it is possible to determine the transfer of information between the hologram, the image, and the observer. Here, PSR is employed to evaluate image resolution and size. The considered object wave is generated by the p-th point source of the 3D object of coordinate rop = [xop, yop, zop], which can be expressed as

$${U_o}({\mathbf{r}_h}) = {e^{ - ik\left\|{{\mathbf{r}_h} - {\mathbf{r}_{op}}} \right\|}},$$
where rh = [xh, yh, zh] and ||r|| = (x2 + y2 + z2)1/2. The analysis is performed for the hologram plane and for the selected object point, hence the dependence of Uo on the object coordinate is omitted in the above notation. The PSR of this object signal is described as
$${f_{xo}}({\mathbf{r}_h}) = \frac{{{x_h} - {x_{op}}}}{{\lambda \left\|{{\mathbf{r}_h} - {\mathbf{r}_{op}}} \right\|}},$$
and
$${f_{yo}}({\mathbf{r}_h}) = \frac{{{y_h} - {y_{op}}}}{{\lambda \left\|{{\mathbf{r}_h} - {\mathbf{r}_{op}}} \right\|}}$$

Reconstructing the hologram H, which encodes object wave given by Eq. (1), with a spherical reference wave R yields an object wave Ur

$${U_r}({\mathbf{r}_h}) = H({\mathbf{r}_h})R({\mathbf{r}_h}),$$
where R(rh) = eik||rh||. The hologram H is a discrete signal sampled with pixel pitch Δh, thus, its bandwidth is limited to Bfh = 1/Δh. Consequently, the corresponding reconstructed object wave Ur is a band-limited version of the analyzed object spherical wave Uo. The local bandwidth of reconstructed hologram Ur, is found by evaluating the local spatial frequencies from the reference wave and adding the hologram bandwidth as follows
$${f_{xh \pm }}({\mathbf{r}_h}) = \frac{1}{{2\pi }}\frac{\partial }{{\partial {x_h}}}[{k\left\|{{\mathbf{r}_h}} \right\|} ]\pm \frac{{{B_{fh}}}}{2} = \frac{1}{\lambda }\frac{{{x_h}}}{{\left\|{{\mathbf{r}_h}} \right\|}} \pm \frac{{{B_{fh}}}}{2},$$
and
$${f_{yh \pm }}({\mathbf{r}_h}) = \frac{1}{{2\pi }}\frac{\partial }{{\partial {y_h}}}[{k\left\|{{\mathbf{r}_h}} \right\|} ]\pm \frac{{{B_{fh}}}}{2} = \frac{1}{\lambda }\frac{{{y_h}}}{{\left\|{{\mathbf{r}_h}} \right\|}} \pm \frac{{{B_{fh}}}}{2}.$$

 figure: Fig. 2.

Fig. 2. Geometrical diagram of the holographic image domain. Here xe, xh, and xo stand for eyebox, hologram, and object plane, respectively.

Download Full Size | PDF

Figure 3(a) shows the PSRs of the local bandwidth limits of the reconstructed hologram and the object spherical wave at the hologram plane for x-direction. This plot is calculated for the experimental parameters of the display and for xo = 0.45Bxo and zo = 1000 mm, where Bxo = 520.5 mm is the size of the image plane at object axial location zo. As seen in this plot, the large spatial extent of the hologram results in the small hologram bandwidth. As a result, only a small area of the hologram supports the spherical wave under consideration. Other regions cannot represent this object signal. The object point hologram support area, which is denoted as Ωxh,yh, is a region having frequency between the intersections of fxo and fxh±, and fyo and fyh±. The boundaries of this support region for x-direction are shown in Fig. 3(a) by the vertical lines and a zoom of the corresponding region is presented in Fig. 3(b). The small object point hologram support area Ωxh,yh is advantageous, because it enables high-speed CGH calculations, and thus, it requires a small amount of space-bandwidth product (SBP) to be processed.

 figure: Fig. 3.

Fig. 3. Diagrams for estimation of image resolution: (a) PSR of hologram support and object beam for xo = 234.2 mm, zop = 1000 mm. The yellow line represents the PSR of the off axis point source in the hologram plane, while red and blue lines depict the PS support of the hologram. (b) Zoom for hologram support area; the points, where the lines intersect, generate the support area Ωh. (c) Error of the evaluation of the image resolution for xop = 0.45Bxo.

Download Full Size | PDF

A similar approach is found in the wave recording plane (WRP) algorithm [36]. This method requires an intermediate plane between object and hologram, in which portion of the spherical wave is stored. When all the spherical fields are placed in this intermediate plane, a propagation method is employed for obtaining the final hologram. For the proposed hologram configuration, the known propagation methods are numerically expensive, and thus, they cannot be applied. Moreover, the size of support area of the WRP is proportional to the diffraction angle of SLM and distance between object and intermediate plane, which must be small. This is the main problem of the WRP. Here, it is shown that the hologram can be computed without propagation and intermediate plane. Consequently, the diffracted field from the object is calculated directly on the hologram plane. This is possible due to novel theoretical results obtained with PSR. Moreover, the CGH calculations consider large non-paraxial spherical reconstruction wave and significantly larger object than the hologram.

The resolution of the object point Bfxo can be computed from the frequency separation between PSR coordinates (xh+, fxo+) and (xh-, fxo-). This parameter is found by introducing concept of local spatial curvature of the wavefront, which is proportional to the PSR tangent lines of the corresponding wavefronts. The local spatial curvature of the wavefront is calculated from the derivatives of local spatial frequencies, which for the object spherical wave takes the form

$$c_{xo}^{}({\mathbf{r}_h})\mathop = \limits^{def} \lambda \frac{{\partial {f_{xo}}}}{{\partial {x_h}}} ={-} \frac{{{{({{y_h} - {y_o}} )}^2} + {{({{z_h} - {z_o}} )}^2}}}{{{{\left\|{{\mathbf{r}_h} - {\mathbf{r}_o}} \right\|}^3}}},$$
and
$$c_{yo}^{}({\mathbf{r}_h})\mathop = \limits^{def} \lambda \frac{{\partial {f_{yo}}}}{{\partial {x_h}}} ={-} \frac{{{{({{x_h} - {x_o}} )}^2} + {{({{z_h} - {z_o}} )}^2}}}{{{{\left\|{{\mathbf{r}_h} - {\mathbf{r}_o}} \right\|}^3}}},$$
while for limits of the bandwidth of reconstructed hologram is given as
$$c_{xh}^{}({\mathbf{r}_h}) = \lambda \frac{{\partial {f_{xh \pm }}}}{{\partial {x_h}}} = \frac{{y_h^2 + z_h^2}}{{{{\left\|{{\mathbf{r}_h}} \right\|}^3}}},$$
and
$$c_{yh}^{}({\mathbf{r}_h}) = \lambda \frac{{\partial {f_{yh \pm }}}}{{\partial {y_h}}} = \frac{{x_h^2 + z_h^2}}{{{{\left\|{{\mathbf{r}_h}} \right\|}^3}}}.$$

The estimate of Bfxo can be found by evaluating local spatial curvatures at the center point of the support region of Ωxh,yh that is

$$[{{x_{hop}},{y_{hop}}} ]= \left[ {{x_{op}}\frac{{{z_h}}}{{{z_{op}}}},{y_{op}}\frac{{{z_h}}}{{{z_{op}}}}} \right].$$

For the sake of clarity, here 1D resolution in the image is investigated. The corresponding bandwidth Bfxo can be found from the condition

$$\frac{{{B_{fxo}}}}{{|{c_{xo}^{}({x_{hop}})} |}} = \frac{{{B_{fxo}} - {B_{fh}}}}{{|{c_{xh}^{}({x_{hop}})} |}}.$$
When solving for Bfxo, it is found that
$${B_{fxo}} = {B_{fh}}\frac{{{z_h}}}{{{z_{op}}}}.$$

This equation provides estimate for the image size

$${B_{xo}} = {B_{xh}}\frac{{{z_{op}}}}{{{z_h}}}.$$

This result, obtained with non-paraxial analysis, shows that the resolution of the 3D image changes according to the simple proportion, which surprisingly is spatially invariant. It does not depend on the view angle. Figure 3(c) illustrates error of the approximation of image resolution using Eq. (1)3. It indicates that the relation is accurate and the taken approach introduces error of negligible magnitude.

4. CGH calculation framework

Calculation scheme of the CGH has two steps for each object point: evaluating the area ΩxH,yH and computing the spatially limited spherical wave. For hologram of millions of points the numerical operations of both steps must be computationally efficient. Firstly, we analyze an optimized solution for evaluating the boundaries of Ωxh,yh. The spatial region of frequency support area Ωxh,yh is bounded by coordinates xh- < xh < xh+ and yh- < yh < yh+, which are shown in Fig. 3(b) and can be found by solving equations

$${f_{xh \pm }}({\mathbf{r}_h}) = {f_{xo}}({\mathbf{r}_h}),$$
and
$${f_{yh \pm }}({\mathbf{r}_h}) = {f_{yo}}({\mathbf{r}_h}).$$

These equations have non-linear terms that make it difficult to find an analytical solution for x and y. When Eq. (15) and (16) cannot be solved analytically, numerical methods must be employed. However, accurate numerical solution needs a fine sampling in the spatial domain that has a time cost. When repeating these numerical calculations millions of times, the whole processing of a hologram becomes slow, thus this approach is not suitable. An interesting alternative to find the points x and y is to employ introduced local spatial curvature of the object spherical wave and boundaries of reconstructed hologram bandwidth.

For this, Eq. (7)-(10) are evaluated at the coordinate (xho, yho), giving the sought tangent lines. The resulting equation takes the form

$${f_x} - {f_{xo}}({\mathbf{r}_{hop}}) = \frac{{c_{xo}^{}({\mathbf{r}_{hop}})}}{\lambda }({x_{h \pm }} - {x_{hop}}),$$
and
$${f_x} - {f_{xh \pm }}({\mathbf{r}_{hop}}) = \frac{{c_{xh}^{}({\mathbf{r}_{hop}})}}{\lambda }({x_{h \pm }} - {x_{hop}}).$$

The same is made for y-direction. These two equations are equalized, and consequently, the condition for finding the coordinates of the intersections between the tangent lines can be written as

$$c_{xo}^{}({\mathbf{r}_{hop}}){D_{xho}} = {c_{xh}}({\mathbf{r}_{hop}}){D_{xho}} + \frac{{\lambda {B_{fh}}}}{2},$$
and
$$c_{yo}^{}({\mathbf{r}_{hop}}){D_{yho}} = {c_{yh}}({\mathbf{r}_{hop}}){D_{yho}} + \frac{{\lambda {B_{fh}}}}{2}.$$
where [2Dxho, 2Dyho] is a size of the support region Ωxh,yh. When solving these expressions for Dxho,yho, it is found that
$${D_{xho}} = \frac{{{B_{fh}}\lambda ({{z_{op}} - {z_h}} ){{\left\|{{\mathbf{r}_{hop}}} \right\|}^3}}}{{2({z_h^2 - y_{hop}^2} ){z_{op}}}},$$
and
$${D_{yho}} = \frac{{{B_{fh}}\lambda ({{z_{op}} - {z_h}} ){{\left\|{{\mathbf{r}_{hop}}} \right\|}^3}}}{{2({z_h^2 - x_{hop}^2} ){z_{op}}}}.$$

Finally, coordinates of the boundary points of Ωxh,yh are expressed by

$${x_{hop \pm }} = {x_{hop}} \pm {D_{xho}},$$
and
$${y_{hop \pm }} = {y_{hop}} \pm {D_{xho}}.$$

Accurate numerical encoding of an object point in a hologram requires that the coordinates ${x_{o \pm }}$ and ${y_{o \pm }}$ should be found with the absolute error that is smaller than the pixel pitch Δ2. In this way, the corresponding discrete value is accurately evaluated. The approximation error and normalized area of the support region Ωxh,yh are evaluated for points (xop = 0, yop = 0) and (xop = 0.45Bxo, yop = 0.45Bxo) and the results are presented in Fig. 4(a) and 4(b), respectively. The calculations are repeated for a wide range of depths. The size of the support area is analyzed to show its dependence on the spatial position of the object point and depicted plots are normalized in respect to the total area of the SLM. Plots in Fig. 4 are generated for two FOVs: 31.6° and 63°, which is calculated as follows [37]

$$FOV = {\tan ^{ - 1}}\left( {\frac{{{m_{4F}}{N_{SLM}}{\Delta _{SLM}}}}{{2{F_{ep}}}}} \right),$$
where NSLM is the number of pixels in the SLM. Thus, providing a constant Fep, the FOV can be modified by changing the magnification m4F or NSLM. For FOV = 31.6°, a NSLM = 4096 pixels (4 K) is chosen while FOV = 63° is obtained by increasing the magnification m4F or using twice the pixels in the SLM i.e., NSLM = 8192 pixels (8 K). In Fig. 4, the symbols m4F and SLM8K indicates how the FOV is increased. The solid lines are used to represent the results for on-axis point while the dash-lines for the corner point. Red color represents a FOV 31.6° while blue and green depict FOV = 63° for m4F and SLM8K, respectively. As observed in Fig. 4(a), the error for the central point within the region Ωxh,yh is constant and almost zero for all analyzed FOVs, which makes it neglectable. However, the error of the corner point has different evolution. Figure 4(a) shows that error grows monotonously for all FOVs. When considering 31.6° FOV, it is seen that for positive depths the error grows, and it reaches a value slightly smaller than 5% for the largest depth. When considering FOV = 63°, two different off-axis error plots are obtained. For describing this, let start with FOV that is changed through magnification i.e., m4F. In this case, the pixel pitch of projected SLM is increased, which results in a magnified SLM of almost the twice of size. The increasing of pixel size reduces the frequency bandwidth of the hologram. Thus, the corresponding off-axis support region decreases its area around 3.3 times in respect to support region of FOV = 31.6°. This causes that the support region registers a smaller number of frequencies than when using small FOV, and thus, a smaller error than the FOV = 31.6° is obtained. For the second case i.e., the size of the SLM is enlarged to 8 K, the error grows faster, as shown with the green dash line in Fig. 4(a). This is expected because compared to the 31.6° FOV case, the pixel size of the enlarged SLM does not change, but the off-axis angle doubles. Thus, the maximum error that is obtained for this off-axis point is around 12%, which is still within the tolerance. For the case of negative depths, the error grows more rapidly. Notably, the error should be smaller than 100%. When using the 8 K configuration, the error is 50% at the position zop = 219 mm, and for zop = 166 mm the error is 100%, which is the maximal error that can be permitted. Notably, for these distances the object is too close from the eye and is not a practical case because of the near point of human vision is about 250 mm. Thus, it can be concluded that in any of the analyzed cases, the proposed CGH method provides sufficient accuracy. The change of area of the support region along the axial direction for FOV = 31.6° and 63° and for the on-axis and corner sources can be seen in Fig. 4(b). From this plot, it can be observed that for small FOV, the normalized area is large while for large FOV, the normalized area is the same despite the mechanism of changing the FOV. However, the most important finding from this graph is that the support area is highly spatially variant. It changes with depth and with transverse coordinate.

 figure: Fig. 4.

Fig. 4. a) Normalized error to Δh (left). Solid and dash lines depict the behavior for on-axis and off-axis fields, respectively. Vertical dash line represents the distance between hologram and observer. b) Normalized area (right) Ωxh,yh to the hologram size. The results of these plots are analyzed for a FOV = 31.6° and 63°. Note that for the case of FOV = 63°, two cases where considered: m4F, and SLM8k, which means that the FOV was increased using an optical 4F system and a larger SLM, respectively.

Download Full Size | PDF

Once that the accuracy of the algorithm has been validated, we can proceed to the second step of the algorithm, which is to evaluate the object wavefield with Eq. (4) for the support area Ωxh,yh. In this way, the wavefield Ur has appropriate frequency range, and thus, it is equivalent to Uo. Additionally, the obtained wavefield is multiplied by the spherical reference wave as shown by Eq. (4). Thus, the hologram for a one point is given by Uo(rh)R*(rh). When processing all the object point sources, the final hologram can be expressed as

$$H({\mathbf{r}_h}) = {R^\ast }({\mathbf{r}_h})\sum\limits_{p = 1}^P {U_o^p({\mathbf{r}_h})} ,$$
where P is the number of point sources in the object.

Proposed display architecture enables high-speed CGH calculations since object spherical waves can be calculated for small area Ωxh,yh. According to Fig. 4(b) the size of the support region area depends on the axial location of the object points. Therefore, the computation time also depends on this distance. For this reason, CGH calculation time is investigated for three axial object locations zo = 500 mm, 1000 mm, and 5000 mm and for four sizes of point clouds = 0.1, 1, 5, and 10 million. The test objects are randomly distributed point clouds that covers 2/3 of the image space. For each case, measured time is presented as the average of 10 repeated hologram generations. Figure 5 presents the obtained results.

 figure: Fig. 5.

Fig. 5. Time consumption for different sizes of the point cloud.

Download Full Size | PDF

The results show that the proposed algorithm can be used for hologram generation of a very dense object. Time consumption increases linearly with the cloud size. In contrast, the computational speed depends nonlinearly on the object distance since it is proportional to the size of the support area. It is interesting to note that the proposed algorithm can be employed efficiently for the next generation of 8 K SLM.

5. Incoherent light effect

The display system, which is illustrated in Fig. 1, utilizes LED illumination. The optics of the display images the LED source to the VW plane with a transverse magnification of mts = 0.067. In our discussion, we assume that LED is an assembly of incoherent point sources located at the source plane. Consequently, in the image space each illumination point source generates a single reconstruction of the holographic image. The final incoherent image is a sum of shifted individual reconstructions. Reference [24] has shown that for paraxial viewing angle, the resulting incoherent image is a convolution of the scaled source with coherent image, and the scaling parameter is a function of axial location of the holographic image and the transverse magnification. Here, this analysis is generalized to the wide-angle view. The PSR is applied in this regard. It enables evaluation of parameters of coherent image and scaling factor of the incoherent source. For simplicity of the discussion 1D analysis is utilized.

To estimate image blur due to the incoherent source, a holographic image obtained with reconstruction wave generated by the single off-axis point source of coordinate xs = [xs, 0, 0] of LED is analyzed. Hologram reconstruction of a single object point of coordinate rop is considered, which can be written as

$${H_{rd}}({\mathbf{r}_h}) = H({\mathbf{r}_h})R({\mathbf{r}_h} - {\textrm{x}_s}) = {U_o}{R^\ast }({\mathbf{r}_h})R({\mathbf{r}_h} - {\textrm{x}_s}).$$

This equation shows that the object wave is distorted by the factor R*(rh)R(rhxs), which introduces spatial shift of the reconstructed object point. To estimate value of this spatial shift the PSR is utilized. Figures 6(a) and (b) show PSR diagrams of the reconstructed objects: ideal, distorted, and corresponding reference waves. Plots were generated for parameters of the display and for object point at: zoh = 1 m, xoh = 234.23 mm, which corresponds to 0.9 of the image size. The hologram reconstructions obtained for on-axis illumination point source are shown using dotted line while for off-axis point source − solid line. The off-axis illumination point source is the most off-axis point source of the LED, that has size ws = 960 µm. In image space the most off-axis point source coordinate is xs = 32.2 µm. The difference between the sets of distributions is small, and it is not visible in full Fig. 6(a). All four distributions are illustrated in Fig. 6(b), which shows the zoom for spatial region of the hologram support area. The coordinate of zero spatial frequency of the spherical wave is also a coordinate of the point source. The zoom including spatial locations of the object point xo and the distorted one xso is presented in Fig. 6(c). To estimate the coordinate xso the local spatial frequency distribution of distortion factor R*(rh)R(rh- xs) is evaluated as

$${f_{lx}}_{(R{R^\ast })}({\mathbf{r}_h}) = \frac{{x_h^{}}}{{\lambda \left\|{{\mathbf{r}_h}} \right\|}} - \frac{{x_h^{} - x_s^{}}}{{\lambda \left\|{{\mathbf{r}_h} - {\textrm{x}_s}} \right\|}}.$$

 figure: Fig. 6.

Fig. 6. PSR for estimation of spatial shift due to shifted reconstruction wave: (a) PSR diagrams of local spatial frequency of reconstruction waves $R({{{\mathbf r}_h}} )$ - flx(R(xh)), and $R({{{\mathbf r}_h} - {{\mathbf x}_\textrm{s}}} )$ - flx(R(xh-xs)) and of object wave ${u_o}({{{\mathbf r}_h}} )$ - flx(uo(xh)), and ${u_{or}}({{{\mathbf r}_h}} )$ - flx(uor(xh)), (b) zoom A of plot (a), (c) zoom B of plot (a), the red arrow depicts the displacement position xso of the point source due to the influence of the extended source while the vertical arrow represents the local spatial frequency distribution of distortion factor at the point xo. (d) PS distribution of ${f_{lx({RR\ast } )}}$.

Download Full Size | PDF

When using the condition of small value for xs, it yields

$${f_{lx}}_{(R{R^\ast })}({\mathbf{r}_h}) = \frac{{x_s^{}z_h^2}}{{\lambda {{\left\|{{\mathbf{r}_h}} \right\|}^3}}}.$$

Figure 6(d) presents the plot of flx(RR*) as function of transverse coordinate of xo. Applying Eq. (29) to the geometry of Fig. 5(c), the value of spatial shift can be estimated as

$$x_{so}^{} = \frac{{x_s^{}z_h^2({z_{op}^{} - z_h^{}} )}}{{{{({z_h^2 + x_o^2} )}^{3/2}}}} = {m_s}{x_s}.$$

Above equation shows that for source with location xs, the reconstructed image at msxs is obtained. The intensity image distribution can be found by assuming spatial invariance of the reconstructions obtained for all reconstruction sources, which are equal to the coherent light reconstructions that can be found directly from Eq. (13) as sinc(Bfxo(xo-xoh)). For all sources the partially coherent image is a sum of intensity contributions from all the sources of the LED. This can be written as a convolution:

$$ I_{o r}\left(x_o\right)=\operatorname{sinc}\left(B_{f x o}\left(x_o-x_{o p}\right)\right) \otimes I_{o r}\left(\frac{x_o}{m_s}\right). $$

The equation enables easy interpretation of the partially coherent illumination effect in wide- angle display. Let us first discuss the two elements of the equation separately. Their resolution estimates as a function of coordinate x­o are presented in Fig. 7(a). The first element is spatially invariant and the resolution estimate 1/Bfxo is presented by the red line. The effect source size is field depended. For analyzed 1D case ${I_s}({{x_o}m_s^{ - 1}} )= rect({{x_o}m_{ts}^{ - 1}m_s^{ - 1}w_s^{ - 1}} ),$ where rect(…) is rectangle function. The change is presented by the blue line. For points xo = 0, 0.45Bxo the corresponding widths of rectangles are 53.9 µm and 41.5 µm. Since the final reconstructed point is a convolution, at the off-axis point the reconstruction is slightly more coherent. The result of the convolution is presented in the plot of Fig. 7(b) using blue line.

 figure: Fig. 7.

Fig. 7. (a) The resolution estimates of coherent image and source size effect. (b) Coherent and incoherent point source reconstruction for op = 0.45Bxo.

Download Full Size | PDF

6. Experimental results

In this section, holographic near-eye display with large hologram is investigated in two experiments. The first compares quality of the single frame reconstruction with coherent laser and incoherent LED sources. The second demonstrates the display capability to reconstruct high quality, large-scale 3D object with LED as the light source. All images shown in this Section were captured by a smartphone wide angle camera, which parameters allows for imitation of the human eye [38].

As a test object, virtual model of a 18th-century British sailing ship is used. It is composed of 12 million points. The object is well suited for experimental verification, since it has many details like flags, nets, ropes, sails, deck, etc. Dimensions of the model are matched to the maximum FOV of the display and equal 560 mm width, 460 mm height, and 236 mm depth. Object is placed 1000 mm from the hologram plane. The CGH is calculated using method described in Section 4. In the display virtual object, that is observed by the viewer, is reconstructed 1544 mm from the eye. For experimental evaluation, one perspective view of the ship was selected, which shows many fine details. For this view, hidden points are removed from the cloud of points [11], resulting in 7.8 million points for CGH encoding.

In the first experiment CGH is reconstructed with two light sources: incoherent LED (λ = 635 nm) and coherent laser (λ = 632.8 nm). For the second case display setup was modified, and LED was replaced by a laser with a spatial filter composed of a pinhole and microscopic objective. Single frame holograms were calculated for both wavelengths, keeping other parameters unchanged. Figure 8 shows the results of the optical reconstructions. For better visualization of the differences parts of the images are enlarged. It can be noticed that quality of reconstruction obtained for LED is higher than for laser. All edges are sharper, and more details of the wooden deck are visible. The reconstruction obtained using laser has much more speckle noise, and the edges are slightly more blurred.

 figure: Fig. 8.

Fig. 8. Optical reconstructions of 3D point of cloud object representing ‘ship’ and its enlarged parts obtained for a), c) laser and b), d) LED source. Yellow rectangle indicates area chosen for calculation of speckle contrast.

Download Full Size | PDF

Second experiment shows image quality obtained in proposed near-eye display. Reconstruction of the object is carried out using the LED as light source. For achieving highest possible quality, time multiplexing with random phase is applied during content generation. Figure 9 presents reconstructed object captured by the camera, which was focused on the ropes attached to the middle sail of the ship. Three regions of the image are enlarged to illustrate detailed reconstruction of the object. Since 3D model is deep, the flag, which is at the back part of deck, is slightly blurred. This effect is caused by the camera and for observation with eye is not visible. Obtained result proves that the quality of reconstruction is high, and small details of the deck of the ship can be seen. Employed multiplexing method was tested with laser source as well. As for single frame reconstruction shown in Fig. 8, laser gives higher speckle noise. In additional experiment, CGHs for 360° rotation of the viewpoint around object are generated. The step between two consecutive views is 1/8° thus 2880 holograms are calculated and displayed sequentially by the SLM. Visualization 1 shows obtained optical reconstructions.

 figure: Fig. 9.

Fig. 9. Optical reconstruction of 3D point of cloud object representing ‘ship’ obtained in wide-angle near-eye holographic display with LED illumination and time multiplexing (left). Three zoomed in areas of obtained image (right).

Download Full Size | PDF

To quantify the improvement of the image quality using LED, we have calculated speckle contrast parameter C [23,39] for a region of the reconstructed sail marked with yellow square in Fig. 8. The selected area has the best uniformity of the reconstruction. Parameter C is defined as the standard deviation of the intensity distribution to the mean intensity value in chosen region of interest. The obtained values are 0.79, 0.70 for single frame reconstruction with laser and LED, respectively. For reconstruction with time multiplexing and LED, the corresponding value is 0.2. Please note that fulfilling the condition of uniform reconstruction using point source 3D object is not possible, and thus, the calculated measure is not accurate. Nevertheless, the calculated C indicates quality improvement.

7. Conclusion

This work presents near-eye holographic display of FOV 31.6° with large non-paraxial hologram generation and LED illumination. In the display, large hologram is formed far from the observer’s eye. This system architecture offers fast wide-angle hologram calculations and the possibility of employing an incoherent light source. We show theoretically and experimentally that large LED is optimal light source for wide angle near-eye display because it enables high resolution and high-quality image.

The selected near-eye display architecture allows projecting a large virtual hologram far from the observer’s eye. Total magnification of the display mtot is 20.11 giving a 288.80 mm × 162.45 mm hologram. An essential functionality of the system is the complex encoding of holograms, which has been solved using cosine-based solution.

For presented display configuration only a small area of the hologram has a frequency, which supports one object point. The small object point hologram support area is advantageous, it enables high-speed calculations of wide angle CGH since it requires a small amount of SBP to be processed. Numerical simulations prove that the proposed CGH generation algorithm handles very dense objects. The key-step of the proposed CGH algorithm is a calculation method of point source support area. For this, a concept of local spatial curvature of the wavefront, which is proportional to the tangent line to PSR of the corresponding wavefront, is introduced. By using this concept an approximate, analytical solution calculating point source support area is developed that is computationally efficient and has negligible approximation error.

It has been experimentally shown that the resulting LED image has less speckle noise and a resolution comparable to that obtained with coherent light. This experimental conclusion is supported by the developed theory describing spatial coherence effect of wide-angle display. It is shown that the coherence effect is angle variant. Conducted coherence analysis for the parameters of the experimental system shows that the used LED with diameter of 960 µm blurs the image very slightly, what is more, for off-axis points the blurring effect decreases.

Funding

Politechnika Warszawska; Narodowe Centrum Nauki (UMO-2018/31/B/ST7/02980).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. L.-H. Lee, T. Braud, P. Zhou, L. Wang, D. Xu, Z. Lin, A. Kumar, C. Bermejo, and P. Hui, “All One Needs to Know about Metaverse: A Complete Survey on Technological Singularity, Virtual Ecosystem, and Research Agenda,” arXiv 14, 1–66 (2021). [CrossRef]  

2. D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vis. 8(3), 33 (2008). [CrossRef]  

3. C. Chang, K. Bang, G. Wetzstein, B. Lee, and L. Gao, “Toward the next-generation VR/AR optics: a review of holographic near-eye displays from a human-centric perspective,” Optica 7(11), 1563–1578 (2020). [CrossRef]  

4. X. Yang, P. Song, H. Zhang, and Q.-H. Wang, “Full-color computer-generated holographic near-eye display based on white light illumination,” Opt. Express 27(26), 38236–38249 (2019). [CrossRef]  

5. X. Yang, S. Jiao, Q. Song, G.-B. Ma, and W. Cai, “Phase-only color rainbow holographic near-eye display,” Opt. Lett. 46(21), 5445–5448 (2021). [CrossRef]  

6. A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36(4), 1–16 (2017). [CrossRef]  

7. M. Y. He, D. Wang, Y. Xing, Y. W. Zheng, H. Le Zhang, X. L. Ma, R. Y. Yuan, and Q. H. Wang, “Compact and lightweight optical see-through holographic near-eye display based on holographic lens,” Displays 70, 1–6 (2021). [CrossRef]  

8. P. Sun, S. Chang, S. Liu, X. Tao, C. Wang, and Z. Zheng, “Holographic near-eye display system based on double-convergence light Gerchberg-Saxton algorithm,” Opt. Express 26(8), 10140–10151 (2018). [CrossRef]  

9. S. W. Nam, S. Moon, C. K. Lee, H. S. Lee, and B. Lee, “Aberration-corrected full-color holographic augmented reality near-eye display using a Pancharatnam-Berry phase lens,” Opt. Express 28(21), 30836–30850 (2020). [CrossRef]  

10. J.-S. Chen and D. P. Chu, “Improved layer-based method for rapid hologram generation and real-time interactive holographic display applications,” Opt. Express 23(14), 18143–18155 (2015). [CrossRef]  

11. J. Martinez-Carranza, T. Kozacki, R. Kukołowicz, M. Chlipala, and M. S. Idicula, “Occlusion culling for wide-angle computer-generated holograms using phase added stereogram technique,” Photonics 8(8), 298 (2021). [CrossRef]  

12. E. Buckley, A. Cable, N. Lawrence, and T. Wilkinson, “Viewing angle enhancement for two- And three-dimensional holographic displays with random superresolution phase masks,” Appl. Opt. 45(28), 7334–7341 (2006). [CrossRef]  

13. J. S. Lee, Y. K. Kim, and Y. H. Won, “See-through display combined with holographic display and Maxwellian display using switchable holographic optical element based on liquid lens,” Opt. Express 26(15), 19341–19355 (2018). [CrossRef]  

14. Q. Gao, J. Liu, X. Duan, T. Zhao, X. Li, and P. Liu, “Compact see-through 3D head-mounted display based on wavefront modulation with holographic grating filter,” Opt. Express 25(7), 8412–8424 (2017). [CrossRef]  

15. B. Kress and T. Starner, “A review of head-mounted displays (HMD) technologies and applications for consumer electronics,” Proc. SPIE 8720, 87200A (2013). [CrossRef]  

16. W. Song, X. Li, Y. Zheng, Y. Liu, and Y. Wang, “Full-color retinal-projection near-eye display using a multiplexing-encoding holographic method,” Opt. Express 29(6), 8098–8107 (2021). [CrossRef]  

17. S. Kazempourradi, E. Ulusoy, and H. Urey, “Full-color computational holographic near-eye display,” J. Inf. Disp. 20(2), 45–59 (2019). [CrossRef]  

18. G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41(11), 2486–2489 (2016). [CrossRef]  

19. C. Jang, K. Bang, G. Li, and B. Lee, “Holographic near-eye display with expanded eye-box,” ACM Trans. Graph. 37(6), 1–14 (2018). [CrossRef]  

20. X. Duan, J. Liu, X. Shi, Z. Zhang, and J. Xiao, “Full-color see-through near-eye holographic display with 80° field of view and an expanded eye-box,” Opt. Express 28(21), 31316–31329 (2020). [CrossRef]  

21. Z. Zhang, J. Liu, X. Duan, and Y. Wang, “Enlarging field of view by a two-step method in a near-eye 3D holographic display,” Opt. Express 28(22), 32709–32720 (2020). [CrossRef]  

22. M. Chlipala and T. Kozacki, “Color LED DMD holographic display with high resolution across large depth,” Opt. Lett. 44(17), 4255–4258 (2019). [CrossRef]  

23. S. Lee, D. Kim, S. W. Nam, B. Lee, J. Cho, and B. Lee, “Light source optimization for partially coherent holographic displays with consideration of speckle contrast, resolution, and depth of field,” Sci. Rep. 10(1), 1–12 (2020). [CrossRef]  

24. T. Kozacki, M. Chlipala, and P. L. Makowski, “Color Fourier orthoscopic holography with laser capture and an LED display,” Opt. Express 26(9), 12144–12158 (2018). [CrossRef]  

25. J. H. Park, “Recent progress in computer-generated holography for three-dimensional scenes,” J. Inf. Disp. 18(1), 1–12 (2017). [CrossRef]  

26. T. Ichikawa, K. Yamaguchi, and Y. Sakamoto, “Realistic expression for full-parallax computer-generated holograms with the ray-tracing method,” Appl. Opt. 52(1), A201–209 (2013). [CrossRef]  

27. K. Matsushima and S. Nakahara, “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt. 48(34), H54–H63 (2009). [CrossRef]  

28. H. Kang, T. Yamaguchi, and H. Yoshikawa, “Accurate phase-added stereogram to improve the coherent stereogram,” Appl. Opt. 47(19), D44–D54 (2008). [CrossRef]  

29. R. Kukołowicz, M. Chlipala, J. Martinez-Carranza, M. S. Idicula, and T. Kozacki, “Fast 3D Content Update for Wide-Angle Holographic,” Appl. Sci. 12(1), 1–15 (2022). [CrossRef]  

30. T. Ichikawa, T. Yoneyama, and Y. Sakamoto, “CGH calculation with the ray tracing method for the Fourier transform optical system,” Opt. Express 21(26), 32019–32031 (2013). [CrossRef]  

31. R. Watanabe, K. Yamaguchi, and Y. Sakamoto, “Fast calculation method of computer generated hologram animation for viewpoint parallel shift and rotation using Fourier transform optical system,” Appl. Opt. 55(3), A167–A176 (2016). [CrossRef]  

32. T. Zhan, E. L. Hsiang, K. Li, and S. T. Wu, “Enhancing the optical efficiency of near-eye displays with liquid crystal optics,” Crystals 11(2), 1–8 (2021). [CrossRef]  

33. J. Xia, W. Zhu, and I. Heynderickx, “Three-dimensional electro-holographic retinal display,” 49th Annu. SID Symp. Semin. Exhib. 2011, Disp. Week 20112, 591–594 (2011).

34. O. Mendoza-Yero, G. Mínguez-Vega, and J. Lancis, “Encoding complex fields by using a phase-only optical element,” Opt. Lett. 39(7), 1740–1743 (2014). [CrossRef]  

35. X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013). [CrossRef]  

36. T. Shimobaba, N. Masuda, and T. Ito, “Simple and fast calculation algorithm for computer-generated hologram with wavefront recording plane,” Opt. Lett. 34(20), 3133–3135 (2009). [CrossRef]  

37. L. Shi, F. C. Huang, W. Lopes, W. Matusik, and D. Luebke, “Near-eye light field holographic rendering with spherical waves for wide field of view interactive 3D computer graphics,” ACM Trans. Graph. 36(6), 1–17 (2017). [CrossRef]  

38. R. Cicala, “The Camera Versus the Human Eye,” https://petapixel.com/2012/11/17/the-camera-versus-the-human-eye/.

Supplementary Material (1)

NameDescription
Visualization 1       This video presents the 360 degree reconstruction of a cloud of point that represents a 18th-century British sailing ship. This cloud has 12 million points and physical dimension 560 x 460 x 236 mm for width, height and depth, respectively.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. Scheme of large FOV near-eye holographic display setup with LED illumination. Lc – collimating lens, P - polarizer, BS – beam splitter, L1 and L2 lenses forming 4F system, Lep – eyepiece.
Fig. 2.
Fig. 2. Geometrical diagram of the holographic image domain. Here xe, xh, and xo stand for eyebox, hologram, and object plane, respectively.
Fig. 3.
Fig. 3. Diagrams for estimation of image resolution: (a) PSR of hologram support and object beam for xo = 234.2 mm, zop = 1000 mm. The yellow line represents the PSR of the off axis point source in the hologram plane, while red and blue lines depict the PS support of the hologram. (b) Zoom for hologram support area; the points, where the lines intersect, generate the support area Ωh. (c) Error of the evaluation of the image resolution for xop = 0.45Bxo.
Fig. 4.
Fig. 4. a) Normalized error to Δh (left). Solid and dash lines depict the behavior for on-axis and off-axis fields, respectively. Vertical dash line represents the distance between hologram and observer. b) Normalized area (right) Ωxh,yh to the hologram size. The results of these plots are analyzed for a FOV = 31.6° and 63°. Note that for the case of FOV = 63°, two cases where considered: m4F, and SLM8k, which means that the FOV was increased using an optical 4F system and a larger SLM, respectively.
Fig. 5.
Fig. 5. Time consumption for different sizes of the point cloud.
Fig. 6.
Fig. 6. PSR for estimation of spatial shift due to shifted reconstruction wave: (a) PSR diagrams of local spatial frequency of reconstruction waves $R({{{\mathbf r}_h}} )$ - flx(R(xh)), and $R({{{\mathbf r}_h} - {{\mathbf x}_\textrm{s}}} )$ - flx(R(xh-xs)) and of object wave ${u_o}({{{\mathbf r}_h}} )$ - flx(uo(xh)), and ${u_{or}}({{{\mathbf r}_h}} )$ - flx(uor(xh)), (b) zoom A of plot (a), (c) zoom B of plot (a), the red arrow depicts the displacement position xso of the point source due to the influence of the extended source while the vertical arrow represents the local spatial frequency distribution of distortion factor at the point xo. (d) PS distribution of ${f_{lx({RR\ast } )}}$.
Fig. 7.
Fig. 7. (a) The resolution estimates of coherent image and source size effect. (b) Coherent and incoherent point source reconstruction for op = 0.45Bxo.
Fig. 8.
Fig. 8. Optical reconstructions of 3D point of cloud object representing ‘ship’ and its enlarged parts obtained for a), c) laser and b), d) LED source. Yellow rectangle indicates area chosen for calculation of speckle contrast.
Fig. 9.
Fig. 9. Optical reconstruction of 3D point of cloud object representing ‘ship’ obtained in wide-angle near-eye holographic display with LED illumination and time multiplexing (left). Three zoomed in areas of obtained image (right).

Equations (31)

Equations on this page are rendered with MathJax. Learn more.

U o ( r h ) = e i k r h r o p ,
f x o ( r h ) = x h x o p λ r h r o p ,
f y o ( r h ) = y h y o p λ r h r o p
U r ( r h ) = H ( r h ) R ( r h ) ,
f x h ± ( r h ) = 1 2 π x h [ k r h ] ± B f h 2 = 1 λ x h r h ± B f h 2 ,
f y h ± ( r h ) = 1 2 π y h [ k r h ] ± B f h 2 = 1 λ y h r h ± B f h 2 .
c x o ( r h ) = d e f λ f x o x h = ( y h y o ) 2 + ( z h z o ) 2 r h r o 3 ,
c y o ( r h ) = d e f λ f y o x h = ( x h x o ) 2 + ( z h z o ) 2 r h r o 3 ,
c x h ( r h ) = λ f x h ± x h = y h 2 + z h 2 r h 3 ,
c y h ( r h ) = λ f y h ± y h = x h 2 + z h 2 r h 3 .
[ x h o p , y h o p ] = [ x o p z h z o p , y o p z h z o p ] .
B f x o | c x o ( x h o p ) | = B f x o B f h | c x h ( x h o p ) | .
B f x o = B f h z h z o p .
B x o = B x h z o p z h .
f x h ± ( r h ) = f x o ( r h ) ,
f y h ± ( r h ) = f y o ( r h ) .
f x f x o ( r h o p ) = c x o ( r h o p ) λ ( x h ± x h o p ) ,
f x f x h ± ( r h o p ) = c x h ( r h o p ) λ ( x h ± x h o p ) .
c x o ( r h o p ) D x h o = c x h ( r h o p ) D x h o + λ B f h 2 ,
c y o ( r h o p ) D y h o = c y h ( r h o p ) D y h o + λ B f h 2 .
D x h o = B f h λ ( z o p z h ) r h o p 3 2 ( z h 2 y h o p 2 ) z o p ,
D y h o = B f h λ ( z o p z h ) r h o p 3 2 ( z h 2 x h o p 2 ) z o p .
x h o p ± = x h o p ± D x h o ,
y h o p ± = y h o p ± D x h o .
F O V = tan 1 ( m 4 F N S L M Δ S L M 2 F e p ) ,
H ( r h ) = R ( r h ) p = 1 P U o p ( r h ) ,
H r d ( r h ) = H ( r h ) R ( r h x s ) = U o R ( r h ) R ( r h x s ) .
f l x ( R R ) ( r h ) = x h λ r h x h x s λ r h x s .
f l x ( R R ) ( r h ) = x s z h 2 λ r h 3 .
x s o = x s z h 2 ( z o p z h ) ( z h 2 + x o 2 ) 3 / 2 = m s x s .
I o r ( x o ) = sinc ( B f x o ( x o x o p ) ) I o r ( x o m s ) .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.