Micro-tomography via single-pixel imaging - OSA Publishing

2 downloads 0 Views 4MB Size Report
Nov 12, 2018 - performed in the visible region of the spectrum where camera ..... Raskar, “Lensless imaging with compressive ultrafast sensing,” IEEE Trans.
Vol. 26, No. 24 | 26 Nov 2018 | OPTICS EXPRESS 31094

Micro-tomography via single-pixel imaging J UNZHENG P ENG , 1,2 M ANHONG YAO, 1 J IAJIAN C HENG , 1 Z IBANG Z HANG , 1,2 S HIPING L I , 1,2 G UOAN Z HENG , 3 AND J INGANG Z HONG 1,2,* 1 Department

of Optoelectronic Engineering, Jinan University, Guangzhou, 510632, China Provincial Key Laboratory of Optical Fiber Sensing and Communications, Jinan University, Guangzhou 510632, China 3 Biomedical Engineering, University of Connecticut, Storrs, Connecticut 06269, USA

2 Guangdong

* Corresponding

author: [email protected]

Abstract: Tomographic imaging allows for the cross-sectional imaging of specimen, whereas single-pixel imaging can produce image only with a spatial non-resolved detector. Here we propose a compact tomographic imaging system combining single-pixel imaging. This approach uses a digital micromirror device (DMD) to encode the spatial information of specimen and employs an array of single-pixel detectors to record the light signals from different directions. For each single-pixel detector, we can retrieve an image of the specimen from a unique perspective angle. Based on the retrieved images, we can realize tomographic imaging, such as intensity images refocusing and three-dimensional (3D) differential-phase-contrast imaging, without mechanically scanning the specimen. Experimental results also demonstrate that the microtomographic images with 384×384 pixels can be simultaneously realized only with an array of 5×6 single-pixel detectors. Furthermore, due to the broad operational spectrum of the single-pixel detector, the proposed method is a good candidate to realize tomographic imaging with the non-visible light wavebands, such as terahertz and x-ray, thus it would open up opportunities in many life science and engineering fields. © 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1.

Introduction

Tomography imaging is a vital tool for biomedical research as it produces a group of image slices of the specimen [1–5]. Stacking these image slices together forms a set of volumetric data of the sample. To get the image slices of the specimen, however, conventional tomographic imaging methods usually require scanning the specimen along the optical axis or in different angles. In either case, the success of the applied tomographic imaging depends critically on a precision and expensive device to control the specimen or the illumination beam. As an alternative method, light field microscopy based a micro-lens array can realize micro-tomographic imaging without the need of scanning [6, 7]. However, it imposes a trade-off between the spatial and angular resolutions (i.e., one can obtain densely sampled images in the spatial domain with sparse samples in the angular domain, and viceversa). A low angular resolution will lead to severe aliasing artifacts in refocusing, and a low spatial resolution will produce image with low quality. Recently an approach based on an LED matrix [8] provides a solution for tomographic imaging while bypassing the trade-off between the spatial and angular resolutions. However, this approach is performed in the visible region of the spectrum where camera technology is well developed. For other regions of the spectrum, such as infrared and terahertz, tomographic imaging technique traditionally requires complicated, bulky and expensive cameras. Therefore, a compact and cost-effective tomographic imaging system that can operate efficiently across a much broader spectral range is still highly expected. Here we propose a micro-tomography via the single-pixel imaging [9-23]. In our approach, we use a digital micromirror device (DMD) to encode the illumination patterns and employ a #341220 Journal © 2018

https://doi.org/10.1364/OE.26.031094 Received 1 Aug 2018; revised 6 Oct 2018; accepted 9 Oct 2018; published 12 Nov 2018

Vol. 26, No. 24 | 26 Nov 2018 | OPTICS EXPRESS 31095

single-pixel detector array to collect the light signal from the specimen. Similar to light field microscopy, we achieve tomographic imaging by recording both spatial and angular information of the light field. For each single-pixel detector, we can produce a 2D image of the specimen from a unique perspective angle by single-pixel imaging. Based on the recovered images with different perspective angles, we can realize tomographic imaging by digitally refocusing the specimen at different depths, without physically scanning the specimen. We can further use the recovered perspective images to generate 3D differential-phase-contrast images without changing the setup. Different from the light field microscopy, the spatial information of the proposed approach comes from the encoded illumination and the angular information comes from the distinct locations of the single-pixel detectors. Hence the proposed method removes the tradeoff between the spatial and angular resolution. The proposed approach provides a connection between the single-pixel imaging and the micro-tomographic imaging. The broad operational spectrum of the single-pixel detector allows the proposed method to be applied in wide variety of wavelengths such as infrared and terahertz. Hence, our scheme might benefit various applications in biomedical inspection as well as security screening. 2. 2.1.

Experimental system Experimental setup

The configuration of the proposed system is shown in Fig. 1. The collimated light beam emitted from an LED light source (center wavelength: 633nm) is directed onto the DMD (9.500, 1920×1080 pixels, pixel size: 10.8 µm) by the reflecting mirror. We then display a set of illumination patterns on the DMD and project them onto the specimen through a tube lens (focal length: 200mm) and an objective lens (10×, NA 0.25). Finally, the light transmitted through the specimen is collected by an array of single-pixel detectors on the right. To convenient experiment, we employ a CCD camera (pointgrey, GS3-U3-60QS6C-C, 100CCD, 2736×2192 pixels, pixel size: 4.54 µm) to build the array of single-pixel detectors. Each single-pixel detector is constructed by binning 240×240 pixels of the camera. Due to the use of single-pixel imaging method, the resolution of the retrieved image does not rely on the size of the single-pixel detectors, but depends on the objective lens and the pixel size of the DMD. Reflecting mirror

Tube lens

Objective Lens

DMD fTL

fOL

Specimen

Single-pixel detector array 1

5

10

41

45

50

71

75

80

LED source

Fig. 1. Schematic diagram of the experimental setup, with fT L and fOL denoting the focal length of the tube lens and the objective lens, respectively.

Vol. 26, No. 24 | 26 Nov 2018 | OPTICS EXPRESS 31096

2.2.

Acquisition of the spatial and angular information

Since measurements are performed via single-pixel imaging scheme, the proposed setup encodes the 2D spatial information into a 1D intensity sequence of individual pixels. As such, each singlepixel detector records a 1D intensity sequence. For the case where Fourier basis patterns are used, one can assemble the Fourier spectrum of the specimen from the single-pixel measurements [12]. Applying the inverse Fourier transform on the Fourier spectrum yields the image of the specimen. As such, each single-pixel detector generates a 2D image of the specimen from the sequential single-pixel measurements. Tube lens

Sensor plane of camera

Objective Lens LED 1

$ LED 0

%

LED 2 fTL

fTL

fOL

fOL

(a) Sensor plane of camera

Objective Lens

Tube lens

LED 1 C LED 0 D LED 2

fTL

fTL

fOL

fOL

(b)

Sensor plane of camera

Objective Lens

Tube lens

LED 1 E LED 0 F LED 2 fTL

fTL

fOL

fOL

(c) Fig. 2. Reciprocal configuration of Fig. 1. (a) Object AB is located at the focal plane, (b) object CD is located in the front of the focal plane, (c) object EF located behind the focal plane.

Apart from retrieving the specimen image (also termed as 2D spatial information), the proposed system allows recording the angular information. In order to understand its principle in recording the angular information, we show the reciprocal configuration Fig. 2. Since single-pixel

Vol. 26, No. 24 | 26 Nov 2018 | OPTICS EXPRESS 31097

imaging is subject to the Helmholtz reciprocity, the proposed setup is equivalent to its reciprocal configuration of the LED array shown in Fig. 2. Specifically, the single-pixel detector array shown in Fig. 1 is equivalent to the LED array, where each single-pixel detector is equivalent to an LED element. Meanwhile, the DMD is equivalent to the 2D image sensor shown in Fig. 2. Compared to the size of the specimen, the distance between the LED array and the specimen is so large that the light delivered onto the specimen can be treated as plane waves [24]. We can draw the following conclusions based on Fig. 2: i Each individual LED illuminates the specimen at a unique angle. ii If the object is located at the focal plane, its images will be unchanged in the sensor plane whatever it was illuminated by the on-axis or off-axis LEDs, as shown in Fig. 2(a). iii When the object deviates from the focal plane, its projections on the sensor plane will be laterally shifted, as shown in Figs. 2(b) and 2(c). iv When the object extends at different depths, their projections on the sensor plane will be laterally shifted with different pixels and directions, as shown in Figs 2(b) and 2(c). Based on Fig. 2, we can capture a set of perspective images by sequentially illuminating the specimen with different LEDs. Each captured image provides a specific perspective view of the specimen. Although the light travels in opposite directions for the two imaging systems shown in Figs. 1 and 2, the images retrieved from both are equivalent because of the reciprocal principle. For the proposed system in Fig. 1, each single-pixel detector therefore produces a 2D image with a unique perspective view of the specimen. Different from the light field microscope using an LED array, we do not need to scan the specimen in sequence. Instead, all perspective images of our system are simultaneously recorded by all single-pixel detectors. For the perspective image retrieved by one single-pixel detector, its resolution is determined by that of the Fourier basis patterns loaded on the DMD. For example, if we want to reconstruct an image with N × N pixels, the resolution of the Fourier basis patterns should be set as N × N. Note that the pixel size of DMD here is larger than the diffraction limitation of the objective lens. Therefore, the lateral resolution of the perspective images is determined by the magnification of the objective lens (denoted as M) and the pixel size of DMD (denoted as ∆ ps ). Considering M and ∆ ps , the lateral resolution of the perspective images achieved by the proposed system can be calculated as ∆ ps /M. 3. 3.1.

Method Methods of intensity refocusing and 3D DPC imaging

Based on the perspective images retrieved from different pixels, we can reconstruct the specimen at different depths with the shifting-and-adding algorithm [8]. For example, if n × n single-pixel detectors are used to retrieve the perspective images, the digital refocusing procedure can be summarized as follows. 1) Determine the refocusing depth ∆z. 2) Calculate the incident angles (θ x, θ y ) for each single-pixel detector via tanθ x = xi /d, tanθ y = yi /d, where (xi, yi ) represents the coordinate of the single-pixel detector and d is the distance between the sample and the image sensor of camera. 3) Calculate the shift amounts for each perspective image, ∆x = ∆z · xi /d and ∆y = ∆z · yi /d. All perspective images are then shifted based on (∆x, ∆y). 4) All shifted images are added together to synthesize the refocused image. Another capability of the proposed system is to perform differential-phase-contrast (DPC) imaging without changing the hardware. DPC imaging is an important tool for life science, as it provides label-free phase contrast for transparent samples. In the conventional optical microscope, DPC can be realized by illuminating the sample with two sets of opposite sources (e.g., capture one image I L with the left side of the source and capture another image IR with the

Vol. 26, No. 24 | 26 Nov 2018 | OPTICS EXPRESS 31098

right side of the source). The DPC image can be calculated as the normalized difference between the two images: ID PC = (I L − IR )/Itot , (1) where Itot = I L + IR . Since single-pixel imaging technique subjects to the principle of Helmholtz reciprocity, we can enable the proposed system to have the DPC imaging capability. Taking the left-right DPC image for example, the implementation steps are summarized as follow: (1) retrieve one image by using the values measured by the left side single-pixel detectors; (2) retrieve another image by using the values measured by the right side single-pixel detectors; (3) the DPC image can be calculated as the normalized difference between the images. Obviously, the top-bottom DPC image can be achieved with the proposed system by using the similar steps. Inspired by the concept of 3D DPC [25, 26], we can further enable the proposed system to have 3D DPC imaging capability without changing any hardware. The realization process is similar to that of the intensity image refocusing, except that the last step is different. Taking the 3D left-right DPC for example, after shifting all perspective images based on the refocusing depth, we separately add the shifted images retrieved from left-half and right-half detectors to synthesize the refocused images: Õ I L∆z = Ii∆z , (2) NL

IR∆z

=

Õ

Ii∆z ,

(3)

NR

where Ii∆z denotes the shifted intensity image at ∆z from the ith detector, NL and NR represent the number of the left-half and right-half images, respectively. Finally, the refocused DPC image at depth ∆z is calculated using ∆z ∆z I LR = (I L∆z − IR∆z )/(Itot ),

IT∆zB

=

(IT∆z



∆z ), IB∆z )/(Itot

(4) (5)

∆z where Itot = I L∆z + IR∆z denotes the refocusing bright field image at depth ∆z. Since the left-right and top-bottom DPC images represent the phase gradient of specimen along the vertical and the horizontal directions, respectively, they can be used to visualize different features of the specimen.

3.2.

Experimental procedure

The measurement procedure of the proposed system is the same as that of the conventional singlepixel imaging setup. In general, reconstruction of the 2D image via single-pixel detector requires modulating the object with spatially or temporally varying patterns. Recently, a number of illumination strategies, such as random illumination patterns [9], Hadamard basis patterns [10,11] and Fourier basis patterns (also called as sinusoidal intensity patterns) [12] have been reported to improve the reconstruction quality as well as to reduce the acquisition time. Since the latter is more efficient than the other two methods [27], we use the Fourier basis patterns for modulating the specimen in our implementation. With the Fourier basis patterns, we can perform Fourier spectrum measurements to acquire the object spatial information. Each Fourier coefficient is obtained by using four sinusoidal patterns with a phase-shifting of π/2. The resolution of reconstructed images is 384×384 pixels in our experiments. To maximize the projection speed, we use binary Fourier basis patterns [28] in the experiment. Meanwhile, to ensure the quality of the retrieved image, we up-sample the binary Fourier basis patterns to be 768×768 pixels on the DMD.

Vol. 26, No. 24 | 26 Nov 2018 | OPTICS EXPRESS 31099

4. 4.1.

Experimental results Reconstruction of perspective images

We use a thick specimen (cotton aphid) to test the capability of recording the spatial and angular information. The distance between the specimen and the array of single-pixel detectors is 33mm. We firstly use an array of 8×10 single-pixel detectors in experiment. Each single-pixel detector is built from the CCD camera at interval of 240 pixels. Thus the pitch between the adjacent single-pixel detectors is 240 × 0.00454 =1.09mm. After projecting the Fourier basis patterns onto the specimen, the light rays are simultaneously recorded by the 8×10 single-pixel detectors from different angles. Based on the light signal collected by each single-pixel detector, we can retrieve a unique perspective image from each single-pixel detector via Fourier single-pixel algorithm. Figure 3 shows the reconstructed images from 9 different single-pixel detectors.

(a)

(b)

(c)

(d)

(e)

(f)

(g)

(h)

(i)

Fig. 3. Perspective images of cotton aphid retrieved from the (a) 1st, (b) 5th, (c) 10th, (d) 41th, (e) 45th, (f) 50th, (g) 71th, (h) 75th, and (i) 80th single-pixel detectors. The positions of these single-pixel detectors are given in Fig. 1.

From Fig. 3, it can be seen when the images are reconstructed from different single-pixel detectors in the horizontal direction, specimen in different depths, such as the red solid ellipses highlighted in Figs. 3(a)-3(c), are horizontally shifted from each other. Similarly, when the images are reconstructed from different single-pixel detectors in the vertical direction, specimen in different depths, such as the red solid ellipses highlighted in Figs. 3(b), 3(e), and 3(h), are vertically shifted from each other. These results agree well with the analysis given in Fig. 2. These results also indicate that the proposed system can simultaneously record the 2D angular and

Vol. 26, No. 24 | 26 Nov 2018 | OPTICS EXPRESS 31100

spatial information. More importantly, because the resolution of the image retrieved from each single-pixel detector depends only on that of the illumination system, such as the resolution of the modulation patterns loaded on DMD, the angular information can be recorded in the detection side without sacrificing the spatial resolution. Furthermore, all angular information is simultaneously recorded by all single-pixel detectors in our implementation. Thus, the measurement time is not increased compared with the case of using one single-pixel detector. 4.2.

Intensity images refocusing

Using the perspective images retrieved above, we get the results of digital refocusing, as shown in Fig. 4. The digital refocusing z-stack is animated in an AVI format movie (Visualization 1). From these results, it can be seen that different depth sections, such as the head and tentacle of the specimen shown in Fig. 4, can be refocused without physically moving the sample along the axial direction. For microscopes equipped with one single-pixel detector, however, the retrieved image just provides one perspective view of the specimen, as shown in Fig. 3. Thus, it cannot be used to separate the profiles of the specimen in different depths. In addition, all single-pixel detectors work simultaneously during the measurement process, and as such, we can recover the 3D intensity images without increasing the number of measurements compared with that using one single-pixel detector. -140μm

-100μm

(a) 20μm

-60μm

(b) 60μm

(e)

-20μm

(c) 100μm

(f)

(d) 140μm

(g)

(h)

Fig. 4. Intensity images refocused at different depths by using an array of 8 × 10 single-pixel detectors.

4.3.

3D DPC imaging

Based on the perspective images, we can also synthesize a group of 3D DPC images for a thick specimen without changing the experimental setup. Different from the refocusing images shown in Fig. 5, the DPC images shown in Fig. 5 represent the phase contrast of the specimen. Details that do not have good contrast in the intensity images render much better in the DPC images, such as edge profile of the specimen in Fig. 5. The reader can refer to Visualization 2 and Visualization 3 for detailed information. 4.4.

Tomographic imaging with a small amount of single-pixel detectors

As earlier researchers have pointed out [29,30], the object information changes much slower in the angular dimensions than in the spatial dimensions. Hence, if the depth variation of specimen gets more uniform, the tomographic imaging can be implemented with a small amount of single-pixel

Vol. 26, No. 24 | 26 Nov 2018 | OPTICS EXPRESS 31101

-140μm

-100μm (a)

20μm

-60μm (b)

60μm (e)

-20μm (c)

100μm (f)

(d)

140μm (g)

(h)

Fig. 5. Left-right DPC images refocused at different depths by using the proposed system.

detectors. To verify the proposed method, we synthesized a group of results by using an array of 5×6 single-pixel detectors, which are given in Fig. 6. Each single-pixel detector is built from the CCD camera at interval of 400 pixels. Thus the pitch between the adjacent single-pixel detectors is 400 × 0.00454 = 1.82mm. Compared with Fig. 4, the image quality of Fig. 6 remains almost unchanged, except for Fig. 6(a). The small aliasing problem occurring in Fig. 6(a) may be caused by reducing the number of single-pixel detectors. We believe this problem can be resolved with the deconvolution algorithm [31]. We note the actual number of detectors used to retrieve the perspective images depends with the complexity of the tested target. Complex target has large depth variation, and therefore requires more single-pixel detector to retrieve more perspective images for tomographic imaging. It is possible to further develop an algorithm that can determine the number of single-pixel detectors adaptively [32].

-140μm

-100μm (a)

20μm

-60μm (b)

60μm (e)

-20μm (c)

100μm (f)

(d)

140μm (g)

(h)

Fig. 6. Intensity images refocused at different depths by using an array of 5 × 6 single-pixel detectors.

Vol. 26, No. 24 | 26 Nov 2018 | OPTICS EXPRESS 31102

5. 5.1.

Discussion Axial resolution

One of the important parameters of tomographic imaging is the axial resolution ∆z, which is the ability to distinguish features at different depths. For the proposed method, because the refocusing images are synthesized from the perspective images with the shifting-and-adding method, the axial resolution (also called the minimum refocusing step size) of the proposed method is determined by the shifted amounts of the perspective image. For example, if we want to refocus the specimen at depth ∆z, the maximum shifted amount sm of the perspective images should be larger than one pixel, sm = ∆z · tan(θ m ) · M ≥ ∆ ps,

(6)

where the pixel size of perspective image is denoted as ∆ ps , θ m is the largest angle between the optical axis of objective lens and the outermost single-pixel detector, as shown in Fig. 7, M is the magnification of objective lens. Therefore, the axial resolution of the proposed method can be formulated as, ∆ ps ∆z ≥ cot(θ m ). (7) M For the proposed system equipped with 10× objective lens, ∆ ps = 10.8 × 2 = 21.6µm, M = 10, 2376×4.54 tan(θ m ) = 2×1000×33 = 0.16. Then using Eq. (7), the axial resolution of the proposed system is calculated as 13.5µm. In addition, Eq. (7) provides a solution to improve the axial resolution of the proposed system. To achieve this, one can increase the θ m of the single-pixel detector array and the magnification of objective lens, or reduce the pixel size ∆ ps .

Objective lens

Specimen Өm Single-pixel detector array

Fig. 7. Schematic diagram used to analyze the axial resolution.

In order to verify the theoretical analysis mentioned above, a slide of fly mouthparts was measured by changing the θ m of the single-pixel detector array. Because the sample is smaller than the cotton aphid, here we used another objective lens (16×, NA 0.4) to project the modulation patterns onto the specimen. Figures 8(a)-8(d) list the digital refocusing results of this specimen in different depths when the θ m was set as 10.5◦ . Figures 8(e)-8(h) give another set of results refocused in the same depths when the θ m was set as 14.5◦ . From these figures, it can be seen that using a large θ m allows one to yield the refocusing images with better sectioning ability, such as the area highlighted with dash ellipse in Fig. 8. These results agree well with the theoretical analysis mentioned above. Noted that θ m cannot be larger than the NA of objective lens. Otherwise, we will get a dark-field perspective image from the outermost single-pixel detector. Actually, high-quality bright-field refocusing image cannot be synthesized by using the dark-field perspective images.

Vol. 26, No. 24 | 26 Nov 2018 | OPTICS EXPRESS 31103

-20 µm

-10 µm

0 µm

(a)

(b)

(c)

(d)

(e)

(f)

(g)

(h)

θm = 14.5°

θm = 10.5°

-30 µm

Fig. 8. Comparison of axial resolution: refocusing results in different depths when θ m was set as 10.5◦ , (a-d) and 14.5◦ (e-h), respectively. The images in each column were refocused in the same depths.

5.2.

Advantages and disadvantages of the proposed method

Compared with conventional tomographic imaging technologies, the presented approach has several advantages. First, since single-pixel detector provides a broader spectral range compared with conventional cameras, the proposed method can be readily extended to non-visible wavebands via an array of single-pixel detectors. As such, the reported scheme may benefit many applications that require tomographic imaging with non-visible light wavebands. Second, the spatial information of the proposed approach comes from the encoded illumination and the angular information comes from the distinct locations of the single-pixel detectors. Therefore, the proposed method eliminates the tradeoff between the spatial and angular resolutions in conventional light field imaging. Limited by our experimental conditions, the tomographic imaging is implemented by a CCD camera. The measurement speed of this camera is 25 frames per second. Hence, our technique has only been demonstrated with a static specimen. But we believe this drawback can be improved with an array of bucket detectors using multi-channel data acquisition cards. Because the sample rate of the data acquisition card is much faster than that of the DMD. In that case, the measurement speed of the proposed system depends on the refreshing rate of the DMD. A state-of-the-art DMD can generate 22,000 binary patterns per second, thus the image shown in Fig. 3 can be obtained in 2 seconds. Recently, Satat et. al. have reported an ultrafast single-pixel imaging scheme via light pulses and time-resolved sensor [33]. This approach may open up the door for increasing the speed of the single-pixel tomographic imaging. Another limitation of the proposed method is that the size of the measurement data increases with the resolution of the perspective images. The greater the resolution, the slower the reconstruction will be. However, since natural images tend to be compressible in the Fourier space, this drawback can be alleviated by using compressive sensing, e.g., one can reconstruct the perspective images by selectively sampling the Fourier coefficients. 6.

Conclusion

In conclusion, we report a compact micro-tomographic imaging approach via single-pixel imaging. Experimental results demonstrate that the proposed system can be used to realize

Vol. 26, No. 24 | 26 Nov 2018 | OPTICS EXPRESS 31104

tomographic imaging without physically scanning the sample. It can also be used to realize 3D DPC imaging without hardware alterations. With an array of bucket detectors, the proposed method can be extended in the non-visible light wavebands. Future works include the extension of the current scheme for enhancing resolution and contrast, and for performing polarization imaging. Funding National Natural Science Foundation of China (NSFC) (61475064, 61605126, 61605063,61875074); Natural Science Foundation of Guangdong Province, China (2015A030310458); Fundamental Research Funds for the Central Universities (21617403). Acknowledgments We would like to thank Dr. Deng Dingnan and He Wenqi from Shenzhen University, China, Mr. Liu Shijie from Department of Optoelectronic Engineering, Jinan University, China, for their help with experimental preparation. References 1. F. Charrière, A. Marian, F. Montfort, J. Kuehn, T. Colomb, E. Cuche, P. Marquet, and C. Depeursinge, “Cell refractive index tomography by digital holographic microscopy,” Opt. Lett. 31(2), 178–180 (2006). 2. W. Choi, C. Fang-Yen, K. Badizadegan, S. Oh, N. Lue, and M. Feld, “Tomographic phase microscopy,” Nat. Methods, 4,717–719 (2007) 3. Y. Sung, W. Choi, C. Fang-Yen, K. Badizadegan, R.Dasari, and M. Feld, “Optical diffraction tomography for high resolution live cell imaging,” Opt. Express, 17(1), 266–277 (2009). 4. J. Conchello and J. W. Lichtman, “Optical sectioning microscopy,” Nat. methods 2, 920–931 (2005). 5. J. Mertz, “Optical sectioning microscopy with planar or structured illumination,” Nat. methods 8, 811–819 (2011). 6. M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM T. Graphic. 25(3), 924–934 (2006). 7. M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microsc-Oxford, 235(2), 144–162 (2009) 8. G. Zheng, C. Kolner, and C. Yang, “Microscopy refocusing and dark-field imaging by using a simple LED array,” Opt. Lett. 36(20), 3987–3989 (2011). 9. M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-Pixel Imaging via Compressive Sampling,” IEEE Signal Proc. Mag. 25(2), 83–91 (2008). 10. S. S. Welsh, M. P. Edgar, R. Bowman, P. Jonathan, B. Sun, and M. J. Padgett, “Fast full-color computational imaging with single-pixel detectors,” Opt. Express 21(20), 23068–23074 (2013). 11. P. Clemente, V. Durán, E. Tajahuerce, P. Andrés, V. Climent, and J. Lancis, “Compressive holography with a single-pixel detector,” Opt. Lett. 38(14), 2524–2527 (2013). 12. Z. Zhang, X. Ma, and J. Zhong, “Single-pixel imaging by means of Fourier spectrum acquisition,” Nat. Commun. 6, 6255 (2015). 13. M.-J. Sun, M. P. Edgar, D. B. Phillips, G. M. Gibson, and M. J. Padgett, “Improving the signal-to-noise ratio of single-pixel imaging using digital microscanning,” Opt. Express 24(10), 10476-10485 (2016). 14. J. Yang, L. Gong, X. Xu, P. Hai, Y. Shen, Y. Suzuki, and L. Wang, “Motionless volumetric photoacoustic microscopy with spatially invariant resolution,” Nat. Commun. 8, 780 (2017). 15. W. L. Chan, K. Charan, D. Takhar, K. F. Kelly, R. G. Baraniuk, D. M. Mittleman, “A single-pixel terahertz imaging system based on compressed sensing,” Appl. Phy. Lett. 93(12), 121105 (2008). 16. T. Mohr, A. Herdt, and W. Elsässer, “2D tomographic terahertz imaging using a single pixel detector,” Opt. Express 26(3), 3353–3367 (2018) 17. N. Radwell, K. Mitchell, G. Gibson, M. Edgar, R. Bowman, and M. Padgett, “Single-pixel infrared and visible microscope,” Optica 1(5),285–289 (2014). 18. G. Gibson, B. Sun, M. Edgar, D. Phillips, N. Hempler, G. Maker, G. Malcolm, and M. Padgett, “Real-time imaging of methane gas leaks using a single-pixel camera,” Opt. Express 25(4), 2998–3005 (2017) 19. J. Greenberg, K. Krishnamurthy, and D. Brady, “Compressive single-pixel snapshot x-ray diffraction imaging,” Opt. Lett. 39(1), 111–114 (2014) 20. A. Zhang, Y. He, L. Wu, L. Chen, and B. Wang, “Tabletop x-ray ghost imaging with ultra-low radiation,” Optica 5(4), 374–377 (2018) 21. B. Sun, M. P. Edgar, R. Bowman, L. E. Vittert, S. Welsh, A. Bowman, and M. J. Padgett, “3D computational imaging with single-pixel detectors,” Science, 340(6134), 844–847 (2013).

Vol. 26, No. 24 | 26 Nov 2018 | OPTICS EXPRESS 31105

22. Z. Zhang and J. Zhong, “Three-dimensional single-pixel imaging with far fewer measurements than effective image pixels,” Opt. Lett. 41(11), 2497–2500 (2016). 23. M. J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun., 7, 12010 (2016). 24. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. photonics, 7(9), 739-745 (2013) 25. L. Tian, J. Wang, and L. Waller, “3D differential phase-contrast microscopy with computational illumination using an LED array,” Opt. Lett. 39(5), 1326–1329 (2014). 26. L. Tian and L. Waller, “3D intensity and phase imaging from light field measurements in an LED array microscope,” Optica, 2(2), 104–111 (2015) 27. Z. Zhang, X. Wang, G. Zheng, and J. Zhong, “Hadamard single-pixel imaging versus Fourier single-pixel imaging,” Opt. Express 25(16), 19619–19639 (2017) 28. Z. Zhang, X. Wang, G. Zheng, and J. Zhong, “Fast Fourier single-pixel imaging via binary illumination,” Sci. Rep. 7(9), 12029 (2017). 29. M. Levoy and P. Hanrahan, “Light field rendering,” in Proceedings of the 23rd annual conference on Computer graphics and interactive techniques (ACM, 1996), pp. 31–42. 30. T. Georgiev, K. C. Zheng, B. Curless, D. Salesin, S. Nayar, and C. Intwala, “Spatio-Angular Resolution Tradeoff in Integral Photography,” in Proceedings of Eurographics Symposium on Rendering (2006). 31. M. Broxton, L. Grosenick, S. Yang, N. Cohen, A. Andalman, K. Deisseroth, and M. Levoy, “Wave optics theory and 3-D deconvolution for the light field microscope,” Opt. Express 21(21), 25418–25439 (2013) 32. J. Chai, X. Tong, S. C. Chan, and H. Y. Shum, “Plenoptic sampling,” in Proceedings of the 27th annual conference on Computer graphics and interactive techniques (pp. 307–318). ACM (2010) 33. G. Satat, M. Tancik and R. Raskar, “Lensless imaging with compressive ultrafast sensing,” IEEE Trans. Computational Imaging 3(3), 398–407 (2017)