Color Difference Amplification between Gold Nanoparticles in

Jul 22, 2014 - via simulation that enlarged color difference between spectrally close samples .... color filter array (CFA) placed on top of a monochr...
0 downloads 0 Views 3MB Size
Article pubs.acs.org/ac

Color Difference Amplification between Gold Nanoparticles in Colorimetric Analysis with Actively Controlled Multiband Illumination Xiaodong Cheng, Dinggui Dai, Zhiqin Yuan, Lan Peng, Yan He,* and Edward S. Yeung State Key Laboratory of Chemo/Biosensing and Chemometrics, College of Chemistry and Chemical Engineering, College of Biology, Hunan University, Changsha, 410082, China S Supporting Information *

ABSTRACT: Spectral chemical sensing with digital color analysis by using consumer imaging devices could potentially revolutionize personalized healthcare. However, samples with small spectral variations often cannot be differentiated in color due to the nonlinearity of color appearance. In this study, we address this problem by exploiting the color image formation mechanism in digital photography. A close examination of the color image processing pipeline emphasizes that although the color can be represented digitally, it is still a reproducible subjective perception rather than a measurable physical property. That makes it possible to physically manage the color appearance of a nonradiative specimen through engineered illumination. By using scattering light imaging of gold nanoparticles (GNPs) as a model system, we demonstrated via simulation that enlarged color difference between spectrally close samples could be achieved with actively controlled illumination of multiple narrow-band light sources. Experimentally, darkfield imaging results indicate that color separation of single GNPs with various sizes can be significantly improved and the detection limit of GNP aggregation-based colorimetric assays can be much reduced when the conventional spectrally continuous white light was replaced with three independently intensity-controlled laser beams, even though the laser lines were uncorrelated with the LSPR maxima of the GNPs. With lowcost narrow-band light sources widely available today, this actively controlled illumination strategy could be utilized to replace the spectrometer in many spectral sensing applications.



detection of heavy metal ions10−13 and small molecules.14 Specially, with fast development of digital color imaging in recent years, cameras15−18 and smart-phones19−21 are increasingly utilized for quantitative evaluation of colors gas sensing,22,23 biomolecule detection,24−26 nanoparticle sizing,27 etc. Compared to spectral analysis, however, digital color analysis is generally not as capable in differentiation of small spectral variations. While the resolution of spectral measurement can be improved systematically by increasing the line density of the diffraction grating and decreasing the width of the entrance slit, colorimetric sensing often suffers from the nonlinearity of color response. Both human eyes and color cameras are highly sensitive to spectral variation in some wavelength regions, e.g. most of the 540−590 nm yellow−green range, but are insensitive in other regions, e.g. the 600−700 nm red region28 and the so-called MacAdam ellipses specified on the CIE (International Commission on Illumination) 1931 chromaticity diagram.29 As a result, many small spectral shifts may not

INTRODUCTION Over the past two decades, gold nanoparticles (GNPs) that can give rise to localized surface plasmon resonance (LSPR) have attracted wide interest in the field of chemical, physical, material, and biomedical science.1−3 Since LSPR absorbance and scattering spectral maxima of GNPs are highly sensitive to their composition, size, shape, and surrounding environment,4 a large number of chemical sensing and disease diagnosis schemes have been developed based on detectable LSPR spectral shift of functionalized GNPs.5−9 However, a common limitation associated with most of these assays is that a gratingbased spectrometer is needed to acquire the LSPR spectra. Unlike light sources and detectors that can be readily miniaturized and integrated into consumer devices, common spectrometers cannot be decreased much in size and manufactured inexpensively, posing a bottleneck in developing consumer applications for personalized healthcare. Nonetheless, electromagnetic radiation in the 380−700 nm wavelength range with a certain intensity level directly correlates to the sensuous perception of color by human eyes. Because a large spectral shift in the visible region is always accompanied by a noticeable color change, simple and low-cost colorimetric sensing schemes are actively developed for GNP-based © 2014 American Chemical Society

Received: April 16, 2014 Accepted: July 14, 2014 Published: July 22, 2014 7584

dx.doi.org/10.1021/ac501448w | Anal. Chem. 2014, 86, 7584−7592

Analytical Chemistry

Article

Figure 1. (A) Schematic diagram of the color image processing pipeline of digital color cameras, usually including four major steps: (1) raw image acquisition, (2) CFA demosaicing, (3) colorspace transform, and (4) tone-mapping/gamma correction. (B) Schematic illustration of the process of color matching by using three monochromatic primaries at wavelengths of 700 nm (red), 546.1 nm (green), and 435.8 nm (blue).33 (C) The CIE 1931 RGB color matching functions.34

display identifiable color change in their “natural” appearance. Moreover, color analysis often encounters the color inconsistency problem. One spectrum may exhibit various colors under different lighting conditions, making it difficult to justify whether a color change is caused by a certain spectral shift. Even with the same white light illumination and camera, reliable color measurements may not be achieved if the position of the light source, the exposure time of the camera, or the white balance of the image are not fixed, but these conditions may not be controlled well in many situations. Therefore, although simple and straightforward to use, digital color analysis so far has not been widely adopted for spectral analysis applications. In this study, we present a new framework to improve the applicability and reliability of colorimetric analysis. We start by correcting a general misunderstanding on color imaging. It is well-known that in digital color cameras with a pixilated sensor chip, the incoming light is detected via three bandpass filters in the red, green, and blue wavelength regions, respectively, and the color information at each pixel is represented numerically by an RGB triplet. For people that are not familiar with color science, however, it appears that a color image is simply a combination of three grayscale images obtained from the three primary color channels, and RGB analysis such as calculating the R/G ratio has been thought to be similar to ratiometric analysis of light intensities detected at the red and green wavelengths. Indeed, as a subjective sensation rather than an objective reality, the color appearance can only be reproduced but not be measured experimentally. To render the acquired image “faithfully” to human vision, an image processing pipeline or “color engine” must be implemented in digital cameras. While that makes it difficult to associate the “acquired” color information to the spectral profile of the specimen, it also becomes possible to manipulate the color appearance at various stages of the pipeline. Subsequently, we take advantage of this

color reproduction mechanism and replace the white light illumination with engineered illumination of multiple narrowband light sources, i.e. individually intensity-controlled red, green, and blue laser beams. With darkfield imaging of plasmonic GNPs as a model system, we demonstrate with both simulations and experiments that chromaticity coordinates of the GNPs can be manipulated by varying the intensity ratios of these laser beams. Such actively controlled illumination enables direct observation of enlarged color difference between spectrally close GNPs that would otherwise appear almost the same color under the white light. This method is similar to adding a multiband bandpass filter with independently variable transmission efficiencies into the light path. With today’s readily available narrow-band light sources and low-cost digital color imaging devices, we expect this simple strategy to be widely used to replace spectrometer-based spectral analysis with colorimetric sensing in various fields.



EXPERIMENTAL SECTION Chemicals and Materials. HAuCl4·3H2O and sodium citrate were purchased from Sinopharm Chemical (Shanghai, China). 2-Mercaptosuccinic acid (MSA) was obtained from Sigma-Aldrich (U.S.A.). Ultrapure water with a resistivity of 18.2 MΩ was produced using a Millipore Milli-Q apparatus and used in all experiments. Preparation and characterization of MSA and thymine mercaptol (T-SH) modified GNPs were performed according to the literature, and the details are described in the Supporting Information. Darkfield Imaging. For darkfield microscopy, a Nikon 80i upright microscope equipped with a 100-W halogen tungsten lamp, an NA = 1.20−1.43 oil immersion darkfield condenser, and a 60× plan fluor oil immersion objective with iris was used. For laser illumination, the lamp was replaced with R−G−B lasers of 473, 532, and 635 nm (CNI laser Changchun, China). The color images were obtained with an Olympus DP72 color 7585

dx.doi.org/10.1021/ac501448w | Anal. Chem. 2014, 86, 7584−7592

Analytical Chemistry

Article

variety of algorithms such as nearest neighbor, bilinear, gradient based, and high quality linear interpolation.32 The ones used by commercial cameras are always more complex and are proprietary “black boxes” to end users. Noise reduction and image sharpening are usually also implemented at this stage to remove artifacts due to the interpolation. Notably, although complicated demosaicing algorithms improve the overall imaging quality without sacrificing the spatial resolution and the linearity of the intensity response, they inevitably lead to further crosstalk of the red, green, and blue channels. The acquired as well as the interpreted R0, G0, and B0 values of the three primaries obtained after demosaicing at each pixel form an RGBsensor triplet. Although they have already encoded the spectral information at the pixel, what we get here are only three numbers from overlapping channels. To find out their color representation to human vision, each RGBsensor triplet must be mapped to a canonical colorspace, e.g., the CIE XYZ colorspace. The CIE XYZ colorspace is an absolute mathematical abstraction or numerical representation of the human color perception and is so far the most widely used color reference model. It was originally derived through matching the color appearance of each wavelength in the visible region to that of the linear combination of three single wavelengths, i.e., 435, 546, and 700 nm, with standard human eyes as the difference detector (Figure 1B).34,35 Three color matching functions r(λ), g(λ), and b̅(λ) were constructed ̅ ̅ (Figure 1C) and can be used to find out the R−G−B combination for any visible irradiation. Particularly, for a nonradiative sample with spectrum s(λ) under illumination of I0(λ), the color-matched combination is given by36

CCD camera. The acquired sRGB TIFF images were analyzed using ImageJ and MATLAB.



RESULTS AND DISCUSSION Color Image Formation Mechanism. Figure 1A shows the simplified color image processing pipeline in most digital cameras, which includes four major steps: raw image acquisition, demosaicing, colorspace transform, and tonemapping/gamma correction.30 The related issues have been extensively studied in color science and digital photography but are too specialized and sometimes even confusing for many chemists. Particularly, the term RGB is quite misleading for nonspecialists as it has been overused to convey different meanings in different contexts. To avoid confusion, in this paper, the term RGB is supplemented with subscript, slashes, and/or hyphens for different meanings. Most digital color cameras nowadays utilize a single-chip CCD or CMOS imaging sensor with three channels sensitive to the long, middle, and short wavelength visible lights, respectively. This is usually implemented by using a Bayer color filter array (CFA) placed on top of a monochrome sensor chip, where each pixel is designed to detect only one of the three primary colors. Upon light impingement onto the sensor chip surface, every pixel, depending on the wavelength range of the microfilter on top of it, acquires a stimulus value on one of the three primaries, which can be quantitatively described as31 R0 =

∫0

G0 =

∫0

B0 =

∫0



I0(λ) s(λ) Dr (λ) dλ ∞

I0(λ) s(λ) Dg (λ) dλ

700



I0(λ) s(λ) D b(λ) dλ

R 700 =

∫380

G546 =

∫380

B435 =

∫380

[I0(λ)s(λ)] r ̅(λ) dλ

700

(1)

where I0(λ) is the power spectral distribution function of the light source; s(λ) is the characteristic spectral response of the sample; and Dr(λ), Dg(λ), and Db(λ) are the spectral sensitivity curves of the red, green, and blue color channels of the CFA, respectively. The obtained R0, G0, or B0 values, after correction of device-dependent reading errors, have a high dynamic range (12- or 14-bit) and are linear with respect to the incoming light intensity. We denote them as R/G/Braw, which can be stored as the raw images in high-end camera models. Depending on the requirement for scene-specific chromatic adaption, the relative sensitivity of the three channels may need to be adjusted here by manual or auto white balance. Notice that to mimic the color responses of eyes throughout the visible wavelength range, the spectral sensitivity curves of the CFA cannot be selected arbitrarily. Figure S1 shows those of the human cone cells and several camera models. It can be seen that they are all spectrally wide and have large overlaps. Therefore, most wavelengths will not be detected by just one of the three channels and there is considerable channel crosstalk at the raw image level. Nevertheless, after the image acquisition, every pixel is “labeled” by one of the red, green, or blue colors, and the raw image is in fact a combination of three grayscale images, each having 1/2 or 1/4 of its original size. To produce a high resolution image with every pixel containing all three of the R0, G0, and B0 values, the missing two at each pixel have to be interpolated from the information on neighborhood pixels, a process referred to as demosaicing. This can be done by a

[I0(λ)s(λ)]g ̅ (λ) dλ

700

[I0(λ)s(λ)]b ̅ (λ) dλ

(2)

A normalized matrix transform on the obtained R700, G546, and B435 values, which form a RGBlinear triplet, gives the deviceindependent XYZ tristimulus values36 ⎡ 0.49 ⎡X⎤ 0.31 0.20 ⎤⎡ R 700 ⎤ ⎥ 1 ⎢ ⎥⎢ ⎢ ⎥ G ⎢ Y ⎥ = 0.17697 ⎢ 0.17697 0.81240 0.01063⎥⎢ 546 ⎥ ⎥ ⎢ ⎢⎣ 0.00 ⎣Z⎦ 0.01 0.99 ⎥⎦⎣ B435 ⎦ (3)

The CIE XYZ colorspace is not associated with any real color. It was deliberately designed to characterize the luminance of a color by the value Y and the chromaticity of a color by two nonnegative values x and y (with the white point designated to x = y = 1/3) ⎡x⎤ ⎡ X ⎤ ⎢ y ⎥ = ⎢ Y ⎥ /(X + Y + Z) ⎢ ⎥ ⎢ ⎥ ⎣z⎦ ⎣ Z ⎦

(4)

The [x,y] coordinates of all colors form a tongue-shaped CIE x−y chromaticity diagram (Figure S2).21 By definition, the chromaticity value of a color is independent of the luminance of the color, so it contains the intensity-independent spectral 7586

dx.doi.org/10.1021/ac501448w | Anal. Chem. 2014, 86, 7584−7592

Analytical Chemistry

Article

adapted by all the popular color image formats such as JPEG and TIFF. Hence, after tone-mapping and gamma-encoding, the obtained sRGB values redistribute the HDR tonal levels of the camera into LDR ones that are more perceptually uniform and make most efficient use of the bit-depth of color video display or computer graphics. It is noteworthy that while calculating the color readout parameters such as the chromaticity, the hue or the R/G ratio previously, most of the reports did not consider the gamma-encoding or the tonemapping effect, so their findings were inevitably biased by the luminance factor. Taken together, the color is not an objective measurable quantity; rather, it is a stimulated sensation that can only be simulated with good approximation. The whole purpose of the color image processing pipeline is to “faithfully” reproduce human perception of the colors of a real scene. This is performed, after the initial image acquisition via three different wavelength channels, in three consecutive steps: spatial resolution reproduction via demosaicing, chromaticity reproduction via color transform through linear mixing of three single-wavelength primaries, and brightness reproduction via tone-mapping and gamma correction. While the input/output interfaces of digital color cameras function the same way as monochromatic cameras, their inner data flow involves nonobjective domain conversions (though largely objectified thanks to the advancement in color science) and are thus clearly different from conventional analytical devices such as a spectrometer. Therefore, although the color appearance of a sample is related to its intrinsic spectrum, there is no one-toone correspondence between them. That explains why different spectra could show the same color and the same spectrum may exhibit different colors under different lighting conditions. While this uncertainty makes things complicated, it also implies new opportunities when using colorimetric sensing for spectral analysis. Color Difference Amplification with Multiband Illumination: Simulations. With a clear understanding of the color image formation mechanism, we can now postulate a scheme for applicable and reliable colorimetric analysis. As mentioned earlier, a major problem in color sensing-based spectral analysis is that small spectral shifts may not lead to detectable color change. For many people, however, one would have to passively settle for the “natural color” because it is an underlying assumption that the color is a measurable property as the spectral maximum. In reality, the color appearance of a nonradiative sample is determined by both its own spectrum and the power distribution function of the light source, and is created through orthogonal combination of three monochromatic lights. Therefore, when the color difference between two spectrally close samples is too small under the usual broadband white light illumination, it is possible to manipulate the color appearance and improve the color separation with multiple narrow-band light sources whose intensities are controlled independently. To demonstrate this novel concept, we use the combination of three individually intensity modulated red (635 nm), green (532 nm), and blue (473 nm) laser beams as the artificial light source and compare its performance with the halogen tungsten lamp (the conventional white light in microscopy) in darkfield scattering imaging of GNPs. Prior to the experiments, we simulated the color images of four GNPs with sizes 30, 50, 70, and 90 nm under the two different lighting conditions. Figure 2A shows the normalized LSPR scattering spectra of the GNPs

information on an object and is obviously the optimal parameter to describe color change induced spectral shift. With the CIE XYZ colorspace reference, all that is needed is to determine for each RGBsensor triplet the corresponding XYZ tristimulus values and/or the RGBlinear triplet. This is generally performed by establishing a camera-specific color correction matrix or lookup table (LUT) together with a white balance algorithm. All possible combinations of the RGBlinear triplets constitute the CIE RGB colorspace that occupies an enclosed triangle on the CIE chromaticity diagram (Figure S2). By choosing different sets of the three primary colors and the white point (specified by their chromaticity coordinates), different RGB colorspaces have been created. Among them, the most popular one is the sRGB colorspace, which has been used extensively by monitors, cameras, printers, and many other digital devices. Notice that the physical meaning of the RGBsensor triplet is totally different from that of the RGBlinear triplet. The former are the pixel values resulting from weighing the incoming power spectral distribution function of the illumination/sample with three wideband bandpass filters that have considerable channel crosstalk. The latter are the relative amounts of the three single-wavelength primaries in their orthogonal combination, which describe a sensuously equivalent physical representation of the former. The only bridge between the RGBsensor triplet and the RGBlinear triplet is human eyes! An important inference here is that purely numerical manipulations of the R/G/Braw values such as calculating the red/green ratio would “downgrade” colorimetric analysis to ratiometric analysis of two heavily overlapped channels and would not provide good spectral resolution. Indeed, the acquired R/G/Braw and RGBsensor values as well as the derived RGBlinear and sRGB values are all various formats of differential inputs that provoke the “sensory response” of the color camera, which, just like human eyes, could in principle differentiate millions of colors regardless of whether or not there are overlaps between the input channels. Therefore, to realize the full power of colorimetric analysis, one should quantify the color difference in a colorspace. With the CIEXYZ colorspace standardized device-independent chromaticity reproduction of color, the last major step in the image processing pipeline of the camera is to reproduce the brightness of color for display. According to the discussions earlier, the R/G/Braw, RGBsensor, RGBlinear, and XYZ values all show a linear dependence on the light intensity, usually with a high dynamic range (HDR, 12-bit or more). But the display devices such as CRT/LCD monitors, printers, and projectors can only reproduce a very limited dynamic range (LDR, less than 8-bit). If a HDR image is proportionally reduced to a LDR one, the high intensity part of the image will appear too bright while the low intensity part will become hardly discernible, because the eyes are more sensitive to luminance change in dark scenes than in bright scenes. Therefore, a nonlinear tonemapping operation is usually applied to the RGBlinear values to preserve the image details and makes a subjectively “pleasing” reproduction of the scenes.37 The tone-mapping algorithm of every camera model is unique and mostly proprietary but can be estimated by a calibration procedure if the operation is spatially uniform and is based on only the luminance level at each pixel. Finally, to compensate for the nonlinear input− output characteristics of the display monitors, the sRGB colorspace specification also incorporates a gamma encoding procedure, the “display gamma,” which is now a standard 7587

dx.doi.org/10.1021/ac501448w | Anal. Chem. 2014, 86, 7584−7592

Analytical Chemistry

Article

Figure 2. (A) Calculated scattering spectra for 30, 50, 70, and 90 nm GNP. (B) The spectral profiles of the white light and R−G−B lasers (I635/I532/ I473 = 6:1:4). (C) Simulated scattering images of the GNPs under the white light (1−4) and the R−G−B laser (5−8) illumination. (D) The corresponding chromaticity [u′,v′] coordinates of the GNP color images on the CIE u′−v′ diagram.

Here, the chromaticity coordinates [un′,vn′] and luminance Yn specify the reference white point. The numerically comparable color difference and luminance-independent chromaticity difference can then be calculated from the Euclidean distance of the [L*,u*,v*] and [u′,v′] coordinates, respectively, by38

calculated based on the Mie theory, whose peak wavelengths are 535, 542, 550, and 569 nm, respectively. Figure 2B show the measured spectral profile of the white light and the simulated spectral power distribution function of the R−G−B lasers with intensity ratio I635/I532/I473 = 6:1:4. Figure 2C shows the calculated LSPR scattering images of the GNPs. It can be seen that under the white light illumination, the 30, 50, and 70 nm GNPs all appear green with small color differences, and the 90 nm one looks yellow. But under the R−G−B laser illumination, the 30 nm GNP looks a little blue and the 90 nm one appears red, displaying an obvious shift of the color appearance and enlargement of color separation. To quantify the chromaticity difference among the GNPs, their [x,y] coordinates are calculated according to the color matching functions (eqs 2−4). As shown in the CIE x−y diagram (Figure S3), the separation distances of the GNP [x,y] coordinates are clearly widened when the white light is replaced with the R−G−B lasers. One problem here is that, however, these enlarged [x,y] coordinate differences are only semiquantitatively meaningful since the CIE XYZ colorspace is not perceptually uniform; i.e., the same distance on the x−y chromaticity diagram might not produce a change of about the same visual importance. For better quantification of the color difference and the chromaticity difference, another colorspace transform from CIE XYZ to the perceptually more uniform CIE LUV is adapted38

*= ΔEuv

*= ΔCuv

(5)

where 4X 4x = −2x + 12y + 3 X + 15Y + 3Z

v′ =

9y 9Y = −2x + 12y + 3 X + 15Y + 3Z

(Δu′)2 + (Δv′)2

(8)

Figure 2D and row 1 and 2 in Table S1 show the [u′,v′] coordinates of the 30, 50, 70, and 90 nm GNPs in the CIE u′− v′ chromaticity diagram under the two lighting conditions. With R−G−B laser illumination, the ΔCuv * variations in the 30/50 nm, 50/70 nm, and 70/90 nm pairs enlarge by 10%, 67%, and 133%, respectively. Such amplified chromaticity separation can apparently be attributed to a higher percentage of the blue and red light in the R−G−B laser combination, which emphasizes the spectral difference in the long and short wavelength regions of the GNP spectra. The above simulation assumes that the laser beams have single-wavelength line spectra with 0.2 nm bandwidth, but the results are similar even if the bandwidth is increased. Rows 3 to 5 in Table S1 show the calculated [u′,v′] coordinates as well as the ΔCuv * values when the bandwidth of each line of the R−G− B incident beams is increased to 5, 20, and 30 nm, respectively. Also shown in row 6 are the values when the R−G−B beams centered at the laser wavelengths are replaced with typical narrow-band R−G−B LEDs centered at similar wavelengths (470, 525, and 625 nm) with ∼30 nm bandwidth for each LED.28 The intensity ratios of the R−G−B beams are kept unchanged at I635/I532/I473= 6:1:4. The spectral profiles of these illuminators are displayed in Figure S4. It can be seen that in almost all of these cases with narrow-band R−G−B lights, the ΔC*uv variations between the GNPs are significantly larger than those obtained with white light illumination (Figure 3). Interestingly, somewhat wider bandwidths seem to lead to better results. Therefore, the color differences of the GNPs can be enlarged by using a variety of low-cost and easy-to-integrate light sources such as LEDs and laser diodes, making this engineered multiband illumination strategy very appealing for practical applications.

u* = 13L* × (u′ − un′)

u′ =

(7)

and

⎧ ⎛ 29 ⎞3 ⎛ 6 ⎞3 ⎪ ⎜ ⎟ Y /Yn , Y /Yn ≤ ⎜ ⎟ ⎝ 29 ⎠ ⎪⎝ 3 ⎠ L* = ⎨ ⎪ ⎛ 6 ⎞3 1/3 ⎜ ⎟ − > Y Y Y Y 116( / ) 16, / ⎪ n n ⎝ 29 ⎠ ⎩

v* = 13L* × (v′ − vn′)

(ΔL*)2 + (Δu*)2 + (Δv*)2

(6) 7588

dx.doi.org/10.1021/ac501448w | Anal. Chem. 2014, 86, 7584−7592

Analytical Chemistry

Article

light (Figure S7A), the R−G−B laser spectra (Figure S7B), the scattering images of the GNPs under white light (Figure S7C) and R−G−B laser illumination (Figure S7D), and the chromaticity [u′, v′] coordinates of the GNPs under the two lighting conditions (Figure S7E). The experimental results are obviously consistent with those of the simulation. Figure 4 show the imaging results from monodisperse MSA-modified GNPs with average size of 42, 54, and 81 nm, respectively. Again, R−G−B laser illumination leads to larger color separation than the white light illumination (Figure 4); the ΔC*uv,42/54 and ΔC*uv,54/81 variations in the 42/54 nm and 54/81 nm pairs increase from 0.009 to 0.019 and from 0.056 to 0.139, respectively. By constructing a calibration curve, this method could be utilized for efficient and sensitive single GNP sizing. Further studies along this direction are under way. Finally, for a preliminary demonstration of practical application, this actively controlled illumination scheme was applied to a food-safety related GNP aggregation assay. It has been reported that melamine was used as an illegal food additive in milk production in some places. From the fact that a melamine molecule can form hydrogen bonds with multiple thymine moieties, Huang and co-workers have developed a colorimetric method for its detection based on melamine induced aggregation of thymine-modified GNPs, whose limit of detection is ∼200 nM with UV−vis spectrometry.39 In our experiments, we prepared T-SH modified 47 nm GNPs (Figure S8) and added melamine into the GNP solution with a final concentration of 0, 20, 80, 200, and 400 nM, respectively. The solutions were then subjected to digital-photo imaging and dark-field imaging. It can be seen that with digital-photo imaging of the transmitted natural light (Figure 5A), some color change from pink to purple can be observed at 400 nM melamine, but there is little color difference in the 0 to 80 nM solutions. With darkfield imaging of LSPR scattering of the GNPs under white light illumination (Figure 5B), the five solutions all appear green with their [u′,v′] coordinates clustered together (Figure 5D), though the 200 and 400 nM ones appear a little darker due to scattering intensity difference. However, under the R−G−B laser illumination (Figure 5C), there is a clear color change of the GNP solutions from yellow green to red with increasing melamine concentration. Notably, the 0 and 20 nM solutions exhibit a noticeable color difference and the [u′,v′] coordinates of the five solutions are well

Figure 3. Effect of the spectral bandwidth of the R−G−B lights on the * /ΔCuv,white * of the GNP color images. The six different light ΔCuv,RGB sources are L1, white light; L2−L5, synthesized R−G−B lights with the same intensity ratio (I635/I532/I473 = 6:1:4) with bandwidths of 0.2 (laser), 5, 20, and 30 nm, respectively; L6, real R−G−B LEDs (I625/ I525/I470 =6:1:4).

Besides I635/I532/I473= 6:1:4, other intensity ratios of the three beams were studied as well. Figure S5 shows the simulated chromaticity u′ and v′ coordinate values of the GNPs as a function of the relative intensity of beam R while keeping the intensity of beam G and B unchanged. It can be seen that for all the GNPs, the u′ values increase and the v′ values decrease monotonically with rising beam R intensity, but there is not much variation in the u′ separation distance or v′ separation distance between different GNPs at different beam R intensities. Hence, the enlargement of color separation is not sensitive to the intensities of the R−G−B beams, though the ratios do affect their color appearance and determine whether their chromaticity coordinates are in the sensitive region of human eyes. This is very appealing because in many situations the light source intensities may not be controlled accurately. Color Difference Amplification with R−G−B Lasers: Experiments. As a proof of concept experiment, single GNPs dispersed on coverglass surfaces were imaged under R−G−B laser darkfield microscopy (Figure S6). To extract the intensityindependent chromaticity values from standard gammaencoded sRGB images, a calibration based method is established. Figure S7 display the spectral and imaging data along with the analysis results related to 4 nearby GNPs with apparently different sizes, including: the scattering spectra of these 4 particles acquired via a transmission grating under white

Figure 4. (A) TEM images of 42, 54, and 81 nm GNPs. The scale bar is 100 nm. (B) Size distributions of the three GNPs calculated from A. (C, E) Darkfield scattering images of the GNPs obtained under (C) white light illumination and (E) R−G−B laser illumination. The scale bar is 5 μm. (D * = ((u′)2 + (v′)2)1/2 values of the corresponding GNP color images in C and E, respectively. and F) Distributions of ΔCuv 7589

dx.doi.org/10.1021/ac501448w | Anal. Chem. 2014, 86, 7584−7592

Analytical Chemistry

Article

Figure 5. Images and the corresponding chromaticity coordinates of five SH-T modified GNP solutions after adding different concentrations of melamine. (A) Digital-photo pictures of the test tubes. (B, C) Darkfield scattering images obtained under (B) white light illumination and (C) R− G−B laser illumination. The scale bar is 10 μm. (D) Chromaticity [u′,v′] coordinates of the color images in B and C on the chromaticity u′−v′ * vs melamine concentration. The dashed line indicates the value of the JND (just noticeable diagram. The inset is the curve of color differences ΔEuv,0 difference).

separated (Figure 5D), suggesting a ∼10 fold reduction of the detection limit compared to that under white light illumination. Since the scattering intensity of each solution can also be obtained from its sRGB image, such improved sensitivity can be further quantified via calculating the luminance-dependent color difference ΔEuv,0 * with respect to the 0 nM solution according to eq 7. It has been suggested that a ΔEuv * value of around 2.9 corresponds to a just noticeable difference (JND) in the CIE LUV colorspace.40 As shown in Figure 5D, under white light illumination, the ΔEuv,0 * values of the 20 and 80 nM solutions are both below 1.5 and much less than the JND, making them indistinguishable in color from the 0 nM solution. But under the R−G−B laser illumination, the ΔEuv,0 * value of the 20 nM solution is already 6.7, which is over twice as large as the JND and suggests that the detection limit may be further reduced. Overall, we have demonstrated through both simulations and experiments that by using individually intensity-modulated narrow-band light sources, the colors of a set of spectrally similar GNP samples can be changed and their color separation can be enlarged. Notice that this method is different from postimaging color image manipulation with image processing software (PS), which are purely subjective and could vary from person to person. In contrast, engineered multiband illumination kicks in at the very beginning of the image processing pipeline with a clear physical significance. By replacing the continuous broadband white light with multiple discrete narrow-band lights, the initial spectral profile of the sample is changed before the R/G/Braw values are acquired. Therefore, eq 1 can be rewritten as

R0 =

∫0

G0 =

∫0

B0 =

∫0



[s(λ)I0,multi(λ)]Dr (λ) dλ ∞

[s(λ)I0,multi(λ)]Dg (λ) dλ ∞

[s(λ)I0,multi(λ)]Db(λ) dλ

(9)

where I0,multi(λ) is the discrete power spectral distribution function of the multiband R, G, and B lights. By placing the I0,multi(λ) term between the sample and the detector, the light source is essentially playing the role of bandpass filters or the entrance slit of a spectrograph with variable transmission efficiency. Therefore, the obtained R0, G0, and B0 values are equivalent to the detector responses of the discretely “filtered” light signal from the sample but could be acquired much easier than using a spectrometer. Importantly, since the color is a reproduced rather than a directly measured quantity, there is no need to have these “filters” spectrally match the characteristic peak location of the samples, making it possible to play with multiple “filters” simultaneously with various wavelengths and bandwidths to achieve the best color separation results for distinct samples. Indeed, in the present study, the wavelengths of the R−G−B lasers were irrelevant to the LSPR maximum of the GNPs. That also explains why the chromaticity enlargement is not affected by small fluctuations of the bandwidth or relative intensity of the narrow-band lights. The major limitation of this scheme is that the engineered spectral differences of the samples still have to be evaluated by the three channels of the CFA, whose spectral profiles are mostly sensitive in the ∼450− 600 nm range (Figure S1). Hence, to get a significant color difference, the sample spectra need to show enough variations within this wavelength region, regardless of whether their peak wavelengths are in the region or not. As a result, our strategy could be utilized to enlarge the color difference of silver 7590

dx.doi.org/10.1021/ac501448w | Anal. Chem. 2014, 86, 7584−7592

Analytical Chemistry



nanoparticles with scattering light in 390−460 nm range (Figure S9) and gold nanorods with scattering light in the 600−670 nm range (Figure S10). Further improvements can be made by designing new CFAs. In the future, the illumination is not restricted to three beams of light. We can design a laser diode or narrow-band LED array that consists of 10−15 wavelengths and cover the entire visible range. Now, instead of carrying a bulky spectrometer or a set of optical filters for spectral analysis in the visible region, one can use just a color camera plus a small illuminator chip. Spectral analysis can be performed instantaneously by examination of the color difference produced after applying a sequence of predefined light pulses for each different sample. Such small but versatile lighting units could even be integrated into future smart phones for the development of personalized healthcare applications.

Article

ASSOCIATED CONTENT

S Supporting Information *

Additional figures as mentioned in the main text. This material is available free of charge via the Internet at http://pubs.acs.org.



AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. Notes

The authors declare no competing financial interest.



ACKNOWLEDGMENTS We thank Dr. Jianping Li of Hong Kong Baptist University for helpful discussions. This work was supported by NSFC 21127009, NSFC 91027037, NSFC 21221003, Natural Science Foundation of Hunan Province 13JJ1015, and Hunan University 985 fund.



CONCLUSIONS Although human eyes can differentiate millions of colors and high quality color images can be readily acquired today, it is still a big challenge to utilize color difference detection for spectral shift measurements. A major obstacle is that the color appearance could be affected by too many factors, and reproducible quantification of the color information can be achieved only under carefully fixed lighting and camera conditions that are not always practical. Furthermore, even after one manages to control these conditions well, the sample matrix may not exhibit a detectable color difference toward the spectral variation because the color response is not linear. But for people mistaking the color as a spectrum-like measurable property, they would have little choice but to settle with the “natural color”. In this study, we have made a significant step toward applicable and reliable colorimetric sensing for spectral analysis. By elucidating the color image formation mechanism in digital photography, we stress that the digital representation of the color is actually reproduced rather than measured by digital cameras. That makes it possible to control the color appearance at various stages of the color image processing pipeline. Therefore, we replace the conventional spectrally continuous white light with spectrally discrete, independently intensitycontrolled narrow-band light sources. Simulations indicate that under darkfield microscopy, the color separation between plasmonic GNPs of various sizes is amplified. Further introduction of the CIE LUV colorspace allows quantitative comparison of color differences. Preliminary results on a GNP aggregation based food-safety assay show that under actively controlled R−G−B laser illumination, the detection limit is about 10 times lower than under white light illumination. This engineered narrow-band illumination strategy is similar to the addition of multiple bandpass filters with variable transmission efficiencies into the light path but are practically more flexible and potentially more cost-effective than using a spectrometer or a filter wheel. Importantly, because the color is the “simulated reality” and has no one-to-one correspondence with the intrinsic sample spectrum, there is no need to have the wavelengths of the lights match the characteristic spectra of the specimen. Therefore, the same light source combination (with different intensity ratios) can be used for all kinds of samples. With a nowadays fully developed lighting industry and readily available consumer imaging devices, this strategy will open up a whole new avenue for spectral analysis applications in personalized healthcare, food safety, environmental monitoring, etc.



REFERENCES

(1) Saha, K.; Agasti, S. S.; Kim, C.; Li, X.; Rotello, V. M. Chem. Rev. 2012, 112, 2739−2779. (2) Elghanian, R.; Storhoff, J. J.; Mucic, R. C.; Letsinger, R. L.; Mirkin, C. A. Science 1997, 277, 1078−1081. (3) Dreaden, E. C.; Alkilany, A. M.; Huang, X.; Murphy, C. J.; ElSayed, M. A. Chem. Soc. Rev. 2012, 41, 2740−2779. (4) Willets, K. A.; Van Duyne, R. P. Annu. Rev. Phys. Chem. 2007, 58, 267−297. (5) Mock, J. J.; Smith, D. R.; Schultz, S. Nano Lett. 2003, 3, 485−491. (6) Raschke, G.; Kowarik, S.; Franzl, T.; Sönnichsen, C.; Klar, T. A.; Feldmann, J.; Nichtl, A.; Kürzinger, K. Nano Lett. 2003, 3, 935−938. (7) Lee, J.-S.; Ulmann, P. A.; Han, M. S.; Mirkin, C. A. Nano Lett. 2008, 8, 529−533. (8) Wu, Z.; Wu, Z.-K.; Tang, H.; Tang, L.-J.; Jiang, J.-H. Anal. Chem. 2013, 85, 4376−4383. (9) Valentini, P.; Fiammengo, R.; Sabella, S.; Gariboldi, M.; Maiorano, G.; Cingolani, R.; Pompa, P. P. ACS Nano 2013, 7, 5530−5538. (10) Du, J.; Yin, S.; Jiang, L.; Ma, B.; Chen, X. Chem. Commun. 2013, 49, 4196−4198. (11) Guo, Y.; Wang, Z.; Qu, W.; Shao, H.; Jiang, X. Biosens. Bioelectron. 2011, 26, 4064−4069. (12) Xue, X.; Wang, F.; Liu, X. J. Am. Chem. Soc. 2008, 130, 3244− 3245. (13) Wang, Y.; Yang, F.; Yang, X. R. Biosens. Bioelectron. 2010, 25, 1994−1998. (14) Liu, D.; Wang, Z.; Jiang, X. Nanoscale 2011, 3, 1421−1433. (15) Cantrell, K.; Erenas, M. M.; de Orbe-Payá, I.; Capitán-Vallvey, L. F. Anal. Chem. 2009, 82, 531−542. (16) Liu, Y.; Ling, J.; Huang, C. Z. Chem. Commun. 2011, 47, 8121− 8123. (17) Suzuki, K.; Hirayama, E.; Sugiyama, T.; Yasuda, K.; Okabe, H.; Citterio, D. Anal. Chem. 2002, 74, 5766−5773. (18) Hirayama, E.; Sugiyama, T.; Hisamoto, H.; Suzuki, K. Anal. Chem. 1999, 72, 465−474. (19) Wei, Q.; Nagi, R.; Sadeghi, K.; Feng, S.; Yan, E.; Ki, S. J.; Caire, R.; Tseng, D.; Ozcan, A. ACS Nano 2014, 8, 1121−1129. (20) Oncescu, V.; O’Dell, D.; Erickson, D. Lab Chip 2013, 13, 3232− 3238. (21) Shen, L.; Hagen, J. A.; Papautsky, I. Lab Chip 2012, 12, 4240− 4243. (22) Wang, X. D.; Meier, R. J.; Link, M.; Wolfbeis, O. S. Angew. Chem., Int. Ed. 2010, 49, 4907−4909. (23) Ali, R.; Lang, T.; Saleh, S. M.; Meier, R. J.; Wolfbeis, O. S. Anal. Chem. 2011, 83, 2846−2851. (24) Xiao, L.; Wei, L.; He, Y.; Yeung, E. S. Anal. Chem. 2010, 82, 6308−6314. 7591

dx.doi.org/10.1021/ac501448w | Anal. Chem. 2014, 86, 7584−7592

Analytical Chemistry

Article

(25) Zou, B.; Liu, Y.; Yan, X.; Huang, C. Chin. Sci. Bull. 2013, 58, 2027−2031. (26) Steiner, M.-S.; Meier, R. J.; Duerkop, A.; Wolfbeis, O. S. Anal. Chem. 2010, 82, 8402−8405. (27) Jing, C.; Gu, Z.; Ying, Y.-L.; Li, D.-W.; Zhang, L.; Long, Y.-T. Anal. Chem. 2012, 84, 4284−4291. (28) Schubert, E. F.; Gessmann, T.; Kim, J. K. Light Emitting Diodes; Wiley Online Library: New York, 2005. (29) MacAdam, D. L. J. Opt. Soc. Am. 1942, 32, 247−274. (30) Ramanath, R.; Snyder, W. E.; Yoo, Y.; Drew, M. S. IEEE. Signal Proc. Mag 2005, 22, 34−43. (31) Jenison, R.; Yang, S.; Haeberli, A.; Polisky, B. Nat. Biotechnol. 2001, 19, 62−65. (32) Ramanath, R.; Sander, I. I. I. W. A.; Snyder, W. E.; Bilbro, G. L. J. Elec. Imag. 2002, 11, 306−315. (33) Brown, M. S. IEEE International Conference on Image Processing, 2013. http://www.comp.nus.edu.sg/brown/ICIP2013_ Brown.html. (34) Wright, W. D. Trans. Opt. Soc. 1929, 30, 141. (35) Guild, J. Philos. Trans. R. Soc. London, Ser. A 1932, 149−187. (36) Fairman, H. S.; Brill, M. H.; Hemmendinger, H. Color. Res. Appl. 1997, 22, 11−23. (37) Č adík, M.; Wimmer, M.; Neumann, L.; Artusi, A. Comput. Graph. 2008, 32, 330−349. (38) Sharma, G.; Bala, R. Digital Color Imaging Handbook; CRC Press: Boca Raton, FL, 2002. (39) Qi, W. J.; Wu, D.; Ling, J.; Huang, C. Z. Chem. Commun. 2010, 46, 4893−4895. (40) Mahy, M.; Eycken, L.; Oosterlinck, A. Color. Res. Appl. 1994, 19, 105−121.

7592

dx.doi.org/10.1021/ac501448w | Anal. Chem. 2014, 86, 7584−7592