Let me address the implications first. One really has to understand how the sample and spectrometer act together to really understand the recorded data. If one were to start recovering chemistry-specific data from a heterogeneous sample, one must understand how the spectrometer interfaces with that type of sample. If you have a large enough market where a certain type of sample is analyzed repeatedly (such as breast tissue or prostate tissue, which are both of interest in our laboratory), then perhaps you could design an instrument in which the scattering is known or minimized or somehow handled, so that chemistry information is obtained really quickly. That's the driving force for this trend.
The evidence says that until recently, most imaging instruments or even most point-mapping instruments in IR spectroscopy were pretty much the same and performed rather similar functions. You could put your sample in and load it up, and either the beam would raster through or you would collect a two-dimensional image, and pretty much the same performance and same sort of data were acquired, with only slight differences between manufacturers. A few years ago we started seeing instruments designed for the first responders or those in the homeland security area, most notably by Smiths Detection, and then people realized that perhaps even within the spectroscopy market there might be some niche areas that could be addressed. For example, last year Bruker came out with a dedicated microscopy system.
There are specific segments in this market in which if we focus, we can then make an instrument that can outperform any other instrument, but it might only do a limited number of things. It's not a general-purpose instrument, but what it does, it does really well. This trend is also perhaps encouraged or promoted by other technologies coming on board. Our laboratory and others have used what we term discrete frequency imaging, as opposed to Fourier transform or continuous spectral imaging. Our laboratory developed narrow-band filters that can be developed cheaply, can be manufactured easily, and are fairly low cost.
We've also used quantum-cascade lasers. With quantum-cascade lasers or filters, you are not really measuring the entire spectrum. You could if you wanted to, but usually you are interested in just a few frequencies. This is quite the opposite of the Fourier transform model in which you measure the whole spectrum; whether you want it or not, that's what you always get. When you start using the discrete frequency approach, then the question arises of which frequencies should be examined, and that's where the use of the instrument really comes into play. From a general-purpose spectral recording device, what you might have is a very specific function spectrometer that might be based on a set of filters or a quantum-cascade laser within a certain range, for example, and that might drive future applications. It's both the need as well as the technology evolving now that will make this trend possible.
How has data interpretation in IR microscopy and imaging improved recently in terms of identifying relationships between structure and spectra?
That's a great question because this is one of the most exciting areas in IR microscopy today. We have always known — in fact the first paper that I published in this field demonstrated — that the real part of the refractive index has a tremendous influence on the spectral data that are recorded. This was in 1998 and we thought that yes, it does have an influence, we understand that, but there are more important things to be done: Making better instruments showing IR imaging is useful in a variety of applications. About seven years ago, people started realizing again that scattering and real transport of light through a sample influences the data that we acquire and also actually limits us in application. Since then, a lot of work has been done. The first few attempts were model based, so we would treat every pixel independently; we would treat things as if there were a scattering center there, and as if that were the only effect on the system. Then, almost four years ago, we came out with a comprehensive theory for IR microscopy. This theory has been published in series of papers in Analytical Chemistry since then (2,3).
So what we have been doing over the years is taking this theory and applying it to very specific cases. Now, for some objects that are well defined and whose properties we know, we can predict what kind of spectra would result if a particular instrument were used. The same framework was recently used to explain the limit of information in high-definition IR imaging. Early in 2013, an Applied Spectroscopy paper from my group laid out a theoretical background and simulations to show the quality of image that we can potentially get with IR imaging (4). So it's not just an improvement in identifying the relationship between structure and spectra, it's also leading to better instrument design, and that in turn then allows us to better understand structure and spectra. This resonance between basic understanding, application, and instrument design is really one of the more exciting trends in the field now and I hope this will continue to be an exciting trend for some time to come.
What other trends in IR imaging do you think will become important in the near future?
One of the important trends we are seeing is the emergence of many new components. Though we have already seen advances on the source side, with lasers, filters, even better sources, and better interferometers, we perhaps have not yet seen the full potential of what can happen on the detection side. Certainly hardware advances as always will continue to remain important. One aspect that has not been quite as important in IR microscopy has been theory and algorithm. I think this trend will really become apparent now that we have realized that the information that we can extract from IR imaging data is limited unless we use some sort of correction. Nearly all forms of data recording will require some sort of data processing that is based on the physics of the instrumentation.
What we are really doing here is moving from a basic passive data recording approach to more of a computed data recording paradigm, in which computation and hardware are merged with each other in a way that is quite synergistic. Most certainly, applications are going to be a big trend. There have been many new ideas. In my laboratory, we have used IR imaging on food grains, forensic materials, and other things. Biomedical applications as always will continue to remain strong.
The one trend though that I have not yet seen much evidence for but is likely to emerge is the idea of total information; not only spectral information and not only thinking of pixels independently of others, but asking how do multiple pixels work with each other, and how do different structures in the image relate to the same chemistry or relate to the same problem? We have not really seen analytical algorithms go after the entire imaging data. We have often seen chemometric algorithms focused on spectral data on a per-pixel basis, and we have seen imaging extracted at a specific wavenumber, for example, but certainly looking at all dimensions of the information is likely to be an area of interest in the future.
(1) R. Bhargava, Appl. Spectrosc. 66(10), 1091–1120 (2012).
(2) B.J. Davis, P.S. Carney, and R. Bhargava, Anal. Chem. 82, 3487–3499 (2010).
(3) B.J. Davis, P.S. Carney, and R. Bhargava, Anal. Chem. 82, 3474–3486 (2010).
(4) R.K. Reddy, M.J. Walsh, M.V. Schulmerich, P.S. Carney, and R. Bhargava, Appl. Spectrosc. 67, 93–105 (2013).