Smartphones as Blood Analyzers and Allergen Testers
Aydogan Ozcan’s phone can analyze blood, tell whether a cookie contains peanuts, and watch sperm cells dance
Steven Cherry: Hi, this is Steven Cherry for IEEE Spectrum’s Techwise Conversations.
I’ll tell you the name of the journal article, but I don’t think it’ll convey just how cool this is. Okay. It’s “A Personalized Food Allergen Testing Platform on a Cellphone.” Was I right?
Here’s what Aydogan Ozcan and five other researchers have done. They’ve created an attachment, which they call an iTube, for a smartphone and its camera. The augmented camera absorbs light in such a way that an assay is done by a different part of the iTube. The assay was capable, in their testing, of detecting the presence or absence of peanuts in store-bought cookies. The research is being published in Lab on a Chip, a journal put out by chemical sciences publisher RSC Publishing. Peanuts were chosen for their testing because of the way some people are highly allergic to them.
Aydogan Ozcan is an associate professor in the UCLA department of electrical engineering and heads his own research group there. He’s a Fellow of the SPIE, which is the leading international society for optics and photonics. In addition to his official UCLA bio, he has one at National Geographic as an “Emerging Explorer,” and he’s my guest today by phone.
Aydogan, welcome to the podcast.
Aydogan Ozcan: Hi. Thanks for having me.
Steven Cherry: Tell us more about the iTube. It weighs only 40 grams. That’s less than 2 ounces. What does it look like, and how do you hook it up to the phone?
Aydogan Ozcan: So it is a very lightweight attachment that has some optics in it that attaches to the back of your camera phone. And the idea is that there’s a smart application on the phone which is specifically designed to process, digitally process, the images captured through this optical attachment that goes at the back of the cellphone. And this is actually an extension of our work, to use the cellphone itself as a lab that you can conduct various different tests. And in this recent work, we used it to quantify the contaminants, the allergen concentration, in different kinds of food. And in this specific study, we focused on peanut. And the idea is that the user would be able to quickly grind the food of interest that they’re suspecting some potential peanut contamination in the food. And then there’s a chemical assay that will take, roughly speaking, around 20 minutes.
Steven Cherry: So basically you have to grind up the cookie? And do you put some substance in it as well and then wait?
Aydogan Ozcan: Yes. So there is a chemical assay which . . . there’s a protocol which you can also access through the cellphone’s application. It instructs you, essentially, how to prepare it. But essentially there are some liquids that you mix with the food after some grinding process. And then it takes some incubation and washing steps to start the chemical assay, which essentially will change color again based on the concentration. And then there is a separate container, which doesn’t have any food in it and will differentially analyze the transmission of the light, which is generated by a light-emitting diode—the same thing that we have at the back of our BlackBerrys, which essentially blinks whenever we have a new email. That’s a light-emitting diode. And its transmission through these tubes changes if there is an allergen. And then you can differentially look at the transmission of a test tube versus a control tube and understand, quite accurately actually, the concentration of the allergen.
Steven Cherry: So people can go into anaphylactic shock from just a tiny amount of peanut butter. Does the iTube detect down to those tiny amounts?
Aydogan Ozcan: Yes. It’s actually down to one part per million. Which is pretty good in terms of sensitivity.
Steven Cherry: It certainly is. How about other substances? You can go into anaphylaxis from shellfish, eggs, even chickpeas. Lots of things . . .
Aydogan Ozcan: Yeah, I mean the idea here is that it could be a panel of things that you can look at. Here we demonstrated it actually for peanuts, but potentially you can go for other types of nuts or even eggs or other potential allergens. It all depends on essentially the chemistry that is employed. Some of them are less sensitive, some of them are more sensitive. Peanut, because it’s extremely widely seen among especially young children, that’s why we’ve chosen it. But essentially it’s a platform where we can also test different kinds of allergens.
Steven Cherry: And would each substance that you’re looking for require a different liquid to be added to them?
Aydogan Ozcan: Absolutely, yes. It will be different test kits for different allergens. The idea would be that it’s roughly going to take you the same amount of time to prepare. But most of it is not, again, active, working. It’s something like waiting for the chemistry to work. So in that regard, you probably spend around five minutes on it. And then you’ll wait around 10 minutes or so depending on the assay. Then you will be reading it within a couple of seconds using the cellphone.
Steven Cherry: How phone-specific is it? What phone did you use? And would it have to be built differently for different-size and -shaped phones?
Aydogan Ozcan: So that’s a great question. We like Android phones because of the fact that it’s much easier to program. It’s much easier to integrate separate attachments to it. It’s much easier to tap into their batteries and everything for powering up the iTube components that you attach. So we really like Android phones, and this was a Samsung Android phone. Not really the best of the best, but it’s one of the recent versions of Samsung smartphones. Potentially you can do it in all kinds of smartphones, all Android phones. What would change from one Android to another is not the application, obviously, but it’s mostly the mechanical attachment to the phone, because of the dimensions and the location of the camera. So it could work, essentially, with any smartphone. We haven’t implanted it in iPhones because we find iPhones quite difficult to work with.
Steven Cherry: So right now it’s a lab device. How far do you think this would be from being a commercially manufacturable attachment for, even limiting yourself to, say, one manufacturer of Android phones?
Aydogan Ozcan: So there’s a start-up company based in Los Angeles, very close to UCLA, which I also am a co-founder. This company has licensed over the last one and a half years more than 20 different intellectual property applications from my lab, from UCLA. And this is one of them. So we anticipate, depending on the business model and how profitable it is, this could easily go into the market within one and a half, two years, from now.
Steven Cherry: I wanted to get to one of those other devices. I understand that cost and practicality concern you a lot. In that National Geographic biography I mentioned, you point out that as a researcher you’re not that interested in making, sort of, yet another sophisticated $100 000 blood-count analyzer or making a really good one even better, as much as you would rather make a simple blood-count analyzer available to many people, even if it’s not as good, if it’s a lot cheaper. You’ve actually made a blood-count analyzer, is that right?
Aydogan Ozcan: Yes. Yes, various different kinds of. So, indeed, simplification of advanced microscopy, or microanalysis, diagnostic tools, is a huge interest in my own research for two reasons. First, there’s a need for it. Obviously, especially for global health, especially for resource-poor countries, there’s a huge need for cost-effective, compact, field-portable microscopes, diagnostic tools, microanalysis devices, cytometers, blood analyzers. There’s a huge need for that.
And the second reason that I’m very much interested in this direction is because it’s timely. Intellectually, I think there is a huge depth to it. And that is mostly coming from competition. Because today’s components in terms of optical hardware is quite advanced. Another way of saying the same thing is, in our cellphones, the camera phones, the camera that takes pictures, they’re quite advanced compared to a decade ago. They have more megapixels. They have very nice pixel textures that are quite sensitive to light, very small size. And they can be really used for simplification of imaging, microscopy, and related microanalysis, diagnostic applications.
In our cellphones today, because of the massive volume, literally we have 6 billion cellphone subscribers today. Subscribers. And that massive volume is driving the cost so low for such an advanced technology like the cellphone. And for scientists it’s exactly a great goldmine. Because then you can tap into better components, in terms of their optics, in terms of their performance. Let me open up a parenthesis here and talk about, for instance, the design of a conventional microscope that is lens-based. It’s not changed for the last century or so. And it’s mostly made out of analog components, so you can think of this actually, a microscope, as an analog computer. But today’s world, I think, is highly digital, and the costs are extremely low for high-quality computation. Therefore, I think it’s the right time to think about how we can fundamentally change the design, the operation principles, and the use of microscopes, and make them more digital than they were ever before.
Steven Cherry: I wanted to ask you about the microscope. As I understand it, it adds an LED to the side of the camera on a phone, and then the light is scattered, and this creates a hologram. And then the system captures, basically, a substance’s holographic signature, and then some digital signal processing reconstructs the microscopic image. Do I have that right?
Aydogan Ozcan: Yep, exactly. So essentially you can think of it as a shadow imager. The hologram that you mentioned are no different than a typical shadow. What is different there is, when you’re walking on a sunny day outside, of course you will have your shadow on the street. But the illumination coming from the sun and our body’s interaction with that light is not entirely permitting a holographic shadow because we’re opaque, and the light is not coherent enough for creating fringes in our shadow. But when you fine-tune the illumination for the microworld, you can tune the properties of your sun, so to speak. By using an LED in a special configuration, you can make it almost work like a laser for a single cell or for a bunch of cells, and therefore every cell’s shadow, because they’re so small—on the order of a few microns—and they’re transparent to light. Light can penetrate through their body, unlike our own, and then cast a unique shadow, which is textured, which contains these fringes, which are holograms.
Steven Cherry: It sounds tremendously useful, and not just for the Third World. It strikes me that, you know, even in the First World, there are tests that don’t get done because a hospital can’t afford the equipment, or because the test costs too much, or because we’re spending so much on expensive tests that we don’t do other things as well. So this isn’t just a Third World issue, it seems.
Aydogan Ozcan: Absolutely. I think for Third World, I think it is the bread and butter. It’s so fundamental. But for the First World, I think it will lubricate the system to be more efficient and cost-effective. It could enable a point-of-care office almost function like a major hospital in terms of its lab capabilities. So that’s essentially what it enables for health care, either in the First World or the Third World.
But in terms of technology, computational imaging, the way that I described, also has some unique angles for extreme throughputs, extreme speed, extreme volume of imaging, and it can also enable us to shed light onto, literally, biophysical processes that have never been observed as clear as these techniques can provide.
One recent example of this, where it really answered a very interesting question, was imaging of human sperms using computational microscopes, based on, again, holographic shadows. Recently we’ve published a major result as of last fall, where we have reconstructed using holographic shadows on a chip the three-dimensional trajectories of human sperms, but at extreme throughputs, meaning that we were looking at, literally, a few thousand human sperms all in parallel across extreme large volumes at the microscale. And we were tracking the sperm head in three dimensions, with submicron accuracy. And this high throughput—this large statistics—actually provided some very interesting insights as to how human sperms move. And we have recorded very nice movies of some of their rare motion in the form of, for instance, a three-dimensional helix.
Some of the sperms, apparently human sperms, every maybe one out of 25 sperms or so, form a three-dimensional helix when they’re traveling in three dimensions, and a very tight one. Imagine a human sperm head, which is essentially having a helix radius of a few microns, like 2, 3 microns. And it’s doing a rotation that’s on the order of 10 rotations per second. So it’s a very tight helix that these sperms have been conducting.
Steven Cherry: That does sound pretty amazing. You know, Aydogan, if you can bring down the cost of some medical equipment by two or even one order of magnitude, it would be a magnificent achievement. On behalf of everyone, thanks for your work. And on behalf of our listeners, thanks for joining us today.
Aydogan Ozcan: Well, thank you. Thank you for having me.
Steven Cherry: We’ve been speaking with UCLA researcher Aydogan Ozcan about using the cameras and smarts of smartphones to develop low-cost medical diagnostic tools.
For IEEE Spectrum’s Techwise Conversations, I’m Steven Cherry.
NOTE: Transcripts are created for the convenience of our readers and listeners and may not perfectly match their associated interviews and narratives. The authoritative record of IEEE Spectrum’s audio programming is the audio version.