Despite being otherwise enamored by the intrigue of holography, Dr. Daniel Smalley says that unfortunately the scene in Avengers during which Tony Stark puts his gloved hands into holographic projections would not be possible; holography couldn’t allow light to travel through his hands and certainly couldn’t bend the light to go around it. Such imagery could, however, be created by 3D projections and Dr. Smalley, an electrical and computer engineering professor at Brigham Young University, is happy to tell us how.
Listen in to learn more about holography and 3D projection technology, including its potential uses in medicine, aerospace surveillance, and telepresence. Visit his website at www.smalleyholography.org and the Brigham Young University ElectroHolography Research Group at https://holography.byu.edu.
Richard Jacobs: Hello. This is Richard Jacobs with the future tech podcast. I have Daniel Smalley. He is at Brigham Young University as a physicists and that’s in Provo, Utah. We’ve been talking about 3D projections or 3D holograms. So this will be a pretty cool call. Well, Daniel, thank you for coming.
Dr. Daniel Smalley: Yeah, my pleasure.
Richard Jacobs: I guess as a lay person, I don’t really know much about holograms. What is a hologram versus a 3D projection? And then let’s talk about how you’re improving it.
Dr. Daniel Smalley: Well, I think that the biggest difference is the screen. Would we imagine a 3D image from science fiction? We imagine something that’s projected out in air. We don’t imagine having to look through glasses or having to look into a monitor to see that 3D image. And that then is the primary difference between volumetric technologies that you can see, like you look at a water fountain where the imagery just pops out. You don’t have to be looking down into the water fountain to see it and the more traditional 3D where you have to be looking into a screen like you look into a television. So that’s the biggest difference between what we’re doing and holography and other more traditional 3D approaches.
Richard Jacobs: So how do holograms work?
Dr. Daniel Smalley: Holography has a very specific scientific definition, I mean holography has to do with the creation of wave front of light as light passes through patterns of line and deflects and by that definition you have to be looking at some diffractive surface to see that hologram for light into your eyes. You have to be looking at some pattern of line. By contrast a 3D projection in general isn’t creating images with just light alone. Image points in a volumetric display are actually little pieces of matter floating around in the air. And it’s the combination of that little piece of matter as a scatter and light scattering off that little piece of matter that create the image and you give it a lot of its special abilities.
Richard Jacobs: Okay. So where is the source of the light coming from to create a hologram or a 3D projection and where does the viewer see it as if it’s coming from where if it’s coming from nowhere?
Dr. Daniel Smalley: So in a hologram, light is coming either from the back and transmitting through a film or a diffractive surface or from the front, maybe from the top, you have lighting from the top and light is reflecting off that surface and be modulating by those lines. In a projection display or a volumetric display, the light can come from a couple of different places. It could be coming from just about anywhere. It can be coming remotely from some illumination source. It could be illuminated despite a room light or the little particles themselves. The little pieces of matter that make up the image points could be luminous. They could be emitting light themselves. So there’s many different ways that light can be emitted from an image point and the volumetric in a 3D projection display.
Richard Jacobs: Well, we’ve seen in the movies like Tony stark, The Avengers, that he put on his gloves and he puts his hands into these holographic projections and all of a sudden it works. In real life, what is it like or what could it be like versus what do we see in the movie?
Dr. Daniel Smalley: So seeing that scene was actually annoying to me. I had been singing the praises of holography for most of my adult life at that point when I had seen that scene and realized that that was something that holography couldn’t do in particular because light travels in straight lines through the air. And so light would have to either burrow through Tony Stark’s hand to get to his eyes or somehow turn around in midair. And it just wasn’t something that I could imagine holography doing. And I still believe that to be a fundamental limitation, being able to have light, have an image that wraps around the human body is a difficult problem to solve. But 3D projections can do it. If you can place light scatter as anywhere you want there’s nothing stopping you from having those light scatters placed above and around behind the arms of an individual that you could actually live with and live in the same space as your 3D display. I imagine, for example, in medicine, you could imagine using a 3D projection to project on a potential, you know, a prosthetic design onto the stump of somebody who’s lost an arm or something like this. You could actually fit it to their stump. And in fact, we play little games in our lab where we’ll take little toys, like I’ve got a little enterprise and a little cling on battle cruiser and we’ll set it in the display. And they’ve got a student who’s got it cut the little toys shooting photon torpedoes and back and forth at each other. So, I mean, you can actually breathe life into these otherwise inanimate physical objects by augmenting them with this 3D projection technology.
Richard Jacobs: So where is it use for the common person? The only place I’ve ever seen holograms is maybe a sticker on a piece of merchandise that kind of looks a little bit orangey and in green and someone like a cheap hologram. When I go to a science center and I’ll see one. But are there any commercial applications or wide scale use applications of it?
Dr. Daniel Smalley: For screen-based 3D it’s harder to find commercial applications and it’s because of the screen. For screen-based 3D to be useful. It needs to be really big so that everywhere you’re looking, you’re more or less looking into that display. For this newer 3D projection technology, there’s a lot more use cases, especially for small displays. Right now our images are really small. I mean, some thumbnail size, but we hope to get them soon to maybe an eight inch, at maximum dimension. And at that point, you can imagine using it for aerospace surveillance, tracking satellites, make sure there’s no collisions or conjunctions. You could imagine using it for medical imaging, looking at catheters as they pass through, the vasculature of the body during an intervention or even telepresence, having little disembodied heads pop up out of your phone or a watch and then turn around and talk to you. Just like a real person would, are all possibilities as we scale this technology.
Richard Jacobs: So it sounds like it would have a big home and augmented reality applications.
Dr. Daniel Smalley: Yeah, I think so. And one of the big bonuses is most 3D is really computationally intensive. That is to say even a small holographic display can take a supercomputer to run. But with 3D projection technology, it really only requires red, green, blue and X, Y, Z for every image point in space. And so for just the amount of computing power that your video card has and your laptop, you could run a pretty decent telepresence app with maybe a million little particles running around drawing a face for you to talk to. So I think that there’s a lot of advantages with computational complexity being much lower.
Richard Jacobs: So what would be a more realistic interpretation of what’s possible, Star Wars perhaps, they were grainy those images.
Dr. Daniel Smalley: Yeah. Well, and they did that on purpose. I think that a lot of what happened in Star Wars was trying to tarnish science fiction so that it looked quotidian. Made to make it look every day by making it look a little dusty. In fact the princess Leia George Lucas created vertical scan lines on purpose so that you would know that it was something different than television, which had horizontal scan lines. All of this was carefully done to create a certain response in the viewers. But I think that we can expect with 3D projection technology to be able to create all the images from science fiction that we’ve seen so far, including a few that we haven’t seen. So definitely I think we’ll see a princess Leia projections. I definitely think we’ll see an avatar table. I think we’ll see that iron man display and I think we’re going to see something brand new. I think we’re going to see displays that that are customizable so that we’ll all be looking in the same space, but seeing something totally different, if that makes any sense. Well, one of the very interesting things about 3D projection, at least the way that we’re doing it in our lab, is that we can get light to scatter in one direction, but not the other. So if you tried to project the princess Leia with an isotropic scatterers with fireflies for example, you’d be able to see the front of princess Leia and the back of her at the same time. You’d be able to see both her hair buns from every direction all the time. She looked like a ghost so to see. If you want to make it look self-solid, you have to have the ability, for example, to take one particle that only scatters forward and use that to draw the front of princess Leia and you take another particle that only scatters backward and draw the back of princess Leia and then you won’t have this problem. So it looks so self-solid. But you can use that very same technology to make everybody see something different. So you could imagine a case where there’s this family room and you’ve got a little kid who’s doing their homework in the middle of the room in some central image volume. And then right next to her mom is talking to grandpa also in that same volume. And then next to her dad is playing a soccer game and coupled with parametric speakers, they could have these totally immersive experiences, totally customized. And yet at the same time, not wearing glasses, not looking down at a screen, which would be pretty unique for an American family room. In a way, what we’re doing is instead of abstracting people into the digital world we’re taking data out and we’re making it physical and we’re putting it in our world making it a physical part of our space.
Richard Jacobs: Yeah. That’s cool. I like that a lot better. So how does 3D projection work? Can you reiterate again and compare it to hologram just once more? And this would help me get the concept of listeners too.
Dr. Daniel Smalley: Yeah. So the way this works really sounds like science fiction from the very start. We begin with a laser tractor beam and we use this laser tractor beam to pick up a little particle of paper and we scan that little particle around. Now the image space. And as we scan it, we illuminate it with red, green, and blue lasers. That means that we can make it glow any color we want. So the combination of illuminating that particle and scanning it through the air gives us an effect, not unlike what kids do with sparklers on the 4th of July where they will write your name in the air if you move, if that sparkler fast enough. So this is an effect called persistence of vision. So if we can get that little glowing particle moving around fast enough, we can make images and even faster we can make those images move, frame by frame and do little animations or something. We started to do, I’ve actually got a little Stickman that jump up and down on your finger or other little animations.
Richard Jacobs: You’re using Ambien dust particles or like where the particles fits? And do they fit in the framework? And then you’re good enough where they can move around randomly until you can cap. You can use them as an image as long as there’s a certain density.
Dr. Daniel Smalley: We have a little cocktail of particle pieces that sit in a reservoir and we have this automated technique for picking up particles out of this reservoir. The hope is that we can get good enough at automatically picking up these particles that even if you knocked it out, even if you stuck your hand in the image and knocked out the particle. No, bend down, pick up a new one fast enough that you’d never even know anything had changed or anything had happened. And that way you could make it a really robust, but we got a little reservoir particles. It picks up from the little reservoir and it draws an image.
Richard Jacobs: Well again, the picks up a part, what does that mean? Magnetically moves one around or just statically like out of that position and hold the particle in a certain spot. And would it be possible to use natural phenomenon, natural Brownian motion of dust in an area to project the image onto or a cloud or something like that?
Dr. Daniel Smalley: To answer your second question, I mean there are certainly people out there who worked with ambient dust. In fact, one of the earliest volumetric display concepts was put forth by Ken Perlin in a paper and a patent he called hollow dust where he says, let’s just take the dust bunnies, dust motes floating in the air and we’re going to scan the room to find where they are. And once we found them, we’re going to shoot visible laser beams at them to make them glow in the air. The problem being of course, that you need a lot of dust in the air to make images that are nice and complete. The bone is here with what we’re doing you don’t have to modify the environment at all. So what we’re doing is we’d take a laser beam and we focus it with a lens, but the lens is imperfect. And to create an imperfect focus it’s got a hole in it and that hole is a low intensity region. It’s a dark spot. And when a little particle gets in that dark spot if it tries to escape, it gets warmed up by the bright areas around the dark spot and that creates kind of a jet action that pushes it back into the center. So it’s basically stuck there. It’s trapped. And then once it’s in the trap, you can move the laser beam around and you can move that hole around and then the particle is going to be obligated to be dragged around with it.
Richard Jacobs: So you’re using essentially like a cage of laser beams to keep particles in a certain area?
Dr. Daniel Smalley: Yep, that’s right. We call them bottle beans.
Richard Jacobs: I guess it’s like when you go to a science place, science lab, there’s a ping pong ball on it. It sits in a column of air, it stays in the column of air because the pressure there is lower and outside of the column is higher.
Dr. Daniel Smalley: Yeah, that’s right. Yeah, that’s a good thing. A good analogy.
Richard Jacobs: Wow. So, all right. So by having a reservoir, can you stay with the particle in air and the reservoir?
Dr. Daniel Smalley: Yeah. Cellulose, cellulose or paper. It’s a little Carboniferous a particle.
Richard Jacobs: So there’s no danger if someone like inhaling?
Dr. Daniel Smalley: Oh, they could, I mean, they could, they could absolutely inhale it, but it would just be one of the thousands or millions of particles that we inhale at regular intervals. The average American household produces 40 pounds of this dust every year. So we’re breathing this stuff all the time. It wouldn’t have any effect on the total amount of dust we’re breathing in. In fact, as you have mentioned you could configure your display to be like a sponge where it’s actually pulling dust out of the air when it’s not doing other things. It would actually make the air a little cleaner than it was otherwise. So yeah, we don’t consider the health there, there didn’t necessarily to be a health issue.
Richard Jacobs: That’s really interesting. What about other applications besides printing projection? What are some, I don’t know how you would amounts a column of this above the doorway and it would trap and push down a gradients somehow light gradient particles and take them out of the ambient air. It’ll be used to clean the space?
Dr. Daniel Smalley: Sure. I’m sure it could. And while it may not be very effective on earth where there are other more efficient ways of doing it, perhaps maybe in other places like space. The problem with space is that there’s no air and you’ve got to have air for this to work. So, and there might be places where you just can’t be blowing the air around or something like that. Maybe let’s say that you’re dealing with, for example, a dangerous thing. Let’s say you’re trying to move Malaria or something else from one bottle to another and you’re using in teeny tiny droplets and you don’t want anybody to touch it you could use this to transport little droplets from one place to another without anybody ever having to touch it. So there’s a lot of nondestructive testing and I mean, limitation is used broadly and it has been for decades in lots of different fields. So it does have a pretty broad applications outside of display.
Richard Jacobs: Yeah. I really don’t want to go too far. So what size particle range does this work for so you have a tune system?
Dr. Daniel Smalley: So particles can be trapped from nanometers up to 50 or a 100 microns, which is actually a really big. We tend to use particles in a 10 micron range. And then of course, it just depends on what we’re trying to do. If we want to get nice directional scattering then we will use a bigger particle, if we want a light to go everywhere. If we wanted just the glow everywhere, we’ll use a smaller particle.
Richard Jacobs: That’s why I was going to ask you is how you do modulate resolution? Any given field of view, if you have areas of smaller particles and larger particles, I’m sure you could create all kinds of different effects. You can create like the body of an object and make maybe a diffuse area around it. For instance, if you want that effect by having different size particles traps.
Dr. Daniel Smalley: Yeah, definitely having lots of traps enables us to do all sorts of potentially interesting things. I think the biggest thing that travelling multiple particles enables us to do with, to make the display bigger. So right now or at least until not long ago what we were doing was just taking a single particle and moving it through a complicated path to make our images. But that’s not very scalable as an approach. So one of the things we can do instead is have lots of traps holding lots of particles and instead of moving it through a complicated path, we can just move it through a simple path, maybe just up and down for example. So you can imagine a plane of particles to just move up and down and get illuminated as they go. And then you could imagine making much bigger images that way. That’s how we hope to get to eight inches and beyond though inches is not the end here. We have sketched up designs that we hope will enable us to get to meet our sized displays so we can do soldier decoys and the sort of thing.
Richard Jacobs: Oh wow. As some of the anticipated future obligations. So that would be one. Any others that are really interesting to you?
Dr. Daniel Smalley: So in addition to medical visualization and aerospace surveillance, I mean, definitely things like telepresence, but there’s a lot of people who have approached me who seemed to be really interested in AI agents, in making AI corporeal. So taking Alexa or Siri and giving them a little body enabling them to interact with us face to face. You could imagine for example, something like a hollow nurse that could be projected around, could follow a loved one or a senior citizen around the house who could help them with medical, with medication compliance. Make sure that they’re taking the right stuff at the right time and point to the label or do medical education or even point out fall dangers as they’re walking around during the day. Have a little friend. The funny thing is that if it’s customizable, it could be a little friend that only you see, it could exacerbate this thing where you’re walking down the street today and everybody seems to be schizophrenia. If they’re on their Bluetooth, it could get worse. Where everybody seeing something that only they can see and talking to them.
Richard Jacobs: Yeah. I imagine, what if you’re in a bank or let’s say a nightclub and the security level sees certain things that no one else can see and customer level, they see stuff and the Boss see other stuff. Same environment. Literally would see different things, which is pretty cool.
Dr. Daniel Smalley: Yeah. According to your security clearance, your proclivities or the language you speak, all these things could be customizable and that could make very interesting scenarios.
Richard Jacobs: You have redacted reality. Well, very good. What’s the best way for folks to see some of this stuff?
Dr. Daniel Smalley: Yeah, so we have a website, smalleyholography.com and BYU has a bunch of resources online to their university pages and we’re always making things available through publications. So we’re happy to appoint people to patents and papers that we’re publishing of our group.
Richard Jacobs: Excellent. Well Daniel, thank you so much for coming on. I appreciate it.
Dr. Daniel Smalley: Yep. Thank you.
Podcast: Play in new window | Download | Embed
Subscribe to Our Newsletter
Get The Latest Finding Genius Podcast News Delivered To Your Inbox