Seamus Blackley was one of the renegades at Microsoft who created the original Xbox game console, launched in 2001. (Somebody I know wrote a book about that). He went on to a long career in games and as an agent at Creative Artists Agency.

More recently, he cofounded a mobile game company called Innovative Leisure, to release games made by former Atari classic arcade game designers. That company didn’t quite make it, though Blackley still has some games that could potentially be launched at some point in the future.

Then Blackley returned to his original career: doing research related to high-energy physics. He met with Brian Mullins, CEO of augmented reality firm Daqri. Now Daqri is acquiring Blackley’s research lab and assigning him to work on commercialization of an interesting new 3D printing technology dubbed “software defined light.” It could enable instant 3D printing, as you can see in the video below.

I talked with Blackley about his new job, and how his quest for augmented reality and instant 3D printing could one day lead him back to gaming.

Here’s an edited transcript of our interview.

Seamus Blackley revives the Atari game design band.GamesBeat: How did this new job come about?

Seamus Blackley: I had a contact with a Hollywood guy who wanted some help doing physics work. I ended up starting a physics lab, and it’s sort of like the mob in the Godfather movies. Once you’re a physicist, it always drags you back. I set up a rapid prototyping lab, primarily focused on physics, in Pasadena, in the space where I was restoring all my arcade games. We ended up having staff and doing rapid prototyping projects.

Through this intern we brought in from Cal Tech, I met someone who turned out to be one of the first investors in Daqri. He introduced me to the CEO of Daqri, Brian Mullins, who I really took a shine to. I consider Brian one of my best friends today. I thought Brian was looking for a job at the time, but it turned out that he was interested in what we were capable of doing at my lab. We were doing very difficult, very abstract, very technical problems and converting them into working prototypes and devices that exploited new ideas in physics to accomplish real things. It was at an intersection between mathematics, high-speed programming, and reality.

The company we were doing our work for at the time started facing some problems with financing. They suggested we look for other work to keep doing what we wanted to do. Brian showed me some of the stuff they were doing at Daqri. He invited me to his place in downtown L.A., and I was not prepared for what I saw down there. I wasn’t prepared to see this big company with cool Star Trek-looking offices and tons of augmented reality stuff.

Brian’s real interest is that he has this large collection of analog holograms, the kind you make with real objects. He’s always been fascinated by that, and I have too. When we were talking about Xbox, back in 2001 or 2002, you might remember that at some point you asked me what the next step in games would be, and I think you were expecting me to talk about HD resolution or something. I said the next big thing was holographic rendering. I’m always going to talk up that. And here I met this guy with funding who said that what he really wanted to do was holographic rendering. That’s where he believes AR is going. The company is set up around that.

https://www.youtube.com/watch?v=hqOdqq778AU

They had acquired a company in the U.K. called Two Trees that had been making a holographic device for use in automotive heads-up displays. If you use a holographic device as opposed to a video device for a HUD, you can render things at different distances. You can render stuff on the dashboard near to the driver, and then render navigation commands over where the driver is focusing toward the road. It causes much less eyestrain, and because it operates using lasers, you can compensate for the optical power needed to overcome the sun. You can render in broad daylight.

Two Trees had been working on this a long time before they were acquired. They were acquired because heads-up displays (HUDs) are ground zero for AR. It’s AR performing a meaningful service for the driver. It’s a very directed thing. It’s a place you can be introduced to AR, become dependent on it, and understand its value. One of Brian’s key missions with Daqri is that AR will help humanity as it solves problems for people. That’s why Daqri concentrates entirely on professional, commercial customers for its AR devices. They solve a problem. You’re working at a petroleum plant and there are 650,000 valves. You can put the smart helmet on and tell which one is which. Everyone is safer and operations and smoother.

The HUD fit perfectly with that idea, so Daqri acquired Two Trees, and along with it acquired this holographic technology that was being used for automotive HUDs. I learned about that from Brian, and his dream of holography, and it became apparent to us — as well as Brian and the principal at Two Trees, whose name is Jamie Christmas, a brilliant guy — that there was much more potential for holography in this device than was being exploited with just the HUD. They had thought of a few things, so we discussed them. Amongst them was using holography to do instantaneous 3D printing. You probably by now have seen the prototype we built here in Pasadena doing that, in the MIT Tech Review.

I looked at the system and got a feel for the hardware. I attacked the problem from the standpoint of being a high-energy physicist and having done a lot of field theory. If you recall, I wrote a flight simulator called Flight Unlimited. Flight Unlimited was able to do aerobatic flight simulation because I wrote a different approximation to the Navier-Stokes equations from the ones that typical flight simulators use. My approximation was written with computability in mind relative to state of the art hardware at the time, so it could run in real time and simulate all sorts of nonlinear stuff for aerobatics and tumbling that nobody else had been doing.

People said, “Oh, you pulled off the impossible,” and that’s not really true. It’s just that because I was a physicist, I could look at the differential equations and write an approximation that was computable, as opposed to using the normal approximations that everyone is taught. When people use a normal set of approximations and there’s a status quo, you start to think those are the equations, and they’re not. They’re approximations of the equations. People have been using those for flight simulation since the ‘50s.

Lab for software defined light.

Above: Lab for software defined light.

Image Credit: Daqri

Fast forward, I’m looking at this holographic device and thinking of Maxwell’s equations, wave solutions to Maxwell’s equations for light. Light is electromagnetic radiation, so it follows the equations of James Maxwell that he assembled in the 1800s. Maxwell has this great moment when he was first assembling these equations, when he’s sitting there looking at them and trying to make sure they’re correct by solving all these electricity and magnetism problems that people had done using separate sets of equations in the past, making sure he gets the right answer.

He noticed there was a wave solution to this set of equations, and he saw that in the wave solution — there’s a term for the speed of a wave as it propagates. It’s the square root of these terms that were measured by rubbing a glass rod with cat hair and connecting batteries to cloth-covered wires near pieces of iron on scales, these two constants. The square root of their product turns out to be exactly the speed of light. He was sitting there looking at the proof, that light is electromagnetic radiation. He knew that when no other people could know that. Others had guessed it, but he had shown it.

I looked at the problem of holography fundamentally as a problem of solving Maxwell’s equations. I started to write down, here in my lab, solutions and approximations for those equations, for the set of Maxwell’s equations that holograms represent. We came up with alternate approaches that were really computable using high-end graphics hardware, for computing holograms in real time. The purpose of this was so we could generate objects made out of light and project those into tanks of monomer, in order to solidify it and do instant 3D printing.

GB: Tanks of what?

Blackley: A monomer is a resin that can be polymerized in some way so it becomes a solid. In an epoxy, for instance, you mix two liquids and they become a solid. In 3D printing that uses lasers, the laser solidifies this fluid wherever it touches. Our method uses a fluid similar to that. We have chemists here now working on this all the time. Our fluid is a light-sensitive monomer that polymerizes when you have a certain threshold of photons hitting a specific molecule. We can project this structure made out of light into the tank of monomer and it will solidify wherever there is hologram photons. It won’t solidify elsewhere. If I make a hologram in the shape of a paper clip, you can pull out a paper clip right away. We can do a bunch of other objects as well.

It works, and it’s really neat. You’ve made an actual object out of light, using the principle of holography as computed by this approximation strategy we have, using consumer graphics hardware. We started doing that, and it became immediately clear that, to me at least, the potential for this was enormous. There’s no reason that we can’t push this all the way to large scale displays that can be fully interactive and holographic. Groups of people can look at a phone display, a phone display, a television display, and they’ll all see the same 3D scene from different perspectives as if they were looking at real objects. All the math we’ve done, all the simulation, all the engineering work shows that this is the case. We can do it at a small scale already.

Daqri is also working on augmented reality heads-up displays for cars.

Above: Daqri is also working on augmented reality heads-up displays for cars.

Image Credit: Daqri

We’re continuously building toward larger devices and more applications of the technology, which we’re calling software-defined light. “Hologram” has become a completely maligned word. It’s lost any meaning. The idea of a hologram now essentially means Pepper’s ghost, a video image reflected off a glass in front of you so it appears to float in the world. These are different. They’re actual things you can see. When you move around the perspective changes, they occlude themselves, and do all the things that holograms do. Because they’re real objects made out of light, you can use them to print, because it’s a real thing. It also means you can make beautiful displays, and you can do other interesting applications that take advantage of the fact that you can now directly connect the mathematics in your software to physical light in the world.

That’s where we get to this idea of software-defined light, or SDL, which is really the generalization of this technology. We can compute a light field that we want to generate, a pattern of photons, and if we compute it correctly, it will exist. It’s a direct connection between mathematical computation, information theory work in a machine, and a light field in the real world. This means that, for everything from lidar applications to headlights that don’t blind oncoming cars to anywhere you have a system of lenses or complicated optical devices, in just the way that software-defined radio allows Qualcomm to put custom radio hardware in phones and other devices, letting a computer send and receive radio, we are now able to use a computer to directly send and receive light.

https://www.youtube.com/watch?v=uMHMsaawupE

When all of this became clear, it was obvious that this was an important thing to pursue. I found myself, as a person with a background in field theory and real time computing and making hardware and running a physics lab — it seemed like I was the guy to take this technology and exploit it as quickly as possible. That’s what we’ve done. My lab has become the graphics prototyping and rapid research lab for Daqri around software-defined light. I’ll be working on all the stuff I just mentioned, and probably some more.

GB: How many people were in your lab? Are they all coming over to join this effort?

Blackley: Everybody came over. We had five people, and then another four or five contractors. Most of them have come over. Now we’re rapidly accelerating our hiring, getting more people with backgrounds in holography, more electrical and optical engineers. We’re having a really good time. It’s a pretty easy sell because we have this stuff working. We bring people in who’ve been working on optics and we say, “Hey, we can do this.” They say, “OK, I’ll start now.”

GB: As far as the application here, is it a brand new product line for Daqri, or is there some way this comes back to help with their smart helmet?

Blackley: This is something of a serendipitous outcome from the AR work on the HUD. The guys on Jamie Christmas’s team who invented this knew that there were other applications. It’s not like we discovered some kind of secret. But we did discover that the techniques they were employing made it possible sooner than anyone had thought. It doesn’t affect the strategy at Daqri. In fact there are places in which using these devices can cause their AR displays can be much better and more powerful. In that way it complements them. But it also introduces ways to take our thinking about AR and extend it into the real world.

Instantaneous printing, if you think about it, is the ultimate example of AR. You take a computer object and you make it real. It augments reality. That sounds like a joke, but it really is the sort of thing people have been thinking about around AR since the ‘40s and ‘50s in science fiction. One of the weird things that happens, in fact, when we show this to people who aren’t familiar people with 3D printing technology, is that they’re not that impressed. If you’ve never used a 3D printer you’re not aware of the state of the art. You probably expect, from Star Trek, that 3D printing involves just pressing the button and you have the object in your hand in 10 seconds. So people say, “Well, of course that’s how 3D printing works.” Which is super depressing when you’re trying to impress someone with a demo, but it makes a lot of sense.

GB: Is the first application in 3D printers? Or will you be doing something else?

Blackley: We have a lot of partners at Daqri in various business categories. We obviously have an automotive partner for the HUD. Hundreds of thousands of vehicles on the road are using the first generation of the HUD technology. Those vehicles are already on the road using the holographic technology in that version, from phase two of our product road map, for the near future. We also have other automotive partners that we haven’t announced yet.

When we showed the capabilities of SDL to those automotive partners, they became incredibly interested in its applications for Lidar (laser sensing system). If I can mathematically steer light and read back the signal, I have a big advantage over mechanically scanning Lidar systems. Also, if I’m mathematically defining the outgoing light I can encode in that all sorts of information that makes the Lidar problem easier, faster, more robust, and so on. We’ll be talking about this more in the future. It’s an area of research right now because it’s so important to self-driving car efforts and safety features. It may be that we end up licensing that technology to partners. It may be that we try to build some products. We haven’t made that decision yet.

Enterprises could use augmented reality glasses to improve technical maintenance.

Above: Enterprises could use augmented reality glasses to improve technical maintenance.

Image Credit: Daqri

The same thing goes for printing. We have partners in our AR business, customers of our helmets and goggles and other sensing devices, who are very interested in not only just straight 3D printing, but in the mechanism of that printing, which would enable you to not just print things in a tank, but also print things on objects, print things inside objects, mark objects, and so on. That’s a licensable technology. It’s also something that could be a stand-alone product in conjunction with a partner.

The overall message is that I really am running an R&D lab. The decisions we make, based on what succeeds and fails in that R&D, have a whole spectrum of implications. But the immediate application will be in the automotive side, just to put a fine point on it.

GB: Are you guys a standalone lab, or do you join another R &D unit that’s already inside Daqri?

Blackley: We’re a standalone lab. The R&D that went on in holography took place in Milton Keynes, in England. Fortunately, they’re incredibly nice, incredibly smart people, and so we’ve been able to partner with them. They’re doing fundamental research for the automotive products, the fundamentals of the holography. We’re looking at applications and improvements and other ways to push that technology into different places.

GB: As far as technologies you’re excited about, do you think some of these big problems are going to get solutions soon? Things like what Magic Leap is dealing with, and making outstanding AR in general. Is that going to be solved soon? Does this technology play a role there?

Blackley: This technology is completely distinct from the technologies that are used by Magic Leap or HoloLens or any of the other AR companies. This is a much harder way to go about solving the problem. I was nervous about whether or not we would be able to use it properly to generate holographic images in something like real time.

To some extent, the technologies at HoloLens and Magic Leap and so on are ideas that people had to create images that look a lot like holograms, without having to try to solve the general holography problem, because that’s been viewed as very difficult. It is very difficult. We operate in a totally different way. Our progress with software-defined light is in a separate direction from what those guys are doing.

We met, actually, with a very senior engineer who works on virtual reality, not augmented reality. Like all engineers we show this stuff to, he was very skeptical, and then when we proved it’s working, he was super excited and very impressed. He was laughing, because he suggested we call it Pepper’s ghost instead of calling it holography. He said that for so long, since the ‘60s or the ‘70s, people have been projecting images onto screens and sheets of mylar and stuff and saying, “Oh, look, it’s a hologram.” It would be a funny joke if, now that we’re making a real hologram, we went and called it Pepper’s ghost. It’s different enough that tech people crack jokes about how different it is.

I have no idea what the fate of those companies is going to be. I don’t even really know anymore what struggles they’re facing, because I’ve been so focused on my own stuff. But I can tell you that it’s a completely different technology tree. It comes from an unexpected place. We never thought that automotive HUD holographic projection units could be pushed to do live 3D printing or polygonal holographic displays. But it turns out we can.

Software Defined Light enables instant 3D printing.

Above: Software Defined Light enables instant 3D printing.

Image Credit: Daqri

GB: I’m doing one of our conferences again. We’re focusing on the theme of how different fields are inspiring each other. Science fiction inspired games in areas like virtual worlds, and real-world technologies like AI, and now those technologies are accelerating past what fiction predicted. I wonder about your own point of view on that. Do you see that kind of acceleration going on as far as what inspires what these days?

Blackley: I came to this taking what I thought was a contract job for my rapid prototyping lab, to take somebody’s holographic display and make a printer out of it. That was the contract I accepted. It was a complete surprise that this device could be made to generate live holography. I’ve now gone through years of academic research and other stuff looking at the development of computational holography. I don’t think anybody expected us to be at this point so soon.

A lot of times, at least in my experience, things that are actual breakthroughs come from unexpected places. The combination of automotive HUD and 3D printer is a pretty interesting way to come up with novel display technology. If you look at any movie you go to, all the movies dorks like you and me go and see, everything assumes that in the very near future there will be holographic displays all over the place. But as yet, nobody had any idea how we would do this, unless everyone was wearing the same brand of goggles or something.

It’s in the back of everyone’s mind, which is a double-edged sword, because on the one hand, it’s the first thing you want to go for when you think you have a new technology like this. It really excites people. I don’t want to overstate things as if we’re going to be building holographic TVs next year, because it’s really hard. It’s going to take a long time. I don’t know exactly how it’s going to go down. But we’re going at it hard. On the other hand, the fact that it’s been in popular media so much means that people just expect it.

Star Trek Tricorder

Above: Star Trek Tricorder

Image Credit: Paramount

GB: It’s sort of like, “Why hasn’t somebody made me a tricorder yet?”

Blackley: Right. Or when we show people who are new to printing the instant 3D printer. When you show that to someone who does a lot of 3D printing, they freak out. But when you show it somebody who doesn’t know anything about the field, they just say, “Oh, that’s cool.” Because of movies and TV and science fiction they assume that’s just how it’s supposed to work. Why would it take time?

Maybe 20 years ago, somebody found a bit of ice near the south pole and they thought it might contain fossilized bacteria from Mars. The public reaction to that wasn’t, “Holy shit, life on Mars!” The public reaction was just, “Oh, okay.” Because we’ve been trained to believe that this is just one of those things. Holography is the same. It’s seeped into the public consciousness so deeply that the standard you have to hit in order to actually impress someone is incredibly high. Of course a little robot could project Princess Leia. Why don’t I have that at home right now? It turns out that it’s harder than that.

GB: I didn’t know what you were doing out here.

Blackley: It’s where I live. I had the arcade and all that stuff going on. I was working on some cool projects. When I saw that it had become possible to change the way displays work, it was too epic to turn down. It’s one of these situations where — I was talking to our CEO, who has a degree in physics and is a really smart guy. It became clear to me that I was an ideal person for this. The universe was handing me a mission, almost. “Here’s a terrible computation problem in graphics hardware, which you know about that needs to simulate field theories, which you know about, with the goal of making super high quality displays, something you also know about. You’d better get on it.”

GB: Was there any difficulty in making everything line up as far as getting one company to work on this technology, something that had been the work of multiple companies before?

Blackley: This predated me, but Brian and the board at Daqri and the executive team seemed to be pretty diligent about finding companies that were doing interesting things in AR, and either partnering with them or acquiring them. They acquired Two Trees, which was pretty unique in having a device like this that worked, and they also acquired a couple of patent libraries from other researchers who were doing similar things, with the goal of aggregating it all in one spot so you could really get somewhere. I have Brian to thank for doing that, and his foresight. We seem to have a clear runway at this point.

Daqri’s helmet

Above: Daqri’s helmet

Image Credit: Daqri

GB: Are you actually in an arcade?

Blackley: We took the warehouse that had all the old games in it. That warehouse is now a really cool super-high-tech Millennium Falcon-looking lab, with a bunch of PhD guys in it.

GB: Is it a big switch from working on games?

Blackley: It’s almost like a continuum. The thing that always interested me about games is the intersection between technology and art. The collection of games I have here is really a collection of the intertwined relationship between people and computers and entertainment and art. It’s an appropriate place to get right on the bleeding edge and push that. It feels right from here. It feels really good.

GB: If you keep working on that display technology, you’ll have a new platform for games.

Blackley: I hope so. I really do. That, in my heart of hearts, is the thing I’m pushing for. Having done it a couple of times before, I think I have a pretty good shot.

1 2 3 View All


📚 Featured Products & Recommendations

Discover our carefully selected products that complement this article’s topics:

🛍️ Featured Product 1: Gray Shaker Base Skin Veneer Panel

Gray Shaker Base Skin Veneer Panel Image: Premium product showcase

High-quality gray shaker base skin veneer panel offering outstanding features and dependable results for various applications.

Key Features:

  • Premium materials and construction
  • User-friendly design and operation
  • Reliable performance in various conditions
  • Comprehensive quality assurance

🔗 View Product Details & Purchase

💡 Need Help Choosing? Contact our expert team for personalized product recommendations!

Remaining 0% to read
All articles, information, and images displayed on this site are uploaded by registered users (some news/media content is reprinted from network cooperation media) and are for reference only. The intellectual property rights of any content uploaded or published by users through this site belong to the users or the original copyright owners. If we have infringed your copyright, please contact us and we will rectify it within three working days.