Episodes
Wednesday Mar 15, 2023
Gavin Smith, Voxon
Wednesday Mar 15, 2023
Wednesday Mar 15, 2023
The 16:9 PODCAST IS SPONSORED BY SCREENFEED – DIGITAL SIGNAGE CONTENT
When I was at the big ISE pro AV trade show a few weeks ago, I yet again saw several products that were billed as holograms, even though they didn't even loosely fit the technical definition.
I am always paying attention to news and social media posts that use that terminology, and once in a while, I come across something that actually does start to align with the true definition of holograms and holography. Like Voxon, which operates out of Adelaide, Australia.
Started years ago as a beer drinking and tinkering maker project in a garage, Voxon now has a physical product for sale that generates a visual with depth that viewers can walk around and see from different angles.
That product is mainly being bought by universities and R&D teams at companies to play with and learn, but the long game for Voxon is to produce or be the engine for other products that really do live up to the mainstream, Hollywood-driven notion of holograms.
I had a great chat with co-founder and CEO Gavin Smith.
Subscribe to this podcast: iTunes * Google Play * RSS
TRANSCRIPT
Gavin, thank you very much for joining me. I know you're up in Scotland, but you are based in Adelaide, Australia, correct?
Gavin Smith: Yes, that's right. I'm originally from Scotland. I grew up here, spent the first part of my life in the north of Scotland in Elgin, and then I went to university in Paisley, Glasgow and then eventually, after working for 10 years in the banking sector, I immigrated to Australia and I've lived in Adelaide for the last 14 years.
That's quite a climate shift!
Gavin Smith: Yes, it is a climate shift. I was speaking to my wife the day before, and it was about 40 degrees there, just now they're having a heat wave, whereas up in Elgin here, it's about 1 degree at the moment.
Yeah. I'm thinking, why are you there in February? But on the other hand, why would you wanna be in Adelaide if it's 40 Celsius?
Gavin Smith: I quite like the cold. I prefer to be in this temperature right now than 40 degrees, that's for sure.
Oh, I just spent 45 minutes with my snow machine clearing 25 centimeters of snow off my driveway, so I wouldn't mind being in Adelaide today.
Gavin Smith: Thankfully I can have the best of both worlds. I'm heading back there in about a week and a half time.
I was intrigued by your company. I saw a couple of LinkedIn posts with embedded videos and thought that's interesting and I wanted to speak more. So can you tell me what Voxon does?
Gavin Smith: Yes, sure. So Voxon is a company that started in about 2012-2013, and it came out of two joint research projects. One was me and my friend Will, based in Adelaide, we had a Thursday Night Lab Session, as we called it, where we went to the shed and we drank a few beers and we tried to invent things. It was a bit weird, science-esque.
So this wasn't exactly a lab?
Gavin Smith: It was a shed. Let's face it, with a beer fridge and there was a lot of machinery, which was in various stages of repair. We used to get hard rubbish off the right side of the road in Adelaide and take it apart and see what we could make.
It was just amateur invention hour. But it was at the start of that project, we built fairly rudimentary machines, CNC machines and we took apart laser scanners and were just inquisitive about how they work from a mechanical point of view. But that then turned into more of a, let's see how far we can push ourselves and learn new stuff, and we've been inspired by sci-fi, Star Wars, all those sorts of things. So we said, let's try and make the sort of 3D display that we'd seen in the movies and those science fiction movies always had the same type of display, and that wasn't a screen, that wasn't a headset. It was always some sort of floating image that you could walk around and you could look out from any direction and the common name for that in popular media was a holographic display. That's what people called it. So that's what we set out to build, and we very quickly figured out that this type of display had to be something to do with projecting images or dots onto some sort of surface that moved and that's because in order to render these little dots that make up the image, inside a space that had physical dimensions, you couldn't make the lights just appear on air. We figured you, you might be able to do some sort of gas or some sort of lasers and things like that. But the way we approached it was starting off by just shaking business cards back and forwards and shining lasers on them, and then that made a line because of persistence of vision.
I always think that Neanderthal man invented the volumetric display because they probably waved burning embers around on the sticks at nighttime and drew those patterns in the air and those patterns really only existed because of the persistence of vision and the extrusion of light through a volume of space, and so that's what we decided to do, and we realized if you could draw a line, then if you could control the laser and turn it off and on again, you could draw a dot. And so we did that by cutting the laser beam with a rotating CD that was stuck on a high-speed drill with some sticky tape on it. We chopped the laser into little bits, and by controlling the speed of the laser, we ended up having a single dot, which we referred to as a voxel, that's what we Googled that a dot in space is referred to as a voxel and then we extrapolated from there and say if we're building these images out of little pixels of light or voxels, we need more and more of these dots, and when you do the math you quickly realize that you need millions of dots of light or volume to make an image, and that's difficult. And really that started us down the road of experimenting with video projectors, with lasers with all sorts of things and more and more advanced moving surfaces, and eventually, we made a small helical display using a vacuum-formed helix that we basically made in Will's wife's kitchen when she was out, in the oven, and yeah, we created a very small image of an elephant. You might call it a hologram at the time. That's what we called it at the time, but it was a volumetric swept surface image. The terminology I'll go into a bit more detail, but at the time it was just a hologram to us, and we thought this was amazing and we'd never seen it before. So we put a video of it on YouTube and some guys in America who were unbeknown to us doing the same project got in contact with us and push came to shove, we decided to join forces and form Voxon, and that was back in 2013.
So when you created this little elephant, was that like a big ‘aha’ moment? Like, “Oh my God, we figured this out”?
Gavin Smith: Yes, very much so. We believed at the time, we were the first people to do this. In fact, we weren't. But it was the first time we'd seen this type of image, and it was literally spine tingly amazing, to see a truly three-dimensional object that you could look down from, above, from the sides, from any angle, and it filled a space the same way as you or I fill a space in the physical world, you could measure its length that's spread, that's height and even its volume in gallons or liters. It had a tangible existence in the physical world and not on a screen as other 3D images tend to do.
At this point, was this a stationary object?
Gavin Smith: Yes, at this point the elephant was stationary and the way I'd created the elephant was we'd figured out, in order to make this elephant, we first needed to have the swept surface moving. So that was the helical screen, which was spinning at about 900 RPM on a very small electric motor and then we had a video projector that we'd managed to get going at about 1,200 frames per second, and in order to create the images, which were cross sections, helical cross sections of an elephant, that was all done offline. So the way I approached that was, we used software called 3D Studio Max, which is a design software, and in that, I modeled a helix and an elephant, and I then intersected the helix with the elephant in the software, rotated the helix digitally, and then I rendered out the resultant cross-section, the boolean operation of one on the other, and this is like taking a drill and drilling a hole into the ground and looking at just a helical core sample.
So really it was like a CT scan of this elephant, but just slice at a time, and then I rendered those images to a file. I wrote some software to convert it to a new video format that we had to invent to compress all that data into this high-speed image stream, and then projected that onto the helix. Now, of course, the timing of the images and the rotation of the helix were not in sync, and so much like an old CRT screen where the vertical shift is not dialed in, the elephant would drift out the top of the display and come back in the bottom, and at that point, we knew that this was all about a combination of mathematics, optics, precision, and timing. And to make it interactive, we'd have to write a real-time computer program capable of generating these images in real-time, and that was the next part of the puzzle.
This was a work working prototype basically.
Gavin Smith: This was a working prototype, yeah.
How big was it?
Gavin Smith: The helix was very small. It was about five centimeters in diameter, about an inch and a half in diameter, and about an inch tall. But because the projector that we used was a Pico projector at the time, and it was about half the size of a pack of cards. This tiny little thing that we got off the internet from Texas Instruments, and you could focus it at about one centimeter away. So all those little pixels were infinitesimally small, so it was a very high-resolution display and very small, and we realized to get these number of frames per second, we'd have to take advantage of one of the most incredible pieces of engineering ever conceived, in my opinion, and that is the DLP chip from Texas Instruments invented by Larry Hornbeck who passed away several years ago, sadly, and that is an array of mirrors that is grown on a chip using photolithography, the same process as you create microchips, and that array of mirrors contains upwards of a million mirrors arranged in a two-dimensional array, and they can tilt on and off physically about 30,000 times a second.
And that's called a MEMS, a microelectromechanical display or in optical terms, a spatial light modulator. So it's something that turns the light on and off at ultra-high speed, and those on-off cycles are what give us our Z-resolution on the display. So that's the slices that make up the display.
Wow. So where are you at now with the company now that you've formed it and you've grown it, what's happened since that very first prototype elephant?
Gavin Smith: Following that we realized that my programming skills were finite. I'd spent 10 years as a COBOL programmer in banking, and I wasn't up to the task of writing what was needed, which was a low-level graphics engine. This didn't need a mainframe, no, and we couldn't afford a mainframe, even if we wanted one.
So we looked up on the internet to see who we could find in terms of programming to join the company, and there were two programmers who stood out. They were referred to as the top two programmers in the world and were John Carmack of Oculus, and then there was Ken Silverman who wrote the graphics engine for Duke Nukem back in the late 90s, so we contacted Ken. John wasn't available so we contacted Ken and demoed to him at Brown University in Rhode Island where he was working subsequently as basically a computer programmer teacher with his dad, who was the Dean of Engineering there, and Ken really liked what we were doing and his understanding of mathematics and foxholes and 3D rendering really made him think this was something he wanted to be involved in. So he joined our company as a founder and chief computer scientist, and he has led the development of the core rendering engine, which we call the Voxon Photonic engine and that's really our core IP, it's the ability to tick any 3D graphics from a third party source, from Unity, from a C program or something else, and turn it into a high speed projected image, which can be processed in such a way as to de-wrap them when they're projected, so they're the right size. We use dithering in real time to make color possible, which is similar to newsprint, CMY newsprint in the newspaper, and this all basically allows us to project images onto any type of moving surface now and do it in real-time and make applications that are much bigger and extensible so we can plug it into other programs or have people write their own programs for our displays.
So you've emerged from being an R&D effort in the shed to a real company to having working prototypes and now you're an operating company with the product.
Gavin Smith: I like to say we've emerged, but I'd very much say we're still crossing the chasm, so to speak, in terms of the technology landscape.
After that initial prototype, we spent many years batting our heads together, trying to work as a team in America, and eventually, Will and I decided to raise some money in Australia and set up the company there. We raised about a million and a half Australian dollars. It was about a million US dollars back in 2017, and that was enough to employ some extra engineers and business development, and an experienced COO and start working on our first product, which was the VX1. Now, the VX1 was a different type of display. We decided not to do the helix back then, and we decided to make a different type of display, and that was a reciprocating display and so we invented a way of moving a screen up and down very efficiently using resonance. It’s the same I guess mechanical thing that all objects have, and that is at a certain frequency, they start vibrating if there's a driving vibration force. So the Tacoma Bridge falling down when the wind blew at the right speed was an example of when resonances destroyed something. But an opera singer, breaking a glass at the right pitch is another example of something that vibrates due to a striving force, and so we found out if we built a screen, which was mounted on springs that were of a very particular weight, and the springs were a very particular constant of Young's modulus, we could vibrate that subsystem and the screen would vibrate up and down very efficiently and very fast, fast enough that you couldn't see the screen. So that's what the VX1 became, and onto the back of that screen, we project images and those images from a swept volume, and the VX1 had a volume of about 18x18x8 cm, I think it's about 7 inches square by about 3 inches tall, and we have a single projector mounted inside of that and a computer and a ton of electronics keeps it all in sync, and we built a software API for it and a library of programs that come built into it. So it's off the shelf, you turn it on and it works.
And so we built that back in 2017 and over the last five years, it's evolved into something which is very reliable and now, you can't tell them apart when they're manufactured at the start, each one might look different with hot glue and duct tape and all the rest of it. But now we have a complete digital workflow. We outsource most of the manufacture of the parts and we do final assembly software, QC, and packaging up and then ship them out to companies we've sold probably about 120 VX1s globally since 2017, and those have gone out to companies all around the world, like Sony, MIT, Harvard, CMU, Unity, BA Systems, Verizon, Erickson, a lot of companies and they've bought them and they're generally going into explorative use cases.
Yeah, I was going to say, it sounds like they're going into labs as opposed to stores.
Gavin Smith: Yeah, they're not going into stores. The VX1 is really an evaluation system. It's not prime time ready for running all day long, and the reason for that is it has a vibration component to it, and also the refresh rate of the VX1 is actually variable within the volume. It's hard to explain, but the apparent volume refresh rate is 30 hertz in the middle and 15 hertz at the poles and so it has a little bit of flicker. But in a dark environment, it's really spellbinding and it's actually used in museums. There's some in Germany and a science museum there. It's been used in an art exhibition in Paris, where the art was created by David Levine and MIT Media Lab and it's frequently used in universities and it pops up in all sorts of trade shows, and it's always a talking point and it always gathers a crowd around it, and what we like to say with the volumetric display from a marketing point of view, or really a description of what it is, it's really about creating a digital campfire. That's the kind of user experience.
It's gathering people around something intimately in a way that they can still have eye contact and maintain a conversation, and each person has their own perspective and view of the 3D data.
The scale you're describing is still quite small and that seems to be What I've experienced with, when I've seen demonstrations at the SID trade show of light field displays. They're all like the size of a soda bottle at most.
Is that a function of just the technology, you can't just make these things big?
Gavin Smith: You can make them bigger, and we have since that point. The biggest display that we've made so far was one that we just delivered to BA Systems in Frimley near London, and fo that one, we've gone back to the helical display for that particular one, and it's. 46 centimeters in diameter and 8 centimeters deep. So that's about nine times the volume of the VX1. So that's a much bigger display.
Now you can, with a swept volume, you can go as big as you'd like within the realms of physics, and what I mean by that is with a rotating display, you can make the display as big as something that can rotate at a speed that's fast enough to make the medium kind of disappear. So if you think about propellers and fans, for example, I've seen pedestal fans that are a meter in diameter running faster than we run our display, and with rotating displays, it's easier to do because you have conservation of momentum and you have inertia which drives the display around, and yet you can rotate the volume as well, have it enclosed so that you're not generating airflow as a fan does.
So for example, if you have a propeller-shaped blade encased in a cylindrical enclosure, and that enclosure is spinning, then you don't get the air resistance you get with a fan and the display that we made for BA Systems is ultimately silent and flicker-free because we're running at exactly 30 hertz throughout the volume, which means you don't get flicker, but reciprocating displays, ones that go up and down, scaling them is more of a challenge because you're having to push the air out the way up and down, and as the size of the screen moving up and down gets bigger, if you're projecting from behind, for example, you also have to start considering things like the flexing of the substrate that you're projecting onto. For a front projection display where you project down from the top, we can go bigger because you can make a very lightweight, thicker screen out of exotic materials and those are materials that are very light but very stiff. Things like air gels and foamed metals, and very lightweight honeycomb structure so that way you can go bigger but we may need to move into the realms of using reduced atmospheric displays, partial vacuums, and things like that to reduce the resistance or using materials that are air permeable, such as meshes that move up and down very quickly. And we have done experiments with those and found that we can go a lot bigger.
However, with the current projection systems that we're using, you then have to increase the brightness because the brightness of the image is also stretched out through a volume. If you imagine a home cinema projector projecting 3k or 4k lumens, you have to consider that each of the images that it's projecting is pretty much evenly lit in terms of all the pixels that you're projecting. Whereas what we are doing is we are projecting these thousands of images, we're only illuminating the cross-section of every object. So we're maybe only using 1% of the available brightness of the projector at any one time, unless you project a solid slice all the way across, which is really you're building up this construct, which is how I explain it to people as it's very similar to 3D printing. If you look at how a 3D printer works, we are doing exactly the same thing, except we are printing using light instead of PLA and we're printing thousands and thousands of times faster.
In digital signage, the thing that always gets people nervous is moving parts, and that directly affects reliability and longevity. How do you address that?
Gavin Smith: So the VX1 is a good example of moving parts in a display that isn't yet ready for long-running and when I say long-running, we do have it in exhibitions, but we have recently engineered it in such a way that the parts that may break or will break are the four springs that drive the machine, and those have been engineered to resonate at particular frequency. Now after several hundred million extensions of those springs, they can fatigue and they will fatigue break and that's something that we're working on, and that might be a month or three weeks of running 24/7, and so we've made those springs user replaceable. You can change them in two or three minutes for a fresh set. So it's almost like the mechanical profile of something like an Inkjet printer where you have to change the cartridge every so often.
And we find with mechanical stuff, people accept mechanical things in their lives as long as the maintenance/utility ratio is at a level they can accept like bicycles, cars, and things like that. You maintain them as long as their utility outweighs the inconvenience of the repair. Now for projection equipment and things like that in digital signage, there are a lot of two-dimensional technologies that are ultra-reliable on those things, big LED panels, 2D video projectors and just lighting. You can turn them on and leave them and you should be okay.
So in our rotating displays and we have another rotating display that we're working on, which we can't discuss just now cuz it's still under NDA, is part of the reason we're going down that rabbit hole or going down that design sort of path because we can make rotating displays, which are very reliable, they're effectively like a record player. You turn it on and it spins around and you could leave it and come back in three weeks and it would still be spinning around, and also a rotating display if properly manufactured within tolerances won't cause the vibration, and the vibration is really the thing that can cause the issues because vibration can lead to fatigue and failure in electrical components, electronic components, small cracks in circuits, and things like that.
So from our point of view, we're going towards rotating mechanics because that ultimately allows us to make things which are reliable enough to be used in a wide range of industries including digital signage, advertising, medical imaging and gaming, and many more.
In my world, there are all kinds of companies who are saying that they have holographic products of some kind or another. As somebody who's doing something that sounds very much like a hologram or close to what we thought of when we all saw Star Wars, what do you think of those things?
Gavin Smith: I don't like to be a troll, first of all on LinkedIn, and so I try to shy away from saying, look, that's rubbish. But what I try to do is politely point out how things work when it's not clear from someone's post how something might work or where it's misleading. Now if you look at the term hologram, it comes from the Greek, hólos and grammḗ, which means the whole message, and in a way, I tend to think of an actual hologram, which is created using lasers, laser interference patterns, and light beams and things like that they don't represent the whole message. Because if you take your credit card out, which is one of the few places you will see a hologram you'll notice that you can't look down on the hologram from above, you can't turn the card over and look at it from the back. They are a limited view of something, and so the term hologram has become, as you say, in popular fiction, and popular media, it's really a catchall for anything that is sci-fi 3D related, right? And it’s misused, everyone calls it a hologram, and our staff sometimes call it a hologram. I like to say it's not a hologram because it has a lot more features than a hologram.
Holograms have some really interesting properties, one of which is that you can cut a hologram into 10 little pieces and it turns into 10 individual little holograms, and that's a really interesting thing. But holograms from a 3D point of view don't exist in signage anywhere. They simply don't. The terminology used to describe things that you see in signage and popular media is completely misused, and I like to go through them and categorize them into different things. And those are, first of all, volumetric displays of which we're the only company in the world that's making a commercial volumetric display. There's one other company Aerial Burton, who are based in Japan that makes a volumetric display, but it's a very high-tech scientific prototype that uses lasers to explode the air and has very low resolution. And then you've got autostereoscopic 3D displays, and they broadly fit into the categories of lenticular displays which are as you probably know LCD panels, which have got a plastic lens array on them that allows you to see a left and a right image, and those left and right images can give you a stereoscopic view. I would call them stereoscopic displays because they're not 3d. You can't look at them from any direction and they don't physically occupy three-dimensional euclidean space, which is what the real world is, and those types of displays come in different formats. So you get some with just horizontal parallax, which means you can move your head left and right and see a number of distinct views. You've got some that you can move up and down as well, and also get a little bit of vertical parallax as well, and there's probably five or six companies doing those sorts of displays. You've got Looking Glass, Lightfield Labs, Acer, and Sodium, so that area can grow. The physical size of those displays can get bigger, but the bigger they get, the harder it is to move further away because you're pupil distance means it's harder to get a 3D view, and also with any display like that, the 3D image that you see because it's the result of you seeing two independent images with your left and right eye, that 3D image can never leave the bounds or the window of the display, and that's something in advertising, which is very misused a lot, they show a 2D monitor with the image leaping out beyond the border of the monitor, and that just can't happen. That breaks the laws of physics, and so that's the kind of three auto stereoscopic 3D landscapes, and it's hard to say that autostereoscopic, 3D display because people zone out and they go, is it a hologram? And no it's not.
The other types of 3D that are popular just now are obviously, glasses-based display, AR, VR, mixed-reality, and we don't really, we don't really mind about that or care about that because it's something you have to put something on your head, and that's our different thing really. So those offer you an immersive experience where you go down a rabbit hole and you're in another world and that's not what we are about.
And then you've got the fake 3D displays, which are not 3D stereoscopically but appear that way, and that's where I get slightly annoyed by those displays, but I understand there are people making types of signage I guess you would say, that is perfectly suitable for a scenario and those are things like Pepper’s ghost which is when you reflect a 2D image off a big piece of glass or plexiglass, and that's the pepper, the famous one, the Tupac hologram at Coachella. I met the guy and spoke to him. He's a really lovely guy and I had a good chat about that, and he knows full well that it's an illusion, but it's the illusion that Disneyland has been using for many years, and it's a perfectly good illusion for a seated studio audience because they see someone on stage and they're doing it now with the, I think the ABBA Show in London is a similar type of setup.
They call them holograms, but it's a 2D picture that's far enough away that you can be made to believe that it's three-dimensional and it might exist at different levels like a diorama. You could have a stack of images, on fly screens or whatever, that appear to be layered, but ultimately they are 2D, and then the one that's come out recently, which causes probably the most amount of confusion for people are the anamorphic projections on large billboards, and everyone's seen these displays on LinkedIn and YouTube, and they tend to appear on large curved billboards in parts of China where the rental of the billboards is sufficiently cheap as you can put these big images up there, film them from one particular spot in 2d, and then put that on LinkedIn and have people comment on it and say, wow, that's an amazing hologram. Even though a) they haven't seen this in real life and b) it's not a hologram and it's not even three-dimensional. It's a perspective-based 2D trick, and so one of our challenges is expectation management, and that is people see large-scale fake 2D images, and fake 3D images and then they conclude that it must be possible and they want to buy one, and then when they see yours they go, oh, it's much smaller than I imagined, and you feel like saying, it's real. It's actually based on science, and you could walk around it.
And that's the challenge we're at just now. Trying to move away from this feeling that you have to have the biggest display in the world for it to be valid, and a lot of the business for us and a lot of the inquiries we get are from the likes of the Middle East, where they want to build very big, very impressive, very bright, very colorful displays and they say, we want a hologram that will fit in a football stadium and fly around in the sky, and you have to say well, that's great, but that's also impossible using anything that's even imaginable today, let alone physically achievable, and so yeah, we are very much a case of trying to be as honest as we can with the limitations, but also with the opportunities because regardless of the fact that our technology is relatively small compared to large screen billboards, we have got the ability to create sci-fi-inspired interactive displays that you can put in personal spaces, in museums, in galleries, in shopping centers, and they really do look like something up close under scrutiny that you might see in a Marvel movie, and that's the kind of relationship we're trying to find with other companies as well.
There are other types of the display as well. You probably talked to Daniel about some of his displays, which are levitating grains of dust and things like that, and the challenge I have with them is yes, you can make a 3D image, but you have to look at how long it takes to make that 3D image and they're really more akin to painting with light. It's long-exposure photography. You have to manipulate something and move it around over a long period of time to bring it, to build a single image, and scaling those types of displays is impossible. It’s the same with laser-based displays, whenever you're moving a single dot around, you run out of resolution extraordinarily fast because it's a linear thing, and even with Aerial Burton exploding the air with a laser they can only do about 1000 or 2000 dots every second, and that breaks down to being able to draw maybe a very simple two-dimensional shape whereas to draw a detailed image, an elephant or anything like that, that we've displayed in the past, it requires upwards of 30 or 40 million dots a second to do that with each image, each volume contains millions of dots.
Where do you see this going in, let's say, five years from now? And are you at that point selling products or are you licensing the technology to larger display manufacturers? Or something else?
Gavin Smith: So at the moment what we're doing is we're looking for projects that we can scale and one of the first projects that we're working on just now and the technology can be applied to a range of different industries. As you can imagine, any new display technology. You could use it for CT scans, you could use it for advertising, for point of sale, for a whole lot of different things. But you have to choose those projects early on when the technology is immature, and that is low-hanging fruit if you want to use that term, and so our low-hanging freight at the moment, we believe is in the entertainment industry, digital out-of-home entertainment to be specific, which is the likes of video gaming and entertainment venues, and so 2018, we were in the Tokyo Game Show with one of our machines, and we were situated next to Taito at the company that made Space Invaders, and their board came across their senior members and they played with our technology and they really liked it. And so we entered into a conversation with them and over several years, we have built a Space invaders arcade machine called Next Dimension, and that's using our rotating volumetric display with three projectors each running at 4,000 frames per second and a large rotating volume, and we've written a new Space Invaders arcade game and Taito has granted us the license to bring that to market. In order to do that, we're now doing commercial testing and technical testing which involves taking the technology into venues, play testing it and getting feedback from the venues on the suitability of the game and the profitability of it as a product. So with that game, our plan is to follow in the footsteps of the previous Space Invader game, which was called Frenzy made by Roth Rolls. It sold 3000 or 4000 units globally. So if you could do that, it would be a profitable first venture in terms of bringing technology to market, and at the moment, we're looking to raise some capital. We need to raise $2-3 million USD to do the design from the manufacturer for that and build the first batch of machines which would be rolled out globally.
Now, that's really seen for us as a launch of technology using the IP of Space Invaders as a carrier, a launch vehicle for the technology, but once launched and once our technology is widely known and understood, what we then plan to do is build our own revenue generating model and technology platform that can be deployed to venues around the world who can use this as a kind of an entertainment device where you can run different IP on it from different vendors and do a sort of profit share with the venue owners. So a cinema, Chucke CheeseB, Dave & Busters, those types of venues, as well as bowling alleys, VR arcades, and all those types of entertainment venues that currently is starting to grow in strength, largely because people are now looking for entertainment experiences, not necessarily just staying at home.
COVID obviously threw a curve ball our way as well. When our Space Invaders machine was sent to Japan for testing, COVID had just happened so it went into internal testing within Taito, and then Square Enix who owns Taito, their parent company decreed that Taito would no longer manufacture arcade machines but would license their IP only so that kind of threw a spanner in the works and they've come back to us and said, we'd love the game, but we want you to bring it to market, not us. So that's one thing we're working on just now. There's a video of Space Invaders: Next Dimension on YouTube that you can look at, and it's a really fun experience because it's a four-player game. We've added the volumetric nature. You can fly up and down during sub-games. You can bump your next-door neighbor with your spaceship and get a power-up. It really is for us a way of saying, look, this is a new way, it's a new palette of which to make new gaming experiences and the future is really up to the imaginations of people writing software.
All right. That was super interesting. I learned a lot there and some of it is, as often the case, I understood as well.
Gavin Smith: That's great. I'm glad you understand. It is a hard thing to wrap your head around, especially for us trying to demonstrate the nature of the technology in 2D YouTube videos and LinkedIn videos, and you really have to see it with your own eyes to understand it, and that's why this week I was over for a meeting with BA Systems, but I took the opportunity to spend several days in London at a film Studio in SoHo, in London, the owners very gratefully let me have a demonstration group there, and I spent two days last week demonstrating the product to ten or so companies come in and see the technology, and it's only then when they really start to get their creative juices flowing and that's where POCs projects kick-off.
So that's what we're looking for just now, are companies that have imaginative people and they have a need for creating some new interactive media that can be symbiotic with their existing VR and AR metaverse type stuff. But really something that's designed for people up close and personal, intimate experiences.
If people want to get in touch, where do they find you online?
Gavin Smith: So we have a website, which is just www.voxon.co. Voxon Photonics is our Australian company name, and you can find us on LinkedIn.
Actually, my own personal LinkedIn is generally where I post most stuff. That's Gavin Smith on LinkedIn, you can look me up there around, and then we have the Voxon Photonics LinkedIn page and we're on Twitter and Facebook and YouTube as well. We have a lot of videos on YouTube. That's a good place to start. But if you wanna get in touch, contact us via Voxon.co. Drop us an email and we'll be happy to have a meeting and a video call.
All right, Gavin, thank you so much for spending some time with me.
Gavin Smith: My pleasure. Thanks very much for having me.
Comments (0)
To leave or reply to comments, please download free Podbean or
No Comments
To leave or reply to comments,
please download free Podbean App.