Episodes
Monday Jul 03, 2023
Jim Nista On Code Painting
Monday Jul 03, 2023
Monday Jul 03, 2023
The 16:9 PODCAST IS SPONSORED BY SCREENFEED – DIGITAL SIGNAGE CONTENT
When a big LED video wall gets populated with fresh creative, the creatives and the people operating a display are likely going to have a conversation about the size of the finished file and how to move it – because there’s a good chance the rendered file is huge, and not something that can attach in an email.
So I was intrigued as hell when a creative guy told me the video wall creative he’d produced for a project could fit on an old-school floppy disk … because it was really just some lines of code.
A lot of people in the digital signage ecosystem will know Jim Nista. He started and ran the LA creative technology shop Insteo, before it was acquired by Almo. Nista worked for his new masters for a while, but eventually went off on his own, and is now spinning up a new boutique agency that’s doing creative for visual projects.
One of the things he’s been actively working on is what he calls Code Painting – a big visual that gradually builds in front of viewers and then self-destructs, replaced by a new visual that again starts to build. You set the file and visual instructions, and then forget it, as it will just run and run but always be a bit - or a lot – different.
It’s all done in programming instructions, and in the case of his current efforts, is focused on the familiar visuals of flowers.
Nista’s work was one of three used for the Sixteen:Nine Mixer at InfoComm last month. Having had a couple of explainers of what was going on, and the approach, I figured a podcast was the best way to help the industry understand what he’s figured out, and what he’s delivering to clients.
Subscribe from wherever you pick up new podcasts.
TRANSCRIPT
Hey Jim, can you explain what code painting is?
Jim Nista: Yeah, it's a fun new concept for me. I know that other people are doing some of the same types of things, but really, I have been trying to make a painting through either JavaScript or other coding techniques. I started with a more simple approach, and the goal truly was to create something that you're watching a painting come to life, and my own brief to myself was this needs to look good at every stage of the process. Sometimes a time-lapse at the beginning is. what are we looking at here? And so it's been a fun process for me to figure out a way to make something, not just paint itself through code, but to be interesting to look at through the entire process, however long it takes, but 30 seconds to a minute is what I'm usually trying to come towards.
But it is truly just code. There are no images, videos, AI, machine learning, or anything else. It's just a scripted process of creating a unique painting while you watch.
We were sitting in Orlando, chatting about this and you were describing it to me, and I was thinking, boy, this is a little bit over my head, but it sounded like it starts with almost like an armature. You start with some curves, and it just builds from there?
Jim Nista: Yeah. It starts from a very primitive drawing. It is almost like a child's drawing because some of the early pieces, and certainly some of the pieces that I showed down in Orlando, are flowers because those are shapes that are very recognizable to our eyes.
We can spot a flower-type shape almost as well as we can determine a human face style shape. I don't know why, and I don't know the evolutionary reasons behind it, but I realized that this is a pattern that we can determine very easily so behind the scenes, one of the very first things that this code does is to generate out some curves and if you think about the shape of a curve, if you flip that shape, it makes a petal or a leaf shape. So if you make a simple curve and then flip it, you end up with a leaf shape or a petal shape, and if you take that and rotate it around in a certain way and put a dot in the center, our eyes say “flower,” and we're really good at it, right?
It probably has something to do with our ability to find food and all the other things that we do as humans. But it becomes a shape that is very recognizable to us, and so once I have these very primitive drawn shapes stored in memory, then the real code takes over, which is the work that I spent so much time on over the last year and a half trying to make realistic-looking paint strokes come out of these primitive drawings that are stored in memory, and that's been really the fun part of the project is to invent a way to create something that comes to life that way, but is truly just based on the most primitive, basic line drawings you could possibly imagine. So there's some color encoded in that primitive. There are some rough shapes encoded in that primitive, but really it's just very simple and then works from there to create a painting while you watch.
We demoed this on a very large canvas down in Orlando, a 26-foot-tall LED video wall, 155 feet wide in terms of a curve and you had little vignettes, a number of these, I shouldn't say little, but they were substantially sized, maybe 15 feet tall or something like that, but you had a number of them and essentially, something would just build on this as you went and eventually show itself as a flower and evolve from there.
How long do these things take to build, and when they're finished building, is that what stays on screen, or does it erases off and you start over?
Jim Nista: The big idea that I started with for this project and just going back one step is this really came from before the pandemic when I started trying to learn oil painting again. I had done it in college and art classes and things like that, and it was always a fun passion. But I was struggling because when I was learning oil painting, especially during the pandemic-that was my hobby-I kept making paintings that were almost too realistic and I challenged myself. I was like, I know how to code too, so why can't I code a paint stroke that can teach me how to be more loose? And this is where the genesis of this came from, and from there, I let this take on a life of its own, but along the process, I decided I don't want this to look as organic as it could.
The overall idea is that it creates a piece as you're watching it and then destroys it and creates a new one, and then destroys that and creates a new one, and the idea came out of so many video walls, and we've all done them where you end up with a five-minute loop of stock footage, and there's a lot of fatigue that you can imagine from employees who have to visit, watch this screen every single day to, guests and visitors having to see the same content over and over again and just the boring factor of yeah, we spent hundreds of thousands of dollars on this video while now we're gonna spend a thousand dollars on content. There are so many projects that are like that, and so I was thinking, how can I put together something that is eternal content, or at least, doesn't need to be changed as often?
And that was the genesis behind this. Let it create a piece, let it destroy it, let it make a new one, let it destroy that, and just eternally create work that hopefully carries some unique nature to it, and that's part of what this code carries, it will not just change the colors throughout the day and do dramatic colors at sunrise and sunset and things like that, but it also changes its colors and I even call it mood. It changes its mood throughout the year, so for running this on a video wall, it would be never the same image. There would be a lot of similarities from day to day. There would be a lot of similarities if you looked at something in one November versus next November, you'd see a lot of similarities, of course. But that content would constantly evolve and change throughout the day, the month, and the seasons, and with the idea that you just can truly set this and forget about it, walk away and let this run on a video wall indefinitely and truly never see the same thing twice.
What's the timeline?
I imagine there are all kinds of variables and parameters you can set, but typically, is it the sort of thing that builds over the space of four hours or four minutes?
Jim Nista: I've been playing around with it and I love the idea of letting these things run even longer. But right now I've found, at least from people watching some of the early samples and some of the places that I've installed it, that shorter time durations are better, so about a minute to create a piece and letting it linger for a little while after it's created and then destroying it, and that really does seem to be the sweet spot.
Now everything is just code. So if somebody said, I want it to run and build for four hours, I can certainly do that, but I was funny as I've been building this because it's code I can just run it in a web browser, that means I can run it on my phone, and so I've been able to annoy friends but demo this out in front of a lot of people as I've been working on it, and I noticed very early on that the slower pieces tended to have people looking away as opposed to looking at it, so capturing a performance of creating this piece within a minute and refining it and almost getting the rough sketch done sooner rather than later, really that's how it started to feel like it was creating a performance that people would wanna watch, and I think it can capture people's attention while they're transiting through space, which is a lot of these sorts of corporate AV installations take place. Nobody is expected to just stand there and look at it for a really long time. So it's tuned into that.
But I do know that even running it in my own space, I have a projector here and I can run it on a big wall from time to time just to see it, that sometimes when it is more ambient, letting it build for several minutes is a better approach. But in the public space, I think about always seeing, and when somebody first walks into this space, I don't know what point of the build, of the painting it's gonna be at, it's just continual, and so the goal is to just have it always look good, and that's been a very difficult goal to achieve because you think about something at the very beginning of it drawing, it might be very abstract, and it's hard for people to understand.
When you say, it builds, and then it destroys itself. What does that look like when it's in destruction mode, so to speak?
Jim Nista: I had to make the brushes move a lot faster during that mode so that it does attract attention like something's being wiped out. But I also found that leaving a lot, not completely erasing the canvas, leaving a lot of the underlying or previous painting adds a lot of character as well, and so it roughly washes over with larger brushes and lighter brushes, but it will leave pieces of the previous painting there, and that's a nice approach.
Now there's another version of this that was more prevalent in Orlando that we ran there, which is more of a continual mode where rather than creating an image behind the scenes, it's creating a 3D image and using that instead and so now that can constantly rotate or unfold or evolve, and so that's an alternative version. So in that one, the destruction is almost immediately where it is changing from, let's say, creating an orange flower to now evolving into creating a purple flower. That transition is a lot less noticeable than some of the other versions of this that I'm running, which are creating one painting, building that painting, and then destroying it.
So there are a few different ways that I've envisioned this, and one is truly continuous, but evolving where rather than Building and painting and destroying, it's constantly painting, and what it's painting is constantly changing, and so it's a different approach and that process requires a little bit more horsepower. So I've built one of these that I know, from having worked at digital signage, we don't necessarily always have the fastest media players in the world, and so one of these, I've tuned is low cost and even with previous generation media players, we're getting some really fast new media players that have GPUs built into them, so that's really going to be wonderful to take advantage of. But I built this before those were really available to me, and some of the versions of this are really designed for a sort of solid-state media player, like BrightSign players or SpinetiX players, that kind of thing, right?
Where I've been focused on turning this around, they're great HTML engines, they don't have a lot of memory, and they don't have a lot of horsepower, but how can we do generative art on that type of hardware, which is so prevalent around the industry?
So this doesn't need to be on a big-ass media server.
Jim Nista: There's a version that does, and that's the version that I was running in Orlando because of course we had a lot of horsepower there and a bigger screen too, to take advantage of. But yeah, there's a version of this that's just pure JavaScript and I've tested it all over the place, including on a 15-year-old laptop, and it runs fine there.
So I've written for a number of years now about visualized data, and that's evolved into the terminology of generative visuals and generative AI. But you skip past this really quickly when you're explaining things that this isn't AI. This is its own thing, right?
Jim Nista: Yeah, I could have used some machine learning techniques for this in terms of creating the underlying primitive image. But rather than doing that, given that I'm dealing with somewhat simple shapes like flowers and landscapes and hills and trees and things like that, code can easily create things like trees and landscapes and those types of things. So it didn't make sense for me to train this in a machine learning model to build those primitives for me, and certainly, machine learning wouldn't help with the process of coming up with the painting itself, but the idea of connecting this to live data or sensors in space is really where this is headed.
I've had other projects that are more interactive or immersive, especially involving the Microsoft Kinect from the Xbox Days before it evolved into this commercial tool toy. So now some of the new work around this is, reading what's happening in space. So if somebody is standing in front of this and they're wearing red, the flowers will be red, and so those are some of the pieces that are coming out within this because yeah, it's generative. I can pull weather data, I can pull any sort of information and add it to the mix of what I'm currently doing.
Oddly, some of the early versions of this were intended and requested to be offline, completely isolated from the internet, and run forever, and so really the only data that those have is the clock. They just know the time and the date, and that's the only data that they can use. So everything has been pre-programmed in and it's just following its script forever, but there's so much randomness to it has a tendency to never repeat.
So one of the things that were interesting about walking around the exhibit hall at Infocomm recently was seeing how a lot of the big display guys, particularly the LED guys, were using generative art on the displays instead of just like the stock videos and so on, which was what happened for a whole bunch of years.
It is the sort of thing that Refik Anadol pioneered; there may be another artist as well, but that's the one that most people would know. Is that the sort of thing you could conceivably do with this as well, or is it just a different track?
Jim Nista: No, a lot of the work that he's doing, and I don't mean to trivialize it because when I see some of the work that he's doing, he's pulling in these massive data sets, right? But a lot of the work itself is running through software that I use as well, like TouchDesigner, and a lot of the same type of effects are happening. What he's built, and a lot of people are copying, unfortunately for him, but what he's built is in some respects, a two-part process. He's pulling all of this data together and then from there, using his own code to render it, and that a lot of that is done in software, like TouchDesigner, Unity, Unreal Engine, those types of applications is where a lot of this happens. But yeah, I think that one challenge that we're facing is that an artist like him, his style is identifiable, but as I was walking around the shift floor, I'm seeing essentially what are either ripoffs, direct ripoffs of his work, or artists just copying it or inspired by his work.
At the end of the day, and again, I don't mean to trivialize what he's doing, but there are effects built into these software applications that sort of mimick his style. He was the first guy to come up with it and use those tricks and techniques and everything else. But a lot of people can just follow along a 30-minute YouTube tutorial and mimic a lot of the work that is coming from him and then some of these other generative artists as well. So there is danger in that, working this way becomes somewhat easy if your style becomes popular to mimic it, and it's sad to see that so many companies are either hiring somebody to copy this artist's style or just outright taking the work directly from other videos that he's published online.
But it's unfortunate. It happens. It's nicer to see though because most of the time what we see at these is somebody running movie trailers or worse, Big Buck Bunny or those blender foundation free videos, and those are very well produced, but they're now 15 years old, and it's like blenders have gotten better in the last 15 years, and so it's nice to see a little bit more creativity around the show floor, but at the end of it, it's not creative because a lot of it's just, “Hey, look at this popular artist. Let's take his work.” And there's a fine line with that. I certainly would love to have some of my demos made available, but, at the same point, if we start seeing it over and over again or people are copying it, it's a nice form of flattery, but it's also dangerous form of flattery as well.
One thing that you mentioned when you were finalizing this stuff for this big video wall was you said that the actual code package scripting or however you wanna describe it, and you'll do a better job than me obviously, was so light that you could have loaded this on a couple of floppy disc, which for the youngsters out there, look it up, and you'll see what a floppy disc is.
Jim Nista: It's the icon that we use when we save things on our regular software. Not that anybody has seen one in a while.
That canvas we were working on for the Orlando project is 18,000 by 3,000 pixels, so a lot of real estate, and of course, this was rendered out, we've given the nature of the project and everything, but if that had been delivered generatively, it's just shader code. That project was built using a concept called GLSL shaders, and it's code, it's a weird code language. It borrows a lot from many different types of scripting languages, but it's for creating visuals like that through code and the files that created those flowers, the individual code for some of those was 9 kilobytes. Just because it's just a little script running and doing all this creative. But what's funny in there is that along the way, as I was initially building some of these projects, I would go into graphics software like Adobe Illustrator for example. I'd go into that graphic software and I'd hand draw what I thought a paintbrush should look like.
And so now I've got this little chunk, not really code, but a vector graphic of what a paintbrush should look like, and over time, all of those little things that I did, I took them out and said, no, I need to code what a paintbrush looks like. I can't rely on having drawn something in advance and so all of these asset files initially part of this just to save a step or move faster were removed and just replaced with code. So there's just one file that builds these experiences that just has to be launched in and played back. Some of those are HTML and run in a web browser. Some of them are not HTML and would need a GPU to render out properly. But yeah, they're very small files that run.
Obviously if it's running in a web browser, the digital sign's just going to be playing its HTML content and the file that is uploaded to, for example, BrightSign players can go a few kilobytes. It's a fun different process versus hundreds of gigabytes of files and or these large dataset files that we see from some of these artists where they're saying, “I built this dataset analyzer that goes through a million photos of this city to create this art.” And I'm looking at it going, that's cool, I created some random noise channels to get my data to generate my randomness rather than having to go through millions of photos.
It's certainly a different approach and makes for fun stories as far as not having to deliver all of these massive files. I've had some surprises along the way. “I don't think you sent me everything.” “Yes, I did.” Just launch it on the BrightSign player and see what it does.
So operationally, there are implications it seems, in terms of data transmission times, and bandwidth consumption, although that's not as big an issue as it used to be, and local storage, things like that. Are there other kinds of operating implications or advantages of going down this path?
Jim Nista: Yeah, I think the biggest advantage is just being able to promise a client that the content's not going to get stale, right?
You can set it and truly forget it.
Jim Nista: Yeah, and that provides a big advantage to it. There are some other challenges to this, and so certainly some of the projects that I've done where we, after a while, realized that that particular circumstance and that particular hardware are not really conducive to running generative, and in those cases, I've rendered eight-hour-long movies, I just let my computer do the generative work and record it, and then upload a big, long movie. So that defeats the purpose and the idea of having to send a nine-kilobyte file, now all of a sudden, it's turned into a big, long movie. But for the most part, no, there's really not too much to consider with this, especially on the simpler version of this code, which certainly is not as dramatic as what we're seeing in museums, in some of the early days of some of the generative artists that are starting to get really well noticed.
But I'm also thinking in terms of the real-world applications of this, we have a lot of low-cost, low-power media players out there in the world that are well suited towards this and can handle a project like this without overheating or anything else like that. So that was a big driving factor for me that I know a lot of other artists wouldn't really even think to put limitations on themselves like that. They would just think, I don't care about the technology and then suddenly create a project that requires multiple of the latest, greatest expensive GPUs on a Windows device, which is nothing wrong with Windows, as we all know, it's not the biggest friend in digital signage. It's a noisy operating system, and it wants to make its presence known, and what we're looking for most of the time is all of that stuff to be in the background as far as possible.
Is this something that you came up with or was it technology that existed and some people were using and you've just adopted it and done your own thing with it?
Jim Nista: I think in terms of the work that I'm doing with GLSL shaders, and the more modern GPU process, I'm coming into a workflow that other people have been pioneering, and so I'm just getting more than my feet wet with that, but it really is newer.
But on the other side, building generative art on, essentially, think about some of the last-generation BrightSign players, we're talking about devices that were designed in 2015-2016. So seven years old and already intended for a solid state and not necessarily have any GPUs. That idea to create generative art on some of these older devices, is new, and I have not seen anybody building content that way. I don't know why you would, right? If you start on a project like this and you don't create limitations for yourself, your are going to want to have the best GPU, you're going to want the best system to run it on. You're going to aim for the highest caliber, and here I am going, no way, I got to aim for these devices that are out of date and not necessarily have the fastest horsepower, but I know I can count on them and I know I can rely on them and I know that they're going to do what part of this project's goal was for me, which is to run indefinitely, right? To be able to create something like this.
I think another idea that's always popped into my head, and just as an odd way, is if you go to a contemporary art museum or gallery and you see audiovisual art fairly often, it's a big part of it, and I was at one a couple of years ago, I went to one of those big art shows where they have the galleries come and I bought the tickets and I decided I wanted to go see this like the Art Basel but Frieze is the one that I went to, and I noticed a lot of the AV was very poorly running. So they're trying to sell us a million-dollar art project and there's a BrightSign player on the floor with cables, and the AV guys frame-shooting it and know it just isn't right. I know that device. I know what it's capable of. I know it could pull this off.
So a lot of that was what happens when somebody invests in one of these pieces or wants a work of art in their space. Now they have to keep it running, and they have to have an AV tech constantly going out there and patching it up and fixing it and keeping it going and all the other things and it really started to make sense as you look at it older AV art installations. There are a lot of AV artists from the 80s and 90s who used CRT devices, right? How do these museums keep that stuff running? And so it was also just a practical idea for me, having an understanding of the AV industry to think as an artist doing this work, I have to prepare for that. Who knows if anything I do will ever have longevity or maybe nobody will even look at it. But, it was an idea from the beginning. I want to help solve these things. I've been that guy on the floor fixing the BrightSign player and I don't wanna be, I don't wanna create that problem for somebody else.
So it was an idea just about born out of seeing how a lot of this audiovisual art becomes a technical nightmare, and how can I do something that is, from the beginning, avoiding those technical challenges. I've done projects until the pandemic like I had BrightSign players that had been on and running for 10 years for some projects and, if I can count on a device to just run that way, that makes this creative all the more impactful, at least easier to operate into the future, and just a fun little goal of myself to say, how can I challenge this to do something that attracts people that is interesting to look at, but then also is capable of running on much, much older devices.
So is this a product?
You've had a design agency in the past that you sold to Almo, then you went off on your own, doing your own thing, and you're spinning up an agency again. But is this something that if I could ring you tomorrow and say, yeah, I want five for five different venues, I could buy it, or how does that work?
Jim Nista: Yeah, that's part of the idea for me I wanted to do something as a creative project, but I also wanted to find a commercial space for it too, right? It's no fun just to build something and then walk away from it, and knowing and having a background in the commercial AV space, it it's designed for that, especially the more modern version that is a little bit more active. Both of them are really designed for the commercial AV space, and yeah, I've been working already with some architectural firms. I think that they're the ones that are going to get this right away. But then also there are some AV companies that are really specialists in building video walls in corporate residential spaces, right? And I think the residential spaces are going to be a big upcoming space for this, and also I think this is another factor around this type of work that it lends itself to is most larger cities in North America, have a percentage of a project's budget that has to go to art, and I've noticed more and more a lot of l led walls getting put in to satisfy that piece of the art budget, and so if I can productize this around some of the commercial AV companies that I know are getting those projects, right? They can say, oh, we spent all this money on art, but they're putting in an LED wall. You know exactly what the next step is. They're going to put the stock video on it and call it art, and so if we can have something readily available, somewhat off the shelf, and I say somewhat because I don't know too many clients who are going to just want this exactly as is.
This is custom creative. Everyone's gonna want a little bit of customization to it, and I always have known that in any of the template-type projects that I've done is that there's a piece of this that has to be still within the client's control. They feel that they drove some of the creativity, and they wanted certain colors or whatever, and I've left that as part of this. But yes it really truly is intended for this commercial AV space as a somewhat off-the-shelf product that you can just pick up and install into this as the primary visuals on an LED wall.
Some of the work that's been done by these generative artists, it's amazing and everything else, but there are six-figure projects. I don't want to put you on the spot, but can you give any sense of how this fits? Or people listening to this going, that would be amazing, but I don't have half a million dollars to buy this original creative.
Jim Nista: A lot of that comes with the name, and once you get to be a well-known and a well-identifiable artist suddenly, the prices go up. I'm not there yet, and you know a lot of projects have a tendency to fit into that magic sweet spot of the 10k budget, right? The $10,000 content budget. They're spending $250,000 on the LED and the installation and the content is still in the afterthought realm, and that's really the other idea that I've had around this.
What's very common, but when I'll get a phone call, we have an LED wall that's being built. This is the type of content, only sometimes clients are asking for generative. They come back, and they're looking for some sort of animated visuals, motion graphics, or edited video loops, and it typically comes in as a budget of around $10,000 for getting that content pulled together, and that's what I was aiming for with this is that it would fall into that sweet spot. Somebody’s looking for more or more customization, certainly, that's possible, and that would impact the price, but to just prep this and get it ready to go for your average installation is intended to be easy and also not half a million dollars, right?
At least not yet!
Jim Nista: No, and even then, if you reach that stage as an artist, there's another fun factor that comes along with it is now you have gallery exclusivity and all kinds of other things, and while it's fun, those big tickets are being shared amongst a lot of different greedy hands, and so you see something that sells for half a million dollars, the artist might have gotten 20% of that and as sad as that is, there's a lot of hands that get into those pies. So at least for now, by focusing this around a space that I know better, I don't necessarily know that gallery space so well, but I know the commercial AV space a lot better, and focusing on that, just makes a lot more sense to me.
I do interact with commercial art brokers, and that's been happening now for about the last six months or so, but, again, that space is different from the commercial art gallery space, which is very interesting.
I bet. So if people want to know more about this, how can they find you?
Jim Nista: My website is getting redone. I'm in the process of making it prettier, and I still need to put this work on there, because I have put very little on there, so this is all new for me as I'm getting this up this summer.
But yeah, it's nista.co, and I'll be enhancing that over the next couple of weeks to be showing some more of the work that I'm doing. I have a few projects that are wrapping up all around the same time, so I should get those wrapped up, get some photographers out to those locations, and then get this content on my website. If you look at it today, it could be showing off some of the generative work.
Okay. Thank you. That was terrific. I now understand it better, although I'm not ready to sit down and try to do it myself.
Jim Nista: No, it's fine. I appreciate this and the opportunity to talk a lot more. I'm having a blast with it. I can't wait to see what else is out there as I get more involved with this work.
Comments (0)
To leave or reply to comments, please download free Podbean or
No Comments
To leave or reply to comments,
please download free Podbean App.