Welcome to the “Real-time” issue of your free Digital Art Live magazine. Here we’re riding on the coat-tails of the great advances currently being made in real-time 3D rendering.
Subscribe now and receive the PDF FREE every month
“Faster, faster, faster” has always been the cry of the 3D digital artist, followed by an anguished “why is it so slooowww!?” The ideal is that we can see the final art, and make incremental changes to it, as fast we can dream it. Of course users of DAZ Studio and Poser have long had big real-time OpenGL previews, and these could be saved out as a render. Those who used DAZ and Poser for their original purpose — figure reference images — often found that a 4k OpenGL render was enough. Just ink or paint over it. Smith Micro later found an elegant way to marry the two approaches inside Poser 11. Their easy-to-use Comic Book Preview feature lays an adjustable real-time inking layer on top of filtered OpenGL colour-flats. Hopefully new Poser owner Renderosity will continue to refine and develop the real-time comics side of Poser.
But for those who wanted realistic pictures, “faster” usually meant a spend of at least £600 (about $750). The Poser Firefly and Vue Xtream semi-pro user usually found a refurbished Dell T5500 workstation fitted with 2x Xeon X5650 CPUs (or higher, up to X5690), and this would give 12 fast cores for CPU rendering. The DAZ Studio iRay user and Poser 11 SuperFly user usually fitted two slot-in Nvidia Titan graphics cards, and wrestled with fitting the new power-supply unit and cooling needed to run the hot new GPUs. Even then, one could wait some appreciable time for a smooth 4k render with all the finest visual refinements and details.
The rest of us built our scene by excluding a mental checklist of items known to cause slow renders — ‘bad’ hair, fur, fog, water, and similar. We tweaked arcane render-settings for an optimum balance of speed and quality. Then we left the PC to chug for 10 hours overnight, hoping there would be no crash. This can still be done, but is not ideal.
Others tried real-time videogame engines, via software such iClone, Muvizu or even Garry’s mod. FlowScape is the latest entry there, with a sweet price of just $10 for easy access to the power of the Unity game engine.
Real-time raytracing, via the latest Nvidia Titan RTX cards, is now changing the equation for GPU rendering, especially for time-pressed artists with $1,000+ to spend. It’s well known that Blender 2.8 has its new Eevee real-time engine. Less well known is that DAZ Studio turned on RTX 2019.1 in its 4.12.x beta, enabling fully interactive 3D rendering. Even less well known is that the iRay dev team has quietly reported of iRay RTX 2019.1.3 that: “… we optimized some parts of iRay interactive to render faster on all GPUs, so one can expect a 2x speedup for most scenes even on non-RTX cards … iRay photoreal also got a free ~2x speedup when using the CPU for rendering.” A 2x iRay speedup for free, and RTX 1.3 is available now in DAZ Studio 220.127.116.11 or higher. Thanks, devs and DAZ!
Subscribe now and receive the PDF FREE every month
Welcome to ISSUE 43 : Real-time
We interview Chris Hunter of Pixel Forest Games, the maker of the best-selling new real-time landscape creation software.
SOFTWARE | GAMES
“When I was younger I put a ton of hours into Bryce and Terragen, and the main frustration was vegetation. So the core focus has been to easily do that in FlowScape. What it does not yet have is procedural landscapes, but I have prototypes looking very promising.”
‘Crazy Knife’ is a San Francisco digital artist, and one of the leading users of the ‘Garry’s mod’ real-time engine.
“Garry’s mod satisfies a very nice niche, in terms of being able to quickly bash concepts together with human characters, environments, and a story. I’ve never seen any technology come close to this in its speed of execution.”
Jonathan is a veteran of the TV title-sequence world, and has worked on the likes of Iron Man, Transformers, and more
SCREEN TITLES | VFX | VR
“I met Mix Master Mike [Beastie Boys] by just asking him if I could create the first hip hop VR music video with him. I had no idea what I was doing, going in! [And now] “Magma Chamber” has been featured at the Cannes festival and has won numerous awards.”
OUR LIVE WEBINARS!
INDEX OF BACK ISSUES
GALLERY: NATURE INSPIRED DESIGN
DAL: Michelangelo, many thanks for contacting us directly for an interview. You seem an excellent choice for an indepth interview on the latest forms of raytracing available within the Unreal videogame engine.
MC: Thank you for giving me the opportunity to talk about my passion! I’m always happy to share some insight into my explorations of digital art, and hopefully inspire other artists to use amazing programs like Unreal to express themselves. Ray tracing is a technology that has been around for quite a while, but seeing it in action in real time has been so thrilling for me… and I think that every real time artist should look into it!
DAL: Indeed, the new RTX technology is very interesting. But tell us more about yourself first, how did you first get into 3D artistry when you were younger?
MC: I was developing a game with some friends and I was essentially the only artist in the team. I originally just wanted to contribute concept art but, as production continued, and our ‘dream game’ became reality, I realized how my lack of 3D skills impeded the visual goals of the team.
In our minds we were imagining the game as being a beautiful 3d-stylized tactical RPG (with dynamics heavily inspired by Final Fantasy Tactics) but, in reality, our prototype was comprised of cutout shapes and sprites painfully animated in Photoshop. The game looked ‘interesting’, and it was functional as a 2.5D demo, but the programming and design had to be constantly re-targeted to fit my capabilities and schedule.
Just as our team was starting to tour the country to showcase the demo at indie game conventions, I was finishing high school in Italy and was looking for possible higher education abroad. Concept art was still my main focus, and to make it my profession I was recommended to attend several art schools in Los Angeles. During my first visit of the city, I stumbled upon Gnomon, and I realized that it would be the place where I would learn the most, as most of the artworks decorating the school walls were done entirely in 3D, and it all looked absolutely stunning.DAL: A fine choice. And you’re now an environment designer in Los Angeles. Did you also train in other fields?
MC: My technical training includes online courses, drawing books, digital art workshops, comic book schools and, most recently, a two-year certificate program at Gnomon where I learned about 3D art. But, without years of avid reading and playing, it would be impossible for me to apply the technical knowledge.
So what I’m saying there is the main foundation of environment design is storytelling, and I have always been passionate about books, comics, movies and of course games. Starting from comics, I had always consumed this media with a critical mind, and tried to replicate it with the same amount of passion that I had consuming it.
As my range of skills widened, I organically found myself spending more time designing scenes and environments compared to the characters that would inhabit them. I find more chances to tell an interesting story by letting the viewer, or the consumer of my art, be the main character and omit placing oneself in the scene.
DAL: Would you recommend the Gnomon training route if people want to get into the industry at a high level? I’d imagine there’s a certain cost barrier there?
MC: Studying at Gnomon is definitely very expensive but, yes, if your goal is getting into the industry, then this is one of the best schools to join. All of my classmates and friends, who have studied alongside me for the two years, are now employed in some of the biggest companies in the videogame and movie industry. But, even in the past year, many new digital schools have opened, and the amount of tutorials and online courses and webinars has grown substantially, so, to any artist looking to work in the industry, I always recommend to look at what is readily available first. Ask professionals for advice and most importantly have a clear goal.
DAL: And specifically, you now work for the major game developer Naughty Dog. Are you able to tell us more about your work there? Within the limits of non-disclosure and ‘trade secrets’ of course.
MC: Sure, I am working there as a texture artist, so I mostly use a proprietary in-house material editor to create or modify materials that belong to assets in the game. To ensure that the material response is correct, and the level is optimized, I have to work with different software packages and interact with artists from different departments, so I find myself learning something new about game creation every day.
My work is overseen by several leads and all the other environment artists are always available for questions and can help me out when there is an issue. Occasionally I get to help other colleagues too, since texture artists are still being hired, and it’s a great way to solidify my knowledge and really master the craft.
DAL: Great, congratulations. And I understand that Naughty Dog has a unique free-wheeling approach to game development? Can you outline some of that for our readers, please?
MC: From early development, up until the end of production, game designers come up with new ideas for gameplay and software engineers and artists work together to integrate them in the game, this is to ensure the best possible game and everybody is happy to make it better.
There are still specifically defined phases of production that dictate the deadlines for each department (alpha, beta, gold), and they are very important, just not as crucial as the final quality of the game, both visually and performance-wise.
DAL: Sounds very optimised. Turning to the technical side, now. You are developing to take advantage of the new RTX series graphics cards, which are designed for real-time ray-tracing in Unreal and other software such as DAZ Stuydio. What are the main challenges in developing for RTX, and how are you solving those challenges?
MC: As soon as Unreal supported ray tracing I was very curious to try it in my personal projects. Turning it on is still not optimal in most scenarios, but the new RTX cards have dedicated RT cores, and are also more powerful than the previous generation, so they can easily handle these expensive features.
The main challenge is figuring out whether it’s worth sacrificing performance for the visual enhancements brought by ray tracing. You cannot just ‘turn all the sliders to the max’, even with the most powerful cards, and you always need to compromise somewhere.
In my projects I always think about ‘the visual target’ along with the story, for example, if the focus of my scene is an object with a refractive surface, then ray tracing is currently the best option to make it look realistic.
But if the environment has numerous complex materials, or is exceptionally geometry-dense, then ray tracing will only hurt performance with little to no visible benefit.