Rendering in Cinema 4D with Redshift

To prevent too many posts in the “iClone 8 Render Engine” thread, I’m starting this new topic in the hope that C4D and Redshift related posts can be better gathered here.

In view of the discussion about various global illumination (GI) modes in Redshift, I have run a few renders with different settings. The scene is an interior view of a diner, with the following features:

  • 2+ million polys without instancing, 6+ million with
  • There are a number of (practical) lights in the scene, plus a sun-and-sky rig for the exterior
  • The exterior visible through the windows is mesh-based (not an HDRi)
  • For the following renders, the same output size (3840x1644) and render settings were used; the only change is to the GI mode for the primary and secondary method
  • The render settings include post-render denoising with OptiX (built into RS); there is also a little application of RS post effects. No further post work outside of C4D
  • All render times are in seconds on a single RTX4090

Image #1, No GI, 84 seconds

Image #2, Brute Force for primary and secondary method, 149 seconds

Image #3, Brute Force for primary, Irradiance Point Cloud for secondary, 242 seconds

Image #4, Irradiance Cache for primary, Irradiance Cache Point Cloud for secondary, 296 seconds (i.e., almost 5 minutes)

Render settings:


RS_SS_02

Conclusions:

  • No GI is obviously the fastest, but also looks the “worst” (relatively speaking)
  • Brute Force for primary and secondary GI beats any other combination of GI methods for THIS scene hands down
1 Like

I guess, however, I “fix” a range of samples and, based on the (quality) Threshold parameter I set, RS will then decide how many samples will be used within the range set. If I want a quick preview, I can increase the threshold (higher values mean lower quality/more noise in the image, and thus faster renders), for final renders I decrease the value and/or use denoising.

I have no personal experience with Arnold, so I can only speak in general terms:

  • I’d rather have a render engine that I can tweak according to my or the scene’s requirements, rather than one that goes the whole hog all the time (if that is what Arnold does).
  • Without having done any comparisons myself, I would still think that Redshift’s render settings could also be tweaked for “Hollywood film” level (at the cost of renderspeed, of course), however, if I’m planning to publish my content on platforms like YouTube or Vimeo, the question is whether such quality is actually necessary. (In my experience, YouTube cannot even give me the same quality I see from my original (non Hollywood quality) renders once the video is uploaded and re-compressed by YT—haven’t uploaded anything to Vimeo in ages.) So, yeah, as soon as Hollywood starts beating a path to my door, I’ll consider engines like Arnold. :wink:

I have no personal experience with Arnold, so I can only speak in general terms:

I’d rather have a render engine that I can tweak according to my or the scene’s requirements, rather than one that goes the whole hog all the time (if that is what Arnold does).

Arnold was literally built for use on massive render farms but while it is unbiased by default us single user
have a wide variety of setting we can adjust based on the type of scene we are rendering.

I guess, however, I “fix” a range of samples and, based on the (quality) Threshold parameter I set, RS will then decide how many samples will be used within the range set. If I want a quick preview, I can increase the threshold (higher values mean lower quality/more noise in the image, and thus faster renders), for final renders I decrease the value and/or use denoising.

That is what I love about Blender cycles in that we can not only set the number of samples independently for diffuse, specular, reflections volume( like with Arnold)
But we have two really good Denoising option for even more speed as well as the option to literally
set the seconds/minutes to “cook” per frame making it easier to predict how long our machine will be tied up. with a particular render.

But personally I am fine with EEVEE for my personal 3D animation work for youtube etc
and these days I am mostly doing super fast NPR 3D and/or outright 2D for comics graphic novels.

I don’t have a render farm… But if you can adjust the settings to your needs as a single user, all is good.

So does RS. On the Sampling tab, under Overrides, you can either define a multiplier (e.g. 4x the standard sample rate) or the samples in absolute terms (for SSS, I usually go to 4096 or 8192, while the regular sample rate tops out at 512).

OK, the time to “cook” per frame cannot be set in RS (don’t think I’d be using that one if it were available).

RS has 3+ denoisers: OIDN, OptiX, Altus (as single and dual). Personally, I only ever use OptiX or OIDN depending on the scene.

Just speaking from an animators perspective I don’t understand why this scene needs to be 2 million+ polys.

I see alot of fairly simple geometry with no object with highly “sculpted”
details with the possible exception of the vinyl booth seat cushioning details and that could have easily been accomplished with normal maps.

Also I don’t see the point of having detailed geometry outside of windows
unless there was some vital story narrative reason.

Also even though I have multiple machines and could theoreticly commit to “days long” render times, All of the render times posted here would still be unacceptably long for my personal preference on a minutes per frame scale
in an animation project.

Well, I guess Evermotion (the company that made this asset/scene) figured that everything should also look good up close. It works for me, I absolutely hate the low-poly look I find in so many iClone renders. And I appreciate being able to get close to things if I want to, without them “breaking apart” visually. This is an asset for “photorealistic” arkviz, not a stylized, low-poly set for iClone.
For example, the bottle lamps have 47K each (instanced, of course) and are refractive—even the filaments are modeled; the napkins on the table are 17K each, and so on.

And, no, you couldn’t get the same look for the upholstery with normal maps, at least not up close, unless those maps were used for deformation at render time (via hardware tessellation).

Although the stuff outside the windows is not all that detailed, I’m happy Evermotion did it with meshes. If they had used an HDRi or some other kind of “flat” backdrop out there, parallax would not work properly, once the camera is in motion. Also, let’s say I had people walking in the street outside the windows, their reflections (e.g. in the windows on the other side of the street) and shadows wouldn’t work properly either if the backdrop was a “flat” plane.
If you want to use DOF, the background elements also cannot all be the “same” distance from the camera for DOF to look realistic.

So, yeah, except for skies (devoid of any buildings, mountains, trees, etc.), I always use mesh-based backgrounds and not HDRis.

BTW: I did not put this scene together; I just converted the materials for use in Redshift and did a little optimization with regard to render times. Other than that, the scene is pretty much the way it came out of the “box”.

Unless I have an absolutely trivial scene at 1080x1920 (for YT shorts), I don’t see 1-minute-per-frame render times (not even on the RTX4090). I have learned to accept that, because what you call NPR renders are not really an option for me.

Of course anything imported from iclone (including the characters to be honest)
are not ever going to look truly “realistic” in a pro render engine like Redshift, Vray Arnold, Octane.
and Unreal engine.

Even the higher quality Daz genesis figures fail the “realism” test despite being brute force path traced in the old NON RTX version of Iray that Daz users cling to.

I was not even aware that Evermotion was still in business but I can see why thier Arch vis products are so heavy as they are not really for animation/games.

Render times are relative depending on your objective.

But I can see why Aurora Trek’s latest stylized production is not
being rendered with Redshift.

I don’t get that. OK, I suppose for games, low-poly assets may be of some benefit to keep things at 150+ FPS, but for non-real-time animations? Who cares if a render takes a few seconds longer; I’d much rather have the detail that a higher poly count gives me. So, I’m happy that Evermotion has not felt the need to produce low-poly junk (at least at the time when they created this diner scene a few years ago). There are enough low-poly assets out there (not just on the RL platforms but also on other 3D platforms).

The poly-count in this scene is IMO not the decisive factor with regard to render times. I have had higher poly scenes that rendered faster. The thing here is stuff like refractions (including lights in a refractive “container”), the number of individual lights, etc. not the resolution of the meshes and, of course, the quality setting. Also, interiors with GI almost always render slower than exterior shots (with similar poly-count and render settings).

Image #1, same settings, refractions off, 32 seconds

Image #2, same settings, reflection off, 121 seconds (i.e. 28 seconds faster than with reflections)

Image #3, same settings like Image #1 in the first, original post of this thead, but threshold way higher (i.e., fewer samples, lower quality), 37 seconds

The poly-count is unchanged in all three examples and while the 37-second render is 4x as fast as the “standard” version, the quality is not what I would consider really acceptable (it does not show too much in the image here, but that is because (a) these images are just 25% of the original resolution (downscaled by the forum software) and (b) the denoiser got rid of a lot of noise in the original render.

Unless you have a monster CPU (say 128 cores, 256 threads) and a low-end GPU, Redshift is probably going to run circles around the Standard and Physical engines in C4D.
I think @Auroratrek is just getting his feet wet in Redshift and that is why he hasn’t switched, not because Redshift is slower than Standard or Physical or less suitable for a “stylized production”. I’d be cautions, too, if I were contemplating switching render engines in mid-production. We’ll see.

Edit: Since Redshift can also run on the CPU, I re-did Image #3 using the 16-core CPU only: It took 12 times as long as the RTX4090 (445 vs. 37 seconds).

Funny you should mention this. For my newest chapter, I finally jumped into Redshift, but to the points made here, it was only after I purchased a GPU that was 3 times as fast as my previous one. Even so, there were some pretty hefty renders–like 4-5 minutes per frame. I thought Redshift was supposed to render “realtime”? LOL

Overall, I’m pleased with the look. I didn’t mind the previous standard render look, but sometimes having to simulate all the reflected lighting was a real drag, so going with Redshift made lighting a bit easier for the most part. There was a learning curve for node-based materials, and I’m still getting used to those. One of the biggest pluses might be the Matrix Scatter and Redshift trees for landscapes–certainly allows for a lot of trees. Check out chapter 4 here:

Which GPU did you get?

AFAIK, Redshift (production) was never touted as real-time—just as “fast” (for a ray-tracing render engine); there is Redshift RT which I have never used which is supposed to be faster (at the expense of realism/quality). If I were a Blender user, I don’t think I’d be using EEVEE either.

I see you also “preserved” the previous look to some extent—probably a good thing for an ongoing series. However, it seems to me that the background has more details than I recall from previous episodes. It also looks to me that you used RS tessellation.

You have some really great voice acting in this series!!

I think Redshift has a preview mode called “RT” where you can get a near “realtime” preview for setting up lights & materials.

To be honest my experience is that working in a software that has a close to final quality near “realtime” preview is where you really gain production time on a movie or series.

It was torture having to do endless full quality test renders
(at 5-7 minutes per test frame)
just to check lighting in my Old C4D R11.5 on the mac when I was making
“Galactus Rising”

Thankfully those days are over for everyone.

I got an Nvidia GeForce RTX4070 Super, and after a lot of hemming and hawing and research, it really did run at pretty much the speed increase the bench tests said it would. Of course, once I had the new GPU, I moved the goalposts again, so I’m probably rendering at the same per frame rate as before–you can’t win this arms race!

Okay, got it, not realtime, tho I guess it is a lot faster for all the calculations it’s doing.

I didn’t really change the characters, so they look pretty much the same, just with upgraded lighting. I guess I was able to add more stuff in the background, but also maybe the improved lighting gives everything a bit more interest. The other nice thing with the new GPU is that hair is a lot faster, so the dwarf’s beard didn’t prolong the renders too much.

EEVEE was originally introduced to give video game dev content creators a “real time” viewport renderer to see a close approximation of how their game assets
will look without having to constantly export to the actual game engine.
Animated film makers also use EEVEE for the same preview purposes before switching over to the cycles engine for final rendering,

Personally I am just fine with the
“Game Cinematic” look of EEVEE for most things particularly the AO & bloom effects
and the render speed is addicting.

Thanks! Yes, the elf and dwarf (and drunk, who is the same guy doing the dwarf) are paid actors from Voices.com, and I think they do a great job. The little gnome you may already know is my wife–she doesn’t charge me as much LOL.

Yes, the preview window is very helpful for setting up lighting, and I think I’m getting a handle on that. I dabbled a little with Unreal Engine without knowing what I was doing, but that seems like quite a hurdle from where I am now, so I’m sticking with C4D for now. I keep looking at iClone’s render, too, but I feel like since I do so much customizing in C4D, it’s easier to stay there. Until the next tech leap, then we’ll see. :wink: I keep thinking that the hardware will finally catch up with the software, but that may be a pipe(line) dream…

Not likely,
As you said ,we keep moving the goal posts.
I read an article about ILM’s production of the Transformers “Revenge of the Fallen” movie.

Now you might think that they are rendering in realtime on their multi million dollar hardware setups right?

No they are getting a frame very 2-5 hours
,or something, because just the “Optimus Prime” Character alone had so many layers of geometry & 4k textures that his scene files alone were 30 TERABYTES! on the ILM servers ,it’s insane. :weary:

Not exactly. Redshift RT is more like EEVEE is to Cycles. The preview window will depend on the “engine”/RS version you choose, but since it is of lower resolution than the final render (at least for me) and use progressive rendering (instead of buckets), it will give a fairly good idea within seconds. However, a “preview” render at full resolution and with “production” settings is obviously going to take its time. However, you only need that if you do pixel peeping (which I do to figure out if, for example, higher sampling rates are really worth the added render time.

Just for the fun of it, here is a render with Redshift RT of the diner scene. To me it just looks bad, i.e. unusuable. So, I guess, I won’t be using RT (I suppose, there may be ways of improving this, but I don’t think it worth it to explore this further at this time—perhaps when, at some future time, there is less of a quality gap compared with the “production” Redshift engine:

That is not exactly what I meant. The standard renders have a bit of a dull look to them with regard to, for example, metals. With Redshift, you could use a PBR/metalness approach, that would give you a more realistic look for the metallic materials. However, that would probably not be in line with the look of the other episodes so far and perhaps also not fit in with the overall, somewhat stylized look you are going for. In other words, even though you changed the render engine, you pretty much preserved the look of the previous episodes—which is probably a good thing for a series.
However, if you do stand-alone renders or, at some time in the future, consider a new series, PBR/a metalness workflow may be something you may want to look into.

True, but that is not Redshift RT; although I still get fairly good feedback on that when moving around in the viewport. You can also have Redshift draw the viewport—instead of OpenGL (I personally don’t do that, but for less complex scenes it can work reasonably well).

Exactly. It will probably also be easier to integrate simulation stuff in C4D than in iClone; also, the more complex a scene gets, the more iClone slows down (at least in my experience).

Well, unless RL really does revamp their internal render engine considerably, I think Redshift will give you more bang for the buck (especially since it now comes with C4D for “free”). Also, if you decide to upgrade your system, Redshift can also make use of multiple GPUs (which iClone, to the best of my knowledge, cannot).

Which is not, IMO, a bad thing. How else do you get improvements?

I think, that the render quality that can be achieved “at home” is closing in on the one that Hollywood does. It’s the same with regular video equipment. 50 years ago, there was a huge difference what the big boys with their 35mm (or more) film cameras could capture and what the amateurs were able to do with their 8mm cameras. With 4K and 8K cameras in everybody’s pocket, the gap to the big boys has been reduced enormously; much more than could have been anticipated only 25 years or so ago. You could render the CGI scenes from Tron or The Last Starfighter on your home system in better quality.
Or take “Flow” (which I still haven’t watched) which does not even use the best render engine available on free software like Blender but still manages to get awards (must be frustrating for Pixar and the other big-league animation studios).

1 Like