Apologies for the late come back to this thread. I wanted some more time to gather my thoughts and not come up with unnecessary questions and sound like a total noob. But thank you all for the nice warm welcome. I feel at home already.
Tangled-Universe: I've been browsing a lot through these forums during the past couple of weeks and managed to get a lot of answers right here. It significantly shortened my (initially) extensive list of questions.
And yes, I am indeed Dutch as my name implies. Een ras echte Utrechter wel te verstaan.
ChrisC: Thank you for posting those links. There's a lot of info there which I found very educational.
Cypher: Those are looking very promising Richard. I would love to see how far you can take these inside Terragen.
Okay then, let me start with a few questions first. Please pardon my ignorance if I come across in a confusing manner. Things I didn't understand:
•Displacements and imported objects. Currently this appears to be limited to the scanline renderer. No displacements possible when using the 'Ray trace objects' option is turned on. Is this correct? Furthermore, is there any reason why imported objects need to be triangulated for displacements to occur? When using a model consisting of quads, the displacement did not show up in my tests. Can we expect to render displacements on imported meshes in a future release of Terragen? I guess the same question goes for populations of imported meshes…
•Render farm. This is a big point. Can Terragen render on a Linux based render farm? I suppose not as the application is OSX and Windows only. I suppose I could conjure up a small cluster local Mac Pro workstations. Would Terragen be able to deal with that? Are there any links or tutorials available that explain how to set this up?
•Is there any way of adding one or more nodes to an already existing group in the 'Node graph'? Or conversely, a method of removing a node from it's group? I could not find a way.
•Can the front clipping plane of the camera be adjusted?
•I came across one situation where I had an 'Image map' Camera Projected onto the Planet surface. A default 'Surface layer' was plugged into the 'Image map shader's' input node so as to stack behind/below the projected image. Another image map shader was used to mask out the the effect of the projected image via the 'Blend by Shader' port. What I found strange was the fact that the GI prepass was visibly ignoring the blend shader, yet the main ray tracer rendered the image as expected. Is this normal? I'd think such behaviour would result into an incorrect GI cache… no?
I also thought it might be nice to write down just a few of the things I was pleasantly surprised by:So here are a few things I really like:
•It's stability. Granted, I may not have pushed Terragen as hard as some have, but in all my playing around with it I think it crashed… once? Awesome! If only Maya was as stable as this.
•Ability to pause and continue the render. Very convenient if you quickly need to divert 100% CPU resources to something else. Again, why don't more applications have this?
•The way the Anti Aliasing levels scale. I remember reading somewhere Matt mentioning that if he were to start writing a renderer from scratch he'd probably limit the sampling to a geometric sequence with integer powers of two (1, 2, 4, 8, 16, 32 etc.). But from a user's point of view, being able to dial up the sampling along a much more gentle slope really helps keeping rendering times in check by fine tuning AA settings. Cool!
•The soft clipping, compensate and contrast effects under the 'Extra tab in the Render nodes. Really reigns in that raw 32bpc look one so quickly gets when working in linear floating point. Plus it's nice and easy to reproduce in Nuke with a ColorCorrect and a SoftClip node for virtually identical results. I suspect Matt's tenure at Digital Domain brought him in contact with Nuke as parts of Terragen have that Nuke feel to it… which is great as it's my compositing app of choice.
•The vast amount of geometric detail it can handle. Here at last is a application that literally can handle planets to sand grains and everything in between and does so elegantly and reliably.
Kudos to Matt, Jo and Oshyan. And lastly some things I would love to see in future versions of Terragen:
•A slight modification to the Colour picker where RGB an HSB sliders are displayed at the same time. Right now you have to select either set from a pull down menu (at least in the OSX version I'm using).
The way I arrive at choosing colours is almost always by modifying both sets of sliders. Having to constantly switch between RGB and HSB in the the pull down menu is a tad annoying and unnecessary. I see no reason why the two sets of sliders can't be displayed side by side in a single menu. Like in Nuke for instance. Very handy.
•Procedural plants and trees. Right now I'm using Onyx Tree, XFrog and Maya's Paint Effects to generate my personal Arboretum's worth of trees, plants, grasses and shrubs and bring them into Terragen via the OBJ format. This works well. However, I can't help but fantasize about Terragen having it's own procedural tools to generate such forms of life. It would free us from large, static OBJ files, long ex- and imports, expensive Ram usage, etc. An procedural plant/tree system of sorts inside Terragen could potentially introduce natural variety/unique trees (akin to Vue's Eco-system where every tree is different), proper dynamics for wind and other effects, shape/growth adaptation to surroundings etc.
•Displacements and populations on imported geometry. For obvious reasons.
•Arbitrary Output Variables (excuse the Pixar/PR-Man talk). For those unfamiliar with AOV's. It's basically a function of Pixar's PR-Man renderer which, at very little cost, is able to generate additional render channels/passes as it renders the beauty pass (the actual image). These can range from a simple 'diffuse' channel, to much more involving passes such as 'reflection', 'normal direction', 'Y-height grads', 'Matte/ID' passes, etc. It would help tremendously with tying Terragen into a VFX pipeline where a lot of work is done in compositing packages, re-building the image from it's constituent render channels, allowing for a lot of post render tweaking and flexibility.
•Render View with history. I'd like to be able to store test renders inside Terragen's Render View window. Ideally not handled in the clumsy way that the Maya Render View does this, but with access to previous renders via a catalogue window of sorts. It would be great if certain important render settings were recorded alongside each render. So if I like how a certain render turned out in terms of detail, anti-aliasing, sampling, etc, but I've deviated from those settings since then (and can't remember what those good settings were), I would be able to look this up in the Render View catalogue. Some in-house Render Buffer tools I've used go as far as offering options to copy/paste those recorded settings right back into the render node.
Also, the ability to re-arrange the order of the renders stored in the catalogue would be a big bonus.
•F-curves/TCB-curves for animation in an accompanying graph editor. Would lend itself well for controlling animation in anything other than a linear fashion.
•Better documentation. Right now it's scattered all over the place and is patchy to say the least. There are so many tools, options, nodes, etc. that really have no explanation at all and leave the user to experiment or resort to Google.
ZBrush has an awesome feature "hidden" inside it's tooltips. When hovering over a button, or a slider, you get a default tooltip displaying the tools name and it's hotkey (if assigned). If you press CTRL when hovering, the tooltip expands into a very helpful, concise description. Usually this is enough to inform the user of what said tool does. I'd love something like that inside Terragen.
Phew... I write too much, so I better leave it at that and get back to Terragen. Thanks all for reading my blurb.