Displacement is technique which is used to give greater shape and detail to surfaces without using precalculated geometry. This means you can take an object made up of relatively few polygons and add much more detail. During rendering scene elements such as the terrain or objects are broken up into micropolygons. Displacement is used to move these micropolygons in 3D space to create more detailed shapes.
Displacement is a fundamental part of the Terragen 4 rendering engine. All of the terrain is created by applying displacement to the smooth sphere of the underlying planet, even when using heightfields. Displacement can create features from the size of mountain ranges down to little pebbles. Displacement can also be applied to other parts of a scene, such as objects like rocks or imported 3D models.
You might be familiar with bump mapping. Bump mapping is another technique which can be used to give surfaces the appearance of more detail. It uses a bump texture to simulate lighting effects that give the impression of a surface having more shape, or making it "bumpier". The difference between displacement and bump mapping is that displacement creates real 3D geometry whereas bump mapping fakes the appearance. With displacement a flat surface takes on a real 3D shape. It looks 3D from all viewing angles. With bump mapping the flat surface stays flat and this is obvious from many viewing angles, especially looking from the side.
Displacement is generated at render time, basically by using mathematical calculations performed by shaders. The geometry generated by displacement is not saved in project files or exported with terrain or object files.
As mentioned above displacement can be used to create very large and very small features. Any large features should be created as part of the terrain rather than as a surface layer. You can do it but we don't recommend it. For best results you should connect any shaders that are generating large displacements into the terrain part of the network. They should be connected above the Compute Terrain node.
Most of the shaders which can generate displacement have a common set of parameters for controlling it. Here's a rundown of how those parameters work:
This popup list allows you to choose the direction that displacement is applied in. Any options in the popup list that have "(requires computed normal)" require that there be a Compute Terrain or Compute Normal connected somewhere above the node in the network to work properly. The popup has the following options:
- Along vertical: Displacement will happen along the normal of the underlying object (i.e. the planet or a model) without any displacement being applied.
Along normal: Displacement will happen along the current surface normal.
Vertical only (requires computed normal): Displacement only happens along the normal of the underlying object (i.e. the planet or a model). The displacement is scaled by the difference between the object normal and the surface normal. Displacement is reduced as the angle between the normals approaches 90°.
Lateral only (requires computed normal): Displacement only occurs in the lateral plane, or in other words perpendicular to the normal of the underlying object.
Lateral normalized (requires computed normal): This is the same as Lateral only but the normal is normalised (scaled so it has a length of 1).
This multiplies the displacement values coming from the Displacement function input. A value of 1 leaves the incoming values unchanged. A value of 2 would make the incoming values twice as large. A value of 0.5 would make them half as large. Negative values will invert the displacement.
This parameter is where you connect the node(s) used to generate displacement for the layer. It expects scalar inputs. This means some nodes which create displacement themselves may not give the results you expect, no displacement for example. This is because those nodes are displacing scene geometry directly rather than outputting values which can be used for generating displacement in another node. You can connect nodes which create colour though. The colour will be automatically converted to a scalar.
An example of this situation is using the Simple Shape Shader to generate displacement for another node. If you just turn on displacement for the shape shader you won't get any displacement in the node its connected to. However if you turn on colour for the shape shader you will see displacement.
From v2.4 on you can use the Displacement Shader to Vector node to convert the output of a displacement shader to a vector which can be connected to the Displacement function input. The vector gets converted to a scalar.
This value is added to incoming displacement values after they are multiplied by the Displacement multiplier parameter. This creates the effect of offsetting the displacement by a set amount along the Displacement direction. Positive values push the displacement out so it looks almost as if it was sitting on a plinth. Negative values will sink the displacement back into the surface. It doesn't reverse the displacement, it's more like creating a hole in the surface and then applying the displacement to the bottom of the hole.
You might find surfaces with rough or spikey displacement occasionally show problems, such as being cut off at bucket edges or causing gaps in ray traced shadows. Some nodes, such as the Planet, have Displacement tolerance parameters which can help to improve this. Changing this parameter can greatly increase render times so you should only change it if you have a specific need. The default value is 1. If you're having problems try increasing it to 2. If that improves things but doesn't completely resolve them then try increasing it by small increments. You would not generally want to go above a value of 4 or 5 however.
Literally, to change the position of something. In graphics terminology to displace a surface is to modify its geometric (3D) structure using reference data of some kind. For example, a grayscale image might be taken as input, with black areas indicating no displacement of the surface, and white indicating maximum displacement. In Terragen 2 displacement is used to create all terrain by taking heightfield or procedural data as input and using it to displace the normally flat sphere of the planet.
A single object or device in the node network which generates or modifies data and may accept input data or create output data or both, depending on its function. Nodes usually have their own settings which control the data they create or how they modify data passing through them. Nodes are connected together in a network to perform work in a network-based user interface. In Terragen 2 nodes are connected together to describe a scene.
A parameter is an individual setting in a node parameter view which controls some aspect of the node.
A scalar is a single number. 1, 200.45, -45, -0.2 are all examples of scalar values.
A shader is a program or set of instructions used in 3D computer graphics to determine the final surface properties of an object or image. This can include arbitrarily complex descriptions of light absorption and diffusion, texture mapping, reflection and refraction, shadowing, surface displacement and post-processing effects. In Terragen 2 shaders are used to construct and modify almost every element of a scene.
A vector is a set of three scalars, normally representing X, Y and Z coordinates. It also commonly represents rotation, where the values are pitch, heading and bank.
When Terragen renders, it divides the image up into buckets or tiles. Each bucket is rendered separately, allowing multiple buckets to be rendered at once. It also allows memory to be used more efficiently.