3.1 Multiped Lecture Overview
Read this article shows that you have agreed to the statement
MultiTexturing may be the easiest skill that can be implemented with pixel shader. In addition, because the pixel shader replaces the multi-texture phase, then we should have a most basic understanding of multi-texture "what" and "what to do". This section describes the concise overview of multi-texture.
When we discusted texturing (Translator Note: The texture discussion is not within the scope of the translation, specifically refer to the original book, "Part 4: Coloring and effect" Chapter 3), we have ignored the discussion of multi-texture in fixed-function pipelines, which have two reasons: First, multi-texture is a tricky process, we consider this is a high-level topic at the time; In addition, the fixed functional multiped is replaced by new and stronger pixel shaders. Therefore, it is meaningless to spend time is meaningless in the fixed function texture.
The idea of multipedia is a little and mixed. In Chapter VI (Translator Note: The Seventh chapter of the original book is not in the scope of the translation). We promote this same idea to multiple textures. That is, we allow several textures, then define how these textures are mixed together to reach a special effect. One of the usual use of multi-textured is to perform light. As an alternative to the light model of Direct3D in the vertex processing phase, we use a special texture map called "Light Map", how is the encode surface is illuminated (Translator Note : This sentence means "it defines how the surface is illuminated"). For example, suppose we want a spotlight on a large wooden box, we can define a spotlight of a D3DLight9 structure, or you can mix the texture map representative of the wooden box with a light map representing the spotlight, as shown in Figure 3.1. Indicated.
Figure 18.1: Rendering a wooden box illuminated by a spotlight using multipedia. Here we combine these two textures by multiplying the corresponding texture pixels.
Note: Use the mixture in Chapter VII, and the result image relies on the texture mixed. In the multiped pharmaceutical phase of the fixed function pipeline, the mixing mode is controlled by the Texture Render State. With a pixel shader, we can write a simple expression of a mixed function in a programmable manner. This allows us to mix texture in any way we want. We will discuss the texture mixing when discussing the sample application prepared for this chapter.
Mix multiple textures (two in this example) to illuminate the wooden boxes than the Direct 3D lights:
N illumination is calculated in advance in the light map of the spotlight. Therefore, the light does not need to be calculated at runtime, which saves processing time. Of course, only the illumination of the static object and the static lighting can be pre-calculated.
n Because the illumination map is calculated, we can use more accurate and complex illumination models than Direct3D (light) models. (Better illumination results in more real scenes.)
Remarks: The multiped phase is typically used to achieve a full lighting engine (Full Lighting Engine) of static objects. For example, we can use a texture map to save the color of the object, such as the texture map of the wooden box. Then we can save a scattering surface shade with a scattered light map, a separate mirror light map to save the mirror surface coloring, a fog map (Fog Map) Save the mist in the surface The total amount of things, and the details of a small, high visibility surface that can be saved with a detailed map (Detail Map). When all of these textures are combined, just retrieve in these pre-calculated textures, you can effectively illuminate, colors and add details to the scene. Note: Spotlight photographic map is an insignificant example of the most basic illumination map. Typically, specific procedures are used to generate light maps at a given scene and light source. The generated light photograph beyond the scope of this book. Interested readers can refer to the illumination map described in 3D Games: Real-Time Rendering and Software Technology in 3D Games: Real-Time Rendering and Software Technology.
3.1.1 Allow multiple textures
Memolive, the texture is set with idirect3ddevice9 :: setTexture method, and the sampler status is set with the idirect3ddevice9 :: setSamplerstate method, the prototype is as follows:
HRESULT IDIRECT3DDEVICE9 :: SetTexture
DWord Stage, // Specifies The Texture Stage Index
IDirect3dbasetexture9 * Ptexture
);
HRESULT IDIRECT3DDEVICE9 :: SetSamplerstate
DWORD SAMPLER, / / SPECIFIES The Sampler Stage INDEX
D3DSamplersTateType Type,
DWord Value
);
Note: A specific sampler phase index i is associated with the IXTURE STAGE. That is, the first sampler phase specifies the sampler status of the first set (SET) texture.
Texture / Sampler Phase Index identifies the texture / sampler phase of the texture / sampler we want to set. Therefore, we can allow multiple textures and set their corresponding sampler status by using different phases indexes. In the previous section of this book, we always specify 0 to indicate the first phase because we only use only one texture at a time. So, for example, suppose we should allow three textures, we use this stage 0, 1, and 2:
// set first texture and corresponding sampler stats.
Device-> setTexture (0, TEX1);
Device-> setsamplerState (0, D3DSAMP_MAGFILTER, D3DTEXF_LINEAR);
Device-> setsamplerstate (0, D3DSAMP_MINFILTER, D3DTEXF_LINEAR);
Device-> setsamplerState (0, D3DSAMP_MIPFILTER, D3DTEXF_LINEAR);
// set Second Texture and Corresponding Sample State.
Device-> setTexture (1, tex2);
Device-> setsamplerstate (1, D3DSAMP_MAGFILTER, D3DTEXF_LINEAR); device-> setsamplerstate (1, d3dsamp_minfilter, d3dtexf_linear);
Device-> setsamplerState (1, D3DSAMP_MIPFILTER, D3DTEXF_LINEAR);
// SET Third Texture and Corresponding Sampler State.
Device-> setTexture (2, tex3);
Device-> setsamplerstate (2, D3DSAMP_MAGFILTER, D3DTEXF_LINEAR);
Device-> setsamplerState (2, D3DSAMP_MINFILTER, D3DTEXF_LINEAR);
Device-> setsamplerState (2, D3DSAMP_MIPFILTER, D3DTEXF_LINEAR);
This code allows TEX1, TEX2, and TEX3, and sets filtration modes for each texture.
3.1.2 multiped coordinates
Recall the sixth chapter, for each 3D triangle, we want to define a corresponding triangle on the texture to map to the 3D triangle. We completed this by adding texture coordinates for each vertex. Therefore, each of the three vertices constituting a triangle defines a corresponding triangle on the texture.
Because we need to use multiple textures now, we need to define a corresponding triangle on each of the allowed textures. We do this to increase the collection of additional texture coordinates to each vertex - each of the apex, and accordingly, each allowed texture. For example, if we mix the three textures together, then each vertex must have a three-episode texture coordinate to index to three allowed textures. Therefore, a three-textured multiped parenteral structure can look like this:
Struct Multitexvertex
{
Multitexvertex (Float X, Float Y, Float Z,
Float U0, Float V0,
Float U1, Float V1,
FLOAT U2, FLOAT V2)
{
_X = x; _y = y; _Z = z;
_U0 = u0; _v0 = v0;
_U1 = u1; _v1 = v1;
_U2 = u2; _v2 = v2;
}
Float _x, _y, _z;
FLOAT _U0, _V0; // Texture Coordinates for Texture At Stage 0.
FLOAT _U1, _V1; // Texture Coordinates for Texture At Stage 1.
FLOAT _U2, _V2; // Texture Coordinates for Texture At Stage 2.
Static Const DWORD FVF;
}
Const DWord Multitexvertex :: FVF = D3DFVF_XYZ | D3DFVF_TEX3;
Note that specifying the free vertex format tag D3DFVF_TEX3 indicates that the vertex structure contains three episode texture coordinates. The fixed function line supports up to eight episodes of texture coordinates. To use more than eight episodes, you must use the vertex declaration and the programmable vertex line.
Note: In a newer pixel shader version, we can use a set of texture coordinates to index to multiple textures, and thus eliminate the needs of multiple texture coordinates. Of course, this is assumed to use the same texture coordinates during each texture phase. If the texture coordinates of each phase are different, we still need multi-texture coordinates.