How to projection a texture

zhaozj2021-02-17  55

How to projection a texture

Original source: SGI OpenGL tutorial

Translation: Heart Blue Pan Li Liang. XheartBlue@etang.com

Translator's preface:

There are two classic implementation methods: First, Shadow Volume. The second is shadow mapping. How to use Light Mapping to achieve a projection shadow? This is to use Project Texture. It means that projection texture ---- putting a texture like a slide to the scene, if there is a movie machine in the movie, along the lens direction, movie will be In the wall, the projection texture is similar to this, and the texture we want to use is Film in the movie.

The following is an article I found in the tutorial of SGI. Dedicated to everyone.

How to Project a Texture

http://www.sgi.com/software/opengl/advanced98/notes/node49.html

Projection A texture image is the same in many steps and projected rendering scenes on your own synthesis. The key to the projection texture is the content of the texture transformation matrix. The transform matrix is ​​connected in series by three transformations:

1. View / Model Transform - Down to the direction of the appearance to the scene.

2. Projection transformation (perspective or orthogonal)

3. Zoom and offset (Bias is a shift?), Which approaches the cut surface to the texture coordinates.

Projecting a texture image into your synthetic environment requires many of the same steps that are used to project the rendered scene onto the display The key to projecting a texture is the contents of the texture transform matrix The matrix contains the concatenation of three transformations..:

1. A ModelView Transform to Orient The Projection in The Scene.

2. a Projective Transform (Perspective or Orthogonal).

3. A scale and bias to map the near clipping Plane to Texture Coordinates.

The model / view transform section can be calculated as the normal graphics pipeline, you can use the general model / view transform tool, an example, you can use glulookat () to projection, or use GLFRUSTUM or Gluperspective ( ) To define a perspective transformation.

The modelview and projection parts of the texture transform can be computed in the same way, with the same tools that are used for the modelview and projection transform. For example, you can use gluLookat () to orient the projection, and glFrustum () or Gluperspective () to Define a atpective transformation.

The role of the model / view transformation is the same as the role of OpenGL observation pipe. It moves the observer along the -z direction to the origin and projection center. In this case, the observer is like a light source, close cutting The plane is better than the location of the texture image of the projection, the texture image can be seen as a transparent film in the projection. Or, you can also imagine an observer in the observation position, through the close-plane texture, see the surface that needs to be attached to the texture (projection texture). The modelview transform is used in the same way as it is in the OpenGL viewing pipeline, to move the viewer to the origin and the projection centered along the negative z axis. In this case, viewer can be thought of a light source, and the near clipping plane of the projection as the location of the texture image, which can be thought of as printed on a transparent film. Alternatively, you can conceptualize a viewer at the view location, looking through the texture on the near plane, at the surfaces To be textured.

The projection operation transforms the spacespace to a normalized device space, in which X, Y, Z coordinate range is -1 to 1. Projection texture can be imagined to be placed on the near plane in the projection direction, which is defined by the model / view and projected transform matrices.

The projection operation converts eye space into Normalized Device Coordinate (NDC) space. In this space, the x, y, and z coordinates range from -1 to 1. When used in the texture matrix, the coordinates are s, t, and r INSTEAD. THE Projected Texture Can Be Visualized As Laying on The Surface of The Near Plane of the Oriented Projection Defined by The ModelView and Projection Parts To The Transform.

The last part of the transformation is to scale and offset the texture mapping, which makes the range of artwork coordinates into 0 to 1, so that the entire texture image (or expected area) can cover the entire projection near plane. Because near-plane is defined as NDC (Specific Device Coordinate) coordinates. Put the near levels under the NDC coordinates to narrow the coordinates in the S and T directions on the texture image, and then translate 1/2. (Note: [- 1, 1] * 1/2 1/2 = [0, 1]). The texture image will be centered and cover the entire close-up (I have never understood what is Back Plane). Texture can also be rotated when the direction of the projected image is changed.

The final part of the transform scales and biases the texture map, which is defined in texture coordinates ranging from 0 to 1, so that the entire texture image (or the desired portion of the image) covers the near plane defined by the projection. Since the near plane is now defined in NDC coordinates, Mapping the NDC near plane to match the texture image would require scaling by 1/2, then biasing by 1/2, in both s and t. The texture image would be centered and cover the Entire Back Plan. The Texture Could Also Be Rotated If The Orientation of The Projected Image Needed to Be Changd. The projected order is the same as the ordinary graphics pipe. The first is the model / view transformation, then the projection shift, and finally the zoom and the flat position to the texture image:

Glmatrixmode (GL_Texture);

2. GLLoadIdentity (); start.

3. GltranslateF (0.5F, 0.5F, 0.5F);

4. Glscalef (0.5F, 0.5F, 1.0F);

5. Set the projection matrix (such as: GLFRUSTUM ())

6. Set the view / model matrix (such as glulookat ()).

THE SAME As The Graphics Pipeline, The ModelView Transform Happens First, THE The Projection

Glmatrixmodegl_texture (GL_Texture)

2. GLloadIdentity () (Start over)

3. GltranslateF.5f, .5f, 0.f (.5f, .5f, 0.f)

4. Glscalef.5f, .5f, 1.f (.5f, .5f, 1.f) (Texture Covers Entire NDC Near Plane)

5. Set The Perspective Transform (E.G., GLFRUSTUM ()).

6. Set The ModelView Transform (E.G., Glulookat ()).

So how do you define the mapping mode of the texture coordinates of the primitives? Because our projection and view / model transformation is defined in the eye space (all scenes are installed in this space). The most direct way is to create a 1-to-1 correspondence in the texture coordinate space and the eye space. This method can be set by setting the texture coordinate generation mode to the Eye linear mode, while setting the Eye Planes into 1 pair 1 mapping: (specific See OpenGL's texture coordinate generation, the method of D3D can also be found.)

GLFLOAT SPLANE [] = {1.F, 0.F, 0.F, 0.F};

GLFLOAT TPLANE [] = {0.F, 1.F, 0.F, 0.F};

GLFLOAT RPLANE [] = {0.F, 0.F, 1.f, 0.f};

GLFLOAT QPLANE [] = {0.F, 0.F, 0.F, 1.F};

What about the texture coordinates for the primitives that the texture will be projected on? Since the projection and modelview parts of the matrix have been defined in terms of eye space (where the entire scene is assembled), the straightforward method is to create a 1 -to-1 mapping between Eye Space and texture space. this can be done by Enabling Texture Generation to Eye Linear and setting the eyemping:

GLFLOAT SPLANE [] = {1.F, 0.F, 0.F, 0.F};

GLFLOAT TPLANE [] = {0.F, 1.F, 0.F, 0.F};

GLFLOAT RPLANE [] = {0.F, 0.F, 1.f, 0.f};

GLFLOAT QPLANE [] = {0.F, 0.F, 0.F, 1.F};

You can also use the body space of the body space, but it is also included in the MODEL / View transformation when it is created.

You Could Also Use Object Space Mapping, But The Current ModelView Transform Into Account.

Everything is done now. What will happen? When each primitive is rendered, the texture coordinates correspond to the X, Y, the coordinates of the vertex will be generated, and then the transformation of the textured transformation matrix is ​​then passed. First apply a view / model and projection transform matrix, which will be imaged in the texture coordinates of the primitive to the normalized device coordinate (-1.1). Then zoom and move this coordinate. Then, a filtering and texture environment operation is applied to the texture image.

So when you've done all this, what happens? As each primitive is rendered, texture coordinates matching the x, y, and z values ​​that have been transformed by the modelview matrix are generated, then transformed by the texture transformation matrix. The matrix applies a modelview and projection transform;. this orients and projects the primitive's texture coordinate values ​​into NDC space (-1 to 1 in each dimension) These values ​​are scaled and biased into texture coordinates Then normal filtering and texture environment operations are performed using the. Texture Image.

How to limit the projection texture in a single area when the transform and texture map is applied to all the polygons that need to be rendered. There are many ways to achieve this. The easiest way is to render those you try to put the texture from the multilateral shape you try to project texture. But this method is rough. Another way is to use the template cache algorithm to control those parts in the scenario will be projected. Scenes do not use projection texture to render over again, then use Stencil Buffer to cover a particular area, and the scene is rendered once in a state where the projected texture is opened. Stencil Buffer can cover the entire area that does not want to be used to use the projection texture, which allows you to create a projection image arbitrary outline, or projepsen a texture to a textured surface (it is multi-pass texture. the need to support ARB_Muti_Texture oh.) If transformation and texturing is being applied to all the rendered polygons, how do you limit the projected texture to a single area? There are a number of ways to do this. One is to simply only render the polygons you intend to project the texture on when you have projecting texture active and the projection in the texture transformation matrix. But this method is crude. Another way is to use the stencil buffer in a multipass algorithm to control what parts of the scene are updated by a projected texture. The scene can be rendered without the projected texture, the stencil buffer can be set to mask off an area, and the scene re-rendered with the projected texture, using the stencil buffer to mask off all but the desired area. .

Here is a very simple way to implement a non-repetitive texture to a surface that does not map the texture: set the texture environment to GL_MODULATE. Set the texture of the texture into a clamping GL_CLAMP, then set the boundary color of the texture into white. When the projection texture, the surface that is not projected to the texture will be automatically set to the texture of the boundary color - white, then modulate with white. This will remain unchanged because this is equivalent to each color component.

There is a very simple method that works when you want to project a non-repeating texture onto an untextured surface. Set the GL_MODULATE texture environment, set the texture repeat mode to GL_CLAMP, and set the texture border color to white. When the texture is projected, the surfaces outside the texture itself will default to the texture border color, and be modulated with white. This will leave the areas textured with the border color unchanged, since each color component will be scaled by one. texture filtering and the general texture The mapping is the same. Projection texture is determined by a relative screen pixel is reduced and magnified. When a small image is required, MipMapping is required to achieve a better result. If the projection texture is moved in the scene, it is important to use a good texture filtration method.

Filtering considerations are the same as for normal texturing; the size of the projected textures relative to screen pixels determines minification or magnification If the projected image will be relatively small, mipmapping may be required to get good quality results Using good filtering is especially important.. If The Projected Texture Moves from Frame To Frame.

Please note that as observed and projected, texture projections are not fully meticulous. Unless special methods, texture will affect all surfaces - no matter whether it is still behind (translator Note: It will have a movie image behind the movie machine, of course, this is not in line with optical principles) . Because there is no default view volume, the application needs to be careful to avoid the projection texture effect of undesirable. User Custom Trisse (Additional Cutting "helps better controlling the projection texture.

转载请注明原文地址:https://www.9cbs.com/read-30074.html

New Post(0)