Rendering Grass with Instancing in DirectX* 10
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
Rendering Grass with Instancing in DirectX* 10 By Anu Kalra Because of the geometric complexity, rendering realistic grass in real-time is difficult, especially on consumer graphics hardware. This article introduces the concept of geometry instancing with Direct3D* 10 APIs and shows how it can be used to implement realistic grass on consumer graphics hardware. 1
RENDERING GRASS WITH INSTANCING IN DIRECTX* 10 Instancing Grass that occur with the waving motion and from the effects of the wind are simulated using the same sine wave that A typical patch of grass can easily have a few hundred animates the grass.1 Geometry instancing places numerous thousand blades. Each blade is similar to the other, small patches along a grid on the terrain. This method with slight variations in color, position, and orientation. allows visible patches to be selectively drawn. Patches with Rendering a large number of small objects, each made from various levels of detail (depending on the camera position) a few polygons, is not optimal. Current generation graphics can also be introduced with relative ease. APIs, such as DirectX* and OpenGL*, are not designed to efficiently render models with a small number of polygons Figure 1 highlights the dynamic culling of grass thousands of times per frame. geometry. Only patches shown in blue are rendered. Refer to the code sample provided later in this article To efficiently render thousands of blades of grass, for more details. the number of draw calls needs to be drastically reduced. If the geometry of the grass blades doesn’t change, the best approach is to process the grass elements in a vertex buffer and render them in one draw call. However, if the geometry does change often—for example, if the level-of- details scheme is being used for geometry simplification— this approach won’t work, because a large amount of data would need to be sent to the graphics card every time the geometry changes. Geometry instancing allows the reuse of geometry when drawing multiple similar objects in a scene. The common data is stored in a vertex buffer, and the differentiating parameters, such as position and color, are stored in a separate vertex (instance) buffer. The Figure 1. Selective drawing of grass patches using hardware uses the vertex and instance buffers to render geometry instancing. unique instances of the models. (Refer 5) Using geometry instancing APIs helps factor common Instancing with Direct3D* 10 data from the unique data (flyweight design pattern) Several steps are necessary to implement geometry and thus reduces memory utilization and bandwidth. The instancing using the Direct3D* 10 API. vertex buffer can stay resident in graphics memory, and 1. Define the vertex and instance buffers. the instance buffer can be updated more frequently if needed, providing performance and flexibility. Direct3D* 10 does not distinguish between the various buffer types; they are all stored as D3D10Buffers. To Implementation Details render instanced grass, two buffers are created: one contains the static geometry information for the patch, In the example described in this article, numerous small and the other contains the various positions at which the patches of grass are drawn across the terrain. A patch of patches are to be drawn. grass consists of a vertex buffer that contains a number of randomly placed intersecting quads. Each quad is mapped 2. Associate the input buffers with a vertex shader. with a texture containing a few blades of grass. The vertex and instance buffers are associated with a A natural waving motion of the grass blades is achieved vertex shader in Direct3D* 10 using an input layout object, by animating the vertices of each quad using a combination which describes how the vertex buffer data is streamed of sine waves of different frequencies. Color changes into the input assembler (IA) pipeline stage (Figure 2). 1 Isidoro, J. and D. Card, “Animated Grass with Pixel and Vertex Shaders.” http://ati.amd.com/developer/shaderx/shaderx_animatedgrass.pdf 2
RENDERING GRASS WITH INSTANCING IN DIRECTX* 10 3. Bind the Objects Once the vertex buffers are ready, they are bound to the IA stage as the source listing below shows. An array of vertex buffer pointers containing vertex and index buffers, strides, and offsets is created and bound to the IA along with the previously created layout. Figure 2. The process of streaming the vertex buffer into the input assembler stage. An input-layout object is created from an array of input-element descriptions and a pointer to a compiled shader. Each element describes the data structure of the 4. Draw the primitives vertex buffer/buffers and its layout. The input-element Once all the input resources have been bound to the array described below is used for the sample source pipeline, draw calls are issued to render the primitives. provided for rendering instanced grass. The first two Direct3D* 10 supports various instanced draw calls for elements of the array define the data structure of the drawing geometry, based on the primitive topology used. vertex data coming from the vertex buffer, while the third The example below shows the draw calls for triangle lists element describes the data structure of the instance data used to render the instanced grass sample. coming from the instance buffer (second vertex buffer). Notice that the input slot (fourth data entry) for the elements is different for the vertex and instance buffers. For more details, refer to “Getting Started with the Input-Assembler Stage.”2 2 “Getting Started with the Input-Assembler Stage (Direct3D* 10).” http://msdn.microsoft.com/en-us/library/bb205117(VS.85).aspx 3
RENDERING GRASS WITH INSTANCING IN DIRECTX* 10 The implementation of the source’s instancing portion Future Work and Conclusions is divided into two classes: (1) InstancedBillboard and The Direct3D* 10 API simplifies the implementation of (2) BBGrassPatch. The InstancedBillboard class is a instancing and also offers improved performance. Because generic class used to hide the implementation details the mapping of vertex buffers and shader inputs is done about instancing. It accepts the vertex and instance data using the creation of input layouts during intialization, it structures as the templated inputs. The BBGrassPatch doesn’t need to be done at draw time for every frame, class handles the implementation details and initializes as in earlier versions of the API, thereby improving the the grass blades and patches. performance. Rendering grass blades with The sample source provided can be modified with alpha-to-coverage relative ease to create a level of detail simplification scheme. Multiple static patches at different levels of detail The grass patch is rendered as a number of randomly can be generated, each with a fewer number of polygons. placed intersecting quads. Each quad is mapped with an The patches can be placed on the grid with instancing alpha texture containing a few blades of grass. Rendering depending on the camera distance. Further simplification it as is with alpha blending requires that the quads to can be achieved by using two animated textures for grass be sorted back to front in order to render transparency patches at even greater distances. correctly. Using this approach can be computationally expensive especially if hundreds of thousands of quads The sample currently runs at reasonably good frame must be sorted every time the camera moves. Sending the rates on consumer graphics hardware (30–60 fps) and can data to the graphics card or using depth peeling on the be further optimized by tweaking the size of the static graphics card is equally prohibitive. grass patch (hence the number of blades) and number of instances, along with the other level of detail optimizations Using alpha-to-coverage solves this problem. Since the mentioned above. grass billboards use the alpha channel as cut-outs (alpha is either 0 or 1), this method works well. However, the Figure 3 shows an image of grass field rendered on cut-outs are not always binary. In some cases the edges of integrated graphics hardware with alpha-to-coverage the cut-outs are blurred to make the vegetation look more running u g at about 60 fps. ps realistic. In this case alpha-to-coverage and multi-sample anti-aliasing (MSAA) can solve the problem. Alpha-to-coverage converts the alpha-component output from the pixel shader to a coverage mask that is applied at the sub-pixel resolution of an MSAA render target. When the MSAA resolve is applied, the output pixel gets a transparency from 0 to 1 depending on alpha coverage and the MSAA sample count. The images produced using this technique (refer fig3) look realistic and have no artifacts. 3,4 This method is a pseudo order-independent transparency solution and works well if the alpha channel is being used for cut-outs (alpha is either 0 or 1) like rendering vegetation. However, this method doesn’t work if correct transparency is desired. Figure 3. Field of grass rendering using alpha- to-coverage. 3 Instancing10 Sample—Microsoft DirectX SDK: http://msdn.microsoft.com/en-us/library/bb205317(VS.85).aspx 4 Aliasing with transparency – Nvidia Technical report: http://developer.download.nvidia.com/SDK/9.5/Samples/DEMOS/Direct3D9/src/AntiAliasingWithTransparency/docs/AntiAliasingWithTransparency.pdf 4
RENDERING GRASS WITH INSTANCING IN DIRECTX* 10 Related Articles The following resources provide additional information: Carucci, Francesco, “Inside Geometry Instancing,” in GPU Gems 2: Programming Techniques for High-Performance Graphics and General-Purpose Computation, Matt Pharr, ed., Addison-Wesley Professional, 2005. About the Author Anu Kalra is a senior software engineer at Intel. He works in the visual computing software division, where he develops technologies and advises gaming ISVs in the adoption of multi-core and Larabee technologies. He holds a Master’s degree in Computer Science from the University of Illinois, where he specialized in computer graphics and virtual reality. Sign up today for Intel® Visual Adrenaline magazine: www.intelsoftwaregraphics.com >> Intel does not make any representations or warranties whatsoever regarding quality, reliability, functionality, or compatibility of third-party vendors and their devices. All products, dates, and plans are based on current expectations and subject to change without notice. Intel and the Intel logo are trademarks or registered trademarks of Intel Corporation or its subsidiaries in the United States and other countries. *Other names and brands may be claimed as the property of others. © Copyright 2009. Intel Corporation. All rights reserved. 0209/CS/RMH/PDF
You can also read