青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

麒麟子

~~

導(dǎo)航

<2025年9月>
31123456
78910111213
14151617181920
21222324252627
2829301234
567891011

統(tǒng)計(jì)

常用鏈接

留言簿(12)

隨筆分類

隨筆檔案

Friends

WebSites

積分與排名

最新隨筆

最新評(píng)論

閱讀排行榜

評(píng)論排行榜

#

Programming 3D games on Android with Irrlicht and Bullet

Programming 3D games on Android with Irrlicht and Bullet (Part 1)

Posted on May 19, 2011, 11:00 pm, by xp, under Programming.

Just got a new Android phone (a Samsung Vibrant) a month ago, so after flashing a new ROM and installing a bunch of applications, what would I want to do with the new phone? Well, I’d like to know if the phone is fast enough to play 3D games. From the hardware configuration point of view, it is better equipped than my desktop computer in the 1990s, and since my desktop computer at the time had no problem with 3D games, I would expect it to be fast enough to do the same.

At first, I was considering downloading a 3D game from the market, but 3D games for Android are still rare, then why don’t I just create a 3D demo game myself?

After looking around which 3D game engines are available for the Android platform, I just settled down with Irrlicht. This is an open source C++ graphic engine, and not really a game engine per se, but it should have enough features to create my demo 3D application. And I like to have realistic physics in my game, so what could be better than the Bullet Physics library? This is the best known open source physics library, also developed in C++. The two libraries together would be an interesting combination.

Although Irrlicht was developed for desktop computers, but luckily enough, someone has already ported Irrlicht to the Android platform, which requires a special device driver for the graphic engine. And guess what? Someone has also created a Bullet wrapper for the Irrlicht engine. All of them in C++, and open source. All we need to do now to pull all these codes together to build a shared library for Android.

In this part, I’ll just describe what needs to compile all the codes for Android. Since we will compile C/C++ codes, you’ll need to download the Android native development kit. Please refer to the documentation on how to install.

We create an Android project, and add a jni folder. Then we put all the C/C++ source codes under the jni folder. I created three sub-folders:

  1. Bullet: All the Bullet Physics source codes. Actually, we only need the Collision, Dynamics, Soft Body and Linear Math libraries.
  2. Irrlicht: The Irrlicht 3D graphic engine source codes. This is the Android port of the engine.
  3. irrBullet: This is the Bullet wrapper for Irrlicht engine, which makes it easier to write your programs.

After, all we need to do is to create an Android.mk file, which is quite simple, really. You can read the makefile to see how it is structured. Basically, we just tell the Android NDK build tools that we want to build all the source codes for the Arm platform, and we want to link with the OpenGL ES library, to create a shared library called libirrlichtbullet.so. That’s about it.

However, there’s one minor thing to note though. Android does not really support C++ standard template library, but the irrBullet library made use of it. Therefore, in the jni folder, we need to add an Application.mk file, which contains the following line:

APP_STL  := stlport_static

And that’s it. Now, you can run ndk-build to build the shared library. If you have a slow computer, it would take a while. If everything is alright, you should have a shared library in the folder libs/armeabi/. That shared library contains the Bullet Physics, Irrlicht and the irrBullet wrapper libraries. You can now create your 3D games for Android with it. In the next part, we will write a small demo program using this library.

You can download all the source codes and pre-built library here.

 

 

Programming 3D games on Android with Irrlicht and Bullet (Part 2)

Posted on May 20, 2011, 11:30 am, by xp, under Programming.

In the last post, we have built the Bullet Physics, Irrlicht and irrBullet libraries together to create a shared library for the Android platform. In this post, we are going to create a small demo 3D game for Android, using the libraries that we have built earlier.

This demo is not really anything new, I am going to just convert an Irrlicht example to run on Android. In this simple game, we are going to stack up a bunch of crates, and then we will shoot a sphere or a cube, from a distance, to topple the crates. The Irrlicht engine will handle all the 3D graphics, and the Bullet Physics library will take care of rigid body collision detection and all realistic physical kinetics. For example, when we shoot a sphere from the distance, how the sphere follows a curve line when flying over the air, how far it will fly, where it is going to fall on to the ground, how it reacts when it hits the ground, how it reacts when it hits the crates, and how the crates will react when being hit, etc, all these will be taken care of by Bullet Physics, and Irrlicht will render the game world accordingly.

Since it is easier to create Android project in Eclipse, we are going to work with Eclipse here. You will need to following tools to work with:

  1. Android SDK
  2. Android NDK
  3. Eclipse IDE
  4. Eclipse plugin for Android development.

I’m assuming you have all these tools installed and configured correctly. And I’m assuming you have basic knowledge on Android programming too, so I won’t get into the basic details here.

Let’s create a project called ca.renzhi.bullet, with an Activity called BulletActivity. The onCreate() method will look something like this:

  1. @Override

  2. public void onCreate(Bundle savedInstanceState) {

  3. super.onCreate(savedInstanceState);

  4. // Lock screen to landscape mode.

  5. this.setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

  6. mGLView = new GLSurfaceView(getApplication());

  7. renderer = new BulletRenderer(this);

  8. mGLView.setRenderer(renderer);

  9. DisplayMetrics displayMetrics = getResources().getDisplayMetrics();

  10. width = displayMetrics.widthPixels;

  11. height = displayMetrics.heightPixels;

  12. setContentView(mGLView);

  13. }

This just tells Android that we want an OpenGL surface view. We will create a Renderer class for this, something very simple like the following:

  1. public class BulletRenderer implements Renderer

  2. {

  3. BulletActivity activity;

  4. public BulletRenderer(BulletActivity activity)

  5. {

  6. this.activity = activity;

  7. }

  8. public void onDrawFrame(GL10 arg0)

  9. {

  10. activity.drawIteration();

  11. }

  12. public void onSurfaceChanged(GL10 gl, int width, int height)

  13. {

  14. activity.nativeResize(width, height);

  15. }

  16. public void onSurfaceCreated(GL10 gl, EGLConfig config)

  17. {

  18. activity.nativeInitGL(activity.width, activity.height);

  19. }

  20. }

The renderer class’s method will be invoked every time a frame needs to be rendered. There’s nothing special here. When the methods are invoked, we just invoke the native methods in the activity class, which will then call the C native functions through JNI. Since Irrlicht and Bullet are C++ libraries, we will have to write the main part of the game in C/C++. We keep very little logic in Java code.

When the surface is first created, the onSurfaceCreated() method is invoked, and here, we just call nativeInitGL(), which will initialize our game world in Irrlicht. This will initialize the device, the scene manager, create a physical world to manage the rigid bodies and their collision, create a ground floor, and put a stack of crates in the middle point. Then, we create a first-person-shooter (FPS) camera to look at the stack of crates. The player will look at this game world through the FPS camera.

I’m not going to describe the codes line by line, since you can download the codes to play with it. But when you create the Irrlicht device with the following line of code:

  1. device = createDevice( video::EDT_OGLES1, dimension2d(gWindowWidth, gWindowHeight), 16, false, false, false, 0);

Make sure you select the correct version of OpenGL ES library on your Android device. I have version 1.x on mine. But if you have version 2.x, change to video::EDT_OGLES2 instead.

After the initialization, we would have a scene that looks like this:

When a frame is changed and needs to render, the onDrawFrame() method of the renderer is invoked. And here, we just call the nativeDrawIteration() function and will handle the game logic in C/C++ codes. The code looks like this:

  1. void Java_ca_renzhi_bullet_BulletActivity_nativeDrawIteration(

  2. JNIEnv* env,

  3. jobject thiz,

  4. jint direction,

  5. jfloat markX, jfloat markY)

  6. {

  7. deltaTime = device->getTimer()->getTime() – timeStamp;

  8. timeStamp = device->getTimer()->getTime();

  9. device->run();

  10. // Step the simulation

  11. world->stepSimulation(deltaTime * 0.001f, 120);

  12. if ((direction != -1) || (markX != -1) || (markY != -1))

  13. handleUserInput(direction, markX, markY);

  14. driver->beginScene(true, true, SColor(0, 200, 200, 200));

  15. smgr->drawAll();

  16. guienv->drawAll();

  17. driver->endScene();

  18. }

As you can see, this is very standard Irrlicht game loop, the only thing added here is a function to handle user input.

User input includes moving left and right, forward and backward, shooting sphere and cube. Irrlicht engine depends on keyboard and mouse for user interaction, which are not available on Android devices. So, we will create a very basic kludge to allow users to move around and shoot. We will use touch and tap to handle user input. Users will move their finger left and right, on the left part of the screen, to move left and right in the game world. Users will move their finger up and down, on the right part of screen, to move forward and backward in the game world. And tap on the screen to shoot. Therefore, the movement direction is translated into a parameter, called direction, and passed to the native code to be handled. We also grab the X and Y coordinates of the shooting mark, and pass them as parameters to native codes as well.

That’s it. You can now build it, package into an apk, install it on your Android device, and play with it. When you shoot on the stack of crates, you would have a scene that looks like this:

The performance on my Samsung Vibrant is ok, I get about 56 or 57 FPS, which is quite smooth. But if there are too many objects to animate, especially after we have shot many spheres and cubes, we will have a screen that hangs and jumps a bit, or sometimes, it stops to react to user input for a fraction of second. In a real game, we might want to remove objects that have done their work, so that the number of objects to animate is significantly reduced to maintain an acceptable performance.

The other important thing that we want to improve is user interaction and control. The Irrlicht engine is developed for desktop computers, it relies mainly on keyboard and mouse for user interaction. These are not available on mobile devices. The current demo attempted to use touch and tap on screen as user control, but it does not work very well. In a next post, we will try to create virtual controls on screen (e.g. buttons, dials, etc), and we might want to take advantage of the sensor as well, which is a standard feature on mobile devices now.

You can download the source codes of the demo here.

 

Programming 3D games on Android with Irrlicht and Bullet (Part 3)

Posted on May 23, 2011, 6:25 pm, by xp, under Programming.

In the last post, we have created a basic 3D demo application with Irrlicht, in which we put a stack of crates, and we topple them by shooting a cube or a sphere.

In this post, we will try to create an on-screen control so that you can move around with it, like using a hardware control. What we want to have is something like this, in the following screenshot:

What we have here is a control knob, sitting on a base. The knob is in the centre of the base. The on-screen control always stays on top of the game scene, and the position should stay still regardless of how you move the camera. However, users can press on the knob, and drag it left and right, and this, in turn, moves the camera left and right accordingly.

Obviously, you can implement movement direction along more than one axis too. And you can also have more than one on-screen control if you want, since it is a device with multi-touch screen. But that’s left to you as an exercise.

Placing an on-screen control is actually quite easy. All you have to do is to load a picture as texture, then draw it as 2D image on the screen, at the location you want to put the control. But since we want the on-screen control to be always on top, we have to draw the 2D image after the game scene (and all the scene objects) are drawn. If we draw the 2D image first, it will hidden by the game scene. There, in your loop, you would have something like this:

  1. driver->beginScene(true, true, SColor(0, 200, 200, 200));

  2. smgr->drawAll();

  3. guienv->drawAll();

  4. // Draw the on-screen control now

  5. onScreenControl->draw(driver);

  6. driver->endScene();

That’s the basic idea. Here, we have created an OnScreenControl class, which encloses two objects, one of VirtualControlBase class and the other, of VirtualControlKnob class. The draw() method of the OnScreenControl class looks like this:

  1. void OnScreenControl::draw(IVideoDriver* driver)

  2. {

  3. base->draw(driver);

  4. knob->draw(driver);

  5. }

It just relegates the drawing works to its sub-objects, the control base and the control knob. Note that the knob has to be drawn after the base, otherwise, it will be hidden behind the base, instead of sitting on top of it. The draw() method of the base looks like:

  1. void VirtualControlBase::draw(IVideoDriver* driver)

  2. {

  3. driver->draw2DImage(_texture,

  4. position2d(pos_x, pos_y),

  5. rect(0, 0, width, height),

  6. 0,

  7. SColor(255,255,255,255),

  8. true);

  9. }

As you see, it just draws a 2D image with the texture, at the location specified. That’s it.

After putting the on-screen control in place, we have to handle user touch events on the screen. If users press on the knob (i.e. pressing within the square boundary of the knob image), and move the finger around, we update the position of the knob according to the movement direction. Here, we want to make sure that users can not drag the knob out of the control base boundary (or too far out of the boundary anyway), to make it look and behave like a real control. As users move the knob around, you want to update your camera’s position accordingly. And when the knob is released, you want to reset its position back to the centre of the control base.

That’s basically the idea. You can grab the source code here. Ugly codes, I warn you.

The major problem of programming Irrlicht on Android is the separation of Java codes and the C/C++ codes. If you want to limit your program to only Android 2.3 or later, you can probably write the whole program in C/C++, using the native activity class. That way, you don’t have to move back and forth between Java and C++. But if you want to run your program on older versions of Android, your Activity must be written in Java, and the main game logic written in C/C++. You then have to catch user interaction events in your Java code, pass them through JNI to your C/C++ code. There will be loss of information as you move back and forth, not to mention that there will be quite a bit of code duplication. You can certainly create a full wrapper for the Irrlicht and Bullet libraries, but that will be taxing your mobile device heavily, and will certainly have a negative impact on performance. And creating a full wrapper for these two libraries would be a heck of a job.

The other problem is that, Irrlicht is an engine developed for the desktop, where keyboard and mouse are the main input devices. The Irrlicht port to Android mainly concerns with a display driver for Android, but the port has not really gone deep into this area of user interaction. Therefore, as you write your Irrlicht-based Android program, you would have to hack together user input handling model, event model, etc. In my demo, I haven’t even touched that, I have just kludged together some primitive event handling codes. In order to have our program fit in those multi-touch based devices, we would have to dig into the Irrlicht scene node animator and event handling mechanisms, and work it out from there. For example, we will have to define our own scene node animator which would be based on touch events instead of keyboard and mouse events, and add it to the scene node that we want to animate. This is something that we are going to look into in our future posts.

posted @ 2013-04-06 21:19 麒麟子 閱讀(1345) | 評(píng)論 (0)編輯 收藏

irrlicht引擎:鏡子效果

 

最近在用irrlicht做一個(gè)3D試衣間的小項(xiàng)目,為了給項(xiàng)目增添點(diǎn)花樣,于是想實(shí)現(xiàn)一面鏡子。

我記得D3D龍書上有一個(gè)使用模板緩沖區(qū)實(shí)現(xiàn)的例子。網(wǎng)上也有OPENGL實(shí)現(xiàn)的例子。 但這一次,我想用irrlicht的RTT實(shí)現(xiàn)一面鏡子效果。

其實(shí)原理和水面反射原理是一樣的, 只是沒有加擾動(dòng)而已

 

第一步:渲染反射貼圖

反射貼圖的渲染,其實(shí)就是將攝相機(jī)通過鏡面鏡像即可,irrlicht中我找了半天,沒有發(fā)現(xiàn)鏡像矩陣的算法,倒是在網(wǎng)上搜到了一個(gè)。 很是不錯(cuò)。

同時(shí),也翻閱了一下先前公司引擎項(xiàng)目的代碼,發(fā)現(xiàn)其實(shí)就是那個(gè)公式。 有興趣的朋友可以參看這里

 

http://www.cnblogs.com/glshader/archive/2010/11/02/1866971.html

 

通過這個(gè)鏡面反射矩陣,我們可以將攝相機(jī)鏡像, 相當(dāng)于是從鏡子里向外看,渲染出一個(gè)世界。 在渲染的時(shí)候,要記得設(shè)置裁剪面。 在我的測(cè)試中我沒有設(shè)置。

第二步:重新渲染世界

重新渲染世界的時(shí)候,鏡子需要一個(gè)特殊的紋理來進(jìn)行反射貼圖。(鏡像攝相機(jī)空間的投影紋理映射)。 這個(gè)貼圖方式,就是指忽略鏡子的紋理坐標(biāo),而通過

鏡像攝相機(jī)來計(jì)算出投影坐標(biāo),然后貼在鏡子上。在我的測(cè)試中,是用SHADER來實(shí)現(xiàn)的。 為鏡子做了一個(gè)特殊的紋理。

 

下面,我貼一下SHADER,很簡(jiǎn)單,如果實(shí)在不清楚的,可以參考一些投影紋理相關(guān)的資料。

 

頂點(diǎn)著色器代碼 HLSL

float4x4    WorldViewProj;
float4x4    MirrorWorldViewProj;
struct VS_OUTPUT
{
    float4 position    :POSITION;

    float3 uv: TEXCOORD0;
};

struct VS_INPUT
{
    float4 position        : POSITION;
    float4 color        : COLOR0;
    float2 texCoord0    : TEXCOORD0;
};

VS_OUTPUT main(VS_INPUT input)
{
    VS_OUTPUT output;
    float4 pos = mul(input.position, WorldViewProj);
    output.position = pos;

    //計(jì)算反射紋理的坐標(biāo)

    pos = mul(input.position,MirrorWorldViewProj);
    output.uv.x = 0.5 * (pos.w + pos.x);
    output.uv.y = 0.5 * (pos.w - pos.y);
    output.uv.z = pos.w;
    return output;
}

 

像素著色器代碼 HLSL

sampler2D colorMap;
struct PS_OUTPUT
{
    float4 color : COLOR0;  
};

struct PS_INPUT
{
    float4 position    : POSITION;
    float3 uv: TEXCOORD0;
};
PS_OUTPUT main( PS_INPUT input )
{
    PS_OUTPUT output;
    float2 uv = saturate(input.uv.xy / input.uv.z);
    output.color = tex2D(colorMap,uv);
    return output;
}

 

 

RTT相關(guān)的操作,irrlicht的RenderToTexture已經(jīng)很明白了,再此不在敷述。

 

上圖,收工

 

2000

posted @ 2013-04-05 00:53 麒麟子 閱讀(2026) | 評(píng)論 (1)編輯 收藏

irrlicht引擎:真實(shí)的水面渲染

先上圖

image

 

image

本來說是做鏡子效果的,結(jié)果手工計(jì)算的鏡面反射矩陣應(yīng)用在irrlicht相機(jī)上的時(shí)候,始終無法出現(xiàn)效果,只能去網(wǎng)上搜索

在irrlicht official wiki上發(fā)現(xiàn)了這個(gè)擴(kuò)展的WaterNode,下載下來,改了點(diǎn)BUG,整合進(jìn)了Terrain Demo里, 就是上圖的效果。

在我的機(jī)器上,HLSL版本是沒有問題的,GL版本貌似RTT有點(diǎn)問題。

 

點(diǎn)擊此處下載源代碼

posted @ 2013-04-04 18:16 麒麟子 閱讀(1941) | 評(píng)論 (2)編輯 收藏

10 Fun Things to do with Tessellation

原文地址:http://castano.ludicon.com/blog/2009/01/10/10-fun-things-to-do-with-tessellation/

Hardware tessellation is probably the most notable feature of Direct3D11.

Direct3D11 was announced at the last Gamefest and a technical preview was released in theNovember 2008 DirectX SDK. Hardware implementations are expected to be available this year.

Direct3D11 Pipeline

Direct3D11 extends the Direct3D10 pipeline with three new stages: Two programmable shader stages (the Hull and Domain Shaders), and a fixed function stage (the Tessellator). More details can be foundhere and here.

Rendering of Catmull-Clark subdivision surfaces is often mentioned as the primary application for the tessellation pipeline, but there are many other interesting uses that have not received that much attention.

I thought it would be interesting to take a closer look at those other applications, and submitted a proposal to do that at GDC’09. However, it seems that the organizers do not think tessellation is as interesting as I do, or they didn’t like my proposal, or maybe it’s just that they know I’m a lousy speaker. I will never know, because the gracious feedback of the GDC review committee can be represented by a single boolean.

In any case, here’s a brief overview of the 10 fun things that I was planning to talk about. I don’t get very deep into the technical details, but in future posts I may describe some of these applications more thoroughly. Please, leave your comments if there’s something you would like to learn more about.

PN-TRIANGLES

Curved PN Triangles is a triangle interpolation scheme that operates directly on triangle meshes whose vertices are composed of positions and normals (PN stands for Point-Normal).

PN Triangles

It’s an interesting way of improving visual quality that offers a simple migration path, since assets do not need to be heavily modified.

The PN Triangle evaluation consists of two steps: First, for every triangle of the input mesh a triangular cubic patch is derived solely from the vertex positions and normals; no adjacency information is required. Then, the resulting patch is subdivided or tessellated for rendering.p>

The resulting surface is smoother than the polygonal surface, but does not have tangent continuity in general, and that results in shading discontinuities. To hide these discontinuities normals are interpolated independently using either linear or quadratic interpolation. These normals are not the true surface normals, but they provide a smooth appearance to the surface.

This two-step evaluation maps very well to the Direct3D11 tessellation pipeline. The evaluation of the control points can be performed in the Hull Shader, the fixed function tessellator can produce a tessellation pattern in the triangle domain, and the actual surface can be evaluated for each of the tessellated vertices in the Domain Shader.

Scalar Tagged PN-Triangles

In order to support sharp edges a rim of small triangles is added along the edges. That increases the number of patches, and it’s not entirely clear how to properly texture map them.Scalar Tagged PN-Triangles solves that problem in a more elegant way by tagging each crease vertex with three scalar that act as shape controllers and modify the construction of the surface control points. However, this representation does not support crease corners.

SILHOUETTE REFINEMENT

When tessellation is enabled the only supported primitive type is the patch primitive. In Direct3D11 a patch is an abstract primitive with an arbitrary number of vertices. You can use patches to represent traditional primitives (ie. a triangle is just a patch with 3 vertices), but this also enables you to represent other input primitives with arbitrary topology and additional connectivity information.

Silhouette Refinement

An interesting extension of of PN-Triangle tessellation is to augment the input triangles with the neighbor vertices in order to perform silhouetterefinement.

With this additional information it’s possible to compute tessellation factors in he Hull Shader based on whether an edge is on the silhouette or the interior of the mesh. Then the fixed function tessellator uses these edge tessellation factors to produce a semi-regular tessellation pattern and the Domain Shader transforms it to interpolate the surface.

PHONG TESSELLATION

Phong Tessellation

Phong Tessellation is a geometric version of Phong interpolation, but applied to vertex positions instead of normals.

First, points are interpolated linearly over each triangle using its barycentric coordinates, then the points are projected onto the planes defined by the corner position and normal, and finally the result of the three projections is interpolated again.

This procedure produces a smooth surface comparable to PN Triangles, but its evaluation is much cheaper, since no additional control points need to be computed.

BEZIER SURFACES

Curved surfaces are not only useful for characters, but also for level geometry and objects.

Quake 3 Arena

id Software introduced the use of quadratic Bezier patches for architectural geometry in Quake 3 Arena and has been using them ever since.

Climax Brighton’s Moto GP used cubic Bezier patches to model the motorcycles.

Bezier patches can be evaluated very efficiently, because they don’t need any information about the surrounding mesh. As these games show, tessellation hardware is not required to render these surfaces. However, hardware tessellation will allow doing it much more efficiently, and will facilitate the use of these and more complex surfaces.

APPROXIMATION TO SUBDIVISION SURFACES

Rendering of approximated Catmull-Clark subdivision surfaces is probably the most anticipated application of hardware accelerated tessellation. Several approximation methods exist.

Approximation to Catmull Clark Subdivision Surface

Approximating Catmull-Clark Subdivision Surfaces with Bicubic Patches is the most popular one. This approximation constructs a geometry patch and a pair of tangent patches for each quadrilateral face of the control mesh. The geometry patch approximates the shape and silhouette, but does not provide tangent continuity. A smooth normal field is constructed using two additional tangent patches. The approximation supports boundaries and has also been extended to support creases in Real-Time Creased Approximate Subdivision Surfaces.

GPU Smoothing of Quad Meshes proposes an alternative approximation using piecewise quartic triangular patches that have tangent continuity and do not require additional tangent patches to provide a smooth appearance. In Fast Parallel Construction of Smooth Surfaces from Meshes with Tri/Quad/Pent Facets the same approach is extended to approximate triangular and pentagonal faces.

(c) Kenneth Scott, id Software

Kenneth Scott, id Software

Gregory patches are a more compact representation that also provides a very similar approximation, but only support quad and triangle control faces.

The availability of sculpting tools like ZBrush and Mudbox makes it possible to create highly detailed meshes. Displaced subdivision surfaces provide a compact and efficient representation for these meshes.

RENDERING GEOMETRY IMAGES

Another approach to render highly detailed surfaces is to use geometry images. While geometry images can be rendered very efficiently, their video memory requirements are generally higher than displacement maps due to the lack of high precision texture compression formats. Traditional animation algorithms are not possible with this representation, and view dependent tessellation level evaluation is complicated, because geometry information is not directly available at the Hull Shader stage. However, geometry images may be the fastest approach to render small static objects at fixed tessellation levels.

TERRAIN RENDERING

Terrain rendering is one of the most obvious applications for tessellation. The flexibility of the tessellation pipeline enables the use of sophisticated algorithms to evaluate the level of refinement of the terrain patches, and frees you from having to worry about many of the implementation details.

Saga of Ryzom

It’s also possible to extend traditional terrain engines with arbitrary topologies. Some MMORPGs are already doing that to create more rich environments.

For example Saga of Ryzom, a game that is based on the Nevrax engine, uses cubic patches to model the terrain, which enables them to create impressive cliffs and overhangs.

Saga of Ryzom

Tessellation should make it possible to combine regular heightfields, with caves, cliffs, arches, and other interesting rock formations.

I think that ZBrush or Mudbox would be excellent tools to create natural looking rugged terrain.

HAIR RENDERING

Efficient hair rendering is one of the most interesting applications of the Direct3D11 tessellation pipeline. In addition to triangular and quad patches the fixed function tessellator can also generate lines, which are very useful for applications like hair and fur rendering.

Nalu

The algorithm described in Hair Animation and Rendering in the Nalu Demo maps very well to the tessellation pipeline.

As shown in Real-Time Rendering of Realistic Hair, the use of the hardware tessellation pipeline makes it very easy to simulate and render realistic hair with high geometric complexity in real-time.

That’s possible, because the simulation is performed only on a few hundred guide hairs, that are expanded by the tessellator into thousands of hair strands.

RENDERING PANORAMAS

Another application for tessellation is to perform arbitrary non linear projections, that is useful, for example, to create real-time panoramas.

Since graphics hardware relies on homogeneous linear interpolation for rasterization, arbitrary projections and deformations at the vertex level result in errors unless the surface is sufficiently refined.

PanQuake

The traditional image based approach is to render the scene to a cube map and then perform an arbitrary projection of the cubemap to screenspace relying on texture hardware to do the sampling and interpolation. This was the approach taken in Fisheye Quake and Pan quake.

While that works well, it requires rendering the scene to the 6 cube faces, and sometimes results in oversampling or undersampling of some areas of the scene.

panorama

Dynamic Mesh Refinement on GPU using Geometry Shaders proposes the use of the geometry shader to dynamically refine the surfaces to prevent linear interpolation artifacts. However, the Geometry Shader operates sequentially and is not well suited for this task. On the other side, the dynamic mesh refinement algorithm maps well to the Direct3D11 tessellation pipeline.

RENDERING OF 2D CURVED SHAPES

While GPUs can render simple polygons, they are not able to automatically handle complex concave and curved polygons with overlaps and self intersections, without prior triangulation and tessellation.

SVG Tiger

The Direct3D11 tessellation pipeline is not designed to perform triangulation. However, there’s a well known method to render arbitrary polygons using the stencil buffer that can be used in this case. This method was first described in theOpenGL Red Book, but was recently popularized by its implementation in the Qt graphic library.

It’s possible to combine this technique with hardware tessellation to render curved tessellated shapes without the need of expensive CPU tessellation and triangulation algorithms.

posted @ 2013-04-01 00:18 麒麟子 閱讀(397) | 評(píng)論 (0)編輯 收藏

BSP創(chuàng)建中的一些問題

用BSP來對(duì)游戲中的室內(nèi)場(chǎng)景進(jìn)行分割是一個(gè)傳統(tǒng)但卻有效的手段,BSP創(chuàng)建的質(zhì)量決定了BSP在使用時(shí)的健壯性,因?yàn)橹笠褂肂SP進(jìn)行渲染、物理交互等操作。創(chuàng)建一個(gè)BSP的過程即簡(jiǎn)單又煩瑣,雖然邏輯簡(jiǎn)單,但是在分割結(jié)點(diǎn)的過程中也會(huì)產(chǎn)生不少問題,以下是本人創(chuàng)建BSP時(shí)的一些簡(jiǎn)單經(jīng)驗(yàn)總結(jié)。

BSP樹創(chuàng)建流程:

1. 得到場(chǎng)景的多邊形列表(多邊形何種方式組織無關(guān)緊要,只要含有足夠信息即可,我用的是索引。注:此多邊形列表要求有正向面的法線信息或正反面的標(biāo)識(shí))。

2. 判斷當(dāng)前結(jié)點(diǎn)處的多邊形集合是否為凸集,若是則標(biāo)記當(dāng)前結(jié)點(diǎn)為葉子,否則繼續(xù)進(jìn)行下述操作(判斷凸集的方法應(yīng)該選擇分割面的方法相一致,否則相互矛盾容易出現(xiàn)無限分割的情況)。

3. 遍歷當(dāng)前結(jié)點(diǎn)多邊形集合中的所有多邊形所在的平面找到最優(yōu)的分割面(分割平面的選取原則也有很多種,可以自己根據(jù)情況選取,我用的是保證BSP樹的均衡的選取原則)。

4. 標(biāo)記所有與分割平面處于同一平面上的多邊形為已使用,子結(jié)點(diǎn)中不再使用這些多邊形進(jìn)行分割操作

5. 用最優(yōu)分割面對(duì)當(dāng)前結(jié)點(diǎn)的多邊形集合進(jìn)行歸類,將對(duì)應(yīng)的集合存放到子結(jié)點(diǎn)中(具體的操作應(yīng)與多邊形列表的組織關(guān)系相結(jié)合)。

需要注意的問題:

1. 若當(dāng)前多邊形集合為凸集則其應(yīng)該找不到分割平面,對(duì)于一個(gè)非凸集的多邊形集合則必能找到了下分割平面(此處情況常出現(xiàn)在分割面的選擇與凸集的判斷不一致的時(shí)候)。

2. 對(duì)與當(dāng)前分割平面重疊的多邊形的處理:若與當(dāng)前分割面同向,則放于前結(jié)點(diǎn)集合,若與當(dāng)前分割面反向,則放于后結(jié)點(diǎn)集合。

3. 對(duì)于一個(gè)凸集多邊形集合,則遍歷統(tǒng)計(jì)后(對(duì)于每個(gè)多邊形,與其余的多邊形進(jìn)行位置判斷):front >= 0 , back = 0 , split = 0 , overlap_samedir >= 0 , overlap_diffdir = 0

4. 最優(yōu)分割面的選擇條件應(yīng)與凸集的判斷條件相一致,否則會(huì)出現(xiàn)對(duì)于一個(gè)非凸集而找不到分割面的錯(cuò)誤情況。

下圖為一個(gè)場(chǎng)景的BSP分割結(jié)果(用面來渲染,不同的顏色表示不同的葉子結(jié)點(diǎn)):

一個(gè)BSP分割的結(jié)果(不同顏色表示不同的葉子結(jié)點(diǎn))

BSP創(chuàng)建之后即可進(jìn)行Portal的添加操作,待續(xù)。。。

http://blog.csdn.net/bugrunner/article/details/5259174

posted @ 2013-04-01 00:13 麒麟子 閱讀(343) | 評(píng)論 (0)編輯 收藏

僅列出標(biāo)題
共38頁(yè): First 2 3 4 5 6 7 8 9 10 Last 
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            欧美一区观看| 久久爱www.| 亚洲精品自在久久| 欧美日本国产在线| 亚洲一区二区毛片| 亚洲男女自偷自拍| 黄色亚洲精品| 最新成人在线| 国产精品久久久久久av下载红粉| 午夜日韩视频| 久久精品一区二区国产| 亚洲精品乱码久久久久久| 亚洲伦理精品| 激情五月婷婷综合| 亚洲啪啪91| 国产一区二区久久久| 欧美激情精品久久久久久久变态 | 久久亚洲免费| 欧美另类极品videosbest最新版本| 在线视频日韩| 久久久久女教师免费一区| 日韩视频免费在线| 欧美一级理论片| 日韩午夜在线电影| 欧美一区二视频| 一区二区欧美日韩视频| 欧美中文在线观看| 亚洲视频在线观看视频| 久久久福利视频| 性欧美暴力猛交另类hd| 麻豆精品91| 久久精视频免费在线久久完整在线看 | 亚洲女人天堂av| 欧美高清在线观看| 久久亚洲欧洲| 国产精品美女一区二区| 亚洲国产精品毛片| 国产一区二区三区四区三区四| 亚洲精品乱码久久久久久日本蜜臀| 国产模特精品视频久久久久| 亚洲黑丝在线| 亚洲国产成人精品久久| 欧美在线免费视频| 小黄鸭精品aⅴ导航网站入口| 欧美—级a级欧美特级ar全黄| 久久九九免费视频| 国产手机视频精品| 亚洲中午字幕| 欧美一级艳片视频免费观看| 欧美日韩成人在线视频| 亚洲国产精品久久久久秋霞影院| 国内精品嫩模av私拍在线观看 | 午夜日韩视频| 午夜精品免费在线| 欧美日韩一级黄| 99re8这里有精品热视频免费| 亚洲激情一区| 免费看的黄色欧美网站| 麻豆国产精品777777在线| 国内久久视频| 久久久久久尹人网香蕉| 久久综合999| 亚洲国产精品成人综合色在线婷婷| 欧美一区免费视频| 久热国产精品| 亚洲激情亚洲| 欧美日本精品| 亚洲尤物视频网| 久久成人精品电影| 韩日在线一区| 欧美不卡一区| 99精品国产一区二区青青牛奶| 亚洲一区二区三区精品视频| 欧美性jizz18性欧美| 亚洲一区在线免费观看| 久久精品国产综合精品| 狠狠干综合网| 欧美精品激情blacked18| 日韩一级网站| 久久精品理论片| 亚洲国产欧美一区| 欧美日韩在线另类| 欧美一区二区成人6969| 欧美 日韩 国产在线| 一区二区精品在线| 国产视频一区二区三区在线观看| 欧美在线三区| 亚洲精品社区| 久久精品国产精品亚洲| 亚洲国产日韩一区| 国产精品每日更新在线播放网址| 欧美一区免费视频| 亚洲久久成人| 久久久久久夜精品精品免费| 亚洲精品你懂的| 国产欧美一级| 欧美激情综合亚洲一二区| 亚洲女性裸体视频| 91久久久久久久久| 久久久久久久一区二区三区| 亚洲精品美女久久久久| 国产欧美在线视频| 欧美激情a∨在线视频播放| 亚洲欧美欧美一区二区三区| 欧美激情一区二区三区在线视频| 亚洲一区欧美| 亚洲美女网站| 激情久久五月| 国产精品视频网站| 欧美日本免费一区二区三区| 久久成人国产| 亚洲欧美大片| 99精品热视频只有精品10| 免费日本视频一区| 久久精视频免费在线久久完整在线看| 日韩一级大片在线| 亚洲高清色综合| 国产中文一区二区| 国产精品永久免费| 国产精品成人国产乱一区| 免费看的黄色欧美网站| 久久九九99| 午夜精品影院在线观看| 在线视频亚洲欧美| 99人久久精品视频最新地址| 亚洲大片一区二区三区| 欧美成人免费网站| 免费观看国产成人| 麻豆精品传媒视频| 久久久久久夜精品精品免费| 性色av一区二区三区| 亚洲宅男天堂在线观看无病毒| 99国内精品久久| 一区二区欧美精品| 亚洲美女毛片| 日韩视频在线观看国产| 亚洲欧洲在线视频| 亚洲精品网站在线播放gif| 亚洲国产视频一区| 亚洲精品1234| 99精品国产在热久久下载| 亚洲精品视频啊美女在线直播| 亚洲精品日本| 夜夜爽www精品| 亚洲一区三区电影在线观看| 一区二区三区精品在线| 亚洲网站视频| 香蕉久久一区二区不卡无毒影院 | 亚洲高清激情| 亚洲精品永久免费| 在线视频你懂得一区| 亚洲女爱视频在线| 久久精品亚洲一区| 欧美成人视屏| 欧美日韩视频在线第一区| 欧美午夜精品久久久久免费视| 欧美深夜福利| 国内精品免费在线观看| 亚洲国产精品一区二区三区| 99热这里只有精品8| 亚洲欧美国产不卡| 久久精品亚洲精品| 欧美激情无毛| 中文在线资源观看网站视频免费不卡| 亚洲伊人伊色伊影伊综合网| 欧美在线综合| 欧美理论片在线观看| 国产精品主播| 亚洲日本电影| 欧美在线播放| 亚洲欧洲综合另类| 午夜日韩电影| 欧美日韩国产成人在线| 国产欧美精品在线播放| 亚洲黄网站黄| 欧美一区二区三区啪啪 | 亚洲在线国产日韩欧美| 久久综合狠狠综合久久激情| 亚洲欧洲一区二区三区| 午夜视频一区在线观看| 欧美激情1区| 国产视频综合在线| 亚洲一区二区三区精品在线观看 | 亚洲人人精品| 久久se精品一区二区| 日韩视频在线观看| 久久深夜福利| 国产一级一区二区| 亚洲女人小视频在线观看| 欧美韩国日本综合| 欧美一区二区视频在线观看2020 | 国产偷国产偷精品高清尤物| 亚洲美女毛片| 你懂的视频欧美| 亚洲欧美日韩高清| 欧美日韩国产综合新一区| 一区二区在线看| 久久久久久91香蕉国产| 中文一区在线| 欧美视频日韩视频|