A reflection on the state of simming...


No, really; Simulated radar reflection:



Here's a GPU shader powered radar back scatter image dynamically generated
 from the normal map provided by the new API detailed in the posts below...



The original normal map used to calculate back scatter.



Another test image, this time with the radar located at 0.5, 0.0
This version is much simpler, displaying a wider backscatter area for easier debug.
It also lacks color and height mapping.



New mapping API:

This is the major news for this release, it brings a whole new level of power to Gizmo64 with regards to mapping the DSF files for use with synthetic vision systems, TAWS, moving maps and more..


Here's some sample images; 


Heat mapped to show height, normal mapped to show definition.




Composite Normal map made by joining two adjacent sample tiles.



Composite Height map made by joining two adjacent sample tiles.


This set of images is a composite of two different tile data sets. 
The images were joined in photoshop by pixel alignment only. There is no detectable seam.




An earlier height map example showing a different tonal range.


I am creating more reference code that will show how to tile and manage the data sets in the simulator at runtime.


The height map is colored using a simple shader and this texture lookup table stored as a PNG file.


The color table is very simple to replace with each horizontal pixel matching a height in the heightmap.
Colors start at the left hand side representing the lowest terrain and finish with the highest terrain color at the right.
Your color map texture should be exactly 256 pixels wide. 
If it is more than 1 pixel high only the bottom row of pixels are used.

Black height map pixels read as 0 representing a low altitude. 
White height map pixels read as 1.0 representing a high altitude.

A heightmap value of 0.25 (a dark grey tone) would cause the shader to select a color from the left hand side of the color map texture.


The shader is fairly crude at the moment and is part of a 
It was nice to find a solution that avoids any branching in the shader code.



Synth Vision:

Significant progress has been made towards providing in-sim synth vision display.

The new map API includes functions for converting terrain data into both a TriMesh (more on these later) and optionally, directly into an OpenGL VBO* handle.

There are some "fitting" errors, this is to be expected when you translate floating point numbers through two different coordinate sets three times... 

The error appears to be about 120 meters in horizontal on both X and Z axis, it may be possible to find an offset value that results in a better fitting of the terrain, experiments yield good results but nothing definitive yet.

World > GL Query > Probe > GL Response > World(Fix gravity normal) > GL Query > RE-Probe > GL Response > World(Altitude instead of probe returned Y value... *phew*)


Here are some example images from the core of the synth vision system:



Wide area long distance terrain fit.



Close-up at the airfield, where lots of ground based effects are possible.
Vertical offset error is about 20 centimeters in this shot.
Long distance fitting of mountainous terrain is also quite good.




Testing two sample tiles with entirely different query parameters.
Both display good results. FPS impact was minimal.




Long distance wide area tiling test.
Three different sets of data tiled together in OpenGL.
Horizontal fitting is excellent for all tiles.
Vertical fitting displays some errors that are returned by the
 Terrain Probe API, these may be reduced in future.





TriMesh Data

We also have a new trimesh API that can be used to create large blobs of vertex data that is suitable for using with OpenGL to use high performance drawing techniques.

This area of the new API code is lacking in documentation or example code.

It was developed to enable the new synth vision systems and bloomed into a new API for scripts to pass vertex data into C++ and OpenGL efficiently.

I am planning to create a new OBJ8 loader script example shortly that will demonstrate how to use TriMesh data.

For now, please refer to the documentation:

http://www.gizmo64.com/Gizmo64_API.htm#trimesh.newTriMesh



For more information on the new mapping API please see the reference manual here:

http://www.gizmo64.com/Gizmo64_API.htm#map.newQueryPoints

...and also the working example code here:

https://github.com/benrussell/Gizmo-Open-Extensions/tree/master/MapHarness




The shaders API has also had some new code added to it.
We can now do shader powered multi textured (upto 16 textures!) custom (OBJ8) drawing effects.




* Drawing with VBO data unlocks the speed in modern graphics cards. 
Gizmo64 is now free from the limits if LuaGL in terms of custom OpenGL drawing and we should start to see some really interesting effects in the near future.



Comments

Popular posts from this blog

Gizmo64 Version 23+

projectFLY Bridge Plugin Rewrite.