PBR just works
But GI is a problem.
Also, I render my terrain multipass, 4 materials per pass... with Surface shaders!
It seems to work like a charm. Manual blending mode and keepalpha pragma makes mu splat shader actually work. You can see that I achieved the material blending with per-vertex alpha value for correction, and a Splatting Passes Texture that is sampled using vertex texture fetch.
This way I achieved the multi-material support - per vertex but it looks perfect
After I fix all the splatting bugs, I will describe how the terrain rendering works for me. I had to use 2 set of normals, however - the Polyvox normals for splatting and unity's default normals for lighting. I have some problems with normals - when I render them, they look yellowish rendered, but lighting works. Polyvox's ones look red/green/bluish, Splatting works with them, but lighting is completely broken... and I don't have idea why. The opposite effect - if I use Unity's Mesh normals [that come as well from polyvox) - Splatting is broken, but lighting works.
Usity is doing something with the normals, probably storing them differently, so I get different values for lighting and different from splatting.
The Triplanar texturing shader is the old ogre function touched a lot to avoid glitches, but it works. See the "Bug" screenshot
Overall, what I am doing now:
I am storing the game world in chunks of my own gfArray3d<VT>. VT is a struct I posted earlier, but a litle bit smaller now.
When I extract Marching Cubes mesh, I construct 20x20x20 Polyvox::RawVolume<uint8_t> and fill it only with the densities. I run the extractor on it, then I run the decimator (0.8) on it and keep both meshes. The original mesh is used for rendering, the decimated one for physics.
For each vertex in the original mesh, I get the vertex position and then calculate the trilinear interpolated alphas value for it. MAterials maps to colors: RGBA - set to 1 either, so when I interpolate these, I get a "color" that encodes the splatting alpha levels. I do this based on the passes data - and generate a NPOT texture with width = vertex count, height = pass count, and each pixel are the intterpolated splatting alphas.
I pass the vertex, index, alpha buffers and splatting texture to unity and construct the mesh data there - the both meshes for rendering and physics, the 2D texture with splatting alphas per vertex/pass and then I setup shader passes and render them.
By using MwshRenderer.sharedMaterials[] with more than 1 generated materials, you make unuty render mesh multi-passed.
Then smile
and time to go to Lunch now.