Ok, let's leave aside the issue of having a diffent texture on each face and focus first on handling the local transformation.
You say you already have the system working without a local transformation, and so this implies that you have managed to generate UV coordinates (for sampling the textures) and normals (for blending the textures) based on the final world space position of the vertices?
In this case you should be able to transform your world space positions and normals back to local space. In your Ogre .program or .material you are probably using something like:
Code:
param_named_auto world world_matrix
param_named_auto viewProj viewproj_matrix
to pass these values to the vertex shader? If so, you can try passing the following matrix to the pixel shader: 'inverse_world_matrix'. I think that's the one you want, but have a look at
this page to see your options. You can then use it to transform you normal back to local space, and also your positions (probably before you turn them into UV coordinates).
The above is untested... fingers crossed