Texture assign for Unity

Ever wondered why every Substance Painter to Unity workflow tutorial shows the Unity scene with all textures already applied? Because this process is incredibly tedious and time consuming. Each material has 4 or 5 textures which you need to find, then drag and drop onto the material.

The TextureAssign editor script solves this problem by automating assigning the textures to the materials in Unity. It works by name-matching textures with materials.

textureassign

Get it here:
https://share.allegorithmic.com/libraries/2396

Image

A320 cockpit CAD model renders

Here are some new renders I made from the A320 cockpit CAD model. It now includes all panel text, which is embossed into the geometry. View the screenshots at full size by clicking on the thumbnail and then selecting “view full size” on the bottom right.

The CAD model is available for purchase. Please use the contact form at www.airlinetools.com for inquiries.

Using sprites for button states

Changing the state of a button in Unity (ON, OFF, FAULT, etc.) can be done in a few different ways. The easiest is to make a few different materials and change the material at runtime. However, this is not good for performance.

There is a better way. Use a texture atlas (sprite sheet) and shift the UV’s. This way the operation runs entirely on the GPU which is many times faster. It is not as easy to set up though, so here is a detailed description how to do this..

First you need to render separate emissive textures for each button state. In most cases there are 4 states: no lights, ON light, FAULT light, ON and FAULT light. This is great because each texture can be stored in a corner of the main texture making it both efficient and easy to set up.

The textures can be rendered in Substance Painter by creating a layer for each button state, each with different emissive materials placed using the ID map. Then disable all emissive layers except one and export the textures. Rename the emissive texture and export again with another button state enabled. Do this for each button state until you have 4 separate textures. Click on the thumbnail for a better view. I wrote a plugin for Substance Painter which makes exporting the textures more easy. You can find it here:

https://share.allegorithmic.com/libraries/2319

Once the plugin is installed, just press on the “Export Emissive” button and it will automatically save the emissive channel and rename the texture as necessary.

Below you can see the 4 exported emissive channel textures. Note that the orientation is on its side. This is due to the automatic UV unwrapping from Unwrella. It might not look nice but it is completely irrelevant in our workflow.

emission-textures

The next step is to place each of the 4 textures in the corner of a new, bigger texture. This can be done in Gimp using a plugin called “fuse layers”. You can find the plugin here:
http://registry.gimp.org/node/25129

Once the plugin is installed, fuse the 4 textures into a single one.
File->new-> set the same resolution of the input image. The resolution of the final image will be automatically increased accordingly.
Set Image->mode to RGB.
File->Open as layers-> select all 4 images.
Delete the background layer.
Filters->Combine-Fuse layers. Set x = 2.
To save, use file->export.

Now we have a single texture containing a button state in each corner:

emissive-combined-example

This texture can’t be used as-is because the UV mapping needs to be changed in code. This is what happens when you apply the texture in Unity without any UV modifications:

emissive-unity-default

To fix this, the Unity standard shader needs to be modified. Here is how to do that:

-Download build in shaders.
-Copy UnityStandardInput.cginc and put into project.
-Copy Standard.shader, rename to StandardShift.shader, and put into project.

-Open StandardShift.shader.
-Modify the first line to:
Shader “StandardShift”

Place this code just below the line “_DetailNormalMap(“Normal Map”,…”
_EmissionTileOffset(“EmissionTileOffset”, Vector) = (1,1,0,0)

-Open UnityStandardInput.cginc.
-Place this code just below the line “sampler2D  _EmissionMap;”
half4   _EmissionTileOffset;

-Search for this function:
“half3 Emission(float2 uv)”
-Place this code just above the line “return tex2D(…”
uv.x *= _EmissionTileOffset.x;
uv.y *= _EmissionTileOffset.y;
uv.x += _EmissionTileOffset.z;
uv.y += _EmissionTileOffset.w;

-Create a material and set the shader to StandardShift.
-Add the material to an object.
-Place the texture with the 4 button state in the emissive slot.
-Create a script and add the script to the object.
-Add this code to the script to show each button state:

//The vector format is:Tile X, Tile Y, Offset X, Offset Y
rend = GetComponent<Renderer>();
//Bottom left.
rend.material.SetVector(“_EmissionTileOffset”, new Vector4(0.5f, 0.5f, 0f, 0f));
//Bottom right.
rend.material.SetVector(“_EmissionTileOffset”, new Vector4(0.5f, 0.5f, 0.5f, 0f));
//Top left.
rend.material.SetVector(“_EmissionTileOffset”, new Vector4(0.5f, 0.5f, 0f, 0.5f));
//Top right.
rend.material.SetVector(“_EmissionTileOffset”, new Vector4(0.5f, 0.5f, 0.5f, 0.5f));

Now we can cycle through the different button states using a script in Unity. Here is the result:

 

 

Inventor for Game Design #2

I released the InvToSP script along with a few tutorials which describe how to get an Autodesk Inventor model into Substance Painter for texturing.

Here is a playlist of the tutorials:

Script features:
-Significant workflow speedup.
-Automatically import high poly and low poly from supplied directory.
-Fix flipped faces using automated re-import.
-Dynamically adjust mesh resolution.
-Fuse objects (collapse, attach) without destroying the explicit normals.
-Supports Unwrella for automatic unwrapping.
-Convert materials to FBX compatible materials.
-Name modifying for Substance Painter “match by name” baking.
-Add nearby objects to high res model for AO baking.
-Assemble standalone parts into final model.

You can find the script and manual here:
http://www.scriptspot.com/3ds-max/scripts/inventor-to-substance-painter

Rendering light sources

The funny thing is that there are thousands of references available on how a light affects an object. But the amount of references available on how the light itself looks you can count on one hand. I once found a scientific paper, but that’s about it. Perhaps that is why very few people get it right. Often you see an emissive sphere with a flare sprite slapped on top of it. But that is a far cry from a physically based approach, which I will describe here.

Most lights have a lens, which makes them either highly directional like a flashlight, or horizontally directional, the result of a cylindrical Fresnel lens. This directional behavior is simulated with a phase function which shows nicely on a polar graph. Here you can see two common light radiation patterns:

[IMG]

The blue graph has the function 1 + cos(theta*2) where theta is the angle between the light normal and the vector from the light to the camera. The output of the function is the irradiance. Adding this to the shader gives the lights a nice angular effect.

[IMG]

Next is the attenuation. Contrary to popular belief, focused lights (in the extreme case, lasers) still attenuate with the inverse square law, as described here:
http://www.quora.com/Is-the-light-f…distance-grows-similar-to-other-light-sources

But contrary to even popular scientific belief, lights themselves don’t behave in quite the same way, or at least not perceptually. The inverse square law states that the intensity is inversely proportional to the square of the distance. Because of this:

[IMG]

You see this reference all over, for example here:

[IMG]

Yet the light itself is brighter than bar number 4, which is about at the same distance as the light to the camera. The light itself doesn’t seem to attenuate with the inverse square law. So why is this? Turns out that in order to model high gain light sources (such as directional lights), you need to place the source location far behind the actual source location. Then you can apply the inverse square law like this:

[IMG]

Note that highly directional lights have a very flat attenuation curve, which can be approximated with a linear function if needed in order to save GPU cycles.

Some more reading about the subject here (chapter Validity of the Inverse Square Law):
http://blazelabs.com/f-u-photons.asp

One other problem is that the light will disappear if it gets too far from the camera. This is the result of the light being smaller than one pixel. That is fine for normal objects but not for lights because even extremely distant or small lights are easily visible in real life, for example a star. It would be nice if we would have a programmable rasterizer, but so far no luck. Instead, I scale the lights up when they are smaller than one pixel, so they remain the same screen size. Together with the attenuation, this gives a very realistic effect. And all of this is done in the shader so it is very fast, about 0.4 ms for 10.000 lights on a 780ti.

Since I made this system for a flight simulator, I included some specific lights you find in aviation, like walking strobe lights (also done entirely in the shader):

[IMG]

And PAPI lights, which are a bit of a corner case. They radiate light in a split pattern like this (used by pilots to see if they are high or low on the approach):

[IMG]

Simulated here, also entirely in the shader.

[IMG]

Normally there are only 4 of these lights in a row, but here are 10.000, just for the fun of it. They have a small transition where the colors are blended (just like in reality), which you won’t find in any simulator product, even multi million dollar professional simulators. That’s a simple lerp() by the way.

I should also note that the shaders don’t use any conditional if-else statements but use lerp, clamp, and scaling trickery instead. So it plays nice even on low-end hardware.

Released for free, but with limited support:
https://www.assetstore.unity3d.com/en/#!/content/46409

WebGL:
https://googledrive.com/host/0Bwk4bDWv3jAcUkZBc2dNQ0RNT2M
Controls: same as in the editor.
Speed up/slow down: 1 and 2 (not numpad)
Bloom is SE natural bloom.
It looks best in fullscreen.
Note that the frame rate is much higher in a standalone build.

Video:

Hole in a curved surface

Previously, adding a hole in on a complex curved surface was not easy to do in Inventor. With the new Curve On Face feature in Inventor 2017, this has been made much easier. This is how it’s done:

-Start a 3D sketch.
-Select “Curve On Face”.
-Draw a 3 point spline on the surface, with the 4th connecting to the first one to close the loop.

hole on surface

-Finish the 3D sketch.
-Create a plane out of 3 points, using the 3 points from the spline.

hole on surface 2

-Create a 2D sketch on the plane.
-Add a circle on the 2D sketch and place it in the center of the spline points. The circle will be used to extrude the hole.
-Finish the 2D sketch.
-Create an axis by selecting “Normal to Plane through Point” and select the mid point and the work plane. This axis can be used later to align a screw.

hole on surface 3

-Add a work point to the center point. This will be used later to place a screw.
-Extrude the circle with a cut operation. You might have to extrude it both ways to create a clean cut.

hole on surface 4

-Hide the 3D sketch and the work plane, leaving only the hole, work point, and the work axis behind.

hole on surface 5

-Now you can add a screw, rod, etc in an assembly.

hole on surface 6

A320 CAD design

Here are some screenshots of the finished A320 cockpit CAD model. The CAD model is available for purchase. Please use the contact form at www.airlinetools.com for inquiries.

Experimental texturing of monitor in Substance painter:

monitor 2

Test of leather material in Substance Painter.

leather

Test of leather material (armrest) in Substance Painter:

leather 2

Throttle unit. The text is embossed into the geometry:

full 9

Some more screenshots:

full 8 full 7 full 6 full 5 full 4 full 3 full 2 full 1