Sunday, May 20, 2012

How to use Mia material with Maya's ambient light pt2.

This is the follow-up of the post, How to use Mia material with Maya's ambient light.

The following mel script is to create a lambert material and connect it to all the existing mia_material_x_passes materials at one shot.

// create a Lambert material named ambPicker
string $ambPicker = `shadingNode -asShader lambert -name ambPicker`;
setAttr ($ambPicker +".color") -type double3 1 1 1;
setAttr ($ambPicker +".diffuse") 0;
setAttr ($ambPicker +".miFrameBufferWriteOperation") 0;

// make a connection between the ambPicker and the mia materials
string $mia_material[] = `ls -type "mia_material_x_passes"`;
for ($member in $mia_material ){
    int $connectivity = `isConnected ($ambPicker +".outColor")($member +".ao_ambient")`;
    if ($connectivity){
        print ($member +" is already connected to ambPicker.\n");
    }
    else{
    connectAttr -force($ambPicker +".outColor")($member +".ao_ambient");
    }
    setAttr ($member +".ao_on") 1;
    setAttr ($member +".ao_samples") 16;
    setAttr ($member +".ao_distance") 30;
    setAttr ($member +".ao_dark") -type double3 0 0 0 ;
   
}

You can simply delete the ambPicker material when it's not needed.

Friday, May 11, 2012

MEL: Maya file texture node initializing for Mental ray.


The default filter type in Maya's file texture node is set to Quadratic and this is not desireable for Menta ray since Mental ray uses Mipmap filter.
And the default filter value is set to 1.0 which is a little too high for most of the case.
So you need to manually tweak these settings for every single file texture node and that's tiresome.

Instead, I'd like to automate this process using a MEL script.
To make this possible, you need to execute two tasks.
One, select all the file texture nodes in the scene.
Two, for each of these selected file texture nodes, change attribute's values.

Here's the complete script.
This script switches the current Filter Type to Mipmap and set the Filter value to a modest value of 0.5.

/* select all the file texture nodes in the scene and put them in the string array*/
string $selected[] = `ls -type "file" -long`;
/* declare two strings for file node's filterType and filter */
string $filterType, $filter;
/* loop through all the file nodes */
for ($member in $selected) {
    /* get the current attributes */
    $filterType = `getAttr ($member+".filterType")`;
    $filter = `getAttr ($member+".filter")`;
    /* set new values for the attributes */
    setAttr ($member+".filterType") 1;
    setAttr ($member+".filter") .5;
}

Thursday, May 3, 2012

Extending render pass availability of mi_car_paint_phen_x_passes








The mi_car_paint_phen_x_passes provides with the following render passes by default.

ambientRaw
diffuse
directIrradiance
reflection
specular
indirect

However, if you utilized the custom color pass, you can render out the following passes as well.

Ambient, Spec1, Spec2, Flake, Flake Reflection, Dirt.

I like using the bent normal method for a simulated indirect illumination. (This is introduced in this post.)
This method is easily available for mi_car_paint_phen_x_passes because it has the Ambient color slot.

And I also like to use a mia_material for the car paint's reflection, since it gives me a better control and quality especially when coupled with the mia_envBlur.

So the question comes down to whether I can get all the necessary render passes after setting up the shader.

The render passes that I need is like below.

Ambient ( for the simulated indirect illumination through bent normal )
Diffuse
Reflection ( mia material's )
Specular ( car paint's )

MasterBeauty

Here's the MasterBeauty of the test render. The left sphere is the mi_car_paint_phen_x_passes shader only and the right sphere is the mi_car_paint_phen_x_passes and mia material combo.

I simply connected the output to the additional color attribute of the mia material.

The render passes are below.

ambientRaw

diffuse



specular 
I turned off the specular of the mia_material by setting the Specular Balance to zero. So what shows up on the right one is the specular of the mi_car_paint_phen_x_pass shader.

reflection
I turned off the reflection of the mi_car_paint_phen_x_passes shader by setting the Reflection Color to black. So only the right one shows the reflection which comes from the mia material.


The problem with the default Render pass is that the ambientRaw pass is a raw color not a processed color. This means it's just the pure bent normal output which is a simulated indirect illumination color itself. So I need a pure diffuseMaterialColor pass to multiply it with the ambientRaw pass to generate a processed pass.

I created a new render pass, Custom Color. And I named it "AmbientCarPaint".
Then I created a WriteToColorBuffer node and set it up like below. Note that ambientCarPaint pass is selected for the Custom Color Pass.


 This is the shader connection. Notice that a new writeToColorBuffer node is created and connected.





The below is the connection between the mi_car_paint_phen_x_passes shader and the writeToColorBuffer node.


By default, mi_car_paint_phen_x_passes doesn't support the diffuseMaterialColor pass. But instead,
AmbientLevel produces the same result. Now it will be output through the custom buffer.

ambientLevel

The reason why it doesn't look flat like a regular diffuseMaterialColor pass is that mi_car_paint_phen_x_passes shader has a unique attribute "Edge Color Bias" that makes the diffuse color darker around the edge just like the Fresnel effect.

Now that I have all the necessary passes, here's the comp in Nuke.

The comp result is identical to the first MasterBeauty render. So, this car paint shader is ready to go!






Wednesday, May 2, 2012

Rayleigh scattering and Mie scattering in Maya


Rayleigh scattering and Mie scattering are the main reasons why the sky is blue and the fog is white.
Depending upon the size of the particles that the sun ray hits, the color and the directionality of the scattering vary.
If the particles are as tiny as air molecules and therefore smaller than the wavelength of the sun ray, the sun ray is scattered into every direction. In this case, the shorter wavelengths like blue and violet start to scatter first. That's why the sky is blue. But when the sun is low in altitude and has to travel the thicker atmosphere, virtually all the blue is scattered already and the remaining red is also scattered. That's why the sunset sky turns red. Therefore, the behavior is spectrally selective.
In contrast, Mie scattering occurs for larger particles like water droplets. In this case, the scattering is directional and not spectrally selective. so Mie scattering creates mainly white color and tend to look like a halo around the sun or the light sources.


These two natural phenomena are implemented in Maya's physical fog.
Here's a render of a terrain with no atmospheric effect at all.


No matter how gigantic the terrain, it's hard to feel the magnitude of it.
I'll add an environment fog node to the scene. By default the environment node resides in the Maya software render settings, under render options
Once you create the node, mental ray will respect it in rendering




When the Physical type is set to Sky, this works as the combination of Rayleigh and Mie scattering.
Fog is Mie and Air is Rayleigh. You can control both individually.
First, I turn off the Fog by setting the Fog Density to zero. With the Air only, below is the result.
It clearly show the blue sky only.


Not only you can see the sky but also the blue-tinted faraway terrain around the horizon.
This is what's called aerial perspective (or atmospheric perspective). This is one of the essential elements that makes the outdoor scenery looks realistic and natural.


Now I turn on the Fog by raising the fog density.



with the addition of the fog, the terrain near the horizon becomes more desaturated.
Since the fog is sensitive to the direction of the sun, there are controls for the sun's position.



At the Sun Azimuth of zero, the sun is in line with the Z axis. Since the current camera is pointing the Z-direction, the following adjustment will reveal the sun.
I adjust the Fog Light Scatter attribute to a lower value like 0.2. This gives the fog scattering more directionality. ( higher value makes the scattering more uniform )


Here's the result. You can see the white halo forming around the sun.



If you use a very small number like 0.01 for the Fog Light Scatter, you may get a sun like shape like below.





This is an extreme case of back-lit fog.


In conclusion, you may want to use the Mental ray's physical sun and sky for the environment solution but it lacks atmospheric effects such as aerial perspective. On the other hand, Maya's physical fog is not an illumination source like MR's physical sun and sky. So these two features are complementary to each other.


 

Do not scale the camera!


There are cases where you shouldn't scale the camera, otherwise you will get artifacts or poor performance.

Case 1:  When you use a mia_env_blur node  to provide with a blurry reflection to a mia material.


This is a highly reflective metal material. I use the mia_envBlur to apply the blurry reflection like the following.

  
And then I scale up the camera by 5 times.


You'll get this pinching point like artifact.
This time, I will scale the camera down by 5 times ( 0.2 ).


Another weird artifact.
So, don't scale up or down your camera when you use mia_envBlur for blurry reflections.
It doesn't matter how much. As soon as you begin changing the scale it will go bad.



Case2:  When you use Optimize for Animations FG mode with View option on.



This time, I use the FG to create this.


I use Optimize for Animation mode with View option on. It gives me a fairly even FG point distribution like below.


So far, so good. But if I scale up the camera, let's say by 10 times.
The FG brightness diminishes by around that much like below.



This has to do with the fact that the View mode uses the camera space to deal with FG points and density. You can compensate the brightness by scaling up the Min/Max pixel value by the same scaling factor that's applied to the camera.

That being said, I'd rather not use a scaled camera.
 

Bent normal to simulate indirect illumination(plus diffuse convolution).


Using the final gathering for animated objects demands very high settings and could take a very long rendering time. This high settings are usually for preventing flickering or swimming pixels in the image.
If there's a flicker free solution, one can feel free to use the final gathering for animation.
Bent normal can produce a one-bounce final gathering like result easily, even though it takes some preparation.





Fist, I used the final gathering using the following 2k environment map to get this result.



This is a 3-bounce FG result. 
I assigned a white mia_material on the left one. I will use bent normal env as an indirect illumination input for the right one.

Here's the bent normal env connection which will be used as an indirect illumination input.



The bent normal env node's attributes.


The amb occlusion node feeds the Bent Normals attribute of the bent normal env node.




 For the environment texture for the lookup_spherical node, you need a convolution-applied version of the environment texture that was used for the final gathering.


This small thumbnail-like image is what I used as an input environment for the bent normal env node.
It can be really small like this since it doesn't require any detail. It's just to simulate a fully diffusive illumination.

Here's the other mia_material for the object on the right.
I connect the output of the bent normal env node to the Ambient Light Color attribute.



 Now, this mia material shader has an incoming illumination which is attenuated by the build-in AO.



This is the render result. The left one is illuminated by FG and the right one is the simulated indirect illumination that comes from the bent normal environment.( I turned off the Final Gather Cast and Receive render flag for the object on the right to prevent it from getting the double indirect illumination.)


They look almost the same. This method can be a good alternative to the FG. It's flicker-free.
With a textured shading, it's almost impossible to tell which is which.



Tuesday, May 1, 2012

lat-long style Environment map creation



 
Rendered using Vue xStream

E-on software's Vue is one good solution for the realistic natural environment creation. I do like the atmosphere and the sky it generates.

If you render the environment in Vue in lat-long style and use it for IBL in Maya, it can be a very efficient environment setup.  Vue provides with an option to export the sky in lat-long style but takes very long time to render since it only utilizes a single cpu thread for this particular task.

So, what I usually do is to create the sky in Maya using the xStream plug-in and render within Maya.
If a panoramic lens shader like latlong_lens shader is available, it will make your life a bit easier. But, if not, you can still render 6 images in cubemap style and transform them into a lat-long format.



Here's the result of the lat-long lens shader. It can directly create spherical type (lat-long) panoramic image from one single camera.


The only problem it has is this shader is not always available, especially for a newly released Maya.
If that is the case, you need to turn to the traditional cubemap method which requires 6 images-top, bottom, front, back, left and right.



You can use one camera and rotate it like above image or can use 6 cameras.
I prefer the latter. 



Here's my camera rig. Note that the naming convention has to be like that.



Each camera must cover exact 90 degrees of field of view in square frame.

Here's the render settings.

 Don't forget to set the framebuffer to 32bit float.


Now, batch render it.







I will stitch these 6 images into one cross-type cubic environment map.
To do this, I use an application called "cube2cross.exe". This takes 6 hdr format images to generate one big cross type hdr map.


I load this in HDRshop.


Using the settings above, I transform the cross type map into lat-long type env map.


This transformed result is basically identical to the one that's rendered through the lat-long lens shader.
Both methods take advantage of using full cpu threads.


The hdr image is now loaded in Maya's IBL. Now it takes virtually no time to render the environment.