default manipulators and zooTriggered

david | animation,mel script,tutorials | Wednesday, April 16th, 2008

Yesterday I learned something new. Stev Kalinowski posted this on CGTalk

In the attribute editor, under the Display rollout you set the "Show Manip Default" drop down list to Translate or Rotate for each control. Then to select objects hit "t" (Show Manipulators) instead of the move or rotate tool. Now when you select objects you'll get the manipulator you set for each control.

I tested it out using two objects called "translateMe" and "rotateMe" and I set the Show Manip Default attribute as shown here

defaultManipAttrEd_translate.jpg

defaultManipAttrEd_rotate.jpg

Now, as long as I am in Show Manipulators mode (invoked with the "t" hotkey by default) then the manipulator handle will change based on my selection.

This is quite handy when animating, but it depends on being in the Show Manipulators mode, which I would not be as comfortable with alot of the time, since I frequently need to also use the translate, rotate and scale modes.

Well, there is another way to do the same thing using Hamish McKenzie's zooTriggered scripts.

I have written about using zooTriggered here, but if you read that quick walk-through, you might think it's use is restricted to selecting a control object to trigger the selection of another object(s). In fact zooTriggered can trigger almost any mel command(s) which means it can be used to activate a specified manipulator when you select an object.

Here's a really quick look.

Open the zooTriggered UI and in the View menu make sure "Edit zooTriggered command" is selected. Then add the objects you want to configure (to the top left pane). Then right-mouse-click in the lower pane and select either the "use move tool preset" or the "use rotate tool preset".

zooTriggeredMoveToolPreset1.jpg

You should end up with something like this

zooTriggeredRotateToolPreset2.jpg

Now enable the trigger by clicking the "load" button so that the "load status" bar changes to green

zooTriggeredEnable.jpg

That's it. Now whenever you select the object, the manipulator tool you specified will become active.

There is a slight delay between your selection and the change of the manipulator, which is not the case with the Show Manipulators mode, but my preference is still the zooTriggered method since it allows me to stick with the regular translate, rotate, scale modes that I generally use.

mip_matteshadow and how I have used it

david | mentalray,rendering,tutorials | Tuesday, April 8th, 2008

In this post I will explain the way I have been using the mip_matteshadow shader in maya 2008 (along with some of the other mentalray "production" shaders). It is not going to be a step-by-step tutorial - and I'm certainly no "production" shader expert. I started by reading Zap's mental ray tips, and the Production Shader Wiki and the PDF manual. Then I did some experimenting to see how I could include these new methods in my work which, unlike Zap's examples, usually involves rendering CG characters in CG environments rather than live-action or photographic backgrounds.

Here is a simple example of how I have recently started using mip_matteshadow as a better alternative to maya's useBackground shader... (click here for the rest of the post) (more...)

one way to customize maya

david | mel script,tutorials | Sunday, March 2nd, 2008

I realise that many of my posts contain references to mel scripts that customize the way maya works, so I thought it could be useful to write some details about how I have configured my maya environment to allow me to customize it without messing up the default installation.

systemPropertiesNewVbl.jpg

Read more...

envSphere, mentalray and IBL

david | mentalray,rendering,tutorials | Wednesday, February 13th, 2008

The way mentalray has been integrated with maya's native shaders is a constant source of curiosity. Some things work the way you expect, some not at all and some need to be tricked into working. In this post I'm going to share the results of my experimentation with maya's envSphere shader in a mentalray context, which led me to the obvious conclusion that IBL is a better choice.

After years of using maya's own software renderer I'm used to doing things in a certain way. The envSphere environment shader is easy to use. I hook up a fileTexture, or a rampTexture and I can then use the place2dTexture node control exactly how they sit within the sphere. For example if I have a 180deg panorama, I will probably want to set the repeatU=2 in the place2dTexture.

The trouble with envSphere is that it doesnt seem to work properly with mentalray shaders. If I connect it to the camera's mentalray environment shader slot maya crashes when I attempt to render.

To the untrained eye, the mentalray equivalent of an envSphere would appear to be mib_lookup_spherical, but it is not so user-friendly, because it forces you to use a mentalrayTexture node and a whole bunch of other nodes to get anything approaching the functionality of the place2dTexture. Too hard! And I could never figure out how to hook up a ramp texture. Often I have simply resorted to mapping my fileTexture onto a big nurbs sphere. Easy!

But I like messing around with mentalray and the hypershader, and today, almost by accident, I discovered that its possible to use envSphere as a mentalray environment shader after all. Rather than connect it directly to the camera, you need to go via a multiplyDivide node (or something similar). Here is a snapshot of the hypershade connections (click image for the full size version).

envSphereHypershadeSml.jpg

I guess the multiplyDivide node acts as some kind of data converter. Here's a render I did.

envSphereMR_filter1.jpg

I'm using a fairly lores image, but it should not be that blurry. What I realized is that you need to set the fileTexture filter=0 in the effects tab (dont set the filter type to none; leave it as quadratic of mipmap). Then when I rerendered I got this.

envSphereMR_filter0.jpg

This works pretty well. I've got the convenience of the place3dTexture to rotate the environment and the place2dTexture to set my repeats and to fine tune the offset. Here is one I did with a ramp instead.

envSphereMR_ramp.jpg

My excitement was short lived though. When I turned on final gather I got this.

envSphereMR_FG.jpg

Dohhh!

Fortunately, there is a solution - Image Based Lighting. You can create a mentalrayIbl node by clicking a button in the Environment tab at the bottom of the render settings window. There is no need to connect it to the camera. Once the IBL node is in your scene mentalray will use it as your environment.

Here is my artifact free render with final gather where I have removed the envSphere from the scene and created an IBL node and connected my ramp to it.

IBL_ramp_FG.jpg

When you create an IBL it defaults to Type=Image File and expects you to give it the name of your image file. But if you want to fine tune your image mapping with a place2dTexture node or you want to use a ramp then you need to change to Type=Texture and connect a maya fileTexture or a ramp texture.

A minor inconvenience is that you cant simply drag-and-drop the texture node onto the texture swatch in the IBL attribute editor. You can try, but no connection gets made. To connect an existing node you need to use the connection editor and connect as shown here.

connectRampToIBL.jpg

Notice how the IBL attribute called "texture" in the attribute editor (below) is called "color" in the connection editor (above right) - just to confuse you.

IBL_attr.jpg

A bonus with the IBL node is that you get a textured preview in the viewport which really helps you line up your reflections. By default the IBL will be scaled very large and you may need to scale it down to see it within your camera's clipping planes.

IBL_viewport.jpg

The IBL can also do light and photon emission. Neither are turned on by default, but they are there if you want them, and you can afford the render time.

Conclusion: Sometimes it is possible get mentalray to do things it probably wasn't meant to do, but for an environment shader, Image Based Lighting is a better choice than envSphere (and forget mib_lookup_spherical unless you like complication).

layering shaders with mentalray

david | mentalray,rendering,tutorials | Friday, February 1st, 2008

This is a mini tutorial where I will show one method of layering shaders using mentalray for maya 2008. To make it slightly more interesting one of the shaders will be the misss_fast_simple.

Maya comes with its own layered shader node and you can hook it up with mentalray shaders, but it doesnt always work, and I'm not a big fan of the interface so I dont use it. For several years I have been using the mix8layers shader (from ctrl.studio). It is a reliable shader and texture layering node with lots of features. But today my layering example will use a phenomenon I created called dj_mix_colors which is built upon the mib_color_mix node.

I am going to layer a rusty metalic looking blinn over misss_fast_simple to get the result you can see in this picture.

layeringShadersInMR_DJ001.jpg

I created a sphere on a plane with a couple of area lights as shown in the next picture

perspView.jpg

Then I created a blinn shader with its color coming from a crater texture and bump from a noise texture.

Next I created a misss_fast_simple. When you do this in maya 2008 a few extra nodes are automatically created at the same time. The extra nodes are a lightmap node and a mentalrayTexture and they are used to generate and store the sub surface scattering calculations. The important thing to realize is that the misss_fast_lmap_maya node (the light map) connects to the shading group node rather than to the material node. This connection to the shading group node must be taken into consideration when I do the shader layering.

Before getting into any layering I simply assigned the blinn to my sphere and adjusted the material attributes to make the rusty metalic appearance. Then I assigned the misss_fast_skin and adjusted the lighting to show off the subsurface scattering.

Next I created a dj_mix_colors node. This node can either be used as a texture or as a material in mentalray shader networks. For reasons which I will not explain here, if you are trying this example, you may not be able to assign dj_mix_colors to a surface using the hypershade Assign Material To Selection menu. However it can be easily done by assigning another shader (say a lambert) to create a shading group node, and then you can drag-and-drop the dj_mix_colors into the mentalray Material Shader slot in the shading group node. The lambert will be ignored (and can be deleted).

In this example I just reused the shading group node that was created along with the misss_fast_simple material. I connected my dj_mix_colors node so that it replaced the misss_fast_simple material. This saved me creating that temporary lambert material, but much more importantly it means the lightmap connection is already set up.

So here is what the shading group attribute editor looks like now (I have renamed it to be more meaningful)

layeringShadersSG_attr.jpg

dj_mix_colors can mix upto 8 different inputs in a variety of ways. In this case I started by setting color0 to black. I then enabled 2 additional colors and set the mode for each of these to add. I dragged the blinn to color1 and dragged the misss_fast_simple to color2.

If I do a render now then both materials are added together. In dj_mix_colors weight1 and weight2 controls the relative strength of the 2 layers. But I want to use a mask to cut out one material, and in this example I am using a simple checker texture with the repeatU set to 0.5 so I get horizontal stripes.

First I connect the checker to the dj_mix_colors weight2. Now the misss_fast_simple will only effect the areas defined by the white parts of the checker. I need to also connect to weight1 but this needs to go via a reverse node so that the blinn will be seen only in the areas where the checker is black.

The dj_mix_colors attribute editor looks like this

dj_mix_colors_attr.jpg

Rending now shows that the materials colors are masked correctly, but the bump continues to cover the sphere. This is due to the way maya handles bump maps. Even though the bump node is connected to the material, that is not how the renderer sees it. To correct the bump I just have to connect the reversed checker to the alphaGain of the noise texture to cancel out the noise.

Thats it!

Here is a snapshot of the hypershade network with the connections highlighted. Click the image for the full sized image

layersHyperGraphSmall.jpg

And here is the render again

layeringShadersInMR_DJ001.jpg

You can download my scene file here.

To use this example you will also need to get dj_mix_colors but the technique could also be used with the standard mib_color_mix.

depth-of-field pass for maya or mentalray

david | rendering,tutorials | Tuesday, December 11th, 2007

A common approach to rendering is to do it in several passes and to composite them using something like after effects or fusion. For me an important pass is a variation on the standard depth pass, which I refer to as the depth-of-field pass. For a few years I have rendered the depth-of-field pass using a wonderful mentalray shader called zDepthDOF by Andreas Bauer. (The link takes you to a good explanation of how the shader works and why it is better than a standard depth render.)

The thing I like most about zDepthDOF is that I can use it with the maya distance tool. I connect the distance attribute to the shader's focus distance parameter. I point-constrain one of the distance tool's locators to the camera. Then I animate the other locator to accurately control the focal point in my depth pass which can then be used to create some nice focus change effects using an after effects filter like the compound blur.

But zDepthDOF is a mentalray shader, so just for fun I decided to see if I could do the same thing using the standard maya software renderer. It wasnt too difficult and I thought it might worth sharing. It's a great example of how the maya utility nodes can be used in a shader network.

Click here for........... (more...)

exploring mentalray phenomena

david | mel script,mentalray,rendering,tutorials | Wednesday, November 21st, 2007

In the "Introduction to Mentalray" section of the maya documentation I read this:

The Phenomenon concept provides the missing framework for the completion of the definition of a scene for the purpose of rendering in a unified manner.

It inspired me to find out more about this concept and to see if I could do something useful with it.

A phenomenon is a mentalray shader network that has been exported in a special format that allows it to appear like a custom shader as a single node with its own interface.

I conceived a simple task as a practical way of exploring how to create a phenomenon. My aim would be to take one of the mentalray Data Conversion shaders called mib_color_mix and give it a make-over so people will find it easier to use.

Click here to read the full article describing the steps I took to achieve my aim.

(more...)

mia_physicalsky and haze

david | mentalray,rendering,tutorials | Monday, October 29th, 2007

A while ago I learned from one of "Zaps Mentalray Tips" that the mia_physicalsky haze parameter could be used to make clouds in the sky by connecting a file texture to it. Since then I have experimented with this technique and discovered some limitations and some solutions to those limitations.

sky_hazeWithVisDist_mentalrayTexture.jpg

(more...)

faster renders

david | rendering,tutorials | Wednesday, October 17th, 2007

Depending on how you currently start your maya renders, this tip may save you some render time.

If you launch renders using the command-line (either directly or via a render manager application) then you will already know about the various cmd-line flags that can be used to control various aspects of the render.

For a while now I have been using the command "render -r file" to tell the render to use which ever renderer (usually mentalray or software) I have specified inside the scene file. This is especially convenient when, because of render layers, I'm using both these renderers in the one scene.

Today I made an interesting discovery. If I instead use "render -r mr" to tell the renderer specifically to use mentalray then my render times decreased significantly, in this case, dropping from just over 5 mins per frame to approximately 3 mins per frame.

I know these differences will probably vary depending on the content of the scenes. And I know that being specific about the renderer will require some extra thought when different render layers may use different renderers, but the time saving is sure to make it worth the effort.

mia_physicalsky visibility distance

david | mentalray,rendering,tutorials | Wednesday, October 10th, 2007

This is a mini tutorial on using the visibility distance parameter of the mia_physicalsky shader.

Most people probably add mia_physicalsky to their scenes using the create button in the environment tab at the bottom of the mentalray render-settings (shown here)

renderSettings.jpg

This creates a shader network involving several nodes. If we look at the camera attribute editor mentalray tab we see that mia_physicalsky has been connected to the environment shader slot and mia_exposure_simple to the lens shader slot.cameraAttrEdA.jpg

The default settings are a good start point for lighting out door scenes. Here is a render of a 9 objects in a line using the default sun and sky settings - except for the mia_physicalsky visibility distance, which I changed from 0 to 10.

default sun sky render

So what is visibility distance and why am I writing about it?

The manual says "Aerial Perspective is a term used by painters to convey how distant objects are perceived as hazier and tinted towards the blue end of the spectrum. mia_physicalsky emulates this with the visibility_distance parameter." This can be a little confusing for two reasons: 1. there is also a "haze" attribute and 2. it doesnt seem to matter what value you give to visibility distance - nothing changes.

Making visibility distance work is easy once you know how. You need to connect mia_physicalsky as a lens shader. Our camera already has a mia_exposure_simple as a lens shader, but we can add another one. You might expect that you could drag-and-drop the existing mia_physicalsky shader into the empty lens shaders box, but for some reason that doesnt work. Luckily its easy to do with a simple mel command.

connectAttr mia_physicalsky1.message cameraShape.miLensShaderList[0];

(You may need to change the cameraShape name to match your camera.) Then the camera attribute editor should look like this

cameraAttrEdB.jpg

Update: You can also install a modified mel script to make the connection using a button in the attribute editor. Read about it here.

If we rerender the test scene we get this

withLensShader_visDist10_haze0.jpg

These are the same settings as the first render except now mia_physicalsky is being used as a lens shader as well as an environment shader. Here are some different visibility distance settings

withLensShader_visDist40_haze0.jpg

withLensShader_visDist100_haze0.jpg

So what is the difference between visibility distance and haze? Haze sets the amount of haze in the air and changes the look of the sky but on its own does not obscure the view of the objects. Here is a render with haze but zero visibility distance

withLensShader_visDist0_haze10.jpg

Here is the same haze value but visibility distance set to 100

withLensShader_visDist100_haze10.jpg

And here is another variation

withLensShader_visDist100_haze3.jpg

Summary: visibility distance only works if the mia_physicalsky is used as a lens shader.

« Previous Page | Next Page »

Powered by WordPress | Based on a theme by Roy Tanck