Apr 092015
 

I’ve been working for the last few months on a pipeline system for Wolf & Crow here in Los Angeles. My efforts are typically focused on Maya pipelines, which I’ll be documenting later, but I wanted to show off a little something I made that’s usually outside my repertoire… web programming.

The company needed an interface to handle mundane tasks like creating new jobs and assigning job numbers to them, creating new assets, and creating new sequences & shots for projects. Normally this would be as easy as duplicating a template folder, but the pipeline here makes extensive use of symbolic links in order to tie multiple Maya workspaces together in a way that’s as efficient as possible (for example, the “sourceimages” folder of every Maya project in a job can be symlinked to a single “textures” repository). Symlinks can’t be created in Windows, although Windows will happily follow symlinks served from a Linux server. So the bulk of the work has to be done on the server itself via Python or Bash scripts.

Prior to the web app, the standard procedure was to run plink.exe (an SSH command-line utility) inside a Python script using the subprocess module (see my earlier post on subprocess), which would pass a command over to the Linux file server to create all the symlinks. Or you could just SSH into the server using putty.exe and run the command yourself. This was clumsy, though, since you either needed to have Maya open to run the Python script through an interface, or you had to be comfortable enough with SSH and the Linux command line to run these scripts.

Instead, there’s a (sort of) sexy web interface that anyone can run from a browser! Here’s some examples.

Click below to see a whole bunch of code.

Continue reading »

Apr 272012
 

I just wrote out a new version of the Maya to FBX export script, this time in Python, and with a lot of bugs and instabilities cleaned up. It auto-detects deforming vs. non-deforming objects, and exports each type of object to a separate FBX, optionally from a selection. Everything is baked into world space so there are no complications with unexpected transforms.

The new script is on the scripts page, in place of the old MEL one.

Post a comment if you have any questions…

Oct 292011
 

For those of you who like to use a linear workflow in Maya, you might notice that once you apply a lens shader (mental ray) or an overall gamma setting (vray) to your scene with gamma 2.2, while your gamma-corrected textures might look great, your image planes will be washed out. This is a pain when you’re trying to use a plate as a reference for lighting your scenes. There is a way to gamma correct image planes, but the setup takes a few more steps and it’s somewhat counter-intuitive. Thanks, Maya!

The first thing to do once you attach an image plane to the camera is to switch the “Type” attribute from “image” to “texture.” Once you do this, you can attach a regular File node to the texture, and set up your image sequence attributes on the file node if you are using a sequence.

The next step is to connect the file node to a gammaCorrect node… create a gammaCorrect node, set the gamma in all three channels to 0.454 (or whatever the inverse of your current gamma is if you’re not correcting to 2.2) and then connect the outColor of the file node to the “value” of the gammaCorrect node.

The last step is the tricky part. Try connecting the output of the gammaCorrect node to the imagePlane node… where does it go in the Connection Editor? Maya decided that you, the user, should be sheltered from this most dangerous input, and it’s been hidden by default. Turn on “show hidden” on the right side of the Connection Editor, and then connect the outValue of the gammaCorrect node to the “texture” input of the imagePlane.

I don’t really understand why this has to be so difficult in Maya, but the same could be said about a lot of this program.

Oct 262011
 

A lot of After Effects compositors like to use a plugin to allow texture substitution on objects, called Re:Map. You render out a special color pass from your 3D scene, and then the plugin uses those colors to wrap a flat image onto the object, basically allowing you to re-texture in post.

A lot of people in the past have asked me or others to find the material you’re supposed to use to render this pass from Maya. There are different ways to make it, but by far the easiest is to just use a place2dTexture node and a surface shader (or a VrayLightMtl, if that’s your thing).

Connect the place2dTexture.outU –> surfaceShader.outColorR. Then connect place2dTexture.outV –> surfaceShader.outColorG.

That’s it. Apply the material to everything and you’re done. In VRay, you don’t even need a material if you want to save a render layer; just create a VrayExtraTex element on your render layer and connect the place2dTexture outputs to the ExtraTex element in the same manner.

The setup is really easy but there are a few things to watch out for. First of all, if your UV’s are distorted, then any texture you place on it is going to be distorted. So you need good UV’s. If you’re using simple rectangular billboards, make sure the UV’s are normalized (you can normalize UV’s in the UV Texture Editor). Also, the image quality will suffer if you aren’t rendering to a floating point file format– 32-bit floating point images are best to avoid artifacting.

One other subtle thing to watch out for. If you are using a linear workflow when you render (and you should be!) it’s easy to screw up this render and end up with weirdly warped images. This render should NOT be gamma-corrected in any way, so disable your lens shaders if you’re in mental ray, and set your gamma to 1.0 and turn off “linear workflow” if you’re in VRay. It’s hard to see, but take a look at the difference between a linear render of this pass and an sRGB (gamma 2.2) render:

The image on the left is the correct one. Using a color-corrected UV pass will cause your substituted textures in After Effects to appear very warped around certain edges.

(I’ve made this mistake way too many times.)