Sep 262014
 

Side FX added a new Cloud FX toolkit a version or two ago, and I recently had a chance to mess around with it. Their Cloud Rig shelf tool is pretty great out of the box… pick a shape, turn it into a cloud, done. The Cloud Noise and Cloud Light SOPs that are built into the rig setup can get some pretty good results, but it’s not exactly what I wanted… I was looking for a solution that would be a little less dependent on the volume container resolution, and more based on textures instead. It’s not necessarily the greatest solution for swirling, dynamic simulations, but for more-or-less static clouds, it gets resolution-independent nice results that are quick to generate. Maya’s fluid shader supports textures by default, but the Cloud Shader that Houdini uses doesn’t have much in the way of textural control, so I had to make some customizations, and that’s what this post is about. Check the link below to keep reading…

Continue reading »

Sep 112014
 

One of the bigger challenges with rendering liquids is that it can be difficult to get good UVs on them for texturing. Getting a displacement map on a liquid sim can make all the difference when you need some added detail without grinding out a multimillion-particle simulation. Unfortunately, liquid simulations have the annoying habit of stretching your projected UVs out after just a few seconds of movement, especially in more turbulent flows.

In Houdini smoke and pyro simulations, there’s an option to create a “dual rest field” that acts as an anchor point for texturing so that textures can be somewhat accurately applied to the fluid and they will advect through the velocity field. The trick with dual rest fields is that they will regenerate every N seconds, offset from each other by N/2 seconds. A couple of detail parameters called “rest_ratio” and “rest2_ratio” are created, which are basically just sine waves at opposite phases to each other, used as blending weights between each rest field. When it’s time for the first rest field to regenerate, its blend weight is at zero while the rest2 field is at full strength, and vice versa.

It’s great that these are built into the smoke and pyro solvers, but of course nothing in Houdini can be that easy, so for FLIP simulations we’ll have to do this manually. Rather than dig into the FLIP solver and deal with microsolvers and fields, I’ll do this using SOPs and SOP Solvers in order to simplify things and avoid as many DOPs nightmares as possible.

Here’s the basic approach: Create two point-based UV projections from the most convenient angle (XZ-axis in my case) and call them uv1 and uv2. As point attributes, they’ll automatically be advected through the FLIP solver. Then reproject each UV map at staggered intervals, so that uv2 always reprojects halfway between uv1 reprojections. We’ll also create a detail attribute to act as the rest_ratio which will always be 0 when uv1 is reprojecting, and 1 when uv2 is reprojecting. It all sounds more complicated than it really is. Here goes…

Continue reading »

Sep 082014
 

I ran into a problem recently where I was trying to make some nice-looking embers in houdini, complete with nice motion-blurred trails. Typically with a particle system you use the velocity attribute to handle motion blur, but geometry velocity blur is always linear, so your motion trails will always be perfectly straight even if you have nice squiggly motions with your embers.

Deformation motion blur looks great, but in most simulations particles are being born and dying all the time, and deformation motion blur doesn’t work with a changing point count.

The solution is to force a constant point count. This can be problematic when your particles need to have a lifespan, so there are a few little tricks you’re going to have to pull in order to make this work…

Continue reading »

Mar 252014
 

I’ve been messing around with using Ptex in VRay for Maya and ran into a particularly weird little problem involving a normal map exported from Mudbox. It’s probably easier to show it than to tell about it (Fig. 1):

bad ptex bottle

Fig. 1. Check out the wax wrapper up top… all chunky and gross.

That wax wrapper looks terrible… not at all like the original sculpt. The mesh is set to render as a subdivision surface (using the custom VRay attributes from the Attribute Editor), but the details are all mangled like it’s not subdividing the surface at all.

Even weirder is what happens when I only render a small region of the image (Fig. 2):

bad ptex with region render

Fig. 2. The details are suddenly cleaner! All I changed was enabling region render.

This was really confusing, and although I’m still not 100% sure what’s going on in VRay to cause this (I’m fairly certain this is a bug), there is at least a solution.

What’s probably happening here is that choosing to render this mesh as a subdivision surface is changing the point count of the object BEFORE any Ptex information is applied to the mesh during rendering. Ptex is very sensitive to your geometry… any change in point count could potentially break things. You’re not allowed to polySmooth objects that will have Ptex textures, for example, or Maya won’t know what points to assign Ptex information to.

The way to get around this is to put the object into a VRay displacement set. Assign displacement and subdivision attributes to the displacement set like you normally would, but don’t attach a displacement map, and make sure the Displace Amount is set to 0. Rendering this way gets you the more correct image (Fig. 3):

correct ptex

Fig. 3. This looks a lot more like the original sculpt from Mudbox.

This seems like buggy behavior to me more than some technical thing I’ve overlooked, but thankfully the workaround is pretty minimally difficult. If you have any better insight as to what exactly is happening here or if there’s a less hacked way to prevent it, I’d love to hear it.

Nov 192013
 

I just wanted to post a little bit about the Creep POP in Houdini because I’ve had such a hell of a time getting it to do anything useful.

I was trying to create the effect of raindrops sliding down glass. In Maya, this is a pretty simple thing- enable Needs Parent UV on the emitter, create Goal UV attributes on the particles, and set goalU=parentU and goalV=parentV on the creation expression. The particles stick to the object, and then you can work from there.

Houdini, as it usually does, makes things a little more complicated. The Creep POP looks for two important attributes: POSPRIM and POSUV. It wants to know what primitive to stick to, and what UV coordinates on that primitive to stick to. Of course, when emitting a particle randomly on a surface, you rarely have any idea what those numbers are, so we’ll have to use some expressions to create them manually.

Here goes. On the Source POP you’re emitting from, set up a birth group (leave Preserve Group off) and call it something like “justBorn.” We’ll create the attributes only on the brand new particles so they know what their starting position should be for the Creep. Next we’ll create two Attribute POPs. The first one we’ll use to create POSPRIM, so we know what primitive on the source geometry the particle was emitted from. On the first Attribute POP, set the Source Group to “justBorn”, the type to Integer, and the name to “posprim.” The expression for the value looks like this:

xyzdist($TX,$TY,$TZ,"../path_to_emitter",-1,3)

Here’s what this is doing. xyzdist() can return a number of things about an object based on a position in space; in this case, we’re querying based on the position of each new particle, $TX $TY $TZ. We want to get values based on the object we’re emitting from, which is ../path_to_emitter. The -1 means that we just want to return values based on the nearest primitive we can find. If we used any other value, it would return values specifically from one primitive. The 3 means that we just want to return a primitive number, and nothing else.

If you check the Details View now and run your simulation a bit, you should see that each new particle now remembers what primitive it was emitted from via our new POSPRIM attribute.

Now we have to take care of POSUV, which is done in a very similar manner. On your second Attribute POP, again set the Source Group to “justBorn.” This time the type is Float and the size is 2 (since we’re dealing with a UV coordinate) and the name is “posuv.”

Here’s the expressions for this one. The first expression goes into the first Value slot, and the second expression goes to the second Value.

xyzdist($TX,$TY,$TZ,"../path_to_emitter",$POSPRIM,1)

xyzdist($TX,$TY,$TZ,"../path_to_emitter",$POSPRIM,2)

It’s almost exactly the same expression every time… the difference between these expressions and the first one is that now we’re looking for U and V coordinates for a specific primitive based on our location. Since we have POSPRIM to tell us which primitive we should be sampling for every point, we can use the values 1 and 2 at the end of the expression to return U and V coordinates, respectively, based on our position. Now every particle knows what primitive it was emitted from, and the UV coordinates it was emitted from exactly. Check the expression help for xyzdist() to get a better idea of what kinds of things you can return from this expression.

Now you can just drop down a Creep POP and use the default local variables $POSPRIM and $POSUVU/V for the Prim Number and Prim UV. If you want to use forces to drive the particles, set the Behavior to Slide and use whatever force you want to push the particles along the surface. It’s not so bad once you do it a few times; it’s just too bad this behavior doesn’t have a quicker setup.

Apr 152013
 

My latest gigantic Houdini project is finally live! I was the Technical Director for this one, which also means Houdini tube effects, lighting, shading, rendering, etc. Thanks to my fellow Houdini artist, Alvaro Segura, for handling the inky effects, as well as for helping me to refine the tube generator OTL I spent so long constructing.

Feb 112013
 

Many operations in Maya will run faster if Maya doesn’t have to refresh the viewport while running them. For example, if you switch the viewport to only show the Graph Editor before baking animation, or caching particles or Alembic geometry, the operation will happen much faster than if Maya had to actually display the geometry for each frame that it’s being baked. There is probably a way via the API to tell Maya’s viewport not to refresh, but since I don’t know shit about the API, here’s a workaround using a few of Maya’s less-documented MEL commands and some Python.

I wrote this method assuming that artists would want to cache out information frequently without it disrupting their workflow. That means that I needed to first store the user’s current panel layout, then switch it to something that doesn’t require a refresh on every frame (like the Graph Editor), and then restore the previously restored layout.

I prefer coding in Python, but some of the procedures I’m running are MEL functions that are found in scripts/startup and so they’re not documented and they don’t have Python equivalents. A hybrid approach is the best way to handle it, since MEL is terrible.

Click for some Python code… Continue reading »

Feb 012013
 

Just wanted to document a dumb little problem I was having. My compositors need cameras in .FBX format, and I had a camera that had a Look At object (an aim vector) which wasn’t being considered in the export. I tried parenting a duplicate of the first camera to the original and exporting that, but of course I just ended up with a still camera.

The solution is to use CHOPs. Make a CHOP network, then drop down an Object CHOP. Select the object you want to bake as the Target Object. Then you can create a duplicate of your original object, and use this expression to drive any of the channels you need to bake:

chop("../chopnet1/objectCHOPname/channelName")

Or you can use an Export CHOP, point it to the duplicate object, and make sure the Path field includes the full list of channels you want to export (such as t[xyz] and r[xyz]).

 

Jan 292013
 

I’ve been very, very busy, which is a lame excuse for the lack of posting new things, but there you have it.

Much of my work time recently has been dedicated to building inky, nebula-like effects in Houdini. Generally speaking, when you’re trying to make that whole ink-in-water effect, you dust off your copy of 3DS Max, do a quick fluid simulation, advect a bunch of particles through it, and then cache out a bazillion partitions of your simulation with different random seeds, then render in Krakatoa. Well, I have none of those things, and I needed greater control over the simulation than what can typically be achieved in Fume/Krakatoa, so I tried to do it in Houdini. I definitely have a lot of respect for the Krakatoa renderer after a week spent on this effect. This stuff is hard! Hit the jump to see exactly how I went about this…

Continue reading »

Oct 252012
 

A short film that I worked on with a friend of mine, director Dan Blank, is finally out to the public! I was the CG Supervisor, and I handled all of the lighting, materials and rendering in addition to the overall pipeline.

Take a look at this article from The Atlantic magazine:
http://www.theatlantic.com/video/archive/2012/10/monster-roll/263968/

The short:

And here’s the VFX breakdown:

So excited it’s finally out there and getting good reviews!