I realize this effect was done years and years ago by smarter people than me (JT Nimoy, Adam Swaab, among others) but I’m a big nerd for procedural art and wanted to see how I could create the Quorra Heart effect described on Nimoy’s blog here. The effect itself isn’t Read more…
A quick script here to convert Gizmos in Nuke into Groups. Gizmos are nice in theory in that they provide reusable groups of nodes for users to interact with, but they’re often more trouble than they’re worth, especially if you need to send your Nuke scripts to someone else. Here’s Read more…
I’ve implemented most of my planned pipeline, finally. Just want to do a brief walkthrough to brag about features to show what this set of tools is capable of doing. The pipeline uses symlinks to allow each asset or shot to exist inside its own self-contained Maya project workspace, while Read more…
I've been working for the last few months on a pipeline system for Wolf & Crow here in Los Angeles. My efforts are typically focused on Maya pipelines, which I'll be documenting later, but I wanted to show off a little something I made that's usually outside my repertoire... web programming.
The company needed an interface to handle mundane tasks like creating new jobs and assigning job numbers to them, creating new assets, and creating new sequences & shots for projects. Normally this would be as easy as duplicating a template folder, but the pipeline here makes extensive use of symbolic links in order to tie multiple Maya workspaces together in a way that's as efficient as possible (for example, the "sourceimages" folder of every Maya project in a job can be symlinked to a single "textures" repository). Symlinks can't be created in Windows, although Windows will happily follow symlinks served from a Linux server. So the bulk of the work has to be done on the server itself via Python or Bash scripts.
Prior to the web app, the standard procedure was to run plink.exe (an SSH command-line utility) inside a Python script using the
subprocess module (see my earlier post on subprocess), which would pass a command over to the Linux file server to create all the symlinks. Or you could just SSH into the server using putty.exe and run the command yourself. This was clumsy, though, since you either needed to have Maya open to run the Python script through an interface, or you had to be comfortable enough with SSH and the Linux command line to run these scripts.
Instead, there's a (sort of) sexy web interface that anyone can run from a browser! Here's some examples.
Click below to see a whole bunch of code.
I'm working on a production right now that involves sending absolutely enormous animated meshes (6.5 million polygons on average with a changing point count every frame) out of Houdini and into Maya for rendering. Normally it would be best to just render everything directly out of Houdini, but sometimes you don't have as many Houdini licenses (or artists) as you'd like so you have to make do with what you have. If the mesh were smaller, or at least not animated, I'd consider using the Alembic format since V-Ray can render it directly as a VRayProxy, but for a mesh this dense animating over a long sequence, the file sizes would be impossibly huge since Alembics can't be output as sequences. Trying to load an 0.5 TB Alembic over a small file server between 20 machines trying to render the sequence and you can see why Alembic might not be ideal for this kind of situation. Hit the jump below to see the solution. (more…)
Side FX added a new Cloud FX toolkit a version or two ago, and I recently had a chance to mess around with it. Their Cloud Rig shelf tool is pretty great out of the box... pick a shape, turn it into a cloud, done. The Cloud Noise and Cloud Light SOPs that are built into the rig setup can get some pretty good results, but it's not exactly what I wanted... I was looking for a solution that would be a little less dependent on the volume container resolution, and more based on textures instead. It's not necessarily the greatest solution for swirling, dynamic simulations, but for more-or-less static clouds, it gets resolution-independent nice results that are quick to generate. Maya's fluid shader supports textures by default, but the Cloud Shader that Houdini uses doesn't have much in the way of textural control, so I had to make some customizations, and that's what this post is about. Check the link below to keep reading... (more…)
One of the bigger challenges with rendering liquids is that it can be difficult to get good UVs on them for texturing. Getting a displacement map on a liquid sim can make all the difference when you need some added detail without grinding out a multimillion-particle simulation. Unfortunately, liquid simulations have the annoying habit of stretching your projected UVs out after just a few seconds of movement, especially in more turbulent flows. In Houdini smoke and pyro simulations, there's an option to create a "dual rest field" that acts as an anchor point for texturing so that textures can be somewhat accurately applied to the fluid and they will advect through the velocity field. The trick with dual rest fields is that they will regenerate every N seconds, offset from each other by N/2 seconds. A couple of detail parameters called "rest_ratio" and "rest2_ratio" are created, which are basically just sine waves at opposite phases to each other, used as blending weights between each rest field. When it's time for the first rest field to regenerate, its blend weight is at zero while the rest2 field is at full strength, and vice versa. It's great that these are built into the smoke and pyro solvers, but of course nothing in Houdini can be that easy, so for FLIP simulations we'll have to do this manually. Rather than dig into the FLIP solver and deal with microsolvers and fields, I'll do this using SOPs and SOP Solvers in order to simplify things and avoid as many DOPs nightmares as possible. Here's the basic approach: Create two point-based UV projections from the most convenient angle (XZ-axis in my case) and call them uv1 and uv2. As point attributes, they'll automatically be advected through the FLIP solver. Then reproject each UV map at staggered intervals, so that uv2 always reprojects halfway between uv1 reprojections. We'll also create a detail attribute to act as the rest_ratio which will always be 0 when uv1 is reprojecting, and 1 when uv2 is reprojecting. It all sounds more complicated than it really is. Here goes... (more…)
I ran into a problem recently where I was trying to make some nice-looking embers in houdini, complete with nice motion-blurred trails. Typically with a particle system you use the velocity attribute to handle motion blur, but geometry velocity blur is always linear, so your motion trails will always be perfectly straight even if you have nice squiggly motions with your embers. Deformation motion blur looks great, but in most simulations particles are being born and dying all the time, and deformation motion blur doesn't work with a changing point count. The solution is to force a constant point count. This can be problematic when your particles need to have a lifespan, so there are a few little tricks you're going to have to pull in order to make this work... (more…)
I’ve been messing around with using Ptex in VRay for Maya and ran into a particularly weird little problem involving a normal map exported from Mudbox. It’s probably easier to show it than to tell about it (Fig. 1): That wax wrapper looks terrible… not at all like the original Read more…