Sops to Cops
Building upon the infection solver, we now interpret the @infection_time attribute and generate a texture map. This technique is a powerful way to transfer point-level attributes—whether animated or static—into a 2D texture. By leveraging textures in materials, we can avoid the need for extremely high polygon counts, making it an excellent optimization strategy for rendering.
This method is rooted in the classic Compositing Operators (COPs) workflow, sticking with Houdini's traditional 2D texture generation tools. No Copernicus required—pure OG COPs!
To start, we need to ensure our model has good UVs. Without good UVs, this process will produce some strange results. We're looking for clean, non-overlapping UVs.
We need to make sure our UVs are assigned to the points, not the vertices. To do this, I use a Point Split SOP with the attribute set to UV. This splits the points at UV seams, and we can then promote the attribute to the points. After that, we just need to ensure the UVs are propagated all the way downstream through the infection system to where we calculate the @infection_time.
When we have our @infection_time attribute out of the solver, we first need to normalize it so that we have a consistent range between 0 and 1. I use an Attribute Promote SOP, set to Detail and Maximum, to save the highest point attribute as a detail attribute. This allows me to reference the detail attribute in an Attribute Remap SOP to ensure we always have the normalized 0-1 range.
Next, we want our points to be in their UV coordinates. This can be done with the line:
"v@P = v@uv;"
. Now our points are in their UV coordinates with the @infection_time attribute.
At this point, we need two Nulls:
One connected to the points in their 3D coordinates.
One upstream, where we point split our UV attribute.
However, before the second Null, we also need the geometry in 2D space. This is accomplished by applying the same transformation:
"v@P = v@uv;"
. With this step, we now have:
3D points with the @infection_time attribute.
2D geometry aligned with UV space.
Lastly, we need to capture the 3D rest position of our geometry.
Now, we need a COPs network. In the COPs network, we just need one node: the VOP Generate node. This node should be set to a square resolution.
Inside the VOP, we need to interpret the X and Y parameters into a vector, leaving the Z value at 0. This associates the coordinates to 0-1 space in 3D, where our 2D geometry resides.
Next, we need to collect the 3D rest position from our 2D geometry. To do this, use the XYZdist node. Set the P channel's input to the XY vector and reference the 2D geometry Null.
Finally, to capture the rest position, add a Primitive Attribute node that references the same 2D geometry and collects the @rest attribute.
Now, we essentially have proxy 3D coordinates for our geometry at each pixel of our COP output. This can be visualized by feeding the output of the Primitive Attribute node into the R, G, and B channels.
However, we want to sample a custom attribute for each pixel, which allows us to avoid having an excessively high point count for our simulation. To achieve this, we use the PCOpen node. First, take the rest position attribute from the Primitive Attribute node and feed it into the P channel of the PCOpen. This time, reference the 3D points null as the "Point Cloud Texture."
To extract an attribute from the point cloud, we use a PCFilter node. You can input any attribute name into the channel input—in this case, we want the @infection_time attribute.
Finally, connect the output of the PCFilter to the R, G, B channels of the VOP Generate node, and we’re good to go.
For previewing the texture, I use the Labs Quick Material node and reference the COP VOP node using the "op:" prefix before the reference.
When sending to render, I find it’s best to cache the infection simulation and then use the PDG tools to render out the texture map. Once the texture is baked, you can render it per frame.