Original Source: http://feedproxy.google.com/~r/tympanus/~3/nrI1eldXkN8/
In April 2019, Harry Alisavakis made a great write-up about the “magical marbles” effect he shared prior on Twitter. Check that out first to get a high level overview of the effect we’re after (while you’re at it, you should see some of his other excellent shader posts).
While his write-up provided a brief summary of the technique, the purpose of this tutorial is to offer a more concrete look at how you could implement code for this in Three.js to run on the web. There’s also some tweaks to the technique here and there that try to make things more straightforward.
This tutorial assumes intermediate familiarity with Three.js and GLSL
Overview
You should read Harry’s post first because he provides helpful visuals, but the gist of it is this:
Add fake depth to a material by offsetting the texture look-ups based on camera directionInstead of using the same texture at each iteration, let’s use depth-wise “slices” of a heightmap so that the shape of our volume is more dynamicAdd wavy motion by displacing the texture look-ups with scrolling noise
There were a couple parts of this write-up that weren’t totally clear to me, likely due to the difference in features available in Unity vs Three.js. One is the jump from parallax mapping on a plane to a sphere. Another is how to get vertex tangents for the transformation to tangent space. Finally, I wasn’t sure if the noise for the heightmap was evaluated as code inside the shader or pre-rendered. After some experimentation I came to my own conclusions for these, but I encourage you to come up with your own variations of this technique
Here’s the Pen I’ll be starting from, it sets up a boilerplate Three.js app with an init and tick lifecycle, color management, and an environment map from Poly Haven for lighting.
See the Pen
by Matt (@mattrossman)
on CodePen.0
Step 1: A Blank Marble
Marbles are made of glass, and Harry’s marbles definitely showed some specular shine. In order to make a truly beautiful glassy material it would take some pretty complex PBR shader code, which is too much work! Instead, let’s just take one of Three.js’s built-in PBR materials and hook our magical bits into that, like the shader parasite we are.
Enter onBeforeCompile, a callback property of the THREE.Material base class that lets you apply patches to built-in shaders before they get compiled by WebGL. This technique is very hacky and not well explained in the official docs, but a good place to learn more about it is Dusan Bosnjak’s post “Extending three.js materials with GLSL”. The hardest part about it is determining which part of the shaders you need to change exactly. Unfortunately, your best bet is to just read through the source code of the shader you want to modify, find a line or chunk that looks vaguely relevant, and try tweaking stuff until the property you want to modify shows visible changes. I’ve been writing personal notes of what I discover since it’s really hard to keep track of what the different chunks and variables do.
I recently discovered there’s a much more elegant way to extend the built-in materials using Three’s experimental Node Materials, but that deserves a whole tutorial of its own, so for this guide I’ll stick with the more common onBeforeCompile approach.
For our purposes, MeshStandardMaterial is a good base to start from. It has specular and environment reflections that will make out material look very glassy, plus it gives you the option to add a normal map later on if you want to add scratches onto the surface. The only part we want to change is the base color on which the lighting is applied. Luckily, this is easy to find. The fragment shader for MeshStandardMaterial is defined in meshphysical_frag.glsl.js (it’s a subset of MeshPhysicalMaterial, so they are both defined in the same file). Oftentimes you need to go digging through the shader chunks represented by each of the #include statements you’ll see in the file, however, this is a rare occasion where the variable we want to tweak is in plain sight.
It’s the line right near the top of the main() function that says:
vec4 diffuseColor = vec4( diffuse, opacity );
This line normally reads from the diffuse and opacity uniforms which you set via the .color and .opacity JavaScript properties of the material, and then all the chunks after that do the complicated lighting work. We are going to replace this line with our own assignment to diffuseColor so we can apply whatever pattern we want on the marble’s surface. You can do this using regular JavaScript string methods on the .fragmentShader field of the shader provided to the onBeforeCompile callback.
material.onBeforeCompile = shader => {
shader.fragmentShader = shader.fragmentShader.replace('/vec4 diffuseColor.*;/, `
// Assign whatever you want!
vec4 diffuseColor = vec4(1., 0., 0., 1.);
`)
}
By the way, the type definition for that mysterious callback argument is available here.
In the following Pen I swapped our geometry for a sphere, lowered the roughness, and filled the diffuseColor with the screen space normals which are available in the standard fragment shader on vNormal. The result looks like a shiny version of MeshNormalMaterial.
See the Pen
by Matt (@mattrossman)
on CodePen.0
Step 2: Fake Volume
Now comes the harder part — using the diffuse color to create the illusion of volume inside our marble. In Harry’s earlier parallax post, he talks about finding the camera direction in tangent space and using this to offset the UV coordinates. There’s a great explanation of how this general principle works for parallax effects on learnopengl.com and in this archived post.
However, converting stuff into tangent space in Three.js can be tricky. To the best of my knowledge, there’s not a built-in utility to help with this like there are for other space transformations, so it takes some legwork to both generate vertex tangents and then assemble a TBN matrix to perform the transformation. On top of that, spheres are not a nice shape for tangents due to the hairy ball theorem (yes, that’s a real thing), and Three’s computeTangents() function was producing discontinuities for me so you basically have to compute tangents manually. Yuck!
Luckily, we don’t really need to use tangent space if we frame this as a 3D raymarching problem. We have a ray pointing from the camera to the surface of our marble, and we want to march this through the sphere volume as well as down the slices of our height map. We just need to know how to convert a point in 3D space into a point on the surface of our sphere so we can perform texture lookups. In theory you could also just plug the 3D position right into your noise function of choice and skip using the texture, but this effect relies on lots of iterations and I’m operating under the assumption that a texture lookup is cheaper than all the number crunching happening in e.g. the 3D simplex noise function (shader gurus, please correct me if I’m wrong). The other benefit of reading from a texture is that it allows us to use a more art-oriented pipeline to craft our heightmaps, so we can make all sorts of interesting volumes without writing new code.
Originally I wrote a function to do this spherical XYZ→UV conversion based on some answers I saw online, but it turns out there’s already a function that does the same thing inside of common.glsl.js called equirectUv. We can reuse that as long as put our raymarching logic after the #include <common> line in the standard shader.
Creating our heightmap
For the heightmap, we want a texture that seamlessly projects on the surface of a UV sphere. It’s not hard to find seamless noise textures online, but the problem is that these flat projections of noise will look warped near the poles when applied to a sphere. To solve this, let’s craft our own texture using Blender. One way to do this is to bend a high resolution “Grid” mesh into a sphere using two instances of the “Simple Deform modifier”, plug the resulting “Object” texture coordinates into your procedural shader of choice, and then do an emissive bake with the Cycles renderer. I also added some loop cuts near the poles and a subdivision modifier to prevent any artifacts in the bake.
The resulting bake looks something like this:
Raymarching
Now the moment we’ve been waiting for (or dreading) — raymarching! It’s actually not so bad, the following is an abbreviated version of the code. For now there’s no animation, I’m just taking slices of the heightmap using smoothstep (note the smoothing factor which helps hide the sharp edges between layers), adding them up, and then using this to mix two colors.
uniform sampler2D heightMap;
uniform vec3 colorA;
uniform vec3 colorB;
uniform float iterations;
uniform float depth;
uniform float smoothing;
/**
* @param rayOrigin – Point on sphere
* @param rayDir – Normalized ray direction
* @returns Diffuse RGB color
*/
vec3 marchMarble(vec3 rayOrigin, vec3 rayDir) {
float perIteration = 1. / float(iterations);
vec3 deltaRay = rayDir * perIteration * depth;
// Start at point of intersection and accumulate volume
vec3 p = rayOrigin;
float totalVolume = 0.;
for (int i=0; i<iterations; ++i) {
// Read heightmap from current spherical direction
vec2 uv = equirectUv(p);
float heightMapVal = texture(heightMap, uv).r;
// Take a slice of the heightmap
float height = length(p); // 1 at surface, 0 at core, assuming radius = 1
float cutoff = 1. – float(i) * perIteration;
float slice = smoothstep(cutoff, cutoff + smoothing, heightMapVal);
// Accumulate the volume and advance the ray forward one step
totalVolume += slice * perIteration;
p += deltaRay;
}
return mix(colorA, colorB, totalVolume);
}
/**
* We can user this later like:
*
* vec4 diffuseColor = vec4(marchMarble(rayOrigin, rayDir), 1.0);
*/
This logic isn’t really physically accurate — taking slices of the heightmap based on the iteration index assumes that the ray is pointing towards the center of the sphere, but this isn’t true for most of the pixels. As a result, the marble appears to have some heavy refraction. However, I think this actually looks cool and further sells the effect of it being solid glass!
Injecting uniforms
One final note before we see the fruits of our labor — how do we include all these custom uniforms in our modified material? We can’t just stuck stuff onto material.uniforms like you would with THREE.ShaderMaterial. The trick is to create your own personal uniforms object and then wire up its contents onto the shader argument inside of onBeforeCompile. For instance:
const myUniforms = {
foo: { value: 0 }
}
material.onBeforeCompile = shader => {
shader.uniforms.foo = myUniforms.foo
// … (all your other patches)
}
When the shader tries to read its shader.uniforms.foo.value reference, it’s actually reading from your local myUniforms.foo.value, so any change to the values in your uniforms object will automatically be reflected in the shader.
I typically use the JavaScript spread operator to wire up all my uniforms at once:
const myUniforms = {
// …(lots of stuff)
}
material.onBeforeCompile = shader => {
shader.uniforms = { …shader.uniforms, …myUniforms }
// … (all your other patches)
}
Putting this all together, we get a gassy (and glassy) volume. I’ve added sliders to this Pen so you can play around with the iteration count, smoothing, max depth, and colors.
See the Pen
by Matt (@mattrossman)
on CodePen.0
Technically the ray origin and ray direction should be in local space so the effect doesn’t break when the marble moves. However, I’m skipping this transformation because we’re not moving the marble, so world space and local space are interchangeable. Work smarter not harder!
Step 3: Wavy Motion
Almost done! The final touch is to make this marble come alive by animating the volume. Harry’s waving displacement post explains how he accomplishes this using a 2D displacement texture. However, just like with the heightmap, a flat displacement texture warps near the poles of a sphere. So, we’ll make our own again. You can use the same Blender setup as before, but this time let’s bake a 3D noise texture to the RGB channels:
Then in our marchMarble function, we’ll read from this texture using the same equirectUv function as before, center the values, and then add a scaled version of that vector to the position used for the heightmap texture lookup. To animate the displacement, introduce a time uniform and use that to scroll the displacement texture horizontally. For an even better effect, we’ll sample the displacement map twice (once upright, then upside down so they never perfectly align), scroll them in opposite directions and add them together to produce noise that looks chaotic. This general strategy is often used in water shaders to create waves.
uniform float time;
uniform float strength;
// Lookup displacement texture
vec2 uv = equirectUv(normalize(p));
vec2 scrollX = vec2(time, 0.);
vec2 flipY = vec2(1., -1.);
vec3 displacementA = texture(displacementMap, uv + scrollX).rgb;
vec3 displacementB = texture(displacementMap, uv * flipY – scrollX).rgb;
// Center the noise
displacementA -= 0.5;
displacementB -= 0.5;
// Displace current ray position and lookup heightmap
vec3 displaced = p + strength * (displacementA + displacementB);
uv = equirectUv(normalize(displaced));
float heightMapVal = texture(heightMap, uv).r;
Behold, your magical marble!
See the Pen
by Matt (@mattrossman)
on CodePen.0
Extra Credit
Hard part’s over! This formula is a starting point from which there are endless possibilities for improvements and deviations. For instance, what happens if we swap out the noise texture we used earlier for something else like this:
This was created using the “Wave Texture” node in Blender
See the Pen
by Matt (@mattrossman)
on CodePen.0
Or how about something recognizable, like this map of the earth?
Try dragging the “displacement” slider and watch how the floating continents dance around!
See the Pen
by Matt (@mattrossman)
on CodePen.0
In that example I modified the shader to make the volume look less gaseous by boosting the rate of volume accumulation, breaking the loop once it reached a certain volume threshold, and tinting based on the final number of iterations rather than accumulated volume.
For my last trick, I’ll point back to Harry’s write-up where he suggests mixing between two HDR colors. This basically means mixing between colors whose RGB values exceed the typical [0, 1] range. If we plug such a color into our shader as-is, it’ll create color artifacts in the pixels where the lighting is blown out. There’s an easy solve for this by wrapping the color in a toneMapping() call as is done in tonemapping_fragment.glsl.js, which “tones down” the color range. I couldn’t find where that function is actually defined, but it works!
I’ve added some color multiplier sliders to this Pen so you can push the colors outside the [0, 1] range and observe how mixing these HDR colors creates pleasant color ramps.
See the Pen
by Matt (@mattrossman)
on CodePen.0
Conclusion
Thanks again to Harry for the great learning resources. I had a ton of fun trying to recreate this effect and I learned a lot along the way. Hopefully you learned something too!
Your challenge now is to take these examples and run with them. Change the code, the textures, the colors, and make your very own magical marble. Show me and Harry what you make on Twitter.
Surprise me!
The post Magical Marbles in Three.js appeared first on Codrops.