any clever trick for tall grass on mobile (Quest)?

Suppose you had to make something like this:

on a mobile platform like Quest. The obvious approach is to have a bunch of overlapping transparent cards for all those bundles of grass. They could be combined into one big mesh or drawn with something like DrawInstancedIndirect, but no matter how the polys get into the pipeline, there’s going to be a huge amount of overdraw.

Is there any more clever solution, using a custom shader? Some way a whole area of grass could be one simple mesh, merely drawing as if it had a bunch of grass under it? I’m thinking something like casting a ray, within the shader, into a computationally defined sea of grass, and plotting the color of the first blade such a ray would hit, so each pixel only gets drawn once (by the grass, at least). Obviously not general ray-casting, but something very specific to the constraints of this problem.

Am I crazy? Is there a better trick? Is there an off-the-shelf solution already?

Try interior shading and equivalent. It’s a kind of raycasting too, modulo shader on shader toy also deal with infinitely reapeating negative shapes. I’m working on positive infinitely repeating shape for hair rendering, a solution for thin axis align strip exist.

Once you have infinity it’s like a 3d query in a convex shape, just discard hits outside the convex primitive to get patches, combine convex primitive to carve complex shape.

I’m working on trick to shuffle position so they don’t look like grid align, basically the same way we shuffle seed to create pseudo random number by overlaying different scale of hash.

The problem with these is they are limited to primitive shapes (raycasting and ray discard), so you would have to combine in order to get desired shape. Specific texture shape are typically random, so you can’t predict mathematically.

So the most probable solution for you is raymarching, that is assume all quads are on a grid and test visibility at each plane, that is do a for loop that test each pixels along the projected ray on a grass texture, stop at first solid pixel.

Imho for mobile hw it’s not a real solution you just move the problem elsewhere, random memory access is not great on those, and they are bad already for typical gpu anyway.

If what’re you’re aiming for is actually a Quest, or at least an Adreno based mobile platform like the Quest, use alpha tested grass. There’s no way to make that performant with alpha blending. Adreno’s tiled deferred rendering is surprisingly good at handling overdraw as long as what you’re rendering is opaque or using alpha tested materials with ZWrite On. It’s even pretty decent with alpha to coverage, though for grass that’s too expensive.

Thanks to both for your replies — @bgolus , yes, alpha testing with ZWrite On would be how we handle it if we go with a bunch of overlapping cards (though I do hate the aliasing that causes… but maybe it’ll be less noticeable on Quest 2). And we’ve also used alpha to coverage for other foliage (looks SO much better but as you noted, is more expensive). If the hardware is especially good at this, then I guess I should try it before assuming it can’t work.

But I’m still hoping there’s a more clever way that doesn’t involve a bunch of cards at all.

@neoshaman , I understood at least 2/3 of the words in your post. :slight_smile: What you’re working on sounds very relevant, though. Got any screen shots you can share?

I don’t understand your point about random memory access (though to be fair, that’s only one of many things I don’t understand). I’m picturing something where the shader is built specifically for “drawing grass”. You slap it on some simple mesh, and it does the ray cast/march to work out what blade of grass (in a regular grid of grass blades) is hit. It wouldn’t need to access memory, in the simplest version, because it knows there is some perfectly regular grid of grass blades.

Raymarching mean you step through many pixels to resolve, accessing pixel is accessing memory, and it also mean you would jump around in memory position, hence random access.

What you describe is raycasting, is solve it mathematically, hence interior shader, it infer from a simple shape, find the virtual position then you sample accordingly. I’m exploring stuff here, Infinite parallax hair volume using wrapping grid tracing (attempt)
Or Relief function shader for high depth and tiling it start to look like grass.
But try interior shading too similarly with high depth and close wall in one axis, though raycasting techniques are limited to primitive shapes, your grass will be boxy most of the time, which you can try to alleviate by mixing multiple casting, picking the closest result and texture trickery, don’t forget to blend with the scene by writing the virtual z to zbuffer.

Yes, something like raycasting is what I was thinking then. I think it’s fine to treat each blade of grass as some simple shape like a long, slender cone or cylinder, and to further assume they are in a regular square grid. So then, yeah, someone with sufficient math chops could probably work out how to cast a ray into a field of such and figure out what it hits.

After reviewing those other threads, though… whew! :slight_smile: I’m not sure that someone is me.

That code for infinite flat parallel square grass blade exist. Basically you would mix the code publish in both thread. Start by learning tutorial about parralax occlusion mapping or parralax mapping, to give a rough overview, look at box projected cubemap and interior mapping tutorial, then it will make sense.

Basically you need the view normal vector on each pixel, then test if you are inside the empty space, ie we will just test either U or V is over .5, then look the corresponding components of the view vector, ie x or y, then divide the empty space by the components, from the ray position, to get how many step is needed to hit the non empty space in the direction of the components, the integer number of steps is the depth as plane position, so infer the z positions, sample the texture at the scale down, do depth related adjustment like fog, write to the zbuffer or discard if the sample if depth is too much or the virtual cuz is over the range you wanted like height. Since it’s based on it position, you can deform if to get some effect, swaying pattern might be possible.

That’s roughly the ideas

I did find this work-in-progress asset which uses parallax occlusion mapping (POM) to make short volumetric grass. Not exactly what I started out asking for, but pretty close (and we could probably live with shorter grass if necessary).

There’s also this tutorial on making a geometry shader that generates a procedural blade of grass for every point on a mesh — and points in between, using runtime tesselation. But I don’t understand why this would be any more efficient than simply modeling all that grass in your favorite modeling program.

@bgolus , any thoughts on that? Is a geometry shader (with or without tesselation) ever going to perform better than the equivalent predefined mesh?

Pom is raymarching by the way

Sorry for necroing this threa, but i would really like to know what your final solution was @JoeStrout ?

None. We gave up on it.

1 Like

“Make your grass rendering nearly free with this one easy trick…”
The trick is don’t have any grass

4 Likes

Seriously though, you can absolutely do grass on the Quest. But understand it comes at the cost of basically everything else. You have a limited vertex count and fragment budget, and there’s no getting around grass being very high on both. Even with all of the extra benefits of Adreno’s hardware tile based deferred rendering supporting and alpha testing significantly reducing over shading, it’s still there. Alpha tested shaders basically end up running twice, once for the coverage and once for the color. In the second pass you’ll get a ton of over shading from pixel quads. The screen is a grid of 2x2 pixel groups called pixel quads. If a triangle is visible in one pixel of a pixel quad, all 4 pixels are processed. Basically the same problem as micro triangles.

So, you can have grass, but possibly nothing else.

The best way to do grass on the Quest is to have as little of it as possible. A few tufts here or there and the rest is simply hinted at by ground textures and edge details. See almost every Nintendo game ever with grass.

Only a few scattered clumps of “real” grass, with most of it faked by having the bits of geometry that stick into the ground having some grass textures around the base. Also fade out / shrink and hide the grass after a few meters.