NumberFlow, visual editor for procedural textures

NumberFlow is a visual node-based editor extension that allows you to design your own procedural textures right inside Unity. You can generate textures and animations at runtime, providing virtually limitless content while requiring very little storage space. The full source code is included.

Buy Now from the Asset Store (affiliate link).

Download Free for non-commercial use from the product web page.

Visit the product web page for more information, web player demos, and documentation.

Frequently Asked Questions

Does it require Unity Pro?
No. NumberFlow works with both the free and Pro versions of Unity. It also looks nice with both the Light and Dark skin.

Can I use procedural textures while editing my scenes?
Yes. With the help of a Diagram Material Manager, you can use materials with procedural textures in edit mode.

Do I need to program to use it?
No. You design diagrams visually by dragging, configuring, and connecting nodes. You can link them materials with a Diagram Material Manager. You can also export them to PNG images and use those like any other texture.

Does it run on mobiles?
Yes. It works on iOS, Android, and any other platform.

Does it run on Flash?
Don't count on it. It might, but NumberFlow is not supported for Unity's discontinued Flash export.

Does it use the GPU?
No. NumberFlow runs fully on the CPU.

Can I use texture animations?
Yes, but it is only practical to do realtime texture animation in very limited cases. You also need to do some scripting yourself. The web player demos give a good impression of how nonstop texture generation strains the CPU. The Gallery demo has some examples of using shaders for animation, which is a lot cheaper.

How can I generate textures as fast as possible?
The fastest way is to design in NumberFlow and then reproduce the result with your own code. Then you can eliminate the overhead of the diagram structure and incorporate all kinds of case-specific optimizations. Whether this is a wise use of development time is questionable.

How can I improve performance?
There are many ways to be smart about this. Here are some tips, but the main idea is to use less more creatively. Use a Diagram Material Manager or your own thread to compute textures. Generate once at load time. Use a lower resolution. Use less frames for animations. Use pregenerated short looping animations. Use multiple simple images for more complex effects. Use less octaves of noise. Use square distances.

Is it memory friendly?
Yes. Once a diagram is initialized it allocates no memory while generating textures. Of course the textures themselves do take up memory, as do any color arrays that you use. Make sure to set textures that won't change after generation to be no longer readable.

Can I generate compressed textures at runtime?
Textures need to have the ARGB32, RGB24, or Alpha8 format, otherwise their pixels cannot be set. However, you can use the Texture2D.Compress method afterwards to compress to DXT1 or DXT5, if available.

Does it use Allegorithmic's Substance technology?
No. NumberFlow is a visual scripting tool for designing procedural textures yourself, right inside Unity. The textures are generated by scripts written in C#. There is no custom data format, no native libaries, and no external tool needed.

Can it be used with JavaScript/UnityScript and Boo?
Yes. Though NumberFlow is written in C#, you can use it just fine with the other scripting languages.

Is the source code available?
Yes. When you buy NumberFlow you get all source code. The free version has mostly precompiled DLLs.

Buy Now from the Asset Store (affiliate link).

Download Free for non-commercial use from the product web page.

Really nice for the good price. And the fireball - so beautiful:hushed:

This is super impressive. I have a question though. Is Numberflow multithreaded and if so how well does it scale across threads ? Does it automatically use as many cores as the device has or do you need to specify how many it should use. The reason I ask is that I would like to know how well it scales on mobile. If I run it on a quad core mobile chip will it run on 4 cores and be faster than running it on a single core mobile chip automatically or would I need to explicitly tell it to run on 4 cores. The reason I ask is that I noticed in your demo there is the computational spread option. I'm unclear whether that is spreading the computation across multiple cores of a cpu or asking it to spread the computation across multiple frames of game time. Also the fact that in the demo you specify how to spread the computation makes me wonder what it's default behaviour is. Ideally it would have a load balancer that would automatically spread the work across multiple threads based on how busy they were, is that the case ?

I've also got a suggestion to make NumberFlow more user friendly. I think the most useful feature of NumberFlow would be linking it's runtime generated textures to unity materials. It looks like this is possible but at the moment you have to do it in code. It would be great if you added a button in a similiar way to the one that exports to png but one that exports a script that automaticaly links the diagrams texture output to the current gameobjects material. Then all a user would have to do would be drag that script to the gameobject they want to use the texture on and it would be used by its shader. The script could be generated so that is displayed and allowed the user to vary any of the input variables associated with the diagram it was linked to in unity's inspector window as well as to specify what texture channel it should link to on the objects material. You could also design it so that a user could drag more than one of these scripts onto a game object because they may want to use them in multiple texture channels on a shader so that they could combine multiple diagrams into one shader and then use the gpu to perform any functions that would combine them that would be too expensive to do on the cpu with NumberFlow to combine the diagrams together. You could even try and get really fancy and add options in the generated scripts that let the user specify how the texture in the diagram is generated in a similiar way to what can be done with substances. You could have options to specify whether they are generated on level load , run in realtime , run once and then cached in memory or run once and saved to storage so the next time the application loads the diagram doesn't need to even use the diagrams and their only purpose being to keep the required download size down.

@V0odo0: The fireball is my favorite too!

@IanStanbridge: NumberFlow diagrams can be run in separate threads. (Caveat: textures must be supplied as Color arrays.) However, you cannot use multiple threads to compute a single diagram, because that will screw up its internal data. So at most one thread per diagram.

The demo distributes the computation over multiple frames, not threads. It gives you manual control over how much frames to use, but it can also be adaptive, which is what the editor does.

I can put some threading stuff it there, but a thread management solution is kinda out of scope. That's typically a stand-alone thing that people use for their entire app and would want NumberFlow to work with, not roll its own version.

The initial NumberFlow version is basically the minimum needed to get it out there, but I definitely want to add functionality and improve usability. Your suggestions are very nice and I really want people to bring more! I am investigating how to integrate better with materials but the options are limited. I'll keep working though. Code generation is also on my mind, it was just totally out of scope for 1.0. Buy NumberFlow, and updates will come!

I have updated the diagram demo to use multithreading instead of spreading computation over multiple frames. This change will be included in version 1.0.2, which should come out somewhere later this week. The introductory price will also end then, so grab it quick!

Very cool package,

Just bought and imported into 4.3

Assets/Plugins/Catlike Coding/NumberFlow/Functions/FloatFunctions.cs(297,47): error CS0120: An object reference is required to access non-static member `Perlin.Noise(float, float, float)'

Regarding multi-threading..

I am using Loom Multithreading from the asset store. Would you consider in a future update to add support for Loom Multi-threading?


@rocki: That is a weird error, as that method signature does not match what's being used. Can you check whether the code does resolve to CatlikeCoding.Math.Perlin? Does it fail for all the noise calls?

I am not familiar with Loom but I looked through its API. It looks like you can just fire up a thread with a method that fills a color array and calls back to the main thread when you're ready to fill a texture. Is there some specific Loom stuff I should hook into?

Got it working... turns out there was another plugin that has Perlin.cs in the plugin folder. Seems strange that even though your project is in it's own namespace that unity should use another Perlin.cs

As for Loom, what I meant about supporting Loom is that for users of NumberFlow that have Loom, they do not have to do anything, just start up the package and it's already multithreading with Loom. Perhaps there could be a cross promotion between the 2 packages as they are really a great fit for one another.

Btw, is there a way to do distance maps in the package.

Really enjoying playing with the package... Very Flexible and Great Potential... Super Nice.

Found a very interesting link on Amortized Noise.

"A new noise generation algorithm called amortized noise generates smooth noise using dynamic programming techniques to amortize the cost of floating point multiplications over neighboring points."

@rocki: Thank you for checking! I did some tests and it clashes when the script has the same name, it is also in the Plugins folder, and it is not inside a namespace. I'll make sure it won't clash anymore in 1.0.2 by using the fully qualified type.

I could include add-on functionality that conveniently hooks into Loom. Not automatic, as that might well worsen performance and screw things up, but certainly optional. I will investigate this.

Computing distance maps from arbitrary textures isn't a good fit. You're better off with a specialized algorithm. I do happen to have a free distance map generator for use in Unity's editor, which might be useful.

I'm not sure that amortized noise is worth it, as it seems to rely on a lot of cached data. The paper doesn't do much explaining, so I'll have to study the code.

:) Oh, and if you design something neat, I wanna see!

Would be so awesome if you could make NumberFlow and Loom integration more convenient.

Nice Distance Map Generator, will try it out.

Sure thing, if I have something cool, definitely will show you.


first: this is a great package and for this price a real steal.

i had the same issue with the procudral examples (from unity) package. so its not that rare ;). i don't know why unity gives precedence to the other class instead of yours.

factor 2 or 3 faster is at least worth the look imo ;). there is also a very extensive gpu-noise package from jesta. it would be great if this could be included/linked once it supports gpu->cpu communication (in the works).

some remarks, ideas and feature suggestions:
complex diagrams get a mess real quickly so it would be great to have a string field per node where the user can enter a "variable name" (just for user), comments and remarks. just to remember what a node is supposed to do.
pi should be included as constant (maybe also e). as well as a function for arcussinus and arcuscosinus (and maybe tangens + arcustangens).
the currently selected node should be more "visible" maybe through another color or a frame.
also you could highlight all links which lead to the current selected node so its easier to "follow" the values (where they come from) when there are several "paths".
when i select a node it shows "linked" in the detail view. it would be usefull to click on this and the linked node is selected (or have a small button beside to do this) to traverse backwards.
the v coordinate goes from bottom (0) to top (1). i don't know if this is standard behavior but at least it could be documented somewhere ;). for my purpose i have to use 1-v.
an organisatorical feature would be some rectangular container (nameable/comment) where i can put nodes inside and then only move the container to move all nodes inside accordingly. this way one could name the parts of an "equation" and keep it organized in one place.
noise has parameter lacunacity what should be lacunarity (r) if i'm not mistaken (not english native speaker).
a gradient should not have a solid color by default. i was wondering why my noise was not working as expected. so a default gradient (fe black/white) would at least prevent this and make issues more obvious.

now what i have done with it. i created a diagram for a planettexture that shall be mapped on a sphere. so i calculate the position of each pixel in 3d space and then feed this position as direction vector into the noise engine to grab a noise value there. by using this approach seamless planetary surfaces can be achieved which you see mapped on a icosphere (lat/long sphere has texture issues at poles, much reduced but still not gone).
consider the distortions of the flat texture at top and bottom (poles) which is because the pixels are much closer together there. the gradient is simple red for 0 and white for 1.

my attempts are not finished yet and i have plans for different planet types, planetary ring textures, a corona and clouds. just want to let all people know that your tool is a great help, easy to use and highly recommended for any type of prototyping (when performance is ok also for production). its a pleasure to work with (once you get the concept) as you directly see the effects of your changes.

i also want to achieve craters. do you have an idea how this can be achieved with your package (maybe with custom nodes)? can a region be set to a certain value? you have a function distance from center so a distance from arbitrary point maybe would be usefull. maybe also creating a "mean"value for a region.
what about creating a normal map? or do i need external tool for it? (would be ok).
do unlinked nodes affect performance? i ask because its often required to show some intermediate result or to test something and its a hassle to always create the output again.
when is the diagram saved? when the scene is saved? after each change?

Work for 1.0.2 is nearly done. I've added various flavors of voronoi noise. Though slower than Perlin noise, it enables lots of interesting effects.


@exiguous: :) Ah, you bought it, and I am glad to hear you like it!

I admit that the editor is currently quite bare-bones. Figuring out how to improve the workspace is a real challenge, because it needs to stay as simple as possible, but I have some ideas. The selected node and its connections have a slightly different color, but you're right in that it's not visible enough. Turning "linked" into a button sounds like a good idea, thanks!

I'm using OpenGL's UV convention, which has (0,0) at the bottom left. Here you can't please everyone, but at least there's 1 - V.

Cool that's you're texturing planets! I'm going to include sphere coordinates nodes somewhere in the future, but I've yet to decide what systems to use. You're using the obvious one here, but as you mentioned the poles are low quality.

I can definitely include a PI constant node, but you can also use Sinusoid nodes instead of Sine and Cosine. It has 2PI conveniently factored in already.

Can you achieve craters? I'd say yes, but can you elaborate? Do you want to place features at exact points, or leave it up to noise? You can get the distance to a point by subtraction and getting the (square)length of the result, which you can then use to modulate some effect.

Can you create normal maps? Yes, but there are many ways to do so. The Terrain demo uses a two-step approach where one diagram creates a height field and another computes normals from that. See the "Normals Heights" diagram. There is another "Terrain" diagram in the other demo, which calculates heights, colors, rough normals, and lighting in one go. Nice to show off, but not practical.

Do unlinked nodes affect performance? No. Only nodes leading to the output node are part of the computation. I regularly work with diagrams that have
huge dead chunks in them.

When is the diagram saved? It's saved when you save the scene.

i know that you can't please anyone. i thought there is a standard as this way its counter-intuitive for me. but probably there are standards to cover all possibilities and this is like no standard at all ;). i also wantet to mention this for other people needing this information. so it was not entirely a complain ;).

this problem is caused by the mesh, not the texture or the uv coordinates. the only way to avoid would be to use a sphere based on a cube with a cubemap texture mapped on it. but then the coordinate calculation is not straight foward any more. but when you would make a solid mapping of cubemap-uv to 3d coordinates and vice versa i would not complain.

the placing of those features can be random, maybe influenced by a linked value. for example there could me more craters in higher regions and none on sea level.

i have no idea how to "loop" over the data. so in general i would determine the position of the crater center then find all pixel within its radius and adjust their value. from my understanding your package does calculate each pixel once? i don't know how to include several potential influences in the diagram. as mentioned i would not care to do it in a custom node but you stated that those should not rely on persistent states (i guess because of distributed calculation). so it is difficult to keep a list of crater data or maybe i just understand the workings wrong. thats why i ask if you see a method/aproach that fits into the internal working of your package.

is there a save check so when i (accidentially) close the diagram or open another (by selecting it). so is it saved in background, am i asked or is my work since last manual scene-save lost?

another thing i would like to understand is how your fractal noise samples the perlin coordinates in relation to the point put in as parameter.
so imagine i have 2 octaves an the pixel koordinates are given as input on the x-z-plane. so pixel 0,0 is used to sample perlin at position 0,0,0. pixel 1,1 samples at 1,0,1 and so on. so where is the value for the second octave with doubled frequency taken from? 0,0,0 and 2,0,2 in the example? so is the frequency simply the factor/multiplicator for the coordinates? i ask because the noise wraps around without a seam out of the box and want to understand why. i know its hard to explain/understand.

anyway thanks for your explanations and input.
btw i always prefer the documentation as a pdf together with the package. a pdf can be searched for keywords but for information on different webpages this is more difficult/inconvenient.
and i hope you see my complains not as nagging but as constructive critics and suggestions for improvements. in the end you decide what matches your "vision" of the product and what not. you must not include everything just to make me happy (i have paid already ;)).

@exiguous: :) You aren't nagging, I greatly appreciate your input!

Indeed you don't loop over data. What you do is blend. For example, you can design something that looks like craters all over the place, then use altitude to fade it in or out on top of you base altitude. The upcoming voronoi noise would be a good candidate for generating a crater landscape.

Here I added one big hole by factoring it into some noise.

The Perlin noise is solid 3D noise, it's sampled at whatever point you provide. The frequency is indeed used to scale this coordinate. Only the tiling noise variants have to do some extra trickery. Your planet texture is seamless because you wrap it around a sphere in 3D.

It wasn't really correct of me to say it's saved when you save the scene. The diagrams are assets and as such cannot be lost unless you delete them from your file system. You won't lose work when you close them. Unity saves assets at various moments, from what I remember including on scene save and application quit. The only point you could lose works would be when Unity crashes before it saves its assets.

I don't like PDFs and much prefer web pages myself. Searching in documents is so often riddled with false positives. Web pages are far easier to maintain for me. All the links, animations, web player, downloads, etcetera are also right there. I could add a site search though.

just to make sure i'm not too demanding ;).

thanks for the example. i hope i find some time on the weekend to investigate it in detail.

i know this issue. i had already created seamless textures with simplex noise in code a while back. i just thought it should only work for the point itself. in my tests (with simplex) i used the position on the unit sphere as direction and scaled it to the length of the radius i wanted to sample from for each octave. so a short radius has a small circumferrence and thus samples a low frequency noise.
i just wondered why it worked with all octaves in your noise. it seems that its done the same way in your method by uniformly scaling the point in all 3 directions.

ok. i just wondered when i selected another diagram in the project view it got displayed immediately without clicking "pen in editor". this is usefull for quick reference of examples etc but only as long as the previous diagram is not lost ;).

i understand this. with a site search the issue is gone anyway. and there comes an "online documentation" to mind where you press a question mark beside a command and land on the page when you press it. but i'm just thinking loud ;).

hey Jasper,
i have followed your big hole example.
your example has one crater and thats my issue how to achieve multiple ones? its hard to understand for me when i cannot loop over data and refer to other data (neighbouring pixels).
am i right that i need to do a blend for each crater i want to have? so my theoretical setup is to have a diagram that produces the general surface in a texture. then i create a loop with the number of craters iterations and feed the texture to another diagram which blends in one crater per iteration at an arbitrary position. is this the setup you meant? one blend per crater might be very time consuming as the whole texture is touched. maybe its better to do it in code and only touch the pixels required.
also i don't know how to wrap around the texture when a crater is at the border but reaches the "other" side.

more questions:
how can i seed the perlin noise?
as for most cases noise is needed between 0 and 1 you could do the scale to 0-1 internally (in a special node or by checkmark).
in what form is an input gradient defined? is it a 1d texture?

thanks alot.

Edit: another "issue" is that the pixel distance is not equal to the 3d space distance. so when i simply use it the craters would look distorted. so it seems that craters in my use case would be too hard to achieve with your tool. this is not bad as it cannot be suited for everything. so i'll do the craters in code and use the diagrams for texture, heightmap, normalmap and maybe clouds.

@exiguous: Doing it once per crater is indeed not a good idea. Instead, you want to generate craters with noise as well. Below I've done just that with NumberFlow 1.0.2. I used three voronoi noise nodes to generate spheres at different sizes and subtracted them from the terrain noise.


As it's all 3D this approach also works fine when combined with a sphere texture. The screenshot shows a 3x3 tiled preview.


The noise uses a permutation table. You cannot seed it, instead you can offset the sample point so you end up using a different part of the domain. Like adding 123 to Z. You have a 256-unit cube to move around in, beyond that it will repeat. The Perlin noise matches the reference, which is centered around zero and has an effective range of about -0.7 to 0.7 for one octave. Transforming it to a 0-1 range is expensive in the general case, better define your own constant factor that works. Another useful trick is to use a "Float / Range / Pingpong 01" node to make it bounce between 0 and 1.

thanks for your example. so i wait until 1.0.2. arrives. btw to make 1 crater per blend was not because i like the method but because of lack of understanding how this blending works. thats why i asked about looping over data. and without the voronoi noise (which was not released then) i did see no other possibility. but thanks for showing me the way.

for simplex noise the table contained the numbers from 0 to 256 iirc and i used a seed for unitys random number generator to fill the list in pseudorandom order with all the numbers. this way the noise should look different at same position with different seeds. or does the tabkle need to be in this fix form for a certain reason?