Let’s say I have a Texture2D, everything is either black or white, and for each black pixel I want to find the nearest white pixel, how could I do so? I know I can use color.grayscale to identify if it’s black or white and I know of ways to get a distance between 2 pixels but I’m stuck at finding the nearest neighbor without just looking up every pixel in the texture.
At some point the work has to be done. If there are relatively few white pixels, iterate the entire image once and make a list of their positions, then iterate that list when you care and ask which is closest. Assuming the image doesn’t change, that list is valid forever.
Hello,
to avoid exponential searches (each pixel looking for every pixels) I would map out the white ones before running the foreach loop. With an array of white pixels vectors you’d have far less points to look for. That is, if your white points are randomly placed.
Knowing it’s a 2D texture, more optimal ways can be imagined, if your points are less random. Running across your points in an organized manner (top to bottom, left to right), you may be able to map out the nearest white pixel simply by index in the white points array. The first one being the most topleft, the last one being the most bottom right.
Finally, when your white pixels are placed, you can imagine making a heat map of closest white pixels with an algorithm like voronoi. That map would allow you to directly establish the closest one.
It kinda depends on how your white pixels are placed and when you have the time to perform the calculations.
Little side note here, to compare distances, I would recommend using sqrMagnitude instead of magnitude, because the latter is taking the square root of the first. It’s not much by today computational capabilities, but there’s no need to add another computation when it’s not necessary.
EDIT: another point, it seems like a work for the GPU. I advise you to look into ComputeShaders to immensely accelerate the process.
You can try implementing a quad tree, or finding one with a relaxed license.
https://en.wikipedia.org/wiki/Quadtree#Region_quadtree
This algorithm would work for textures of arbitrary complexities (i.e. pixel distribution) or dimension sizes in the most optimal time.
To use it properly, you would have to build the structure by feeding all the white “pixels” to it. The algorithm doesn’t care about the “pixels” themselves (or the texture), as everything’s just a point to it, but you can make it even faster if it works with integers (aka integral quadtree as opposed to spatial).
Then you would query it with a 2D bounding box and it would return you a list of all the white pixels in it. If there are no white pixels in a bounding box, you can expand it until there is at least one, and this is your winner. If there are more than one pixels, you then crunch the squared distances and determine the winner (the one with the smallest distance).
I have shared a util before, to efficiently determine the closest point from a list, here .
Note: Do mind that using this in a bruteforce manner isn’t not a good solution on its own. You still need an efficient spatial structure, comparable to a quadtree.
The reason why you’d compare squared distances is because you avoid having to compute the roots. I’ve also included the version in which you compute the root of just that one last winning distance, if you care about the distance at all.