currently I’m having quite a problem with the FPS performance in my VR-App.
According to the Profiler it’s all about the Rendering (Render.OpaqueGeometry)
In this app I create a pointcloud of about 17.000 points.
At the moment I instantiate a primitive black sphere for each point.
I have neither light nor other expensive things, just a white solid background and many black sphere-gameobjects.
The more spheres are in my FOV the less fps i have, which is kind of logical but FPS drops under 15 very fast.
So my question is: Is there way to increase the performance maybe by attaching a simpler shader/material than the standard/built-in one or is the amount of gameobjects simply too much for the rendering pipeline to provide good FPS?
enabling GPU instancing for the Gameobject’s material doesn’t seem to do the trick.
What do you mean with ataching all spheres to a single exactly?
Do you mean attaching them to a parent object or something like Mesh.CombineMeshes?
EDIT:
Found a thing that helps quite good
The wrong way was to create a sphere for each point with Gameobject.CreatePrimitive() and then I edit the components in the code.
Now I creat a Prefab of a Sphere with just a Mesh and a Mesh Renderer and instantiate this Prefab for each point.
UnityEngine.Object o = Resources.Load("Prefab");
GameObject go = GameObject.Instantiate(o) as GameObject;
Now the Spheres are kind of clones from each other and not single instances anymore.
Another thing that helped was to combine the meshes from each sphere.