I use HLAPI in Unity 5.x to connect two instances of application on localhost. On the host the Oculus Rift is used with VR Support enabled. Client’s instance is running with -vrmode None flag, so the same scene is rendered on the screen as usual. But when the client connects to the server performance of the host decreases from 75 fps to 45-50 fps. And the more clients connects the less fps becomes on host. If I run the instances separately (without any network communication) everything is fine. According to profiler it seemed to be that server computes physics and rendering instead of clients. Am I right? And if I will use LLAPI could I increase the performance?
No, I don’t think so - the avatar is just a box. The behavior is the same even on the empty scene without any objects but plane with rendered gui on it. I’ve updated the profiler results and the main reason of slowdown is waiting for GPU. The rendering for client takes just a small part. Here are screens of profiler for host-only instance and host with client connected instance:
Maybe, it’s just a problem of Oculus synchronization. I mean, if the result of rendering couldn’t be presented in any particular frame with needed rate it will wait a while to satisfy the rate limitation. Or I’m not right?