Hi all,
I’ve noticed that the Mac client of my game runs buttery smooth, while the windows client has some framerate drops. I will need to spend some quality time with the profiler to optimize my game, but I’m curious why that is? The quality settings on both are the same (same graphics settings, resolution, vsync, monitor refresh rate, etc.), and while my Mac is newer, it has much less power than my windows desktop (CPU speed, GPU, and RAM). For most other games, my Mac chugs along, while my PC does exceptionally well (even for games that I know were developed in Unity). But for my game, it is the opposite.
Is it likely a difference between DirectX and OpenGL? I’ve heard the latter, which Mac uses, is better.
Edit: Sorry I put this in the wrong section. I just realized I put this in “scripting” after posting, and I’m not sure if I can move / delete it.