Can someone from Unity (or someone with Multiplay experience) confirm the following 4 questions about scaling in Multiplay?
-
When setting max available servers, Unity shows this prompt saying 'more servers may be made available at no extra cost’. We have seen this in our case where we set max servers to 2 but we get 7 servers available. From these 7 available servers you can only allocate 2 of them - aka you cannot use them. Is this the expected behavior? If so, this text is very misleading and the ‘extra 5 servers’ should be hidden.
-
In the scaling settings documentation here: Scaling settings (unity.com), this statement is contradicted because it says that any available server incurs costs. The question here relates to the first one, that is do available servers incur costs even though we set max to a smaller number?
-
Once you exceed the number of servers on a single machine, Multiplay will automatically spool another machine to cater to the next server’s requirements when you allocate again. This is all well and good but the machine is never switched off even if all servers are deallocated. That is if you have 7 servers on a machine and we allocate again, we get 14 available servers now - and most likely paying for 14 servers instead of 8. Is there any configuration setting that can allow machines to be turned off when they’re not needed anymore? (aka when servers are deallocated)
-
Is there an easy way to monitor costs when testing so we can get an idea of CPU/Memory and bandwidth usage?
Thank you!