Hi there,
I’m seeking clarification on how CPU pricing is determined in relation to its MHz. Could you explain if changing the CPU’s MHz affects the cost and how this calculation works? This will help us estimate our spending more accurately.
Hi there,
I’m seeking clarification on how CPU pricing is determined in relation to its MHz. Could you explain if changing the CPU’s MHz affects the cost and how this calculation works? This will help us estimate our spending more accurately.
Are you an Enterprise customer? If not, you cannot change the server specifications.
There is no public information as to bare metal machine pricing. You can use the Compute Engine pricing structure to make a best guess though. The default that everyone gets is the lowest-end N2 machine.
Feel free to use my Unity Gaming Service Cost Calculator sheet for estimating the various service costs. It’s available by joining my free Patreon.
No, I’m not. I saw the tutorials and documentation about it. You can customize the CPU clock speed in the build configuration. What is the point of minimizing the clock speed if it doesn’t affect the cost? If it does, how does the calculation work?
For example, check: 6:25
No. You have to reach out to the sales team meaning it’s negotiated on a per customer basis and behind an NDA.
You can see in the video that these are greyed out. CodeMonkey switches to the custom tab but doesn’t actually attempt to modify these values. I suppose he simply assumed that he could modify them but hasn’t tried because then it would likely pop up a message that you can’t. Or CodeMonkey has Enterprise-level access and is documenting the services from a different perspective (likely unknowingly since modifying the machine specs is the only Enterprise-level difference for Services).
Okay, so at this point, it’s not customizable. I saw another video in which he could adjust the clock speed. In that video, he used 2000 MHz, for example. It made me wonder how pricing works. Okay, clock speed is fixed, and pricing is based on the CPU core. Is this summary correct? Also, isn’t 750 MHz too low for game hosting? I see 750 MHz as the default in the video, and you said it’s unavailable for any change.
Maybe he was using an earlier version of the services. Less than two months ago I last checked the machine specs and they are as follows:
Google Cloud General Purpose (according to machine names)
GCP-N2 @ 2.8 GHz | 8 GB RAM
CPU: Intel Cascade Lake (Xeon)
Note that this is a virtual machine with 2 cores, the Xeon of course has a lot more cores than this.
The clock speed is also documented somewhere in the manual.