I wanted to ask more knowledgeable people what might the impact be of this whole Intel chip flaw/and or exploit on our day-to-day game development? I have read that the major OS’s are currently patching this flaw, but that it will slow down Intel processors up to 30%. The slowdowns are based on the type of tasks one does and I have read that code compiling will take a large hit in performance. So does anyone know what the impact might end up being from this in regards to development?
Not sure I don’t see any articles mentioning antivirus or firewall being effective against this. From what I’ve read it can be as simple as some Javascript running on a web page that grabs the contents of your system’s memory.
I posted up the same query on the Unity Reddit as well. So far this is a promising comment: “I’m running Windows 10 and installed the update earlier today. I haven’t noticed any decrease in performance and have about ~600 script files, for reference.”
Then again no idea on this person’s processor or system specs. Seems that newer generation processors will feel less of an impact. Also I have no idea what qualifies as “newer gen”. I’m guessing Skylake or newer?
Seems like a conspiracy theory… they publish faulty chips, a “third party” seems to notify everyone of the flaw and the danger, creating fear and motivation, the solution is… buy a new computer… well… so long as it’s got intel inside. Hmm.
With a low level issue like this and the fix expected to (basically) impact processing time by 30% it’s kind of just something we have to deal with. I don’t think it really impacts us specifically as game devs as much as it does other businesses that heavily rely on the processor speed. I mean, we’re more or less just subject to whatever our target hardware is so we’ll just have to accommodate the speed adjustment in the future.
That being said, a lot of businesses that still relying on existing processing speeds may not be affected at all if they do their work on private networks like some render farms and such. The main target seems to be cloud services where you can rent access to a server, pull a memory dump and move on to the next server. That, while terrible, does seem at least somewhat niche.
It was already fixed in macOS in the version from a month ago, and didn’t slow anything down (to any degree that’s actually noticeable for users). Put down the conspiracy theories and back away slowly.
AMD must be partying hard at the moment. Not gonna bother upgrading until my work needs it. There’s a lot of drama and so on about this but ultimately it’s only going to screw you if you visit dodgy websites and download dodgy cracks etc.
I’m not really one for any of the conspiracy theories out there. Because this seems like a highly technical issue at the processor level and I keep reading about processor slowdowns up to 30% my main concern is what the impact will be to running Unity itself, code compiling, baking lightmaps, etc (anything processor intensive). Again I lack understanding of how exactly or what CPU performance aspect this is impacting. I don’t dev on a Mac nor do I intend to so I am mostly curious how this impacts the PC development environment using Unity.
I personally think the risk of impact and the performance degradation numbers are being overblown. These flaws (Spectre and Meltdown) will be very difficult vulnerabilities for hackers to take advantage of. OS level software mitigation of these flaws will most likely NOT yield a 30% drop in performance. My guess is that we will see a 5-10% drop in performance for some use cases and possibly even a small performance gain in other use cases. And remember that most desktop and mobile computing involves the computer waiting for the user, file system, or network resources. Modern computers spend a lot of time in energy saving states just waiting for other things to happen anyway.