Hi guys,
I have an OSX app (TacticalPad Pro) that was deployed for Mac using Perfect Quality Settings. On Mac Book Pro with two video cards (onboard and Nvidia), it looks perfect, with the expected AA, Texture compression, etc.
But i have users that have MBP with Intel Iris only and it seems to run with no AA and high texture compression.
I’ve attached the image of the same deploy on these two setups and Mac About Screen, showing their setup.
Any idea what may be the cause? Unity Player, any OSX settings?
Thanks for the support and any suggestion.
1 Like
Maybe file a bug report for them to investigate it.
This may be the way to go. If i got no message from someone from Unity team today, i will do it =). Thanks @MrEsquire
It’s probably the fact the resolutions are different and you are using point filtering? if so then this will happen in any engine or hardware. If you are not using point filtering, then it is likely a bug. When talking about issues (esp on macs), the version of Unity should be mentioned
In principle, both are running Retina with no scale. I asked for our user to send me the Display Settings screen. As soon as i have it, i will post it here. Thanks!
Yeh but which version of Unity did you use to make your game?
Are you using point filtered textures? You will know this without asking a user.
No. Filter Mode is Trilinear.
Hi @hippocoder , absolutely same versions (from AppStore), both Macs on Default Scale. Our filter is alway trilinear. Attached pics of the Intel Iris. If you have MacBook pros both with Iris and NVidia, i can send promocode from AppStore, so at least you could run. Here i only have Nvidia, unfortunately.
Thanks!
Set the texture’s compression format to truecolor, perhaps?
And they are textures, correct? If they are sprites, make sure to turn off “Generate Mip Maps.”