Isn’t 3.4 backwards compatible with version 3.2? I thought it was, but apparently even if I remove / fix any deprecated code and shaders, Unity still won’t let me build a stand alone (I have a Pro license).

Why would I want to move my project to 3.2? Because the same project that runs at 60 fps with 3.4 runs at 400+ fps with 3.2.
Very weird issue…
I think that every time a project gets updated to a newer version of Unity, there is a warning that it won’t be backwards compatible (I’m not certain because it becomes automatic to accept).
It is normal practice in software as new versions add new features and new save formatting to accept them.
No, not every time
only if the project serialization changes.
That was the case 3.1->3.2 and 3.2->3.3
3.3 ↔ 3.4 is no problem.
3.2 ↔ 3.4 is a bit of a problem as the texture stuff got updated. But generally you shouldn’t hit many problems at all unless you used something that got added in 3.3+ naturally, especially look out for any OnDestroy (that was a 3.3 addition, wasn’t it?)
The code and shader issues I understand. What I don’t understand is why my license suddenly got reduced to lower than an Indy license? The only thing I can save to is a webplayer?
Hehe … you got a magical new license, special feature just for you 
Naw sounds more like either a harddisk messup (but then 3.4 would show it too) or that the license system changed between 3.2 and 3.4, upgraded your license and now needs a reactivation inside 3.2 to make it work again 
It’s called vsync, and you can turn it off in 3.4 so 3.4 runs at 450fps. Thats right.
@hippocoder - Yup, I just figured that out. (Probably at the same time you wrote it!) Vsync used to be a toggle setting and I typically left it off, so this was a bit of a surprise. I’ll stick with 3.4 after all.
@Dreamora - If I create a new project in 3.2 I can build as expected. The new “custom license” only happens when I open a 3.4 project in 3.2. Weird, eh?
hehe interesting that the vsync fix catches that many off guard (vsync should only be disabled on slow machines, not high end machines → the lower 3 quality settings should have it disabled, the higher 3 enabled - after all you don’t want high quality and really really ugly tearing, thats not the point of it ;))
and yeah definitely weird … you must have discovered an undocumented new feature on 3.4 in relation to platforms or alike if its able to lock it that way and store that info inside the project …
@Dreamora - Interestingly enough, I’m not seeing tearing on my i7 with a fast Nvidia card with Vsync off. Question though, with vsync on does it just “regulate” the playback speed at 60 fps? Seems to run at 60 fps no matter what I do to the scene, so the apparent huge drop in frame rates really shouldn’t matter because it’s keeping it at 60 fps, right?
VSYNC on does not regulate anything actually. It just tells the driver to render it at the screens native frequency so it will only present 60 frames in case of a 60hz screen so its perfectly in sync with the screen refresh so it really only updates the image when the screen will do so too.
And yes it will keep it there and just “wait the rest of the time” leading to huge device present times in the logger cause it waits for the driver to give the frame free again. Using the application targetframerate along it should help here as it will not wait for it to return but only really send it at that rate and then return to “business logic” (unless 3.4 changed that)
as for tearing: perhaps its just not visible or the screen is that fast on the refresh rate you don’t see it, but normally you will see the scanline wandering top down or better it shows up at 1 place per screen refresh basically
Ok, got it. Thanks very much Marc. 
It’s not actually a scanline though on a flat panel though is it? 
not technically naturally unless you have the flatest tube screen ever 
But the same visual look. can see if I can grab an image at some time with the camera showing it on my 27" Samsung dual screen setup for example.
Might be a bit screen dependent potentially, but generally the screen information still needs to be built starting from one carner and if the image it tries to draw changes midway through the screen it will still give a visual cut along the “line” where it had old + new image