Issue with Application.LoadLevel and DontDestroyOnLoad

I have a function which will load a level with:

Application.LoadLevel(“Scene”);
Instantiate(Object1, Vector3(0,0,0), Quaternion.Identity);
Instantiate(Object2, Vector3(0,0,0), Quaternion.Identity);

I want these objects to appear after the level is loaded, but they are deleted on load unless I use DontDestroyOnLoad. I do want them to be destroyed on load, though - But I thought that they would appear in the level that was loaded, since the object with this script attached isn’t destroyed on load.
I could just put both objects in the scene that is loaded, but I want to use the scene for other things without the objects.

How could I prevent these objects from being deleted?

If you want them deleted then don’t do anything. If you do not want them deleted, then use DontDestroyOnLoad. Whats the problem with that?

put the two instantiates in another script’s awake function… when the scene is loaded the old version will be destroyed, and when the new scene is loaded they’ll be reinstantiated. Something like an empty gameObject with a “scene setup” script attached which has those two lines.

I guess this means Application.LoadLevel() waits until the current frame is finished before breaking everything down in the current scene to load the new one. Thanks for testing that out for us!