I’m working on a workflow automation which involves opening some custom editors windows from plugins and clicking on a few buttons. Unfortunately there is no public API in the plugin and so the preferred way would be to trigger those button clicks from script somehow.
Even though the Event class now has a PopEvent method (which is used by the new UI system internally) to remove an event from the event queue, there is no way to artificially add events to the event queue. So there’s no solution inside Unity. So your only option in that case would be to generate OS input events to simulate mouse and / or keyboard events. Of course depending on the exact innerworkings of the third party editor window, you may be able to use reflection to manipulate the internal state or call internal methods directly. However when the relevant code is directly inside OnGUI inside the button if statement, there isn’t much you can do besides going through the OS event route.
Of course it depends on what exactly you want to do, but simulating OS events is quite easy to automate with AutoHotKey. Though in any case you would need either the absolute or relative positions of the buttons you want to interact with, Autohotkey has methods that allow you to “search” for images on the screen. However this would get complicated quite fast ^^.
Thanks for the quick reply. Yeah, I also had the idea of simulating OS mouse click events, but that seems messier than I’d like it to be. For now, I’m just going with an extracted & stripped-down version of the original code and using reflection where necessary, but I’m just concerned about its reliability/maintainability.