Thanks for sharing those videos.
I think mostly this feels fine (and getting tools to be consistent is important), but I have some questions / notes:
Conversion path for existing tools. I tried to get a few people onboarded to the EditorTools API, and many mentioned that they do not want to (right now) support something that they cannot easily integrate, as turned out for the EditorTools API.
I tried porting some of our own tools over to this, and the issue here is that often “tools” are implemented on top of Components / MonoBehaviours, and are thus very context-specific to the selected components. At least from my tests so far it didn’t feel “easy” to make these use the new API, mostly because an “EditorTool” has no understanding of selection per se and it seems quite a bit of boilerplate needs to be implemented just to “forward” a tool command to a specific, currently selected component or set of components.
So the question here would be: are there samples for component-specific tools and how to implement them properly? Or, to put it differently, can I tap into the EditorTools API from a custom inspector directly, instead of having to make an EditorTool that then somehow figures out what is selected and provides the right details?
Shortcuts. It seems this is a missing key piece from the videos: shortcuts are important for specific tools and scenarios, and current solutions like the Shortcut Manager only deal with global shortcuts, not context-specific ones, those are “on the roadmap” for a long time now and nobody seems to be working on it. It seems from the 2nd video that you’re in fact planning for Shortcut Manager integration, but this would be a big issue I see with that.
I do not want to remap a million keyboard keys and make sure that they don’t overlap as is the current situation with tools; I just want shortcuts to be forwarded to whatever tool is active and handle them there, and then they can “bubble through” to the rest of the editor if they aren’t handled (just one way to deal with this, there are surely more).
Multiple Scene Windows. One big advantage of the tools section being at the top was that screenspace was saved, all scene views stayed clean. With the tools section moving into the scene view, what happens when I have multiple scene views open? Do the tools jump around with me? What happens to that “floating thing layout” I manually did by positioning them with scene views of different sizes? Or, in the case that each scene view comes with its own configuration, can I copy / paste configs to other scene views?
Contextual Tools. As mentioned in the “Conversion” question already I believe most tools are very context-specific. How will the screen space be leveraged for those? If I have 10 different MonoBehaviours that come with different sets of tools that are just relevant to them, it seems like I would be wasting a ton of screenspace by always having to have all their tool containers open, or I would need to constantly reconfigure my scene view to show just the tools that are relevant for that selected object. As mentioned before, it would be great to have the option to use the EditorTools API directly from a custom inspector and configure what is shown there - as done today through hooking into OnSceneGUI, but my understanding is that you want to clean this all up and have everything go through the new API.
Additional Customization and Tools that affect the project, not scene. There’s more to the UI by now than just the scene view and tools section. I understand that these might be treated by different teams inside Unity, but as a user, I’d like to also configure/dock to e.g. the right side of the toolbar (the area where Collab etc are). This is especially true for those tools that are project related and not scene related.
Looking forward to more details on those!