Muse in the Editor?

I find it weird that the people at Unity created a Web-based version of ChatGPT rather than directly integrating it as an editor window. I think that Muse would be magnitudes better in the Unity editor as it can gain context of a project’s files.

For example, feeding the LLM context of the code base and console such that it can help to decipher the error messages, suggest changes to a codebase, refactor, etc…

Though, the video trailer shows an editor version that looks promising.

12 Likes

Hi @DangerDano!
Thank you for your feedback, we are reading everything and might not be able to answer straight away!

We are looking into integrating it into the editor, would you mind providing some more insight on why and where it would be beneficial?
You also mention contextual awareness, for code base/console - could you expand on this, please?

Thank you!

1 Like

For a chat interface, a simple editor window where we can message back and forth without needing to leave Unity to find the answers.

It would be more beneficial than a web-based approach because, in the editor, the LLM can use context from the project. For example, it can have access to the scripts, resources, and assets folder that it can use to help answer our questions in a more precise way.

Let’s say, I have an error in a script and I don’t know how to fix it. Currently, I’ll have to copy my script with the error from the console and take it into the web app and tell Muse/ChatGPT my intent then paste the code and error messages.

If there was a chat interface in the editor then this step can be nullified by the model having access to the console log and scripts folder. With the log, the model can have the context of the problem and then follow it to the line of code where the error is. From there it can rewrite the script while giving an explanation.

Muse in the editor will already know what version of Unity we’re using.

The editor version of Muse will help Muse differentiate itself from other web-based interfaces like ChatGPT, Claude, and so on by being integrated with Unity itself. Allowing for more potential applications like function calling, where the agent can call Unity API’s functions automatically.

For example,
it can help with pointing to and finding resources in the project that helps with its explanations.
Give suggestions on code improvements then write to C# scripts files.
Writing common code like object pooling, random placements of objects, etc…
Refactoring code
Help with profiling

11 Likes

Being in the editor could allow it to have direct access to all the following data which could be useful for various reasons:

  1. The Player Log and the Editor Log
    These two bits of information could allow Muse to examine the log output including exceptions, in real time, for our games. The log context itself could include clues that help to isolate exceptions from the log.

  2. The C# code as well as any debug information available from DLLs
    Since the log output will contain exceptions and function calls, those could be mapped directly to the contents of C# and DLL code, allowing Muse to understand the full context of why an exception happened.

  3. With access to the structure of a scene heirarchy in real time, combined with the code, Muse could answer questions about why the heirarchy is in the state it is in.

  4. Muse could be connected to a debugger, and inspect the contents of the debugger

  5. Once muse understands how to correctly guide us on the creation of shaders, it should also be able to interactively describe to us how to make them and what is wrong with them. The current Muse answers my questions incorrectly and does so in the same way that ChatGPT does.

5 Likes

Some more thoughts as well:

  1. The ability to shorthand a script directly into the editor. Meaning, a suggested method for a script is placed in the right script, or a generated script is placed with the right namespace, in the right folder.
  2. Unit test / integration test generation. This is huge because all other LLMs, and even copilot, will not understand the right context to generate an appropriate Unity test for a given component/behavior.
  3. Be able to apply best practices to a scene. Suppose you construct a scene with various components, and they could be arranged in a more performant manner, or there may be recommendations such as take a public field and convert it into a [SerializeField] private field.
3 Likes

Hey all, these are all good and valid ideas, and rest assured we are working on integrating not only Muse Chat with the Editor, but of course also making the Muse platform aware and capable of using a Unity project as context, to make specific suggestions and share ideas in a context-sensitive way.

The benefit of sharing Muse Chat as a web application is that it is immediately accessible to everyone, even Unity novices who just want to learn about Unity. In addition, the use cases for Muse Chat exceed using it with the Unity Editor, but also help users with many other tools, services, and generally questions they might have about Unity.

Let me know if this makes sense :slight_smile:

4 Likes

+1 to move this into the editor. If we are thinking in long term development with more features coming along the way, at some point it has to touch code or assets (at least read them for context). Refactoring suggestions, and suggesting usage for best practices based on your current project would be super nice to have in the future.

1 Like

Here is a quick example of using the OpenAI Api from the inspector. The API call gives ChatGpt the context of the error and code to help solve the problem at hand without going back and forth with a web app.

6 Likes

@martina_johannesson - We recently purchased this plugin that provides a really seamless in-editor experience. The generate button sits under the “Add Component” button in Inspector and pops up an editor window for prompts. This is really useful. Can we have Muse integrated in the same way instead of copy-pasting queries on another chat experience

Also, an in-IDE experience like Codeium while coding will be really useful!

3 Likes

+2 on getting muse into the editor as soon as possible. From a development standpoint (which it seems everyone here is really doing), in-editor usage with actual project specific awareness seems the be the holy grail.

2 Likes

By contrast I think the Editor is the worst place for this, and a step backwards. The best place is in the IDE - i.e. MSVC/VSC/JetBrains(Rider)/etc.

Development happens primarily in an IDE, and the main IDEs already have full access to the Unity project (and are able to e.g. deepllink into the Scene, find references both in code and in scene through the same shared search box, etc). If it was primarliy in the Editor I … would stop using it, because ChatGPT / CoPilot are already integrated into the IDE, which is vastly more useful.

It would be a useful feature I think for it to have access to your project. For example if you had an error in the console you could click an option to have it ask muse about the error and it would have more specific data about why there errors occurred rather than a generalized answer.

1 Like

I think you’re speaking only of the development with scripts. Imo, although already useful, Github Copilot still has some ways to go when being inegrated with its IDE. It is useful because it removes the step of going back and forth having to give the context to a webapp for it to understand your problem.

And going beyond the current state of GPT, OpenAi and others alike will become ‘multimodal’ and more than just text predictors. As you could see in the videos above, images can be generated which could speed up the texturing and prototyping stage.
Though, image generation would be quite useless in an IDE.

Muse should be the goat, when it comes to Unity specific tools. Like profiling, DOTS, VFX, Shader Graphs, Editor windows, UXML, USS, etc… This will speed the the learning curve and allow developers to get creating faster.

2 Likes