Levels should be able to be treated like clay – as well as decent hard-ops stuff too.
This means free-form and flexible tools.
Boolean-style CSG operations are mandatory nowadays. At least some semblance of a voxel-based environment designing solution. But then also LOD and imposter generation options as well. Even the most simple geometry often needs to be able to be displayed in an open-world solution, so creating LODs for them is needed. Vertex-by-vertex editing is mostly outdated in the kind of pipelines modern games need.
Sometimes, I feel Unity’s tool design pipeline – especially for even moderately open worlds – is just completely out of touch.
I use Unity’s recently touted “workflow” for “optimization” in “open world games” (such as the “optimization” in Alba, for example) – I wasn’t surprised that project nearly failed because of its “iterative” approach to geometry generation!
Coming from a game artist and researcher who is familiar with many procedural tools and workflows, the best way to tool geometry, from experience, is brush, sculpt, and chain / placement tools (all of which Blender is great at – and I abhor in Unity, since they often require addons/assets).
As an added side-effect, these “better” approaches tools (sculpting, placement, chaining) are also applicable in VR as well, and ironically, this is a result of their more intuitive usage, leading to a generally better UX.
Additionally, being able to “program geometry” using tools like Blender’s geometry nodes (which, if I’m being honest, you might do A LOT better by simply porting / mirroring blender’s geometry generation and other tools’ results (and even adding your own operation handles for some of them) into Unity itself to give more flexibility (similar to how Houdini Engine is used in Unity / Unreal), and would give LOTS of great tools and a truly decent pipeline option to Unity users, with a low amount of overall effort on Unity’s part (aside from simply grabbing a resulting mesh and setting boundaries / placement options and perhaps some UI / UX features via an addon in Blender potentially), and letting Unity users do only basic operations natively in Unity itself (i.e. verts/edge/face manipulation, CSG operations), as its main goal would be sending and receiving assets/geometry to (and from) Blender automatically, making Unity Assets be a part of Blender’s Asset Library when in-use in Unity, or performing CSG operations, etc.in Unity and sending those small updates back to Blender, leaving the heavy geometry processing on Blender – and keeping Unity out of the business of developing more crappy art tools.
This approach, I think, would lend a much more future-proof tool (and require less buy-in from higher-ups) and since Blender is constantly evolving and becoming more user-friendly (and also free), many of its tools, as they improve, just make Unity better as a result. You can use networking to talk to / from Blender and synchronize its results (and also add results to Blender to keep things mirrored).
I’ve seen this done in Unity already – so you’ve already got a bit of a base to work from. But it would make greyboxing levels much more intuitive and freeform (especially if you let users rely on Blender to implement a VR editor experience). Generating Unity Instances from points in Blender would work very similar to how Houdini Engine works for Unity already. And the best part – it’s free.
Unity + Blender (Pros/Cons)
Pros:
-
Networking Interfaces can exchange data and changes in realtime
-
Existing base for geometry mirroring between Unity already exists in Unity’s github repo
-
Using Blender would be similar to using Houdini Engine in Unity (fully working example already exists)
-
Access to Procedural Geometry and Sculpting Tools out of the box
-
This access includes CSG (and potentially material) workflows
-
Blender is a free/open-source tool that is always improving, meaning its increased value adds value to Unity
-
Unity is only responsible for the in-editor “handles” that chain and place any meshes generated
-
Unity can provide very basic low-level mesh tools (vertex/edge/face editing, CSG boolean operations) and mesh/object/instance placement tools
-
More complex mesh operations can be passed off to Blender and can return to be visualized without the time-consuming export process (which is currently required)
-
Asset Libraries in Unity can contribute – and benefit from – Blender’s Asset Libraries and art ecosystem
-
Purchased assets / tools from the Asset Store can be used in Blender as well (and be added to its Asset Library as a new Collection to be used in a scene or other projects, and can come over as simply an “instance” in Unity (since it exists as a gameobject/prefab in Unity already) allowing one to create complex levels with repeated elements – without the struggle of importing/exporting mesh variations just because it is a “mesh” in Blender, but a “Prefab” in Unity – as long as it is in an Asset Library, it is an “instance” of a prefab, existing between both software applications)
-
Greyboxing is kind of a “duh” operation in Blender, but dumb meshes can be swapped out for more fancy ones with materials / animations / etc., since Unity is aware of these, and the User can use an addon in Blender to send them over to Unity in a special way.
-
Hardsurface modeling could be combined with sculpted modeling quickly and easily
-
It would be possible to partner up with Blender Institute and bring more users to Unity as more “official” support of it as the replacement for Blender’s game engine might come with an official partnership (or at least it might be possible to simply sit pretty for a while and even gain support from the official Blender development team to help with Unity’s integration / mirroring of its tools/API if it ever desires to).
-
Although Blender is non-native, it could FEEL like Blender is being used natively by making the UI / UX mostly transparent (in a similar way that Houdini Engine does it – i.e. by providing a handle-based UI for certain UX – such as for splines, curves, or other visual modifiers and UI / UX that is defined in a Blender addon to assist in value-tweaks to certain nodes and parameters as they appear in Unity (similar to an HDA in Houdini – a digital asset), letting Blender do the hard work, and letting Unity appear to do this natively – which would fit into both applications in a way that slots naturally into Blender’s procedural “everything nodes” pipeline
-
Unity gets to decide how “intuitive” its art tools are by leveraging the Geometry Nodes (of the “Everything Nodes” project) and slapping an interface on them, probably by developing their own addon that helps signal to Unity the kind of UI or 3D widgets particular named nodes or custom nodes would require for their UX in Unity (such as pseudo-CSG operations, or even dungeon generation)
-
Sculpting, Brush, and Chain/Placement tools are VR-ready
-
Blender is aiming to integrate VR in its toolwork pipeline, meaning Unity may not have to be the pioneer in that pathway and integrate its editor
-
Artists, Animators, and Game Designers are already using Blender for their development/design work
Cons:
-
Unity has no official support for Python
-
Unity wants barebones greyboxing tools and nothing more
-
Unity doesn’t want to use or play nice with third-party tools
-
Unity has to put effort into a tool that is not its own, making that tool non-native
-
Blender would be the boss over what features stay / go in the future
-
Unity would (potentially) have to make an addon or two in Blender to control its output to Unity in order to maintain feature-parity with something like an HDA digital asset from Houdini (for custom procedural tools and pipelines)
-
Keeps Unity out of the “business” of developing crappy art tools
-
Procedural tools wouldn’t be as good as Houdini – but for free? – It’s definitely a start!
-
Doesn’t fit with Unity’s (currently vague) greyboxing direction – but provides all it will ever need (and more!) in a single tool solution (such as random greybox dungeon generation), so…
-
It’s me suggesting this so… it’s probably “hard”.
-
Some designer @ might need me to guide them.