Survey and Discussion - Built-in Mesh and Greyboxing Tools (ProBuilder)

DISCLAIMER This is an early survey only, no actual feature or timeline is implied.

Hi there! Coming up next on the Scene Tooling roadmap, we intend to enable fast, intuitive “greyboxing” tools by default. Meaning, you open up a new install of Unity, and immediately have access to tools for building and editing simple meshes (and maybe more). As the title implies, this is essentially the next evolution of ProBuilder, and we are very excited to begin work on this!

Similar to the Right-Click Menu post , we want to ensure we get your input early in the process, and continually. To begin, we’re gathering general feedback, requests, and info on how you currently use ProBuilder.

If you are interested in greyboxing, level design, in-editor mesh building or similar workflows - this is the time to get your questions, requests, and feedback posted! We’d love to get your input on the short survey HERE, and please use this forum post to discuss. Thanks very much for your time and input!

4 Likes

Filling out the survey, this is very exciting!
Is this related to https://portal.productboard.com/dzcznunfgebtky7ipmhrc22z/c/1839-native-block-out-tools by any chance?

1 Like

Hi! Yes, directly related! :slight_smile: Thanks!

Does or will ProBuilder support non-destructive workflows (CSG)? If not, that would be on my wishlist. :wink:

3 Likes

It doesn’t currently, it isn’t specifically planned out, but yes we’d love to have something like this also. Are you aware of RealtimeCSG?

2 Likes

Sure, I tried that and SabreCSG. They both felt somewhat offputting, at least one created broken geometry with rather simple boolean ops. Not having received any updates in >4 years makes them unsuitable for production use.

The devs actually teamed up to make a new CSG tool called Chisel but they warn it is far from production ready and sadly, there was barely any development done on it in the past two years.

Last option is to use TrenchBroom (Quake map editor) and import maps. Problem is: you end up with a single static mesh. Making doors, lifts, platforms would have to be done in Unity after importing, and matching them up with a mesh. Then go back and make changes to the map and … ugh.

I just recently learned that Dusk was made entirely with ProBuilder without the use of CSG boolean operations. That reinforced my notion that ProBuilder is still the best option when it comes to making Quake-style maps (which is basically grayboxing, except you end up polishing and using that graybox). Together with (nested) prefabs I think this can work really well.

1 Like

Thanks for all the info! TrenchBroom is another good example. Sad to hear none of those worked out, but yes folks like Dusk, Strafe, Tunic, SUPERHOT … they’ve made ProBuilder work so it’s definitely doable. We know it can be much better though! I’m sure they’ll tell you that too, ha!

All that said, we are considering a re-scope to include boolean and/or CSG style workflows as primary. We’re big fans of this … it’s considerably more work, but might be the right way to go.

5 Likes

A good bevel and boolean solver like Blender would be great.

4 Likes

after 4 years of new prefab workflow, we still have this issue on unity 2021.3.11 using Probuilder latest version 5.0.6.
as others mentioned in this thread:

prefabs always thinking PB objects have overrides. It’s making it hard to maintain prefabs and whether they’re up to date. making it hard to find actual overrides if we use some proBuilder objs in our prefab.
This is annoying and has hindered the speed of our work.
this is requested by many to be solved.

2 Likes

@shikhrr - yes, “putting holes in things, easily and non-destructively” will be a major focus. Doors and windows, should be simple.

@afshin_a_1 - also yes - working fully with prefabs and version control are requirements!

2 Likes

It would be amazing to have the boolean operations as intuitive as in tools for vector editing (see gif below).

Typically, in 3D programs it’s more like this:

  • Select two objects
  • Then select the correct type of boolean operation (there are several buttons)
  • Apparently, I’ve selected two objects in an incorrect order, so I have to revert changes and do this correctly.

Then repeat for who knows how many times.

I understand this is much more difficult to do in 3D space than in 2D, still would be amazing to have that.

3 Likes

One thing I’d love to see is an improved selection workflow. Too many times when I do a box selection it just grabs random vertices, and as a result I have to go through all of them one by one :hushed:

4 Likes

Levels should be able to be treated like clay – as well as decent hard-ops stuff too.

This means free-form and flexible tools.

Boolean-style CSG operations are mandatory nowadays. At least some semblance of a voxel-based environment designing solution. But then also LOD and imposter generation options as well. Even the most simple geometry often needs to be able to be displayed in an open-world solution, so creating LODs for them is needed. Vertex-by-vertex editing is mostly outdated in the kind of pipelines modern games need.

Sometimes, I feel Unity’s tool design pipeline – especially for even moderately open worlds – is just completely out of touch.

I use Unity’s recently touted “workflow” for “optimization” in “open world games” (such as the “optimization” in Alba, for example) – I wasn’t surprised that project nearly failed because of its “iterative” approach to geometry generation!

Coming from a game artist and researcher who is familiar with many procedural tools and workflows, the best way to tool geometry, from experience, is brush, sculpt, and chain / placement tools (all of which Blender is great at – and I abhor in Unity, since they often require addons/assets).
As an added side-effect, these “better” approaches tools (sculpting, placement, chaining) are also applicable in VR as well, and ironically, this is a result of their more intuitive usage, leading to a generally better UX.

Additionally, being able to “program geometry” using tools like Blender’s geometry nodes (which, if I’m being honest, you might do A LOT better by simply porting / mirroring blender’s geometry generation and other tools’ results (and even adding your own operation handles for some of them) into Unity itself to give more flexibility (similar to how Houdini Engine is used in Unity / Unreal), and would give LOTS of great tools and a truly decent pipeline option to Unity users, with a low amount of overall effort on Unity’s part (aside from simply grabbing a resulting mesh and setting boundaries / placement options and perhaps some UI / UX features via an addon in Blender potentially), and letting Unity users do only basic operations natively in Unity itself (i.e. verts/edge/face manipulation, CSG operations), as its main goal would be sending and receiving assets/geometry to (and from) Blender automatically, making Unity Assets be a part of Blender’s Asset Library when in-use in Unity, or performing CSG operations, etc.in Unity and sending those small updates back to Blender, leaving the heavy geometry processing on Blender – and keeping Unity out of the business of developing more crappy art tools.

This approach, I think, would lend a much more future-proof tool (and require less buy-in from higher-ups) and since Blender is constantly evolving and becoming more user-friendly (and also free), many of its tools, as they improve, just make Unity better as a result. You can use networking to talk to / from Blender and synchronize its results (and also add results to Blender to keep things mirrored).

I’ve seen this done in Unity already – so you’ve already got a bit of a base to work from. But it would make greyboxing levels much more intuitive and freeform (especially if you let users rely on Blender to implement a VR editor experience). Generating Unity Instances from points in Blender would work very similar to how Houdini Engine works for Unity already. And the best part – it’s free.

Unity + Blender (Pros/Cons)
Pros:

  • Networking Interfaces can exchange data and changes in realtime

  • Existing base for geometry mirroring between Unity already exists in Unity’s github repo

  • Using Blender would be similar to using Houdini Engine in Unity (fully working example already exists)

  • Access to Procedural Geometry and Sculpting Tools out of the box

  • This access includes CSG (and potentially material) workflows

  • Blender is a free/open-source tool that is always improving, meaning its increased value adds value to Unity

  • Unity is only responsible for the in-editor “handles” that chain and place any meshes generated

  • Unity can provide very basic low-level mesh tools (vertex/edge/face editing, CSG boolean operations) and mesh/object/instance placement tools

  • More complex mesh operations can be passed off to Blender and can return to be visualized without the time-consuming export process (which is currently required)

  • Asset Libraries in Unity can contribute – and benefit from – Blender’s Asset Libraries and art ecosystem

  • Purchased assets / tools from the Asset Store can be used in Blender as well (and be added to its Asset Library as a new Collection to be used in a scene or other projects, and can come over as simply an “instance” in Unity (since it exists as a gameobject/prefab in Unity already) allowing one to create complex levels with repeated elements – without the struggle of importing/exporting mesh variations just because it is a “mesh” in Blender, but a “Prefab” in Unity – as long as it is in an Asset Library, it is an “instance” of a prefab, existing between both software applications)

  • Greyboxing is kind of a “duh” operation in Blender, but dumb meshes can be swapped out for more fancy ones with materials / animations / etc., since Unity is aware of these, and the User can use an addon in Blender to send them over to Unity in a special way.

  • Hardsurface modeling could be combined with sculpted modeling quickly and easily

  • It would be possible to partner up with Blender Institute and bring more users to Unity as more “official” support of it as the replacement for Blender’s game engine might come with an official partnership (or at least it might be possible to simply sit pretty for a while and even gain support from the official Blender development team to help with Unity’s integration / mirroring of its tools/API if it ever desires to).

  • Although Blender is non-native, it could FEEL like Blender is being used natively by making the UI / UX mostly transparent (in a similar way that Houdini Engine does it – i.e. by providing a handle-based UI for certain UX – such as for splines, curves, or other visual modifiers and UI / UX that is defined in a Blender addon to assist in value-tweaks to certain nodes and parameters as they appear in Unity (similar to an HDA in Houdini – a digital asset), letting Blender do the hard work, and letting Unity appear to do this natively – which would fit into both applications in a way that slots naturally into Blender’s procedural “everything nodes” pipeline

  • Unity gets to decide how “intuitive” its art tools are by leveraging the Geometry Nodes (of the “Everything Nodes” project) and slapping an interface on them, probably by developing their own addon that helps signal to Unity the kind of UI or 3D widgets particular named nodes or custom nodes would require for their UX in Unity (such as pseudo-CSG operations, or even dungeon generation)

  • Sculpting, Brush, and Chain/Placement tools are VR-ready

  • Blender is aiming to integrate VR in its toolwork pipeline, meaning Unity may not have to be the pioneer in that pathway and integrate its editor

  • Artists, Animators, and Game Designers are already using Blender for their development/design work

Cons:

  • Unity has no official support for Python

  • Unity wants barebones greyboxing tools and nothing more

  • Unity doesn’t want to use or play nice with third-party tools

  • Unity has to put effort into a tool that is not its own, making that tool non-native

  • Blender would be the boss over what features stay / go in the future

  • Unity would (potentially) have to make an addon or two in Blender to control its output to Unity in order to maintain feature-parity with something like an HDA digital asset from Houdini (for custom procedural tools and pipelines)

  • Keeps Unity out of the “business” of developing crappy art tools

  • Procedural tools wouldn’t be as good as Houdini – but for free? – It’s definitely a start!

  • Doesn’t fit with Unity’s (currently vague) greyboxing direction – but provides all it will ever need (and more!) in a single tool solution (such as random greybox dungeon generation), so…

  • It’s me suggesting this so… it’s probably “hard”.

  • Some designer @ might need me to guide them.

2 Likes

Here I have some more minor, cool features that could land in the backlog.

A nicer way to bevel corners: (skip to 1:22)

A cool feature in ue5, I’ve already had an occasion to use it, and It’s almost perfect. I also love the interface that pops up from the left but only when needed (a bit like in Maya).

Some cool features from a video game (made in unity) that have house-building elements:

Also, would be nice to get an implementation of splines in probuilder.

3 Likes

Absolutely agree on the Cube Grid suggestion. Ever since I saw that I was drooling lol. The cube grid tool in UE5 is a game changer for workflow and is extremely simple to use for blockouts when you need a specific kind of prototyping. I tend to think more in that way, so I’ve been thinking of building my own solution in Unity for prototyping like that. Having support for it in ProBuilder would certainly be helpful though

1 Like

Honestly, unity would have to first do something about their grid system and the fact that pro builder (and poly brush) meshes override in every scene.

Especially the grid system, UE5 now has two separate grid systems, one for placing the meshes and the second for creating them via cube grid. This becomes an annoying issue whenever you want to move the mesh you’ve just created, then go back to edit it a tiny bit.

There is also a problem with UV and mesh generation. You can’t use this tool for anything than rapid prototyping, even If you intend to make a game with voxel graphics or a quake-like.

Well yeah, a blockout tool is a blockout tool. It doesn’t need to be supported with all that extra stuff like UV mapping. Its just so you can block out a design for a level really really rapidly. Textures aren’t usually part of a blockout. You’d export your blockout and build your nice fancy scene geometry in something like Blender, working around the blockout version. But I imagine simple UV mapping each block face as the full UVs would be sufficient for a blockout if you really needed to use some textures for something

I just think sometimes if you try to hard to make a tool into an “everything” tool, you will almost always fail to make it good at doing “specific” things. I’d rather have a really really simple blockout tool for more involved game development and art workflows than to have it be a dull swiss army knife tool that beginners can latch onto and not really learn any good workflows with other programs

Honestly, I like texturing pro builder meshes in prototypes, so having an option for that is always a big plus.

Stairs still broken! Cant make curve, steps have wrong angle.

I really hope that polybrush gets some love too! It hasn’t been update for a long time :frowning:

3 Likes