I’m making several tools, which can work separately, but one can enhance the other. So for example:
Tool A and B can work without the other installed.
Tool A gains extra features if tool B is installed.
The problem I’m having is actually detecting that tool B isn’t installed without throwing up compiler errors. Ideally I’d just have a “#define TOOL_B” in tool B and an “#if TOOL_B” in tool A, but I can’t seem to get that working. Are there any known workarounds for this? If at all possible, I’d also like it to still fit nicely into a unitypackage, so adding an rsp file isn’t the best option for me, as it would overwrite any existing rsp file.
What is the reason for wanting to use defines instead of just not using the extra code if the other tool doesn’t exist? I doubt you’re really dealing with that much saved space.
This would work fine if ToolB was in the project, but if it wasn’t, it would throw an error because of the “ToolB.Method()” line, since ToolB does not exist.
There might be a way to either find an abstraction that resides in a common code base of both “tools” (like an interface) or if that doesn’t work, you’ll be able to handle this either with a type-safe event (C# events) or some reflection based (not necessarily type safe) event system, kinda like SendMessage.
Without any further information about the complexity and relation of these tools, it doesn’t make a lot of sense to go into detail for either of these approaches. I wouldn’t solve this with conditional compilation though.
It basically boils down to the example I gave earlier of “If Tool B is installed, execute some code the requires it.”
One of the tools is a custom build pipeline, the other is a tool that automatically stamps the version from git into the build. Both work without each other, but the custom build pipeline tool can integrate with the automatic versioning tool to set the version before building the player.
Some packages include Scripting Define Symbols when they’re imported. For example, after importing the Post-Processing Stack V2, you can see this in the Player settings window:
I don’t know how intrusive it is to modify the Scripting Define Symbols, but it works like you’d want it to. For example, in Cinemachine (another unrelated asset), they use #if to determine whether Post Process Stack is installed. For example:
I have tried this actually. My fear there is that a user will install Tool A and B, then decide they don’t want Tool B, uninstall it, and then they’re getting errors because the symbols still exist.
Redesign the code so your dependencies go through interfaces. So your build pipeline tool would be able to use an IVersionSetter for example if one exists. Then just have a simple plugin registry that knows what plugins are installed and knows how to instantiate them by name via Activator.CreateInstance.
Please don’t use ifdef’s, that’s horrible and needs to die in a fire, I wish Unity would stop asset vendors from using that.
I just meant some code that knows what tools are installed. You could just save it in json local to the device. You could even just make it a scriptable object that end users configure manually. Like simplest/easiest form I can think of is an enum with a value for each tool, and the scriptable object just has a list of those enums. Or just a static list of names.
Then your tools could just have logic to query the registry and should know how to instantiate the dependency if it’s there.
So you’re suggesting I then have Tool C which is a dependency of Tools A and B? That may end up being my best option. I don’t really see a way around this without either having the user manually set some values or having a dependency keep track of things.
So that’s some good information to start with. Since this is a pretty common scenario, you’d actually find a lot of approaches that neither use conditional compilation nor some other fancy logic.
It seems you wanna process the same data all the time, so you won’t even need specific interfaces, instead you can keep this very generic.
The following snippets demontrates just one quick and trivial approach out of many others;
public interface IPipeline<TData>
{
void Add(IPipelineModule<TData> module);
void Remove(IPipelineModule<TData> module);
TData Process(TData data);
}
public interface IPipelineModule<TData>
{
TData Process(TData inputData);
}
public class CustomBuildPipeline<TData> : IPipeline<TData>
{
private List<IPipelineModule<TData>> _pipelineModules;
public CustomBuildPipeline()
{
_pipelineModules = new List<IPipelineModule<TData>>();
}
public void AddModule(IPipelineModule<TData> module)
{
_pipelineModules.Add(module);
}
public void RemoveModule(IPipelineModule<TData> module)
{
_pipelineModules.Remove(module);
}
public TData Process(TData data)
{
foreach(var module in _pipelineModules)
{
data = module.Process(data);
}
return data;
}
}
You could now inherit the custom class (or use the generic version on the fly) which does already define the logic, all you’d do is specifying what TData is. Register various implementations (your tools) of pipeline modules that operate on a TData instance and runs through all of them.
Your tools do now only need to know the absolutely minimal “common denominator”: The IPilineModule interface and the TData type.
There’s also a design pattern that you could look into. The decorator pattern does somthing very similar. Also, pipelining or streams (in terms of Java 8 streams - not to be confused with input/network/io streams) can be written in a more functional way - which is also very nice and that approach would require fewer type declartions, too.
As you may guess by now, there’re tons of ways to achieve that.
Very interesting approach. It seems similar to a callback, basically invoking a set of methods at a certain point in the code, with the ability for other classes to add to that set of methods.
Yes, that’s it. As mentioned, that’s only a quick, dirty and very trivial example.
The main idea is: offer a small set of interfaces you’d like to use at a certain point and allow others to implement these interfaces (e.g. it could be compiled into a small assembly someone else could add to a project) and voila, they’ll be ready to write plugins and extensions for your core systems.
The only remaining question is who’s responsible for the registration part of the modules. @snacktime has already pointed out a valid and commonly used approach, namely using json files. More generally, some sort of configuration. That’s usually the most flexible solution, as it will depend on non-hardcoded information, i.e. such as the mentioned configuration file, or using programmatic yet dynamic contexts or a simple straightforward solution: another application-specific component that knows which modules exist and how to set the dependencies (which order, which concrete implementation, which configuration etc…).
There’s a lot of stuff you can come up with, and with a little bit of effort you can already create a very flexible system.