Auto Gamepad Generator- Generate controller support in under 5 minutes!

What is it? Another Input manager! Before you stop reading it is different from the other excellent input solutions on the store like Rewired, Incontrol, etc. Where each of those solutions effectively replace the Unity Input system with their own API, instead Auto Gamepad Generator uses the Unity Input API so you don’t have to change a single line of code and can add gamepad support in under 5 minutes to your keyboard/mouse game. Essentially, even if you didn’t want to put the effort in to support controllers now you can easily!

Cost: $19.98

Trailer:

DeepDive:

The main market I see for this asset is for anyone who’s game is keyboard/mouse and wants to put zero effort in order to support most controllers. Yes, the Unity input API isn’t perfect it doesn’t support rumble or hot plugging but it does work just fine for the vast majority of games. The asset is also priced accordingly much less than Rewired or Incontrol.

Features
* Automatically detects your input manager
* Provides easy drop-downs for mapping
* No new API to learn
* No code changes required
* Remaps both keyboard and mouse
* both controls coexist seamlessly
* Works both in game and on new Unity GUI menus
* Supports many controllers and platforms
* Takes just a few minutes

Asset Store Description
So you’ve created your game with keyboard/mouse controls using Unity’s Input API for quick iteration in the editor. Perhaps you’ve thought about controller support but which controllers and which platforms? Then you looked into it and have realized that there are no standards on controller mapping at all and it’s a mess. Finally, something easy can be done about it!

With Automatic Gamepad Generator you can support today’s most common gamepads within 5 minutes. No coding required and simply use our user friendly editor window to add the controller mappings that you desire for your game and that’s it! We generate the additional Unity Input Manager virtual axes and buttons for you. Existing code will now work for gamepads!

Platforms Supported
* Windows
* Mac
* Linux
* IOS
* Android
* Xbox Consoles
* Playstation Consoles
* Fire TV
* Android TV
* Web GL (Chrome, Firefox, and Edge)

Our asset has just gone live today. I’ll be supporting this for a long time so if you have any questions/issues don’t hesitate to ask.

1 Like

Version 1.1 update is now live on the store.

ChangeLog

  • Added support for additional bluetooth 4.0 controllers on Windows
  • Added Unity 5.0 package to assetstore in addition to original Unity 4.3 package to remove warning for 5.0 users (package is compatible with Unity 4.x and Unity 5.x)

Hi!
Does the manager support 4 gamepads?

Yeah it uses the Unity Input API under the hood so it supports up to 11 gamepads.

Thanks For The Answer Ill give it a go!

Hey, this is a terrific tool, thanks for making it! However, we’re having problems with a couple things:

  1. Left/Right trigger inputs don’t work.

  2. At times in the game, the cursor is released so the player can click on GUI buttons etc. I’ve assigned left thumbstick left/right to all the Horiz/Vertical, Mouse X/Y, and Move Cursor X/Y inputs, so the player can use the left thumbstick to move the cursor around, but it doesn’t work on the controller.

Using 4.6.9, and I’m developing on a Mac, so haven’t tried a Windows build yet. Any suggestions?

Thanks for the kind words. Responses below:

  1. The Mac is by far the pickiest platform for controller support, probably due to the fact that it relies on 3rd party drivers for the most popular controllers. Can you let me know what OSX version, what controller, and what 3rd party driver you are using and I can try to test that on my end. One important things to note about the triggers on OSX is that they start out at 0 but then switch to a -1 to 1 range the first time you hit them, so it’s usually best in your get axis code to check if the value is greater than 0 since most people just want a positive range (depending on the game type). The other platforms (windows, ios, android, etc.) are much more consistent in how controllers behave.

  2. This is not inherent to AGG but is important to any controller/keyboard navigation of Unity GUIs. When you have a Unity GUI and are not using a mouse or a touchscreen you have to always make sure something is selected. This is because keyboard/gamepad menu navigation is relative (move left/right/up/down) and only makes sense if the GUI knows whats selected (not the case with mouse since it uses raycasts). If when you release the cursor, you make sure the gui element you want is selected it should navigate your Unity GUI correctly. If I’m misunderstanding what you mean, if you could elaborate a bit more on what you are seeing let me know.

Thanks for the quick response!

  1. I’m running 10.10.5, with a 360 controller and this driver:
    Release 360Controller 0.15 (beta 3) · 360Controller/360Controller · GitHub
    We’ll check the axis code as you suggest.

  2. I didn’t mention that this game uses a combination of onGUI and uGUI, for legacy reasons, and the onGUI code is pretty crazy in some places. Would your suggestion work with onGUI or only with uGUI?

  1. I can’t say I’ve used this driver before. In general my recommendations for mac is the tattie bogle driver for the 360 controller and Xone-OSX driver for an Xbox One controller and these are what I use on my end. Not saying the other driver is the culprit it might be fine but something to try if you can’t get the triggers to work. It’s important to note I’m on latest El Capitan and your on Yosemite so there are other differences in play.

  2. My suggestion is for uGUI. I haven’t used onGUI in ages but I believe the premise was the same for keyboard/controller. In the legacy system I believe you used GUI.FocusControl and GUI.setNextControlName and had to explicitly do a lot of things that isn’t necessary in uGUI that it handles based on positioning. However, when not using mouse/touchscreen always having something selected is important. These control schemes are relative and can’t use raycasts like mouse/touchscreen can.

Hello,

I’m wondering if there’s a convenient way to use the XBOX Controller D-Pad or triggers as if they’re regular buttons. The problem is that since they’re mapped to axes (on some platforms in the D-Pad’s case) I can’t use Input.GetButtonDown on them. Do I have to keep track of each trigger/button’s state myself or do you have a convenient solution for us that doesn’t require extra code?

Edit: It turns out that to achieve this you do need to use Input.GetAxis. So there would be one line of code change if you want to map a keyboard key to an analog stick. Mapping it to the face buttons, bumpers, etc. do not require any code changes. Below’s advice on deadzone, gravity, etc. is still valid, but you cannot use getbuttondown on an axis only a button.

You can treat any axis (whether the dpad or triggers) like a button if you want to. The picture below shows the important parts if you want to do this before you click generate (mainly the overrides for gravity, dead zone, sensitivity, and invert). Analog sticks usually have the settings like the first entry (0, .19, and 1 respectively). The second is your more typical digital button with values of (1000, .001, and 1000 respectively). What these values do is dictate how quickly the value changes for 0 to 1 and then back from 1 to 0 for the gravity and sensitivity. The dead zone also dictates what low values should be ignored. Most of the time the defaults that are generated are what you want and you don’t have to change them (like with the dpad, but if you want the triggers (analog) to behave like a digital control you will have to change the values). Whether you are using getaxis or getbutton you are essentially always getting a value from 0 to 1 and they can be used interchangeably. So in your question where you don’t want to change code and still want to use getbutton on the dpad you certainly can, just align the values so that the value changes to one when you want it. Since the dpad is a digital control if I wanted right on the dpad to be a button I would set the values like a digital control (1000, .001, 1000 which is default so you don’t have to change) and set it to dpad horizontal. If I wanted dpad left to be a button I would do the same but check the invert box so that getbuttondown fires when the value went to -1 not to 1.

The triggers would be the same but by default would behave like an analog stick (which can generate .25, .67, etc. which we don’t want we want it to be button like 0 or 1). So for the trigger we’d want to overide the sensitivity and the dead zone so it behaves like we want. I would recommend (0, .2, and 1000 respectively). This would fire the buttondown when the trigger is pressed 20% and go back to 0 when the trigger is released to less than 20%. The deadzone dictates how far you’ve depressed the trigger when the “button” will fire.

2513600--174027--AGG.png

Hope that helps. You should be able to achieve what you want without changing any of your existing code.

Oh thanks!
From other forum posts I got the impression that you can’t use Input.GetButtonDown if your input is of type Joystick Axis. Cool if it does work!

I went back and explicitly tested this tonight because I know I’ve done something like this in the past before, and you are correct, that button, buttondown, etc. do not fire when they are mapped to axis (only when mapped to buttons like a, x, left bumper, etc.

You can still accomplish what you are trying to by changing the code (or adding a condition to an if statement) to get it to work for axis as well if you want to use the dpad, triggers on platforms that use them as axis and not buttons (like windows although android and other platforms they are actually buttons). You would change a line of code like the following

if (Input.GetButton("ActionButton"))

to the following

if (Input.GetAxis("ActionButton") > .2f)

The same would apply to getbuttonDown, or getbuttonUp but you would have to keep a bool and only set it when it changes value from previous frame.

Obviously this only applies if you are trying to map a digital key to a joystick or other axis, mapping to face buttons or other digital buttons work just fine.

I know I had done something like this in the past and dug up my old code and it’s actually the opposite that is true… Meaning, that if you have a joystick axis you can map it to 2 digital buttons just fine (like make left/right bumper be an axis). In my old example I replaced my getbutton code for the getaxis equivalent and then I didn’t need to have an if condition or separate code regardless of whether it was an axis or a button.

Sorry for the confusion.

Yep, thanks for checking. I was kind of hoping I’d not need to add the messy hack of tracking each of these button states in the code…

Yeah, no problem. I take it in your game you already used up the 6 digital controls (face buttons and bumpers) and are on Windows where it’s only mapped to an axis? I kind of like on Mac, IOS, Android, WebGL that Unity mapped the dpad or triggers to digital buttons as well and wish they did the same on windows but they didn’t.

Well, we want to support all of them.

And yeah, we need a lot of buttons…

So Tommi and I are setting this up, but now I’m getting goofy results when testing in the editor, so I went back and re-read the documentation and realized I misunderstood this part:

If your game is multiplatform select the platform you are currently doing a build (you can regenerate easily later when you do a build for a different platform). If using play mode in editor for testing select the platform Unity is currently running on.

I had thought that I could add a second round of inputs for the other platform (to get both Mac and Window). But now I realize I have to set up one platform, publish, and then clear the settings and do the other platform, then publish. That is, I misunderstood the purpose of the platform pulldown – I thought it was telling Unity “Assign the appropriate settings for this platform and use them exclusively on that platform” but it’s only saying “assign the appropriate settings for this platform.” So this turns out to be a pain to redo each and every time I do a build for the other platform. Is there really no way to have settings for multiple platforms?

In general to answer this question you have to realize what Auto Gamepad Generator (AGG) is doing for you. At the end of the day it is generating entries in the Unity Input manager for you based on your previous mouse/keyboard controls and the platform/controller button you select. The reason there is the platform dropdown at all is Unity does not maintain consistency across their platforms on controller layouts. For example below:

Windows:
Button 5: right bumper
Button 6: back
Button 7: start
Button 8: clicking left stick

Mac:
Button 5: dpad up (right bumper is button 14)
Button 6: dpad down (back is button 10)
Button 7: dpad left (start is button 9)
Button 8: dpad right (click left stick is button 11)

As you can see these actually conflict with eachother so if you were to use the windows generated inputs and run the game on the mac it isn’t that nothing would work it would actually mean totally different buttons! If you generated both sets you would actually then have conflicting entries and your buttons on both platforms would do multiple things. At the end of the day the issue is with the underlying Unity mappings in their API and is not something that can be compensated for in an external plugin which is why the documentation states to delete/regenerate before each platforms build. A typical game that has only 8-10 actions this would take about 1-2 minutes to do, but I agree it is annoying. If there was a way to change this I would certainly add it but unfortunately there isn’t.

If you use the Unity Input API there isn’t a way around this and is why this asset is less than half of the price of Rewired. Essentially I market this as an asset that allows you to generate controller support off of your existing mouse/keyboard game easily when you used the Unity API, whereas rewired replaces the unity input API with their own, but that means you have to do it rewired’s way from the start, including mouse/keyboard. At the end of the day, I certainly feel there are games that would benefit more from AGG (mainly games that want to add controller support to their mouse/keyboard game after the fact) and there are games that would benefit more from rewired (if you need hot plugging, directinput compatibility, etc. or other things the Unity API doesn’t provide).

Couldn’t you store the mapping info somewhere other than Input Settings and then let the user simply switch between their setups when switching platforms (or even automatically switch it as the user switches platforms though I’m not sure if this is possible)? I think the most annoying thing is that the setup resets after you generate the inputs every time (unless we’re doing something wrong of course). This is not only annoying - even if it’s doable in 2min, it introduces the possibility of error every time you build.