Best practices for UI audio?

Was wondering what the best way would be to handle UI sounds like button clicks and such.

My initial thought was to make a single AudioSource in my scene holding a click sound, then wiring up each button to that AudioSource and calling AudioSource.Play in the onClick event.

However, I dynamically spawn a lot of UI widgets, so that’s a little annoying to do. Then, I thought I could just put an AudioSource component on every button GameObject. That way I can link up the event to the AudioSource in my button prefabs. A lot easier to do, but I wasn’t sure about a performance/memory impact of doing things this way.

Thoughts? Other considerations I’ve missed?

bump

What I do in general - not for UI exclusively - I have a prefab “soundEffect” that is a transform with sound source attached. Then I have an audio handler that has methods for “spawning” sound effect prefabs in needed locations and with the right parameters. The sound effect destroys itself after it finishes playing. Then I just reference my audiohandler object on scripts that spawn sound and use something like AudioHandler.PlaySound(soundX); This way I can spam sounds.

I don’t know if many audio sources are bad on performance, but most of the time in my application no more than 2 are played anyways!