Imitation learning without sensors

Is it possible to set up imitation learning when the project doesn’t have any sensors in it?

The documentation and functions on the demonstration recorder seem minimal. I’m assuming usually it uses sensors to detect what is happening. However, in my projecty I don’t use sensors to gather observations, they’re all set through AddObservations() in an agent.

If it is possible, how do I let the demonstration recorder know what’s going on?

AddObservations() is actually using a vector sensor. It is being defined when you’re setting the vector observation space size to any value other than 0.