What is difference between SetReward() and AddReward() They all look like same while I am watching the cumulative reward in my environment. I didnt understand the difference which documents talk about.
1 Like
With SetReward you set the reward of a specific step during learning. With AddReward you add a value to the current reward value of that step. You can call AddReward(0.5) twice in the same step, or SetReward(1) once, to get the same result. If you first call AddReward(0.5) and then SetReward(0) then the reward in that step will be 0. If you first call SetReward(0.5) and then AddReward(0.5) then the reward in that step will be 1.
Note that these functions only refer to one step, they don’t set the reward for the entire episode (although depending on your setup, the last step in your episode may be the only one in which the agent gets a reward).
4 Likes