Average (?) vector from two lines

I have two lines, which could be represented by two vectors (V1 and V2). I am trying to figure out how to find the point (normalized) along a line going outward from the average of the angle of the two vectors. (represented here by the dot on the green line). And also inward (the purple dot on the purple line). Hope someone can help!
140000-vectorangles.png

Just normalize V1 and V2, add them together and normalize the result again. This will give you the purple vector. Inverting this vector gives you the green one. Keep in mind if the angle between V1 and V2 flips over 180° the purple and green vectors would switch places- Also there’s a possible issue when V1 and V2 are exactly 180° apart since they would cancel each other and result in a (0,0,0) vector. This case can be handled, though it depends on what you want to happen in this case.