A spacesuit should be aware of its location and the relative locations of other spacesuits, vehicles, and anything else with which it maintains radio contact. The audio system within the helmet should take this into account. If you're in a suit on an EVA and another astronaut who is above, behind, and to the left of you speaks, it should sound like they're above, behind, and to the left.
This would be particularly useful for moon and Mars colonies, where lengthy EVAs could put significant distance between astronauts and terrain could impede line of sight. It doesn't even have to be precise, but if a natural language statement like "Hey, look at this!" doesn't need to be followed by a location statement or require that the other astronaut look around to see where their companion is to respond then I think that'll be both more efficient and make interaction while suited more natural and less distracting.
One might even want the system to adjust a speaker's volume (to a point) according to their distance.
Just a thought.