Both my mother and wife work in the health field and both are always going on about all the hostility that goes on in the workplace, to the point that they are both ready to leave – some of their colleagues have already done so for the same reasons. Yet, the health industry remains a place where a lot of women are drawn to work in it.
I'm not trying to justify the actions of people in IT. I'm all for it to become the paragon for the treatment of women and set the standard for the rest of the working world. However, I'm not sure IT is unique in this regard so it does not really explain why woman are drawn away into other, just as hostile, professions.