The specific details of the hiring process are in question. You are running away from grappling with the (increasingly likely) possibility that bias wasn't the (only) driver in hiring disparity.
The point I'm trying to make is that the details of any specific hiring process aren't really germane to the overall discussion. Hiring disparity is a metric that's measured at a much higher level than any individual organization.
> Hiring disparity is a metric that's measured at a much higher level than any individual organization.
And how do we know if the hiring disparity is due to bias? The details of the hiring process are absolutely relevant, because the notion that the hiring disparity is due to bias is a claim about the details of the hiring process.
My main issue with a lot of DEI programs is that they don't try to eliminate bias. They just assume disparities are due to bias and work towards "fixing" those assumed biases with explicit discrimination. The problem is that when you actually try to measure and quantify bias in tech, the results often aren't what DEI advocates assume. E.g. https://www.pnas.org/doi/10.1073/pnas.1418878112
This is why there's such a a strong pushback against anonymizing interviews and other bias mitigation measures. What happens if your interviews and applications are all anonymized and the hiring disparity remains? The justification for "fixing" the representation is now a lot weaker since it's harder to claim it's due to bias. "Let's address bias in our hiring process" is a lot more popular than "let's set quotas". So the people who want quotas try to claim that they're just fixing bias by setting quotas.
You don't eliminate biases by focusing on the gender ratio of the orchestra. You eliminate biases by putting a veil between the auditioner and the evaluators. We all know this, but some people feel compelled to pretend that they're working towards eliminating biases when in reality they're working towards achieving certain demographics outcomes.