It's also almost exclusively used by the kind of third-grade mass-recruiting consulting companies that have given Indian developers a bad name. Which is not a surprise, since colleges (including mine) have picked up on this, and now specifically train students for such exams. Even for technical roles (supposedly, although I think spamming StackOverflow hardly qualifies), the online test rarely contains a computer science / computer programming component, and new recruits are commonly expected to undergo year-long training at their own expense even after being hired.
My dislike for Aspiring Minds is hard to put into words. It's strong enough that I would refuse to work for a company that uses them, out of fear of the unchecked incompetence of my future coworkers.
Even if the figure is true, this is not the right starting point for discussion. Also, we can do better with direct links to studies, rather than bad journalism digests.
Doesn't matter if it's damaging to the business in the long run, it makes their quarterly results look great, and that's all Wall Street cares about. Headcount is down, profit is up.
Who cares if their business logic will be a hot mess in 2 years because everyone competent got fired? Everyone high up will have already collected their performance bonus.
Far too many companies are afraid to take long term decisions which are better for the company but worse in the short term. This was cited as one of the primary reasons for Dell to go private again:
"My partners at Silver Lake Management and I successfully took Dell private a year ago in the largest corporate privatization in history. I’d say we got it right. Privatization has unleashed the passion of our team members who have the freedom to focus first on innovating for customers in a way that was not always possible when striving to meet the quarterly demands of Wall Street." 
I refuse to believe this article, though. It seems very much a clickbait title/article to get visitors to their website. Any study can prove pretty much anything.
This is a case of molding a study around trying to substantiate a predetermined (and financially motivated) goal.
Unfit to do what specific "software development" job? There are Indians working this very vast field. Presenting them with a test they fail does not change that fact.
If your study results in a claim that contradicts reality, you are at best doing it horribly wrong and have at worst terrible ethics.
36,000 engineering students from IT related branches took [the test]
over 2/3 could not even write code that compiles
1. What stage in their IT education were they in?
2. Was the test given in a language the students were meant to be familiar with?
I've been programming professionally for over 15 years, yet I wouldn't even be able to write "Hello world" in Haskell without having to look up a tutorial - because I've never used Haskell.
Perhaps your experience was such because most of these "engineers" remain unemployed, take up another profession or join the Indian sweatshops (ie, TCS, Wipro, HCL, Infosys, Tech Mahindra, Accenture etc..).
I have an impression that there is huge social pressure in India to become a developer. What makes sense is because it is a good profession.
Because many of these developers have too little capacities to become a programmer, hence, there may be a stereotype of poor programmers from India.
One of many examples competing (successfully) with (for example) California companies (who pay around 2..4x their devs): http://enpass.io
Disclaimer: I'm not Indian but European and not enpass owner/user/developer.
In any case, this isn't an India-specific problem; the fact that most candidates to a programming job can't program has been reported in various countries, which is why Fizzbuzz is still a thing.
For me, the more problematic evidence comes from the interaction colleagues and I have with actual Indian engineers. Westerners are far from perfect, but I've had a lower success rate with Indian outsourcers. As I understand it, Indian cost of living is at most 1/2 that of western countries, so I would expect a 1/2 price differential for the same quality. But that doesn't seem to happen. I'd guess it's due to upper management failing to see the difference between the Indian quality at 1/2 price and Indian quality at 1/10 price, and choosing the latter, making the former scarce.
1. have they reached 100% students? to claim 95%? no they have not reached there are 500+ colleges in a single state
2. From which engineering year student given test, for first year most of the colleges have same syllabus for first year
3. Company itself mentioned using automata it was found that the machine learning score was able to predict 22.6% good candidates
"Automata is world's most advanced and only programming assessment that uses machine learning for grading programs."
That pretty much sums up anything you might want to know about the nature of the test.
I'm not sure where race/national origin factors into this if all the participants are Indian. Perhaps you could point that out? I am vaguely aware of India's caste system but I don't see any mention of it in the original article.