Neither HackerRank nor Leetcode invented this style of interview. The questions you're complaining about predate both sites. The sites were created to help you study for those questions, they did not significantly popularize the questions.
Except, there's a difference when you're in a room with somebody who can tell when the shitty site or the vague nature of a question makes it difficult to complete the request. They can also better gauge someone's understanding of subject matter, merely by talking through the question, approach, etc.
The availability of these sites makes it seem like it's a standard, which adds to the problem. Some small company might not have taken that approach. Now, you don't want to fall behind by not making use of a widely-accepted "standard" practice.
I agree with many parts of your comment. I have been in interviews where the interviewers themselves clearly had no fundamental understanding of the problem or solution. To be clear, I didn't either, but asking clarifying questions got me nowhere.
> Now, you don't want to fall behind by not making use of a widely-accepted "standard" practice.
Absolutely. I can understand Google asking you to write a k-way distributed sort off the top of your head, but I've had similar questions from an "enterprise gone unicorn" that were clearly still in the "wtf even is cloud" phase.
Even so, there is either someone available for some clarification, in my experience. Obviously, this varies across companies and at various levels of helpfulness. Your point is valid, though.
The other part there is that it puts the onus on the third-party site(s), exacerbating the problem; whereas, if the interviewers do not understand the question, they are more likely to focus on one of those items to correct going forward.