Now sometimes you don’t need this level of effort but don’t underestimate just how much CPU you might be leaving on the floor. Sometimes it can mean what seems like an impractically slow algorithm actually runs fine. Even if you ran an infinite precision version of the ML code on a modern desktop CPU you might struggle to get perf to get the system to run in real-time (we needed to run the simulation within 5-10 ms and then do a bunch of even more complicated math to compute the blue dot location every 250ms)
Moreover, many problems related to money) must compute with limited accuracy. For example, all intermediary results must be in 1/1000th of euros. You can use floats (yeah, I know, everybody say it's a no go) or fixed point, but in the end, you absolutely can't avoid thinking about rounding.
OTOH that starts getting quite "application oriented" and where do you draw the line? compsci is perhaps more about computation and how to achieve it, analyze it, etc, and in that case even dates and times should not be in the list, being an "application" domain concern.
> where do you draw the line?
My language is written Right-to-Left. Jira couldn't care less about my market, so I don't use it, even though I could trivially add support myself in Firefox's User CSS file. But there are another 300 million people who speak RTL languages they are ignoring along with me. Is 300 million people a small market? "But everybody in tech speaks English" they argue.
Being _able_ to use their product in a language foreign to myself doesn't mean that I'll _choose_ to use their product given alternatives.
Of course your language has just as much right to be dealt with correctly in computers as every other, but that's probably a separate conversation about the right to equal access to technology whatever the economics (A cause I agree with), etc, etc.
> I meant where do you draw the line as to what is "computer science", and what is domain and application-specific problem-solving?
I actually think that the line is clear. Solving a practical problem or bug? That's not computer science. Developing or improving a generally useful algorithm or technique, such as a sort function? That _is_ computer science.
Science is improving our understanding of how things work or can be made to work. Actually putting that knowledge to application is Engineering (or tinkering).
Text Rendering Hates You - https://news.ycombinator.com/item?id=21105625 - Sept 2019 (170 comments)
Most do barely L10n, some do make it to proper I18n.
Proper G11n...Maybe the fingers of one hand.
And then, you can formally prove that code does not have certain kinds of bugs. You cannot formally prove that code has no bugs, because 1) you don't even know all possible kinds of bugs, and 2) even for the kinds you do know, you don't have formal proofs for all of them.
* Strings (think Unicode, collations, string sizes, etc)
* Numbers (think currencies, precision, explaining floats to people, etc)
* Dates (as mentioned in the post)
And it's funny that we start teaching programming with these concepts.
I think the same kind of thinking applies for all the simple things in each domain.
I'm going to be the "actually" guy and say that, actually, you can formally verify some studff about programs written in traditional/mainstream languages, like C. Matter of fact, this is a pretty lively research area, with some tools like CBMC  and Infer  also getting significant adoption in the industry.
* knapsack (fit the most “boxes” into min number of containers, sometimes 4D+ “boxes”)
* traveling salesman
Anyway yeah there are definitely lots of hard things.