Additionally, I would clearly recommend learning math in English, since there are a lot of synergies that can be really, really helpful in the jargon of computer science. Even if it is just the little nudge that allows you to connect problems.
Furthermore you can unlearn a tool if you don't use it. I learned frequency dissection in school. Forgot about it 2 weeks later. Only as I started to implement my own shitty jpeg-compression is ehrn I really started to use it as a tool. Had to relearn it of course. Turns out there are many applications for general image recognition. Great. Now the math got useful and is indeed needed.
There are some "purely" mathematical tools like adding zeros, multiplicating by 1 or logic that allows for further transformations. But I would argue that it doesn't as much help to solve a problem as it helps to verify the solution.
But some thousand years ago someone must have made the decision to make mathematical notations specifically unreadable when expressed with ascii symbols. Really don't like that guy.
But then again, without him we probably would miss a lot of what makes programming elegant, like functions as f(x).
In an alternate timeline, we might all be using Arabic mathematical notation .
And they are called papers.
However, in this article I am not advocating that mathematical thinking is using one letter notation. I simply follow that convention for standard things like function.