Because it doesn't have a 100% success rate doesn't mean it shouldn't be considered, otherwise we could use the same logic to dismiss basically any rule, law or regulation ever made.
Echo chambers are never a good thing, echo chambers built around hateful extremist ideologies are a lot worse. By forcing these people to get out of them some good can be achieved. Of course some of them will manage to regroup elsewhere but statistically a significant proportion will return to a more healthy lifestyle.
But Reddit is not the entire internet, and that ban perhaps directly fed into 8chan's rise. People do not disappear because a forum went offline, and tools to communicate are only getting better, faster and more resilient.
It seems highly dubious that all that attention completely disappeared instead of following other channels, which may be even more extreme. I can't find any study of this.
These individuals dont disappear and it really doesnt take long to route to a new site. Fixing the udnerlying problem is a better idea than covering it with a band aid
So to rephrase the original statement, deplatforming rarely solves anything.
Surely you'd have an issue if the government showed up at a businesses office with guns and forced the owner (at gunpoint!) to allow a nazi hate site to continue to exist on their platform, eh? Cause that is what you are arguing for.
Not the point, but even still what people are arguing against is mobs of internet users pressuring companies into political decisions.
Literally (as in figuratively) the people asking for banning the site are the one holding the owner at gunpoint asking for the termination of a business relationship.
The PR benefits you're claiming only exist because it placates the very same people that make such an outrage and call for corporate action in the first place.
If we were to go the other direction, we can see how having a greater platform would be worse. E.g. if there were a blatantly neo-Nazi cable TV channel in everyone's home, we would expect many more people to end up watching neo-Nazi content and some of them to become radicalized. Propaganda requires a platform to be effective. It is thus not exactly surprising that you can make propaganda less effective by eliminating the reach of the platform.
If racists like those on 8chan can be pushed into the deepest corners of the dark web, where you have to use Tor to get to them or whatever, that's a win. Lots of people aren't going to bother, and thus will never run across them, and never have the opportunity to be radicalized by their propaganda/content.
Technology is not standing still. Tor isn't necessary. We're seeing the rise of distributed, federated, encrypted, and anonymous networks that take little more than an app install or website link. They are only getting more hardened against these mitigation techniques and the approach of "just shut it down" will soon become an infeasible solution.
My concern with big companies deplatforming political extremists that weren't in the public eye is that it almost validates their "they don't want us saying XYZ because it's true" points. Not that their political shit has any basis in reality, but when impressionable people see that they actually are being squeezed out of the internet, it leads many of them to conclude that their other points are valid.
Right now, loads of extremists are taking to Discord and other private chats to discuss their points and recruit people. Inside those tight-knit private groups, there's no possibility of a random passerby to stop in and offer a dissenting view. They see their discussions as the absolute reality of the world. Pushing them deeper into those groups feels far more dangerous to me.
Censorship is the suppression of speech, public communication, or other information, on the basis that such material is considered objectionable, harmful, sensitive, or "inconvenient". Censorship can be conducted by a government, private institutions, and corporations.
In this case as far as I can tell the server provider (not CF) was repulsed by the site content. I support that choice and the company ability to make it. Still it is censorship.
There is bad censorship and good censorship.
All of that said, CF can choose to not do business with anyone, especially if that entity is causing legal grief for them and/or abusing the AUP  See section 2.7. Rather than censorship, I would call it PITA avoidance. I would not want to be the CDN for any of the chan sites.
 - https://www.cloudflare.com/terms/
More specifically, what is a crime prevention service?