Historically there have been issues with inefficient code on pages with lots of elements or larger data sizes than expected. Since web pages are complex and ads continue to evolve, there’s this constant arms race where people run more code trying to block them and that code takes time and memory to run, and the update cycle means things are pushed out quickly.
A lot of the examples I’ve seen were unexpected situations: e.g. if you look at every <img> with a poorly-tuned regex and someone uses data: URLs, it might be backtracking pathologically on orders of magnitude longer strings. There used to be an issue with each injected CSS file being stored separately - it was fixed years ago but there was a multi-year period where people would complain about Firefox being slow, you’d ask if they had AdBlock Plus installed, and the performance issue cleared up as soon as they disabled it – the problem was an extremely large style sheet multiplied by every open tab and iframe. It was bad enough that they officially called it out on the Mozilla blog:
The other thing to remember is that a browser developer has to support every user, not just the savvy ones. You might know to stick to certain more-trusted extension but the Safari developer’s threat model has to include someone’s grandfather installing McEaglePatriotGuardElite and blaming Apple when their iPad is slow or that injected code is exploitable. Apple characteristically chose to respond to that by reducing flexibility, which is certainly a valid decision but also one people can reasonably disagree with.
Which performance concerns?