What makes you think it's the developers in this case being selfish and not you? The numbers might be on either side, or on nobody's side depending on how the problem gets framed.
I may not be thinking about this correctly, but some assumption has to be made:
(1) Notifications on by default.
(2) Notifications off by default.
Either (1) or (2) can of course be presented to the user as being the default case and the user can be made aware that the default can be modified. Which seems optimal, until you remember that default choices developers make about notifications are not the only choices they make regarding defaults. Lots... most settings are given default states.
So then it's a question of which settings are explicitly presented to the user as having defaults and being modifiable. Are they all important enough? Of course not. Are notifications? Maybe.
But if notifications are important enough, then it's very likely that others will be important enough as well. And this means that upon installing a program, the user is in for a real choice making treat.
Most users are simply not capable of reasoning about such choices. And if they are capable, then they'll likely either know how to find and change the setting or know how to find out how to find and change the setting.
In all other cases, it only makes sense to make assumptions about what users want when they probably don't know what they want, or can't really reason about the situation due to lack of language-tools or context or background or whatever.
The most irresponsible and selfish thing for developers to do to such users is say, "Here are a bunch of choices. Figure it out yourself." Most can't figure it out themselves, would make the wrong choices with regard to themselves, and would be less happy for it. Developers can sometimes make horrible and damaging assumptions about users, yes, but I don't think this is one of those times.
In the case of notifications, most people I know want to be notified 9 times out of 10 about facebook mentions, google+ circle adds, SMS, etc. etc. Mostly these people love the interactions, it's why they have smartphones in the first place. Call it vanity, vacuous egotism or what have you. I am very, very suspicious of it myself.
(There is an excellent philosophical discussion to be had about the responsibility one has for ones own attention and the attention one demands of others -- and that discussion would probably end in a denunciation of all things notification-like, but I'm setting that discussion aside here for a pragmatic one instead.)
But the fact remains developers are probably making the right choice and assumptions about the default-is-on state of notifications when it comes to most users. Now, I understand you do not like the notifications. In that, you are not alone. My guess is that the set of people who despise notifications is probably not too divergent with the set of people who know how to turn them off. Granted, I've got no data on that. And maybe somebody else does (it would be a very interesting data to reason about!).
So all this is to ask about the question of responsibility on the part of the developers. And at the bottom of my reasoning here is the idea that (1) developers must make assumptions about defaults, because (2) users can't in most cases be trusted to make those assumptions themselves, and (3) if assumptions must be made they better be made with the majority in mind.
iOS has had in it the form of "This application would like to use notification", the first time you start an app, for ages. Maybe it's one of the things I'm very visibly noticing in switching to Android.
It's not a complicated setting.
"Do you want to be notified when there's an update to be had?" The first time the app runs. It's when the attention to that app is the highest, we've just picked it and installed it and want to try it.
It could be a system wide setting; that filters down into all mobile apps, or just be unchecked, or you decide not to.
If notifications are on by default, it's often selfish motive in the design to get higher uptake and stickiness of the app. We all know that most apps overwhelmingly are rarely used after installation despite the notifications defaulting to be on.
I was also, specifically, speaking about apps with notifications, on mobile devices. If you wish to extend that generally to software in general that's fine, fair and your decision, but it's not what I'm speaking to.
Its entirely pragmatic to be able to have a say in how your focus, productivity and attention is interrupted. It's not philosophical. I like getting stuff done, and separating the signal from the noise of what to pay attention to is integral to that.
Most users have enough of an issue handling the information overload they're facing. Instead of technology being an empowering tool it's become quite the opposite when they're not given good examples of not proactively being to make one basic decision, "should I bother you".