They have been deprecated because they perform poorly:
> Firefox, for example, when it realizes that a mutation event has been turned on, instantly goes into an incredibly-slow code path where it has to fire events at every single DOM modification. This means that doing something like .innerHTML = "foo" where it wipes out 1000 elements would fire, at least 1000 + 1 events (1000 removal events, 1 addition event).
> Mutation Events are widely acknowledged as “slow” in terms of the real performance degradation that occurs on websites that use them heavily for tracking changes to the DOM
The spec says "A new specification is under development with the aim of addressing the use cases that mutation events solves, but in more performant manner." so hopefully it won't be too hard to port things like this over... I hope so anyway, because the mutation events are mighty handy!
I'm all for bringing back vote counts, I do sorely miss them... but I have to ask, in this particular case, how would mankind be served if we knew that 127 more people agree with cstuder? Maybe I'm missing something, this is an honest question.
the problem is that without a vote count, how do you know if 2 people voted for it (and thus really isn't a big deal), or 127 (and thus, the OP should really fix his stupid site). The only alternative are "me too" posts, which are hideous (and which is basically what my reply was).
In short, as-is, the OP really has no way of knowing just how aggravating his site is.
Thanks for explaining your comment, that makes sense. I do think, however, that your parent's comment makes a point on its own merit, not sure it needs to be "me too'd" 127 times to be heard. It also gets a prominent position on the page even without a vote count.
I'll concede that 127 visible upvotes are much preferable to 127 "me too" replies, though!
1. Those who aren't familiar with the topic. With a single-digit score, there's potentially not much value there. With a three-digit score, hey, maybe people shouldn't use large images in the background. Then this person can either follow it or look up or ask why this is. It's marking significance to an otherwise unknown topic.
2. Similarly, if the author of the site sees this, they can see just how many people agree with this. This is important because again, high score means people agree with it. This person may still not take action, but should at least consider it if it's a high enough score. Maybe even chime in and respond with a rationale.
3. The rest of the community. This will either acknowledge your own views or raise a red flag if the score is high enough. Did you miss something? Are enough people just plain wrong according to what you believe to be true? Should you post a counter-point or add/ask for clarification?
These are all good reasons to bring back vote counts (which, again, I'm in favor of) but only your point (2) is a reply to my question about this particular case. (1) and (3) would apply to any comment here on HN.
I guess what I didn't take into account in my previous comment is that the guy responsible for the animated GIF might drop by HN and might take the GGP's comment more seriously if he sees a large number of votes attached to it. This makes sense and is another good reason to get vote counts back here, especially if it discourages others from posting me-too comments.
If you see a comment or article you approve of, you should upvote it, regardless of how popular (or not) it is among other people. If there are 529 people who approve of a certain comment, then it comes by those 529 upvotes honestly.
This is the reason why some beggars on the street make more money than a full time job (If they're in a really good spot).
Everyone thinks they're just giving them a little bit. But all those little bits add up to a lot. Without any visibility into how much other people have already given them, there is no way for people decide if they really need the money or not.
If you saw a beggar on the street, and knew that they'd already been given $1000 so far that day, would you still give them a dollar?
I don’t see upvoting as giving something to a beggar; I see it as an exchange of value. You give me something I like (something interesting to read) and I give you something you like (karma).
I am the OP; i just removed the background... but is going to be back! (as soon as i figure out a way to make it consume less CPU... probably using canvas)
Why are you comparing this to `jQuery(document).ready()`? Apples and oranges.
You say it’s faster, but fail to provide any numbers / a jsPerf test case. I’m sure it’s faster to execute `waituntilexists()` initially, but if you take into account it uses an 5ms interval in which it traverses the DOM for the same elements over and over again until they’re finally found, it seems it probably has a negative impact on performance overall.
actually, the script doesn't make any sense because it would be executed as soon as the browser is idling, that is after it has rendered the dom completely, that is when the DOMContentLoaded event fires.
The right way to run a script right after an element has been created is to place the script right after the element closing tag.
You still can use it for setting late binding event handlers though.
I like the idea and it does seem that it could reduce the time that your JavaScript code has to wait before it can run. However, in reality most real JavaScript front end code frameworks require not just one element but multiple elements, and nested elements. I would be hesitant to start running JavaScript just because one element was ready unless I was sure that other elements that my code referenced or manipulated were also ready. I'm sure it would be possible to make a version that allowed you to specify a list of elements to wait for before running code, but then that creates a nightmare for code maintenance, keeping track of the entire list of elements that your JavaScript requires. For me the jQuery(document).ready() method seems easier and more reliable.
* I could test it myself, but I expected to have in the article detailed information on the different possible states the document can have when my callback is called. The questions I know the answer when waiting for the whole document to be loaded : Is it half loaded ? What is the DOM state ? Is completely loaded ? If not, have the document JavaScript code been entirely ran ? and so on. This kind of question left unanswered often lead to hard bites.
* why using "itself" ? I would have used a "sensible default" : when no context is given, use the waited DOM object instead of an arbitrary value (document), and use something else when it is explicitly given. The "itself" thing should be left as an internal flag IMHO.
1) The dom state is that everything inside the HTML element that you were waiting for already exists when the function is executed.
2) When i am writing javascript is very common for me to use the global object (window) as the context; so in my case is not a good idea to use the HTML element as context by default.
Does someone have a really good example of where this would be useful? The description is basically, "When this element exists, do something" but what is the something you'd do? If it's styling, that should be done in CSS. If it's event handlers, that can be done with `.live()` or `.delegate()`.
I'd also like to see some benchmarks to back up the assertion that this is faster. The DOM modification events are generally accepted as slow and it seems like this is leaving a handler attached for the duration of the page's existence.
some UI stuff could benefit from this: things like turning a list into an accordion. (Normally there is at least a little work to be done to ensure you don't get a flash of unstyled/unscripted content, if this fires fast enough you could avoid that)
http://www.w3.org/TR/DOM-Level-3-Events/#event-type-DOMNodeI...
They have been deprecated because they perform poorly:
> Firefox, for example, when it realizes that a mutation event has been turned on, instantly goes into an incredibly-slow code path where it has to fire events at every single DOM modification. This means that doing something like .innerHTML = "foo" where it wipes out 1000 elements would fire, at least 1000 + 1 events (1000 removal events, 1 addition event).
http://lists.w3.org/Archives/Public/www-dom/2009AprJun/0072....
> Mutation Events are widely acknowledged as “slow” in terms of the real performance degradation that occurs on websites that use them heavily for tracking changes to the DOM
http://www.w3.org/2008/webapps/wiki/MutationReplacement