Hacker News new | comments | show | ask | jobs | submit login
Top 5 Bugs of my Career (javadots.blogspot.com)
10 points by pembleton 2811 days ago | hide | past | web | 4 comments | favorite



At some point I decide that my new rules are complete and I put them to work on several real inputs: large zip files with > 100K classes. I notice that both the import and load commands run much slower than before. In fact, they run so slow that they wipe out all the benefits of having my own optimized data structures.

Well, it would have been even slower without the correct data structure, right?

Remember, we are speaking about the C programming language, so expecting something as fancy as a stacktrace is out of the question.

Maybe on Windows. This is certainly not a problem with the GNU stack. But I doubt that Win32 C programs can't be debugged.

First, I tried to apply reasoning. I made educated guesses regarding which events are likely to be the ones causing the crash. After several hours of unsuccessful guesses I went to a more brutal approach: I commented out the whole switch block.

I would have tried 'printf("%d\n", msgid)' first. Seems like this would have identified the faulty switch case almost immediately.

I started debugging this code. When I stepped over the b = true statement the program crashed. This puzzled me. b is a local variable. It is stack allocated. How can an assignment to it fail?

Are you sure that as the stack started to unwind after this, destructors on local variables were not being called? (C++ does actually have this sort of automatic memory management, after all.)

if you implement a cache you must always implement some cleanup strategy.

Every language with garbage collection has weak references. That's what these are for.

Anyway, this article was kind of weird. "Here are some weird bugs that caused me lots of problems", but no word on how they were ever resolved.


Anyway, this article was kind of weird. "Here are some weird bugs that caused me lots of problems", but no word on how they were ever resolved.

I think that's part two....from the bottom of the post:

Got it? Great. Otherwise, wait for the next post...

(To be concluded)

EDIT: Updated to include the quote.


#4: Sometimes transforming your data so that it's in an "optimized" structure requires a significant upfront cost. If the processing on that data is comparatively small, it might not be worth the effort. I'm going to guess that the import and load functions had n^2 algorithms in them.

#2: Assuming that no objects go out of scope, memory corruption. If objects do go out of scope, I'd still suspect memory corruption, but I'd check the destructors first.

#1: He already answered it, really. The caching mechanism was probably maintaining references to the objects, while still allocating space for new ones.


#1: The rows are being traversed by index, but the indexes are changes during the execution of the loop as some rows are deleted. The loop ends up skipping any row that immediately follows a deleted row.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: