Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Do you spend more time coding or debugging?
84 points by WillKirkby on May 19, 2016 | hide | past | web | favorite | 75 comments
I find myself spending most of my time debugging older code, and I'm curious as to how other people's time is split at work, between developing new code, extending existing code, and maintaining existing code. Anyone care to share?

I used to spend the vast amount of my time debugging.

Fast forward a few years - Experience, good programming habits, and the gratuitous use of assertions. Now I spend a negligible amount of time debugging, and it never ceases to amaze me how frequently things just work the first time.

Edit: I guess I want to say that there is one, and only one, bottom line: the relentless, ruthless pursuit of quality. It takes time to develop the good habits and watch for the pitfalls, but once you're there you develop your software products in a quarter as much time, with one tenth the stress, and everyone on your team feels proud of themselves and each other. Then with your free time you can focus on what's really important - your business and your life

Agreed, but with the addition of some periodic, many hour, something-completely-unanticipated-happened-and-I-am-so-perplexed debugging sessions.

This but let's also define debugging. From reading some of the comments left throughout the thread it seems like people don't think that a test failing and you having to go back and fix code (or even the test) is debugging but IMO it is. There is a bug, you are going back, finding it and fixing it. It usually means it is easier and faster to find/fix, but it is still time spent "debugging".

I spend more time creating then debugging when I write the code from scratch, but I still say I spend probably at most 70% coding if it is a new project. The rest is in failing tests and finding those weird screwed up errors that take multiple days of a debugging session to find. For existing code I take over, it is probably usually more like a max of 50-60% coding depending on the original authors skill.

"Quality? But how does that affect our bottom line?" :)

I know you're joking... But it really does affect the bottom line! I've learned the hard way that taking twice as long to do something the right way will be more economical in the long run than doing it quickly, and cleaning up the bugs that manifest themselves over the following months.

Yes, but this is also a false dichotomy. In my experience doing something the "right" way is generally no overhead over doing it the "quick" way. Adding a check, or a log message, or making sure you're using consistent styling, or an assertion before/after some kind of operation... these things don't add time or complication to something

I strongly disagree. Thinking through and handling all of the edge cases that could break my code inevitably takes more time than getting it to work for the few examples I know of and calling it a day. For example, I just wrote a function to generate a rolling average from a timeseries dataset with one data point per day. All of my sample data has exactly one datum per day. But I just know that eventually there will be an issue that causes data gaps, and of course writing code to handle gaps when they occur takes more time than not doing so and ignoring the consequences. And that's not even mentioning the time required to write unit tests.

Furthermore, a lot of the extra time comes not from the initial coding exercise, but the diligence and follow-up required. ie. Cool, I've implemented a feature, but did I go through carefully and ensure that I've removed all my console.logs and commented out experiments? Did I leave dead code anywhere? Did I make any changes that require renaming or refactoring of other parts of the codebase? I never submit a PR these days without carefully going through my own git diff and double-checking myself. I almost always catch something when I do. These things take time.

This. And I'd add that the type and complexity of the workload your application sustains is almost proportional to the amount of time spent debugging it

I remember the first time I wrote an entire controller and everything worked exactly like it did in my head. There was a lot of whooping and dancing to be had.

Not that it happens much.

Neither, I spend more time writing or thinking about tests. When I was earlier in my career I used to insert debug comments, run code, look at output to figure out what was going on. Now, I try to write my tests that will eliminate the need to debug. If I'm trying to fix a bug, the first thing I do is write tests for what should happen if the tests don't already exist and work backwards from there. If you get in a good habit of this, it just becomes natural and with today's toolchains you can re-run tests with a watcher to achieve a real time feedback loop, which is better than a debugger IMO. The plus side is going forward you always know that the code works vs just seeing it work that one time. As your application grows, something may change what the value of the debug statement output and you'll never catch it, but your test will. /old man rant

I'd shift your statement slightly, "I spend my time Thinking" (elided "about test"), but agree entirely with the thrust of your point.

My reason for the truncation is that the thinking isn't just about what tests to write to validate the semantics you want, but what semantics do you even WANT? Happy path may take an hour to figure out, but getting to a point (for a reasonably complex system) where I feel confident that I've enumerated the "perimiter" of the mental model such that there are fewer surprises, gotchas, odd edge cases, usually takes significantly more contemplation of the problem space than modern big-co "DELIVER FEATURES NOW NOW NOW" would often like, certainly more time than spent actually implementing, by and large.

(you may sense some bitterness, it is largely because a respected mentor of mine made significant effort to stress to me that if I'm leaning on a debugger, or having to printf a lot, I probably don't UNDERSTAND what's going and and can fall prey to far more severe logical issues; and despite my observation that I became a far more robust engineer utilizing this strategy, it's often hard to incentivise balancing this against simply shipping, especially given the difficulty of empirically justifying "I need a day to think really hard about this problem to make sure it's not subtlety wrong" against the rebuttal of "what's the ROI")

Agreed. I used to debug large "integration tests". Now with unit testing, I'm only debugging a bit of new code atop a foundation that is regularly unit tested.

From The Mythical Man-Month by Fred Brooks:

No parts of the schedule are so thoroughly affected by sequential constraints as component debugging and system test. Furthermore, the time required depends on the number and subtlety of the errors encountered. Theoretically this number should be zero. Because of optimism, we usually expect the number of bugs to be smaller than it turns out to be. Therefore testing is usually the most mis-scheduled part of programming. For some years I have been successfully using the following rule of thumb for scheduling a software task:

  1/3 planning
  1/6 coding
  1/4 component test and early system test
  1/4 system test, all components in hand.
      This differs from conventional scheduling
      in several important ways:
1. The fraction devoted to planning is larger than normal. Even so, it is barely enough to produce a detailed and solid specification, and not enough to include research or exploration of totally new techniques.

2. The half of the schedule devoted to debugging of completed code is much larger than normal.

3. The part that is easy to estimate, i.e., coding, is given only one-sixth of the schedule.


and from Robert L Glass, Facts and Fallacies of Software Engineering (2003, but I think a self-reference from Building Quality Software, 1992):

"The data on the percentage of time spent in error removal has varied over the years, but the usual figures are 20-20-20-40. That is, 20 percent for requirements, 20 percent for design, 20 percent for coding (intuition suggests to most programmers that here is where the time is spent, but intuition is very wrong), and 40 percent for error removal."

It seems a lot of people here are talking about debugging in the context of automated tests and manually stepping through debugger loops.

For me debugging usually means "figuring out why we had an outage." This means looking at: 1) application logs, 2) server metrics, and 3) source code associated with the failed applications.

I recently had to ssh and run ngrep on 8(!) servers to see how groups of messages passed around and then look at timestamps to correlate what happened. It was very tedious. This could have been saved by better debug logging; we could have switched that on for 2 minutes, run the tests, and and the looked at everything in Logstash.

When this happens, I end up spending a ton of time tracking down errors. On a bad week, this can be half my time.

So to me, debugging is as much thinking about how you'll have to solve errors in the future and planning for it as it is writing unit tests and tweaking code.

Definitely not debugging.

I am working on an existing distributed system with many moving pieces, which is rather prone to outages. This is fintech, so outages mean a lot of money for a lot of people. So my job involves overhauling the existing system, as I upgrade bits of the system slowly: A full rewrite at once would be madness, but I suspect nothing in the current system will remain in two years.

The biggest time sinks are stress testing any of the newer pieces that I try to bring in, followed by incident remediation. There's an incident that requires a human hand to fix it every couple of weeks or so, and I end up spending about three days each time writing better error handling code, adding observability and alerts, and if something is really recurring, writing automation to make the problem fix itself.

This is a fact of life in any distributed system that was written fast: People are start happy because it works most of the time, but as you want the 4th and 5th nine, you need people hardening the system. This is something that is very hard to do as you build anyway: While unit tests are good, there are entire layers of behavior nobody will be able to spec properly by looking one piece at a time, so stress testing, gamedays and such are the only ways to make sure not that the system works to a spec, but that we can even come up with a spec that behaves the way we want in practice.

There's value in evaluating scenarios in your head, but I've also seen what happens when mathematicians use that as their only weapon in a distributed system: Months are spent making sure the system is correct, but then lots of effort is spent on scenarios that are more theoretical than practical, and other scenarios are ignored, even though they occur a lot in practice.

In this respect, it's not very different from entrepreneurship: Getting an MVP out the door and doing things by hand instead of using automation is going to beat spending a lot of time making a product without having any idea of what the market really wants.

There are two phases to programming: bugging and debugging.

Sometimes I feel like I'm not writing code at all, just debugging till it works. :)

Implementing a feature is fixing a missing functionality bug, fixing a bug is completing a feature. There is no real difference between feature dev and debugging, except in how they are perceived.


I've sunk a lot of time into trying to change this. Among other changes, I've:

- Improved crash dump collection, to spend less time reproducing bugs and be more thorough in addressing them.

- Improved code debuggability - for example, writing scripts to inject call stack information into actionscript and java via disassembly, which I can then display on assert, especially on platforms where I have unreliable or incomplete debuggers.

- Learn and use defensive coding techniques to make bugs fewer in number, shallow, and caught more quickly and with more context.

- Write thorough tests to catch said bugs before I even run my main application, and more edge cases to input

- Learned more tools to catch bugs I might not even know exist - valgrind, address sanitizer, static analyzers, fuzz testers, ...

I spend much less time debugging my own code now. If I'm lucky, I'll work on projects where I don't have to debug my coworker's code all that much either. That still leaves debugging 3rd party libraries and tools - which I may lack the source to entirely - that I suddenly have more free time to really properly investigate and get to the root cause of.

Neither. I spend most of my time procrastinating and/or day dreaming.

Same here. I often spend time wondering if it's simply that this field isn't for me, 40hrs a week kills my drive, or if it's a bigger issue like depression.

Only thing I enjoy now are exercise and music.

Have you ever coded for a project you were excited about? Where you could feel really good about delivering a quality, useful product? If possible, try and find that at your current job. If not, a new job may be the best route. Management and bureaucracy are great at squashing good vibes here.

Also, see if you can find some satisfaction in expanding your programming skills through reading and learning. Not sure your experience level here, but I would recommend that to anyone -- it has made a huge difference to me personally.

Pretty early in my career. I've held two jobs that both sound wildly better than any other option in the near area (small startups with tons of work vs. big businesses where nothing gets done).

I intend to spend no free time on my career outside of work. There are far too many other things in life I would prefer to work on and experience, hence why I wonder if this is the right field for me.

Sorry if I sound so negative, it's just how I've felt since first starting out in my field.

On the other hand I bet your .bashrc and .emacs are amazing

> On the other hand I bet your .bashrc and .emacs are amazing

Nope, but my .vimrc is a work of art.

I think a lot of debugging activities end up laying the framework for new features, so it's unclear how to separate the two.

Working as developer supporting and maintaining web applications , over past few years , I have spent maximum time fixing bugs left behind by those came before me , and more even debugging to understand functionality and architecture. A significant amount of time unit testing ofcourse and lastly some amount of time adding new features or extending existing features. Ofcourse like many developers out there I do however look forward to truly build code from scratch, like from a whiteboard concept to a MVP at work. No luck there so far, cause I usually end up serving businesses that have due a number of reasons chosen to remain where they were a decade ago!

Debugging tests. It's the worst because the feTure took an hour to add with ita own tests, but a day and a half of debugging how the changes broke other tests.

                Analysis Programming Debugging Overhead
  ------------  -------- ----------- --------- --------
  My Own Stuff    30%       60%         10%       0%
  Others' Code    50%       10%         30%      10%
  Enterprise      10%       10%         10%      70%

I get your point but my guess is the lack of analysis is only exacerbating the problem on the Enterprise end.

If he means reading code and documentation as the only activities of analysis, then I'd say the ratios are about right.

The overhead consists of endless meetings that never reach consensus, but arguably this could be filed under analysis

Meetings, discussions about meetings, pre-meeting discussions regarding discussion to be had in meeting, logging all time spent in meetings, discussing meetings, discussing discussions to be had in meetings, logging time lost to "noise" to figure out why nothing gets done on time, meeting to look at time spent on "noise", planning what to do with remaining time, having a meeting to discuss what to do with remaining time, updating numerous workload management tools with tasks to do with remaining time, spend time trying to get back into work, get pulled off to do something of far less value but which is "much more urgent", log time spent on that, have a meeting to discuss all upcoming urgent and unscheduled things, have a discussion about what wont get done, goto 1.

That's where you schedule a meeting to talk about scheduling a meeting to talk about Where We Went Wrong With Analysis.

You joke but the meeting meetings are pretty common, and as awful as you would expect.

50% design, 25% coding, 20% documenting, 5% debugging

But I have the luxury to work in a result-oriented environment with people too experienced to fall for "agile". So I can spend half of my week in a cafe with pen & paper as long as the project is done by Friday night.

Preface: I work with C/C++ the majority of my time. C#, F#, and PowerShell are the bulk of my remaining professional time. Go & OCaml/F# at home.

When I started, the bulk of my time was debugging my own code. I am gifted with the ability to write vast swathes of code in a short amount of time and when I was younger, it was vast swathes of shit code.

A little later into my career years 3.5-5, I spent more time coding and less time debugging. I designed my code better, used better patterns, and generally was just an all around better engineer.

I've come into the third stage of my career now where I spend a good portion of my time debugging junior engineers' code in a complex system I work on. In particular, my focus is usually in reliability and performance. I don't tend to debug the junior engineer one-off issues but rather the subtle regressions introduced by seemingly harmless changes.

In this third stage, I still write a lot of code, but much more of it is investigative and refinement over existing ideas with occasional injection of something wholly new.

I think it will really depend on the role.

- One project is in active development, and I probably spend about 70-80% of keyboard time coding with 20% debugging.

- A separate project is in maintenance mode, and obviously most of my time on it is debugging as bugs come in. So probably opposite, 70-80% debugging there.

- Sometimes feature extension requests come in, in which case it's probably closer to 50/50 on that project.

A bunch of time planning the change and reading through the current version of the code, so that I can add things without breaking everything else (and/or understand the current behavior and the desired behavior, so I can repair the bug). About half as long actually writing new code, and then debugging iterations until the tests work again. Some time writing new tests for the feature. Some time fixing integration issues (occasionally). A fair amount of time dealing with customer escalations (usually tiny edge cases really messing stuff up at a customer's site).

So, they're all bugs, and in a sense, all coding is debugging. New-feature, regression, existing (from previous release), and escalation bugs. They're all basically the same thing: Identify the deficit, write a fix (includes what you might have meant by "debugging"), write tests to cover the changed behavior, check it in, deploy/release.

I am currently integrating a handful of open source big data systems and frameworks, the breakdown of my time is

    60% debugging
    15% stack overflow
    25% email archives
    15% commit logs
    10% navigating code, spelunking
    12% jira
     5% writing tests to confirm config/state/feature availability
     3% coding

Implementation (existing code) - 30%

  Maintaining - if this means soft feature creep, then 10%

  Maintaining - if this means bug fixes and other things, put in implementation, 5%
Creating nice PPTs and control documents about stuff - 30%

  I genuinely like this

  New functionality 20%, including sitting with users for new functionality requests, seeing their workflow
My main job. Having whittled this down to 20% of my day, I need to start ramping it up again. It is nice to have 10% time, and I loved 50% time, but 80& time may get discovered (though all time is dedicated to the company)

Other stuff. Like filling in timesheets, which assume hours can accurately be attributed to discrete tasks for discrete people any and all of the time.*

* Just set goals for staff. Do staff achieve their goals? If so, why timesheet? Or just timesheet roughly, my hour-by-hour 7 day per week sheet is a pain.

I spent more time in the planning stages, followed by debugging, followed by writing code.

I work in an environment where tests are unheard of. Most of the Javascript written is erring towards the "simple" side with many edge cases as it is all written for front-end web development. Things like calculating the size of the header, adding/removing classes based on certain user actions, managing Google Maps or having logic for complex forms.

Most things are several small (<10 line) functions.

The most complex thing I've built is a pre-qualification form. Thank god for moment.js, because I never thought it would be so difficult to calculate if someone is between the ages of 40 and 82 (or will be 40 by October 15th of next year).

Of all the answers here that provide stats, I wonder what percentage of those have actually measured the time versus just giving numbers that 'feel about right'. If there is one thing I've learned about stats it is that 96.60% of people's gut feel is usually wrong ;-)

That said, about the question asked. I can attest to the fact that in general thinking and writing good tests greatly reduces time spent debugging. Also, adding judicious and useful logging for the critical / edge cases helps a whole lot.

Well, it's a fairly even split of both. But a lot of time is spent fixing projects that started as something really simple and immediately ballooned into some sort of bloated megaproject that ended up with no cohesive plan in regards to how anything would be tested.

Bonus points if the project started as one thing and pivoted to something completely different midway through development, and about 50% of the code is completely unused.

I spend most of my time debugging or extending old code, but then again I'm personally responsible for about 600,000 lines of undocumented code without any tests (but there are a lot of daily reports which act as monitoring checks). I will say that I rarely need to debug my own code, it's usually old code that needs to be updated due to a data change or business logic change

I'm not sure there's much than can be concluded by someone's ratio. I definitely wouldn't assume that someone who spent more time debugging was a "worse" programmer.

There are certainly practices that reduce the amount of debugging, but it's all relative. Personally, the question for me is nearer something like;

> When is the right time to let go of my current approach?

> I definitely wouldn't assume that someone who spent more time debugging was a "worse" programmer

I believe that more time spent debugging, the worse the code you're debugging is. Now if you're spending most of your time debugging your own code, then likely you're a novice who hasn't learned the many ways to write quality code that "just works".

If, on the other hand, you inherited a codebase from someone who did not follow the tenet of "develop your code as if the next maintainer is an axe murderer who knows where you live", then spending a great deal of time debugging is understandable and likely unavoidable.

Personally, during the time that I get paid for programming, most of my time is spent writing tests and developing features.

On the side, however, I have a project that I inherited from someone who clearly never intended to have another person look at the code, and most of my time is spent spelunking and debugging (and slowly replacing every last line).

I find myself doing the exact opposite. The time I get paid for is mostly spent fixing bugs in a codebase of variable quality that I've inherited. The hobby projects I get up to are entirely my own code and thus I get to spend the majority of that time implementing new features.

Well - my "hobby" project is running the league website for a pool hall that was written in PHP ~12 years ago. There are 4 different leagues, each on their own database, all running very similar (but not the same) versions of the code on similar but not the same db schemas.

So I'm trying to modernize it by building a separate app that can interoperate with the 4 different schemas and do all the same things that the old app did. It's an interesting exercise in replacing legacy code piece by piece while still using it (all the leagues would not function if the site didn't work, and there's basically 3 weeks out of the year when the leagues aren't playing).

Professionally, on the other hand, I work at a startup where I've more or less had my hands in the code from day one.

debugging. because coding is the easiest part. when I am asked to build any feature. I take some time thinking on what and how should I do (around 10%-20% of my time). Once I feel like I have a good idea I continue to code (But sometime when I start to code I spend more time on thinking on designing code, like best practises etc.) so I think like 30%-40% of my time go to coding and thinking about best code practises. But then comes the part where I really struggle DEBUGGING. Rest of my time go in debugging stuff,

But I am not sure how can I improve myself. I am not sure whether anyone faced this but I feel like the starters really feel the same way.

All fun lies in debugging I sometime love it. I find funny bugs in my code. But it is so time consuming and I really want to reduce that. Not sure how. Any help will be thankful.

Like other commenters have noted, the more experience you get and the more you learn to improve the design of your code right off the bat, the less debugging you should have to do. Hopefully then the coding part becomes just as fun as the debugging currently is for you -- a lot of satisfaction can come from coding something really well and having it work just as you expected the first time.

Lately, it's mostly debugging. I rarely find debugging difficult (tedious at worst), and I like fixing real problems (vs building a new feature or component I don't see the reason behind) so that suits me just fine.

Honestly, I'm probably better at it than building new features anyway.

Not employed. As a hobby programmer I spend more time debugging. The larger the project the more debugging.

I wonder if there is a study about the percentage of time spent debugging as you become more experienced.

Roughly equal, probably debugging takes more time (or at least it feels that way!).

Usually can get through a chunk of code no problems but when that 1 inevitable bug arises it will take up a lot of time through trial and error, stack overflow and just generally googling to find a solution.

It depends on what I'm writing. For portable data terminal programs, it was 90% coding, 10% testing and debugging. For the air traffic control software, it was 10% coding and 90% inspection, testing and debugging.

Between developing, extending and maintaining code, I'd say about 20%, 60%, 20%.

That said, I'd say only about one-third of my time is spent on code. I spend significantly more of my time doing operations work and having meetings.

My work is 50/50 fixing other's people code and writing new code and personally about 10/90, it's so much easier to debug things for me when I get that memory spark from writing it the first time

If you consider that the life of a software product is 5 years, it would typically take the first 3 months to build v1. So coding vs debugging would pretty much be in the same proportion. 3:60 in this case.

Coding is only the tool, i don't spent much time coding, i spent a lot of time designing/analyzing then i spent time debugging/testing which takes more time than coding.

debugging the platform so i can get 10 lines of my code to run

Lot of planning then Coding and Debugging - theres a lot of refinement going on during the debugging process. So I cant call it just debugging.

Neither, reading and understanding occupies the bulk of my time programming, even when I'm reading and understanding code I wrote.

Mostly coding & unit testing, but sometimes I hit an case where debugging becomes the focus for a while.

cough.NET Parallelismcough

40% - coding 50% - unit testing 10% - debugging

I am spending most of the time on coding. Debugging is "waste".

Depends on whether I'm coding or debugging.

80% implementation

80% maintenance

50% coding, 40% testing, 10% debugging.

Before 40 Coding; After 40 Debugging;

Am spend more time on debugging.

you need to debug this comment

Missing capitalization, and terminating semicolon.

I mean period. Terminating period.

debugging. esp. not my code.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact