
Aks HN: How did you improve the quality of your code base? - splittingTimes
Our company builds a java openGL CAD&#x2F;CAM application suite for windows
desktops. We have a couple tens of million LOCs, with ~50 projects and
1000s of packages. We grew to ~50 devs. Our teams are distributed worldwide in several locations.
After ~10 years of neglection we need a strategy to increase the code quality (lots of dependencies,
feature envying inheritance hierarchies, spaghetti code, similar problem are solved
in myriad ways, all that jazz).<p>+ How do you measure code quality? How do you interpret the metrics?<p>+ What are good tools for a windows&#x2F;java&#x2F;eclipse dev environment?<p>+ How do you act on the metrics and actually improve code quality?<p>+ Can you recommend any resources of success stories on how companies managed to increase code quality of a big, tangled system?<p>What we are doing ATM:<p>- code reviews via Gerrit<p>- Jenkins for build + testing (~ 5% code coverage, tests run nightly and take ~1h)<p>- Scrum with 2 week sprints with 30% time allocated for refactor&#x2F; writing of tests
  + 1 week maintenance to work on bugs backlog<p>- small team (2 members) increase GUI tests to automate validation<p>- developing guide lines for reviews and coding
======
avitzurel
* Code review and care. * Always leave code in a better state than you arrived.

This project is big, it's definitely not your average webapp express node.js
you see these days.

From my experience in projects this big with a big team, there's absolutely no
better solution than reviewing code carefully and caring about quality.

I was in the same situation as a consultant a few years back and the rule to
leave code better than you arrived at it is a real trigger to most people that
respect what they do.

I'd say that the VP ENG should be involved in the process and set some rules
for what is acceptable quality and what isn't.

One more thing

Everyone knows a smell, every single member of your team has some piece of
code he/she saw that doesn't make sense. Keep a document with all of these and
just make sure you scratch them off EVERY single day.

Stuff like: * User Creation is using LOCK on table_x and it shouldn't * Form
submit code is too complex, need to be better * Extract component X into a
microservice

etc...

If you go through a list like this and fix things one by one, you'll be better
off in a short amount of time.

Don't try and take it all at once, create manageable consumable pieces that
your team can relate to, understand and get behind.

Good luck

------
twunde
The main business goals of code quality improvements are to 1) reduce the
number of problems customers encounter thereby reducing customer support costs
2) reduce the amount of unplanned work/firefighting engineers do and 3)
increase the pace of innovation. Therefore along with code metrics you should
be tying these quality changes out to business metrics such as number of
customer-filed bugs per a month, number of customer support calls a month,
etc. This should provide validation to code quality improvements, and give you
a way to sell further improvements to management. Also keep in mind that there
are non-technical ways to improve IT efficiency (improve project management,
improve release management, improve testing etc)

So how do you go about making code quality improvements: 1) See if you can
remove unused code/dependencies/features. Less code means less code to
support, and faster compile and testing times. Look at metrics like code
removed

2) Focus on the most problematic areas of the code and eliminate errors and
bugs. If you can eliminate a significant source of unplanned
work/firefighting, you'll have more time to spend on planning development
instead of just reacting to work. These problematic areas are where tests will
be most useful

3) Add static analyzers and linters to easily detect simple problems (like
unused variables, style problems, problematic constructs such as if(foo =
bar). As you develop your coding guidelines, implement the rules in these
tools to automatically find these minor problems. This will allow your code
reviews to focus on the big picture instead of nit-picky implementation
details.

4) Do things to close the feedback loops for development. Consider running
testing throughout the day. Maybe try to increase release cycles. Maybe add UX
earlier in the cycle

------
hiperlink
This thread might give you some ideas:
[http://softwareengineering.stackexchange.com/questions/15548...](http://softwareengineering.stackexchange.com/questions/155488/ive-
inherited-200k-lines-of-spaghetti-code-what-now)

------
sethu-b
In our company, we use community-edition of SonarQube to help improve Code
Quality. SonarQube can help you setup different metrics and fail builds if
they are not met via maven-sonar-plugin. The new SonarSource project has
plugins for all the modern IDE's and does a quick analysis on the current
file.

------
gravypod
This might be difficult to do but can you explain your overall architecture
design? What sorts of issues are you having? Where are the pain points?

Going from top to bottom:

> Our company builds a java openGL CAD/CAM application suite for windows
> desktops

If it's an application suite then, from my understanding, you'll be building a
main set of libraries and then a set of tools that all use these libraries.
Have you considered a hierarchical plugin design? Have a main application that
starts and setups all of your main rendering and CAD/CAM magic. Then go from
there to working out a simplest of APIs to what everything actually _needs_
access too.

Your main application basically just manages UIs/drawing to an OpenGL port.
From there you can load modules to do other things. If you abstract what is
needed then each module should only need to define How a functionality is
executed, not where and what a functionality should look like in the UI. For
instance refactor your code to follow such a structure:

    
    
       Master UI System (Exposes: "Options", "Renderables", "Views")
        -> Drafting Plugin (Exposes: "Models", "Collision", "Faces")
        -> CAM Plugin (Exposes: "Routing Paths")
    

Master UI does not need to know anything about Design Plugin and CAM Plugin.

Drafting Plugin needs to know about Master UI but nothing about CAM Plugin.

CAM Plugin needs to know about Master UI and Drafting Plugin.

That's what I would try and do if this was a new project but this isn't one
and uprooting your entire (or even any recognizable percentage of your code
base) is unreasonable.

> We have a couple tens of million LOCs, with ~50 projects and 1000s of
> packages

If you've got that many packages then you might want to find out what sort of
abstractions are being used, not working correctly, and remove them/replace
them with simpler solutions. How much of these packages are filled with
Interfaces/Abstract Classes/Implementations of interfaces

> After ~10 years of neglection we need a strategy to increase the code
> quality (lots of dependencies, feature envying inheritance hierarchies,
> spaghetti code, similar problem are solved in myriad ways, all that jazz).

One at a time:

> lots of dependencies

Slowly replace dependencies by either abstracting features further, replacing
with new standard library features, or by implementing other solutions to the
same problems. Every dependency is an added layer of complexity in my book so
it's best to avoid this as much as possible.

> feature envying inheritance hierarchies

This comes as a side effect of not knowing what a level of abstraction is
actually meant to be doing. Have a team meeting and ask what each team thinks
the actual problems that are needing to be solved are. The people knee deep in
crap will have a better idea of what's the correct or natural abstraction for
these cases if the ones currently being used are unnatural. It may just be
that the code base has had too many large scale changes or even just have had
too many features pushed in at once (which for a CAM/CAD tool is definitely
not unheard of, this is a very specialized and hard task)

> spaghetti code

Get some sort of static analyzer. I remember one group I worked with used
Sonar. Also remember that the best code quality tool is a good agreed upon set
of standards. Somethings that have worked for me on some group projects I've
worked on has been: Avoid complicated constructors, always default a variable
to final, avoid complicated logic statements always exit early rather then
filter before in a for loop, use all the up-to-date constructs to aid with
code clarity (try(stream), for(var:set), and more).

> similar problem are solved in myriad ways

If there is one problem that exists in two places this is an opportunity for
you to pull the part out, abstract it, and use it as a library. This is a
double edged sword since these two parts actually need to contain the same
problems which some times is not the case.

Now to the nitty gritty:

> How do you measure code quality? How do you interpret the metrics?

(How many times does the code result in an error) * (The time in hours that it
takes to debug the code).

Larger number is worse. Keep a notebook/log of these times, graph them, and
use that as a map to decide what is worth refactoring. If a piece of code
"just works" but looks ugly it can wait to be refactored if there is another
piece of code that looks "visually appealing" while still causing daily side
effects in the active development of the project.

> What are good tools for a windows/java/eclipse dev environment?

I've always managed ANT scripts for my group projects since they are very very
cross platform. Maven works great but I'm not a fan of the complexity of
install for non-linux users. Also check out IntellJ for built-in maven
support.

> How do you act on the metrics and actually improve code quality?

Change your code by coming at it from a different perspective. If that
perspective yielded a more promising piece of code (that is easier to
understand, causes less side effects, and uses less external/non-standard
functionality) then you keep it. A lot of my code I write is code I throw
away. This is much harder to justify to business people but it's an important
part of the process to sketch up what you think _might_ work even if the
attempts aren't always fruitful.

> Can you recommend any resources of success stories on how companies managed
> to increase code quality of a big, tangled system?

Check out the U.S. Digital Service for the only recent success story that
comes to mind [1].

If anyone knew the secret sauce they wouldn't give it out for free. The
ability to "Fix" all the "Broken" projects isn't an issue on the scales that
we think they are. A large portion of all technology-related projects fail
[0]. If anyone could prove they where able to reliably fix these issues they'd
be billionaires over night.

[0] - [http://www.zdnet.com/article/study-68-percent-of-it-
projects...](http://www.zdnet.com/article/study-68-percent-of-it-projects-
fail/)

[1] - [http://www.theatlantic.com/technology/archive/2015/07/the-
se...](http://www.theatlantic.com/technology/archive/2015/07/the-secret-
startup-saved-healthcare-gov-the-worst-website-in-america/397784/)

Edit: Removed "What do you mean by thousands of packages?"

Looking forward to what you think of all this.

