
DIB Guide: Detecting Agile BS (2018) [pdf] - gashad
https://media.defense.gov/2018/Oct/09/2002049591/-1/-1/0/DIB_DETECTING_AGILE_BS_2018.10.05.PDF
======
eru
Great document overall!

> The purpose of this document is to provide guidance to DoD program
> executives and acquisition professionals on how to detect software projects
> that are really using agile development versus those that are simply
> waterfall or spiral development in agile clothing (“agile-scrum-fall”).

Actually doing 'waterfall' properly would probably be fine? Or at least not
the bogeyman it's made out to be.

The real danger is that the project would just be managed badly, independent
of professed approach. Just muddling through badly.

(I am not even sure if waterfall was ever actually a thing, I mostly ever only
hear about it as a thing to avoid?)

~~~
tsimionescu
> Actually doing 'waterfall' properly would probably be fine? Or at least not
> the bogeyman it's made out to be.

Doing waterfall in the exact sense that is usually described will almost
certainly never work well. People usually focus on the design phase, but I
think the much more catastrophic part of 'waterfall' is doing testing only
once development is complete.

Now, you absolutely can successfully run software projects that are design
heavy, that more or less freeze requirements and designs early on, and that
seek to execute on the design, instead of iterating. However, if you truly
develop for 6 months before any kind of external QA, as described in most
Agile talks about what Waterfall means, that is a recipe for disaster. If you
do test things by feature, invest in component testing and practice that all
along the way, you can succeed.

This essentially describes a 2-stage waterfall: design and execute, where the
execute phase includes both development and testing all along the way. Is this
'doing waterfall properly'? Or is it a hybrid methodology? That's a matter of
definitions in the end.

It's also important to note that the extent to which this works depends a lot
on the project itself. If the requirements are volatile (e.g. when developing
a new product with an uncertain market), or if the project is too large (e.g.
if the design phase suggests it will take 5 years to finish given the current
team), then it's likely that the project would benefit a lot more from an
Agile style of development, where you deliver smaller chunks of the project to
its end users to gather early feedback on the actual usefullness.

~~~
jrumbut
I think the parent post has, but for the reader who is a software engineer
take 15 minutes out of your life to read WW Royce's "Managing the Development
of Large Software Systems."

In this brief and engaging paper you will find the diagram used by many agile
enthusiasts to describe the "waterfall method" and will be shocked to discover
that it is held up as an example of a process that never actually works in
reality.

You will then read quotes like this, which could have come out of an agile
book:

"For some reason what a software design is going to do is subject to wide
interpretation even after previous agreement. It is important to involve the
customer in a formal way so that he has committed himself at earlier points
before final delivery. To give the contractor free rein between requirement
definition and operation is inviting trouble."

[http://www-scf.usc.edu/~csci201/lectures/Lecture11/royce1970...](http://www-
scf.usc.edu/~csci201/lectures/Lecture11/royce1970.pdf)

~~~
qznc
[http://beza1e1.tuxen.de/waterfall.html](http://beza1e1.tuxen.de/waterfall.html)

Incremental and iterative development is not an invention of Agile either. It
was used before "Waterfall" existed.

~~~
Frost1x
I've used incremental and iterative processes in R&D for many years before
"agile" became the current goto. Unfortunately, the obsession in tech with
following trends has pushed many to try and modify, IMHO, a better refined
process for the domain where agile just doesn't quite work. In R&D,
prototyping work is quite challenging and doesn't lend itself well to full-out
agile pipelines. CI/CD can actually hamstring you far more than help you, as
can other typical PM tools. You need to be more _agile_ than "agile."

For larger projects with adequate resources, agile _can_ make sense, although
it's typically modeled after quarterly focused business models and can miss
long term opportunities. The core issue with agile is that its structure is
ripe for abuse by everyone involved except the actual developer(s). Instead,
developers have to reconcile all sorts of poor choices together in a fairly
formal environment, leading to headache after headache.

------
ngngngng
Interesting that all these issues are completely different from my own
experiences with Agile BS. We would have 10 hours of exhausting meetings every
2 weeks in order to plan our sprints. Unconsciously, we just ended up hyper
inflating estimates so our team would joke about how the only thing we did
each sprint was "slap a box on it" (in CSS, or some similarly simple task).

I left that job when all the developers completed their tasks a few hours
before the end of the sprint, giving me (QA) just a few hours to test, merge,
and deploy their code, which because of our terribly clunky and manual deploy
system, just wasn't possible. I was placed under an internal investigation for
not being productive because I held up the sprint of the "most productive team
in the company" and made us look bad to out of state executives.

~~~
dionian
testing should have been part of the estimates and you should have thrown that
right back at the rest of the team during retro

~~~
ngngngng
Exactly. But my team didn't think it was a problem, we just forgave ourselves
and moved on and I finished testing and deploying early in the next sprint.
The problem came when external executives noticed how little we completed
compared to our usual throughput and insisted that I be punished for it since
I was the bottleneck. My team was just as confused as I was, but I had been
meaning to move from QA/Test automation to being a developer for some time, so
that just hurried me along and I left within a couple weeks.

------
vbtemp
I love the document and will distribute to my co-workers. The short story,
however, is not really that there's a checklist to determine BS Agile, but
rather that all/most Agile is BS.

In my career I've seen "Agile" throw a wrench in the works for so many
projects. Embedded systems; data center distributed systems that are air-
gapped; aerospace safety systems; R&D work. Agile in these cases just isn't
the thing to do, but unfortunately the culture these days is that everyone
MUST be Agile, and so it creates another bureaucratic nightmare of
dysfunction.

The funny thing is, people will always go to bat for Agile (MUCH more so on
Reddit than HN).. and I don't understand why. I think furthermore the
discourse around this has become so weird. For example, I asked someone kindly
what is "Not-Agile"? There's no answer, other than "bad ways of making
software". The discourse has become caveman-like "Agile good. Not-Agile bad."

At the end of the day, Agile is probably great if you're making a mobile app
or a web app with a small team for a client with a small-to-medium budget,
which accounts for most work software developers do, which is why its so
popular. But is inherently too short-sighted and unable to address technical
challenges that go more than superficially deep.

~~~
jrott
I love this document as well for very similar reasons. It's interesting how
attached people become to the ritual of doing things instead of thinking about
why they are doing it.

In my experience it seems like a ton of things that people want to do with
agile is get to skip the writing and design work that has to happen to make
deep technical projects happen. There is a really strong desire to just start
implementing and then be able to refactor to working code. Of course there is
always pressure to deliver faster so the refactor only ever gets half done and
then there is an architectural mess.

------
zoomablemind
What's DIB? Just trying to understand who is the target audience of this
guide.

It's a practical set of traits to spot. But inevitably a question comes up
"what to do next?" Re-educate, enforce, hire/fire, disband?

One needs to remember how the Agile processes were being "installed" back in
the day in organisations/teams of various degree of dysfunction. Lots of those
teams went through trials of "templates", including waterfall, just with the
same outcomes.

Too often, the failure is not at the team but on org-level. The base tenet of
Agile success is buy-in on all levels. Yet it's easier for the management to
"buy-into" a structure and attributes, not into actually empowering and
trusting the teams.

So, this detection approach may find all right attributes, tools, lingo,
roles... but not the actual practices. A beaten down example is a morning
stand-up, which disguises the dreaded subordinate reports - best indicator of
such theater is a presence of a "clip-board" or note-taker person.

I'd think for such guide to be of better practical value, there should be a
section which would outline ways to detect the constraints and obstacles for
adoption of a proccess which would be effective in a given team's case. It
does not have to be Agile-or-wrong.

~~~
Jtsummers
DIB = Defense Innovation Board,
[https://innovation.defense.gov](https://innovation.defense.gov)

Made up of various tech industry leaders (mainly CEOs, it seems). The purpose
was to try and modernize the way, or present a path to modernization, defense
software systems are developed and maintained. Because it's presently a
cluster fuck.

------
sgt101
My current problem : customers who don't want to participate in the agile
process and do want to have a "simple" pre-agreed specification to use to
determine project success/failure. Of course they can't write such a spec and
want me to do it - and of course I can't without doing large parts of the work
to implement it (because if I'm wrong I'm on the hook for a large sum of money
wasted).

~~~
kthejoker2
Why can't they write it? Conduct a user story workshop, slice out an MVP
release, and that's your "spec."

You can't force customers to "get" Agile. You can force them to understand the
risks of not participating in the process.

~~~
sgt101
>Why can't they write it? Conduct a user story workshop, slice out an MVP
release, and that's your "spec."

Well - yes, this (variations) is what we do, but guess what the outcome is....
"I'm not convinced that we've got this right...", "I was never fully signed up
to that...", "I think we have invested a lot of effort in a process that isn't
generating business value..."

The problem with Agile is that it doesn't account for politics, if people play
nice and are all signed up to get the best done with the tools and people
available it's brilliant. If you've got to deal with corporate politics it
leaves those with good intent exposed in 100's of ways.

------
dang
Discussed at the time:
[https://news.ycombinator.com/item?id=18910608](https://news.ycombinator.com/item?id=18910608)

------
greatgib
It started well, then it completely fell on the corporate bullshit side.

For example: "Some current, common tools in use by teams using agile
development".

This is the kind of reason why a lot of people are required by management to
use useless overkill stacks for their needs like docker or kubernetes.

Also the "questions to ask" are typical ridiculous agile corporate bullshit
like "have you a product charter", or common forced process oriented
questions.

~~~
mumblemumble
I don't think that listing some example tools is a problem. Especially for the
audience of a primer like this, a categorized vocabulary list like this can be
indispensable for giving people a quick lay of the land and a preview of some
names they'll encounter.

My complaint would be that, "Tools listed/shown here are for illustration
only: no endorsement implied," should have been inline instead of buried in a
footnote.

------
swiley
I interviewed recently at a defense contractor and experienced something like
this. I know you’re supposed to ask questions during an interview but I
usually only really come up with two: what is your git/hg workflow and what
does your automated test coverage look like.

Usually the second one has an answer along the lines of “not good but we’re
working on it” which is fine. This place though tried pretty hard to convince
me they were using git to manage their code right up until I asked the second
question. The senior engineer sort of mumbled a few things ending with
something along the lines of “we’re still figuring out exactly what the
transition from svn will look like.”

I’m not sure why they didn’t decide to hire me but I feel like that
interaction really upset someone and may have been a big part of it.

------
lmilcin
For me the biggest clue is always a fixed "agile" process. By definition any
notion of fixing a process is anti-agile.

In my current team the only thing that I have asked to do when "going agile"
was biweekly retro to discuss what to improve. Seems to be going pretty well
even if most problems have solutions from various described "agile" process
templates.

