Three weeks ago I initiated The Big Agile Practices Survey. In these three weeks, with the help of many bloggers and twitterers, my survey attracted 341 participants. The survey was actually divided in six mini-surveys, for requirements, design, construction, testing, process, and organization.
Not everyone bothered to answer all questions, which is not surprising. The survey was indeed quite BIG: spanning 67 software engineering practices, with six questions per practice. That's 402 checkboxes in total! I am very, very thankful to all those participants who had the stamina to fill out the extensive forms. I won't ask this again any time soon, I promise.
As is often the case with surveys, this one got some criticism from both famous and less famous people. Johanna Rothman wrote that surveys like this may be harmful. And James Shore told me that the survey suffers from selection bias. And they are both right! Every poll on the Internet suffers from selection bias, and if data is not interpreted and used correctly the results can indeed be harmful.
Of course, the same applies to the selection process for Agile 2009. It was biased from the start (but I don't care). And the selection of practices in agile books, like the Art of Agile Development, also biased and potentially harmful. And don't forget my own Top 100 Blogs for Developers. Very biased indeed. However, there IS value in knowing what other people think. Though you should never let those opinions dictate your own actions. Which is why I ignored the critics and went my own way…
Fortunately, I had many more supporters than critics. Here's a big thanks to the bloggers who helped me with the promotion of this survey: Dave Nicolette, Corey Ladas, Mike Cohn, Mike Cottmeyer, Alberto Brandolini, Alvin Ashcraft, Robert Dempsey, Kirk Knoernschild (DZone), Raven Young, Amr Elssamadisy (InfoQ), Lee Henson, Dahlia Bock, Artem Marchenko (AgileSoftwareDevelopment), and all those on Twitter who tweeted about this survey. My hard work would have been in vain without you! Thank you.
And now, the results…
|Which practices are REALLY AGILE, or… highest level of agility?|
|Process||Iteration Planning / Planning Game / Sprint Planning||98.8%|
|Process||Daily Stand-up Meeting / Daily Scrum||97.6%|
|Organization||Self-Organizing Team / Scrum Team||97.5%|
|Process||Sprint Review / Iteration Demo||96.3%|
|Process||Timeboxing / Fixed Sprints / Fixed Iteration Length||96.0%|
Comments: This is the top 10 practices that the participants in this survey most often associated with agile software development. It is clear that Iterative Planning is the clear winner for this question. And it's interesting to see that the practices from the Process and Organization categories are the ones that are strongest associated with agile.
|Which practices are NOT REALLY AGILE, or… lowest level of agility?|
|Construction||Issue Tracking / Bug Tracking||27.4%|
|Design||Design by Contract||32.2%|
|Construction||Software Metrics / Code Metrics & Analysis||32.5%|
|Construction||Code Reviews / Peer Reviews||42.3%|
|Process||Root Cause Analysis / 5 Whys||42.6%|
|Organization||Move People Around||42.6%|
Comments: This is the top 10 practices that are least associated with agile software development. Configuration Management is the winner for this question, and interestingly enough four practices are from the Construction category. Agile software development is clearly more associated with organization and processes than with tools/practices that support the construction of code.
|On which practices was the LEAST AGREEMENT regarding level of agility?|
|Design||User Interface Prototyping||1.1%|
|Construction||Coding Style / Coding Guidelines / Coding Standard||1.9%|
|Construction||Source Control / Version Control||4.1%|
|Design||Domain Driven Design||7.9%|
|Testing||Smoke Testing / Build Verification Test||9.0%|
|Organization||Move People Around||14.8%|
Comments: For some practices the answers were split almost evenly, which indicates lack of agreement on what it is to be agile. This is the top 10 of practices with the least agreement among participants. User Interface Prototyping is the winner here, with an (almost) even split among Yes and No answers. And it is a bit surprising to see no less than four practices from the Testing category.
|On which practices was the LEAST CONFIDENCE regarding level of agility?|
|Process||Lead Time / Cycle Time||49.2%|
|Process||Value Stream Mapping||54.6%|
|Process||Root Cause Analysis / 5 Whys||62.2%|
|Requirements||Defer Decisions / Real Options||65.1%|
|Construction||Behavior Driven Development||66.3%|
Comments: Not everyone answered every question. In fact, some questions were ignored by many, which indicates a lack of confidence in being able to answer the questions (due to the practices being not well-known or not well-described). Here it seems that people had the most trouble with lean practices, like Lead Time, Value Stream Mapping, and Kanban Board.
|Which practices are REALLY IMPORTANT, or… highest level of importance?|
|Construction||Source Control / Version Control||100.0%|
|Process||Definition of Done / Done Done||99.4%|
|Construction||Frequent Delivery / Frequent Releases||98.3%|
|Testing||Storytesting / Acceptance Criteria / Acceptance Testing||98.1%|
|Process||Retrospective / Reflection Workshop||98.0%|
Comments: When talking about importancy, participants unanimously agreed that Source Control is an important practice. More suprisingly, there is not a single design practice in this top 10 of most important practices.
|Which practices are NOT REALLY IMPORTANT, or… lowest level of importance?|
|Design||Design by Contract||52.9%|
|Organization||Move People Around||53.4%|
|Construction||Software Metrics / Code Metrics & Analysis||57.1%|
|Organization||Scrum of Scrums||64.2%|
Comments: In the top 10 of least important practices we see a significant number of requirements and design practices. (Of course, this does not mean that design and requirements in general are unimportant.)
|On which practices was the LEAST AGREEMENT regarding level of importance?|
|Design||Design by Contract||5.9%|
|Organization||Move People Around||6.8%|
|Construction||Software Metrics / Code Metrics & Analysis||14.3%|
|Organization||Scrum of Scrums||28.3%|
Comments: This is the top 10 of practices with the least agreement among participants, regarding their importance. What is most intriguing here is that these are exactly the same results as in the previous question, only in another order. The practices that people find least important are apparently also the ones for which public opinion varies the most!
|On which practices was the LEAST CONFIDENCE regarding level of importance?|
|Process||Lead Time / Cycle Time||46.5%|
|Process||Value Stream Mapping||47.6%|
|Design||Design by Contract||55.9%|
|Organization||Move People Around||57.2%|
|Process||Root Cause Analysis / 5 Whys||57.8%|
Comments: This is the top 10 of practices that people ignored the most, when asked about importance. Again, the lean practices dominate the results, meaning that many people in agile software development haven't made up their minds yet about the importance of specific lean practices.
|Which practices are REALLY APPLIED, or… highest level of application?|
|Construction||Source Control / Version Control||100.0%|
|Process||Iteration Planning / Planning Game / Sprint Planning||92.9%|
|Process||Daily Stand-up Meeting / Daily Scrum||90.8%|
|Process||Timeboxing / Fixed Sprints / Fixed Iteration Length||90.8%|
|Construction||Daily Builds / Automated Builds / Ten-Minute Builds||90.3%|
Comments: This is the top 10 of practices most often applied by the participants in their own organizations. Again, Source Control is the clear winner, with a significant lead. And again, there are no clear winners from the Design category of practices.
|Which practices are NOT REALLY APPLIED, or… lowest level of application?|
|Process||Value Stream Mapping||24.2%|
|Design||Design by Contract||25.3%|
|Process||Lead Time / Cycle Time||29.4%|
|Construction||Behavior Driven Development||35.1%|
Comments: In this list of top 10 practices that are not applied, we see both lean and design practices dominating the results. Interestingly enough, the Testing category is not present in these results, which might be good news for quality assurance people around the world.
|On which practices was the LEAST AGREEMENT regarding level of application?|
|Construction||Software Metrics / Code Metrics & Analysis||1.2%|
|Construction||Pair-Programming / Pairing||4.5%|
|Design||Domain Driven Design||7.1%|
|Process||Root Cause Analysis / 5 Whys||8.0%|
|Requirements||Product Vision / Vision Statement||16.5%|
|Construction||Code Reviews / Peer Reviews||18.1%|
|Organization||Move People Around||18.5%|
|Requirements||Minimum Marketable Features||18.6%|
|Organization||Scrum of Scrums||22.9%|
Comments: This is the top 10 of practices with the least agreement among participants. It is interesting to see Software Metrics topping this list for the first time, meaning there are about as many people applying software metrics as there are people not applying software metrics. The Testing category is also conspicuously absent in this case.
|On which practices was the LEAST CONFIDENCE regarding level of application?|
|Process||Lead Time / Cycle Time||64.3%|
|Process||Value Stream Mapping||67.0%|
|Requirements||Defer Decisions / Real Options||68.0%|
|Organization||Move People Around||72.2%|
|Process||Root Cause Analysis / 5 Whys||74.1%|
|Design||Design by Contract||74.2%|
Comments: This is the top 10 of practices that people ignored the most, when asked about their own application of practices. Not surprisingly, the lean practices dominate the results again. Either people don't know what the practices are, or they don't know whether they're being applied in their organizations.