Ambysoft Logo

Agile Practices: 2009 Open Research

How to Measure Anything This open research into agile practices was performed the first week of July 2009 and there was 123 respondents. The survey was announced on several agile mailing lists, including the Extreme Programming (XP), Test-Driven Development (TDD), Scrum Development, Agile Modeling, Agile Databases, Agile China, and Agile Unified Process (AUP) mailing lists. The goal was to find out what agile developers were actually doing to compare it with what’s being talked about.

The Survey Results

Some findings include:

  • Figure 1 lists the top 10 agile practices which are believed to be most effective.
  • Figure 2 lists the top 10 agile practices which are believed to be easiest to learn.
  • Figure 3 lists the top 10 agile practices which are believed to be hardest to learn.
  • Figure 4 lists the top 8 agile practices which were most likely to be tried and then abandoned.
  • Figure 5 lists the top 10 agile practices which people want to adopt but have not yet done.
  • Figure 6 shows that 68% of people indicated that their agile teams were of size 10 people or less. Some people indicated that they were working on agile teams with hundreds of IT people involved. Team size is one of several way of working (WoW) tailoring factors.
  • Figure 7 shows that 33% of respondents indicated that their projects had to conform to regulatory compliance. Regulatory compliance is one of several way of working (WoW) tailoring factors.
  • Figure 8 shows that 9% indicated that they were working on projects that were CMMI compliant. It is possible to take an “agile CMMI” approach.
  • Figure 9 shows that 42% of teams were co-located, the rest had some form of geographical distribution. Geographical distribution is one of several way of working (WoW) tailoring factors.


Figure 1. Most effective agile practices.

Figure 2. Agile practices that were easiest to learn.

Figure 3. Agile practices that were hardest to learn.

Figure 4. Agile practices which were tried and abandoned.

Figure 5. Agile practices which people hope to adopt some day.

Figure 6. Average size of agile teams.

Figure 7. Agile software development and regulatory compliance.

Figure 8. Agile software development and CMMI.

Figure 9. Agile software development and geographical distribution.




Survey questions

The Survey Questions(116K)

Survey Data File

Raw Data(103K)

Survey Presentation

Summary Presentation(206K)


What You May Do With This Information

You may use this data as you see fit, but may not sell it in whole or in part. You may publish summaries of the findings, but if you do so you must reference the survey accordingly (include the name and the URL to this page). Feel free to contact me with questions. Better yet, if you publish, please let me know so I can link to your work.


Discussion of the Results

  1. I’m disappointed in the response rate being only 123 people. I suspect that the agile community has become tired of being surveyed constantly.
  2. The question about org size was misworded. As a result I suspect that most people answered what their team size was, particularly because the answers were very similar to that given for the question where I actually asked about team size and because previous surveys had very different answers for org size.
  3. 8 people dropped off at question #7, which asked about what phase (Initiation, construction, release, production,…) your agile project was in. “Phase” is a dirty word among some extremists, even though it’s exceptionally obvious that agile projects go through phases (oops, I mean they exhibit rhythms), and it’s disappointing that that many people would quit the survey just because I used terminology which goes against their “agile sensibilities”.
  4. Different people find different practices easy to learn vs. hard to learn. So, there’s an overlap in figures 2 and 3 as a result. This was expected to happen.
  5. This survey suffers from the fundamental challenges faced by all surveys.


Links to Other Articles/Surveys

  1. My other surveys


Why Share This Much Information?

I’m sharing the results, and in particular the source data, of my surveys for several reasons:

  1. Other people can do a much better job of analysis than I can. If they publish online, I am more than happy to include links to their articles/papers.
  2. Once I’ve published my column summarizing the data in DDJ, I really don’t have any reason not to share the information.
  3. Too many traditionalists out there like to use the “where’s the proof” question as an excuse not to adopt agile techniques. By providing some evidence that a wide range of organizations seem to be adopting these techniques maybe we can get them to rethink things a bit.
  4. I think that it’s a good thing to do and I invite others to do the same.