Thursday, September 14, 2006

Early Performance Feedback

One of the tenets of Agile product development is to seek and learn from feedback, even from your deployment process and server environments. At my company, while we build and test for well-performing code, we wait until we're in the Certifcation/Staging environment to officially test performance. This is yet another scenario when the function, albeit valuable and necessary, should be moved forward in the development cycle. First, this testing is performed just late enough to disallow any significant performance-enhancing changes at the code level. Second, the timing of when these tests occur does little to add any value to the software development process (hence no learning can occur).

I asked my team to consider the following:

  1. write tests that time certain functions of our application
  2. time the tests to get an average performance time for each (note that we're not testing performance, we're merely benchmarking it). And you'll want to benchmark the performance during the time when you're going to regularly run these tests.
  3. schedule the tests to run and time existing performance compared to the expected performance time. Factor in a little tolerance, say 10%, and fail each tests when it performs outside that range.
  4. these tests would run as part of a regression test suite, triggered after every build deployment.
  5. failed tests would be identified and investigated immediately to see if the changed code could have done anything to adversely affect performance.

As Agile does so many times, this challenged conventional thinking. Why would you test in an environment known to be a horrible performer? How could you trust the test results and not spend countless time investigating errors that prove to have nothing to do with the code? But just think of the benefits:

  • shifts responsibility for performance to the developer, rather than a tester or person responsible for supporting testing (possibly even a developer who didn't write the code)
  • provides opportunity to immediately change code that affected performance. Remember that the earlier you find and fix a problem, the less time actually is spent doign so. Finding such problems during a Certification cycle will likely take sizeably longer. This could have been avoided and is not Lean thinking. Equally so, the cost to modify related code could equally be sizeably higher.
  • encourges the developer to think about performance while designing code, rather than trusting that someone else's test will report back any poor performance
  • investigating these results facilitates learning, which enhances future code design as well as future test design. Such continuous process improvement and commitment to indivduals are hallmarks of Agile, Lean and Scrum.

I don't see this as an option. I think we absolutely have a responsibility to ourselves and our company in bringing this testing (or timing & investigating) forward. And we'll all benefit from it. Lastly, I'd like to give a special call out to Mike Kelly who took the time to discern this topic with me and blog a wonderful post on Thanks Mike!

Saturday, September 09, 2006

Agile Risk Management

A regular thought in the back of my head on Mondays, is that I've got to go look at my project's risk log and see what mitigating activities are needed this week. And in the fast-paced, open, collocated environment my project operates in, thinking about Risk Management is often last on my mind. Compound that with our risks being in an online Excel spreadsheet (vs. casually handled, openly expressed and discussed, and if possible radiated on the wall), and I was the perverbial deer in the headlights. I had regressed back to my old ways of PMing with respect to Risk Management!

For clarity purposes, I defined risks as "the watch list for things that could go awry (content not fabricated, new/unfamiliar technology, geographically dispersed team, interdependence with another project, etc). I began seeking the input of those more experienced. And did I ever spark a lengthy number of back-and-forth responses. Here are my takeaways:
  • In Scrum, this is exactly what the Scrum Master does every day. While in XP, it is the resposibility of the entire team. But what if you're "Agile?"
  • With daily standups, heartbeats & retrospectives, and other forms of communication and feedback, the team focuses on real issues and higher-probability and nearer-term concerns. Of course this reduces waste by not spending valuable time mitigating risks that may never happen. And when the stories related to a given risk neawr, the team can say "we're uncomfortable with the new technology needed to implement XYZ story, so let's allocate 5 additional points to do a Spike or a Learning Task beforehand".
  • I was reminded that many of the principles and practices of an Agile team are to mitigate most of the risks many people would call "the motherhood and apple pie" of risk management for PMs. As an example, you don't fret over the possibility of losing a key developer mid project as long as you're pair programming. Similarly you don't have to fear priorities suddenly changing, because of our broadly-brushed product backlog (and only the next 1-2 iterations have the additional details required to sufficiently estimate, build and test).
  • My question even moved Ron Jeffries to write an article for the XP website, which also prompted Kent Beck to chime in. Woo hoo! I've hit the big time, baby!

Self Retrospective - 8-Sept-2006

I thought one of my standard posts could be a self retrospective of my personal Agile journey. I’ll follow the 4-question format: what went well, what could be improved upon, what still has me puzzled, and who do I want to give some shouts out to (props, kudos, flowers/chocolates [thanks Diana and Esther]):

What went well?

  • Took some constructive inquiries and well-intended feedback very well from a different Agile product manager within my company
  • Well prepared before every daily standup. Started focusing my comments on what I think is “of value” to my teammates, rather than just a core dump over everything I did.
  • Began discussions with an external QA manager who will still certify my last project’s deliverable as a component of a much larger program release. I want to track defects found “outside” of our Agile project—as testimony to the improved quality of code developed within on our project.
  • Initiated a company-wide distribution list for Agile practitioners, akin to a very non-automated version of Yahoo! Groups.

What could be improved upon?

  • Not having the right answer to help my developers feel comfortable with incremental design + refactoring (as opposed to having more requirements now and how everything’s going to fit together in the future).
  • Cannot sustain my current pace (which includes full-time scrum master, significant self-education, and additional efforts to operationalize Agile’s fit for my entire department/company.

What still has me puzzled?

  • Seeking examples of Sprint Planning checklists to facilitate both preparation for and execution of the Sprint Planning session.
  • Quality of backlog – on a 14-week project, estimated backlog projects ~ 30-35 weeks. We’re at least talking about this collectively, but I’m still uncertain how I can forecast something like this to external sponsors without getting ourselves in a jam.
  • Can we get our automated Quick Test Pro acceptance test suite installed on the Continuous Build Box and have them run every time a new build is run?

To whom do I want to give some shouts out?

  • My team for their willingness to do volunteer at the St. Louis Food Bank as a team-building opportunity. I think I’ll write a separate entry about this altogether.
  • Bob Payne and his Agile Toolkit (still my first and foremost podcast site for work-related information)
  • St. Louis Bread Company (aka Panera or Au Bon Pan) for their wifi access. Allows me to get away and lunch and still do some web research.

Friday, September 08, 2006

Welcome to my Agile blog

Welcome to my maiden voyage at blogging. Well actually my second. My first was a spirited attempt at putting up something about me, my family, my interests (movies, Chiefs, Jayhawks) and use it to keep extended family updated. Well that didn't last long. I think I was intimidated by the technology a friend set me up with, but I neither had the time to learn the technology or I guess the motivation to commit time regularly. But since being exposed to Agile (and ready-made blog templates like blogger), I can address both concerns and get going now!

So why a blog? Great question, and one which I'm not sure of myself just yet. But the last five months have been perhaps the most inspired months of my professional life! I hope to both chronicle my journey, highlight the epiphanies as they come, continuously improve my understanding of Agile, and who knows.....maybe even help someone newer to Agile than me.

I modestly hope to provide something of interest to most visitors, maybe attract some interesting feedback, and evolve this blog's purpose as and when needed.

With both trepidation and conviction...

Paul Arrowood
Agile Coach and Scrum Master