From: Brian Marick Date: Wed, 14 Aug 2002 12:04:57 -0500 Subject: XP/Agile Universe trip report XP / Agile Universe was held August 4-7 in Chicago, Illinois, USA. It was an outstanding conference for testers. I venture to say it's the start of a beautiful relationship. Here's my trip report. Other people on this list were there. I hope they chime in. The week began with a workshop organized by Bret Pettichord and me. It followed roughly the format of Cem Kaner's, Brian Lawrence's, and Elisabeth Hendrickson's LAWST workshops , in that it was organized around people telling stories of things they'd done, with the audience asking both clarifying and more probing questions. The group, which included James Bach, Ward Cunningham, Lisa Crispin, Janet Gregory, Martin Fowler (later), and about ten others, was quite friendly and productive. Most people worked on XP projects. There was no one there from the other named agile methods (Scrum, DSDM, etc.). There were both people who thought of themselves as testers and also those who identified themselves as programmers. We had no people from the XP Customer role. Bret and I may be be writing up a complete workshop report. Here's what sticks in my mind, perhaps skewed by my biases: - The testers seemed to be happier than most, felt more appreciated. There's more interaction and collaboration between programmers and testers than in conventional projects. - Despite an emphasis on automated tests, *no one* used commercial GUI tools (for capture/replay, scripted tests, or data driven tests). There were three overlapping strategies mentioned: 1) Build the system such that a thin GUI talks to the "guts". Use data-driven testing (typically tabular format) to drive the guts. 2) Same as (1), but write the tests in a programming language (Ruby, Java). 3) Like (2), but with the addition that the GUI is also tested "from within" by posting events to the event queue and querying GUI state. The interesting twist here is the same test can both drive the GUI (as an external GUI tool could) and also poke at the guts. Note that all these strategies work because the programmers are willing to add testability support to the product. In the conference, four people told me they were willing to (or about to) release their product testing frameworks as open source code. (Three of them were strategy 3, one strategy 1.) - There is strong interest in writing product tests first and using them to guide development. (I've started to call these "coaching tests".) I venture to say that's considered ideal, but that people are struggling with making it work. (The biggest problem seems to be getting the tests ready fast enough.) - There was a lot of interest in James Bach's (and others') notions of manual exploratory testing. Ward Cunningham, in particular, seemed quite taken with James's ideas. Because of Ward's well-deserved reputation in the XP world, I consider this a wonderful sign. There's much work to be done, but I feel now that it'll be easier to get permission to be (partly) non-automated. Because the workshop was only one day, we didn't have time to hammer out a sweeping consensus statement. We did reach consensus on four points: - Agile methods enable us to use tests as specification by example. (Editorial comment: "Specification by example" is Martin Fowler's phrase. I think it's a particularly good one. It says to me that you don't need 'totalizing' specifications that tell what the program must do for all inputs. Instead, you give the programmers concrete examples - tests - and trust that they, assisted by conversation, will be able to generalize.) - Documented acceptance tests are a useful subset of the value a tester can add to a project. (Editorial comment: This implies that manual exploratory tests are another one.) - XP-style acceptance testing is intended to measure progress. (Editorial comment: You achieve a steady, sustainable pace of development by passing ever more tests.) - "Agile acceptance testing" is acceptance testing on an agile project, for purposes of this discussion. (Editorial comment: huh? This was probably the outgrowth of discussions about how "acceptance testing" isn't such a good name. We adopted it for the name of the workshop because it's in common use on XP projects.) The next day, Martin Fowler (conference chair of the Agile Universe half) devoted some words of his keynote to testing. He said that he'd been happy to discover a school of testing (the context-driven school) that's nicely compatible with agile methods, named some leaders of the school (James, Bret, Cem Kaner, me), said that two of us were still at the conference (James had left), and said that he hoped this was the start of a lot of cross-fertilization. This, to me, was another "we're in!" moment. Much discussion of testing happened throughout the conference, including confirmations of the workshop points. A couple of miscellaneous additional notes: - Non-testers at the conference like to talk about testing *activities* but are less comfortable with a separate testing *role*. Testers, I think, tend to resist that attitude, fearing the consequences of a loss of independence. I think those consequences are real, but I'm now more inclined to explore alternatives. - Martin Fowler points out that the traditional analyst role also feels out of place on agile projects. How can analysis and testing be merged into one activity, specification by example? Bret convened an Open Space on the role of a tester in an agile project. Some things that caught my attention: - the focus seems to be shifting from "finding bugs" to "providing information". A nice phrase: "[the goal of the tests is to] ensure that the current health of the product is known at any time." (This focus shift is not new in some parts of the testing world.) - Here (and throughout the conference) there's an emphasis on tests that are understandable by a customer (though not necessarily written *by* the customer). More generally, think of automated tests as text to be read and talked about - by testers, by programmers, by customers. In this way, tests participate in the desire to make communication permeate agile projects (see Kent Beck's notion that code is about communication with programmers [_Smalltalk Best Practice Patterns_]). - Testers can help the customer in the difficult job of representing the interests of multiple parties. (For more on Open Space: .) Lisa Crispin hosted a panel discussion on whether XP made testers extinct with Janet Gregory, Ron Jeffries, Jeff Canna, Glen Alleman and Ken Auer. It actually quickly opened up to include comments from many in the room. In addition to being generally friendly and productive, Lisa's panel was important to me personally. I thought we ended up with more people wishing harder for tests as specification by example, written in advance. But I also had a chance to start to correct an impression I've given - that I think that tests should *replace* Programmer conversation with the Customer, rather than *improve* it. (It was round about that panel that I started toying with calling such tests "coaching tests", rather than "guiding tests". "Guide" can have the connotation of someone who talks at you and tells you where to go because you're too dumb to go safely on your own. A "coach" is someone who helps you realize your potential. Also "coaching test" has the advantage of having no obvious meaning, so people will find it harder to jump to conclusions [other than that this guy Marick speaks in tongues].) I also would like to thank Ron for being gracious, and for writing down something I said. (What was it?) Kay Johansen convened an Open Space (think Birds-of-a-Feather) session on Agile Testing Techniques. Their notes: They voted on top three points. They were: # Incorporate your test team into the entire team # Start testing on day one # Take the time and effort to build a relationship of understanding between testers and developers (and by extension, with all on the project team) (Notice how none of these are "techniques" in the sense of "cause-effect graphing".) I convened an Open Space session on taking advantage of the momentum we'd achieved at this conference: I won't summarize that, but I'll make three points: - There's a lot of interest. We people-who-do-testing should go to agile conferences. There are programmers and methodologists who want to talk with us. - We should go in an organized way. It was good that Bret and I had the workshop. The Open Space gave us a way to get people talking that doesn't happen if you just rely on blundering into people with similar interests. - The people at this conference are big on hands-on experience. For example, XP Fest , is a miniature XP project, with a customer, people programming in pairs, etc. Next conference, we should do both automated coaching tests and manual exploratory tests at the XP Fest. And bring your laptop with sample tests to show people. They will be interested in looking. -- "Act always so as to increase the number of choices." -- Heinz von Foerster -- Brian Marick, marick@testing.com www.testing.com - Software testing services and resources www.testingcraft.com - Where software testers exchange techniques www.visibleworkings.com - Adequate understanding of system internals [I've edited this slightly, to improve the chronology and incorporate some comments Brian made in a later email. -- Bret Pettichord.]