There’s a renewed interest in test teams and tools, and Microsoft’s Chuck Sterling can think of two good reasons why this is so.

First, he said, development teams have changed around agile processes and continuous delivery of software, but test teams are still using processes that no longer integrate with how developers work.

Second, with the emergence of the cloud and the predominance of Web applications, the pace of development is now a 10x factor over the 18-month or 2-year cycles the industry used to see for software releases. So, testers need tools to work at this speed and to integrate with how development teams are working.

The term “agile testing” was defined in 1986, Sterling noted, but is only now gaining momentum. “There’s a dramatic change” under way in the testing culture, he said. “Test teams were looking at the availability of a box, and now they’re looking at the reliability of a service…and whether or not you had the right idea. Testers are looking at things now with a customer-colored view. ‘Is that what the customer wants?’ That’s a huge change.”

Yet the more things change, the more they stay the same. For instance, 80% of software testing done today still is performed manually, according to Voke analyst and founder Theresa Lanowitz. “Testers still struggle with the cost/quality/schedule triangle. They still spend a lot of time documenting test cases, and provisioning test environments, and reproducing defects, for instance. They need to optimize their time to do more strategic types of things.”

Both Lanowitz and Sterling agree that testing needs to drive quality upstream. “You can’t test in quality at the end,” Sterling said. “By then, if you find the software architecture is poorly defined, or there are major coding problems, it’s almost too late to do anything about it. You want to drive testing back into the ideation. It’s no longer at the end.”

One way to better integrate dev teams and test teams is to get them onto a single point of reference or a common language. “Having a common language is a big benefit,” Lanowitz said. “That enables a high degree of collaboration.” It’s to an organization’s advantage, she said, to have the automated test engineer on the test side, for example, using the same language that the codebase was developed in. With that, you get the added bonus of transparency, she pointed out.

Testing tools built with this higher level of integration between the teams can be particularly helpful in getting the teams working together. Source-code analysis is a particularly important area. “Developers should write automated unit tests, and then those become part of the test suite,” said Lanowitz. “The developer can say, ‘Here’s the source code. Here’s the level of quality.’ And the test can go in, find a defect, replay it, and work with the developer to resolve it in a product-equivalent environment.”

Lanowitz, who will be speaking at the STAR East Conference this week, calls Microsoft’s Visual Studio 2010 Test Professional tools “the best testing product no one has ever heard of.

“They use virtualization well,” she said, pointing out that Microsoft has the first automated virtual lab management capability built into an ALM tool. “People are still building physical test labs. Microsoft has come out with tooling to free testers up from the manual stuff,” which makes them more efficient.

This culture shift in testing might help testers shed the image of being impediments to software releases instead of facilitators of quality software releases. “How do we work with the development team in this brave new world?” Sterling asked. “The role of testers in the cloud becomes a huge value-add. Testers triage the feedback data, and turn around to tell developers, ‘Your customers want features 5, 10 and 43.’

“All of a sudden,” he said, “your success in getting your next check is pinned on testers getting you the feedback data to deliver what those customers want.”

Or, as Lanowitz simply put it: “It’s a pretty exciting time in testing.”

David Rubinstein is editor-in-chief of SD Times.