Agile Development: Fragile Deliverables

Agile Development is great, it’s how we build for most of our clients. But, it can have its challenges, like: 

  • Weak Acceptance Criteria (Documentation Aversion): Stories created in a sprint often lack enough detail for QA to build proper acceptance criteria.
  • Inadequate Test Coverage: With continuous integration and changing requirements, it can be easy to miss critical tests for a requirement.
  • Inadequate API Testing: Most if not all applications start with an API layer. Which client-side devs quickly start using without adequate acceptance testing.
  • Regression Issues/Frequent Builds: In 2-week sprints, code is deployed multiple times a day, this increases the risk of breaking existing features.
  • Performance Bottlenecks: As you add features with every sprint, code becomes more complex and introduces potential performance bottlenecks that can go unnoticed if not properly tested for.

The “V” Model for QA is proven and still a very viable practice. We have tailored it to suit the needs of our current-day product development in an Agile environment.


Business Requirements Stage (Weak Acceptance Criteria or Documentation Aversion)

  • It is not practical to have detailed enough user stories defined within a sprint and expect dev and test to meet timelines with quality.
  • SourceFuse recommends a dedicated Business Analyst that works with the product owner to ensure that the user stories in the backlog are detailed enough so that the scrum team can effectively plan for them in the upcoming sprints.
  • The business analyst them performs the UAT at the end of a sprint, provides a sprint demo to the onsite product owner to ensure quality.


Software Requirements Stage (Inadequate Test Coverage)

  • Once user stories have been detailed and added to a sprint, the dev team breaks them into sub-tasks (A JIRA feature) while QA creates system test cases for the complete User Story and sub-tasks (if needed).
  • A common challenge in Agile dev is visibility into test coverage for the current sprint as well as Regression testing.
  • SourceFuse maps each user story with the respective test cases. This serves as a requirements traceability matrix and also gives an easy snap shot into the testing progress at any point on the sprint making it possible to answer the question – Is this ready for deployment?


Functional API Testing (Inadequate API Testing)

  • A lot of inefficiency happens in the handoff on APIs from server side devs to client side devs (Web/Mobile)
  • At SourceFuse we recommend building API definitions using the SWAGGER API standard. The same Swagger doc is used to upload cases to the SOAPUI tool for functional testing of APIs.
  • This best practice ensures consistently documented and tested APIs are provided to the client side developers.



Automation (Regression Issues/Frequent Builds)

  • Code is updated and pushed multiple times a day, it is not practical to manually test every existing feature and test new sprint features in a single sprint.
  • At SourceFuse we leverage automation testing of every code base:
    • API – SoapUI
    • Web – Selenium
    • Mobile – Appium



Automated Load Testing (Performance Testing)

  • As more and more features are added, volume and complexity of the code base increases and the volume of data typically increase in the database as well.
  • Functional testing does NOT test for performance issues that could be caused by slow queries on a large dataset (production size), or when a large number of concurrent users hit the system.
  • SourceFuse performs end to end load testing of apps using JMeter and other tools to ensure that there are no bottlenecks introduced by sprint deliverables.