TradeMeLong

From Vincents CV Wiki
Jump to: navigation, search

<< return to main page

Senior Software Test Analyst – Trade Me (Dec 2014 - Aug 2016)

References
2016 Trade Me Motors - Jason Cullum

Reference, Annual Review

2016 Mike Berry - Delivery Mgr
Part of small squad (cross functional agile team) testing software changes to the iconic NZ Trade Me website (Motors Group) covering
  • context driven, tool assisted exploratory testing, using session and thread based techniques
  • testing DB, UI, API, and architectural changes
  • leading the deployment of changes using Trade Me's continuous integration and continuous delivery processes
  • splunk system monitoring
  • agile methods and squad mastering
  • test automation for API (Ready!API/SoapUI) and UI changes (tractor/protractor) using BDD with Gherkin syntax.
  • test planning and peer test reviews
  • visual test tools (eg mindmaps for test planning and video capture of test sessions etc)
  • Jira for test plans, managing test sessions, and defect work flow. Confluence wiki for test practice documentation.
  • active contributor to test and agile guilds
  • new staff induction and junior staff support

After my first professional testing role at LeasePlan I was really excited to join the Trade Me motors team as a senior test analyst. Trade Me is a large scale internet organisation, they have a well established test guild and are renowned for their agile implementation. All things I was very keen to learn more about.

Development at Trade Me

Squads take ownership for implementing the necessary changes to the full technology stack to deliver new features and projects, from inception through to deployment in production.
  • Database
  • System Architecture
  • API
  • UI
  • The squad is responsible for the story's design, implementation, testing, and deployment to production.
  • Development is performed on short feature branches using mercurial. When stories are ready to be deployed they are merged into the integration and then the release branches before being deployed to production and eventually merged with the default trunk of the code.
  • All code changes require code reviews.

Agile at Trade Me

Trade Me was my first introduction to agile, working in a small cross functional team, within the Motors vertical, as part of the Motors Test Chapter, and supported by the wider company test guild.
  • Trade Me uses the Spotify model of squads, chapters and guilds.
  • The squads were usually 2 Dev's, 1 tester, ½ BA, with access to design. In addition, the PO providing direction but considered just outside the squad.
  • Most squads are product facing, but there are also a number of squads that provide internal technical and support services to help the product squads. (DB, Platform, API, Automation, Code Health etc) Squads are trusted to ask for assistance when needed, and when to reach out to others when there are shared or over lapping responsibilities.
  • Trade Me has an active agile guild and squads are encouraged to use scrum, kanban, or any other mix of methodologies. Squads perform regular retrospectives to genuinely review their processes and to ditch those activities that are not useful and to continually look for new ways to improve their processes.
  • As squad master I facilitated processes and meetings for project inception, story grooming, planning & estimation, retrospectives, daily standups, and squad whiteboard.

Testing at Trade Me

  • The test guild at Trade Me is a strong advocate for Context Driven Testing
  • Testers are expected to consider a wide scope of testing, looking for anything that might surprise someone that mattered. We would contribute to discussions about UX and UI designs, system architecture, database and code structure. As well as verifying that the solution behaved as intended by the various designs.
  • At all times testing needs to be efficient and provide value for the resources spent, focussing on high risks, and leaving acceptable risks.
  • Testing discussions usually start at project inception, and testing requirements are discussed as part of grooming, planning and estimation.
  • A story's implementation journey starts with the "Three Amigos" meeting of BA, Dev & Test. This helps us understand the problem we're trying to solve.
  • A "Test Notes" discussion would usually follow. Here we'd discuss the proposed solution and the kind of testing we'd like to do. This helps align Dev and Test efforts, and reduces rework.
  • Now the tester can scope out a rough session/thread based test plan, perform risk assessments, and investigate the tools that might be needed to perform the testing.
  • Test plans are put into Jira as a subtask, and if required, a mindmap created and attached.
  • Dev's are encouraged to share builds as early as possible, so the tester can start exploring the implementation, filling in details on the test sessions/map, and start giving early test feedback.
  • Once initial dev is complete, I would ask for a demo and code walk through on my PC. This confirms it runs on my PC, shows what modules were changed, and I ask what else might be affected. I would also show and discus the test plan & map.
  • By this stage the test plan should be reasonably well formed, the highest risks tested and we are checking each of the test ideas in the sessions/threads.
  • The test plan always requires a peer review. The level of the review is determined by the squad, and the risks identified.
  • Story bugs found during testing are communicated to the dev, usually verbally, and tracked simply in the Jira case to ensure they are addressed prior to deployment.
  • Once a story's implementation & testing are completed to a satisfactory level it is ready to start the deploy process.
I performed all of the above routinely as part of my senior test analyst role.

Deploy Process at Trade Me

The deploy process is where a story completed on a feature branch is deployed to production, the deployment of stories is shepherded by the testers, with active support from the platform team and the story's developers.
  • The completed feature branch is merged into the integration branch (at any time) and then promoted to a test environment and tested.
  • Most weekdays there are are two, two hour deploy windows. Just prior to the start of a deploy window, a release candidate is built from the head of the integration branch. This batches up all the integrated change sets since the last deploy. The release is promoted to, and tested in, the stage environment.
  • The release candidate is then deployed to production and errors actively monitored for a short period of time.
  • Once the deploy is confirmed, the release is merged to the default trunk of the code.
At times multiple components need to be deployed, and the sequence of deploying these is important.
During the steps above, the testers are responsible to monitor, and make a number of checks, to ensure the new build is good to be deployed to production. If an urgent issue occurs the deploy lead can chose to rollback or to ask the squad to do a hot fix.
I have deployed new stories routinely and been a test deploy lead many times at Trade Me.

Tools used at Trade Me

Your brain! First and foremost you were expected to use your intelligence to solve problems and to continually improve processes and tools used.
  • Ready!API for API testing & Automation (this is a UI for SoapUI) (This tool allowed you to write custom groovy scripts)
  • Jira for managing cases/stories, test plans/session charters, bug tracking, test progress, issue(bug) tracking
  • mindmup, xmind, simplemind
  • Confluence Wiki for storing anything that might be useful for others, eg implementation details, how-to's for testing, common testing processes
  • Mercurial for version control and deploy scripts (CMDer, Beyond Compare)
  • Tractor for automating the new angular user interface for Trade Me (this is a UI for Protractor)
  • Powershell scripts
  • Icecream screen capture
  • Loads of chrome extensions (TM API Tester, CJS Custom JavaScript, Bug Magnet, Clear Session, Responsive Web Ruler, WASP etc).
  • Microsoft SQL Server Management Studio for querying databases and profiling stored procedure calls
  • Splunk live error analysis and error graphs
  • Fiddler & Wireshark for network traffic capture
  • Developer tools on tier 1 browsers
  • MS Office

API First Policy

Trade Me started in 1999 when API's weren't part of the IT landscape, and their flagship desktop browser website application did not implement an API. However, since then we have seen the rise of mobile devices and with it the desire to use API's to implement business logic separate from a proliferation of user interfaces and technologies. Trade Me now has a number of mobile apps for Android and iOS devices, and several business to business interactions all connected via their API. The desktop website has always been the most fully featured of their applications and most new features were developed first (or only) for the desktop website. This was becoming a serious drag on implementing the mobile solutions and the company adopted the policy API First. It meant that any new feature destined for the desktop website would first need to implement the feature in the Trade Me API so that it could also be implemented in the mobile solutions when needed.
This meant that all product squads were required to develop and deliver API solutions and API testing was part and parcel of my routine work. The API testing work was divided between tool assisted exploratory testing using an in-house developed API test tool. In addition we also implemented API automation cases using SoapUI (part of Ready!API), which were then merged into the general API Automation code repository.

Projects

Motors Price Change

Partway through 2015, Trade Me announced to the NZX (where Trade Me is a listed company) that the motors pricing was changing, effective in just over two weeks. That put a fixed deadline on the change. To make matters a little harder again was that the pricing structure was being changed from a three tiers to four tiers. This was now a high profile, risky, time constrained project with potential for significant financial and reputational impacts. I had only been with the company for a few months and there were very few testers with any better experience at the Auckland office.
This project was scary from the outset but with help from my newly joined test chapter lead, Jason Cullum, we set out to implement and test the changes. It was clear we were not going to be able to cover as much of the testing scope as we would have liked to, and it was going to be a task of determining the areas to cover, the areas that we would not be able to cover and communicating the test coverage with the wider team for feedback and buy-in. We decided to draw up a large mindmap for this and it proved a very good visual tool to convey the information. We also looked for opportunities to call in help with specific parts, and we were able to borrow a tester from the Christchurch office for a few days.
In this project I was the lead tester, and managed the testing and comms with guidance and support from my chapter lead. I was pleased that we were able to quickly work out the agreed test coverage and then put the blinkers on to test as much as we could cover as fast as we could. In the end the price change went live without major issues and only a couple of minor issues were found in the untested areas, these were fixed quickly with fast follow updates.

Premium Packages - Motors Sell Process

The process Trade Me users use to create a new (or edit it) listing is called the sell process. A user specifies the listing's details, uploads some photos, they can promote their listing with some extras, they're shown a confirmation screen and presto! their listing is live for people to look at and buy. Parts of the sell process code is shared across the entire Trade Me site. The promotional extras include: feature to show the listing higher in search results, a bold title, a yellow border to highlight the search card, a subtitle, etc. Most of these were shown in an unattractive list on the extras page, and were in need of a face lift. The product owner, after investigations, decided to replace the entire concept of extras with a new model of four packages with much nicer graphics and UX. Each package would contain a suite of the extras growing from the basic package to bronze, silver and gold packages, each with more goodies.
This was a difficult project, requiring changes and testing through the entire stack from DB, core shared Sell Process architecture, to a whole lot of work at the UI for the new page design. We found many places where the extra's were being referenced eithe by name or by their cost and required updates and testing. There were a plethora of edge cases because the Sell Process code was also used for sellers to edit their listings. Making sure the page displayed properly across the tier 1 browsers and portable devices such as iPads also proved quite hard.
I was trying to help the squad focus on the most common use cases, trying to identify the value associated with certain features, or the cost of leaving small bugs to be fixed later. This was hard, but we developed over time a good model where we would do periodic demos of potential issues and progress to the product owner and together we decided on what was fixed now or deferred to later.
Due to size and fairly risky nature of this feature it was implemented with an on/off config key, and we were able to deploy smaller stories to production in a switched off (aka dormant) state. After several month's work we were finally able to switch the feature on in production, but after a few weeks it showed that it negatively affected TBC

DealerBase to SalesForce API (Automation)

Metrics Driven development of the Motors Sell Process Extras Page

The formula for calculating the confidence of the difference between the control and the variant (this should go into a blog at some point)

Statistical Significance for MDD

Preview - a new Motors Home Page


<< return to main page