Fiserv Auckland - Intermediate Software Test Engineer

From Vincents CV Wiki
Jump to: navigation, search

Jan-2017 - Apr-2020

Fiserv

Fiserv Auckland is responsible for developing mobile apps utilized by over 2000 banks (mainly in the USA), serving more than 8 million active users. Additionally, they manage multi-tier and multi-tenanted Web and API integration servers interfacing with core online banking (OLB) systems and third-party platforms. Fiserv's solutions offer extensive configurability, allowing for customization of features and branding. Operating within the stringent and risk-averse banking domain, reliability and quality are paramount. Testing at Fiserv presented complexities and challenges, yet it has been a rewarding and intellectually stimulating role despite its difficulties.

References

Software Developer in Testing - 2019-2020

With this role I assisted with integration testing the mobile API server, which was used by the mobile apps as a gateway to a network of core online banking systems (OLBs). Each OLB had its own interface contract, and each served multiple financial institutions (FIs). Due to the expense and difficulty of replicating the OLB systems, only three integrated testing environments were created. These test environments were subject to frequent configuration changes, used by many staff, and tightly controlled from the USA. Despite these difficulties and the non-deterministic nature of testing in these environments, integration testing remained essential.

To streamline integration testing and monitor environment readiness, I spearheaded the development of the Postman Testrunner Framework (PTF), a flexible solution capable of dynamically executing complete user scenarios through various OLB's and FI/user configurations.

Development of Postman Collection

Utilizing tools like Fiddler, Burp Suite CE, and MITM Proxy, we captured API calls made by the mobile app, and to then create a comprehensive Postman collection of requests. Each user scenario was a sequence of calls, each call performing an action and storing relevant data in the Postman Environment Variables. I emphasised obtaining data dynamically from the OLB to minimize reliance on potentially stale data.

Architecture of the Postman Testrunner Framework (PTF)

The PTF automatically orchestrated the calls in the correct order to execute the user scenarios reliably. It used an external JSON file to specify a sequence of steps called userActions, each userAction referenced a request from the collection, and contained response handlers for each http response code which set the next userAction to perform. Effectively, the PTF was a simple state-machine. The PTF also implemented a simple nested JSON data syntax to be able to store data such as user credentials as well as FI connection settings. Passwords were encrypted when stored, and decrypted at run time.

Custom Development and Integration

The PTF was implemented using Newman in a Node.js project, with a custom javascript reporter developed to process events emitted by Newman during execution. This allowed for real-time capture of results and detailed logs, providing clear insights into failures and partial successes. Results were sent to the PTF dashboard, as well as to a dedicated Splunk instance for comprehensive monitoring and analysis. The PTF dashboard and Splunk implementations are detailed in the sections below.

The PTF was executed inside a shell terminal on TFS build agents, and used shell environment variables to provide the PTF with FI settings and user credentials. The PTF was designed to be able to execute in parallel, and TFS was configured to run all users concurrently once per hour.

Development of PTF Dashboard

I used Node.js with Express.js and Pug to create

  • an API for receiving events from the PTF, and
  • a Web UI to display a snapshot of the latest results in a tabular dashboard.

The API was designed to process data from concurrent PTF executions, and the Web UI updated in real-time to give immediate feedback about the environment health from multiple user perspectives. The fast feedback for multiple users was particularly useful following a deployment of the mobile API server.

In addition to pass and fail, I chose to also show that sometimes scenarios

  • could not run, eg. a user with just one account could not try to transfer money between accounts.
  • pass ⚠ when only partially successful. eg. an attempt to fetch a list of bill payments returning no items because none had been made
  • not supported by the FI/OLB
  • not run. eg. skipped, or still waiting to be run

For each result cell I used hover and mouse actions to show details.

Link to a screenshot of the PTF dashboard

Setup Splunk Enterprise

I setup a dedicated instance of Splunk Enterprise to store and analyze trends in the PTF data (results, logging, and full API requests and responses). This involved configuring indexes, HEC event collectors, user access permissions, and managing VM storage requirements. I developed dashboards to visualize historical PTF data, utilizing shades of green, red, and grey to represent pass, fail, and indeterminate results. The shading was used to differentiate users. These grids provided valuable insights into environment health, user status, feature performance, and OLB status. Click-through functionality was added to facilitate investigations and drill down through the layers into increasingly more detailed views of the data.

  • Configured indexes, HEC event collectors, and user access permissions
  • Extensively analysed historical PTF data
  • Developed dashboards to visualize historical PTF data, using colour to show health, status, and performance
  • Implemented click-through functionality for detailed data exploration

Link to screenshot of the feature grid

Software Test Engineer - 2017-2018

At Fiserv, I began as a QA member within agile teams responsible for implementing changes across various mobile banking solutions.

My responsibilities included:

  • Testing new features for mobile apps, and conducting cross-device regression checks.
  • Contributing to the development of the C# Specflow API automation suite for mobile API servers.
  • Deploying environments and modifying configurations using Octopus.
  • Testing a banking Web App hosted on dedicated hardware, where I leveraged Powershell scripts for configuring and automating deployments.

Tools and Technologies

At Fiserv I used the following tools and technologies:

  • Postman and SoapUI for API testing.
  • Splunk for log analysis and monitoring.
  • Team Foundation Server(TFS) for version control (Git repos) and continuous integration (build server). (Note: TFS has been rebranded to Azure DevOps)
  • Powershell for automation tasks.
  • Octopus as a deployment automation tool.
  • Specflow and C# for API automation using BDD.
  • Conducted testing across various domains including mobile functional, accessibility, iOS upgrade, and platform API functional testing.
  • Fiddler, Burp Suite CE, and MITM Proxy for capturing network calls.
  • XMind for mind mapping.
  • Confluence for product & project documentation
  • SQL Server Management Studio for data queries, test data setup, and testing SQL scripts
  • Visual Studio for code related tasks
  • Microsoft Test Manager for managing test cases and suites, and recording test progress.

Agile

Fiserv used the Scaled Agile Framework (SAFe) to govern their Agile practices. Squads, were typically about ten in size, and engaged in common Agile Rituals, and were responsible for SDLC through to integration. We followed Gitflow (on TFS).

The squad

  • Engaged in Agile Rituals - stand-ups, backlog grooming, estimation, planning, demos, and retros.
  • Owned the development lifecycle - story design, implementation, testing, and integration
  • Followed Gitflow - feature branches for development and integrating changes into release-train branches.
  • Contributed to quality checking at various stages before code changes were deployed to production.