Fiserv Auckland - Intermediate Software Test Engineer

From Vincents CV Wiki
Jump to navigation Jump to search

Jan-2017 - Apr-2020

Intro

Fiserv Auckland is responsible for developing mobile apps utilized by over 2000 banks (mainly in the USA), serving more than 8 million active users. Additionally, they manage multi-tier and multi-tenanted Web and API integration servers interfacing with core online banking (OLB) systems and third-party platforms. Fiserv's solutions offer extensive configurability, allowing for customization of features and branding. Operating within the stringent and risk-averse banking domain, reliability and quality are paramount. Testing at Fiserv presented complexities and challenges, yet it has been a rewarding and intellectually stimulating role despite its difficulties.

References

Software Developer in Testing - 2019-2020

With this role I assisted with integration testing the mobile API server, which was used by the mobile apps as a gateway to a network of core online banking systems (OLBs). Each OLB had its own interface contract, and each served multiple financial institutions (FIs). Due to the expense and difficulty of replicating the OLB systems, only three integrated testing environments were created. These test environments were subject to frequent configuration changes, used by many staff, and tightly controlled from the USA. Despite these difficulties and the non-deterministic nature of testing in these environments, integration testing remained essential.

To streamline integration testing and monitor environment readiness, I spearheaded the development of the Postman Testrunner Framework (PTF), a flexible solution capable of dynamically executing complete user scenarios through various OLB's and FI/user configurations.

Key Contributions:

Development of Postman Collection

Utilizing tools like Fiddler, Burp Suite CE, and MITM Proxy, we captured API calls made by the mobile app, and to then create a comprehensive Postman collection of requests. Each user scenario was a sequence of calls, each call performing an action and storing relevant data in the Postman Environment Variables. I emphasised obtaining data dynamically from the OLB to minimize reliance on potentially stale data.

Architecture of the Postman Testrunner Framework (PTF)

The PTF automatically orchestrated the calls in the correct order to execute the user scenarios reliably. It used an external JSON file to specify a sequence of steps called userActions, each userAction referenced a request from the collection, and contained response handlers for each http response code which set the next userAction to perform. Effectively, the PTF was a simple state-machine. The PTF also implemented a simple nested JSON data syntax to be able to store data such as user credentials as well as FI connection settings. Passwords were encrypted when stored, and decrypted at run time.

Custom Development and Integration

The PTF was implemented using Newman in a Node.js project, with a custom reporter developed to process events emitted by Newman during execution. This allowed for real-time capture of results and detailed logs, providing clear insights into failures and partial successes. Results were sent to the PTF dashboard, as well as to a dedicated Splunk instance for comprehensive monitoring and analysis. The PTF dashboard and Splunk implementations are detailed in the sections below.

The PTF was executed inside a shell terminal on TFS build agents, and used shell environment variables to provide the PTF with FI settings and user credentials. The PTF was designed to be able to execute in parallel, and TFS was configured to run all users concurrently once per hour.

Development of PTF Dashboard

I used Node.js with Express.js and Pug to create

  • an API for receiving events from the PTF, and
  • a Web UI to display a snapshot of the latest results in a tabular dashboard.

The API was designed to process data from concurrent PTF executions, and the Web UI updated in real-time to give immediate feedback about the environment health from multiple user perspectives. The fast feedback for multiple users was particularly useful following a deployment of the mobile API server.

In addition to pass and fail, I chose to also show that sometimes scenarios

  • could not run, eg. a user with just one account could not try to transfer money between accounts.
  • pass ⚠ when only partially successful. eg. an attempt to fetch a list of bill payments returning no items because none had been made
  • not supported by the FI/OLB
  • not run. eg. skipped, or still waiting to be run

For each result cell I used hover and mouse actions to show details.

Link to a screenshot of the PTF dashboard

Setup Splunk Enterprise

I setup a dedicated instance of Splunk Enterprise to store and analyze trends in the PTF data (results, logging, and full API requests and responses). This involved configuring indexes, HEC event collectors, user access permissions, and managing VM storage requirements. I developed dashboards to visualize historical PTF data, utilizing shades of green, red, and grey to represent pass, fail, and indeterminate results, respectively. The shading was used to differentiate users. These grids provided valuable insights into environment health, user status, feature performance, and OLB status. Click-through functionality was added to facilitate investigations and drill down through the layers into increasingly more detailed views of the data.

  • Configured indexes, HEC event collectors, and user access permissions
  • Extensively analysed historical PTF data
  • Developed dashboards to visualize historical PTF data, using colour to show health, status, and performance
  • Implemented click-through functionality for detailed data exploration

Link to screenshot of the feature grid

Software Test Engineer - 2017-2018

At Fiserv, I began as a QA member within agile teams responsible for implementing changes across various mobile banking solutions.

My responsibilities included:

  • Testing new features for mobile apps, and conducting cross-device regression checks.
  • Contributing to the development of the C# Specflow API automation suite for mobile API servers.
  • Deploying environments and modifying configurations using Octopus.
  • Testing a banking Web App hosted on dedicated hardware, where I leveraged Powershell scripts for configuring and automating deployments.

During this period, I used tools and technologies such as:

  • Postman and SoapUI for API testing.
  • Splunk for log analysis and monitoring.
  • Team Foundation Server(TFS) for version control (Git repos) and continuous integration (build server). (Note: TFS has been rebranded to Azure DevOps)
  • Powershell for automation tasks.
  • Octopus as a deployment automation tool.
  • Specflow and C# for API automation using BDD.
  • Conducted testing across various domains including mobile functional, accessibility, iOS upgrade, and platform API functional testing.
  • Used Fiddler, Burp Suite CE, and MITM Proxy for capturing network calls, as well as XMind for mind mapping.

orig stuff

Fiserv Auckland produces mobile apps for 2000+ banks (8M active users), as well as the multi-tier web and API integration servers that interface to core online banking systems and third parties. Our solutions are configurable with varying degrees of customisation of features and branding. The banking domain is very strict and risk averse! Reliability and quality are particularly important. I've found testing our product complicated, difficult & challenging.
  • (2019-2020) - Sole developer of
  • (2017-2018) - QA member of agile teams delivering changes to a range of mobile banking solutions.
Whilst working at Fiserv I worked with the following technologies
  • Postman/Newman/Javascript/TV4 JSON validator
  • Node.js/Express.js/Pug (Simple Web UI, Data API for test results, task scripting, data analysis)
  • Splunk (system monitoring, setup data collectors, creating new dashboards)
  • TFS (Git repos, build server, and script scheduling)
  • Powershell (System deployment automation & TFS)
  • Octopus (deployment engine)
  • Specflow/C# (Gherkin API automation)
  • Mobile functional, accessibility, iOS upgrade testing
  • Platform API functional testing
  • XMind (Mind Mapping Tool)
  • Fiddler/Burp Suite (Network capturing)
  • Soap UI (API testing)


old content from firserv long page

<< return to main page Template:FiservShort


I joined Fiserv first work day of 2017. Fiserv is a very different organisation from Trade Me and presented some real challenges for me. They are a huge global (but USA centric) financial services company with over 23,000 staff and over 100 million active users. Their systems were far more complex, the financial services domain far more risk averse, and has very strict quality and deployment process requirements. The management hierarchy was much deeper and most of the work performed in NZ was directed from offshore with little access to clients or end users. The NZ office of Fiserv produces mobile

Below is being updated to reflect my time and experience at Fiserv, but was copied from a prior role as a template

Development at Fiserv

xxx
  • Database
  • System Architecture
  • API
  • UI
  • The squad is responsible for the story's design, implementation, testing ....
  • Development is performed on short feature branches using mercurial. When stories are ready to be deployed they are merged into the integration and then the release branches before being deployed to production and eventually merged with the default trunk of the code.

Agile at Fiserv

Fiserv uses the Scaled Agile Framework which is ...
  • XXX to be updated XXX The squads were usually 2 Dev's, 1 tester, ½ BA, with access to design. In addition, the PO providing direction but considered just outside the squad.
  • XXX to be updated XXX Most squads are product facing, but there are also a number of squads that provide internal technical and support services to help the product squads. (DB, Platform, API, Automation, Code Health etc) Squads are trusted to ask for assistance when needed, and when to reach out to others when there are shared or over lapping responsibilities.

Testing at Fiserv

  • TBD

Tools I used at Fiserv

  • Postman for functional API testing, and developed framework for managing settings, and to be able to orchestrate API calls from a general collection to test different user scenarios.
  • SoapUI for functional API testing
  • VersionOne for managing cases/stories, test plans/session charters, bug tracking, test progress, issue(bug) tracking
  • xmind for mind maps and visual models to help test planning, execution, and reporting
  • Confluence Wiki for storing anything that might be useful for others, eg implementation details, how-to's for testing, common testing processes
  • Git & TFS version control and build server
  • Powershell scripts, mainly for speeding up repetitive tasks, eg deployments to multi-VM test environments
  • Octopus deployment engine
  • Chrome CJS Custom Javascript extension (to assist with repetitive QA specific tasks)
  • Microsoft SQL Server Management Studio
  • Visual Studio for various code development tasks
  • Microsoft Test Manager for managing test cases and suites, and recording test progress.
  • Splunk error analysis and error graphs
  • Fiddler & Blurp Suite for network traffic capture
  • Developer tools on common browsers
  • MS Office