Fiserv Auckland - Intermediate Software Test Engineer

From Vincents CV Wiki
Jump to navigation Jump to search

Jan-2017 - Apr-2020

Intro

Fiserv Auckland is responsible for developing mobile apps utilized by over 2000 banks (mainly in the USA), serving more than 8 million active users. Additionally, they manage multi-tier and multi-tenanted Web and API integration servers interfacing with core online banking systems and third-party platforms. Fiserv's solutions offer extensive configurability, allowing for customization of features and branding. Operating within the stringent and risk-averse banking domain, reliability and quality are paramount. Testing at Fiserv presents complexities and challenges, yet it has been rewarding and intellectually stimulating despite its difficulties.

References

Roles

Software Developer in Testing - 2019-2020

Developed the PTF (Postman Testrunner Framework)

The Postman Testrunner Framework was born out of a need to test the API serving our mobile banking apps in a large, integrated test environment, over which we have little control. Our API platform acts as an aggregator of several core online banking systems (OLB's), each serving multiple financial institutions (FI's), and each with its own data/interface contract. The integrated environment is used by hundreds of staff, across the company, and the data setup changes constantly. Testing here is never deterministic, it has to be opportunistic, and needs to respond appropriately to any number of situations. To monitor the environment's health we needed a simple check of API functionality, capable of exercising all integration paths across the range of different OLB data interfaces, and FI/user configurations. It had to be flexible to run for a range of users, FI's, and OLB's, as well as for different deploy instances of our platform, and finally to handle the range of dynamic responses possible in such a fluid environment. To start with, we captured and observed the API calls made by the mobile app (Using Fiddler, Burp Suite, and MITM Proxy) and tried to design a Postman solution to emulate the app's behaviour. To initiate a check, we select a platform instance, a code for the FI, a useragent for the device, and then enter the user's credentials. Thereafter, the information required for subsequent scenarios must be obtained in prior calls. For example, to test a transfer scenario, you need to first obtain a list of their accounts. Our development teams have been using Postman for over a year and built up a collection with 100+ endpoints and requests. Many requests are furnished with helpful test scripts that extract data from the response, and saves them to the Postman global/environment variables. The collection is an organised into feature folders, and alphabetised to facilitate interactive functional testing of the platform API. However, the developer/test analyst must know the sequence of calls to make to start a session, and then they can perform some feature testing. This collection is actively maintained and versioned with pull requests and reviews in a Git repo. It is a really wonderful resource, and this project tries to leverage it's value by implementing a framework that can orchestrate the correct sequence of API requests to automate common functional (API) scenarios. The Postman Testrunner Framework (PTF) uses an external data file to specify a sequence of steps called userActions. A userAction executes a request from the underlying collection, and then has a list of handlers for the possible response codes. Response handlers are little snippets of code that determine the next userAction to perform. When no next userAction is specified in the response handler, execution moves to the next userAction in the external data file until the scenario is completed. The PTF is a simple state-machine. The PTF implements a data store of the information necessary to be able to test with many different users, FI's, OLB's, deploy instances, etc. A data syntax was developed that links different data types, and selects the values necessary to initiate a scenario for a user. The input variables are processed, the relevant data links expanded, so that the Postman global and environment variables are ready prior to the first request. Throughout the implementation A custom reporter was developed to receive the


Setup Splunk Enterprise & Integrated PTF with Splunk

Developed Inhouse Web UI for PTF Results

  • Quick glance dashboard (and associated data API) written in Node.js/Express.js/Pug


Software Test Engineer - 2017-2018

At Fiserv, I began as a QA member within agile teams responsible for implementing changes across various mobile banking solutions.

My responsibilities included:

  • Testing new features for mobile apps, and conducting cross-device regression checks.
  • Contributing to the development of the C# Specflow API automation suite for mobile API servers.
  • Deploying environments and modifying configurations using Octopus.
  • Testing a banking Web App hosted on dedicated hardware, where I leveraged Powershell scripts for configuring and automating deployments.

During this period, I used tools and technologies such as:

  • Postman and SoapUI for API testing.
  • Splunk for log analysis and monitoring.
  • Team Foundation Server(TFS) for version control (Git repos) and continuous integration (build server). (Note: TFS has been rebranded to Azure DevOps)
  • Powershell for automation tasks.
  • Octopus as a deployment automation tool.
  • Specflow and C# for API automation using BDD.
  • Conducted testing across various domains including mobile functional, accessibility, iOS upgrade, and platform API functional testing.
  • Used XMind for mind mapping, as well as Fiddler and Burp Suite -CE for network capturing and analysis.

orig stuff

Fiserv Auckland produces mobile apps for 2000+ banks (8M active users), as well as the multi-tier web and API integration servers that interface to core online banking systems and third parties. Our solutions are configurable with varying degrees of customisation of features and branding. The banking domain is very strict and risk averse! Reliability and quality are particularly important. I've found testing our product complicated, difficult & challenging.
  • (2019-2020) - Sole developer of
  • (2017-2018) - QA member of agile teams delivering changes to a range of mobile banking solutions.
Whilst working at Fiserv I worked with the following technologies
  • Postman/Newman/Javascript/TV4 JSON validator
  • Node.js/Express.js/Pug (Simple Web UI, Data API for test results, task scripting, data analysis)
  • Splunk (system monitoring, setup data collectors, creating new dashboards)
  • TFS (Git repos, build server, and script scheduling)
  • Powershell (System deployment automation & TFS)
  • Octopus (deployment engine)
  • Specflow/C# (Gherkin API automation)
  • Mobile functional, accessibility, iOS upgrade testing
  • Platform API functional testing
  • XMind (Mind Mapping Tool)
  • Fiddler/Burp Suite (Network capturing)
  • Soap UI (API testing)


old content from firserv long page

<< return to main page Template:FiservShort


I joined Fiserv first work day of 2017. Fiserv is a very different organisation from Trade Me and presented some real challenges for me. They are a huge global (but USA centric) financial services company with over 23,000 staff and over 100 million active users. Their systems were far more complex, the financial services domain far more risk averse, and has very strict quality and deployment process requirements. The management hierarchy was much deeper and most of the work performed in NZ was directed from offshore with little access to clients or end users. The NZ office of Fiserv produces mobile

Below is being updated to reflect my time and experience at Fiserv, but was copied from a prior role as a template

Development at Fiserv

xxx
  • Database
  • System Architecture
  • API
  • UI
  • The squad is responsible for the story's design, implementation, testing ....
  • Development is performed on short feature branches using mercurial. When stories are ready to be deployed they are merged into the integration and then the release branches before being deployed to production and eventually merged with the default trunk of the code.

Agile at Fiserv

Fiserv uses the Scaled Agile Framework which is ...
  • XXX to be updated XXX The squads were usually 2 Dev's, 1 tester, ½ BA, with access to design. In addition, the PO providing direction but considered just outside the squad.
  • XXX to be updated XXX Most squads are product facing, but there are also a number of squads that provide internal technical and support services to help the product squads. (DB, Platform, API, Automation, Code Health etc) Squads are trusted to ask for assistance when needed, and when to reach out to others when there are shared or over lapping responsibilities.

Testing at Fiserv

  • TBD

Tools I used at Fiserv

  • Postman for functional API testing, and developed framework for managing settings, and to be able to orchestrate API calls from a general collection to test different user scenarios.
  • SoapUI for functional API testing
  • VersionOne for managing cases/stories, test plans/session charters, bug tracking, test progress, issue(bug) tracking
  • xmind for mind maps and visual models to help test planning, execution, and reporting
  • Confluence Wiki for storing anything that might be useful for others, eg implementation details, how-to's for testing, common testing processes
  • Git & TFS version control and build server
  • Powershell scripts, mainly for speeding up repetitive tasks, eg deployments to multi-VM test environments
  • Octopus deployment engine
  • Chrome CJS Custom Javascript extension (to assist with repetitive QA specific tasks)
  • Microsoft SQL Server Management Studio
  • Visual Studio for various code development tasks
  • Microsoft Test Manager for managing test cases and suites, and recording test progress.
  • Splunk error analysis and error graphs
  • Fiddler & Blurp Suite for network traffic capture
  • Developer tools on common browsers
  • MS Office