Fiserv Auckland - Intermediate Software Test Engineer: Difference between revisions
Jump to navigation
Jump to search
Fiserv Auckland - Intermediate Software Test Engineer (view source)
Revision as of 21:39, 22 May 2024
, 22 May→Custom Development and Integration
m (→orig stuff) |
|||
(19 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
'''Jan-2017 - Apr-2020''' | '''Jan-2017 - Apr-2020''' | ||
== | == Fiserv == | ||
Fiserv Auckland is responsible for developing mobile apps utilized by over 2000 banks (mainly in the USA), serving more than 8 million active users. Additionally, they manage multi-tier and multi-tenanted Web and API integration servers interfacing with core online banking (OLB) systems and third-party platforms. Fiserv's solutions offer extensive configurability, allowing for customization of features and branding. Operating within the stringent and risk-averse banking domain, reliability and quality are paramount. Testing at Fiserv presented complexities and challenges, yet it has been a rewarding and intellectually stimulating role despite its difficulties. | Fiserv Auckland is responsible for developing mobile apps utilized by over 2000 banks (mainly in the USA), serving more than 8 million active users. Additionally, they manage multi-tier and multi-tenanted Web and API integration servers interfacing with core online banking (OLB) systems and third-party platforms. Fiserv's solutions offer extensive configurability, allowing for customization of features and branding. Operating within the stringent and risk-averse banking domain, reliability and quality are paramount. Testing at Fiserv presented complexities and challenges, yet it has been a rewarding and intellectually stimulating role despite its difficulties. | ||
Line 12: | Line 12: | ||
To streamline integration testing and monitor environment readiness, I spearheaded the development of the Postman Testrunner Framework (PTF), a flexible solution capable of dynamically executing complete user scenarios through various OLB's and FI/user configurations. | To streamline integration testing and monitor environment readiness, I spearheaded the development of the Postman Testrunner Framework (PTF), a flexible solution capable of dynamically executing complete user scenarios through various OLB's and FI/user configurations. | ||
=== Development of Postman Collection === | === Development of Postman Collection === | ||
Line 19: | Line 17: | ||
=== Architecture of the Postman Testrunner Framework (PTF) === | === Architecture of the Postman Testrunner Framework (PTF) === | ||
The PTF automatically | The PTF automatically orchestrated the calls in the correct order to execute the user scenarios reliably. It used an external JSON file to specify a sequence of steps called userActions, each userAction referenced a request from the collection, and contained response handlers for each http response code which set the next userAction to perform. Effectively, the PTF was a simple state-machine. The PTF also implemented a simple nested JSON data syntax to be able to store data such as user credentials as well as FI connection settings. Passwords were encrypted when stored, and decrypted at run time. | ||
=== Custom Development and Integration === | === Custom Development and Integration === | ||
The PTF was implemented using [https://www.npmjs.com/package/newman '''Newman'''] in a [https://nodejs.org/en '''Node.js'''] project, with a custom reporter developed to process events emitted by Newman during execution. This allowed for | The PTF was implemented using [https://www.npmjs.com/package/newman '''Newman'''] in a [https://nodejs.org/en '''Node.js'''] project, with a custom JavaScript reporter developed to process events emitted by Newman during execution. This allowed for real-time capture of results and detailed logs, providing clear insights into failures and partial successes. Results were sent to the PTF dashboard, as well as to a dedicated [https://www.splunk.com/ '''Splunk'''] instance for comprehensive monitoring and analysis. The PTF dashboard and Splunk implementations are detailed in the sections below. | ||
The PTF was executed inside a shell terminal on '''TFS | The PTF was executed inside a shell terminal on [https://learn.microsoft.com/en-us/previous-versions/azure/devops/all/overview?view=tfs-2018 '''TFS'''] build agents, and used shell environment variables to provide the PTF with FI settings and user credentials. The PTF was designed to be able to execute in parallel, and TFS was configured to run all users concurrently once per hour. | ||
=== Development of PTF Dashboard === | === Development of PTF Dashboard === | ||
Line 60: | Line 58: | ||
My responsibilities included: | My responsibilities included: | ||
* Testing new features for mobile apps, and conducting cross-device regression checks. | * Testing new features for mobile apps, and conducting cross-device regression checks. | ||
* Contributing to the development of the C# Specflow API automation suite for mobile API servers. | * Contributing to the development of the C# Specflow API automation suite for mobile API servers. | ||
Line 66: | Line 63: | ||
* Testing a banking Web App hosted on dedicated hardware, where I leveraged Powershell scripts for configuring and automating deployments. | * Testing a banking Web App hosted on dedicated hardware, where I leveraged Powershell scripts for configuring and automating deployments. | ||
== Tools and Technologies == | |||
At Fiserv I used the following tools and technologies: | |||
* [https://www.postman.com/ '''Postman'''] and [https://www.soapui.org/ '''SoapUI'''] for API testing. | * [https://www.postman.com/ '''Postman'''] and [https://www.soapui.org/ '''SoapUI'''] for API testing. | ||
Line 75: | Line 73: | ||
* [https://specflow.org/ '''Specflow'''] and [https://en.wikipedia.org/wiki/C_Sharp_(programming_language) '''C#'''] for API automation using BDD. | * [https://specflow.org/ '''Specflow'''] and [https://en.wikipedia.org/wiki/C_Sharp_(programming_language) '''C#'''] for API automation using BDD. | ||
* Conducted testing across various domains including mobile functional, accessibility, iOS upgrade, and platform API functional testing. | * Conducted testing across various domains including mobile functional, accessibility, iOS upgrade, and platform API functional testing. | ||
* | * [https://www.telerik.com/fiddler '''Fiddler'''], [https://portswigger.net/burp/communitydownload '''Burp Suite CE'''], and [https://mitmproxy.org/ '''MITM Proxy'''] for capturing network calls. | ||
* [https://xmind.app/ '''XMind'''] for mind mapping. | |||
* [https://www.atlassian.com/software/confluence '''Confluence'''] for product & project documentation | |||
* [https://en.wikipedia.org/wiki/SQL_Server_Management_StudioMicrosoft '''SQL Server Management Studio'''] for data queries, test data setup, and testing SQL scripts | |||
* [https://visualstudio.microsoft.com/ '''Visual Studio'''] for code related tasks | |||
* '''Microsoft Test Manager''' for managing test cases and suites, and recording test progress. | |||
: | |||
: | |||
==Agile | == Agile == | ||
Fiserv used the [https://www.scaledagileframework.com '''Scaled Agile Framework''' (SAFe)] to govern their Agile practices. Squads, were typically about ten in size, and engaged in common Agile Rituals, and were responsible for SDLC through to integration. We followed Gitflow (on TFS). | |||
: | |||
The squad | |||
* Engaged in Agile Rituals - stand-ups, backlog grooming, estimation, planning, demos, and retros. | |||
* Owned the development lifecycle - story design, implementation, testing, and integration | |||
* Followed Gitflow - feature branches for development and integrating changes into release-train branches. | |||
* Contributed to quality checking at various stages before code changes were deployed to production. | |||