Monday, March 7, 2011

Symphony KT

Citifinancial is replacing one of its Legacy Application(Maestro) with web based application (Symphony). The project is called Symphony. It is a 120 million project and we have around 200 people from different teams working in this project. We are working on an application called Lead Management which is one of the major enhancement that is being done to the existing loan booking system. This is a CRM application(Customer Relationship Management) which would eventually become a revenue generating application for CitiFinancial. There are different types of customers we work on which includes present borrower, former borrowers, private labels and majorly potential customers (prospects)

The data for all the customers is pulled from a centralized citi database called Account Master(existing production system) by a third party vendor called datalab and will be sent to the ETL team in a space formatted flat file. Before sending the data to testing team datalab will apply some algorithms to the data and will scrub the data so that the personal information like SSN, Phone number, names and account details will not be used for general testing purpose.

Once the file is received, testing team will look into the file and validate the data in the file. There is a field layout provided by the business which will have the characteristics of the field and their boundary values. Based on that the file will be validated using a Microsoft excel or Microsoft access. Once the testing team approves the validation on the data, ETL team will pull the file and using the Data Warehousing tool (Ab Initio) the data is loaded into the database. The loaded data will be validated agains the business requirements and business rules using SQL queries. We use TOAD to access Oracle Database and we write complex SQL queries to validate the data in the database. We will actually do a join between more than 10 tables and kind of create our own view and validate the date against the source flat file.

Once the data is loaded and validated, we will go ahead and start testing the web application. The web application that we were using is a product of Chordiant. Chordiant is a company that sells Out of the Box applications for different lines of business. SO our motive was to enhance the OOTB application based on our requirements. The input for this application is the Marketing data that was loaded by ETL earlier. We will pull the data and test all the functionalities, GUI and check whether the transactions have been updated in the DB. We use sql to validate the DB part. Once the Functionality testing, integration and system testing(which includes different other systems like ISW, EERS, HR Feed (all internal systems for Citi)), then we will move on to the Data Warehousing mode.

Now for Creating a Data warehouse the input is the Source table. We again use Ab Initio to extract the data from source based on the requirements and rules and the front end transactions, and will migrate the data to the Data warehouse (Reporting tables). The target tables are the reporting tables which is built based on Star Schema (Dimensions and Facts).

Testing team will validate the source data with the target data using SQl queries and joins and also the Business requirement and rules. Once the validation is done we log in to a BI tool called Cognos and validate the mapping between the front end and the back end. Once the Cognos reports are validated we have something called Cognos Cubes. We will have to validate the Cubes and cubes are mainly used for enterprise level reports as it takes the snapshot of the data and stores it within itself unless and until it is refreshed again. So the user can play with massive data and analyse the reports and create adhoc views of reports based on the need with no performance issues.

For all of these validations we will have to have a proper test strategy, test plan and test cases written before proceeding with the execution. The defects are logged in QC and testing team is responsible for the defect management and tracking defects to closure. If the business decides not to fix a defect for a current release as it is not so critical for launch, then that defect is backlogged for future releases.

We are involved in the full life cycle of the project right from Business requirement analysis, design document analysis, Drafting Test strategy (Please look at the test strategy of the LM project and get to know the components of test strategy and the basic understanding), test planning, effor estimation (will be based on the number of test cases per day per person, complexity of the test cases and the priority of test cases. Complexity and priority should be decided by the testing team only. Based on that the test scripts are assigned to the testers in the Quality center , this is task Management). If there are any downtime that will also be captured and tracked. Once all the processes are done and we are done with the project, then we will have to prepare an exit report. This will have all the downtimes, issues encountered, backlogged defects for future releases, functionalities completed, in scope functionalites not tested and approvals from project stake holders (Please have a look at the exit report to get an idea and know the components of the report).

Also we constantly interact with the customers, provide inputs and suggestions(make sure you have one good suggestion which you made to customer to answer this), attend defect calls, daily calls with Onshore folks etc.

Also you can include inbetween that you have been doing, functionality testing, integration, system, regression, automation(if any), User acceptance(by discussing with business and understanding the business logics and expectations), Unit testing on ETL (Drafting test cases from ETL transformations when there is no specific business requirement but a technical requirement to make sure the data loaded is valid)

No comments: