Anjireddy Kandula by TlCG8IO4

VIEWS: 30 PAGES: 6

									Anjireddy Kandula


Summary:
Around 8+yrs of experience as a Senior Datawarehouse Developer with extensive project lifecycle
experience in Design, Development, Implementation and testing of various sized applications using ETL
& amp; BI Tools.

      Over 7+ years of Ab Initio experience with ETL, Data Mapping, Transformation
       and Loading from Source to Target Databases in a complex, high-volume environment.
      Extensive experience with EME for version control, impact analysis and dependency analysis.
      Good understanding of new Ab Initio features like Component Folding, Parameter Definition
       Language (PDL), and Continuous flows, Queues, publisher and subscriber components.
      Working experience with the JCL language for creating the Batch jobs to run the graphs through
       the Mainframe. Experience with the Mainframe files (MVS, VSAM, GDG and Flat file).
      Experience in Data Modeling Schemas (RDBMS, Multi-Dimensional Modeling (Star Schema,
       Snowflake), MOLAP, ROLAP and HOLAP) using Erwin Data Modeler.
      More than 1 Years experience on Teradata SQL, Teradata ANSI SQL, Teradata Tools &
       Utilities (Fast Export, Multiload, Fast Load, TPUMP, BTEQ and QueryMan).
      Good Experience working with various Heterogeneous Source Systems like Oracle 10g, DB2,
       MS SQL Server, Flat files and Legacy Systems.
      Extensive experience in prototyping, RAD, SDLC, JAD, Agile Iterative development, structured
       techniques, meta data management and data mapping.
      Proficient in using Shell scripts and SQL for automation of ETL processes.
      Good experience in Scheduling, Production Support and Troubleshooting for various ETL Jobs.
      UNIX Shell scripts for Batch processing. Used VI and emacs editors.
       .

TECHNICAL SKILLS

Data Warehousing: Ab Initio (GDE 1.15.10, Co&gt Operating System 2.15.10) Data
warehousing designing, (Mapping Designing, Mapplet, Transformations), ETL, Metadata, Data Mining,
Data mart, EME.
Databases: Teradata V2R5, Oracle 10g/9i,MS SQLServer 6.5/7.0/2000 DB2,MS Access.
Data Modeling: Star & Snowflake Schemas, Visio.
Tools: SQL Assistant, MS Access Reports, TOAD, Maestro, Control M, Clear Quest
Programming: C, SQL,Shell Scripting, Korn Shell, SQL*Plus,PL/SQL, Visual Basic 6.0
Operating Systems: Windows NT/98/2000, AIX 5.0/5.2/5.3, OS/390


Professional Experience:


Legg Mason, Baltimore, MD
ETL/Abinito Developer                                                 Oct 2010-present



Abinitio Salesforce integration:
Australia has a need to report on account information to interested 3rd parties. To accommodate this
requirement, several fields will need to be added to SalesForce.Com. The Interested Parties will be
added as a Contact and linked back to the Client (Company). Once all required fields are in SFDC, data
from Maximizer will need to be applied to the SFDC environment.The required fields from Maximizer and
the new fields that are to be used for the same purpose in SFDC.
Responsibilities:
      Developed Abinitio graphs using the salesforce components
      Good knowledge about the salesforce components such as Query salesforce Components,
       Write Salesforce Components, Get Salesforce Info, Retrieve Salesforce Objects.
      Involved in deploy the Abinitio graphs and migrate to various instances like QA by using the air
       commands.
      Designed and Developed wrapper scripts for batch stream jobs.
      Involved in create the Autosys Jobs for Abinito Graphs, and jilin various instances DEV and QA.
      Developed Pset’s ,Dmls, and xfrs for generic graphs.
      Extensively used Partition components and developed graphs using Write Multi-Files, Read
       Multi-Files, Filter by Expression, Run Program, Join, Scan, Rollup, Sort, Reformat.
      Responsible for load international batch stream into various Marts like domestic and
       International.
      Extensively used ETL to load data using Ab Initio DB components from heterogeneous source
       systems from DB2, Oracle etc to target DW DB2 Database and other file systems.
      Used various Ab Initio Multi File Systems (MFS) to run graphs in parallel using layout feature.
      Involved in debug Production issues for Batch stream jobs.
      Experienced in writing complex SQL queries based on the given requirements.
      Interacting with business partners to gather requirements.

Environment: AbInitio (GDE 3.0.2.1, Co-Operating System 3-0-2) and (GDE 1.14, Co-Operating System
2.14, UNIX shell Scripting, SQLServer, DB2, Oracle10g,UNIX, Windows NT/2000.


Verizon, Atlanta, GA                                                 Sep 2009 – Aug 2010
Sr. ETL/Abinito Developer



Responsibilities:
      Developed several partition based Ab Initio Graphs for high volume data warehouse.
      Involved in all phases of the System Development Life Cycle Analysis,& Data Modeling.
      Extensively used Enterprise Meta Environment (EME) for version control.
      Extensive exposure to Generic graphs for data cleansing, data validation and data
       transformation.
      Created Sandbox and edited Sandbox parameters according to repository Extensive exposure to
       EME.
      Used AIR commands to do dependency analysis for all ABI objects
      Involved in Ab Initio Design, Configuration experience in Ab Initio ETL, Data Mapping,
       Transformation and Loading in complex and high-volume environment and data processing at
       Terabytes level.
      Extensively used Partition components and developed graphs using Write Multi-Files, Read
       Multi-Files, Filter by Expression, Run Program, Join, Sort, Re format.
      Followed the best design principles, efficiency guidelines and naming standards in designing the
       graphs
      Developed shell scripts for Archiving, Data Loading procedures and Validation
      Involved in writing Unit Test scripts, support documents and implementation plan.
      Tuned the graphs by creating Lookup files, Memory sort and Max-core parameters for
       maximum usage of cache memory and enhanced performance.
      Implemented an 6 way multi file system that is composed of individual files on different nodes
       that are partitioned and stored in distributed directories (using Multi directories).
      Responsible for cleansing the data from source systems using Ab Initio components such as
       reformat and filter by expression.
      Used the sub graphs to increase the clarity of graph and to impose reusable business
       restrictions.
      Capacity of designing solutions around Ab initio, with advanced skills in high performance and
       parallelism in Ab Initio.

Environment: Abinito (GDE 1.15.10, Co-Operating System 2.15.10), UNIX shell Scripting, Windows
NT/2000, DB2, UNIX IBM AIX 5.1.

Wells Fargo (Legacy MBNA), Dallas, TX                                           Jan 2009 – Aug 2009
Sr. ETL Developer

Responsibilities:

      Worked in the Data Management Team on Data Extraction, Fictionalization, Subset, Data
       Cleansing, and Data Validation.
      Responsible for requirement gathering and development of Expiration Date Fictionalization
       project.
      Creating the batch jobs through mainframe to run the graphs .
       Working experience with the DCLGEN through DB2 to create the copy book for the DB2 tables.
       Involved in migrating the data warehouse from DB2 to TERADATA.
       Wrote several queries and BTEQ scripts for ETL process and also responsible for decision of
       primary, secondary indexes of tables and usage of Analytical Functions(QUALIFY
       ROW_NUMBER(), OVER (PARTITION BY) for effective performance at peak volumes and
       complex joins.
      Resolved major spool space issues by collecting Stats on proper columns of the Teradata tables
       and ensuring proper distribution of data in the volatile sessions of Teradata scripts.
      Responsible for to create the params for the graphs to run through UNIX environment.
       Co-ordination with different testing groups to accommodate their testing data requirements and
       translate them to data selection criteria in Abinitio format.
      Extensively used ETL to load data using Ab Initio DB components from heterogeneous source
       systems from DB2, Oracle etc to target DW Teradata Database and other file systems.
      Batch processing for data downsizing (subset).
      Maintaining sandbox by storing all the work in a sequential order.
      Developed UNIX shell scripts for the purpose of parsing and processing data files. Maintained
       and did trouble shooting for batch processes for overnight operations.
      Co-ordinate with data team for the development of future changes in the file or table structures to
       accommodate future testing requirements.
      Worked with MVS, VSAM, GDG, DB2, Flat files and excel sheets as the inputs for the graphs.
      Worked with AbInitio functions like date, strings and user defined functions as per the
       requirements.

Environment: AbInitio (GDE 1.14.16, Co-Operating System 2.14.1), UNIX shell Scripting, Windows
NT/2000,DB2, Teradata V2R6.

AIG,NJ                                                                      Oct’07 – Nov’08
CB – Consumer banking
Sr. ETL Developer
Responsibilities:

      Extensively used the Ab initio components like Reformat, Join, Partition by Key, Partition by
       Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize and
       Denormalize. Used Abi features like MFS (8 way partition), check point, phases etc...
      Optimized and tuned the Ab Initio graphs. Architected and implemented this project in Agile
       Iterative Methodology with the help of BT Project Manager. The iterations are CD, MM, and
       Integration with Card products using Customer Identification Number, Marketing Campaigns and
       IVR Call and agent response data
       Automated the ETL process using UNIX Shell scripting.
       Extensively used the Teradata utilities like BTEQ, Fast load, Multiload, TPump, DDL
       Commands and DML Commands (SQL).
       Did the Proof of Concept (POC) using Ab Initio to show that Ab Initio is the right solution Used
       various Ab Initio components like Call Web service, Read XML, Write XML, xml-to-dml utility
       for testing. Also did the POC with Ab Initio/Oracle Stored Procedures (PL/SQL) to evaluate the
       performance.
       Involved in writing complex SQL queries based on the given requirements and Created series of
       Teradata Macros for various applications in Teradata SQL Assistant and performed tuning for
       Teradata SQL statements using Teradata Explain command.
       Prepared Unit and Integration testing plans. Involved in Unit and Integration testing using the
       testing plans.
       Created several SQL queries and created several reports using the above data mart for UAT and
       user reports. Used several of SQL features such as GROUP BY, ROLLUP, CASE, UNION,
       Subqueries, EXISTS, COALESCE, NULL etc.
       Involved in after implementation support, user training and data models walkthroughs with
       business/user groups.

   Environment: Ab Initio (GDE1.13.7, Co>operating system 2.14.104), Teradata V2 R5, Fast-loads,
   Multi-loads, Fast Exports, BTEQ, UNIX IBM AIX 5.1, Unix Shell scripts.

   FedEx, Memphis, TN                                                   Oct’05 – Aug’07
   ETL DW Developer

   Responsibilities:
    Analyzed Data Warehouse/Decision Support business requirements.
      Involved in designing fact, dimension and aggregate tables for Data Warehouse.
      Followed Star schema methodology to store the data in the Data Warehouse.
      Developed number of Ab-Initio Graphs based on business requirements using various Ab-
      Initio Components like Partition by Key, Partition by round robin, reformat, rollup, join,
      scan, normalize, gather etc.
      Used various Ab-Initio Multi File Systems (MFS) to run graphs in parallel using layout feature.
      Involved with Teradata DBA team to understand the structures of the tables and
      attributes before loading from Abinitio to Teradata.
      Modified BTEQ scripts to load data from Teradata Staging area to Teradata data mart and also
      wrote complex queries to load summary tables based on the core tables in DW.
      Implemented Data Parallelism through graphs, which deals with data, divided into segments
      and operates on each segment simultaneously through the Abinitio partition components to
      segment data.
       Responsible for designing Parallel Partition Abinitio Graph for high volume data warehouse.
       Used UNIX environment variables in All the Abinitio Graphs, which comprises of specified
       locations of the files to be used as source and target.
       Responsible for creating a 4 way multi file system that is composed of individual files on that
       are partitioned and stored in distributed directories
       Involved in Abinitio Graph Design and Performance tuning to Load graph process.
       Developed Ab-Initio graphs for Data validation using “validate” components like compare
       records, compute checksum etc.
       Developed Complex Ab-Initio XFR’s to derive new fields and solved various business
       requirements.
       Extensively used Ab-Initio Co>OS commands like m_ls, m_dump, m_mkfs m_rollback etc.
       Enhanced Ab-Initio graph’s performance by using various Ab-Initio performance techniques
       like using lookup’s (instead of joins), In-Memory Joins and rollup’s to speed up various Ab-Initio
       Graphs.
       Involved in developing Unix Korn Shell wrappers to run various Ab-Initio Scripts.

Environment: AbInitio (GDE 1.13.11, Co-Operating System 2.13.1), UNIX shell Scripting, SQL Server,
Teradata V2R5, UNIX, Windows NT/2000.

AAA Mortgage Corporation, Birmingham, MI                                 June ’03 – Aug’05
ETL Developer


Responsibilities:
       Interacting with business partners to gather requirements
       Converting the business requirements into list pull specifications.
       Wrote precise, reusable ETL specifications and patterns to facilitate development of best
       practices and internal competencies.
       Used Ab Initio as ETL tool to pull data from source systems, cleanse, transform, and load
       data into databases.
       Developed various informatica Graphs for data cleansing using functions like is valid, is defined,
       is error, string substring, string_concat and other string_* functions.
       Created proper PI taking into consideration both planned access and even distribution of data
       across all the available AMPS.
       Considering both the business requirements and factors created NUSI for smooth (fast and
       easy) access of data of Teradata tables.
       Designed parameterized generic graphs to pass the values from Wrapper script
       Based on business requirements, developed number of informatica Graphs using various
       Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan,
       gather, Broadcast, merge etc.
       Worked on improving the performance of Ab Initio graphs by using Various performance
       techniques like using lookups (instead of joins),In-Memory Joins to speed up various
       informatica Graphs.
       Preparing Technical Design and Test Plans.

   Environment: informatica, Erwin 3.5, Oracle 8i, Teradata,SQL Server 2000, AIX UNIX, Shell Scripts,
   PL/SQL, Windows NT

								
To top