Audit _ Knowledge Transfer Document 1

Document Sample
Audit _ Knowledge Transfer Document 1 Powered By Docstoc
   what requirements analysis/study has taken place ? who are the users ? key contacts ?
   what documentation has been developed ? basic functionality matrix ?
   usage profile for solution ? user matrix ? (performance and scalability testing ?)
   number of users and location ? hours of cover ?
   technical architecture documentation ?
   documentation: Emphasis on documentation is - is it available?, is it hard or soft copy? What national
    language is it in? How fit for purpose is it to assist in knowledge transfer? Will an exercise need to be
    undertaken to produce documentation
     system overview
     key functions within system
     business/user spec
     functional spec
     technical spec
     program spec
     operational guide
     User manual
     DBA manual/ procedures
     staff induction manual

   technical design documentation ?
   networked ? standalone solution ?
   development process followed ? review cycles, code reviews (documents)
   configuration management procedures followed ? tools ? number of environments ?
   what tools were used to construct the solution.
   key components ? screen, batch processes ? % complete ?
   external interfaces ? files, on-line real time ? batch ? encryption ?
   scheduling ? tools ?
   coding standards ? naming conventions ?
   quality standards/methodology ?
   error reporting ?
   security standards
   what‟s been tested ? at which stage is testing unit testing ? integration testing ? UAT ? performance
    testing ?
   database schema development
   underlying infrastructure ? servers, database, operating system ? upgrade path ? versions ? support
    considerations ? networking topology ?
   licensing considerations ? inventory of software licenses
   third party products ? support agreements ? SLA‟s, responsibilities ?
   backup and recovery considerations ?
   error tracking database ? change control notes ? error reports ?
   test strategy ?
   structured test plan ? scripts ?
   acceptance criteria ? key users involved in defining this ? user liaison ? (frequency)
   development plan/effort ?
   key design principles ?
   resourcing, skills, composition of development team ?
   key document index ?
   transition planning ?
   development completion plans.
   internationalisation ?
   Current Development Status (Work in progress)
     Open problem calls
     Open user queries

76d721c0-2019-400a-8a37-157e6ce52530.doc1                                             12/02/2011
     Change requests
     Ad hoc requests
     Software upgrades
     Systems info. Upgrades
     Agreed Enhancements and status
     Agreed Developments and status
   development environment
   test environment (number of)

10TH JULY 2002 – DES BERRY (CR17)

   controls ? these are sub-sets of data for campaign management. i.e. a sample of the data that reflects
    the distribution of data in the entire campaign.
   fully system tested (originally written for Germany) but not put into production (about 2 years ago).
    i.e. no controls for Germany at the moment.
   requirement to deliver this into the UK.
   no requireement to deliver this functionality into france.
   business rules used to create the subset of data in the controls (who – des to supply)
   5% are included in population of campaign mailing (this percentage is configurable in the database and
    is held as a ratio)
   france not yet mature enough – if required will be table entry in control reference table and retesting
   key business user (GPT) – Matt Spinks.
   the requirement for this functionality is a carry-over from the legacy system documented in the
    original business requirements for the system – there is no specific requirements document for this
   the whole actvity run via UniCentre as part of the current Campaign Management processing
   the run after load program before extract (not much in between anyway). what is run ? aren‟t these
    just amendments to the current software, hence, require no additonal configuration of UniCentre. –
    new process implemented for permanent controls.
   historical run manually – sql plus command line (one-off) all development in package marked as
    historical (his in title).
   entire development done in PL SQL – no other langauges used and no reliance on third party
    software – just unicentre.
   software split into 2 section – insert permanent controls (T_GMDRIVE_APL) and
   configurable via T_Permanent Control_tables i.e. we can set the input tables for the data that is the
    source of the controls.
   read in record by record tested against business rules to become permanent control
   permanent customer control (from gm tables) – astra, vectra and corsa
   prospect control (t-std_prospect) – checked against table of “like” cars defined in the marketing
   hots the ratio (1:20) active suppression created – T_CONSUMER_SUPPRESSION (person details
    for people who should not be mailed) – another flag active/deactivate i.e moving hosue.
   second is change of address program (based on address indicators – inser or update) only called if
    updtaes of addesses.
   only process people done on that day (relies on batch run date) T_BATCH_RUN_DATE
   load sets the T_BATCH_RUN_DATE (used to keep consistency past midnight)
   every time looks at records if update address is already on the database. if present then this record
    becomes a permanent control.
   3 tables needed are T_PERMANNET_CONTROL_TABLES,
   applied table and contry code. only processing for german
    T_PERMANENT_CONTROLS_RATIO (2 records one for permanent and one for prospect).
   single table feed for customer, 1:n for prospect
   permanent control range cars similar to astra (accessed via data dictionary) marketing segment
    detailed by cars. i.e. super minis are 002 ?!

76d721c0-2019-400a-8a37-157e6ce52530.doc2                                           12/02/2011
   CR310 – supports multiple country usage i.e. added a country attribute to the table to allow sources
    (input tables) to be linked to specific countries.
   rules for CR17 – business rules originally written by Brendan walsh – enhanced by des berry.
   business rules are encoded in s/w.
   permanent controls – no housekeeping the sub-sets of data are held infinitum for each campaign –
    seed from campaign management front-end. Hist program need to be run.
   no real performance/scalabilty testing – similar software ran in uk for the last 5 years (but not this
   performance issues with historical ? no testing done in this area – what is historical data ?
   runs against applied records for a day – estimates are 1000‟s therefore run times are not likely to be an
   installation in 2 parts – package header. (one piece of s/w)
   one package for historical. B01 and K01 customer p01 and propects p01 – change of address
    in p02
   1100 lines of code (uses existing version of pl sql and pro c)
   complex but well commented at start of each block
   cgy1 is potential test bed but will require schema changes to the current environment.
   copy of database schema request for enhancement. (310) des to provide list of amends and inserts
   T_PROGRAM_DETAIL – logging purposes (existing table).
   standalone requires not additional support other than auditing areas i.e. logging of processing using
    standard logging facilities in ECM21.
   no formal requirements defintion to permanenet controls – original requirmwents documentation
    feeds this enahancement.
   knowledge transfer to Des Berry about 1.5 wks to understand the requirement and be in a position to
    start the dvelopment work
   5 weeks all together for the entire CR (including KT and unit/system testing).

(CR18) 10TH JULY 2002 (DES BERRY)
Objective : loading additonal data into production system – for which country ?
   marketing body style
   option code engine
   option code transmission
   fuel type

   total development took about nine weeks using n people using the existing version of pl sql and pro c
   data is currently sent from sources but no processing is undertaken – this data is not loaded onto the
    production system.
   requirement to change transformation program (written in pro c)
   three tables (reference tables) have been created to support this development. These are

       in additonal, the existing MRKTG_BODY_STYLE_CODE has been changed to CHAR(3) to
        hold sourced data.

   checks against db for new values - refrence data already on the database. if not, then rejected until
    new reference data loaded. GMDrive, GMFAST (ge/fr) SMST (uk) , If not, . MVS code
    corresponding to ECM code.
   for each record ther are 3 additional accesses to the db – when ?
   after transformation – output file is created in the hygeine directory for a specific county.
   sql loader – pumping data onto work table (GMDRIVE_W1) - positional changes in the control file
   load program (des did this and some anslysis) – data processed in W1 inserts or updates to
    T_VEHICLE - successful inserts to APL – rejects to REJ.
   rejects at the end of new nights processing.

76d721c0-2019-400a-8a37-157e6ce52530.doc3                                             12/02/2011
   changes required to schema. minor chnages to inserts to cater for new fields. DMCR308 (new tables
    for reference used in transformation)
   dmcr309 load program i.e. extending marktg_body-style code and adding 3 new fields
   no additonal process for unicentre (processing about 3000 per minute).
   existing version of pl sql and pro c used during development.
   invalid version in RCS currently – should have new version in /ln/dev_root/CR18 on antwerp server.

   outputs from unit test and system test activity.
   what errors we recorded during unit and system testing ?
   what was the test strategy ? availability of test scripts ?
   business contact for CR17
   marketing segment mapping - how do we know to which segment cars are linked ?
   confirm estimates (i.e. how long has it taken to get the development to this point ?)
   has any transition planning been done for this development.
   what are the business rules for CR17 used to create the sub-set of data for the controls ? are they
    country specific ?
   list of CR and corresponding database changes raised against CR17 & CR18 ?


Live (on-going KT for support)

   trillium fixes up names and address
   matching lower thresold – insert
   if match – then update
   threshold s are user defined – records between 2 thresholds fall to manual review
   rejects out of hygeine but not out of matching (is on or not on database).
   use trillium and SSA - all files in flat file format.
   VB6.0 (front end to view and update records) – requires oracle odbc drivers loaded and configured on
    the local PC.
   uses trillium and ssa dll‟s
   no third party controls used during the development.
   download files to pc, corrects and uploads to to server.
   ftp not done interactively through the tool i.e. file must be copied to local pc/server using ftp tools
    i.e. hummingbird prior to using manual review tool.
   must download and then find with tool.
   records are sent to manual review (and reviewed by the manula review tool) if the number of possible
    matched is > 1.
   when records have been reviewed (and corrected) they are sent through hygiene again. After hygeine,
    the file is sent to the matching directory. After matching records are loaded or rejected. – identified
    by the file extension applied whilst running through the batch process.
   possible matches can be viewed through the tool ?
   setting up manual review. set up oracle logon, role profile and user access table. manula reviewer for
    ge should not be able to see data for fr. done via table on db.
   transfer to ftp should be acsii hummingbird 6.01.
   control file – records the number of records amended and how many will be sent to hygiene or
    matching. This file is held locally on pc.
   control files are timestamped on local pc.
   validation on screens using the data on the db sourced from trillium (address search) – how does the
    data get onto the db ? data loaded manually (not in batch) SQL Loader – part of DBA function –
    scripts should be available.
   there is a basic online help facility.
   multicountry functionality i.e. can use on ge or fr - can only access data for one country at a time.
   is this tool used in the uk ? yes.

76d721c0-2019-400a-8a37-157e6ce52530.doc4                                            12/02/2011
   access to country data is decided when the user logs on.
   screens labelling is only in english.
   controled by ini file. i.e. logos for different countries are specified in the .ini file
   screen 3 hygiene or matching review – select country at this stage.
   different screen for hygiene and matching. input files selected – standard outfile names allocated.
   save as we go along. if pc crashes.
   no facility to move backwards in the file – i.e. can‟t go backwards only forwards.
   corrections are written immediately to the file i.e. as we go along record by record.
   validation of addresses via search address button.
   before and after images shown – can‟t change “title gender field” this is based on values in gentle or
    title (note, potential title values are country specific).
   vanity addresses (outside of the PAF update file) will be forced through hygiene i.e. this has been
    specified by the customer and will not be updated using PAF.
   tools indicates the number of records in the file.
   refresh facilities if mistake made – saves copy before amendment.
   can‟t search for specific records ( this is in the help desk tool). i.e. can‟t look for the occurrence of
    walsh in a specific file.
   accept records go back through hygeine – they could be thown out again.
   vanity or send directly on to matching to overcome problems raised during hygiene.
   search address screen.
   using current version of trillium ( used for hygiene). data loaded into table – no direct interface with
   ssa for matching (uses dll)
   are trillium/ssa dll‟s linked to a specific version of the operating platform.
   wildcards can be used searching.
   valid for nt4.0 client (sp5). only supported for nt4.0
   not officially supported for any other platform.
   matching – screen shows all possible matches.
   person clustering ? master and 2/3 slaves ? customer linking (links cus-seq-no‟s together) – takes the
    most recent data supplied.
   marks the records in the file – upd-indicator switch (I/U), and puts the cus_seq-no in the file.
   non matches have a blank cus-seq-no and a “I” in the indicator field.
   puts the master records cus-seq-no in field (to be redicussed when person clustering software handed
   upper threshold in the ini file.
   fairly standard vb development – ssa matching, trilllium for paf.
   tools was first created for gm custom (vb4 original development) – upgrades to vb6 but worth looking
    into bringing inline with vb6. especially if going to vb7 !
   exclusive usage on a file during manual review.

function prototype for ssa calls.

 person clustering (master and slave).
 one stored procedure added to the unicentre batch (current held on the applicat server
 significant amount of setup time required for person clustering – over 40 reference points.

11TH JULY 2002 09:30 – 11:30 (TOM FITZGERALD) PAF UPDATE (NO CR)
   trillium for data cleansing
   could trillium support PAF Update ? run PAF update for a given qtr.
   currently QAS used.
   sent file to
   (ACTION) - Tom to send PAF Update report
   how would trillium process records processed by PAF update (62,000)

76d721c0-2019-400a-8a37-157e6ce52530.doc5                                             12/02/2011
   different quality codes QAS – 62,000 had cross section of quality codes used on QAS
   81%-85% went through straight away (no tuning) if tuning, higher (See toms report)
   addresses cleansed by QAS and trillium in parallel ?
   records on the base that have trillium error codes against them 204,000 failed for different rason
    (trillium parser, address verification). error codes will allow cleansing at later date.
   functionality already in uk (QAS). just add functionality for ge/fr (trillium).
   ecm21 (GM Custom) – vauxhall (UK)using QAS. „99 ecm21 implemented for ge (using trillium). „01
    uk migrated to new version of ecm21 (kept trillium). ‟01 france implemented with trillium.
   PAF update (extract records in a particular file format from base). QAS tool allows comparison and
   trillium same scenario (extract records) – 40,000 rec files (trillium looks at file and cleanses – sorted
    post code)
   10 records through trillium, cleanse using new PAF file, trillium sets flags on file to identify specific
    change, then records reapplied based on cus_seq_number.
   technical design.
   ecm21 unique address key. possible key on base for stroing PAF key (not sure whether paf keys are
    unique). trillium don‟t provide PAF changes file.
   PAF File from post office but QAS has a changes file file too. trillium does not source changes file.
   run every quarter – last run was december. (miisied two).
   takes about (fr – 1,000,000 ge – 3,000,000) about 7,000,000 addresses in uk) –300,000 400,000 an hour
   start Friday evening – complete Sunday.
   do not run paf updates together (because accessing same T_ADDRESS_TABLE) – but fr/ge can be
    run same weekend.
   extract no vanity addresses (cherished addresses in uk).
   must be run exclusively. switch off all caledars and no datamart activity, just run the initial jobs at the
    beginning of the batch (timestamping) – logstart.ksh.
   e.g. PAF_Update_driver_script.ksh runs 3 jobs – extartion in 400,000 chunks
    (PAF_Update_Extraction_SCR01.ksh) then puts file into hygeine directory – then runs specific scripts
    for country (hygeine) then runs PAF_Address_update (single for all countries) – only six scripts.
   only runs if env variable if set to true (RUNPAFUPDATE).
   after hygiene copied to matching directory for country and then runs PAF_ADDRESS_UPDATE.
   extraction number can be set via environmanet variables. (PAF_FETCH_SIZE).
   files are archived from the matching directory – does not copy to load directory (removed from the
    matching directory).
   DOCUMENT – Program specification.
   audit table to support extraction (identifies last extarcted address_seq_num)
   customer linked to address via T_CUSTOMER_CONTACT_POINT – if address type then
    add_seq_num held in record.
   no write to reject file ! just skip the record.
   pro c and c supported by ksh. - trillium to tidy it up. no other software required.
   trillium v5.0.
   all runs on the server.
   scheduled runs wth service bureaus.

PAF UPDATE (Co-ordianted via PAF_Update_Driver)
1. extraction (PAF_Update_Extraction_SCR01.ksh) – only takes no vanity addresses.

2. revalidating against new paf file (Common_Hygeine_SCR01.ksh – country specific script)
 order (normal trillium processing) – based on flag (PAF_UPADTE_FLAG for conditional
 cleanse rude words/special (COMMON_Hygeine_SCR01.ksh – on efor each country)
 parsed (breaking data into individual elements - trillium processing) COMMON_Hygeine_SCR01.ksh
 split file based on vanity and non-vanity addressing.
 geocode only the non-vanity addresses. (paf checked) - COMMON_Hygeine_SCR01.ksh – address
     variifed addresses.

76d721c0-2019-400a-8a37-157e6ce52530.doc6                                              12/02/2011
   output from geocoder – strip out records that have had no changed applied or no issue with them.
    conditional step (identidy chagned addresses - based on PAF_UPDATE_flag) – just for paf
    processing. checks geocoded changed flags and forward records to next steps – unchanged records are
   name & address re-construction
     uk specific programs (person name, company name and paf organisation name).
     software to build (uk_non_vanity_address_construction_psn01) – non-standard addressing
        created trillium converted to vauxhall agreed format (address reconstruction). uk only
     structures in fr and germany – just use use standard trillium converter step to build output name
        and address. paf organsiation names don‟t occur in fr/ge – no major issues (the bureaus son;t
        source the information – field is available but not used in fr/ge).
     uk_vanity_address_reconstruction_psn01 – address fields are copied across (this is the way the
        customer wants to be known – but trillium used to parse name and company parts of the address.
        paf-organisation is part of address therefore copied straight across. – this only applied to normal
        trillium processing not paf update.
     merge 2 files together (vanity/non- vanity) for normal trillium processing.
   seperation for manual review and load file (COMMON_Hygeine_SCR01.ksh ) based on flags set.

3. apply cahnged to the db (PAF_Address_Update)
      (PAF_UPDATE_Apply_Address_change_psn01) – (
      ssa has keys to identify addresses and these are stored on the db (used in matching).
      calls ssa function to establish ssa address key (give address as paramneter and key retruned)
      reads file from matching directory, connects to base, g_process_paf_address makes calls to ssa to
          generate keys and performs update on the db. (PAF_Address_update.pc) – uses
          COMMON_matching_pc functionality.

 all held in paf-promotion directory. (document reflecting how to install software).
 test paf files before putting into production (trillium errors i.e. files don‟t work)
 should be sending a text based version of paf file for manual review tool. this is loaded onto the db.
   note, we don‟t call trillium directly from the manual review tool. nb. make sure file sin sync used for
   manual review and normal processing.
 all manual review should be cleared before running a paf update.
 requires a week to receice file, test, validate and set up paf update activity. promoting files to
   production, copies to backup and setting up unicentre and schedule the run with the customer.

   awaiting trillium licensing issues to be addresses but business objects licenses have been transferred to
   quarterly path update work – copy on applicat (set up by Neil Veitch).
     incorporates changes to common hygiene scripts for ge, fr and uk and (pg31) to support PAF
     incorporates changes to trillium to cater for path update.

   /u03/tril5v0/great_britain
     de_postal
     fr_postal

   replace detabcit causes problems – worth using the old one and update - trillium supplied files.
   pick off all none vanity addresses from the database and place in paf_update folders (see document
    re:directory structure (p4) for requirements – set up the folder structure and permissions.
   probably requires most testing for great britain.
   parms directory - pfaccept
   MTR Hygiene Presentation ?
   converted, parser and geocoder.
   converted – rude words, special words.

76d721c0-2019-400a-8a37-157e6ce52530.doc7                                             12/02/2011
   parser – decomposes the inbound record into a trillium friendly structure – also identifies if vanity or
    non-vanity (vanity.par, non-vaity.par) – uses parsout dictionary “dict” directory, gender checking i..e
    Mary is female, title matches gender MRS and F
   geocoder – uses paf files to validate and cleanse addresses, if possible – if can‟t be corrected then sent
    to manual review. might correct grafton road to graftton st is postcode
   matching uses ssa.
   transformation puts data into 3474 file format – mapping of data from sources takes place in this step.
   most countries sort of postcode – easier to gecoder to match agaoast paf files.
   casing rules – put first letter to upper case, etc (trillium converter 4)
   identify errors (converter 5) OUT_FIELD_COMPARE lists error codes – see pfcondv4.par
   in france – if fail level 3 (no house number) copy error code to quality field. this isn‟t to much of an
    error in frane but we can track where and when this ocuurs.
   trillium doesn‟t like tabs.
   just offsets have changed to cater for additonal address changed fields in the parsout file.
   four fields added are locality_changed, postcode_cahnaged, st_name_changed and st_part_changed.
   if we want to run PAF manually then RUN_PAF_UPDATE must be set 1.
   matching runs inot probelsm with large files so kepp population low – PAF_FETCH_SIZE = 20,000
   change to common_hygiene_scripts – skip first converter (rude names, etc) if paf_update flag set.
   change to common_hygiene_scripts – skip sixth converter if paf_update flag set. (family checks –
    france and germany doesn‟t want send out info to families or businesses. only does converter for fr/ge
   each country has own common hygiene script.
   still to do UAT on PAF Update
   some volume testing.
   test cases and plans.
   no error reporting during testing – just keep going through test cases until they pass.
   used existing unit ad system test env on antwerp.
   could use existing UAT environment.
   need to run scipts with permissons.
   spreadsheet listing all code that is being checked out of RCS.
   coupled with PAF Installation Guide.
   look on applicat server – all document and unit test data and plans on applicat server.
   on antwerp /ln/dev_root/paf promotion - all source code for the development.
   matt spinks involved for UAT teting (user rep) GM
   germany/france – britta windt (user rep) – sabine/aurelie (EDS)

11TH JULY 2002 15:30 – 16:30 (DIERDRE MCMAHON) CR24
   mini apps (front end vb application and unix scripts) in rcs and on applicat.
   cold list loads (sperate folders) – non-standard – this options allows new source code to be added.
   standard prospect files can be loaded too but doesn’t allow new source code to be added.
   processed outside of the batch – usually manual process
   powerpoint presentation.
   vb6.0 application installed on local pc.
   business reqts (CR24_ABD.doc)
   technical spec (CR24_TDN01.doc) and prog spec (CR24_PSN01.doc)
   uses no non-standard/thrid party controls
   uses sockets to talk to unix server – no requirement for ODBC connectivity.
   if cold list customers are sourced from elsewere then suppression indicator lifted.
   unix scripts will write messages to log file.
   uses standard batch scripts with minor alterations – prefixed by mini_app_……
   common application for all 3 countries – all labelling in English – select country at login.
   uses dp_user1 – has access to tables – possible write access issue.
   uses ecm21ma.ini file – all setting will need to be changed for UAT environment.
   uk has choice of source codes and file layouts i.e. sdif2 or sdif3.
   if you have wrong sdif for country – transformation will create an error message.

76d721c0-2019-400a-8a37-157e6ce52530.doc8                                             12/02/2011
   source file need to be placed in the “cold list” directory.
   unit and system testing already undertaken – no UAT.
   dierdre to check whether this can be used to run standard jobs.
   dierdre to check whether which version of sockets specification is required ?
   dierdre to check qualification of application to operating platform ?
   no formal collation of errors identified during testing for all CR‟s – juts test until tests passed ?
   dierdre to check what coding standards have been followed for VB and UNIX.
   put mini_apps. scripts in bin directory and load app on local pc (and amend .ni file)
   user login must provide access to all tables (particularily matching, load and creation of new source
   test file in coldlist directory or files in standard prospect directories i.e. ibis, icats, etc. – available upto end of July.

76d721c0-2019-400a-8a37-157e6ce52530.doc9                                            12/02/2011