BaBar Analysis School

Document Sample
BaBar Analysis School Powered By Docstoc
					      BaBar Analysis:
   Think before you leap


        Fergus Wilson
Rutherford Appleton Laboratory
    BABAR Analysis School
         Feb 11, 2008
BaBar Analysis School

Blind Analysis
Systematic errors
Monte Carlo Pitfalls
General Rules for Analysis




  11th February 2008   BaBar Analysis School, Fergus Wilson   2
1 - Blind Analysis




  So oft in theologic wars,
  The disputants, I ween,
  Rail on in utter ignorance
  Of what each other mean,
  And prate about an Elephant
  Not one of them has seen!




  11th February 2008        BaBar Analysis School, Fergus Wilson   3
Blind Analysis and Blind Faith

 “Seek and ye shall find”
 Examples
   o   The sun goes round the Earth. Obvious so why bother testing?
   o   Weapons of Mass Destruction.
   o   Pharmaceutical testing.
   o   Canals of Mars.
   o   UFOs (OK so they exist really).
 Famous (non)-Discoveries
   o   Parity Conservation in weak interactions. Obvious so why bother
       testing?
   o   17 KeV neutrinos
   o   Leptoquarks
                                Could a Blind Analysis have
   o   Cold Fusion
                                         prevented
   o   Higgs Discovery
                                   premature discoveries?
   o   Top Discovery
   o   New Resonances that then disappear

   11th February 2008           BaBar Analysis School, Fergus Wilson     4
Simple Definition
 “ Applied to a test or experiment conducted by one
  person on another in which information about the
  test that may lead to bias in the results is
  concealed from the tester until after the test is
  made; orig. used of tests for determining the
  efficacy of drugs.”
 “The purpose of a "blind analysis" is to avoid any
  preconception of a "right" answer from influencing
  the result. One of the easiest ways for such biases
  to enter analyses is by the judicious choice of cuts
  to emphasize or de-emphasize a statistical
  fluctuation.”
 Good Review: Annu. Rev. Nucl. Part. Sci., 55 (2007)
  141-163.

  11th February 2008   BaBar Analysis School, Fergus Wilson   5
Dangers of unblinded analysis

 If a theorist says you will see 1000 events:
   o   You optimize for a Branching Fraction measurement.
 If a theorist says you will see 0.1 event:
   o   You optimize for an Upper Limit.
 You only perform checks if the result is wrong.
 You optimize cuts on the data.
   o   Statistical fluctuations.
 You never (hardly ever) de-optimize your cuts.
   o   Errors on independent measurements are no longer gaussian.
 You optimize cuts for the result you want.




  11th February 2008           BaBar Analysis School, Fergus Wilson   6
Blinded Analysis:

 Advantages:
   o   Avoids experimenter’s bias.
   o   Selection criteria are more likely to be unbiased.
   o   Avoid’s Upper Limit/Central Value dichotomy.
   o   Tests are decided before the result is known (and performed
       whatever the result).
   o   Useful when you have some idea of what you are looking for.
   o   Useful when looking at very rare processes.
 Disadvantages:
   o   Can add to the time to do an analysis.
   o   Can easily be subverted.
   o   Perhaps not very useful when doing searches (SUSY at LHC).



  11th February 2008         BaBar Analysis School, Fergus Wilson    7
PDG

                                                           Every
                                                       measurement
                                                      agrees with the
                                                          previous
                                                       measurement




  11th February 2008   BaBar Analysis School, Fergus Wilson             8
Top at UA1



     Nature, July 1984




  11th February 2008     BaBar Analysis School, Fergus Wilson   9
Top at UA1
 Associated Production of an isolated, large transverse momentum
     lepton (electron or muon) and two jets at the CERNpp Collider
     G. Arnison et al., Phys. Lett. 147B, 493 (1984)

 Looking for

     pp  W
           | b t
                | bl



 Signature is isolated lepton
     plus MET and two jets
      o   Mass (jl) should peak at mt
      o   Mass (jjl) should peak at mW



ˡ
     11th February 2008            BaBar Analysis School, Fergus Wilson   10
What they found
6 events observed
0.5 expected




  11th February 2008   BaBar Analysis School, Fergus Wilson   11
What was happening


          Hard to get m(lj1j2) below
          m(lj2) + 8 GeV
          (since pTj1 > 8 GeV)




Hard to get m(lj2) below 24 GeV
(since pTl > 12 GeV )
In fact 30–50 GeV is typical for
events just passing the pT cuts


   11th February 2008                   BaBar Analysis School, Fergus Wilson   12
The moral
 If the kinematic cuts tend to make events lie in the region
  where you expect the signal, you are really doing a “counting
  experiment” which depends on absolute knowledge of
  backgrounds
                   efficiency          background                  peak

                                x                     =




 UA1 claim was later retracted after analysis of more data and
  better understanding of the backgrounds (J/, Y, bb and cc)
   o   The knowledge of heavy flavor cross sections and the
       calculational toolkit available at that time were much less
       complete
   o   Final limits from UA2 (UA1): mt > 69 (60) GeV


   11th February 2008               BaBar Analysis School, Fergus Wilson   13
The good news
 Being wrong doesn’t
  stop you getting a
  Nobel Prize




   11th February 2008   BaBar Analysis School, Fergus Wilson   14
Different approaches to Blind Analysis

 1) Hidden Signal Box:
  o   Hide the signal region (“signal box”) until analysis is complete.
  o   Useful for rare processes where the signal region is known in
      advance:
       • e.g. B decays: Know where these events fall in the ΔE and
         mES region.
       • e.g. tau decays: know what the mass of the tau is but not
         how many




  11th February 2008          BaBar Analysis School, Fergus Wilson        15
Different approaches to Blind Analysis

 2) Blind Cut optimisation on subset of data:
   o   Half the MC and the data outside the “signal box” is used to
       optimize the cuts.
   o   The remaining half is used to obtain background normalisation.
   o   Not used much in BaBar as we are often statistics limited in
       MC as well as data.
 Variation on a theme:
   o   if an analysis is being updated, could reblind the original data
       set. Although this is possible, it often isn’t worth the bother.




  11th February 2008           BaBar Analysis School, Fergus Wilson       16
Different approaches to Blind Analysis

 3) Hidden Answer:
  o   Useful in precision measurements of parameters.
  o   Can still do error analysis while blind.
  o   Plots of distributions do not give away answer.
  o   KTeV (direct CP and superweak v. weak hypotheses):
        • Measure ε’/ε (related to K0L and K0s decays to .)
        • ε’/ε known to be about 10-3 – 10-4
        • Add an unknown offset in fitting program:

                      1                  C: Hidden constant
            (Hidden)      C               chosen by
                      1                   deterministic random
                                               number generator

  o   Can give datasets with different C to different analysis groups.

  11th February 2008         BaBar Analysis School, Fergus Wilson     17
Different approaches to Blind Analysis

 4) Hidden Answer and Asymmetry:
   o   Useful where shifting the central value is not enough or
       distributions reveal the true value.
        • e.g. CP Asymmetry
               1
 t (Hidden)     Stag  t  C
               1

       Stag: +/-1 for B0/B0bar
              flavor tags




   11th February 2008         BaBar Analysis School, Fergus Wilson   18
How does BaBar do a ML Hidden Signal Blind Analysis?

 Define a blinded signal region               How large should the Blinded
    o   Often ΔE, mES or both.                    region be?:
 Choose Loose cuts based on:                       o   Not so large that it includes too
                                                        much background
    o   Signal MC.
                                                    o   Base the size on the resolution of
    o   uds continuum MC.
                                                        the variables (e.g. 2, 3 sigma)
    o   B generics MC (“peaking”).
 Look at data outside signal
   region:
    o   Off Resonance data
    o   Sidebands (watch out for
        feeddown).
    o   Cross-check with MC
 Extract PDFs from
    o   Signal MC
    o   uds MC/off-res DATA
    o   B backgrounds MC (“peaking”).



   11th February 2008                BaBar Analysis School, Fergus Wilson                19
How BaBar does a ML Blind Analysis

 Perform ML fits using the MC to:
   o   Use PDFs to generate distributions.
         • Assumes no correlations.
   o   Multiple “toy” fits to understand distribution of results.
   o   Correlations.
   o   Fitting errors and convergence.
   o   Warning: there is a circular argument here: PDFs -> toys -> Fits
       -> PDFs. Can cross-check by using full MC which will have
       correlations in it.
         • But are the correlations the same as in data?
 Sometimes fit to data with result blinded and only
  show error:
   o   Frowned upon: if error is ±5 events, 25 is a pretty good guess
       for the central value.

  11th February 2008          BaBar Analysis School, Fergus Wilson      20
How BaBar does a ML Blind Analysis
 If possible fit to an unblinded Data sample that has
  similar characteristics (“calibration channel”):
   o   Allow as many signal parameters to vary as possible.
   o   Quite often this is a D decay to the same final state.
   o   Compare errors and shifts with MC values.
   o   Can be difficult to interpret if the calibration channel has
       large background.
 Finally fit to Data:
   o   Usually the signal PDF parameters are fixed while the
       background PDF parameters are allowed to float:
        • We are not certain of continuum distributions.
 Not finished yet:
   o   Need to compare fitted PDFs to data distributions:
        • sPlots and LH ratios (see later this week).
   o   Need to perform systematics analysis

  11th February 2008          BaBar Analysis School, Fergus Wilson    21
2 - Systematic Errors General Introduction

 A nice introduction: Ann. Rev. of Nucl. and Part.
  Science, 57 (2007) 145-169.
 “Systematic errors are errors that produce a result
  that differs from the true value by a fixed amount.
  These errors result from biases introduced by
  instrumental method, or human factors.”
 “The nature of systematic errors is that they may
  not cause different answers when the experiment is
  repeated”.
 “The reliable assessment of systematic errors
  requires much more thought and work than for the
  corresponding statistical error.”


  11th February 2008   BaBar Analysis School, Fergus Wilson   22
It’s not just mathematics
                         Who’s right?
        Result A                                       Result B



                              Person A                                Person E
            You                                            Person D
                                   Person B
                                                Person C




       Method A                                   Method B
                       Now who’s right?


  11th February 2008      BaBar Analysis School, Fergus Wilson              23
Correlated Systematic Errors

 Example: 2 body Decay B0 → +-
  o   B mass = [(e1+e2)2 – (p1+p2)2]1/2
       • Energy comes from calorimeter
       • Momentum from tracking detectors.
  o   But if magnetic field is incorrect BOTH p1 and p2 will have
      either a higher or lower momentum.
  o   The systematic errors on the momenta are correlated.
  o   Errors associated with each track must be added linearly to
      form overall tracking error.
  o   Same is often true of PID:
       • A possible exception: B -> K. If “K” is defined as “not ”,
         the systematic errors could be anti-correlated and cancel.
         Depends on what you are measuring.



  11th February 2008         BaBar Analysis School, Fergus Wilson       24
Different Types of Systematic Errors
 Separating the different errors is important:
   o   Determining significance of result.
   o   Combining different measurements or decays.
 Additive Systematic Errors
   o   Affect the central value and the statistical significance of a
       result.
   o   e.g. Biases in fits.
 Uncorrelated (Multiplicative) Systematic Errors
   o   Affect the central value but not the statistical significance of
       a result.
   o   e.g. Daughter Particle Branching Fractions.
 Correlated Systematic Errors
   o   Affect all modes or results in the same way.
   o   e.g. PID, Tracking.


  11th February 2008          BaBar Analysis School, Fergus Wilson      25
BaBar Systematic errors

 Each analysis (or type of analysis) has its own set of
  systematic errors. Some are general to all analyses.
   o   PID efficiency
   o   Tracking efficiency
   o   Neutrals efficiency
   o   K0s efficiency
   o   Fitting biases
   o   Backgrounds
   o   MC
   o   B counting
   o   etc…




  11th February 2008         BaBar Analysis School, Fergus Wilson   26
Tracking Efficiency
 http://www.slac.stanford.edu/BFROOT/www/Physics/TrackEffi
  cTaskForce/TrackingTaskForce-2007-R22.html
 Good News: no tracking efficiency correction needed in Release
  22.
 Add systematic uncertainty linearly per track.
 May not be suitable for very low PT tracks or in very high
  multiplicity events
                        Systematic Uncertainty per track (%)
                        GTL                  GTVL                     CT
Run 1                   .652                 .674                     .719
Run 2                   .377                 .351                     .202
Run 3                   .476                 .454.                    .496
Run 4                   .620                 .662                     .630
Run 5                   .678                 .674                     .333
Run 6                   .277                 .298                     .226
Average                 .236                 .229                     .142

   11th February 2008               BaBar Analysis School, Fergus Wilson     27
K0s Tracking Efficiency (BAD677)
 http://www.slac.stanford.edu/BFROOT/www/Physics/TrackEffi
  cTaskForce/TrackingTaskForce-2007.html#KsRecipe
 You need to store for each MC K0s
   o   Transverse Flight Length                   With respect to
   o   Transverse Momentum                        primary interaction
   o   Polar angle in degrees                     point not the B decay
 Identify the correction table that has the cuts that most
  closely match your analysis (see URL).
 Run the root macros from the URL:
   o   Separate macro for each run period (aarrrgh…!)
   o   Different macros for 1 or 2 K0s.
   o   It would be nice if the macros were a bit more friendly.
 Will get efficiency correction and error. Remember to weight
  by luminosity per run.
 Minor complication to do with “FEX bug” in run 5.

   11th February 2008             BaBar Analysis School, Fergus Wilson    28
PID Efficiency

 http://www.slac.stanford.edu/BFROOT/www/Physics
  /Tools/Pid/PidSelectors/index.html
 There are a large number of PID selectors
 There is a method for calculating a correction called
  “PID weighting”/”PID tweaking”
   // Uncorrected
                                                                A particular
                                                                selector e.g.
   result = theSelector->accept(myCand);
                                                                “PidLHElectrons”
   // Corrected
   HepAListIterator<BtaCandidate> iter(*pidList);
                                                                    PID Efficiency-
   while (pidcand = iter()) {                                       corrected list for
       if (myCand->overlaps(*pidcand)) {accept =true; break;}       each selector
   }

 There is NO common recipe for calculating
  systematics. You have to create your own.

  11th February 2008                   BaBar Analysis School, Fergus Wilson              29
Creating your own recipe
 Compare Data to MC (no corrections).
 Compare PID-Killing with no correction.
 Compare PID-Weighting with no correction.
 Compare to a control sample.
 Look at a similar analysis and quote those results
  (but be sensible).
 The full Monte:
    o   Run w/o corrections on MC: ε = εsignal,MC
    o   Run with PID-Killing on control sample (CS): ε1 = εCS,Data
    o   Run with PID-Tweaking: ε2 = εSignal,MC x (εCS,Data/εCS,MC)
    o   Correction: ε2 / ε1
 But be sensible. If your analysis is not limited by
    PID systematics you don’t need to do this.

    11th February 2008          BaBar Analysis School, Fergus Wilson   30
Some PID systematics

  Choose an analysis
 similar to yours and
      quote those
      systematics




  11th February 2008    BaBar Analysis School, Fergus Wilson   31
Neutral Efficiency (photons, 0, K0L)
 http://www.slac.stanford.edu/BFROOT/www/Physics/Analysis/
  AWG/Neutrals
 Corrections to energy Scale and Resolution are applied at run
  time:
    talkto EmcNeutCorrLoader {
         correctionOn set true
    }


   Efficiency Correction choices:
    1.   Average correction with 3% systematic per 0.
    2.   A momentum-corrected correction with 3% systematic per 0.
    3.   Efficiency correction tables from BAD 870.
o   The hope is that this systematic will come down soon to ~1%
o   There are also corrections for K0L and single photons (not
    covered here).


    11th February 2008           BaBar Analysis School, Fergus Wilson   32
PDF parameters
 Many analyses are dominated by the uncertainty on the fixed
  parameters use in the Maximum Likelihood fit.
 How to estimate uncertainty on parameters:
   o   It may not be the statistical error on the fitted signal MC because
       you have many more MC signal events than your final data set.
 Options:
   o   Find a data calibration mode where you can float as many of the signal
       parameters as possible.
   o   Float individual parameters on your data and see if there is significant
       shift in the results.
   o   Compare to other analyses. If a similar analysis sees a shift then
       perhaps you should as well.
   o   There are quite a few programs that will automatically vary all the
       parameters by their errors and calculate the systematic error taking
       into account the correlations.
 It is a black art and there is no correct answer.

   11th February 2008            BaBar Analysis School, Fergus Wilson         33
Other Common Systematics to consider

 B-Counting: 1.1% (but different in tau studies).
 Upsilon(4S) to B0B0bar/B+B- : 0.5%
 MC statistics: due to limited statistics (but don’t
    double count with statistical error on efficiency).
   Daughter Branching Fractions: from PDG.
   Fit Bias: often taken as half the fit bias.
   BB backgrounds: Often fixed in fits, so redo fits
    with different background levels.
   Matter/anti-matter asymmetry of Kaons:
    Calorimeter is made of matter so could respond
    differently to K±. Thought to be small ~0.3-0.5%.


    11th February 2008   BaBar Analysis School, Fergus Wilson   34
Homework

 Why is it good to have 2 detectors performing the
  same experiment?
 If two measurements from BaBar are combined,
  which systematics are correlated?
 If two measurements from BaBar and Belle are
  combined, which systematics are correlated?




  11th February 2008   BaBar Analysis School, Fergus Wilson   35
Systematic Errors Conclusion

 Many systematics can be calculated before you
  unblind.
 The number of systematic errors to be considered is
  infinite:
   o   Many are unimportant but…
   o   Phases of the moon and train timetables: turned out to be
       important for LEP.
 Concentrate on the important systematics.
 “Systematic studies will expand to fill the time
  available for their completion”.




  11th February 2008         BaBar Analysis School, Fergus Wilson   36
3 - Monte Carlo Pitfalls
   Generic MC.


    uds MC.
    Tagbits.
                                    There is no
   Geant.                         such thing as
    Photon Emission.
                                    “MC Truth”

   MC matching.
   Resonance Width.
   Interference.
   Backgrounds.
   Check the Decay File.
   MC is not Perfect.
   Finding Modes is difficult.

    11th February 2008   BaBar Analysis School, Fergus Wilson   37
The Signal Decay File
 Signal Modes
  o   Over 8000 modes.
       • Finding your mode is very time consuming
       • Get to know BbkSPModes and grep.
  o   Some are obsolete.
  o   Some are repeated but with changes.
       • Document modes used in your analysis in the BAD.
  o   Some have mixing in, some do not.
  o   Some have incorrect transversity distributions.
  o   The daughter branching fraction may not be correct or
      changed with time.
       • Always check the decay file.
  o   Most decay files do not have interference:
       • Dalitz analyses.
                              Get to know your signal decay files


  11th February 2008        BaBar Analysis School, Fergus Wilson    38
The Background Decay Files
 The B generic MC is not a perfect record of all B
  decays.
   o   Not all modes there.
   o   Some modes have enhanced BFs.
 Useful for identifying possible background modes:
   o   Run over B generics.
   o   Find which decays are accepted by your analysis
   o   Find the equivalent signal decay mode
   o   Run over signal decay mode
   o   Fit the resulting backgrounds or count number of events.
 uds continuum MC:
   o   Does not contain all background resonances
   o   Overall normalisation not well known.
        • Float normalisation in fits.


  11th February 2008         BaBar Analysis School, Fergus Wilson   39
The Simulation and Reconstruction

 MC not perfect:
  o   Always need to apply corrections (see systematics section).
  o   Time dependent:
       • Different run periods have different conditions.
       • Always select runs over a number of different periods.
 MC matching not perfect:
  o   mcFromReco sometimes fails.                                     B

       • Neutrals
                                                                          Soft γ
       • Resonances
  o   Photon emission can be confusing
                                                                    
                mcFromReco only returns true if
                the reconstructed candidate has
                exactly the same number of
                tracks as the MC truth

  11th February 2008           BaBar Analysis School, Fergus Wilson                40
The Simulation and Reconstruction
 Tagbits and skims
   o   Some MC is skimmed, some is not.
   o   Your analysis may only use a subset of the tags or need to merge tags.
        • e.g. BFourHHHH and BFourHHHK.
   o   Need to ensure that your code requires the tagbit for all input files.
 Resonances
   o   Not all resonances are Breit-Wigner’s (but sometimes they are):
        • 0, f0, K*(1430)
        • Check how they were simulated.
 MC cuts:
   o   Some MC have acceptance cuts already applied
        • Particularly Bhabhas.
 Beware on the tails of distributions.
   o   This is where the MC will be most wrong.
   o   Cut loosely here if you have to.



   11th February 2008           BaBar Analysis School, Fergus Wilson        41
Monte Carlo Conclusion

 There is no such thing as “MC Truth”.
 BaBar MC is very sophisticated.
 Don’t just accept what you are given. Check the
  assumptions.

                                          Trust but verify




  11th February 2008   BaBar Analysis School, Fergus Wilson   42
4 – General Rules for Analysis
 Try writing an abstract. What are the goals of the
  analysis?
 Is the analysis worth doing? Is it publishable?
   o   Will your results be new?
   o   Will your results be significantly better than published
       results?
   o   Always do a back-of-the-envelope calculation first.
 Don’t use your data to choose your cuts.
   o   Statistical fluctuations.
   o   Introduces biases (when have your chosen a worse set of cuts
       based on what you see?)
 Keep your cuts simple and limited.
   o   More cuts you add, more systematics to consider.
   o   More cuts mean more correlations.


  11th February 2008          BaBar Analysis School, Fergus Wilson    43
Enough and no more

                                                     Are there too many
                                                     cuts?
                                                     Do the cuts do
                                                     anything?
                                                     Are they
                                                     correlated?
                                                     Do they add
                                                     unnecessary
                                                     systematics?




  11th February 2008   BaBar Analysis School, Fergus Wilson            44
General Rules

 Always check to see if your signal is robust as you
  vary cuts:
   o   Vary cuts and see if BF changes significantly
   o   Split analysis into different run periods (e.g. Run 4).
   o   Examples:
        • LASS shape                        Signal
        • D* feeddown




                                                        Selection
                                                          criteria




  11th February 2008           BaBar Analysis School, Fergus Wilson   45
General Rules

 Look at distributions before you apply your cuts.
 Look at distributions after you unblind:
   o   e.g. if you do not use helicity in your selection, look at the
       helicity after you unblind to see that it is correct.
   o   But keep in mind efficiency corrections (see next page).
 Use MC to measure efficiencies and backgrounds
   o   BUT cross-check with:
        • Off resonance data.
        • Sideband data.
   o   Remember the MC caveats we talked about.
 Systematic errors:
   o   Do not underestimate.
   o   Do not overestimate.

  11th February 2008           BaBar Analysis School, Fergus Wilson     46
Spin in Top decays
     Because its mass is so large, the top
      quark is expected to decay very                       Left-handed
      rapidly
                                                                       Right-handed
     No time to form a top meson
                                                                   Longitudinal
     Top  Wb decay then preserves the
      spin information
       o   reflected in decay angle and
           momentum of lepton in the W rest
           frame
     We find the fraction of RH W’s                                   cos *
      to be (95% CL)
                                                              L=230 pb-1
      F+ = 0.08±0.08±0.05 (DØ)
          < 0.09 (CDF)
      CDF finds the fraction of longitudinal
      W’s to be
      F0 = 0.74 +0.22 –0.34
      (lepton pT and cos * combined)
      In the SM, F+  0 and F0 ~ 0.7
              All consistent with the SM
                                                                           PRD 72, 011104 (2005)
  11th February 2008                    BaBar Analysis School, Fergus Wilson                   47
Combining errors - 1
 Be careful how you combine
  errors.
 Are these errors
  compatible?
 Is the average meaningful?




   11th February 2008     BaBar Analysis School, Fergus Wilson   48
Combining errors - 2

 Be careful how you interpret errors




  11th February 2008   BaBar Analysis School, Fergus Wilson   49
Combining errors - 3


                                             KamioKande and
                                             SuperKamioKane
                                             do not agree so
                                                what does
                                             average mean?




  11th February 2008   BaBar Analysis School, Fergus Wilson    50
BaBar Specific
 Do you like to work by yourself or in a group?
   o   Might determine the analysis you do.
 Check early to see if your signal and background MC is
  available:
   o   Turnaround can be many months.
 Not all analyses are equal:
   o   Core Analyses       – Unitarity Triangle etc…
   o   Important Analyses – BF, First Observations.
   o   Speculative Analyses – Pentaquarks.
   o   Historical Analyses – αS , Michel Parameters, …
 BaBar Documentation:
   o   In general, it either does not exist or is wrong. Always check how
       uptodate it is.
   o   Write your own and offer it to the collaboration.
   o   The exception is the BAD repository.
 Selection Criteria:
   o   Always check for correlations (e.g. Neural Net selection versus Dalitz
       plot position).


   11th February 2008           BaBar Analysis School, Fergus Wilson        51
BaBar Specific
 Use a loose pre-selection if possible:
   o   Can tighten later without rerunning over all the data.
   o   Maximum Likelihood methods don’t need tight cuts.
 Hypernews:
   o   Unfortunately, some people think hypernews is the same as
       documentation – “It’s on hypernews”.
   o   The search function is not ideal.
   o   It can be a good source of bad information and should be looked at
       before asking a question.
   o   Ask yourself how old the information is.
 Publication:
   o   Look at the Pub Board pages now. There are lots of hoops to jump
       through so understand what is needed now:
       http://www.slac.stanford.edu/BFROOT/www/Organization/PubBoard/
       index.html
 Speakers Bureau
   o   Writing a talk? look at the Publication Database:
       http://oraweb.slac.stanford.edu/pls/slac/babar_documents.startup



   11th February 2008            BaBar Analysis School, Fergus Wilson       52
Conclusion

 BaBar is a mature experiment and many problems
    have already been iron-out
   There is usually no reason to re-invent the wheel.
   Sometimes the wheel could be improved.
   In September 2008, you will have access to a
    consistent set of Data and MC all processed under
    the same conditions.
   Have fun…




    11th February 2008   BaBar Analysis School, Fergus Wilson   53

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:17
posted:9/17/2011
language:English
pages:53