Skip to content

friccita/boosted-tau-analysis

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Setup (only needs to be done once)

1. Login to ucdavis.cern.ch

2. Execute

cmsrel CMSSW_5_3_11
cd CMSSW_5_3_11/src
cmsenv

For Rachel:
git init
git pull git@github.com:rpyohay/boosted-tau-analysis

For everyone else:
git clone https://github.com/rpyohay/boosted-tau-analysis.git BoostedTauAnalysis

git clone https://github.com/yiiyama/Toolset.git
git clone https://github.com/rpyohay/physics-tools.git PhysicsTools
git clone https://github.com/cms-sw/RecoLuminosity-LumiDB.git RecoLuminosity/LumiDB
cd RecoLuminosity/LumiDB
git checkout V04-02-10
mkdir ~/Tau
cd ~/Tau
cmsrel CMSSW_5_3_11
cd CMSSW_5_3_11/src
cmsenv
git cms-addpkg DataFormats/HepMCCandidate
git cms-addpkg DataFormats/MuonReco
git cms-addpkg DataFormats/JetReco
git cms-addpkg DataFormats/TauReco
git cms-addpkg RecoTauTag/RecoTau
git cms-addpkg PhysicsTools/PatUtils
git cms-addpkg PhysicsTools/Utilities
cp -r DataFormats/ ~/CMSSW_5_3_11/src/
cp -r RecoTauTag/ ~/CMSSW_5_3_11/src/
cp -r PhysicsTools/PatUtils/ ~/CMSSW_5_3_11/src/PhysicsTools/
cp -r PhysicsTools/Utilities/ ~/CMSSW_5_3_11/src/PhysicsTools/
cd ~/CMSSW_5_3_11/src/
cp ~friccita/public/HepMCCandidate/src/classes* DataFormats/HepMCCandidate/src
cp ~friccita/public/MuonReco/src/classes* DataFormats/MuonReco/src
cp ~friccita/public/JetReco/src/classes* DataFormats/JetReco/src
cp ~yohay/public/CMSSW_5_3_11/src/DataFormats/TauReco/src/* DataFormats/TauReco/src/
cp -r ~yohay/public/CMSSW_5_3_11/src/RecoTauTag/Configuration RecoTauTag
b -j16
rm -rf ~/Tau

3. Execute

mkdir -p /data1/`whoami`/QCD/analysis
mkdir -p /data1/`whoami`/QCD/EDM_files
mkdir -p /data1/`whoami`/QCDB/analysis
mkdir -p /data1/`whoami`/QCDB/EDM_files
mkdir -p /data1/`whoami`/QCDBMu/analysis
mkdir -p /data1/`whoami`/QCDBMu/EDM_files
mkdir -p /data1/`whoami`/DYJetsToLL/analysis
mkdir -p /data1/`whoami`/DYJetsToLL/EDM_files
mkdir -p /data1/`whoami`/SingleTop/analysis
mkdir -p /data1/`whoami`/SingleTop/EDM_files
mkdir -p /data1/`whoami`/TTJets/analysis
mkdir -p /data1/`whoami`/TTJets/EDM_files
mkdir -p /data1/`whoami`/WNJetsToLNu/analysis
mkdir -p /data1/`whoami`/WNJetsToLNu/EDM_files
mkdir -p /data1/`whoami`/Wbb/analysis
mkdir -p /data1/`whoami`/Wbb/EDM_files
mkdir -p /data1/`whoami`/WJetsToLNu/analysis
mkdir -p /data1/`whoami`/WJetsToLNu/EDM_files
mkdir -p /data1/`whoami`/WW/analysis
mkdir -p /data1/`whoami`/WW/EDM_files
mkdir -p /data1/`whoami`/WZ/analysis
mkdir -p /data1/`whoami`/WZ/EDM_files
mkdir -p /data1/`whoami`/Wh1_Medium/EDM_files
mkdir -p /data1/`whoami`/gg/EDM_files
mkdir -p /data1/`whoami`/ZH/analysis
mkdir -p /data1/`whoami`/ZH/EDM_files
mkdir -p /data1/`whoami`/VBF/analysis
mkdir -p /data1/`whoami`/VBF/EDM_files
mkdir -p /data1/`whoami`/ZZ/analysis
mkdir -p /data1/`whoami`/ZZ/EDM_files
mkdir -p /data1/`whoami`/nonIsoWDYJetsToLL/analysis
mkdir -p /data1/`whoami`/nonIsoWDYJetsToLL/EDM_files
mkdir -p /data1/`whoami`/nonIsoWTTJets/analysis
mkdir -p /data1/`whoami`/nonIsoWTTJets/EDM_files
mkdir -p /data1/`whoami`/nonIsoWWNJetsToLNu/analysis
mkdir -p /data1/`whoami`/nonIsoWWNJetsToLNu/EDM_files
mkdir -p /data1/`whoami`/data/analysis
mkdir -p /data1/`whoami`/data/EDM_files
mkdir -p /data1/`whoami`/nonIsoWData/analysis
mkdir -p /data1/`whoami`/nonIsoWData/EDM_files
mkdir -p /data1/`whoami`/SinglePhotonParkedData/analysis
mkdir -p /data1/`whoami`/SinglePhotonParkedData/EDM_files
mkdir -p /data1/`whoami`/results

4. Put the following lines in your rootlogon.C script:

gSystem->Load("libRooFit.so");
gSystem->Load("libRooFitCore.so");
gSystem->SetIncludePath("-I/afs/cern.ch/cms/slc5_amd64_gcc462/lcg/roofit/5.32.00/include");

5. Execute

cd ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test
cmsenv
make -f STLDictionary_Makefile
root -l -b -q compilePlottingCode.C












Running TauAnalyzer analysis jobs over data and MC

1. Login to lxplus.cern.ch

2. Execute

cd ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test
./generateJobFiles.sh v<version_number_1>
./generateJobFiles.sh v<version_number_2> tauanalyzer_WNJetsToLNu_Wh1_narrowMassBins_template_cfg.py tauanalyzer_data_narrowMassBins_template_cfg.py tauanalyzer_nonisodata_narrowMassBins_template_cfg.py
./submitAll.sh v<version_number_1>
./submitAll.sh v<version_number_2>

to launch the batch jobs.  The STDOUT files created by the batch jobs will be written to ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/v<version_number>/LSFJOB_XXXXXXXXX/STDOUT, where is XXXXXXXXX is the job number.  These files will tell you whether the jobs finished successfully or not.  In what follows, <version_number_1> refers to a run of the analysis with standard mass binning, while <version_number_2> refers to a run of the analysis with 0.25-GeV bin widths (used for the J/psi fit).

3. Once the batch jobs are finished, login to ucdavis.cern.ch and execute

cd ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test
cmsenv
./copyAll.sh v<version_number_1>
./copyAll.sh v<version_number_2>
cd v<version_number_1>
./runggTauAnalyzerCfgs.sh
./runWh1TauAnalyzerCfgs.sh
./runZHTauAnalyzerCfgs.sh
./runVBFTauAnalyzerCfgs.sh
./runWh1IsoSignalMETUncertaintyTauAnalyzerCfgs.sh
./runggIsoSignalMETUncertaintyTauAnalyzerCfgs.sh

4. Make sure that in your rootlogon.C script, you DO NOT have the line

AutoLibraryLoader::enable();

Then execute

cd ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test
cmsenv
./runStandardAnalysis.sh <version_number_2> <version_number_1>

If you want to also make plots where no HPS tau isolation cut is applied, edit the root commands in runStandardAnalysis.sh to put the flag "true" as the last argument to each function.

To calculate the signal uncertainties, run

cd ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test
cmsenv
./calculateBVetoUncertainties.sh v<version_number_1>

They will be printed as text.

5. Check the output files:

/data1/<your_username>/DYJetsToLL/analysis/isoVsNonIsoTaus_normalizedTo1_v<version_number>.root
/data1/<your_username>/DYJetsToLL/analysis/muHadIsoAnalysis_DYJetsToLL_v<version_number>.root
/data1/<your_username>/DYJetsToLL/analysis/muHadNonIsoAnalysis_DYJetsToLL_v<version_number>.root
/data1/<your_username>/DYJetsToLL/analysis/muHadAnalysis_DYJetsToLL_v<version_number>.root

/data1/<your_username>/SingleTop/analysis/isoVsNonIsoTaus_normalizedTo1_v<version_number>.root
/data1/<your_username>/SingleTop/analysis/muHadIsoAnalysis_T_v<version_number>.root
/data1/<your_username>/SingleTop/analysis/muHadNonIsoAnalysis_T_v<version_number>.root
/data1/<your_username>/SingleTop/analysis/muHadAnalysis_T_v<version_number>.root

/data1/<your_username>/TTJets/analysis/isoVsNonIsoTaus_normalizedTo1_v<version_number>.root
/data1/<your_username>/TTJets/analysis/muHadIsoAnalysis_TTJets_v<version_number>.root
/data1/<your_username>/TTJets/analysis/muHadNonIsoAnalysis_TTJets_v<version_number>.root
/data1/<your_username>/TTJets/analysis/muHadAnalysis_TTJets_v<version_number>.root

/data1/<your_username>/WNJetsToLNu/analysis/isoVsNonIsoTaus_normalizedTo1_v<version_number>.root
/data1/<your_username>/WNJetsToLNu/analysis/muHadIsoAnalysis_WNJetsToLNu_v<version_number>.root
/data1/<your_username>/WNJetsToLNu/analysis/muHadNonIsoAnalysis_WNJetsToLNu_v<version_number>.root
/data1/<your_username>/WNJetsToLNu/analysis/muHadAnalysis_WNJetsToLNu_v<version_number>.root

/data1/<your_username>/WW/analysis/isoVsNonIsoTaus_normalizedTo1_v<version_number>.root
/data1/<your_username>/WW/analysis/muHadIsoAnalysis_WW_v<version_number>.root
/data1/<your_username>/WW/analysis/muHadNonIsoAnalysis_WW_v<version_number>.root
/data1/<your_username>/WW/analysis/muHadAnalysis_WW_v<version_number>.root

/data1/<your_username>/WZ/analysis/isoVsNonIsoTaus_normalizedTo1_v<version_number>.root
/data1/<your_username>/WZ/analysis/muHadIsoAnalysis_WZ_v<version_number>.root
/data1/<your_username>/WZ/analysis/muHadNonIsoAnalysis_WZ_v<version_number>.root
/data1/<your_username>/WZ/analysis/muHadAnalysis_WZ_v<version_number>.root

/data1/<your_username>/Wh1_Medium/muHadIsoAnalysis_<MT_Bin>_Wh1_a<a1_mass>_v<version_number>.root
/data1/<your_username>/Wh1_Medium/muHadIsoAnalysis_<MT_Bin>_Wh1_a<a1_mass>_v<version_number>.root

/data1/<your_username>/gg/muHadIsoAnalysis_<MT_Bin>_gg_a<a1_mass>_v<version_number>.root
/data1/<your_username>/gg/muHadIsoAnalysis_<MT_Bin>_gg_a<a1_mass>_v<version_number>.root

/data1/<your_username>/ZH/muHadIsoAnalysis_<MT_Bin>_ZH_a<a1_mass>_v<version_number>.root
/data1/<your_username>/ZH/muHadIsoAnalysis_<MT_Bin>_ZH_a<a1_mass>_v<version_number>.root

/data1/<your_username>/VBF/muHadIsoAnalysis_<MT_Bin>_VBF_a<a1_mass>_v<version_number>.root
/data1/<your_username>/VBF/muHadIsoAnalysis_<MT_Bin>_VBF_a<a1_mass>_v<version_number>.root

/data1/<your_username>/ZZ/analysis/isoVsNonIsoTaus_normalizedTo1_v<version_number>.root
/data1/<your_username>/ZZ/analysis/muHadIsoAnalysis_ZZ_v<version_number>.root
/data1/<your_username>/ZZ/analysis/muHadNonIsoAnalysis_ZZ_v<version_number>.root
/data1/<your_username>/ZZ/analysis/muHadAnalysis_ZZ_v<version_number>.root

/data1/<your_username>/nonIsoWDYJetsToLL/analysis/isoVsNonIsoTaus_normalizedTo1_v<version_number>.root
/data1/<your_username>/nonIsoWDYJetsToLL/analysis/muHadIsoAnalysis_nonIsoWDYJetsToLL_v<version_number>.root
/data1/<your_username>/nonIsoWDYJetsToLL/analysis/muHadNonIsoAnalysis_nonIsoWDYJetsToLL_v<version_number>.root
/data1/<your_username>/nonIsoWDYJetsToLL/analysis/muHadAnalysis_nonIsoWDYJetsToLL_v<version_number>.root

/data1/<your_username>/nonIsoWTTJets/analysis/isoVsNonIsoTaus_normalizedTo1_v<version_number>.root
/data1/<your_username>/nonIsoWTTJets/analysis/muHadIsoAnalysis_nonIsoWTTJets_v<version_number>.root
/data1/<your_username>/nonIsoWTTJets/analysis/muHadNonIsoAnalysis_nonIsoWTTJets_v<version_number>.root
/data1/<your_username>/nonIsoWTTJets/analysis/muHadAnalysis_nonIsoWTTJets_v<version_number>.root

/data1/<your_username>/nonIsoWWNJetsToLNu/analysis/isoVsNonIsoTaus_normalizedTo1_v<version_number>.root
/data1/<your_username>/nonIsoWWNJetsToLNu/analysis/muHadIsoAnalysis_nonIsoWWNJetsToLNu_v<version_number>.root
/data1/<your_username>/nonIsoWWNJetsToLNu/analysis/muHadNonIsoAnalysis_nonIsoWWNJetsToLNu_v<version_number>.root
/data1/<your_username>/nonIsoWWNJetsToLNu/analysis/muHadAnalysis_nonIsoWWNJetsToLNu_v<version_number>.root

/data1/<your_username>/data/analysis/muHadNonIsoAnalysis_SingleMu_v<version_number>.root

/data1/<your_username>/nonIsoWData/analysis/isoVsNonIsoTaus_normalizedTo1_v<version_number>.root
/data1/<your_username>/nonIsoWData/analysis/nonIsoW_muHadIsoAnalysis_SingleMu_v<version_number>.root
/data1/<your_username>/nonIsoWData/analysis/nonIsoW_muHadNonIsoAnalysis_SingleMu_v<version_number>.root
/data1/<your_username>/nonIsoWData/analysis/nonIsoW_muHadAnalysis_SingleMu_v<version_number>.root

/data1/<your_username>/results/MC_closure_v<version_number>.root
/data1/<your_username>/results/dataVsMCQCDFromData_muHadNonIsoAnalysis_19p7fb-1_v<version_number>.root
/data1/<your_username>/results/dataVsMC_RegionAQCDEstimate_v<version_number>.root
/data1/<your_username>/results/dataVsMC_RegionBQCDEstimate_v<version_number>.root
/data1/<your_username>/results/dataVsMC_muHadNonIsoAnalysis_19p7fb-1_v<version_number>.root
/data1/<your_username>/results/dataVsMC_muHadNonIsoDifference_19p7fb-1_v<version_number>.root
/data1/<your_username>/results/final_v<version_number>.root
/data1/<your_username>/results/sigVsBkgQCDFromData_muHadIsoAnalysis_19p7fb-1_v<version_number>.root
/data1/<your_username>/results/sigVsBkgQCDFromData_muHadIsoAnalysis_normalizedTo1_v<version_number>.root
/data1/<your_username>/results/sigVsBkg_muHadIsoAnalysis_19p7fb-1_v<version_number>.root
/data1/<your_username>/results/sigVsBkg_muHadIsoAnalysis_normalizedTo1_v<version_number>.root














Skimming BACKGROUND MC SAMPLES ONLY with CRAB

1. In ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauSkimmer/test/multicrab.cfg:

Comment out (with the # sign) all datasets and associated paramaters for datasets you DO NOT want to process, including, of course, the SingleMu and SinglePhotonParked datasets
Increment the version number on the names of all the tasks in square brackets (e.g. v2 --> v3)

2. Submit your jobs with multicrab from ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauSkimmer/test as you normally would

3. When your jobs complete, calculate the total number of events processed and the number of events passing each filter of the skim by executing, for example,

cd ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test
./combineResults.sh 1 200 --exclude=5,7,34 /afs/cern.ch/user/<first_letter_of_your_username>/<your_username>/CMSSW_5_3_11/src/BoostedTauAnalysis/TauSkimmer/test/WJetsToLNuSkim

where

1 is the job number of the first job in the task WJetsToLNuSkim
200 is the job number of the last job in the task WJetsToLNuSkim, or equivalently the number of jobs in the task
--exclude=5,7,34 means exclude jobs 5, 7, and 34 from the sum because they failed
/afs/cern.ch/user/<first_letter_of_your_username>/<your_username>/CMSSW_5_3_11/src/BoostedTauAnalysis/TauSkimmer/test/WJetsToLNuSkim is the full path to the CRAB directory of the task WJetsToLNuSkim

4. Use the information from step 3 to update the weights, if necessary, in ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/formatPlots.C

5. Update the locations of the new MC skim files in

~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateQCDTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateQCDBTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateQCDBMuTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateDYJetsToLLTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateSingleTopTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateTTJetsTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateWNJetsToLNuTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateWbbTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateWJetsToLNuTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateWWTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateWZTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateZZTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generatenonIsoWDYJetsToLLTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generatenonIsoWTTJetsTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generatenonIsoWWNJetsToLNuTauAnalyzerCfgs.sh













Skimming DATA SAMPLES ONLY with CRAB

1. In ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauSkimmer/test/multicrab.cfg:

Comment out (with the # sign) all MC datasets and associated paramaters.  Note that the data samples end in "AOD" and the MC samples end in "AODSIM".
Increment the version number on the names of all the tasks in square brackets (e.g. v2 --> v3)

2. In ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauSkimmer/test/crab.cfg:

Comment out lines 19-20 (splitting by events)
Uncomment lines 21-22 (splitting by lumi sections)
Uncomment line 25 (JSON file)

3. Submit your jobs with multicrab from ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauSkimmer/test as you normally would

4. When your jobs complete, calculate the total number of events processed and the number of events passing each filter of the skim by executing, for example,

cd ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test
./combineResults.sh 1 200 --exclude=5,7,34 /afs/cern.ch/user/<first_letter_of_your_username>/<your_username>/CMSSW_5_3_11/src/BoostedTauAnalysis/TauSkimmer/test/WJetsToLNuSkim

where

1 is the job number of the first job in the task WJetsToLNuSkim
200 is the job number of the last job in the task WJetsToLNuSkim, or equivalently the number of jobs in the task
--exclude=5,7,34 means exclude jobs 5, 7, and 34 from the sum because they failed
/afs/cern.ch/user/<first_letter_of_your_username>/<your_username>/CMSSW_5_3_11/src/BoostedTauAnalysis/TauSkimmer/test/WJetsToLNuSkim is the full path to the CRAB directory of the task WJetsToLNuSkim

5. Run "multicrab -report" to get JSON files of the successfully processed lumi sections

6. Execute

cd ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauSkimmer/test
cmsenv
pixelLumiCalc.py overview -i SingleMu_Run2012A-22Jan2013-v1_AOD_skim_v2/res/lumiSummary.json
pixelLumiCalc.py overview -i SingleMu_Run2012B-22Jan2013-v1_AOD_skim_v2/res/lumiSummary.json
pixelLumiCalc.py overview -i SingleMu_Run2012C-22Jan2013-v1_AOD_skim_v2/res/lumiSummary.json
pixelLumiCalc.py overview -i SingleMu_Run2012D-22Jan2013-v1_AOD_skim_v2/res/lumiSummary.json

to get the processed recorded luminosity per dataset, changing the "_skim_v2" as necessary to fit what you originally submitted

7. Calculate the total processed recorded luminosity by adding the recorded luminosities for the 4 datasets in step 6

8. Use the information from step 7 to update the weights in ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/formatPlots.C

9. Update the locations of the new data skim files in

~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateDataTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateNonIsoWDataTauAnalyzerCfgs.sh
~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauAnalyzer/test/generateSinglePhotonDataTauAnalyzerCfgs.sh














Skimming SIGNAL MC SAMPLE ONLY

1. Login to ucdavis.cern.ch

2. Execute

cd ~/CMSSW_5_3_11/src/BoostedTauAnalysis/TauSkimmer/test
cmsenv
./generateWh1SkimCfgs.sh <version> tauSelectionSkim_Wh1.py
./generateggSkimCfgs.sh <version> tauSelectionSkim_gg.py
./generateZHSkimCfgs.sh <version> tauSelectionSkim_ZH.py
./generateVBFSkimCfgs.sh <version> tauSelectionSkim_VBF.py

where <version> is the desired skim version.  Then execute

cd <version>
./runWh1SkimCfgs.sh
./runggSkimCfgs.sh
./runZHSkimCfgs.sh
./runVBFSkimCfgs.sh

About

Switching to CMSSW_5_3_11

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 46.8%
  • Shell 23.0%
  • C++ 19.7%
  • C 10.5%