CINT can now autmatically generate dictionary for templated classes, e.g. A < B > if the header file for A and B is known. This allows to interpret vector by transparently generating a dictionary for it. This feature can be toggeled on or off by ".autodict".
source $VO_CMS_SW_DIR/$SCRAM_ARCH/external/apt/0.5.15lorg3.2-CMS3/etc/profile.d/init.csh
apt-get update
apt-get install cms+cmssw+CMSSW_2_1_8
process.load("RecoVertex.BeamSpotProducer.BeamSpot_cfi")
e aggiungi process.offlineBeamSpot alla SequenceedmDumpEventContent [file.root] | grep PixelDigi
temp = process.dumpPython()
outputfile = file("merdaputtanatroia.py",'w')
outputfile.write(temp)
outputfile.close()
and then python [your_cfg.py]
[ewv@vaandering ~]$ cat ~/.scramrc/symlinks lib:/uscms_data/d1/$(USER)/$(SCRAM_PROJECTNAME)/$(SCRAM_PROJECTVERSION) tmp:/uscms_data/d1/$(USER)/$(SCRAM_PROJECTNAME)/$(SCRAM_PROJECTVERSION) bin:/uscms_data/d1/$(USER)/$(SCRAM_PROJECTNAME)/$(SCRAM_PROJECTVERSION)and then uses the -s option on scram
PhysicsTools/PatAlgos/test/*py
, in praticolare patLayer1_fromAOD_full.cfg.py
, che produce uno skim con i PAT dentro. Poi puoi runnare sui PATtuple, con un cfg tipo quello in PhysicsTools/StarterKit/test
import FWCore.ParameterSet.Config as cms process = cms.Process("StarterKit") process.source = cms.Source("PoolSource", fileNames = cms.untracked.vstring( 'file:./PATLayer1_Output.fromAOD_full.root') ) process.maxEvents = cms.untracked.PSet( input = cms.untracked.int32(200) ) process.load("PhysicsTools.StarterKit.PatAnalyzerSkeleton_cfi") process.TFileService = cms.Service("TFileService", fileName = cms.string('PatAnalyzerSkeletonHistosFromSkim.root') ) process.p = cms.Path(process.patAnalyzerSkeleton)et voila'
// Create the root file theFile = new TFile(theRootFileName.c_str(), "RECREATE"); bool dirStat=TH1::AddDirectoryStatus(); TH1::AddDirectory(kTRUE); // book histogram TH1::AddDirectory(dirStat);
process.filterPath = cms.Path(process.wfilter * process.wanalyzerFilter ) process.ep = cms.EndPath(process.output) process.output.SelectEvents = cms.untracked.PSet( SelectEvents = cms.vstring('filterPath') )
Configuration.StandardSequences.Reconstruction[Cosmics]_cff
TrackPropagation.SteppingHelixPropagator.SteppingHelixPropagator_cfi
edm::FileInPath listruneventstmp("DPGAnalysis/Skims/data/"+listruneventsinpath_);Fuck!
process.pickEvents = cms.EDFilter( "PickEvents", RunEventList = cms.untracked.string("listrunev") )where "listrunev" is a file as follows:
# Format of select file: ## ############################################## # # is either "+" (i.e. keep run/event) or "-" (i.e. reject) # is of type "aaaaaa:bbbbbb". bbbbbb can be also "infty" # is of type "cccccc:dddddd". dddddd can be also "infty" # # By defaults event are rejected. This is equivalent to a # "- 0:infty 0:infty" # at the beginning of the file # # all actions are applied in order, so if an action at line 6 # is in contraddiction to action in line 5, line 6 will be retained and executed. # # All lines with an unrecognized structure will be considered as a comment. + 122314:122314 16952773:16952773
edmProvDump fileName.root | grep -i globaltag
static int j=1; if ((_ev%j)==0) { if ((_ev/j)==9) j*=10; cout << "Run:Event analyzed " << event.id().run() << ":" << event.id().event() << " Num " << _ev << endl; }
lumisumamry.jsonfile to be used with
lumiCalc.py -i lumiSummary.json delivered
cmsRun -j fjr.xml ...and then pass the produced fjr.xml to the following script lumiJobReport.py:
lumiJobReport.py fjr.xmlthat will return the integrated luminosity as well as a
lumiSummary.jsonto be used with the previous method.
cvs co RecoLuminosity/LumiDB/
This is far from trivial, that's how far I got so far.
edmConfigFromDB --runNumber 148822and look at the output: at the beginning you'll see something like:
process.streams = cms.PSet( A = cms.vstring( 'BTau', 'Commissioning', ...plus other Stream. You are interested in Stream: A.
edmConfigFromDB --runNumber 148822 --format streams.list:A.JetYou'll get a list of the HLT path associated to that stream/PD.
You must start from RAW (or possibly DIGI, to be checked) and run L1 and HLT simulation. The simple way to have a working cfg is to use
cmsDriver, as, eg, like this
cmsDriver.py test -s DIGI,L1,HLT:GRun,RECO --conditions START38_V14::All --processName test --no_exec, then you modify your input, add your analysis and you are happy.
srm-rmdir srm://t2-srm-02.lnl.infn.it:8443/srm/managerv2?SFN=/pnfs/lnl.infn.it/data/cms/store/user/slacapra/EWK_Wanalysis_300_pass2/
BLUE="\[\033[0;34m\]" RED="\[\033[0;31m\]" LIGHT_RED="\[\033[1;31m\]" WHITE="\[\033[1;37m\]" NO_COLOUR="\[\033[0m\]" case $TERM in xterm*) TITLEBAR='\[\033]0;\u@\h:\w\007\]' ;; *) TITLEBAR="" ;; esac #$BLUE[$RED\$(date +%H%M)$BLUE]\ #PS1="${TITLEBAR}\$LIGHT_RED\h:\w$BLUE>\$NO_COLOUR" PS1="${TITLEBAR}$LIGHT_RED\h:\w$BLUE>$NO_COLOUR" PS2='> ' PS4='+ '
gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.3 -dNOPAUSE -dQUIET -dBATCH -sOutputFile=output.pdf input.pdf
apt-get install openafs-client openafs-krb5 krb5-user krb5-config apt-get install module-assistant module-assistant prepare openafs-modules module-assistant auto-build openafs-modules dpkg -i /usr/src/openafs-modules-2.6.27-7-generic_1.4.7.dfsg1-6+2.6.27-7.16_i386.deb ./openafs-client start klog.afs slacapra -cell cern.ch
Last updated: 11-Jan-13 15:44