The purpose of the ATLAS log page is to log all the tools and procedures that I use everyday in ATLAS. These are not analysis specific but rather general like how to run a derivation, how to setup a release, how to download datasets from the grid, etc.


Table of Contents generated with DocToc

Tips and Tricks

  • Useful way to search for things in ATLAS is to use https://search.cern.ch
  • Print line of code
    #define DEBUG std::cerr << __FILE__ << "::" << __FUNCTION__ << "::" << __LINE__ << std::endl;
    

ATLAS Help mailing lists

  • For Physics Analysis help: hn-atlas-PATHelp@cern.ch
  • For Offline software: hn-atlas-offlineSWHelp@cern.ch
  • For Grid help: hn-atlas-dist-analysis-help@cern.ch

Every login

setupATLAS
cd build
asetup --restore
source */setup.sh

Setup athena in rel21

setupATLAS
mkdir source build run
cd build/
asetup 21.2.3,AnalysisBase
emacs ../source/CMakeLists.txt # fill this file!
cmake ../source/
make
source */setup.sh

compile after modifying files (no new files)

cd ../build/
make

if you add files

cd ../build/
cmake ../source/
make

Update analysis release

rm -rf build
mkdir build
cd build
asetup AnalysisBase,9.9.99
cmake ../source/
make

Compile with individual packages

cd ../source
git checkout [...] athena # copy from gitlab the repository address
emacs source/package_filters.txt # list packages you want to keep
cd ../build
cmake ../source

Create my package

cd ../source/
MyPackage="myVBF"
mkdir ${MyPackage}
mkdir ${MyPackage}/${MyPackage}
mkdir ${MyPackage}/Root
mkdir ${MyPackage}/src
mkdir ${MyPackage}/src/components
mkdir ${MyPackage}/share
# add "atlas_subdir (${MyPackage})" to ${MyPackage}/CMakeLists.txt
cd ../build/
cmake ../source/
make
source `eval echo \\${${AtlasProject}_PLATFORM}`/setup.sh

xAOD tips

Interactively check xAOD file

root -l
gROOT->Macro( "$ROOTCOREDIR/scripts/load_packages.C" );
xAOD::Init();
f = TFile::Open("/path/to/DAOD_EXOT4.root", "READ");
t = xAOD::MakeTransientTree( f );
t->Draw( "ElectronCollection.pt()");

Check number of events in xAOD

Open file in root and do

CollectionTree->GetEntries()

Merge xAOD

xAODMerge <output_xAOD.root>  <input_xAOD_1.root input_xAOD_2.root ...>

Running truth derivation

setupATLAS
asetup 20.1.8.3,AtlasDerivation,gcc48,here
Reco_tf.py --inputEVNTFile powheg.root --outputDAODFile tutorial.pool.root --reductionConf TRUTH1

Running truth derivation on the grid

setupATLAS
asetup 20.1.8.3,AtlasDerivation,gcc48,here
lsetup rucio panda ganga
voms-proxy-init -voms atlas

Rucio usefull commands

# List dids
datastring="Sherpa_221*Zmumu*.merge.DAOD_TRUTH1"
rucio list-dids mc15_13TeV:mc15_13TeV*${datastring}*/ --filter type=container | grep mc15 | sort | awk -F"|" '{print $2}'

# Download from list
mylist="Zmm.list"
for i in $(cat ${mylist}); do echo $i; rucio download $i; done

How to access eos from lxplus

https://cern.service-now.com/service-portal/article.do?n=KB0001998

setupATLAS
voms-proxy-init -voms atlas
lsetup xrootd

it depends if you want to access ATLAS or your cernbox you can set the environment variable:

for ATLAS:

eos root://eosatlas.cern.ch ls /eos/atlas/user/o/othrif
# or
export EOS_MGM_URL=root://eosatlas.cern.ch
eos ls /eos/atlas/user/o/othrif/

for cernbox:

eos root://eosuser.cern.ch ls /eos/user/o/othrif
# or:
export EOS_MGM_URL=root://eosuser.cern.ch
eos ls /eos/user/o/othrif

in root

TFile * test = TFile::Open("root://eosatlas.cern.ch//eos/atlas/user/o/othrif/hist-input.root")

How to access eos from NAF:

setupATLAS
voms-proxy-init -voms atlas
lsetup xrootd
kinit othrif@CERN.CH
xrdfs eosatlas.cern.ch ls -l /eos/atlas/user/o/othrif/
xrdfs eosuser.cern.ch ls -l /eos/user/o/othrif/
xrdcp <file.txt> root://eosatlas.cern.ch//eos/atlas/user/o/othrif/.
xrdcp root://eosatlas.cern.ch//eos/atlas/user/o/othrif/<file.txt> .
xrdcp root://eosuser.cern.ch//eos/user/o/othrif/<file.txt> .
# in root:
TFile * test = TFile::Open("root://eosatlas.cern.ch//eos/atlas/user/o/othrif/hist-input.root")

Check xAOD

TO DO: need instructions for rel21.

Setup RootCore:

mkdir checkdir && cd $_
setupATLAS
lsetup 'rcsetup Base,2.4.28'
rc find_packages
rc compile

Interactive ROOT

root -l
gROOT->Macro( "$ROOTCOREDIR/scripts/load_packages.C" );
xAOD::Init();
f = TFile::Open( "xAOD.pool.root" )
t = xAOD::MakeTransientTree( f )
t->Draw( "Electrons.pt() - Electrons.trackParticle().pt()" );

You can use TBrowser after creating the transient tree in "root/Root Memory/CollectionTree"

checkxAOD.py

To retrieve the container type "xAOD::CaloCluster" and key name "egammaClusters":

setupATLAS
asetup 21.2.13,AnalysisBase
checkxAOD.py xAOD.pool.root

To know the variables associated to this container do:

root -l xAOD.pool.root
root [1] CollectionTree->Print("egammaClusters*")

the varialbe will come after the dot, like "rawEta" in "egammaClusters.rawEta". You can for instance output this variable.

Metadata

Explore metadata with

asetup AthAnalysis,21.2,latest
lsetup PyAMI
getMetadata.py --inDS=mc15_13TeV.3610*EVNT*
getMetadata.py --inDS=mc15_13TeV.361063.*EVNT* --explainFields kFactor

Also, you can put the sample dataset names in a text file and get an output file with:

getMetadata.py --inDsTxt=my.datasets.txt --outFile=my.metdata.txt

The outptu file can directly be parsed in your analysis code.

You can also see what metadata is available:

checkMetaSG.py xAOD.pool.root

Grid stuff

Check status more than 7 days: https://bigpanda.cern.ch/tasks/?username=Othmane%20Rifki&days=50

Also when tasks disappear due to limit of time passed try adding &days=500

Grid setup

setupATLAS
lsetup rucio panda ganga
voms-proxy-init -voms atlas

Rucio

DID = scope:name (data15_13TeV:DAOD_SUSY2.06797864._000001.pool.root.1)

rucio list-dids "user.${USER}:*"
rucio list-scopes | grep $USER
rucio list-dids  <DID>
rucio list-files <DID>
rucio list-content <DID>
rucio download   <DID>
rucio download  --nrandom 1 <DID>  // 1 randomly
rucio get-metadata <DID>  // can be used for --filter options
rucio list-dataset-replicas <DID>
rucio list-file-replicas <DID>
rucio list-account-usage $USER // shows you quotas

Examples: rucio list-dids "data15_13TeV:*276329*" --filter type=dataset,datatype=DAOD_SUSY2

Duplicate to another site

for i in $(cat list); do rucio add-rule  $i 1 DESY-HH_LOCALGROUPDISK;done

Kill pid's in pbook:

content=[]
with open("/tmp/sub") as f:
    content = f.readlines()
content = [x.strip() for x in content]
for x in content:
    print "kill(",x,")"

pbook retry help

Open pbook:

help(retry)

When you get brokerage failed, do

setupATLAS
lsetup rucio panda ganga
pbook
retry(PID)

To upload files to the grid

rucio upload --rse DESY-HH_SCRATCHDISK file1 file2 file3  --scope user.<YOUR_USERNAME>

To check the available RSE

rucio list-rses
rucio list-account-limits othrif
rucio list-account-usage othrif

DESY: DESY-HH_LOCALGROUPDISK, DESY-HH_SCRATCHDISK

Working with rules

 rucio list-rules --account othrif

Delete rule with

 rucio delete-rule <datasetname>

Or redirect the output of list-rules to rules.txt and run in python:

 #!/usr/bin/python
with open('rules.txt', "rb") as rules:
  for rule in rules:
    args = rule.split()
    print 'rucio delete-rule', args[2], '--rse_expression', args[4]

Then check the rules before copy pasting them to the terminal

Problems with the grid

  • Check grid output If you have problems with the output directory beeing too large by looking at pilotlog.txt in the logs

Pandamonium

  • Check out: https://github.com/dguest/pandamonium
  • To get a list of jobs: pandamon user.othrif
  • To retry finished: pandamon | awk '$1 ~ /finished/ {print $2}' | panda-resub-taskid
  • To retry failed: pandamon | awk '$1 ~ /failed/ {print $2}' | panda-resub-taskid

Tricks from Kirill

  • In pbook killAndRetry(14359589, newSite=True, newOpts={'--forceStaged':''},)

Setup Athena release

mkdir MyAnalysisProject; cd MyAnalysisProject
mkdir source build
cd build
acmSetup --sourcedir=../source AthAnalysis,21.2.6

CP Tools

Calibration Jets

https://twiki.cern.ch/twiki/bin/view/AtlasProtected/ApplyJetCalibrationR21 The configuration files are stored in the calibration area: http://atlas.web.cern.ch/Atlas/GROUPS/DATABASE/GroupData/JetCalibTools/ You can find them as well in CVMFS: /cvmfs/atlas.cern.ch/repo/sw/database/GroupData/JetCalibTools/


Git with ATLAS

Every time you want to update a package (i.e. SUSYTools)

git fetch upstream
git checkout -b my21.2-2018-03-21 upstream/21.2

Clone Athena with full checkout

git clone https://:@gitlab.cern.ch:8443/othrif/athena.git
cd athena/
git remote add upstream https://:@gitlab.cern.ch:8443/atlas/athena.git
git remote -v show
git fetch upstream
git checkout -b <THE BRANCH I WANT> upstream/master --no-track
git push --set-upstream origin <THE BRANCH I WANT>

Derivation framework

Existing derivation

setupATLAS
asetup 21.2,AthDerivation,latest
Reco_tf.py --inputAODFile $disk/samples/AOD/input.AOD.root --outputDAODFile output.pool.root --reductionConf EXOT5 --maxEvents 10

Passthrough on:

Reco_tf.py --passThrough True --inputAODFile $disk/samples/AOD/input.AOD.root --outputDAODFile output.DAOD.root --reductionConf EXOT5

Modify derivation

#kinit othrif@CERN.CH
setupATLAS
lsetup asetup
lsetup git
mkdir sandbox && cd $_
export here=$PWD
mkdir build run
git atlas init-workdir https://:@gitlab.cern.ch:8443/atlas/athena.git
cd athena
git atlas addpkg DerivationFrameworkExotics
git branch -a
#git checkout -b 21.2-updateRel21EXOT5-2018-01-18 upstream/21.2 --no-track
git branch -a
# MODIFY athena/PhysicsAnalysis/DerivationFramework/DerivationFrameworkExotics/.
git diff PhysicsAnalysis/DerivationFramework/DerivationFrameworkExotics/
# git push -u origin 21.2-updateRel21EXOT5-2018-01-26 upstream/21.2
cd $here/build/
asetup 21.2,AthDerivation,latest
cmake ../athena/Projects/WorkDir
make
source */setup.sh
cd ../run/
Reco_tf.py --inputAODFile $disk/samples/AOD/input.AOD.root --outputDAODFile output.DAOD.root --reductionConf EXOT5 --maxEvents 10

Run derivation on grid

setupATLAS
lsetup rucio panda ganga
voms-proxy-init -voms atlas
pathena --trf "Reco_tf.py --passThrough False --inputAODFile %IN --outputDAODFile %OUT.EXOT5.pool.root --reductionConf EXOT5" --nFiles=1 --inDS <INPUT_DATASET> --outDS user.othrif.<OUTPUT_DATASET>
pathena --trf "Reco_tf.py --passThrough False --inputAODFile %IN --outputDAODFile %OUT.EXOT5.pool.root --reductionConf EXOT5" --nFilesPerJob=5 --inDS <INPUT_DATASET> --outDS user.othrif.<OUTPUT_DATASET>

Generate PRW config file

setupATLAS
acmSetup AthAnalysis,21.2,latest
lsetup pyAMI panda
voms-proxy-init -voms atlas
generatePRW.py --inDsTxt=mySamples.txt --outDS=user.othrif.myPRW --skipNTUP_PILEUP
checkPRW.py --inDsTxt=mySamples.txt path/to/prwfiles/*.root

running the lowest unprescaled trigger tool

setupATLAS
asetup AthenaP1,21.1.20
python -c "
from TriggerMenu.api.TriggerAPI import TriggerAPI
from TriggerMenu.api.TriggerEnums import TriggerPeriod, TriggerType
print TriggerAPI.getLowestUnprescaled(337833, TriggerType.mu_single)"

ATLAS framework tips

Debug line number

#define DEBUG std::cerr << __FILE__ << "::" << __FUNCTION__ << "::" << __LINE__ << std::endl

cout in Athena

ANA_MSG_INFO( "done with histInitialize Nominal" );

Check xAOD

xAODChecker /path/to/DAOD.file
checkFile  /path/to/DAOD.file

Code base path for derivation

  • /cvmfs/atlas.cern.ch/repo/sw/software/21.2/AthDerivation

Latest release

Nighlies

For example, I want to know what is the next numbered build for AthDerivation in 21.2, then I check: https://gitlab.cern.ch/atlas/athena/blob/21.2/Projects/AthDerivation/version.txt which shows 21.2.17.0 I can confirm it by looking at https://twiki.cern.ch/twiki/bin/view/AtlasProtected/DerivationProductionTeam#Info_on_AtlasDerivation_caches_a which at the time of the writing shows 21.2.16.0 as the last entry in the table. So the next release candidate number is 21.2.17.0.

You could also go to https://gitlab.cern.ch/atlas/athena/blob/release/21.2.18.0/PhysicsAnalysis/DerivationFramework/, to get to release 21.2.18.0.

Compile ATLAS notes with Latex

export PATH=/afs/cern.ch/sw/XML/texlive/previous/bin/x86_64-linux:$PATH

Histfitter

Setting up SS3L Histfitter

On LXPLUS:

kinit othrif@CERN.CH
mkdir -p /afs/cern.ch/work/o/othrif/SS_HistFitter && cd $_
git clone https://:@gitlab.cern.ch:8443/disimone/SSHistHitter.git
mkdir run2 && cd $_
mkdir -p HistFitterUser
wget https://bootstrap.pypa.io/ez_setup.py -O - | python - --user
wget https://pypi.python.org/packages/ef/33/9b65092aaa03a43e90326dcf4d921516fe9e649a47eb928f3e19f961b050/odict-1.6.2.tar.gz
tar -xvzf odict-1.6.2.tar.gz
cd odict-1.6.2/
python setup.py install --user
git clone https://:@gitlab.cern.ch:8443/HistFitter/HistFitter.git
cd HistFitter
git checkout tags/v0.54.0
source setup_afs.sh
cd src/
make
cd ../..
ln -s ../SSHistHitter/macroSS macroSS
ln -s ../SSHistHitter/python python
ln -s ../SSHistHitter/prepare prepare
ln -s ../SSHistHitter/susyGridFiles susyGridFiles
ln -s /afs/cern.ch/work/o/othrif/InputTrees InputTrees
export PYTHONPATH=/afs/cern.ch/work/o/othrif/SS_HistFitter/run2/python:$PYTHONPATH
export HFRUNDIR=/afs/cern.ch/work/o/othrif/SS_HistFitter/run2/InputTrees
# emacs python/pathUtilities.py  # Modify the paths
cd /afs/cern.ch/work/o/othrif/SS_HistFitter/SSHistHitter/HFpatch
source cpFiles.sh
cd ${HISTFITTER}/src
make
cd /afs/cern.ch/work/o/othrif/SS_HistFitter/run2/prepare/xsections/mc15_13TeV
ln -s ../xsections_Ximo.txt Backgrounds.txt

On ouhep01:

kinit othrif@CERN.CH
setupATLAS
lsetup "root 6.04.02-x86_64-slc6-gcc48-opt"
git clone https://:@gitlab.cern.ch:8443/disimone/SSHistHitter.git
mkdir run2 && cd $_
git clone https://:@gitlab.cern.ch:8443/HistFitter/HistFitter.git
cd HistFitter
git checkout tags/v0.54.0
source setup.sh
cd src
make
cd ../..
mkdir
ln -s ../SSHistHitter/macroSS macroSS
ln -s ../SSHistHitter/python python
ln -s ../SSHistHitter/prepare prepare
ln -s ../SSHistHitter/susyGridFiles susyGridFiles
export PYTHONPATH=/UserDisk2/othrif/ss3l_HistFitter/run2/python:$PYTHONPATH
export HFRUNDIR=/UserDisk2/othrif/ss3l_HistFitter/fromPeter
# emacs python/pathUtilities.py  # Modify the paths

Everytime you want to launch the fit:

cd HistFitter
source setup_afs.sh
export PYTHONPATH=/afs/cern.ch/work/o/othrif/SS_HistFitter/run2/python:$PYTHONPATH
export HFRUNDIR=/afs/cern.ch/work/o/othrif/SS_HistFitter/run2/InputTrees

AMI

To look at information about dataset:

ami show dataset info
mc15_13TeV.308276.PowhegPy8EG_NNPDF30_AZNLOCTEQ6L1_VBFH125_ZZ4nu_MET125.evgen.EVNT.e6126

Or:

ami list datasets mc15_13TeV.308276.PowhegPy8EG_NNPDF30_AZNLOCTEQ6L1_VBFH125_ZZ4nu_MET125.evgen.EVNT.e6126 --fields events,cross_section,generator_filter_efficienty

or with a list of files if you want to avoid # and blank lines:

while read -r line; do [[ "$line" =~ ^#.*$ ]] && continue;  if [ ! -z "${line}" ]; then echo "ami list datasets  ${line} --fields events,cross_section,generator_filter_efficienty | grep mc15" ; fi; done < evnt_samples.list

Quick tests in Rel21

Setup rel21

mkdir MyProject
cd MyProject
mkdir source build
cd build
acmSetup AthAnalysis,21.2.18

GIT merge requests

Go to this url: https://gitlab.cern.ch/othrif/athena/merge_requests/new

results matching ""

    No results matching ""