LHCb software @ Lapp

From LHCB-LAPP

Jump to: navigation, search

Contents

Before Starting

Some versions of some software work with .opts or other with .py ; you can find the list of versions installed at Lapp with the option files taking into account.

Gauss

Releases v2* work with .opts => DC06 official production

Releases v3* work with .py => MC08 and test 09

Boole

Release v12r10 to use with .opts => DC06 official production

Release v16r2 to use with .py => MC08 and test 09

Brunel

Releases v30r* and v31r* to use with .opts => DC06 official production

Releases v33r* and v34r* to use with .py => MC08 and test 09

DaVinci

Untill release v19r12 to use with only .opts

Then you've got the choice between .opts or .py

Ganga

Releases less than 5.0.3 (included) work with .opts and others with .py

The above trouble with Ganga can occur only if you send Grid Jobs with the Dirac or Local backend. Otherwise it works with both .opts and .py.

Getting started

Available versions of the software should be checked first on

/grid_sw/lhcb/lib

Check this page to see if the wanted version is available and work at Lapp [1]

There are two ways of working at the LAPP : interactively or using the Grid. In order to be coherent with whole LHCb environment (Lxplus and Grid), LHCb should be used on slc4 (Scientific Linux 4). 32 and 64 bits are still supported on Lappsl4 and on the Grid. To use both of them you should work on a lappsl4 : ssh UserName@lappsl4.in2p3.fr.

LHCb environment

LHCb uses different softwares : [2]

Starting with LHCb software at LAPP

A (t)csh shell is present in /lapp_data/lhcb to setup all LHCb environment:

1) you have to edit your $HOME/.login and suppress the "." like

if ($?path) then
   set path=($HOME/bin $HOME /usr/local/sbin /usr/local/bin /sbin /usr/sbin ... $path)

and then

source .login
source /lapp_data/lhcb/LHCbEnv.csh [3]

2) it setups the software installed on the Lapp Cluster,

2) one has to declare the Scientific Linux you use and the number of bits,

3) then coming from out of the Grid on LHCb softwares installed on the cluster you have to source,

4) finally you have to declare your user area.

  setenv MYSITEROOT /grid_sw/lhcb/lib
  setenv CMTCONFIG slc4_amd64_gcc34
  source /grid_sw/lhcb/lib/scripts/ExtCMT.csh
  setenv User_release_area ~/cmtuser

Don't worry about the warnings.

At the same time /lapp_data/lhcb/LHCbEnv.csh setups a version of ROOT :

source /lapp_data/lhcb/setup_root_sl4_(32 or 64).csh [4]

By default the ROOT v5r14.00f is used because supported by LHCb software and TMVA which will be introduced later.

Starting a application

You first need to setenv the chosen one specifying the version you want to use :

  SetupProject <Application_Name> <version>

If you've never done that before

Do

  setenv<Application> <VERSION>

It will lead you to the directory : ~/cmtuser/<Application_Name>_<Version>

Then you have to make a getpack of the wanted application, in LHCbEnv.csh an alias has been made :

  alias getpack /grid_sw/lhcb/lib/scripts/getpack

You need also to create a ~/.ssh.config file where you have to write what is said at this page [5]

Firstly you have to be identify as a LHCb user to be able to do the "getpack"

kinit -4 <your_afs_user_name_at_CERN>

Then, you just have to write

  getpack <Application_Directory>/<Application_Name> <Version>

The <Application_Directory> are for example Phys for DaVinci, Rec for Brunel, Digi for Boole and Sim for Gauss. This information can be found at [6].

If it doesn't work giving you "Bad ticket" message or something like this try first to change your password on lxplus at CERN and try again. Then.... heu... I don't know contact [User support @ cern]

It will then create the whole tree :

  ~/cmtuser/<Application_Name>_<version>/<Application_Directory>/...

Going to the directory

cd ~/cmtuser/<Application_Name>_<version>/<Application_Directory>/cmt/ 

you have to source the setup.csh and compile :

  cmt config
  source setup.csh
  cmt br make

Finally you can launch your jobs with :

$<APPLICATION_NAME>ROOT/$CMTCONFIG/<Application_Name>.exe $<APPLICATION_NAME>ROOT/options/<My_Options_File>

or if you work with PYTHON (releases with python can be found at [7]:

GaudiRun <JOB_OPTIONS>.py

If you've ever done that before

It will lead you to the directory : ~/cmtuser/<Application_Name>_<Version>

Going to the directory ~/cmtuser/<Application_Name>_<version>/<Application_Directory>/<Version>/cmt/ you have to source the setup.csh :

  source setup.csh

and compil :

cmt br make

Finally you can launch your jobs with :

$<APPLICATION_NAME>/$CMTCONFIG/<Application_Name>.exe $<APPLICATION_NAME>/options/<My_Options_File>

or in python (depends of the version of the software you're using)

GaudiRun $<APPLICATION_NAME>/options/<My_Options_File>.py

Exercise

Try do install DaVinci v19r7 and DaVinciUser v8r3 (@ ~/cmtuser/DaVinci_v19r7) and reconstruct the Pi0 mass of the channel Bd->J/PsiPi0. The data are at /lapp_data/lhcb/Tutorial/DaVinci.

As an algorithm you can use the following ones (Pi0.cpp and Pi0.h) [8] and [9] to put into DaVinciUser/src

You need to copy an new requirement file in the cmt of DaVinciUser before compiling [10]

and the .opts related to them (DaVinci.opts) in DaVinci/options [11]

You can also copy them from /lapp_data/lhcb/Tutorial/DaVinci

Here is the answer [12]

If releases aren't installed on /grid_sw/lhcb

Check on /grid_sw/soft-dev-lhcb/lhcb if it's been installed.

If it's installed just

source /grid_sw/soft-dev-lhcb/lhcb/LHCbEnv.csh

and do the same you do for releases installed on grid_sw.

If it's not installed you can install it with

python /grid_sw/soft-dev-lhcb/lhcb/install_project.py -p <Software> -v <Version>
Personal tools