[torqueusers] searcing for the directory of running job

Mahmood Naderan nt_mahmood at yahoo.com
Thu Jan 31 00:39:07 MST 2013


Below is the output of "qstat -f". Please note that I am not looking for PBS_O_WORKDIR. That is the working directory which I ran qsub. What I want to find is the temporary directory on the computing node which is running the executable. Assume,
1- I compile the program
2- qsub the program
3- while '2' is running, I modify the code 
4- compile the code
5- qsub the program again

Now 2 instances of my program are running however they are independent. So torque should have copied the executables somewhere on the computing nodes to provide this independence. I want to find that location.


 
root at orca:/home/mahmood# qstat -f 92322
Job Id: 92322.hpclab.orca
    Job_Name = sample_machine-SET2-4core-MILC-LESLIED3D-GEMSFDTD-LBM.simics
    Job_Owner = ali at hpclab.orca
    resources_used.cput = 12:54:55
    resources_used.mem = 11203852kb
    resources_used.vmem = 11568104kb
    resources_used.walltime = 09:35:49
    job_state = R
    queue = orcaq
    server = hpclab.orca
    Checkpoint = u
    ctime = Thu Jan 31 01:25:32 2013
    Error_Path = hpclab.orca:/home/ali/lu/work_update/sample_machine-S
        ET2-4core-MILC-LESLIED3D-GEMSFDTD-LBM.simics.e92322
    exec_host = ws05/6
    exec_port = 15003
    Hold_Types = n
    Join_Path = oe
    Keep_Files = n
    Mail_Points = a
    mtime = Thu Jan 31 01:25:33 2013
    Output_Path = orca:/home/ali/lu/work_update/tor-reports/SET2-4core
        -MILC-LESLIED3D-GEMSFDTD-LBM.simics.tor.rep
    Priority = 0
    qtime = Thu Jan 31 01:25:32 2013
    Rerunable = True
    Resource_List.neednodes = ws05
    Resource_List.nodect = 1
    Resource_List.nodes = ws05
    Resource_List.walltime = 960:00:00
    session_id = 24161
    substate = 42
    Variable_List = PBS_O_QUEUE=orcaq,PBS_O_HOME=/home/ali,
        PBS_O_LANG=en_US.UTF-8,PBS_O_LOGNAME=ali,
        PBS_O_PATH=/home/ali/lu/msim/bin:/home/ali/lu/workdir
        :/home/ali/lu/msim/bin:/home/ali/lu/workdir:/opt/mpich
        2/bin:/opt/mpiexec/bin:/opt/mpich2/bin:/opt/mpiexec/bin:/usr/local/mau
        i/sbin:/usr/local/maui/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/u
        sr/bin:/sbin:/bin:/usr/bin/X11:/usr/games,
        PBS_O_MAIL=/var/mail/ali,PBS_O_SHELL=/bin/bash,
        PBS_O_HOST=hpclab.orca,PBS_SERVER=hpclab.orca,
        PBS_O_WORKDIR=/home/ali/lu/work_update,SHELL=/bin/bash,
        TERM=xterm,
        XDG_SESSION_COOKIE=de692068037f222a35a95f874dc6aed9-1359565750.805643
        -944328263,SSH_CLIENT=213.233.182.203 9134 22,SSH_TTY=/dev/pts/5,
        USER=ali,
        LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=
        40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:o
        w=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01
        ;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=0
        1;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31
        :*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;3
        1:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.ace=01
        ;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=
        01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tg
        a=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*
        .svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;3
        5:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=
        01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wm
        v=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.
        fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.
        yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:
        *.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.mid=00;36:*.midi=00;
        36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00
        ;36:*.axa=00;36:*.oga=00;36:*.spx=00;36:*.xspf=00;36:,
        PATH=/home/ali/lu/msim/bin:/home/ali/lu/workdir:/home
        /ali/lu/msim/bin:/home/ali/lu/workdir:/opt/mpich2/bin:
        /opt/mpiexec/bin:/opt/mpich2/bin:/opt/mpiexec/bin:/usr/local/maui/sbin
        :/usr/local/maui/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin
        :/sbin:/bin:/usr/bin/X11:/usr/games,MAIL=/var/mail/ali,
        SIMICS_INSTALL=/home/ali/gems/simics-3.0.30,
        PWD=/home/ali/lu/work_update,LANG=en_US.UTF-8,
        GEMS=/home/ali/gems,HOME=/home/ali,SHLVL=2,
        BASH_ENV=~/.profile,LOGNAME=ali,PYTHONPATH=./modules,
        SSH_CONNECTION=213.233.182.203 9134 194.225.69.105 22,
        LESSOPEN=| /usr/bin/lesspipe %s,SIMICS_EXTRA_LIB=./modules,
        LESSCLOSE=/usr/bin/lesspipe %s %s,_=/usr/local/bin/qsub
    euser = ali
    egroup = users
    hashname = 92322.hpclab.orca
    queue_rank = 27317
    queue_type = E
    etime = Thu Jan 31 01:25:32 2013
    submit_args = tor-files/SET2-4core-MILC-LESLIED3D-GEMSFDTD-LBM.simics.tor
    start_time = Thu Jan 31 01:25:33 2013
    Walltime.Remaining = 3421384
    start_count = 1
    fault_tolerant = False




Regards,
Mahmood



________________________________
 From: Diego Bacchin <diego.bacchin at bmr-genomics.it>
To: Torque Users Mailing List <torqueusers at supercluster.org> 
Cc: Mahmood Naderan <nt_mahmood at yahoo.com>; Torque Users Mailing List <torqueusers at supercluster.org> 
Sent: Thursday, January 31, 2013 12:48 AM
Subject: Re: [torqueusers] searcing for the directory of running job
 

Post us the output of 
qstat -f jobid

--
Diego Bacchin

Il giorno 30/gen/2013, alle ore 15:51, Chrysovalantis Paschoulas <cpaschoulas at gmail.com> ha scritto:


try this:
>
>$ qstat -f <jobid> | grep -i workdir
>
>or just run
>
>$ qstat -f <jobid>
>
>and search for the work dir..
>
>best regards,
>chrys
>
>
>
>
>On Wed, Jan 30, 2013 at 3:16 PM, Mahmood Naderan <nt_mahmood at yahoo.com> wrote:
>
>There is no such variable
>>
>>mahmood at orca:~$ qstat 92285
>>Job id                    Name             User            Time Use S Queue
>>------------------------- ---------------- --------------- -------- - -----
>>92285.hpclab               ...3-AG-l2pf mahmood         01:43:31 R orcaq
>>
>>mahmood at orca:~$ qstat -f 92285.hpclab | grep init_work_dir
>>qstat: illegally formed job identifier: 92285.hpclab
>>
>>mahmood at orca:~$ qstat -f 92285 | grep
 init_work_dir
>>mahmood at orca:~$
>>
>>
>>
>>
>> 
>>Regards,
>>Mahmood
>>
>>
>>
>>
>>________________________________
>> From: diego bacchin <diego.bacchin at bmr-genomics.it>
>>To: torqueusers at supercluster.org 
>>Sent: Wednesday, January 30, 2013 3:59 PM
>>Subject: Re: [torqueusers] searcing for the directory of running job
>> 
>>
>>Hi,
>>     qstat -f jobid | grep init_work_dir
>>Bye
>>
>>Diego Bacchin
>>IT System Administrator at
>>  BMR Genomics srl - Via Redipuglia, 19 - PADOVA (PD) - Italy
>>  CRIBI - University of Padova - Via U. Bassi, 58 - PADOVA (PD) - Italy
>>diego at bmr-genomics.it - diego.bacchin at cribi.unipd.it
>>366 72 97 232
>>
>>Il 30/01/2013 12:41, Mahmood Naderan ha scritto:
>>> Dear all,
>>> How can I see the working directory of the running job on a a 
>>> computing node based on the job ID. In condor, there is a command " 
>>> <id>". Is there something similar in torque?
>>>
>>> Regards,
>>> Mahmood*
>>> *
>>>
>>>
>>> _______________________________________________
>>> torqueusers mailing list
>>> torqueusers at supercluster.org
>>> http://www.supercluster.org/mailman/listinfo/torqueusers
>>
>>_______________________________________________
>>torqueusers mailing list
>>torqueusers at supercluster.org
>>http://www.supercluster.org/mailman/listinfo/torqueusers
>>
>>
>>
>>_______________________________________________
>>torqueusers mailing list
>>torqueusers at supercluster.org
>>http://www.supercluster.org/mailman/listinfo/torqueusers
>>
>>
>
_______________________________________________
>torqueusers mailing list
>torqueusers at supercluster.org
>http://www.supercluster.org/mailman/listinfo/torqueusers
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.supercluster.org/pipermail/torqueusers/attachments/20130130/f7cfc818/attachment.html 


More information about the torqueusers mailing list