[torqueusers] multiple commands in one qsub script

Matt Britt msbritt at umich.edu
Thu Jun 27 11:52:24 MDT 2013


Yes, and if you only asked for one processor, doing that just used four
processors.   If your system allocates by processor and not by whole node,
this would impact both yourself and other users.  If you did ask for 4
processors, it then ties up resources unnecessarily, unless all 4 jobs end
at the same time.

Thanks,
   - Matt

--------------------------------------------
Matthew Britt
CAEN HPC Group - College of Engineering
msbritt at umich.edu



On Thu, Jun 27, 2013 at 1:50 PM, Jack Wilkinson
<jwilkinson at stoneeagle.com>wrote:

> In the example you've given, doesn't that put all four jobs on the same
> box tho?  Not spread across four different nodes?  I'm just curious.
>
> Thanks!
> jack
>
> -----Original Message-----
> From: torqueusers-bounces at supercluster.org [mailto:
> torqueusers-bounces at supercluster.org] On Behalf Of Chandler Wilkerson
> Sent: Thursday, June 27, 2013 12:12 PM
> To: torqueusers at supercluster.org
> Subject: Re: [torqueusers] multiple commands in one qsub script
>
> If you would like commands in a qsub script to run in parallel, you can
> employ a useful trick we've been recommending to some of our users.
>
> Basically, you allocate multiple cores for your job (on a single node),
> then run a process per core backgrounded (with the & symbol) then put a
> wait statement at the end of your script.
>
> e.g.
> #PBS -l nodes=1:ppn=4
> prog1 &
> prog2 &
> prog3 &
> prog4 &
>
> wait
>
> This will run the four programs in parallel and wait for all of them to
> finish before exiting the job.
>
> On 6/27/13 8:23 AM, Mahmood Naderan wrote:
> > Hi
> > Assume I put more than one command in a typical qsub script. Are these
> > commands executed sequentially or in parallel? For example, the script
> > looks like
> >
> > #PBS -N temporal
> > #PBS -V
> > #PBS -q tigerq
> > #PBS -l nodes=1
> > cd $PBS_O_WORKDIR
> > ls
> > env
> >
> > I want to know if 'ls' and 'env' executes in parallel or sequential.
> > Thanks.
> > Regards,
> > Mahmood*
> > *
> >
> >
> > _______________________________________________
> > torqueusers mailing list
> > torqueusers at supercluster.org
> > http://www.supercluster.org/mailman/listinfo/torqueusers
> >
> _______________________________________________
> torqueusers mailing list
> torqueusers at supercluster.org
> http://www.supercluster.org/mailman/listinfo/torqueusers
> _______________________________________________
> torqueusers mailing list
> torqueusers at supercluster.org
> http://www.supercluster.org/mailman/listinfo/torqueusers
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.supercluster.org/pipermail/torqueusers/attachments/20130627/6e55e5b6/attachment.html 


More information about the torqueusers mailing list