Submit Grid Job » History » Revision 17
Revision 16 (Timo Eronen, 2016-12-08 09:19) → Revision 17/21 (Timo Eronen, 2016-12-08 09:30)
h1. Submit Grid Job *This page is under construction* (obviously) *1. Before you can submit Grid jobs you need to prepare your setup.* Finish first the steps described here: https://p55cc-redmine.utu.fi/projects/user-s-page/wiki/Prepare_For_Grid_Usage *2. Other preparation* You use the Grid resources via the ARC (Advanced Resource Connector) middleware, developed by the Nordugrid community. There are basically two ways to submit Grid job using ARC: # From ANY other computer but Pleione or Titan Login frontend # From Pleione or Titan Login frontend. (i.e. pleione.utu.fi or titan.utu.fi) h3. 2.1 Using the Grid from other computers Using the Grid from other computers than Pleione or Titan you need to install ARC client software and configure it. Such tasks are documented here: https://research.csc.fi/fgci-arc-middleware#1.3.2 h3. 2.2 Using the Grid from Pleione or Titan frontend The rest of this guide is just short version of this guide: https://research.csc.fi/fgci-using-arc-middleware So you might want to read the comprehensive guide before running the example. *3. Example of running a simple Grid job from Titan login node* job* To run your binary as a Grid job jog you need the following files: - compiled program (the binary) - job description file - batch file - possibly input file(s) The c-source of an example program *gtest.c* : <pre> #include <stdio.h> int main(void) { char *line = NULL; size_t size; printf("Hello UTU.\n"); while (getline(&line, &size, stdin) != -1) printf("%s", line); return(0); } </pre> Compile the source: <pre> gcc -Wall -o gtest gtest.c </pre> The job description file *gtest.xrsl* : <pre> &(executable=gtest.sh) (jobname=g_test) (runtimeenvironment>="ENV/FGCI") (join="yes") (stdout=std.out) (cpuTime="1 hours") (count="1") (count="12") (memory="1000") (inputfiles= ("gtest" "" ) ("gtest.txt" "" ) ) (outputfiles= ("gtest.tgz" "" ) ) </pre> The batch file *gtest.sh* : <pre> #!/bin/sh echo "Running gtest" module load OpenMPI chmod u+x gtest mpirun ./gtest < gtest.txt > gtest.out tar czf gtest.tgz gtest.out echo "Done" exit 0 </pre> Input Optional input file for this test run *gtest.txt* : <pre> Hello FGCI. </pre> *Now you are ready to run the job.* Get proxy to run the job: <pre> $ arcproxy Enter pass phrase for private key: Your identity: /DC=org/DC=terena/DC=tcs/C=FI/O=Turun yliopisto/CN=Timo Eronen tke@utu.fi Proxy generation succeeded Your proxy is valid until: 2016-12-08 19:46:02 </pre> Run the job: <pre> $ arcsub gtest.xrsl Job submitted with jobid: gsiftp://io-grid.fgci.csc.fi:2811/jobs/3dxMDmPc7Ypn9NOVEmGrhjGmABFKDmABFKDmXGKKDmABFKDmWJfHjm </pre> Query status of the job: <pre> $ arcstat gsiftp://io-grid.fgci.csc.fi:2811/jobs/3dxMDmPc7Ypn9NOVEmGrhjGmABFKDmABFKDmXGKKDmABFKDmWJfHjm Job: gsiftp://io-grid.fgci.csc.fi:2811/jobs/3dxMDmPc7Ypn9NOVEmGrhjGmABFKDmABFKDmXGKKDmABFKDmWJfHjm Name: g_test State: Finishing Status of 1 jobs was queried, 1 jobs returned information </pre> Job was not yet finished but after awhile: <pre> $ arcstat gsiftp://io-grid.fgci.csc.fi:2811/jobs/3dxMDmPc7Ypn9NOVEmGrhjGmABFKDmABFKDmXGKKDmABFKDmWJfHjm Job: gsiftp://io-grid.fgci.csc.fi:2811/jobs/3dxMDmPc7Ypn9NOVEmGrhjGmABFKDmABFKDmXGKKDmABFKDmWJfHjm Name: g_test State: Finished Exit Code: 0 Status of 1 jobs was queried, 1 jobs returned information </pre> Get the results: <pre> $ arcget gsiftp://io-grid.fgci.csc.fi:2811/jobs/3dxMDmPc7Ypn9NOVEmGrhjGmABFKDmABFKDmXGKKDmABFKDmWJfHjm Results stored at: 3dxMDmPc7Ypn9NOVEmGrhjGmABFKDmABFKDmXGKKDmABFKDmWJfHjm Jobs processed: 1, successfully retrieved: 1, successfully cleaned: 1 </pre> And then you ready to view the results: then: <pre> $ ls -l 3dxMDmPc7Ypn9NOVEmGrhjGmABFKDmABFKDmXGKKDmABFKDmWJfHjm/ total 8 -rw------- 1 tke admin 148 Dec 8 09:14 gtest.tgz -rw------- 1 tke admin 19 Dec 8 09:14 std.out $ cd 3dxMDmPc7Ypn9NOVEmGrhjGmABFKDmABFKDmXGKKDmABFKDmWJfHjm/ $ tar xf gtest.tgz $ cat std.out Running gtest Done $ tar xf gtest.tgz $ cat gtest.out Hello UTU. Hello FGCI. </pre>