cluster computing - Passing variable between python scripts through shell script -


i can't think of way of doing trying , hoping little advice. working data on computing cluster, , process individual files on separate computing nodes. workflow have right following:

**file1.py**  files, parameters, other info user  call: file2.sh  **file2.sh**  submit file3.py computing node  **file3.py**  process input file parameters given 

what trying call file2.sh , pass each input data file 1 @ time there multiple instances of file3.py running, 1 per file. there way this?

i suppose root of problem if iterate through list of input files in file1.py don't know how pass information file2.sh , on file3.py.

from description, i'd the straightforward way call file2.sh directly python.

status, result = commands.getstatusoutput("file2.sh" + arg_string) 

is enough of start moving? nodes conversant enough 1 launch command directly on another? if not, may want consider looking "interprocess communication" on linux. if they're not on same internet node, you'll need rest commands (post , operations), whence things grow more overhead.


Comments

Popular posts from this blog

Delphi XE2 Indy10 udp client-server interchange using SendBuffer-ReceiveBuffer -

Qt ActiveX WMI QAxBase::dynamicCallHelper: ItemIndex(int): No such property in -

Enable autocomplete or intellisense in Atom editor for PHP -