[Bioperl-l] parallel processing with perl
pwilkinson at videotron.ca
Fri Jul 16 09:44:34 EDT 2004
This is something that you can do by creating a process for each query and
capturing the output from each child process. Start by having a look at the
fork() command. Be warned that process management is an art and you might
end up spend some time learning about multiple process management.
At 04:27 AM 7/16/2004, gowthaman ramasamy wrote:
>Please ignore this question if it sounds like not related to BIOPERL.
> have a lengthy Perl script running on a 4 processor machine. At one
>point of time i have to execute four mysql quries from
>four different batch files (via shell). Currently i run them one after
>other. Can i some how fire them simultaneously so that they occupy all 4
>processors and does the job quickly.
>portion of script follows ...
>$var1=`mysql -h localhost -u xx -pyy filter < batchfile1.sql |tail -1`;
>$var2=`mysql -h localhost -u xx -pyy filter < batchfile2.sql |tail -1`;
>$var3=`mysql -h localhost -u xx -pyy filter < batchfile3.sql |tail -1`;
>$var4=`mysql -h localhost -u xx -pyy filter < batchfile4.sql |tail -1`;
>NOTE : I dont want to use Perl-DBI.
>many thankx in advance
>Malaria Research Group,
>ICGEB , New Delhi.
> 91-11-26173184; 91-11-26189360 #extn 314
>Bioperl-l mailing list
>Bioperl-l at portal.open-bio.org
More information about the Bioperl-l