Aim:
Instant Cluster, just add water (continued)!
Attending:
CIS(theta) 2010-2011: DavidG, HerbertK, JoshG, RyanH
Reading:
NA
Research:
openSSH and openMPI
MPI on Ubuntu
http://ubuntuforums.org/showthread.php?t=1327253
http://ubuntuforums.org/showthread.php?t=1327253
Sample MPI code
http://www.cc.gatech.edu/projects/ihpcl/mpi.html
http://www.cc.gatech.edu/projects/ihpcl/mpi.html
Thomas Jefferson High courses
http://academics.tjhsst.edu/compsci/parallel/
Thomas Jefferson High paper
http://www.tjhsst.edu/~rlatimer/techlab07/BWardPaperQ3-07.pdf
Thomas Jefferson High ftp
http://www.tjhsst.edu/~rlatimer/techlab07/
Thomas Jefferson High teacher
http://www.tjhsst.edu/~rlatimer/
MPI4py
http://blog.mikael.johanssons.org/archive/2008/05/parallell-and-cluster-mpi4py/
http://academics.tjhsst.edu/compsci/parallel/
Thomas Jefferson High paper
http://www.tjhsst.edu/~rlatimer/techlab07/BWardPaperQ3-07.pdf
Thomas Jefferson High ftp
http://www.tjhsst.edu/~rlatimer/techlab07/
Thomas Jefferson High teacher
http://www.tjhsst.edu/~rlatimer/
MPI4py
http://blog.mikael.johanssons.org/archive/2008/05/parallell-and-cluster-mpi4py/
Parallel Python
IPython
Large Integer number crunching Mersenne Primes
http://www.hoise.com/primeur/03/articles/weekly/AE-PR-01-04-37.html
Large Integer number crunching Beal Conjecture
http://www.bealconjecture.com/
http://www.hoise.com/primeur/03/articles/weekly/AE-PR-01-04-37.html
Large Integer number crunching Beal Conjecture
http://www.bealconjecture.com/
MPIPOV
http://comp.uark.edu/~ewe/misc/povray.html
POVRAY
http://himiko.dnsalias.net/wordpress/2010/03/29/persistence-of-vision-ray-tracer-povray/
MPI and blender
http://www.blender.org/forum/viewtopic.php?t=10244&view=next&sid=3003d90233a27b81c5093a374c2b0e31
More MPI and blender
http://wiki.vislab.usyd.edu.au/moinwiki/BlenderMods/
http://comp.uark.edu/~ewe/misc/povray.html
POVRAY
http://himiko.dnsalias.net/wordpress/2010/03/29/persistence-of-vision-ray-tracer-povray/
MPI and blender
http://www.blender.org/forum/viewtopic.php?t=10244&view=next&sid=3003d90233a27b81c5093a374c2b0e31
More MPI and blender
http://wiki.vislab.usyd.edu.au/moinwiki/BlenderMods/
InstantCluster Step 4: Software Stack (Continued)
We got Josh up to speed adding openSSH and public-key authentication and openMPI. So, what follows is a summary of what we did to get up to plublic-key authentication. We then installed openMPI (see dependencies below) and tested multi-core with flops. Testing the cluster as a whole will have to wait until the next meeting! We followed openMPI install instructions for Ubuntu from
http://www.cs.ucsb.edu/~hnielsen/cs140/openmpi-install.html
These instructions say to use sudo and run run apt-get install openmpi-bin openmpi-doc libopenmpi-dev However, the way our firewall is setup at school, I can never update my apt-get sources files properly. So, I used http://packages.ubunutu.com to look up these files and got the following dependencies!
If you think that's bad, look at all the dependencies I ran into to install VLC on Ubuntu Maverick Meerkat 10.04 which was required to instrall handbrake!
We finally got all this working (sans vlc which is on my smartboard station). Then we used the following FORTRAN code to test multi-core. FORTRAN, really? I haven't used FORTRAN since 1979!
we compiled flops.f after installing gfortran too:
and tested openmpi and got around 900 MFLOPS using 2 cores:
Next meeting we have to generate a "machines" file to tell mpirun where all the nodes are:
Well, that's all for now, enjoy!
http://www.cs.ucsb.edu/~hnielsen/cs140/openmpi-install.html
These instructions say to use sudo and run run apt-get install openmpi-bin openmpi-doc libopenmpi-dev However, the way our firewall is setup at school, I can never update my apt-get sources files properly. So, I used http://packages.ubunutu.com to look up these files and got the following dependencies!
If you think that's bad, look at all the dependencies I ran into to install VLC on Ubuntu Maverick Meerkat 10.04 which was required to instrall handbrake!
We finally got all this working (sans vlc which is on my smartboard station). Then we used the following FORTRAN code to test multi-core. FORTRAN, really? I haven't used FORTRAN since 1979!
we compiled flops.f after installing gfortran too:
mpif77 -o flops flops.f
mpirun -np 2 flops
mpirun -np 4 --hostfile machines flops
No comments:
Post a Comment