DavidG, HerbertK, JayW, JoshG, RyanH
Reading: Building Parallel Programs, Chapter1
I welcomed our new CIS(theta) team today! We will meet every other Wednesday. Our task is to learn about cluster computing hardware in general. Investigate the current state of cluster computing software and find a problem worthy of running on our cluster.
First, let's get some vocabulary straight:
CIS(theta) = Computing Independent Study class
CIS(team) = DavidG, HerbertK, JoshG, JayW, RyanH (aka Geek Squad)
Shadowfax = 25 node, 64bit dual-core AMD Athlon 2GHz, gigE cluster
How about some history:
CIS(team) 0: We've done a bit of cluster programming over the years as a final project in AP Computer Science after AP Week all the way back to clusterKNOPPIX, i386/486s and slow ethernet! Actually, about 25 years ago I helped JamesM with a robotics project for the Westinghouse Talent Search. So, that's when the bug bit me to get into all this research with my students! I think the computing independent study started about 20 years ago when JoeB wanted to learn about Mathematica and Vector Calculus.
CIS(team) I: ChrisR, FrankK, NathanielR successfully installed openMOSIX on 25 Pentium IVs over fastE and created several Mandelbrot plots using C++ and fork(). We followed the advice from this blog, http://nullprogram.com (search there for mosix, octave, c++ or fractal). Take a look at the fractal prints on my deviantart site, http://cistheta2007.deviantart.com/gallery
CIS(team) II: MarcA, MitchellW were the first to use the current cluster hardware. They installed the Quantian Linux DVD and used bash scripts to scatter/gather povray jobs over publicly authenticated ssh. Take a look at the ray tracings on my zazzle site: http://www.zazzle.com/cistheta2008/gifts
CIS(team) III: ArthurD, DevinB, JeremyA, StevenB were the first to attempt using the current cluster hardware in 64bit mode. We tried Fedora, CentOS and even Rocks Clusters and OSCAR to no avail. I think we barely got HelloCluster.cc running on openMPI.
CIS(team) IV: DavidG, HerbertK, JayW, JoshG, RyanH will try 64bit mode again on Ubuntu Linux 10.04 that I have already installed on all nodes (see my post on this, http://shadowfaxrant.blogspot.com/2010/06/so-many-linux-distros-so-little-time.html). You may want to try pelicanHPC too. This is a 32bit live Linux CD. With this CD you are supposed to be able to reboot the whole room as an MPI cluster in minutes without installing a thing! What's nice about this solution is that it includes openMPI and has a lot of demos. We had this working on the new hardware last summer but sometimes its hard to use as its based on PXE and there's some DHCP server on our 10.5.*.* subnet that interferes with booting the nodes. However, another benefit of this CD is that it includes Octave (like MATLAB/SAGE) and mpiTB (an interface for Octave to work with MPI)! Hence your online research assignment listed above. Another CD solution with MPI is BCCD. An older DVD with openMOSIX is QUANTIAN.
So, our first meeting consisted of discussing all the above. Also, I mentioned that the CIS(theta) team will act as my personal Geek Squad and help me install hardware, firmware and software in our PC Classroom (since I am still recuperating from surprise hernia surgery). So, we had 2 "field trips."
Firld Trip 1: We went to the display case with some fractal and ray tracing prints we wanted to swap out. We also went to Mrs. Murthy's new office to help decorate it with some prints. We saved some nice big prints for the PC Classroom. We are getting 6 new marker boards in there which will be installed all around the room. I will hang a big print above each one. So, my decorative work here is done!
Firld Trip 2: I also showed the current CIS(theta) team around to all the servers
centauri: ftp server in the book room
colossus: ssh server in the business office
guardian and caprica: new servers in the PC Classroom
So, the Geek Squad will also be helping me to install these new servers. The new servers will eventually replace the old ones. These new servers each have 4x72GB RAID drives and have 64bit Intel Xeon quadcores! Maybe we should make a cluster out of these? Google Cloud Computing using Ubuntu and Cloud Computing for HPC.
Wow, we have our work cut out for us. But, I think we are going to have a lot of fun this year!