Saturday, March 23, 2013

CIS(theta) 2012-2013: March Update

Computing Independent Study 2012-2013

We only had only one meeting and all we did was set up for Game Day! We had an OpenArena LAN Party all day yesterday. Thanx to KyleS (period 1), fun was had by one and all! All the brownies and cup cakes were great too! Thanx to JessicaP (period 6) and LeslieM (period 8), noone left hungry! 

Yesterday (3/22/13) was the Friday before break. It had the feeling of a celebration like the day before XMas break. I suppose that's because we haven't had a break since then. 

Not only did Hurricane Sandy close the school for 10 days and condemn the Math building, but it cancelled Midterm Week and February Break too! Thanx a lot, Sandy....

We finally got around to writing and using MPI4PY, mathplotlib() and spectral but only on 3 cores. Our work is based on Lisandro Dalcin's excellent materials listed below. We ran as an executable file from the terminal just like any standard python program. To accomplish this we added:
as the first line of the file and ran:
chmod 755
to make an executable. is executed like
mpiexec -n 3 python

OOPS, we got our signals crossed and had to postpone this meeting! Next time I think we'll try to render some Mandelbrot Fractals. If we ever get back to our old room, we could make a FractalZoom movie using 100 cores instead of just 3....

Finally, we got working today on a single node with 3 cores! We wrote the following code:

1 from mpi4py import MPI
3 rank = MPI.COMM_WORLD.Get_rank()
4 size = MPI.COMM_WORLD.Get_size()
5 name = MPI.Get_processor_name()
7 print ("Hello, World! "
8 "I am process %d of %d on %s" %
9 (rank, size, name))

We saved this text in the file in the pelicanHPC home dir. Then, from a shell in the home dirm we executed this code by:
mpiexec -n 3 python

We downloaded the new ISO with mpi4py from and followed this tutorial: and this one too: and and

We tried to isolate our LAN from other DHCP servers but could not. So we still only have individual 3-core SMP boxes running at about 1.2 GFLOPs. We also tried our hand at mpi4py, but we have a way to go in that department too!
UPDATE: mpi4py was left out of the current pelicanHPC by mistake! No wonder mpi4py didn't work! Look here:

We finally had a chance to meet this month in our new PC Lab! We got pelicanHPC to run 3 cores at about 1.2 GFLOPS. We have AMD Phenom IIs. These CPUs are supposed to be quad-cores. It seems one core is dead. Even so, that's not a bad start. However, we could only run pelican in SMP mode. We could not PXE boot any other nodes. We are also looking into Flame Fractals.

Sorry to say that we had no meetings this month. I was hoping for 3 meetings, but Hurricane Sandy changed everything! We are in a new room where we may be able to try out liveLinux CD based MPI clusters next month. Stay tuned!

That was a great meeting today! We burned 8 CDs of Precise Pangolin and reinstalled the whole back row of our PC LAB/Classroom. Many thanx go to Jeremy for coming to vist today and lending a hand! 

Here's the steps we followed for a minimal install of the Student Stations (64bit Athlons):
Reboot each Linux box with the current  CD. Answer some basic questions about time zone, userid, passwd, no login on bootup, etc.
Reboot each Linux box without the CD. Make sure to configure the gigE cards and proxy server:
IP: 10.5.129.x
Configure System Settings as desired (unit circle trig calculator background, no screensaver, etc).
We had to switch Software Sources in the Ubuntu Software Center (edit/source) to Main before this would work:
sudo apt-get update
sudo apt-get upgrade
Now, we could use the Ubuntu Software Center to install WINE.
I copied my VTI83 and VTI89 directories from my memory stick to the Desktop. Then, after editing preferences to have VTI open with WINE, I configured each calculator.
I will edit my /etc/crontabs tomorrow....
We haven't decided what else we may have to install (local SAGE server, JRE, openSSH, openMPI,etc). We'll have to think about that! Here's some info on install fests from prior years:

We decided to try out the new Ubuntu Linux 64bit Desktop 12.04 nicknamed Precise Pangolin. So we surfed on over to and downloaded the latest ISO. We burned the CD, rebooted a guinea pig box and reinstalled it. This should be a simple procedure as we no longer use dualboot or dualnic boxes. However, we ran into a SNAFU right away! Intranet gigE works fine, but we can't get on the Internet? OOPs, we forgot the network proxy. If at first you don't succeed, try, try again!

Ubuntu Release History
4.10 Warty Warthog (mammal)
5.04 Hoary Hedgehog (mammal)
5.10 Breezy Badger (mammal)
6.06 Dapper Drake (bird)
6.10 Edgy Eft (amphibian)
7.04 Feisty Faun (mammal)
7.10 Gutsy Gibbon (mammal)
8.04 Hardy Heron (bird)
8.10 Intrepid Ibex (mammal)
9.04 Jaunty Jackalope (mythical beast)
9.10 Karmic Koala (mammal)
10.04 Lucid Lynx (mammal) 
10.10 Maverick Meerkat (mammal) 
11.04 Natty Narwahl (mammal) 
11.10 Oneiric Ocelot (mammal) 
12.04 Precise Pangolin (mammal) 
12.10 Quantal Quetzal (bird) release: 10/18

Guardian, our ssh server, is running 10.04 32bit. Guardian has a dualcore 32bit intel Xeon processor with 2GB RAM and a 512GB RAID drive.

Caprica, our ftp server, is running 10.04 32bit. Caprica has a dualcore 32bit intel Xeon processor with 2GB RAM and a 512GB RAID drive.

Shadowfax, our teacher station, is running 11.10 32bit. Shadowfax has a dualcore 64bit amd Athlon processor with 2GB RAM and a 256GB hdd. We use a 32bit OS here as SmartNotebook doesn't run on 64bit....

Alpha-Omega, our student stations, are running 11.04 64bit. These Linux boxes, like Shadowfax, have dualcore 64bit amd Athlon processors with 2GB RAM and a 256GB hdd.

We are only upgrading Alpha-Omega to 12.04 (or 12.10 if it's available when we upgrade in a couple of weeks). We are also waiting for a hardware upgrade for Alpha-Omega to amd quadcore Phenoms!

We had our traditional first organizational meeting:

(1) Wreath of the Unknown Server: 
We visited our first ssh server, Colossus, which is still in the switch room though dormant. I set it up for the first time in 1995 running Slackware Linux. Colossus ran for 12 years straight, 24x7 never having to shut down, reboot or even have anything re-installed! Colossus would not die. We finally just replaced Colossus with a dual-core Intel Xeon box complete with a RAID drive running 1TB. Old Linux boxes never die, they just fade away...

(2) Display Case Unveiled: 
We took down a ton of fractal prints and ray tracings from Room 429 to the 2 cases on the 1st floor near the art wing. We decorated both cases as best we could and left before anyone saw us. Must have been gremlins.

(3) Recruiting 2012: 
We decided that we did not have a good pool of candidates to recruit more CIS(theta) members for this year's Geek Squad, so we tabled that topic.

(4) Planing 2012: 
Next meeting would have been 9/28 but that's Yum Kipur. So, we have to wait another 2 weeks after that for 10/10 at which point Ubuntu Precise Pangolin 64bit release 12.10 Desktop Edition should be available for a mini install fest. After that, we may use bootable cluster Linux CD distros to learn MPI.
What we are researching I (Sept)
(look what this school did in the 80s): 
Thomas Jefferson High courses
Thomas Jefferson High paper
Thomas Jefferson High ftp
Thomas Jefferson High teacher

What we are researching II (Oct)
(clustering environments): 
Parallel Virtual Machine
Message Passing Interface

What we are researching III (Dec)
(instant MPI clusters via liveCDs): 
Cluster By Night
Flame Fractals

What we are researching IV (Jan)

What we are researching V (Feb) and

Today's Topic:
CIS(theta) 2012-2013 -

Today's Attendance:
CIS(theta) 2012-2013: Kyle Seipp

Today's Reading:
Chapter 5: Building Parallel Programs (BPP) using clusters and parallelJava
Membership (alphabetic by first name):
CIS(theta) 2012-2013: 
Kyle Seipp

CIS(theta) 2011-2012: 
Graham Smith, George Abreu, Kenny Krug, LucasEager-Leavitt

CIS(theta) 2010-2011: 
David Gonzalez, Herbert Kwok, Jay Wong, Josh Granoff, Ryan Hothan

CIS(theta) 2009-2010: 
Arthur Dysart*, Devin Bramble, Jeremy Agostino, Steve Beller

CIS(theta) 2008-2009: 
Marc Aldorasi, Mitchel Wong*

CIS(theta) 2007-2008: 
Chris Rai, Frank Kotarski, Nathaniel Roman

CIS(theta) 1988-2007: 
A. Jorge Garcia, Gabriel Garcia, James McLurkin, Joe Bernstein, ... too many to mention here!

Well, that's all folks, enjoy!
Happy Clustering,

No comments:

Post a Comment