Wednesday, June 26, 2019

preCalculus Honors 2018-2019: Fare Thee Well, You Will Be Missed! (June)

preCalculus Honors 2018-2019:
Fare Thee Well, You Will Be Missed!
(June)

JUNE UPDATE(last one):
Well, good bye to all my preCalculus students! We had fun with our Game Day, right? Hope you had a great year and you learned a lot! Have a great summer. Hope to see you next year!

CALC UNIT03-UNIT06 SCREENCASTS







MAY UPDATE: 
We flew thru an 8 week intro to Calculus Boot Camp. UNIT03 was about Differentiation. UNIT04 was about Applications of Derivatives. UNIT05 was about Integration. UNIT06 was about Applications of Integrals.

UNIT02 is an introduction to Limits. It's called UNIT02 as it's a version of UNIT02 from my Calculus class where UNIT01 is a review of preCalculus. We don't need UNIT01, now do we? 

So, in UNIT02 we extended the ideas of CHAP03 where we talked about Limits at Infinity and Limits at a point wrt graphing asymptotes of Rational Functions. Then we talk about 1-sided and 2-sided limits, continuity and the definition of the Derivative. 

XTRA CREDIT FILKS RUBRIC 
(1 video = up to 5 bonus points):
1) Use a recognizable tune.
2) Karaoke entire song changing up the words (about STEAM).
3) You are Singing, Dancing or Playing an instrument.
4) You upload your video to YouTube and provide the url.
5) YouTube Description includes the lyrics.

XTRA CREDIT ARTICLES RUBRIC
(up to 5 articles = 1 bonus point each):
1) Cover Sheet is a Summary of the article.
2) FullPage, 12 pt, DoubleSpaced, 1" Margin.
3) Article has to be STEAM related
4) Article has to be a current event.
5) Copy of entire article is attached.

Well, that's all folks!
Teaching with Technology, 
AJG
A. Jorge Garcia

 

Applied Math, Physics & CompSci
PasteBin SlideShare 
MATH4R and 4H, AP CALC: GC or SAGECELL
CSH: SAGE Server
CSH: Interactive Python
APCSA: c9.io
APCSA: openProcessing

Beautiful Mind Soundscape:

AP Calculus BC 2018-2019: Fare Thee Well, You Will Be Missed! (June)

AP Calculus BC 2018-2019:
Fare Thee Well, You Will Be Missed!
(June)

JUNE UPDATE(last one):
Well, good bye to all my AP Calculus BC students! We had fun with our AP Movie Marathon (Theory Of Everything, Hidden Figures) didn't we? Hope you had a great year and you learned a lot! Have a great summer. Hope to see you next year!

UNIT12 ScreenCasts
MAY UPDATE: 
May was all about AP Review, AP Exam Week, AP Movie Marathons and Final Projects!

UNIT12 is all about Vector and Polar notation. We started by talking about parametrized motion problems. Then we extended our formulas for ArcLength and Surface Area. Finally, we did the same thing with Polar Graphs: finding Area and ArcLength. 


We're loving our class set of TI nSPire CX CAS Graphing Calculators from Donorschoose


RECOMMENDED AP CALCULUS BC REVIEW:
CRIB SHEET (not given during exam) 
REVIEW BARRONS BOOK (see me)
REVIEW BARRONS ONLINE 

REVIEW KUTASOFTWARE
REVIEW DELTAMATH AB CALC FLASH CARDS
REVIEW APCENTRAL (lastest BC FRQs)

REVIEW APCENTRAL (older AB FRQs)
REVIEW APCENTRAL (older BC FRQs)
REVIEW EDX MOOC01 
REVIEW EDX MOOC02 
REVIEW COURSERA MOOC03 


XTRA CREDIT FILKS RUBRIC 
(1 video = up to 5 bonus points):
1) Use a recognizable tune.
2) Karaoke entire song changing up the words (about STEAM).
3) You are Singing, Dancing or Playing an instrument.
4) You upload your video to YouTube and provide the url.
5) YouTube Description includes the lyrics.

XTRA CREDIT ARTICLES RUBRIC

(up to 5 articles = 1 bonus point each):
1) Cover Sheet is a Summary of the article.
2) FullPage, 12 pt, DoubleSpaced, 1" Margin.
3) Article has to be STEAM related
4) Article has to be a current event.
5) Copy of entire article is attached.

Well, that's all folks!
Teaching with Technology, 
AJG
A. Jorge Garcia

 

Applied Math, Physics & CompSci
PasteBin SlideShare 
MATH 4H, AP CALC: GC or SAGECELL
CSH: SAGE Server
CSH: Interactive Python
APCSA: c9.io
APCSA: openProcessing


Beautiful Mind Soundscape:

AP CompSci A 2018-2019: Fare Thee Well, You Will Be Mised! (June)

AP CompSci A 2018-2019:
Fare Thee Well, You Will Be Missed!
(June)

JUNE UPDATE (last one):
I didn't get a chance to screencast the last 6 projects (listed below). Stay tuned, I'll probably do so over the summer!

Well, good bye to all my AP CompSci A students! We had fun with our AP Movie Marathon (The Martian, Immitation Game, Jobs) and Game Day, right? Hope you had a great year and you learned a lot! Have a great summer. Hope to see you next year!

LAB19 ScreenCasts



MAY UPDATE: 
We did a few new labs involving recursion:
C19X15 LSystems in JDK and openProcessing
C19X16 Math.java
We did a few new labs on inheritance:
C14X5 Employee
C14X6 Worker
We added a few labs to Lab07 (arrays)
C7X11 Chess960.java
C7X12 Game of Life in openProcessing
We also had AP Review, AP Exam Week, AP Movie Marathon and Final Projects!

Our last topic before AP Review was LAB19 Recursion. We didn't have a lot of time left for this topic, so all we did was override the static Math class by writing Math.java and making our own pow() method. We also added a Factorial method, a Fibonacci method and a Pascal Triangle number method. LAB20 is about Searching and Sorting algorithms. We just watched the infamous video: Sort Out Sorting!

BTW, we're loving our new cs50.io, aka c9.io, ide from the Amazon Cloud (AWS)! Don't forget about the Processing IDE! We just started using OpenProcessing online as they've just added Processing.js which is more compatible with Processing's IDE. The default mode is P5.js (javascript) but we switch to Processing.js (java) and all is well! So, we're computing in the cloud even when we use Processing! IDK, is OpenProcessing is on AWS or the Google Compute Platform (GCP)?

RECOMMENDED AP COMPSCI REVIEW:

CRIB SHEET (given during exam)
REVIEW BARRONS BOOK (see me)
REVIEW BARRONS ONLINE 
REVIEW APCENTRAL (past FRQs)
REVIEW EDX REVIEW MOOC01 
REVIEW UDEMY REVIEW MOOC02 
REVIEW CODING_BAT 
REVIEW PRACTICE_IT 
REVIEW RUNESTONE 
AUDIT CS50


XTRA CREDIT FILKS RUBRIC 
(1 video = up to 5 bonus points):
1) Use a recognizable tune.
2) Karaoke entire song changing up the words (about STEAM).
3) You are Singing, Dancing or Playing an instrument.
4) You upload your video to YouTube and provide the url.
5) YouTube Description includes the lyrics.

XTRA CREDIT ARTICLES RUBRIC

(up to 5 articles = 1 bonus point each):
1) Cover Sheet is a Summary of the article.
2) FullPage, 12 pt, DoubleSpaced, 1" Margin.
3) Article has to be STEAM related
4) Article has to be a current event.
5) Copy of entire article is attached.

Well, that's all folks!
Teaching with Technology, 

Beautiful Mind Soundscape:

Saturday, June 15, 2019

CIS(theta), 2018-2019 No June Meeting!

CIS(theta), 2018-2019
No June Meeting!

CH01: (MEETING0) SEPTEMBER READING
CH02(MEETING1) OCTOBER READING
CH03(MEETING2) NOVEMBER READING
CH04(MEETING3) DECEMBER READING 
CH05(MEETING4) JANUARY READING
CH06(MEETING5) FEBRUARY READING
CH07(MEETING6) MARCH READING
CH08(MEETING7) APRIL READING
CH09(MEETING8) MAY READING

JUNE SUMMARY (no meeting 8)
This was the first year using the new hardware, namely the Raspberry Pis. Historically, any time we change hardware, firmware or software in this project, we have had a lot of growing pains. 

This year we didn't get as far as we would have liked. We completed the first few steps of the project as listed below, but then we had to tear down the project in preparation for next year's group as we ran out of time: 

1a) update and upgrade Linux on each Raspberry Pi (RPI)

1b) learn how to use a RPI as replacement desktop  

2) install and test openmpi on each RPI  

3) install and test openssh on each RPI  

4) connect 2 or more RPIs and benchmark  

5) write programs to plot hires fractals using the cluster  

6) write programs to plot hires ray tracings using the cluster  

7) write programs to generate a fractal zoom movie  

8) write programs to generate animated movie sequences using the cluster  

We were on step 4 when we ran out of time.
We started benchmarks and got up to 1 GFLOP/s with 2 nodes or 8 cores, a far cry from the 50 GFLOP/s with 25 nodes or 100 cores we used to have with our old Linux boxes.

Next year I hope to get farther learning from the experience of this year.  

We were trying to scale the cluster from 2 to 4 to 8 to 16 nodes plus 1 master node (steps 1-4) similar to the cluster in this video which has 32 nodes and 1 master node:

Documentation, Source code, and EagleCAD designs: https://bitbucket.org/jkiepert/rpiclu... Summary: The RPiCluster is a 33 node Beowulf cluster built using Raspberry Pis (RPis). During my dissertation work at Boise State University I had need of a cluster to run a distributed simulation I've been developing. The RPiCluster is the result. Each ...
www.youtube.com
Step 5 looks like this:
Step 6 looks like this:

Using Povray's MPI patch on a cluster (NCE) -- Usando o patch do Povray para MPI em um cluster (NCE)
www.youtube.com
Step 7 looks like this:

This Mandelbrot zoom takes us all the way to a mini-brot at a depth of e1091. This video has quite a large colour variety due to a new rendering technique that I trialled. It took well over a week to render. Please hit subscribe! Support the channel by buying a T-Shirt: https://mathstownfashion.com?utm_source=yt_eotu Direct downloads and usage ...
www.youtube.com
Step 8 looks like this:

A text version of this video is available here 👉https://bit.ly/2R8gmFe Accelerate your renders with a Render Farm! A render farm is simply a collection of networked computers that work together to render a sequence in less time. By dividing your sequence between multiple machines your total render time becomes a fraction of what it is on a ...
www.youtube.com


Hope you learned a lot this year!
Good luck in college next year!
Have a great Summer!

MAY UPDATE (meeting 8 - last one)
We collected our Raspberry PIs and reported on our progress with the project. Unfortunately, we did not get very far as a strict independent study done at home. Next year, we may have to set up a bench in our room as a permanent setup of at least one RPI to experiment with together. 

MARCH & APRIL UPDATE: (meetings 6&7)
We are stepping back a bit and doing some research. Let's see how other's are attacking similar projects:

Here's a HowTo!
Here's another HowTo!
And one more HowTo for you!
Here's a cool desktop solution!









FEBRUARY UPDATE: (meeting 5)
We tried to install openMPI but spent forever just updating our Linux OS:
sudo apt-get update
sudo apt-get upgrade
sudo apt-get install openmpi, gfortran
sudo apt-get install openssh

JANUARY UPDATE: (meeting 4)
We un-boxed all the stuff we got from BOCES and Donorschoose and figured out how to boot up 1 RPI per student. We are playing around with these micro-board PCs as replacement PCs at home until our next meeting.

USB Power Supplies

NOOBS Sims

HDMI To VGI Converters

DECEMBER UPDATE: (meeting 3)
We downloaded the latest pelicanHPC ISO and burned a DVD for each of us. Then we booted our PCs from the DVD drive and ran openMPI from RAM. We used flops.f to test our "clusters." flops.f is a FORTRAN program that uses mpirun to stress a cluster by calculating PI using reimann sums for 1/(1+x^2) from a=0 to b=1. 

BTW, I call our PCs "clusters" since they have quadcore processors and openMPI can run on multicore just as well as on a grid. We can't set up a grid based (multinode) Linux cluster anymore as we are not allowed to setup our own DHCP server anymore. We got about 2 GigaFLOP/s per core, so 8 GigaFLOP/s per PC. If we could setup our own DHCP server, we'd get 100 cores running in parallel for about 200 GigaFLOP/s!

Compile:
mpif77 -o flops flops.f
Execute multicore:
mpirun -np 4 flops
Execute multinode:
mpirun -np 100 --hostfile machines flops

Enter pelicanHPC as a our first solution! We demoed an old DVD we had to show how to fire up the cluster. Our experiment demonstrated that we could not boot the whole room anymore, as we used to, as PXE Boot or Netboot requires we setup our own DHCP server. When you boot the DVD on one PC, it sets up a DHCP server so all the other PCs can PXE Boot the same OS. However, our new WimpDoze network uses its own DHCP server. These two servers conflict, so we cannot reliably connect all the Worker bees to the Queen bee. We can't setup grid computing or a grid cluster, but we can still setup SMP. In other words, boot up a single PC with the pelicanHPC DVD and run multicore applications on all the cores on that one PC.

So, here's your homework. Download the latest pelicanHPC ISO file and burn your own bootable DVD. Don't worry if your first burn doesn't boot. You can use that DVD as a "Linux Coaster" for your favorite beverage the next time you play on SteamOS. If you can make this work at home, try to run Hello_World_MPI.py from John Burke's sample MPI4PY (MPI for Python) code.


NOVEMBER UPDATE: (meeting 2)
See below for our Raspberry PI project. We have been waiting for funding for some extra hardware from DonorsChoose and we just got it! Yeah! In the mean time we're playing with PelicanHPC and BCCD DVDs to see how openMPI works so we can set it up the same way on our new Linux Cluster.

OCTOBER UPDATE: (meeting 1)
We've decided to make a Linux Cluster out of Raspberry Pi single board computers! Our school district has been kind enough to purchase 25 RPIs plus some USB and Ethernet cabling, so now we just need some power supplies, routers and SD cards. So here comes DonorsChoose to the rescue! We started a campaign to raise the money to purchase all the remaining equipment from Amazon!



What we want to do is to replace our Linux Lab of 25 quadcore PCs, where we used to do this project,with 25 networked RPI 3.0s. The Raspbian OS is a perfect match for our project! Raspbian is linux based just like our old lab which was based on Ubuntu Linux. Also, python is built-in so we can just add openSSH and openMPI to code with MPI4PY once again! With the NOOB SD card, we start with Linux and python preinstalled!


Once we get all the hardware networked and the firmware installed, we can install an openMPI software stack. Then we can generate Fractals, MandelZooms, POV-Rays and Blender Animations!


SEPTEMBER UPDATE: (meeting 0)
Please see september organizational blog post.

NEW SMARTBOARD SETUP
NOTE: MIC FOR SCREENCASTING!
NOTE: TI nSPIRE CX CAS EMULATOR!!
NEW DECOR IN THE REAR OF ROOM 429
NOTE: SLIDERULE!
NOTE: OLD LINUX SERVERS!!
NEW TAPESTRIES IN ROOM 429
NEW VIEW FROM LEFT REAR SIDE
NOTE: OLD UBUNTU DESKTOP!
NEW VIEW AS YOU WALK IN
NOTE: SIDERULE!

So, what's all this good for aside from making Fractal Zoom or Shrek Movies?
SETI Search
Econometrics
Bioinformatics
Protein Folding
Beal Conjecture
Scientific Computing
Computational Physics
Mersenne Prime Search
Computational Chemistry
Computational Astronomy
Computer Aided Design (CAD)
Computer Algebra Systems (CAS)

These are but a few examples of using Computer Science to solve problems in Mathematics and the Sciences (STEAM). In fact, many of these applications fall under the heading of Cluster Programming, Super Computing, Scientific Computing or Computing Science. These problems typically take too long to process on a single PC, so we need a lot more horse power. Next time, maybe we'll just use Titan!

====================
Membership 
(alphabetic by first name):

CIS(2019)!
AaronH(12), AidanSB(12), JordanH(12), PeytonM(12)

CIS(theta) 2018-2019:
GaiusO(11), GiovanniA(12), JulianP(12), TosinA(12)

CIS(theta) 2017-2018:
BrandonB(12), FabbyF(12), JoehanA(12), RusselK(12)

CIS(theta) 2016-2017: 
DanielD(12), JevanyI(12), JuliaL(12), MichaelS(12), YaminiN(12)

CIS(theta) 2015-2016: 
BenR(11), BrandonL(12), DavidZ(12), GabeT(12), HarrisonD(11), HunterS(12), JacksonC(11), SafirT(12), TimL(12)

CIS(theta) 2014-2015: 
BryceB(12), CheyenneC(12), CliffordD(12), DanielP(12), DavidZ(12), GabeT(11), KeyhanV(11), NoelS(12), SafirT(11)

CIS(theta) 2013-2014: 
BryanS(12), CheyenneC(11), DanielG(12), HarineeN(12), RichardH(12), RyanW(12), TatianaR(12), TylerK(12)

CIS(theta) 2012-2013: 
Kyle Seipp(12)

CIS(theta) 2011-2012: 
Graham Smith(12), George Abreu(12), Kenny Krug(12), LucasEager-Leavitt(12)

CIS(theta) 2010-2011: 
David Gonzalez(12), Herbert Kwok(12), Jay Wong(12), Josh Granoff(12), Ryan Hothan(12)

CIS(theta) 2009-2010: 
Arthur Dysart(12), Devin Bramble(12), Jeremy Agostino(12), Steve Beller(12)

CIS(theta) 2008-2009: 
Marc Aldorasi(12), Mitchel Wong(12)

CIS(theta) 2007-2008: 
Chris Rai(12), Frank Kotarski(12), Nathaniel Roman(12)

CIS(theta) 1988-2007: 
A. Jorge Garcia, Gabriel Garcia, James McLurkin, Joe Bernstein, ... too many to mention here!
====================

Well, that's all folks!
Happy Linux Clustering, 
AJG
A. Jorge Garcia

 

Applied Math, Physics & CompSci
PasteBin SlideShare 
MATH 4H, AP CALC: GC or SAGECELL
CSH: SAGE Server
CSH: Interactive Python
APCSA: c9.io


Beautiful Mind Soundscape: