The Intel NUC Computers, AWS, and racing cars

It has been over five years since my last post about software and technology.  It’s not that I stopped using it.  I just stopped talking about it.  Lately I have been on a bit of a streak.  I have been working on the MARRS Points tracking app in AWS for over a year now.  It will now be the official points tracking application for the 2016 season across all race classes in the Washington DC Region (WDCR) of the SCCA.  I have actually done something mildly productive with my spare time!

An AWS Project Was In Order

It was mainly by happenstance that I got the app going.  I wanted to work in the Amazon AWS cloud a bit to understand it better.  I had managed teams using it for years now at various companies.  So it seemed like a reasonable learning experience. I could have easily chosen Microsoft Azure or the Google Cloud, but AWS has the deepest legacy and I started there.  Once I logged in and started to play with AWS, they let me know my first year was FREE if I kept my usage below specific CPU and memory levels.  Sure no problem.  But what to build, what to do?  I remembered I had built an old Java/JSP app as a framework for a racing site for my brothers and I, called cahallbrosracing.com.  GoDaddy had taken their Java support down and it had been throwing errors for years.  So I decided that was the perfect domain to try, and grabbed the skeleton code.  It would be some type of Java/JSP racing application that used a MySQL database backend.  But for now, I just needed to see if I could configure AWS to let me get anything live.

EC2, RDS, a little AWS security magic…

I provisioned an EC2 node, downloaded Tomcat and Oracle Java and went to work.  In no time, I had the fragments of the old site live and decided I should put my race schedule online.  The schedule would not come from a static HTML page.  It would use a JSP template and a Java object to get the data from the database.  Then each year I would just add new events to the database and display by year.  Quickly the MySQL DB was provisioned, network security provisioned, DB connectivity assembled and the schedule was live.  OK – AWS was EASY to use and I now had a public facing Java environment.  I was always too cheap to pay for a dedicated host. Too cheap to sort out a real public facing Java environment that allowed me to control the Linux services so I could start and stop Tomcat as needed.  But FREE was right up my alley.

So there I was, developing Java, JSP and SQL code right on the “production” AWS Linux server.  Who needs Maven or Ant, I was building it right in the server directories!  Then I started to realize I did not have backups.  I was not using a source code repository.  It could all go away like a previous big app I wrote when my RAID drives both failed in the great 2005 Seattle wind storm.  Not a good idea.

Intel NUCs (and GitHub) to the rescue!

Enter the NUCs!!!  I had learned about the Intel NUC series and bought a handful of them to make a home server farm for Hadoop and Cassandra work.  These units are mostly the i5 models with 16GB of RAM running Ubuntu 14.04.4 LTS.  I realized I needed to do the development at home, keep the code in a GitHub repository, and then push updates to AWS when the next version was ready for production.  My main Java development NUC has been awesome.  It is a great complimentary setup.  An AWS “production” environment in the cloud and a Linux environment at home with the source code repository also in the cloud.  I even installed VMWare Workstation on my laptop so I have Linux at the track.  This allows me to pull the code from GitHub down to my laptop and make changes from the track.  It’s almost like I have made it to 2013 or something.

Why software is never “done”

Well once I got going, I wanted to track my points in the MARRS races.  So I made some tools to allow manual entry of schedules, race results, etc.  This manual process clearly did not scale well.  The discovery of  Race Monitor and their REST APIs. solved that issue.  The code was written to pull the results back from Race Monitor and used Google’s GSON parser.  GSON let me marshal the JSON data to objects used in the Java code.  Unfortunately, Race Monitor does not pass a piece of critical data, the SCCA ID for each racer.  The next step was to work with the Washington DC Region and the fine people at MotorsportReg.com to use their REST APIs to get that data for each race.  This simple Java app has become complex with two REST APIs and tools to manage them.

The rest is history.  The tool can now also import CSV files from the MyLaps Orbits software.   A simple CMS was added to publish announcements and steward’s notes per race.  All of the 2015 season has been pulled into the application across all of the classes and drivers.  Many features, bells and whistles have been added thanks to Lin Toland’s sage advice. Check out the 2015 season SSM and SM Championship pages.  A ton of data and a lot of code go into making those look simple.

Racing into the future with MARRS

I am really looking forward to being able to help all of the WDCR MARRS racers track their season starting in April.  Let’s hope I can drive my car better than last year and race as well as I have coded this application.

It is kind of odd to think that my desire to play with AWS caused me to build something useful for hundreds of weekend racing warriors.  Now the next question, should I make it work for every racing group across the world?  I mean multi-tenant, SaaS, world domination?  Hmmm…  Maybe I should try to finish better than 6th this year…

Ted Cahall

Kubuntu and Wubi

Linux  desktop variations

After playing with Debian and Ubuntu, I wanted to see what the latest in KDE looked like. I have mostly been a Gnome user and had read some interesting tidbits on KDE 4.3 in LinuxJournal. I did not want to “polute” my Ubuntu installation by downloading all of the KDE parts to it.   So I decided I would add a Kubuntu partition to my Ubuntu box.  I would do this as well as test Kubuntu on my 64bit Windows machines using Wubi.

I was surprised to see that the installers for Ubuntu and Kubuntu are not really from the same code base. The installation on my 32 bit Ubuntu box went off without a hitch. I had a spare drive on it and I used that for the new partition. I needed to manually change the partitions with the partition manager.  This is so it could leave the old Ubuntu 9.04 and 9.10 versions where they were. Even this was simple and straight forward.

Wubi letdown

I guess my biggest surprise was that Wubi does not install Kubuntu/Ubuntu to run “on top of Windows” as I thought it would. I had thought there was an additional VXLD layer or something that was written to let Linux run as a guest OS on top of Windows XP. This would have been really cool. Sort of like Cygwin on steroids. This may sound ridiculous, but a colleague long ago, Bill Thompson, wrote such a VXLD for Windows.  He did this back in the mid ’90s that allowed x86 versions of Unix to run on top of Windows.

I searched around the web and Facebook and LinkedIn to see if I could find Bill. With much digging I found him on LinkedIn. His start-up was called “nQue”. He was also a file system guru that wrote a lot of CR-ROM file system drivers, etc. after the start-up went south.

Needless to say, I think if that feature could be added to the Wubi concept, a lot more people might try Ubuntu.  Adding it right on their Windows desktop as an application environment without requiring a reboot. I know Wubi does not alter the Windows partitions.  So it is still a fairly painless way to try Ubuntu without risking much. Users can always uninstall it as they do any Windows application it if they are not happy with it. I just prefer to rarely reboot my systems if I can avoid it.

Ted Cahall

Ubuntu and Debian Installation Fun

Home Data Center Saga continues…

The “home data center” is getting a bit crazy to maintain. It is a good thing I have so much free time on my hands (not). I did finish a couple of the projects on my list last weekend. I wanted to upgrade one of my P4 3.0GHz “home brew” machines from Fedora Core 3 to Ubuntu and put Debian server on one of the “new” used Dell 2850 servers I bought from work. I am now the proud systems administrator of both of these machines – with plenty of fun along the way.

Upgrading from Fedora to Ubuntu

I started with the Fedora Core 3 to Ubuntu conversion by making sure all of the applications I had written in Java, PHP, Perl, sh, as well as the databases in MySQL had been successfully ported to another CentOS machine and regression tested. That took longer than expected (of course). I had already used BitTorrent (thanks Bram Cohen) to download Ubuntu 9.04. I like their numbering scheme as even I can figure out how new/old the rev is. I then went and installed it on my “home brew” Intel motherboard based system. It worked like a charm and I was checking out its slick UI and features within minutes. So far, so good.

Next I decided to see how the graphics worked and if I could get it into 1920×1280 mode with my 24″ monitor. That was a tad trickier – but I was pleased to see that it went out on the Internet and figured out where to get the latest NVidia driver that supported the video card I had bought years ago. That was slick and the graphics were awesome. In high res mode it even puts some transparency to windows and gives them “momentum distortion” as you move the window. Not sure how useful it is – but it looks pretty cool.

VNC for graphical UI across machines

I like to sit in my home office and use VNC to access the 7 Linux boxes running in my basement and other rooms (versus running between them to try things). I know that “real systems administrators do not use VNC” as told to me by one of our real systems administrators at AOL (and CNET years ago). I am not embarrassed to say I am not a real systems administrator!  I like the graphical UI access to all of these machines. It makes working on them so much easier with 4 or 5 windows open at a time. So here is where the rub is. I enabled VNC, ran back to my office, and tried it. No luck. I made sure SSH worked and I could get to the box – that was all set and good to go. I check that the machine was listening on port 5901 – that was good too. A little snooping in the VNC log file let me know it could not execute the /etc/X11//xinit/xinitrc script. I thought that was odd but enabled execute permissions on the file and everything worked.

Upgrading versions of Ubuntu to 9.10

As I performed a routine update of the OS and files, it let me know that Ubuntu 9.10 was now out (as it was past October – month 10 in year 2009). I had downloaded 9.04 a month earlier when I began thinking of the project. A 9.10 upgrade sounded great – so I decided to “go for it”. Bad decision. After the upgrade the video would not work in graphics mode and I could only bring the system up in text mode. Not a big deal for a “real systems administrator” but definitely not what I was looking for – especially on a desktop machine where I wanted to check out the cool graphics in Ubuntu.

Video driver hell

Since the machine had no real work on it and I did not feel it was worth my time to really figure it all out in 80×24 text mode as I trouble shot the X Window system, I simply put 9.04 back on the machine and got it working where it was before the upgrade. This would represent my fallback case in a worst case scenario. I then used BitTorrent to get 9.10 on a DVD. Ubuntu allows you to add multiple OS versions by partitioning the drive. I did that and shared the drive with 9.04 and 9.10 and performed the installation. 9.10 came up and worked from scratch – but the video upgrade would not work. When I tried to get it to go out and upgrade the video driver as it had in 9.04, it kept telling me that there were no drivers and that the upgrade was not possible. This did not let me use the 1920×1024 graphics mode of the card or monitor.

After playing with the software update tool, I was able to find some NVidia drivers that were available and downloaded those. Once I did that the system finally let me do the upgrade to enhanced video mode and use the 1920×1280. I am not sure why the 9.10 version was not able to automatically find these drivers as the 9.04 version was, but clearly this was why the upgrade had failed when I tried to go from 9.04 to 9.10 “in place”. The VNC issue for xinitrc still existed and I again corrected that. Project complete!

And on to the Debian upgrades

The Debian 5.0.3 server install for my Dell 2850 proved to be less frustrating – but not without hiccups. I had downloaded the first 3 DVDs for Debian and proceeded to the basement to start the install. That is when I noticed that this 2850 came with a CD-ROM drive and not a DVD-ROM drive! I had already put CentOS on the other Dell 2850 months ago – so I “assumed” that both machines had DVD-ROM drives. Bad assumption… The nice thing about Debian is that is allows a “net install” CD to be burned that is fairly small. It then downloads the rest of what it needs as it goes along. So this is the route I chose for the Debian server. From there the install was fairly straightforward. The graphics are nowhere as nice and Ubuntu – but this is a server install and I don’t have a fancy video card in the 2850 anyway. The VNC issue for xinitrc also exists with this version of Debian – which is no surprise as Ubuntu is a downstream distribution of Debian. Another project complete and now I have systems to compare different OS features and issues and keep up with some of the pilot projects we are doing at work to streamline software distribution, etc.

Ted Cahall

CentOS Rocks Good-Bye RHEL

Home Data Center saga

I upgraded my home “data center” recently with the addition of two used HP DL320s.  They both have 4GB of RAM and two 15k 36GB drives in a RAID1 configuration.  I build and buy hardware as a hobby to keep me close to the reality of managing corporate software systems and data centers.

From RHEL to CentOS

I decided to run CentOS 5.2 on these new home systems.  It is fantastic.  You have to love the GPL that makes this possible.  Some companies making their first foray into Linux might not feel comfortable using CentOS.  Mature companies that have been using some form of RHEL for a few years should feel very comfortable.  How many Linux OS support calls do you make in a year anyway?  Needless to say, we are migrating AOL from RHEL to CentOS at a significant annual savings.

I would have gone to Debian, as I needed a free alternative  to improve my company’s operating margins, but why not use CentOS when it is binary compatible with RHEL and FREE?  Easy decision.

Open Source Commoditization of Software

Similar to my decisions to move my previous companies off of BEA WebLogic to Tomcat and Sybase to MySQL.  It is almost hard to believe that companies payed for the Netscape Web Server now that Apache is ubiquitous.  Distant memories of companies paying for Alta Vista, Verity, or FAST search products now with Lucene and SOLR.  Thank you Open Source!

Ted Cahall

Home Data Center “Drudgery”

Managing my home data center takes more time lately. I now have a mac mini and a Fedora Core 3 box in my office.  This is in addition to the servers in my garage.

I enjoy the Fedora Core 3 box and am looking to upgrade an older AS2.1 box to Fedora.  First I need to  move the old applications off of it. I am keeping notes in a Wiki on one of my machines now to track all the stuff I am doing to these boxes. At least I have MRTG graphs for my PIX and my NICs on all the machines in the garage. I also have RRD graphs for all of the web servers on those machines as well.

Connecting my DSL and Cable Modem Networks

I plan on buying a PIX for the upstairs office.  This way I can make a site-to-site VPN that auto connects between the upstairs cable modem and the garage’s DSL line. Then I will instrument the upstairs machines with MRTG and RRD as well.

Maybe I will start my Harley some time in May. It has been so long the battery may be dead by now. 🙁 It is overcast and cold this morning up here in the hills – so no ride today.