I suspect some SEO will drive the wrong audience to this post with the above title. NAS Hot-Swap Love is all about being able to change the drives in my Synology Network Attached Storage (NAS) servers without powering them down. The act of doing this while the power is on and the units are still running is called a “Hot-Swap”.
As I mentioned in a previous article, I run my home network with two Synology NAS units for back-up and storage media. These units are great. Between the two of them, I have 30TB of disk in RAID-5 configuration. They would be even more valuable if I had anything more impressive than old racing videos to store on them. You can only take so many pictures of your cats – no matter how good looking they are.
Right before I relocated to Virginia in July 2019, my first Synology DS1515+ died. They replaced it no charge and said it was under warranty. I removed the drives, numbered them, and shipped them to VA. I had Synology send the new system to VA as well. When I put it all back together, it actually worked! Which was great news, until the first drive died.
I shut it down, ordered a new replacement drive and waited for it to arrive. Upon arrival, I replaced the drive, rebuilt the RAID set, and yes, all my useless files were still available. This again went smoothly for a month unitl the second drive died. I knew what that beeping sound was this time. However, I ordered 2 new drives – as something told me these drives would start failing one after the other…
Last month, the third drive failed. I became emboldened! I took the drive out while the unit was still on, put the new drive in the housing, and reinserted it. It allowed me to rebuild the RAID set and I never was offline for one second. This was NAS Hot-Swap Love at its very finest. I did not play opera music while performing the operation – but I could hear it faintly in my head.
Alas, I realized I had exhausted my supply of 3TB drives – and had never stocked any 6TB drives for my new DS1517+ unit. As the picture above shows, I now have a supply of two 3TB and two 6TB drives waiting for the next drive failure. I have never felt so prepared for NAStiness.
I received this unit in December 2019 with Ubuntu 18.04 LTS installed. The 3.28 version of Gnome running on the system did not support display “Fractional Scaling” – which made some applications much too small (or in 200% mode much too big) to read and use well. Some research showed that a newer version of Gnome would support Fractional Scaling with 125%, 150%, and 175% options available. The 20.04 LTS version Ubuntu includes Gnome 3.36 which does support Fractional Scaling.
Right now Ubuntu 20.04 does not provide direct upgrades from the 18.04 LTS release until the 20.04.1 release arrives. I did download the ISO and ran it under virtualization and can see the Fractional Scaling option exists!
I will soon be moving my development environments over and moving from the Ubuntu Unity desktop to Gnome in this high-powered mini-PC with 64GB of RAM, a quad core i7 processor, and amazing video power.
I have moved around the United States many times while pursuing my career in business and technology. My relocation plans have taken me to Chicago, three different homes in the the Bay Area of Northern California, Seattle, Northern Virginia, back to Seattle, over to Minneapolis and now splitting my time between Blacksburg and Arlington Virginia.
My trips used to include packing hundreds of pounds of books, audio CDs, computer CDs, etc.
My time at Digital River allowed me to see how they built an early part of their business which leveraged high bandwidth speeds to obviate the need for software CDs. We all just download our software now and would not think of lugging boxes of software CDs.
Back-up CDs are also a thing of the past. I have my two Synology NAS devices, which slowly back themselves up to the AWS cloud.
Mr. Steve Jobs made audio CDs a thing of the past with iTunes. Most kids have no idea what a record player or even a cassette tape are. It felt good throwing away all those boxes of audio CDs.
I was browsing my Audible library this morning. Yep, all of my books are in there. Even my books I read in print are in my Kindle app.
Many folks embark upon a home theater upgrade only to find it a tad more difficult than they expected. My goal was to get everything in my Family Room to UHD 4K. While it looks terrible, I was able to achieve that by putting an external rack next to my built-in TV / speaker cabinet. But wait, my house came with built in component racks for this stuff! This is where my hell begins. (Current “hacked” set up next to my TV below).
Great House, Great Cabinets, Aging Tech
Almost six years ago, I bought a wonderful house, originally built by the Minnesota Vikings great, Joe Sensor, and later upgraded amazingly by the daughter of the founder of Best Buy. It was clearly Geek Squad city in here for weeks. I have home theater set-ups in both the Family Room and the Media Room in the walk-out basement. Both using embedded racks in cabinets built into the walls with AMX control panels to control everything from window shades, lights, on down to the TV and the components. The main issue I have is that the embedded rack cabinets are on the opposite side of the room from the giant built-in TV / speakers cabinet. The cabinets and the embedded racks are really well designed. They can be pulled out on attachable rails (see below) and have articulating wire guides. There are two racks – one for the main components of receiver, amplifier, etc. and a second one for old accessories that are now unnecessary such as DVD players, CD players, Blue-Ray, VHS, etc. Pretty cool racks huh?
This does not sound terrible at first consideration until you realize it was built around 2005 before HDMI cables, HD or UHD. It was even before common use of Cat-5E or Cat-6 Ethernet cables. The TV and components in the house when I bought it were pre-HD. For goodness sake, we cannot have that! I would lose my Platinum Couch Potato card.
So What is the Problem?
So the main issue is getting the UHD 4k Receiver connected vi HDMI 2.0 to the UHD 4K TV across the room (thus my hacked set up with external stack next to the TV cabinet). Unfortunately, the Geek Squad (or home re-modelers) left no auxiliary conduit between the racks and the TV cabinet. The conduit used between the rack and TV is absolutely stuffed with speaker wires and various coax wires. It is impossible to re-fish anything through that mess.
Potential Solutions to Home AV Hell
Fortunately, there is one Cat-5 (not Cat-5E) wire that seems to run directly between the racks and the TV cabinet. I have not buzzed it out to test if it is a “direct” point-to-point wire or if it goes through one of the basement Ethernet “home run” switches. It turns out, there are a few HDMI UHD 4K extenders that run over Cat-5E or Cat-6. While the cable I have is Cat-5 – it might work if the distance is short enough. So an HDMI Extender over Ethernet is my best option.
Of course, there is also the option to rip up the walls, ceiling, floor and run a 50 foot HDMI cable rated for UHD 4K. Of course while I was in there, I would add a bunch of Cat-6 cables for any type of future expansion since that seems to be the type of network cable used for wire converters. This seems crazy expensive considering the path through my walls that would be necessary (the room is 2 stories high) and I don’t have the home wiring diagrams as to the routes they took.
One other option I have is a wireless HDMI solution. Right now, I only see systems that support HD quality HDMI over wireless. This might have to be the setup I use when I sell the house so that everything is in an enclosed cabinet and the place looks high tech (even though it will be low tech HD). UPDATE: I just found some wireless UHD 4K @ 60 Hz by J-Tech. $500 for the pair. If the $300 Ethernet based pair do not work due to my cable only being Cat-5 and not Cat-5E, I now have an option. It says line of site. I wonder if I could cook a hot dog on one of those antennas.
Conclusion – more to come
I have ordered the “No Hassle AV” UHD 4K Extenders (see Ethernet extender link above) that run over Ethernet. I will test them and see if it will solve my dilemma. I will post updates back here as I make progress (or lack of progress). Comments and feedback welcome on Facebook where I posted this article.
One of the most important components of my home data center, are my Network Attached Storage (NAS) servers.
I have had my old NetGear ReadyNAS unit for at least 9 or 10 years now. It has a whopping 1.3TB of storage across 3 drives in a RAID 5 configuration. NAS units are great for storing my racing videos that no one will ever watch, old photos now that everyone with a phone collects thousands of photos a year, and copies of my important tax, mortgage, and legal documents. Some of my friends store TBs of pirated videos from the dark web. I am a NetFlix and AppleTV guy so that saves me a few TBs.
Goodbye NetGear, hello Synology
While the ReadyNAS served me well, it was long in the tooth and short on TBs. It also was missing some of the interesting new features that I did not even know I was living without until I bought my first Synology NAS back in 2015- the DS1515+. These guys really have done the whole consumer NAS thing really well.
The main feature I use and like is the immediate file sync of Linux directories on my Linux servers (and one of my Windows 10 desktops as well). Once I configured this option and selected the directories I wanted synced, all of those files continue to be safely stored on the NAS. No backups – it copies the files immediately upon edit or save to the NAS file system. It is also a nice way to grab files from one machine to the other as the systems can all see the disk replicas across the servers.
This does not mean I do not do backups. I have Amazon Glacier storage and I have those critical legal, tax, mortgage files sent out to Glacier storage once a week from the Synology NAS. The great thing about that is Synology provides the service that runs on the NAS to do the Glacier backup. Really simple integration.
Built-in Servers (services)
The disk drives are even “hot swappable“. No downtime if you have a drive go bad. Aside from rock solid hardware technology, another amazing thing about Synology is the application ecosystem they provide on the NAS server. They want you to make this your “server” for everything and anything you do in your home. Want a VPN sever? It has that. DNS? Yep. Connect with my Macs, Windows, and Linux in their native network protocols? Of course. It has email servers, video security servers (I bought two cameras to test and they are great), video, photo, audio servers. There are Active Directory, Email, Network Management, Print, Content Management, WordPress, WikiMedia, E-Commerce, Docker, Git, Web, Plex, Application (Tomcat) and Database servers! These all run Native on the NAS. Not just as the disk – but in the memory and on the quad core CPU.
I cannot possibly list all of the features and servers these new Synology NAS units supply. I have tested many of them, and they are rock solid and dependable. I never envisioned using my NAS as a “server” other than as a network attached storage server. Now it can work as so much more.
The more the merrier
The Synology product has me so hooked, I bought my second unit! A DS1517+ with 8GM of main memory and 30TB of storage (5 disks @ 6TB each). I used this for the security video storage and as a snap backup of the first unit. Had I planned it better, I could have arranged these two Synology units in an active-passive mirrored configuration. This would allow one to take over if the other crashed. Clearly I do not need that at home. But it is nice to know that a simple consumer grade products offer these features now.
I fully and highly recommend these Synology NAS products. They do not sell direct. I recommend finding them on Amazon after you spend hours like I did on their product site comparing models and features.
[Update] One cool thing I forgot to mention before I hit “publish”, is that this unit of course runs Linux. It is a 3.10 kernel version modified by Synology. This is the reason so many of these services (servers) are available as a stock part of the unit. Synology chose to make Linux the engine to run the NAS and brought along many of the Linux services. With simple configuration, you can ‘ssh’ into the NAS and work on it as though it were a plain old Linux box. It is really well done.
I am pretty psyched to get my latest Intel NUC. The NUC7i7DNKE has an 8th generation Intel® Core™ i7 vPro™ 4.2 GHz “Turbo”, quad core processor with 32GB of DDR4 2400 MHz RAM and a 1TGB SSD drive. Not to mention built in 4K UHD video with HDMI ports and USB 3.0.
I will use this as my main development machine. It is crazy that I tend to run out of RAM on my 16GB machines running Ubuntu.
This will be my 9th NUC. Maybe I am a little too in love with these things. They make great clusters for home research and development on distributed technologies such as Cassandra and Hadoop. I have three nodes running Cassandra and Hadoop today – and am looking to add a 4th node when I free up my current development machine NUC.
Quiet, Low Power, great for clustering!
They are whisper quiet and use very low power. There are 5 in a stack sitting on my desk next to me as I write this, and they make less noise than a single standard PC. In fact, they seem to make no noise at all.
I also run Windows 10 on one as a home theater type of PC connected to a Samsung UHD TV via HDMI. These NUCs are awesome. I gave my old i3 core media NUC to my younger brother as a gift.
Here is an old picture of my early stack of NUCs. They are each 4″ x 4″.
It has been over five years since my last post about software and technology. It’s not that I stopped using it. I just stopped talking about it. Lately I have been on a bit of a streak. I have been working on the MARRS Points tracking app in AWS for over a year now. It will now be the official points tracking application for the 2016 season across all race classes in the Washington DC Region (WDCR) of the SCCA. I have actually done something mildly productive with my spare time!
An AWS Project Was In Order
It was mainly by happenstance that I got the app going. I wanted to work in the Amazon AWS cloud a bit to understand it better. I had managed teams using it for years now at various companies. So it seemed like a reasonable learning experience. I could have easily chosen Microsoft Azure or the Google Cloud, but AWS has the deepest legacy and I started there. Once I logged in and started to play with AWS, they let me know my first year was FREE if I kept my usage below specific CPU and memory levels. Sure no problem. But what to build, what to do? I remembered I had built an old Java/JSP app as a framework for a racing site for my brothers and I, called cahallbrosracing.com. GoDaddy had taken their Java support down and it had been throwing errors for years. So I decided that was the perfect domain to try, and grabbed the skeleton code. It would be some type of Java/JSP racing application that used a MySQL database backend. But for now, I just needed to see if I could configure AWS to let me get anything live.
EC2, RDS, a little AWS security magic…
I provisioned an EC2 node, downloaded Tomcat and Oracle Java and went to work. In no time, I had the fragments of the old site live and decided I should put my race schedule online. The schedule would not come from a static HTML page. It would use a JSP template and a Java object to get the data from the database. Then each year I would just add new events to the database and display by year. Quickly the MySQL DB was provisioned, network security provisioned, DB connectivity assembled and the schedule was live. OK – AWS was EASY to use and I now had a public facing Java environment. I was always too cheap to pay for a dedicated host. Too cheap to sort out a real public facing Java environment that allowed me to control the Linux services so I could start and stop Tomcat as needed. But FREE was right up my alley.
So there I was, developing Java, JSP and SQL code right on the “production” AWS Linux server. Who needs Maven or Ant, I was building it right in the server directories! Then I started to realize I did not have backups. I was not using a source code repository. It could all go away like a previous big app I wrote when my RAID drives both failed in the great 2005 Seattle wind storm. Not a good idea.
Intel NUCs (and GitHub) to the rescue!
Enter the NUCs!!! I had learned about the Intel NUC series and bought a handful of them to make a home server farm for Hadoop and Cassandra work. These units are mostly the i5 models with 16GB of RAM running Ubuntu 14.04.4 LTS. I realized I needed to do the development at home, keep the code in a GitHub repository, and then push updates to AWS when the next version was ready for production. My main Java development NUC has been awesome. It is a great complimentary setup. An AWS “production” environment in the cloud and a Linux environment at home with the source code repository also in the cloud. I even installed VMWare Workstation on my laptop so I have Linux at the track. This allows me to pull the code from GitHub down to my laptop and make changes from the track. It’s almost like I have made it to 2013 or something.
Why software is never “done”
Well once I got going, I wanted to track my points in the MARRS races. So I made some tools to allow manual entry of schedules, race results, etc. This manual process clearly did not scale well. The discovery of Race Monitor and their REST APIs. solved that issue. The code was written to pull the results back from Race Monitor and used Google’s GSON parser. GSON let me marshal the JSON data to objects used in the Java code. Unfortunately, Race Monitor does not pass a piece of critical data, the SCCA ID for each racer. The next step was to work with the Washington DC Region and the fine people at MotorsportReg.com to use their REST APIs to get that data for each race. This simple Java app has become complex with two REST APIs and tools to manage them.
The rest is history. The tool can now also import CSV files from the MyLaps Orbits software. A simple CMS was added to publish announcements and steward’s notes per race. All of the 2015 season has been pulled into the application across all of the classes and drivers. Many features, bells and whistles have been added thanks to Lin Toland’s sage advice. Check out the 2015 season SSM and SM Championship pages. A ton of data and a lot of code go into making those look simple.
Racing into the future with MARRS
I am really looking forward to being able to help all of the WDCR MARRS racers track their season starting in April. Let’s hope I can drive my car better than last year and race as well as I have coded this application.
It is kind of odd to think that my desire to play with AWS caused me to build something useful for hundreds of weekend racing warriors. Now the next question, should I make it work for every racing group across the world? I mean multi-tenant, SaaS, world domination? Hmmm… Maybe I should try to finish better than 6th this year…
I just realized that I bought my “new” Windows 7 machine way back in late January. The thing is amazing: 8GB RAM, i7 860 Quad Core CPU, 3.0Gbps RAID-1 SATA drives, etc. I recently went out and bought a 30 inch Samsung monitor so I could put the video card in 2560×1600 mode. The speed, video, stability, etc. of this machine are incredible!
The most amazing thing is the OS. I skipped Vista due to all of the bad press – coupled with the fact that XP mostly did everything I needed from a desktop OS. Mostly was the key part of that sentence. It really could not handle more than about 2GB of memory efficiently – and I had some leaky open-source apps that regularly gobbled that up since I rarely reboot…
Free Microsoft Software!
Additionally, Microsoft has tossed in some FREE apps that were not available under XP as part of their Windows Live Essentials program. The most significant of those apps (to me) is Movie Maker. I regularly edit and upload portions of my SCCA Club Racing videos using Movie Maker. It is simple and easy – which fits my video skill level really well. I am also in the process of adding in a TV Tuner card so I can really utilize the Windows Media Center software that came with my Windows 7 Ultimate version. That should make it even more interesting to connect to my Xbox-360 (which now gives my AppleTV a run for the money in renting movies from the Internet).
Windows 7 handles memory well
I now regularly run over 3GB of apps without any issues on the machine whatsoever. I have not added all the DB servers, app servers, etc. that I used to run on my various Windows desktops. That is because I never retire my old machines and they are still on the network somewhere. I finally have created what is mostly a desktop machine used as a desktop.
No question, Windows 7 is a really fantastic OS. It will continue to be my main machine to access all the servers running in my in my home data center.
This morning I decided to grab a few photos off of my friend’s camera I borrowed when I went to the World Class Driving200 MPH EXTREME event last weekend. After all, it has a mini-USB connector on the camera (I thought) and I have dozens of cables from the myriad of devices I have purchased over the years.
Enter micro-USB connector
Much to my surprise, the Olympus FE-370’s mini-USB connector is very “mini”. In fact, it is so mini, it is called “micro” USB. It is just slightly smaller than mini and will not accept any of the many mini cables that I own. Being that there is literally two and a half feet of fresh snow outside and not being the type to give up easily, I fished around for a few of my memory readers and removed the memory card from the camera.
My handy little Transcend RDP8 memory card reader can read four different formats. This should be no problem. Denied! It turns out the Olympus has a special memory card called an xD Picture Card. These are probably more common that I think – but not common enough for my Transcend reader that I bought for my 16GB CF card. My SanDisk reader (2 formats) would not accept it as well.
Looks like I need to trudge out into the cold and snow to borrow the cable for the camera. I should probably invest in a micro-USB cable of my own and a newer memory card reader as well.
Great work by Brad Allison in creating SUMO and for the data center and SA teams for pushing its usage. This tool allows AOL to identify underutilized servers and either decommission them – or bundle them up onto virtualized hosts.
It is great to work with dedicated people that are not only smart, but care about their environment at the same time.