NAS Hot-Swap Love

I suspect some SEO will drive the wrong audience to this post with the above title. NAS Hot-Swap Love is all about being able to change the drives in my Synology Network Attached Storage (NAS) servers without powering them down. The act of doing this while the power is on and the units are still running is called a “Hot-Swap”.

Replacement drives for my 2 Synology NAS units – waiting for NAS Hot Swap Love

As I mentioned in a previous article, I run my home network with two Synology NAS units for back-up and storage media. These units are great. Between the two of them, I have 30TB of disk in RAID-5 configuration. They would be even more valuable if I had anything more impressive than old racing videos to store on them. You can only take so many pictures of your cats – no matter how good looking they are.

Right before I relocated to Virginia in July 2019, my first Synology DS1515+ died. They replaced it no charge and said it was under warranty. I removed the drives, numbered them, and shipped them to VA. I had Synology send the new system to VA as well. When I put it all back together, it actually worked! Which was great news, until the first drive died.

I shut it down, ordered a new replacement drive and waited for it to arrive. Upon arrival, I replaced the drive, rebuilt the RAID set, and yes, all my useless files were still available. This again went smoothly for a month unitl the second drive died. I knew what that beeping sound was this time. However, I ordered 2 new drives – as something told me these drives would start failing one after the other…

Last month, the third drive failed. I became emboldened! I took the drive out while the unit was still on, put the new drive in the housing, and reinserted it. It allowed me to rebuild the RAID set and I never was offline for one second. This was NAS Hot-Swap Love at its very finest. I did not play opera music while performing the operation – but I could hear it faintly in my head.

Alas, I realized I had exhausted my supply of 3TB drives – and had never stocked any 6TB drives for my new DS1517+ unit. As the picture above shows, I now have a supply of two 3TB and two 6TB drives waiting for the next drive failure. I have never felt so prepared for NAStiness.

Ted Cahall

Graphs and Charts with JFreeChart

Have you ever wanted build free graphs or charts using open-source in Java? Well it turns out that you can build free Java graphs and charts with JFreeChart!

Graphs and Charts with JFreeChart: JFree.org website
JFreeChart at http://www.jfree.org/jfreechart/

As the author of marrspoints.com, I have often thought of bringing some of the statistics to life with some graphs and charts. The main issue I faced was they needed to work with Java, and they needed to be free and open-sourced. After considering a few other options, I landed on JFreeChart written by David Gilbert. David maintains a blog but it, like the JFreeChart documentation is somewhat out of date. More on that in a bit – as there must be a reason I still chose it for marrspoints!

I originally developed marrspoints.com in 2015 and have continued to update it with some SEO and then some statistics in April of 2016 and more recently the Bracket Racing program (a very different type of class) in April of 2018. Oddly I am writing this post in April of 2020 (which included the graphs of which I will discuss).

Advantages of Graphs and Charts with JFreeChart

The main advantages I found with JFreeChart were my key requirements:

  • Written for Java and works with Java 8
  • Free and Open-Source
  • It has some documentation (not free and not up to date)
  • It has a huge library of ready-made samples (not free)
  • Works with Java WebApps – specifically Tomcat 8

Since it comes as a Java Jar file, I decided to go ahead and buy the documentation for about $55 USD. The first thing I noticed was that the PDF file was for the 1.0.19 version and not the current 1.5 version. This was worrisome. However, David’s blog pointed out that the documentation also gave you a complete sample set of charts in a jar file, and instructions to run it.

Between the documentation and the sample set of 300 graphs, it was fairly easy to make quick progress. The documentation gave me enough clues to make a Java Servlet that created a PNG with a summary chart. There is a sample app that allows simple clicking through the 300 sample graphs and charts to allow selection of various features for chart type, legend manipulation, axes control, titles, subtitles, etc. Combining these two elements, I was able to pull together four graphs of three different chart types fairly quickly. My first chart follows.

Graphs and Charts with JFreeChart: Unique Drivers per Class per Season

Disadvantages of Graphs and Charts with JFreeChart

As previously mentioned JFreeChart is not totally free if you want the documentation and the sample charts. But to me, they are totally worth $55 USD. Also as mentioned, the documentation is not current for version 1.5 – but it was good enough in the sections I needed to help me create Java Servlets that produced a PNG image. The sample charts are well worth the money and helped me craft custom charts such as the bar chart below.

Graphs and Charts with JFreeChart: Top 3 Ironman past 5 years

It would have been nicer if the documentation was up to date. It also would make me feel a bit better if there were updates post 2017 to this project by David. But it works great and is mostly free. David also has other newer SVG graphs, JS graphs and a whole new charting library. But this one worked well enough for marrspoints.com.

Conclusion of Graphs and Charts with JFreeChart

Hopefully this short article has helped you see that you can build some free Java based graphs and charts with JFreeChart.

Ted Cahall

Gnome 3.36 and Fractional Scaling

Top view of the SimplyNUC Hades Canyon 8i7HVK

My latest Intel NUC unit from SimplyNUC is a Hades Canyon 8i7HVK. I have written about the power and quiet nature of these NUC units before, but this unit is a break from all my previous NUCs. It comes packed with 64GB of DDR4 RAM, two 1TB SSD drives in RAID-1, four VR ready 4K video ports, dual GB NICs, four 3.1 USB ports and five USB 3.1 ports. It is a monster in a very small NUC package. The unit is about twice as wide as a normal NUC and about an inch and a half deeper at: 8.7″ x 5.6″ x 1.54″. It is slightly taller than the normal NUCs, but shorter than the “full height” versions.

Front ports on the Hades Canyon NUC

I received this unit in December 2019 with Ubuntu 18.04 LTS installed. The 3.28 version of Gnome running on the system did not support display “Fractional Scaling” – which made some applications much too small (or in 200% mode much too big) to read and use well. Some research showed that a newer version of Gnome would support Fractional Scaling with 125%, 150%, and 175% options available. The 20.04 LTS version Ubuntu includes Gnome 3.36 which does support Fractional Scaling.

Back ports of the Hades Canyon NUC

Right now Ubuntu 20.04 does not provide direct upgrades from the 18.04 LTS release until the 20.04.1 release arrives. I did download the ISO and ran it under virtualization and can see the Fractional Scaling option exists!

I will soon be moving my development environments over and moving from the Ubuntu Unity desktop to Gnome in this high-powered mini-PC with 64GB of RAM, a quad core i7 processor, and amazing video power.

Ted Cahall

Digital Age Relocation

In July of this year, I had the amazing opportunity to join block.one, an innovative blockchain company and creator of the EOSIO blockchain software.

I have moved around the United States many times while pursuing my career in business and technology. My relocation plans have taken me to Chicago, three different homes in the the Bay Area of Northern California, Seattle, Northern Virginia, back to Seattle, over to Minneapolis and now splitting my time between Blacksburg and Arlington Virginia.

My trips used to include packing hundreds of pounds of books, audio CDs, computer CDs, etc.

My time at Digital River allowed me to see how they built an early part of their business which leveraged high bandwidth speeds to obviate the need for software CDs. We all just download our software now and would not think of lugging boxes of software CDs.

Back-up CDs are also a thing of the past. I have my two Synology NAS devices, which slowly back themselves up to the AWS cloud.

Mr. Steve Jobs made audio CDs a thing of the past with iTunes. Most kids have no idea what a record player or even a cassette tape are. It felt good throwing away all those boxes of audio CDs.

I was browsing my Audible library this morning. Yep, all of my books are in there. Even my books I read in print are in my Kindle app.

The software I write is stored in GitHub in the cloud. No CDs there. Even my PCs are super small and easy to move.

Why is it I still hate moving my stuff so much? 🙂

Happy Holidays 2019!

Ted

UHD 4K over CAT5 – Solved!

In a previous post about my 4k home theater upgrade, I noted that I was trying to solve an issue caused by my AV cabinets being on one side of my family room, while my TV enclosure is located on the other. This only became an issue when I decided to upgrade the TV and the AV receiver to UHD 4k. UHD 4K requires HDMI cables that support HDCP 2.2. None of those wonderful cables exist in my walls between the AV cabinets and the TV enclosure. However, there currently are some solutions on the market that can run HDMI/HDCP 2.2 over Ethernet cables – specifically CAT5e or above.

Unfortunately, the house was wired back in 2005 before CAT5e was available. There were no extra cables pulled between the AV cabinets and the TV enclosure and no spare open conduit. That would have been sweet! In assessing my situation, I noticed that a single CAT5 wire had been pulled between the AV cabinets and the TV enclosure. It had been used for IR signalling to turn components on and off. Unfortunately, CAT5 is not CAT5e.

Taking a Risk on HDBT on CAT5

What is HTBT / HDBaseT? According to Wikipedia, it is

a consumer electronic (CE) and commercial connectivity standard for transmission of uncompressed high-definition video (HD), audio, power, home networking, Ethernet, USB, and some control signals, over a common category cable (Cat5e or above) using the same 8P8C modular connectors used by Ethernet.

Wikipedia

Please note that the specification reference states CAT5e or above. Well unfortunately, I did not have CAT5e – I only had CAT5 in the wall. I had seen some references to a model of HDMI extender that ran 330 feet over CAT5e. My curiosity was, would it run correctly over 30 feet on just CAT5? The answer turned out to be yes! The unit I purchased was from “No Hassle AV” – but don’t bother with their website – as clearly that was too much of a hassle for them to build. But they did answer the phone when I called and shipped the product on Amazon very fast. They do supply some support manuals online – but no link to them from the main page. Maybe even a simple link to the support page was too much of a hassle! But they WORK!!!

Logitech Harmony Elite Remote / Hub

Controlling all of the components (receiver, cable box, AppleTV, Roku, etc) in the AV cabinets and the TV requires some magic since they are all located behind some type of enclosure. Furthermore, the two enclosures are again, on opposite sides of the room.

The Harmony Elite Remote and Hub are a great consumer level solution for this. The remote communicates with the hub over Wi-Fi. The hub can also be controlled by an app on your smart phone. The hub, in turn, controls all of the components enclosed in the cabinets. One of the nicest features of the Elite remote is that it is fully programmable by your smart phone. The earlier, non-Wi-Fi versions of their programmable remotes required using their website and a USB cable to program updates. Adding and removing devices and configurations is now amazingly simple.

I have owned all of these Logitech Remotes – in fact, these are their boxes… Elite on the far left.

Sending the IR from the Hub to the TV

The final issue with this setup was controlling the TV on the other side of the room. The hub was in the cabinet and its “IR extenders” were also behind that closed cabinet door. Another great feature about the HDBaseT specification and the extenders I purchased is they allow IR signals to ride over the CAT5 as well. The extenders come with two additional IR receivers and transmitters so signals can be sent in both directions. Now the remote’s hub can send the IR signal across the room via the extenders to control the TV as well.

Final Snag….

I was able to make the HDMI extender work with the Logitech “IR Blaster” but only when it was in a lighted room. I spent hours trying to debug this. It turns out the “Blaster” was “saturating” the IR receiver when in the enclosed cabinet. By ordering the Logitech Harmony Precision IR cable, I was able to solve this issue as well. Note my “hack” setup until I figured out why it only worked outside of the cabinet.

IR Hack

UHD 4K Home Theater Happiness

While I may have exited UHD 4k Home Theater Hell, I would not quite say this is heaven, but it is pretty close. I purchased the more expensive HDMI extenders to improve the odds of working over CAT5 instead of CAT5e. It is not clear if the cheaper model would have worked just as well.

I am very satisfied with this setup. But I also realize that had the previous installers not added a CAT5 cable for IR signaling, I would not have been able to use this solution. I was considering using radio based HDMI extenders – but those specified “line-of-site” which did not make me confident of working behind closed cabinet doors. I did hear of some possibilities of fishing new wires through the ceiling of the basement via my can lighting. While that would have been an option, I was luckily saved all that research and pain by a simple CAT5 cable.

Ted Cahall

UHD 4K Home Theater Upgrade Hell

Many folks embark upon a home theater upgrade only to find it a tad more difficult than they expected. My goal was to get everything in my Family Room to UHD 4K. While it looks terrible, I was able to achieve that by putting an external rack next to my built-in TV / speaker cabinet. But wait, my house came with built in component racks for this stuff! This is where my hell begins. (Current “hacked” set up next to my TV below).

Hacked External Media Rack

Great House, Great Cabinets, Aging Tech

Almost six years ago, I bought a wonderful house, originally built by the Minnesota Vikings great, Joe Sensor, and later upgraded amazingly by the daughter of the founder of Best Buy. It was clearly Geek Squad city in here for weeks. I have home theater set-ups in both the Family Room and the Media Room in the walk-out basement. Both using embedded racks in cabinets built into the walls with AMX control panels to control everything from window shades, lights, on down to the TV and the components. The main issue I have is that the embedded rack cabinets are on the opposite side of the room from the giant built-in TV / speakers cabinet. The cabinets and the embedded racks are really well designed. They can be pulled out on attachable rails (see below) and have articulating wire guides. There are two racks – one for the main components of receiver, amplifier, etc. and a second one for old accessories that are now unnecessary such as DVD players, CD players, Blue-Ray, VHS, etc. Pretty cool racks huh?

This does not sound terrible at first consideration until you realize it was built around 2005 before HDMI cables, HD or UHD. It was even before common use of Cat-5E or Cat-6 Ethernet cables. The TV and components in the house when I bought it were pre-HD. For goodness sake, we cannot have that! I would lose my Platinum Couch Potato card.

So What is the Problem?

So the main issue is getting the UHD 4k Receiver connected vi HDMI 2.0 to the UHD 4K TV across the room (thus my hacked set up with external stack next to the TV cabinet). Unfortunately, the Geek Squad (or home re-modelers) left no auxiliary conduit between the racks and the TV cabinet. The conduit used between the rack and TV is absolutely stuffed with speaker wires and various coax wires. It is impossible to re-fish anything through that mess.

Potential Solutions to Home AV Hell

Fortunately, there is one Cat-5 (not Cat-5E) wire that seems to run directly between the racks and the TV cabinet. I have not buzzed it out to test if it is a “direct” point-to-point wire or if it goes through one of the basement Ethernet “home run” switches. It turns out, there are a few HDMI UHD 4K extenders that run over Cat-5E or Cat-6. While the cable I have is Cat-5 – it might work if the distance is short enough. So an HDMI Extender over Ethernet is my best option.

Of course, there is also the option to rip up the walls, ceiling, floor and run a 50 foot HDMI cable rated for UHD 4K. Of course while I was in there, I would add a bunch of Cat-6 cables for any type of future expansion since that seems to be the type of network cable used for wire converters. This seems crazy expensive considering the path through my walls that would be necessary (the room is 2 stories high) and I don’t have the home wiring diagrams as to the routes they took.

One other option I have is a wireless HDMI solution. Right now, I only see systems that support HD quality HDMI over wireless. This might have to be the setup I use when I sell the house so that everything is in an enclosed cabinet and the place looks high tech (even though it will be low tech HD). UPDATE: I just found some wireless UHD 4K @ 60 Hz by J-Tech. $500 for the pair. If the $300 Ethernet based pair do not work due to my cable only being Cat-5 and not Cat-5E, I now have an option. It says line of site. I wonder if I could cook a hot dog on one of those antennas.

Conclusion – more to come

I have ordered the “No Hassle AV” UHD 4K Extenders (see Ethernet extender link above) that run over Ethernet. I will test them and see if it will solve my dilemma. I will post updates back here as I make progress (or lack of progress). Comments and feedback welcome on Facebook where I posted this article.

Ted Cahall

Amazing or Amazon Web Services (AWS)?

Working with Amazon Web Services (AWS) is actually quite amazing.  It is not just a hosting platform – which is all I initially needed when I launched marrspoints.com.  You know, the mundane, standard bit of Java, Tomcat and MySQL hosting.  After launching the FinTech start-up, TheSubtractor.com, with Scott Scazafavo last year, it has been a fantastic journey into the many application services that they offer as well.

It is so nice that the Route 53 DNS service integrates so seamlessly with their SSL certificates and elastic load balancers.  It makes certificates dead simple and painless.   It feels like the Geico caveman commercials compared to the work of making SSL sites of the past.

Email services are a snap with Simple Email Service (SES).  Standard email client code such as Apache Commons work right out of the box once your account and domain are cleared by Amazon to send.

Email integration is also done so elegantly.  With bounce, complaint, and delivery notifications done in their Simple Notification Service (SNS) via JSON messages.  The costs are crazy low if you do not mind making the glue to your applications and the services.  Even text messaging is super cheap when you move to add texting in addition to email.

Need a CDN?  No problem, start with some S3 buckets as the backing store, put AWS CloudFront in front of it, add in an SSL cert for the secure pages, and your are up and running in minutes!  Again, dead simple to implement.  Want to make the image management of your CDN native to your in-house developed tool set (versus using the AWS S3 console to upload files)?  Use the AWS SDK in the language of your choice to list the files and upload new ones.  It’s a little sad the sub-directories are not real in S3, but with some code you can fake those pretty well.  You might want to break out only the services you need in the SDK, as the Java version all up is over 100MB!

Need to do some AI or Machine Learning?  Amazon has some amazing services that allow you to spin up and only use the compute when you need it.  You do not need to build out a Hadoop cluster on a bunch of ec2 nodes – they take care of all of that for you.  You supply the PySpark or MapReduce code on top of their Elastic Map Reduce (EMR) services.  They are making so many inroads to Big Data as a service that it really lowers the bar of entry for companies to innovate, test and learn with their data strategies.

For a number of reasons, I learned to move to Ubuntu 16.04 for my main version of Linux on AWS.  Too many 3rd party packages were not available in AMI 2 from AWS.  Things like Zabbbix had to be hacked and AMI 2 was way behind on the version of Tomcat in the repo.   Issues with the MySQL client libraries also come to mind.  Just too much fiddling to get a new node to work with my standard development scheme when they work perfectly from the repos with Ubuntu 16.04 (minus the Oracle version of Java of course).

I am sure I am leaving off dozens of items I found along the way of working 100% in AWS for our production site at TheSubtractor.com.  It has been a really enjoyable journey.  So enjoyable I have not written a blog post since July!  We are no longer in Beta and open to all for registration.  I could not be more happy with AWS – it does what it says it does, and it is super easy to implement and integrate with all of its other services.

Ted Cahall

WolkeWerks Alpha Goes Live

Today marks another step along my journey as a co-founder (chief bottle washer?) of a FinTech start-up – we are ready to announce our WolkeWerks Alpha Launch!  It was been an interesting and rewarding experience to say the least.  My co-founder, Scott Scazafavo, and I have spent most week days with at least one video meeting as we hash through the details of our product and the problems it solves for our consumers.  Only two people in the company, and yet we still have multiple locations and a two hour timezone gap.  Flexibility is a key to success.

To me, we have too much polish and too many features for an Alpha.  For Scott, he no longer “cringes” when showing the product to his friends and family.   The joys of being co-founders.

We are really fast, er, not that fast

As usual, things always take longer than one would like or even optimistically estimate.  After Scott and I determined the initial high-level plan, we selected a data provider and I was able to produce a proof-of-concept / prototype in one week.  WE are REALLY fast we thought.  Within a day later, I had skinned the “consumer” version in Bootsrap 3.  OMG we are SUPER fast!  The prototype made it clear we had all of the building blocks we would need (aside from an army of software engineers, designers, research assistants, etc).

If I was able to write a fully functioning, Bootstrap-skinned prototype that was based upon a data service REST API in under two weeks, surely we could get our Alpha product live by April?  May at the very latest.

Where did it all go Blanche?

Hmmm.  Oddly, as we launch today, on July 5th, it is interesting to see where the time went.  Figuring out the features for Alpha took longer than expected as we scoped the ideas down to the bare essentials.  There was over a month spent on looking at technologies and products that we eventually realized would not be used until the MVP phase.  A huge help was the book “The Lean Startup” by Eric Ries.  It helped us focus on much less and testing our way into changes incrementally (and thus an Alpha, Beta, MVP and the major releases).

Of course there is also the fact that we only have one software engineer (me).  I like to think I can code fairly quickly, but in fact, I am also the AWS systems administrator, the Apache and Tomcat administrator, the MySQL DBA, as well as the front-end web developer (JavaScript, jQuery, Bootstrap 4, HTML5, CSS, JSP) and the core services developer (Java).  Oddly, as is the case during my whole career, product management was able to generate more ideas and features than engineering was able to produce at the same pace.  I was doing some product management along with Scott – but again only I was doing any coding or system administration.  Clearly I learned to stop making my own backlog bigger fairly quickly. 😉

In the beginning…

In the early phases we needed to agree on the basic functionality.  We knew the long term product would use distributed processing, AI and machine learning.  These are of great interest to me and so I poured myself into learning them more deeply (and getting them working in my lab) as fast as possible. This was going to be a super cool product and possibly even more fun to build!

A dollop of Hadoop and a sprinkle of Spark

What a dream job!  I was a full-time student again.  One of our product’s main goals is to help a consumer manage their online subscriptions.  My quest for building a Hadoop based AI engine allowed me to add at least five more online subscriptions to my credit cards!  I was a super-user and the product was not even built yet.  Courses from Udemy, Lynda.com, Coursera, Pluralsight were great!  I quickly outpaced the top courses in Treehouse but had fun looking at them. These paid services were in addition to my regular free sites of w3schools.com and others.  I was suddenly an online training expert.  Visions of blogging about the various online training sites and their relative merits sadly danced in my brain.

I took courses about Eclipse, Docker containers, AWS S3, Hadoop, MapReduce, Spark, and stuff I cannot even recall at this point.  All of this while building out and upgrading my Hadoop and Cassandra clusters and testing my various theories on how to make the product sing.  Then a dose of reality hit as I was working my way through “The Lean Startup” book.  Oops.  I had gone way too far down that path when we were not even sure of the core viability of our product.  There were MUCH simpler ways to achieve the product viability testing we would need without the AI engine working day one.  Well that was at least one month of “fun” research that was pointless to our Alpha.  Ouch.  That was a big chunk of time lost – regardless of how much fun it was.

Time for a road trip

Once I realized I had lost my way according to the “start-up bible”, I quickly re-focused my interactions with Scott.  We decided we needed to spend some time in the same city (with a real whiteboard) to flush out the phases of the product roadmap.  Scott and I chose Denver as it was in between Seattle and Minneapolis.  We used Trello (for free) and carved out what the Alpha, Beta, and MVP versions of the product needed to be (knowing even Alpha would shift as we continued to determine feasibility).

Real collaboration – at the code level

The next issues was linking Scott’s product features and design to my coding.  Sounds simple, but we needed to agree on simple things such as a front-end framework and possibly a tool that would allow Scott to design mock-ups that could be implemented with relative ease inside the chosen framework.  With much angst and encouragement, we agreed Scott would learn a little Bootstrap and select a Bootstrap tool to make the “unauthenticated” portion of the Alpha site, since my version was a white page with a login form (I thought it looked great!)

Since Bootstrap 4 was available, Scott picked a tool that generated Bootstrap 4 code.  No worries, Bootstrap 4 is backwards compatible to the Bootstrap 3 code used in the prototype – right?  Um, no. 🙁

I made the mistake of a 5 minutes hack of the consumer site (integrating the authenticated prototype and the new unauthenticated code from Scott).  It produced a hybrid Bootstrap 3/4-ish site that I showed to Scott.  I think I almost killed the company.  It looked OK-ish (to me).  Scott was so depressed at how bad it looked he was on LinkedIn looking at Junior Product Manager roles.

Staying positive and building momentum

I realized that incremental crap-ism might not be the way to encourage Scott along the Bootstrap 3 to 4 migration journey. I quickly researched all the issues with making the site fully Bootstrap 4 and re-coding the prototype so it looked proper in both the authenticated and non-authenticated states.  Slowly I coaxed Scott back off the ledge when he saw his designs working pixel-by-pixel as he had designed them.

Moving from Bootstrap 3 to 4, creating a working rhythm with Scott as he generated pages and updates to the prototype pages took some time, but the pages started looking better and better.  Eventually Scott learned how to use GitHub and started making some UI changes directly himself.  Now we had put down our rocks and sticks and began cooking with gas.

There were some data service issues that we needed to address.  Then some AWS issues we needed to investigate and correct, plus redirects, SSL certs, password standards, and a lot of other things that  were correctly deemed important into a FinTech Alpha.

Then the real fun began.  The product was getting close enough, but there were some key features that we felt would make the product more useful and pleasing to our consumers, and there was an issues with “the browser wars” that was likely to cause confusion with our Alpha users.  Should we add the features and try to fix the browser wars issue?  In the end we decided yes.  The additional features required about a day of coding.  Well worth it.

FireFox, FireFox, why, why, why?

We decided we needed to make a valiant effort to correct an issue where browsers were auto-populating our site’s credentials into a third-party access credentials form.  We allow our consumers to access their own data through our product, but they need to securely pass their credentials onto their banking service.  If the browser auto-populated these fields with our site’s login credentials , the user would likely pass the wrong credentials to the back-end banking service and the connection would fail.  It should be easy to stop a browser from auto-populating a form field you would think.  Unfortunately, again, no.

Here is where the “browser wars” come into play again.  The HTML5 specification says that there is a attribute that can be used on the <form> and <input> tags called “autocomplete”. By setting autocomplete=”off” the browser should know not to populate that password field with the current site credentials.  Perfect.  Except none of the browsers honor it!

Protecting the installed base – of course!

There is a significant hurdle browser companies implement so users are less likely to consider switching browsers.  Many users allow the browser to remember all of their passwords to the many banking, media, and merchant sites they use.  Smart users do not use the same password for their favorite recipe site as their investment or banking sites.  Who can remember all of those passwords?  Let Chrome, Firefox and Edge remember them for you!  But once a user has done that, it is way too much work to start over again with a new browser and migrate all of those passwords (that they really don’t remember) along with them.  Great product management by the Chrome, Firefox and Edge teams in protecting their installed bases.

This is all fine and dandy, until a site needs to be sure the consumer is not confused when a different set of credentials are being requested.  Auto-populating credentials (especially when they should be different than the site they are on) may cause less sophisticated users to submit the wrong credentials.  This further confounds the user experience and frustration levels.

A partial fix for now – and full fix in Beta

Interesting enough, on a blog from the Mozilla Developer Network, there is an article that explains a simple way to stop the browser from auto populating the forms fields.  And guess what?  It works for the latest versions of Chrome and Edge – but NOT FIREFOX!?   Mozilla explained how to correct the issue (autocomplete=”new-password”) – but then explains they ignore that as well.

Why would Mozilla do this?  Because they have been losing market share to Chrome for years – and they need to be more aggressive in their product design to capture and keep new users by storing their password at all costs.  Sad.

So we launch with Firefox users having some confusion when we (securely) need their credentials to their bank services.  There is a complex JavaScript fix that we will eventually implement that randomizes the field names on load, but changes them to “password” and “username” just prior to form submit.  Sad we need to resort to that.  But we will make that part of the Beta and just pre-warn our Firefox Alpha users.  After all, it is Alpha, and we decide who signs up and what issues we are asking them to avoid.

A curvy roadmap – and then a right turn

So after a month’s travel down a wrong road along the course of developing our roadmap, we have finally gotten to the day of our Alpha release.  We need to remind ourselves it is not an MVP or a Beta – it is just an Alpha.  But to me, an Alpha that is pretty slick and does a lot of what we will need it to do as we roll towards Beta and the MVP.

There are many features left for Beta and the MVP as well as dusting off the machine learning and AI code – but we are on our way!

Ted Cahall

 

Ted Cahall, Moz, and Open Source

Ted Cahall and Moz
Ted Cahall and Moz at the old Netscape offices circa 2009

I am a huge proponent of open-source.  Often I refer to using open-source software to “standing on the shoulders of giants”.  Such amazing leverage to accomplish complex tasks.  Software developers today are the modern alchemists stringing together pieces of the solution as the systems integrator.  My tribute to open-source and Mozilla.  Taken at AOL’s old Netscape offices back in the 2009 time frame.

Ted Cahall

Zoom – FREE P2P Video Conference

Scott Scazafavo and I have been working full time on our new start-up, WolkeWerks.com.  This often places me in my home office reaching out to colleagues for advice and collaboration.  My communication tool of choice has been free peer-to-peer (P2P) video conference tools.  Scott and I have used Skype and FaceTime, but experienced the common video lags and garbled voices.  These were frustrating experiences needless to say.

Zoom Logo

It only take a garage to fall on me

My dad used to say, “I don’t need an entire house to fall on me to learn something, it only takes a garage”.  I think he was telling me to learn from trends when they are still small – and even a garage hurts when it falls on you.

The second time someone (ok a nice recruiter) asked me connect with Zoom, I realized it was a high quality service in terms video choppiness and garbled voices.  I did not look into pricing as I figured it was another service used by larger corporations ala WebEx or BlueJeans.  When a colleague in Berlin sent an invite with it, I thought it was odd that he was willing to pay for a service just to chat with me.  This was on top of another colleague with a pending Zoom call scheduled.  Why does everyone want to see a bald guy on video when it is such a frustrating technology?

FREE Zoom P2P Video Conferencing

Because it really isn’t frustrating anymore.  At least from my sample set of 4 calls now. One to Boston, one for two hours to Berlin, and two to different people in Seattle now.  But the biggest surprise, it is FREE for 2 people for unlimited connectivity.  It is also free for 3 or more people for 40 or less minutes.  FREE is my favorite word as I am an unabashed open source bigot.  But FREE that really works well is amazing.

I love the idea of getting people to try something for free for personal use and then once they fall in love with it, they are happy to pay for it in other circumstances.  I have not tried a 3 or more participant call yet.  I suspect they have this technology so dialed in that once you do a 30 minute call with 3/4 people that runs long, you get hooked on how well it worked and add your credit card to the account.

Check out Zoom

The folks at Zoom also have connectivity modules and upgrades for H.323/SIP systems, LifeSize, Polycom, and Cisco gear in corporations. It is all on their website.  They seem like they really nailed the tech on this so far.

I have a call Monday to London with another colleague.  I would never have asked him to use video conferencing in the past.  Too clunky and messy.  But we are setup on Zoom and it will be good to see his face for the first time in a year – even though we catch up nearly every month.

Check out Zoom. All it takes is a laptop, iPad or a mobile phone.

Ted Cahall