SSIDs Everywhere

As someone who has been an apartment dweller for a relatively long time, there are some extremely solid perks to it. It’s nice to never have to worry about things like maintenance, for example; if something goes wrong, I open a service ticket and someone shows up to fix the problem. When my furnace wouldn’t start on a cold day, for example, I wasn’t scrambling to figure out who to call. There are some downsides as well, though. For one, many people are frustrated by the fact that they can’t customize their home to the degree they’d like with respect to things like the color of the walls. Anyone who knows me knows that I couldn’t possibly care less about that. What does give me grief, though, is how significantly the spectrum around me is completely polluted by my neighbors.

I had issues with this in my last apartment, which was roughly the same size as my current apartment but with a longer, narrower layout than my current home. When I moved my home office to the bedroom, which was at the opposite side of the apartment from where my router was, I saw a noticeable performance decrease in my home network; this was a problem when I was working from home and an even bigger problem when I was trying to reach Platinum in Overwatch (spoiler alert, I didn’t manage to do it.) At the time, I replaced my cheap home router with a mesh WiFi setup so that I could utterly drown out my neighbors with sheer WiFi dominance. I ended up buying a pack of 3 access points because it was only slightly more expensive than buying 2 access points individually. That was more than enough to blanket my 900 square foot apartment, and I didn’t really think too much about my setup after that.

Fast forward to my current apartment, which is located in a much more populous area than where I previously lived. Pre-pandemic, I still didn’t think too much about my home network setup. The only bandwidth-intensive activity I did was video streaming, and that was mainly done from streaming sticks connected to my TV that was literally right on top of the router/access point connected to my modem; I never ran into issues with it. After the pandemic kicked into high gear in this area, though, I started living from my home office and spending unholy amounts of time on web conferences. While they worked well enough most of the time, I’d periodically have spikes of extremely high latency that would cause me to sound like Megatron’s cousin while on calls. This was annoying on work calls and infuriating when trying to record podcast episodes.

At first I assumed that the problem had to be upstream with my ISP being overloaded since suddenly everyone was staying home all the time. As the problem persisted, though, it made less and less sense to me. It would be reasonable if this behavior happened during the evenings when everyone is sitting around binge-watching their favorite shows because they can’t go out. When my network is choking to death on a 9 AM call, I was scratching my head. Surely not enough people could be doing video calls at that time, right? While latency affects VOIP traffic heavily, it isn’t exactly the most bandwidth-intensive thing to be doing in 2020.

Thinking my mesh network was the problem, I even tried ripping it out and replacing it with a single router connected to my modem. While it at first seemed to have a bit more stability, I still ran into some of the same problems. Finally, I realized that I was seeing a lot of networks when I just looked at what was available from any of my client devices. I fired up WiFi Explorer and was presented with this nightmare.

If you’re thinking that looks disgusting, you’re correct. Every channel in both the 2.4 GHz and 5 GHz bands is completely packed with networks. I’ve been checking this periodically since realizing it could be the problem, and I regularly see anywhere between 50 and 70 different wireless networks from my apartment. Yikes.

Admittedly, I’m part of the problem. I’m broadcasting with a 5 GHz and a 2.4 GHz network from a router that I use exclusively for work. I’m also broadcasting a main and guest network on my main mesh setup. That causes matching SSIDs to broadcast on both the 2.4 GHz and 5 GHz spectrum on each of the 3 access points per network, meaning I’m technically broadcasting 12 SSIDs. Even so, I’m still competing with, at minimum, 40-ish other devices crowding the same spectrum.

After coming to the realization that I may have been blaming all of the wrong things, I adjusted the setup of my access points so that the main router/access point which is connected to my modem is in direct line of sight across my apartment from the mesh access point at my office desk (which moved from the desk proper to a table next to the desk.) Since doing that, knock on wood, things have at least seemed to be a bit more stable for me. Either I’ve been having a better experience on web conferences or no one bothers to complain to me about it when my audio suddenly sounds like garbage because they’re just used to that happening from my end.

All that being said, I do still have an issue where the network stack on my MacBook Pro will crash, leaving me with no network connectivity until I disable and then re-enable WiFi. I haven’t managed to find a fix for that particular problem yet, though I imagine having 4 different flavors of VPN client installed probably isn’t doing me any favors.

Ubuntu Linux GRUB Error After 20.04 Upgrade

While I’ve nuked my personal VPS, I still have a VPS that I use for work; it comes in handy for things like running cron jobs, maintaining persistent shells, and generally handling things where a Linux shell seems better than a macOS shell (I’m looking at you, remote PowerShell sessions connecting to Microsoft Exchange.) This week I decided to upgrade it from Ubuntu 18.04 to Ubuntu 20.04. I like to stick on the LTS (long term support) releases for my servers, but I do typically prefer to keep even the LTS releases upgraded rather than waiting for them to go end of life. I could have kept using Ubuntu 18.04 with maintenance updates until 2023 and security maintenance until 2028, but what’s the fun in that?

Upgrading a VPS is always a bit of a nerve-wracking situation just because I don’t have local access to the host in case something goes extremely awry. Ubuntu tries to help alleviate this by opening a second SSH daemon on a different port just in case the primary daemon crashes during the upgrade, but if the machine ends up in a non-bootable state I’m still more or less hosed. Fortunately for me, things almost went off without a hitch… almost.

While the upgrade did complete, I received an error toward the end of the process that GRUB failed to upgrade successfully. This was mildly terrifying since GRUB is the bootloader; if it’s not working properly the system won’t boot, and I can’t access the host of the VPS to troubleshoot it. Luckily, GRUB continued to work in my case, and my system was able to reboot successfully after the 20.04 upgrade and beyond. GRUB just wasn’t getting upgraded. I quickly noticed that I also received an error from GRUB every time I ran sudo apt update && sudo apt upgrade to update my system. Again, the other packages would upgrade successfully, but GRUB would always complain:

dpkg: error processing package grub-pc (–configure):

installed grub-pc package post-installation script subprocess returned error exit

status 1

Errors were encountered while processing: grub-pc

E: Sub-process /usr/bin/dpkg returned an error code (1)

After spending some time just ignoring the problem since it wasn’t exactly critical, I finally decided to do some digging. It turns out that problems like this have apparently plagued Ubuntu upgrades for a while, as I found a thread with the same problem all the way back with an upgrade to Ubuntu 14.04. The solution in that case was to simply “nuke and pave” by removing GRUB and then re-installing it. It’s once again a bit of a white-knuckle situation since if anything happens between removing and reinstalling GRUB the system will not have the ability to boot. The steps were very similar to the linked thread with some minor differences in the era of Ubuntu 20.04. The first step was still to purge GRUB:

sudo apt-get purge grub-pc grub-common

Running this command in 2020 removes /etc/grub.d/ already, so there’s no reason to manually run the removal. Instead, I next moved straight to re-installing GRUB:

sudo apt-get install grub-pc grub-common

The installation process kicks off an interactive wizard asking which disk(s) GRUB should be installed to. In my case, I only needed it on the main disk, which is /dev/sda. With that done, I updated GRUB and then rebooted:

sudo update-grub
sudo reboot now

This part kind of sucked as I was left running nmap against the SSH port for my VPS and hoping that GRUB was properly set up to allow the system to boot. After a nervous 15 seconds, though, the port started to respond again, and I could successfully SSH into the server. Re-checking for updates showed that everything was fine; the errors about GRUB having a needed upgrade that couldn’t be installed were gone. Admittedly, it was probably unnecessary to go through this upgrade without any specific reason for it, but the beauty of Ubuntu is its popularity. Rarely will there be an issue someone else hasn’t encountered, solved, and documented before, and this problem was no exception.

Offsite Podcasting

I’ve written before about how difficult it can be to record podcasts remotely, something that has continued to be a struggle throughout the pandemic. For the Unusually Pink Podcast, the irritation of recording remotely was enough to make both Brandi and myself decide it was best to throw in the towel after a year. Mark and I have managed to continue doing well with the Same Shade Of Difference podcast, though significantly more work is involved in getting an episode together when done remotely. I say that like I have anything to do with it, but it’s really all Mark going through the work of recording our podcast episodes that we do over Discord, editing the files, adding in the music, and everything else. It’s substantially more effort, and at the end of the day we still don’t end up with as clean and natural of a product as we would while recording in person; latency means we still periodically try to talk over one another even as we endeavor to avoid exactly that with video calls.

What we’ve been experimenting with lately, though, are offsite recordings. It gives us a similar experience to the onsite recordings we used to do from our old podcast studio but in random locations. We first actually gave this a shot at Dead Lizard Brewing Company in Orlando when we were there for the Podfest Expo, AKA the last thing we did before the pandemic ground everything to a halt here. That was a surprisingly good experience, and you might be surprised at how infrequently even the loudest background noise comes through to the recording.

With how cool the experience was at Dead Lizard, Mark and I have been trying to basically emulate that experience now from places that allow us to sit outdoors, away from everyone else. We’re racing against time a bit as the weather continues to get colder, but it’s mostly been a great experience thus far. We first recorded at the West Sixth Farm for the “How We Work in the COVID-19 Era” episode. The Farm is a terrific location, with a covered pavilion area that has picnic tables, accessible WiFi, and outdoor power we can use. Barley the farm dog came to hang out with us for a little while, and it’s a very difficult view to top while recording.

The one downside of this setup is that, with things being relatively out in the open, we do end up at the mercy of the wind. We attempted to record a second Same Shade Of Difference episode there maybe about a month ago, but we had to scrap the idea because the windscreens for our microphones couldn’t quite keep up with the wind that would have caused a constant disruption to the episode.

More recently, we recorded at Idlewild Park in northern Kentucky. This was also a cool experience where we were able to take a quick look around, find a pavilion with no one else at it, and post up to record without having to worry about being too close to anyone. We were fairly close to Cincinnati/Northern Kentucky International Airport, but even that sound barely came through the microphones.

That episode isn’t live yet, but look for it soon over at the Same Shade Of Difference website or your podcast source of choice.

As with anything, there are caveats to offsite recording. While the West Sixth Farm is an insanely good location due to this, things like electricity and Internet access aren’t guaranteed when you’re trying to stay away from everyone else. While at Idlewild Park, we just leaned on the batteries for our gear and used the hotspots from our phones and tablets for connectivity. The pavilions at both locations were clutch for the possibility of inclement weather. For both locations, either Mark or myself had to travel at least a bit to get there, so having backups of pretty much everything is helpful. Spare batteries, memory cards, headphones, etc. is useful to save yourself from a scrapped recording session. I say this as if I provide any of it, but it’s really all Mark who has both everything you need and backups of everything you need in his bag at all times. I give him a lot of grief over how heavy his backpack is, but I’d be lying if I said it hadn’t saved me more than a few times.

On the whole, I would highly recommend anyone struggling with remote podcast recording over the Internet try meeting up at an out of the way place to record in person while being outdoors and socially distant, weather permitting. The elephant in the room for this topic, of course, is how you record the episode since dragging something like the RODECaster Pro out there isn’t feasible, even if it’s technically possible since the device will run off of D batteries. The recordings in Orlando and at the Farm were done with Mark’s Zoom H4n Pro, and while that device did a terrific job of recording, it still left a lot to be desired with regard to the amount of edit work that was required afterward. Without any built-in soundboard someone (AKA Mark) would have to go in and actually edit our recording to insert things like our intro music. To alleviate this, Mark recently got the new PodTrak P4. I won’t share any thoughts on that here, though, as that’s the topic of a future podcast episode! Stay tuned to the Same Shade Of Difference if you’re interested in our take on that particular device.

Updating PowerShellGet

It’s not too often these days that I find myself needing to update the underpinnings of PowerShell. The majority of the PowerShell work I do now is based on PowerShell Core, the current version of which is 7.0.3, and frequently just comes with newer versions of the supporting modules. PowerShell Core began with PowerShell 6 and is created with .NET Core, which is Microsoft’s open source and cross-platform flavor of .NET. PowerShell version 5 and before, known as Windows PowerShell, is the original, Windows-specific variant of PowerShell. Microsoft doesn’t really do any new development work on Windows PowerShell, instead opting to work on PowerShell Core and slowly make the full set of functionality available on all platforms.

This is awesome, but some systems very specifically target Windows PowerShell. This can easily happen since the interpreter even has a different name; Windows PowerShell calls powershell.exe while PowerShell Core calls pwsh.exe in an effort to allow the two versions to co-exist on the same Windows host. As a result, systems which proxy PowerShell commands or scripts on your behalf down to a target machine that have not been updated to expect PowerShell Core will generally target Windows PowerShell instead. This was the situation I found myself in last week.

I was attempting to load a script that I had written into a monitoring platform which will then send my script down to any number of “collector” machines in order to for it to execute and do the actual data aggregation. In this case, my script failed because it was calling the MSAL.PS module. MSAL is the Microsoft Authentication Library, and as the name indicates it facilitates authentication to Azure AD. It replaces the older Azure AD Authentication Library (ADAL), and is honestly much nicer to use. The module needs to be installed first, though, and while I had previously installed it on the target system under PowerShell Core, Windows PowerShell is a completely separate entity with a separate space for modules. I remoted to the system and ran the following to handle the installation from an administrative Windows PowerShell session:

Install-Module -Name MSAL.PS

Instead of joy, I got the following error message:

WARNING: The specified module ‘MSAL.PS’ with PowerShellGetFormatVersion ‘2.0’ is not supported by the current version of PowerShellGet. Get the latest version of the PowerShellGet module to install this module, ‘MSAL.PS’.

Ick… some things were a bit old in the Windows PowerShell installation. This was one of the rare instances where the error message didn’t tell me exactly how to fix the issue, though, so I did a few searches on this exact error. The trick is that updating PowerShellGet involves not one but two steps.

While PowerShellGet is a module specific for discovering and installing PowerShell packages from the PowerShell Gallery, it leverages Microsoft’s much more generic NuGet package manager. To get the latest version of PowerShellGet, I first had to make sure I was using the latest version of NuGet by running:

Install-PackageProvider Nuget -Force

Once that completed, then I was able to successfully update PowerShellGet via:

Install-Module -Name PowerShellGet -Force

Once the update completes, the current PowerShell session will still be running the old version. I just closed PowerShell, re-launched a new administrator instance, and then successfully installed the module via the same cmdlet from earlier:

Install-Module -Name MSAL.PS

Safari 14

Last week the 10.15.7 update to macOS Catalina came with a nice surprise: Safari 14. I was caught off guard by this since I had assumed we wouldn’t see Safari 14 until Big Sur released later this year. It was also a nice surprise for me since Safari has become my browser of choice, not just on my iPhone and iPad, but also on my MacBook Pro. The big reason for this is that I do my best to avoid any Chromium-based browser. Over the last few years we’ve seen diversity in browsers erode more and more as new browsers are built based on Chromium (e.g. Brave) while others abandon their own engines in favor of using Chromium (e.g. Opera and Edge.) I personally see this homogeneous browsing platform as being pretty bad for the Internet as a whole, as it opens up the possibility for web developers to focus all of their development on Chrome and ignore everything else. This leads to sites that only work on Chrome and that ignore web standards, just like we saw back in the day when much of the web was developed with only Internet Explorer 6 in mind. The difference now is the way the web has evolved into an entire platform. In 2004 the main issue was that sites developed just for IE 6 wouldn’t quite render properly on other browsers. In 2020, there are entire web apps that straight up won’t work on non-Chromium browsers. That’s something I can’t support.

The two major browsers moving forward with different engines are Firefox (with Gecko) and Safari (with WebKit.) I was previously using Firefox on my laptops, but I became extremely concerned recently when Mozilla had massive layoffs and switched their mission to focus on revenue. I certainly understand that Mozilla needs to make money in order to continue making Firefox, but when a group lays off their entire incident response team, I don’t exactly feel warm and fuzzy inside about using the product. I still use it on my Linux installations, but on macOS I switched to Safari.

The pleasant part about switching to Safari is that, for the most part, it’s been a very slick browser that I’ve enjoyed. While Safari 14 doesn’t do anything too Earth-shattering or even different from any other browsers, it does bring Apple’s offering up to parity with some of the major players. For example, Safari will now finally display favicons for websites on tabs. How they’ve made it this far without supporting them I’ll never understand, but it immediately makes a huge difference in quickly finding the tab I want… and I say this as a person who typically doesn’t have more than 10 tabs open at any given time. Tab addicts (you know who you are) will especially appreciate this when Safari starts stacking tabs on top of one another. As another update to tabs, Safari can now preview the content of a page when the mouse is hovered over a tab. This can also be useful for quickly finding the appropriate tab without actually having to switch to anything.

The big change, though, is how Safari communicates with the user about how it has helped protect against invasive tracking. This feature is extremely similar to the Protections Dashboard in Firefox. There’s an icon to the left of the address bar that can be clicked at any given time to see a breakdown of trackers on the current page. Clicking will also allow me to see the specifics of what trackers are being blocked:

For a bigger picture, I can also get an overall view of what’s been blocked in the past 30 days. I can see which sites were attempting to be the most invasive, and similar to the per-site rendering, each can be expanded to show which trackers they had embedded:

Similarly, I can click on the Trackers heading in order to see a list of which trackers appear the most frequently across the sites I’m visiting. I can expand those listings to see which specific sites are hosting that tracker:

I don’t think it should come as a surprise to anyone that Google, Bing, and Facebook appear the most frequently after just a short period of testing. It’s also interesting to see trackers from both Facebook and Snapchat when I don’t use either of those “services”. It really shows you how pervasive they are across the Internet.

While I can already hear the Apple-haters I know railing on the fact that Firefox already has this feature, in my opinion it’s nice to see Apple bringing their browser up to feature parity and offering a more transparent and secure browsing experience to people in a package that also does not leverage Chromium but which does have a support team behind it that’s more than a skeleton crew. Similarly, you still don’t see anything like this today in Chrome or Edge, likely because the companies behind them both appear relatively high up in the tracker list.