A collection of tips and tricks...

Turn on and off your AKS (Azure Kubernetes)!

Turning off your AKS will reduce cost, since all your nodes will be shut down.

Turning it off:

Stop-AzAksCluster -Name myAKSCluster -ResourceGroupName myResourceGroup

And then turning it back on:

Start-AzAksCluster -Name myAKSCluster -ResourceGroupName myResourceGroup

If your an Azure N00b like me, and you get “Resource Group not found”, change into the correct subscription using either name or id, with:

Select-AzSubscription -SubscriptionName 'Subscription Name'


Select-AzSubscription -SubscriptionId 'XXXX-XXXXX-XXXXXXX-XXXX'

Thats it for today!

Migrating from Authy to 1Password

I’ve previously used LastPass and Authy, but have decided to start using 1Password instead as their app is nicer and they have many features that are not available natively on the LastPass Desktop App or Browser Extention, like 2FA.

But how to migrate without having to re-setup 2FA on every site?

After trying out some javascript-browser-hacks without much luck, I found a Go Lang written program that uses the Device-feature of Authy to get access to the TOTP-secrets, works like a charm as they say!

Here’s a link to the program:


Recursive unrar into each folder…

I’m not sure why this was so hard to find, but now it’s working… I was initially working on having “find” -exec UnRAR but it didn’t seem to work too well (I couldn’t find a good one-liner to separate file directory and full-filename to put into the find -exec command, if you have one, let me know).

find $PWD -type f -name '*.rar' -print0 | 
while IFS= read -r -d '' file; do
dir=$(dirname "$file")
	unrar e -o+ "$file" "$dir"/ 

(some parts of the script was inspired from other online sources)

That oneliner…

Once again, this is just so that I don’t forget 🙂

apt -y update && apt -y upgrade && apt -y dist-upgrade && apt -y autoremove && apt -y clean && apt -y purge && reboot

Because you just want keep your system updated … I’m sure some of the commands are redundant, but hey.. It works!

Manipulating “date added” in the Plex Media Server Database

So I noticed a quite annoying thing in my Plex Server, some items were always marked as “recently added”. As usual, I went online and searched for a solution.

Some posts suggested to wipe the library and start over, while some posts suggested that doing that doesn’t fix the problem as the “bug” seems to be that when an initial scan is done, some times Plex will add a “date in the future” as the “added date”.

So I got curious, I started looking at the XML-file of the Items in question, and just as described in one of the posts, the year of the added-date, was in 2037 a fair bit into the future.

I started looking for solutions to the problem and trying to find someone who had been able to fix it, when I stumbled upon a reddit post with a SQL-script.

Reddit Post by bauerknight

So I started by downloading the Plex SQLite3 Databases from my Plex Server (making sure to make backups of the original files…), I then downloaded a SQLite3 Database Tool to my computer and started exploring the database structure.

After looking around, it seemed like the 4 year old Script found on Reddit would do what I wanted, so I modified it to work on my Library and ran it.

After that I uploaded the DB-files to my Plex Media Server and started it, problem solved!

Reddit article: bauerknights post

SQL Oneliner for future use:

UPDATE metadata_items SET added_at = originally_available_at where library_section_id = '5' and added_at >= '2020-08-29 00:00:00'

KVM VM Image Crash

So, something bad happened! My Virtual Machine running on my NAS crashed.

And it looked like this…

2020-05-08T19:51:09.582896Z qemu-system-x86_64: -drive file=/share/Storage/VM/Windows 10 Enterprise/Windows 10 Enterprise.img,format=qcow2,if=none,id=drive-virtio-disk0,cache=writeback: qcow2: Image is corrupt; cannot be opened read/write

So I did what I normally do, I started googling, and found a bunch of old articles saying that I had to install some “nbd” module in the kernel and then run “ddrescue”, most of the articles I found pointed to one very sad solution, scrap your VM and reinstall, because you’re never getting your data back.

Anyway, all of that seemed pretty old (and sad), and I thought there must be a better way. And guess what, there was!

First I ran a command called “qemu-img” with the command “check”, and It looked like this:

./qemu-img check /share/Storage/VM/Windows\ 10\ Enterprise/Windows\ 10\ Enterprise.img

When that was done it gave this:

<WALL OF TEXT> (list of all errors in the Image), and then a Summary:

2047 errors were found on the image.
Data may be corrupted, or further writes to the image may corrupt it.
17593 leaked clusters were found on the image.
This means waste of disk space, but no harm to data.
802489/4096000 = 19.59% allocated, 4.83% fragmented, 0.00% compressed clusters
Image end offset: 53754200064

I liked the part about “no harm to data”. So I ran the second command that I had found, which was:

./qemu-img check -r all /share/Storage/VM/Windows\ 10\ Enterprise/Windows\ 10\ Enterprise.img

And after that command finished:

The following inconsistencies were found and repaired:
17593 leaked clusters
1024 corruptions
Double checking the fixed image now…
No errors were found on the image.
802489/4096000 = 19.59% allocated, 4.83% fragmented, 0.00% compressed clusters
Image end offset: 53754331136

And that solved it!

Lesson learned: Scheduled Backup of your VM = good thing.

Remove files with Find

Try to remember that the syntax to remove files recursively using Find is:

find . -name "Thumbs.db" -exec rm '{}' \;

HDMI + MacBook = No Network

I’ve always been one of those guys, saying that “why do you need a fancy HDMI-cable, it’s a digital signal, either it works or it doesn’t work, the signal will not be affected by interference in the same way as an analog signal”.

While this is true, a digital connection like HDMI comes with other challenges like, versioning. It turns out (I was aware of this, but have not payed much attention to it in the past), that HDMI has several different versions that all include different new features, and even though the cable and the connection interface looks the same on all products, the actual result of the connection might vary and/or bring different problems with it if there is a missmatch.

I recently experienced this with my new monitor, when I connected it via my HDMI-dock-adapter to my Mac Book Pro, all networking went bananas, I tried turning the monitor off, thinking I had received a unit that was interfering with the Wi-Fi connection on my computer (Bad ESD protection or something, was my first though). But after some researching and tinkering with different cables and input-devices, I figured out that it was actually the HDMI-cable causing the problem.

With a newer HDMI-cable (and not one of the many old-ones I’ve gotten for free with different purchases over the years), everything worked fine!

So if your network connection or other things starts acting up after you connect something new with an HDMI-cable, you might be experiencing the same thing!

What I really think could be improved here, as a consumer, is some kind of error-messaging via a fall-back to a lower version, “Ethernet over HDMI malfunction, because cable does not support HDMI v. 1.4”.

AWS Glacier!

Hard drives will fail!

Manufacturers of hard drives like to boast about “MTBF” (Mean Time Between Failures) and “AFR” (Annualized Failure Rate) rates that indicates that the drive would last anywhere from 300 000 hours (~34 years) to having a likely hood of failing during a year as low as 0,8%.

But real life tests, performed by companies like BackBlaze show that disk drives can fail already after 1 year, actually they had a 5.1% AFR during the first year, 1.4% between 1,5 years to 3 years and a staggering 11.8% between the third and forth year. Their study also shows that after 4 years, it was only 80% of the original disk drives that where still “alive”. Meaning that 20% of the drives had failed after 4 years. 

So, why does this matter?

I have videos, photos and important documents that I need to backup and in some cases archive. Archiving being “removing the local copy and storing the item in a secure place” (think of book keeping, where it’s required to archive the books, it’s not required that you have multiple copies in case of a fire), where as backup is a second copy of the object in another location in case of a fire. We’re talking wedding photos, wedding videos, photos of my son growing up etc. that, in the case of an unforeseen catastrophic event in my home, could potentially be lost for ever!

Companies like BackBlaze offer products for backing up your local machine (and also files stored on a server, like in my case), but I optioned for another setup using Amazon Web Services S3 Glacier!

AWS Glacier

So what is S3 Glacier? Glacier is an archiving service, where you can upload all you data and pay a very small service fee per month for Amazon to keep your data safe. Amazon S3 Glacier is designed to provide average annual durability of 99.999999999% for an archive. The service redundantly stores data in multiple facilities and on multiple devices within each facility. To increase durability, Amazon S3 Glacier synchronously stores your data across multiple facilities before returning SUCCESS on uploading archives. S3 Glacier performs regular, systematic data integrity checks and is built to be automatically self-healing.

My setup consists of a software running on my local NAS that monitors specific places on the NAS where I can put important files and documents, these are then uploaded to Amazon S3 Glacier.

Putting files into Glacier is actually quite easy and the price for storing your data in Glacier is very modest, it’s when you want to retrieve the files that you might need to pay (it also depends on how much data you want to retrieve and how fast you need your files).

As there is no GUI for exploring AWS Glacier Vaults (where the files are stored), I found a really great application online called Freeze that makes life much easier for anyone who wants to explore their files and retrieve them from a UI. 

So are you in the market for a simple, easy to use and very secure way of storing files, documents, pictures, videos etc. of your beloved ones, and you want to be sure that even if your house burns down you will still have access to those files? Try out AWS Glacier with the Freeze App!

(This is not a payed advertisement, BackBlaze, AWS and Freeze App have not been consulted and/or informed about this blog post)

Best Screen Capture for macOS

The best tool out there that I have found is Skitch.

Whats great with Skitch, is that it’s awesome to capture screenshots and annotate them right in the app and being able to just copy paste the annotated screenshot to any other application, be it Skype for Business, e-mail or Microsoft Teams. It just works, and what’s even better, it includes free synchronisation of screen snaps to Evernote.

So if you ever look for a great screen snap/screen shot utility for macOS, look no further!

(This is not a paid advertisement)