Why Managing Transaction Logs in SQL Server Is Important

This past week I had a situation arise where the SQL Server transaction logs for a particular site grew out of control. This ended up consuming an entire disk volume for the log files which eventually caused the SQL Server to stop processing and by extension caused application downtime. This can be a common cause of unplanned downtime for an application when the transaction log was not managed correctly. Just what is the transaction log in SQL Server for and what do you need to know about it?

What is the transaction log for?

In SQL Server each database consists of two types of files: the first is the data and the second is the log file. Each database may have one of each or multiple data and/or log files.

The database file contains the tables, views, functions, indexes, stored procedures, etc. Basically all the objects in the database. This file normally has the extension .mdf.

The log file on the other hand contains a record of transactions that have occurred in the database. The transaction log file is important to consider for many reasons including performance, data integrity, disaster recovery and high-availability. This file normally has the extension .ldf for primary log files or .ndf for secondary logs.

Note a common myth is that in SIMPLE recovery mode the transaction log is not switched off. All operations are recorded in the transaction log whether that be FULL or SIMPLE mode. The difference is in SIMPLE recovery mode once the transaction is committed into the database then the log is truncated.

Why is it important to manage the transaction log?

The most important answer is that the transaction log forms part of the disaster recovery plan of any SQL Server database. If a disaster occurs then backups of the transaction logs will be required to restore to the last known good point in time.

An immediate unwanted effect of uncontrolled transaction logging is that the log file will continuously grow if autogrowth is enabled. The risk is that this can eventually grow to consume the entire disk volume the transaction log occupies. If SQL Server cannot grow the log file of the database then the transaction it is trying to process will fail.

Why and when should I use SIMPLE or FULL transaction logging?

When deciding on the most appropriate transaction log the decision should be focused around the recovery requirements and acceptable loss window.

For example if you have a live ERP system you should consider FULL transaction logging so that you are able to restore the database using transaction log backups if a failure occurs after a full or differential backup of the database is done.

On the other hand if you had copied the same database into a temporary test system and you are not using the test environment with intention it should be a perfect replica of live then consider using the SIMPLE transaction logging method.

Generally use SIMPLE:

  • Test/training databases.
  • Data warehouses.
  • Settings databases (i.e: lists of usernames, application configuration where the data does not change frequently)

Generally use FULL:

  • Production databases for live OLTP applications.
  • UAT and Pre-Production systems where there is a requirement to mimick the live environment configuration exactly for proving purposes.

You should also consider switching a production system with databases in FULL recovery mode to SIMPLE on a temporary basis whilst you are performing an application upgrade or patch which updates the database. This is to avoid the transaction log growing out of control during the operation. After finishing the upgrade the recovery mode should be switched back to FULL.

Footnote: Switching between recovery modes.

  1. Stop all dependent applications for the databases.
  2. Take a full backup of the databases you are switching the transaction log for. Safety first.
--Set the database to single user mode so that further transactions are stopped.
ALTER DATABASE [ExampleDatabase] SET SINGLE_USER WITH ROLLBACK IMMEDIATE

--Switch the recovery mode
ALTER DATABASE [ExampleDatabase] SET RECOVERY FULL | SIMPLE 

--Finally set the database back to multiple user mode.
ALTER DATABASE [ExampleDatabase]  SET MULTI_USER

You can then start any application services that depend on these databases.

Now That We’ve Got a Cumulative Update…Updating SQL Server on Linux

A few weeks’ ago I blogged about installing SQL Server 2022 on Linux. It just so happens to be that Cumulative Update 8 for SQL Server 2022 came out this week. This gives us the perfect opportunity to talk about applying those all important CUs to the SQL Server we built a couple of weeks back.

As discussed applying SQL Server Cumulative Updates on Linux works differently to Windows. On Windows you generally either get CUs by manually downloading the update via Microsoft’s KB on that particular update or accidentally let one install via Windows update if there’s a security fix. On Linux you generally get updates via the repositories you have configured for your server.

Applying Cumulative Updates for SQL Server (of any supported version) ensures that you have the latest fixes for the SQL Server platform. Each update is certified to the same degree as a service pack used to be and Microsoft generally recommends you keep up to date with their installation. That being said you still should test carefully in UAT before applying to a live system.

The first thing you need to do of course is backup your SQL Server. Something could go wrong so you must make sure you have a rollback plan in case that does happen. This could be: taking a snapshot of the server’s VM in your hypervisor, performing a full backup of all the system and user databases, initiating a full server backup using your favourite backup agent or something completely different.

Once you have a test plan in place which has been approved plus your backups you are now ready to install the latest Cumulative Update.

Step 1: Check to see if you’ve got the correct apt source in your repos:

sudo apt edit-sources

You will then get a 1-4 choice for which editor you want to use. I chose nano which is option #1. At which point you’ll see nano appear. Scroll right down and find your sources.

If you don’t see Microsoft’s sources for SQL Server in there it’s not all bad news as Microsoft have made a guide on how to resolve that problem.

Once you are done reviewing the sources list press CTRL + X to close nano.

Step 2: Run apt update to fetch the current package list from the repositories:

sudo apt-get update

This takes a few minutes and for my server resulted in a 14 MB download. This command is vital to ensure that the next command runs properly otherwise you’ll be using an outdated package list.

Step 3: Perform an upgrade of msssql-server using apt:

--To update everything on the system at once
sudo apt-get upgrade 

--To do MS SQL Server only
sudo apt-get upgrade mssql-server

This will then list out all the packages that need upgrading based on what’s installed to your Server vs what’s available in the repository. If you decided to upgrade everything one of those packages should be…yes you guessed it…mssql-server. Cumulative Update 8 is approximately 268 MB to download:

You are of course going to answer ‘Y’ to this question. Or press enter (note that the “Y” is capitalised. This means that it’s the default answer if you smack return).

This will then run through all the updates to go through. Highlighted here is apt setting up the mssql-server package version 16.0.4075.1-1 which is indeed Cumulative Update 8.

Step 4: Verify the mssql-server service is alive:

systemctl status mssql-server --no-pager

As you can see we’re onto a winner:

At this point it’s also probably a good idea to open SQL Server Management Studio or Azure Data Studio to check you have a working connection. To double check you have Cumulative Update 8 you could also execute the following command in whatever SQL query tool you are using:

SELECT @@VERSION AS VersionString
Microsoft SQL Server 2022 (RTM-CU8) (KB5029666) - 16.0.4075.1 (X64)  	Aug 23 2023 14:04:50  	Copyright (C) 2022 Microsoft Corporation 	Developer Edition (64-bit) on Linux (Ubuntu 20.04.6 LTS) <X64>

At this point you should start performing your acceptance tests on your databases and applications. Aim to verify that the usual business and system processes are working before declaring a success!

But What If I Wanted A Specific Version of SQL Server?

Whilst it’s usually recommended that a new SQL Server instance should go through a UAT phase with whatever application(s) will be running against it using the latest CU available at the time. However there could be situations where a specific version of SQL Server is required. Whatever that reason in order to do this run the following commands:

sudo apt-get install mssql-server=<version_number>
sudo systemctl start mssql-server

Where <version_number> is the version string you need. For example Cumulative Update 7 is 16.0.4065.3-4. To find the version number you need consult the release notes for SQL Server 2022 on Linux.

TLS on Windows is Dead. Long Live TLS on Windows But Also Avoid Losing Connectivity to SQL Server.

Microsoft have announced this week that future versions of Windows will disable TLS (Transport Layer Security) 1.0 and 1.1 by default. These ageing cryptographic protocols are designed to secure traffic over a network. The move is a bid to improve the security posture in Windows by ensuring that only newer versions of TLS are used between client and server applications.

TLS 1.0 and 1.1 were standardised in 1999(!) and 2006 respectively. They were both deprecated in 2021 via RFC 8996. Although Microsoft claims that no known unpatched exploits exist in the Schannel implementation newer versions of TLS offer much better security. With older versions of TLS a number of bodies have mandated that these older versions should be avoided. For example as the Payment Card Industry (PCI) have deprecated their use since 2018. There are a number of security flaws with both TLS 1.0 and 1.1 which means that we can no longer rely on them for securing traffic.

In addition all major browsers have dropped support for anything prior to TLS 1.2 since 2020. As with all things in computing security it’s best to be ahead rather than behind. There shouldn’t be any browsers and OSes out there that are still supported and can’t use at least TLS 1.2. I fully recommend keeping ahead with developments and plan accordingly to drop anything prior to TLS 1.2.

SQL Server and Applications Impacted

Although Microsoft believe that usage of deprecated versions of TLS are low via their telemetry it would be wrong to simply assume that you can turn off TLS 1.0/1.1 and job done.

If you aren’t sure about how this will impact your business it’s time to start with a review of your applications and how they will be affected. Soon there will be Windows desktops out there that definitely don’t support older versions of TLS out of the box. Whilst Microsoft have stated that you can re-enable TLS 1.0 and 1.1 via the Schannel registry keys in the meantime you absolutely shouldn’t bother with doing so. There’s a reason things move on. Microsoft will at some point do the right thing and completely remove deprecated versions of TLS from the operating system. Putting off the problem won’t solve anything long-term.

Possibly the most direct way this affects SQL Server based applications is indeed the front-end. Many applications now work via a web UI rather than a Windows application. This is perhaps where your investigations should start.

For internet facing applications you could run a test via Qualys which will produce a useful report on how your server is configured. Scroll down and you’ll see the projected impact regarding client browsers and OSes with what versions of TLS they might use.

Yes I do run this test against this blog.

If your applications are internal only it’s not wise to assume that your wires and airwaves are safe even if you own them. for these you can check the Schannel registry keys at the following location:

HKLM SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\

You can check individual protocols at this location to see if they are enabled or disabled.

Getting down to the SQL Server level things get more interesting. Support for TLS 1.2+ exists as of SQL Server 2016 so these versions are good to go out of the box.

For SQL Server 2014 if you have these configured with an encrypted connection this version needs a cumulative update applying before it will support TLS. By now you really, really should have applied an update beyond the version this was introduced anyway. for any new deployments of SQL Server 2014 instances you might have to do remember to apply CUs after you’ve done the install.

For SQL Server 2008, 2008 R2 and 2012 things are arguably beyond the point with those releases as they are no longer supported. You can get yourself a hotfix to apply to those too but unless this is for an application that’s segregated away in some corner of the network for legacy purposes you’ve either got bigger things to worry about or another good reason to upgrade if this is a production system for some reason.

Potentially you’ll need to update the client driver if they use the SQL Server Native driver. Check with the application vendor for the system requirements for this.

Weekend Project: SQL Server 2022 on Ubuntu

There’s something I’ll have to admit to: I don’t (didn’t) have an SQL Server test instance to play with. Gasp! I broke apart my Windows desktop earlier this year, sold all the parts and then remembered how am I supposed to play with SQL Server without it? Well it did save some desk space…

This weekend it came to the top of my task list to build another so for this brief tutorial I’m going to show you how to install SQL Server 2022 onto an Ubuntu Server. Yes that’s right; no Windows involved.

Pre-requisites

  • A fully setup Ubuntu Server 20.04 running on either a spare PC/server, Virtual Machine, cloud service, basically somewhere. Hell they even run Doom on pregnancy tests these days so I’m sure you can find something to run it on.
  • SQL Server 2022 requires a minimum of 2 GB of system memory so spec/configure as appropriate.

Once you have your shiny new Ubuntu Server you should then use SSH to connect into the environment and let the fun begin.

Step 1: Import the GPG keys for the Microsoft repository. This means that you can trust the repo that you’ll download SQL Server 2022 from:

curl https://packages.microsoft.com/keys/microsoft.asc | sudo tee /etc/apt/trusted.gpg.d/microsoft.asc

Step 2: Register the Microsoft repository with apt. This means that you add the address of the Microsoft repository into apt’s list it uses to check for and download software & updates. This is where you’ll actually download SQL Server 2022 from:

sudo add-apt-repository "$(wget -qO- https://packages.microsoft.com/config/ubuntu/20.04/mssql-server-2022.list)"

Step 3: Update apt’s list of available updates then instruct apt to install SQL Server 2022. Note the -y switch after the install argument means that you are telling apt to do all this automatically i.e.: yes to all prompts. This will trigger a download around 1.3 GB in total so warm up that internet connection:

sudo apt-get update
sudo apt-get install -y mssql-server

Step 4: Run mssql-conf setup to configure your newly installed SQL Server 2022 instance:

sudo /opt/mssql/bin/mssql-conf setup
  1. You’ll first get asked for the edition. I chose number 2 for Developer as I want to use this build as a test/training server. Whatever edition you choose do make sure you are using an edition you are appropriately licenced for. You’ve been warned.
  2. Following choosing an edition you’ll get asked to accept the licence terms. I know you’ll read them fully and very carefully but make sure you type “Yes” to accept.
  3. Next will be the language. I chose 1 for English but as you require for this.
  4. After that you’ll get prompted to specify an sa password. Make sure you choose something secure and record it securely especially if this is production.

Step 5: Confirm that the service is running. It would be most definitely disappointing if it weren’t:

systemctl status mssql-server --no-pager

You’ll get something resembling this which confirms that SQL Server 2022 is running:

If you’re new to SQL Server on Linux then something you need to be aware of is that you can’t run SQL Server Management Studio (SSMS) on Linux. As an alternative you can go try Azure Data Studio which has the Admin Pack for SQL Server extension pack available that includes tools for Profiler, SQL Server Agent, Import and for working with .dacpac files.

Once you are connected to the SQL Server instance then one thing to note if you run the T-SQL command @@VERSION you’ll get something like this:

Microsoft SQL Server 2022 (RTM-CU7) (KB5028743) - 16.0.4065.3 (X64)  	Jul 25 2023 18:03:43  	Copyright (C) 2022 Microsoft Corporation 	Developer Edition (64-bit) on Linux (Ubuntu 20.04.6 LTS) <X64>

Specifically the point here is that Cumulative Updates (CUs) in SQL Server on Linux are handled differently than Windows. On Ubuntu Linux to take an example CUs are delivered using apt. This is unlike Windows where you have to download and install the cumulative update that you want or accidentally let Windows Update install one if there’s a security fix that needs applying. This means that the release you get by default from the repository is the latest one.

So, congratulations. You’ve made it this far. If you don’t like reading this blog (why?) then you can always follow Microsoft’s direct instructions on the setup above but then you’ll have to admit I made you scroll all the way down here to find this link.

Bonus Round! Restoring AdventureWorks Sample Data

Now you have your newly installed Ubuntu Server with an SQL Server 2022 instance you’re going to need some test data in there (unless you’re straight into production in which case bon voyage!).

To do this you can use wget to bring in the SQL Server 2022 edition of the venerable AdventureWorks databases.

--OLTP
wget https://github.com/Microsoft/sql-server-samples/releases/download/adventureworks/AdventureWorks2022.bak

--Data Warehouse
wget https://github.com/Microsoft/sql-server-samples/releases/download/adventureworks/AdventureWorksDW2022.bak

--Lightweight
wget https://github.com/Microsoft/sql-server-samples/releases/download/adventureworks/AdventureWorksLT2022.bak

This should download the databases into your Home directory if you haven’t done a cd out of there.

You can then use Azure Data Studio to restore the databases if you have preview features turned on. If you don’t then you can restore using the following T-SQL:

--OLTP
USE [master]
RESTORE DATABASE [AdventureWorks2022] FROM DISK = N'/home/<user>/AdventureWorks2022.bak' WITH  FILE = 1, MOVE N'AdventureWorks2022' TO N'/var/opt/mssql/data/AdventureWorks2022.mdf', MOVE N'AdventureWorks2022_log' TO N'/var/opt/mssql/data/AdventureWorks2022_log.ldf', NOUNLOAD,  STATS = 10

--Data Warehouse
USE [master]
RESTORE DATABASE [AdventureWorksDW2022] FROM  DISK = N'/home/<user>/AdventureWorksDW2022.bak' WITH  FILE = 1, MOVE N'AdventureWorksDW2022' TO N'/var/opt/mssql/data/AdventureWorksDW2022.mdf',  MOVE N'AdventureWorksDW2022_log' TO N'/var/opt/mssql/data/AdventureWorksDW2022_log.ldf', NOUNLOAD, STATS = 10

--Lightweight
USE [master]
RESTORE DATABASE [AdventureWorksLT2022] FROM DISK = N'/home/<user>/AdventureWorksLT2022.bak' WITH  FILE = 1, MOVE N'AdventureWorksLT2022_Data' TO N'/var/opt/mssql/data/AdventureWorksLT2022.mdf', MOVE N'AdventureWorksLT2022_Log' TO N'/var/opt/mssql/data/AdventureWorksLT2022_log.ldf', NOUNLOAD, STATS = 10

Last Week in Tech…

Two things caught my attention in the IT world this week.

The first was an unfortunate and catastrophic incident involving Danish cloud provider CloudNordic. The cloud provider told customers this week that following a ransomware attack that all their data was effectively lost and they were not prepared or had the means to negotiate with the attackers. The firm believes that the attack happened during a move from one data centre to another as the unknowingly infected servers were brought to the same network as their administration systems. Thankfully it appears that no data has been taken by the attackers however all the backups the company retained were also encrypted.

This isn’t and shouldn’t be a “told you so” moment or an argument against going to the cloud but it does serve as a reminder that even though your systems are in the cloud you really need to have a complete & tested backup of critical data. If the situation arises like this event and the provider is totally compromised you may be left in a disaster recovery situation you can’t fully recover from. Doesn’t matter if you are a customer of a hyperscaler like Microsoft or AWS because whilst your data ought to be safe on their servers it still doesn’t negate the need for your own off-site backups.

I really do feel sorry for the engineers at CloudNordic right now and wish them the best for the recovery effort.

Second was a piece in ISP Review that Ofcom decided it was time to consult on the use of “Fibre” in marketing for broadband services in the UK which quite clearly weren’t fully fibre optic. We’re talking mainly about ISPs reselling Openreach based VDSL2 which is Fibre to the Cabinet (FTTC) with the country’s expansive copper telephone cables used for the “last mile” to the property and Virgin Media who use Coaxial cables in a similar fashion.

Does anybody remember the halcyon days of the late 00s to late 10s where ISPs marketing broadband were getting away with vague terms such as “Meg” to describe download speeds, “up to” for ADSL2+ technologies that couldn’t possibly offer the 24mbps speeds advertised and – personal favourite for this one – advertising services as “unlimited” but burying an actual limit into the so-called Fair Use Policy (or “FUP”).

Yes. This would be the next battle in that ongoing saga.

It really should be a no-no that confusing marketing terms like all of the aforementioned should be allowed to creep into marketing materials. Advertising a product as “Fibre” (of which the attainable speed with such technology is theoretically limitless) despite there being copper used in the delivery just means the consumer is actually receiving a service not strictly as described.

It’s important for the UK economy that we have access to fast, reliable and affordable broadband services and advertising them fairly which means accurate & correctly is the first step in the uptake process for the end-user. There shouldn’t be terms like “full-fibre” to differentiate FTTC (Fibre to the cabinet) and FTTP/B (Fibre to the Premise/Building) for example.

WFH…Is It Over?

This week video conferencing specialists Zoom announced that they expect employees back in the office at least for some of the week. At least if you are working within “commutable distance” of their offices.

It’s not just a Zoom – which strangely was the weapon of choice when we were all sent home for the COVID-19 pandemic – but also the likes of Amazon have decided that they will track and penalise employees working from home too much. And it’s not just them either. Over the past year the likes of BT Group, Apple and Twitter/X have also decided on some form of return to work. 

So what’s the big fuss? 

Work from home was a necessity during the pandemic for obvious reasons but now that’s in some kind of endemic phase it’s not really a barrier to bringing people back to the office any more. Top level management are seemingly intent on using that office space they’ve spent the cash on to reap the benefits of bringing teams back together. Or so the theory goes right?

I read different articles and different studies on the effect from working from home. Some in favour, some against. Personally I think it’s very much down to the individual and their circumstances. I do work from home but I don’t really like doing it as I feel there should be a separation of work and home space but also interacting with people face-to-face is critical important. I do live alone so that’s a big reason for the aforementioned mind

There are MANY different benefits to working from home polices in terms of environmental benefits (less cars on the road and/or stuck on the M62 might be nice for some), parental responsibilities, focus in busy environments (who on earth likes working in open offices?!) and countless others.

Despite the fact I personally wouldn’t work from home if I chose to I would still discourage an employer from withdrawing the facility. Fact is that it’s outdated to think that computerised work can only take place in one location and that location must be the company offices. Not everybody wants to work in an office and be closely supervised on work they can do easily at home. Just why am I working this hard just so that employers can decide working from a beach bar is a bad idea?

This Is One Of Those Installation Reports

Someone’s been doing a lot of digging around my area. It was a bit of a nuisance. Dust went everywhere and traffic was a nightmare for a few weeks. Anyway the digging has ceased and so has my complaining. At least until the next crew decide they want a go at the fibre market. Yes this was the team from CityFibre laying down some sweet fibre optics.

A couple of weeks back I got a leaflet through from an ISP offering speeds of which I had previously only been able to dream of. The time had arrived and the fibre was lit. I instead phoned my incumbent ISP Zen Internet and placed an order.

I thought carefully about what my needs were: I live by myself, I don’t stream or play video games lots any more and running Speedtest.net all the time isn’t a legitimate application of the line. This isn’t going to impress the ladies is it? I picked up the phone with a sensible 100 Mbps in mind. When I was asked I came out with 900 Mbps which will set me back £43/month (inc VAT). Well why not.

Now How Brown…Box

The installation appointment was booked for a week and a day ahead with the router getting dispatched later that day. When the day arrived installation was attended by two engineers and took around an hour. I would’ve been instantly online at that point but unfortunately I got the VLAN setting wrong.

One of the crew got to work on the outside. This involved taking a cable originating from a CityFibre pavement cover on the street underneath my garden wall, digging up a bit of my garden and drilling under the pavement slabs to track a brown fibre cable through to finally join to the first box outside my property by the front door.

The second crewmember got to work internally. The outside box neatly lined up with a point indoors where the fibre cable appears. I opted for the cable to be tracked around a door frame and along a skirting board so that the Calix GigaPoint 801Gv2 Optical Network Terminator (ONT) could be mounted to the wall behind a sideboard I put all my tech on.

Installation was OK. During the civils construction phase the workforce turned up unannounced and I had an incident where a worker started cutting the pavement right in front of my new car which did not leave me impressed. As for the install to my home there’s some grubby marks and hammer damage on the walls. That is to say overall that the street and home installations together haven’t been the greatest experience.

CityFibre ONT
The ONT ont’ wall. I’ve been waiting to make that joke for a while.

Zen supplied an AVM FRITZ!Box 7530 AX which conveniently replaced an…AVM FRITZ!Box 7530 (there was an additional “AX” moniker in the former if you didn’t catch on). As these two routers have practically the same configuration it was possible to export the config file from the 7530 and import it directly to the 7530 AX.

Going from 19 Mbps to around 900 Mbps is a bit of a shift in thinking on how you actually use the internet. Everything gets heavier as we go along because the speed of your internet connection is arguably an enabler to what can be done next as opposed to augmenting what can be done right now. Nobody thought working from home would be a norm just over 3 years ago or that you would need a net connection that the family would need to share for everyone working, playing, surfing and communicating in HD and beyond all at the same time but also in real time.

I am very much glad I’ve left the copper era behind. To say that in 2023 I was on a VDSL2 line which was only marginally faster on downloads than ADSL2+ I had at my parent’s house in 2006 and this at a new build house completed 2016 was a dire indictment of the state of broadband in the UK. Thankfully competition in the UK market is starting to move and alongside fibre lines we also have the option of 5G cellular networks increasingly available.

Now just to find a use or rather the uses for all them megabits.

So anyway the Speedtest. This one’s brought to you via an Ubuntu terminal with the Speedtest.net CLI via a gigabit ethernet adapter.

ProtonPass

This week the Proton team announced that their take on a password manager ProtonPass is now on general release. I’ve had a test of the product in the beta phase and now that a stable release has been announced I’ve started testing in view of adoption that I’m intending to last until the end of July.

ProtonPass has launched with extensions for all the popular browsers such as Chrome, Edge, Firefox, Safari and more as well as Android & iOS apps. Within the vault there is support for storing data in the forms of logins for which you can also store 2FA codes as well as notes. Proton stress that all data is end to end encrypted and protected under Swiss privacy laws.

I’m currently using BitWarden but decided ProtonPass was worth a go. Here’s what I’ve found so far.

Loading from my current password manager BitWarden was straightforward as ProtonPass can read the output json. It for some reason missed at least one record which was a note on an encryption key. ProtonPass found the record but did not put the key with it. ProtonPass also doesn’t support filling in your adresses or credit/debit cards which is a useful feature to have. For these records there was at least a note in the import wizard recording that these records were skipped. It should be noted that there is support for many other providers too.

Screenshot from the iOS app.

As noted above there is support for a wide range of web browsers and also both major smartphone OSes. Being a Firefox user on the desktop and an iOS user in my hand I found no major issue installing either extension or app respectively. Both extension and app I’ve found to have a modern, clean and easy to use design which matches the design language of the existing Proton applications. All in all no complaints here. It does struggle for some reason to autofill logins to this website which seems to have something to do with the fact this blog URL matches several account emails in the vault.

There are some things missing in the product that you’ll find in other password managers such as the ability to organise items into folders. ProtonPass has a “vaults” concept but ProtonPass did not create vaults based on folders in my BitWarden import so I am not 100% sure folders and vaults are analogous concepts. Password breach monitoring and reports are also missing which I would like to see on the roadmap. These can alert you to finding if an account has appeared in a breach and are also useful for

Possibly most missed however is a web vault. At the moment you need to use a browser extension or smartphone app to access your vault. Whilst this is OK and arguably you’re using the password manager most from the apps & extesnsions it does mean if you need to see a larger UI to look through and oganise your vault then you might miss such an interface.

ProtonPass is free for the basic tier which includes unlimited devices & logins, entries such as passwords and notes in the vault and upto 10 hide my email aliases via SimpleLogin. The “PassPlus” option at €4.99 per month or €1 a month for a 12 month plan offers unlimited email aliases, 2FA and vaults to organise items is more feature complete but questionable value at €4.99 if you only want to commit month by month for some reason. If anything it would be best to either use it as the free tier or as part of an unlimited or family subscription.

At this point if you don’t have a password manager then ProtonPass is worth a try. It is barebones in some regards such as reporting and a completely missing web interface but with strong privacy credentials from the Proton team it’s worth a try as part of the

Powered by Pasties

Last week I’ve been back at work after a week scoffing pasties down in Cornwall with the family. I had a Parkrun and a few sea swims in between too which has been great. I also gave myself time to finish reading Diana Nyad’s memoir Find a Way.

Diana was the first person to swim from Cuba to the United States. This incredible achievement over a distance of around 180km (110 miles) took 5 attempts, a lot of heartache and dedication to the dream.

I wanted to read this book after watching an interesting Ted Talk by Diana titled “Never, ever give up”. In the 15 minute appearance she talks about the challenges and motivation she found for taking on extreme distance swimming. Right at the end she calls to the crowd “find a way!”. Not only did I want to read her full story I wanted to find out who exactly Badass Bonnie was too!

I found Find a Way well worth the read and interesting as an open water swimmer. I would definitely recommend it to anyone whether a swimmer, sports fan or in need of some inspiration.

That is also to say that other than doing “catch up” not in the water that’s as much as I’ve done professionally following my return.

Pulling Teeth* or Pulling Contacts? 

I don’t know how this got complicated but it did so here’s a blog post on how I rescued a load of contacts off a Microsoft account without owning a Windows device and therefore Outlook on the desktop.

I’ve very much moved away from Microsoft as my email, calendar and storage provider. My new provider is Proton who are a privacy centric outfit based in Switzerland. The very last bit to move has been the contacts which Proton can do in the Mail client but doesn’t sync with devices so that you can use them in the phone and messenger apps. I posses an Apple iPhone and whilst I don’t use Apple iCloud for mail, calendar, etc I was using it for Tasks. I decided contacts can be stored there for now.

I’ve tried various tactics to get my contacts away from Microsoft but nothing seemed to work. If I had to make an educated guess this isn’t straightforward to do on a technical level as Microsoft’s Exchange and Apple’s iCloud (which is presumably an implementation of CardDAV) will store information in different formats. Microsoft will spit out a CSV; iCloud only accepts contact cards. There’s never much motivation to a provider to make the export process any easier when it’s about migrating away so I decided not to expect a straightforward time.

There’s probably a better way of doing this but as I no longer own a Windows device here’s how I achieved the move in an abstract:

  1. In Outlook.com export all the contacts into a CSV file. 
  2. Check your CSV file using your favourite text editor for any errors, duplicated contacts or anyone who’s unfortunately become a bit of an enemy. 
  3. In Evolution perform an import into the local contacts folder.
  4. Setup your iCloud Contacts account in Evolution to the CardDAV address https://contacts.icloud.com.
  5. Drag and drop all the contacts into the iCloud account.

You could do the process in fewer steps by importing direct to iCloud instead of the local Evolution folder however I found that Evolution would go unresponsive and not provide a progress indicator. I had 162 contacts and I observed the process to be overall slower by importing direct i.e: it seemed to work faster importing locally then copying to iCloud.

The caveat was that no matter what date format I used in the Microsoft CSV it wouldn’t import to Evolution or produce an error as to why it hadn’t. I had to manually re-enter the dates in my contacts in iCloud.

For my next trick I’m considering setting up a local contacts server such as Radicale.

(* Pulling teeth is an expression that means to do something that ends up quite painful to do like pulling out teeth without anaesthetic!)