I don’t know how this got complicated but it did so here’s a blog post on how I rescued a load of contacts off a Microsoft account without owning a Windows device and therefore Outlook on the desktop.
I’ve very much moved away from Microsoft as my email, calendar and storage provider. My new provider is Proton who are a privacy centric outfit based in Switzerland. The very last bit to move has been the contacts which Proton can do in the Mail client but doesn’t sync with devices so that you can use them in the phone and messenger apps. I posses an Apple iPhone and whilst I don’t use Apple iCloud for mail, calendar, etc I was using it for Tasks. I decided contacts can be stored there for now.
I’ve tried various tactics to get my contacts away from Microsoft but nothing seemed to work. If I had to make an educated guess this isn’t straightforward to do on a technical level as Microsoft’s Exchange and Apple’s iCloud (which is presumably an implementation of CardDAV) will store information in different formats. Microsoft will spit out a CSV; iCloud only accepts contact cards. There’s never much motivation to a provider to make the export process any easier when it’s about migrating away so I decided not to expect a straightforward time.
There’s probably a better way of doing this but as I no longer own a Windows device here’s how I achieved the move in an abstract:
In Outlook.com export all the contacts into a CSV file.
Check your CSV file using your favourite text editor for any errors, duplicated contacts or anyone who’s unfortunately become a bit of an enemy.
In Evolution perform an import into the local contacts folder.
Setup your iCloud Contacts account in Evolution to the CardDAV address https://contacts.icloud.com.
Drag and drop all the contacts into the iCloud account.
You could do the process in fewer steps by importing direct to iCloud instead of the local Evolution folder however I found that Evolution would go unresponsive and not provide a progress indicator. I had 162 contacts and I observed the process to be overall slower by importing direct i.e: it seemed to work faster importing locally then copying to iCloud.
The caveat was that no matter what date format I used in the Microsoft CSV it wouldn’t import to Evolution or produce an error as to why it hadn’t. I had to manually re-enter the dates in my contacts in iCloud.
For my next trick I’m considering setting up a local contacts server such as Radicale.
(* Pulling teeth is an expression that means to do something that ends up quite painful to do like pulling out teeth without anaesthetic!)
If you’re stuck in life trying to move on the internet might throw back at you various quotes in the vain of: if you’re trying to move forward then don’t look back. This blog isn’t about philosophy but sometimes it’s damn well close.
Recently I caught myself reminiscing the earlier days of my career when I’d joined the family firm and ended up taking charge of the IT infrastructure. When I started this was one Fujitsu box running Small Business Server 2003 along with a rabble of Windows XP desktops dotted about. There were a few Vista laptops appearing as well at the time.
When I first joined the company I became aware of an issue that would strike at some point over the weekend. On almost all Monday mornings the internet would be down. Not every Monday but quite a lot.
The simple fix was to exclude the Internet Security and Acceleration Server (ISA and now called Forefront Server) proxy cache file from the weekly full backup that Backup Exec was doing. That was the first major IT issue solved in my career.
Curvy boxes were an in thing back then.
Shortly after fixing that the company decided it had outgrown the Small Business Server 2003 setup and we decided to replace it with Small Business Server 2008 on advice of our IT partner. Windows 7 had also appeared which was of keen interest to the company and long suffering employees with that all too blue XP interface.
But if you don’t know much about the legendary Microsoft product that was Small Business Server I’ll explain dear reader.
The big idea with Small Business Server was to bundle together many core products vital to a growing business into one licence, at a reasonable price and all carefully designed to work together more or less out the box. It would then be up to an IT provider to design, implement and support the server. In addition if you needed it you could buy the shiny Premium add-on which granted a second Windows Server plus a licence for my favourite video game SQL Server (at the time the rather advanced SQL Server 2008).
For Small Business 2008 it would provide: Active Directory, Microsoft Exchange, File & Print, SharePoint, Windows Server Update Services plus a backup solution built in. To manage all this the server offered a console which also reported on the status of the server as well as the clients connected to it. Somehow it was a product that was sold like an appliance, worked like an appliance but wasn’t actually an appliance.
(There’s probably something important I missed out in that list).
Did I mention this thing also provide Remote Desktop Web Gateway and PPTP based VPN? Yes indeed! This was a more innocent time in the age of the internet where broadband lines weren’t quite as ubiquitous as they are now. You did have to run it behind a router as having a second network interface was prohibited.
Behold! The SBS 2008 console.
But in my tenure as an SBS admin this simply wasn’t enough. Nope. We decided to add on Symantec’s Backup Exec and Sophos Endpoint AND Sophos PureMessage. Somehow it all continued to work together.
This product back in the day was on one hand valuable for small/medium businesses to access server technologies but on the other questionable as to whether it was such a great idea to actually run it. By the standards of today it’s an absolutely crackers product for a small to medium business to run because the sheer number of moving parts on the installation were asking for trouble.
I would be very surprised if there weren’t stories of horror out there of SBS completely falling over, backups not working and entire businesses grinding to a halt. This product was arguably dangerous to run a small to medium business upon.
The world moved on from Small Business Server and the last version would be Small Business Server 2011 based on Windows Server 2008 R2. For the Windows Server 2012 era Microsoft would replace it with Windows Server Essentials and also nudge you towards Office 365.
By the standards of today it’s an absolutely crackers product for a small to medium business to run…
I owe a great deal to Small Business Server. What I learned running the product was the basis for the first 16 years of my career. After 9 years I moved onto a consultant role and took the skills with managing Windows Server, Active Directory and most importantly SQL Server. Arguably the last remaining “on-premise” skills now that the world is more cloud centric.
The most valuable lessons I learned from supporting Small Business Server?
First was to never run a server on RAID5 because whilst the storage is efficient (only 33% is used for parity) the performance was absolutely dire. Taking 10-20 minutes plus to reboot whilst emails were probably getting lost was unacceptable then and would be grounds for dismissal now.
Second was that given the rise of email, instant messaging and to a lesser extent services like SharePoint it’s absolutely vital to keep these afloat and therefore a single box running everything is too great a point of failure in the business. It’s time to consider hosted or cloud for such things unless you have the resources to reliably host and build adequate redundancy on site.
Third well now the product has gone it’s always worth remembering that there was a time where we needed to run everything ourselves. In a cloud first era someone still has to do the work in the datacentre to run all of this. Tip a thought to those individual every so often an appreciate the work that gets put in.
Suicide is the biggest killer of men under 45. We lose 1 man every 2 minutes to suicide. It’s an absolutely shocking statistic.
On Monday 17th April 2023 we’re opening the doors of Andy’s Man Club in Mirfield where I live. We’ll be open from 7pm at Mirfield Library every Monday except bank holidays. I’m a veteran facilitator at the nearby Huddersfield club and I’m proud to be part of the team that’s going to help establish another club in the area.
My Andy’s Man Club (AMC) story began during the pandemic. As a single person with mental health issues occurring earlier in my life I was suddenly faced with a situation where I was alone, isolated and in danger of a serious relapse. Without the usual human contact we take for granted I was essentially facing the most serious global crisis since perhaps the Second World War all on my own. I attended the online sessions and found great comfort being able to say what was on my mind without fear of judgement. It got me through those turbulent days of uncertainty and isolation.
Just when I thought I’d cleared the exit of that situation I was faced with a series of events I could never have predicted would happen. On one particular April morning I received a text message from a woman who I was dating to say it was all over and she wasn’t interested in perusing a relationship. Considering things seemed to be going really well I was absolutely gutted. I did what I do best and went swimming that evening to try clear my head, let all that grief out in the pool then came back home and made myself a Shakshuka. Just when I thought my day couldn’t get any worse I received a call that same night from my best friend to deliver some disturbing news. A mutual friend of ours who’d unexplainedly lost contact with us had been convicted in court of some serious offences without us even knowing what had happened or why. Not only did I not understand what was going on I also didn’t actually know at the time what would happen to me and if I’d be caught in the fallout from it.
It was without a shadow of a doubt the worst day of my life so far.
The day after I had to take the afternoon off work because I was that distraught. I seriously had to consider that as a result of my so-called friend’s actions that my life would implode. I didn’t even have chance to grieve over a lost relationship. That’s when I realised that my attendance at Andy’s Man Club had to continue. Shortly after that day of infamy AMC gave the all clear to restart face to face sessions. I turned up at my first session and in all honesty I had to catapult myself through the front door I was that nervous.
Thankfully the facilitation team were absolutely fantastic. Two of the facilitators had a chat to me about the session format and rules, another facilitator made me a brew and I was in. I got my chance to open up to the group and surprisingly found another member who’d gone through a similar experience. Just having the space to open up gave me the chance to move on with my life and get back in control over what was going on in my head.
Fast track perhaps a year later things were on the up. The pandemic in the UK was under control and things were getting to the new normal. I got asked to be a facilitator. It’s been one of the biggest honours in my life and one that I least expected. I honestly thought I was in trouble somehow when the facilitator team asked me after the session to have chat!
That’s my AMC story and arguably the single biggest reason I’m not a statistic either for mental health reasons or for something far worse.
So, men of Mirfield (and even beyond) the invitation to you is to come visit, have a brew with us and get things off your chest. Attendance (and the brew) is always free and all sessions are run in the understanding of confidentiality and no judgement.
In the news recently has been that the Password Manager service LastPass was infiltrated and password vaults were stolen. The jist of it is that attackers were able to gain access to the company’s development environment and by extension raid a backup environment for customer password vaults.
Understandably a lot of people out there who have used LastPass will be very worried. For the IT profession there will begin questioning over how this has happened and how we should be responding when consulting. From my perspective this isn’t a post to defend LastPass, explain the attack or analyse what they should’ve done. That’s a whole separate subject matter and whilst these questions are important what I’m going into here is about the general theory of password managers, the immediate impact to users on the data loss and the potential security responses to it.
With the LastPass hack the vault that was stolen and much of the data in there was encrypted. In the December 22nd post from Karim Toubba the stolen data is described as:
The threat actor was also able to copy a backup of customer vault data from the encrypted storage container which is stored in a proprietary binary format that contains both unencrypted data, such as website URLs, as well as fully-encrypted sensitive fields such as website usernames and passwords, secure notes, and form-filled data.
Karim Toubba, LastPass, December 22nd 2022
The attacker was able to go and retrieve customer data pertaining to their account details such as their email address, billing address, phone numbers and IP addresses that were used to access the vault. Whilst this would appear that customers got off lightly this data is still sensitive and we’ll go into that in a bit.
Concerning the stolen vaults each one was protected with AES-256 encryption using a secret derived from the user’s master password. If users have gone with the security defaults of a 12-character password and 100,100 iterations on the Password-Based Key Derivation Function (PBKDF2) then it would take a considerable amount of processing power – potentially millions of years with the technology we currently have commercially available – to crack a properly secured vault. The attacker would have to keep hold of the trove long enough for a weakness to be found or wait enough time for the right computing power to be available and by which point the data could be useless anyway.
Based on what LastPass has said I would remain cautious but anything hyper-sensitive in the vault such as email accounts, banking or financial accounts, social media accounts and medical accounts I would instantly change as a swift precaution.
As touched on above however is that customer data was not kept encrypted. Arguably the biggest risk LastPass users now face is that the attacker has enough information to go targeted phishing against victims to either get the master password for the vault or for specific websites of interest they’ve identified.
My recommendations on password managers still stands the same and that is to combine a password manager with a separate multi-factor authentication (MFA) tool for the vault and contained logins. At the moment my favoured tools are Bitwarden combined with Yubikeys. An important note on what I’ve just said there: I do not use Bitwarden to store MFA codes. Whilst I think that it might be a good idea to add MFA codes to a vault if you for some reason have to share the account with a team on balance they really should be separate. Also a good point to make is that MFA backup codes don’t belong in the vault either.
YubiKey. Tethered to an old Intel Xeon so it can’t escape.
Yes there is considerable argument that an offline password management tool like Keepass is a much safer option but that in itself brings its own problems. What happens if you lose the vault and your backup is insufficient? What happens if you can’t get to the vault in a critical moment because it’s not with you on person? What if the vault is stolen from you and you didn’t apply security practices as good as you thought? As always with security it’s a battle between doing the most secure thing and most convenient option. Personally I stick with the online option so I don’t have to worry about any of this.
In short: don’t give up on password managers. The benefits of having them far outweighs going back to a shared password for all your accounts. As long as LastPass users had a decent master password on their vault, applied MFA to sensitive accounts, changed passwords for anything hyper-sensitive and most critically watch for phishing attempts then I would hope that victims will remain safe from mass attacks.
That’s it for 2022. I packed away my work laptop and phone after submitting my final timesheet of the year. Overall it’s been a great year working hard, responding to the challenges of modern working and supporting organisations whatever their mission may be.
Lots happened for me in 2022. Professionally I ascended to membership of the British Computing Society, passed a few Microsoft exams and also formally adopted permanent working from home. In my private life I helped pull off a successful beer festival and bonfire as part of Mirfield Round Table, I got close to my goal of swimming 10k by swimming…9k…but I also had my heart broken a couple of times :’-(.
Key Anticipations for 2023
It’s getting a lot cloudier out there. For my part in this I’m going to be focusing a lot lot more on cloud hosted applications whether that be lifts n’ shifts to public cloud VMs or migrating clients to cloud native solutions. Fact is they don’t want anything “on-prem” anymore. Fine by me.
I also anticipate we’ll be talking more about general ethics in IT. Whether that be privacy concerns, making the profession more inclusive or ensuring that we are safeguarding the planet for future generations we do have our work cut out for us and it’s critically important we rise to that challenge.
We’re also inevitably going to see a lot more challenges regarding security, stability and connectivity. As we move to (arguably) post “Wintel” desktop and server world to one that’s more cloud native and ARM powered we will see opportunities and problems arise. A constant challenge of mine is getting applications into the hands of users in a variety of settings, devices and conditions. My personal challenge for 2023 and beyond will be to make sure I can do that for people who aren’t “Wintel native”.
However your 2023 looks I wish you a Merry Christmas and a Happy New Year.
The only recent professional development to share is that recently joined the British Computing Society which I’m currently on the induction phase. More on that story later but yet another shiny badge right here.
We’re fast approaching Christmas which means that winter is also looming for those in the northern hemisphere of planet Earth. Conditions in the UK are also becoming quite challenging. The cost of living crisis is making a real, human impact and we are also entering a period of economic recession. We’ve also got to adapt to Brexit whether we voted for it or not. With this in mind I’ve been making sure I’m as prepared for the winter months as best as possible to ensure I am mentally well.
I thought I’d share some tips for surviving work from home (WFH) during winter. These are written by an IT person so take and tweak accordingly.
Ensure you’re interacting with others and not just through glass – if you are like me and live alone this is not the greatest time of the year. It can get lonely and is very much a dehumanising experience. I attend Andy’s Man Club on a fortnightly basis to discuss my feelings and listen to other men doing the same in a mental health safe space. For women we signpost to Women’s Wellbeing Club as an equivalent.
Set aside your workspace – if you can make sure that your workspace is for work only, keep the door shut and talk to other inhabitants that live with you to set boundaries. This minimises the disruption and keeps your mind focused on work.
Maintain a line between work and home – on meetings ensure that you use a background, and also use a headset to ensure the conversation is kept private. At the end of the day shut down laptops & phones and then shut the door. You are done, you are human and it’s time to rest. Stick to set working hours and ensure you are setting time aside for self-care and rest. It’s not a guilty pleasure, procrastination or anything else. You need this time to reset.
Write your tasklist – write down everything you need to get done in a day. Prioritise the important things you need to do, delegate the tasks that you can to share the workload and plot the tasks that can wait for days when the load is lighter. Ensure that you ask for a deadline from colleagues and remember that “no” is not a bad word; it’s actually a good one! If there’s too much to handle you need to say it!
Exercise – get out the house daily for exercise whether that just be for a walk or for something more strenuous like a run. I’m indoor swimming now which helps.
Check in with Colleagues – keep communication constant even if that only starts with “Hello” on a morning and “Goodnight” in your team channel. Do share any big problems, unusual discoveries or even funny incidents you’ve had. This way you aren’t being forgotten about.
Journal – write down thoughts and feelings onto a suitable medium and keep a track of any persistent thought patterns. If you do identity anything that’s making you feel down make sure to act on it whether that being by raising it with your team, a manager or taking action yourself.
Get out for a day – contradicting a few points above but I’ve found it helpful to work from a coffee shop 1 day a week. I do have to provide mobile internet for security purposes but I’ve found that this breaks up the week and gives me something to look forward to
Please put a comment below to share your tips or just check in if you’re having a bad day. I’d love to hear from you.
Canonical have today released Ubuntu 22.04.1 and by extension opened direct upgrades for installs running 20.04.
I was very excited to upgrade today with my StarBook Mk V but I came across an issue that’s plagued me for some time not just with distribution upgrades but occasionally with routine updates.
Situation: when you run updates/upgrades you get something along the lines of this:
The upgrade needs a total of X M free space on disk '/boot'. Please free at least an additional Y M of disk space on '/boot'. You can remove old kernels using 'sudo apt autoremove', and you could also set COMPRESS=xz in /etc/initramfs-tools/initramfs.conf to reduce the size of your initramfs.
Running sudo apt autoremove doesn’t resolve the problem and as a Linux novice I cannot speak about compressing initramfs & the implications there.
Apparently this is caused by a load of kernels the system “hangs onto” following an upgrade filling up the /boot partition. I *think* this is fixed /better handled in versions after 20.04 but I’m not so sure. Indeed it’s a very bizzare problem because surely Ubuntu ought to handle this itself right?
Today I found out how to deal with it and I decided to share it with the internet. Disclaimer: I’m currently learning Linux and Ubuntu. This may kill your system. If it does I apologise but only slightly.
First thing’s first: list the current kernel that’s currently in use. We’re going to try really hard not to delete it.
uname -r
Once you’ve made a note of that then list out the kernels that have been installed:
dpkg -l | grep linux-image
In my case there were about 7-8 listed (!!!) in addition to the currently running kernel. What we need to do is trim this list down so that we’ve got some space in the /boot partition.
At this point you should’ve got a backup and possibly consulted someone who’s a Linux expert as opposed to a Microsoft one.
Remove excess kernels by running the following command depending on what you find (replacing the version numbers of course):
You should probably keep the immediate past version to the one identified with uname -r.
This should free up enough disk space in /boot so that you can upgrade your OS however you may run into the problem I had and this is the one that’s bugged me for ages: if you remove the signed image for whatever reason apt installs the corresponding unsigned image. Not knocking a free offer but I’m not sure why that is. So anyway the trick is to purge both signed and unsigned at the same time like so:
Third exam of the year. This time it’s AZ-900: Azure Fundamentals and yes here’s another shiny, shiny badge I can show to Mum to prove it:
AZ-900 is the easiest Microsoft exam I’ve ever done by a long way. It’s testing knowledge of Azure at a very basic level. That being said it’s definitely not one to underestimate. You need at least a superficial understanding of how Azure works and what the key concepts are with cloud computing.
On May 5th 2022 Microsoft adjusted the exam and made it even more fundamental going as far as removing the bits about databases. A lot of the resources you’ll find on the internet therefore probably go far too into depth.
The Study Plan
The key is AZ-900 is not to overthink it. This one is free with the cornflakes. Take an Azure Virtual Training Day: Fundamentals course to get your free exam voucher then either have a go at the exam or go study with the Microsoft Learn Learning path with a copy of the study guide in hand (or on screen; save paper please).
If you fail it you can always redo the training day or pay £69 to save yourself the hassle.
Practice Exam? Don’t bother. Just enjoy learning and relax about it.
This one along with DP-900 completes the two exams I needed to do this year so I am very pleased with both passes and both done first time.
This also means I get another shiny badge to put up on this blog so here it is and yes you can click to verify I’m not fibbing about it:
The Study Plan
I could’ve very much done a copy and paste job on the DP-900 effort I did earlier this month but that would’ve meant that I couldn’t write another witty title and that would be boring.
I have now 6 years experience deploying, patching, configuring, troubleshooting and tweaking SQL Server. In these exams well founded experience and knowledge of what you’re being tested upon helps.
Used the learning path for DP-300 on Microsoft Learn. I’ll admit at this point I did not finish the last two modules on Automation and High Availability but thankfully I did very well on those questions.
Again, used the Measureup practice test and yes some similar or even the same questions came up on the exam. I did start to notice with this test that I began learning the answer by recognising the question rather than understanding what was being asked. The note of caution here is to not rely on this too much as there’s only 122 questions in the bank.
It’s quite a hard exam to do and I found it challenging. Whilst SQL Server is familiar to me Azure SQL Database is completely new. I had a lot to learn in a short space of time but I got through comfortably. If you’re taking the same exam soon then all the best you.
Should be noted that the contents of the exam changed on 5th May 2022. From the updated skills measured sheet it would occur to me that they made the new format more “fundamental”. That’s not to say it’s an outrageously easy exam. I had to learn a few new concepts but as someone with an interest in SQL Server I enjoyed the learning process.
Here’s the badge to say “I did it”.
The Study Plan
I attended the free Azure Virtual Training Day: Data Fundamentals from Microsoft. Each session was just under 4 hours long and was a pre-recorded video. By attending across the two full days you receive a credit to take the exam for free so not only do you get an intro to the subject you also save £69 for the exam.
Used the Azure Data Fundamentals Learning Path on Microsoft Learn. This was a good source of basic knowledge and a few free labs on Azure were available too. Made lts of notes here to revise with later on.
Subscribed to the official practice test available on Measureup.com. Some questions in this practice test came up on the exam although it must be said that the practice test probably does not reflect the May 2022 changes just yet but keep an eye on the website for more info. I put the test in practice mode and set it to explain wrong answers to strengthen my knowledge and further improve my notes.
Overall not the hardest exam to pass. As long as you understand the subjects in the exam you’ll have no problem passing it. All the best!