IT professional specialising in Microsoft SQL Server, Windows Server and Azure based in West Yorkshire, United Kingdom. Driven to learn, improve and create better IT outcomes for clients. Open water swimming enthusiast and facilitator for Andy's Man Club.
Microsoft have recently announced the availability of the next version of Windows Server. Through retail or channel partners Windows Server 2025 is now ready for deployment in both your own datacentre or in public cloud environments.
There are many areas of new features and improvements in Windows Server 2025 including enhancements to Active Directory Domain Services (ADDS), security, performance, software defined networking, general management, and much more.
As with all previous releases of Windows – server and desktop – it will take some time before applications are certified to run on the new release. That being said it’s well worth evaluating Windows Server 2025 for any near term deployments you may be considering. Although Windows Server 2022 will continue to be generally supported moving towards the latest version of the Microsoft server system is recommended for maximum longevity and taking advantage of the latest improvements.
Whether you’re considering a new deployment, a cloud repatriation strategy to reduce costs or need to move away from older versions of Windows Server that are no longer supported Digital Incite and Matter Ltd have the knowledge and expertise to help. Please contact us today to discuss your upcoming projects.
We’re all (probably) familiar with the error message Something Went Wrong. You might also receive a long string of numbers, get told to try again later and report the error to an administrator. These kind of messages aren’t really helpful by themselves.
Regardless of where this error is coming from you need to get a better understanding of what’s actually going wrong in order to be able to work out what to do to address it or even to determine if you need to do anything about it.
In SQL Server there are two features which can be used to provide in-depth information: the SQL Server Profiler and Extended Events. As a SQL Server and/or application administrator learning to use these tools is a critical skill in responding to application errors.
Before we go into any further depth it’s necessary to mention that SQL Server Profiler and also the SQL Trace events are marked as deprecated by Microsoft. This means that the features are no longer being developed and will be removed in future versions of SQL Server. That also means you should learn Extended Events for future reference.
From professional experience however many application vendors we work with will still request an SQL Server Trace file (with a .trc extension) so if you need such a trace to add to a support ticket read on.
A Basic SQL Server Trace
Before any trace is started on the SQL Server instance please be aware that this activity can…will place a significant strain on server resources. This could either be the amount of disk space required for the capture or the impact to processing performance. With this in mind you should try to organise an SQL Trace session when you have either exclusive access to the application database or that you know that it will be a quiet time (i.e. after regular office hours). If you are doing this for the first time definitely practice on a test/training environment before working on production.
The settings I’ll go through here are to be considered a basic trace only. If you are being asked to provide a trace file you should always first check with the software author or your DBA asking what events they need to see in the trace. You can then either open their trace template or use the instructions below to select the trace criteria they require.
Please note that the following was performed on SQL Server 2022 with SQL Server Management Studio (SSMS) 20.2. The SQL Server Profiler has very much been the same with all prior versions so don’t expect any major deviations in the instructions.
Step 1: Launch SQL Server Profiler from either the start menu or from SQL Server Management Studio from Tools > SQL Server Profiler. Login with a user that has permissions for ALTER TRACE by clicking Connect.
Step 2: On the first tab – General – we need to setup the basics of the trace:
Give the trace a suitable name.
For a template use Standard (default) for now.
Check Save to File. This will then prompt a save location.
Set maximum file size (MB): up this to 100 MB
I recommend leaving Enable file rollover checked. This will create multiple trace files as necessary.
Step 3: On the second tab – Events Selection – you will now select the events you want to capture. As we selected the Standard (default) template we have a set of pre-selected events to work with already. I suggest adding a few more in order to get a slightly more useful trace:
Click Show all events and Show all columns to see the full list.
Select the additional rows using the checkbox to the left of the event:
Errors and Warnings
User Error Message
Stored Procedures
SP:Completed
SP:Starting
Finally uncheck Show all events to only display what events you have selected. Have a little review to make sure the ones suggested have been selected.
Ideally we should filter the trace events to those for the required database(s) only. This can be done by clicking the Column Filters button. On the left scroll to find the DatabaseName filter. Expand the Like operator on the right of the Window then type in the name of the database you want to Trace for. Click OK once you have your databases listed.
Step 4: Click run to start the trace. You will see the trace window appear with events being logged. Don’t worry if the events are flying past too quick. The SQL Server Profiler by default will continuously scroll to the bottom.
Notice that there is also now a trace file saved to the path you specified in Step 2.
Step 5: Either get your colleague to replicate the problem in the application or follow their replication steps yourself. In this example I captured doing a simple select statement from AdventureWorks.
At this point note the two highlighted buttons on the screenshot below. The left “stop” button (Stop Selected Trace) will end the trace and stop SQL Server Profiler from capturing events. The right button (Auto Scroll Window) will stop the window scrolling if you need to quickly study a series of events you have noticed whilst keeping the trace capturing new events.
Once you have finished live analysis make sure to click the Stop Selected Trace button. As mentioned SQL Server Profiler running a trace will have a significant performance impact to the SQL Server instance so don’t leave it!
The aforementioned trace file can now be securely transferred to the software author or you can re-open it later and review it any time you want.
In the SQL Server Profiler window you’ll no doubt see many different events and also captured text in the bottom half of the Profiler window. By studying what the SQL Server engine is doing we can begin the process of troubleshooting problematic or unexpected application behaviour. Whilst the level of information in an SQL Server trace will be comprehensive it’s necessary to take the time to study it properly.
Conclusion
In this blog post we’ve learned to create a basic trace file in SQL Server. Whilst only a basic trace additional events can be captured as well as additional filters specified to help us understand application behaviour when an issue is reported.
If you need further support Digital Incite and Matter can not only help create the requested trace file but we can also work with your software provider to manage the incident case from diagnosis to patch deployment. Please get in touch with us today for further assistance.
IT related waste is a growing problem worldwide. In 2022 alone over 62 million tons of “e-waste” was generated. Whilst there has been a number of initiatives worldwide to alleviate this problem such as the EU directives for common power chargers for portable devices and WEEE for device recycling the amount of global e-waste continues to grow rapidly.
Device support policies are a significant cause of e-waste. On October 14th 2025 Microsoft will end support for Windows 10 Home and Pro. Up to and past that important date IT teams across the planet will be replacing devices that will no-longer receive important security updates. This will sadly trigger otherwise worthy devices to be sent for recycling or even landfill if not handled properly.
Not only do important OS end of support dates trigger unnecessary device replacements but also hardware designs that make it difficult if not impossible for a user to replace faulty parts which further contributes to the e-waste problem. For example use of adhesives in bonding together tablets and smartphones can make it a challenge to open a device for repair without shattering a screen.
Thankfully it’s not all bad news. The right to repair movement is still strong and gaining further momentum. Many devices out there – desktops, laptops and servers – are still designed with upgradability and repair in mind. For the impending Windows 10 end of support date the Operating System can be changed for a Linux OS allowing a device to be continued to use.
When choosing devices we always encourage sourcing from manufacturer’s that have serviceability considered in their product design. This also includes open warranties, spare parts availability and avoiding Operating System lockdown.
Digital Incite and Matter Ltd can provide a variety of servicing, upgrade and repairs for your desktop and laptops. Whether you require an upgrade to storage, reload of the operating system or a device repair we can be of assistance. Not only does this save on replacing devices that would otherwise only need an economical repair it also helps your organisation meet environmental goals by avoiding sending devices to landfill.
Masks off. This week on October 10th 2024 is World Mental Health day. This year’s theme is very on point for professionals worldwide: “It is time to Prioritize Mental Health in the Workplace”.
According to the charity Mind statistically around 1 in 4 people in England will experience a mental health issue each year with specific demographics such as LGBTQIA+, Black or Black British people, Young Women and those with overlapping problems even more likely to report mental health problems.
Thankfully the conversation is moving on and the stigma around having mental health problems is being challenged. In a work setting it’s of critical importance to create a mental well-being supportive environment. Whether it’s additional support from colleagues, time off to seek help and treatment or reworking a job role everyone should be able to get help when they need it most.
I volunteer for Andy’s Man Club and they’re a great example of a group that’s worked hard to challenge the stigma and support men like me when life got challenging. For other groups that cater to your needs check out Hub of Hope for links to other groups.
Did you know that 25% of users in a World Password Day survey admitted to reusing their passwords across multiple sites? This kind of behaviour poses a challenge for organisations. Should a user’s password be guessed or compromised then an attacker could access multiple systems via that one combination.
Multi-Factor Authentication or MFA is a system by which an additional factor of authentication beyond a password is added to an account. This is done to enhance the security of the account by preventing takeovers if the password is lost, guessed or brute-forced. You may also hear this referred to as Two-Factor Authentication or 2FA which is a term often used interchangeably.
By using MFA for both administrators and users of a system you can prevent account takeovers that result from passwords that have been guessed, reused or compromised. Enabling MFA for any cloud services your organisation subscribes to is also required for schemes like Cyber Essentials in the UK.
Types of MFA could include but are definitely not limited to:
A one-time code sent via: SMS to a mobile phone number, voice to a telephone number or an email account
Response to a notification via an authentication app on a smartphone.
a Time-based One Time Password (TOTP) generated by a smartphone app or physical device.
A hardware authentication device such as a Yubikey. This may support the FIDO2 scheme and/or the older U2F standard.
An example of MFA: a Yubikey.
It should be noted that whilst MFA offers additional security from unauthorised access it does not completely guarantee secure systems. Breaches may still result via other means. Principally you should always be mindful of attackers trying to gain access to an account by social engineering techniques. A common way by which this is happening is attackers calling users to ask them for the one-time codes.
Additional ways of adding security to accounts should also be considered. A password manager can help users avoid password reuse, help them to generate secure & unique passwords per login and also allow an administrator to monitor password use. Securing sites with Single Sign-On (SSO) is also another option to explore. This allows a user to access multiple systems seamlessly via one login. This can help users by avoiding situations where they struggle to remember different passwords and be tempted to reuse the same password. Instead their “primary” account acts as the login which can be secured via MFA and continuously monitored (such as Microsoft’s Conditional Access for Entra ID).
Ultimately the password is indeed quite dead. It’s not uncommon that in our day to day lives across personal and professional accounts an individual person might have to remember 100s of individual accounts. Passwordless schemes along with SSO are the way to go.
In summary MFA is a great way of adding additional security to a system for both administrators and users. It should be an automatic requirement in setting up new accounts – both cloud and privately hosted – especially in Cyber Essentials certified organisations.
Get in touch with us today for further assistance with Active Directory, Entra ID or SQL Server based authentication challenges.
Let’s imagine that you’ve just come back from summer holidays. You’ve been away travelling, enjoying life and having a relaxing time. Crash down back to work in September (sadly) and you’ve been asked to write some T-SQL based queries. Maybe it’s for a new dashboard component, report lines or even a new view. Somehow that query isn’t performing as you or a colleague expect. Panic sets in as you look blankly at the query and think “what do I do?!?!”.
Don’t fret; performance issues with T-SQL queries happen. Most of the time it’s an issue that can be fixed very quickly. If you find you’ve gone a bit rusty over summer here’s five quick tips to help you troubleshoot those queries:
Tip 1: Check For Obvious Offenders
Quite often you’ll find your issue by re-reading your T-SQL code and making some adjustments. Common causes could be:
Avoid Using SELECT *. Whilst this is sometimes OK to get an understanding of the table contents what it does result in absolutely everything in the table being retrieved. Not only is the full set of data unnecessary but you may also be impacting other queries executing simultaneously on the instance.
Check for complex joins. Occasionally a table join will introduce a complex operation that SQL Server must complete to get the matching rows. Check that your joins are appropriate for the data you want to select.
Consider using the WITH(NOLOCK) hint on live OLTP databases. This prevents queries from locking the table and blocking other queries. This may result in dirty reads so watch out!
Not obvious where the query is going slow? Break the query down bit by bit and re-introduce statements one at a time until you find the offending part of the query.
Tip 2: Check the Query Execution Plan
An automatic go-to for understanding how SQL Server is working at the engine level is to include the Execution Plan. In SQL Server Management Studio (SSMS) you can enable this option on the menu bar at the top.
Once your query has been executed and completed the actual query execution plan will be displayed as a tab next to the results window. Within the execution plan check for expensive operations, missing indexes and also any other pertinent issues that might be causing your problem. Some execution plans end up quite in depth so take some time to study each part of the plan.
Once you have an understanding of you SQL Server is trying to perform your query you can then go make adjustments to your T-SQL or work with your DBA on the potential bottlenecks the execution plan helped you identify.
Tip 3: Check Indexes
In SQL Server Indexes are used to help the database engine perform the most efficient lookup of data possible. Having indexes in place is imperative to a production database especially where the number of records becomes substantial. Without indexes the database engine must perform more complex lookups of the data held in a table which takes longer to complete versus tables that have appropriate indexes designed.
Check the execution plan for Index Scans and Seeks. Index Scans are more intensive as they mean that the whole table is being looked up whereas an Index Seek means only the matching records. An Index Seek is generally preferable except in certain circumstances where there are a large numbers of matching records in a table. You will also see a warning box at the top of the execution plan if SQL Server has identified a missing index that may have helped the query.
Tip 4: Index Defragmentation and Statistics Update Jobs
As discussed in the previous tip indexes and statistics are important in maintaining adequate database query performance. Over time the indexes on a table will become fragmented and the statistics become out of date. This will gradually harm performance over the course of time. If you find your existing queries are getting slower, slower and slower then this is a probable cause.
Check the system view sys.dm_db_index_physical_stats for your particular database. If the avg_fragmentation_in_percent value for your indexes are running high check on the SQL Server instance that an Agent job or maintenance plan is in place to perform a regular index reorganise or an index rebuild for heavily fragmented indexes.
Depending on how the SQL Server instance is configured statistics may be updated automatically however there should also be an SQL Server Agent job or maintenance plan to update index and/or column statistics on a regular basis as appropriate.
Tip 5: Use the Query Store
The Query Store is a useful feature that was introduced in SQL Server 2016. It is not enabled by default unless you are working with SQL Server 2022 and have created a new database. Not only does the Query Store contain a number of useful reports that help you understand how queries are performing in your SQL Server instance but from SQL Server 2017 it enables the Automatic Tuning functionality.
If the Query Store is not enabled then it may be enabled on a per database via SQL Server Management Studio or T-SQL like so:
ALTER DATABASE <my_database>
SET QUERY_STORE = ON (WITH OPERATION_MODE = READ_WRITE)
Once the Query Store is on you need to let it run for a while to let it capture sufficient data. Whilst this is going on take some time to review the Query Store for information on query regressions and potentially make adjustments to the query accordingly.
Once you have captured sufficient data you can then enable automatic tuning on your database like so:
ALTER DATABASE <my_database>
SET AUTOMATIC_TUNING ( FORCE_LAST_GOOD_PLAN = ON );
Still Having Issues?
If you find yourself still having performance issues then perhaps it’s time to bring in a consultant to help you out. Digital Incite and Matter are proficient at query writing and optimisation. If you issue lies beyond the obvious then we can also advise on the infrastructure, instance configuration and expanded troubleshooting.
Get in touch with us today and we’ll be pleased to assist you further.
On Friday 29th August Canonical announced the availability of the first point release of Ubuntu 24.04 “Noble Numbat” LTS. If you’ve been running Ubuntu 22.04 LTS this means you will shortly receive a prompt to upgrade. Alternatively you can manually upgrade via Ubuntu’s software updater or via the terminal via running:
sudo do-release-upgrade
There’s a number of worthwhile changes to Ubuntu especially if you are upgrading from the previous LTS release. Not only will you receive kernel 6.8 with the latest improvements there but there are also a number of refinements in the desktop OS such as the redesigned system menu in GNOME 46, a reworked App Centre which is far better than the old one and also updates to the various tools and packages Ubuntu relies upon.
As always make sure your backups are current and tested before opting to upgrade.
Upgrade Issue on Devices Using Secure Boot?
For me however it wasn’t all that straightforward. There’s definitely a case for doing fresh installs for any desktop OS rather than upgrades. I found the following error after trying to upgrade my StarBook MK V. This was noted after the upgrade failed then I ran software updater to try again:
The shims are to do with UEFI/Secure Boot and Ubuntu’s implementation of it. For my device this was solved by running the following in the terminal and then trying the upgrade again:
sudo apt-mark hold shim-signed grub-efi-amd64-signed
This effectively gets the apt package manager to hold these packages back. From looking around I think that this perhaps has something to do with running Coreboot and upgrading through various iterations of the firmware.
The anticipation on this one has been rising for some time. It’s finally here! After some long delays StarLabs have shipped their Starlite MK V Linux tablet. I placed an order back in August 2023 and have now taken delivery a few weeks back.
The Starlite MK V is the successor to the MK IV which was a netbook. The MK V shifts to a tablet form factor with support for optional keyboard with trackpad and pen support. Whilst the MK IV was a great PC for travel and surfing you could argue the netbook form factor is a bit dated compared to ta tablet. There’s been considerable interest in the StarLite MK V from the Linux hardware community so let’s explore further.
In your box you’ll find the StarLite MK V packaged as per the picture above. The charger, cables and pen will arrive in a box neatly nestled in with the rest. The keyboard also ships in a smartly designed schematics box too. The initial impressions as always with Starlabs are very good.
Specifications
12.5-inch (diagonal) LED-backlit 10-point touch display with IPS technology.
2160×1440 resolution at 208 pixels per inch at 3:2 aspect ratio.
MPP Pen Support.
1.00GHz quad-core Intel Alder Lake N200 Turbo Boost up to 3.70GHz, with 6MB Smart Cache.
I opted for the 2 TB SSD plus a UK English keyboard, the MPP active pen, and an upgrade to a 5M USB-C charge cable which came to a total of £866.67.
Note on the display: the unit I have shipped with a 3k display (2880 x 1920) however StarLabs have changed to using 2k displays as per the specifications above due to what I understand as sourcing issues.
Connectivity wise you’ll find the following ports around the tablet:
Micro HDMI (version 2.0).
2 x USB Type C 3.2 Gen 2 (up to 10 Gbps) with Power Delivery 3.0.
Supports Display Port (DP Alt Mode).
Micro SD Memory Card Reader.
3.5mm Headphone Jack.
The StarLite MKV will connect up to two displays with 3840×2160 resolution at 30Hz: up to 3840×2160 resolution at 60Hz using HDMI or up to 3840×2160 resolution at 30Hz using Display Port.
In terms of OS compatibility any recent Linux distribution will be supported. During your order you may select some StarLabs provided choices such as Ubuntu, Mint, Manjaro, MX Linux, Zorin and elementary OS. You may also opt for Windows 11 to be shipped with your device if you really want it.
The tablet shipped with Coreboot 24.05 which had a number of issues with the screen, suspend, charging amongst a few other minor points. Thankfully version 24.07 has been made available on LVFS and has provided a much smoother experience. For my review I installed Ubuntu 24.04 Noble Numbat. Unfortunately the kernel 24.04 ships with does not properly recognise the orientation of the screen. For installation I connected to an eternal display and then ran some commands provided by Starlabs to correct. As I understand later builds of kernel 6.8 should be OK however the end experience is pretty much dependent on having both that and Coreboot 24.07 in use.
First Look
The StarLite MK V comes in a smart looking black aluminium anodised chassis with the Star:abs logo embossed into the back. With each order there is the customary blue sleeve to keep the device in. On the back there is also a 2MP camera for taking photos and recording video.
At around 0.9 Kg the StarLite MK V is heavy and unwieldy to handle. Compared to a Microsoft Surface Go 5 (Intel N200 processor for comparison) which has a mass of 521g the StarLite is on the heavier side of the device front. I have to use both hands or rest the device somewhere to use it.
The screen itself is very bright and readable. For example reading PDF files were clear and legible, watching videos was also a fine experience with acceptable colour and brightness levels. The screen won’t be on the level on some high end laptops and tablets but for the price point I feel it’s pretty good. Unfortunately around the edge of the display there’s some hollow points underneath which can be felt when using the device but not so many it completely spoils the experience.
Performance and Battery Life
As per the specifications StarLabs quoted up to 12 hours of battery life. I wrote this review on the StarLite MK V starting at approximately 09:00 with a charge of exactly 70%. With Ubuntu set to power saver mode I lasted until approximately 11:23 until the critical notification at 5% was triggered. This was with a connection to a VPN over WiFi doing largely emailing and web browsing. In other words even with the uncharged 30% of battery accounted for it’s not anywhere near the quoted 12 hours of battery.
The Intel N200 processor is a 4 core and no hyperthreading SoC. Performance wise that should put it around 10th generation Core i3 levels. It’s more than acceptable for basic web browsing and watching videos. I didn’t notice any dropped frames watching a short film for example. Anything more strenuous than the basic tasks such as compiling software, virtualisation or gaming and you’ll not find this tablet well suited for. There is 16GB RAM available to the user which means that multitasking should be OK.
Keyboard and Pen
Whilst you may just want a tablet for web browsing, media consumption and video chatting the StarLite MK V optionally ships with a custom backlit keyboard with a multi-touch trackpad for more keyboard and mouse intensive applications such as word-processing, coding and spreadsheets. We have to unfortunately discuss the keyboard in not so great terms. I opted for the UK variant and my experiences are based as such. Languages include UK English, US English, German, Spanish, French, and Nordic.
The keyboard is not very visually appealing. It’s made from what I would guess is Polyurethane which picks up anything greasy or oily quite easily. It doesn’t have any StarLabs branding on it and also fully covers the tablet when attached. Typing on the keyboard is OK. You can’t put the keyboard at an angle sadly so it will always be flat on whatever surface you are resting upon which is a drawback for coding and word processing. Some of the keys aren’t cut properly and catch on the frame. I particularly note the space bar and the Escape key are susceptible to catching. All things considered you won’t be using this keyboard for typing intensive activities.
The trackpad surface itself is OK; it’s responsive and accurate to use for navigating the OS. Multi-touch is supported which means that if your OS or application supports gestures this is nice to have.
The keyboard connects to the tablet by the way of two little plastic prongs. These don’t actually semi-permanently clip the tablet to the keyboard so it’s quite possible that the tablet might fall out if you’re not careful! A rudimentary kickstand is achieved by folding the back cover horizontally which is pretty clever design.
Overall the keyboard stands as a disappointment. It’s probably almost a requirement having one to get a full experience from a tablet for example if you want to write out a longer email, do a spreadsheet, code or input some shell commands.
The StarLite MK V display includes support for MPP active pens. This admittedly isn’t a strong requirement for me however for completeness I wanted to try it out. The pen itself is an extra £30 with your order. Two extra nibs are included as spares. The pen itself seems reasonably accurate although me being artistically challenged I’m not able to make an authoritative judgement to that.
Final Words
It’s great to finally have hold of the StarLite MK V. If you’re reading this review I’m assuming an interest in open source and by extension Linux. If you want the StarLite MK V you really must want this tablet and be prepared to live with its limitations.
That’s where it’s hard to talk about this without acknowledging that you can find an iPad or Android tablet for less money but will also provide an app store with a full fledged selection of apps. If that’s important to you look away now.
The tablet itself maintains the usual Starlabs traditions of great packaging, smartly designed chassis and also much flexibility in terms of options to customise the tablet in terms of storage, accessories and pre-loaded OSes. The customer support has been great as always.
As alluded to battery life and the keyboard are the most noticeable drawbacks of this tablet. The battery life may be addressed in the future by improvements to Coreboot firmware and kernel updates. The keyboard however can not and despite it being an option it’s very much a requirement for situations where you need to use a hardware keyboard or trackpad for system tasks.
For Linux enthusiasts out there the StarLabs Starlite MK V is well worth a look. Should you have any questions do let me know in the comments!
It’s been some time (over)thinking, (over)planning and (over)delaying but today marks the first day of my career break.
I have been working for Lake Financial Systems for the past 8 years as a Technical Consultant specialising in Microsoft SQL Server, Microsoft Windows Server and Microsoft Azure. I have now decided it’s time to move along by taking a short career break. Thank you all to the Lake team once again and all the best for the future.
Once the decompression phase is over I’m working on transforming this site into the full functioning business I always intended it to be. As a business we’ll be offering the best of support to all kinds of organisations with your data and infrastructure requirements. I shall still be blogging and next week plan to release a review of the new StarLabs Linux tablet Starlite MKV.
Small form-factor (SFF) PCs are very much in demand in both office and home environments right now. They have a variety of use cases from a go-to device for basic internet, email and video on the desktop to use as mini-server and everything in between.
UK company Star Labs decided to replace their initial Byte mini-PC with a new version aptly called Byte Mk II. I was a fan of their StarBook MK V laptop (still in use and writing this review on it) and also the idea that they started their company in a pub. What could be more British that starting a company in a pub? After a delay with shipping finished units and delivery to customers it’s finally the day to review this new iteration of their Linux championing SFF PC.
Specifications
The PC itself measures a rather compact 12.7 cm x 12.7cm x 4.3cm (W x H x D) and has a mass of only 265g. The PC is fanless and is cooled by a passive heatsink so as such produces no noise.
As for internal hardware and possible configurations from Star Labs’ website expect the following:
Intel N200 SoC – 4 core @ 1.00GHz, Turbo Boost to 3.70Ghz with 6MB Smart Cache
Intel UHD Graphics at 750Mhz frequency.
1 x 8 GB / 1 x 16 GB / 1 x 32GB DDR4 3200Mhz Memory module
As noted the memory and SSD are customisable options. The base price of the system configured with 8GB memory and 512 GB SSD as of writing is £504 before discounts. For the system as reviewed I opted for 16GB memory and 2TB SSD. 32GB memory I don’t think was an option when I pre-ordered.
I also noted it’s possible to add 2.5″ SSD/HDD via what I presume is an internal connector on the motherboard. That might for example be useful if you were intending to use this PC as a NAS or for some additional storage beyond the M.2 SSD.
As usual with Star Labs when placing an order you get a choice of Operating Systems that can be pre-loaded before being sent out to you. There are several Linux distros to choose from including: Ubuntu, MX Linux, Mint, Manjaro, elementaryOS and Zorin. Notable choice is that you may have Windows 11 installed despite the fact that the Byte Mk II ships with a coreboot firmware. This surprised me as the last I heard on this was that it was a bit experiential however if that’s your thing then you’re covered. I opted for no OS to be shipped with the device as I prefer to do that myself.
In the box you’ll also find your choice of power adapter (mine was for the UK) and a VESA mounting kit so that you can attach the Byte 2.0 to the back of a monitor to keep your desk tidy. The rubber feet to cover the screws do not come pre-attached so if you want to change components before powering on you won’t damage the feet.
Connectivity
Front
Power button
3.5mm Headphone Jack
1 x USB 3.0 Type-C
2 x USB 3.0 Type-A
Back
2 x USB 3.0 Type-A
1 x DisplayPort
1 x HDMI
2 x Ethernet
1 x DC Power
According to the specifications the Byte 2.0 can support up to two displays with 4096×2160 resolution at 60Hz. Perfect for dual screen office work and general browser based tasks.
Power On, Coreboot and Performance
As mentioned above the Byte Mk II runs coreboot and not the usual UEFI based firmware. As of recent Star Labs have changed from using a tool within the Linux desktop to having a built in menu system accessible via repeatedly mashing F2 / Del on the keyboard. This will also let you choose the boot order for your connected devices. If you are used to configuring a UEFI or BIOS system then coreboot will be pretty similar. As mentioned above I think this system will even boot Windows if that’s really your thing but do check with Star Labs first.
Initial power-on was without any drama. As mentioned above the Byte Mk II is passively cooled so no noise could be heared at all. I attached a USB drive to the Byte Mk II’s front panel an proceeded to install Proxmox VE 8.2. the OS installer recognised the SSD and network adapters with no issues. If you use a recent version f your favourite Linux distribution you should have no problems.
I haven’t benchmarked the CPU, GPU or the SSD for the purpose of this review. Partly because that’s not directly doable on Proxmox and also because I don’t feel anyone’s going to be buying this PC for high performance requirements. Sorry!
the Intel N200 SoC was powerful enough to run Proxmox as a learning environment consisting of the base Hypervisor and 2x CLI only instances of Ubuntu 24.02. Performance was acceptable given the 4 cores and single-channel memory in use however further load I would expect to strain the system somewhat.
Final Words
The Star Labs Byte Mk II is easily a recommended low power, Linux focused SFF PC. The packaging gives the usual excellent first impression.
Worth also noting that Star Labs produces a disassembly guide to help you replace or upgrade components. I like to see importance given to serviceability so kudos for supporting the movement for self-repair.
For this PC I’ll be using it to experiment with Proxmox VE. Thanks to dual network ports you may find this device pretty useful for a low powered firewall, router or file server for example. the Intel N200 SoC may be limiting for some uses such as gaming and bulk video transcoding however as it’s a similar type used in retail NAS units you may well find this suited to a media PC too. Of course the Byte Mk II would also server great as a home office PC sat behind your monitor.