Sitecore 10.4.1 is finally here! Everyone's excited about the new release—but with every update, sometimes surprises sneak in. And we’ve got one such story for you.
This happened on a Sitecore 10.4.0 instance running in an Azure Kubernetes Service (AKS) cluster. One day, a fellow Sitecorian reached out to me and told that things just… broke.
The Problem
He said it was supposed to be a normal release. A developer pushed changes, the pipeline ran, and then—boom—errors everywhere. So we started looking into it. For your context, we got the below error -
The Analysis
Our First Guess: Maybe It’s the Code?
We thought maybe the issue was in the new Pull Request (PR). We checked the changes, but everything looked okay.
Just to be sure, we reverted the PR and tried the release again. Same error.
Then we tried releasing an older build (before the error started). That one worked just fine.
So now we knew—this wasn’t caused by code changes.
The Solution
Soon, we felt like something has changed behind the scenes. I remembered seeing the announcement that Sitecore 10.4.1 was getting released (my LinkedIn feed was already full of the buzz).
Based on my past experience in CICD and familiarity with Sitecore in AKS, I am aware that build release process to Sitecore in AKS is different from releases in on-premise and PaaS. The key difference is the build for Sitecore AKS is pushed to a Container Registry and build artifacts for custom solution are layered over the base Sitecore image (not usual in on-premise and PaaS).
During build stage in CICD, the base Sitecore image is pulled each time from Sitecore's public container registries and it has the possibility to change. So I had a strong feeling that the issue was related to the Sitecore image.
We reached out to Sitecore support and explained them everything.
They replied quickly—almost like they were already expecting this problem.
And guess what? They confirmed the issue: our build had started using the new 10.4.1 image, not 10.4.0 like we thought. The reason for the issue was found to be the incorrect Sitecore image usage.
How it happened?
So, if you are new to Sitecore in Kubernetes and Docker, you will find it useful to know that the base Sitecore image versions are specified in env.template and kustomization files.
In our Kubernetes config files (env.template
and kustomization.yaml
), we were using this image tag: 10.4-ltsc2022
This tag means: “Give me the latest image in the 10.4 series.”
So when 10.4.1 was released, the build automatically started using the latest version on 10.4 series, even though we were expecting 10.4.0.
You can check the image reference page, to see the latest images released: https://raw.githubusercontent.com/Sitecore/docker-images/refs/heads/master/tags/sitecore-tags.md
To fix the issue, we changed the tag to: 10.4.0-ltsc2022
This locked the version to exactly what we wanted—and the error was gone. Happy ending!! Not yet.
Important Tip for Sitecore in AKS/Docker
Most people prefer to reuse the files from Sitecore MVP-site repo. Below snapshot shows how Sitecore image version is referenced in these file -
In the snapshot, you can see that the default env.template file (and also the k8s files) come with two-digit image format, e.g. 10.4-ltsc2022. This needs to change to 10.4.x-ltsc2022
We passed this information to Sitecore Support to notify existing users to make the change as they may also get impacted by this.
Need more details? See the following reference page:
Final Thoughts
This issue took us by surprise—but it taught us something valuable:
Always pin your Sitecore image versions in Kubernetes.
Don’t assume it’ll stay the same forever.
We also told Sitecore Support so they could help others who might face the same problem.
Hope this helps you avoid the same headache!
Until next time—happy Sitecoring! 👋
Comments
Post a Comment