Experience Sitecore ! | All posts tagged 'Azure'

Experience Sitecore !

More than 200 articles about the best DXP by Martin Miles

Why CNAB could be a game-changer for Docker containers and how Sitecore can benefit from that?

Introduction

Currently, in order to run Sitecore in docker locally, one has to pull the code from a GitHub repository, build it (if not done yet, or pull already built images from a Docker registry), set the license file, and say "up" to docker-compose. Not to say the prerequisites required. If dealing with a cloud, then the deployment into Kubernetes is required, which also demands adequate skills.

Imagine an ideal situation where you don't need to do all the things, but just pull a "managing" image from a remote registry, and this image itself will care about running "all the things" internally, such as prerequisites, pulling the dependent images, preparing the environments, network and dependencies, doing alternative steps to docker-compose and much more.


Or going far beyond that: shipping is as a traditional GUI installer for non-technical people or deploying that same image into a cloud, any cloud, ready to use, does not that look as such desirable? What if I tell you this technology is already available and you can use it? Please welcome the new universal spec for packaging distributed apps created by Microsoft and Docker:

Cloud-Native Application Bundles or simply CNAB

Firstly, what the heck is CNAB at all? Using Cloud Native Application Bundle, a developer has an option of deploying the app either locally at his dev. machine, or in a public cloud as it bolts together containers and services into a seamless whole, placing a strong focus on standardization. It is an open-source specification that aims to facilitate the bundling, installing, and managing of containerized apps. With this bundle, users can define resources that can then be deployed to a number of run-time environments, such as Docker, Azure, Kubernetes, Helm, automation services (such as those used by GitOps), and more.

At a first glance, that task is supposed to be solved by Docker itself. However, when dealing with largely scaled hybrid infrastructures, its standard features become insufficient. Thus, CNAB is an attempt of standardizing the process of packaging, deployment, and lifecycle management of distributed apps on a basis of Kubernetes, Helm, Swarm, and others by using a unified JSON-based package format. 

Recently the CNAB spec reached version 1.0, which means it is ready for production deployment. The spec itself is now broken down into several chapters:

  • CNAB explains the fundamentals of the CNAB core 1.0.
  • CNAB Registry will describe how CNAB bundles can be stored inside of OCI Registries (this section is not yet complete).
  • CNAB Security explains the mechanisms for signing, verifying, and attesting CNAB packages.
  • CNAB Claims describes the CNAB Claims system, shows how records of CNAB installations formatted for storage
  • CNAB Dependencies describes how bundles can define dependencies on other bundles.

Tooling

Each of the organizations has provided its own tools that demonstrate CNAP capabilities: Microsoft released Duffle, while Docker shipped Docker app. Docker Desktop application is fully compatible with CNAB from May 2019.

CNAB is not the only solution for managing the cloud applications lifecycle. For example, Kubernetes has Сrossplane manager as well as package manager Helm. However, CNAB is the first ever solution that supports several most popular tools and is platform-agnostic. By the way, CNAB can also work with Helm and I came across a sample of it at GitHub.

Duffle is a simple command line tool that interacts with Cloud-Native Application Bundles - helping you package and unpackage distributed apps for deployment on whatever cloud platforms and services you use. Its goal is to exercise all parts of the specification and this tool also comes with very handy VS Code extensions, one of which named Duffle Coat allows you to create native executable installer (*.exe) of your bundle:


This results in the proper installer that will install and configure you Sitecore 9.3 locally from bundle image stored at docker registry:



Once again, instead of local Sitecore installations (like SIA does), we are having this CNAB installer that installs the platform from a Docker registry and with no prerequisites required at all! And CNAB bundle cares about all the dependencies and parameters. What magic!

Another tool, Porter, is Microsoft’s CNAB builder that gives you building blocks to create a cloud installer for your application, handling all the necessary infrastructure and configuration setup. It is a declarative authoring experience that lets you focus on what you know best: your application. The power of Porter is coming from using mixins giving CNAB authors smart components that understand how to adapt existing systems, such as Helm, Terraform, or Azure, into CNAB actions.


And of course, Docker App is a CNAB builder and installer that leverages the Docker Compose format to define your applications. To facilitate developer-to-operator handover, you can add metadata and run-time parameters. These applications can easily be shared using existing container registries. Docker App is available as part of the latest Docker release.

Bundle manifest file
As I said above, the specification defines the way of packaging distributed application of various formats. CNAB includes package definition (named bundle.json) used for describing an app, as well as a special image (also called invocation image) for its installation. A bundle.json is similar to a docker-compose.yml file in that it describes a complex configuration for image deployment. The difference is, the CNAB bundle is very clearly defined as to how it should be laid out, encoded, and where all associated files must reside. It contains:

  • The schema version.
  • Top-level package information.
  • Information on invocation images.
  • Map of images.
  • Specification for parameter override (with a reference to a validation schema).
  • List of credentials.
  • Optional description of custom actions.
  • A list of outputs produced by the application.
  • A set of schema definitions is used to validate input.
Here is a sample of bundle.json file below:
{ 
   "credentials":{ 
      "hostkey":{ 
         "env":"HOST_KEY",
         "path":"/etc/hostkey.txt"
      }
   },
   "custom":{ 
      "com.example.backup-preferences":{ 
         "frequency":"daily"
      },
      "com.example.duffle-bag":{ 
         "icon":"https://example.com/icon.png",
         "iconType":"PNG"
      }
   },
   "definitions":{ 
      "http_port":{ 
         "default":80,
         "maximum":10240,
         "minimum":10,
         "type":"integer"
      },
      "port":{ 
         "maximum":65535,
         "minimum":1024,
         "type":"integer"
      },
      "string":{ 
         "type":"string"
      },
      "x509Certificate":{ 
         "contentEncoding":"base64",
         "contentMediaType":"application/x-x509-user-cert",
         "type":"string",
         "writeOnly":true
      }
   },
   "description":"An example 'thin' helloworld Cloud-Native Application Bundle",
   "images":{ 
      "my-microservice":{ 
         "contentDigest":"sha256:aaaaaaaaaaaa...",
         "description":"my microservice",
         "image":"example/microservice:1.2.3"
      }
   },
   "invocationImages":[ 
      { 
         "contentDigest":"sha256:aaaaaaa...",
         "image":"example/helloworld:0.1.0",
         "imageType":"docker"
      }
   ],
   "maintainers":[ 
      { 
         "email":"matt.butcher@microsoft.com",
         "name":"Matt Butcher",
         "url":"https://example.com"
      }
   ],
   "name":"helloworld",
   "outputs":{ 
      "clientCert":{ 
         "definition":"x509Certificate",
         "path":"/cnab/app/outputs/clientCert"
      },
      "hostName":{ 
         "applyTo":[ 
            "install"
         ],
         "definition":"string",
         "description":"the hostname produced installing the bundle",
         "path":"/cnab/app/outputs/hostname"
      },
      "port":{ 
         "definition":"port",
         "path":"/cnab/app/outputs/port"
      }
   },
   "parameters":{ 
      "backend_port":{ 
         "definition":"http_port",
         "description":"The port that the back-end will listen on",
         "destination":{ 
            "env":"BACKEND_PORT"
         }
      }
   },
   "schemaVersion":"v1.0.0",
   "version":"0.1.2"
}
You may read more about bundle.json format at CNAB.io official page.

How about Azure?
For Azure, we also have got the solution, Azure CNAB Quickstarts library. It demonstrates how one can use bundles for deploying applications and solutions and how to create their own bundles. The library is designed to be optimized for bundles that use Azure resources but is not limited to Azure only. There is CI/CD workflow in the repository using custom GitHub Actions to enable the automatic building of bundles and publishing of bundles to a public Azure Container Registry. The library supports bundles made using Porter tool I mentioned above, a tool capable of building, publishing, invoking, and updating bundles.

Final thoughts
Likely CNAB becomes a game-changer for 2020, as we get more and more into containerized deployments and orchestrating them in the clouds. The specification is quite new and not too many companies are using it at the moment, but there is an ever-growing interest in it. Since all the major vendors are now ready, I am quite sure it will boost the whole industry in the coming months!

References
Hope you find this post helpful and get your own hand on CNAB shortly!

Uploading scheduled auto-backups of editors content for your Helix / SXA website into Azure Blob storage

INTRODUCTION

I assume you are following good practices and develop Helix-powered websites (or SXA, that follows Helix by definition). In that case, you do separate actual user-generated content items from definition items, normally shipped along with each release and created by developers (if not, then please refer to an article I've previously written on that), so you end up having a Unicorn configuration that stores all of your author-edited content:

Will are going to split the process into 3 major steps

  1. Re-serialize Unicorn configuration for content items
  2. Archive serialized content
  3. Upload an archive into Azure Blob Storage

IMPLEMENTATION

1. Re-serialize Unicorn configuration for content items. Luckily, Unicorn provides us with MicroCHAP.dll library helping to automate the sync process as a part of your deployment process (note that the DLL and corresponding PowerShell module Unicorn.psm1  should be referenced from your code). The good news is that any verb can be passed to it, not just sync, so one can use 'Reserialize'. That call will look like:

$ErrorActionPreference = 'Stop'
$ScriptPath = Split-Path $MyInvocation.MyCommand.Path

Import-Module $ScriptPath\Unicorn.psm1
Sync-Unicorn -Verb 'Reserialize' -Configurations @('Platform.Website.Content') -ControlPanelUrl 'https://platform.dev.local/unicorn.aspx' -SharedSecret '$ecReT!'

2. Archiving serialized content would be the next step. If you click Show Config button for Platform.Website.Content configuration in Unicorn configuration page, you may find all the relevant information about it, including the physical folder where items are serialized. We need this folder to be archived. The step comes as:

$resource = "Content_$(get-date -f yyyy.MM.dd)"
$archiveFile = "d:\$resource.7z"
$contentFolder = "C:\inetpub\wwwroot\Platform.dev.local\App_Data\serialization\Project\serialization\Content"

7z a -t7z -m0=lzma -mx=9 -mfb=64 -md=32m -ms=on $archiveFile $contentFolder
I am using 7zip as an archiver, as my content is slightly more than 4GBs and traditional zip cannot handle that. As a hidden bonus, I am getting the best compression ratio coming with 7zip. Also, It would be worth checking if a file with such name exists at target and deleting it before archiving, especially if you run the script more often than daily.

3. Uploading to Azure Blob Storage concludes given routine. To make this happen you should have a subscription, and ideally a connection string to your Storage account. Then you may use the following code to achieve the result:
$containerName = "qa-serialization"
$ctx = New-AzureStorageContext -ConnectionString "DefaultEndpointsProtocol=https;AccountName=lab1234;AccountKey=y9CFAKE6PQYYf/vVDSFAKEzxOgl/RFv03PwAgcj8K80mSfQFDojdnKfakeaLMva0S9DbrQTzNjDMdGCp7rseRw==;EndpointSuffix=core.windows.net"

Set-AzureStorageBlobContent -File $archiveFile -Container $containerName -Blob $resource -Force -Context $ctx

Also, it is assumed you already have blob container created, if not you need to create it upfront:
New-AzureStorageContainer -Name $containerName -Context $ctx -Permission blob
Optionally, you may want to delete the temporal archive, once it's uploaded to blob storage.


RUNNING

Here's an entire code that works for me:
# Step 1: re-serialize user-generated content
$ErrorActionPreference = 'Stop'
$ScriptPath = Split-Path $MyInvocation.MyCommand.Path

Import-Module $ScriptPath\Unicorn.psm1
Sync-Unicorn -ControlPanelUrl 'https://platform.dev.local/unicorn.aspx' -Configurations @('Platform.Website.Content') -Verb 'Reserialize' -SharedSecret '$ecReT!'
# Step 2: archive serialized user-generated content with 7zip using best compression $resource = "Content_$(get-date -f yyyy.MM.dd)" $archiveFile = "d:\$resource.7z" $contentFolder = "C:\inetpub\wwwroot\Platform.dev.local\App_Data\serialization\Project\serialization\Content" 7z a -t7z -m0=lzma -mx=9 -mfb=64 -md=32m -ms=on $archiveFile $contentFolder # Step 3: upload generated content into Azure Blob Storage $containerName = "qa-serialization" $ctx = New-AzureStorageContext -ConnectionString "DefaultEndpointsProtocol=https;AccountName=lab1234;AccountKey=y9CFAKE6PQYYf/vVDSFAKEzxOgl/RFv03PwAgcj8K80mSfQFDojdnKfakeaLMva0S9DbrQTzNjDMdGCp7rseRw==;EndpointSuffix=core.windows.net"
Set-AzureStorageBlobContent -File $archiveFile -Container $containerName -Blob $resource -Force -Context $ctx #Step 4: clean-up after yourself Remove-Item $archiveFile -Force
I run it on a daily basis by Windows Task Scheduler in order to get a daily snapshot of editors' activity. The script produces the following output:


As a result of running script, I get an archive appearing in the Azure Blob Storage:



RESTORING

There's no sense in making backups unless you confirm restoring the data out of it works well. For content items, download an archive, restore and substitute content serialization folder with what you've extracted, then sync content configuration. As simple as that!

Note that your content should be aligned with the definition items, or it may not work well!

Hope this post helps!