Julian Wraith

Menu Close

Continuous Delivery vs Content Management

Recently I had a discussion where the initial question somewhat baffled me. Having thought about it more, I want to write something about it to see if I can come to a nice conclusion. The question was; Is Continuous Delivery a threat to Content Management? The form of the question predicates that the asker thinks that Continuous Delivery is actually a threat to Content Management, but why?

As someone who tries to take the independent stance but heavily leaning on Content Management for the staple of work, my initial reaction is that no, Content Management is no threat to Continuous Delivery, but nor is Continuous Delivery a threat to Content Management. Both have a place in any internet delivery environment and such a question is a little like comparing apples and pears. But for kicks, let’s look at it in a little more detail.

What is Content Management (CMS)?
Specifically, we are talking about Web Content Management (rather than the general definition). Wikipedia describes this as:

A web content management system is a software system that provides website authoring, collaboration, and administration tools designed to allow users with little knowledge of web programming languages or markup languages to create and manage website content with relative ease. A robust Web Content Management System provides the foundation for collaboration, offering users the ability to manage documents and output for multiple author editing and participation. (source: https://en.wikipedia.org/wiki/Web_content_management_system)

Systems like SDL Tridion Web make good on the following: allowing non-technical users to edit site content (and even manipulate layout), collaborate on content, version and reuse content across multiple channels and sites. Some systems allow for additional integrations to support content create such as translation systems, DAM etc. and changes can be made to a production website in a matter of minutes. Not all CMS platforms support direct updates but rather they rely on periodic refreshes of the content.

What is Continuous Delivery (CD)?
Continuous Delivery differs in what it, as an approach is trying to resolve. Wikipedia describes it as:

Continuous Delivery is a software engineering approach in which teams produce software in short cycles, ensuring that the software can be reliably released at any time. It aims at building, testing, and releasing software faster and more frequently. The approach helps reduce the cost, time, and risk of delivering changes by allowing for more incremental updates to applications in production. A straightforward and repeatable deployment process is important for continuous delivery. (source: https://en.wikipedia.org/wiki/Continuous_delivery)

The focus here is on agility of changes in a development lifecycle with a heavy focus on automation of repetitive tasks to lead to productivity and quality improvements. These automated stages are things like build, test and deployment. They feature integrated products covering things like collaboration, unit tests, versioning and source control and typically this is focused on product (code which could be a website) development which can and often does include editing of assets such as labels, (website) content and binary objects.

Comparing the two
Both have overlap in two areas; versioning control and pipeline management. Both paradigms focus on rapid delivery of assets and both are only comparable if we are talking about the delivery of a website. A CMS is no good for supporting delivery of a desktop application. Whilst many CMSs support the delivery of code and have a web application, SDL Web does not mandate such a thing and you can develop any application you would like with varying degrees of code in the CMS itself. Currently, the recommended practice of SDL is not to include code in CMS, but to develop a separate application and have SDL Web deliver content which it can do in any form you need (e.g. JSON or XHTML). Continuous delivery specializes in the delivery of code and assets.

In Continuous Delivery, you can enter content assets into a versioning system (e.g. Git) and include that in your build which is eventually deployed. Content can be edited with a suitable IDE in a semi- non-technical form. I do not want to say completely non-technical because an IDE is typically still a technical tool. CMS systems tend to focus on empowerment of non-technical users and organizations that use SDL Web have non-technical marketing users editing content either via forms or using tools like Experience Manager. Content that is entered into the versioning system can then be pushed through the delivery pipeline into the deployment together with all the application code. This has an advantage in agility of deployment because the content will always be delivered by the deployment and the content is as available as the application. Where SDL Web sometimes has challenges is that you need to have a single, scaled and redundant source of content for the webapp. This means that always needs to be there to make the web application works. However, separating the two pipelines of code and content using CD and WCM means that you can make minute by minute changes to the website and not require the application to be redeployed. If you want to separate your web application from your CMS, then content can be delivered through content-as-a-service.

Conclusion
Every web application needs content so if you do not have a CMS then you will need to deliver your content through CD. CD will provide enough features to edit and manage content providing you have the right people and you allow the right speed of updates in the form of multiple deployments per day. What you lose by not having a CMS is the features that the CMS would bring such as content inheritance, translation, inline editing, per minute updates of content. For a simple site (i.e. a micro-site), having a full enterprise CMS is perhaps overkill especially if you do not already have a CMS. If you do, reusing content and content editing processes from the existing CMS is a considerable plus.

If your website is larger in content terms than a simple site and is really multiple sites in multiple languages with a high amount of content reuse, then using CMS and CD together, seems to be the ideal solution. You can manage all your content for all your channels (including campaigning) though one tool and develop awesome apps in record time with CD. One is not a threat to the other.

Going forward I would make a recommendation that your deployments are done in a microservice architecture and in that, your CMS content should be delivered as a service (along with all the other things like targeting). This means all deployed sites take advantage of content that is centrally managed, application deployments are not weighed down by large volumes of content assets and CMS features like content targeting are uniformly deployed on all channels.

Photo credit: Ian Brown (Flickr)

SDL Tridion in the Cloud – Tridion Developer Summit 2014

Earlier this year I presented as lightning presentation at the Tridion Developer Summit in Amsterdam. The even was awesome and really well attended. Here are the slides that I used:

SDL CXC and Tridion

Today I posted some slides to slideshare on SDL CXC and Tridion. Just a simple overview of what we have done for Tridion on SDL’s CXC platform.

The SDL Tridion Reference Implementation

htmlRecently, SDL released the Tridion Reference Implementation, or TRI, to the community. The first edition of the TRI is based on .NET and aims to provide a reference for customers and partners on a preferred approach to an implementation.

Traditionally, Tridion has had an open architecture where a delivery implementation (e.g. an MVC framework) was not supplied with the product. This allowed customers to choose the delivery architecture which suited their needs. As a result we see variations from a simple ASP.NET/ JSP page based site to a fully dynamic MVC based implementation. With the release of the TRI, nothing changes in that regard but should you wish, there is an implementation that you can start from.

The TRI is an MVC based application that uses common elements and approaches with the community MVC application, DD4T. This means that customers using DD4T are compatible with the TRI from a Content Management point of view and so long as the TRI is getting content from something in the right format it can be independent of the content provider though support for semantic web (but by default, Tridion is the content provider). The templates are managed in the application and not in content management and the supplied templates follow a RESS approach and are re-brandable using bootstrap / LESS. The approach means that you can rapidly rebrand the application to another design and this can support a wide variety of devices using one site rather than separate websites for different devices.

The current release, version 1, will be replaced with the upcoming version 2 which will add new features – implementing more SDL product features – will have a java version and will become part of the SDL product stack and available with the installation of Tridion. Right now, you can contribute to the release by going ahead and using it.

You can download it from SDL Tridion World and look at the sources on github.

Five years of MVP events

Via Chris Summers (@UrbanCherry)

The 2012 MVP event (Via Chris Summers @UrbanCherry))

Since the SDL Tridion MVP program started five years ago, there are five people who have managed to be an MVP each year. I count myself in the lucky few who have managed, sometimes narrowly, to still be privileged to be able to attend the events. Each of the five events have been held in and around Lisbon, Portugal. Over time the group has grown from 10 to around 25 and the balance of members has steadily grown in favor of the external (to SDL) community members.

Each year, we descend around this time, on Lisbon and embark on a long weekend of work. From Thursday to Saturday the team works on various community related activities, both in terms of open source projects but also in terms of thought leadership.

This year the team is working heavily on the Tridion Reference Implementation which is SDL’s newly release delivery framework. Currently in a community edition and based on .NET, SDL plans to move this to being fully supported and delivered out of the box with the product. This move comes when there is an updated version which will come sometime in the not too distant future.

This year, the city of Evora in Portugal is our home; a tourist destination somewhere between Lisbon and Madrid. Its old, built by the Romans and probably not really ready for the MVPs. I hope they have a strong constitution…

Provisioning Tridion in the cloud

See! White and Fluffy

See! White and Fluffy

It’s been a little over a year since we started the “Tridion in the Cloud” project at SDL. Traditionally, SDL’s flagship content management system, Tridion, was is an on-premise solution. Customers either installed the product on their own hardware or private cloud or with an SDL partner. However, Tridion was the only SDL product was not possible to host with SDL. So, a year back we set out to change this so that SDL’s customers that wanted to have a completely managed SDL solution.

Code or Infrastructure

During my day to day role, I talk with many of SDL’s customers who are looking to re-provision their Tridion environments and are taking the opportunity to look at cloud providers like AWS or Azure. This is typically IT driven on the idea of cost reduction or outsourcing responsibility of the platform to experts. What I rarely come across is the idea of moving to a more agile approach with regards to infrastructure. Public cloud providers like AWS and Azure have APIs for the creation of infrastructure using code. This means that you can move from treating infrastructure as a series of interconnected virtual machines (servers) to code objects. This concept is not one that IT teams grasp easily because infrastructure is something you provision and then hand over to the application team to install something onto it. Even with templating of virtual machines, which is common place in IT teams, IT teams still work on the very fixed idea of how hardware has always been provisioned.

When you treat infrastructure as code, then you can roll up the build of infrastructure into the deployment of the platform (Tridion) and the deployment of the implementation (your websites) and do this from a single provisioning process. If each layer in that process is abstracted, then you can even replace different layers with new or updated layers (e.g. an updated Tridion version). For a single organization this makes little sense to go to the effort of coding all of this just for Tridion but it would make sense to do this as an approach for all your deployed applications. For SDL, it makes perfect sense to do when you are managing many Tridion implementations. Not only does it make building and rebuilding environments fundamentally easier but also it make configuration changes easier to deploy, moving to different regions is straightforward and issues with individual servers can be quickly removed by replacing the server within a short period.

Provisioning

When we provision environments at SDL we undertake the following steps;

  1. Provisioning of the infrastructure, security groups, DNS, tagging and storage
  2. Installation of the Tridion application, required modules, hotfixes and pre-requisites
  3. Configuration of the installed software
  4. Deployment of the customer’s implementation
  5. Hook each infrastructure component to monitoring

The process is very quick. If you would compare this to a traditional physical deployment, you can expect around 12 weeks to order and install the hardware and then an additional 1-2 weeks to get all the software installed. With a fully automated approach, you can expect – once you have everything set and ready to go – that an environment can be built in less than an hour. Preparation time is of course, potentially time consuming, but once that is done the process is repeatable in the event of failures or updates to the platform.

To do this you can control the provisioning process with technologies such as Chef or Puppet and leverage any technologies that your cloud provider might have to make this easier. SDL uses AWS CloudFormation. This is not transferrable to Azure so if you would want to be agnostic of cloud provider you will need to control the process completely with 3rd party tools rather than leveraging cloud provider specific technology. SDL uses CloudFormation because it is very powerful and to code something similar would have been time consuming. This was labeled as pragmatic portability by SDL because the built provisioning could be ported but we did not want to code forever for scenarios that may or may not happen. Better know where it is not portable and keep coding to add value to the Tridion product.

Updated awards

Finally got around to updating the awards on the site.

SDL Tridion Translation Manager

The last video of 2013 and I am now starting to really enjoy making them. If only I had more time. So, this one is about Tridion Translation Manager (TTM) the plugin that enables sending content to and from translation with SDL TMS or World Server.

SDL Tridion Whiteboarding Experience Manager

In this edition, I look at the basic architecture of Experience Manager and how it interacts with Content Manager.

MVP weekend 2013

I *think* I have recovered, although I am really not sure.

So what happened? Well, I would say a lot of fun and games at the latest edition of the MVP meet up. It started late for me after I arrived late on Thursday, driven up to the castle at night was awesome but the final walk was a killer  (and also twice the distance it should have been). After an evening of fun, Friday was a day of work and catch up for me. The project I fitted into was the Tridion Context Engine Wrapper (a punchy title, right?). The Wrapper – lets leave it at that, shall we – wraps the new 2013 SP1 context engine with  a set of libraries in .NET (following with Java and TCDL) that can be implemented with SDL Tridion 2013 SP1 and the SDL Tridion Content Engine on a delivery website.

Oh so what does that mean?

Well you can do things like this:

<context:Family Names=”smartphone,tablet” runat=”server”> <div>this content will only display on smartphone or tablets</div> </context>

Which is really good because you conditionally act server side for different types of devices; definitions which you create. Awesome huh?

After Friday building documentation we venture out on Saturday to do lots and lots and lots of things. Way too much to really describe but in short it was: Sour Cherry Schnaps tasting, off roading, visiting a lagoon, a guided tour of the city where we stayed (Obidos) and did I forget anything? Not sure… it was just awesome.

 

© 2016 Julian Wraith. All rights reserved.

Theme by Anders Norén.