Getting Mucky in SDL Tridion Publishing

The general view of publishing that most of us see is the publishing queue, a long list of jobs that get processed and change between such statuses as “waiting for publish”, “In Progress” and, if you are unlucky “Failed”. However, there is allot of additional information lurking under the hood that could be considered pretty useful.

There are allot of use cases that might be applicable to you using the information stored with all publishing jobs so it might be worth while picking up the documentation and taking a look. Between 2009 and 2011 .NET APIs there have been allot of improvements but the main one is that in 2011 you can access all data from API calls rather than loading up the publish transaction XML and surgically picking out what you want to know.

One use case that I know the best here is measuring the performance of the content being rendered. For a customer, we wanted to know how quickly everything was being rendered on a per item basis. We also wanted to gather data for longer term analysis (e.g. are we improving the overall performance on a day to day basis). To do this, we extract the data from the queue for a given day in a pipe separated format for import into excel. Overtime we have built a very complete picture of the growth and performance of publishing.

Now before I dive into code I have to declare that I am not a programmer, I am a Technical Account Manager which was likened last week to being retired from being a consultant. So my skills are not as good as some people I could mention (in fact those that implied that I was retired from useful things). However, it works! And for this that is the most important thing.

So, to look into the queue items you need to do the following steps:

  1. Get the queue transaction list and loop it
  2. For each transaction get the transaction itself
  3. Dig around for details

In detail this looks something like…

Get the queue transaction list and loop it…

We need to start a session, get the list, get the XML document and then the nodes. This is the only XML related thing you have to do which is the plus over the 2009 API.

Session TridionSession = new Session(RemoteUser);
XmlElement QueueTransactionElement = TridionSession.GetList(typeof(PublishTransaction), QueueFilter);
XmlDocument QueueItems = QueueTransactionElement.OwnerDocument;
QueueNodes = QueueItems.SelectNodes(“tcm:ListPublishTransactions/tcm:Item”, GetNamespace(QueueItems.NameTable));

Lastly we loop the QueueNodes in a nice For loop.

Get the transaction

To get the transactions we need the TCM URI of the publish transaction (job) and then get the transaction object:

transactionId = tpq.QueueNodes.Item(i).Attributes[“ID”].Value;
PublishTransaction publishTransaction = (PublishTransaction) TridionSession.GetObject(transactionId);

Digging Around in the mud

The transaction

So now we can see what we can get from a transaction. Let’s start with the basics and let’s assume we are just looking at successful publishing jobs.

So starting with some general details:

The Transaction ID: publishTransaction.Id

The ID of item being published: publishTransaction.Items.First().Id – This reveals the development path of the API, this is only ever one item but still there is an collection of items.

The Item Type being published: publishTransaction.Items.First().Id.ItemType

The title of the item being published: publishTransaction.Items.First().Title.ToString()

The priority: publishTransaction.Priority.ToString()

The purpose: publishTransaction.Instruction.ResolveInstruction.Purpose.ToString() – This is either publish or unpublish. According to the documentation, it should have a “re-publish” state but I can’t see to get this to work

Who published: publishTransaction.Creator.Title.ToString()

Duration:

dateTransactionStart = publishTransaction.Instruction.StartAt;
dateLastStatusChange = publishTransaction.StateChangeDateTime;
tsDuration = dateLastStatusChange – dateTransactionStart;

The tsDuration is now how long our job took to complete from start (the time it went into the queue) to the end (the time it’s status was changed to “success”).  If you submitted allot of jobs at once, then for some this time would be long because it includes queuing time.

The job itself

Within the transaction is the context. The context is holding the actual job itself; so for instance it contains details on the items the job resolved to.

To get the context is available as publishTransaction.PublishContexts

We can then…

Get the count of the processed items: transactionContext.ProcessedItems.Count

The publication: transactionContext.Publication.Title.ToString()

The Publication Target name:  transactionContext.PublicationTarget.Title.ToString()

The processed items

Then within the context we have processed items which we can loop around and get yet more details:

The processed item id: processedItem.ResolvedItem.Item.Id

The time it took to render: processedItem.RenderTime.Milliseconds

The template id it was rendered against (if applicable): processedItem.ResolvedItem.Template.Id

The item type of the processed item: processedItem.ResolvedItem.Item.Id.ItemType.ToString()

We can of course do things like add all the render times up and make some more numbers and if we subtract it from our duration I mentioned higher up, we can get an estimate on how much time was take to deploy (everything else but rendering).

Summing it up…

As you can see there is a wealth of information in the publishing transaction data and this was just the detail I needed for my purposes. I suspect there is allot more in there and playing around with the API is somewhat like Digital Archeology. To help you out I’ve added the scripts I use for measuring publishing on SDL Tridion 2011 SP1 which you can download, play with and even use to collect your readymade statistics!

Download QueueView2011_v0.3. This is an alpha release and requires additional work to make it production ready.

1 comment / Add your comment below

  1. Great info here Julian, awesome post.

    And it shows how easy it is to get this information. I can only imagine how valuable this gets over time for a large organization.

    N

Leave a Reply

Your email address will not be published. Required fields are marked *