Tuesday, February 14, 2023

Celebrating 10 years on StackExchange

 Well, times does fly when you're having fun. Or getting older, I guess.


At times during our professional careers there are milestones worth celebrating, and next week we have one of those - it will be 10 years since https://tridion.stackexchange.com went live.

It is hard to think back to those days. Before going live on StackExchange we were using the main stackoverflow site (under the Tridion tag) or the good old Tridion Forum which I think was hosted on a very old VM in the Amsterdam Office. (quick note: I just realized that the Tridion tag in StackOverflow is still being used).


2013 was a really good year for our community. Just look at the list of MVPs for that year. It was also the year the idea for a Tridion Developer Summit, community owned and organized, came up. All because someone (Robert Curlette) thought it would be cool to meet up his old friends in Amsterdam - and after getting about 100 of us together at the Eye decided he'd go for it.

I cannot thank enough everyone that worked so hard to make this community what it became, and so many of the strong friendships I have today were forged during those days. People I feel privileged to have worked with, traveled with, drunk with, and more importantly: dreamed with.

So here's a cheer to our beloved community. 10 years well spent together. And a special thank you to the lovely people at StackExchange, who have ALWAYS been super helpful in maintaining our little piece of the Internet.


Tuesday, January 23, 2018

Starting and stopping AWS instances via Powershell

On my long-overdue return to this blog, here's a script I wrote recently to solve my usually long process to start a full environment, do some work on it, then stop it.

Objective

I regularly have to run POCs or tests, on somewhat complex environments involving 10+ machines. Since I am a cost-conscious person, I will normally start the environment, do my testing/work, and then stop it at the end. All good with this, except that:

  1. Logging in to AWS console, switching profiles, switching zones takes a good 2-3 minutes
  2. Finding the the instances I need, starting them, takes another 30 seconds
  3. Updating all links to my test instances in Remote Desktop Manager (which I use) takes another good 5 minutes (remember we're talking 10 instances).
So I decided I should automate this process, given that's what APIs are for.

The script below will:
  • Find all instances that use a given AWS Security Group
  • Start all those instances
  • Modify my Remote Desktop Manager xml file that contains the links to the VMs
As usual... the code is written in a very "results oriented way"... some code purists may be annoyed with my lack of code consistency.

Hope this helps anyone out there.

I promise I'll figure out code highlighting on this blog another day.
# Get list of instances 

# Important Variables
$groupName = "NHS-ALL-LOCAL" #Name of security group to search for in instances
$rdgFilePath = "C:\Users\Nuno\OneDrive\Documents\NHS.rdg" #Path to RD Manager group file with links to Machines/RDP
$instanceStartWaitTime = 20 # Time in seconds to wait for EC2 instances to start

function GetName($instance) {
    foreach ($tag in $instance.Tags) {
        if ($tag.Key -eq "Name")
        {return $tag.Value}
        return $null
    }
}

function GetAllInstances() {
    $allAWSInstances = aws ec2 describe-instances | ConvertFrom-Json
    $instances = @{}
    foreach ($instance in $allAWSInstances.Reservations.Instances) {
        foreach ($group in $instance.SecurityGroups) {
            if ($group.GroupName -eq $groupName) {
                $nameTag = GetName($instance)
                $instances.Add($nameTag, $instance)
            }
        }
    }
    return $instances
}

function GetInstanceIds($instances) {
    $listInstances = @()
    foreach ($instance in $instances.GetEnumerator()) {
        $listInstances += $instance.Value.InstanceId
    }
    return $listInstances
}
$instances = GetAllInstances 
Write-Host "Found the following Instances:"
foreach ($instance in $instances.GetEnumerator()) {
    Write-Host $instance.Name " - " $instance.Value.InstanceId
}

# Start all instances
$list = GetInstanceIds($instances)

Write-Host "Starting all instances with security group $groupName..."
aws ec2 start-instances --instance-ids $list
Write-Host "Waiting $instanceStartWaitTime seconds"
for($i=1; $i -le $instanceStartWaitTime; $i++)
{
    $timeRemaining = $instanceStartWaitTime - $i
    Write-Progress -Activity "Waiting $timeRemaining seconds" -PercentComplete (($i/$instanceStartWaitTime)*100)
    Start-Sleep -Seconds 1
}
$instances = GetAllInstances

# Should be updated now.
foreach ($instance in $instances.GetEnumerator()) {
    Write-Host $instance.Value.InstanceId is now $instance.Value.State.Name
}

# Modify C:\Users\Nuno\OneDrive\Documents\NHS.rdg to place all the right URLs
[xml]$rdg = Get-Content -Path $rdgFilePath


foreach ($instance in $instances.GetEnumerator()) {
    $instanceName = $instance.Key
    $publicUrl = $instance.Value.PublicDnsName
    foreach ($server in $rdg.RDCMan.file.server.properties) {
        if ($server.displayName -eq $instanceName) {
            Write-Host "Updating Node: $instanceName with Url $publicUrl"
            $server.name = $publicUrl
        }
    } 
}
$rdg.Save($rdgFilePath)

Tuesday, August 04, 2015

Rumors

The Tridion Developer Summit is just around the corner, as you probably know, since I'm sure you registered for it already, right?

Given the proximity, and the fact that I'll be delivering the keynote, and also that we're just about to release a new version of our software, I thought it would be the right time to throw some rumors in the air about what I will be announcing. Some are true, some are just rumors... I'll leave it to you to do the filtering - or better yet, come to the Developer Summit and see it for yourself!

Rumor list
  • You'll be able to move things up & down in the BluePrint
  • You can launch a new site simply by following a wizard
  • You won't need Java anymore to run .NET websites
  • You will have a central configuration service for Content Delivery
  • You won't need to store Deployer information in the CM database anymore
  • You'll have access to development licenses for free
  • You now get SDL Mobile as part of Content Delivery (i.e., no additional license)
  • All our software (including CM) will support Cloud Databases
  • Creating a new publication will now also create a Root Structure Group
  • All namespaces and package names are being renamed to Sdl.Web instead of Tridion or com.tridion

Choose a few, then check with me after September 17 to figure out which ones you got right. See you in Amsterdam!

Tuesday, May 20, 2014

10 things you did not know about SDL Tridion 2015

Last week (May 15th) we had our first ever Tridion Developer Summit, and man, was it good!

I've posted the slides I used for the keynote on slideshare - so share away. Below you'll find some additional background info for each of the changes introduced. To be clear, there is a lot more about SDL Tridion 2015, but 10 is a nice round number.




#1 - You will be able to load AppData from multiple items in bulk
One of the most expensive (in terms of cpu and database time) things to do with Tridion is extending Lists. Not necessarily the extension by itself, that part is easy, but the retrieval of the data to display in your extended list may be hard to achieve, because the data is "buried" deep in the CM. In one of the harder extensions I've built we had to load metadata from a keyword in a linked component and display it. Couple that with lists that contain 1000's of items, and this is a performance nightmare (and may get the DBA knocking at your door). So a common workaround is to use Event System to store this data in AppData, then read it from there, which removes quite a few database loops, but you still need to talk to the database at least once for each individual item. In Tridion 2015 we added the ability to load AppData from a collection of item IDs in one go, removing the biggest bottleneck in this scenario.

#2 - We will have a new item type (4096 - Business Process Type)
This is a new root-level Organizational Item that will allow you to define different sets of rules for different blueprint branches in terms of governance models. You will be able to use Business Process Types to associate Content Types, Page Types, Schemas, Target Types and Workflow Definitions to a specific Publication (and children) and "impose" rules for governance using standard Blueprinting. Because it is a Repository Local Object, normal rules apply to it - it can be localized, it can be inherited, etc.

#3 - Content Delivery will be able to load configuration from a repository (rather than File System)
As we move to this virtual, always-on, seamlessly scaled environments called "cloud", the requirement to have specific server configurations on the File System makes it unnecessarily complex to spin up new servers on demand. As of Tridion 2015 you will be able to manage CD configuration settings from a repository (for all your servers in a given CD Environment) and apply changes to all servers in the same "stack" centrally, reducing margins for error and improving manageability.

#4 - Content Delivery will expose a discovery endpoint to CM
Again linked to cloud and managing many sites with different functional or technical requirements from a Delivery standpoint, we need to improve how we decide that content of a given publication must be published to a given Content Delivery stack versus another. In order to achieve that, we need to first discover what is available in CD (think of it in terms of capabilities: search, SmartTarget, Experience Manager, etc). So, for instance, if you have a requirement that content published to "staging" has Experience Manager, this information could help us select an environment that provides such capability.

#5 - SDL Tridion 2015 will trigger Events on lists
 The current framework for List Extenders works great when you want to add/modify/remove information from a list in the Content Manager Explorer or Experience Manager. But if you want to do the same with WebDAV, or even lists requested by an external application via CoreService, we don't provide that same mechanism. With Tridion 2015 you'll be able to use the core Event System to extend any given list, therefore removing this limitation (while also making it easier to implement).

#6 - SDL Tridion 2015 will allow code to temporarily elevate a user's privilege
One of the most common scenarios I ran into as an implementer would be to write event system code that created objects or published items to areas where the user did not necessarily have access to. Since the user did not have access to it, we'd normally work around this by creating a new Content Manager session with an Admin account, creating/modifying/publishing our objects, and then hopefully disposing this session. We'd also run into many issues with these sessions not being properly disposed of, and of course the actions in version history would be logged as having been performed by the Admin account, which damages traceability. With the new release you'll be able to elevate the user's permissions, perform a given action, and you'll be able to trace it back to an action taken by that user while elevated (somewhat similar to sudo). Awesome stuff.

#7 - SDL Tridion will let you create "Site Types"
Site Types have actually been renamed to "Site Templates" in the past few days, but the principle is pretty much the same. This is not really new, as you already had this ability in the past via Blueprinting, but we're now making it explicit, and you'll be able to define several rules around this Site Template (like, for instance, which Business Process Type is applicable to it). By the way, a Site Template is a subtype of the Repository class (much like a Publication).

#8 - SDL Tridion 2015 will introduce a Topology Manager
This is one of the coolest things we're doing now, and it is used to tie back all this information that we have about Target Type purposes, Business Process Types, Site Templates and Content Delivery capabilities. This service exists between CM and CD and will be responsible for most of the logic tying CM and CD together, provide information about and configuration for Content Delivery, and in the future drive additional features about scaled-out environments, cloud integrations, etc. The Epic from which most of these features come from was called "Target Awareness" - this should give you a good idea about where we're trying to go with it.

#9 - SDL Tridion 2015 will have a default website and reference implementation
I don't know about you, but I for one am getting tired of the "we can do anything" approach we've had with Tridion in the past. Yes, it is true. I haven't yet found a file system or database that I can't publish to from Tridion, but this only works when the people driving the implementation actually understand Tridion's way of working and understand architectures in general. Basically, "we can do anything as long as you know what you're doing". Nothing wrong with that, and we won't change that. But we'll also give you a reference implementation, something like "this is how we think you should be doing it", something that new developers/implementers can use to learn the ropes with. I will be talking more & more about this in the near future, so I won't expand much more here than I did already.

#10 - Publication Targets will be deprecated in SDL Tridion 2015
And last, but certainly not least, we are revamping the publishing process pipeline, and how the configuration between CM and CD can be done - while doing this we realized that we don't actually need a Publication Target anymore. Yes, we still need the information stored in this (we need to know how to get to a given deployer, which target language, etc) but we could store this information in either our topology manager (for deployer endpoint) or the Target Type. And the Target Type can have additional information we need like the capabilities required for a given Site Template.

Yes, exciting times ahead. A lot of these 10 things are actually already developed and in our main dev branch, and everything else is currently in progress. As usual with process development, there are things that may drop, but as far as I can see this will not be the case for any of these 10. If you were at the Developer Summit I hope you had a blast! (I certainly did, even though it's a bit foggy after 2 AM). If you were not there... well, you should have!

Friday, March 07, 2014

Context, Context, Context

Lately it seems to be all I do and talk about... Context. Contextual. Content in Context. Context Engine. So I felt it appropriate to blog about what exactly are we doing with Context in the SDL Web Product Line.

If you've played around with Tridion 2013 SP1 you might have seen something new on the installer:

This is probably the only visible part of the Context Engine (for now) in the Content Manager (and I'll get to it in a bit), but behind the scenes there is a really powerful tool to help you determine the current context of a visitor's session.

What is context? Well, to put it bluntly, Context is Everything, especially in the context of Customer Experience Management.
"To create a perfect experience - we must understand why you are here - we achieve this by understanding who you are and the current context…"
After a few iterations, we decided to focus on the following 4 aspects of Context:
  • Device used
  • Time spans
  • Geo Location
  • Visitor characteristics (explicit or implicit profile)
And then we started working on ways to discover these characteristics, and how the intersection of all those can provide a meaning to someone's visit to your site - therefore allowing you to "decide" what to show to this visitor in this context. We have currently finished (and released last December) our server-side Device Detection module, "SDL Mobile", which you can use in conjunction with any client-side device-ready code you may have in place already, extending the capabilities of RWD into the RESS territory (image resizing based on device width, HTML optimizations depending on device capabilities, etc, all done server-side to minimize bandwidth usage - and ultimately requiring less time to load).

Time spans are easy to control and calculate too... though what those time spans are really depends on the nature of your business. In some business areas, time spans can be as wide as a whole season, while in others the actual time of day could be the most important time span.

Geo-Location is not black magic either, and we offer both ways to do it via Server-Side IP geo-location, or integrating client-side HTML5 geo-location into your Ambient Data set.

Fourth, and definitely the biggest aspect, is management of the user profile. And this is where the "Context Expressions" you see in the new installer come into play. As you might know, SDL acquired Alterian early in 2012, and one of the many parts we acquired was this product called "CMA" (Customer Management and Analytics) which can analyze massive amounts of data you gathered about your customers, irrelevant of their source - POS, website, newsletters - and allow you to create segments from that data.

With the Context Expressions extension we allow you to save the segment definition into Tridion as an Expression (could be something as simple as "customer.state='NY'", or more complex like "customer.country='NL' and device.mobile=true and customer.ownsproduct=true), and you can link this expression to any component presentation - the content will only display if the expression is true. The mixing and matching of these context variables allow editors to create content that maps specifically to a specific context, and gives you lots of power in targeting... or contextualizing your site's content and presentation.

Obviously, given that the Context Engine runs on top of the Ambient Data Framework, all this information is also available to other tools, like Fredhopper/SmartTarget.

More info about the SDL Tridion Context Engine:
Happy context! (it's Friday).

Friday, January 03, 2014

'tis the season...

Welcome to the future 2014!

Since it is January, and it is the time for wishes, predictions, promises and whatnots, here's a few of my own:

Wishes for 2014:
  • I wish Google would finally figure out that changing my geographical location does not change the language I speak in
  • LinkedIn to figure out that people, sometimes, do want to link to their groups and would appreciate URLs that don't look like they were generated by some late 90's Portal product.
  • The end of "please download our app to see this content"
  • The end of "please switch to our mobile|desktop site to see this content"
  • The end of WYSIWYG for multi-channel content creation...
Predictions for 2014 (I don't like predicting stuff):
  • There will be 12 months this year.
  • The SDL Tridion MVP Retreat will be absolutely awesome, even better than all 4 past years together
  • I will be 40 by this time next year.
  • I will fly on an A380 this year (next week actually)
Promises for 2014:
  • I will pay some attention to details instead of focusing almost exclusively on the bigger picture (I think our DEV teams will appreciate this)
  • I will be more patient with people, and lower my expectations (and pressure) of loved/close ones.
  • I will figure out what the heck is the deal with not allowing Distributed Transactions in cloud environments...
  • I will do my utmost to do what I want instead of what I have to.
  • I will listen more to myself and less to customers (*)

(*) Nothing wrong with listening to customers, and I will continue doing so - but customers rarely want disruptive new features, and look forward to incremental updates instead of "the next thing". Doing the right thing for a product requires a good balance of both disruption AND incremental improvements.

Sunday, December 01, 2013

Product Manager definition: A translator?

About 18 months ago I took a pretty big side step in my professional career, approached Dominique Leblond and told him I wanted to join his team as a Product Manager. 6 months later I left Professional Services - where pretty much all my career was built - and moved away from a comfortable position as Principal Consultant for SDL US to that hotbed of discussion, politics and Priority Management: Product Management.

One of my dearest friends told me: "I hope you make me change my opinion of PMs. Every single one I know sucks".

To me it was a simple decision: move from a position where I implement the product - often working around design limitations and customer's lack of vision - to a position where I can change the product to more closely match (current day) WCM customers. It is not a secret that WCM has changed immensely in the past 10 years, and to continue to stay ahead of the curve we need to (like everyone else) start worrying about what happens after you click the publish button. (Disclaimer: I am in no way responsible for SDL's move into this space, these are processes and projects that can take years to completion and were already in progress before I joined Product Management - probably reason number 1 of why I joined PM was because I agreed with the vision).

What wasn't so clear is exactly what does a Product Manager do? I obviously saw and was inspired by the performances of my fellow co-PMs (Davina, Alexandra and of course Dominique) whenever they had a chance to talk to us - on new product features, on new launches, on roadmap planning, but it wasn't really clear to me what goes on behind the scenes that makes the clock tick. A roadmap or a product launch is not something that happens on a vacuum, born out of pure boredom.

So, in the past 14 months I've been learning the ropes of what it takes to be a good Product Manager. I took to the web for inspiration, and I've learned that if you're good, you're the number 1 position your company can live without... at least for the first year or so. Do go read Kenneth Norton's take on it, a very good read.

And then, yesterday, a thought hit me on the head about what probably defines my role in the best way: I am a translator. I spend my days translating vision into high-level, flashy, sales-ready brochures and presentations, translating vision into low-level, very un-flashy, development ready epics and themes, translating from high-level, blurry customer requirements to development themes, from low-level, incredibly detailed and narrow-focused requirements from the implementer community into higher-level, theme-linked approaches.

Obviously, interpreting priorities is the most challenging part of my job. Understanding that there are 20 things our teams could be doing, but only 5 will be done by the next release is easy. Deciding which 5 is very hard. And I have to do that by translating the needs of our customers (internal or external), the longer term goals of the company, the short term goals of the company (including sales - sales goals are always short term, no matter which company you work for) and doing what's right.

So... let's take a look at an example of how this translation process goes, shall we?

  • Roadmap states "Mobile Experience Management" as a theme.
  • Translate up: Provide clarity to Product Marketing and Sales teams as to what components are modified (and how) to enable "MEM" (because we need a new acronym, CEM, CXM, WCM, PEM and such others are not enough :-)).
    • This will take the form of high-level briefings and presentations on how Mobile Experience Management will part of day-to-day work of both developers and content editors by using modules such as the SDL Tridion Context Engine and Experience Manager's Device Preview (it is awesome by the way)
  • Translate down: Provide clarity to developers on how to group devices together, use Ambient Data to track device information, understand if we are currently in "Device Preview" mode rather than a real device (and take corrective measures, for instance, on how we determine which browser you're using)
    • This will take the form of low-level use cases and functional requirements that can be further translated to "real" development actions.
  • Translate left: Provide information to implementers on how to use this new feature called "MEM" to their benefit, usually by making sure it is all correctly documented.
  • Translate right: Inform existing customers and prospects on how MEM will make their life so much easier, they'll forget they ever had a challenge with mobile.
And this is how I spend most of my time: translating. Looking at the same topic from 4 different views, describing it in 4 different ways, using different language, using different techniques and tools, and trying to bridge expectations across all 4 "channels".

A similar process is done with translations going the other way around, where a requested functionality may end up becoming a theme by itself and land on the roadmap, and eventually the product (recent examples include our completely revamped workflow engine and bundles - both introduced in SDL Tridion 2013).

If it sounds boring to you, well, maybe you're just not the personality type that would like Product Management. I do spend a lot of time discussing themes and future developments - not only of the product, but of the web as a whole, with a focus on how Content Management must evolve. But at least half of my time is spent translating. And it's great, often I end up knowing a lot more about a feature I designed myself :-)

Friday, November 22, 2013

Playing with the future - Part 5 - Any tool can create content


Today, we all are Information Architects. The average number of documents, presentations, emails, blogposts, and the myriad information sources we have to cope with daily continues to grow exponentially, with no end on sight.

So we all come up with nice little tricks to organize our content. Some go for the "all in one folder approach" (works with good search), others go for the super-structured approach for content management (folders upon folders of content hierarchies), and others (if your company's smart enough about knowledge management) go for the Enterprise Search approach. "Dump it in any of our document repositories, and go to this url to find it back".

Coveo nod: I really like their tagline of "Stop searching and start finding".



So, as part of our daily job as information architects (for our own information, not for your organization's), we work with a lot of tools, and very often the tool you use is determined not by your preferences but by the intended audience of the content you're creating:

  • Microsoft Word for the audit report
  • Microsoft PowerPoint for the roadmap or visionary statement
  • Blogger/wordpress for your personal blogpost
  • Email for the quick communication
  • Twitter for the even quicker communication
  • Tridion / CQ / SiteCore / Sharepoint for your company's official blog
  • Visual Studio or Eclipse for the really cool stuff
  • OneNote (or EverNote or Google Keep) to take notes during meetings
  • Prezi for the "I'm cool" effect (nope, doesn't work that way anymore, you're 3 years late)
  • Confluence for requirement gathering and roadmap grooming
  • Jira for backlog management
  • Facebook for the family/friend hugs
  • [list goes on]
Many years ago I remember thinking that, perhaps, the browser would be the tool of the future. Nope, that didn't really work either - yes, you do use your browser to do a lot of your work today, but you're not really using a browser - you're using the application behind the browser.

And another thing that is happening is that we're losing the W in WCM. Content that is not web accessible is not really content anymore, is it?

Hence my prediction... tools that can handle content transformation easily and can abstract the delivery mechanism are the tools we're going to use for everything. CM will eventually become a standard set of APIs (yeah, yeah, CMIS is an effort in that direction... but not really there yet, and too enterprise-y) and the tool you use to create content won't matter anymore. Because there will be enough intelligence behind the tool to "understand" what you're talking about (see part 1 and part 4 of this series) there will also be enough intelligence to understand how to transform that content to your required delivery format. And the tool(s) of the future will be born to address this requirement - hide all information architecture complexity from me, let me create content as content, and then help me deliver the content to my audience. And don't make me think.

Saturday, November 16, 2013

Playing with the future - Part 4 - Data is the future of content

A few months ago I had a question from a prospect that had me stumped.

Your product is great with content, but how does it deal with data?
It took me a while to understand - more context was needed - and still today it is somewhat haunting me. To me, content is data, so what in the name of <insert random deity> did they really mean?

Well, what they meant is not exactly what this post is about, but it is somewhat linked. In their specific scenario Data meant semi-structured data feeds they get from their other applications that may or may not be displayed on the website.

How to deal with it is linked to the topic of this post.

You've seen the semantic web at play. For us WCM geeks, some of the first examples of the semantic web were those "helpful links" under a google search result (and now they show with even more detail, like a link to a related blog) and lately with efforts like good relations and schema.org the semantic web keeps on creeping up on us with great results for everyone (and search engines!).

With me so far? Semantic Web is good, content is good, data is also content, but might come from a different source.

Now, why do I say that data is the future of content?

If you've developed websites before (not just snazzy html, I mean really design websites, web experiences, content creation flows, contextual experience definitions, etc) you've probably been frustrated like I have about the lack of metadata on content. Editors seem to just want to use "an html WYSIWIG" editors to create content, but then expect you to do miracles about how the content displays, magically determine which pieces of data are used for your tab names, which images to use for the home page, etc. I had a particularly harrowing experience with a given editor that insisted all content should be classified under "Personal Finance", yet expected the site to be able to sort the difference between auto loans and mortgages (deep sigh).

Summary #2:
  1. Content Editors want an easier to use, easier to create content in, simpler UX/UI paradigm that allows them to create semantically meaningful content without having to deal with complex operations or data structures.
  2. Semantic content needs proper annotations, schema compliance, and contextual information (used to determine, for instance, when is appropriate to show this content vs when it's not)
  3. Both points above are at odds. To create semantically meaningful content editors need to spend more time curating their content.
And this means that the solution is to enrich content with semantically meaningful metadata automatically (with the possibility to be modified/enriched by the editors).

In other words - editors will get what they want: simpler ways to create content; and developers will get what they want: more meaning attached to their content in the form of "data" - metadata, ambient data, structured streams, whatever you want to call it. And that will allow us to start creating smarter UIs that can help you determine layout, presentation and context with less effort. All you need is more data, and we will get more data from intelligent systems that can do most of the digging for us.

Monday, November 11, 2013

Playing with the future - Part 3 - Context Engines

As a starting point of our 3rd post in this series of "non-binding futuristic plays", I'll start by telling you a secret that everyone except Marketing seems to have realized. Actionable analytics already exist, and they've been around for quite a while.

Yes, it's true. I keep hearing some babble babble from Marketing people on how they need actionable analytics, and we (IT guys with a clue) keep on asking them "how do you want to use them?".

You see, we have the data. We have had the data for many years now. The challenge is not there. The real question you (Marketing guys with a clue) should be asking is "how do I use the data we have without having to call you every time I need to change something?" So, basically, what you need is not actionable analytics. What you need is a way for you to act on the data, and a way to know which data you have, that doesn't involve calling me or some other IT fellow unfortunate enough to be on your quick-dial list.

OK, guess that's enough to set the stage for what I want to talk about today: Context Engines. Back in August we released (rather silently, I admit, and for good reason) version 1.0 of the SDL Context Engine, and we're now finalizing version 1.1 (to be part of SDL Tridion 2013 SP1 and available for 2011 SP1 as an add-on, if that's what you were going to ask) and I am really impressed with what we were able to cook so quickly.



What does a context engine do?

I believe that modern sites should be able to answer, within milliseconds, a very simple question: why did I come here? Understanding the reasons that drive someone to open up a given URL gives us the insight required to serve that visitor's request quickly and without wasting their time (i.e., providing a good web experience). And there's no other way to understand why you're here than by understanding the context that made you come here.

So, what is context?

That's a very open ended question, so I'll answer it in 2 ways:

  1. Context is everything
  2. Context is a collection of data points that can be used programatically to determine why you visited a web page, and let you act upon this based on configurable rules.
 A Context Engine does the following two things:
  1. Determines the properties of the current context
  2. Evaluates the context and executes a certain contextual path or rule
Example:

In the current context we determine that you are using an iPad 2, it's 10 in the morning, it's the second time you came to our site today, and the last product you looked at was coffee.
Based on this information we can:
  • Make sure you see the tablet optimized UI for our site (server-side, with optimized images, not only RWD)
  • Give you a coupon for the nearest starbucks
The beauty of this is not that it can be done. I (and most other Web people out there) could have written code for this back in 1999 (well, not really the tablet-optimized UI part). The beauty of it is that this rule was created by a content editor, using perhaps something as simple to use as SDL Customer Analytics (or who knows, Tridion Target Groups) and the Context Engine simply chose the most appropriate path based on your rules.

Now if you extend the data awareness of a context engine to include data from your purchasing history (or interaction-with-my-brand) you suddenly open the door to way more ways to provide a contextual experience to any visitor, and you start being really good at understanding why I came to your site, and, who knows, maybe you'll even be able to sell me that great vacation I clearly need.

This is - again - not new. SDL Fredhopper, for instance, is an amazing Context Engine. What I think will be new by 2020 is that most sites will be using a Context Engine or similar technology to determine the context and decide what your experience should be. I also expect to start seeing cloud-based Context Engines (someone called them Context Brokers in the past) with all the serious privacy implications this includes...

One last note. As part of the development of the mobile aspect of Context, I've come to realize that most people ignore the fact that the device you are using is only a part of the context, not all of it. The WCM industry seems to be focused so much on how to show nice buttons on an iPhone that we seem to forget the bigger picture: why are you using an iPhone to come to this site? Are you on the move? Are you having a smoke outside? Are you right outside my shop? Are you in my competitor's shop doing price comparison?

If experience was determined by UI alone, then nobody would ever use craigslist. No, well-designed Context Engines put editors in control of selecting the right content for the right context.

Tuesday, November 05, 2013

Playing with the future - Part 2 - Content Ownership

As a follow up to my previous blogpost, here's the second concept we came up with on the topic of "How will content authors create content in 2020".

This idea might be a bit more radical than the first one... "Content ownership will be diluted".

There are many types of content creators out there, from the marketing-snazzy cloud-sourcing heavy world of "modern social media buzzword compliant marketing" to the corporate workflow-heavy legal review world of most of the customers I work with.

In some industries, it is perfectly acceptable to have someone from outside the organization to create content for you - be it via "endorsed blogging", or "fan content on Facebook", or even comments on specific pages that get promoted to full page articles given its quality. This is something we already see happening today on a regular basis.



But the brand fan of the future is different. The brand defender of the future is possibly 16 years old, and is compelled to share due to sharing being in their DNA - hyper-connectivity does that to you. So companies - including workflow-heavy legal compliance companies - will go out of their way and find methods to assess how much of a fan are you really, and possibly give you special rights to create content on their websites. 
If you believe in my company and brand even more than I do myself, why would I stop you from contributing positively?
Here's how I think this will impact the world of content:
  1. Gamification principles and social media tracking will be used to accurately measure a person's brand-awareness level - you want to find those brand defenders out there, and you want to empower them
  2. Brand defenders will - from outside your firewall - have special privileges on your content platform - be it by being allowed to review content, or by being able to create content themselves. This process already happens today, but in a rather unstructured way. (I certainly get emails from brand defenders about content published to Tridion World, I can only imagine that Bart Koopman gets even more)
  3. Brand defenders will be given access to marketing strategies, campaign ideas, and any other branding material. They will carry the flag for you in exchange for early access to data, exclusive T-Shirts and bragging rights. Why wouldn't you reward them in their own coin (data)?
In other words, brand defenders will become your "trusted content contributors".

I can certainly see a future where even the most legalese of texts gets reviewed by people that are -- at first glance -- completely unrelated to your company, but that know your brand value better than the people being paid to create a brand value. Where content is created for your website by your most loyal fans, and where content management tools are built with this in mind from the ground up. Where content review is done by people outside your corporate legal department (but likely not excluding legal completely), and where you provide your brand defenders with all the tools and data they need to be heard.

Tune in soon for my next non-binding futuristic play: Context Engines.

Friday, November 01, 2013

Playing with the future - Part 1

Some time ago I had an interesting conversation by email with my colleague and fellow Product Manager, Alexandra Popova. The subject was "How will content authors create content in 2020".

This spawned a whole series of ideas and concepts about content creation and - especially - content delivery, ensuring that the content that is shown is what you are searching for at this point in time. From there I ended up creating a Slide Deck I use sometimes with the title of "The future of content - a non-binding futuristic play". I think it's time to put those ideas out to the world and see if there's any strong disagreements.

There are 5 main "ideas" that we think will be prevalent in 2020:
  • Schemas will disappear (as in, you won't see the content structure anymore)
  • Content ownership will be diluted
  • Context Engines will be mainstream
  • Data is the future of content
  • Any tool can be used to create content for a delivery platform
I left some out for the simple reason that by 2020 they will already be a strong reality: web content will stop being page centric (some argue that this is already the case, and I agree), content will be self descriptive and "atomic".

Anyway, let's dive into the 5 ideas that I think will be a reality in 2020.

Idea #1 - Schemas will disappear (from the editor's screen)

There will still be some niche markets (product catalogs and support documentation) where this type of interface makes sense, but as systems improve and become more reliable, you'll be more & more using "smart" content editors that derive the semantic meaning of your content for you. There will still be a schema that your content must comply to, it just won't be "in your face". And no, HTML5 is not a content schema - at most it's a page schema and a vocabulary for content. You can present structured content using HTML5, but that's a result of its flexible design.

So, what are some examples of this out there? Well, our own SDL Xopus editor, for starters. It binds to an XSD just like most XML editors, but presents it in a completely familiar way to editors used to working with "less structured" content tools like MS Word or EverNote. (go play with the demos if you don't believe me)



Furthermore, the advances being done in what was once an exclusive domain of "enterprise search" software - entity discovery and concept mapping (see the excellent Apache Stanbol as an example) - means that this type of technology is not anymore in the domain of very expensive and rare software. No, it's starting to be available to anyone with a workable Internet connection. And if I, as a content editor, can let the software discover what my content is about and tag it appropriately then I'm free to create my content and let the metadata tag itself (I obviously need to review and approve).

An interesting side effect of this is that you will not get less metadata. Au contraire, we're going to (finally?) get a lot more metadata can be used to segment our content in ways that will allow our devices to display it properly.

Systems (and people) will struggle for a while, but as leading systems pick this up and start improving on it with usage, we'll all get better for it. And devices like HUDs on cars will easily display your content, in a way that won't distract the driver from what he needs to focus on.

Next week I'll post about how I feel the concept of content ownership will slowly dilute, and distributed ownership (including from outside your firewall) will be the norm.