Blog Changing Domains – update your RSS feeds!

Dear taxonomists, content managers and other subscribers…

We recently launched a new website on Drupal (www.earley.com) and we also integrated our blog in the new site.

Please update your RSS feed readers with the new address: www.earley.com/blog
(or http://www.earley.com/rss.xml if you want the whole site).

We will stop double-posting as of now, so if you want to continue reading our fabulous blog posts, like the mega-menu series or my most recent post on social media and persuasion, please update your links.

Thanks for your continued readership!

Stephanie and the rest of the Earley team

Taxonomy & Mega Menus… Part 2: Grouping

Best Practice #2: Use chunking and grouping to increase scanability and learnability

So you’ve taken the mega-menu plunge and you now have more labels to fit into your drop-down. How do you make sure it doesn’t look like a mess of text?

There are a couple of options:

Grouping:
Create clear and logical groupings within the menu and give them prominent labels that can easily be scanned.

There are four elements to this approach

1. Logic: Groups have to be internally coherent and logical. Either they are all children of a common parent or somehow conceptually related in a way that is evident and quickly learnable.

2. Labeling: Use simple, unambiguous labels that convey the nature of each group. Decide if your labels will be “clickable” – is there a landing page behind them or is it just a visual way-finder? The mega-menus tend to discourage clicking on such intermediate levels, but marketing may want the space to provide category-level merchandising.

3. Volume: Follow general good practice on number of items in a category. We can thank cognitive science for an easy rule of thumb of 7 +/- 2, but I would say that in a mega-menu, space being limited, I would reduce that to 5 +/-2. This will reduce visual noise and fits well with best practice #1 (less is more).

4. Visual distinction: Use striking colors, increase white space between groups, use shading or dotted lines… anything that you can do to make a visual separation between the groups so that they eye can quickly skip from one group to the other without much thinking.

Let’s look at some examples.

Here’s a screenshot from an office supply site taken a while back:Image

Aside from the fact that it has far too many terms, it is suffering from not having good visual distinction. It’s hard to distinguish the group labels from the single items below, and some of the terms are a little confusing: “Basic Supplies”? What makes a message pad a basic supply vs. a paper supply? I’d have a hard time scanning that label and deciding to either skip over that category or make the effort to look at the children.

Compare this to the next version of the same site:

Image

We still have the labeling issue, but notice how this is visually much easier to scan  – I can concentrate just on the bright blue headings to quickly decide where to focus my attention.

Here’s yet another office supply site:

Image

Good: nice visual grouping – flashy orange & capital letters directs attention
Bad: ambiguous labels (organization? how is a post-it organization?), too many terms in one category

This is fun… One more!

Image

Although they’ve tried to use capital letters and some dashed lines, the grey text just doesn’t attract the eye. There are also too many groups and the inconsistent use of multiple groups per column makes it look cluttered and confusing. Another interesting thing to note is that although visually subtle, the use of groupings likely still attracts attention more than the left hand column which has a collection of single choices. I would venture a guess to say that a majority of visitors to this site would ignore the left hand column altogether (except for the A-Z artist block).

Keep in mind that when available, groupings will visually and cognitively trump singles, so avoid mixing the two approaches.

There are some cases in which the mega-menu does not lend itself well to grouping, such as low term volume (in total or per group). If you have 6 categories with only 2 items in each, it can be more visually distracting to have group labels cluttering up the menu. In such cases, an alternative approach is:

Chunking:
Create “chunks” of related terms separated by white space (i.e. with no labels.)

This approach is definately less straightforward and doesn’t give the scanability that grouping with labels does, but it’s better than a big list of undifferentiated terms.

Image

Again, not as effective as labels, but cleaner and more meaningful than a straight list. Underwear, loungewear and socks are clearly related, and so are all the accessory type categories. One might be able to skip over a chunk after scanning only the first few items once you got the gist of its grouping principle.

Here’s what it looks like when you don’t use chunking or even ordering based on grouping principles.

Image

The price points aren’t grouped together… it’s hard to tell whether the items below mean that they are under 50$ (they aren’t)… the tea products are scattered in with other types… Generally messy.

So, to recap:

  • grouping is a good way to reduce the amount of work someone has to put in scanning a larger list of items
  • use visually distinctive grouping mechanisms
  • make sure labels are simple and intuitive
  • if you can’t group due to low volume, consider chunking (or at minimum using meaningful ordering)

Next in the series: Best Practice #3: Use simple & concise terminology

SharePoint Content Structure – Let a thousand content types bloom?

“How many content types should you have?”

This is the question that came up in a conference call last week on SharePoint architecture. This organization had implemented their corporate portal on SharePoint 2007 and was interested in going forward with more portal sites but had some concerns about the approach to information architecture they had undertaken.

I answered what I would answer no matter what technology it was – “Only as many as you really need to implement the appropriate level of metadata, workflow and templates.” Which is of course vague, as most good consultant-ese is. I followed up with some stats: when we work on web content management implementations, we typically end up with about 10-15 content types for a site of medium complexity. We always try to keep the structure simple and number of content types few for many good reasons, ranging from ease of content structure management to content publisher user experience.

The folks on the phone were quiet for a minute… You see, the previous consultant they had worked with had a bit of a different (read opposite) approach. The philosophy they described was that SharePoint content types should be created to the maximum degree of granularity (e.g. one content type per library) so as to reduce the need for content publishers to select a content type and tag metadata values. For example, if you had a site for human resources forms, you would have one library and content type for medical forms, one library and content type for dental forms, etc. Each content type would be extremely specific and require little tagging. “If you need 30,000 content types, then so be it” is the idea. (insert eye twitch.)

The intent behind this – to reduce uncertainty and effort for content publishers – is noble and good, and in some specific cases might be the right approach. But in general, the overly-granular content types seems to be in the realm of sledgehammer to kill a fly. To help explain why, I thought I’d enlist the help of a couple of friends and colleagues.

First, I emailed content management guru Bob Boiko, author of the Content Management Bible, to see if he agreed. His response?

“How many content types is the right number? The fewest possible to squeeze the most value out of the info you possess. If it were my system, I would create a generic type and put all the info that I could not find a business justification for into that bucket. It’s not worth naming if you can’t say clearly why you are managing it. Then I would start with the info we have decided is most valuable and put real energy into naming the type and fleshing out the metadata behind it. Then on to the next most valuable and so on till I ran out of resources. In that way, the effort of typing is spent on the stuff that is most likely to repay the effort.

Amen to that! But I also wanted to get a tool-specific view from my colleague and SharePoint expert friend Shawn Shell. So I skyped him…

ImageSo, what do you think?

Image Well, having a content type for every document library is certainly an interesting approach, though I think your SharePoint administrators, as well as your users, will go quite mad.

ImageSo, I think the argument is that having this many content types is supposed to make it easier on the users by presetting all choices and removing the potential for error. If you never have to choose a content type because each library has a very specific default that matches the content you are creating, then there’s no confusion, the idea seems to be… From a general content management perspective, this is flawed. But what about from a SharePoint-specific standpoint?

ImageI can understand why this might make sense on the surface.  Unfortunately, I think you end up exchanging one kind of confusion for another.  Further, there’s a huge maintenance implication as well. For example, if you have a content type for each library, you are, for all practical purposes requiring the user to decide where to physically store a document.  This physical storage then implies your classification — regardless of whether a default content type is applied.

ImageSo, you’re basically recreating all the ills of a fileshare folder structure!

ImageIn essence yes. To make matters worse, more complex SharePoint environments will necessarily include multiple applications and multiple site collections. Because content types are site collection bound, administrators will have lots more administration to create, maintain and ensure consistency across the applications and site collection. This would normally be true, but when you have such an overload of content types and libraries, the complexities of management are compounded.

ImageSo, if you have 50 content types, and you need to use them in 2 or 3 site collections, you’d have to create 150 content types. Good argument to keep your use of content types judicious. Is there a hard limit to the number of content types one can manage in a site collection?

ImageThe answer is “sort of.”  There’s no specific hard limit to the number of content types in a site collection, but there are some general “soft limits” in the product around numbers of objects (generally 2000). This particular limit is an interface limit where users will see slower performance if you’re trying to display more than 2000 items.  The condition won’t typically manifest itself for normal users, but it will for administration. The other real limit is the content type schema can’t exceed 2 Gb.  While this seems like a pretty high limit, if you have a content type for each library, loads of libraries in a site collection and robust content types, there’s certainly a chance to hit this limit.

ImageWhat about search? I assume that a plethora of content types would have adverse effects on search.

ImageIt absolutely does.  Like everything we’ve discussed here, the impact is primarily two fold: 1) administration and 2) user experience. Content types, as well as columns, can be used as facets for search.  If you have an overwhelming number of facets in results, the value facets bring is reduced.  Plus, as I mentioned before, having large numbers of content types could also produce performance problems when trying to enumerate all of the type included in the search result.

From an administrative standpoint, we’re back to managing all of these content types across site collections, ensuring that the columns in those content types are mapped to managed columns (a requirement for surfacing the metadata in search results) and, if you have multiple Shared Services providers, that this work is done across all SSPs.

ImageI expect there will also be a usability issue for those trying to create content outside of the SharePoint interface. Wouldn’t users have to choose from the plethora of content types if they started in Word for?

ImageThis is another excellent point.  Often, when discussing solutions within SharePoint, we think only of the web interface. When developing any solution, however, you need to keep both the Office and Windows Explorer interface in mind as well. Interestingly, using multiple document libraries, with a content type for each library, makes a little more sense from the end users perspective, since it’s similar to physical file shares and folders.
However, the same challenges that many organizations are facing related to management of file shares can manifest themselves when using the multiple library and matching content type approach as well — putting these organizations back in the same unmanageable place they started.

ImageGreat, thanks Shawn for your insights! I’ll be sure to spread the word to avoid a content type pandemic.

So there you have it folks. As a general rule, less is more. Standardize, simplify and don’t let your content types multiply needlessly. Your content contributors and SharePoint administrators will thank you.

Taxonomy & Mega Menus… Part 1 of many

No matter where I run, I cannot seem to hide from them.

They fly out of website navigation menus with no warning. They assault my senses with link overload.

…they are…mega menus.

Are they a new navigation paradigm or just a bad fad – like acid washed jeans?

And whose idea were they anyways?
It’s difficult to trace the starting point of the mega menu (or mega fly-out, or maxi menu, or whatever you call them); they started popping up on e-commerce sites a couple of years ago. The first one I bumped into one, my brow got that wrinkle it gets when I am at once curious and horrified – horrious? curified? I remember thinking, really? This is what we are doing now instead of putting effort into making our drop-downs more usable? Let’s just add more drop-down…

Once the initial feeling of horriosity passed, I just forgot about it. None of our clients were using them, so I didn’t really have to pay attention. And THEN, this March Jakob Nielsen put out an Alertbox saying “Mega Menus Work Well.” That was really the clincher. If Jakob/NNG says it’s ok, well there goes the neighbourhood. Continue reading

Integrating Taxonomy with CMS – Book Chapter in Publication

The final draft has been submitted… Mark your calendars…

The Information Management Best Practices 2009 book is going to publication this week, in hopes of being ready for launch at the J.Boye Conference in Aarhus, Denmark, Nov 2-4. I’ll be there, giving a talk on SharePoint IA, but also to lend a hand with the book launch activities.

I’m proud to have a chapter in this book, with co-authors Seth EarleyCharlie Gray (CMS & Taxonomy Strategist, Motorola), on one of our most in-depth and successful projects – integrating taxonomy with CMS at Motorola. The best practice covers the steps below in great detail, offering practical advice and screenshots from the actual implementation at Motorola.

Steps

  • Step 1: Educate Stakeholders on Taxonomy
  • Step 2: Bring a Taxonomy Expert onto your CMS Implementation Team
  • Step 3: Determine Functional Requirements Continue reading

Taxonomy in Extreme Places

How often do you get to be immersed in a completely alien work environment?

As a taxonomist, I get to learn about so many different domains through my work, from mouse genetics to greeting card manufacturing. Each company has its interesting quirks and workplaces…Like the toy manufacturer, whose workers had their cubicles adorned with all sorts of inspiration and materials: multi-colored fur, googly-eye collections, pictures of themsleves as superheroes…

But this week, I got to experience something completely different.

We just started a content strategy project with a semiconductor equipment manufacturer which aims to help their service groups (the folks who fix the machines) get the right information at the right time. This is an interesting project involving issues around technical writing and information architecture (DITA), integration across many different knowledge systems and databases, and getting information to users in a less than hospitable environment – the clean room.

A clean room is essentially a manufacturing or research facility that has low levels of environmental pollutants, such as dust and microbes. Pollutants are kept to a minimum through air filtering and circulation, as well as a strict dress code involving what are “lovingly” referred to as “bunny suits“. A clean room suit involves:

  • Glove liners
  • Rubber gloves x 2
  • Hair/beard net
  • Face mask
  • Shoe covers
  • Coveralls
  • Hood
  • Booties
  • Safety glasses

You get dresImagesed in a specific sequence so as to reduce contamination… first being the glove liner, rubber glove #1, hairnet, face mask, and shoe covers. Then you enter a second room where you add the hood, coverall, booties, rubber glove #2 and safety glasses. You then walk over some sticky paper into an air lock, where you are blasted with some air, and you’re now ready for the clean room.

Two minutes in a bunny suit and you gain a quick appreciation for the difficulties inherent to working in such an environment. It’s hot under all those layers, you have poor peripheral vision in the hood, the glasses constantly get fogged up from your breath under the mask, and it’s hard to walk. (Well, I have to admit that the “hard to walk” part is probably because I was wearing high-heels in my booties – ill-advised and embarassing! I also made the newbie mistake of taking a cough drop before putting on my mask, and I ended up breathing menthol air into my eyes and fighting back tears the whole time.)

But if I’ve set the scene up appropriately, you can start to imagine the challenges inherent to knowledge work in this environment. First of all,Image it’s hard to get access to information – carrying around a laptop is difficut, your hands are slippery, there’s nowhere to set it down in this lab, nowhere to plug it in… Even if you did find a place for it, you can’t use a track pad when you are wearing 3 layers of gloves – it’s hard to type and the gloves don’t create enough friction for the pad to capture movement. You might use a tablet and stylus, but there are holes in the floor, so if you drop it… You might use a handheld device, but again with gloved hands good luck typing on that tiny keypad, and the screen is much too small to show detailed tool schematics. You don’t have access to the internet, so all the information has to be available on the machine, and there are hundreds of parts for each machine.

Add the next layer: search, systems and content structure. These folks currently have to search across mutliple systems to try to find documentation on specific problems… starting with the original manual, which is likely for the product as it was shipped, not as it was configured at the client site. There are multiple databases where there might be troubleshooting tips or solutions, but you have to check them individually. The content is not well tagged or structured, so if you do find a document that might be useful, it’s typically a gigantic PDF that you have to comb through.

As you can see, this is a challenging problem: how do you get the right information (the right amount of it) in a way that is well structured and accessible to them in the clean room environment? What part of it involves structured writing in XML vs. system integration vs. taxonomy and metadata and how do we pull all those pieces together to offer a simple interface to a service professional?

We’ll be working on this project for the coming weeks, so I’ll keep you posted on the conclusions and insights. But in the mean time, I’m sure this will probably be my personal “one to top” in terms of taxonomizing in extreme places. Perhaps I’ll beat it if we ever do any work with cave spelunkers, submarines, or NASA…

Share your extreme taxonomy stories in the comments!

Photo credits:

http://lasp.colorado.edu/images/engineering/tech_cap/clean-room-suit.jpg
http://ixbtlabs.com/articles2/cm/intel-israel-dec2k5.html

Ensuring Cross Channel Consistency in Brand Management

“Having the people,systems and governance in place to facilitate a cross channel view of marketing assets and customer experience is a critical challenge many organizations are facing”

Laura Keller, Strategist at MISI company

Silos Revisited

In many organizations the responsibility for creating marketing assets is decentralized and siloed by channel. One group is working on email marketing, another on web commerce, others on social media and still other groups on more traditional print and broadcast. Without solid governance and systems to support a view across these channels, companies are missing a tremendous opportunity to:

  1. Re-use marketing assets
  2. Realize value from cross channel synergies
  3. Evaluate the consistency and quality of marketing assets

A great deal of  time and money is wasted creating new assets because of a lack of awareness of existing assets.  Assets also need to be managed across channels in order to maintain consistency and measure effectiveness of programs and maximize impact from spend.  Any organization engaging in cross channel marketing programs will benefit from core tools and approaches that, when put into place, can improve response times and save money.

A centralized repository of marketing assets that is supported by consistent organizing principles (taxonomy) is a requirement for facilitating cross channel views and re-use of assets. Unfortunately, the following scenarios are all too common. Continue reading

Follow

Get every new post delivered to your Inbox.