Archive of Technology posts


Writer River

I try and follow as many technical writing blogs as I can but beyond the dedicated blogs there are other good sources of information and ideas out there. Finding these additional articles can be tricky as they can be found in a huge number of different sources.

To help with this Tom Johnson set up Writer River as a way to allow anyone (with an account) to post links to interesting blog posts and articles. The website has been running for a whlie now and continues to provide links to useful and topical articles.

I monitor an RSS feed of the website and whilst that is adequate it does mean that I can go a week or more without checking which can lead to a build of too many links to check. That usually means I skim the link titles rather than click through them all and the likelihood is that I’ve missed something that may have been useful.

So I’m delighted to hear that Writer River now has an auto-updating Twitter feed. Every link posted on Writer River will be pinged over to Twitter, and as I’m following Writer River on Twitter, I’ll see the links as they are added, increasing the chances of me taking a few minutes to click through to the linked website.

Time will tell but this is either a great use of social networking tools, allowing me to keep bang up to date and probably meaning I won’t skip any links, or another challenge to my time keeping, a quick check of Twitter during a context-switch* moment at work may lead to a longer break than I had intended?

We’ll see but, as ever, Tom continues to push his ideas forward to the benefit of us all.

Thoughts on HATT Survey thoughts

Tom Johnson has had a look at the survey recently published by the HATT matrix website on help authoring and, by pulling in the results of some other surveys in the same area, has extrapolated some good conclusions from them.

He rightly points out that surveys need to be taken with a pinch of salt (he goes into the detail of why this is so), and that whilst the numbers involved would seem to be high enough it’s likely that the questions themselves need further consideration in future.

That said, there are two things I took from his post.

1. Know the problem before picking the tool
You may not be in the position to switch authoring tools, but if you are and you have investigated the market then please make sure that you are buying a tool that addresses the problems you have.

The presumption here is that if you have a legacy tool (like we currently do, FrameMaker 7.1) and it still works and meets your requirements then there is no good reason to upgrade. I’ve been victim of buying into the ‘keeping up’ frenzy that software manufacturers like to generate but once a product is reasonably mature it is likely it has most of the features you need already.

I’d offer Microsoft Word as an example here, I could probably still use Word 2.0 for the few documents I maintain in that format as the newer versions add functionality I don’t need (and which has ended up intruding on my workflow at times!).

The X-Pubs conference a couple of years ago had a common, if not publicised theme. Almost all of the presentations included the advice to figure out what problems you had, before deciding IF single sourcing (using XML as the base format) will help and that’s even before you consider the tools themselves.

2. DITA is still a theory
Whilst it is true that the number of people leveraging DITA for their content is rising, the numbers remain low.

Partly that will be due to the fact that few organisations/teams/people are in a position to quickly switch just because a new technology has come along, but and I’ve said this before (in fact I’ve said that I’ve said this before!) rollout of DITA remains harder than rolling out a bespoke authoring tool.

When costing an implementation of a new tool there are various factors and it’s very easy to see that you can get MadCap Flare up and running quickly, where as a DITA based solution takes investment in developing the environment. This is beginning to change but, as yet, the phrase ‘DITA support’ really only means that you can output to a DITA formatted XML file. The tools aren’t constructed around the DITA concepts, so you immediately lose a lot of the benefits that DITA can bring.

Until there is a tool that fully leverages DITA, building it into the workflow of using the tool, and helping the concepts become part of your daily working practice then it will continue to be a marginal player.

Which, in a way, is how it should be. DITA is not a tool, it is a technology and methodology. It is there to support the toolset and the writer. It’s just a shame that tool vendors continue to believe that THEIR format is best, refusing to budge from that position and shoe-horning ‘DITA-esque’ features into their software.

Anyway, the rest of the survey write up is interesting and worth a read but, as Tom says:

“I do love these surveys, though; if for no other reason than they give us something to talk about”

UA Conference Notes – Day 1

Notes and thoughts from Day 1 of the User Assistance Conference

Session 1 – Tony Self – Emerging Help Delivery Technologies
It’s been quite a while since I heard Tony speak but as ever he provided an entertaining, if somewhat limited, presentation. Covering the various types of help viewing technologies he nicely summarised some of the available choices including the features to look out for, including the ability to wrap up an online help system in its own application (using technology like Adobe AIR). It was interesting to hear some Web 2.0 features making their way into online help technologies, including voting and commenting facilities which would give you direct feedback from the people using your help system.

Session 2 – Joe Welinske – Write Mote, Write Less
Embracing the Value of Crafted Words and Images
Another regular speaker and Joe was certainly fired up, challenging us all from the outset of his presentation to consider how we work in far more detail than we currently do. First up he suggests that we should be writing fewer words whilst making sure those words are correct and so lessen the impact on the reader, providing just the information they need and nothing more.

And then he hit on something that I’ve previously mentioned here (although Joe nailed it much better than I did), namely allocating writing resource to the highest priority pieces of documentation work, rather than the traditional approach of documenting everything. It’s a simple approach that, when combined with better writing, leads the craft of technical communications to provide much higher value to the business which is good news for all of us.

Session 3 – Sonia Fuga – DITA & WordPress Solution for Flexible User Assistance
A showcase style presentation of a stunningly simple concept. With a little bit of coding work (building a DITA importer to get XML content into the WordPress database), the team at Northgate offer a web-based help system which allows users to add their own notes and to vote for useful information, and which is can receive updates with new content with each release.

How? By using WordPress features. Notes are left as comments, votes are left using a WordPress plugin, and the updateable content is controlled by only allowing the customer (who has access to the WordPress admin screen) to create Pages, leaving the Posts controlled by Northgate. I use WordPress for this website, and spoke to Sonia in the evening to confirm some of the finer details. It’s a very clever use of WordPress, and I hope Northgate release their DITA importer to the open source community!

Session 4 – Question and Rants
A short session with four speakers each giving a two minute ‘rant’ and then taking questions. Nothing particularly noteworthy came of this but it’s a good addition to the usual style of presentations and made for a little bit of light relief.

Session 5 – Dave Gash – True Separation of Content, Format, Structure and Behaviour
Another familiar name, Dave is always entertaining and a very dynamic speaker and in this session he even managed to make the somewhat mundane topics of HTML, CSS, and JavaScript interesting!

Outlining some basic principles he showed how you could take an HTML file, full of embedded behaviours (javascript), style rules (CSS), and content and strip out all four parts into a more manageable set of files. This way, holding the style and behaviours in referenced files, you can make changes to either and have them ‘ripple’ through all of your deliverable.

Admittedly this was all done by hand but the basic principles are something that you should be following if you have that kind of output.

Session 6 – Matthew Ellison – User-centred Design of Context-sensitive Help
Interesting presentation by Matthew which started a little slowly, covering the history of context-sensitive help before taking us onto the idea of task support clusters. Originally presented by Michael Hughes at the WritersUA conference, the premise is to offer the user a smarter landing page, referred to as Keystone Concept topics here.

The key to a successful Keystone Concept topic is not to limit what is presented, and to consider that it should be different depending on the context from which it was launched, with the ultimate aim of getting the user back on task as quickly as possible. This includes any form of tips and hints, and crucially suggests NOT to include the obvious stuff (don’t answer questions that users will never have!). This mirrors part of the theme from Joe’s talk early in the day, and certainly seems to be a sensible goal given the business (time and resource) pressures we are all under.

After that, I had a few beers and a chat with some other delegates, and as ever it was great to hear that most of us have similar issues, problems and solutions.

I’ll post my notes from Day 2 of the conference tomorrow.

Web 2.0 is hard

Question: How much investment does Web 2.0 really take?

Answer: A lot.

I’ve seen the same quote repeated in several different locations recently. It was uttered by O’Reilly and has the twin benefits of being short, quantitative, and seemingly true. As I’m in the midst of setting up a new website for our company, focussed on the developer community that already exists (in number if not in action), it was a phrase that made me realise just how much work lay ahead of me.

Part of the work I’m doing is to replace the existing website, rebranding and updating it in one fell swoop. Most of the work is largely concerned with uploading documents and files to make sure that everything that is currently available will be available from the new website, but there are already thoughts around how we can use the website to drive further adoption, innovation and so on.

And, of course, because Web 2.0 is the phrase of the moment there are quite a few eyes waiting to see what will appear.

One thing I have realised, and I’m still winning over minds on this, is that most of what Web 2.0 is about isn’t the technology and, whilst this may seem like an odd statement, it’s not really about the people who use the website, not initially at any rate. No, for me the big issues that surround Web 2.0 adoption by corporations are centred around information and transparency, about being part of the conversation.

That last sentence is important. You cannot drive a conversation on the internet, you can start it, you can contribute to it, but once you’ve set it free you no longer have control over it. All you can do is hang in for the ride, and that’s where transparency kicks in. As the numbers of conversations grow the easier they are to manage if you are open and upfront. For, as Tim O’Reilly said of Web 2.0 (and I’m paraphrasing here):

“The more people that use it, the more uses we’ll find”

So, just as the benefits of having a more connected community of users will increase what they can acheive both individually and collectively, so to do the number of pitfalls awaiting the cumbersome.

What this confirms is that most of the challenges around setting up a community website are largely about the individuals and being able to reach out to them, to be able to consistently engage them and ultimately offer them benefits for their time and input.

Which doesn’t half take a lot of work.

Everything is connected

This post has been bubbling for the past year or so, ever since I started this blog. It’s a bit of a ramble but if I don’t publish it now I’ll just keep adding to it and it’s long enough as it is!

I question everything. It’s part of the way my mind works, and is something I’ve embraced and believe it makes me better at my job as a technical communicator. That attitude has also helped me realise that there is a common thread that can be found across several different areas of our industry, which I (and others) are slowly pulling together. Convergence is the word that springs to mind, and as businesses clamber onto the social networking bandwagon, now is an excellent time to grab the reigns and take control.

Let’s step back a little.

Late last year, on two separate mailing lists, I followed discussions about what the myriad of people who share my profession have as job titles. I prompted one discussion on the ISTC mailing list, and chipped in some thoughts on the TechWR mailing list before dropping out later on when the noise ratio, as ever, got too high.

I wonder how much useful information I miss when I do that? Ahhh something else to ponder. But not today.

Anyway, discussions around how we as a profession should be referring to ourselves, envitably leads to discussions and thoughts about what we do, where our skills lie, and the benefits we can bring to an organisation. Something I’ve toyed with before, but which is wrapped up in many layers of ifs, buts and other such caveats.

Following on from that, I read an article by Virginia Lynch in the CIDM newsletter (and if you aren’t subscribed to their newsletter, you should be) entitled Information Developers – The New Role of Technical Writers in a Flat World which encapsulates a lot of my current thinking on how to take my current team forward, making sure we are matching company strategy whilst allowing the team members to retain a focus on maintaining and developing their core skills. The article title rather neatly alludes to Thomas Friedman’s book The World Is Flat: The Globalized World in the Twenty-first Century which is certainly worth a read.

Virginia mentions that JoAnn Hackos recently referred to these core skills as “Basic Hygiene”, citing the fact that, regardless of how the collation and production, distribution and usage of information may change, as we explore the burgeoning arena of new tools available to us under the banner of “social web applications” our core skills remain. Typically they tend to drop off as we are pushed to create more, faster, with a rise in quantity favoured over a maintenance of quality.

style, grammar, punctuation, spelling, and even clarity seem to have been sacrificed for quantity —JoAnn points out that knowledge of basic writing skills is still critical to our success as writers. Basic Hygiene also comprises an understanding and appreciation of editing, the information development life cycle, fundamental web and computer skills, and of course attention to detail.

However it is important to note the nod towards quantity being a business leader, and those of us tasked with managing a team need to consider how we achieve that business aim, without impacting our integrity as Technical Writ… umm… Information Developers?

So, how do we produce more whilst maintaining quality?

Wait! What’s that coming over the hill? Ahhh yes, the shining white knight of single source, armour gleaming, his trusty DITA (or DocBook) in hand, ready to do battle against the ills of productivity measurements and over-zealous QA departments. What else were you expecting? Ohh more resource? No, not these days when everyone is a “content creator”, not these days when we should be embracing and encouraging our audience to help plug the gaps in our information dykes (I really must stop mixing my metaphors).

Topic-based writing certainly seems to tick the required boxes and every business case and ROI I’ve read (and I’ve written a couple myself) points us towards the promises found over the horizon and the “he’ll be here real soon, honest” arrival of the aforementioned white knight. The trouble is that, whilst it is easy to agree with the theory, I’m not all that sure the white knight is all he seems. Certainly as we climb the hill towards him, auditing our content, deciding on chunking levels, agreeing metadata requirements, we begin to see that that armour seems a little thin and dented in areas, and I’m not entirely sure the knight is filling that armour as much as he should. Aren’t they supposed to be big strapping warriors? He looks a little weedy to me…

Topic driven content written with a minimalist slant, deferring here to the instructions of Strunk and White* rather than Roy Carroll, are where we seem to be (need to be?) heading and that’s fine and good from where I’m sitting.

* A sentence should contain no unnecessary words, a paragraph no unnecessary sentences, for the same reason that a drawing should have no unnecessary lines and a machine no unnecessary parts. This requires not that the writer make all his sentences short, or that he avoid detail and treat his subject only in outline, but that every word tell.

On the flip side, there is a definite growth in awareness around the use of Web 2.0 technologies and systems, building online communities, integrating Wikis, blogs, RSS feeds into the information flow either as part of end user deliverables or as methods for encouraging information creation by everyone involved with the product, internal or external.

A large part of our job concerns the collation and filtering of information so as far as I’m concerned anything we can do to make the creation of source information easier has to be welcomed. Extending these mechanisms beyond internal usage means it should be easier to provide information to the people who really need it, with the added bonus of a greater level of trust in that information. Don’t believe me? Which type of information do you put most weight on, the information passed to you by a trusted colleague who you know uses the product heavily, or the product documentation? (and bear in mind that we technical writers pre-disposed to favour the work of our peers). That in itself is another issue which may be alleviated by embracing social content creation, pulling on the goodwill generated by openly inviting contribution and collaboration, whilst giving technical writers a chance to show their worth in full public view.

So where is all this heading? I’m not sure if anyone is too sure but there do seem to be some trends appearing. The use of Wikis to host documentation, the creation of community websites with few restrictions, and more. There are plenty of tools, and with a little work you can get them talking to each other. Technology is not the limiting factor anymore, attitudes are now the only things stopping us trying these wonderous new things. It’s a big step for some companies, and some people, to free their information, to pass their hard earned knowledge about willy-nilly without a clue as to how it will be used.

Once you’ve gotten past the limitations, the real effort, once you have your community or collaboration up and running, is the surrounding processes. Do you want to pump content into the website regularly? (yes). Do you want to allow anyone and everyone to contribute to that same store of information? (yes). Do you want to allow others to quietly correct your mistakes? (yes). Do you want to give the people who need it, access to information about your product, regardless where it originates, trusting them to use their judgement? (yes).

The final pieces of the jigsaw are the finer details of implementation. Presuming we want to reuse information as often as possible where do you store information and how do you allow access to it? Who should be involved in verifying new information? Where/how is the level of trust established?

Pulling together the threads of this emerging role is tricky, with so much overlap into multiple areas and so much to consider there is a danger of not seeing the wood for the trees. This post is an attempt to step back and make a little more sense of what I can see, what I know, and the changes starting to drag our profession in interesting new directions. I fear I may have muddied the waters, but hopefully they’ll settle and things will start to make sense.

Regardless of whether I’m right or wrong, one thing is for sure, these are exciting times and we have a great opportunity to finally leverage technical communications into the spotlight. The value of information is finally being properly realised, and we are ideally placed to help any organisation make the most of what information they have and help them understand and create the information they really need.

Back to DITA?

I’ve mentioned DITA a few times on this blog, and my DITA is not the answer post is still attracting attention. As I’ve said, I think the DITA standard is an excellent one for software documentation and the DITA movement is slowly catching up to the hype. I’ve never given up on DITA and had always planned to use it as the basis for the next stage of our content development, and as it happens the switch to a full DITA/CMS based solution may be closer than I had anticipated.

We have been considering how best to publish up to date information in keeping with patches and minor releases, and if we can tidy up and publish useful information from our internal Wikis and support system. The nature of the product we work with means there are a lot of different usage patterns, not all of which we would document as they fall outwith typical (common) usage.

So, how to publish formal product documentation, in-line with three versions of the product, in PDF for ‘printed’ manuals, JavaHelp to be added to our product, and HTML to be published to a live website alongside other technical content (ideally maintained in the same system as the product documentation). Storing the content as XML chunks also allows us to further re-use the content programmatically (which can be tied into our product in a smarter, dynamic, fashion).

The obvious answer is single source using DITA to structure the content, storing the content as XML to give us the greatest potential avenues for re-use. Nothing particularly startling there I know, but it’s a switch from the direction we had been considering. So I’ve been catching up on what’s new in DITA-land and have to admit I’m a little disappointed.

We already have FrameMaker and Webworks in-house, although both are a couple of versions old, and thinking we might keep using those applications I’ve been hunting about to see if I can find a solution that offers a coherent, end-to-end, story. There are several CMS solutions which require an editor, editing solutions which require a CMS, and a few products that straddle both CMS and editing but then require publishing engines.

I understand that it would take a collaboration between vendors to be able to offer a simple, seamless solution

In addition to that there does seem to be a tendency for any DITA focused solution to remain the remit of the overly technical. Don’t get me wrong, I’m quite happy delving into XML code, hacking elements, or running command line scripts to get things done. But surely I shouldn’t have to resort such things? Now, I’m sure there are many vendors who will tell me that I don’t need to worry, but I’ve seen several demos and all of them miss a part of the FULL story.

Come on then vendors, stick your necks out. If you are a CMS provider, then recommend an editor. If you sell editing software then talk nice to a CMS vendor and start promoting each other (yeah Adobe, I’m looking at you!).

And yes, I’ll happily admit that maybe I’m just not looking closely enough. If only there was some sort of technical community website that I could join, perhaps with a group or two on DITA? That’d be great.

Ohhh wait. There is! (not the most subtle plug in the world, was it? I think the new Content Wrangler communities could be a big hit, do check them out).

Have a got the wrong end of the stick, are there really gaps in the market in this area at present or is it just my imagination? I guess I’ll be running a fair few evaluations over the coming few weeks and, of course, I’ll post my thoughts and findings here.

Q10: Text editor with a difference

Recently I’ve been toying with some software tools to try and focus my writing a little better, stumbling across an application called Q10. A full-screen text editor which is, according to the website, “a simple but powerful text editor designed and built with writers in mind.”

Whenever I start a new piece of work I tend to write up notes in text files first, before moving them into FrameMaker for formatting and publishing. This was largely borne from limitations in Structured FrameMaker, and sits well with our usage of an internal Wiki for content collaboration. Q10 helps with this approach by doing one thing very well.

Like most well-designed products, the features are usable out of the box, but there is enough scope for tweaking things to get a system that suits you. The full-screen aspect of Q10 is the most important, blacking out the rest of the screen and leaving you with a blank slate on which to focus your thoughts. There is no menubar, with options only available through keyboard shortcuts, including one to turn off the typewriter noises which I’ve left on as they are somewhat soothing… oddly.

You can set the file encoding, a target number of words (handy if you are writing an article), change the font and colour settings, and it supports quick text allowing you to replace given character combinations with whatever you specify (I use this for product names, although you have to search and replace on the underlying text file if the name changes).

At first I was only really using Q10 for writing blog posts and articles, but I’ve started to extend it into my set of ‘work’ tools and it’s proving very useful. Sure it gets some odd looks as people glance at your screen but I certainly seem to be more capable of focusing on my writing these days.

I’ve tried a few other full-screen text editors (for Mac and Windows) but as the bulk of my writing time is spent on Windows machines, Q10 is proving very useful. Best of all it’s free!