A tech writing team for 21st century software developmentâwhether an in-house tech writer, freelance writer, or independent contractorâneeds to adopt the same spirit, structure and methodologies of an Agile software development team. Each writer needs to live by the basic tenets of Agile development: (1) Interact directly with customers and developers to support and communicate with all stakeholders, (2) Use tactics that work rather than straining against established protocols and layered processes, and (3) React quickly to changes. These processes are in contrast to the overly planned and scripted practices of past writing methods.
Traditional Writing Expectations
A traditional writer waits for a product team to finish their coding and testing, and then acts as scribe to perfunctorily write resulting products and features. They do not engage in PRD planning, assist in providing skills for UI development, or immerse in customer issues at the their level. They seem to think they only have a single responsibility in the team to handle help files and react only to explicit complaints. Consequently, traditional tech writers are either limited to a simple project or fail when asked to handle multiple responsibilities.
If not immersed into the product team with agile writing methodologies, the traditional tech writer is most likely living a frenetic, reactive life of waiting to get laid off, or meeting seemingly endless deadlines with maximum stress. Adopting processes that identify application features, identify and get feedback from audiences, set milestones for delivery, and provide replicable processes, should be welcomed warmly by todayâs tech writer, especially those with so many balls in the air and growing responsibilities.
The Agile Tech Writer
Tech writing is most effective when the writer knows conceptually why the new feature or product is needed in the market and how it is going to be used. This is what agile methodologies provide. These writers go beyond simple reference identification to expand information to assist the user in understanding the value and utility of the software product.
An agile tech writer, like an agile dev team, is part of the whole planning, development, and support processes. The agile tech engineer and writer understands everyoneâs responsibilities, knows the product as a whole, and meets the myriad of communication requirements along the product lifecycle from a variety of stakeholders.
An Agile tech writer
- Understands customer practices across vertical industries.
- Engages throughout the product lifecycle to provide internal communication and documentation early to stakeholders and to capture content to frontload their own research needs. This releases engineers and PMs to focus on their jobs. Everybody wins through new efficiencies, common information, and expanded writing services.
- Provides right-brain communication from within a predominantly left-brained engineering organization. They embed themselves as chief communicator for all internal and external needs, expanding roles to include technical journeyman, real-time author, curator of content, and publishing strategist.
- Iterates through each writing cycle to improve products and processes with each delivery.
Like agile software developers, the agile tech writer needs to rely on interaction with key individuals rather than to rely on perfunctory processes and never-used planning tools. They need to embed within the dev teams to understand team goals and changing product features, requests, and opportunities. They value best practices from the customer perspective over detailing product features with reference data and obvious help files. They do not see their jobs as scribes with specific writing assignments during release phase, but rather champions of communication and collaboration throughout the lifecycle of the product. A task never completed until the software is obsolete.
The agile tech writer goes beyond help systems and release notes into a comprehensive role as tech info coordinator, documentation writer, and curator of product information all wrapped into one. In my mind, this is an Agile Tech Writer with skills to support APIâs, generate in-depth articles, manage formal documentation, and integrate own writing with social and cloud resourcesâand to publish on the web, to the KB, or any other format using object-oriented writing techniques. They identify overall product inconsistencies, initiate needed conversations, challenge dev and PM from another perspective, and generate articles from their team relationships for formal doc and other publications across the KB, company and cloud.
Agile software development methodology is based on customer input and quick reaction to meet customer needs and opportunities. It structures resources to meet specific business needs and puts it on a calendar, providing milestones and requiring goals to be met. It also requires customer interaction from all members, and backend processes and tools to get more efficient and scale to meet targeted and vertical needs. For the tech writer, support protocols that meet all these goals would allow the writer to touch all points of communication and publish required doc across the enterprise.
- Agile tech writers provide internal communication and documentation early to stakeholders and capture content to frontload their own research needs, thereby releasing engineers and PMs to focus on their jobs. Everybody wins through new efficiencies and expanded expertise.
- Agile tech writers provide right-brain communication from within a predominantly left-brained engineering organization. They embed themselves as chief communicator for all internal and external needs, expanding roles to include technical journeyman, real-time author, curator of content, and publishing strategist.
- Agile tech writers for the 21st Century need to be the right brain communicator in an overwhelming left brain engineering organization. They embed themselves with the dev team to know product requirements from PM and customer input, understand products and features from conception, are privy to UI design and vocal in its implementation, and part of beta testing. They frontload most of his works when writing articles for new practices, new features by staying engaged all the time.
In future postings, I will expand on how to improve your technical skills and writing processes to improve your agility and ability in supporting software development teams.
One valuable instance of a Knowledge Mashup is the concept of a documentation mashup, or DocMashupâa name that brevity and convention demands. Basically, this type of knowledge mashup serves as an aggregation of content from all resource types near and far for a product or service, including the importing and updating of the traditional static documentation set for a hardware, software, or other technical, healthcare, or scientific product. The DocMashup consolidates diverse information sources from the webâreal-time comments streamed from forums, independent content from blogs, images and videos hosted on the web, localized glossary entries from Wikipedia (or DBpedia if using semantic tags). The public content from the web is integrated with internal corporate or personal content from local servers or files: documentation guides, wikis, help sets, and training collateral for example. The result is a portal or centralized document linking all germane and up-to-date information while continuously keeping it current.
The DocMashup author pulls together all the disparate content and gives it context, narrative, and organization. Headers, tables, panes, links, and new content combine to form a new type of dynamic, multimedia document with streaming, real-time, semantically-relevant knowledge for a selected audience. The DocMashup can be used for traditional documentation guides and help sets, as well as replacing traditional training and professional services documentation.
The Guide by the Side
The premise behind the DocMashup is that the author no longer stands as the master of the set of all knowledge for a particular topic, product, or service. The author is the aggregator, teacher, student, researcher, collaborator, organizer, contributor, and explorer in finding content, adding content, providing context to disparate information sources, and imparting shared knowledge for the group. He or she is the guide by the side, albeit the leading guide, but not the sage on the stage pedantically feeding information and taking responsibility for all knowledge. The writer is now the director, casting director, and producer…yet still a writer and editor.
So what does it mean to no longer act as the âsage on the stageâ when writing? For me, it means that the author changes perspectives from the sole generator of content and moves to a position as the information manager and producer. He or she continues to write, but also shares all the other aspects of writing with the audience: allowing access to initial research of all media types, citing research topics and experts, and capturing important information from subject matter experts. The whole writing process is now shared and published and available for scrutiny during the process.
When delivering the content, the writer gives all the captured information credence and provides context to render real knowledge to a focused audience. The DocMashup author acts as the guide in finding valuable content, not the sole builder and distributorship.
Reader Adds to the Mashup Content
Along with all other resources, the reader also adds content to the DocMashup. The writer initiates original resources and organization, but the reader is also allowed to edit and update content.
After hand off to the reader, the author can update content to meld with new versions of readerâs content. Likewise, the reader can update their comments back to the author and other members of the audience.
Where do you author and host a DocMashup?
This is a question we will experiment with and answer in detail at a later date. But suffice it to say that there is no perfect solution but many good solutions to cobble together at this time. A complete solution stands as our ultimate goal in building a DocMashup, but until then here are a few ideas:
- Author a media-rich document in MS Word, Adobe RoboHelp, Madcap Flare, or any other authoring tool and save to PDF.
- Create template for MS OneNote or Evernote and let readers add content.
- Create Adobe AIR file with media-rich XHTML and allow for comments.
- Create standard mashup using mashup platform.
- Create separate web artifacts pulled together on web page.
The basic idea here is to pull together all resources from the web and local data stores, add comments from the author and reader, and keep the content updated.
The next posting will talk more about tools, platforms, and practices to build a DocMashup. I hope to get a few comments and ideas from others engaged in information aggregation and knowledge mashups.
February 17, 2010
Â· Michael Hiatt Â· One Comment
Posted in: Content Strategist, DocMashup, documentation mashup, Information, Information management, Knowledge management, knowledge mashup, Mashups, technical communicator
Itâs the world according to YOU. Itâs about you getting the information you want want when you want it. Itâs about accessing content from open and collaborative sources, then filtering and focusing that content to meet unique documentation, training, and other educational needs. Itâs personalized, real-time information delivered directly to your computer devices. Thatâs the promise of Web 2.0 moving to Web 3.0âautomatically accessing and controlling the best information you deem as relevant to your needs. In your world, you are the one who provides context and meaning in the cloud of overwhelming data and disparate information. You are the existentialist of the Information Age who defines real knowledge amongst the chaos and chatter.
Thatâs the idea behind the Knowledge Mashup. Find the best pieces of information, tag that information, and then structure and sequence it for specific needs. Use a simple object-oriented approach that identifies articles, videos, pictures, and people as independent objects and then give these objects context and navigation by overlaying them with some type of structural paradigm (a list, content map, outline, search filter).
The idea of a knowledge mashup is to bring together disparate but germane resources for a specific topic. The original author of knowledge mashup can tailor these resources and give context to educate, communicate, and impart information. It allows for usage of the best content that is controlled by an author or teacher in providing context and organization. It is a filter on the fire hose on the Internet to direct just the information needed for a specific product, process, instructional aid, or discipline. Among other benefits, it allows the author to be a âguide by the sideâ of the reader in finding information rather than the all-knowing âsage on the stage.â
A Simple Idea
The idea and practice of a knowledge mashup can be fairly basic, even primitive. Like all mashups, the idea is to take pieces from existing content and pull these pieces together to form a new entity. Because content on the web is the most pervasive and accessible to capture, cut and paste, or link to, a basic knowledge mashup requires little coding or enhanced logic to capture important content. Itâs basically identifies germane comments, articles, paragraph from articles, glossary references on Wikipedia, videos on YouTube, or streamed RSS feeds and publishes it based on a required narrative, organization, or paradigm that fits the authorâs audience and intent.
In fact, the knowledge mashup is less a âmashingâ of content than a âmeshingâ of topics. It employs the same hyperlinking and single-sourcing practices used by web developers and technical writers for years. But it employs these topic-based management tactics using both open resources and proprietary documentation sets and internal knowledgebases. It also allows the reader to add their own topics and annotation. But like any book, web site, or published material of any type, all this diverse content needs to be given some type of form to impart ideas and provide navigation.
With Web 3.0, the introduction of Semantic Web with markup based on content types will further personalize information based on content and automate distribution techniques. This is the future of knowledge mashups.
Types of Knowledge Mashups
Knowledge mashups can be used for a variety of needs:
- Product documentation
- Product training
- Travel guide
My next postings will explore some of the vertical uses of knowledge mashups and drill down into the practices and protocols used to build them.
Building Cloud Computing and Communication Tools and Processes
At Mashstream.com, we are taking it to the virtual streets. We are joining the egalitarian revolution of open web protocols, open cloud development practices, open data streaming, open information sharing, and open global markets. Now is the time to take the theoretical ideas of Web 2.0 socialization and cloud computing techniques and move them further into the realm of real-life solutions. To that end, we introduce the Mashstream Projects. These open source, ongoing projects are designed to help focus our understanding of emerging technologies and to generate new ideas through research and collaboration.
As coordinator and project manager for the Mashstream Projects, I have been working with Jeffrey Hanson from the Mashstream Advisory Committee to identify an array of current challenges that can be met using Web 2.0 practices and emerging cloud computing technologies. These projects will allow us to research a variety of topics, such as mashup applications using mixed data and services, information aggregation portals, real-time web streaming, and linked data processes for a variety of vertical businesses and disciplines.
Quick Note: I will be updating content to the initial projects listed below and adding new projects as they are suggested and approved. Check back often to see updates and progress on the projects and to comment and join in on the fun. Additional postings will address many of the basic issues of the technologies used in these projects. We will highlight case studies, educate ourselves on basic concepts, and design systems for the next generation of Web 2.0 moving to Web 3.0 technologies and trends.
Learning by Doing
Throughout the research and execution of the Mashstream Projects, we plan to learn by doingâto learn new technologies and tools by immersing ourselves into common problems and challenges. I am relying on this goal-based approach to better learn about Web 2.0 tools and technologies by applying them to real-world challenges. We can employ our shared expertise as a community to define requirements, proffer ideas, and find workable solutions. Oh yeah, and to change the world of knowledge development as we know it thereby saving the planet. All in a dayâs work.
As a community of web developers, information developers, and content strategists, we are all students in this brave new world of emerging cloud computing and virtual interaction. Itâs time to put our heads down, set some goals, and immerse into the Age of Information.
For deep-dive technical questions, we will go to experts like Jeffrey Hanson and other technical gurus and resources for guidance and a periodic reality-check. We will also seek input from those working in the trenches of education, business, medicine, science, and other related disciplines for their ideas and feedback. This open-source development can use all the input, criticism, and ideas that it can muster. Together we can serve all of our common interests while assisting each other in finding specific answers and strategies.
Enough said. Letâs bring on the projects.
Welcome to the Mashstream projects
Hereâs a list of the current projects being researched and developed:
Project 1: Linked Data System for Researchers
Opportunity: Scientific researchers need raw, real-time data from peer researchers integrated with existing content from published scientific journals, compendiums, and other relevant information sources accessible from the web. This aggregated content needs to be maintained and given context by each research team and integrated with existing knowledge systems. The process needs to harness knowledge flow for internal knowledgebases and external portals and provide features to expedite traditional publishing needs.
Scenario: Provide linked data communication and solutions for a nanotechnology and DNA research company.
Factors: Streaming raw data is what linked data technology is all about. It is the most basic of Semantic Web implementations, requiring content to be structured and semantically tagged. Security is an issue here. Need to research existing implementations and define infrastructure.
Goals: Design Linked Data processes to integrate with existing information services for research teams. Consolidate open data sets with private data sets providing necessary security.
- Identify available generic data sets and those specific to research.
- Identify research in this area and available semantic web already in place.
- Research process for securely linking public with private data sets using semantic markup.
- Analyze costs for semantic markup
- Establish Linked Data communication of raw content between scientific researchers.
- Link with existing internal knowledge sets.
- Link with existing corporate publications and information developers.
- Allow research data to be shared immediately and given context by each research team for their unique needs.
- Integrate relevant data sets using basic Linked Data practices and identify possible services for this sector.
Project Plan of Attack
Hereâs a list of the initial steps required to get this project moving.
Research and report current Linked Data practices.
Identify relevant data sets.
Formulate strategy for delivering data.
Identify process for creating Semantic markup.
Devise process for sharing, providing peer review, and publishing results.
Hereâs the foreseeable technologies needed for this project:
Linked Data practices
RSS and Atom feeds
Open knowledge management systems
Publication strategies for multiple audiences and information formats
Planned Product or Service
Linked Data services and processes for public and private researchers
Project 2: DocMashup for IT Management Software Systems
Opportunity: IT Management Systems are changing dramatically. The advent of online and virtual software applications and new hardware capabilities allows for new opportunities to challenge conventional knowledge aggregation and publishing. During this time of transition, there are great opportunities to consolidate features and provide supporting resourcesâtraining, implementation instructions, and general documentationâas well as upgrade features for legacy management tools to synch with the coming management systems in the cloud.
Scenario: An independent company of information developers provides content by combining in-house documentation and training with available cloud resources, including web articles, YouTube training videos, links to tech sites, and the addition of common practices. This independent group of information developers remixes and adds updated content for products like the Dell Management Console and initiatives such as the announcement of HP working with Microsoft on the Azure web development platform.
Factors: Currently, sales engineers, implementers, and customers wade through shotgun communication from software companies or write their own content for usage for disparate management tools. During the transition to new IT management services (and while this current recession lingers), business investment for new IT management systems will be waning. Thereâs also a current mistrust of proprietary systems being developed by major market vendors. Most businesses will work to extend the life of current IT management systems and add on features using widgets and other artifacts that employ open protocols and cloud resources.
Goals: Build a docMashup template and specific content for system management tools. Confront shotgun communication fixes for products while expanding options to add cloud management features to proprietary IT management systems.
Moving to open cloud computing protocols while utilizing legacy applications and databases requires some type of mashup to remix data and services. We plan to provide practices for using in-house data sets while mixing content and adding functionality from open APIs and online management services being developed by companies large and small.
Develop docMashup guide of standard IT management practices.
Map needs of customers to vendors, technologies, and open source options.
Identify future trends for industry through industry and analyst predictions.
Plan mashup application for open source extensions of legacy systems.
Publish docMashup for IT management systems.
Project Plan of Attack
Hereâs a list of the initial steps required to get this project moving.
- Investigate cloud computing IT management strategies, including those being developed by HP, Microsoft, Dell, Symantec, KACE, Apple, Linux, and other platforms.
- Research MS Azure, Amazon Web Services, Google App Engine, and VMware vCloud, and other web development platforms.
- Develop mashups for âlong tailâ solutions using legacy data and online data sets, Google maps, and other web resources.
- Build docMashup for IT Systems Management aggregating relevant content, documentation, and training.
- Build artifacts for IT management available from any web page.
Hereâs the foreseeable technologies needed for this project:
Web 2.0 Mashup
Semantic web tagging
Planned Product or Service
Knowledge Mashup examples for multiple vendorsâDell, HP, MS, and others.
Set up consulting services for information developers small and SMB enterprises.
Project 3: Aggregation for Education
Opportunity: As high school educators in many areas resist using web content in the classroom, we plan to develop a system where teachers can combine their lesson plans and personal research with online resources and learning portals. The aggregation of focused information for the classroom needs features to âwhite listâ sites selected by the teacher and track requirements for college preparation using existing web sites.
Scenario: High school teacher and student work together to integrate syllabus, lesson plans, ulterior web sites, and student notes into an artifact for a web site or aggregator save on the web and local hard drive. Mashing of college requirements with high school content is also a necessity.
Factors: Education is already moving towards online but proprietary education services. Aggregation of content needs to allay fears of concerned parents about accessing the Internet, organize structure to impart information defined by the teacher, and allow students to add their own notes. For educators, many Ning sites exist for education but have yet to move beyond teacher collaboration sites. While college professors are starting to use social networking strategies and college students have adopted Microsoft OneNote, Evernote and other online information aggregators, the full potential of online aggregation and content mashups are yet to be realized.
Goals: Successfully complete trial for aggregated education services using a selected teacher and student.
Identify platforms for content aggregation:, such as note taking software, AIR and Silverlight applications, Ning web sites, and other open source communities.
Identify data sets for SAT and ACT tests.
Plan mashup application.
Publish mashup, case study, and docMashup for education services.
Project Plan of Attack
Hereâs a list of the initial steps required to get this project moving.
Identify tools for online note taking and content aggregation.
Identify SAT and ACT portals.
Investigate Adobe AIR and Microsoft Silverlight platforms.
Find teacher and student for case study.
Hereâs the foreseeable technologies needed for this project:
Linked Data practices
Ning social sites
Planned Product or Service
Content Mashup for multiple information sites, online teaching venues, and personal notes from student and teacher.
Through the Mashstream Projects, we plan to define important areas of investigation and doggedly push to find real solutions. We plan to identify challenges, develop new ideas, research the feasibility of each idea, plan a project, and render usable products, services, and processes to help various disciplines and social communities. We hope to work together with you to identify opportunities and meet common challenges for our mutual benefit.
Meanwhile, as âmashstreamersâ with shared interests, we need to keep the questions and ideas flowing. Together as a community we can research, develop, and employ cloud technologies and Web 2.0 strategies to provide solutions for our own lives, whether in education, IT technology, scientific research, personal content aggregation, or a miscellany of other disciplines where innovation is needed.
So thatâs the highline of our ambitions. We are also hardened capitalists here at Mashstream and are intent on turning a profit. While much is left to do in defining standards, building open APIs, and organizing the larger Semantic Web, thereâs enough infrastructure and tools in place to begin bringing mashups, linked data, and semantic tagging techniques to market.
Itâs time to put together some mashups that make sense, capture raw data using linked data techniques, optimize and monetize social networking, and develop projects that show the maturity and direction of emerging trends and technologies. Stay tuned.
Hope everyone had a happy new year and fine holiday season all across the world. Like many where I reside, I have been away from business and spending time with family while over-celebrating in this traditional party season of the west. But those are just good memories now (except for that episode where I missed the train and couldnât find a cab on New Yearâs morning). But now I am focused and excited by the prospects of a new yearâespecially this year. I have a real good feeling about 2010.
The global recession abates, new ideas abound, and thereâs much work to be done in setting up new systems and societies after the gold rushes of the past twenty years. I see a year where competition gears up, old businesses fail, and new enterprises supplant the inefficient. A year of glorious birth and destruction. I am planning for the former.
Quick Note: This is my fourth posting on changing ecosystems for technical communicators and content strategists of all typesâweb programmers, technical writers, knowledge managers, and anyone providing technical content to a new generation of Web 2.0 technologists. See also evolving, adapting, and competing in the Information Age.
New Yearâs Resolutions
I resolve to make this a big year for me. I plan to make a lot of friends and collaborators. I plan to take some risks and act on some provocative ideas to work them through to their natural conclusions. But most of all, I plan to confront and act on the dream of possibilities that I see all around.
I plan to be the captain of my own ship, whether a loyal subject, privateer, or just plain pirate. I plan to find myself a place in the high seas of commerce and emerging new marketplaces.
I have a kind of musing about future trends and possibilities that I canât get out of my head. Much of it comes from bits and pieces of what I hear across many disciplines coupled with anecdotal news stories. Some questions revolve between new ideas in cloud computing, new social interaction possibilities, access to information and real knowledge, open markets, new points of competitionâall future trends. And part of my musings cast back to my knowledge and continued study of history. Between the old and new, I can see many, many similarities.
It is clear to me that we are in an epoch of new markets and new platforms from which to launch innovation. We are witnessing the advent of instant, global communication that can improve lives and confront pressing problemsâfrom reshaping education to improving health care to generating smart energy to communicating better in business and science. Some of my ideas are more humble, like providing data and protocols for a mashup to better grow and eat locally, by integrating data from agriculture production, farmers markets, and Google maps. My musings are part utopian fantasy and part momentary flashes of possibilities (and part not getting my medication just right, as some may say).
A Dream of the Enlightenment
I believe a new epoch presents itself for some countries and cultures. While manufacturing settles to lower cost labor markets, the Information Age of a protocol society presents new opportunities. I believe the emergence of a global community and the ubiquitous Internet cloud open new markets and cottage capitalism like past markets. It is the European Enlightenment combined with the American manifest destiny, an ecosystem of open market practices in the cloud with new territory to explore and improve. Itâs the combining of Athenian democracy, Adam Smith capitalism, and Ben Franklin energy and innovation. All improved and brought to reality using twenty-first century technology and protocols.
Itâs a new age of democratic enlightenment glimmering like a distant shore. Ripe markets. New opportunities. Unexplored terrain. I envision willing trading partners in an enlightened economy based on innovation and hard work, and all devoid of corporate cronyism and the exploitation of peoples and resources. It reduces the monarchs of current corporate mercantilism and moves beyond the Industrial Age.
Donât wake me up. Let me enjoy this illusion for just another minute.
I see a democratic workplace in the cloud where people work as tribes of researchers, developers, writers, and market specialists bringing all types of products and services to an open and willing market. Associations of innovative, high-tech craftspersons, scientists, and online merchants support each other in open guilds and communities of entrepreneurs and professional peers. Itâs like the best parts of 18th Century European idealism and fiscal enlightenment.
Donât wake me. I walk in reverie with Rousseau.
I see societies springing up around innovative new ideas disseminated through social networks and built from the resources of virtual ecosystems. Itâs a place where competition and collaboration strengthens and broadens services to blur the lines between customers, vendors, and specialists. Where automated processes lead to increased productivity. Itâs an enlightened marketplace furnishing a foundation for meritocracy unburdened by bureaucracy and politics. Itâs a place where Ben Franklinâthe consummate entrepreneur, printer, inventor, politician, and revolutionaryâwould feel right at home with its possibilities and open promise.
Itâs all so beautifulâŠ
But. Okay. Time to wake up.
I know all about dreaming, but I also know about abruptly waking up. Nothing this good can be true.
Waking up to the Protocol Society
Good or bad, change is coming. It has actually been here for awhile, but now the bubble has burst and people need results. The global recession of late only accelerated the need for change, regardless of political or social stance. But that may be a good thing. The next economic surge needs to come from wholesale innovation.Those cultures most able to change their cultures and embrace innovative practices from education to business to environment will prevail. Most of us in the west live in a protocol society and are just now feeling its effects and identifying its challenges.
I love the U.S. political commentator David Brooks, whether on PBS, NPR, or reading his NY Times Op-Ed column. One of his latest essays was on the protocol society. It starts out with a clear message:
âIn the 19th and 20th centuries we made stuff: corn and steel and trucks. Now, we make protocols: sets of instructions. A software program is a protocol for organizing information. A new drug is a protocol for organizing chemicals. Wal-Mart produces protocols for moving and marketing consumer goods. Even when you are buying a car, you are mostly paying for the knowledge embedded in its design, not the metal and glass.â
Brooks goes on to talk about the currency of ideas: âThe success of an economy depends on its ability to invent and embrace new protocolsâŠbut they are really talking about how quickly a society can be infected by new ideas.â He argues that the economic culture is key to growth, and depends on âhow fast a society can absorb and change to the increased velocity of new recipes [protocols]â
This is a reality the western world has been living with for decades. A protocol society where innovative ideas lead to new software programs, auto and home designs, shared scientific data, contextual knowledge sets, and streamlined processes that constitute a major portion of the GNP.
David Brooks finishes with this: âEconomic change is fomenting intellectual change. When the economy was about stuff, economics resembled physics. When itâs about ideas, economics comes to resemble psychology.â
For technical communicators, web developers, product managers, and content strategists, the clear message is that the interaction of groups, authoring of contextual content, and creation of innovative protocols is a social and psychological interaction, an emerging megatrend, and our bread-and-butter for the future. Communicators who engage, support, and gain the trust of users and readers where they live and work opens a whole new avenue of virtual storefront opportunities.
Breathing like Ben Franklin
Benjamin Franklin, the charismatic founding father of the U.S Constitution and all-around commentator, inventor, revolutionary, and bon vivant of his time, never went into any endeavor without coming up with a few new ways of doing it better. He invented swim fins after noticing that his boyhood friend with big feet could swim fastest. He famously researched electricity and invented the lightning rod. He listened to a concert of Handelâs Water Music and then went home and invented his own glass armonica.
Ben Franklin tackled lifelong sight ailments by fathering the bifocal. He worried about ships sinking while on Atlantic voyages so suggested using the Chinese model of dividing holds as separate watertight compartments. He developed a flexible urinary catheter to relieve his brother Johnâs kidney stones.
But lest we forget his other noteworthy achievementsâa fervent apologist for the arguments of the Enlightenment, residing statesman during the Second Constitutional Congress and Continental Congress, diplomat extraordinaire winning over the French during the U.S. War for Independence, and as always, the innovative newspaper man and entrepreneur. Maybe it would be easier to say what he didnât succeed at.
Ben Franklin touched many other disciplines and left them better with his curiosity, hard work, and uncompromising need to discover. Innovation for Ben Franklin was like breathing.
Ben Franklin lived during a time of eroding mercantile controls in a de facto wilderness economy with expansive opportunities and unending resources. Unlike industrial European economies, the New World economy grew as exploration and settlement grew and the brutal replacement of native societies and ecosystems took place.
To support both the new economy and emerging Information Age as a protocol society, a new structure and open set of economic rules are needed. Benjamin Franklin and the other merchants and farmers who railed against the British controls and supported the new ideas of the enlightened economist Adam Smith changed the rules through social and military revolution. Iâm only advocating the former, once again.
What would Ben do?
Like Ben Franklinâs economic world, new expanses lie ahead. Today they exist virtually in the form of Web 2.0 communication delivery and practicesâsocial networking, personalized delivery of information, open cloud computing, interactive real time communication, independent online merchants, virtual research partners, and the reach of global communication. Unlike Benjamin Franklin, we are not on the verge of the Industrial Age but rather moving past it to an Information Age. But I think these passing times share some characteristics. For fun, letâs go back and see if we can learn anything.
Question for the Past: What would Ben Franklin do in the emerging Information Age?
- Use many AvatarsâFrom the anonymous Silence Dogood letters that Ben put on his publisher brotherâs stoop to his pretense as a gentile, coonskin-wearing American charming the French court, Ben always liked to hide behind a good stereotype and nom de plume to get the job done. Whether he reused the anonymous Celia Single or Alice Addertongue, Ben would be coming out in all voices, all with selected graphic attributes.
- Promote democratic web access and publishing. Sure, Ben made good as the owner and publisher of the Philadelphia Gazette and then sold out for a comfortable retirement. But I have to believe he would NOT be doing as Rupert Murdoch and threatening lawsuits and subscription walls. Itâs a time for innovation for newspapers. We really do need the genius of enlightened commercial thought to get the fourth estate on its feet again.
- Use the web to sell. Franklin was the consummate salesman. He became an expert at selling as a young shop owner of twenty and handled rough competition in his later newspaper years. However, he saved his best salesman skills for the political years in the Continental Congress and ambassador for the colonies enlisting France to help secure the U.S. War for Independence.
- Publish controversy. Like Poor Richardâs Almanac published as a commercial spoof to generate interest and profit, Ben would have stretched reality for entertainment. He may have been a movie director, come to think of it. He most definitely would have gone toe-to-toe with competitors in publishing or politics. He probably would have still forecast his main competitorâs death as a business calculation. But would not have still taunted the man after his death about the prediction. We are no longer THAT cold-blooded.
- Advocate common sense. The commercial, sometimes prosaic, but always sensible advice from Poor Richardâs Almanac would be welcome relief to blog readers. But then again, maybe Oprah has taken over this market. Ben would definitely not throw out past wisdom for the promise of new technologies without first identifying the advantage. He most certainly would blend common sense with each changing technological or social fashion.
- Join several social groups. Ben Franklin was nothing if not a civic-minded man. He was gregarious and a believer in society and the civic duty of all. He would now as he did then connect to as many social groups as possible for business, politics, and Iâm sure to meet babes. Ben loved people. As his alter ego Poor Richard warned, âHe that drinks cider alone, let him catch his horse alone.â Today, maybe that truism would be more, âHe without Facebook friends let him drink his cider with his horse if he can catch him alone.â Or something similar.
Iâm still working on this list. I think I will email Walter Isaacson, author of a fine biography titled Benjamin Franklin An American Life and quoted throughout this posting, to see if he will add to the What Would Ben Do? list.
I am starting to see our current world of change and turmoil as a time for opportunity. Itâs time to move on to the projects of 2010. We need to take these where Ben Franklin and others of his time of enlightenment would have goneâGov 2.0, Press 2.0, Med 2.0, Jobs 2.0, and maybe Peace 2.0. A lot of innovation to ask for. As Benjamin Franklin would have liked it.
Check in to my next posting as we kick off our projects and the revolution begins. Itâs time to bring all these dreams to reality.
Competition drives all life on Earth. It is the invisible hand of markets and the engine of evolution. In our economic world, we see the traditional competition for jobs: Younger employees versed in new technologies forcing out workers with older skills, experienced employees competing with new graduates for fewer jobs, new technologies and services driving out existing companies, and the ever-popular corporate strategy of moving to cheaper labor markets. These are competitive factors we know and expect. But nowadays, new competitive pressures and opportunities present themselves in our ever-changing technological and globally socialized Age of Information.
Quick Note: I talked about the evolution of a new breed of technical communicator and content strategist in earlier postings, as well as adapting to a new open information environment. This is my third posting on changing ecosystems for content strategists of all typesâweb programmers, technical writers, knowledge managers, and anyone providing technical content to a new generation of Web 2.0 technologists and communicators using various media ports, portals for social interaction, and groups of common stakeholders.
Competing in the cloud
In the world of Web 2.0 social groups, companies rely on open forums, corporate Facebook pages, Twitter tweets, Linkedin, and other online social mediums to expand services and communicate with current and prospective customers. It is a symbiotic ecosystem in the web cloud where predators and prey feed and find their natural markets. But it is also an environment constantly roiling with changing roles and opportunities.
Competing as a solitary barracuda to challenge larger corporate sharks in the high tech ecosystem or scavenging as pilot fish for high-value product leftovers is nothing new. Microsoft, IBM, Apple, Blackberry, Google, and other high tech companies rely on independent third-party developers and authors to assist in educating their user base and providing applications for their platform. Subcontractors are needed to provide service engagements and work as project-based consultants. Most platforms rely on the competition of independent software and information developers. It is a developed market ecosystem.
Evolving markets and changes in traditional markets are also changing the economic environment. Small, nimble companies of individuals or collaborative tribes of developers now compete directly with larger competitors and in-house corporate teams. These small competitors provide more focused product lines (see TriActive), support the âlong-tailâ of products abandoned by companies but not customers (see exprescient), or develop innovative mashups and service-oriented applications for specific market needs (see programmableweb). Small companies and individuals can develop add-on features and upsell to existing customers, fill in with new utilities for existing features or products, or develop entirely new applications by integrating open data sources and employing universal web artifacts.
Independent developers employ easy-to-build mashup applications, knowledge portals using RSS and Atom feeds, adoption of configurable social user interfaces (UIs), consumable APIs, and semantically-rich web content marked with XML syntax or other metadata using the resource description framework (RDF) data model. All new cloud infrastructures, methodologies, markups, and emerging protocols allow for easy assembly of powerful applications and personalized, contextual knowledge resources to fundamentally challenge the way we communicate and work together as societies.
For the content strategist, the ecosystem is a cloud of web objects ready to be formed and given purpose. Tim Berners-Leeâs vision of a data-based system rather than a document-based system will be a reality at some time in the future. Personalization of features to deliver raw data and then render data as usable knowledge will be requiredâno one wants your UI or your doc or your education. They want THEIR user interface and THEIR doc and THEIR education. They just want you to provide a way to consume it, even if they <gasp> have to pay for it.
Darwinian rules for everybody
Adaption to environment. Natural Selection. Inheritance. Survival of the fittest. Predators and prey.
All of the driving forces of the natural world apply to our brave new economic world we live in As the 150th anniversary publication of âThe Origin of Speciesâ passes by, it would serve us well to take a closer look at the technological and economic Darwinism that faces us today.
Change brought on by the âGreat Recessionâ seem to be destroying one ecosystem while creating opportunities in a newly emerging climate and environment. The world is in transition. Like always, all inhabitants are simply looking for a steady source of sustenance and a new way to survive. Much like the finches of the Galapagos Islands.
Most evolutionary processes take too long to view using standard empirical scientific research, but the finches off the small island of Daphne in the Galapagos archipelago evolve very quickly it seems. Like most of the flora and fauna of these islands, their distant location makes it easier to see how species evolve independently based on an isolated environment separated by miles of ocean. For a young Charles Darwin aboard the HMS Beagle, the Galapagos was a perfect laboratory of segmented evolution. It still is the perfect laboratory for seeing evolution on steroids.
The finches of Daphne seem to be evolving almost instantaneously. In a matter of decades, larger finches first noticed in 1982 began replacing the smaller finches because they had larger beaks for cracking the native seeds. They were more efficient at their jobs. Consequently, the birds with the smaller beaks moved to eating the smaller seeds overlooked by the larger birds, which favored the smallest of the small birds. The smallest finches were more adept at eating smaller seeds. This is known as âcharacter displacement,â the act of finding a position to minimize competition in order to live in better harmony (i.e., to better survive).
During a drought in 2003 and 2004 on the island, fewer seeds grew overall and the larger birds competed for large seeds and the smaller birds competed for the smaller seeds. A more competitive environment accelerated the size change between each species of finch based on the size of the seeds they competed for and the efficiency in finding, cracking, and eating the sees: the larger birds got larger competing for the larger seeds and the smaller birds got smaller competing on the smaller seeds. After the culling of the less efficient food gatherers in each population, the most efficient size of finches evolved to fit the ecosystem. The finches of Daphne found an equilibrium and defined a new competitive system with each subgroup finding their most advantageous size and position.
So must we.
Some Feet-on-the-ground Ideas
Okay, so enough of the nebulous references to cloud computing, birds, and highbrow talk of becoming an independent content strategist. As my new friend Corda asks: How do you make this happen? Letâs get real. We have mortgages and grocery bills. How do we make it work?
First of all, I do not call myself an expert in telling others what to do professionally. I am a student of new trends and and inquisitive bystander at this point, although I plan to put all of these ideas to the test very soon. So with that gigantic caveat, let me share some ideas:
- If this is fun, weâre having it. The world is changing. Get used to it. Itâs not like we have a choice. Enjoy the new skills you are about to learn and the challenges in front of you, and embrace all the opportunities. Because if this is NOT fun, weâre also having it.
- The markets are changing. Products, services, and corporations in the market that have gotten fat, lazy, and effete are vulnerable to competition. A new type of democratic mercantilism of independent experts, collaborative tribes, and organized crafts men and women is emerging. Increased productivity, better tools, innovative practices, open web infrastructure, and skilled communicators can compete with anyone.
- The old markets are still in place. Some corporations are de facto monopolies or at least oligarchies. Think Toyota, Walmart, Microsoft, Google, and Apple. Technical Communicators need to keep close to these behemoths to utilize services and infrastructure and tap into customer base.
- We are straddling changing societies and technologies. The IT management system your company bought today will be obsolete in five years. Everything will be virtual partitions on a hard drive with embedded or online applications. 25 percent of what you know this year will be obsolete next. Your iPhone, Blackberry, of Google Droid is yesterdayâs model. All of this change affects societies. The Iranian government is slowing down web access and using Twitter to counterstrike protestors.
- Skills are changing. Jobs are changing. You need to evolve into a multi-skilled Technical Communicator. The goal is to get information to people. That will require some additional skills in the emerging Information Age.
How to compete:
Letâs get down to some specifics ideas. How about a table to keep notes. I an going to start taking a list of ideas here and expand it through time and effort.
|Projects||How to compete|
|Author contextual knowledge||Find a topic needing content and become an expert. Take a stance on a social issue. Blog, comment on sites, build social coalitions, write e-books, and push your agenda|
|Build mashups||Design and build web products using content and services from cloud resources. Mash videos, text, graphics, audio, and legacy content to provide alternatives to corporate products.|
|Build custom applications||Design and build applications for various platforms such as iPhone, Blackberry, Adobe AIR, MS Silverlight, and others large and small.|
|Sell services||Become an independent contractor.|
|Extend professional services||Work directly with customers or with other contractors to supply real-time, real-world content directly to customers.|
|Become an information portal||AppDeploy.com was an information portal bought by KACE. These were guys who provided a knowledge portal that was bought out.|
|Provide independent training||Undercut education wherever you can.|
|Hire on as content strategist||Build a rĂ©sumĂ© abound social networking and strategies to get companies noticed and their message out.|
|Become an analyst for a market and set of products.||Became a customer advocate in defined fields and compare, contrast products specifically for readers.|
My goal in this posting is to ferret out some of the opportunities for competing in a changing marketplace and point out some vulnerabilities to be exploited. We need to identify our seeds and see how big our beak needs to be to compete. That would be a significant difference between humankind and finches. We can decide what kind of forager or predator we are suited for and how we most want to compete.
I intend to openly discuss some of the problems and requirements associated with being a new type of technical communicator or content strategist and web masher. I plan to address more ideas around the reality of being an independent communicator (regardless of your current situation in or out of a corporation), what that means, and where the market is today. I will start by throwing out these basic ideas and hopefully get some comments. I will be updating this article from time to time as well.
December 11, 2009
Â· Michael Hiatt Â· 9 Comments
Posted in: Cloud Computing, Content Strategist, Contextual Data, Information Age, Information management, Knowledge management, Mashups, technical communicator
The times they are a changinâ.
Iâve been listening to a lot of Bob Dylan lately. My fifteen year-old son finds him cool. Itâs a strange revolving world.
Regardless of my motives, the Bob prediction of the â60s reappears 50 years later with the same growing urgency. Socially, economically, and technologically, the symptoms and repercussions are different but the same grassroots revolution hangs in the air today. The environment of Web 2.0 and cloud computing hastens many changes on the job front and the advent of the âGreat Recessionâ only accelerates its reality. Society is discontent. Unpopular wars persist. Freedom and change cross everyone’s lips like mantras. People are mad and arenât going to take it anymore. I have seen this before.
John Steinbeck said all human endeavor was two steps forward and one step back. It seems like we are about to move off our one step backward and take two big steps forward. And as with all major changes in the never-ending cycle of destruction and rebirth, there will be winners and losers based on those who can and those who canât adapt.
Quick Note: This posting is the second of four articles that explores some of the major changes and opportunities in the ecosystem for the technical communicator. See Evolving as a Content Strategist. Your criticism, reality checks, pushback, and all-around input is appreciated.
Fourteen thousand years ago, the natural dam that held ancient Lake Bonneville in the western United States broke and the lake immediately drained. Almost overnight an ecosystem evolving for 16,000 years washed through the Snake River to the Columbia River to the Pacific Ocean, catastrophically changing the landscape for all local species in and out of its wake. The freshwater fish poured into the ocean and the osprey found different prey or moved to new hunting waters, or died. A lake the size of Lake Michigan released a surge of floodwaters which lowered its level by 105m (350 ft.) on average. All that remained was a puddle with no drainage now called The Great Salt Lake and the other low-lying lakes and rivers feeding into it.
Things change. Sometimes dramatically.
For the alluvial fans of igneous rock damning the waters at Red Rock Pass, seepage had been wearing away the mountain notch for years and the rising levels of the lake consistently put pressure on an already existing weakness. The process of wearing away the north wall of Lake Bonneville was gradual, but its effects were dramatic when it failed: 33 million cubic feet of water per second burst out at a rate of 110 km (70 miles) per hour into the unsuspecting valley below. The landscape changed forever in the blink of a geological eye.
The Information Age is here. As writers and content providers, imperceptible changes have been occurring for the past twenty years. But soon the dam will break and the floodwaters will inundate. So grab your surfboard or learn to breathe underwater. In some way, all of us will be affected and we need to learn to ride the wave.
Factors of Impending Change
Things may happen slowly beneath the surface, but their consequences once actuated are immediate and cataclysmic. As software and information developers, we all feel the ongoing strain of impending change in our world. But that change is not relenting. It is only accelerating as the destruction/creation cycle begins anew. Here are some of the factors that come to mind causing these changes:
- The flattening world. The world is getting closer and flatter, meaning that immediate and interactive communication brings together interested audiences and employment resources irrespective of location. Goods and services are electronic so vendors, corporations, and personnel can reside anywhere in the world. Companies can outsource from the U.S. to Estonia, from Estonia to India, from India to China, from China to the Philippinesâwherever the job costs are lower. But it also means that the individual can compete directly with a corporation on a level playing field. Stature between the individual subject expert and the corporate messaging machine is also becoming level.
- The âGreat Recession.â Or so it has been termed. The slowing global economy put all types of content providers out of jobs. During this crisis, executives focused on cutting costs and bolstering the end phase of the sales cycle to wring more revenue without spending more money. It was a short-term solution that put more pressure on writers, web developers, and training personnel seen as expendable. These highly-skilled, creative people are now roving the new world looking for new opportunities and challenges. And the pen really is a mighty weapon.
- Social groups. Facebook, Google Wave, Twitter, wikis, forums, and all the other resources of Web 2.0 technologies and social-oriented communication continue to change the way we consume and disseminate information. Customers can no longer be easily lured to your website. You need to go to them on Facebook or elsewhere where they congregate. Or use Facebook Connect and Google FriendConnect to continue with a web presence while integrating with these social sites to get the best of both worlds. Customers look for germane and honest content delivered directly to them based on their personal needs.
- Content is king. The web infrastructure is built and now real content is needed. Entertainment and information stand preeminent to the delivery tools. Those who have the creativity, perspective, and writing skills can now take their rightful place with programmers and technicians as a thread in the wide tapestry of economic opportunities.
- Cloud Computing. Mashups, Semantic Web, lifestreaming, real-time web, and other emerging methodologies and technologies provide new ways of delivering and consuming content. Online books read on Kindles, multimedia novels, multisource notebooks saved to the cloud using EverNote and accessed from your iPhoneâall provide a new wave of capturing content and communicating with stakeholders. Cloud computing technologies will provide a way to bring content directly to you and your readers.
Service-oriented architecture (SOA) and mashups hold much promise for corporate development for business needs in the future, as does Wikipedia for the teacher, online real-time practice groups for the musician, timely information for sales people, and online multimedia travel maps for the retiree to share real-time with the grandkids. What do they have in common? All of the services have APIs on the web now and can be implemented with the correct know-how using simple HTML, semantic markup, and mashups using scripting languages.
- Information turnover. For some disciplines and technologies, 25 percent of what you know this year will be obsolete next year. Make your plans accordingly.
Technical communicators face changes in the near future. Exciting ones if we ride this wave well and tact our course correctly. Catastrophic changes if we panic and try to hold on to past practices. Like always, the impending changes will have good and bad sides.
The World is FlatâLaterally and Vertically
All economic waters run together and the worldâs cultures and markets are more equally sized. This is commonly referred to as the âflattening of the world.â
- Corporations take advantage of globalization and instant electronic communication to outsource or employ independent contractors at lowered costs. Getting costs down is just good business.
- Global communities, the blogosphere, aggregators, search engines, social groups, forums, wikis, knowledgebases, websites, and free online information seemingly handle all communication needs, casting newspapers and technical communicators as yesterdayâs word processors. Again, using latest trends and technologies to improve processes, lower costs, and market globally is just good business.
- Increased training costs and professional service engagements is likewise just good business. Adjusted pricing of new services and arresting free services halts when everyone hunkers down for a global recession.
Thereâs lots of good outcomes for business that can be disruptive to traditional technical communicators, web developers, and programmers. The flattening of the world means location is moot, companies global, and products and services virtual. We compete on an open field.
On the flip side, business opens itself to competition on this flat field. Communication is democratic. Customers prize objectivity and can smell out a business scam. Customers are looking beyond corporate goals to saddle them with professional service costs and training courses as an add-on sale. Technical communicators have an opportunity to take market share from the large corporation.
Until recently, the freedom of expression used to be only with those who owned a printing press. Now we can all publish somewhere. The environment and rules have changed for all organization and individuals.
Business has other vulnerabilities. Sales forces traditionally focus only on the big sales and leave small and medium business to fend for themselves. While many free services and tools exist out there in the web that need only knowledge about integration and best practices, companies still see information management as their exclusive province. It may be time for predatory âtrimming of the herdâ from the marketplace to keep companies lean.
Itâs Alright Ma
The pressures of wholesale change have been pushing on the walls of the dam for many years now. Corporations put too much value on in-house marketing and proprietary platforms. Writers only delivered basic reference content with little value. Training teams required customers to travel to costly education sites while lowering quality. Corporations forced customers to buy overpriced professional services contracts. Technology gave too much praise for delivering too little actual cost savings and productivity.
Weâve seen it all before. The outpouring of discontent brings on radical change. But we can embrace both the horror and hope to make some plans for our future. If anyone can acclimate to a changing ecosystem, it is the technical communicator: the most communicative and adaptable creators of new information and products.
Thatâs inspiring and comforting. And exciting. It calls for another Bob quote:
He not busy being born is busy dying
Thereâs no laying back in the heated swimming pool and taking it easy nowadays. You are either creating something new or lagging behind professionally. Innovation stands supreme. But for those of us who are used to producing new content and meeting deliveries, we will be alright.
That can do what’s never been done
That can win what’s never been won
Meantime life outside goes on
All around you.
âfrom Itâs Alright, Ma (Iâm Only Bleeding).
Eric Schmidt, now the CEO of Google, said we are all our own company. He has been saying that for years. He claims the market is going on all around us. I believe him. I believe now is the time to brand ourselves, employ our expertise or become experts on something compelling, communicate our expertise, sell our ideas, and open new markets in this world of infinite communication and open commercial possibilities. Especially those of us who can communicate.
I acknowledge that the climate out there for information developers is unsettled and many questions are pending. But letâs take a look at what we do know and try to forecast some answers together to this changing environment.
Finding new feeding grounds
Writers need to take advantage of this golden age of information development and delivery. The Web provides a smorgasbord of research and writing opportunities begging for someone to give the disparate data structure and readers meaningful knowledge. We are the technical communicatorsâthe creators and conveyors. Weâre not the clean up crew.
As we are told, globalization flattens the world. Instantaneous communication allows for extended teams of experts to work in virtual teams worldwide without regard to distance. It can improve our lives dramatically and allow us to work together for our mutual benefit. Companies can employ the best workforce based on expertise and production to provide value to their bottom line in addressing societiesâ needs. These are the best of times for technical communicators.
The next postings will keep questioning many new assertions and publishing some ideas from myself and others. It is also time to put the rantings of my latest posting to the test with a real-life project. So stay tuned for âfeet to the fireâ time.
I for one plan to take advantage of the opportunities of the future-trends and put down the expectations from the past.
Itâs time for Bob one more time:
I got a head full of ideas
That are drivin’ me insane.
It’s a shame the way she makes me scrub the floor.
I ain’t gonna work on Maggie’s farm no more.
December 7, 2009
Â· Michael Hiatt Â· 2 Comments
Tags: Content Strategist, democratic mercantilism, Information Age, technical communicator Â· Posted in: Content Strategist, Information, Information Age, Information management, Knowledge management, Real-Time Web, Semantic Web
The Age of Information is here. The economic, political, and global landscape is changing quickly. As web programmers and information developers (or those with shared skills), we search for a way to survive and prosper. We know that we need to recreate ourselves to be self-sustaining, innovative, and adaptable. We fight fears of corporate downsizing and fewer resources shared by expanding and cheaper labor markets. We want to know how to fight back.
Social groups in Web 2.0 are pervasive and promising for the creative information developer, but how does one make money? Open-source programming taunts the Web developer with its promise of free services and open functionality. Mashup applications, real-time web, semantic web, and other new cloud computing strategies abound. Our skills are needed somewhere but how do we fit in? And how long will we stay with the staid, proprietary domains and skill set demanded by our current employer? Thinning down seemingly superfluous content providers is in fashion for everyone from CNN to local newspapers to IBM.
Itâs time to make a stand. As content providers we need to evolve from simple programmers and writers to personalized content aggregation providers, focused on delivery of usable content to high-traffic sites where customers virtually congregate in the 21st century. We need to take control of our expertise and domain of knowledge in independent careers as content and communication experts. We are not scribes, word processors, or one-way blog posters. We are the creators and broadcasters of new ideas and the independent genius behind new innovative, interactive products. We are the Content Strategists. And this is our time.
Quick Note: This posting is the first of four articles that explores some of the major changes and opportunities in the ecosystem for the content strategist. Your criticism, reality checks, pushback, and all-around input is appreciated.
A new breed of communicator
What is a Content Strategist? And why do you lump writers and programmers together with a shared job and a common fate?
As to the first question, let me give a brief introduction from my own experience and beliefs.
My definition of a Content Strategist: The designer and arbiter of valuable content and disseminator of knowledge for a corporation or online audience with shared interests. One who expedites, filters, and manages access to all types of high-value information from personal, internal, and web resources to give information context as usable knowledge for selected audiences.
I also like the overview of a real-life Content Strategist, Rachel Lovinger: ââŠto use words and data to create unambiguous content that supports meaningful, interactive experiences. We have to be experts in all aspects of communication in order to do this effectively.â Kristina Halvorson adds to Rachelâs posting with additional definitions and comments. But while both do a great job of describing the Content Strategist from a writerâs perspective, they do not go far enough in identifying the emerging role of the web programmer joined with the traditional writer. Both need to deliver germane data and integrated online functionality for content and knowledge delivery.
Web programmer and technical writer share same feeding grounds
Both jobs require taking advantage of open cloud resources to provide aggregated functionality and integrated, contextual content from the web. Web programmers are now able to mash together different data sets and web functionality to form new online applications from the cloud. Think Housingmaps.com.
Likewise, writers can embrace linked data methodologies and semantic web markup to deliver real-time content to meet the ever-changing needs of their readers. Think how you could re-use open-source content if topics were stored in dynamic repositories as is being done at dbpedia.org and linkedgeodata.org.
Together, the web programmer and technical writer will need to bring together disparate pieces from the cloud to reach their audiences and form new products. Writers and web programmers will be sharing the same feeding grounds in the near future.
Why do I group web programmers with information developers?
Basically both disciplines are moving together at a quickening pace as providers and strategists of personalized knowledge. The advent of the Active Server Pages (ASP) and Java Server Pages (JSP) in the nineties combined coding with information on the same Web page. The user interface and textual content became one. This continues to happen.
Proprietary development is fading away. Current cloud computing initiatives and social networking challenges continue to bring these disciplines and professionals even closer together. Both need to adopt open platforms and furnish users, companies, and Web audiences with dynamic data and collaborative knowledge.
Both web and information developers are looking for better human interaction and automation of services brought directly to Facebook (using Facebook Connect), Google docs (using Google FriendConnect), or other integrated social networking in the cloud where the customer/user/reader resides and calls the shots. Both disciplines should realize that customers donât want a proprietary application or a pre-built user interface. Customers of today and tomorrow want personalized functionality and real-time knowledge delivered to their doorstep in a way that fits their unique needs. The times they are a changinâ and no one is going to your website. You need to go to them.
But most importantly, I believe that both the Web and information developer share a common fate of planned obsolescence in their near future. Both types of professionals need to adopt the belief that they are not in the âcodingâ business or âwritingâ business, but in the business of aggregating and directing knowledge to users where they live.
I am not haphazardly categorizing together these two disciplines. It is moving technology and the marketplace pushing these jobs together. However, that doesnât mean that most established companies are catching on quickly.
Then why do companies continue to hire only traditional web programmers and writers?
TouchĂ©. Good point. For the most part, traditional programming and technical writing jobs are all you see advertised. The idea of a corporate Content Strategist is far down the executiveâs range of vision and priorities. Hereâs a few reasons why.
New ideas donât survive in Big Business. Are we really going to count on large corporations to make the break from their cronyism, bureaucratic decision-making, and stale attempts at forced innovation? Before you answer, think Enron, Tyco, and Global Crossing. Then think Lehman Bros, Bear Sterns, and AIG. Isnât it time we take the reins away from these supposed experts and leapfrog to the more democratic mercantilism of the Information Age? Then think Novell, Sun, and Symantec. Do you really think they will change before they all end up in the scrapheap of yesterdayâs technology and the executives jump off with their golden parachutes? I donât think so. The small, nimble businesses that supplant these behemoths will prevail long after the slow-moving giants fall into the tar pits.
Lowered quality of knowledge. Letâs be completely honest. Most companies who are currently advertising for experienced writers and high-level programmers are either looking for cut-rate entry level positions or moving jobs to low-cost labor markets. Or maybe they are still giving jobs to their brother-in-law who once took English 101. All content professionalsâfrom the web designer to the information designer to the SQL scripterânotice that companies are taking less responsibility in delivering useful information. Companies are repositioning as simple vendors. Call our consultants if you want real-world knowledge, but itâs going to cost you.
Think of Home Depot, a big-box home repair retailer in the U.S. They hire young and inexperienced workers to stock shelves and ask, âCan I help you?â But these inexperienced workers have not actually replaced a drain trap or fixed a missing roof tile on their own. How can they have any idea of my needs? The sage, experienced owner of the small hardware store has been replaced with low-cost products and a âyouâre on your ownâ attitude. Who is going to stand in for the experienced craftsman and craftswoman? That would be you and me.
Corporations actually do want to hire a content strategist. They just donât know it. Executives are not deaf to the possibilities of expanding social networks and free cloud computing promises. They know that an offhand comment from a blogger or forum participant carries more weight than their whole marketing department. Companies want to âgameâ the system to allow for free marketing and product interaction on social networks but still want to control messaging. They think customers can open wikis or visit forums and write their own documentation and APIs for each other, thereby reducing their needs for paid content providers. But this is an unsustainable model. Eventually the reader will demand honest information and the author will need to comply in order to build a loyal âdownlineâ of users and readers.
That means giving up proprietary content and priming the discussions with real-world knowledge before expecting the reader to offer up their own hard-won knowledge. It has to be a quid pro quo arrangement between the author and the reader and social group. One for all and all for one attitude is the only way to keep people involved voluntarily. It is a symbiotic relationship. The rainforest of knowledge.
And somewhere along the line the executive will question the costs and overlapping effort of shotgun communication that occurs each time a product or service is released. But here is what the executive will hear: Complaints about high training costs for products undermining sales from the sales field; researchers and engineers spending all their time publishing or hoping for published materials across disciplines; and products deemed too abstruse and complicated. How the left-brained overachieving executive will respond will most likely follow this pattern: Outsource it > let R&D open a new secluded wiki > hire my brother-in-law with an anthropology degree.
Regardless of the name applied to the content strategist, it is a position that every organization needs. Whether in entertainment, science, software, healthcare, or engineering, all left-brain organizations need a right-brain Content Strategist to impart information, make connections, and communicate value and services.
A changing environment
All trends seems to encroach on the traditional territory of the software coder and technical writer. Corporations continue to play the game of short-sighted decisions based on short-term returns, leaving the software and information developer at the mercy of downsizing plans and defensive about roles, value, and jobs.
But on the other hand, opportunities abound in the new world of functional mashups and real-time web computing using open APIs and knowledge integration through Linked Data and personalized delivery. Interest in open cloud computing ascends as the world gets flatter and the Web provides accessible open source opportunities, allowing the individual to offer new types of collaborative Web services and information sources. This is especially true for the innovative and creative among us.
Itâs time to get back on offense. We need to change from the techno-toadies of the past at the mercy of bean counters and greedy executives, and evolve into our rightful positions as independent Content Strategists. Freedom of the press used to be only for those who owned a press. Now a baseline web infrastructure is built and the tools assembled to allow each of us to publish content quickly and easily. We can now compete with larger organizations based on value to the reader, delivery of quality knowledge, and nimble processes.
A Company of One
You donât make the rules. You can only exploit them. You need to take advantage of the opportunities and changing landscape. That means that you need to brand yourself and become an expert for hire. I like the ideas presented at New Grad Life for new graduates who need to brand themselves, but I think the idea needs to be taken beyond a job search to a new environment where each professional sees themselves as their own businessâwhether paid by a weekly paycheck or through a personal LLC using invoice payments. Each professional serves in their own company, a Company of One, regardless of how they are paid.
Trust me. The company employing you will only stick with you until the money runs out, or until they find a lower wage earner. It is the law of business to always lower costs. They will let you go when it serves their purpose, not when it serves yours. In turn, you need to look out for yourself. Content providers traditionally attached to software, engineering, medical, or scientific teams living symbiotic existences can now be liberated to prosper on their own. We all share the same fate of obsolescence if proprietary, in-house information development continues down this road.
The only way to survive is to be a threat to or an alliance with larger companies (or both). You can compete in the same feeding waters by finding your own audience or be the pilot fish attached to the larger predator. Both a sustainable living, But regardless of the strategy, you need to have your own business path, expertise, and marketable skills.
Your current company wonât last forever, and handing out resumes and begging for a job will give over to new environment where a working relationship can be established between companies. To that end, you need to start branding yourself, become a domain expert, and create your own self-supporting company.
Turn into a Company of One. Be an amalgam of a small businessperson, creative designer, accountant, publisher, and salesperson for Me, Inc. It is time to move into your rightful role as a Content Strategist using your multiple communication skills and vast experience as an information developer so needed in the marketplace of ideas and talent.
Brave New World of content strategists
Many questions need to be answered, but here is what I know about this changing world:
- Knowledge is king of the jungle. Much of the writing tools and web infrastructure is already built. Not to say there wonât be improvements and new technologies, but the need for real content and usable knowledge is on the upswing. We have 900 channels on our cable TV, but still need entertainment beyond reruns of Star Trek and Gilliganâs Island (I hope these old reruns are only being seen on U.S. TV and Hulu.com). Content is resuming its rightful position as king in driving traffic and attracting customers. Now we just have to make it pay.
- Content providers are moving up the food chain. The change from technical capability to provide information in new and unique ways is giving in to a real need for usable content. But who will provide this information in a way that leads to real knowledge? Certainly not the technocrat of yesterday.
- Join a Company of One and work with tribal communities. Writing, publishing, researching, and collaborating as writers, web developers, editors, and subject matter experts holds many possibilities, but also many questions. Experts have been saying for years that we are moving to a project-based, independent contractor employment model. I personally donât want to wait until the rivers dry up before I migrate.
As information and web developers evolving to Content Strategists in this brave new world, we have great opportunities and all the skills necessary to adapt. So do I dare say it? Yes I do:
BRING IT ON!
Linked Data and Mashups rely on open standards to integrate information and furnish collaborative features, but differ on their uses and the intricacies of combining disparate data and features to build new functionality from cloud resources. Both provide emerging standards to provide content and shared applications for readers/users from existing Web resources coupled with proprietary enterprise data. Each promises increased productivity by combining and efficiently repurposing content and features for focused tasks and specific audiences.
Linked Data employs basic identification of content to loosely mesh and deliver shared semantic content of raw, personalized data from disparate sources. Likewise, Mashups employ semantic tagging but require logic through coding or high-level scripting using open APIs to provide functionality from disparate cloud services melded with proprietary data to render a new knowledge set or services. While both hold great promise for Web 2.0 moving to Web 3.0 strategies, each has significant differences in implementation and usage. Both technologies remain in their infancy to date but they are growing up fast.
This posting attempts to confront the similarities and differences of these technologies and their emerging standards while providing use cases and current examples. My intent in this and following postings is to highlight the promise of each technology to help developers and information developers catch a glimpse of a possible new frontier before making catastrophic decisions using yesterdayâs technology. I donât want anyone to pour concrete now on the plot they plan to garden next spring.
Quick Note: This posting comes from comments presented by Kingsley Idehen, President and CEO of OpenLink Software and my religious reading of Dion Hinchcliffe and Jeff Hanson. All commented on my About Mashups and Linked Data posting.
Linked Data and Mashups: Comparison and Contrasts
Linked Data and Mashups see the Web as a collection of objects of raw data and open functions rather than formatted documents residing on websites or proprietary applications. Both employ open Web standards to consolidate content and coordinate tasks using open protocols. Both promise quick and easy development for enterprise and commercial products and publications without in-depth development costs or expensive, proprietary tools. And both utilize structured markup as metadata to identify information based on its semantic meaning. Essentially, Linked Data and Mashups expose, share, and connect data to render comprehensive knowledge and actuate collaborative, complementary program features from the Web cloud.
Linked Data employs simple HTTP protocols to connect exposed data stores and stream unedited content quickly to interested readers before publishing as contextual knowledge. As a subset of the Semantic Web, Linked Data relies on semantic markup to define the meaning of content and then employing dereferenceable URIs (semantics) to locate and deliver Web content as URL addresses. It is Tim Berners-Leeâs vision of the Web as a universal data, information, and knowledge exchange.
The premise of Linked Data is this: The more connected data you have, the more powerful the information and more accessible the knowledge. It is the meshing of information without relying on published documents or formal web sites. Its promise is to deliver personalized content to each individual based on his or her unique needs. As Kingsley Idehen playfully puts it, âLinked Data basically delivers the ability to Mesh disparate data sources rather that settling for brute-force data mashing as exemplified by Mashups.â
Both emerging methodologies provide a glimpse of the future of programming and information development. As Dion Hinchcliffe writes, âWeb 2.0 [provides] alternatives at a fraction of the cost of their enterprise-class predecessors, even if they donât have exactly the same functionality. Enterprise 2.0 solutions, depending on their feature set, are becoming candidates to replace existing document management and knowledge management systems, enterprise portals, and even larger enterprise suites such as CRM systems.â
Problems with Employing Linked Data and Mashups
There are problems here that need to be confronted by proponents of both technologies:
- Data not ready to be shared. Much of the data in the world is not structured to be accessed and shared. Governments, enterprises, and social groups are not interacting together to provide access to their data. You can scrape data from FaceBook, but not MySpace, or Twitter, or Flickr, or YouTube. Each is its own silo. Tim Berners-Lee calls this âDatabase Hugging.â And letâs not explore the massive data tied up in corporations and governments.
Unique URIs are needed to expose linked data and a hierarchical ontology for each domain. Some of this work is being done at dbpedia.org and linkedgeodata.org and elsewhere, but much more work needs to be done to create and import ontologies and define standards.
- Some database hugging is reasonable. Legitimate reasons for holding on to proprietary data exist. Medical and financial information needs to be private. Internet service providers better not be handing out my personal information. And companies have good reason for keeping information private, especially when it would be equally valuable to their competitors and less valuable to the public. Sometimes keeping information far from the Web is the responsible and legally-wise policy.
- Throw away database queries and applications of the last 20 years? Are we really supposed to throw away decades of database research designed to store, index, and query datasets to give way to open and accessible text files over the web? Combining SQL queries for legacy databases and common semantic tagging needs to be adopted.
I acknowledge that there is no reason to rush headlong into the open display of information and application functionality at this time, but there are scenarios where it makes sense now.
Immediate Uses of Linked Data and Mashups
While much has to be worked out in adopting Linked Data and Mashups, now is the time to determine if these technologies and methodologies are ripe to exploit in some fundamental way right now:
- Scientific research needs Linked Data now. I was offered to apply to position as a writer for a leading-edge nanotechnology and DNA research company. They wanted someone to handle content to share between scientists both internally and externally. My input as an experienced information developer was to establish an ontology of common semantics to automate content markup based on semantic to facilitate the interaction between research teams and provide real-time data. They didnât need to throw this data over the wall to a writer to publish on a web site or through formal papers. They are getting back to me.
Example: http://bio2rdf.org/–Semantic web atlas of post-genomic knowledge.
- Shotgun communication. Companies wrestle with all of their diverse content when releasing products and service from the different R&D, technical support, marketing, and writing teams in individual silos. All of this overlapping information needs to be brought together to identify cost savings and get the right information in front of the various types of customers (prospective customers, customers needing best practices, upsell customers, internal employees, et al).
Example: See my posting on this issue and an example of shotgun communication.
- Low-hanging opportunities. Some mashup applications are easier to implement than others. These high-value, low-cost implementations need to be taken to the market to build on for more complex and productive products.
Also refer to Enterprise Web 2.0 postings from Dion Hinchcliffe for a more authoritative validation and some additional best practices.
My Direction in using Linked Data and Mashups
In my next posting, I will define a project and take it to its logical conclusion while sharing my travails and successes. I will provide a product requirements document to identify my intent and define success. I will then share my findings. I plan to test out my previous postings about linked data and mashups, the myth of single-source authoring and publication, shotgun communication, and employing the confluence of content on the web, as well as the experience and expertise of real subject experts. I am going to hold my own feet to the fire.
November 24, 2009
Â· Michael Hiatt Â· 2 Comments
Tags: Linked data, Mashups, Semantic Web, single-source authoring Â· Posted in: Cloud Computing, Information management, Knowledge management, Linked data, Mashups, Ontologies, Semantic Web
Single-source publishing is a zombie idea that revives itself periodically and refuses to stay dead. Its zombie supporters chant its purported benefits as a âwrite once, publish to manyâ promise and ploddingly follow it as their ultimate goal for mechanized authoring and machine translation. As an object-oriented writing methodology, it is as human as present-day robot technologyâgood only for conveyor belt assembly or specialized tasks, and always very expensive to implement. Single-source publishing lacks purpose in todayâs world of information turnover and the dynamic nature of the Web 2.0 moving to Web 3.0 landscape.
But hope survives to finally bury this living-dead entity once and for all. And who will be our emerging heroes to fill the promise of content reuse and localization savings? Knowledge mashups and applications using cloud-based linked data and the emergence of the semantic Web.
Quick Note: This posting starts a periodic thread on authoring content and the collaboration of knowledge providers for the individual as well as the enterprise. In my mind, concepts about integration of content to form ad hoc knowledge and the collaboration of services to form unique applications are interchangeable when talking about mashups, cloud computing, and linked data. All come from the same fountainhead of processes and the same philosophy with the same benefits.
Not Working as PromisedâŠAgain
After its death-knell in the early â90s with SGML markup and the DocBook DTD, single-source authoring rose again with XML and DITA in the late nineties and early 21st century. IBM liked the idea early on but, with the advent of Web publishing and quickening pace of information management, gave it up. Others like Novell and other large corporations adopted it as a multi-platform solution to render content in different formats relying on syntactical processing rather than semantic markup. But the overhead of highly structured writing and evolution of multiple publishing formats made this practice obsolete. Still, some companies hang on to their archaic ideas of single-source authoring and similar to the plot of the movie Weekend at Bernieâs, continue to prop up this dead thing as a real-life entity just to keep the party going (and to support their now entrenched system and management decisions). But like todayâs updated version of zombies, single-source authoring will never be more than a corpulent, well-dressed stiff.
Taking on the Purported Benefits of Single-Sourcing
Single-source publishing makes promises that the same content can be employed in different documents or in various formats to reduce writing costs in publishing and localization. It is claimed by proponents that the expensive, labor-intensive work of setting metadata and reusing topics for use in different documents and formats can be accomplished mechanically by automated tools to save time and money. I want to dispel some of these arguments:
Publishing to Multiple Formats. The reasons for highly-structured content to publish as online help, PDF, printed materials, and Web content died in the nineties. These days, any content can be saved to all of these formats using the authoring tools in the marketplace today.
Reusing Topics. In theory, the ability to re-use content from a library of already written, edited, and translated topics seems to save time and leads to cost savings. Itâs like the difference between procedural, top-down coding and the now ubiquitous use of object-oriented programming. Sounds good as a theory, but writing and coding are far different endeavors.
In practice, single-source authoring rarely works. While code classes are organized and accessed for specific needs and can be extended from the root classes for specific needs, they always rely on same base class functionality. Rules for each programming language are objective and reliant on a compiler to translate using exacting syntax into program operations. Conveying information is subjective from the context of the writer to the context of the reader. One needs to meet finite compiler rules, and the other needs to move information from the synaptic interconnects in one personâs brain to the synaptic interconnects of anotherâs. Writing relies on the de facto connotations of language as it evolves organically in a society, while programming languages are de jure, unbending rules are set by a software vendor or open-source committee.
Trusting Others. Single-sourcing within a company requires one writer to generate a topic to be used by another. Not a problem for object-oriented programmers. Itâs actually an effective process and the status quo for programmers today. After all, each coding language includes a library of classes to be implemented or extended rather than reinventing each procedure. However, in my experience, information developers seldom re-use another writerâs topic unless it is a basic glossary entry link. The needs of each communicator are so vast and different in imparting knowledge in an e-book, guide, or document as to make reuse not worth the time or effort.
And then there is my empirical knowledge of content reuse. As a manager of a technical writing team engaged in single-sourcing methods, my experience shows that a writer seldom grabs a topic wholesale and places it into his or her document. Topics rarely meet all needs of the author and usually throw off the context and purpose of the document. At best, some parts of a document (a paragraph or two) can be referenced and reused as Context References, the ConRef feature in DITA for example. But then, cut and paste proves effective here too.
I can see where a single writer or closely-knit group of writers (two or three at most) can collaborate seamlessly at a workable level of lockstep writing. But for most organizations, the planning and practicing of content reuse is rarely successful beyond the publication of a cookbook or other basic reference materials.
Localizing Documents. The translation of content by creating individual, reusable topics presents a chance for information developers to demonstrate a real cost savings of single-sourcing for an organization. The argument goes that because it costs so much to translate, then reuse has a real cost savings that can be shown on the ledger sheet.
I profoundly doubt this argument. Writing topic-based content requires dumbed-down and standardized information to meet the assembly line process of single-source authoring. Topics need to follow a formula of concept, task, and reference topics strung together and watered-down to meet the lowest common denominator for all translated languages. This means the author needs to omit the richness of language of each dialect. I understand that diluted language reflects the nature of writing for localization, but single-source strategies only add to banal explanations and reference content not really needed by users. Customers need context and real-world knowledge.
Many proponents of topic-based writing point to the savings of using it with automated machine translation (MT) rather than human translation as a cost savings, which is probably a highly contested argument across all localization companies. That debate is not something I want to get into here. I have even heard a high-ranking manager forecast that one day the machines can do all the writing as well. Machine writing to machine translation to human readers. Good luck on that.
For now, the cost of setting up machine translation using topic-based writing requires a large investment that rarely, if ever, realizes cost savings. To be a believer in the merits of single-source authoring, I would need to see the total costs of staffing a localization team to painstakingly set up MT and then get a forensic accountant to study the time and effort spent on adding metadata to each document and compiling its various components as a readable document. And then I would have to see a customer satisfaction study on how the lack of quality affects sales.
Authoring In-house. Practices in writing and reading content are changing rapidly. Just ask Rupert Murdoch and any newspaper publisher. Aggregators, bloggers, and social networks stand as the future for imparting much of the information we will consume. In-house authoring now competes with bloggers as experienced subject matter experts and with group editors inherent to social network postings critiqued by multiple readers. As the sole writer confronting the horror of the blank page, itâs hard to compete with so much experience and intellect. Instead, social writing practices should be embraced and fostered.
In addition, the logistics of single-source authoring requiring that all writers use a common database and authoring tools regardless of their location causes many performance and security problems. Add to the logistical problems the emerging advent of online translations through Google Translate and other services, and the argument for single-sourcing and proprietary machine translation practices seems weak.
Single-Sourcing After Publishing
For information developers working with product, service, or development teams, the goal is to describe the features of the product or service as presented to them from internal experts. Consequently, they produce a feature by feature description of the product from the inside-out. You want to travel on vacation? Well first let me give you an encyclopedia of the features of the combustion engine. I may get around later to a travel guide and maps. In-house authors write from the perspective of features developed by the R&D, marketing, or product support teams rather than the outside-in best practices and innovative uses needed by the customer. See Shotgun Communication for an in-depth view of corporate information problems and examples.
The main issue for me is between authoring static in-house documents using single-sourcing methods before publishing, or capturing information sources dynamically after publishing from online social networks, linked data sources, and knowledge mashups . The myth of single-source authoring is that it actually has a life in the future and remains a viable goal for many information developers. With so many mega-trends against itâsuch as the belief that static authoring from a single vantage point from a single author paid by a single organization is a workable systemâseems ludicrous. Instead, we should be looking to capture, sequence, and give context to the wealth of rich content already published in context from the Web. Collaborating with the many subject experts, authors, videographers, bloggers, tweeters, and writers coming together on the Web with shared interests will be powerful if it can be harnessed.
In a future posting, I will present my ideas for knowledge mashups and linked data objects that utilize the best of in-house authors to prime key discussions while giving stakeholders the knowledge and impetus they need to perform tasks specific to their unique needs.
November 18, 2009
Â· Michael Hiatt Â· 25 Comments
Tags: knowledge mashups, Linked data, single-sourc authoring, Single-source publications Â· Posted in: Contextual Data, Information management, Knowledge management, Linked data, Mashups, Single-source publishing