Tool evaluation and prototyping

Until you have experienced a tool or technical framework in action, it is very hard to judge whether it can work for you. Mekon’s direct approach lets you evaluate technology against your real requirements, proving feasibility and revealing any hidden risks before it is too late.

Product demos by vendors are a useful introduction, but often they focus on key selling points and not your organization’s needs. When it comes to choosing a technical framework, second-hand, overly conceptual knowledge is no substitute for direct experience. For over a decade, Mekon has helped organizations develop their requirements and then evaluate products against those requirements through hands-on evaluation and detailed proofs of concept. Mekon’s clients benefit from:

  • A real understanding of a proposed approach — how it will change life for content creators, or how your customers will be able to interact with content.
  • Reduced risk through comprehensive user testing and flagging areas of concern at an early stage.
  • Documented evaluations and insightful analyses of the tool/s under consideration, based on your team’s testing of those tools.

Example services:

Conference Room Prototype

Mekon work with solution vendors to deliver tailored workshop and demonstration sessions, using your own content, so that you can experience first-hand how tools actually work, look and feel in a live environment. Our comprehensive evaluation structure:

  • Ensures the focus is on the organization’s real needs rather than the technical capabilities and flashy new features of the tools
  • Includes qualitative evaluation of the tools in addition to discrete scores
  • Identifies all potential risks and concerns, and provides a platform for addressing those with vendors systematically.

The key stages of a conference room prototype are developing requirements, preparing sample content, participating in hands-on evaluation workshops, and analyzing the results.

  • Developing user stories. User stories emphasize what people actually need from the tool rather than specifics of how it should accomplish that, ensuring comparability between tools and maintaining the key focus. Where clear system behavior-oriented requirements exist, these are included in the same framework but in a suitable format that is distinct from the user stories. Requirements are prioritize so the scores can be weighted accordingly in the final analysis.
  • Preparing sample content. Content for testing the system needs to be carefully selected to ensure comprehensive coverage of all key structures that will be used by authors, and sufficient representation of content from different product lines or business areas. Where taxonomical or other metadata is not presently used but will be in the future, Mekon can prototype this effectively in the sample content.
  • Hands-on evaluation workshops. Mekon use the requirements framework to guide the evaluation process, ensuring that all concerns and questions are addressed, and qualitative and quantitative feedback is recorded.
  • Analyzing the data. Mekon process the quantitative data and use that together with users’ comments and reflections to provide key insights as to the pros and cons of each tool.

Dynamic Delivery Prototype

Choosing a dynamic delivery process can be complex. The market is small and specialised, but a demonstration, or even a full workshop is not enough to establish the product’s capabilities — content distribution, search, SEO, analytics, uptime and user experience.

Supported by the solution vendors, we help customers run meaningful evaluations that give enough time to examine all facets of a dynamic delivery product. We can develop user stories, advise on data gathering approaches, conduct usability interviews with users and analyse the resulting feedback.

As Information Architecture specialists we can also make recommendations on changes to the content to help achieve your goals for a dynamic delivery system. Effective content structure, metadata and taxonomy can improve performance, ease of content distribution and user experience. If the basics are not in place, a dynamic delivery solution can fail to deliver on its promise.

I was really impressed by not only how well the team was able to summarize our company perspective, but also on their domain knowledge … the final presentation and the comprehensive work leading up to it left no doubts about our solution moving forward.

(Information Architecture Lead at a major security solutions provider)