Information Management

Accessible PDFs: Questions, Thoughts and Ideas from a Social Network Exchange

First published on G3ict.org Should accessible PDF documents be a part of a company’s web accessibility strategy? That’s the question that was posted recently in a LinkedIn web accessibility forum. The question inspired a lengthy and exciting discussion among accessibility experts from a variety of sectors and roles. What resulted was an informative and multi-faceted conversation that brought up several questions, comments and solutions related to accessible PDFs. To read the entire LinkedIn exchange, copy and paste this link into your browser: http://www.linkedin.com/groupItem?view=&gid=41800&type=member&item=266438271&trk=groups_search_item_list-0-b-ttl. For those who just want the quick highlights, I have consolidated a few of the more popular thread themes, questions and ideas that emerged. PDF Document Accessibility There was almost full consensus from accessibility experts on the fact that all online PDF documents should be made accessible. Within the LinkedIn discussion, a web accessibility consultant commented that since they’re likely available on a company’s public facing website or customer facing portals, PDFs should be part of a company’s overall web accessibility strategy. “It’s particularly important if that information isn’t available in another format that’s accessible,” a Section 508 accessibility and remediation specialist added. Others pointed out that some companies have gotten around creating accessible PDFs by making the same information available in an accessible HTML format instead. Keep in mind that whatever the format, when approaching accessibility for what I call the ad-hoc or one-to-many type documents like marketing collateral, publications, informational documents, reports, etc., the approach typically is a manual one whether repairing, touching up or creating accessible PDFs. The key here is to author with accessibility in mind. What about Archived PDFs? I also saw a strand of comments regarding whether or not archived PDFs should be available in an accessible format. While many of the contributors in the discussion suggested that ideally they’d like to see historic PDFs made accessible, most saw the process of converting them as too time consuming and cost prohibitive, and in my opinion this is likely, because the traditional approach to making these documents accessible is a manual tagging and repair process that simply couldn’t be applied to such a large volume of archived PDFs. One researcher in usability and accessibility pointed out that no one would ever look at those documents anyway, while a Section 508 accessibility and remediation specialist stated that converting them would depend on budget, timeliness and importance – otherwise, accessible archived documents could be made available upon request. My opinion here is multi-fold; firstly, whether or not someone would or could look at an archived document shouldn’t be the basis for the decision as to whether it is accessible or not. If the document is made available, it should be made available to everyone, including those with visual disabilities. Traditionally, there hasn’t been a solution that is timely or cost effective that would allow these archived documents to be addressed post-composition, but there is an automated solution on the market now that does just that and produces WCAG 2.0 Level AA compliant PDFs. “Many companies are mandated – either by internal by-laws or external regulations – to store those historical PDF documents,” wrote my colleague from Actuate. For other companies, allowing customers to access historical documents (including statements or past invoices) may be a value-added service. “Whatever the reason for storing might be, if any time in the future that content needs to be accessed then it goes without saying that it should be accessible,” he added. “That doesn’t mean that organizations have to store the content in an accessible format for the lifespan of the document, but rather can employ a solution to automatically convert documents on-the-fly/on-demand.” This is very exciting since the solution mentioned here is a patented and fully deployed solution performing this very process for very large financial institutions today. It can and is being done! What about PDF/A format for archived documents? Many PDFs kept for historical purposes are stored in this format, a Section 508 assistant coordinator pointed out. PDF/A formats have a stricter structure that allows them to remain backward and forward compatible, he added. Could PDF/A formatting get in the way when it comes to accessibility? My colleague responds, “No, it didn’t “negate or hinder … capabilities to provide that content in an accessible format (meaning this format too would work for a solution that applies accessibility tagging or PDF remediation on demand) when the content is requested.” There are exceptions, of course – not every document can be made accessible post production, depending on how it’s been authored or formatted – but a large number can be, without the need to re-author them. Accessible HTML Instead of PDFs? A marketing communications consultant pointed out that HTML isn’t always appropriate. For example, it is not appropriate in the case of very long documents or for those that will be distributed mainly through print. A Section 508 assistant coordinator added that if it’s the PDF that’s going to be widely distributed, it should still be available in an accessible format. I hear this HTML question posed to me frequently, and agree that in many cases HTML or XML is the best format when the content (code) is designed accessibly. HTML and XML typically pose less accessibility issues for assistive technologies like screen readers, particularly for web content. But what about the case of high-volume, electronically-delivered, customer communication documents, like bank statements, telco bills, medical notices, etc.? That is the question I pose and it is a leap for many to consider. This content is usually presented as PDF and can quickly add up to millions of pages or even hundreds of millions of pages per month, per organization and is therefore in a different category of challenges, mostly due to sheer volume. The typical approach to making PDFs accessible is to design with accessibility in mind, convert to PDF, then check and touch up the PDF– which is the repair or remediation process – and it simply doesn’t fit or scale for statement type PDF documents. They’re also typically not available in HTML or XML, since PDFs are usually the format of choice and are often required for archival and regulatory compliance purposes. Additionally, large organizations producing these types of communications have often invested heavily in their technology infrastructure with sophisticated software that transforms the data – like names, account numbers, marketing ads, etc. – into print files that get turned into paper communications and also into PDFs for online presentment. So providing accessible HTML/XML in this case may not be a solution. There is now a technology solution that works within the IT enterprises and allows for every PDF to be created completely accessible automatically (to WCAG 2.0 Level AA conformance), so now these companies can include all their e-delivered PDF communications as part of their overall web accessibility strategy. That’s just a small sample of some of the discussions exchanged on LinkedIn around PDF accessibility as part of an oveall web accessibility strategy – along with a few of my opinions on the topics posed. Thank you to all who provided great insight into the accessibility issues with PDF. I hope we can continue to have more of these types of conversations on social media with lots of industry experts sharing their insights! Please connect with me on LinkedIn or via Twitter. See more on PDF Accessibility: » PDF accessibility using PDF/UA format: PDF/UA: What is it? Why is it relevant? » PDF Association’s PDF/UA Competence Center  

Read More

Say Hello to the Digital Business! – is it a Case of Going Back to the Future?

At the beginning of each year I tend to spend a lot of time reviewing key technology trends identified by various IT press and industry analysts. These are the technology trends that will impact consumers and businesses for the next twelve months. One of the key trends at the moment is the ‘digital business’ and I don’t know about you but I feel a bit of deja vu occurring here, let me explain why. I have been fortunate to have been around long enough to have seen a lot of technology related trends come and go over the years. My first recollection of the term ‘digital’ was actually in the mid-seventies when my grandfather paid what seemed a small fortune for a Sinclair Oxford digital calculator. He wanted a way to simplify the book keeping of his business. In fact the word digital seemed to spread like wildfire in the late 1970’s and early 1980’s. The first recollection I have of going digital myself was when I was given a Binatone games console in the late 1970s, I was the talk of the road that I lived in at the time. The ability to control two paddles on the screen and watch a white square being hit from left to right had my friends enthralled! ‘Pong’ may have been a very simple game to play but these early consoles kicked off the computer games industry, or should this have been called the ‘digital games industry?’ My own digital journey really began to gain momentum when I was at secondary school where the peer pressure from fellow pupils to buy new digital gadgets was immense. For example trading in my Timex watch for a cool looking digital watch with the obligatory red LED display or getting rid of trigonometry conversion tables and getting my first scientific digital calculator, or trading in my record player for my first compact disc player (and the many battling digital formats that came with it at the time). It was also around the mid-eighties that I saw a stainless steel bodied DeLorean car making its film debut, a very modern car that at the time represented a very modern era to be growing up in. It was during my latter years at secondary school that a new piece of technology was introduced, the computer. I joined the computer club at school, we wheeled out a Research Machines 380Z computer, (our only computer in the entire school!), on a trolley every week to test our computer programs that we had spent the previous week writing down on a piece of paper. I got my first home computer in 1983, a Commodore Vic-20, and I remember spending hours entering code from computer games magazines to save the £5 or so that it would have cost to buy the games on tape!. Anyway it was this two year flirtation with the Vic-20 that set me on the path for a ‘digital career’. I went off to University and studied Computer Aided Engineering, yes the digital term was being consigned to the bin and everyone had decided that ‘computers’ were the way forward. In the early 1980s it was almost as if companies were rolling out ‘computer aided’ strategies, Computer Aided Design to develop products, Computer Aided Process Planning and then Computer Aided Manufacturing to manufacture the products from the 3D representation of the product that had been created. I spent many years honing my technical drawing skills on paper, buying my own Rotring drawing board only for that to be consigned to the dustbin as Computer Aided Design packages started to become more popular. In fact for many manufacturing companies, it was the design office that emerged as the department with the most computers, often expensive computers, to automate the design process and hence get products out of the door more quickly. I then went on to join the leading CAD vendor at the time, Computervision. Since then I have remained working for various technology vendors, and so it seems strange that we seem to be entering a new digital period where companies are just starting to realise the benefits of going digital. But hang on a second, many of today’s CIOs are from my generation, they have grown up with digital technology in their personal lives and yet for some reason translating their digital personal life into a digital business life seems to be taking quite a while. This is mainly due to the fact that most companies have complex business processes to adhere to and government mandates ensure that paper based documents must be retained for regulatory compliance purposes and hence it is taking time to digitise every single business process. During my career at GXS, and now OpenText, I have been amazed at how many companies still have manual paper based processes in place, whether it is raising shipping documentation or sending invoices that need a whole army of people in the loop before a supplier can be paid. But wait, what’s this pulling into the IT strategy station, yes it is the Digital Express train, it’s time for every CIO to get on board or head back to the IT dark ages. Yes, the term that represented the 1980s is back, and it is meaner and tougher than ever, digital. Needless to say that digital in the 1980s was focused around the consumer. Today however, (with help from other technological marvels such as the internet, mobile devices, the cloud, social networks and the very latest set of buzz words, the Internet of Things), it is about the ‘digital business’. So yes the time is right for the digital business, a place where information, irrespective of its source, can be digitised, archived, accessed, viewed, any time, any place and anywhere. From digital mock-ups, digital manufacturing through to digital procurement, digital is now going main stream, but this time it is business versus consumer driven. In a time where companies are exploring new markets, having a scalable and flexible IT infrastructure to allow a digital business to spread seamlessly around the world is a standard requirement, not an option. Over the coming months I will be posting more blogs around every aspect of the digital manufacturing business, from digital design to the digital shop floor, from digital testing and compliance through to digital service and support, the time is right for every company to develop their own digital business strategy to put them on the path for growth and success. So here we are, going digital again, seems strange that back in the late 1970s my grandfather had the vision to automate his book keeping via a digital calculator and here we are forty years later and we are talking about the introduction of the digital business, however this time it is here for good as I will reveal in future blog entries. So what was your very first memory of going digital? either in your personal or business life? In closing, today’s generation Z will become tomorrow’s digital natives, and to see what they think of a piece of ‘pre-digital’ technology that was a mainstay in companies across the world for decades, take a look at this short video

Read More

Assessing Cloud Architecture and Fax Performance

So, you’ve decided (or been mandated) to migrate fax operations to the cloud? It’s a big step, not unlike other IT migrations. You’ll need to understand how the onboarding cloud platform works in conjunction with your desired application; in this case, how your organization will manage faxing in the new cloud environment. Whether you’re replacing fax hardware entirely or cloud-enabling aspects of your on-premises fax infrastructure, understand successful implementation depends on how well you leverage the cloud’s unique capabilities in conjunction with enterprise fax applications. One of the first things to ask is whether or not the cloud fax provider features a robust network based on multi-tenant cloud architecture. Considering the business critical nature regarding the ability to fax from your specific applications in a unique environment, that answer should be “yes.” The network should also support connections to multiple telephone carriers, a concept known as carrier diversity. In other words, it should ensure consistent fax uptime across geographic borders by utilizing multiple connectivity options and points of presence to support disaster recovery, redundancy and failover. This also underscores the provider’s ability to scale out fax operations according to your specific volume requirements without the need to pay for excess capacity. Other architectural components to look for are: a platform that allows simple integration of your unique desktop fax applications an environment that offers additional fax document workflow options such as automatic capture and notifications 24 x 7 customer support and operations monitoring Service Level Agreements (SLAs) guaranteeing specific network performance requirements These are simply the basics, as every fax implementation is different. However when making the leap to the cloud for managing your enterprise faxing, the above represents the bare essentials you can’t overlook. In the next installment I’ll dig deeper into how cloud fax architecture and performance directly impact the concept of disaster recovery. NOTE: This is an installment of a Blog post series on enterprise-class cloud fax services. To view other posts in the series please refer to the following links: What Makes Cloud Fax Services Enterprise Class Recovering from Fax Disaster Fax Compliance in an Ever Changing World Cloud Fax Takes Information Management to the Next Level Cloud Fax Services Make Administration Easy Simplify Global IT Support with Cloud Fax Know Your Cloud Fax Service Provider’s Strengths

Read More

Choosing the right fax server: What is the business need?

Many organizations begin their quest to implement a fax server based on a business need that is driving the project. This goes beyond your boss telling you to “Go out and find us a fax server – stat.” Usually, you’ll be asked, “What is your business need?” but you should also consider, “What is your business pain?” Understanding the “business pain” is an important first step. Ask yourself if you have these typical business pains regarding faxing: · Do I have to reduce costs by eliminating fax machines? · Do I need to increase productivity by making it easier for employees to fax? · Do I need a faster way to process incoming faxes? · Do I exchange sensitive and private information that needs to be secure and protected? · Is faxing a bottleneck in a workflow or business process? Once you’ve identified your business pain(s), it can sometimes make it easier to then identify how you will use the fax server: the business need. This is important to evaluating the capabilities that you will need in a fax server. Understanding how fax is used is key to choosing the best fax server for your organization. Here are some questions to consider: · How do my users need to fax – from desktop applications, MFPs, inside email applications?  Tip: Determining the “source” of the content will help identify the most efficient way to fax the information. · Is there a business process or workflow involving fax that can be more efficient with a fax server? Tip: Identify the workflow/business process and the line of business owner/stakeholder. · Has the line of business owner/stakeholder mapped how fax documents flow in and out of this business process or workflow? Tip: Interview the line of business owner/ stakeholder to see how the process works today and how it would ideally look with a fax server implementation. Map out the flow of fax documents (inbound and outbound) to see where efficiency,  productivity and cost savings can be gained. · What types of applications need to be integrated with fax? Tip: Make a list of all of the back-end applications that are part of a workflow or business process for fax (ERP, CRM, document management, etc.) and any vertical application systems (software specifically developed for healthcare, legal, financial, etc.). Do users need to fax from applications such as Microsoft Office? Do you need to integrate fax with your existing email application? The more comprehensive the list, the better prepared you will be to evaluate fax server capabilities. Now that you’ve identified and documented business needs and how fax is used in your organization, you can now move on to evaluating the next key capability of a fax server – Choosing the right fax server: Desktop, Email and MFP Integrations. 1. What is the Business Need? 2. Desktop, Email and MFP Integrations 3. Production (Automatic) Faxing and Application Integrations 4. Easy Routing and Storage of Electronic Fax Documents 5. Security, Privacy and Compliance 6. Business Continuity/Disaster Recovery 7. Ease of Administration and Administrative Tools 8. Telephony Compatibility

Read More

Talk to an OpenText Hero: Getronics Optimizes BPM in the Cloud

Getronics, winner of the 2013 OpenText Heroes Award for Best Implementation of Cloud Services , uses OpenText Cloud Brokerage and Business Process as a S ervice (BPaaS) to orchestrate a variety of application services they provide to their customers. The OpenText solution enables Getronics to remove organizational and technical silos, and enable process-led agile operations in the delivery of those services, while helping Getronics’ customers save money and reduce internal costs when moving to the cloud. Hear from Tim Patrick-Smith, Group CIO of Getronics as he explains how OpenText Cordys (now Cloud Brokerage and BPaaS) helped their business reach its vision. Want to hear more from Getronics? Register for the “ Process Efficiency at Your Fingertips: Discover BPM Optimization in the Cloud ,” webinar featuring Alex Howe, Senior Enterprise Architect at Getronics, or join him at Enterprise World 2014 on November 9-14th.

Read More

The File-Sharing Dilemma (a.k.a. It’s 3 a.m.: Do You Know Where Your Content Is?)

You’re in IT management. What keeps you up at night? Standard stuff like health and retirement savings? Or is it that new hire in marketing—the one leaving the office every night with confidential campaign plans copied to a flash drive? Or maybe it’s the R&D manager who’s using public file sync and sharing services to transfer sensitive product development specs between their work and home computers. If either of those scenarios is familiar, that’s what you should be stressing over. And for a couple of reasons: At the most basic level, that’s your organization’s critical information—its lifeblood—out there roaming beyond the firewall. At a higher level, it also means your enterprise probably doesn’t have a secure, compliant, user-friendly file sync and share solution integrated into its ECM platform. You’re not alone. If it makes you feel any better, many organizations are struggling to adapt to a rapidly evolving work environment that now encompasses anywhere, anytime, and on any device. To help put the changing landscape in perspective, here are some results I’ve pulled together from a few surveys: 65% of respondents have accessed work-related data on their mobile device, though only 10% have corporate-issued devices. Shockingly, over 50% said access to their devices wasn’t password protected. 78% of companies say the number of personal devices connecting to their networks has doubled over the past two years. However, less than 10% are fully aware of which devices are logging in. 93% of companies without an enterprise file sync and share platform say their employees are specifically using Dropbox, despite (or, more likely, due to an unawarenss of) several recently documented security issues . BYOx Has Arrived. What’s Your Response? Fact is, companies are expecting more out of their employees, and resourceful staff members are doing their best to deliver. So much so that the concept of BYOD (Bring Your Own Device) is quickly morphing into BYOx, where “x” is defined as whatever’s necessary to get the job done—devices, applications, web services, cloud storage, and more. Good on the staff for showing initiative, but it’s now all on the infrastructure architects to provide them with a secure, productive sandbox to play in. I’m not alone in saying that adopting an “anything goes” policy for external information sharing and storage is a no-win proposition. It results in an inefficient, tangled mess for users and gruesome security and governance risks for information guardians. There really is only one, true win-win in this new world, and it’s in the form of a cohesive, dedicated file sync and sharing application that’s built from the ground up with inherent security and compliance to excel at all three aspects of the corporate sync-and-share paradigm: Usability, Governance and Security. The Best of All File Sharing Worlds Is in One Simple Solution So, at the most basic level, it seems there are two paths to meeting the demands of the next-gen workforce and workplace. Sadly, one involves trying to grow a business through public file sync and sharing tools created for non-business use. Tools that are incompatible with your tech environment ask you to rely on someone else’s definition of security and can’t tell you where your data’s been hanging out. Truth is, solutions like OpenText Tempo Box are the foundation for the future. Tempo Box is built on an ECM infrastructure and operates in the cloud, on-premise, or as a hybrid model that incorporates both. It’s time to take the leap and implement a true enterprise-grade sync and share solution that effortlessly brings the best advantages of external file sync and sharing—content creation, collaboration, and storage—back behind the firewall and into a secure, governable structure where it belongs. I guarantee you’ll sleep better. Try Tempo Box today!

Read More

Choosing the right fax server: How to evaluate and choose anenterprise-grade fax server solution

When organizations want to make faxing operations an efficient part of their business processes or workflows, they turn to an enterprise fax server to boost efficiency and productivity by increasing the speed of transmitting, routing, and processing faxed documents. Understanding the features and functionality of enterprise fax servers is an important step in choosing a solution that is best for your organization. This series of articles can be used as a decision support tool for organizations planning to choose and implement an on-premises, enterprise-grade fax server solution. It is designed to help evaluate business needs and develop a foundation of criteria for choosing the optimal fax server application. The future articles below will outline eight basic fundamentals of what enterprise fax servers must do to meet the rigorous demands of today’s document-centric businesses. 1. What is the Business Need? – The first step in evaluating enterprise fax servers is to determine the business need for a fax server within your organization. This will help shape how you evaluate the subsequent key capabilities of a fax server. 2. Desktop, Email and MFP Integrations – Enterprise faxing solutions must provide company-wide users the ability to send, receive, and manage faxes from virtually all user-based desktop systems. 3. Production (Automatic) Faxing and Application Integrations – Choose a fax server which has the ability to fax-enable any application that generates documents that are part of a workflow or automated business process. 4. Easy Routing and Storage of Electronic Fax Documents – The ideal fax solution would provide several options for routing inbound faxes with notifications and audit trail available for every document touch point. Also, for long-term storage, a fax archiving option should be available to offload and store documents for as long as you need them. 5. Security, Privacy and Compliance – An on-premises fax server solution should help with compliance initiatives by providing a secure solution for managing all fax documents. The solution must offer various features and capabilities that help organizations achieve privacy and security standards. 6. Business Continuity/Disaster Recovery – Identify an enterprise fax solution that can deploy in high-availability scenarios. Look for a solution that can provide disaster recovery options over multiple site locations. 7. Ease of Administration and Administrative Tools – Key to the success of any fax server implementation is the administration and management of the system. Choose a fax server which provides comprehensive guides and tools to make the administration of the fax server as efficient as possible. 8. Telephony Compatibility – Choose a solution that can operate 100% in-house and/or as a hybrid solution – or a combination of both. Make sure that the fax server software is compatible with your telephony equipment, if applicable. Watch for the first topic: Choosing the right fax server: What is the business need?

Read More

What Makes Cloud Fax Services Enterprise Class

Think (wish) fax is going away? It’s not! In fact, many large organizations are still faxing in high volumes, particularly those in the healthcare, financial services and supply chain sectors. However the number of faxes these industries manually transmit have employees wasting valuable time printing documents, walking to standalone fax machines, dialing numbers, hitting send and waiting for confirmations. In addition, the costs of toner, paper and other supplies necessary to maintain fax operations waste money. In order to overcome the waste of time and money associated with manual faxing, enterprises began leveraging on-premises fax software implementations that enabled faxing from the computer desktop. But on-premises software requires capital investment and IT resources to keep these implementations running – two things many organizations are also trying to reduce. So now many enterprises are taking another step and investigating cloud fax services. Instead of investing in on-premises fax implementations, organizations are cloud enabling their fax infrastructure to drive efficiencies wherever possible – whether it be telephony costs, disaster recovery, scalability, or for testing new applications. There are many cloud fax service providers claiming to offer “enterprise-class” cloud fax services. With so many vendors out there, it’s hard to tell if they’re enterprise class or not. To help make the determination, an enterprise-class cloud fax service provider should feature the following: sophisticated architecture and performance comprehensive business continuity and disaster recovery plans security and compliance controls flexible platforms that support a variety of integrations 24/7 administrative services worldwide network track record of servicing organizations on a global scale While the above doesn’t tell the whole story, it speaks to the beginning of the tale. We recommend you carefully vet potential cloud fax service providers based on these points. Overlooking them will likely lead to a series of costly backtracks for your organization. This is the first in a series of blog posts about what makes a cloud fax service enterprise class. In the next post I’ll talk about what to look for in cloud architecture and performance to support large enterprises. NOTE: This is an installment of a Blog post series on enterprise-class cloud fax services. To view other posts in the series please refer to the following links: Assessing Cloud Architecture and Fax Performance Recovering from Fax Disaster Fax Compliance in an Ever Changing World Cloud Fax Takes Information Management to the Next Level Cloud Fax Services Make Administration Easy Simplify Global IT Support with Cloud Fax Know Your Cloud Fax Service Provider’s Strengths

Read More

Outbound Customer Communications: Author Once, Publish Many?

Originally published on Doculabs.com For many of our clients today, there are three primary outbound communication channels: Paper via USPS Mail, email via Exact Target (or similar), and web pages via the web site. Unfortunately, for most organizations, each of these channels tends to have a different content authoring team, different publishing processes, and different underlying technologies. In addition, the manner in which the content is structured, in terms of naming conventions, meta-data and hierarchy, is different. As a result, while the world has adopted multiple channels of communications, costs have increased for these organizations, when you would actually expect to achieve some synergies. Consider some stats. On an annual basis, a financial services or insurance client with 10 million customers will likely generate 100 million pieces of physical mail, send 50 million emails (assuming a 25% “e-doption”), and roughly 25 million site visit and more than 75 million page views. In support of these activities, across these three channels, there are typically in excess of 50,000 discrete “templates”, and likely a quarter- to a half-million content components used in the various templates. And considering both business and IT resources, it takes more than 100 FTEs to support the authoring, publishing, and delivery processes. Not cheap! Given the magnitude, how do we streamline these processes and gain some efficiency? What about the promise of “author once, publish many” that the industry has been advocating for well over 10 years? Well, in my opinion, the operational constraints of the different target mediums and supporting technologies result in some redundancy. Regardless, though, some efficiencies can be gained. So the key questions our customers ask is: What is the right balance? Can we encourage content re-use within and across channels, without compromising the optimization needed for a particular channel? Here, some suggestions for organizations looking for additional leverage, from the standpoints of people, process, information architecture, and technology: People Cross-pollinate team members. Granted, the skills needed to work in Adobe for web content or Documaker for templates destined for paper, or in Aprimo for email campaigns, etc., involve some specialization. But the fundamental skills needed to understand tone, conform to a style guide, and use brevity apply across all channels. So the more you can cross-train individuals in these three teams, and use common practices for authoring and assigning meta-date, etc., the more you will be encouraging common processes. Organizational alignment. If possible, have the different content authoring teams report directly or matrix to a common set of manager(s); again, consistency is increased. In particular, the different teams will begin to understand and appreciate some of the unique complexities within each channel, but also many of the similarities. And it affords the organization an opportunity to define common operating metrics – the number of change requests, length of time and number of hours to modify content, etc. Process Standardized workflows. While the platforms used to author and publish content may be different in each delivery channel, some degree of process standardization is achievable. Consider, for instance, how change requests are submitted, how requirements are articulated, what additional artifacts are needed, how approvals are collected, etc. In many cases, standardizing these steps simplifies the process, both for the individuals within the publishing teams and also for business users interacting with the various teams. Roles and responsibilities and metrics. The participants in the workflow, regardless of channel, can also be standardized so that everyone knows their role. Most critical is the definition of what tasks business or IT staff are responsible for. In addition, define the key metrics, including the number of units of work needing to be completed and the desired SLAs. Information Architecture Naming conventions and hierarchy. Just about every client I work with complains that they have content “all over the place”. Of course, content is everywhere, because no one ever proactively defined these retention protocols. Even if you are using simple file shares, taking the time to define basic meta-data and where content should be stored and how to do version control (even as simple as Filename_v1,… v2) goes a long way toward consistency. Meta-data. When the basics above have been addressed, go a step further to define the essential set of indices that can be used to describe a template or content component. Graphics and design elements are particularly problematic, so get a cross-channel team in the room and come up with the three to five descriptive elements needed from the original creator of the content. Technology Today, the supplier community might claim they do multi-channel support – and if you were starting “greenfield,” that may well be the case. But for most of our clients, given the divergence of platforms already in use, they need to create channel-specific renditions. “Rendition” is probably too favorable a term, as it implies simply pushing a button, and magically an email or web page is created. Regardless, if the upstream processes leverage common standards (as outlined above), the particularities of technology platforms can be minimized (but not eliminated). Overall, while the perfect world of authoring once and dynamically publishing across paper, email, and web channels is still a challenge, there are many steps organizations can take to streamline their operations. Developing common skills, processes, and content naming conventions optimizes use of resources to the extent possible.

Read More

How to Avoid Pitfall #6 in your EDI Supplier Enablement Program

EDI

In my blog, Six Pitfalls to Avoid When EDI-Enabling Your Suppliers – PART 2, the sixth pitfall I highlighted was “Insufficient planning for supplier communications.” As part of your supplier communications plan, you will need to send a series of communications, most likely in the form of emails. Preparing complete, well-thought-out email templates for communication will help to reduce the number of calls for support that you would otherwise get from your suppliers further down the line with your programs. Before you start, take a look at this checklist for supplier enablement. The first email in your series should provide suppliers with an introduction to your EDI program. You will send this introduction to each group of new suppliers you want to add. It should include: The goal of the program for your company – such as improved productivity, better customer service, greater efficiency The benefits that participation will deliver to the supplier – such as faster payment of invoices, potential to become a strategic supplier that will enjoy increased business, the opportunity to continue to get your business How to learn more about your EDI program – such as a link to a website where additional materials may be found, registering for a webinar during which you’ll describe more of the plans and details, the contact to call for answers to questions Timing requirements – it’s best to include a deadline by which some action must be taken or a process completed. Also, if applicable, it should include consequences of non-compliance. Next steps – Should the supplier contact a specific person? Complete a survey or form? Update a website? Register for a webinar? As with all your communications, the email should be concise, accurate and informative. Furthermore, it should be written with your supplier’s perspective in mind. For example, if you’ve done a supplier survey and you’ve identified a list of suppliers that would be good candidates for a web forms-based solution – either because they don’t have the resources to invest in a full EDI solution or the timing is not right, but you still want them to participate immediately – you need to provide the information about the forms solution. Here is a sample introductory email that has been used successfully with such suppliers. And as a reminder, you may find this Supplier Enablement Program Checklist a useful tool. It provides a listing of the detailed steps you should complete for each phase of your supplier enablement program, along with a description of each step.

Read More

10 Best Practices For Your Targeted Messaging

Organizations are continually looking for opportunities to engage with their customers, and one way is through their Customer Communications. Within customer communications, engagement can be enhanced if the message is personalized and targeted to the user. Proper application of targeted messaging can lead to increased sales and better customer experience across all channels. When targeted messaging is done poorly, however, it can annoy customers and drive them into the arms of your competition. Here are 10 best practices that can help you develop and deliver effective targeted messages: Clearly define the type of message you want to send To do this, you must understand your objective. Is it related to sales (including advertising), customer service, or awareness? For example, are you trying to sell a new service? Reinforce your brand? Provide information about locations or business hours? Draw attention to charitable causes supported by your organization? Be as specific as possible and take care not to mix your messages. Design each message for a specific channel While designing your messages, take into account the characteristics of each channel, why customers use each channel, and what kind of experience they expect to have on each channel. Optimize messaging with analytics and many data sources Fine-tune personalized messages by applying analytics to as many (internal and external) data sources as possible and as much history as possible. The more customer data you assess, the better your chances of crafting appealing messages. Present customer offers in an order of priority To increase offer acceptance rates, use the personalized insights garnered from analytics to select the most appropriate offers and rank them in decreasing order of relevance. Then present them to your customers in that order, starting with the offer that has the best chance of success. Do not repeat accepted/rejected offers on other channels Coordinate offers across all channels so you don’t become a nuisance to your customers. After you extend an offer, it should be automatically removed from a master list to prevent it from being sent out again on another channel. If you waste your customers’ time by forcing them to repeatedly process the same offers over and over again, they will eventually opt out of receiving content from your organization. You might even lose their business. Deliver each message at an appropriate time Be sure that your customers receive messages when they are most likely to be receptive to them. Both timing and context are important here. For example, a customer who logs in to a web portal to complete a transaction does not want to be interrupted mid-task by a sales message. Wait until the customer finishes before delivering a message that is relevant to the transaction. Use strategic partnerships to increase messaging opportunities To gain access to a large pool of high-quality sales prospects, partner with an organization selling products and services that complement your own. For example, a software company that partners with a bank can offer discounted tax preparation software to the bank’s customers. Protect customer privacy when aggregating data Take steps to ensure data security and customer privacy whenever you employ analytics software to spot trends and create targeted messages. Always use permission marketing Spam is not going to help you win new business so only target people who opt in to your content. If you start delivering unsolicited messages to your customers, many will simply drop your company in favor of a less intrusive competitor. Abide by all government regulations when tracking customers online Always gather data about customers’ online activities in strict accordance with local laws. Legal infractions can be costly, both from a monetary and public relations standpoint. If you are interested in more information about targeted messaging, please read this 2-page Business Overview or email me directly at scastrucci@actuate.com.

Read More

What Happens in Europe, Stays in Europe

As companies adopt new disruptive technologies such as mobile and cloud computing, many countries have begun to develop new information compliance standards and regulations. As a result, the geographic location of stored information has become a growing concern for companies in countries that have such regulatory or legislative requirements. As a result OpenText’s recently launched the European Data Zone for its end-to-end cloud fax services. The OpenText European Data Zone, based in the UK, now provides OpenText Fax2Mail, OpenText Production Messaging, and OpenText RightFax Connect customers with complete fax handling that includes message rendering and processing in addition to delivery of cloud fax messages. Customers concerned about data sovereignty issues can have their accounts provisioned to the European data center to ensure their data is handled entirely within Europe. To read the press release in its entirety, click here.

Read More

Breaking Bad: How technology is changing Media & Entertainment

Inspired by the upcoming Emmy awards broadcast, and with a nod to last year’s awesome Best Drama winner Breaking Bad, let’s take a look at the impact of technology innovation on the Media and Entertainment industry and what it will mean for those companies to win an “IT Emmy.” As consumers, we have all directly experienced the dramatic changes happening throughout the Media and Entertainment (M&E) industry. From music to television, books and movies, new and innovative technology has been a major, and at times quite disruptive, factor in reforming the industry’s business models. It is only karmic then that M&E companies are using technology in turn to create solutions that deal with the underlying changes in long established production practices and distribution approaches. OpenText sees enterprise information management (EIM) technologies playing an important role in those solutions. It’s been said that the larger lesson of Breaking Bad is that “actions have consequences.” To help me take a closer look at both the actions and the consequences for M&E, I’ve invited my colleague and industry expert, Charles Matheson, to answer some top of mind technology questions for publishers, broadcasters, movie studios and advertisers alike. Deb Miller: Charles, the Media & Entertainment industry seems to be going through a real transformation driven in large part by technology innovations. Traditional business models have been turned upside down or, in some cases, simply ceased to exist. What segments of the M&E industry are suffering from technology disruption? Charles Matheson: Business model disruption has been in effect for the past 15 years and the first segment of the M&E industry to feel the sting of technology disruption was the music industry in the early 2000’s. The emergence of digital music files or MP3’s that could be shared or sold blew up the decades-old business model of recording and distributing music. In a half decade the music industry saw billions of dollars evaporate from the business with devastating results. Just try to find a record store in your town. The other segments of M&E saw what happened to Music and determined that it wouldn’t happen to them, so adoption of new information technology and an innovation imperative drives their strategic planning and decisions. Media companies across the board have decided it’s safer to ride the technology wave than it is to resist and get buried by it. Deb Miller: Each segment of the M&E industry is different and will face different business pressures and innovation challenges. Are some segments handling this business transformation better than others? Charles Matheson: One key enabler has been the adoption of business process management disciplines and technology starting at the inception of a project, book, show or advertisement. Modeling and mapping a business process, even one as iterative as film production or website development, has proven to deliver results. Cost savings and performance improvements through process and case management are obvious benefits, especially those focused on serving the customer. However, increasing revenue may be the real “gold statue” winner for BPM. For example, a BPM based application that helps manage Intellectual Property and licensing can have a tremendous impact. Deb Miller: How are the segments using technology to enable this business transformation? Charles Matheson: Each segment has its challenges, but they are not in denial or resistant to change. The two segments of M&E that took the hardest hit early on were music and newspapers in the publishing segment. The advent of a ubiquitous free network caused major disruptions in key pieces of their respective business models. Music was freely, if illegally, distributed through services like Napster and news was aggregated through portals while classified advertising moved to specialized web sites like Craigslist.com. Advertising jumped on the new technology early on and perceived it, correctly, as a tremendous opportunity. Other segments like publishing, studios, broadcasters and game developers have changed their business models to adapt and prosper. Deb Miller: How are distribution models changing to meet the new business environment? Charles Matheson: Motion pictures used to be cut, approved and canned for distribution and released in a series of “windows” for consumption. The first “window” was theatrical release followed by a series of other “windows” which generated revenue for the property. Sell it to airlines for in-flight movies or premium cable channels, then network television and finally VHS or DVD distribution. With digital distribution this model stops working, so now films are distributed to as many resellers as possible in the shortest time frame possible – in other words, all the traditional “windows” of distribution are collapsing. The inaccessibility to films gave them great value; but that power is diminished in a networked world so ubiquitous distribution at a lower price point is the new way. This has a ripple effect all the way down the chain of production and accounting and requires new IT systems and applications to address the new paradigm. Deb Miller: What kind of new applications are media companies implementing for distribution? Charles Matheson: I’m a member of MESA (Media & Entertainment Services Alliance) and we spend a lot of time focused on digital distribution as the fastest growing segment of the M&E business. This is creating new challenges for content owners and distributors. Omni-channel distribution is the new imperative and media companies are looking for software platforms that intelligently push the right content in the best format for any device. These solutions need to be rock solid yet flexible and intuitive for the editors producing content and the end user consuming it. They need to be able to deliver highly personalized content over any network. Deb Miller: So that’s a great example for the movie business, do you have other examples that you can share? Charles Matheson: That’s just one example of a changing business model for motion pictures; there are many other examples I could give just within that same segment. The impact of DI (Digital Intermediate) in the editing process, the impact of digital cameras and the increasing size of digital files, the increased number of file types and devices that need to be supported for distribution all force organizations to modify their business process and technological infrastructure. Book publishers for example, are being forced to change and adapt their business to the new competitive environment. To the dismay of bibliophiles everywhere the book is inexorably moving from paper to bits displayed on e-readers and tablets. The new “digital book” is a very different property than its paper bound predecessor and publishers need to reimagine what it is they’re selling and how they sell it. Magazine publishers, at least the ones that will survive, are already way down this transformational path and the print product is becoming a supplement to a powerful online brand with video, polls and questionnaires, social media integration and relevant links to additional information “outside” the property. Deb Miller: How are M&E companies meeting this challenge of huge and growing volumes of digital media? Charles Matheson: Media companies must have a robust digital asset management (DAM) system, preferably one that is integrated with tiered storage. Using disk storage for these large files is not practical or cost-effective and managing petabytes of video requires the use of object storage or LTO (Linear Tape-Open) tape systems. LTO is being used in production and the DAM system must know where content is located and how long it will take to retrieve it. Deb Miller: These are really powerful specific examples of “actions and consequences” for the M&E industry. What would you say are common issues and solutions that apply across all segments in Media? Charles Matheson: Digital asset management is a broad and fundamental imperative for media companies. All segments must effectively manage their content and that content or asset is at this point, by and large, digital. Ironically, because it is so important, almost all media companies have purchased or developed “in house” multiple DAM systems that don’t integrate with each other, so having a federated view and access capability to multiple repositories is vital for performance and innovation. This is an internal business process problem for media companies and the complexity is shielded from consumers and investors, but managing digital assets in the media creation supply chain costs a lot. The digital distribution of content is also a common problem for all media companies and with advances in the network infrastructure and the explosion of end user devices, screen types and sizes, this issue is overwhelming. Leveraging cloud services for software and storage has become important for all media companies, as has the adoption of social networking software for building audience externally and collaboration internally. Federated Media Access is the most common solution to solving the problem of multiple repositories containing valuable content. This can be accomplished in two ways: first by aggregating and storing metadata from different repositories in one single master database with links or pointers back to the actual content or second, by having an application that integrates to multiple repositories and gives the user real time search and access to the content. Deb Miller: With so many technology hurdles and outside competitive threats, what can M&E companies do to gain competitive advantage and win the “IT Emmy?” Charles Matheson: As a rule, Media companies run two parallel IT organizations. One IT shop focuses on traditional IT systems that one would find in any large enterprise. I’m talking about financial systems, email, CRM and other internal properties like the corporate intranet. The other IT organization is focused on creating content like books, magazines, websites, movies or TV programs and advertising plus all the marketing material associated with those properties. These two IT shops use very different technology to achieve their goals and they are not vertically integrated – in fact, often the right hand doesn’t know what the left hand is doing. The integration of production IT with business IT will improve productivity, reduce cost and spur innovation and that is where media companies will focus their energies in the future. There are only a few companies who have operational experience in both back-office IT and media production environments, and they are poised to lead the next wave of business transformation in M&E. A version of this article first appeared in CMSWire. Illustration from Shutterstock. Shutterstock_155281778

Read More

Are Data Protection Concerns Keeping you From Leveraging Information?

data protection

A recent PwC study suggested that many businesses are so concerned about protecting data from possible theft or misuse that they are not taking full advantage of the value that information represents. “Beyond good intentions: An introduction to the 2014 Information Risk Maturity Index” summarizes results from interviews with 1,800 business leaders in Europe and North America on their “risk maturity,” or the extent to which businesses implement and monitor policies and procedures to protect and manage their information assets. While I don’t think the paper presented evidence to tie together the respondents’ risk maturity level and their ability or willingness to use their data effectively, they bring up a point worth discussing. The PwC study contends that a “company-wide focus on security has kept organizations and their boards from sharing and distributing data and information within the organization to maximize its value.” It goes on to explain: “The big issue facing most businesses … is that their information assets are in the hands of the right people to safeguard them, but the wrong people to manage their exploitation. Organizations continue to believe that the IT manager should have ultimate responsibility for protecting information. … A cultural shift is needed so that information is shared with the people who have the right skills. This will include e.g. [sic] data analysts who can mine it and derive insight from it, as well as business units such as R&D, sales, and customer relationship teams who can determine what the insight means for them.” The challenge here is how to share proprietary information with employees who can create business value from that data without running the risk of having it fall into the wrong hands. Of course most companies have ECM systems that enforce users rights and permissions, but what about users who need access to information without having editing, downloading or printing rights? For example, only authorized users could be given access to certain documents—and that access could be restricted to viewing the content on a server only, not downloading files or copying the content within it. OpenText™ Brava makes this type of secure content viewing possible. With Brava integrated into an ECM system, organizations can maintain full control over their content while ensuring that authorized users can view the information they need.

Read More

eDiscovery Costs Keep Rising: It’s Time for Software to Come to the Rescue

I read the recent 2014 Norton Rose Fulbright Annual Litigation Trends survey, which (rather unsurprisingly) shows that corporate lawsuits continue to rise. What did surprise me was how big a jump 2013 showed: “Litigation spending rose once again in 2013, with 71 percent of U.S. companies reporting an average spend of $1 million or more, up sharply from 53 percent reported in both 2011 and 2012.” Why this jump in costs? eDiscovery. According to the 2014 Gartner Magic Quadrant for eDiscovery Software, eDiscovery is feeling the pain of the data explosion, fueled by “increasing volumes of litigation and regulatory investigation, and the ever-expanding amount of ESI [electronically stored information] that must be searched in support of these activities.” Drilling down further, a 2013 Rand study found that 73% of eDiscovery costs are spent in the document review stage, where ESI is evaluated for relevance and privilege. If an organization can streamline and simplify content viewing and particularly redaction, it can help reduce the expense involved in eDiscovery. When I first ventured into the eDiscovery market seven years ago, I was truly astonished to find that companies and law firms took perfectly good electronic documents and printed them to perform review and redaction. They would often use interns, college students, or outside paralegal firms to comb through all the printed material and manually redact information with markers or tape, then scan everything back in. To make the now flat scanned documents text-searchable again, they would undergo optical character recognition (OCR). I can’t imagine doing that on hundreds of documents, let alone millions. Using the original, electronic version of data is—and must be—the trend in eDiscovery. According to forecasts, enterprise eDiscovery software revenues are expected to nearly double over the next few years, from $1.8 billion today to $3.1 billion in 2018. This entertaining—and informative—video shows how software can be used effectively to help speed the review and redaction of data. They use OpenText™ Brava to provide multi-format viewing and redaction in a single interface.

Read More

Making ECM Easy: Accelerate User Adoption and Productivity

Realizing your new ECM’s projected Return on Investment—and other Information Governance benefits—is very dependent on how quickly your users start adopting the new system. Often times, your people have been doing their jobs without this new system for many years, and now you need to convince them that they “need” to make a change. As we have all learned over the years, people really don’t like change. Even if they aren’t thrilled with the current ways of doing things, at least they know how to get the job done. I’ve read several studies that show that ECM implementations can struggle with user adoption. So how can you ensure that your teams will use your ECM system and not try to work around it? In a July 2013 Forrester Research, Inc. report, Improve ECM Satisfaction Levels Through Agility, Analytics, And Engagement (subscription required for access), Cheryl McKinnon identified “several essential practices for business process management (BPM) professionals that have direct applicability to ECM practitioners: 1) identify opportunities to design for specific interactions; 2) add contextual information to activities; 3) apply empathy for users and how they interact with the system; and 4) explore abductive reasoning skills (i.e., push the envelope and imagine scenarios outside of traditional comfort zones). Problems can be reframed; instead of blaming the user for not embracing ECM software, you might ask how to make ECM easier, luring people away from reliance on paper.” Sitting down with users in each department and asking them to help you design “their” user experience will usually create avid adopters, but only if you can deliver. The trick is to create interfaces that tailor the user experience for each role or interaction without having to change the core ECM code, which often requires expensive proprietary coding skills, could bypass critical permissions and auditing, and almost always requires recoding when the ECM is upgraded. User adoption is critical to the success of your ECM project, but this is a high price to pay. Over the past several years, OpenText has resolved to address this by providing our customers with several standards-based tools to enhance and tailor user experiences to meet specific business user needs. About a year ago we purchased Resonate KT, the company that developed the WebReports and ActiveView products for our ECM Platform (Content Suite). Just a couple of months ago, I was asked to help market Content Intelligence, which bundles this technology into a complete package and also includes a stack of pre-packaged dashboards and tools to help customers get started. The marketing materials addressed all of the concerns outlined above, which was very exciting, but I wanted to hear how this technology actually performed for customers. I have since spoken with several customers, and I’ve got to tell you that I’m amazed at their stories. They have really been able to create user experiences that meet each department’s unique needs, even if those needs included documents and metadata from other systems. They have also all told me that when they upgraded their Content Server, these customizations “just worked.” Wow—now I’m impressed! Here is a brief description of the solution, but please continue reading down to the customer story. That is what really convinced me. OpenText Content Intelligence bundles ActiveView and WebReports tools, as well as a complete set of instantly deployable and easily modifiable prebuilt reports, actionable dashboards, and applications via the Report Pack, to make it easy for organizations to optimize the business user experience, drive adoption, and tailor applications to suit specific requirements of their Content Server deployment. Improve user adoption by making OpenText Content Server look and feel the way your business users want it to Limit or eliminate training costs by making it easy for users to work with your ECM system Easily develop and deploy applications tailored for your organization’s specific needs without outsourcing or the need for specialized development resources (HTML5, CSS3, JavaScript, AJAX) Make sure that consistent metadata is available and applied so that users can find what they need, when they need it and ensure the reliability of reports and dashboards Build role-specific dashboards and automated content reports to avoid the need for users to burden IT with requests or learn how to create reports on their own Lower TCO and improve ROI by making OpenText Content Suite a natural extension of the way that business users create and interact with enterprise content I had the pleasure of meeting with Michelle and Melanie at Santee Cooper and I was thrilled to hear how they have used Content Intelligence technology to create very specialized interfaces for each of their departments. They are even using this to sync documents and data from five other systems so they can provide their users with the information they need in each tailored interface, without requiring them to log in to these other ERP, HR, Finance, and Asset Management systems and then try to track down this information for themselves. This sync also enables them to leverage OpenText’s Records Management on all of this critical content, helping ensure that they meet their regulatory mandates while offering their users a central point of truth for this business-critical information. That is what I call making ECM easy for your users. Here is a quote from our discussion that summarizes the benefits they have realized: “We’ve used WebReports, the foundation of Content Intelligence, since 2004 to deploy tailored solutions for each department’s needs to ensure quick user adoption and greatly increase productivity and ROI. Over the years, we’ve been impressed with the effortless upgrades, especially given our complex content and data integrations with systems like Oracle eBusiness and Maximo. It is evident that Content Intelligence will help customers get an immediate head start in visualizing and managing Content Server information and creating their own tailored applications, views, and integrations.” Melanie Bodiford, ECM Technical Lead, Supervisor Application Support at Santee Cooper Please have a look at the Content Intelligence Executive Brief to learn more about this solution and the user adoption and productivity challenges that it is helping to address.

Read More

Digitize, Transform, Accelerate the Bank

The Digital Enterprise. A buzzword trending a lot in financial services in 2014, but what is it all about? According to many, the Digital Enterprise is an organization that uses technology as a competitive advantage in both its internal and external operations. At OpenText GXS we recently identified three trends developing across wholesale banking that enable transformation into the Digital Enterprise. Below is a summary of our findings collected over the last few months, which puts in perspective how financial institutions should digitize, transform and accelerate their business outcomes. 1 – Banks are under continued operational optimization pressure to improve cost income ratios Business process optimization and automation has transformed manufacturing and other industries, whereas Financial Services has lagged behind the curve. How can a bank resolve the issue of delivering process optimization with fragmented legacy technology? It probably boils down to the ability to scale services through self-service and standard components. At the same time, de-coupling legacy architecture and compliance will remove the major restrictions to flexible client servicing. Standardization of the components and features enables the bank to reduce operational costs and creates the platform for future process automation based on standard software. This may include: Enabling the delivery of cross-product functions as services: deliver services based on common features across products in the same way that car manufacturing uses single components to deliver multiple models. De-coupling the application layer of product functions, compliance functions and generic business services to ensure that Compliance and Regulation has less impact on product development based on customer needs. Increasing customer self-service and consumer experience; corporate services lag retail services in this area. Reducing customization per client. Increasing use of economies of scale from technology partners and standard software products over home-grown legacy with differentiation being based on service delivery. 2 – Bank clients are demanding consistent multi-channel delivery of services—‘Any product over any channel’—to improve the Customer Experience Historic product silos from both technical and business management perspectives have led to an architecture that does not lend itself to delivery of feature / function across product lines, let alone optimizing them for the customer experience across any channel. To enable products to be utilized based on the customer channel preference is a significant challenge. How are banks delivering against this goal today, and what mechanisms are available to enable a multi-channel experience? Client demand is to deliver services at the point of process need; for example, based on the corporate client and their very own individual process. ‘Any product over any channel’ will become a pre-requisite for service delivery, although most banks currently treat each channel as a separate silo rather than looking at an integrated channel strategy. A few leading transaction banks are considering placing the customer in control of the interface. Focus is shifting from channel to digital experience. This trend is driving towards a Digital Experience organizational unit that would overlay on Channel Services a layer to improve the client digital experience, a pre-cursor to Cloud Management units. The first step towards achieving this goal is widely accepted to be the deployment of a single gateway architecture as a framework for future product and service delivery. 3 – Corporate clients will want to consume banking services in the Cloud How does a bank respond to corporate client requirements for banking services that align to the Corporate’s business processes delivered in the Cloud, while maintaining a stable and standardized operational model? The need continues for banks to increase Transaction Revenues as interest rate products continue to yield lower returns. Decreasing operational costs, increasing customer self service and satisfaction will always be the outcomes relevant to the business case, but achieved through entirely new mechanisms: Cloud services create the market and network opportunity to deliver bank services in the marketplace Service-oriented models create the opportunity for low cost standardized delivery, whilst enabling the customer to tailor use and self-serve Overall, Cloud and consumable web services also create a number of challenges, similar to the dawn of e-commerce and B2B integration a few years ago. The banks need to think beyond new product development and innovation and make step changes towards the DigitalEnterprise. To conclude, let’s recognize that this Digital Transformation will largely change the way products are developed and the way banks will leverage innovation to differentiate. Equally, new challenges arise with compliance and data risk (“Who owns what where?”), while commercial models need to be adapted (“How are these services charged for and how are the necessary counterparties compensated?”). If you have some ideas and would like to share your thoughts, I’d welcome your views!

Read More

Shoe Shopping, Supply Chains and One Very Happy Customer

Followers of Gartner’s yearly Supply Chain Top 25 list may have noticed signs of progress over the last few years toward a responsive, demand-driven supply chain. As a student of the supply chain, I watch to see who makes the list and what market advances it highlights for suppliers and retailers. Last week though I got to experience the progress firsthand. An Omnichannel Supply Chain in Action Gartner’s list identifies 25 companies that “best exemplify the demand-driven ideal for today’s supply chain” and shares their best practices. There are some remarkable advances being made by supply chain leaders — whether or not they are on the 25 list — and sharing best practices is a good way to improve across the industry. And while all boats rise with a rising tide, the seas are much more turbulent these days in our omnichannel world. Supply chains need to align or get caught in a new perfect storm. Consider this scenario: You’re walking through the mall and see the perfect pair of shoes. You know, the ones you noticed on the fashion website and then on your Facebook page (because Facebook always knows what you are thinking). There they are in the store window on sale, so you pop in. You ask for your size in black to go with a new suit you just bought. CATASTROPHE! They only have them in blue in your size. You try them on. They fit like a dream and look great on, so now you really really want them. They are no longer just an idea, they are your must have shoes but they are the wrong color. A decade ago a consumer who faced this out-of-stock situation at their retailer might be told to wait until the next shipment arrived, or a check might be done and a suggestion made to visit another store location. More recently, the consumer might have gone online, found the shoes and perhaps purchased them from a different retailer. Consumers expect more these days and retailers have to step up to a seamless customer experience in order to keep the sale and maintain a delighted customer. In this case the customer got exactly what she wanted and the retailer did too. Here’s the second half of how this scenario played out: The sales person immediately says she can check for the shoe availability online. Yes, they have them and will ship no delivery charge to your home. They arrive within the week — right size, right color, same wonderful sale price! And yes they look fabulous with your new suit. Since I am a true supply chain geek, I wondered what went on behind the scenes in this scenario. Turns out the retailer in this story — Macy’s — has been investing over the past years in technology and approaches to optimize for omnichannel. They are outstanding in this endeavor and have been recognized by analyst and industry organizations alike for what they do and how they do it. The 30 Second Definitive History of Retail Supply Chain Excellence The retail industry was one of the first to embrace electronic data interchange (EDI) to automate and gain efficiencies in the supply chain. To this day EDI and its progeny continue as the backbone of the business networks that bring product from suppliers through to the point of sale — from ordering, packaging and shipping to payments and returns. Macy’s received an excellence in supply chain collaboration award from VICS (Voluntary Interindustry Commerce Solutions Association) several years back for a MyMacy’s logistics initiative to reduce complexity, cost and time required to process vendor routings. More recently, a Gartner survey reported Macy’s major supply chain focus as well as overall industry priorities had shifted from reducing costs to initiatives for customer satisfaction and growth. And last year it received the Customer Engagement Award for Digital Technology at the National Retail Federation’s Annual Conference. Underlying this journey for Macy’s and other leading retailers are the technologies that enable labor and cost savings, cycle time reductions, streamlined inventory management, improved cash flows and better decision making. That same technology is now being applied to drive end to end supply chain visibility. Much of supply chain execution these days is prefaced on visibility to track to consumer behaviors in planning cycles, and then to adapt inventory and fulfillment tactics to match demand at any channel. Adaptive Approaches Maximize Inventory at any Channel Seventy-eight percent of digital shoppers “webroom” — research online before heading to a store to purchase — according to a recent Accenture survey. Alternatively, some store trips lead to a digital purchase. The same study found that 72 percent of respondents “showroom,” or buy digitally after seeing a product in a store. Consumers have merged online and offline into a single shopping experience, and they expect retailers to align with that world view. Yet Accenture’s benchmark study of retailer’s readiness to deliver seamless customer experiences found 81 percent reporting absent or underdeveloped capabilities in tailoring assortment, pricing and shopping occasion to customer expectations across channels. Macy’s has been working to make sure those capabilities are in place, creating an agile supply chain to maximize to customer behaviors. It has built multichannel commerce strategies and made technology investments against demand-supply point combinations. For example, the buy/deliver shoe scenario described above might be one combination, another might be web order/store pick up. Macy’s is turning retail outlets into multichannel fulfillment centers. It are also fulfilling direct from their supply chain partner DCs (Distribution Centers). In a recent interview, Macy’s Group VP of logistics shared that Macy’s uses a combination of lean manufacturing and forecasting, along with an omnichannel strategy. It focusses on fulfilling customer needs through existing inventories instead of purchasing new ones. Macy’s customers are twice as likely as other online buyers to have researched a product in its stores before purchasing it online. And Macy’s cashes in on this: It makes sure that customers can place their orders anywhere, through any device. Macy’s is also investing millions with vendors, in joint decision making in the choice of “inventories that haven’t even … come into their possession.” The shoes in my earlier scenario were sent directly from the vendor, shipped to the consumer, “sold” to Macy’s. This required both a solid backbone and enhanced management capabilities to keep better track of real time perpetual inventory levels and dynamically allocate inventory across channels to match ever changing customer behavior and demands. Same Day Delivery May Be a Battle, But Profitable Customers Win the Day While Google, Amazon and Walmart have focused on customer shopping behaviors and battling over shorter and same day deliveries, forward looking retailers will ultimately need to focus on winning “profitable customers.” In his recent blog post, Gartner analyst Robert Hetu shared his own supply chain shoe story. He pointed out that in the pursuit of multichannel excellence, “one of the most challenging aspects has been accurate and flexible inventory that can be maximized to meet customer demand regardless of channels.” Robert ordered two pairs of dress shoes from macys.com and they arrived simultaneously at his door — one pair came from the Galleria Mall near Pittsburgh, Pa., while the other came from Willowbrook Mall in Wayne, NJ. “Flawless execution of an e-commerce order fulfilled from store inventory and shipped to my door. And the shipping was free! I am a delighted customer.” Robert wrote about the next challenge for high performing supply chain companies, After years of perfecting a supply chain that delivers to stores retailers are faced with an entirely new challenge. Supply chain optimization along with assortment optimization, are two critical elements of the go forward multichannel strategy. The complexity of layered costs will require a new way of accounting and assessment at a customer level. The question is how profitable am I as a customer of Macy’s?” In the End, It’s All About the Shoes I have no doubt that we will continue to see paradigm shifts in supply chain strategies. I’d like to end this article with some thoughts on the amazing results that supply chains like Macy’s are achieving in the new world of omnichannel retailing, and share my prescient remarks on the challenges ahead. But at the moment all this supply chain talk is just background noise, because I really need to go try on my new suit with my truly AMAZING new shoes! This story first appeared in CMSWire. Shutterstock_49021327

Read More

What Should You Do with Your Legacy Archiving System?

Guest post by Richard Medina. First published on RichardMedinaDoculabs.com. This post provides a simple procedure for helping you decide what to do with your big legacy archiving systems – if you have big legacy archiving systems. By archiving system I mean a big COLD system, enterprise report management system, IDARS, mainframe output management system, and even big image repository – really any big repository that manages fixed content that’s infrequently accessed, and which might be used for long term archiving of historical data, customer service support, electronic bill presentment, management and distribution of report data (mainframe output, financial reports) and a few other applications. You should be evaluating your archiving systems. On the one hand they may be one of the most reliable content management systems in your enterprise – they may be one of the few fixed points as you update your ECM or information management strategy. On the other hand some of them aren’t reliable, some have trouble scaling, most cannot provide the capabilities you need (like adequate records management), and many suffer from product and vendor risk – you shouldn’t depend on either the products or the vendors to be there for you in three years. It’s also possible that you don’t even need a big archiving system. So what should you do? You have three options with respect to your legacy archiving system: 1) stay with your legacy archive; 2) replace it with a system as scalable, complex, and possibly as costly; or 3) replace it with something different and probably less dramatic (e.g. using an existing ECM system to manage the legacy archive content). Evaluate Your Existing Applications To determine which of these options – maintain or replace – makes sense for your organization, consider the following four variables: 1. Volume: Storage volume, including the number of reports, size of reports, and throughput of content stored on your legacy archive. 2. Complexity: The complexity of the reports stored on your legacy archive and the number (and complexity) of integrations between the legacy archive and other business systems. 3. ECM and Records Management Requirements: Your organization’s requirements for retention, disposition, legal holds and releases, and library functions regarding content stored on your legacy archive, as well as any workflows incorporating content from the archive. 4. Financial Analysis: The financial implications of staying with your legacy archive versus replacing it. Now we can map out the possible scenarios and get close to the best decision for your organization. The decision matrix is based on the four variables above, and it presents a total of sixteen possible scenarios. Note that Legacy Archive in the table means a big complex archive. That’s a bit vague, and if you want to discuss your particular situation or particular products, drop me a line – rmedina@doculabs.com, 312-953-9983. Also note that ERM means enterprise report management, basically archiving systems that focus on mainframe report data like bills, financial reports, etc. So evaluate your legacy archive applications according to each of the above criteria: the volume of content stored in the system, the complexity of both your reports and your integrations, the depth of your organization’s ECM and RM requirements, and the results of your financial analysis. Your answers should lead you to the minimal prudent solution that will meet your requirements. Keep in mind, however, total cost of ownership (TCO) considerations, as a more complex solution can sometimes be more cost-effective over the long term. While the final decision concerning your future approach to enterprise archiving will warrant more detailed analysis, this matrix will help you approach the evaluation of your existing legacy archive applications, and can serve as a first cut at what your own enterprise archiving strategy should look like.

Read More

BIRT iHub F-Type Generates Analyst and Press Buzz

Since BIRT F-Type launched on July 10, the analysts and press who cover the personalized analytics market have been buzzing about Actuate’s free server for BIRT reports. These are just a few of their comments. “The interactive reporting capabilities are a strong-point of the platform … and a core aspect of self-service BI that many vendors are overlooking.” Cindi Howson, BI Scorecard. “Looking at the UI, report creators and even users will have the ability to decide how a report is modified, copied, stored and even limit who can send it on. With the ease that data is currently leaked out of many organisations, this will make a lot of companies very happy, especially as they continue to deploy analytics solutions and highly sensitive data to an increasing number of people inside their businesses.” Ian Murphy, business-cloud.com. “Users can run the software to produce up to 50MB of output each day, which can be enough to produce hundreds of reports, with copious tables, charts and other visualizations. … The results of an analysis can be exported into an Excel spreadsheet or embedded into a Web page through a JavaScript programming interface. The Web report can even have controls that the user can manipulate to further scrutinize the data presentation.” Joab Jackson, PC World. “BIRT iHub F-Type provides the industry’s most complete and powerful deployment infrastructure for delivering BIRT content. Developers using BIRT iHub F-type will be allowed 50MB of data free of charge on any given day, with two overages allowed per month. The customers can pay for additional bandwidth by credit card if and when it is required. The users can schedule, store and manage BIRT content, add interactivity for BIRT Reports, collaborate and share content with users, eport [sic] BIRT content to Live Excel, PDF, HTML, XML, and more.” Mandira Srivastava, infoTECH Spotlight. “The new software boosts open source BIRT developers’ productivity through free access to the features and power of the commercial BIRT iHub enterprise-grade deployment platform, with metered output capacity.” Information Management. BIRT iHub F-Type is available free today from the BIRT iHub F-Type download page, and is distributed under Actuate’s commercial licensing terms. There are some limits on capacity (50MB of output capacity per day), but upgrades are priced starting at $500 per month for an additional 50MB. And because you read this blog, Actuate is offering an additional 50MB of daily capacity until the end of 2014. For additional information, check out “All You Need To Know’ about BIRT iHub F-Type.” Complete coverage of BIRT F-Type can be found on Actuate’s BIRT Buzz page. “Buzz” image by Sean MacEntee.

Read More