Information Management

RM: Information Governance Superhero-in-Waiting

Ahhh, good ol’ Records Management.If, like me, you’ve been working in large corporate environments for 10years or more, just saying the phrase undoubtedly conjures up theclassic records management stereotypes you picked up in your early days:Windowless basements stacked with banker’s boxes and rows of filingcabinets. Obsessive crews of process-oriented quasi-librariansmeticulously assigning a file number to “official” corporate documentsthat may never be needed again. Yes, it was all necessary, but it wasalso all seen as somewhat inconsequential and certainly not as a keybusiness driver. DILBERT © 2004 Scott Adams. Used by permission of Universal Uclick. All rights reserved. Well,guess what, folks. Turns out there was something to all thatdetail-driven classification, retention scheduling and archiving. Theregulatory and legal climate of the business world has changeddrastically: The systematic management of corporate records has become acore element of compliant, defensible Enterprise Content Management (ECM). Accordingly, the practice of records management has evolved frommarginalized afterthought to essential survival tool: An organization’svery existence can hinge on its ability to provide timely, accurateresponses to compliance, regulatory or discovery obligations. Talk aboutgoing from zero to hero in the span of a decade! Yet it still amazes me that the generations of thought-leadership, technology and best-practices development that have gone into managing thelifecycle of official corporate records is often not being applied tothe vast volumes of additional unstructured data organizations nowpossess. Think about it: It’s the aforementioned traditional businessrecords that have historically been most likely to be targeted incompliance and discovery reporting. It only makes sense to expand the time-tested, rigorous principles applied tothe most scrutinized corner of the business to the enterprise as awhole. Theconcept of information archiving is a perfect example. Organizationshave become extremely proficient at generating and collectinginformation. From emails to marketing collateral to financial statementsand beyond, it’s growing at staggering rates. But what are manyorganizations doing with it once it’s served its purpose? In the absenceof having a thorough information governance program, they’ve defaultedto dumping it en masse into an archive. Now, they can compress,de-duplicate and encrypt to their heart’s content but, in very shortorder, they’re going to want to cost-effectively access and retrievepieces of that information in its original context–something that’s nextto impossible unless well-planned information governance policies werein place beforehand. Chancesare the optimal solution could already be roaming your hallways. Bybringing your records manager and their knowledge base into the loop andhaving them collaborate with IT, legal, and compliance on acomprehensive information governance program, your organization canextend pre-existing best practices to all its enterprise information anddefine classification, retention, preservation, and dispositionparameters in an easily accessible, defensible structure. Optimal InfoGov Adds Value Across an Enterprise EnterpriseArchiving is just one of the areas where this cross-functionalcollaboration can be hugely beneficial to an organization. In the modernbusiness environment, compliance and legal requirements obviously play ahuge role, but effective information management also adds significantvalue to product development, process improvement, disaster recovery andmore. To achieve this, every email, every R&D report, everyaccounting ledger, every HR comment, every piece of unstructured contentneeds to be subject to the same fastidious processes as the most formalorganizational business record. If you’re a CIO, information architect, or IT strategist, a great first step to developing a comprehensive Enterprise Information Management (EIM) strategy is thoroughly familiarizing yourself with the practice of records management and the principles behind it.Yes, you may have barely noticed records management specialists at thecompany Holiday party all those years ago, but times have changed. Greatthings are waiting for those who think outside the (banker’s) box.

Read More

How B2B Solutions Provide the ‘Cornerstone’ to Improved Supply Chain Resilience

In my previous blog entry I discussed how Supply Chain Directors need to focus more on improving resilience across their supply chain and logistics networks. According to a report from Accenture the correct and increased deployment of ICT infrastructures can contribute significantly towards improved supply chain resilience. GXS offers B2B solutions and services that can provide the flexibility and scalability that Supply Chain Directors need to minimise disruptions across their supply chains. In fact, cloud based B2B integration solutions can provide a key ‘cornerstone’ towards developing a highly resilient end to end supply chain. In terms of helping to build increased supply chain resilience across a supply chain, B2B solutions can help in three very distinctive ways: Provides Flexibility – one of the challenges faced by companies during a period of disruption is ensuring that business related transactions supporting production or other processes continue to flow across the extended enterprise. If disruption occurs then an alternative path may have to be identified quickly to allow transactions to flow uninterrupted to their intended destination. In terms of a B2B infrastructure, companies should ensure that if disruption occurs and data centre resources become impacted they can instantly fail over to an alternative data centre location. If no other data centre is available then some form or disaster recovery process should be in place to ensure that transactions can be recovered as and when the data centre comes back online. GXS Managed Services, GXS’ outsourced B2B environment, utilises a highly available, dual continent data centre infrastructure that provides immediate failover and back up to an alternative data centre, should supply chain disruption occur. Using state of the art storage devices and network routers GXS can ensure that your business critical transactions can get from A to B with no interruption or delay. In addition, the GXS Managed Services environment is highly scalable and flexible allowing your B2B infrastructure to expand or contract as your business requirements change. Many companies are beginning to spread their production risk by manufacturing goods in more than one country, so that if one plant gets impacted then production can be quickly ramped up at the other locations. In a similar manner, GXS Managed Services provides an environment to spread your B2B risk so that you can continue working with your trading partner community during a period of disruption. Companies managing their own B2B environments via software based solutions will not enjoy this level of flexibility or reliability. Enhances Visibility – The manufacturing sector is probably the most global in nature when compared to other industries and having true end to end visibility across a supply chain is an important requirement to ensure production lines do not get disrupted. Whether sourcing parts from suppliers in China, Brazil or India, having end to end visibility of all shipments can make a big difference to the smooth operation of today’s production lines. In fact in the automotive sector, Just-In-Time production techniques depend on the timely delivery of parts and any delay with delivery can result in severe penalties for those suppliers concerned. So when an unexpected earthquake or flood occurs, having visibility of what is happening across your supply chain becomes more important than even before. You may need to identify alternative sources of inventory, quickly identify goods in transit so that they can be re-routed to an alternative location or you may need visibility into cross border related shipments. GXS Active Logistics provides a single, cloud based, platform to monitor all global shipments irrespective of the logistics carrier or the form of transport being used for the shipment. Being able to monitor shipments across multi-modal transport networks anywhere in the world and keep track of shipments as they pass through busy container ports such as Dubai or Singapore can have a major impact on your supply chain operations. GXS Active Logistics also allows key performance indicators (KPIs) to be monitored across your 3PL providers and hence you have the ability to identify points of weakness across your supply chain. Many companies have established control towers across their operations to monitor supply chain shipments and GXS Active Logistics forms a strong foundation for such an environment. With so many disruptions occurring across today’s supply chains, these visibility control towers are doubling up as crisis management centres and they become the go to point of contact during a period of disruption. Improves Communications – One of the key areas where B2B tools can help minimise supply chain disruption is in supplier management, otherwise referred to as community management. After all when some form of disruption strikes you need to make sure you can remain in contact with your trading partner community or identify alternative sources for production materials very quickly. Community management tools increase resilience in two key ways. Firstly they can help establish what can best be described as a centralised B2B contact directory, typically a self-serving environment where all contact information for a particular supplier can be held. As a condition of doing business with some companies, suppliers will be forced to enrol through a workflow driven registration process to ensure that they provide all the relevant contact information. Secondly, community management tools help to improve command and control processes across the supply chain. They also allow you to quickly assess the state of your supply chain in terms of identifying which suppliers have been impacted following a period of significant disruption. As suppliers provided their contact information during the registration process, community management tools allow companies to send out mass communications to a trading partner community with ease. You could also ask suppliers to complete online assessments to allow you to monitor the state of your supply chain. If one supplier gets impacted in some way then alternative suppliers can be identified quickly due to having an up to date contact database. Dual sourcing of key components is becoming a popular method of combatting supply chain disruption and community management tools help to orchestrate the selection of alternative suppliers with ease. Community management tools help to minimise risk across a supply chain. Companies deploy Microsoft office for creating business documents, they implement ERP systems for managing production and business processes. Community management tools help to establish a dedicated IT infrastructure to help assess, manage and mitigate supply chain risk. When was the last time your company undertook a risk assessment of your supply chain? how long did it take to complete? did every supplier participate in the assessment? The ability to assess the state of your supply chain via simple to use community management tools helps to bring a significant competitive advantage during a period of supply chain disruption. GXS Active Community is a community management tool that can help to build greater resilience to future supply chain disruptions. GXS Trading Grid®, the world’s largest cloud integration platform, has traditionally focused on moving data from one location to another. GXS Active Community on the other hand has been designed from the ground up to manage the people-based interactions across a supply chain, an area that many companies tend to forget when extending or restructuring their supply chains. From establishing a centralised contact database for your trading partner community, through to deploying some form of assessment or survey, GXS Active Community improves the way in which you engage with your suppliers on a daily basis. Whether working with contract manufacturers in China, distributors in Eastern Europe or third party logistics providers in South America, GXS Active Community provides a single global platform that can help improve the flow of information across an extended enterprise. It is one thing being able to deploy these B2B tools to help build increased resilience but when disruption strikes a supply chain there needs to be a co-ordinated approach to resolving any potential issues that arise. One increasing trend that I have noticed on professional networking web sites such as LinkedIn or Xing has been the emergence or rise of the business continuity manager role. This person becomes the go-to employee during a period of disruption, they become the central person who is responsible for steering a company through a period of supply chain disruption. Sometimes referred to as the ‘Masters of Disaster’, these people are key to making today’s supply chains operate efficiently and seamlessly. The ability to proactively monitor supply chains and almost sense when disruption is likely to occur has become a key competitive weapon that companies have been keen to implement. Increasing regulatory compliance has also made the business continuity manager more visible in today’s extended enterprise. In the same way that a conductor controls an orchestra, the Master of Disaster will use the B2B tools at their disposal to monitor global supply chains and take appropriate action before major disruption can impact the business. Have you appointed a Master of Disaster yet? In summary, in order to build increased resilience across a supply chain, companies need to address both the physical and digital supply chain issues. They need a B2B platform that is scalable, flexible, secure and available 24x7x365 days per year. If B2B solutions are deployed correctly and in a proactive manner then they can offer your business a significant competitive advantage, being unprepared or responding to disruptions as they occur is generally bad for business. If you would like further information on why companies should be thinking about building further resilience into their supply chains, then please review THIS WEBINAR, which discusses some of the topics covered in this and my previous blog entry in more detail.

Read More

Building “Mini Cooper” apps: How ISVs use BPM PaaS to buildSmart Process Applications

In a previous blog you may recall that I described how the cloud is a catalyst for innovation and differentiation by applying the principles of Mass Customization in IT. I used the example of the Mini Cooper which’ personalization options give you that feeling like “hey… never seen this one before”, every time you come across one. BTW: Mini doesn’t sponsor me, and I myself drive a pretty non-descript grey MPV. The concept just intrigues me and here is why: Last week we held a webinar with John Rymer from Forrester and Uday Vijayan from Excel Software about Building and Delivering the next generation of cloud-enabled business solutions. John Rymer started off describing the challenge of software developers (both at ISVs as well as in Enterprises) where 78% of them feel that the pressure of delivering faster and 79% feel that the pressure of delivering more has made work harder. One of the key reasons why cloud platforms are chosen is actually to deliver fast. John continued to explain that faster time to market means two things: Fast delivery requires possibility of customization and prompt updates & changes must be “designed in”. So if you’re building applications on a cloud platform it’s pretty key that the platform supports you with the options for continual improvement. How far can SaaS applications can be tailored / customized /personalized without losing the advantages of economies of scale and lower unit cost that come with the cloud. In other words can I build “Mini Cooper” apps that have that appeal and the options for customization, personalization and tailoring the user experience at affordable cost? Needless to say that the more support you get out-of-the-box from the cloud platform, the easier it will be for an ISV or Enterprise to take advantage of those options at minimal cost and burden. So far the theory. Uday Vijayan, founder and CEO of Excel Software delivered the second part of the webinar, and it intrigued me that they are applying exactly these principles leveraging the Cordys platform. Excel Software are based out of India and focus on building applications for Supply Chain Management, MRP and Finance. They started the journey to migrate their applications to Cordys as while ago and just recently launched one of their core applications called Medico Online as a SaaS application. Uday explained how they were pushing the Cordys platform to the extremes to validate whether the concept that they wanted to apply could actually stand the test. Medico Online deals with sales and distribution in the Pharmaceutical, Food and FMCG industries and many of Excel Software clients require seamless integration of Medico Online with their ERP back-end. For example: Excel built a single transaction on Cordys that could write more than 72 (!) transactions into the client’s global ERP system without compromising on any of the system controls – in fact,adding far richer domain specific functionality to the ERP. Uday unveiled a pretty interesting view on the architecture of the application. Rather than choosing a “classical” 3-tier architecture, Excel Software leverages the Cordys platform to build their applications in a 5-tier layer. The three tiers of UI, Application Logic and Database are extended with a Process tier and an Integration tier. The two additional tiers are used as and when relevant, so you’re not forced into a straightjacket here. The Process tier allows you to build in flexibility and collaboration by design, particularly when combined with business rules and business activity monitoring . The Integration tier supports you to establish seamless integration with other applications and or services from the cloud and on premise. From a design time perspective we offer design artifacts for all the“tiers” in one integrated and browser based Collaborative Workspace(CWS). The Collaborative Workspace also covers the application lifecycle management aspects for deployment and upgrades of applications. Applications in Cordys are collections of various artifacts that are managed through the Cordys Application Packaging functionality in CWS.Application packages can be deployed on tenant-level or sub-tenant level(the latter are called organizations in Cordys). Cordys supports organization-level styling and customizations, on the level of individual design artifacts. So this gives an ISV and Enterprises all the flexibility to do application configuration and personalization up to the level of schema extension while leveraging the standard application that is shared across organizations / tenants. Cordys manages the dependencies within the standard applications and between the standard and customized artifacts on organization level,respecting those customizations during upgrades. For Excel Software this translates to very tangible benefits. The CEO shared that Excel Software reduced TCO substantially and more importantly this approach has accelerated Time-to-Market by 5 times for the implementation time of their applications. This is what I meant with the closing remark in my last blog that we’re seeing ISVs adopting this approach with great results. I will unveil more of the platform secrets that enable speed to production and continual improvement options in a next blog about Cordys Cloud and the benefits of multi-level multi-tenancy. Stay tuned.

Read More

Big Data and Information Governance in Life Sciences

The clamor around big data’s applications and how best to tame them was one of the key themes which emerged at this year’s BIO International Convention, held in Chicago. Hosted by the Biotechnology Industry Organization(BIO), the event drew 13,594 industry leaders from 47 states, theDistrict of Columbia, Puerto Rico, U.S. Virgin Islands and 62 countries. I was there and fortunate to have the opportunity to gain first-handinsight into the industry’s challenges and opportunities, as well as thestrategies and tactics that forward thinking companies are utilizing toextract value from Big Data and improve the therapeutic options ofpatients. The volume of datagenerated by all aspects of life sciences is staggering. Estimated to bearound 150 exabytes in 2011, this almost inconceivable aggregate ofinformation has been increasing at a rate of 1.2-2.4 exabytes per year.To put this in context, one estimate holds that all words ever spoken byhuman beings could be stored in five exabytes of data. And just howmuch is an Exabyte? An exabyte is 10^18 bytes, or one million terabytes. Truly staggering! Theincredible vastness of this information presents an unprecedented set ofopportunities for the biopharma enterprise, particularly aroundpersonalized medicine and companion diagnostics. It also presents somestiff challenges regarding information governance: All this data must becaptured, curated, stored, searched, shared, transferred, analyzed, andvisualized while maintaining 21 CFR Part 11 compliance. The scene at BIO was set in the first day’s Personalized Medicine and Diagnostics Forum. One of the sessions, Big Problems Need Big Solutions–Fixing the Health Care System Using Big Data, addressed the potential value biopharmaceutical companies could realizethrough exploring the databases of payers, hospital groups, and CROs,in addition to their own datasets. An integration of these datasetswould allow researchers to determine which patients will benefit fromcertain drugs and which could experience side effects. It would alsoassist in identifying patterns and causality in complex diseases,potentially feeding back into the drug development process. How might an integration, synthesis, and governance of data such as this be achieved? OpenText specializes in providing Enterprise Information Management(EIM) solutions that can help life sciences organizations deal with thechallenges of Big Data. From Electronic Lab Notebooks and LIMS to FDAfilings, OpenText Pharmaceutical & Life Sciences solutions ensure compliance with federal regulations for data governance while helping solve critical process challenges. What about the future? New ways of delivering solutions are being explored. Is innovation in the cloudthe answer – might it be the new “space” to accelerate scientificdiscovery and development? Subsequent sessions at BIO delved into thisquestion and others, exploring the trends, challenges, and bestpractices of drug development in the cloud as it relates to ecosystempartnerships, data analytics, and compliance. Aswe work to implement current capabilities and aim toward futurestandards-based collaboration, the conclusion is clear: Collaborationacross biopharma, technology providers, and regulatory agencies will beessential to develop the standards, technology, and approaches necessaryto critically evaluate data and generate useful information forimproved health outcomes in patients. The taming of Big Data is not asimple task for Life Sciences, but the innovation that can flow fromthat is well worth the effort! Now is the time to get on board and explore the possibilities.

Read More

Customer Experience Management: Art vs. Science

This is a CMSWire cross-post. There is a long-standing debate as to how much of customer experience management is science and how much is art. Asyou decide where you weigh in on the question, here are two customerexperiences for your consideration. While one experience is a customersatisfaction triumph and a pleasure to share, the other tells acautionary tale. They both illustrate the critically important roletechnology can play, blending science and art to create positiveimpressions and continuing customer loyalty. Customer Experience Management is PersonalAs followers of my CMSWire article seriesknow, I often write about how technology properly applied can changehow we do business for the better. Customer experience management livesin the oceans of data being generated every day in ever increasingamounts. This Big Data offers competitive opportunities, givingcompanies new ways to listen, learn and respond to their customers’needs. The technology I work with plays an important role in allowingcompanies to extract and apply value from Big Data. The results areespecially powerful when connected with the customer experience. Thegoal is to forge relevant interactions that are both efficient for thebusiness to deliver and at the same time personalized for the customer.And sometimes it gets VERY personal, because after all, we are allconsumers. Here is a look at two of my recent customer experiences. Youmay conclude, as I did, that it’s not a question of art vs. science increating extraordinary experiences, but rather how well you combine thetwo. Customer Experience 1: Web to Store to #DelightSaturdaymorning and I’m searching the web for of all things a replacement for adoor threshold. Our kitchen walk-in cupboard threshold had splinteredover the years and we decided we would replace it ourselves. This seemeda simple task, though in the spirit of full disclosure we are not a DIYfamily by any means. My web search for thresholds immediatelyreveals well-recognized possibilities from two of the leaders in homeimprovement. I begin to explore both and come to the realization that Iam not only viewing this customer experience from my homeowner personabut also from my “day job” persona. I am indeed part of the trend formore business happening online and product research increasingly movingto online as the first destination. I notice that both online sites aredoing a good job of organizing and presenting the product informationand images, and also providing relevant do-it-yourself video options. Then, I realize I’m late for errands we have scheduled, so I abandon mylaptop, head for the car, and continue my research on my iPhone in route(I am passenger not driver). Whether on my laptop or on myiPhone, the respective web experiences are quite good. Clearly thesecompanies use customer experience management technologies to create arich and consistent digital presence across channels. Both have whatappear to be responsive designs and good in-context channel features.Both web customer experiences immediately identify where My Store wouldbe — one got it right on the money but both are easy to adjust. Bothoffer some form of free shipping too, probably a nod to the online-onlytop competitor, whose advantage is most often fast shipping and low(er)pricing. Both companies seem to have what we need at a comparableprice, but we are not really certain we have found the right item, andwe wonder what we would need to do to “install it.” Since we are outdoing errands anyway, we decide to stop by the local Home Depot store. Thestore has the item its website displayed and at the promised price. Inperson, it doesn’t look quite like the shape of the splintered one wehad brought with us for comparison. Enter customer experience rock starCharlie, who leads the lumber department at the store. Charlie notonly assures us that we have the right item, but he also offers to trimand notch the piece based on the old one so it will be simple toreplace. But it turns out the notches need a saw they don’t have at thestore, so Charlie TAKES THE PIECE HOME TO HIS WORKSHOP, cuts it andbrings it back the next day, leaving me a message confirming that it isready to pick up. I go on the Home Depot FB to tell them about my bestcustomer experience EVER and hear right back from them thanking ME. Idon’t know if this is an isolated incident, but I do know Home Depot isdoing something right these days. They have shown the strength of theirbrand and their channel management in the last three months, withoverall sales up 13.9% and same-store sales up 7 percent. There was noextra charge for my Home Depot customer experience, though we certainlywould have paid. The result for Home Depot is that they not only wonthis small transaction but now also have a customer for life. #DELIGHT Customer Experience 2: Store to Mobile to #FailCustomerexperience management isn’t just for traditional retailers. Banking isincreasingly coming to terms with the importance of coordinatingchannels to attract and retain their retail customer business.Multi-channel is certainly necessary, but is no longer sufficient forsuccess. Now omni-channel has hit the financial services world as amajor factor in customer experience management. Cisco® refers tothis new reality as the “Era of Omni-channel Banking,” explaining thatit moves beyond the current approach, in which banks encourage customersto use the least expensive channel, to delivering a consistent andseamless customer experience across channels. The intent is to bring theindustry closer to the promise of true contextual banking in whichfinancial services become seamlessly embedded into the lives ofindividual and business customers. Creating a positive end-to-endcustomer experience is at the heart of this approach and banks aremoving toward secure integrated architectures to enable theirchannel-ready infrastructure. This future sounds great, but we stillhave a lot to learn in the here and now about holistic customerexperience management. Case in point is my own customer experience thatoccurred on the very same day that Home Depot had so delighted me. Iam shopping for pet food at a store we often frequent, and am at thecheck-out with a large order when my credit card transaction is deniedby the system. I immediately receive a message on my mobile phone tocall my bank security with a return number to call. Okay … thissounds very omni-channel and actually quite responsive and responsible.I’m a bit annoyed that my transaction was rejected, but somewhatmollified by the notion that my bank is being circumspect. This bank hasa fabulous website in my opinion and a pretty good IVR system as well,and I viewed them as advanced in their approach to customer experience.So I return the call and provide the usual account and securityinformation to the IVR system. A human voice then comes on the line,clearly reading from a script, asking me for the same information I hadjust entered PLUS additional security information. I provide it. Iam told my account is flagged and I’ll be transferred to a securityspecialist to ensure it is properly unblocked, and thanks for mypatience. I am not particularly known for my patience, but had come thisfar and would see it through since we had two very hungry goldenretrievers waiting at home. I am put on hold for a couple ofminutes and then another human voice clearly following a script asks meAGAIN for the SAME INFORMATION I had just provided. I ask why since (1)they had called me originally and told me to call them; (2) they hadthen transferred me; and (3) I had already given them the informationtwice so surely they already knew it. I am told if I do not want toprovide the information my account would remain blocked. Seriously? So, Icomplete the discussion, providing once again all and more of theinformation they asked for, and my account is unblocked. Perhapsthe worst of all in this experience was that the bank representativecould not (or would not) provide a clear reason why they had determinedto block my card at that moment, and they also left me with the warningthat it might happen again. I quickly run my card and successfullycomplete the transaction before another random event, perhaps sunshowers or gamma rays on the moon might trigger a similar block. #FAIL Perfecting the Science, Enabling the Art of Customer ExperienceIspend a great deal of my professional time helping companies gainbenefit from enterprise information management, the discipline ofdiscovering, managing, extracting value from and building applicationson top of unstructured enterprise information. EIM can bring togethercore technologies and applications within defined practices likeCustomer Experience Management. I have come to understand that totranslate information-based capabilities into value for the enterpriserequires the right technology tool set and approach that changes how theorganization accomplishes work. When this is combined with behavioralscience and artfully applied to the customer experience, it offers asimple, powerful route to improved customer satisfaction. McKinseywrites about “Using behavioral science to improve the customerexperience” noting that companies who care deeply about the quality ofcustomer interactions invest heavily in responsive websites and insimplified call centers that enable customer choice. Yet many companiesignore what makes people tick. Banks, for example, often disturb thecustomer experience through poor menus on ATMs or ill-advised and everchanging interactive-voice-response (IVR) systems. Other companies placetoo much emphasis on average handling times at call centers and notenough on the quality of the interaction. It doesn’t have to be thatway. Academics, such as Professor Richard Chase at the Universityof Southern California’s Marshall School of Business, have used researchon how people form opinions about their experiences to design actualservices. Chase and his team set forth principles to consider whendesigning any customer interaction; including: Get bad experiences over early, so customers focus more on positive subsequent elements of the interaction. Break-uppleasure, but combine pain for your customers, so that pleasant partsof the interaction form a stronger part of their recollections. Finish strong, as the final elements of the interaction will stick in the customers’ memory. Iventure to say that each of these scientific principles was violated inmy recent bank credit card interaction, and exceeded in my DIYexperience. Customer experience then is certainly partscience, part art, part technology and part human. And effectivecustomer experience management is a competitive weapon, especially inindustries where product differentiation is difficult, if notimpossible. For example, my door threshold and my credit card productsare fairly comparable across their respective vendors; thedifferentiation happens through the customer experience that isprovided. The moral of my two tales is to ensure customerexperience management capability is organized and implemented to becustomer-centric and applied in the moment to optimize the “personal”customer experience. In fact, recent Aberdeen Group research on “NextGeneration Customer Experience Management” shows that Best-in Classcompanies: Invest in CEM-related technology tools and solutions Create a unified view of customer data across the organization, and Personalize product and service offerings based on customer data Anew set of priorities and technologies are necessary to orchestratewhat is required for more effective customer experience management, andwe need to include all touch points, including the experience a customerhas with company representatives. This is true for my consumerexperiences and true in turn for the customers I serve, who aretypically large firms that operate in multinational environments withmultiple languages, and want to reach their customers through amultitude of channels. To succeed, these companies will use aholistic technology and business approach that includes customerexperience, information and communication management, as well as aninformation flow or case management paradigm that ensures the customeris the organizing principal. They will leverage technology to blend thebest of art and science with the customer in mind. At the sametime, encouraging employees to be motivated, customer-orientedrepresentatives of the company, and giving them the power to #delightcustomers wouldn’t hurt either! Editor’s Note: To read more of Deb’s thoughts on Customer Experience, see her Oz the Great and Powerful: How ACM Transforms the Customer Experience

Read More

Unveiling Brand Moments That Matter

I am very excited for the latest release of our Web Experience Management product – we are transforming organizations worldwide with powerful new capabilities we unveiled this week. Our launch themes are designed to empower marketers, customer service and every line of business to engage the new digital consumer in this age of experience. Organizations must rethink how they can deliver branded moments that matter across different geographies, languages, social and media channels. We have enabled the enterprise to be truly responsive – agile and adaptive to the consumer at the moment they want to engage with you. It is imperative that every marketer worldwide reinvest in the web to deliver targeted experiences that flow across devices and screens. OpenText can enable an organization today to engage the new mobile consumer in a way that brings agility to digital marketing campaigns and at the same time engage the employee and embrace the partner channel all with the same CEM solution. It is no longer only about sharing information across multiple touch points. Organizations must map the individual’s journey with targeted omni-channel experiences across mobile, social and traditional www sites. It is one thing to be responsive but more importantly, organizations must deliver targeted moments that matter incorporating digital media from DAM to traditional print from CCM – nothing else is like it on the planet. Seeing is believing! This week,we showcased the new release of WEM at an event in San Francisco, California at the Exploratorium on Pier 15. We had guests from all across North America join us to see the product up close and hear first-hand, amazing stories from two of our customers – Wells Fargo and Taco Bell. Lori Robinson from Wells Fargo shared insights into how all 275,000 team members can use Web Experience Management to increase productivity and access information that in turn improves the end customer experience. With every percentage point of productivity and information access improvement comes thousands (if not millions) of dollars in revenue gains. Nicholas Tran from Taco Bell stated he has“the best job in the world” working with the website and social media to capture his customers’ experiences. His team helps build brand awareness by sharing stories, both humanitarian and fun, with others through WEM. If you missed it – check out www.opentext.com/wem.

Read More

Supply Chain Management Transformation: Technology Shifts the Focus from Process Efficiency to Adding Strategic Value

Cost shedding and process optimization are the typical drivers of supply chain technology purchases. No surprise there. However, according to Gartner, a shift is taking place. Now supply chain managers recognize technology as a strategic asset, linking said drivers with value creation. I came across this notion last month at the 2013 Gartner Supply Chain Executive Conference in Phoenix. The theme was simple: to achieve supply chain management success, enterprises must adopt technologies with real time, analytic and predictive capabilities. The goal is to formulate strategies in which the supply chain function and enterprises’ value proposition tie together. Additional conference insights included: The desire to reduce integration costs is accelerating SaaS adoption Visibility and event management (alerts, notification, etc.) technologies are critical compliance levers Due to today’s business climate, supply chain managers are embracing solutions with favorable financing options, such as subscription or transaction based services By 2016 more than 40% of new business logistics application purchases will be cloud based OpenText has a host of services specializing in supply chain data integration to improve process speed, performance, reach and profitability. Click here to read more; it might help earn your seat at the table where C-level executives make strategic decisions.

Read More

Journey to a New User Interface

Good User Interface Design is highlysubjective. It falls in the category of “I’ll know it when I see it.”Yet, it is a critical part of every development effort. I am workingwith several people on a project to come up with a new interface for ourDigital Asset Management product and I want to share this fascinatingjourney as a diary of how we get there. At OpenText we arefortunate to have some UI Design experts with a background in behavioralpsychology. They have been busy interviewing users to create detaileddossiers to help understand the “typical” user. These profiles, calledPersonas, are based on the various roles of people within anorganization who interact with the software. Our meeting discussedthe information gathered from interviews with users andcapturing thatin the personas. We then can start to map out the tasks, concerns, andpriorities for the persona; what they care about and what they need todo get the job done. We discussed “Barb” the job owner. She coordinatesprojects and campaigns making sure the work is assigned and completed ina timely manner. It was interesting how much information has beencaptured about Barb. In just looking how to initiate a new project thereare many steps. Barb interacts with people and the software-notifications and responses, automatically using templates to createproject and folder structures, and all the associated reporting anddashboards to allow her to track projects, assign tasks, and monitorprogress. Some of the questions are how does Barb prepare the campaignor project for different market segments and channels? Who needs thisinformation? How does she collaborate, review and approve created works? Thefirst stage of the journey is discovery, and a major part isinterviewing and understanding the users. There are many roles thatinteract with the software, with many different needs and differentpriorities. Digital Asset Management is used for creative workflows,campaigns and projects, distribution and delivery; it is an integralpart of the digital media supply chain. The next stage is creatingsome wire frames for the UI and drilling down on the technology for apresentation layer. I’m just a layman, I am not a user inteface expert.However, I do use lots of different UI’s and I know a good one when Isee it.

Read More

The Problems with Most CCM Systems Today

When Actuate decided to develop a solution for Customer Communications Management (CCM), we didn’t want to stay stuck in the past. Historically, problems have existed in CCM, and we wanted to create something that would address them. But first we had to identify what those problems were. The main issue, of course, has always been that each step in the CCM process has been supported by a different toolset from a different technology vendor. That equals up to five or six applications, all cobbled together to try to construct a cohesive system. But we all know that technologies that weren’t designed to work together can never truly be cohesive. So issues emerged: The system couldn’t keep up with new business needs and new demands. Today, most organizations need solutions that can scale dynamically and work within any cloud configuration (public, private or hybrid). Also, the output format for statements is changing: consumers expect to access their information online or through mobile devices, and to interact with it once they do. In order to keep up with changes like these, all components of a CCM system must be scalable. However, when relying on multiple technology providers, you’re always likely to get one that falls behind or prioritizes different upgrades and features from the ones you require. The CCM system doesn’t work in unison. By using components that come from separate vendors, organizations are forced to learn, work with and maintain the varying capacities of different technologies, piecing them together as best as possible. Hand-offs break, data gets lost in translation and IT scrambles to create custom code in order to process content seamlessly. Nothing is in synch, because nothing was designed to work in unison. Accountability is difficult to determine. If a problem emerges, it’s challenging to find out where blame falls: is it with the Data solution or the Document Composition tool, for example? Each may work fine independently, but fail when used together. Too much time and effort is spent running diagnostics and troubleshooting instead of optimizing the system’s features and performance to stay competitive. It’s impossible to audit data as it moves through the process. As data moves from processing to content storage and delivery, organizations want, and are often required, to keep track for auditing purposes. However, today’s systems aren’t integrated, which makes an end-to-end audit trail virtually impossible. All of these issues make it difficult for companies to make effective use of their CCM solution or to rely on it completely. As a holistic solution, Actuate’s CCM solution moves past all of those problems, since it was designed to be a single system that encompasses the whole CCM process. The historic problems of CCM become just that: history. Learn more about Actuate’s Customer Communications Solution.

Read More

Customer Communications Management: The Rundown

Here at Actuate, we get asked this question a lot: What exactly is Customer Communications Management (CCM)? Consider organizations that handle vast amounts of information for millions of customers: cell phone bills, credit card statements or insurance policies. They need to ensure that the process that gets those statements to their customers runs smoothly, while managing every step in the data flow. The process also has to be flexible enough to respond to changing business, IT and customer requirements. For that to happen, organizations rely on the six steps of CCM: Data Acquisition and Analytics. To make it useable for transactional documents, raw data is captured, normalized and augmented to ensure accuracy and completeness. Document Composition. Data is formatted in a way that’s visually appealing and user friendly, adhering to corporate brand standards and accommodating personalized advertising with intelligent, targeted trans-promo offers. Document Processing and Transformation. Business rules are defined that outline how content will be processed, how and when data will be acquired, what composition templates will be used to format the data, where the resulting documents will be delivered, what repository the content will be stored within, what format will be used to store it and what index or metadata information should accompany that content. Multi-Channel Delivery. Everything is formatted so that customers can receive their information in the way that works best for them, whether that means print distribution, online, smart phone or tablet access. Electronic Archiving. Data is archived to satisfy regulatory requirements, provide high-speed search and retrieval for internal discovery and analysis, reduce internal helpdesk costs and improve information availability. Portal Technology. CCM portals connect customer-facing applications with relevant content to increase information availability and reduce manual processing. Portals deliver communications in traditional static formats, like PDF, for simple statement review, as well as in interactive HTML formats that provide insight through rich graphical visualizations and allow users to interact directly with the transactional content. Getting the CCM process right helps organizations use data effectively and efficiently.

Read More

Next Gen File Transfer Enhances Transmittal Management

Until now, transferring largefiles to points outside the corporate firewall has been an exercise ininconvenience, irritation, and risk. Believe me, as a technologyprofessional working out of a home office, I can’t tell you the hoursI’ve spent managing the movement of multi-GB files to colleagues andcustomers–either electronically through FTP and its ilk, or (shudder)physically through the shipping of storage media. The fact is, we’ve all had to make do with inadequate large-file transfer options over the past decade: Email is a no-go if your attachment is larger than 10MB or so. FTP, USB drives and DVDs are time-intensive, unreliable and present sizable security and compliance risks. Public file-sharing services?Don’t even go there, friend. Aside from the ever-present threat ofhacking and uncontrolled distribution, I’ll bet you didn’t know that theservice providers themselves generally reserve the right to access yourdata at any time, for a wide variety of reasons. The Future of Large File Transfer is Already Here Thetechnology exists for a better way, though. I’ve seen it every time myson effortlessly downloads a multi-GB video game from an online retailerwith a simple click. And I finally had the opportunity to experiencethe same convenience in a B2B setting with the recent introduction ofthe OpenText Managed File Transfer (MFT) solution. Howeasy is it to use? Transferring a 3.9GB file from head office to myremote laptop involved simply receiving an email informing me there was afile waiting, clicking on a link in the email so I could direct MFTwhere to put the file on my hard drive, and going back to work. That’sit, that’s all. The download finished seamlessly in a few minutesand I had no reason to worry about the transfer at all. Not only is MFTfast and user friendly, it will auto-resume if the connection hangs andit encrypts the data during transmission. The genius behind thesolution–and the reason we have patents pending on it–can be explored herebut, in plain user-speak, I saved a nice chunk of time, worked with afamiliar email-based interface, and securely received a complete,non-corrupted file with an auditable trail. What more could you ask for? Howabout the fact that OpenText is now integrating the progressivecapabilities of our new file transfer solution with our existing Transmittal Management application? It’s all part of our commitment to providing optimal value to our customers through a comprehensive, cross-pillar Enterprise Information Management (EIM) strategy. Thinking about this combination in action really got the wheels in my head turning! A Step-Change in Seamless, Secure Transmittal Management Theengineering sector is my specialty. And I know from listening toDocument Controllers across the industry that the process of efficientlymanaging project transmittal informationaround the world has become a major issue. In recent years, the scopeand complexity of these transmittals has increased in lock step withtheir financial and legal implications. Document Controllers nowregularly spend stress-filled hours struggling with inefficient methodsof transferring contracts, drawings, specifications, and othertime-sensitive, mission-critical information. What’s more, thesetransmittals are often destined for remotely located engineeringprojects utilizing networks of varying quality and stability. Theintegration of OpenText Transmittal Management with OpenText ManagedFile Transfer will be all they need to adapt to these fluctuatingenvironments while ensuring that essential files are deliveredcompletely, securely, and in full compliance with corporate policies andindustry regulations. Granted, not everyone’s daily activitiesinvolve transmitting the blueprints for a hydro-electric dam to thejungles of Borneo, but if the new OpenText Managed File Transferintegration with OpenText Transmittal Management is designed to excel insome of the most demanding environments in cyberspace, think of thestability, security and efficiency it can add to your organization.

Read More

Social BPM and Smart Process Apps Are Improving Productivity for All Process Participants

Submitted by Derek Weeks on June 6, 2013 OpenText: OpenText recently contributed a chapter “How Social Technologies Enhance the BPM Experience for all Participants” to the book “Social BPM: Work, Planning and Collaboration Under the Impact of Social Technology“. What prompted OpenText to write about Social BPM? Derek Weeks: We are excited about exploring the technology and usage models found in social applications and the results that can be achieved by mapping them to the unique characteristics, diverse participants and emerging opportunities in Business Process Management (BPM). OT: Why do you think Social applications are seeing so much success lately for the enterprise? DW: I think the recent success of social applications stems from the pioneering work that these applications have done to make it easier for people to connect with one another and to share information. As they reached critical mass, even more people embraced the technology and made it a common and familiar way for people to interact and find valuable information. As a result, social capabilities historically associated with applications like Facebook and LinkedIn are being demanded within business organizations. Social applications such as OpenText’s Tempo Social, as well as Plaxo and Yammer are specifically targeting the enterprise market. OT: What about Social BPM? Do you see that impacting the ability to improve process participant productivity, especially with unstructured work? DW: Within the business process and case management context, there are many participants who would benefit from more powerful tools that support connecting people and information sharing. BPM initiatives focus on a variety of participants including Business and Process Analysts, IT Professionals, Knowledge Workers, Management and Partners to name a few. OT: There is so much going on right now in the world of Social, it can get confusing. How would you categorize the various social technologies? DW: Given the dynamic nature of social computing, the technologies that define it are constantly evolving. I see three categories: personal networking, information sharing, and collaboration. Most successful social computing solutions draw on multiple technologies and uses them within the context of a social or business context to form a social application. As an example, Facebook draws on the technology capabilities within the social networking category but each of those sites exist to deliver a different (though clearly overlapping) application. Flickr and Picasa draw on information sharing technology to deliver social applications that enable participants to share photographs. OT: What is the most important piece of advice you would give someone who is currently considering applying social technologies for process improvement? DW: I would say that instead of starting with the process model, start with the participants within the process. Once their needs are understood, processes can be developed that support those needs, not the other way around. Social technologies exist to accelerate social conventions that people already participate in. When evaluating how participants work within a process, how they collaborate and access information should be a part of that analysis. By having social technologies as a part of the BPM toolkit, the value delivered to a participant is increased. OT: How are social capabilities being applied to non-social applications? DW: In the BPM business at OpenText, we mostly talk about applying social technologies and process automation to real people. For example, the concept of following people, which was popularized by Twitter, can be used to follow processes as well. Twitter makes it very easy to see the status of something. There is no dashboard or tabular report to understand. There is simply a single sentence that sums it all up. By using this model to convey the status of a customer’s loan application, or the current status of a service level, participants can now see information in a form that is simple, intuitive and consistent. OT: How can Social help promote collaboration? DW: Collaboration is not just about the ability to create discussion boards. It is about knowing who to collaborate with, how to structure and progress the collaboration and how to capture the collaboration so it is useful to others. Collaboration is greatly enhanced when it is combined with expert search capabilities. Having access to a network of knowledgeable people enables participants to find the most skilled resources to form a collaborative team. OT: OpenText is talking a lot about Smart Process Applications these days. How does social fit into Smart Process Apps? Many customers of our Smart Process Applications are using social and collaborative capabilities today. By providing out of the box capabilities through our smart process application factory, organizations are able to quickly and consistently introduce capabilities like instant messaging, user surveys, and knowledge management. One of these customers, Irish Life, was recently covered in a Smart Process Applications research paper by Forrester, Smart Process Apps: To Combine Social and Dynamic Case Management, where they stated, “Irish Life used social within its case management claims solution and increased productivity by 35%”. OT: What do you think is the most significant promise of Social? DW: I see tremendous potential not just in making people more productive but also making them more knowledgeable. Having access to the right information or expert at the right time has a long lasting impact on the individual doing the work. Productivity and service are also enhanced when process participants have tools like OpenText Smart Process Applications with built-in social capabilities to enable them to immediately find and connect with the people needed to keep process and case work moving. By integrating these collaborative capabilities into business applications, these collaborations can be contextual in that transaction specific information can be shared and the collaboration itself can be linked or copied into a system of record so audit trails are more complete and reconstructing how decisions were made is easily enabled. To learn more about OpenText BPM and Smart Process Applications, please visit www.opentext.com/smartprocessapplications. We also invite you to read OpenText’s contributed chapter in “Social BPM: Work, Planning and Collaboration Under the Impact of Social Technology“. Derek Weeks joined OpenText in 2009 where he leads the global product and corporate marketing team for its Business Process Management (BPM) and Enterprise Architecture (EA) portfolio. Over his 20 year career, Derek held executive and senior management positions at Systar, Hyperformix, StorageTek, and Hewlett-Packard’s OpenView software business. OpenText recently sat down with Derek to discuss BPM and social technologies.

Read More

Cloud: the catalyst for innovation through mass customisation in IT

Part I – How BPM platforms help enterprises to capitalise on the cloud If you believe that cloud equals cost cutting and that’s all, then you can skip this post. If you’re looking for ways to leverage the cloud for innovation at affordable cost, here’s my view on how smart manufacturing principles can now be applied in IT, thanks to a wealth of services that can be consumed from the cloud. The question is how to capitalise on this? The (short & simple) answer is: apply a BPM platform as your “cloud service assembly line”. I believe that we are going through a radical shift in our industry and it strikes me that there are a lot of similarities with how the manufacturing industry has evolved over time. Many people view the cloud as a means of just reducing the cost of IT and, while that’s definitely an important factor, there’s a far more fundamental pattern that revolves around consuming IT services and applications from the cloud, tailored to the needs of customers and end-users at affordable cost. One could say that cloud services and applications are “democratizing” IT and fostering innovation by combining various services at (relatively) low cost into tailored and personalised solutions that drive productivity and engagement through personalized experience of customers, partners and employees. In the manufacturing world this concept is called Mass Customisation, as first described by Joseph Pine 20 years ago. It’s all about postponing differentiation until as late as possible in the production process combined with lowest possible unit cost. Enterprises should take advantage of this concept and apply it in IT by: Embracing the cloud for consuming services on different XaaS levels where and whenever possible. (BTW: I have an appreciation for the fact that some systems may be kept on premise) and Making sure that they use an “assembly line” to integrate, aggregate and orchestrate these services. In my view, a BPM platform is an ideal “assembly line” as it can perfectly handle the initial provisioning, the “stitching” of the services, and the orchestration of running aggregated services. If you’re smart you’ll use BPM “as a service”. I say “should” take advantage because this opportunity of leveraging cloud and applying mass customisation will turn into a threat if ignored. I am seeing companies adopt this approach and driving innovation at a cost level that is substantially lower and at a pace that is significantly higher than the alternative of doing it on premise. As this is a pretty fundamental concept, let me share some background on the concept of Mass Customisation with you, so that you really get what I am referring to. As you may know the Cordys core team has its roots in the ERP space, working for Baan Company, one of the four players that actually shaped the ERP market in the nineties. We were particularly strong in the discrete manufacturing vertical with best of breed functionality to support multiple manufacturing principles varying from make-to-stock, engineer-to-order and assemble-to-order. Make-to-stock production is typically for commodity goods where mass production allows for lowest possible price per unit. On the contrary, engineer-to-order is typically suitable for capital-intense industrial goods like high-end machinery that is built uniquely or in small series. In between the two extremes is the assemble-to-order concept which allows companies to assemble a product with unique characteristics but built up from standard modules and assemblies that, in turn, could be make-to-stock, or produced on demand based on predefined specs. Assemble-to-order combines elements of make-to-stock and engineer-to-order to produce larger series at affordable cost. A smart form of mixing these concepts is the manufacturing principle called Mass Customisation. It’s described as “effectively postponing the task of differentiating a product for a specific customer until the latest possible point in the supply network. It combines the low unit costs of mass production processes with the flexibility of individual customisation”. It’s interesting that complete industries such as car manufacturing have gone through cycles when it comes to applying these manufacturing principles during their lifetime. A century ago cars were pretty much engineered-to-order. Then Henry Ford revolutionised the industry with the introduction of the Model T that could be ordered in “any colour, as long as it was black”. In the last decades car manufacturing companies have driven lean and assemble-to-order manufacturing principles to perfection, including the support of Mass Customisation more recently. One of the best examples is how BMW has turned the Mini Cooper into a personal and affordable style icon. It says that out of 100,000 new Minis only 5(!) will be exactly alike. So what does it take to apply Mass Customisation principles? In a nutshell: First of all, you need the right set of standard components, assemblies and modules as building blocks, along with the definition and routing of the operations to be performed to get the final product. Secondly you need the production / assembly line to perform the assembly tasks and operations. Last but not least the customer needs the tooling to actually configure the product within the applicable rules and constraints. You may be thinking, “This is all very nice, but how does this work for IT”? In IT, like the cycles that the car manufacturing industry went through, we started off with engineer-to-order systems and applications as the predominant model. Then the model of “commercial-off-the-shelf” software or “packaged” applications came along, a market boosted by the availability of client–server systems. The rise of the internet, followed by Web 2.0 based on the worldwide web standards, and the availability of bandwidth infrastructure has propelled the proliferation of IaaS, PaaS and SaaS offerings for both consumers and business at a pace absolutely unseen in the history of technological innovation. So we now have the building blocks on various levels of the “stack” and various levels of granularity. Think about network, storage, virtual servers, and a myriad of applications for CRM, ERP, SCM, BI, office productivity, mobile apps etc. They’re all available as services from the cloud, and the standards (Rest & SOAP) for “plug & play” integration of the services are there. The last piece that we need is the “assembly line” to create, aggregate and orchestrate the right bundles of integrated services that are tailored to the needs of the end users. A BPM platform, assuming that it’s supported by solid integration and application development capabilities, is the perfect assembly line for leveraging the Mass Customisation principle using cloud services. Let me give you an example to illustrate my point. We have replaced a Siebel CRM system at one of the leading emergency centres in The Netherlands with a BPM / Case Management application that combines core application logic built as per customer specification (engineer-to-order) with a set of standard public- and private cloud services which allow this company to handle road assistance support calls very efficiently and effectively. BPM is essentially used as the mechanism to steer the application (assemble-to-order). We mix dynamic workflow to support the call handlers with straight-through workflows calling and generating the right mix of services to coordinate all stakeholders in the value chain. One of the ways which we apply mass customisation is that, by integrating Google Maps with the application to plot salvage companies, hotels etc. in the neighborhood of an incident on the map, the call handler gets visual support for selecting the right services for the client in trouble. Similar services are called for number plate checks, insurance policy checks etc. There’s no way that we could build an equivalently rich solution at an affordable cost level that would be so tailored to the needs of the customer without leveraging these public- and private cloud services. In the case above we leverage pretty simple and straightforward cloud services. In another example, one of our strategic partners has built a full blown solution for “a 360° view on the customer” with an integrated portfolio of cloud applications for web content management (Drupal), marketing automation (Eloqua), sales-force automation (SFDC) and analytics (Adobe). This portfolio is offered as a single solution to the customer, connected to on premise systems like ERP and managed by the partner under a specific Service Level Agreement. In both enterprise customer examples, the BPM platform acts as the decoupling layer between different types of services, supports the aggregation on multiple levels of granularity, and introduces a level of flexibility to influence (Business Rules), change and replace these services at an unparalleled level of productivity. If this approach gets industrialised with “service configurators” on the front end and an assembly line that takes and personalises the right mix of services, we’ll see the birth for Mass Customisation in IT. In my next blog I’ll address how the principles of Mass Customisation are applied in the next generation of business application where we’re seeing ISVs adopt this approach with great results when it comes to productivity and application lifecycle management cost.

Read More

HIPAA Violation Penalties on the Rise

In the United States, the Health Information Portability and Accountability Act (HIPAA) protects the privacy of medical patients with a Privacy Rule. The rule governs how patient records must be handled by medical providers, and ensures that patient rights are protected. The Privacy Rule also allows providers to share information with doctors and others who need it. This sharing usually requires patient consent, such as a signed permission form. The US Department of Health and Human Services is responsible for enforcing the HIPAA Privacy Rule, and penalties for violations have recently increased. Get more information about these penalties and how RightFax can assist with compliance in this blog post by OpenText partner Advantage Technologies.

Read More

SME e-Invoicing Adoption in Europe – Part 2

The Role of Service Providers One of the biggest complaints about B2B and perhaps  true in the early days is that this method is too expensive for small and medium companies, and that these B2B programs were all driven by larger customers who essentially  left  their trading partners with little choice about participating. I’m not sure that the statement is 100% accurate. It is true to say that some companies initiate their e-Invoicing programs with a ‘mandate’ to their suppliers that they must use a certain tool, or service provider, however I can also give plenty of examples where large companies offer their SME suppliers a range of different solutions, and incentivise their suppliers through early payment (if electronic), or supply chain finance. It is also true to say that this method has been the most successful to date in increasing e-Invoicing adoption. However, the world is changing. Not all service providers are aimed at large corporate customers; many are now focused on providing solutions that specifically target the SME market, including the major accounting solution providers. But these solutions need to be able to connect SMEs to their larger customers. In the majority, SME solutions are commercial; an SME could face an up-front investment of anything from €50.00-€3,000 depending on the complexity of the solution and process. But the transaction fees will be less than the price of a stamp, and given enough time and volume – the system will pay for itself. Commerce in action. Why should an SME pay to e-Invoice? Each of these solutions has been researched, developed and offered to the market by private enterprises. Each solution has involved a level of investment, and therefore requires a return on that investment in some shape or form. There are some solution providers who claim ‘free’ e-Invoicing for SMEs. In many cases, the large buyer pays so that SMEs adopt quickly but my personal thoughts are clear: Somewhere, somehow… somebody pays (even if through taxes) and it is well to remember Zuckerberg’s quote, “If you’re not paying for the product, you are the product”. Others disagree. Of course SMEs can engage in electronic invoicing by themselves. A recent report from Sage France highlighted that the majority of SMEs in France are currently sending e-Invoices via email as PDF attachments. This is both encouraging and disheartening at the same time. It is a positive thing that these companies have thought to remove paper from the process, but essentially a PDF is ‘digital paper’ – in that it does not contain structured data that can be easily imported and processed through their customer’s systems. In fact, many companies that receive these PDFs print them to paper… and as we discussed in my previous post, innovative service providers are now offering solutions that can extract data from modern PDFs. Another factor to consider in the PDF scenario is tax compliance. Under the new rules in Europe, it is still unclear as to whether this method meets the requirements of 2010/45/EU as implemented by each member state. Do SMEs care? Probably not, it is doubtful they examine the intricacies of doing business electronically – they just do it, but this can have tax and process consequences for their customers. This leads me to another complaint I hear about service providers. They confuse the e-Invoicing marketplace on tax compliance and use ‘scare tactics’ to win business. It’s a perspective I guess, but not one that I subscribe to, much of my life is spent assuring my customers that they and their suppliers, will be tax compliant if they use our solutions. The hardest part of this is interpreting the current rules over every 27 member states and working with tax consultants to ensure certainty in tax matters. Far from confusing the landscape, I think that the majority of service providers work very hard to provide clarity – and to ensure their customers meet the expectations of tax authorities. In my previous post I discussed the power of the network effect for SMEs. However, with different corporates joining different networks does this mean some SMEs have to use multiple e-Invoicing service providers when issuing to different customers? How networks interoperate has been a topic of discussion for years, with telecoms being an historical example, but the interoperation of B2B networks is more challenging due to the complexity of processes, standards and tax regulation. Service providers are coming together in industry associations and working to ensure challenges around the interoperability of networks is overcome. Another set of service providers that should not be overlooked is banks. While some have attempted to enter the B2B market selling to corporates they have met with varying levels of success. But all SMEs have banking relationships, so can banks leverage this relationship further? In the Nordics banks are very successful in consumer e-Invoicing, as direct debit is not widely used and a consumer can log into their on-line bank account, view and pay their bills. This model could potentially be used across the B2B SME community and banks could also have a larger role to pay in early payments through different supply chain finance models. It seems to me that B2B service providers are providing valuable services and solutions that help companies to do business electronically. Essentially, a service provider ‘does what it says on the tin’ – it provides a service, which, if it does not provide value, will not be purchased. This is how commerce works and how private individuals acquire wealth by starting their own companies. I think this is important to understand; within the European Union there are moves to fundamentally alter the way that business is done electronically. The European Commission has invested €millions into large scale pilot programs that compete with private companies in the B2B space with a value proposition that is ‘free’ (again, nothing is free, your tax Euros are funding it). Competition is one thing, but governments are thinking of mandating the use of these LSPs – this ‘federalisation’ of industry is why my next blog will discuss the role of government in SME e-Invoicing.

Read More

Payments, SEPA and the Berlin Wall – EBA Day 2013

Last week I attended the Euro Banking Association’s EBA Day 2013 in Berlin, a pan-European conference focussing on cross-border payments, collaboration between banks and payment infrastructures, partnerships and innovation. The city and the venue’s location were highly symbolic of the conference themes. The Berlin Estrel Convention Centre is located on the edge of the cold war era demilitarised zone, with three original sections of the Berlin Wall still standing in the hotel’s courtyard. Another big concrete wall is about to be demolished within the next few months, this wall was constructed of domestic payment schemes, rules and laws. The SEPA and PSD wrecking balls from the European Commission will now provide the cross-border payments harmonisation that will enable corporates, banks and infrastructures to maximise their business outcomes. If you missed this conference, you can view the full agenda and speaker’s list here, and watch interviews of key speakers and experts on the EBA Day TV here. The most valuable session, in my opinion, was “The future payments and transaction banking landscape”, delivered by Patrick Dixon. According to his study, it currently takes a consumer like you or me about three seconds to be annoyed by a slow loading web page. It takes both of us about five seconds to decide to hit the “back” button when faced with a slow experience. Those five seconds can lead to 95% loss of patience with Bank clients, be it a consumer on a mobile app, a treasurer on a banking portal or a wealthy investor stuck on an automated personal banking switchboard. Patrick Dixon believes that the future of Banking, not just payments, will thrive around monetising customer’s personal emotions and habits. If they want to deliver, Banks, Mobile Network Operators and phone manufacturers should team up to provide an end-to-end service, wrapped up with Big Data analytics, and focus their efforts around packaging a total “turnkey” personalised solution that includes both Banking products (accounts, cards, services, payments) and mobile services (handset, voice, data, identity). In summary, EBA day 2013 was a great occasion for networking; I and the rest of the GXS team there had the opportunity to meet with many of our global clients to discuss the challenges and opportunities of Payments. I am already looking ahead to this conference next year and my personal expectation is to be able to walk through each bank’s lessons learned from the SEPA migration and understand where some may have tripped up. The re-unification of Germany and lifting the Iron Curtain was a lot more complex than just breaking down a single concrete wall; the next iteration and broader scope of the Payments Services Directive, as well as future SEPA rulebook are going to be the new operating frameworks for all players. Why do I say this? Well, my colleagues and I are already helping Financial Institutions, Corporates and Payments Processors to transform this first generation of “tactical compliance” into long-term and strategic operational excellence. If you want to know more about how we are doing this, I’d love to talk to you about it!

Read More

Five Common ECM Wins…and the Straightforward Cloud Solution

As cloud-based Enterprise Content Management (ECM)implementations become increasingly commonplace, I’m seeing a growingnumber of noteworthy client success stories. The kind that overcomesubstantial hurdles to literally transform a client’s content managementinfrastructure from non-existent to world class in one seamless,magnificent leap. The really intriguing element in the vast majority of them is witnessing the tangible, quantifiable benefits the cloudbrings to information management. For me, it’s one thing to discuss thepractice of cloud-based content management as a theoretical talkingpoint, it’s quite another to see it in action, elevating contentsolutions from great to how-did-we-ever-operate-without-this throughadditional functionality, security and efficiency. The OpenText team has collaborated on yet another exampleof this recently; designing and implementing a cloud-based ECM platformfor a non-profit, clinical research firm. The core challenges and goalsof this organization are remarkably similar to most large enterprises,namely, an extensive, remote workforce all generating and collaboratingon vast amounts of mission-critical data that’s highly sensitive andstringently regulated. Sound familiar? What’s more, you may alsorecognize the environment that sparked this particular initiative:Essential data stored in isolated network drives and filing cabinets,email serving as the principal means of collaboration, security andgovernance tools struggling with oversight of highly mobile staff. (As aside note, did you just mentally give your organization one, two, orthree checkmarks? That’s what I thought!) In short, thisparticular enterprise was at the mercy of the disjointed and exceedinglymanual processes resulting from the lack of an integrated contentmanagement solution. For participants, the frustration factor wassetting in with the inefficiencies posing challenges to meet deadlinesand an overall lack of confidence around the information being used tofuel decision making. On top of this, every day was like rolling thedice with potentially catastrophic implications for data security,financial stability, regulatory compliance, and reporting credibility. Tobe sure, these were considerable issues that permeated right to thecore of the enterprise. Yet the solution was relatively, evensurprisingly, straightforward: A tailored version of OpenText Content Serverprovided the centralized and standardized data management structurethey needed to ensure optimal security, management, and compliance. Whatput it over the top (and into my personal business hit parade) was theastute decision to utilize an OpenText Cloudsolution to host the application – ensuring easy, real-time access forfar-flung contributors, both those internal and external to theorganization, and collaborators through a custom-designed web interface. Voila…One undemanding yet multi-faceted suite of ECM tools solving five common issues: A secure repository for storage and distribution of version-controlled documents Increased adoption, efficiency, and collaboration via a user-friendly, web-based interface Comprehensive auditing and reporting functions from one, centralized source Full compliance with all applicable government and industry regulations Simplified system management that exceeds security requirements thanks to OpenText Hosting Services It’s a beautiful thing. And I invite you to read through their entire success storyto get the full story. For us here at OpenText, these contentmanagement successes are both rewarding and invigorating. Not only dothey fulfill our mission of contributing to our customers’ success – youcan literally see the stress from years of legacy inefficiencies fallaway as velocity, security and adoption skyrocket – they also serve asnotice that we, as an organization, are on the right path. Cloud-basedservices, be they ECM or broader-based Enterprise Information Management (EIM) initiatives, are the future. There’s just too much to be gained with them. It’stime to start considering the benefits of cloud services. I know theycontinue to change my perception of what’s possible in the world ofenterprise content management.

Read More

Thoughts on “5 Reasons Why DAM is No Photoshop”

I saw this thought provoking article by David Diamond in CMS Wire, “5 Reasons Why DAM is No Photoshop” (read it here) and wanted to share some of my thoughts. We both agree that DAM is no Photoshop, and I am sure even more reasons could be listed. However, I am curious as to why the comparison of DAM to PhotoShop? Is it to show how DAM has missed the mainstream boat, why DAM has not evolved into “SharePoint for the masses”? (although Microsoft would like to see it that way…) I think there are many factors in play. PhotoShop is tool for creatives – highly flexible and complex, not intuitive, too many buried menus, hard to discover capabilities and a significant learning curve. And yet, as you point out it has become the leader. It also caters to a highly skilled professional market. Other Photoshop-like systems are out there to address the non-professionals, freeware like Paint.net, photo editing tools and even mobile apps. In the world of DAM there is not a professional community per se, few organizations have identified roles for DAM Managers and those who use DAM are frequently the casual user or consumer of content. And as you know DAM can mean anything and everything, the lack of or continually expanding definition has perplexed the industry for a while. That said, your points are valid. There is a bit of schizophrenia in DAM. On one side we work with the creatives managing assets, workflow, work-in-progress all those things that are not part of the core competency of tools such as Photoshop. On the other side are business, marketing, operations and delivery departments all interacting with digital assets and DAM. Digital assets are aggregated from other sources such as stock photos and agencies, requring usage and rights management. Assets are repurposed, refreshed and reused as they move through their lifecycle, prepped, formatted and conformed to the for delivery and distribution to a myriad of destinations, platforms and devices. How do vendors, even dinosaur vendors, drive innovation and communicate better? I think part of it is a focus on the core competency of what DAM offers with the added understanding that systems exist in an ecosystem. I see it as DAM providing that core infrastructure for an organization – multiple systems and people being able to securely access that body of well-organized, cataloged, indexed and findable content to serve any number of purposes. Our experience has been that innovation is happening in the organization in partnership with vendors. Clever developers working with open API’s and flexible platforms interfacing with business, and creative systems; integrating with software to streamline and automate on-line, mobile, video, print and other delivery. The goal is not a complex, monolithic application geared only for professionals – like Photoshop – but a platform or infrastructure allowing controlled, secure access to the digital assets – the IP of the organization, integrating workflows and applications to connect people, processes and technology. This is what will help organizations be more successful as the digital world continues to saturate everything an organization does.

Read More

Putting the “Super” in QSuper

Submitted by Michelle Dufty on May 17, 2013 We’re Talking Smart Process Applications for Financial Services We have all heard the story before: organizations trying to become more customer-centric but can’t because they are held back by ageing applications and siloed operations that leave them unable to keep up with the demands of the modern customer. I have been working a lot recently with customers in the Financial Services industry where this issue is extremely prevalent. The good news is that there are alternatives to legacy systems that no longer fit the bill, and at OpenText we call them Smart Process Applications. Smart Process Apps combine the power of Business Process Management or Case Management, with ECM, capture, analytics, collaboration, and customer communications management to allow organizations to be more collaborative and support the dynamic and demanding customer environment that exists today. Smart Process Apps break down the barriers of application silos while still referencing the legacy applications as the system of record. I am really pleased to announce a new case study we did with QSuper, one of Australia’s largest superannuation (aka retirement) funds that service government employees, related entity workers, and their spouses. Like many financial services organizations, QSuper operates in a highly competitive and dynamic environment, where legacy applications with limited functionality and no central view of the customer could limit their ability to open new accounts, provide stellar customer service, and ensure consistence execution of operations. QSuper used to rely on a workflow system that was embedded within one part of their organization. The system had limited functionality, no disaster recovery capabilities, and experienced frequent downtime – that led to disruptions in their business. In addition to these issues, eight different systems were used by operations staff with numerous repositories for customer information, which included a mixture of paper and electronic documents. Like many other financial services organizations, QSuper realized that they needed to modernize their application infrastructure to provide a single view of their customer in order to become more efficient and improve the customer experience. This new system, workQ, is a Smart Process Application that now handles 78 percent of customer administration processes and is used across QSuper, from knowledge workers processing claims to business operations and IT staff to mid-and senior-level management. They have also been able to decommission 5 of the 8 former customer systems, which has significantly reduced business operations costs and improved employee responsiveness to customer inquiries. The advanced analytics and reporting in the new system has reduced the manual effort to create reports by 99% and provides a much clearer view of overall business performance. QSuper is just one example of numerous customers who are providing better customer service with a modern, Smart Process Application infrastructure. To learn more about Smart Process Apps and other customer success stories, follow me here.

Read More

Why FoIP?

Organizations depend on fax for secure, compliant, auditable, and reliable communications. Fax use is expected to remain high and increase yearly as millions of business-critical documents continue to be faxed every day. And with voice over IP (VoIP) now commonplace, fax over IP (FoIP) allows organizations to exchange high volumes of these business-critical documents over their VoIP networks. For companies extending communication strategies with IP-based document delivery, using software-only fax solutions such as OpenText RightFax helps facilitate these efforts. This also future-proofs an IP network investment by providing a scalable architecture and easing the transition from traditional PSTN to IP communications. Importantly, RightFax integrates directly with all the most popular business systems, including those provided by market leader CISCO. RightFax 10.5 is duly certified by CISCO to work with Cisco Unified Communications Manager (CUCM) and Unity Connection version 9, and has been recently approved for CISCO’s Solutions Incentive Program (SIP). Effective FoIP solutions leverage IP network resources, enhance unified communications, and speed investment returns on IP equipment and VoIP applications. Additional benefits include: Improved document and communication security Reduced document delivery costs Resource consolidation Simplified IT management Bridged data and voice networks Enhanced compliance (i.e. with HIPAA, SOX, GLBA) Server virtualization Centralizing and consolidating fax also reduces administrative costs, eliminates fax machines and consumables (often by leveraging MFPs), and unifies voice, fax, and data onto a single network. To learn how OpenText helps businesses make the transition to FoIP, click here.

Read More