Enterprise Content Management

Demystifying the Complexity of a Trusted System

trusted system

In 2012, the state of California adopted regulations specifying that official electronic records must be maintained in a Trusted System. California is leading the way and other states are following this lead. This mandate has led many state agencies, cities, and counties in California and throughout the nation to attempt to determine how best to comply with this ordinance and implement a Trusted System for electronic records. Most of these agencies currently have some form of imaging system or Enterprise Content Management (ECM) system managing their electronic content and the good news is that it is not as difficult as many think to establish and certify your current or next ECM environment as a Trusted System. What is a “Trusted System”? Trusted System is defined as, “a combination of techniques, policies, and procedures for which there is no plausible scenario in which a document retrieved from or reproduced by the system could differ substantially from the document that is originally stored.” CA Government Code 12168.7 Why is a “Trusted System” Important? The sole reason to certify an ECM environment as a Trusted System is so an agency can destroy the paper copy of the record and establish that the electronic version, stored in the Trusted System, is the official document of record. If your agency does not want or need to eliminate the paper copy, there is no reason to seek Trusted System certification. However, this should lead to some interesting questions about how much your agency is paying for onsite or remote storage of paper. Let’s just say that companies are very happy about storing your paper for you and have no issue taking your money. One large county HHS agency in California that I worked closely with in their Trusted System implementation was able to justify the cost for the conversion from paper to electronic only on the ROI hard dollar savings resulting from eliminating the costs for storing and moving paper. They did not even need to look at the soft dollar savings associated with increased efficiency and speed of processing based on having the electronic version instantly accessible to county staff. How-To Implement a “Trusted System” A certified Trusted ECM System is not too difficult or too costly to implement. In a nutshell, it can be accomplished by  following the AIIM Recommended Practice document.  OpenText works with clients on a regular basis to: Ensure that 2 copies of each document are stored in the system  – redundant separate location storage to prevent possible data loss. This also includes documented procedures for backup and recovery. Ensure that the document cannot be altered when in the system – this is a combination of WORM type storage, PDF-A File format, and ECM system security Ensure that you have documented policies and procedures for scanning and electronic records management that conform to the AIIM recommendations You can actually self-certify that your system meets the recommendations as defined in the AIIM Recommended Practice document, but as you would want the documents to be admissible in court, you would want to get certification signatures from both your CIO and General Counsel or an independent 3rd party. Note that any major changes to the system such as upgrades or new architectures will require recertification. OpenText is pleased to report that several large government agencies in California have already achieved trusted systems certification using OpenText™ Documentum. We would be pleased to share this information with you and show you how other agencies have achieved this certification. What is your experience with Trusted Systems?

Read More

Big Data in Healthcare: Using Health IT Innovation to Accelerate Value

Big data in healthcare

The healthcare industry is in the midst of a big data revolution, starting with the increasing supply of patient information. As we transition to more data-driven healthcare, the ability to access, share and optimize patient data will become even more critical to our ability as an industry to deliver high-quality care—let alone reduce costs for both patient and provider. But the goal of creating a more complete view of the patient record continues to require access to multiple clinical systems for patient data and associated documents—information that remains, despite our best efforts with widespread EMR adoption, locked away in numerous systems and document repositories, in multiple formats, without easy access or facility for electronic sharing. And the industry will be faced with even more patient data moving forward. Technology, though—as we started to discuss in last month’s blog—isn’t enough to improve healthcare outcomes, or lower costs. The way the industry manages and optimizes all patient information will have to undergo fundamental changes before providers and organizations alike can distinguish between valuable data and information overload. More specifically, the industry will have to leverage big data solutions that integrate fragmented healthcare information with existing patient health records to make data more meaningfully usable and enable providers to gain the actionable knowledge from it needed to make better clinical decisions. And access to all patient data will have to be streamlined across the continuum of care to support these efforts. To put it more bluntly, we will have to divine intelligence from patient data and visualize it to prevent “digital landfills” that render it useless. Recent advances in value-added solutions are allowing for just that —information to be easily collected and analyzed from multiple sources, and integrated into existing clinical systems for use at the point of care and sharing across the continuum. Financial concerns and clinical outcomes fueling big data demand Financial concerns are driving the demand for big data solutions, conceivably more than any other influence. What we know for sure is that the current trajectory of healthcare spending in the U.S. is simply unsustainable. Spiraling health care expenditures are expected to double, according to IDC Health Insights, to $4.5 trillion by 2019—which continues take its toll on the U.S. economy by way of increased healthcare costs and health insurance premiums. One notion behind this trend is overutilization. As a result, many payers are shifting from fee-for-service to value-based reimbursement, which will prioritize patient outcomes and place greater emphasis on quality and preventative care as opposed to treatment volume. Under these reimbursement models, providers will have greater incentive to compile and exchange patient information in an effort to speed care delivery and reduce costs, but doing so will require access to comprehensive patient health records. Clinical trends to improve patient outcomes are also paramount in big data’s upsurge in enabling data-driven healthcare. Providers have traditionally been trained to care for the ‘patient in front of them,’ utilizing their own judgment to make treatment decisions. However, the shift to value-based reimbursement changes this paradigm. This movement is creating an outcomes-based scenario where providers will have to embrace the trend of evidence-based medical practice, systematically reviewing clinical information to make treatment decisions based on all available patient data. Providers now will also have to think in terms of entire populations of patients, especially those with specific conditions or diseases—including those who may be ‘at-risk.’ Big data will drive the new disciplines of population health management and analytics, which will enable healthcare organizations to identify and segment patients with certain diseases to provide them with preventative care. Aggregating these data sets will further fuel evidence-based medicine since access to more information and larger datasets will provide more robust evidence to treat patients, allowing for precise, personalized care to be delivered more quickly and cost effectively. From meaningful use to “meaningful use” Healthcare has typically always lagged behind in the use of big data when compared to other sectors, mainly over the chief issue of patient confidentiality. In recent years, however, we have begun to catch up, and first adopters of big data solutions are achieving positive results. But we still face a critical need to accelerate big data innovation in healthcare, and its implementation, in order to capture the full value of what patient data can provide us. To do so, we need to disrupt the current state of the data management in industry to shift from meaningful use of EMR technology to more literal “meaningful use” of patient information to improve patient outcomes and lower costs. But this transformation to data-driven healthcare will require equal parts of record integration, interoperability, usability and infrastructure. Then and only then will healthcare delivery be able to advance to a point where we can solve the problem of rising healthcare costs and decreasing quality. The days of looking at big data as a technology rather than a tool to enable better healthcare delivery are long gone—especially when you factor in that the need to remove the usability constraints from EMRs and other clinical systems. Solutions such as this bring us a step closer to truly appreciating the scope of evidence-based medicine, population health management and analytics can play in healthcare delivery, and they will allow us to determine the connection points between data to identify trends and improve patient outcomes, and drive down costs. Big data solutions will provide the mechanism needed to generate more meaningful knowledge that can positively impact patient outcomes. Clinicians will also become empowered to deliver better care and use access to all information known about a patient to help turn expensive patient encounters into more affordable ones, both for the patient and the healthcare organization. However, healthcare organizations will have to continue to work together to create and share more comprehensive patient records so that the needs to the patient can be met as they journey across the continuum of care.

Read More

Quality of Care with IDC & HIMSS models – Where are eHealth projects going in EMEA?

HIMSS

It is quite interesting to see how HIMSS is evolving their Continuity of Care model: I wrote a bit about it before WoHIT and I was looking forward to seeing how they will introduce it into the EMEA region, where clearly the demand is much more distinct than in the US. The model is still officially scheduled to be launched in August. However in Nice there were many more details shared with the audience than in Orlando. And I am willing to share my thoughts, as promised in the previous blog to look at commonalities and differences between IDC Integrated Care Study and HIMSS Continuity of Care Maturity Model. Based on a recent survey and White Paper, IDC is making a snapshot of a state of current affairs and propose a tool, a framework which should allow concerned stakeholder in healthcare go into integrated care delivery today. HIMSS is bringing in the new term of “continuum of care” (compared with integrated care from IDC) and suggests a maturity model and classical multi-step approach which was quite successful for their EMRAM ranking proposal. So in a nutshell we have a framework to get to the integrated care from IDC and a roadmap to get to the ultimate continuity of care from HIMSS. And basically I’d say this is the only major difference between the approaches. In the end of the day the goal is to have a better care delivery for any participant in the ecosystem, and suggest how to get there. Framework is good to start addressing the key challenges today, assess the legacy and to simplify complex issues. IDC is focusing on the information as a technology term and suggests it as a key driver into the integrated care. It points out what the key decisions should be made before implementing any IT solution or solutions and they are quite pragmatic, e.g. look at the requirements toward the patient data model. It is very valuable for developing countries where the legacy isn’t such a burden and decisions in many cases are made quicker, a lot of steps can be simply skipped. Georgia can be a good example of such an approach. On the other hand, in my opinion, the HIMSS Maturity Model can be especially helpful for the mature markets. I’m not foreseeing a similar motivation based on competition which was a major driver for the EMRAM success with Hospitals. Main beneficiaries of the Continuity of Care are still the regional and country officials, network of hospitals rather than single hospitals directly. And they mainly will be benefiting from the detailed guidelines of different dependent tracks to design a solid eHealth national or regional strategy where the proper steps are made in proper time and include necessary stakeholders. Just to name some tracks: standardization, IT systems, privacy, patient engagement and many other. I’m still feeling impressed by the Finnish eHealth initiative, shared at WoHIT. A consistent step by step progressing which is taking already more than 15 years was the key to build – I suppose – the most advanced eHealth platform in EMEA. As discussed in Nice, the top-bottom approach, and the decision to adopt the standards have been instrumental in granting the success of the initiative, at the beginning as well as now that the system is evolving and opening towards new internal improvements as well as to international data exchange. Another careful look into both concepts reveals strong similarities again. As I said before, the goal is the same and the must do things are similar to achieve better care. There should be accessible, exchangeable and understandable information about patients; and all the stakeholders on both ends of the value-chain should benefit out of it. Obviously, both concepts should complement each other and the best of each should be taken. Vendors are starting incorporating analysts’ guidelines into their solutions and run project accordingly. What we have seen in Nice at WoHIT showed this quite clearly. No participant skipped talking about interoperability, health data exchange and how all the technologies are built to serve the whole purpose of the continuity of care. I wouldn’t name specifically but the big guns of EMR systems, integration platform vendors were pitching standardization commitments and projects where the concepts were already implemented. This new market has to stabilize couple of years at least and we have to see in reality what is implemented and how far the integrated care and continuity of care brought the expected increase in quality. In a meantime we will be part of the HIMSS conference in Turkey coming up in June trying to address both national programs and hospital IT maturity, I believe we will test these new ideas and concepts quite practically in this very interesting market. I’m looking forward having more discussions in Istanbul, and see what topics would be most of interest for the audience. At OpenText, we are running eHealth/Integrated Patient Record projects in other countries across Europe, so we will be welcoming passing by the booth or visiting the sessions where some of the results of our projects are going to be shared with the audience.

Read More

Information Governance in the Energy Sector: Treating Information as a Valuable Resource

 This is the first in a series of blogs that explore how Information Governance is delivering value to utilities, mining and oil and gas companies, in their core business processes. Information Governance is more than just “records management”: It is a means to manage risk, ensure HSE and regulatory compliance, and achieve operational excellence and competitive advantage from your information. Even in heavily regulated industries such as energy and resources, the term “records management” (RM) has a somewhat negative connotation. Over the span of my career in engineering firms, energy companies, and public sector bodies responsible for governing the industry, I’ve come to accept that executives are reticent to invest in RM initiatives. In my experience, it typically takes some compelling event to drive major information and records management initiatives within these businesses—a scathing audit demonstrating non-compliance; a health, safety, or environmental incident that could have been avoided; or a costly claim or lawsuit. In these instances, poor information management (IM) or the reactive costs of eDiscovery are too high a price for the business to pay. Today, it is hard to imagine that there is any executive—or employee, for that matter—who doesn’t recognize the value of information in their day-to-day operations. Especially when you consider that companies such as Google and Facebook are making such lucrative business out of information alone, it’s clear that information is a valuable resource. Yet many utilities and resources companies continue to suffer from poor information management, often failing to invest in the tools that support even their most basic information management needs, let alone leveraging that information for innovation and competitive advantage. Information has value, and when properly managed, information can reduce risk as well as positively impact your overall revenue, efficiency, and profitability. When effective information management is embedded in the processes and activities that your organization performs in the course of normal operations, you’re able to achieve increased compliance and realize the strategic value of that information. This is where Information Governance comes in. I am pleased that the IM industry has adopted this term to reflect the new era of information management. It more accurately reflects the paradigm shift we’ve been experiencing over the past 20 to 30 years, as electronic information in its many forms has replaced antiquated practices and tools rooted in paper-based processes. Over the course of the next few weeks, we will be running a series of blogs that demonstrate how Information Governance supports business processes and challenges that are specific to energy and resources companies. You will hear from my colleagues, with their individual areas of expertise, about content-centric applications such as Engineering Document Management, Contract Management, Asset Information Management, Customer Communication and Customer Information Management. We will also cover two additional topics reflecting the challenges of the complex information technology (IT) landscape in modern energy companies: governing your SAP and SharePoint information, and managing legacy systems and information silos from mergers, acquisitions, and joint ventures. I hope that you will join us for the entire blog series, as we demonstrate how Information Governance manages your risk, ensures your compliance, and delivers value to your business. For more information on the topic of Information Governance in the energy and resources sector, I encourage you to read the whitepaper we co-authored with PennEnergy, How to Mitigate Risk and Ensure Compliance: Govern Your Documents Accordingly.

Read More

Making the Most of Your CCM Initiative: Influencing Customer Behavior

Customer Communications Management (CCM) isn’t only about communicating with customers – it’s about ensuring that you communicate effectively. That means ensuring that customers are sent communications that are relevant to them. For our last post on making the most out of your CCM strategy, we’re going to look at how to successfully use CCM to influence customer behavior. This is the final step listed in the InfoTrends white paper Improve Your Enterprise Customer Communications Strategy in Five Vital Steps. Data analytics and business intelligence tools can help companies better understand their customers: who they are and how they behave. By understanding their customers better, organizations can now help tailor communications directly to their customers’ needs, predict future requirements, as well as fuel upsell and cross-sell initiatives. Customer profiles – determined by online behavior, social media data and other factors – can help ensure that companies send more relevant, personalized communications, which leads to a higher rate of response. Meanwhile, technology such as personalized landing pages and QR codes can even help track printed communications. This becomes even more relevant as companies push more marketing communications through their CCM platforms, allowing them to become hubs that store, track and measure communications. The right CCM technology, with business intelligence and data analytics capabilities built in, will help provide the functionality to do so. Read the full white paper: Improve Your Enterprise Customer Communications Strategy in Five Vital Steps

Read More

Top 5 reasons you don’t need Information Governance

Enterprise Information Management companies like OpenText are pushing information governance as something that is imperative for business, but do you really need it in your life? Here are the top 5 reasons that your organisation doesn’t need an information governance program. All your content and business information is stored and managed. Your IT department has a large and ever increasing budget, so it can afford to keep all of your enterprise content and information forever – especially the duplicate and transient varieties. The more copies the better… Your employees all follow company procedure to the letter. Your team is the best – they never fail to follow standard procedures because they have time to read and memorise them, and regularly review them in case company policy changes. So, they understand that information is the most valuable asset in your organisation; and they’d never use their USB sticks or unsecure public file sharing sites to store and share content… Your industry is immune to oversight and regulation. There’s really no need to put legal holds on information if there is a case brought against you; the law does not apply to you, and you can afford the huge fines anyway, right? Your employees don’t BYOD and they never use social media. There is no one under 60 in your company, and what’s wrong with phoning people up on landlines if you want to talk to them? Mobile phones and tablets are just a fad and would only encourage employees to play angry birds all day. Your business runs best when you are completely uninformed. In your industry, it pays not to know what’s going on; that way you can’t be blamed for making bad decisions, yes? For the rest of us who do need information governance, there is a wealth of videos, white papers, and other resources aimed at helping you understand how it helps minimize risk, ensure compliance, and maximize the value of information in your organization.

Read More

The Discovery Suite and the Rise of Big Content

Today, April 30th, 2014 OpenText announced the General Availability of the Discovery Suite. The Discovery Suite uses the power of search and content analytics to solve problems organizations experience with huge volumes of unstructured content. A great example of the type of problem that can be solved is Auto-Classification of content, which has become a very hot topic. Be sure to grab our whitepapers that discuss why it is time to consider Auto-Classification. Based on the availability of the Discovery Suite and interactions with customers and analysts, I would like to revisit some thinking from a previous blog. The Rise of Big Content – Change the Conversation to Unlocking Value in Unstructured Information I recently had the opportunity to sit on a panel with some of the leading lights in the field of Information Governance. One of the panelists, Alan Pelz-Sharpe, Research Director at 451 Group, took on the role of protagonist, pointing out that, despite the hype, Information Governance is falling flat in the ears of senior executives and many large corporations. He cited his research to point out that “less than one-third of senior management believed IG was very important at their organizations.” If, like me, you are a believer in the process and the benefits of Information Governance, this is probably both disturbing and perplexing. Why is it that we have not been able to get the attention of C-Suite when it comes to Information Governance? I think the answer is becoming obvious: Information Governance discussion inevitably end up focusing on risks and cost. To get the attention of execs, we need to talk about value – and more specifically ways of contributing to the top line. We talk about costs because they are tangible and measurable. We talk about risks because of past experience, or events that happen to like-organizations. Value on the other hand is typically acknowledged as a goal, but then it often fades into the background amongst the many challenges facing organizations as they take on Information Governance. Big Content – A New Way to Look at Unstructured Content Big Content is not the wake of our digital activities like Big Data, but the byproduct of the knowledge economy. It is the huge volume of freeform, unstructured information that organizations are creating and storing on the desktop, on mobile devices, and increasingly, in the cloud. Like the “3V’s” of Big Data, Big Content has significant challenges that need to be addressed in order to be useful. Big Content is Unintegrated, Unstructured, and Unmanaged. In order exploit this information and gain insight from this information, we need to solve the “3U’s” for Big Content. Announcing the OpenText Discovery Suite – Solutions for Big Content Integrating the Unintegrated: Instead of getting smaller, the list of systems that host unstructured information is growing in most organizations. The OpenText Discovery Suite eliminates these silos with a Unified Information Access Platform for enterprise sources and content. The Discovery Platform has connectors for critical Enterprise applications and it processes and indexes documents found in the enterprise. It also provides a rich library of UI components and APIs so that developers can embed search directly into line-of-business applications. Bringing Structure to the Unstructured: Normalizing and enriching content brings structure and consistency to otherwise freeform content. OpenText Discovery Suite uses OpenText Content Analytics to bring structure to the unstructured. It extracts semantic information from content and applies it as structured metadata. Semantic metadata includes people, places, organizations, complex concepts, and even sentiment. This, combined with file metadata and permissions metadata, provides a way to query and organize unstructured information in ways not possible before. Managing the Unmanaged: Big Content, more than anything else, is unmanaged. We need to first manage the content so that we can differentiate between the valueless content and the content that is worth keeping. OpenText Discovery Suite has applications to manage specific Big Content problems that organizations struggle to solve every day. Search and content analytics alone do no solve business problems. Business logic, roles, reporting, and much more needs to be built on top of the platform in order to support the use case and provide a clear return on investment. Some of the Information Governance use cases include auto-classification of content, collection, and early case assessment for eDiscovery, remediation, and migration to control the lifecycle of content. Some of the engagement and productivity use cases include site search, intranet, and extranet search applications. The Way Forward – Making Big Content Valuable In solving the “3U’s”, the Discovery Suite encapsulates the processes required to identify value in Big Content. Once the valueless content is being managed, it is possible to focus on securing and managing the valuable content. Finally, the value of the content can be amplified. Enriched content provides better access, greater productivity, increased collaboration, and content reuse. This allows us to have conversations about effecting both the bottom and top line.

Read More

Making the Most of Your CCM Initiative: Automating and Integrating

There are several ways to make the most out of your Customer Communications Management (CCM) strategy. We’ve recently been outlining the 5 steps discussed in the InfoTrends’ white paper, Improve Your Enterprise Customer Communications Strategy in Five Vital Steps. The fourth tip relates less to the processes you put in place and more to how they are implemented, with the goal of automating and integrating through the right CCM technology. Most CCM systems are implemented in IT architectures that are patchy at best, with stakeholders often operating in siloes, with limited knowledge of how the overall CCM process works. Legacy systems may be in place and they may be less advanced in terms of their management and tracking capabilities. Corporate IT systems that could be considered part of CCM include enterprise resource planning, CRM, accounting/tax and archiving systems. Having a centralized approach to your CCM system can help ensure that you automate and integrate everything in the most efficient way possible. To ensure this, communications should be stored centrally, whether it’s in an archive, an Enterprise Content Management (ECM) system or a CCM solution. Other things to consider include: CCM technology should be easily integrated with other systems. Legacy communications should also be stored and tracked centrally, since modern CCM platforms and post composition solutions can process and store legacy output in an archive. Vendors that invest in cloud solutions can offer the next level of integration, likely including on-premise data and cloud-based delivery. Lastly, the right technology will also ensure that your communications are tracked and stored properly, giving business users and customer service representatives the information they need to communicate with customers. They also allow for the use of data analytics, offering information you can use for upsell and cross-sell opportunities. We’ll finish off this series in our next blog post, by looking at one final step to successful CCM strategy: Influencing Customer Behavior. Read the full white paper: Improve Your Enterprise Customer Communications Strategy in Five Vital Steps

Read More

Regulatory Matters: What you should know about new FDA Supply ChainSecurity regulations

Last week, the European Medicines Agency sent warning letters to healthcare professionals across Europe about falsified and/or tampered vials of Herceptin, Roche’s potent drug for breast cancer. It appears that the vials were stolen in Italy, had their lot numbers modified, and reintroduced into the supply chain. This is a growing problem, not only in the EU, but in the US as well. In response to the growing threat of counterfeit, adulterated, stolen and diverted medications entering the pharmaceutical supply chain, the US Food and Drug Administration (FDA) has implemented several important regulations highlighted within the Drug Supply Chain Security Act and the FDA Safety and Innovation Act (FDASIA). As with any global enterprise, the risks of maintaining supply chain integrity from manufacturing to distribution across international borders are massive. These risks are multiplied when we’re talking about a nearly trillion dollar industry and where a breakdown can mean injury or death. The FDA estimates that 40% of finished drugs and 80% of active ingredient precursors are imported and has defined a rigorous process of inspections with the stated goal of preventing any type of illegal activity within the supply chain. VII of the FDASIA signed into law in 2012, grants the FDA new authority to address these new challenges and better ensure the safety, effectiveness and quality of drugs imported into the United States. As stated on the FDA’s website,”>Implementation of these authorities will significantly advance its globalization and harmonization strategies and support FDA’s ongoing quality-related initiatives. Further, these authorities will allow FDA to collect and analyze data to make risk-informed decisions, advance its risk-based approach to facility oversight, strengthen its partnerships with foreign regulators, and drive safety and quality throughout the supply chain through strengthened tools. At the same time, implementation of Title VII of FDASIA is difficult and complex, and requires not only the development of new regulations, guidances and reports, but also major changes in FDA information systems, processes and policies.” Pharmaceutical companies and API manufacturers will need to become familiar with these regulations and determine how to leverage existing or implement new technologies to interface with the FDA. These include registration and listing of all drug/excipient manufacturers and importers. One critical aspect of Title VII is outlined in Section 706 which speaks to the types of records required by the FDA and timelines to produce the records prior to an inspection or audit. Having a robust document and records management and recovery strategy has always been important, but under these new guidelines, getting the right information quickly to the FDA is essential to prevent delays in manufacturing or distribution. With over 300 life science implementations, OpenText has long provided validated ECM solutions for pharmaceutical records and process management, and through its tight integration with SAP, ensure compliance with current and emerging FDA and EMA regulations. In the next Regulatory Matters, I’ll go into more detail about the Drug Supply Chain Security Act, which outlines critical steps to build an electronic, interoperable system to identify and trace certain prescription drugs as they are distributed in the United States, also known as “Track and Trace.”

Read More

Electronic and Physical Records Management: Why a Combined Approach isBest Practice

When many organizations make plans to implement Records Management, they focus on electronic records exclusively. Other organizations that are heavily paper based may focus on physical records management. Best practice using current technology is to combine both approaches into a common Records Management platform. The ideal paradigm is one that allows users to search for records without regard to file type or media. If the result is an electronic record and the user has the correct permissions, the file can be retrieved and viewed immediately. If the result is a physical record and the user has the correct permissions, the user can initiate a check out process to have the record delivered; the system will track the checked out record until it is returned. All records, regardless of media or file type, including email, should be governed by the same records retention rules. Your records retention schedule should be media independent so that there are no unique rules for paper records versus electronic records. What is included in Physical Records Management? A Physical Records Management solution is much like an inventory control system or a warehouse management system. It includes locations: building with address, room, row, shelf, and bin where boxes of records are stored. Locations and boxes are barcoded for rapid data entry when boxes are placed in a location and for checking out boxes for records requests. When records are placed in a box, the box identifier is entered into the system so that if a user requests that particular record, the specific box is known. When records are accessioned, they are typically accessioned in bulk as a group of boxes. These boxes must be stored on shelves, ideally in contiguous space in the same area in the file storage area, onsite or at a remote records storage facility. A key feature therefore is “Space Management,” which calculates the amount of storage space needed for the accession, providing available storage locations. When records are moved from one location to another, groups of boxes are typically placed on pallets for shipment. Another key feature therefore is the ability to “containerize.” This feature provides the ability to group records into boxes, boxes onto pallets, and so on so that the pallets can be tracked from one location to another, with visibility into which boxes are included on each pallet. When a user requests records, the solution sends a notification to the records center. Notifications are often batched together into a “pick list,” much like a warehouse management or inventory control system. The pick list may be sorted by location to make it easy for the staff to retrieve the requested boxes. The staff uses barcodes to inform the system that the box has been checked out and the system records the requestor, request date, fulfillment date and time. Reports provide lists of all checked out records and any due dates for return. This solution can be used to track any physical objects, not just boxes of records. It can be used for evidence lockers for law enforcement, for storage of data tapes, CDs, DVDs, external hard drives, file cabinets, historical artifacts and so on. Tracking of location, check-out and check-in, and so on applies to these functions very well. For paper records stored in boxes, each box should include an inventory sheet that indicates the content of the box. Ideally each individual record has an entry in the Records Management system with the same information that an electronic record would have, plus the Box ID for the container in which it is stored. Summary An Electronic Records Management system is a solution that uses software to manage all records, electronic and physical. If you are considering implementation of a solution, you should consider what features are supported for all media types. A separate solution for physical records versus electronic records is not best practice, and ultimately this approach will lead to higher costs and policy that is inconsistent at best.

Read More

How I Learned to Stop Worrying about Compliance and Love Information Management

Even in heavily regulated industries such as energy and resources, the term “records management”(RM) has a somewhat negative connotation. Over the span of my career in engineering firms, energy companies and public sector bodies responsible for governing the industry, I’ve come to accept that executives are reticent to invest in RM initiatives. In my experience, is has typically been some compelling event that has driven major information and records management initiatives within these businesses – a scathing audit demonstrating non-compliance; a health, safety or environmental incident that could have been avoided; or a costly claim or lawsuit – where poor information management (IM) or the reactive costs of eDiscovery were too high a price for the business to pay.Today, it is hard to imagine that there is any executive, or employee for that matter, that doesn’t recognize the value of information to their day-to-day operations. Especially when you consider that companies such as Google and Facebook are making such lucrative business out of information alone. Yet, many utilities and resources companies continue to suffer from poor information management – often failing to invest in the tools that support even their most basic information management needs – let alone leveraging that information for innovation and competitive advantage. Information has value, and when properly managed, information can reduce risk as well as positively impact your overall revenue, efficiency and profitability. When effective information management is embedded in the processes and activities that you perform in the course of your normal operations you achieve increased compliance – and you realize the strategic value of that information. This is where “Information Governance” comes in. I am pleased that the IM industry has adopted this term to reflect the new era of information management. It more accurately reflects the paradigm shift we’ve been experiencing over the past 20-30 years, as electronic information in its many forms, has replaced antiquated practices and tools rooted in paper-based processes. An effective Information Governance strategy not only addresses the myriad of information types in your business – paper, electronic documents,rich media, structured data and master data, or sensor and operational analysis data – but it manages the information in the core business processes and applications across the enterprise. The right Information Governance platform is foundation that unifies the information silos and processes – and delivers value to the core operations of the business. Over the course of the next few weeks, we will be running a series of blogs that demonstrate how Information Governance supports business processes and challenges that are specific to energy and resources companies. You will hear from my colleagues, in their individual areas of expertise, about content-centric applications such as: Engineering Document Management; Contract Management; Asset Information Management;Customer Communication and Customer Information Management. We will also cover two additional topics reflecting the challenges of the complex information technology landscape (IT) in modern energy companies:governing your SAP and SharePoint information; and managing legacy systems and information silos from mergers, acquisitions and joint-ventures. I hope that you will join us for the entire blog series, as we demonstrate how Information Governance manages your risk, ensures your compliance and delivers value to your business.

Read More

Making the Most of Your CCM Initiative: Enabling Business Users

Recently, we’ve been looking at different ways to make the most out of your Customer Communications Management (CCM) initiatives, discussing some of the strategies listed in the InfoTrends white paper, Improve Your Enterprise Customer Communications Strategy in Five Vital Steps. Our tips have so far involved taking a more centralized approach to your CCM strategy and engaging your customers through mobile technology. Another step to a more successful CCM strategy is finding ways to enable your business users, while reducing IT costs in the process. Standardized processes for document creation, production and fulfilment can sometimes make flexibility in CCM difficult – any changes along the way require approval from marketing and legal, and need the line of business (LOB) or department to sign off on costs. Only then can IT implement any changes. Such a long process not only means it takes a while before those changes are finally made, but can add to overall costs as well. The right technology can help streamline the process, putting less strain on IT. More specifically, template-based technology can help business users create, manage and deploy customer communications themselves. With this end in mind, tiered template communication systems are becoming the norm. In this system, there are three “tiers” of users: The IT user develops data and content constructs to make template creation easy. The business super user builds the templates and creates business rules. The business user then creates, modifies and schedules the communications. This method takes a lot of the pressure off IT, and puts more power into the hands of users, creating a more efficient and flexible CCM system. In our next blog post, we’ll look at the fourth step for improving your CCM initiative: Automating and Integrating. Read the full white paper: Improve Your Enterprise Customer Communications Strategy in Five Vital Steps.

Read More

Forms and Functions: Technology Enablers of Paperless Communication

Last time I wrote in this space, we looked at paperless communication from the business perspective; today, let’s look at it from the standpoint of the technologies that are available for you to consider as key enablers of the paperless practice. First and foremost, please note the use of the words “technologies” and “enablers” – the plural forms of the nouns – in the preceding statement. In this connected day and age, there’s no question that, to do paperless communication right, you must seamlessly utilize and leverage multiple technology stacks to get various important things to happen. The Usual Suspects Take, for instance, a typical customer service situation. The phone rings, or chat window opens, and the company representative on the receiving end: Uses an electronic form to query the CRM and ERP systems for basic information about the customer, including buying history and reports related to any recent interactions. Leverages the company’s taxonomy and metadata to facilitate the searching for/finding of that information. Accesses one or more document repositories to review related policies, correspondence, etc. Utilizes workflow to route inquiries to and receive responses from colleagues – and sometimes the customer – as needed. Uses another electronic form to capture his or her notes regarding the current encounter and any follow-up actions to be taken. Generates a report that summarizes the nature and result of the interaction and sends it to the customer in a form and format that can be viewed anytime, anywhere, and on different devices (computers, tablets, smartphones, etc.) – including, possibly, on paper. When appropriate, flags the customer to receive additional transpromo, marketing, sales, or support information. Now, count the number of technologies used in this simple example. How many did you get? Certainly more than one, and that’s precisely my point. Interoperability is Front and Center What makes this all work is the intelligent implementation of tools that enable interoperability, which, simply put, involves “getting stuff to work together” and must be added to the list of individual technologies you just made. This isn’t conceptually new – in the old days we might have called it “middleware” – but the increasing and accelerating convergence of so many once-separate capabilities means they are playing an ever more important role in making paperless communication a reality. The trick is to pipe all the data involved into, around, and out of all the technology stacks your business processes touch. On the back end, this in many cases no longer needs be done programmatically; rather, it can be accomplished via Web services or by using standards based on them (like CMIS, which allows different vendors’ content management systems to talk to one another). On the front end, the most common approaches utilize HTML, PDF, plain-text email, and/or, increasingly, custom apps for smart devices. Wanted: Unified Perspective Besides the technical challenge interoperability presents, the hardest part may be that many of the technologies involved are “owned” by different departments: ERP by accounting, CRM by marketing or sales, individual document repositories by separate lines of business, composition tools by designers and printshops, etc. By its very nature, interoperability requires that a unified perspective be brought to bear on the overall business process – and this can be difficult to do given the competition for organizational budget and power that so often characterizes life in the corporate lane. For this reason, the single most important technology enabler of paperless communication may be the office water cooler, around which discussions about efficiency and effectiveness can be held, and consensus reached as to how best to create seamless, electronic flows of information to and from the audiences you serve. Pay this sufficient heed, and you can greatly enhance your ability to browbeat the plurality of tools and personalities into submission.

Read More

Solving the Open Data and Public Records Problem

public records

Government agencies are being asked, and in some cases mandated, to become more open and accountable with their data and public records. The challenge for many of these agencies is not that they do not want to publish their data and public records, but that they do not have the tools or technologies to do this easily. As they look around their organization, they see many disparate systems and repositories of data and documents. Some are aging and have limited accessibility. Some contain a mix of public data and sensitive personal citizen data that cannot be released. Another challenge is that this data is a combination of different data types. Some data is the type that can be stored in a typical database but much of it is unstructured content such as documents, images, and even voice and video. Put simply, the challenges are: Existing internal systems do not have the accessibility or scalability needed to expose to the public Existing data is a mix of public and private data so direct access cannot be granted Existing data is a mix of types including database tables, drawings, documents, images, voice and video Existing data sources are varied and disparate so data resides in many different places New data is being generated at an ever increasing pace and new systems are not being designed for public access When looking at these challenges, most believe that the problem is too big and costly to solve. The typical workaround is to assign more people to respond to open data requests. As the number of requests increases, more people are needed to ensure reasonable response time. Tracking and ensuring that each request is responded to is often done in a spreadsheet or simple database such that no request is lost, delayed or mishandled. Another risk is that private citizen data will accidentally be exposed so a reviewer is assigned to review all outgoing data to ensure that no private information is accidentally released. When tasked with a request, public records responders must then conduct multiple searches of disparate systems and content repositories in a time consuming and labor intensive effort to ensure public records compliance and public data access.  These challenges may lead government IT leaders to ponder the following questions: What if there was a tool that could search, classify and extract data across multiple databases, content repositories, and file systems? What if there was a single repository that could hold all the various types of data across the organization? What if the data was stored in industry standard XML and XML descriptors for unstructured content such that it can be easily linked to Data.gov if desired? What if the repository could hold as much data as desired and be exposed to the public with a simple search screen? What if the repository could also handle live data feeds from other systems that generate real time data streams or real time content? What if the pricing for this system were based on the amount of content stored such that the cost was low for the first few terabytes of data and/or content? What if this system was available either on premise or in the cloud such that there would be no impact to current data center or staff operations? Technologies from OpenText now make it possible to solve the problems in providing direct public access to both data and content. Step 1 (Optional): Purge the content which does not need to be retained, migrate the content into OpenText™ Documentum, and assign records retention policies such that the content is subject to records compliance. Step 2:  Load all public data desired into OpenText™ InfoArchive located in your data center or in a private cloud service offered by OpenText which is accessible to the public and accessed via a simple link off your website. Step 3: Receive recognition and accolades for your accomplishments and find new tasks for your public records response team. If you would like more information on public data access solutions for government agencies, please comment on this blog and we will follow up with you.

Read More

Experience Suite Release—Adaptive, Omni-Channel Experiences

Digital technologies have ushered in the “age of the customer.” With increasing access to information about products and services, today’s consumers are empowered to shop comparatively online, influence the buying behavior of peers, and guide the overall brand experience. They expect immediacy and experiences that satisfy. The sales funnel of the past is being replaced by multiple interactions that create a lifetime of customer value and brands are consistently maintained in consumer communities rather than by advertising or marketing departments. By the year 2020, enterprises need to transform for the age of the customer. To gain a sustainable competitive advantage, companies need the ability to connect directly with the customer to create exceptional, long-term, and rewarding experiences. To nurture deeper relationships with customers, organizations are increasingly required to create consistent experiences, tailored to suit individual needs, across multiple digital channels, touch-points, and even supply chains. Organizations that do this effectively will benefit from deeper relationships with their customers, based on data-driven insights that enable them to more effectively meet their customers’ evolving needs. The latest release of the OpenText Experience Suite bundles interlocking capabilities into a comprehensive solution designed to deliver engaging customer experiences across all touch-points. It manages the entire customer experience by combining adaptive content management, orchestrated information flows, and compelling omni-channel experiences—all within a governance framework to ensure consistency. Experience Suite accelerates collaborative media creation, curation, and personalization to ensure that each user gets the best possible experience at every point of interaction. Experience Suite delivers rich, digital engagement through: Faster time to market: with omni-channel and tailored experiences based on analytic insights and market feedback. Responsive delivery: adaptive content management consolidates information from multiple sources to publish the most effective content for the buyer. Adaptive content delivers experiences that are device agnostic and can adjust as customers switch between devices and channels. Unified, seamless experiences: by connecting a broad ecosystem of partners (e-commerce, translation engines, web analytics, social media, and more) to deliver a consistent and seamless brand experience. Governance: through orchestrated information flows to ensure information and assets are effectively managed, governed, and can be reused in multiple instances. Extensible EIM services: engagement can be extended to include comprehensive EIM capabilities, such as enterprise content management to process management, information exchange, and information discovery—all in one environment. Flexible deployment options: Experience Suite gives organizations the option to deploy on premise, in the cloud, or in combination using a hybrid solution. OpenText Experience Suite provides rapid design tools to create and manage content that is delivered to web, social, mobile and print destinations, and is integrated with the AppWorks EIM developer platform, extending capabilities to all of the OpenText product suites. Solutions can be built on the platform to address digital marketing, customer self-service, social communities and brand management initiatives. OpenText Experience Suite combines capabilities that have been architected to work seamlessly together. Technologies include Web Experience Management (WEM), Customer Communication Management (CCM), Media Management, Tempo Social, and Portal. To find out more, read the press release.

Read More

As ‘Obamacare’ Open Enrollment Ends, Data-driven Healthcare Ramps up

Data-driven healthcare

The first week in April was no “April Fool’s Day” prank for the healthcare industry as reality started to sink in following a set of changes that will undoubtedly affect the industry for some time to come. The U.S. Senate voted to pass a House-approved measure that would implement a temporary ‘Doc Fix’ for Medicare’s sustainable growth rate formula—a scheduled 24% cut to Medicare physician reimbursement rates—and delay the ICD-10 Coding System for one year, pushing the compliance deadline to 2015. The week also marked the official cutoff for insurance enrollment under the Affordable Care Act (ACA). President Obama announced last Tuesday that more than 7.1 million Americans signed up for health coverage under ‘Obamacare’ for 2014. But just as the rollout of health insurance coverage under the ACA came to an end across the country for 2014, the realization of what data-driven healthcare will have to become moving forward, ramped up. More Americans are gaining insurance coverage. More patient data is coming. With all the benefits that extending health coverage implies in terms of healthcare access and financial protections, so too is the sheer amount of data that it will create, not only for payers, but specifically for providers. As healthcare organizations face this new challenge, however, they are already experiencing a rapid growth in the volume of all forms of information. Their current applications and older legacy systems are bursting with information—patient data, clinical and administrative documents, voice recordings, and medical images—which, for most organizations, has continued to raise several serious challenges with cost management, compliance and legacy system preservation. Unfortunately, the problems with organizing and managing patient information are only just beginning as more American’s with access to health insurance start to flood the market seeking healthcare services. Transforming care delivery. As a result of healthcare reform and other long-standing drivers in healthcare, the stakeholder landscape is evolving. New care delivery and reimbursement models will require healthcare organizations to transform how they deliver care across the continuum of care to place greater emphasis on quality and prevention. We are, undeniably, moving into an era where healthcare will be hugely data-driven, and to beat any of the challenges we face as an industry, we’re going to need access to information more than we ever have before. More directly, though, healthcare organizations are going to need comprehensive patient health records. These records, as a recent IDC Health Insights study noted, will enable healthcare organizations to provide their clinicians “with a longitudinal view of the care their patients received across the continuum of care and include(s) health information from multiple healthcare IT (HIT) systems managed by various stakeholders.” But simply making a mental note of technology needs won’t enough for today’s healthcare organizations given that providers will be reimbursed based on quality care measured against patient outcomes. As such, organizations will have to not only leverage current IT investments, but also invest in integrated patient-centric record solutions to improve care team collaboration and care coordination. Increasingly, though, organizations will also have to invest in solutions that allow them to share data across the continuum of care. Transitioning to data-driven healthcare. Healthcare organizations globally have made significant investments in electronic medical record (EMR) systems to manage structured information as they look to meet the challenges of improving health outcomes for their communities while carefully managing operational costs. And more specifically to Meaningful Use efforts here in the U.S., EMR technology has offered the basis for improving clinical and population health outcomes, enhancing patient care, and increasing efficiency. Rapid adoption of these systems has certainly offered boundless potential for the advancement of patient care, quality, and safety—all of which support a foundation for collaborative care and help clinicians and organizations alike meet evolving patient expectations. But for all the benefits they have brought the industry, EMRs still fall short of a complete patient-centric view—with only 25% to 50% of a patient’s health information typically available for view in an electronic format. This results in majority of the content that comprises the patient record residing outside of the EMR system due to several key challenges: Multiple Formats and Systems: Volumes of information are locked in multiple formats and numerous healthcare information systems, paper-based documents and processes, and non-machine readable documents—all which are not aggregated. Information Silos Prevent Sharing: Patient health information is trapped in numerous siloed repositories without easy access or facility for electronic sharing, making information exchange difficult or non-existent. Incomplete Patient Records: Electronic records are incomplete, forcing clinicians to make diagnosis and treatment decisions, and run duplicate tests, due to inadequate access to a complete view of patient history. Paper-Intensive Industry: Paper is not going away as quickly as expected with as a majority of documents used in healthcare today remaining paper based. Paper volumes are increasing year after year, and the inadequate availability of this information in clinical systems continues to result in medical errors. These challenges have plagued healthcare organizations by way of increased costs and decreased efficiency at a time when reining in costs and improving efficiency could not be more urgent. And the approach many organizations have taken with managing patient information introduces a whole host of problems, from increased risk, quality and privacy issues, and difficulty conforming to regulatory compliance. To put this in a slightly different perspective, according to IDC Health Insights, 18% of medical errors are the result of patient information missing from the patient record—such as medication history, lab results, known allergies, or blood type.  The percentages likely only get more disastrous as additional critical information is inadequately available for clinician review. But as Dr. David Denton noted last week in commentary within NewsWeek, “technology isn’t enough to improve healthcare. Doctors must be able to distinguish between valuable data and information overload.” So the challenge we are really faced with is ultimately transforming how we view, organize, access, manage, and use patient information to create efficiencies and optimize patient care. Stay tuned for our next blog post where we will complete this topic and talk more about why simply having information is no longer enough for organizations to succeed. It now must be organized and managed for use in delivering better healthcare.  

Read More

Making the Most of Your CCM Initiative: Taking a Centralized Approach

Customer Communications Management (CCM) can help build customer satisfaction and loyalty – but only if you do it right. And that begins with implementing the right CCM strategy. In my previous post –  “How to Advance Your CCM Strategy: Recommendations for Enterprises”, I looked at recommendations that emerged in the InfoTrends white paper, Improve Your Enterprise Customer Communications Strategy in Five Vital Steps. InfoTrends has identified 5 steps to making your CCM strategy more successful: The first step: Taking a Centralized Approach to your CCM initiative. Chances are, your company currently doesn’t create, produce or manage their customer communications centrally. In fact, most businesses don’t. Instead, they manage them by line of business (LOB) or department, where communicating with customers is a key business function. While this may be a popular way of approaching CCM, it’s not necessarily the method that works best. InfoTrends found that a more centralized approach to CCM is more effective, helping your organization in several ways: – You get more out of your initiative. By taking a centralized approach, you can benefit more from the latest CCM technology, including channel preference management, data analytics and a synchronized experience between channels. Members are also better able to see and understand what exactly has already been communicated to a specific customer. – You improve your ability to stay consistent. By seeing the whole picture, you can make sure brand, messaging and style remain consistent between departments. – You can shorten the investment cycle. Having centralized experts in place to guide the initiative through the funding approval process can help push it through any red tape faster. What’s more, these experts are also able to articulate the benefits of investing in CCM, arguing its merits to anyone who needs convincing. In my next blog post, I’ll look at the second step to a more successful CCM initiative: Engaging through Mobile Technology. Read the full white paper: Improve Your Enterprise Customer Communications Strategy in Five Vital Steps

Read More

Start With a Sip From the Asset Management Fire Hose

Today’s plants and energy facilities – along with their workforces – are aging. And while this is no newsflash, the associated challenges aren’t receding. Given that the “newest” large U.S. refineries have been in operation since the 1970s and many people working at them are nearing retirement, it’s clear that it’s becoming even more complex to execute on the “standard” pressures to boost production, decrease downtime and comply with demanding regulations. This hit home when I read a recent report from Ovum Consulting. After surveying utilities, E&P and refining/gas processing executives, Ovum showed that state-of-the-art asset lifecycle information management (ALIM) provides significant operational and competitive advantages. It even helps avoid certain risks and reduce costs. All of this is great news. What’s not so great is that ALIM adoption is very poor across all of these industries. Despite the fairly obvious benefits, companies are choosing not to invest in improving their information management. Perhaps an information management strategy has never been more important than now – when you need a long-term vision for the future, yet you still have to manage your operations for today. It reminds me of the analogy of drinking from a fire hose. When you consider the multiple facets of improving plant operations, it can be overwhelming. Modernizing information systems, preparing for a new workforce and their preferences, meeting the latest regulatory standards, ensuring the security of documentation and the physical plant…The list goes on. When we’ve worked with customers faced with these same challenges, we’ve found that the best approach is to break down the ultimate goal of operational excellence into stages, or individual, sequenced projects. Taking this approach, the steps to the final objective are not only easier to visualize, they’re much more manageable. And you can realize business rewards right away, not just at the end of a complex implementation process. One important element in the information management mix is getting control of your documents. The Ovum study lists nine examples of ways enterprises may suffer when asset data is incomplete, hard to find, incorrect or inconsistent. These include scenarios of lost time, higher costs and eroded profits as well as increased risks to people and the environment. Based on our work with customers who are already tackling this challenge, we’ve identified a set of best practices that can help you get started. They cover the gamut from evaluating your content, to how to ensure optimal yet secure access, to improving management of change, to integration and compatibility with other business systems. All with an eye to making steady progress and achieving benefits at each stage of the journey. No matter what their reasons for initiating information management projects – personnel changeover, the desire to reduce fines or improve schedules, tightening security – customers often admit that the fire hose of improving overall operational excellence looks daunting at first. But, after their first sip of a structured information management strategy, they were ready for more.

Read More

From HIMSS in Orlando to WoHIT in Nice

A month has passed after the HIMSS annual conference in Orlando. There have been quite a few comments on this published already and I want to reference the one from the IDC Health Insights team  which is probably the closest reflection of my impressions too, after spending 5 days in Orlando. It was a very rainy weather the day when the HIMSS party at Universal  happened, so we skipped the fun part. The rest of the event was very much predictable as it was incomparably bigger than what we do in Europe with World of Health IT (WoHIT). What makes me wonder every time I go to the HIMSS Annual Conferences is how challenging it should be for customers to navigate through all these booths, kiosks, workshops and seminars. One of the key aspects that was debated in many ways is the Integrated Care concept. It is becoming more complex to orient between hundreds of offerings as the adoption of integrated care spreads within the healthcare IT community. IDC speaks about those aspects in their recent report about EMEA, as well as in their 2014 healthcare predictions report. HIMSS and IDC are doing a great job in helping customers understand how to get value out of many different offerings and products, and on where to focus  in the short, medium and long term toward integrated care. The new Continuity of Care Maturity Model (CoCMM) from HIMSS was briefly announced in Orlando, and will be unveiled through a bigger launch at WoHIT in Nice. In my opinion, IDC may be slightly ahead of HIMSS, as their integrated care framework has been already taken into account by some European governments and big hospitals.  We are contributing to both models, as some other big vendors and healthcare institutions do, and follow those models and guidelines in some ongoing regional and national projects across EMEA. But let’s leave a deeper analysis into and the differences between those two until after WoHIT. In Orlando, I also got the impression that most of the Electronic Health Records (EHR) vendors were caught up on the idea of decoupling data from apps and trying to build next-gen platforms to implement this. However, based on my experience, I can say it will take quite some time, especially for the big guns in the EHR space to really change their lock-in strategy and underlying architecture. It will be interesting to see in the next couple of years how those platforms evolve. Another key element I can’t avoid is big data and analytics. There’s has been buzz in this area already for several years. For us, we combine those two: analytics with big data. Business intelligence tools have been around for years. I remember 1st time I visited HIMSS in 2007 and it was already full of solutions in the analytics and decision-support areas. What really makes a difference now is the ability to process huge amounts of data (petabytes and beyond) in a reasonable timeframe and to get practical information out of it. It is starting to emerge now in the genetics area with genome sequencing and stratified medicine. There is already a number of university clinics and pharmaceutical companies piloting this. If interested, you may have a look at Aridhia development in UK. That said, I’m looking forward to WoHIT in Nice this year, which promises to focus on some major trends seen also at the HIMSS Annual Conference, but less influenced by the dynamics of the US domestic health IT market.

Read More

Create a better LinkedIn profile inspired by ECM technology

If you have ever tried creating a portfolio webpage to go along with an online personal website or resume, the addition of the portfolio feature to LinkedIn is clearly a welcome addition to an already solid social networking platform. While I’ve been watching people take different approaches to using the portfolio feature on LinkedIn, there is something that the OpenText ECM solution does that should be something you consider as you’re building out your own portfolio. Although it may seem counterintuitive, it’s not all about the end result – a prospective future employer will appreciate seeing the path you took to solve problems. OpenText’s ECM solution has been able to keep versions of pretty much anything you can throw at it for decades now, which for most businesses makes ‘showing your work’ or the steps you took to get to a specific point really easy. That said, those of us individuals building out our own personal portfolios need to take the same kind of diligent approach to showing how we work. An example of doing this using the LinkedIn portfolio feature would be taking screenshots of an application or website you are developing starting from the first iteration and ending with the final product. This will help illustrate how you improved upon previous work until you arrived at the final product. How have you been using the LinkedIn portfolio feature to showcase your accomplishments? Do you have any examples of how you’ve used OpenText’s ECM solution to show a prospective client or partner how you work and iterate?

Read More