Enterprise Content Management

April 1st Deadline for Critical Infrastructure Protection is No Joke

critical infrastructure protection

Electric Utilities must comply as new standard kicks in – guarding national infrastructure, especially electric power grids, will enter a new phase on April 1st, 2014. That’s the date when North American electric utility companies must comply with Critical Infrastructure Protection (CIP) Version 4, a key standard for stopping sabotage, cyber threats and intentional disruptions to electrical utility operations. What’s interesting is that despite the negative consequences of CIP 3 non-compliance, many utility companies fare no better today in preparing for the upcoming CIP 4 deadline. Fines of over $4 million were levied in 71 cases of CIP 3 violations, according to NERC Software. One reason for such poor compliance is an organization’s lack of information control. It’s no secret to industry veterans that standards can require painful and detailed process steps. Tip: Use Asset Information Management to Ease Compliance Pain An Asset Information Management (AIM) strategy helps avoid non-compliance costs by securing critical information while continuing to provide immediate access. More importantly, malicious attacks can be thwarted – the real impetus behind these standards to begin with – since personnel and authorities are now able to take preventive steps by having access to accurate and complete asset information. Tip: Learn from History to Avoid Common CIP Standard Violations Let’s look at one CIP standard with the most violations as an indicator for how an effective AIM strategy can help drive better compliance. Standard CIP-001, Sabotage Reporting, was responsible for a quarter of all CIP violations to date (source: CIO Summit). It deals with reporting disturbances or “unusual occurrences” suspected as sabotage. Understandably, sabotage reporting aims to expedite finding and acting on sabotage events by requiring each utility to have procedures in place for recognizing events and communicating them to appropriate parties (internally and externally). Operating personnel are required to have sabotage response guidelines to report such disturbances effectively. Tip: Identify Essential Documents to Digitize & Control This reveals one key to readying for compliance: identifying critical documents that traverse the information lifecycle. In this case, sabotage response guidelines could be one such document, similar to standard operating procedures. Surprisingly, in many utilities today, documents like these are likely to be paper-based. Different versions of these critical documents confuse operators, might fall into the wrong hands, and are updated less frequently. A responsible content management strategy prioritizes digitizing documents like sabotage response guidelines, standard operating procedures, and plant specifications, with a focus on helping to lock down each stage of the key document’s lifecycle: Initial document development, iterations, and revisions throughout the approval phase Secure distribution to appropriate staff, after approvals On-going, regular updates to cleared staff through any revision cycles Compliance with  regular audits from regulatory agencies Archiving (per NERC mandates) Tip: Starting a Compliance Effort Ends with Better Competitive Advantage Cleaning up key document workflows and getting control over information is a good place to start for CIP compliance. Additionally, you should become familiar with emerging standards like PAS 55 and ISO 55000, which provide requirements and guidelines for effective asset management. And evaluate how automation technologies can remove human error as well as simplify compliance enforcement and archiving. Although the CIP compliance deadline may drive your near-term content management strategy today, my bet is that you’ll find that safer operations and better informed staff will ensure that these best practices endure. Has your AIM strategy helped you prepare for CIP Version 4 compliance? Share your tips and experiences below.  

Read More

Commuting for 12 Hours? Documentum Customers are Finding the Fast Lane!

Documentum customers

Imagine driving 12 hours to reach your workplace every morning. In the pharmaceutical industry, it takes an average of 12 years to bring a new drug to market. You can bet that customers are exploring every shortcut possible to lower that statistic, just as you would if facing an exhausting commute. We asked dozens of Documentum Life Sciences customers in a recent survey to comment on their tips for back roads or shortcuts, as they drive out inefficiencies across their business. One of the primary goals for Documentum customers is to better respond to changing business needs. Over 80% of surveyed customers say that by using Documentum, they are more agile and can respond more quickly to their changing business demands. This is directly correlated to the fact that over 70% of them also cite Documentum reduced custom coding requirements as a “strong feature” of the solution. I liken this to a built-in GPS feature in your car that gives you real-time traffic updates. If there is an accident on one highway, you can easily change your route and direct the GPS to go another route – it’s easy to make changes if you need to make them. Just like those of you who optimize your different commuting options – train, car, bus – our customers choose different ways of consuming and using Documentum software based on their needs. In fact, all of the customers surveyed (100%) stated that choice and flexibility in solution delivery options (Cloud, OnDemand, On-premises, etc.) was an important factor in their decision to choose Documentum.  Additionally, 94% of customers chose Documentum because it was a proven and trusted solution in the industry. 88% said having a single vendor that can deliver a comprehensive solution was a key factor in their decision. When driving, abiding to laws is something that we all *should* do all the time.  For Life Sciences, abiding to regulations is paramount. There is no gray area when it comes to regulatory compliance, and that’s why 81% of customers ranked Documentum’s uncompromised compliance as a very important factor in their decision to choose our solution. Additionally, 71% of customers ranked our deep Life Sciences industry expertise as a very important factor in their decision to work with us. We’ve been the Life Sciences industry leader in content management solutions for over 20 years, and it’s good to see that our customers value that expertise. I hope by sharing this customer feedback and their routes to success, you’ll be inspired find the long “commutes” in your organization and figure out ways in which you can optimize them. What better back roads or shortcuts can you take? How can you ensure that you don’t get that speeding ticket? How can better content management help reach your destination?

Read More

Driving Bank Process Efficiency

As a loyal bank customer, I wanted to fund a new car purchase with a low interest bank loan. Assuming the process would be fast and simple but it turned into a saga more akin to gaining security clearance at GCHQ, the UK’s intelligence and security centre. The bank had my details, signatures etc. and offered me ‘deals’ on loans and credit cards in the past.  So, just recall the account history, check identity, prove repayment viability, populate forms electronically, route these to approvals, sign off and get me the money safely in my account the same day. No. What transpired was a catalogue of errors with repetitive, disjointed processes devoid of logic, common sense, intelligent workflow, linked automation or thought of customer satisfaction. Thoughts and hopes of efficient re-engineered and innovation in customer facing processes with paperless operations were unfounded. But, driven by rising costs of branches, intense competition and capital expenditure, banks are listening to customers, actively reviewing service culture, and driving operational transformation. A prime example is Société Générale Albania, tackling process inefficiency, changing customer perception and improving service levels, growing market share, stripping out cost and inefficiency whilst driving in value. How did it succeed? By involving its users, stakeholders and owners in the innovation process purposefully selecting ‘best of need’ platforms and integrated applications. The solution created a world-class client-facing solutions delivery environment, in close collaboration between the Bank, the system provider Asseco (with their BPS application), underpinned with OpenText infrastructure products and services. From intelligent data capture, digital archiving, intelligent data formatting, linking the departments, systems and functions across the entire lifecycle, the bank created a reusable scalable framework with phased implementation, managing secure, cost-effective, timely loans and credit card delivery services. Account opening, loan origination and servicing within minutes is a reality. In the UK new legislation has enabled current account holders to switch banks, guaranteed, within seven working days. Previous processes resulted in lost records, late payments, loss of credit ratings and incorrect interest calculations. Switching was deemed too complex and only a very last resort as account holders resigning themselves to bad service, as observed by US comedienne Rita Rudner, “They usually have two tellers in my local bank, except when it’s very busy, when they have one.” Today, the process provides faster, consistent, reliable service levels, fostering increased competition and encouraging new entrants. Loyalty is dead. Consumers have greater choice than ever to chase deals simply and effectively. Differentiating bank services at a time of commoditization will be key to customer retention. Ask yourself these questions. Does your bank provide innovative automated electronic and mobile services designed to make your life easier with reduced servicing costs?  Or does it retain expensive branch networks, complex legacy systems, maintaining archaic lengthy procedures and costly processes? What sort of organization would you like to service your loan?

Read More

2020 Agenda: Are You Ready?™

The pace of business has accelerated. We’ve seen more change in the last 20 years than in the previous 100. Change is the new constant. Organizations must be nimble to adapt to evolving market needs driven by competitive forces, disruptive technologies, and new strategies for growth. Agility is key to capitalizing on innovation and achieving success. Business dexterity requires the seamless integration of business with management processes. High performing companies work to align operational speed (executing quickly) with strategic vision (increasing time-to-value) and speed-to-deployment. In most enterprises, there’s a speed gap: it’s the difference between the time it takes to switch gears at the operational level to support business strategy (3-6 months) and the time it takes to implement IT solutions or infrastructure to facilitate strategic initiatives (6 -10 years). This gap results in ineffectual processes that are implemented too late, after business requirements have shifted or are no longer valid. The consequence of this gap is a disparity in the mapping of application development cycles to suit business needs. To facilitate the alignment of processes and true business agility, a flexible process management system is required—one that automates processes while giving business users the ability to customize and configure solutions to meet their business needs. I’m pleased to announce that our new Process Suite includes cloud, mobile, and social services to maximize flexibility, deployment options, and business agility. It makes business processes accessible to a wide range of users, enabling them to define, modify, and execute processes faster, according to rapidly changing business requirements OpenText Process Suite is a comprehensive automation and case management system that delivers more than improved efficiency, offering an optimized end user experience and a broad spectrum of development options for both IT and business users. Process Suite delivers sustainable performance advantage through: Fast time-to-value: For the highest levels of agility and shortest time-to-value for process and case management solutions, Process Suite drives continuous process improvement across the enterprise. Flexible development: IT and LOB organizations can develop and deploy solutions to suit their specific needs—from packaged applications to assembled and configured applications, model-driven development, and code-level development. Extensible EIM services: Process management can be extended to include comprehensive EIM capabilities, from enterprise content management to customer engagement, information exchange, and information discovery—all in one environment. Broad deployment options: The platform was built from the ground up for public or private cloud-based deployment, giving organizations the option to deploy on premise, in the cloud, or in combination using a hybrid solution. OpenText Process Suite is a full-featured Business Process Management (BPM) solution that bundles platform, applications, and add-on technologies, including BPM Everywhere, a Process Component Library, an AppWorks Gateway, Case Management, Cordys Cloud Provisioning, Process Intelligence, and more. Outpace your competitors through aligned operations, strategic vision, and IT infrastructure. Increase agility, innovation, and time-to-value with OpenText Process Suite. Find out more.

Read More

The Weight of Information

When I get on the train each morning I’m carrying a Windows laptop, an Apple iPad, an iPhone (personal phone) and a Blackberry (workphone). In the course of a 50 minute train ride I’ll normally end up using 2 or 3 of those devices, because each holds some information, app or some ability that the other doesn’t. The phones have network capability, the iPad has a bigger screen for reading from, the laptop has all my work in progress stored locally. So I swap and change between devices as needed, using cloud services and various apps to sync between the devices. But wouldn’t it be great to have one device that could do it all? Is that even possible? There are several products already on the market that try to do just that: The ASUS Padfone is a smartphone that slots into tablet format. The NEC has a fold out screen which doubles in size. Microsoft Surface Tablets blur the edges between tablets and notebooks. So here is a request to the Apple and Samsung R&D folk: Please release an iPhone or Samsung Galaxy, which docks into a tablet, which docks into a full size keyboard. The 64 bit processor in the current iPhone 5Gs has easily enough power to run a tablet and a laptop. With laptop, tablet and smartphone running off the 1 processor, I’d then have one ecosystem for apps, one place for central storage, one device to charge, and one modular device to carry. In reality it’s unlikely that this ultimate singular device will ever exist. Lucky for me then, OpenText Content Suite allows me to access and edited my documents from multiple mobile devices in a secure and governable way. There’s also an added benefit to lugging around a heavy rucksack with 4 devices and associated accessories every day: It’s good exercise.

Read More

Take the Capital Projects Documentation Bull by the Horns

capital projects

I recently came across this sobering stat: In an international survey of capital projects executives conducted by Accenture, about two-thirds reported missing their budgets by more than 25% on all projects. And when it comes to scheduling, only 15% in the energy sector say they completed their projects within 25% of their target dates. If you’re in the energy sector, you won’t like this finding either: your markets are getting riskier. Most future investments will be made in less-developed countries. Projects will be larger and more complex. You’re likely to face a shortage of experienced talent. And you’ll work within a shifting landscape of regulations and policies. That’s a bleak picture! However, there are best practices to help your business join the elite class of companies that does bring projects in on time and on budget. Central to all of these best practices is to “take charge” – establish strong policies to manage risk and uncertainty more effectively at the start of the project. In other words, take the bull by the horns from the very beginning. Granted, a huge herd of bulls inhabits the capital projects field. But one that you can corral successfully is fast, easy, secure access to current information. By gaining control over documents, owner/operators and EPCs can take charge at project initiation and stay on track to completion and handover. We’ve seen them get advantages like: Streamlined management Getting your arms around the variety and volume of engineering plans, CAD drawings, phase schedules, regulatory permits, and supplier contracts required by capital projects is a complex and growing challenge. But incomplete or out-of-date documentation and information trapped in silos can increase costs and delay projects. Successful companies provide a secure method for project managers, suppliers and contractors to collaborate, track and share. Visibility Managing your documents is the first step, but you also need to make sure that your teams are able to get their hands on the right documents, right away. Departmental silos and security concerns limit their ability to quickly access the right documents. A unified content repository that provides an accountable, single version of the truth enables users to do their jobs more accurately and efficiently. Improved interactions Proper control can also help managers, partners, contractors and, of course, inspectors make quick and confident decisions. But because of organizational boundaries, firewalls, and lack of agreed processes, conducting efficient reviews and providing timely approvals can be a huge challenge. To ensure that both collaborations and contractual document exchanges are well managed, a well-oiled and flexible transmittal process is necessary to accelerate productivity and minimize errors. Compliance readiness There’s no underplaying the importance of up-to-date and accessible documentation when it comes to meeting regulatory requirements and preparing for inspections. Without accurate reports and audit trails, you’re at risk for sanctions and fines, an HSSE violation or worse. Controlling project information from Day 1 is the best defense. OpenText has solutions and industry expertise that can provide a more complete view of the thousands of critical documents on your project – plus the ability to manage and share them securely on a flexible platform specifically built for capital projects. It’s one solution for riding ride herd on complex projects from the very beginning. Which can contribute substantially to coming in on time and on budget. Do you agree with this “take charge” philosophy to get control of your documents at the start of the project? Let us know what you think.

Read More

Innovation at ClinTech 2014

The ClinTech conference was an excellent meeting this past week in Cambridge, Massachusetts. The focus of the meeting was on innovation in Clinical Trials, and the event was more successful in helping me think about innovation than any other meeting I’ve been to in the last several years. Here are the top three questions, that we as an industry must address, that I took away from the event: 1. Where are the patients? Typically we think of three primary points of view in the conduct of a trial:the sponsor, the CRO and the investigator. There is a fourth one raised (in a presentation by Thomas Krohn of Eli Lilly) that perhaps has been lost but is probably the most important: the patient. The DIA eTMF Reference Model covers the document objects covering the three primary points of view and serves current Good Clinical Practices (GCP) guidance, but perhaps we can add some things that will help the patient feel that they are a primary contributor to the success of the trial. Surely we cannot conduct the trials without them and the reason the trials are conducted are to improve patient outcomes. They own (and are protected around) their own information. They participate in some of the workflows (signature on Informed Consent, etc) and should have the option to receive and own their own data. If patients are hard to recruit, maybe it is because there is too much friction in the process. How can we begin to eliminate the friction by including them in the collaboration model and make them feel that they are a vital part of the process? 2. How can we help Investigators? There is tremendous interest in the industry in reducing paper and moving to eTMFs. However, this initiative doesn’t currently consider the challenges of how to help the investigator easily provide needed documents to make the eTMF complete. The Investigator Portal concept has been growing but is not yet integrated with the eTMF. One vendor predicts that investigator portal technology will be absorbed into the eTMF as a single solution within five years. What do you think? 3. How can we shorten clinical trials? We often talk about the benefits of improving the conduct of Clinical Trials and how tools can help us realize those benefits. One vendor’s survey results indicated that the most important goal has been the least realized – to shorten the total elapsed time for the trial. My learning is that we need to reinforce our focus onto this high priority goal and help our customers to target, measure and achieve this objective. In summary, I think I have learned that the best answers moving forward may not be in refining the technologies to automate what we have done before. Maybe the best answers lie in our ability to step back and rethink the basic business objectives – this may yield a whole new, fresh, view of the business and the corresponding processes required to support it.

Read More

Supporting Government Field Workers With Amazing New Technologies

automated case management

Many government employees spend most of their time working in the field.  Some examples of these types of workers include: Case Workers for Social Services Construction / Building Inspectors Public Health Inspectors Investigators Fire Code inspectors This is just a short list of the many different types of government workers that perform their job functions outside of the office. For many of these workers, their job is difficult, time consuming and filled with paper forms and reports. Picture in your mind the social services case worker working with families in need. Our worker is named Sarah and she feels overloaded and over worked. Her caseload is increasing and the number of workers available has not kept pace with the increased caseload. Sarah is now asked to see more families per day than ever before. Her typical work day starts with a database application in the office that shows the list of families to visit. Most are overdue for an evaluation.  Sarah pulls the top 10 most overdue and generates a report. As the report is printing, Sarah goes to the file room to get the case folders from the shelf. The case folders are placed in a box along with reporting forms she will fill out as she meets with each family that day. During the meetings, she takes notes and checks boxes on the paper form and puts it back in the file. She also takes some pictures of the home and the children. At the end of the day, she returns to her desk and updates the database with the information captured on the paper forms, she prints the pictures and puts them in the file folder which goes back to the file room. It’s been a long day for Sarah. Luckily for Sarah, new technologies have been recently implemented by her department. Let’s see how she is working today… Sarah now works primarily from her mobile tablet and no longer needs to start her day in the office. The tablet is always online and connected. She opens her case management application and sees the list of families for evaluation today. One tap and she opens the virtual case file and is able to see all the data and documents from the past meetings. She looks at the notes and pictures and sees that the first family she will visit was requested to repair a broken and dangerous front porch rail. The picture visible on her tablet was taken by another case worker and clearly shows the broken rail which was to be repaired. As Sarah arrives, she looks at the rail and sees that it has been repaired. She takes a picture with her tablet and taps the data field to check off that the fix was completed. The picture will be stored in the virtual case file along with all the other documentation from the visit. As she talks with the family, she taps her tablet with her finger to update data fields. She also types some notes on the tablet as she talks. She notes that one of the children seems to have a speech issue, but she is not a child speech expert. She records a short video of the child talking and tells the parents that she will have the video evaluated by an expert see if the child will need special speech therapy. She taps a few more fields on her tablet with her finger and then submits the online form. Sarah is now off to the next meeting. Back at the office, Sarah’s manager is alerted to a new task in her case management application. She opens the virtual case file and reviews Sarah’s evaluation report. She sees that a follow up is requested by a speech therapist and clicks the follow-up button, selects the type, and launches the task.  She then signs off that the evaluation is complete and that all areas have been covered. The recorded video is automatically sent to the external contracted speech expert via email.  Once the expert has reviewed, an email is received by the system and placed in the virtual case file. A new task is generated to Sarah with the results of the evaluation. At her next stop, Sarah opens the task and sees that speech therapy is needed.  She calls the family and tells them how to schedule a visit with a speech therapist. Do you believe that this scenario is possible with current technologies? Is this the type of field worker system needed by your organization? OpenText is at the leading edge of mobile field worker support and can deliver this system today.

Read More

How to Advance Your CCM Strategy: Recommendations for Enterprises

Has your company implemented a Customer Communications Management (CCM) system or are you thinking of doing so? If so, you probably already know a lot about the benefits you can garner from CCM: higher customer satisfaction, stronger customer relationships and even an improved lifetime value for each customer. However, making the most out of your CCM initiatives means introducing a proven strategy that will move your business forward. A new white paper from InfoTrends, Improve Your Enterprise Customer Communications Strategy in Five Vital Steps, reveals exactly what goes into a successful CCM strategy. The following are some of the recommendations that emerged: – Make CCM a hub for centralized communications. This way, you can holistically track your communications and ensure consistent branding and messaging. – Invest in a CCM Center of Excellence. It will help you quantify and articulate all of the benefits that emerge from CCM, while also helping to speed up implementation. – Consider template-driven technology. This will empower business users to create, manage and fulfill communications themselves taking some of the burden off IT. – Think about your mobile strategies. With so many people using mobile technology today, the market is full of opportunities for companies looking to create interactive communications with their customers. – Analyze your customer behavior. Data analytics and business intelligence will help do so, while also assisting as you create data-driven communications, all with the end goal of influencing customer behavior and driving upsell and cross-sell opportunities based on predictive analytics and past behavior. Over the next few blog posts, I will delve more into InfoTrends’ findings, highlighting some of the other key ingredients to a successful CCM strategy. Next up: Creating a more Centralized approach to CCM. Read the full white paper: Improve Your Enterprise Customer Communications Strategy in Five Vital Steps.

Read More

Have You Driven Your Data Lately?

While driving my car to the office in snowy New England the other day, I had an “analogy epiphany”. I’d been wrestling with conceptual ideas to illustrate the process of data migration and transformation. The much-used “data-driven” concept led me to the idea of “driving data”—what is the vehicle, who is driving it and what is the fuel? Just like a car, a power-generation plant is an assembly of complex set of design specifications with parts supplied by multiple vendors. Through the design and construction to the operation of the vehicle, the “fuel” powering the car is data. This data must be defined, refined, re-accessed, updated, tested and migrated. During this process, professional service consultants “drive” the car, guiding it on the right roads, and eventually handing it over to its ultimate driver: the Owner/Operator of the plant. Data migration and transformation is critical during the design/build and handover & commissioning phases of energy plants as well as during the operation and maintenance of existing plants. Inconsistent, out-of-date, or irretrievable data creates a bottleneck and vulnerability in a system that relies on trusted data to make safe and dependable decisions. It is essential that the information is verified as to format, content, and structure in order to deliver tangible value for the operating life of a plant, which can be over a lifespan of 30 years or more. Industry analysts will tell you that the average loss due to poorly managed information handover can exceed 1% of the total capital project expenditure. So, for a $1 billion project, that could mean an avoidable loss of $10 million or more! Our consulting teams possess the driving skills and experience to facilitate data transformation that will deliver relevant and useful content to the operations and maintenance teams. They leverage industry best practices to ensure that owner-specified operations and maintenance information is organized for broad search and retrieval. And they manage the process according to key industry and regulatory standards, including those associated with the National Electric Reliability Council (NERC), Critical Infrastructure Protocols (CIP), ISO15926 for data interoperability, as well as health, safety, security and environmental regulations. As we expand solutions for managing operating plant information with the Documentum Asset Operations solution, it becomes clear that one key to a successful deployment is the delivery of comprehensive and correct data for the plant. As such, we have \ Handover & Commissioning solution that provides both the transformation technology platform and the best practices to take information, from disparate multiple repositories and ensure this valuable data is ready for plant operations. So next time you hop in your car, consider not only the design of your vehicle, but also the fuel and care that leads to its long operational life … especially if you live in New England in the winter!

Read More

How Safe is Your Bank?

Traditional banks have changed. Customers visit their banks less than ever before, and even the ambiance within the bank has changed.  Walk into more and more branches and you will see a marked shift from secure teller stations to open airy shop like premises more reminiscent of coffee shops. Dispel the image of hooded robbers smashing their way into vaults and leaving with fast getaway cars. Today, thieves are not interested in physical cash. They are sophisticated, knowledgeable, computer savvy and well prepared, with a range of electronic tools at their disposal to crack codes, blow open vaults and leave scenes with valuable data, access codes and identities, without leaving muddy footprints or sticky fingerprints. The anonymity, speed, complexity and size of bank related crime has come as a shock to the industry, as not only do institutions suffer financial loss, but more than ever, are subject to increasing reputational risk, loss of trust and market impact. Few banks have been immune to electronic attack, and from a conspiracy theorist viewpoint, threats of terrorist attack destabilizing the world’s financial markets have attracted huge audiences in popular fiction. However the probability is far greater that compounded systemic internal failures will do the job far more effectively sparking heavy operational, market and liquidity risk. Recent ATM outages have been evidenced at major banks, resulting in denial of service, unpaid mortgages and utilities, inability to withdraw cash or make instant cash payments and rendering customers disadvantaged for limited periods. But it is the spectacular rise in card cloning, fraudulent cash withdrawals, identity theft and general crash of confidence in financial institutions that is of most concern. The real question is not how and where your cash is stored but where is your data relating to your accounts, your investment preferences, your attitude to risk and who do you ultimately trust. So is your financial data safe? Not according to A. K. Dewdney in Scientific American, March 1989: The only truly secure system is one that is powered off, cast in a block of concrete and sealed in a lead-lined room with armed guards – and even then I have my doubts. If you think cash is no longer the financial instrument of last resort, you may have considered crypto-currencies, such as bitcoin, a peer-to-peer zero cost payment system and digital currency introduced as open source software in 2009. It is worth remembering that no central authority or bank involved, nobody owns, polices or regulates it, so if it fails there is no financial redress. After an auspicious start with the value of the currency rising to over $640 per unit, more disturbing frequent system problems have resulting in confidence being lost amongst the exchanges causing price falls as users tried to liquidate their holdings. Again the innovation and extended payment possibilities without using traditional banks is very attractive, but not yet matched by the technology necessary to create the safe, confident user experience. Credit and prepaid cards are becoming safer too. From early 2000, after reporting frauds of over $750 million, Visa and MasterCard embarked on a program to unify security standards to protect user identity, authorize stored payment data and improve merchant security. Five years in the making, PCI DSS –Payment Card Industry Data Security Standard was borne, and now is an intrinsic part of any organization performing payment processing functions. See: This history of PCI DSS examining key security events, from Y2K to PCI DSS 3.0 Ultimately, technology is available at a price to protect individuals, companies and economies, using cryptography, passwords and biometrics whilst providing a fast, reliable trusted customer experience.  And to ensure continued protection, regulators and special interest groups are playing their part in creating the roadmaps and recommendations to tackle fraud, standards and market acceptance, However forces are continually at work to break through the ever thickening and rising layers of security.  It pays to be prepared, understand the risks, allocate funds in the right proportion without directly passing them to customers to protect systems, and embrace new certified technologies at the earliest opportunity. With potential claims now estimated in the millions of dollars, how much are you prepared to pay to stay safe?

Read More

HIMSS 2014 Recap: Financial Concerns top-of-mind for Healthcare IT Professionals

healthcare IT

When I stepped on the floor last week at the HIMSS Annual Conference and Exhibition in Orlando, I knew what almost all the rest of my more than 37,000 counterparts knew too—the healthcare industry is facing mounting financial pressures. So it was a pretty sure bet, that as I walked from end-to-end of the Orange County Convention Center, I was passing someone who works at one of more than half of all U.S. hospitals that operate in the red. Knowing this, it didn’t come as shock to any of us in the industry when the HIMSS Annual Leadership Survey noted that healthcare leaders had concerns over the perceived impact that financial resources are having on healthcare IT implementations. Taking a step back, the past two and a half decades have brought providers and healthcare IT together in a way that continues to advance patient care. And in recent times, healthcare reform has only strengthened patient care strategies, proving that healthcare IT initiatives are paying dividends for the organizations that implement them. The question that many healthcare IT leaders are struggling with, though—as my recent blog post touched on—is whether or not healthcare IT is showing enough return on investment (ROI). More so, is it showing enough ROI that will allow providers and IT leaders to continue to make the healthcare system more efficient and cost-effective with the use of technology, and at the end of the day, improve patient outcomes as well. As you would imagine, this was a hot topic at this years HIMSS Conference. A majority of IT leaders responding to the survey (65 percent)—albeit a relatively small portion of total attendants (298 total responses vs. 35,000+ attendees)—reported that their IT budgets actually increased this year. This didn’t necessarily come as a surprise to me given the staggering statistics in recent years regarding investments in electronic medical records (EMRs) as providers’ transition from paper-based systems to near-paperless environments. However, it was somewhat surprising given that healthcare organizations have typically cut back in areas like IT to cope with spiraling healthcare costs and reduced health insurance reimbursements. See more on this topic here.  IT budgets actually increased this year…however, healthcare organizations have typically cut back in areas like IT to cope with spiraling healthcare costs and reduced health insurance reimbursements. Regardless, healthcare IT leaders still face any number of issues and challenges with IT priorities, technology adoption, and IT security. In fact, as many of my conversations during the week seemed to lead with, organizations are facing challenges with finding adequate financial resources for successful IT implementations. And, at this point in the game, challenge is probably a lackluster term to use—barrier is more fitting. As organizations dig deep in this new era of care delivery to cut costs and make care more affordable for patients, IT leaders will continue to face these barriers. And they will continue to maintain financial viability as one of their top concerns—as 25 percent of respondents did this year. Health care cost concerns are not going anywhere, anytime soon, so organizations are going run into significant barriers when trying to implement healthcare IT for some years to come as they face a lack of financial support and a lack of staffing resources. The question now transitions, for most, from simply showing ROI to finding innovative ways to leverage current IT investments and transforming how the organization organizes and manages patient information to improve patient outcomes while reducing costs. Not to mention overcoming the financial constraint barrier and selling top leadership on the need itself as IT leaders seek the financial support one needs to implement the solutions they would like to have. After all, data-driven healthcare is ultimately the only way to enable these new care delivery and reimbursement models that are evolving as a result of the Affordable Care Act. Think of it as ROI 2.0 in this post-reform healthcare era we are embarking upon. How is your IT department dealing with financial constraints in trying to implement solutions that improve patient outcomes while reducing costs? Share below in the comments box.

Read More

Improving Patient Care by Archiving Electronic Medical and Health Records

I have been talking lately about organizations doing a great attempt at making records available while at the same time ensuring compliance with policies on records keeping and data privacy laws. To succeed, organizations will have to start thinking about an Information Governance strategy. Two items worth discussing in regards to making records available are Electronic Medical Records¹ (EMRs) and Electronic Health Records¹ (EHRs), the latter sometimes referred to as Personal Health Records (PHR). Around the world, systems for EMRs and EHRs are being implemented to improve patient care, reduce health care expenses, and fundamentally change the way in which medicine is practiced. Health care providers in various countries are financially supported by government programs to get an EMR in place and software vendors are building solutions to help health care providers. Implementing an EMR has its challenges, but there are some great examples where OpenText helped customers succeed in that challenge. Geisinger is one of those customers. Implementing Electronic Health Record systems presents another challenge of ensuring consistency of format from one EHR to another. Don’t forget it is also critical that we respect and protect a patient’s sensitive information, including HIPAA-protected information, which makes migrating paper records to an online format an even bigger effort. For a country, having an EHR in place is a timely process but doable. There are several countries, especially in Europe, that have achieved this or are steering quickly towards this, including Denmark, France, and Norway. My biggest question mark, however, is why relatively small countries like Denmark, the Netherlands, and Belgium want EHRs as separate countries. Let’s take as an example the Netherlands, where driving just a couple hours would put you in another country. Having an EHR that is accessible only by provigil from the Netherlands will need health care outside of their country. Insurance coverage outside of the Netherlands—and, for that matter within the EU—isn’t the problem, but getting foreign doctors to understand patients’ health records is. The reason is language. And not to mention trying anything during a holiday abroad: Not everyone speaks the foreign language where they go on holiday, nor speak fluently several languages. Conversely, a physician from France will have some trouble communicating verbally with a tourist. With EMR and EHR at the European level, on the basis of codes for medications, treatments, and allergies, the correct translations are possible for each country. This saves time, frustration, and, even more importantly, errors due to wrong communication. So, let’s make sure when making medical and health records available to take the most efficient way, giving back time to doctors and letting them do what they are good: taking care of patients. Learn more about building a successful archiving strategy at www.opentext.com/archive. 1. Electronic medical records (EMRs) are the record of patient health information generated by encounters at one particular physician practice. This is the physician’s own electronic record of his or her patient’s medical care. Electronic health records (EHRs) are defined as patient health records that include clinical data and information from multiple sources and that are maintained outside of a single hospital or clinic. It is a record of a patient’s long-term and aggregate health information generated by one or more encounters in any care delivery setting. Stemming from the interoperability of multiple providers, the EHR is distinct from the software systems that directly support caregivers treating patients. Rather, the EHR connects the physicians and other caregivers. Included in this information are patient demographics, progress notes, problems, medications, medical history, immunizations, laboratory data, and radiology reports.

Read More

Early Adopters in Energy Foretell a “Connected Plant” Future

At a recent ARC Industry Forum, I heard a lot of talk about “The Connected Plant.” I asked myself – have they just rebranded Asset Lifecycle Information Management (ALIM)? But upon reflection, the best way to think about them is that ALIM enables the connected plant, providing the infrastructure to integrate critical asset related content and processes in engineering, maintenance and operations. And the results are very encouraging — increased plant productivity and reduced overall risk. As we heard at the conference, early adopters are placing their bets. At the conference’s keynote presentation from Andy Chatha, President of the ARC Advisory Group, the era of easy oil is over. Capital projects are getting more expensive and more risky. With over $120B in capital projects invested in 2013 by Exxon, Chevron and Shell alone, I realized that these industry changes apply to many of our customers. What changed and why? First off, I believe that connectivity matters more than ever before. Today in Energy & Engineering, like other industries, the number of disparately owned entities involved in a project is increasing. Major capital projects require such a significant investment, that often partners are brought in to ease the financial burden. Other parties include specialists who collaborate on key portions of the project, like design, surveying, or compliance. While a plant may in the past have been shut off and tightly controlled by a select few, today the opposite is required to survive. It’s not just about managing content, but about making key information safely accessible to those who need it. I also believe a change in focus is partly behind the Connected Plant mindset shift. The real objective of any information management effort is to achieve a business impact. It’s not just about managing content, but about making key information safely accessible to those who need it. By focusing on the strategic business priority – working with others to find and deliver energy more cost-effectively – we have moved from a functional description of information management, to a goal-oriented one of better collaboration and connectivity. But besides the vocabulary change, I was also intrigued to see early adopters at the conference moving beyond the conceptual stage and into trials. A session from North West Redwater Partnership (NWR) discussed an $8B+ refinery project to produce high efficiency diesel fuel and derivatives. Imagine managing information and collaborating between 6 engineering, procurement and construction companies? That’s what’s involved in their massive project. To me, this is a company evaluating its entire information asset lifecycle strategically, and investing in the right way to improve efficiencies. This includes “a centrally hosted and integrated solution that can be maintained through to operations and sustainment.” They have translated a Connected Plant vision into attainable goals, which even if not met exactly, will nonetheless buoy their competitive advantage through optimized project design costs and the capture and re-use of engineering information throughout the lifecycle of the plant. While we may be in the early stages of implementation for such collaboration and information sharing projects,  I was pleased to this real example that many of our customers can learn from through NWR. It’s not immediate, and not throughout entire global organizations, but it is trial and learn nonetheless. Just like we saw the concept of Smart Grid appear, and over time, change from concept to trials for utility companies, I believe the same is happening in our industry for “The Connected Plant.” What do you think? Share your comments below.

Read More

Dynamics in the Oil Industry – From Bacteria to the Big Boys

oil industry

This is the story of small single cell bacteria, which lived on the earth 3 billion years ago. And there were many millions, billions and trillions of them that lived in the oceans. It was the only living organism in the world at the time. They lived in huge sea colonies, growing up to millions of tons. As the bacteria died off, these massive colonies settled in the sea-bed and were gradually covered under sediments. Over the next millions of years, the sediments accumulated over and over on top of the dead bacterial colonies. With heavy pressure and high temperatures, the colonies were cooked into liquid hydrocarbons. As time moved on, the sea receded at places and these became our onshore oil fields, and where the sea remained they became our offshore oil fields. So forget the dinosaurs—they were never our oil-producing friends. It was those tiny single-celled bacteria that help us light our homes and fuel our automobiles. Three billion years later, and more precisely in the 21st century, the big boys now play with the enormous hydrocarbon reserves that were generated by our tiny friends. Governments worldwide who own the land or seas on top of these reserves have a keen interest in bringing the oil and gas to their people. To do this, the governmentslease out the land to major oil companies the technology to explore, develop, produce, transport and distribute these end-products. Government-sponsored geologists, geo-chemists and geophysicists have a major role to predict potential worth of discovered reserves measured in barrels per day (bpd) of production. For example, the Brazilian oil field of Libra has been estimated to hold between 8-12 billion barrels of reserves. As per the rules of the game, governments will float a tender to lease out that land for a fixed number of years, 10+ years in most cases. Big majors such as Shell, BP, Royal Dutch Shell, and Chevron then bring in their own team of geologists, geo-chemists and geophysicists to evaluate the government’s predictions, and submit their proposals accordingly. Shell might believe in 10 years, they can only extract 1 billion barrels and then, by the going rates of oil of the market, will bid a certain dollar amount. Similarly, all the oil majors will evaluate and bid according based on their predictions. You can understand why the geo-scientists are so critical and considered so valuable to their companies. The Race to Production Once the contract is awarded to a major, of course, it doesn’t get simpler from here. The oil ministries and the top executives of the oil majors continue inter-governmental negotiations. Time starts ticking away from the oil major’s 10 years lease—every day there is no oil production, the company is losing money. Literally, time is money for them. The majors will sub-contract with other oil services companies to start the exploration, development, and production work including all EPC work. It’s a maze of contracts and sub-contracts, with interests split across different sectors and organizations. However, all of this needs to be synchronized like clockwork to generate the wealth and benefits. This activity is represented by the tiny green box called Positive Cash Flow in Figure 2. This is where the oil is produced and monetary benefits are realized, by selling and distributing various products and by-products of these hydrocarbons—all before the 10 years of lease expires. Information Management Standards are Critical This time dependence and criticality puts enormous importance on information handling and management strategies adopted by the industry over the years. Within the boundaries of this strategy lie models such as the Professional Petroleum Data Management models as well as OpenText solutions including well file management for development drilling, Documentum Capital Projects for design and construction, and Documentum Asset Operations for production and operations. For companies operating in exploration and production, these standards help optimize their upstream operations. Professional Petroleum Data Management (PPDM) is a not-for-profit association with members from the oil & gas upstream operating companies, data and software vendors, and related service providers. In its current model, PPDM provides an extensive data model including data management, seismic, land rights, wells, and production used by upstream operators to help them better manage data that is captured and mined during the exploration and production phase. Solutions for Upstream We have has developed a suite of solutions that address information management challenges in the upstream phase. Documentum Capital Projects helps exploration and “development drilling” companies and their associated EPC contractors to achieve easier and better management of project deliverables (documents, drawings, reports, and transmittals) across multiple capital projects. We also help with handover and commissioning by enabling companies to receive and import, organize and transform content throughout a capital project’s design/build phase for use in an operations and maintenance environment. After the project is complete, Documentum Asset Operations takes care of other standard processes such as Management of Change (MOC) for project modifications, all within the context of the as-built asset. In upcoming blogs, I will delve deeper into these solutions.

Read More

The Rising Tide of Smarter Collaboration, in Life Sciences and Beyond

Life Sciences collaboration

Collaboration and Efficiency through Smarter Information Management – like the waves outside my hotel room window, this is unpredictable. Yet surfers float, awaiting the next big ride. Similarly, nobody can tell where the next inspiration for major new drug innovations might start. We do know that getting approval to market drug innovations is paramount for Life Sciences survival. And we know content management can create the right “swell” to buoy any business swimming in documentation. By swell, I mean cost efficiencies and smarter ways of working, which in turn, can help speed time to market.  Everyone benefits when effective and safe drugs get to patients as quickly as possible to drive better health outcomes. Repositioning Products & Process One place to develop efficiencies is in your existing product set. Search for ways to extend an innovation, or perhaps reuse it for different indications. Adding time-release chemistry could extend the life of an existing drug as well as identifying an unexpected new indication for it. Process and workflows are also worth a fresh look. Simplify “External” Collaboration The process of creating, reviewing and approving regulatory submission documentation isn’t new.  But as we’ve seen in many industries looking to lower costs, collaboration is now required far beyond internal and contracted resources. Information is authored, reviewed, approved, and acted upon by various parties across multiple organizations. Using past tools, a single organization’s boundaries drove collaboration through emails, file shares, and other unsecured tools, creating discontinuous and poorly controlled collaboration. Now, sponsor organizations have solutions for an efficient and streamlined process for ensuring submission-ready documentation. Finally, document owners aren’t paddling furiously to keep up. Let the Experts Be Experts The trend toward compelling user interfaces in content management can keep your experts doing what they love best (hint: not navigating complex systems).  Intuitive, role-specific workspaces with tailored feature sets help workers easily perform their tasks, without having to read the manual. Late nights of ensuring document readiness, or managing clinical trial paperwork, become far less tedious. Stay Ahead of Compliance The best practices above aid in compliance and inspection readiness as well. Should documentation be requested by regulatory authorities, the time to provide it can be minimized. New feature sets offer granular controls that restrict access to only the documentation required for the specified time it’s needed. Those with a “need to know” can utilize a predefined view that simplifies user training and ensures quick response to inspection requests. These content management advances help keep the bite of the audit sharks away. With all of these suggestions in mind, the next new innovation may not be as elusive as that perfect wave after all. Use these four strategies to create your own groundswell of collaboration and efficiency across Life Sciences organizations — and beyond.

Read More

Do Healthcare IT Investments Offer Diminishing Returns for Hospitals That Invest in Them?

healthcare IT investments

Late last year, I was presented with a fairly open-ended question about measuring return-on-investment (ROI) for healthcare information technology. In this particular case, the inquiry dealt with sharing/storing clinical information vs. the focus on investment by hospitals to keep up with healthcare information technology. Like I said, it was open-ended, but even that may be a stretch. Being somewhat more specific about healthcare IT investments, two additional questions (or three depending on how you look at it) were asked of me, as well as a generalized comment for input: 1) which technologies are the wisest investments, and which offer only minimal returns?; 2) how important is imaging technology in attracting patients?; and 3) some experts argue that new technology, while enticing, may offer diminishing returns for the hospitals that purchase it. Bearing in mind that the basis of these questions was primarily directed at imaging technologies, I was not completely sure that the comments I could offer would be what the individual was looking for exactly. This was especially true given that we are a healthcare solution with an imaging component as opposed to strictly an imaging technology (such as a PACS). But the last comment stuck with me and seemed to linger in my mind — “some experts argue that new technology, while enticing, may offer diminishing returns for the hospitals that purchase it.” In fact, I felt compelled to rebut it, as industry experts that argue against the use and implementation of new healthcare information technology and solutions are just flat wrong. In the case of OpenText our solution—the Documentum Integrated Patient Record (IPR) solution suite—allows hospitals to do a number of things, but as direct as I can put it, it allows organizations to improve patient outcomes while reducing costs. With mounting financial pressures being placed on the industry—with more than half of all U.S. Hospitals operating in the red—hospitals that don’t invest in information technology and those who don’t refocus their efforts on improving quality will continue to struggle, if not ultimately fail. Increasing the focus on quality leads to a healthier bottom line, and most importantly; it’s the right thing to do for the patient—both from an outcomes perspective, as well as that of a cost perspective. But more so, evolving business and reimbursement models will require data sharing, and as a result, will lead to data-driven healthcare. There is no other way to accomplish that than with the use of technology. Hospitals and healthcare providers, as a result of the Affordable Care Act, will now be reimbursed—in a nutshell—based on quality of care and patient outcomes. This is going to make it imperative for healthcare organizations to transform how they deliver care across the continuum of care by investing in and creating integrated patient-centric records to not only improve quality of care provided, but also care coordination. And furthermore, allow them set the foundation for population health management and disease management—both lofty goals for keeping our public healthy while keeping cost of healthcare at an affordable level. Since clinicians often lack a complete view of patient history, diagnosis and treatment at the point of care, this transformation—led by the use of innovative technology that enhances the value of the electronic medical record (EMR)—will alter how organizations view, organize, access, manage and use patient data. In fact, since patient information is often locked in numerous systems and repositories without easy access or way of sharing, the lack of technology use to break down this fragmentation barrier leads to quality issues, as well as regulatory and privacy concerns. Our integrated patient record solution unites fragmented patient information—regardless of source, location or format—to provide a fully integrated, patient-centric view of all essential information beyond the electronic medical record (EMR) that become integrated into existing clinical applications for seamless sharing. The result is enhanced productivity, improved care delivery and cost management (i.e. lower cost of care). The return far outweighs the investment as IT investments are maximized. And in actuality, as use cases show, this technology can equate to substantial cost savings for organizations, aside from the cost savings that is realized with more efficient delivery of patient care. It allows healthcare delivery to become more automated across the continuum of care by using patient information—from all available sources—to support business processes. And the need to migrate and upgrade clinical applications and hardware is greatly reduced as manual tasks are eliminated. Simply put, investment in healthcare information technology is the smart thing to do; not only for the patient and how it enhances quality of care, but for how the entire healthcare chain can benefit from reduced operating costs—savings that can be passed on directly to the patient.

Read More

Dealing with High-Volume Transactional Documents. Do I really need an ECM system?

The same way you wouldn’t try and put out a blazing bushfire with a squirt gun, you probably wouldn’t use a stump truck to uproot garden weeds. It’s all about using the right tool for the job. Having said that, we live in a world where feature-rich products, promising to deliver a wide range of functions, are abundant. It’s always tempting to go for a “value-for-money-bigger-solution” to solve a smaller problem. For instance, when we talk about a document repository, generally we tend to think of a full blown ECM system with all the bells and whistles including complex branched versioning, virtual documents, unlimited number of attributes, evolutionary classification schemes, rigorous security architecture and what have you. Although features like these are a must for an enterprise-wide, source-of-truth ECM system, in certain cases, a light-weight, high-performance, cold document repository where you can retrieve and deposit large volumes of transactional documents, is all that you need. A re-depository if you will. It’s hard to argue a case against an ECM system which can also act like a HVTO (High-Volume Transactional Output) repository but, by the same token, it’s hard to find such an ECM system. A feature-rich ECM, generally, is built on a complex engine which, inadvertently or otherwise, has the performance penalties that aren’t suitable for a high performance HVTO repository. So what are the features to look for in an HVTO repository? I am glad you asked. In my opinion, here are a few to consider: Performance, Performance and Performance! An excellent HVTO repository is the one that outputs high-volume/size documents to multiple delivery channels at higher speeds. It may not support complex versioning schemes but, again, for high-volume transactional documents, like customer statements, it may not be required. Support for Printstreams Printstreams are one of the major sources of high-volume transactional documents. A good HVTO repository must support industry standard printstreams like AFP, PCL, Metacode etc. An intuitive and fast way of ingesting as well as spitting out printstreams is expected of a good HVTO repository. Integrated Transformation Capabilities An HVTO repository is expected to provide a built-in transformation engine where content can be ingested, indexed and reformatted as required for storage, viewing or printing. Integration with Information Silos In large enterprises, a purpose-built HVTO repository is almost always found side-by-side with one or more enterprise repositories and other storage systems. Its ability to source/search data and documents from multiple sources is needed due to its, largely infamous, role as an Enterprise Document Cache. Ability to Reduce Storage Footprint Better HVTO repositories should significantly reduce data storage requirements by extracting and storing only one copy of common resources, such as graphics, fonts and metadata, for high-volume printstreams and reconstituting them back together upon retrieval. Fit-for-Purpose Security and Audit Even though HVTO repositories strive for simpler security architecture, the reality is that compliance requirements, both internal and external, call for a tighter level of security and audit on the documents stored in the repository. A good repository should have the right balance of meeting customer needs, regulatory compliance requirements, as well as keeping it all simple to implement and manage. Now that you know some of the things I take seriously while deciding which repository to use, let me tell you about the smart TV I recently bought. It has hand gesture control, amazing sound, 3D screen, voice commands and ability to show streaming 3D and other content from the internet. Turns out, I do not have a big enough room where I can enjoy the amazing sound, hand gesturing is a bit annoying, with no way to bring ethernet cable to the TV and my slow wireless router and internet connection speed – streaming is not an option anyway. Guess what? I am still using my old TV to watch channel 9 news.

Read More

ECM as a Service in Government – a Practical Strategy


Government agencies generate an enormous volume of diverse content as a product of their work and their interactions with the public. The range of content includes traditional documents such as contracts, forms, permits, and case files, as well as CAD drawings, web content, still images, audio and video, and rich media. All of this content is vital to conducting government business. And managing it presents a number of challenges, not the least of which is that a lot of government content is considered public record and is subject to Freedom of Information Act (FOIA) requests. Moreover, every year a growing percentage of government business is conducted electronically—between agencies and with the public. So electronic records simply add to the content management challenge and will increase at a much faster pace than paper, which shows no signs of disappearing. Government agencies need the enterprise content management (ECM) tools to efficiently manage both. In addition to content volume and diversity, government agencies are under pressure to leverage the internet and mobile devices to provide citizen services more rapidly. Many government forms are now available online, but the speed and efficiency of the online channel requires more than just creating digital equivalents of paper forms. It demands processes optimized to streamline form submission and automate workflows for delivery of rapid, error-free citizen service. Understandably, government departments often turn to a central IT function or the office of the CIO to cope with these challenges. And, in general, CIOs recognize that almost all departments they serve need ECM as well as records management. They see the value in providing these services centrally, but lack the experience, and often the staff resources, to plan, deploy, and administer centralized ECM application services. Is your agency in need of a centralized ECM Service? Are you a government CIO with growing pressure to provide ECM as a service? How can this be done? Read more about how we help different industries here.

Read More

Fire Up Your Asset Information Management Strategy

Asset Information Management

In the middle of winter, it’s perfectly normal to look for opportunities to escape the cold, even if only for a short while. The 18th annual ARC Industry Forum provides the perfect opportunity—in Orlando, Florida—to thaw out and fire up your asset information management strategy. The forum’s theme, “Industry in Transition: The Information Driven Enterprise in a Connected World”, it promises to spark discussions about embracing new information technologies to achieve agility and sustain a competitive edge. We know that information drives your business. And there are significant benefits when you take information on paper, stored in people’s heads and in disconnected silos, and unlock it so it can be acted upon to improve your business. An information-driven enterprise can quickly respond to changes in government regulations, employ analytics to make the best decisions while considering rapidly changing variables of the market to outperform their competitors. But getting to that point can seem daunting and completely unachievable. It’s like being asked to compete in Olympic downhill skiing before you feel confident navigating the beginner trails. Fortunately, others have gone before you – and left tracks for you to follow. The key is to partner with an organization like OpenText that has experience implementing this type of system, is familiar with industry standards and best practices, and knows how to avoid the pitfalls to help safely guide you along the way. It’s also important to have an implementation model to follow that allows you to reach milestones, so you can make steady progress and realize benefits each step of the way. Then, beyond having a good plan with achievable milestones, you need to use the right equipment to get you there. OpenText has solutions designed to grow along with your expanding needs. You can start with basic and straightforward requirements like getting control of your content and making sure that your teams are able to access it. We can help you get there and quickly realize the benefits. As your confidence builds, you can take advantage of other capabilities that previously seemed out of reach. Adding change management processes or integrating with maintenance management and other business systems will allow you to reach new summits. We can help you unleash your plant’s potential – an information-driven enterprise. In a forum session at the event, you will hear about a customer that faced these same challenges and started the journey to enhance safety and efficiency through asset information management. You will discover why this customer chose Documentum enterprise content management as the foundation to improve the management and accessibility of asset-related information in their hydro-electric project. Have you begun planning your asset information management strategy? Comment below and share a tip to get started or lessons learned from your experience.

Read More