InfoArchive

Step-by-step Guide: Integrate Market Leading Analytics Engines With InfoArchive

Analytics

Gaining further insights from your data is a must-have in today’s enterprise. Whether you call it analytics, data mining, business intelligence or big data – your task will still be to gain further insights from the massive heap of data. But what if your data has already been archived? What if your data now resides in your long term archiving platform? Will you be able to use it in all analytics scenarios? Let me demonstrate how easily it can be done if your archiving platform is OpenText™ InfoArchive (IA). A customer recently requested a demonstration of integration with analytics/BI tools in a workshop we were running. The question asked was about the possibilities in InfoArchive to integrate with third party analytics engines? The answer is – everything in InfoArchive is exposed to an outside world in the form of REST API. When I say everything I mean every action, configuration object, search screen – literally everything. So we decided to use REST API for the analytics integration demo to the customer. What Analytics/BI tool to pick? Quick look at the Gartner Magic Quadrant has some hints. I’ve been using Tableau with InfoArchive in the past so let’s look at another option in the Gartner list: Qlik. OpenText™ Analytics (or it’s open source companion BIRT) is my other choice – for obvious reasons. Let’s get our hands dirty now! Qlik Qlik Sense Desktop seems to have a simple UI but there are some powerful configuration options hidden behind the nice façade. In Qlik to query a third party source simply open the Data load editor and create a new connection. Pick Qlik REST Connector and configure it. The connection configuration screen enables you to specify the URL of the request, request body and all necessary header values. All you need for a quick test. Now that the connection is configured you’ll have to tell Qlik how to process the IA REST response. Click the “Select data” button in your connection and Qlik will connect to InfoArchive, execute the query and show you the JSON results in a tree and table browser. All you need to do is to pick the column names that you want Qlik to process as shown below: Since the IA REST response columns are stored in name-value elements we have to transpose the data. This can be easily done with 20 lines of code in the Qlik data connection: Table3: Generic LOAD * Resident [columns]; TradesTable: LOAD Distinct [__KEY_rows] Resident [columns];   FOR i = 0 to NoOfTables()   TableList:   LOAD TableName($(i)) as Tablename AUTOGENERATE 1   WHERE WildMatch(TableName($(i)), 'Table3.*'); NEXT i FOR i = 1 to FieldValueCount('Tablename')   LET vTable = FieldValue('Tablename', $(i));   LEFT JOIN (TradesTable) LOAD * RESIDENT [$(vTable)];   DROP TABLE [$(vTable)]; NEXT i We’re almost done. Let’s visualize the data in a nice report now. Select “Create new sheet” on the Qlik “App overview” page and now add tables and charts to present your data. My example can be seen below: Just click “Done” on the top of the screen and you’ll be able to see the end user view: browse the data, filter it and all charts will dynamically update based on your selection. Job done! Continue reading on Page 2 by clicking below.

Read More

Thoughts From AIIM 2017 – and the Heart of ECM

ECM

Social media has changed almost every way that we interact. Even volunteering for extra work in your own company. Not three minutes after I tweeted that I was not going to attend AIIM this year, I received an email asking if I was serious about wanting to attend. It was a sly way of getting me to volunteer for three days in the booth with OpenText, but to be honest, I was glad to have the chance to do it. The AIIM conference is a great place to reconnect with industry leaders and customers to take the pulse of the ECM business. With all due respect to those who title blogs to the contrary, the heart of ECM is still beating In technology we cling to acronyms the way a shipwrecked sailor holds on to driftwood. ECM is no different. It has defenders and no small number of detractors who argue for its replacement. I am of the opinion however that we spend far too much time debating what to call it than actually doing something productive with it. Fortunately there were many very productive conversations at AIIM 2017. This year, AIIM was especially important because it was one of the first opportunities to talk to free range customers about the acquisition of Documentum by OpenText. For Documentum customers, the message is clear – your investments are safe and these are not products that will simply sit on the sidelines. To do otherwise  simply does not make good business sense. Similarly AIIM was an opportunity to give some OpenText customers a first look at what new toys are in the box for them to explore like LEAP and InfoArchive. After attending this conference I can see no value in replacing, retiring, or even updating the acronym ECM. It is good for what it is in the right context. That said, OpenText is not just about content. Look at the home page and you will find we talk about Enterprise Information Management. It is not an either EIM or ECM situation but rather one is inclusive of the other. In the keynote John Mancini suggested moving away from Enterprise to Intelligent Information Management as a way to describe what we do. I like the use of the Information but I do not know anyone who would want Unintelligent Information Management so the message may not be as clear as it seems. More semantics. Going beyond semantics – Content Services What does change the conversation is the idea of Content Services and it was a frequent topic in my discussions. At some level it is really the same thing, but I do believe this is a shift in thinking. As you define your business challenges and understand how content is part of them, a services design mindset is fundamentally different from the past approaches. Content services suggests a repository agnostic, API defined transactional model rather than the enterprise platforms driven by traditional ECM. You can obviously solve many of the same problems with both approaches but the model suggest CS would be more nimble, responsive, and dare I say, a less expensive option over time. The balance we must strike moving forward is how to thoughtfully migrate content workload into this model without sacrificing past investments in content systems and the information they contain. Decomposition of a problem into meaningful segments that can be solved with discrete solutions made up of common services is new in ECM. The monolithic systems of the past were driven (by customer demand in many cases) to solve every conceivable problem in a single offering. We can eventually provide all these services in the cloud in products like LEAP, but it is important to note that this does not need to be a rip and replace strategy. Many existing systems can also be engines behind some of these services for those not ready to make that jump, while keeping their existing systems up to date. Content services is not just another term for cloud delivered ECM or EFSS. In building cloud ECM, some have confused user influence over the buying decision and adoption with ownership of the information assets themselves. The “E” in ECM seems to move from “Enterprise-managed” to “Employee-managed.” While this design focus has had obvious benefits in experience, it sacrifices a critical point. Ownership of the content. Ultimately, does a user or a process own the asset. The last time I spoke at this conference was in 2011 and pointed out even then that we needed to “appify” our existing ECM solutions. The context then was part of a mobile content management experience. This is, in fact, what products like LEAP are doing at the layer above content services. Regardless of your acronym of choice, it is an interesting time to be a part of the AIIM community and the OpenText ecosystem.

Read More

Forget on-premises: InfoArchive, Docker and Amazon AWS

InfoArchive

There are two buzzwords that we have heard in the IT world for some time now: Cloud and Containerization. For me, 2016 proved that these two topics have changed from hype to reality, even in the biggest enterprises, and a lot of customers were asking for our solutions, like OpenText™ InfoArchive,  in public clouds and/or running as Docker containers. While our engineering and PS teams are doing a great job in providing these solutions, I decided to walk this route myself. Follow me  on the journey if you’re interested. I started my tests by creating a Docker hub account. The account private repository will be used to store the InfoArchive Docker images and automatically deploy from there. It is very easy to create a Docker container from InfoArchive – talk to me if you want to know more. It takes just a couple of steps and you’ll have your InfoArchive Docker container image ready. What’s next? Now let’s run this image in Amazon EC2 Container Services (ECS). Welcome to the “cloud world” If you’re new to the Amazon world you might have difficulty understanding some of the terminology around Amazon ECS. I hope this post will help you with this. ECS cluster In the first step we need an ECS Cluster. ECS Cluster of EC2 instances and services. EC2 instances are our “good old” virtual machines and represent our available compute resources. The work that you assign to the cluster is described as “services”. The picture below shows that our InfoArchive cluster started with 3 micro servers (each of them automatically initiated by ECS from the below amzn-ami… VM image): Within a minute your cluster compute resources are running and waiting for you to assign them some work. Ignore the memory values in the below screenshot – I took the screenshot with 3 running tasks occupying the memory already. InfoArchive is a “classic” three-tiered architecture product: Native XML database xDB at the backend, InfoArchive server as middleware and InfoArchive web UI. To prepare for scalability requirements of our deployment we’ll run each of the tiers as dedicated containers. We’ll “front-end” each of the tiers with an EC2 load balancer. This approach will also simplify the configuration of the container instances, since each container instance will have to connect to the underlying load balancer only (with known static hostname/IP) instead of trying to connect with the constantly changing IP addresses of the container instances. On a very high level the architecture can be depicted as shown below: EC2 load balancers are set up quickly  – my list (shown below) contains 4 instances since I’ve also configured a dedicated public load balancer for xDB connectivity. With this step completed the ECS cluster, its compute resources and the cluster load balancers are prepared. Let’s put InfoArchive on the cluster now.

Read More

OpenText Strengthens EIM Portfolio with Completion of ECD Acquisition

In September, OpenText entered into a definitive agreement to acquire Dell EMC’s Enterprise Content Division (ECD), including Documentum. I am delighted to announce that as of today this acquisition is complete. The addition of ECD’s 25+ years of leadership in Enterprise Content Management (ECM) further strengthens the OpenText product portfolio and our commitment to delivering the most functionally complete Enterprise Information Management (EIM) platform in the market. This acquisition provides exciting opportunities for current and future OpenText customers. Existing customers will benefit from a more functionally complete EIM platform while the ECD customer base will benefit from integration into OpenText technology, as well as gaining access to the number-one EIM Cloud and OpenText SaaS applications via flexible, on-premises, cloud, or hybrid deployment options. Specifically, the addition of Dell EMC’s offerings from the Documentum, InfoArchive, and LEAP product families will help to fulfill our strategic vision of growth and leadership in all sub-segments of the EIM market. Our EIM offerings will be enriched by industry-packaged solutions and deep customer relationships across the globe. Along with product enhancements and a worldwide customer base of more than 5,600, the acquisition brings 2,000 talented ECD employees to the OpenText family. Together, we will be over 10,000 professionals strong, focused on customer success in EIM and enabling the digital world. Investing in innovation and development is a key objective at OpenText. As we continue to grow and expand into new markets in meaningful ways, I’d like to welcome ECD customers and employees to OpenText, a focused and dedicated software company that lives, breathes, and sleeps EIM software. Given the importance of the announcement, the ECM Community will be gathering together for a candid discussion of the marketplace and how the acquisition fits into the future of content management. Attend the roundtable session. For more information about this acquisition, read the press release.

Read More

Wrapping Up 2016

For many, this time of year is full of the hustle and bustle of the holidays, preparing for time with families, and pondering what the New Year has in store. For Team ECD it’s no different, only this year there’s additional activities, excitement and anticipation about the road ahead as we prepare to join forces with OpenText.  With this in mind, I wanted to take a moment and provide an update on progress to date and unwrap a little of what’s to come. Do You Hear What I Hear? It’s hard to believe that we just passed the 90-day mark since signing the agreement with OpenText in September. Since then, the focus has been primarily been on you, our valued customers and partners. We’ve spent hundreds of hours talking with you and receiving feedback at events in the U.S., Momentum Europe in Barcelona, and a multitude of 1:1 meetings across the globe. As always, our goal is to be as transparent as possible with our ECD family and ecosystem, and we hope you’ve found these interactions as valuable as we have. On a tactical level, the main driver behind our “hustle and bustle” has been to ensure consistency and continuity – everything from the products to the processes to the people – we want to make the transition as seamless as possible for you. It’s Beginning to Look a Lot Like Closing… There has also been tremendous effort focused on delivering the items that are needed for regulatory approvals and closing the deal. I’m pleased to report that everything seems to be moving ahead as planned. Just as we were all fortunate to participate in the close of the Dell EMC deal, the team is looking forward to celebrating this next step for ECD in the New Year. As we get closer to finalizing a close date, we will share opportunities for you to get involved and celebrate with us. A Toast to You It’s been an amazing 2016, with big, successful events like ECD Ready, Momentum in Las Vegas and Barcelona, and Customer.Next. We’ve delivered more than 65 products, solutions and betas over the past 12 months, with major initiatives across the product families, including the release of 5 LEAP apps and the LEAP Platform beta; Documentum 7.3 with significant updates to the full Documentum family and vertical solutions; and the introduction of extreme archiving with InfoArchive 4.0 and Clinical Archiving 2.0. We’ve been recognized as a leader by five industry analyst firms, including our 13th time as a leader in the Gartner Magic Quadrant for Enterprise Content Management, and have won multiple product and strategy awards. And it’s all for, and because of, you. On behalf of the entire ECD team, I’d like to raise a virtual toast to you – for your continued business, loyalty and friendship. Cheers and best wishes for a happy holiday season and a joyous new year!

Read More

Long Term Archiving for “in-memory” ERP Systems? Really?

in-memory

Unlike other hype trends in IT, implementation of in-memory databases is a trend that has really taken off. Today, leading ERP system providers push “in-memory” as the only option for their customers. Enterprise IT now accepts the idea (which was once perceived as crazy and/or dangerous) that it is preferable to have all their precious ERP data in RAM memory instead of on a “good old” SAN storage array. Has this trend made long term archiving obsolete? Long term archiving is still a valid (and actually must have) requirement – even in the “in-memory” world.  Compliance requirements haven’t disappeared with the in-memory rise. You’ll still have to keep your invoicing data verifiably unchanged for some 5-10 years depending on your legislation. You’ll have to make sure you can prove to your auditors that nobody could have tempered with your data since it was created. When to start with data archiving Is there a right time for archiving your data? Most probably, you’ll be “forced” to archive your historical data before you migrate to an in-memory system. Forced is an exaggerated expression, of course, but it can quickly become your truth if your other option is to purchase the in-memory hardware that is 3+ times the size of your current data volume. If you were ignoring data archiving before (since the SAN DB storage cost was so low, right?) you’ll want to run it before the in-memory ERP onboarding project.Now your ERP system is running in-memory. What’s next? For sure you’ll focus in the upcoming months on getting the best out of your new in-memory system. But sooner or later you’ll have to return to archiving again. Storing the data on the Flash storage will not be recognized as a compliant option in case of an audit. Refresh your data archiving skills since you’ll have to start archiving your data even from the new in-memory ERP. Ideal archive characteristics What should be the main characteristics of the ideal target archiving platform? Have these changed in the last years? It would be a pity to archive in a closed, siloed archive and have the data de-facto locked there. You want to make sure that your historical data can be used even after archiving it – in analytics, business warehouse scenarios, from call center operators – in the same way as your current data. The same platform should support not only your data archiving requirements but also your document archiving needs (all those outgoing invoices and reports have to be stored safely for a number of years). Your requirements for the new archiving platform shouldn’t stop there. I’d propose that you add to your wish list the following: Open design: Vendor lock-in is not cool any more. Ask for industry standard design patterns (like OAIS – Open Archival Information Systems), ask for universality and support of both structured and unstructured data. Open access: If you’re not locked in with one vendor then ensure that your data isn’t locked in your archive either. It should be possible to reuse it throughout the enterprise. Hadoop Analytics and REST API design are the least that you should request to be able to gain value from your historical data. JDBC would be a nice-to-have on top of the below requirements – you’ll be able to stick with your current reporting tool (by letting it connect to the new archiving platform). Cost predictability: One of the reasons you’ll start with archiving data from in-memory ERP systems is to avoid unnecessary cost. In the days of cloud and SaaS solutions you have the right to ask for clear and predictable cost and licensing terms for your archiving platform as well. You shouldn’t be accepting obscure product bundles with complex licensing terms and limitations. Instead ask for simple and predictable licensing – good example: per terabyte of data managed in the system. The “in-memory ERP” era didn’t put data archiving into retirement. It’s just the opposite. Long term archiving is becoming more important today than it was before. And if you’re asking yourself what archiving platform to pick – talk to us about InfoArchive.

Read More

Momentum Barcelona: Where a Spark Becomes a Fire

Momentum Barcelona

Momentum Barcelona 2016 – after every Momentum, I am given the unenviable task of trying to recapture each amazing iteration of this event in vivid detail, to effectively recount the keynotes, the various breakout sessions, and all of the fun and community that make Momentum such an amazing event for our customers, our partners and, yes, our employees. So, while it may be nearly impossible, let me take a moment to relive Momentum Barcelona and to celebrate the return of this great event to Europe. For those of you that were unable to join us for Momentum Europe, I’m truly sorry you weren’t able to experience the incredible atmosphere and beautiful location that is Barcelona. What an incredible city in which to host our Momentum Conference, full of history, life, vitality and, of course, never-ending fun. I can’t think of a better backdrop for an event that celebrates our customers and their successes with our technology. And, to the more than 800 registered attendees and, in particular, our customer and partners, I want to simply say thank you for sharing the experience with us and helping to make it an event that practically defies description. Before I get into the highlights of the conference, I also wanted to share with you an observation from one of the sessions this past week, a view that the team from the Enterprise Content Division has long had an undeniable spark, a spark that seems to burn even brighter during important times like Momentum, and in particular, Momentum Europe. You see, at events like Momentum or our Customer.NEXT roadshows, we are at our best, surrounded by the customers and partners that have made ECD great. We are with our “community.” Better yet, we are amongst friends. During these moments, this ECD community has achieved incredible success, building upon the present and always looking forward to the future. The reason is clear: it’s because each of us – customer, partner and employee – shares a common goal, as well as the commitment to see one another succeed, to realize the incredible value that organizations can achieve working with our technology. This is the spark we experienced this week, one that has continued to burn brightly through all of the years and across every Momentum. And, as we look toward the end of a very eventful year, we’re also looking toward the amazing opportunities that lie before us to fan this flame into an even bigger blaze in the years to come. Perhaps never before has the name Momentum been more appropriate for our conference, or for our organization. This is evidenced by all of the activities and announcements that accompanied this year’s Momentum Europe. As you likely know, at ECD, we are focused on digital transformation and enabling our customers to achieve true competitive advantage in this digital age. You probably also know that much of the rest of the industry has recognized our vision and is now beginning to share our understanding of the importance of enabling transformation. ECD has continually delivered on its promise of providing a complete content strategy and set of solutions that will enable our customers to achieve their transformational goals. At Momentum Barcelona, we launched groundbreaking new solutions, including new LEAP apps and the  new LEAP Platform, as well as the next release of InfoArchive, InfoArchive 4.2, which brings exciting new capabilities for our Financial Services customers (on the heels of the recently announced InfoArchive 4.1/Clinical Archiving 2.0). We also provided a number of exciting updates to our Documentum products that make them easier than ever before to deploy, upgrade and manage in hybrid environments. Finally, we shared updates on a few of our industry solutions, designed to meet specific vertical needs, including new features in the Documentum Asset Operations 2.1 that offer compelling opportunities for the Energy and Engineering industries and Documentum Life Sciences Suite 4.2 , which now provides support for medical device documentation, as well as enhanced features for pharmaceutical organizations. And this is really just the beginning. Momentum Barcelona 2016 also featured three entertaining keynotes, starting with the opening keynote with Rohit, Muhi Mazjoub from OpenText, and three customers who shared how our products are enabling their digital transformation. Our second keynote included two analysts from IDC, Roberta Bigliani and Max Claps, and focused on key trends in Digital Transformation. And the climactic, closing keynote featured noted futurist, Beau Lotto, who offered an exciting vision of things to come. Other highlights of Momentum Barcelona included: 750+ Labs completed, with customers working directly with our technology 748 #MMTM16 mentions 439 Momentum app downloads 300 attendees at our Momentum Partner Summit 200 Hack-a-thon participants 200 Partner Summit attendees 146 trees planted 87 LEAP personality quizzes completed 65 hours of Product & Industry sessions 54 1:1 meetings with media and analysts 33 Life Sciences user group attendees 30 Genius Lab sessions with our Professional Services team 26 customers speaking in sessions 25 partner sponsors 9 Mo & Tim videos recorded 8 sessions dedicated to LEAP 6 partner innovation award winners 1 President’s Award winner for Customer Satisfaction And, just so you don’t get the impression that we were all business, one surreal party But, for those of you who may already know those details, our valued customers and partners, thank you again for being with us at Momentum Barcelona. If we missed you this time, we sincerely hope to see you at one of our future events. It’s a very exciting time for ECD and we can’t wait to share it with you. The spark has ignited, the flame is growing higher, and there is room around the fire for everyone. Congratulations and thank you, everyone, for a fantastic Momentum 2016 in Barcelona!

Read More

Fighting Fraud Through Better ECM

Fighting Fraud

Headlines these days often point to data security cases such as the breach at Yahoo!, affecting millions of users. But in the financial services industry, including insurance, it’s not just about data theft. It’s about fraud. As we continue to drive implementations of InfoArchive, there is a very positive use case emerging that I think is worth blogging about – fraud detection. In particular, the ability to search many content types to perform forensics and analysis. This can turn up insider employees siphoning funds, or external parties manipulating processes for financial gain. Either way, the ability to digitize content and leverage it to protect yourself seems an essential skill in our digitally-transforming economy. How can you get started? Here are three suggestions:  1 – Capturing Content Take a look at your current methods for capturing content as it comes into your organization. Are applications filled out online? Is there a bot on your web site asking for information? Are you using webcams and video, or online voice collaboration? Don’t forget to consider how and where smart phones or devices can be leveraged for first point of capture. Understanding how your stakeholders naturally communicate with your enterprise may turn up new types of content formats. Assessing these for today’s use, then projecting where you expect interactions to grow – say, more voice-collected information – can solidify your requirements. This can help your architects better align recognition, extraction, and classification technologies to fit your fraud detection use case. 2 – Retaining Content One of many reasons I love the idea of extreme archiving is that the right content is always there when you need it. Nothing is worse than detecting some type of financial fraud, and finding you have no pools of content or history to investigate it and recover damages. Regulatory compliance will require you retain content long-term, so might as well leverage the same effort for multiple benefits and use cases. Start by designing a content retention component into your content management strategy. Consider disaster recovery needs and compliance requirements, and the content formats you discovered in step one above. 3 – Searching Content Perhaps one of the most important considerations for the use case of fraud detection is search and query capabilities. It’s one thing to be smart about what content you can find and save, and quite another to search it quickly when something serious happens. (Just ask Bangladeshi bank officials, who realized only too late they had transferred billions of dollars to hackers). Consider where your investigators may be physically located, and what types of devices they may use to query your content. Of course, run through multiple scenarios to ensure the right combination of content format, syntax, language and other search nuances are available to aid in their investigation. Fortunately the financial services sector is already investing in fraud prevention methods. Considering OpenText as part of your fraud detection strategy can deliver a solid ROI on content management solutions, while reusing the capture, retention and search tools you need anyway for managing regulatory compliance. How are you leveraging OpenText for business benefits such as fraud detection? Share your feedback below.

Read More

5 Reasons you Should Consider Decommissioning Your Legacy ERP System

decommission legacy ERP

What attracted your attention in the headline? The “5 reasons” or the fact that somebody would even want to decommission an ERP system? Look around – this world runs on ERP systems. Why would somebody want to decommission an ERP system? Believe it or not there are companies actually doing it. Be it a merger/acquisition, migration to platform stack or simply migration to other ERP system (even if it would be only a new version of the same ERP). The “do nothing” option Not long after the go-live of the new ERP system your (previously highly valued) old ERP instance will not be used at all. What now? Here are your options: Do nothing, keep the server running somewhere in your data center and keep paying. The list of what you pay for is actually pretty long. It starts from DC floor space down to the slice of the employee time that takes care of the “skeleton in your closet”. Virtualize the old environment, shut it down and hope and pray that you’ll never ever need to boot it again Decommission the system and move on – for example to make your IT more agile I won’t go into detail about the first two options, or summarize how much it costs to do nothing, or how expensive the risk of not doing anything might be (or only the virtualization minimum). 5 reasons to start decommissioning Let’s then look at the 5 reasons I’ve promised in the headline. Why decommission an ERP system: Save costs and resources: I admit that most probably this will never be the only reason to think of decommissioning your legacy ERP instance. The old running instance will cost resources in your data center, physical and virtual server costs, OS and database license costs, will need regular maintenance by your team admins and we could go on (storage space costs, floor space, etc). It is sometimes surprising to see how long the list of hidden cost elements can be. The sum is at the end nothing compared to what the implementation project cost you but it remains an unnecessary burden in your budget. Get rid of it. Make sure that you keep access to your legacy ERP data even after you stop using the system in production. Have you thought what happens when you for some reason lose access to the ERP data? If you “decom” the old production instance you’ll have all important data tables available in your enterprise archive and any future requirement (audit, analytics, historical reporting) can be fulfilled. Run (Hadoop) analytics against your historical ERP data at a fraction of a cost of a BI/BW platform: Did you know that this was actually possible? When your ERP was in production you had to run regular data extracts into BI/BW to deliver business insights. Now that your data is in your enterprise archive you can actually run the Hadoop analytics “in place” without replicating and duplicating your data! Decommissioning a legacy ERP actually is very simple: First objection that we usually hear is that “my ERP consists of 1000s of tables that nobody except the vendor understands”. Wrong! There are many companies out there that know your ERP system to the last detail and they share their wisdom by producing decommissioning toolkits that can extract this data. Do you want to extract financial data? No problem. Did you want personnel extract also? Check. Custom tables? Check. Doesn’t any of the above apply to you? Use the importance of your legacy ERP (remember that your company was relying on it for the past few years) to support the purchasing decision of an enterprise wide archiving platform. Or do you think that it makes more sense run dozens of various purpose built archives instead of one open universal archiving platform? Our conclusion Most probably, none of the above is in itself a sole reason to start decommissioning a legacy ERP system. It is only the magic equation of 1 + 2 + 3 + 4 + 5 which sums to ”Let’s do it”. It is exactly this equation that persuaded a global customer to test legacy ERP decommissioning. In their case some of the reasons multiplied by 10 since they were maintaining a zoo of legacy environments. Now look at the reasons again and sum all points. Do you still think that legacy ERP system decommissioning is not for you? Let us know in the comments below.

Read More

Timeless Opportunity – Is There Such a Thing in our Ever-Changing World?

Managing obsolescence

“What do you do with The Huggers? I have many of those!” “You hug them back! It’s simple, don’t fight them.” When I heard this conversation, I wasn’t very sure I was in the right meeting room. I took a chance and joined the conversation. Little did I know that I would pick up three key insights about timeless customer needs. It was a complete pleasure to host several valued InfoArchive customers in Boston for the very first InfoArchive Product Forum. Timeless Opportunity #1: The need to manage obsolescence will never be obsolete “Huggers hold on to their dated applications like their firstborn and won’t let go,” he told us. “They add huge overhead to IT and compliance teams, and they are in virtually every organization! “When you fight them, they hold on tighter,” he continued. “So instead, you just assure them their baby will be in good hands.” As this conversation about decommissioning applications progressed, it dawned on me that in a world where technology and business are changing at an ever increasing pace, obsolescence will always be a constant. This is a technology, people and process opportunity. On the technology side, you need to have a solution that can decouple data from applications, handle unprecedented scale of data and contend with an ever-changing data model. All of this with a super efficient TCO. InfoArchive is the perfect answer for this need. You still need to convince the applications “huggers” to let go. Every organization, in any industry, ought to invest with some degree of scale in building a factory for managing obsolescence. Timeless Opportunity #2: Irregular regulations that change regularly The challenges around compliant Enterprise Information Management resonated with everyone in the room, breaking the geographical and vertical boundaries. We discussed the challenges of global organizations working with inconsistent legislation across different jurisdictions. In the US before 9/11, financial organization customer account data would be preserved for 6 or 7 years, but after that event, it must now be retained for several years beyond the life of an account. In France and Germany, account data must be destroyed and cannot be kept for too long. What’s a global bank to do? A financial services customer talked about the challenge of regulations that could have multiple interpretations. As an example, in Basel they are required to retain information through two economic cycles. How do you define an economic cycle? Everyone agreed that regulatory flux will be a constant trait of our future. Therefore, what is needed is a flexible solution that can codify regulations as business rules or configuration. This is one of the strengths of the InfoArchive platform. Timeless Opportunity #3: Learning from each other different as we may be It was impressive to see the diversity of this board with a spectrum of industries that includes Financial Services, Healthcare, Aviation, Energy & Engineering, Life Sciences, and more. It was even more gratifying to see the gears turn in everyone’s head as we moved from one industry to another. The day reminded me of the following quote: “Get closer than ever to your customers. So close that you tell them what they need well before they realize it themselves.” – Steve Jobs Our duty as a vendor is not just to understand the needs of our customers but to help them learn from each other and anticipate needs that they may not have today. It was personally gratifying to be part of the InfoArchive Product Forum to achieve that goal. Join me and my team at Momentum Barcelona to learn more about InfoArchive stories and share your own challenges/stories with us, and take the opportunity to connect with InfoArchive users face-to-face.

Read More

CEB TowerGroup Analysts Release new Research on Banking and ECM

ECM, Banking

Banking has had a long relationship with technology, with more than 90% of retail banking executives citing that they have the technology currently, or are planning to implement or improve.  However, the pace of change has accelerated due to competitive and regulatory forces, as well as rapidly maturing technologies such as cloud, mobility and analytics. FinTech upstarts have been challenging established institutions and are now similarly challenged as those established banks get more comfortable with digital transformation.  “Technology is transforming our business radically, across every aspect of our business.  A process that has been going on for some time, but has accelerated…  We actually create the prices and the information that then gets communicated.  It gets processed directly, payments get made automatically, much more efficiently, much more cheaply, and without error.” – Lloyd Blankfein, Chairman & CEO Goldman Sachs “The financial services business has been a huge user of technology, not just recently, but for the last for 50 or 60 years. We literally used to move pieces of paper around when you buy a security. The difference today is that much faster, people can access things any-where, any-time. People want 24 x 7 services. I always look at our job – we’ve got to make things better, cheaper,  faster for you.   The mobile device is probably the device (not just for millennials any more) for alerts, moving money, bill pay, knowledge, offers, marketing.  It’s getting faster, quicker.” – Jamie Dimon, Chairman & CEO JP Morgan Chase Recent research published by CEB TowerGroup analysts point to a number of interesting conclusions: Enterprise Content Management (ECM) and Business Process Management (BPM) are as relevant now as ever for transactional automation and compliance. It has reduced processing times and enabled and scaled access to services across time-zones and convenience-zones. Paper is still a major challenge to banking institutions, many of whom lack cultural acceptance and skills to more broadly apply these technologies. Customer engagement is moving from live (in-person) interaction, to leveraging digital channels. Mobility is also a major trend for account opening and servicing. Coupled with traditional ECM services, a combination of customer document delivery (customer communications management), business process (or case) management and eForms, e-signatures, mobility and cloud deployment are technologies that promise to greatly improve customer experience, with conversational and highly optimized workflows. There is a gap between a Banking executives perceived importance and confidence of a firm’s ability to execute key initiatives. The CEB TowerGroup analyst report, Going Paperless to Become Digital, discusses their detailed research and highlights changes in the adoption of ECM solutions, particularly as it relates to paper handling and the move to digital channels. It also shares a analysis of ECM product capabilities, including those from ECD, with special interest to the needs of the banking industry.

Read More

Face-to-Face vs. Digital: Finding the Right Balance in Customer Experience is an Art

CXDay

I like shopping at Nordstrom. No, I love shopping at Nordstrom. And it recently occurred to me that part of my attraction to Nordstrom is their ability to provide a perfect balance for my shopping experience. One day, I might be in one of their stores, feeling the fabrics and breathing in the joy of a fabulous find. Another day, I might be on the move. But Nordstrom offers an online shopping experience that is practical and effective. Finding that balance between face-to-face and digital engagement is key to providing a quality customer experience in today’s marketplace. Some might argue that salespeople are no longer needed, that people are doing their research up front so there is no need to talk to someone. But is that the case with B2B marketing? I don’t think so. I think the most effective B2B marketers find that right mix of handshakes and digital experience. BY THE NUMBERS A Forbes survey a few years ago found that 85 percent of responders felt that in-person business meetings build stronger, more meaningful business relationships, and 77 percent said they preferred those meetings because of the ability to read body language and facial expression. 85 percent of responders felt that in-person business meetings build stronger, more meaningful business relationships. Conversely, 92 percent acknowledged that technology-enabled meetings save time, and 88 percent agreed that they save money. The study found that a majority of the business executives thought the ideal meeting/conference execution strategy combined both in-person and technology-enabled meetings. Face-to-face marketing and virtual communication have definite benefits. Face-to-face interaction helps forge relationships that lead to long-lasting business connections, and digital experience ensures dialogue continues on a more frequent basis. So, before we begin any ECD marketing program we take the time to examine the underlying business objectives, outline clear goals and metrics, and proceed with a strategy that enables the best customer experience outcome. We strongly believe it requires a mix of the two. A TIME OF CELEBRATION So, what better time to celebrate engagement of all types then, CX Day. CXDay is a global celebration of the customer experience and the thousands of customer experience professionals that make it happen. CXDay is a perfect example of utilizing both personal and digital engagement. We will be in NYC at Customer.Next celebrating our customers and hosting a webinar, sharing learnings on our  customer’s experience.  LET’S TALK Has your business found the right balance between face-to-face and digital engagement? Maybe it has – today. But continuing to find that right balance tomorrow – and beyond – is key to the success of your business.

Read More

The Benefits of a Customer-Centric Culture

Customer-Centric

Customer Experience can mean different things to different people. In the Marketing world, there’s a debate about whether a CMO should now be called the Chief Customer Officer. There is also an ongoing discussion about whether today’s organizations actually require this dedicated Chief Customer Officer. There can be arguments both for and against, but in my experience being customer-centric and maintaining a continual focus on improving the customer experience is a necessity for any organization looking to move forward as a digital enterprise. So, assuming that the CCO becomes the newest office in the C-Suite, what qualifications would this person need to possess? The job description can include storyteller, politician, diplomat, digital futurist, customer advocate and human duct tape, just to get started. If this varied definition of skills isn’t enough, there is no shortage of others weighing in on the qualifications. And, while these qualifications are diverse and impressive, much of the CCO or CMO’s real work centers around the culture he or she needs to help nurture and build. Because, it is only through a company-wide commitment that an organization can effectively meet its customers’ increasingly complex needs on the frontline every day. Successful companies create an echo chamber of customer centricity, where everyone from customer service and accounts payable representatives to the C-Level team understands how they influence experiences that will help customers thrive. They recognize that every employee has the opportunity to play a part, to positively influence customer experiences when supported by a culture that allows each person to be the solution, not part of the problem or simply a bottleneck. We continue to proactively engage with our customer base, both digitally and in person. Over the coming months, we will provide a number of opportunities for our customers to interact with both their peers and various team members from ECD. Placing the customer at the heart of what we do is something we take very seriously and focus on every day. We invite you to learn more about how you can engage with us as well as share your feedback and suggestions, by posting on this blog. In our minds, every day is CX Day, and today we want to take an extra moment to pause and say thank you. Your feedback, input and passion around the software and solutions we create is tremendous. We appreciate your trust in us. We take that commitment seriously. And, we look forward to our next chapter and to continuing this journey together.

Read More

InfoArchive – From Information to Knowledge

InfoArchive

Recently I traveled to Pleasanton to work with the InfoArchive team. I arrived late and checked into my hotel. While relaxing and flipping through the TV I stumbled onto an early 90’s Robert Redford movie called Sneakers (by Universal Pictures). Taking place in Silicon Valley, the movie focuses on NSA sponsored technological espionage. Essentially, steal technology that can decode, monitor and listen to any kind of communication. The premise made me chuckle in an ironic way! After all, this is, what, twenty years before Snowden? Beyond the plotline, what really grabbed my attention was a line spoken by Redford’s character, which I found rather pertinent to today. He talked about a war, a world war, that wasn’t about who has the most bullets, but about who controls the information. When I think about InfoArchive, I always focus on information, both content and data. The multitude of data types and large amounts of information that can be ingested into the archiving platform takes precedent. Yet, the real power of InfoArchive is what you can do with this information after ingestion. To unleash this power, you’ll need access to this information to use it. Despite this, we still need to fight against the corporate philosophy of information control that limits access. Nevertheless, innovative companies are developing strategies not to hide or restrict access to information in silos but to provide information as quickly as possible to business units, analysts and individuals. These companies are transforming more rapidly than their siloed competitors by allowing greater access to structured data and unstructured content. The recent release of InfoArchive 4.1 and it’s healthcare version – Clinical Archiving – has made accessing and leveraging information easier than ever before. Risks There are inherent risks to providing open access to information. However, I believe informed employees can see the difference between right and wrong. Naturally, mistakes can happen with access to unfiltered information and some may attempt to exploit information for their personal gain. Thankfully, we have the ability to decide for ourselves how to use information, which can empower employees and provide great strategic value to the company. The recent release of InfoArchive 4.1 and it’s healthcare version, Clinical Archiving, provides greater access to helpful information while simultaneously allowing governance controls. Currently, InfoArchive delivers retention and disposition, masking of PII, litigation hold, chain of custody, user based access and access logs. This way, companies can maintain compliance and monitor access logs. So providing access to information via InfoArchive poises much less risk than providing direct access to production or legacy applications. InfoArchive New Features With InfoArchive 4.1, customers have more accessibility options: Elastic Cloud Storage (ECS) Support for Elastic Cloud Storage, which enables greater accessibility by intelligent devices, provides analytics ready storage and infinite scalability. Synchronous Ingestion With Synchronous Ingestion, data and records are instantly ingested upon completion, providing knowledge workers with the ability to access static data and content immediately. Wait times for ingestion cycles are now gone, improving employee productivity thanks instant data recall. Data Export The data export capabilities of InfoArchive are enhanced, offering greater flexibility for producing reports and content sets. Selected information can be displayed and exported in common formats, including PDF, CSV, XML, JSON or HTML. Clinical Archiving Provides several enhancements for greater accessibility, flexibility and control. The features in Clinical Archiving 2.0 include: HIMVision A new user interface application designed for HIM organization. It provides a view of patient archived with the ability to amend information within the archive. HIMVision includes a “Release of Information” mechanism so all or a part of the patient’s archived information can be exported or printed. ArchiveVision Enhancements ArchiveVision, the clinician user interface application, offers two new perspectives that enhance the user experience. These perspectives are: Snapshot – which offers a simple a quick overview of medications, allergies, history, etc. and Timeline – which presents the patient’s record in a chronological order. Patient Privacy Patient records may contain sensitive information specific to an individual’s condition. Clinical Archiving 2.0 offers a means to configure and enforce the patient’s preferences for privacy. For example, the patient may wish to set limits on which providers are allowed to view the patient’s psychiatric documents. Single Item Archiving Occasionally, a single document may need to be added to a patient’s archive. This feature allows IT to do so without waiting for the next official archiving session. Knowledge is Power In the modern world of business, the saying “knowledge is power” bodes more true than ever before. With the absorption of more information, hospitals can provide greater care for patients, businesses can address a client’s needs faster and customers can identify self-service access to their information. However, as Robert Redford in Sneakers suggested, it isn’t about who has the most ammunition, it is what information we have and how it is used. With the help of InfoArchive and its medical sibling Clinical Archiving, businesses around the world can maintain vital information easily, offer greater protection to their clients, patients and customers, while ensuring information governance. This should help everyone breathe just a little bit easier. To learn more about InfoArchive and its medical sibling Clinical Archiving please visit our InfoArchive page.

Read More

Delivering Innovation in the First City of Innovation

Innovation

In my opinion we couldn’t have picked a better city than Barcelona to host the return of the Momentum Conference to Europe. Why do I have this opinion? There are many reasons. First and foremost, Barcelona is a city that has always embraced transformation and innovation. It became the first European Capital of Innovation when it delivered on its promise to introduce new technology to foster economic growth and better the welfare of its citizens. The First City of Innovation Barcelona continues to innovate for the benefit of its citizens. When I look at the innovations that the Enterprise Content Division has delivered over the past three years with LEAP, InfoArchive and our hosting services I am inspired by what Barcelona is doing. By providing open data initiatives, the city is providing its citizens with valuable information and doing it in an open and continuous way. At Momentum Europe we are going to showcase our innovations that focus on data and information and how new methods of access and availability help organizations, business units and individuals transform the way they do business.In particular, I want to extend an invitation to those who have registered and to those thinking of attending. Consider attending the InfoArchive sessions and labs. InfoArchive is a solution that is helping all types of organizations move beyond the silos of data and information that stifle innovation and cast aside the legacy applications that thwart adoption of modern, flexible, open technology. The InfoArchive sessions will provide our Momentum guests with real-life scenarios and hands-on-experiences showcasing how companies are retiring legacy applications while leveraging the data and content these applications contained for strategic and customer-focused initiatives. We will provide examples of how InfoArchive not only has a rapid return on investment but can actually fund new initiatives. InfoArchive at Momentum Our InfoArchive sessions during Momentum Europe are intended to present attendees with an overview of our accomplishments and take them into the future of InfoArchive. We will give those present a glimpse into some of our largest projects, including: Our FinServ offerings that reduce the burden of regulatory compliance such as MiFID, surveillance and anti-money laundering Clinical archiving that eases HIM implementations like Epic InfoArchive for SAP that reduces the cost and burden of SAP HANA and SAP migration and consolidation In all these sessions we will show how unlocking data and content from siloed and Legacy applications provides valuable information to individuals and business units in a controlled and compliant manner. I look forward to seeing you in Barcelona, the first City of Innovation, for Momentum Europe 2016. Olé Infoarchive!

Read More

OpenText Enhances ECM Functionality and Deepens EIM Offering with Acquisition of Dell EMC’s Enterprise Content Division, including Documentum

Release 16

I am pleased to announce that we have entered into a definitive agreement to acquire Dell EMC’s Enterprise Content Division (ECD), including Documentum.  This agreement marks another significant milestone in our journey to redefine enterprise software and represents one of the most important Enterprise Content Management (ECM) acquisitions in the history of the company. The agreement includes the acquisition of Dell EMC’s ECD offerings from the Documentum, InfoArchive, and LEAP product families. When combined with existing OpenText products, these offerings will enable customers to manage content across the enterprise securely and compliantly, decommission legacy apps to make information more available for complete analysis and insight, and create new ways to work across any device. In keeping with our flexible deployment options, they will be available as on-premises, cloud, or hybrid architectures. At OpenText, we are committed to providing our customers with complete automation across the entire EIM flow, from Engagement to Insight. The addition of these offerings strengthen key components of our vision—from Capture to Archive—and adds to the core OpenText product portfolio with capabilities in information life cycle management, ECM, archiving, and collaboration. Furthermore, it accelerates our reach into key verticals, including healthcare, life sciences, and public sector, while adding to our strong presence in financial services, energy, and engineering. ECD’s more than 25 years of experience in ECM both strengthens our position and expands the OpenText EIM offering with domain expertise, increased channel presence, and deep global customer relationships. The addition of ECD will make OpenText the partner of choice for organizations as they seek to harness the power of enterprise information to transform their business. I look forward to welcoming ECD customers, partners, and employees to the OpenText family when the transaction closes. For more information, read the press release.

Read More

The Opportunity Ahead

Dell EMC agreement with OpenText

Dell EMC agreement with OpenText – today, we are excited to share that we have reached a definitive agreement with OpenText, a global leader in Enterprise Information Management, to combine the Dell EMC Enterprise Content Division (ECD) portfolio (including the Documentum, InfoArchive and LEAP families of products) with OpenText’s existing portfolio of products. The transaction is expected to close in 90 to 120 days. You can read about the terms of the agreement, in the press release. In determining the best long-term future for ECD, we wanted to create a business with a leading position in Enterprise Information Management, so we looked for a partner that shares our vision for the transformation to digital business, our passion for the role of information in the digital world, and the breadth of capabilities to help our customers realize that vision. We also looked for a partner that shares our commitment to deliver a world-class total customer experience. And, we sought a partner that valued the industry knowledge, innovative mindset and unique skillsets of our team. I am very pleased to say that we found all of that in OpenText, an industry leader with 9,200 professionals worldwide. Today’s announcement, therefore, presents a compelling opportunity for ECD’s loyal customers and partners, as well as our talented people. Our complementary strengths will produce a leader in both ECM and EIM: an organization with the financial strength, talent base, and global go-to-market scale to serve a marquee customer base. As we work toward the close of the transaction, I assure you that we will continue to provide the world-class care our customers have come to expect. To underscore our joint commitment, OpenText and Dell EMC have also announced their intent to enter into a strategic commercial partnership to expand customer offerings and better serve customer needs. Customers and partners can continue to realize value from their ECD investments and gain additional value from a richer portfolio of ECM and EIM solutions. Today’s news is great for all stakeholders, and we hope you are as energized as we are about the opportunity ahead.

Read More

The Importance of Data Democratization for the Digital Enterprise

Data Democratization

Democratization in business means more transparency and fluidity in the workplace. Formerly defined roles of the subordinate and the executive are becoming more fuzzy and malleable especially in the age of millennials. And in no area of business are these changes more apparent than in the area of data and content – our Digital Information and Assets. Data democratization in the digital age is an interesting and important anomaly that should be embraced rather than combated against. In the new digital age of business, everything that was traditionally known and done in the business world has been called into question. Back in the days before the digital age, the hierarchy within a business was very pronounced with clear levels and discernment regarding what each participant in a business could do and have access to. The digital age has turned this former monarchical hierarchy on its head leading to an age of data democratization in the business world. More Data and Content Than We Could Have Imagined Once upon a time, the data that a business collected was limited and had to be carefully mined, sought out, and processed and managed. Only a select few people had access to that data including the uppermost executives and the analysts who processed and gathered the data. The rest of the employees in a company were kept largely out of the loop. However, with the rise of computers and the digital age, businesses are faced with an entirely new working environments. Today, businesses have access to more data than they could have even imagined in the past. The vast amount of information available at any given time is quite simply mind-boggling and demands data democratization. Because so much information is so readily available, there is also more of a need to include staff members outside of the select few elite that would have previously been allowed to access the information before. Transparency and democratization has become more important in the business world than ever before, and many “old-school” executives may have a hard time with this concept.   However, it is vital to understand the importance of data democratization for the digital enterprise so that you can learn to embrace this change rather than fight against it. There is a notion that if more of your staff and employees have access to the vast amount of data available, that they will not know how to properly interpret or use the data in the context of their job function within the company. Greater Access to Digital Information is an Advantage Fears that your employees may misinterpret the data or apply it incorrectly can lead you to try to limit or control access to a select few. This decision, while understandable, will only lead to frustrations within your organization as people who need access to information may not be able to get it and is against the principles of data democratization. What you can do is provide employees with gradually building levels of access to the data based on their needs and levels of training and competencies in the interpretation of data. Implementing brief training programs and providing easy-use analytics tools to sift through and filter the digital data available will help your staff to be able to do their jobs more effectively and feel as if they have fair access to business information within the firm. The democratization of data in the digital age is something that should not be feared and should not be cause for counter-measures. Instead, it should be embraced as a part of the inevitable changes that the digital age has brought about in the business world. The key is to ensure that you have eliminated your legacy data silos, provide user focused and flexible access to your data and content and a well-trained staff that is capable of using the available data to your business’s advantage.

Read More

Asia First Impressions: Hungry to Excel

Qfiniti 16.2

I recently moved from Silicon Valley, California to Australia in a new role supporting ECD customers across Japan and Asia Pacific. What has struck me to date is the number of high caliber customers in this region who are leveraging enterprise content management (ECM) solutions, and their eagerness to learn and excel. Here are a few first such impressions, which may be somewhat biased as a reflection of my new arrival status. I hope they are nonetheless helpful to other customers, since we are all likely to learn from our similarities and differences. What’s Different – Learning & Cloud Adoption It’s no surprise that when viewed on a per capita basis, one finds the university-to-population ratios of Hong Kong, Singapore and Australia-New Zealand higher than those of the US and France (Source: Shorelight Education). The willingness to try new things and absorb new ideas is palpable here. Our customers in this region are already adept at partnering with multinationals  like OpenText to collaborate and learn quickly. This is something I personally appreciate, since I have always believed the way to become successful in any region and any business is for a provider and customer to partner together. What’s also different here, compared to the US for example, is the pace of cloud adoption. It feels faster here than in other markets. Perhaps it is industries with less regulated content, but it seems that many enterprises are ready to migrate data to a private or public cloud. I think this is driven, in part, by the growing volume of content, as well as target market compliance regulations that require retaining and managing content for longer periods of time. What’s very exciting to me is knowing that our Enterprise Content Division (ECD) is pursuing a strategy that directly aligns with these customer needs. We can tap into our global consultants or specialists to help on compliance issues, and our InfoArchive solution handles extreme levels of data. Our partners here in this diverse and broad region are helping tremendously to support our customers’ growth. Some Commonalities – Digital Transformation & Compliance As I meet with local business leaders, their focus is clear — business growth and how to adopt best practices that deliver competitive advantage (back to our observation about eagerness to learn). Digital transformation is a common topic of conversation as a powerful concept that can be studied and mastered as a tool for driving growth. It’s no wonder Forrester principal analyst Cheryl McKinnon in her report, “Five Key Trends That Are Shaping How We Manage Enterprise Content” identified digital business as the first of its trends, emphasizing the need to share content across the extended enterprise. This seems to be a universal phenomenon that will fuel the world’s most powerful future economics. This seems to be a universal phenomenon that will fuel the world’s most powerful future economics. For some in the region, business growth is occurring through expansion into America and Europe. Regulatory compliance in those markets is more complex and embedded in business systems. Through technical innovations in content management, these customers can deliver to state or national legislative bodies, while still keeping their enterprise profitable. This is a balance that many companies worldwide have mastered, with the help of our technologies and services. In India, it is the government forging ahead with Digital India to “transform India into a digitally empowered society and knowledge economy.” Much of this effort engages proven concepts and best practices we use consistently with customers worldwide. Also in India, major pharmaceutical companies are looking at digital transformation to expedite compliance with drug manufacturing requirements abroad. We can partner with these Indian organizations to grow their business, as they aim to supply key drugs globally. We have addressed these challenges before, as Middle East-based manufacturers entered the US, or American pharma targeted key European markets, all using Documentum. Forrester principal analyst Tim Sheedy, writing about the Australian tech sector, (Australian Tech Market Outlook, 2016 To 2017 ”, notes that business units are spending more of their budgets on technology.  This is consistent with Silicon Valley’s ongoing drive for innovation.  I have had numerous conversations across the Asia Pacific region in the Life Sciences sector and manufacturing about helping optimize content submissions and how ECM workflows can be made more efficient – all through technical innovations. Like many of our customers, some enterprises I have met with view the road ahead to realize digital transformation as full of unknowns. They are talking to us about how to make key decisions on what to take to the cloud and what to keep on mainframes, for example. Another common issue is how to decommission applications while ensuring the data is managed accordingly to laws and regulations. These are the moments where I realize that the long journey over oceans and land was worth it, because we have solved these kinds of problems before. Having the experience of working with some of the world’s largest banks and manufacturers makes it far easier for me to find common ground, despite the differences. Does your enterprise face any of the issues I have shared? Are you using OpenText ECD solutions to increase efficiencies or handle regulations? What’s unique about your region or use case? Share your comments below.

Read More

Waving the Checkered Flag for Your Compliance Race – Leader in the Magic Quadrant for Structured Data Archiving and Application Retirement

InfoArchive

Drivers on NASCAR super speedways race at breakneck speeds, especially once tracks are repaved and tires are precisely selected. The record speed is supposedly a whopping 212 miles per hour (341 km/hr), by Bill Elliott in 1987, up from about 80 miles per hour (128 km/hr) average speeds when races started in 1949 (Source: NASCAR.com). At ECD, we are setting our own record this year for an enterprise content management (ECM) solution, making a quick entry into Gartner’s Magic Quadrant. Our InfoArchive solution, initiated in 2014, has already moved into the Leader’s Quadrant in Gartner’s 2016 Magic Quadrant for Structured Data Archiving and Application Retirement. For any company trying to win their own competitive race in industries like Healthcare and Financial Services, this Gartner placement makes it easy to hit the gas on your technology selection and ability to innovate. We have seen that companies must quickly leave behind expensive legacy applications, yet still drive ahead with the data and content from those systems. And, globally dispersed teams and compliance leads must easily access that content to uncover innovative market advantages and mitigate risk. For customers with requirements to store large volumes of both structured and unstructured content, there is much to learn from InfoArchive’s rapid market acceleration. It has everything to do with our vision and execution, both of which were rigorously assessed and measured by Gartner for our advancement to the Leadership Quadrant. From a vision perspective, it was our close engagement with customers over 8 years ago through our Consulting Services that initiated productizing of InfoArchive just two years ago. Real-life services use cases drove our product philosophy, which never wavered: the solution must be lightweight, open, handle structured and unstructured data, deliver best-in-class compliance support, and continually hone user interfaces. For those of you with a diversity of applications, we know you will appreciate this dedication to non-proprietary systems. Rather than limit you to only SharePoint, Oracle or other distinct application migrations, InfoArchive’s commitment to an open approach allows partners and us to manage more content from more systems. Decommissioning apps is no longer driven by whether your data is in applications in structured or unstructured formats — both can be ingested, migrated, stored, and accessed. But it’s not just our vision that has led to our Gartner Leadership Quadrant placement and our rapid traction with customers. Execution has been equally important. First, part of our “pit crew” for InfoArchive’s design is our customer base, particularly those in our beta program. Their feedback on user personas, for example, prompted us to include both Configuration Manager and Knowledge Worker options for role-based access. It turns out that many organizations have personnel who need to research and derive value from data insights provided by greater contextual understanding provided by the consolidation of structured and unstructured data, and exporting key information. Customer satisfaction also weighed into the report’s conclusions. Thank you to those who participated! Second, we increased our own “lap speed” by accelerating the cadence of our feature release schedule. We now deliver InfoArchive improvements on a quarterly basis, to the delight of both existing customers and those prospects awaiting a particular product capability. Our aforementioned open platform also speeds development. Most recently, we announced InfoArchive 4.0 Extreme Archiving at Momentum. Thank you to our R&D teams! It is this combination of vision and execution that has led to our pole position in the latest Gartner report. Most importantly, our winning feature sets and solution improvements enable customers to make a strong finish on their migration plans and compliance requirements. Are you an InfoArchive customer (or a racing fan?!). Share your thoughts below. (1): This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Read More