EXPORT TO PARQUET exports a table, columns from a table, or query results to files in the Parquet format. These Parquet files use Snappy compression by default.
Let’s see how these compression types compare in disk usage:
verticademos=> SELECT COUNT(*) FROM big;
COUNT
-----------
134217728
(1 row)
Snappy compression:
verticademos=> EXPORT TO PARQUET (directory = '/home/dbadmin/parq_snappy') AS SELECT * FROM big;
Rows Exported
---------------
134217728
(1 row)
verticademos=> \! du --summarize -h /home/dbadmin/parq_snappy
2.6G /home/dbadmin/parq_snappy
GZIP compression:
verticademos=> EXPORT TO PARQUET (directory = '/home/dbadmin/parq_gzip', compression='GZIP') AS SELECT * FROM big;
Rows Exported
---------------
134217728
(1 row)
verticademos=> \! du --summarize -h /home/dbadmin/parq_gzip
1.9G /home/dbadmin/parq_gzip
Brotli compression:
verticademos=> EXPORT TO PARQUET (directory = '/home/dbadmin/parq_Brotli', compression='Brotli') AS SELECT * FROM big;
Rows Exported
---------------
134217728
(1 row)
verticademos=> \! du --summarize -h /home/dbadmin/parq_Brotli
1.7G /home/dbadmin/parq_Brotli
ZSTD compression:
verticademos=> EXPORT TO PARQUET (directory = '/home/dbadmin/parq_ZSTD', compression='ZSTD') AS SELECT * FROM big;
Rows Exported
---------------
134217728
(1 row)
verticademos=> \! du --summarize -h /home/dbadmin/parq_ZSTD
1.7G /home/dbadmin/parq_ZSTD
Hint: Although we can see in the example that the Brotli and ZSTD compression methods offer similar savings on disk space, there are other factors to keep in mind about using these methods in practice; in reality, the ZSTD performs much, much better than Brotli.
Many organizations that initially used enterprise content management (ECM) systems to tackle large-scale, document-intensive processes have taken a multi-phased approach to process automation. Driven by the desire to be more agile and competitive, they are now working to re-evaluate customer and employee experiences, modernize information strategies and broaden automation throughout the enterprise. And that’s shifting the spotlight to low-code application development.
Digital business automation applications have typically taken months to build and deploy, with developers responsible for translating business requirements into designs and building user experiences and back-end integrations. By eliminating lengthy cycles of development and testing, low-code applications can typically be up and running in a fraction of the time needed before. And they are easier to deploy, adapt and scale.
In addition to delivering faster time to value, low-code development plays a critical role in addressing enterprises’ ongoing challenges with content integration. That’s important, because the amount of information continues to increase—along with the number of content systems—with vast amounts of critical business content residing outside of content management systems.
So, where are organizations on their low-code journeys? Here are five findings from the AIIM that highlight what’s driving low-code development efforts, along with the top criteria for selecting low-code platforms and future investment plans.
1. Low-code application deployments are well under way
Low-code application development, as part of digital transformation initiatives, is being widely used across organizations of all sizes: 53 percent are fully engaged with low-code platforms, with 30 percent using low code for two to three years and 23 percent for at least five years.
2. Extracting context from content is a top concern
Supporting digital transformation involves a huge number of process automation requirements. And there’s a push to create applications that streamline processes to make it easier for the organization to do business. As the volume of enterprise information—much of it unstructured—continues to explode, organizations are challenged to turn content into what AIIM describes as “machine-comprehensible data.”
Seventy-six percent of organizations say the ability to turn unstructured information into structured data that’s capable of being analyzed by machines is “extremely” or “very” important to low-code development efforts.
Organizations need a deeper understanding of the context of content to support process automation. The four most important building blocks to create a modern and agile approach include:
Business content and collaboration—28 percent
Data recognition, extraction and standardization—28 percent
Content analytics and semantics—27 percent
Content integration and migration—26 percent
3. Manual approaches plague back offices
There continues to be a need for automation within key operational processes. Manual approaches and partial automation remain the norm, and most organizations say the following are either less than 50 percent automated or remain completely manual:
Supplier contracts and procurement—65 percent
Human resources—64 percent
Sales proposals and contracts—61 percent
Manufacturing and warehousing—59 percent
Customer correspondence and help desk—59 percent
4. Developer needs and deployment flexibility drive choice
Addressing the needs of IT developers and business analysts is critical. These two titles rank in the top positions when organizations are asked which users will determine low-code development success.
The most important criteria in selecting a low-code platform include:
Packaged business solutions (e.g., finance, HR contracts)—24 percent
Ease of use for “citizen developers” and “professional developers”—24 percent
Support for on-premises and cloud environments—24 percent
For organizations with more than 1,000 employees, the primary selection criteria were based on the perceived viability of the vendor itself; the top two ranking priorities were software revenue growth and software revenue size.
5. Investment plans—more than just talk
As organizations continue to focus on modernization and digital transformation, low-code platforms will play a continued role, with spending predicted to increase.
Fifty-two percent of organizations are actively looking at low-code platforms, with investments on the horizon. Specifically, nearly 20 percent plan to increase spending by 30 percent or more over current levels, and an additional 40 percent plan to increase spending by 10 percent or more.
The OpenText AppWorks advantage
OpenText™ AppWorks™ maximizes the value of information with automated business processes, better decision making and improved employee, partner and customer experiences. The low-code application development environment—available on-premises and in the cloud—enables business and technical users to rapidly build, iterate and deploy process-centric and case management applications that improve efficiency, optimize employee skills and provide business insights. Learn more about how OpenText™ makes it easy and cost effective to implement, maintain and update digital process automation applications.
Every area of business has faced its own challenges over the past 15 months—from sales and marketing, to customer support and product development. On top of all that, cyber threats have grown as criminals rushed to take advantage of remote working and the weaknesses it exposes in cyber resilience.
As we emerge from this crisis, it has become clear that things will not simply go back to the way they were before. Here are five leadership lessons from the pandemic from the OpenText leadership team.
Lesson 1: Communication is key
When supporting customers through a global pandemic, two-way communication and collaboration is vital. In those first few months, our customers were looking to OpenText for solutions as they quickly digitized processes and information and moved to remote work.
“The importance of honest, two-way communication with customers cannot be stressed enough. The move to virtual has meant a lot more time spent on Zoom or Teams, but we’re highly conscious of the quality of this communication, both inside our division and with our customers,” says James McGourlay, EVP, Customer Operations. “Implementing regular communication protocols has enabled us to be proactive in supporting our customers and provide clear, open, and timely communication – and pivot as needed – as we collaborate on their digitization efforts.”
Lesson 2: Digital events are now the norm
Events, conferences and tradeshows changed forever with the global pandemic. But the shift to digital events meant thinking through the digital experience from the customer’s perspective.
“Attendees are looking for different experiences from virtual events than were on offer at in-person conferences” says Lou Blatt, SVP and CMO. “In the place of live sessions featuring panel discussions and PowerPoint presentations, we’re seeing much more interest in highly compelling product demonstrations and trials. We’ve created `click tours,’ which allow people to check out the look and feel of our products, with training embedded in the experience.”
Lesson 3: Process change, remote collaboration and automation are key to innovation during a crisis
Product innovation during a crisis requires the ability to quickly adapt to your customers’ changing priorities and pivot internal processes to better deliver the technology your customers need most.
“In the past year, our ability to accelerate the product pipeline and deliver high-value-added product innovations as new customer priorities emerge has been critical to our success—and to the success of our customers,” says Muhi Majzoub, EVP and Chief Product Officer.
Lesson 4: Cyber resilience is about training and education as much as the tech
The explosion in remote work has meant an equal explosion in new endpoints connected to corporate networks. Coupled with the rise in cyber-attacks, organizations are faced with significant challenges in making their organizations cyber resilient.
“There is an increasing need for organizations to invest in a multi-layered approach to becoming cyber resilient,” says Prentiss Donohue, EVP SMB/Consumer Sales. “Technology solutions are necessary, but it’s only a piece of the puzzle. Training and education are equally critical. And by recognizing this, SMBs – in fact, all businesses – must take responsibility for both the security and education of their people.”
Lesson 5: Rethink traditional customer engagement and communication
Meeting customers face-to-face ceased to be a possibility in March 2020 and needed a rethink of the entire enterprise sales process.
“In terms of the sales cycle, we moved to invest substantially in the earlier parts of the buying process, with the aim to replace some of the normal face-to-face interactions with a digital equivalent,” says Simon “Ted” Harrison, EVP, Enterprise Sales. “By refocusing on how our customers and prospects want us to communicate with them, we have been able to supply relevant, timely and targeted ideas on how OpenText can help them navigate the current work, social and economic challenges.”
Over the past 15 months, the global pandemic shone a spotlight on the role of data in the public sector. Data was critical not only in responding to the pandemic, various lockdowns, and assessing symptoms linked to the virus and ultimately the vaccination rollout, but it was also key in directing the public and keeping them informed.
Many global government organizations have recognized the role that data can play in making them more agile and giving them the tools to adjust services as required. The pandemic has accelerated the UK Government’s existing approach to data innovation and has likely helped inform its new National Data Strategy, which intends to remove barriers and improve collaboration across the public sector.
In addition to helping the country navigate the economic, social and healthcare shocks in 2020, the reliance on data during the pandemic has resulted in an even greater appetite for official information by the public than ever before.
The pandemic has accelerated change in the UK Government’s use of data internally and externally. The future of public services will be more data-driven as a result.
How does data benefit government?
The trend towards greater use of data in the UK Government long predates the pandemic and, in many ways, departments have already made significant headway in their modernization agenda.
Thinking back to 2012, when GOV.uk was launched and replaced 2,000 websites with just one, and to 2013, when it committed to a cloud-first policy, the UK Government had made more progress than many private sector businesses of the same scale and complexity.
And the benefits go far beyond cost savings. Fundamentally using data to offer smarter public services could help relieve pressure on departments that are stretched thin. Increased digital tools could also help make it easier for the public to interact with the government, helping to speed up processes in many departments too. Essentially, the real gains lie in how data is processed and managed – which can help the government build a better service for its stakeholders.
And this is already happening today. During lockdown, the Office for National Statistics (ONS) used faster indicators of economic health to provide quicker analysis of GDP compared to what it normally would use. Because national GDP is often published long after the period being analyzed, the organization needed to look at ways to provide a more rapid view of the economy. This included looking at shipping movements, road traffic sensors and VAT.
Through these data sets, it was able to understand the movement of goods into and around the UK as well as exports to paint a picture of economic health. In future, this move towards real-time analysis could help provide more accurate and timely information for the UK Government in all its forms.
One pitfall is reliance on third-party data sources. These, according to the ONS, can include some bias and it is something they have had to work around by being clear on where data has come from and where there may be potential bias. There are opportunities for machine learning for understanding patterns in bias within datasets – key to making faster indicators more useful and accurate for the government.
Finding speed to value in today’s public sector
Looking to the end goal of a truly digital government in a post-pandemic world, there is still much work to be done, and improving the consistency and interoperability of technology adoption across organizations and removing logistical barriers due to legacy systems is a top priority.
To make the case for modernization and accelerate its approach, governments need to shift their thinking. Focusing beyond cost efficiencies will allow transformation to continue to deliver new and better services. Governments now need to look at the power of information and its ability to transform services in new ways while making budgets for those services stretch further.
Data is at the heart of the digital government transformation. The transition to digital government requires careful planning and a focus on outcomes. To be successful, government CIOs must guide the digital journey, tackling both the strategic and tactical challenges with a relentless focus on using and sharing data to make government services more proactive, and its operations more efficient.
The pandemic has made the case for modernization in government even stronger. The first steps taken so far have enabled the public sector to start to achieve some significant results. The future of digital government could improve on these further – providing excellent public services that meet local and national objectives.
To learn more about how OpenText™ is helping public-sector organizations digitally transform to meet citizen needs, visit our website.
The enterprise resource planning (ERP) software market is set for rapid growth. Worth approximately $39 billion in 2019, it is set to reach $78.4 billion by 2026. Organizations of all sizes are increasingly looking to ERP to help drive their business. And yet, data quality remains a major challenge, undermining the value of these investments.
Automated data processing: The main benefit and a key challenge for ERP
The key benefits of ERP systems relate to their ability to facilitate and automate the delivery of information to employees and senior management, helping to drive business operations and improve decision making. These benefits include faster execution of business processes, centralized access to enterprise-wide data and increased collaboration around that data. However, the value of these benefits is based on the assumption that data in the ERP system can be trusted.
In a world of ubiquitous and infinitely diverse data, there is a vast amount of relevant information that must be captured, tracked, managed and analyzed—and new sources and types of potentially relevant data continue to emerge. According to a recent survey by IDG, 44 percent of transactional data in the ERP system originates outside the organization—in other words, it is created by customers, suppliers, banks, logistics providers and other external parties. Ensuring the quality of data under these circumstances is tricky. Unsurprisingly, surveys show that data accuracy and analytics are two of the top three areas where ERP systems regularly fall short of user expectations.
Poor data quality is costly
It can be counterproductive—or even downright dangerous—to have inaccurate, out-of-date or incomplete data in any system, but the impact is compounded within ERP systems. This is because decisions based on an organization’s ERP data have far-reaching consequences.
Some time ago, Thomas Redman outlined the Rule of Ten. This states that it costs 10 times as much to complete a unit of work when the input data is bad as it does when the data is perfect. In a similar vein, the 1-10-100 rule by George Labovitz and Yu Sang highlights the exponential cost of bad data that’s left unaddressed.
Put simply, good decisions can’t be made on bad data. Bad data can cause a plethora of issues for organizations: stocking up too much inventory, failing to fulfill orders, misreading market dynamics and so on. Issues with data quality undermine operational performance and, ultimately, can even put your relationship with customers at risk.
Responding to the data ecosystem
There are many ways in which data enters ERP systems. It can be created in the system by users, brought in via synchronization from other business applications or integrated from external business partners. Making sure that you have the right processes and solutions in place to address data quality across all scenarios is important for overall data quality, which supports operational excellence.
In addition to processing more and more data, many organizations are now moving away from the monolithic ERP implementations of the past. Instead, they are looking to integrate their new core ERP instances with complementary enterprise applications to accelerate time to value, while adding flexibility and scalability into their digital ecosystem.
Internet of Things (IoT) and artificial intelligence (AI) deployments are also growing fast. By 2022, Gartner predicts that 65 percent of organizations will have integrated AI into their ERP systems. And AI, like many other business systems, will only ever be as good as the data available to it.
The importance of modern data integration
The challenge for all organizations is ensuring that only so-called “trusted” data enters their ERP systems. In addition to a robust integration strategy that spans both internal and external systems, best practices involve using enterprise data management capabilities that can handle data from virtually any data source. This helps to both ensure the accuracy and quality of information, and to maintain that quality throughout its entire lifecycle.
Modern ERP integration solutions address the critical aspects of connecting systems, as well as managing data quality as part of integration. This can include a range of activities, from simple minimum content validations to elaborate workflows involving checks against reference and master data sets.
Applying these measures mitigates data conflicts and quality inconsistencies within the ERP and other systems. This is critical to ensure timely and secure access to consolidated, clean and accurate data.
It’s not just about technology
The key to success is having the right capabilities and deploying them in an optimal way. This requires modern integration and data management technologies, as well as the expertise to wield them. However, it is often a complex undertaking to build solutions that effectively address data quality across different systems, scenarios and data types in an organization’s specific business context. Without proper planning and management, taking on such an effort can quickly turn into an expensive mess.
That’s why it is essential to have the appropriate technology and skills available. It’s also vital to facilitate collaboration between technical and business stakeholders and to manage this engagement in a strategic way. To coordinate integration efforts and manage day-to-day operations systematically and efficiently, organizations can use integration managed services from providers like OpenText. Such an approach helps to deal with the complexities involved.
Whether your organization is looking to modernize its ERP system(s)—for example, by transitioning to SAP S/4HANA or adopting a cloud ERP like NetSuite—or simply wants to maximize the value of its current ERP investment, it’s important to focus on integrations and your ability to ensure the quality of data as it moves between systems. These actions play a key role in helping you to achieve your goals.
It is no longer about ensuring connectivity in the short-term for a distributed workforce. It is about a much more complex nexus of forces. Businesses must support modern work, connect to global commerce, help build sustainable communities, and engage their customers in new ways, all while keeping their information safe.
Businesses Face New and Complex Forces
These pressures can be a catalyst for sustained change. Organizations have a rare chance to reinvent themselves and secure new opportunities. We have seen how rapidly innovation can occur. Look at vaccine development—the right data, technologies, and knowledge, in the hands of the right experts, create astounding results.
For businesses, a bold pivot is equally possible. Information is their most precious resource. And today’s technologies—the cloud, AI, automation—can empower them to put power behind that information, so they can drive growth and reach their most ambitious goals.
AtOpenText World Asia Pacific, happening all online May 20-21, organizations of all sizes will learn how they can use information management to thrive in a new world.
Chart Your Path
This year’s conference will explore the Information Management Journey. There is no one path to digital acceleration. Each business is unique. But we believe the ideal destination for all is the Intelligent, Secure and Connected Business.
What does this look like?
Intelligent companies unite content, customer experiences, analytics and automation to anticipate and react to challenges and opportunities.
Secure companies have multiple layers of defense to detect, defend against, investigate and remediate security threats and data loss.
Connected companies integrate critical business processes and value chains inside the organization, between organizations, and between clouds. This eliminates data silos, automates transactions, and powers analytics and reporting.
Organizations that develop these strengths reap tremendous benefits. They keep virtual workforces and valuable data secure. They build adaptive and sustainable supply chains. And they deliver personalized digital customer experiences at scale.
The Information Management Journey
Leading with an Infinite Mindset: A Conversation with Simon Sinek
I am incredibly excited that this year’s event includes a conversation with optimist, speaker and best-selling author, Simon Sinek. Simon will speak with our Executive Vice President, Global Sales, Ted Harrison, about how to lead your business in a world where disruption is constant. Drawing on his research into companies who build phenomenal cultures, Simon will describe how to identify your company’s vision, create teams that trust each other, and rethink how you react to your competitors. He will consider the differences between those with authority and those who truly lead, as well as whether leaders and visionaries are born or made. His insights will provide practical strategies to help organizations develop genuine resilience—rather than mere stability—and create lasting impacts on our world.
Keynotes: Discover the Ultimate Cloud™
In my CEO Keynote on Day One of OpenText World, I will share exciting details about our NEW initiative, Grow with OpenText. This set of programs brings together everything organizations need to embark on the Information Management Journey. Find out how you can gain an information advantage, and achieve greater scale, efficiency and insight, no matter your point of departure.
Together, we will explore OpenText Cloud Editions—The Ultimate Cloud. This is one piece of software that runs anywhere! And it has five specialized clouds, each focused on addressing a vital business need: Content Cloud for modern work, Business Network Cloud for global commerce and sustainable supply chains, Experience Cloud for modern customer engagement, Security & Protection Cloud to improve cyber resilience, and Developer Cloud to deliver information management as an API. All of it augmented and enhanced with analytics, automation and AI.
OpenText Cloud Editions: The Ultimate Cloud
During my keynote, Ted Harrison will speak with two of our customers, Tata Power and MSIG Insurance, about their own digital acceleration projects for optimizing productivity during remote work and enhancing customer-centricity.
On Day Two, the keynote from our Executive Vice President & Chief Product Officer, Muhi Majzoub, will reveal the new capabilities of OpenText Cloud Editions 21.2, our most integrated, flexible and secure software release to date! Muhi will uncover incredible updates across OpenText’s five clouds—such as new AI services in Digital Experience Cloud to support the customer journey, strengthened self-service testing features in Business Network Cloud to accelerate partner onboarding, enhancements to Security & Protection Cloud to help investigators and examiners find evidence faster. And 21.2 is anchored by a new Content Services platform to power modern work in the cloud. There are so many new features that I cannot possibly list them all here!
Muhi will also speak with OpenText customer Huhtamaki PPL about the company’s strategies for success in using information management, automation and technologies in the cloud to increase both efficiency and insight.
This Way Up
OpenText has three decades of experience in deserving your trust. We look forward to a future of even greater innovation, with a new battle rhythm of software updates every 90 days, and incredible new capabilities to come.
I truly believe that the journey ahead for organizations is upwards and positive. And we are ready to help guide your business through its digital transformation. OpenText experts are empowering organizations to intelligently scale operations to be ready for a digital future. I look forward to showing you how.
I invite you to register now for OpenText World Asia Pacific 2021.
The ransomware attack on Colonial Pipeline was yet another wake up call for critical infrastructure and supply chains to rethink their approach for securing operations. In the past twelve months, ransomware has disrupted operations for supply chain organizations, including:
a European steel manufacturer
a US natural gas supplying facility
a US water treatment facility
a Japanese automotive manufacturer
an Australian logistics company
a South American energy-distribution company.
Infrastructure and supply chains are particularly vulnerable to cyberattacks, but for different reasons. Infrastructure security investments tend to be aligned with regulatory requirements, vs. “what if” scenarios. Supply chains focus on efficiency and minimizing cost, forcing security proposals to compete with other, more appealing investments. The pandemic further increased the attack surface by causing enterprises to rush remote employee access, leaving security gaps in the wake.
Supply chains are now the preferred delivery system for malware, whether targeting key infrastructure or other organizations. With ransomware expected to increase sevenfold by 20251, increasing security protections within infrastructure and connected supply chains is a business imperative.
OpenText offers multiple solutions across the Detect, Protect, Respond and Recover model. While no single security tactic will give you 100% protection, these solutions foster a Defense-in-Depth approach in securing your business, operations and assets.
Endpoint Detection and Response: Beyond NGAV Protection and Data Protection, the inclusion of world-class Detection & Response capabilities is essential, such as those found in EnCase Endpoint Security. You need to have detection and forensic parsing capabilities that sit at the kernel level of your Endpoints – below the operating environment. This enables continuous monitoring for anomalous behavior.
Managed Detection and Response: Businesses that are stretched for resources in their Security Operations Centre now have the option of onboarding Managed Detection & Response services that keep eyes-on-glass (analyst-led monitoring) and deliver continuous machine-automated monitoring of your systems and data sources 24X7X365. OpenText Managed Detection and Response (MDR), for example, pairs best-in-breed technologies alongside security personnel with more than 15 years of experience working breach response investigations and malware analysis engagements.
Business Endpoint Detection and DNS Protection: As ransomware actors continue to target all sectors of the economy, it’s essential for the nation’s health and safety that businesses build resilience against – rather than simply defend against – cyber threats. Device and network level security like Webroot® Business Endpoint Protection and Webroot® DNS Protection are essential, but when paired with backup and recovery solutions from Carbonite they work together to undermine ransomware actors and return operations to normal quickly.
Threat Intelligence: Attacks on critical infrastructure like Colonial Pipeline and the water treatment facility in Oldsmar, Florida underscore the need for embedded threat intelligence in process control systems and other internet-connected devices that can be targeted by threat actors bent on causing maximum harm. Webroot BrightCloud® Threat Intelligence gathers and distributes telemetry data from millions of real-world endpoints, providing protection for all cloud-connected devices only minutes after the detection of a new threat.
Identity and Access Management for Third Parties: Supply chain attacks target the weakest spot in most every operation’s security program: third-party access. Traditional IAM (Identity & Access Management) tools are not built to secure access for decentralized, distributed user populations, providing a fraction of the security delivered to Employee populations. OpenText Identity and Access Management secures and automates access to every third-party person, system or thing connecting to enterprise on-premises and cloud systems.
Industrial IoT: Industrial IoT needs the OpenText IoT platform to securely integrate Operating Technology, like the Colonial Pipelines that was not initially designed for today’s connected ecosystems. The OpenText Supply Chain Traceability can deliver Secure Device Management that is identity-centric that verifies each device and associated data stream to enable clear and governable integration to enterprise applications, providing a protected and resilient IT to OT operations.
Used in concert, security and derived value is increased as each solution leverages capabilities from the others. For example, BrightCloud Threat Intelligence enables:
Encase to increase the chances of discovery of both known and unknown threats
Identity and Access Management to dynamically re-evaluate external risk signals and take action
IoT to alter data security and orchestration.
Doing so provides OpenText customers with rapid response and remediation of threats to avoid disruption and return business operations to a trusted state quickly.
1 Gartner. Detect, Protect, Recover: How Modern Backup Applications Can Protect You From Ransomware. 2021
In an industry generally known for slow adoption of technology, one insurance business discovered that the COVID-19 pandemic provided the catalyst it needed to prioritize digital transformation. Liz Ellis, Senior Application Developer Lead at Grange Insurance, is driving innovation in that company’s multichannel customer documents and communications.
We met with Liz—who’s an early adopter of technology herself—to discuss the challenges and changes she’s seeing in the insurance industry today.
Before we talk about the role of technology in insurance, how did you get into technology?
My dad was a high school science teacher, and he always encouraged me in science and math. That made a big difference in my life. I was also very fortunate to attend a high school—this was back in the ‘80s—that actually had computers and offered classes in programming. One of my math teachers encouraged me to try programming. I took her advice, and I just loved it!
What do you find most interesting about the insurance industry?
You might think that insurance is boring, but the challenges that the industry faces are really quite unique. When I started working at Grange Insurance, I discovered how exciting it was. I would never have dreamed there were so many moving parts to the business.
What is one of the biggest challenges facing the industry?
Before COVID-19, the insurance industry had been very conservative in the adoption of technology, although it was slowly starting to change. But now, since COVID, it seems that the industry as a whole really gets it—that we can’t be slow adopters of technology. We really have to push and be more at the front of the pack of innovation, versus hanging back.
Grange Insurance had already been starting down that path of trying to be more innovative, sooner and faster. But COVID has definitely solidified that. There’s just no time anymore to wait and see. We want to be the one that everybody else is watching, wondering what we’re going to do next. I’m excited about that.
What innovations and trends are you starting to see?
Knowing in real time where you are in the conversation with your customer is a critical thing that companies are paying attention to. This was impossible with older systems that performed batch processing overnight, but all our new systems are real time. Now, if a customer says they want a policy, we can bind that policy and create all the policy documents instantly. We can have a real-time view of what we are doing with the customer, and then we can continue to innovate with our communications. When a customer service rep or an agent is talking with their policyholder, they know where we are as a company in that customer’s journey so that they can provide the best possible care for that customer in that moment. To me, that is exciting.
You recently presented at the OpenText Women in Technology Customer Communication Management User Community. Thinking back to the female math teacher who encouraged you take computer programming in high school, why do you think something like this is important?
To me, the Women in Tech forum is a great opportunity for women to lift one another up and help one another to grow. I think it’s very important for women, especially in traditionally male-dominated fields, to have that opportunity to come together.
The efficiency, productivity and innovation for an organization can stem from implementing the latest software applications in a timely and effective manner. This represents a major challenge for many. Research shows that only 29% of software implementations are rated a success, and one in five are seen as a failure. This blog outlines a structured approach to successful software implementation.
Explaining software implementation
Every company uses software every day and each wants to make the most of the IT investments it has made to drive their business and better serve their customers. With an increasing focus on maximizing digital technology during the COVID-19 pandemic, successfully implementing and deploying the latest software is essential. You can define software implementation as the processes and procedures needed to take software applications and tools from planning and development to the production stage.
The process of software implementation can seem like a daunting task. Software has become more complex so it follows that implementing software will also have increased in complexity.
Organizations that fail to plan and achieve effective software implementation will not gain the full value from the new system and are likely to waste a great deal of resource on an implementation that is poorly adopted and doesn’t meet their business needs. Evidence suggests that failure rates are actually rising.
How to implement a new software system
OpenText Professional Services has over 25 years of successfully implementing Enterprise Information Management (EIM) solutions for companies across the globe. In that time, the team has successfully completed more than 40,000 implementations. Core to this success is a project management methodology designed specifically for software implementations. The structured approach systematically takes you through each stage of the process from initial project initiation to delivery.
Initiate
In the Initiate phase, the goals, roles, budgets, timelines and governance are established and clearly documented. The key stakeholders are brought together for the project kick off to ensure a common direction and commitment. In many instances, a steering committee will be established. From the kick-off, a Project Initiation Document (PID) is created that sets out how the project is to be managed including resources, responsibilities and how to measure outcomes. Successful software implementations will also develop a Project Charter that establishes governance policies and procedures from the outset.
Control
The Control phase spans the entire software implementation project and is designed to ensure the project stays on track and any issues are identified and addressed as early as possible. It puts in place mechanisms for monitoring and managing all aspects of the project such as project risks, development issues, required actions and decisions, change requests, budgets, and commercials are monitored through the life of the project. KPIs established in the Initiate phase and project status are regularly reported to the steering committee.
Execute
The Execute phase is broken down into four elements:
Design: This involves a detailed analysis of features and capabilities to be implemented – including all custom code and integrations to other systems – as well as setting out system usage and the underlying process affected. For this, a Solutions Design Document is developed and an Acceptance Test Plan (ATP) is created that involves real-world use cases of how your staff will actually work with the new system.
Build: This involves developing and delivering the software to meet your specific business needs. The new system is built to the specifications set out in the design phase with milestones established for progress and delivery. The build phase should include structured flexibility to enable change to the solution and project as required. In addition, the project should establish how the migration of existing data and code will be achieved. Finally, software testing models are developed and implemented.
Acceptance Test: In this phase, the Acceptance Test Plan created in the Initiation phase is executed – allowing for any changes necessary due to alteration in the build. Often a readiness review is conducted to ensure the plan will meet the goals of the project. From there, acceptance testing is rolled out across the users affected. In most cases, acceptance testing will be closely aligned with wider organizational change management programs. It is important to roll out the acceptance testing early as testers will often be users within the business. By testing early users can dedicate their time effectively without disrupting any day-to-day business needs. Testing can range from an hour or two to several weeks depending on the complexity and number of users.
System Deploy: With System Deploy, the final system is moved from testing into production. The transition and live operation are closely monitored to ensure a smooth migration or upgrade as well as to guarantee the live system is performing to levels expected and that user adoption is high.
Close
The final stage is to Close the project after the software has been successfully implemented. The Close stage includes the project wrap up meeting and final status reports. Customer questionnaires and surveys are sent close to the go-live date and measure the effectiveness of the management of the engagement. Importantly, it should also put in place a continuous improvement element that establishes a future roadmap for enhancement and optimization.
Selecting your software implementation model
For most large software implementation programs, a traditional, linear approach is deployed that handles the Execute phase as a series of chronological steps. In this model – often called the waterfall model – one step follows the next to reach the desired outcome (See Figure 1).
There are many benefits to this software implementation approach as it allows for a great deal of structure in each stage and the ability to ensure that each stage has been successful before moving to the next. However, this benefit of structure can also be its greatest weakness as it needs to follow the same approach each time, which can add a significant amount of time to the software development and implementation process.
However, software implementation is far from being a new discipline and structured approaches have been developed that are proven to help achieve successful outcomes.
Figure 1: Software implementation: The waterfall model
To address the challenge of large IT projects taking months or years to deliver, a range of agile development methodologies have become increasingly popular. To accommodate these newer development techniques, software implementation models have evolved to introduce a phased approach that mirrors some of the structures of agile. In this model (See figure 2), the Execute phase becomes a series of discrete activities designed for smaller and more regular delivery of some of the software components and benefits. Each of these discrete cycles is handled as a separate sprint with very lean timescales.
Figure 2: Software implementation: The phased model
Why partner with a provider for software implementation?
Taking a structured approach to software implementation will dramatically enhance your ability to successfully deliver the new system. However, it remains a challenge that requires skills and knowledge in both the software itself and project delivery designed for that software. Many organizations choose to work with a provider and, increasingly, look to the benefits of partnering with the software vendor themselves.
For example, the OpenText Professional Services team guides organizations through each stage of a project lifecycle and offers industry-skilled, product-certified implementation consulting services experts across its entire solution portfolio. The team either works closely with an organization’s existing team or takes on complete project management when required.
When looking to partner with either the software vendor or another third-party provider for software implementation, they should be able to deliver the following capabilities:
Installation and configuration
The best software implementation providers will help you accelerate system build using specialist tools and processes that ensure faster time to value, whether implementing pre-packaged and pre-configured or bespoke solutions.
Integration and customization
When working with the software vendor, you have the extra benefit that they are uniquely positioned to understand how to customize the software to your specific needs and integrate with other enterprise applications to maximize product functionality.
Migration and digitization
Leading service providers are equipped to help you target and migrate data to the new systems and make it easier to use and understand with analytics integration.
Project management
The provider will leverage experienced Project Managers to steer projects and provide guidance on best practice software implementation methods.
Why partner with OpenText Professional Services?
Today, OpenText has the world’s largest pool of EIM experts certified on OpenText products and solutions. We have a broad global footprint, serving clients in more than 30 countries. Our network includes staffand a set of trusted strategic partners that augment our own teams around the world. We often work as global teams integrating onshore and offshore resources from our Centre of Excellence. In addition, we provide One Team service where our Professional Services team works together with the product teams, support and field groups as one team that addresses your business needs.
With organizations continuing to push toward digital transformation, the developers and business analysts within these businesses hold the key to accelerate this evolution. Under pressure to help deliver more engaging customer experiences and optimize operations, they’re discovering that low-code development practices—which simplify or remove manual coding from software development—are proving to be the future of digital innovation.
According to Gartner, low-code as a general social and technological movement is expected to continue growing significantly. For example, low-code application platforms (LCAP) are expected to remain the largest component of the low-code development technology market through 2022, increasing nearly 30 percent from 2020 to reach $5.8 billion in 2021.
Why low-code? Why now?
COVID-19 has put more pressure on organizations to respond nimbly, yet strategically, to shifting market and customer demands. The ability to adapt and pivot has never been more important: there’s a need to digitize and take advantage of automation to improve productivity, engagement and support across the entire organization.
Low-code development platforms allow developers to turn the intense pressure to keep pace with fast application development cycles into six key results:
Make an impact: Create applications to enable smarter, more personalized experiences for customers and employees
Move faster: Eliminate lengthy cycles of development and testing for faster time to value
Improve business processes: Close process gaps and reduce manual tasks
Modernize and digitize: Replace legacy applications and enhance existing systems with new capabilities
Optimize skills and reduce budget shortfalls: Lower the barrier to entry for non-IT employees
Fuel innovation: Deliver applications to tap into new markets, new customers and new revenue opportunities
The power to create better software faster will push the adoption of low-code platforms to critical mass in the years ahead. According to IDC, low-code and no-code developer populations will significantly expand within the next three to five years; by 2024, three quarters of the Global 2000 will be high-performance, large-scale producers of software-powered innovation.
Unlock enterprise information
Low-code development has grown from a tool for departmental applications to a strategy to support widespread digital transformation. With enterprise information often existing across a variety of enterprise systems and content repositories, this hinders digital transformation due to disconnected processes, disjointed tasks and rigid processes that do not meet user needs and, ultimately, customer expectations.
Low-code development and applications help create new interactions infused with enterprise information to streamline business processes, improve operations and introduce automation. With the right automation platform, both the business and IT teams can collaborate to access information from disparate systems and re-engineer processes for efficiency. This allows users to interact with information through engaging applications aligned to customer and business needs.
Lean on an application development platform
The best low-code development platforms power digital transformation through enterprise applications that are engaging, smart and easy to deploy. Rapid application development with less IT involvement is made possible due to low-code, drag-and-drop components, reusable building blocks and accelerators to build and deploy solutions more easily.
Low-code development platforms allow enterprises to optimize resources through collaboration and to build mobile and web applications faster to automate manual processes and accelerate time to value. Other features and benefits include the following:
Connect and orchestrate information flows across lead applications and other systems from within business applications
Modernize legacy applications without rearchitecting the complete applications
Take advantage of pre-built, customizable visual reports that aid decision making and eliminate the complex task of building, deploying and using an external business intelligence tool
Tap into artificial intelligence (AI) and analytics for smarter automation and rich applications
Build, iterate and deploy process-centric and case management applications that integrate with other lead applications and systems of record
Create smarter, low-code applications
OpenText™ AppWorks™ is a low-code development platform to support business process automation and dynamic case management applications as part of a digital transformation strategy. Learn more about how OpenText allows business experts to quickly build, deploy, iterate and scale business applications, tapping into existing information assets to generate more business value for the organization, stakeholders and customers.