Our journey began at the turn of the century when Thomas Edison invented the carbon filament lamp. Today, GE Lighting, a Savant company, is taking the lead in developing smart home solutions—helping customers around the world design perfectly lit spaces and smart environments to live, work, and play.
Every year, we deliver millions of products to our customers through our partnerships with online and big-box retailers. Getting our products from the shop floor to our customers’ doorsteps depends on a carefully choreographed fulfillment process.
Forecasting demand to deliver on time at GE Lighting
One of the most important stages in this process is forecasting. To fulfill orders on time, we must anticipate demand across all our markets and align our manufacturing and logistics to ensure we have the optimal quantity and assortment on hand.
We produce our products with dozens of contract and original equipment manufacturers. Information from our suppliers plays a vital role in the forecasting process. For example, we need to know about any manufacturing delays quickly, as this may impact fulfilment downstream.
Firefighting order management issues at GE Lighting
In the past, getting timely information from our suppliers was a tough challenge—and our main pain point was our previous order management tool.
Many of our suppliers are not EDI-enabled. We used the system to enable them to share key EDI documents: purchase orders (POs), invoices, and shipping updates, among others. However, the tool was extremely slow and cumbersome to use. Our suppliers regularly ran into technical problems when uploading their data.
Every week, our sourcing team spent hours on the phone firefighting these technical issues with our suppliers. This diverted them from other important tasks. As a result, there was often a significant delay in suppliers updating information on the system. Ultimately this reduced the accuracy of our forecasts.
Transforming our approach with OpenText
We looked for a new order management solution to collaborate more effectively with our dozens of trading partners. Of the four industry-leading solutions we evaluated, OpenText Active Orders stood out as the best fit for our business.
As well as covering our entire procure-to-pay process, the OpenText solution is easy to integrate with our SAP ERP business systems—a critical requirement for us. The OpenText solution is also very responsive and user-friendly. Gone are the days of painfully long loading times. Our suppliers can instantly open the Active Orders web portal and rapidly upload their information.
Collaborating seamlessly with suppliers
One of the biggest differences between our previous tool and the OpenText solution is how smoothly it runs. In the past, our sourcing team would typically field five troubleshooting calls from suppliers every week. Today, we hardly get any reports of technical issues at all.
Every month, we send and receive more than 7,000 files via Active Orders, and our suppliers can now share information with us more quickly and easily than ever. This year alone, we’ve increased on-time PO confirmations by 2%. This translates into more timely data from our suppliers, which in turn helps us forecast more accurately and fulfill millions of orders on time. In fact, Active Orders helped increase on-time fulfillment by 10% over the last 12 months, which helps us to foster greater customer satisfaction.
About the author
Rick Stalker has spent his entire 20+ year career with GE Lighting, a Savant company, after graduating from The Ohio State University with a BS in Computer Science & Engineering. He has held roles supporting various business functions as an application support project leader. His current role is the IT Business Relationship Manager for Supply Chain.
At OpenText, we are fully committed to helping organizations gain the AI advantage to reimagine work, as evidenced by our OpenText™ Aviator announcement last fall. But we won’t stop there – our AI strategy is ambitious and far-reaching because we believe we’ve only scratched the surface of how this innovative technology can elevate us to be more.
One of our goals is to deliver solutions based on Artificial General Intelligence (AGI), a form of AI that moves from automating calculations–as it does today–to actually making choices in a way not unlike how humans solve problems. Still in the research stage, AGI has the ability to understand, learn, and apply knowledge across a wide range of tasks, and it can independently solve problems and adapt to new situations without the need for specific programming for each task.
Unlike so-called “narrow AI,” which is designed to perform a specific function such as voice recognition or image processing, AGI has the capacity to transfer learning from one domain to another, demonstrating a form of intelligence that is versatile and broadly applicable.
“Computers and software have been doing calculations our entire lives; computers are now doing decision support. Predictive and generative AI automate decisions, but are still very rules based,” said Mark J. Barrenechea, CEO and CTO of OpenText, last fall. “AGI will actually make the choice for you.”
As with generative AI and many of the technologies that came before it, AGI holds the promise to do great things but also harmful things. It’s clear that issues such as the potential for bias that exists today with generative AI will also exist with AGI and will need to be addressed, as will regulation of the technology.
Still, hopes are high that AGI will have a positive impact on people, organizations, and the world around us.
“… it is important to recognize that AGI will also offer enormous promise to amplify human innovation and creativity. In medicine, for example, new drugs that would have eluded human scientists working alone could be more easily identified by scientists working with AGI systems,” reads a blog post from The Brookings Institution.
And there’s more AI innovation to come on the path to AGI. Traditionally, OpenText solutions have been about information governance–infusing your organization’s information with automation and management in a way that’s trusted and secure. With last fall’s release of OpenText Aviator, we’re now focused on adding data governance so you can enable AI, search, and IoT to boost productivity and efficiency while maintaining that security and trust. The next step later this year will be decision governance, which will add managing algorithms, learning-data capture and organization, and deploying micro governance in ways that remain secure and trusted.
Our solutions are continuously being updated with new capabilities, so stay tuned over the coming months to see how our AI enhancements can help you overcome your latest business challenges. And learn more about how you can take flight with OpenText Aviator today.
We’re thrilled to share some fantastic enhancements that will make your experience with OpenText™ SAP® SuccessFactors® even more seamless and powerful. Here are 7 ‘What’s New’ for Extended ECM for SAP SuccessFactors update 24.1 to keep you in the loop:
1.*HR Content Aviator: Introducing a game-changing chat-based feature! Now HR business users can ask conversational questions, utilizing all employee records, learning, and HR documents stored in Extended ECM for SAP SuccessFactors. This AI-driven update fosters knowledge reuse, providing comprehensive summaries for a deeper understanding of HR content.
AI contract agreement summarization
2.Smart Document Bots: Enjoy a user-friendly interface for setting up document folder storage rules. Existing customers can relax – current settings seamlessly integrate into the new layout.
3.HR Document Generation Output in DOCX: Generate and save .DOCX documents, enabling internal review, modification, and approval before PDF publication. Perfect for flexible contract adjustments and letter creations.
4. DocuSign® Integration: Streamline management of your DocuSign® organization license account. This update allows grouping user accounts for enhanced license management control.
5.*Status Icons for External Sharing: OpenText Core Share, when integrated to Extended ECM for SAP SuccessFactors, now includes visual tracking of document sharing statuses. This new feature enables easy identification of sharing, approvals, or inaction from candidates and internal staff, thereby reducing security risks and providing better information to HR business users.
Status icons are used to support external sharing record processes
6. System Enhanced Audit Information: Gain more detailed audit information, displaying saved annotations, document transformations, and redaction events. Enhance governance and records management with this essential update.
Event audit details provide further information
7. Remove Download Option: Business Admin pages now offer even greater control over the download link. This offers unparalleled control over employee record downloads for security and support reasons.
Excited about these updates? We are too! Explore more AI use cases and stay tuned for more innovations and community discussions on Extended ECM for SAP SuccessFactors. Your experience is our priority.
Extended ECM for SAP SuccessFactors Community Event May 2024
Includes peer-to-peer discussions, product feature discovery, and use case insights.
Open to live Extended ECM for SAP SuccessFactors HR customers (please contact your OpenText Sales Executive to participate).
We are thrilled to announce the next generation of Audit Assistant, our innovative machine-learning–assisted auditing of SAST results. Fortify™ now unlocks and reproduces contextual awareness and security expertise from Fortify SAST results for the first time in the history of application security testing.
Why we did it
A fundamental problem with static code analysis has always been that it requires human auditing before the results are actionable. The action of auditing is a leading segment of non-value–added time.
What’s the value?
Reduces the number of issues that need deep manual examination
Scale application security with existing resources
Maintain consistency in auditing and reporting
Increased ROI on existing Fortify products
Why Static Application Security Testing?
Static Application Security Testing (SAST) is the market of products and services that analyze an application’s source code, bytecode, or binary code for security vulnerabilities. The National Institute of Standards and Technology (NIST) notes that static analysis tools are one of the last lines of defense to eliminate software security vulnerabilities during development or after deployment.
These software security tools and services report weaknesses in source code that can lead to vulnerabilities an adversary could exploit, to the detriment of the enterprise. If source code implements weak security, the business is exposed to additional risk that is unknown without SAST. Static application security testing enables enterprises to know their risk, transform their security posture, and make informed decisions to protect the business.
Confirmation of software vulnerabilities
SAST reports categorize issues by their criticality. Issues must then be manually confirmed as exploitable, or marked as not an issue by expert application security auditors. There are many reasons underlying a determination of not an issue. While some findings are false positives, more often the finding is not relevant because of organizational policy or is not exploitable due to mitigations being in place, or the potentially vulnerable code being unreachable.
SAST tools report findings of potential vulnerabilities in an application by using different analysis methods such as taint, structure, or control flow analysis. Expert auditors are required to validate findings using details specific to their enterprise, such as the context of the application and deployment. When auditors determine a potential software security vulnerability is not an issue, the time spent on verification is non-value added time.
These time-consuming audits have traditionally come at significant cost to the enterprise and expose a fundamental challenge with delivering secure applications; tools and technology do not immediately produce actionable intelligence. The well-recognized cybersecurity skills gap adds to the challenge of software security assurance. Skilled security professionals rightfully command high salaries, and by definition even the best individual contributors cannot be effectively scaled to the enterprise’s needs.
Static analysis tools make the impossible job of securing code possible, and a skilled auditor’s software security expertise verifies actionable findings. Even the best security teams are ultimately limited by the human experience available, which often pales in comparison to the universe of potential software flaws to which the organization is exposed.
The next evolution of secure applications comes by leveraging machine learning to make the process of securing developing applications quick and efficient. These techniques extend the reach and better scale the expertise of the security professionals through the security development lifecycle.
Machine-learning and predictive analytics: the next generation of SAST
The actionable intelligence problem is addressed through the Fortify scan analytics platform. Fortify has been deploying and validating this approach with Fortify on Demand by OpenText™ for over over 5 years, and has been delivering monumental improvement to the issue auditing process.
This capability is now available to on-premises customers of Fortify Software Security Center via its Fortify Audit Assistant feature. Fortify Audit Assistant identifies relevant exploitable vulnerabilities specific to your organization, in new static scan results. It does this by employing scan analytics machine-learning classifiers that are trained using anonymous metadata from scan results, previously audited by software security experts.
Fortify Audit Assistant transmits only anonymized metadata derived from the scan results called anonymous issue metrics. Neither scan results nor code ever leaves the SSC environment. Issues indicated by the proven Fortify Static Code Analyzer are parsed by Fortify Audit Assistant into non-sensitive attributes.
These attributes include vulnerability category, severity, and measures of code and software security vulnerability complexity— such as the number of inputs, branches, method output types, programming language, file extension, and the analyzer that found the issue. In the case of training data, the auditor’s previous determination is also included. The anonymized issue metrics are sent to Fortify scan analytics to train and apply machine-learning classifiers, which identify issues with up to 98 percent accuracy.
After processing a new static scan result, audit assistant will add its prediction and prediction confidence to the scan results. Based on an organization’s risk tolerance and preconfigured confidence thresholds, those issues will then be categorized as Exploitable, Indeterminate, or Not an Issue.
The prediction value indicates whether Fortify Audit Assistant considers the issue Exploitable, Not an Issue, or if the prediction confidence falls below the threshold, Indeterminate.
The prediction confidence represents the confidence audit assistant has in the accuracy of its prediction.
Without exposing sensitive or identifying information, Fortify Audit Assistant determinations are added back to the scan results and made available for review by an organization’s security auditors. Anonymized metadata remains protected in the TLS encrypted communication channel throughout its transmission through the cloud. Transmitting only the anonymous issue metrics allows the enterprise to scale its software security assurance program and decrease its risk while not allowing sensitive data outside of its normal protections. By deriving issue metrics from the customer scan results, audit assistant empowers the enterprise to leverage the knowledge of thousands of security professionals who have collectively evaluated billions of lines of code.
Without expanding headcount or allocating additional budget, the enterprise’s application security program becomes more efficient and effective through machine learning. This paradigm shift in SAST from scarce human expertise to the limitless scalability of artificial intelligence reduces non-issue findings by up to 90 percent.
Conclusion
This paradigm shift in static analysis dramatically increases return on investment as the time and cost to audit results decrease substantially. Enterprises now have the capability to leverage the knowledge of thousands of security professionals who have collectively evaluated billions of lines of code within their own software assurance programs through Fortify Audit Assistant.
Rather than reduce the breadth of security issues through limiting what analyzers report, Fortify has created the scan analytics platform to distinguish nonissues from real issues automatically. This innovative approach utilizes Big Data analytics to scale secure software assurance to the enterprise without sacrificing scan depth or integrity.
As organizations continue to transition to a DevSecOps environment, application security must be built into their processes. Fortify Audit Assistant directly helps automate the auditing process so deliverable and deployment schedules are met. Fortify Audit Assistant uniquely reduces the repetitive, time-consuming work of issue review through the scan analytics platform.
Enterprises no longer need to accept noisy scan results, make tradeoffs between scan comprehensiveness and time-to-audit, or negatively impact product delivery dates with scan review time. Classifiers trained on anonymous issue metrics reduce the expense of software security assurance programs without the risk of identifiable data transmitted to the cloud.
Organizations immediately reduce their overall security workload through vulnerability prediction by opting in to use the Fortify product line’s community intelligence classifiers.
Fortify was founded in 2003 and has been providing the industry-leading Fortify Static Code Analyzer tool for a scientifically sound approach to secure software development that enables meaningful and practical testing for consistency of specifications and implementations for more than a decade.
Learn More about Fortify’s Audit Assistant AI with this white paper.
Data privacy is not a static concept. It evolves with the changing needs and expectations of consumers, businesses, and regulators. In 2023, we witnessed some major shifts in the data privacy landscape, such as the introduction of new laws and standards, the emergence of new technologies and threats, and the increased awareness and demand for strengthened data protection from the public. Even though 2024 has just begun, data privacy challenges show no signs of slowing down and organizations need to find effective ways to adapt to a changing data privacy landscape.
In this blog post, we will summarize some of the key insights from Greg Clark, Director of Product Management for Voltage Data Security™, who shared his views on the upcoming data privacy trends, best practices, and how to take privacy to the next level in several articles published during Data Privacy Week 2024.
Here are some of the main takeaways:
Data privacy is not just about compliance. Privacy 2.0 is a new paradigm that goes beyond compliance and focuses on building data trust with customers through ethical data practices. It requires organizations to adopt a holistic approach to data governance, security, and ethics, and to align their data security strategies with their business objectives and customer expectations.
Data privacy is not going to kill AI. Data will grow exponentially with AI, machine learning (ML) and generative AI. Using disparate methods to collect, process and manage data will no longer be enough. In today’s increasingly digitized world, a modern data privacy program needs to unify data discovery and protection to improve privacy and security postures. Only by doing so can you remediate risks and take privacy to the next level.
Data privacy is not only a legal obligation, but also a competitive advantage. Organizations that can demonstrate their commitment to data privacy can gain customer loyalty, improved brand reputation, and market differentiation. On the other hand, those that fail to protect their data can face reputational damage, regulatory fines, and legal liabilities.
Data privacy is not a one-size-fits-all solution. It depends on the context, purpose, and sensitivity of the data being collected, processed, and shared. Organizations need to conduct regular data audits and assessments to identify their data assets, risks, and obligations, and to implement appropriate controls and measures to safeguard their data.
Data privacy is not a static state. It is a dynamic process that requires constant monitoring, review, and improvement. Organizations need to keep up with the evolving data privacy landscape and adapt their policies and practices accordingly. They also need to educate and empower their employees and customers on data privacy rights and responsibilities.
Data privacy is not a solo effort. It is a collaborative endeavor that involves multiple stakeholders across the organization and beyond. Organizations need to establish clear roles and responsibilities for data privacy, and to foster a culture of data privacy across all levels and functions. They also need to engage with external partners, such as regulators, vendors, and customers, to ensure alignment and compliance on data privacy issues.
Here are the links to the full articles OpenText Cybersecurity was featured in for Data Privacy Week 2024:
For more information on how our OpenText™ Voltage privacy-enhancing and privacy-preserving technologies can help you take privacy to the next level 2024, read this solution flyer and visit the following webpage.
Selecting the right financial technology partner is a decision that carries long-lasting implications for banks’ success and future resilience, no matter the size. The dynamic nature of the financial industry and the relentless pace of technological advancement make this more critical. These highlight the need for a strategic approach to choosing a partner that aligns with your bank’s vision and values.
Making a strategic decision on a financial technology partnership
For a bank, partnering with the right technology provider is an investment in the bank’s ability to innovate. It can also affect its ability to adapt to changing market demands and provide a secure and seamless customer experience. In this context, a strategic partnership goes beyond the functional features of a technology solution. It’s about identifying a provider that shares the same vision for success and innovation as the bank.
Key considerations for a strategic partnership
Understand your bank’s needs
Before embarking on the journey to find a technology partner, it is crucial to understand your bank’s specific needs and goals clearly. Consider the unique challenges you face, the customer experience you aim to deliver, and technology’s role in achieving these objectives. A technology partner should meet your current operational requirements and align with your long-term vision for growth and innovation.
Emphasize compatibility and alignment
Features and functionalities of a technology solution are important. However, you should also assess whether the partner’s values, approach to innovation, and commitment to security align with the principles of your bank. Compatibility in these areas ensures a smooth integration of technology into the bank’s already existing vision.
Evaluate technological capabilities
While technological capabilities are undoubtedly crucial, it’s about more than just the breadth of features. Look for a partner whose technology stack is complemented by a commitment to continuous innovation. The ability to adapt to emerging trends and evolving customer expectations is a characteristic of a partner ready to navigate the future alongside your bank.
Prioritize scalability, flexibility, and financial stability
A technology partner should offer solutions that can seamlessly scale with the bank’s growth. They should also adapt to changing market dynamics and accommodate future technological advancements. The ability to flexibly integrate new functionalities or adjust to evolving regulatory requirements are critical indicators of a partner committed to long-term collaboration. Financial stability is equally crucial. A financially stable partner assures that your collaborative journey will be built on a solid foundation, even during uncertain economic times.
Assess the partner’s industry reputation
Reputation speaks for itself. A technology provider’s reputation within the industry reflects its track record, reliability, and practices. Consider the provider’s standing among its peers and its history of successful partnerships. Positive client testimonials and industry recognition indicate a partner capable of delivering on promises and fostering trust.
Security and compliance as non- negotiables
Security is the top priority in the world of banking. Ensure that any potential technology partner prioritizes robust security measures and compliance with industry regulations. A commitment to data protection, adherence to privacy standards, and a proactive approach to cybersecurity are foundational elements of a trustworthy and reliable partner.
The checklist for success
As banks begin their selection process, we encourage them to leverage this checklist—an interactive tool meticulously designed to highlight key areas discussed in this blog. It enables banks to compare the features and functionalities of different financial technology providers against their overarching vision.
Input the name of the fintech provider in the second column and check the corresponding boxes based on their performance in each category. The third column ranks the overarching significance of each statement, aligning with the bank’s priorities. Upon completion, this checklist becomes a valuable tool, significantly easing the process of selecting the ideal partner for your bank.
In the ever-evolving journey of banking, the right technology partner should not just be a vendor; they should be an extension of your bank’s vision, a collaborator in innovation, and a partner in ensuring that your bank not only meets the demands of today but is prepared to thrive in the challenges of tomorrow.
Get the Financial Technology Checklist
Banks face a critical choice when selecting a strategic partner. Should they join forces with a dynamic fintech startup or a well-established technology provider with a proven track record? What features matter? This checklist helps you assess the benefits of a dependable, long-term partner and how their experience and security can enhance your strategic, long-term objectives. Get the checklist.
Digital business files have replaced many paper documents, and the volume of content is expected to soar in the coming years. Every day, I talk to organizations leveraging intelligent document processing solutions to help them cope with the digital document deluge. But even today’s automated platforms can fall behind.
Traditional machine learning lost a step
As document content and layouts change over time, systems require costly, time-consuming manual tasks that reduce efficiency and revenue. AI adds efficiency and accuracy to automated capture workflows. Unfortunately, machine learning models can also take time and resources to train and calibrate.
Machine learning accuracy drifts and degrades over time as the layouts of incoming documents change. Keeping models accurate relies on periodic updates by data scientists in a labor-intensive cycle of retraining, sometimes at the code and database level. These specialized skill sets and activities come at a considerable cost to the organization. Updates typically occur only periodically and without input from key knowledge workers.
Take the leap with continuous machine learning
There is an ongoing shift taking place from machine learning to continuous machine learning (CML). Many organizations have turned to CML to address their content classification and data extraction needs to enable intelligent document processing. With CML, models are updated on the go as they encounter new data and layouts in production. Updates occur in real-time in small batches, which reduces computational time. More importantly, CML reduces the data and human resources required to retrain machine learning models.
How does OpenText leverage CML for information capture and intelligent document processing?
OpenText leverages a CML approach that offers flexibility, accuracy, and efficiency for automated information capture while minimizing or eliminating manual machine learning model retraining.
OpenText information capture products and intelligent document processing solutions solve the machine learning challenge by embedding CML. An AI approach to information capture and data extraction, continuous machine learning eliminates data staleness through an ongoing refresh as the model self-corrects and relearns. Humans in the loop ensure data accuracy as part of daily production runs – eliminating the need for week- and month-long pauses as data scientists scrub data sets to retrain models.
The OpenText approach to CML relies on methodology embedded in its Information Extraction Engine (IEE). Data and differing layouts can quickly be reinforced with just a few clicks by a knowledge worker using a human-in-the-loop UI. IEE continuously assesses human feedback to reinforce or adjust the model accordingly. IEE eliminates the need for a team of data scientists to maintain and retrain machine learning models.
The name says it all, because microlearning refers to learning in small learning units. Instead of working with an extensive training manual or participating in a full-day seminar, microlearning allows individuals to absorb very compressed portions of knowledge in just a few minutes.
The fact is, that microlearning has become increasingly popular in recent years. Several studies confirm that organizations appreciate the advantages of this modern form of learning and are increasingly relying on small-scale training. There has been plenty of research and literature published that indicates that microlearning is also one of the most sustainable forms of learning.
The learning units within microlearning are called ‘learning nuggets’ and are characterized by the following:
Short content on just a single topic
Time effort of typically between two to five minutes
Easy-to-adopt knowledge
Simulations for practical action following the learning unit
Self-paced learning anytime and anywhere
There are numerous examples in daily life where microlearning is applied, for example in sports, if you just want to learn a particular technique, or in music how to use a guitar pedal if you just want to learn how to play a particular riff. In both cases you would not need plenty of theory, but you need some practical guidance and then you need to practice yourself.
Learning nuggets are usually offered in the form of short videos, infographics, quizzes or text (like this blog!) and if possible, with an interactive component. The nuggets always deal with a single topic or a clearly defined part of a topic. Extensive aspects of a major topic should be broken down into several learning nuggets.
The nuggets are an excellent way to train on new content, but are also helpful for newly hired staff, or staff that were absent during a change or an update or anyone not using a particular software frequently. They provide visually interactive learning with some simulated practice in a digestible form that teaches what is important without putting pressure on the user due to lengthy sessions.
Learning concepts for new software implementations – what’s in it for me?
When it comes to learning concepts for new software implementations, we typically find the following educational needs are required:
Time efficient
Pragmatic approach
Task driven
Repeatable
Self-paced
As we can see the Microlearning learning methodology provides exactly these needs and is a perfect fit. It also delivers a very time and cost-effective way to get employees up to speed. Instead of providing hundreds of pages of documentation and user manuals, a well defined set of learning nuggets will be far more efficient, not only for the user but also from a development and maintenance point of view. And it is much easier for the user to digest the content and learn what’s really needed for their daily job. Finally, it can also deepen their knowledge, as the user can immediately practice what they have just learned.
In order to achieve maximized user adoption OpenText™ offers services to help organizations in developing Microlearning content with the concept described above. Customers with implementations of various OpenText products in industries such as Pharma, Engineering and Government institutions trust in this type of learner experience.
The OpenText Learning Services team can also help with your adoption strategy, working closely with you to develop a tailored solution for your users to realize your business outcomes. For more information, visit Learning Services or contact us at training@opentext.com.
Data is one of the most valuable assets for any organization, but it also comes with risks and challenges. Strong data security is essential for complying with regulations, protecting customer trust, and avoiding costly breaches. However, traditional data protection techniques can introduce performance issues, complexity, and drive up your overall cloud compute and storage costs.
Snowflake Horizon, Snowflake’s built-in governance solution, addresses data security, privacy, and compliance issues, allowing customers to efficiently take action on their data and apps in a governed, secure environment. Snowflake Horizon makes it easy to integrate with security solutions like Voltage SecureData™ to enhance data protection across your entire data estate.
If you are looking for a way to conduct analytics at scale without compromising on data security, the Voltage SecureData integration with Snowflake Horizon is a great option to consider. SecureData enables you to encrypt, tokenize, or mask your data before loading it into the Snowflake Data Cloud, or as it lands. By doing so, you can reduce the risk of data breaches, comply with privacy regulations, and preserve the usability of your data for analytics and business intelligence, even in its protected form.
The Voltage SecureData integration with Snowflake Horizon is a comprehensive data protection solution that integrates seamlessly with Snowflake’s native features and capabilities, including external functions and dynamic data masking policies. You can use Voltage SecureData to protect any type of data, such as personally identifiable information, personal health information, financial data, intellectual property, or trade secrets. You can also choose from different protection methods, such as format-preserving encryption, secure stateless tokenization, or format-preserving hash, depending on your security and performance requirements.
The Voltage SecureData integration with Snowflake Horizon offers several benefits for your organization:
High-scale analytics: Voltage SecureData enables high-scale, high-performance, and secure data analytics, data science and data sharing in the cloud. It protects sensitive data before, during, and after it lands in the Snowflake Data Cloud, using industry leading data-centric techniques that preserve the value and format of the data.
Enhanced data security: Voltage SecureData allows you to perform analytics and business intelligence on your protected data with role-based access enabled directly on specific data elements via SQL function calls or conducted transparently in combination with Snowflake masking policies.
Secure third party data sharing: Import data already protected by Voltage SecureData, analyze on other platforms, clouds without removing protection and monetize analytics by third parties external to Snowflake.
Reduced compliance costs: Voltage SecureData helps you comply with various data privacy regulations, such as GDPR, CPRA, HIPAA, PCI-DSS, and more, saving you time and money.
Learn more about Voltage SecureData Integrations for Snowflake here.
At Manutan, we equip businesses and communities with the products and services they require to succeed. Headquartered in France, our company has three divisions, serving companies, local authorities, and tradespeople, employing 2,200 people across 27 subsidiaries.
For more than 50 years, we’ve stayed one step ahead of market trends to shape a compelling product and service offering. Today, our goal is to deliver a consistently excellent end-to-end customer experience on every channel—which means keeping customers informed about the status of their orders every step of the way.
Optimizing customer communications
As we grew our business, we saw an opportunity to standardize our approach to customer communications across all 17 countries that we operate in. With 12 different languages and specific local rules and regulations to address, harmonizing our operations was no simple task.
At the same time, we wanted to improve our approach to communications in another domain: B2B integration. Effective collaboration with our partners is essential to orchestrate our supply chain effectively and deliver to customers on time—particularly during peak retail periods. So, we also looked for a way to engage more effectively with our 3,000-strong supplier base.
Finding a winning formula
Our relationship with OpenText began more than 15 years ago. It started with us deploying OpenText™ Exstream™ as our centralized, company-wide customer communications platform. Since then, we’ve augmented our capabilities—enabling us to orchestrate omnichannel customer experiences at scale.
As Manutan has evolved, so have the OpenText solutions we use. With around 65% of all orders now coming through our e-commerce site, OpenText is helping us to deliver tailored digital communications at scale.
Keeping customers and suppliers in the loop
Today, our customer communications platform is closely integrated with our Microsoft Dynamics AX ERP solution. We’ve created multi-language templates with 107 different themes to meet the needs of our international subsidiaries. By taking advantage of the flexibility of the OpenText solution, we were able to include a high level of customization without compromising on the quality or consistency of our content.
We’re also using OpenText to help manage B2B transactions and enhance collaboration across our supply chain. Customers and employees can now access the information they need quickly and easily, from updates on purchase orders to delivery notes, credit requests and order statuses.
OpenText empowers us to connect customers and employees with the information they need—for example, updates on purchase orders, delivery notes, credit requests and order statuses. Through OpenText™ Notifications, we provide invoices to customers in their preferred format and language. With the built-in traceability feature, we can keep track of when emails are delivered, opened, and downloaded. In the future, we plan to add more channels, such as WhatsApp and mobile push notifications.
Seeing loyalty soar
Thanks to consistent, timely communications enabled by OpenText, and the guidance of their Professional Services organization, we’re developing deeper connections with our customers. By customizing these communications to the requirements of each region, we elevate the customer experience without sacrificing efficiency.
We’ve improved our ability to communicate with trading partners, too. By enhancing data exchange with suppliers, we’re promoting transparency and traceability throughout our supply chain. That helps us plan more effectively, maximize sales opportunities during periods of peak demand and provide fast deliveries.
Working towards greater sustainability
With consistent, ultra-personalized digital communications from OpenText Exstream and OpenText Notifications, we’re cutting our use of paper dramatically. Environmental sustainability is an important part of our promise to customers, and we’re delighted that the solution has helped us to reduce our carbon footprint.
With OpenText supporting us on our business growth journey, we’re in a strong position to build on our market leadership in the years ahead.
Guest author: Michael Sarrasin is Exstream Product Owner at Manutan. Since 2010, he has deployed and maintained the solution across the company’s global subsidiaries. He resides in Gonesse, a suburb of Paris, France.