How Today’s Engineers are Embracing a Virtualized Digital First World

In today’s digital first world, companies face a continuous challenge to ensure that mission critical business information can be accessed anytime, anyplace or anywhere. In order to access digital information, IT infrastructures must cater for a variety of computing platforms with varying levels of performance, mobility and graphics capabilities.

In an earlier blog article I discussed how the distribution of digital information has slowly become more pervasive across the manufacturing business. I highlighted how traditionally the design department of a manufacturing company was seen as the early adopter of new technologies. Design based information is typically large in file size and is graphically intensive with real-time rendering being required to visualize 3D product related designs. This presents a challenge when trying to view large 3D CAD models across different hardware platforms.

Over the years, as technology has advanced, more and more departments have been able to access digital information in different ways and this has introduced a number of challenges:

  • Ensuring the security of information so as to avoid unexpected security leaks
  • Providing a way to adhere to regional data sovereignty laws so that information can be retained in-country or in-region
  • Deciding whether digital information should reside on a behind-the-firewall server infrastructure which is only accessible via a VPN connection or hosted in a cloud-based infrastructure for greater accessibility
  • Driving a balance between application performance and IT infrastructure costs to ensure that applications are available 24/7 and the business is not impacted due to a network outage or slow connectivity to remote users of network resources
  • Making certain that engineering-based users, irrespective of location, have access to design-based applications and there is no lag in performance of the applications used, especially when manipulating complex 3D graphics

When I started working for one of the leading CAD/CAM software vendors in the early 1990s, all design-based applications were hosted on UNIX workstations, at that time a mix of Silicon Graphics, DEC Digital, HP, Sun Microsystems and IBM machines. Fortunately my company had very good relationships with the hardware vendors and we were able to get the latest workstations for demonstration purposes. Our 3D graphics based applications at that time were ideal for showing off the performance of the UNIX workstations. However from a customer point of view, these workstations were very expensive and unless you were the size of company such as Ford, Boeing or Caterpillar, then it was difficult to get access to sensibly specified UNIX workstations for running CAD/CAM applications.

Over the last twenty years, PC-based workstation technology began to improve exponentially and some of the larger discrete manufacturers started to make the shift towards PC-based hardware solutions; and this had a knock on effect with the UNIX market. Coming from the EDI side of OpenText’s business, I know the importance of some of the more ‘mature’ technologies. Companies will not stop using EDI and other mature technologies because they offer benefits that no other technology can offer in the market and the same can be said of UNIX workstations.

Over the years I have seen a split in the market. Automotive, aerospace and heavy industrial companies have been using Product Lifecycle Management (PLM) solutions from PTC, Dassault and Siemens and are today running these applications on highly specified PC-based platforms. However in the high tech and energy sectors, particularly oil and gas, there is still a very heavy dependence on using UNIX-based workstations, especially in a virtualized environment. But why is there a difference in UNIX versus PC usage between these different industries?

Typically in the automotive, aerospace and industrial sectors, manufacturers will produce complex 3D models of their final products. These 3D models are being used for downstream manufacturing processes such as CNC machining or 3D printing and other business processes such as marketing. The products manufactured in these industries lend well to being viewed in multi-platform viewing tools, for example taking a customer on a virtual tour of their new car, simply by using fly through viewing technology on an Apple iPad. Manufacturers can use PLM technology to build complete virtual models of their products and in addition to manufacturing and marketing, this digital information can be used for real-time simulations and even for through-life service and support applications. A whole eco-system has evolved to support these particular PLM solutions in PC-based environments. One of the reasons for this is due to the customer need to access digital information about a product through any type of platform, from PCs, tablets and all the way through to smartphone devices.

By comparison, the high tech industry uses Electronic Design Automation (EDA) tools to design the circuity on their silicon chips. Running simulations is another common requirement across semi-conductor manufacturers, being able to test circuit designs and ensure that chips are operating per their intended design parameters. Both of these design related processes require high powered workstations to complete the work in a timely manner. The oil and gas industry also performs numerous different types of simulations with analysis of ‘seismic surveys’ being one of the most common. Being able to analyse seismic surveys to construct 3D models of rock formations in near real time to help identify potential pockets of oil and gas can significantly speed up the overall exploration process.

But how can remote UNIX users ensure that they can get un-interrupted access to networked UNIX resources in order to run such simulation processes? In another scenario, what if Shell for example was working with external design partners such as Halliburton on a new oil processing plant and these partners needed joint access to 3D design information? What if the design partner did not have access to UNIX workstations, let alone the design applications to open up the 3D CAD models? Today’s design environments are truly collaborative in nature and this is why a virtualized UNIX environment offers many benefits for companies operating in the high tech and oil and gas sectors.

UNIX workstations have long been regarded as the design automation workhorse of these industries which is why today; UNIX workstations are still being used extensively in these particular industry sectors. There is another reason why UNIX workstations are so popular in the oil and gas sector. This sector has traditionally retained staff for a long period of time and many design staff will have been in the industry when UNIX workstations started to take over from mainframe-based environments in the late 1980s. However energy and high tech companies face another challenge when compared with their peer companies in the discrete manufacturing sectors mentioned above,
the flexibility that PC-based platforms have over UNIX. But there is a solution which I will discuss in a moment.

Over the last twenty years manufacturers have globalized their operations to support their customers, entering new markets such as China or India. They would typically establish new manufacturing plants and in some cases establish regional design offices to support local customer needs. For example in China the consumer typically prefers to be driven rather than drive the cars themselves. Many car manufacturers have setup remote design offices in China, requiring them to buy high-end, PC-based workstations and PLM design software licenses to run on those PC workstations. So this is great news for the discrete manufacturer who can scale up their design function quite easily by adding more PCs to their network infrastructure, but what about the semi-conductor manufacturers and oil and gas companies that also need to diversify into new markets and globalise their operations? How can they scale up their UNIX infrastructure to support the needs of their global business?

As companies globalize their operations, they need to provide remote access to network resources such as the design applications used across the high tech and oil and gas sectors. For example Cadence and Intergraph respectively provide design applications for these particular industries, but how do you scale your UNIX-based design infrastructure without adding significant costs to your business, i.e., purchasing more UNIX workstations and at the same time not compromising on network security? The high tech industry has been plagued with network hacking issues over recent years. Designs for the latest semi-conductor chips are stolen from corporate networks and before you know it a cloned semi-conductor chip has been manufactured in the Far East. But if you need remote access to a UNIX based infrastructure how can you ensure that the connectivity between the remote user and the location hosting the UNIX application is secure?

My first experience of using a virtualized computing infrastructure was back in the late 1990s when Sun Microsystems introduced their Java based Ray workstations. You popped your smart card into the Ray workstation, this provided a form of identification to the workstation, and you were then presented with a thin client that was able to access UNIX applications hosted in a remote data centre location. At the time I was building complex demonstration environments and the Sun Ray offered a unique way to access information that would traditionally have required a full blown UNIX workstation. Now admittedly the processing power on the early Ray workstation was not great for manipulating graphics and in fact later in its life the Sun Rays were used for running more general business applications rather than high end PLM solutions. But the concept of running applications remotely on thin clients was certainly a great idea and one which is still in wide use to this very day, especially in the high tech and energy sectors. However the technology used to run remote UNIX applications has moved on considerably.

Here at OpenText, we offer a number of solutions to help companies manage, archive and access their digital information, irrespective of the type of platform you might be running on. OpenText Exceed VA TurboX, ETX, is a remote access connectivity solution, which allows organizations to deploy UNIX applications virtually to their users by keeping them running on UNIX servers, while allowing users to remotely access them through a web browser and achieve the same user experience than if they had the application installed on their desktop, no matter the distance between them and the data centre, both securely and centrally managed.

ETX is ideal where the user needs to run high performance applications:-

  • UNIX applications are accessible from anywhere in the world with no decline in performance, always secure and centrally managed
  • ETX lets people work and collaborate virtually on UNIX applications from Windows, Linux and UNIX desktops anywhere in the world
  • It removes the limitations and the complexity of traditional remote access solutions by offering the fastest connection to your business wrapped in a uniquely intuitive user experience
  • Designed for the enterprise data center, it improves the security, manageability and availability of your UNIX applications

So whether your business is in high tech, oil and gas or other industry sectors such as financial services where virtualised platforms can help reduce operational costs across your IT infrastructure, ETX can help companies take a big step into the digital first world. ETX allows companies to get products to market or deliver capital projects to their customers in a much shorter timeframe. Companies are able to improve how computing resources are deployed and it allows a centralized, private cloud to be established. ETX allows employees to be more productive through 24/7, global access to corporate business information and finally this is all achieved through a highly secure and regionally compliant solution.

If you would like a free trial and further information on our ETX solution then please visit our dedicated web page by clicking here.

About Mark Morley

Mark Morley
As Director, Strategic Product Marketing for Business Network, Mark leads the product marketing efforts for B2B Managed Services, drives industry and regional alignment with overall Business Network product strategy and looks at how new disruptive technologies will impact future supply chains. Mark also has over 23 years industry experience across the discrete manufacturing sector.

Check Also

Digital Transformation

New IDC Study – 66% Would use B2B Outsourcing to Support Digital Transformation

Over the past few months I have posted several blogs relating to digital transformation across …

IoT in the Supply Chain

Establishing a Supply Chain Focused IoT Platform

Over the last few years I have been posting blogs on how the Internet of Things …

Leave a Reply

Your email address will not be published. Required fields are marked *