Technology Agnosticism Is Within Reach With Interoperability Tools

by   |   October 4, 2016 5:30 am   |   0 Comments

Wayne Citrin, founder and CTO of JNBridge

Wayne Citrin, founder and CTO of JNBridge

With the proliferation of new languages and platforms, the ability to be technology-agnostic has evolved from a “nice to have” to a “need to have” — and developers face this reality daily. They must select parts of a solution from a wide array of platforms and other technologies with the aim of developing the best or most suitable solution for the end user. But it’s not enough to be well-versed in various languages if the components themselves don’t get along (as is the case with .NET and Java). Thus, the true promise of the technology-agnostic developer can only be realized when interoperability tools are used. Otherwise, mixing and matching whatever technologies are necessary to build the best possible system becomes a time-consuming and costly task.

Consider the following scenario. A developer is working on a project that’s implemented in .NET. At some point, the developer realizes that a key piece of functionality is either already implemented in a best-of-breed Java package or the organization has already invested in a Java package that serves the purpose. Does the developer resolve the problem by throwing away the Java package and purchasing or implementing the functionality in .NET, or by throwing away everything done in .NET and re-implementing the system in Java just to make use of this piece of functionality? Hopefully, he or she would use both the new .NET application and the existing Java package, as effort and money have already been invested in each, and presumably each technology was chosen because it was most suitable for its task. This is one way in which interoperability tools can support technology agnosticism.

Avoiding Developer Fatigue With Technology Agnosticism

Technology agnosticism can also paradoxically help to prevent developer fatigue because it allows a developer to assemble a solution mostly from familiar technologies, and limit the components based on unfamiliar and emerging technologies to those where the contributions of the new technologies are crucial to the success of the project. In such cases, the developer can strictly limit use of the new and unfamiliar technology to where it is truly needed, and use familiar and proven technologies elsewhere.

Joe Ryan recently suggested that one should be technology-agnostic in every development project, and that not being technology-agnostic will unnecessarily narrow your options when searching for the best solution to a problem. This makes sense: Developers should not be constrained by the particular underlying technologies when looking for the best solution. If one part of a project is best implemented in .NET and another part is best implemented in Java, go ahead and do it that way, and use an interoperability solution to help make it work.

In Ryan’s article, developers choose to be technology-agnostic and build their solutions from multiple technologies because it leads to the best solution. In many other cases, however, developers have no choice; they may be forced to be technology-agnostic for either business or technical reasons.

Making it Work With Interoperability Tools

Whether by force or choice, being technology-agnostic clearly introduces the need for the mish-mash of technologies to get along. Interoperability tools and solutions can help here. Following are some examples.

  • Mergers and acquisitions: Many companies have been a situation where they’ve either acquired a business unit or a product that was based on Java, while they themselves were .NET-centric, or vice versa. While the initial impulse might be to rewrite the newly acquired software for the company’s usual platform, such an effort is expensive, time-consuming, potentially error-prone, and a waste of resources. These customers were able to easily and efficiently integrate the systems using an interoperability solution.


  • Interoperability with trading partners: Even when a company has a uniform, enterprise-wide technology policy, or has otherwise successfully solved their in-house integration issues, there’s the problem of integration with their trading partners’ software, where there’s clearly no guarantee that their systems will be compatible with yours. While REST and other web-based services are one way to support this sort of interoperability, such services may have insufficient throughput, or expose insufficiently detailed functionality. It may also be the case that your partners’ systems may not be web-enabled. In such cases, interoperability tools can easily be used to implement a business-to-business solution, when one trading partner is using .NET and the other one is using Java.


  • Multi-targeted APIs: A software vendor may have a product targeted for a particular platform, with APIs targeted to that platform; for example, a .NET application with .NET APIs, or a Java application with Java APIs. Interoperability solutions can be used to quickly create a new API targeted to the other platform, so that they can expand their customer base to include those that develop applications using a different platform from that on which the original application was implemented. For example, one company decided to take a Java-based product and add the ability to call .NET binaries from the product’s programs. Interoperability tools were used to make this happen, and the resulting new feature was well-received by the company’s customer community.


  • Migration scenarios: In one particularly interesting scenario, a provider of process control solutions for semiconductor manufacturers used technology agnosticism as a strategy to evolve and migrate a system over time. The company had a large in-house-developed application that was developed in Java. After a number of years in production, the company decided that the application should be migrated to .NET. Rather than rewrite the whole thing at once, a process that could take years, they were able to rewrite smaller layers of the application for the new platform, and use an interoperability solution to bridge between the newly rewritten layers and the legacy portions of the application. As more layers were rewritten, the cross-platform bridge was progressively moved; the process will be continued until the entire application has been rewritten. During this extended process, the evolving application could be tested and kept in production, so that functionality was never lost. The customer estimates that it saved $1.6 million by taking this approach, so technology agnosticism can save money, too.


  • A vendor changes its technology: Sometimes, you may have happily settled on a single platform for your product, but then one of your vendors makes the fateful decision to change their platform. Do you follow suit, or do you stay with your existing technology? One company — a software provider that helps customers gain better insight into their business intelligence applications — faced this exact problem when SAP’s Business Objects, upon which the company’s product was partly based, converted to Java. The company’s.NET-based product was effectively disrupted with the change, but rather than completely convert their product to Java, they kept it in .NET and reimplemented the integration with Business Objects by using an interoperability tool, saving time and expense compared to completely reimplementing everything in Java.


  • Non-uniform technology policies: Even in the absence of acquisitions, different divisions of a company may have centered on different platforms for their IT. In such cases, engineers attempting to integrate these different technology silos may be forced to bridge the gap – in this case, between Java and .NET – using interoperability tools.


  • Customer requirements: The example at the beginning of this article is one instance where a developer may be forced to be technology-agnostic because their customer requires that part of the solution use a technology different from the one that the developer might have ordinarily chosen. A company specializing in document imaging, data capture, and payment processing experienced a similar problem when it needed to integrate its .NET-based document processing system into a customer’s existing Java-based pharmacy benefits management system. As in the initial example, the company was able to use an interoperability solution to accommodate this requirement and support a solution that incorporated both Java and .NET.


Interoperability and Streamlining Data Processes

Similarly, interoperability tools also streamline processes from a data perspective, such as when a developer is using Java-based Hadoop component HBase and wants to extract data and display it in Excel. Hadoop can reduce large amounts of data to a meaningful answer in a short amount of time; however, without understanding the shape of your data, developers run the risk of garbage in, garbage out. Analysis itself is an iterative process relying on investigation. Tools that aid data investigation provide a means to quickly view, sort, filter/reduce and represent data, making it possible to more easily find and understand patterns, trends, and relationships.

Microsoft Excel has always been the go-to off-the-shelf tool for data analysis, and it makes a ready-to-go front end for Hadoop. Excel can be extended using add-ins developed in Visual Studio, with the goal being to make the add-in generic with respect to the column definitions and data in an HBase table. Building an Excel add-in that supports viewing any HBase table of column families and provides filtering is relatively straightforward, but only when leveraging the HBase Java client APIs using interoperability tools to create .NET proxies.

Here’s another example. The analysis of very large log files is a typical task suitable for Hadoop. When processed using Hadoop, the log files are broken into many chunks, then farmed out to a large set of processes called “mappers” and “reducers.” Hadoop is well-suited to running on large clusters of machines, particularly in the cloud.

In order to implement the functionality of a Hadoop application, the developer must write the mappers and reducers (sometimes collectively called “mapreducers”), then plug them into the Hadoop framework through a well-defined API. Because the Hadoop framework is written in Java, most mapreducer development is also done in Java. However, developers might want to incorporate functionality that already exists in libraries written for platforms other than Java, such as .NET. While it’s possible to write the mapreducers using technologies other than Java through a mechanism known as Hadoop Streaming, this isn’t ideal, since it incurs additional overhead. With interoperability tools, technology-agnostic developers can create .NET-based mapreducers by programming directly against the Hadoop API.

Each example shows the benefits of technology agnosticism and interoperability tools from a data perspective. If a developer was using Hadoop, and, because of that, was committed to Java, he or she would lose out on the aforementioned possibilities. But when approached with technology agnosticism and use of interoperability tools, better solutions will emerge.

DeveloperTech article suggested that being technology-agnostic during the development process allows the developer to choose the best overall solution regardless of underlying technology platforms. While this is undeniable, there have also been many situations where, even when companies have thought they’ve settled on a single technology or platform, they have been forced to be technology-agnostic later on — long after all the original technology decisions were made. In the scenarios presented here, both business and technical circumstances have forced companies to broaden their technology base and become technology-agnostic further down the road. Interoperability solutions can help by supporting evolving technology needs.


Wayne Citrin is the founder and CTO of JNBridge, the leading provider of interoperability tools to connect Java and .NET frameworks. A developer by trade, Wayne’s career has focused largely on Java and .NET interoperability issues since .NET’s beta days. In addition to having been a leading researcher in programming languages and compilers, Wayne has also served on the computer engineering faculty of the University of Colorado, Boulder.


Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise, plus get instant access to more than 20 eBooks.


Tags: , ,

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>