Business process management, designed to be a systematic way to improve workflows and operational practices, has long been about building collaborative connections between IT and business experts.
The big data trend opens up BPM to new data sources and opportunities to analyze processes more deeply and simulate potential improvements. But doing so successfully takes planning and cooperation among decision-makers in IT and business. It also requires collaboration with subject domain experts, from user-experience designers to technology vendors assisting implementations.
“The intelligence that comes from capturing event data, as well as driving greater understanding and innovation through simulation, is a big data scenario,” says Nathaniel Palmer, a BPM veteran practitioner and author. “This cannot be done with a narrow lens on structured data, nor can it be limited to internal, single-company boundaries. That is why intelligent BPM processes often extend well-beyond discrete transactions between suppliers or customers.”
Data Informed spoke with Palmer, a consultant who is now affiliated with the Workflow Management Coalition, about the changes occurring in the BPM field and their implications for enterprises looking to adapt or implement systems using more data sources and analytics. The interview has been condensed and edited for clarity.
Data Informed: What are the primary trends in BPM today?
Nathaniel Palmer: There are a few important market trends right now. The first phase of BPM offered one of the first real opportunities to enable the abstraction of business and application logic. This has provided significant gains in how organizations manage work and respond to change, yet the impact of new technologies, the mandate for greater transparency, and the ongoing aftershocks of globalization have collectively removed nearly any trace of predictability for businesses. Competitive advantage now comes more from the ability to read signals from many sources, and rapidly translate these into responses.
As a result, we are now moving into a phase that some people are calling “Intelligent BPM Systems,” which is distinguished primarily by the concept of a “sense and respond” capability. In this way we can see BPM software market in terms of three phases, with each building on the other.
The first phase was about separating systems—the application logic—from the processes which they support. Then phase two introduced the notion of a flexible architecture to support adaptable, goal-driven process models by maintaining the intelligence for how to access information and—resources without having a rigid process.
And now phase three is building on the first two sets of capabilities while delivering visibility and feedback which shows what is going on within a process, as well as what will likely occur in the near future.
The first two phases of BPM have established a basis for enabling adaptable systems, allowing BPM adopters to respond quicker than ever before—moving away from the command-and-control structure which defined management systems for the last 30 years.
At the end of the day BPM is not just about technology. It is, instead, mostly about the business and the people. What is indeed new, however, and at the center of the phase three opportunity, is the ability now to adapt systems continuously to match the ever-changing business environment.
The third phase of BPM also to enable business process owners and managers to discover the situation changes which require adaptation. It provides a framework for continuously validating and refining an understanding of business performance drivers, and adapting business systems accordingly.
What are the implications of that kind of shift?
Palmer: With few exceptions, reporting on process events and business performance was previously done only after a process had executed, or otherwise within a separate environment disjointed from the process. This obviously made it difficult to impact the direction or a process, but was based on a limitation of system and software.
Since its inception, IT has been defined by the architecture of the relational database (RDMS). The advances seen in computing, even in the evolution of Internet architecture, were essentially a derivative of the relational database. Today, we are moving to the post-relational era, perhaps more aptly named the “big data era.”
The intelligence that comes from capturing event data (signal detection) as well as driving greater understanding and innovation through simulation, is a big data scenario. This cannot be done with a narrow lens on structured data, nor can it be limited to internal, single-company boundaries. That is why intelligent BPM processes often extend well-beyond discrete transactions between suppliers or customers.
This level of collaboration requires standard conventions, mutually understood meaning of data exchanged between stakeholders, but without rigid structure or formalization. In this way, the post-relational shift to Big Data has largely paralleled, and in many ways is driven by the same conditions and requirements behind adaptability. The movement in both cases is to expand beyond the limits of relational data structures and capture the richer context which defines business events.
How are some of these things appearing in actual implementations?
Palmer: One is the movement toward built-in BPM. The first wave of BPM was largely about tying together existing applications—not just integration but having management threads across multiple application environments. That has really defined BPM for the past 10 years. However, over the last couple of years, what is developing momentum is building out entirely new applications on a BPM platform. The focus now is to provide a consistent user experience across the entire process lifecycle. That increasingly means taking advantage of mobile platforms, and using the BPM suite platforms to take applications beyond where they could go before.
Can you explain how that works?
Palmer: Before, BPM was a thread across multiple application environments. That was where we were doing most of the work; inside of those applications. Now, while that continues to be a factor, more and more, we are doing work outside of those application environments, on multiple devices and in third party networks. The edge points of BPM have extended well beyond the original application boundaries. Now, when you look at the role of the cloud, it is not simply from a multi-tenant standpoint. More important is the way it enables mobile and social interaction. You realize that BPM requires all these things to work. It is no longer having social or mobile access as a point of differentiation. BPM, in today’s environment is not going to work unless you extend to all those endpoints.
What new thinking or approaches does that imply for organizations implementing BPM?
Palmer: If they have been simply looking at integrating applications without regard to all the access points, then that is a more outdated way of thinking and is less likely to succeed. They need to look at both extending the applications to these access points and how do they take advantage of these capabilities. They really need to look at BPM through the lens of user experience. So it is not just necessary to have BPM experts or other process design experts; they also need business functionality and either user experience experts or those that have some keep knowledge of what the user experience needs to be. That way they can drive how the BPM-built application should behave.
That is a pretty big shift in the skill sets being applied to BPM.
Palmer: It is. Over the last few years the big shift was to see the process architect and the process engineer as being someone with a multidisciplinary programming background, with experience in event processing and SOA (service-oriented architecture) as factors of integration. But these same roles generally aren’t the individuals who worried about or thought much about optimizing the user experience. That is the new center of gravity for BPM.
With that insight, how should companies staff and develop BPM projects?
Palmer: From a team standpoint, it is good to still have a cross-functional team, where business leads – somebody who has a business function background or represents a voice of the customer, the “product owner” in the vernacular of agile SCRUM. It may be the customer in the sense that if there is work to be done by a third party, it is somebody that owns that business application vision. There need to be skill sets for how to realize that– much more of a design-oriented skill set, whether it is someone specifically engaged in user experience design, or otherwise they are an architect that understands design factors, user interactions and the application.
In some ways, what you are outlining actually enlarges the scope of effort. How can that larger effort be managed successfully?
Palmer: There is an enhanced agile notion where you focus on turning out results that are demonstrated in 30-day cycles and that is then extended into multiple 30-day cycles, say over 90- or 120-day periods. The advice I have always given, and if anything it has gotten more extreme, is to strive to have a clear demonstration of value within 90 days. It is really all about keeping the excitement, focus, and commitment. The alternative is inevitably losing momentum, most likely, fatally.
How much of this should by DIY and how much should you depend on a vendor?
Palmer: Inevitably, in this stage of where we are with BPM, the more complex the BPM, the more hands-on the vendor needs to be. Most vendors in the realm of highly complex and large scale platforms tend to focus intently. They will dedicate resources and that is probably a necessity. A customer needs to build large internal disciplines just to use the software – separate from actually creating usable functionality.
That has been the case traditionally, but it is changing in some ways. Among large vendors you now are seeing more and more that are taking advantage of extended online user communities or they are putting up examples for people to reference.
Another critical asset that needs to be in place is a set of reference models or architectures for whatever you are looking to do. Many vendors are providing that so you can go and see examples of what they are trying to do and good examples of how to get there.