Have a Reuse Strategy for Business Process Integrations

January 29, 2012

When implementing process automation initiatives, it is important to have a reuse strategy – why? Because, the process flows are a rich minefield for reusing services and common interfaces across a variety of use cases. It can also act as a service provider for other teams to invoke/integrate a common set of processing flows.

Host business process definitions and instances

  • Provide a modeling and execution environment for designing and implementing business processes
  • Implement a generic data structure for manipulating & orchestrating workflow state
  • Provide the ability to reuse a workflow patterns across business processes. E.g. enable reuse via sub-processes, process extension points, etc.
  • Provide the ability to access and orchestrate activities requiring interaction with data services and business rules, and legacy services

Act as services consumer & provider

  • Host process orchestrations, while consuming persistence, validation, and security services
  • Abstract legacy capabilities and reduce tight coupling between internal systems
  • Publish and consume business events to reduce application to application coupling

 

Evolve a reusable asset catalog

  • Ensure technology components and APIs have domain relevance – data, events, and relationships are fundamental abstractions need to be brought together
  • Reduce learning curve for application developers to identify, evaluate, and integrate process definitions and services from a library of reusable assets

Pursuing BPM? Avoid Reuse Inhibitors

December 20, 2010

Implementing business process re-engineering and/or automation involves several moving parts from a technology perspective – stateful business process data, interfaces for external systems/user interfaces to interact with process information, and rules that determine several aspects of process behavior.

Here are some practices to avoid/minimize when pursuing BPM projects:

  • Building monolithic process definitions – as the number of process definitions increase there will be opportunities to refactor and simplify orchestration logic. As projects start to build on existing processes, there will be an increasing need for modular process definitions that can be leveraged as part of larger process definitions.
  • Placing business rules that validate domain objects in the process orchestration graphs. This will effectively tightly couple rules and a specific process hurting the ability to reuse domain-specific rules across process definitions. Certain class of rules could change often and need to be managed as a separate artifact.
  • Invoking vendor specific services directly from the process definition. Over time this will result in coupling between a vendor implementation and multiple process definitions. This also applies to service capabilities hosted on legacy systems. Hence, the need to use a service mediation layer.
  • Defining key domain objects within the scope of a particular process definition. This will result in multiple definitions of domain objects across processes. Over time, this will increase maintenance effort, data transformation, and redundant definitions across processes and service interfaces.
  • Integrating clients directly to the vendor-specific business process API. This will result in multiple clients implementing boiler plate logic – better practice will be to service enable these processes and consolidate entry points into the BPM layer.

This isn’t an exhaustive list – intent is to to highlight the areas where tight coupling could occur in the business process layer.


Reusable Exception Handling for Services – Podcast Episode

November 13, 2010
Want to listen using iTunes?

Using iTunes?

podcast

New episode added to the Software Reuse Podcast Series on designing reusable exception handling for SOA efforts. It elaborates on using a simple set of components for handling exceptions, capturing relevant metadata, and determining notifications.

Like this post? Subscribe to RSS feed or get blog updates via email.


Bringing Events, Processes, and Services Together

November 6, 2010

When working on implementing business process automation solutions, the team needs to bring together several related concepts: chief among them are events, process instances, and service capabilities. I like to think of them in the context of an overall enterprise architecture being aligned with individual solutions and projects. Bringing these concepts together holistically has several benefits:

  • Better alignment between BPM & SOA initiatives (wrote earlier about the risks of not doing so)
  • Consistent interfaces for event producers and consumers (e.g. clients who initiate changes to data or alter a process will be able to reuse a standardized set of messages)
  • Simpler solution architecture – aligning data models across event processing and services will reduce data transformation needs. It will greatly reduce the need for data mapping and complex adapters.

Events can be internal or external and can impact business processes in several ways. An external event could initiate a business process (thereby a new process instance is created), or alter an existing process (execution is moved from some kind of wait state or an existing instance could be terminated). These events could be data, state, or a combination of the two impacting a business process. The business process instance might take an alternate path or spawn a new instance of another process as appropriate.

Process instances themselves can be implemented by leveraging several service capabilities. These service capabilities need to share a consistent set of domain-relevant interfaces. The process instances can then be loosely coupled with services thus enabling the service implementations to evolve over time. As state changes happen in a process instance, outbound events can be generated – these are publications or notifications that are of interest to potentially multiple consumers. Service capabilities can be leveraged for not just creating/changing data but also when encapsulating business rules for decision steps, as well as for publishing messages to downstream systems/processes.


Service Enable Business Processes

October 9, 2010

Business processes hosted in a BPM container (whether running in-process or as an external engine) need to be exposed to consumers in your enterprise. Here’s where your SOA efforts can come in handy – these business processes can be exposed as services via interfaces provided in the service mediation layer. Doing so decouples the vendor specific business process interfaces from enterprise consumers and allows for additional horizontal capabilities to be hooked in (metrics capture, monitoring, authentication, authorization etc.).

So what can these services do? Here’s a very simplified diagram of how these services fit in:

These services are commonly referred to as Task Services and they encapsulate operations on business processes.  Common domain-relevant schemas can be leveraged from other services efforts – facilitating consistency and reduce data transformations between the business and services layers.


Reuse Core Components in Biz Processes and Services

April 19, 2010

There are a variety of components – that can encapsulate business logic (either simple algorithm/calculations or even complex orchestrations). These components can be invoked from business process orchestrations as well as stateless services.  These reusable components could be business rules integrated as decision services or legacy services wrapped using a more decoupled interface. Business events such as a new account being opened or a new security getting added to a portfolio could trigger a core piece of logic – e.g. get statement preferences – that can be used to fulfill both these needs.

For instance, in the diagram below that two business processes and a stateless service invoke a common component via a request dispatcher (or a router module).

If you tightly couple a piece of logic that is applicable across business processes or service capabilities you can refactor it to create a new resuable component. This is all the more reason why it is a good idea to go through a service mediation layer when leveraging legacy services from business processes. If you decide to reuse the legacy service in a new orchestration it will be straightforward to plugin a new consumer.


16 Candidate Reusable Capabilities for Business Processes

November 15, 2009

Here area set of ideas, candidate  assets if you will, of reusable software capabilities for your business processes. Please don’t take these as capabilities you have to build from scratch. Instead, view them as part of your overall BPM software infrastructure. Most of these capabilities could be provided by a single vendor or using an abstraction layer, you can realize these using multiple ones. You can also prioritize this list when evaluating vendor offerings.

  1. Rule driven Exception Handler: Integrate exception handling with business rules engine. The rules could  determine how to resolve, mitigate, and handle errors. It will also specify routing instructions for errors requiring human intervention.
  2. Internationalization Support: Status messages driven by a resource bundle as opposed to a single locale. Additional locale-specific resource bundles can be added as required for your business needs.
  3. Seamless Cross Channel Movement: Enable a business process to start in one sales channel and get completed in another. Idea is to support business processes that start, pause, and get reactivated via an alternate channel. For example, a user can start ordering a book online and request a customer service rep to finish the process. The idea is to have the user’s in-progress process instance data get placed in appropriate work queues in a new business process or in a new state on the original one .
  4. Business Process Completion API: Ability to determine % complete for any business process based on the state of the process instance . This API can provide the ability to get current status, estimate completion time (e.g. based on historical data), and what steps are remaining for the process instance to finish.
  5. Business Activity Hold & Retry API: Facilitate pause, resume, and delegation of any business activity based on business rules. This interface would put processing on hold as well for one or more process instances.
  6. Authentication: Enabling integration with a LDAP store, or providing digital certificate based security for interfaces that instantiate the business process are examples here.
  7. Entitlements  API: Enable fine grained authorizations for business activities and business processes. Role based entitlements for events – i.e. if a particular user role cannot initiate a business process or execute a particular activity, the software  infrastructure should prevent those actions.
  8. Content Management integration: Integrate with content management system to serve targeted content based on state of business process or to augment data in a process instance.
  9. Event Integration API: Capture or derive events and handle them using one or more business processes. This could also impact existing process instances that are in-flight.
  10. Indexed Search Integration API: Ability to execute full-text search as well as categorized search using an indexed search engine against completed and in-progress business process instances. E.g. search all process instances from a particular division or with a particular customer code, or category etc.
  11. Process Dashboarding: Provide key business process metrics – report real time status of availability, performance, usage statistics, escalations, etc. Also provide ability to override/adjust in-flight process instances based on business scenarios.
  12. Business Process Preferences API: Manage a set of process-level preferences. E.g. default logging level for all tasks could be set to high or all notifications get delivered to additional recipients, or data fixes will get audited a different way etc.
  13. Document Integration: Attach documents using information in a process instance and attach/route content as part of the business process orchestration.
  14. SLA Adherence API: Manage service level agreement specifications associated with end to end business process as well as key business activities. Ability to define/handle, and proactively prevent SLA violations.
  15. KPI Definition and Monitoring API: Monitor business processes to capture key process indicators based on business process. E.g. number of accounts opened today, number of accounts opened after multiple validation errors etc.

Like this post? Subscribe to RSS feed or get blog updates via email.

tweet this add to del.icio.us post to facebook


Building Business Processes Using Reusable Assets

November 12, 2009

You can build business process automation solutions using a set of pre-defined assets – assembling processes rapidly is one key benefit of systematic reuse. These assets will help you reduce effort, increase consistency, and most importantly save costs. The following are the different categories of reusable assets:

Event Handlers: Help you place processing logic for various business events in this category. The event handler itself can instantiate a brand new business process or hook into an existing one. Event handlers can be used as reusable entry points to various processes. E.g. a delinquent account event could instantiate a ReviewCustomer business process or a document expired event could trigger a DocumentTracking business process.

Building business processes with reusable assets

View Example

Sub-processes: These are micro-flows reusable across multiple business process. These assets encapsulate system steps, human steps such as escalation and routing. For instance, Purchase Equipment and Book Itinerary could leverage Finance Approval Process Flow (Note: obviously this has to be thought through carefully – the context for each business process would be different and appropriate data has to be sent to a sub-process for it to be truly useful).

Event Generators: These will typically have logic to assemble, transform, and publish an event. The event needs could be stored in a system and sent via nightly batch and over time become real-time. Event generators will shield business processes from such changes. These could be message generation components that publish messages at various stages in a business process. The event itself could be published from several business processes – so these generation components themselves are reusable. For instance, the AddressUpdated event could be published from the Account Maintenance or via the Purchase Widget business processes.

Business Services (or Task services): These often contain orchestrations that invoke external data services, perform business logic with the help of rules. They typically map directly to a business capability and tend to be less reusable than data services. But they can be reused as well across business processes. For example, a getCustomerCreditRating service could be leveraged by AccountOpening and AccountManagement business processes.

Data Services: These are service capabilities that encapsulate key enterprise data such as customer, product, account, document and provide create/read/update/delete operations as well as search, validate, enrich functions as well. They can be accessed from business processes directly or within a task service.

Decision Services: These assets facilitate decision making in the process, often by invoking business rules. Some rules might be reusable if they encapsulate logic that is applicable across multiple processes. This may not always be possible and will require rules to be at the right level of granularity. E.g. a validate customer decision service could be used by AccountOpening and DocumentManagement business processes.

System Adapters: These are connectivity components that provide access to legacy applications and other proprietary systems. They are used by your services layer to shield business processes from being directly coupled to external systems. These adapters could be leveraged by data services, business services, and event generators.

Like this post? Subscribe to RSS feed or get blog updates via email.

tweet this add to del.icio.us post to facebook


Build an Abstraction API for BPM Interaction

October 27, 2009

Introduce an abstraction API when integrating with a BPM solution. Why do I say that? Several reasons:bpm abstraction api

  • Good software design practice to bind to an interface as opposed to an implementation. So individual applications won’t be directly coupled with an external vendor solution.
  • Provides you the flexibility to augment solution using multiple vendors. Related to above point, you can utilize one vendor for a subset of capabilities and another for a different set.
  • This API can be the ideal place to integrate your enterprise capabilities within the context of BPM solutions. Instead of making one-off or tactical modifications to a vendor solution that could be both expensive and proprietary, you can augment missing capabilities using the abstraction API. For example, if the BPM solution doesn’t support authentication based on active directory, this abstraction API can provide that capability (most likely you already have this component in your enterprise). Additionally, this is also the place to integrate horizontal capabilities such as message routing, metrics, monitoring, and error handling. Do you want your BPM solutions to report exceptions differently than other applications? In the same vein, this API can integrate with services or libraries that encrypt/hide sensitive data attributes before returning process state to a calling application. This has the potential to reduce duplication of such logic across user interfaces that integrate with related business processes.
  • Can potentially simplify complexities associated with a native API. If the native API needs a set of steps for starting process instances or get task/work items for a particular user – this abstraction API can simplify those calls. This not only makes it easier for integration but reduces the opportunities for errors across development efforts. This API in essence can act as a façade.
  • The API standardizes access to BPM capabilities and reduces the possibility of competing integration mechanisms across development efforts. If one team uses the native API as-is and another builds a new one on top – you have two ways of accessing the BPM engine. This problem gets worse as additional teams start to use BPM.
  • This API could also make every business process support a core set of functions in addition to start/stop/suspend/resume calls. For instance, every business process can provide a getCurrentStatus() or reassignProcessInstance() that will make it easier for managing processes at runtime.

This API could be realized as a web service depending on the level of heterogeneity and performance requirements. This would essentially act as a service enabler for your business processes.

The above list isn’t exhaustive – are there additional ones to add to this list?

Like this post? Subscribe to RSS feed or get blog updates via email.

tweet this add to del.icio.us post to facebook


Reuse-Friendly BPM Capabilities

October 21, 2009

I have been stressing the need for integrating various concepts for succeeding with systematic reuse. BPM is no different. In an earlier post I introduced reusable BPM integration capabilities. In this post I want to list additional capabilities that you can build as part of BPM initiatives:

  • Mature your integration function that helps consumers integrate, use, and discover business process assets. This includes activities, sub-processes, as well as end-to-end business processes. Provide sample source as well as integration documentation to applications that need to leverage your processes. Do remember developers may not understand the differences in interacting with a stateful business process!
  • Significant alignment with information architecture logical model. Your activities that invoke system steps leverage logical data schemas and data types. If you start to reuse sub-processes within your high-level flows this will become second nature.
  • Ensure that your BPM assets are aligned with key performance indicators (KPIs) – if your business wants to analyze time taken between a particular set of steps you might need to persist, report, compute on this data. Why not use a reusable component that can persist updates from any business process?
  • Business process models and orchestrations developed are more standards compliant and adhere to key BPM and SOA standards. The higher the interoperability of SOA assets, the easier it is to reuse them in your business processes.
  • Expand coverage of business processes hosted in the BPM layer – including processes that involve internal clients, external clients, data governance/compliance etc. You will start identify reusable assets over time. For example, two business processes might require fetching customer data or updating customer address or calculate customer worth to the company and so on. Every such need is an opportunity for reuse.
  • Comprehensive modeling and representation of all relevant business processes including placeholders for manual steps, batch jobs, etc. Even if your existing process isn’t transformed completely from manual to one that is fully automated in real-time identify these steps explicitly. At the very least, you can use reusable components that deal with batch jobs (kick off FTP session or create a temporary file and copy to a particular location etc.)
  • Expanded number of event driven services for your business process. The higher the number of events the greater the likelihood of additional consuming applications getting to reuse data and notifications.
  • Support standardized alerts associated with human workflow steps (delegation, escalation, routing, pause/resume task etc.). Your BPM infrastructure can provide configurable capabilities for these steps.
  • Create scripts that perform continuous integration including smoke tests, unit tests, performance tests etc. If your business process are service-enabled, you can run a battery of service requests to instantiate, update, remove process instances.

Like this post? Subscribe to RSS feed or get blog updates via email.

Slide 2

ØWe are completely focused on Enterprise Account Opening and supporting Account Opening – support complex business process orchestrations.
ØWe are not methodically building a collection of reusable building blocks of assets in the business process layer that will help us for the future
§Data service wrappers, PMWS wrappers, Work in progress (WIP) API,  event processing
ØWe have built a set of components to configure events for a business process, capture metrics and perform administrative functions.
ØNot all business process logic is in business process layer for Enterprise Account Opening
ØWeb service the de facto metaphor of integration. Business processes are accessed on demand using  request/reply
ØSchema Contracts
§Schemas, data types, web service contracts are minimally aligned with information architecture logical data model resulting in a missed opportunity to align ourselves deeper with the information architecture direction.
§There is significant of data type, business object reuse across services and event handler orchestrations
ØSeveral utilities for testing, deployment, and continuous integration capabilities
ØClient integration happens on a per-project basis. This often results in increased pressure on the development team leads, lack of knowledge sharing across projects with respect to service integration/behavioral issues

tweet this add to del.icio.us post to facebook


%d bloggers like this: