Service capabilities can be reused mainly from the user interface tier, the business services tier, batch processes, and real-time processes and could be consumed from a plethora of platforms. These capabilities could be accessed via message exchange patterns – request/reply (tight or relaxed SLA) and publish/subscribe. All of these patterns drive your SOA and systematic reuse efforts. Some capabilities might always be available only via a single exchange mechanism but you will increasingly be offering similar capabilities across these three patterns.
Note: the illustrations below depict a data source behind the service capability. This isn’t a requirement but if you are exposing core data as entity services or business services that access underlying data you will be needing one or more data sources for the service to be functional.
Most common pattern when executing on-demand data services. This is typical of interactive applications that send requests and block on a response. When doing a synchronous request/reply via JMS, a temporary/physical response queue could be created. Regardless of transport used, the idea is to get a response very quickly.
#2 Request/Reply (relaxed SLA – typically asynchronous)
This pattern is used when executing long running service capabilities. The consumer sends a request and does not block on a reply. When the response is sent from the data service, the consumer can use a callback mechanism (a message listener) to process the data. This pattern also typically uses correlation identifiers to relate request and response messages. When using JMS, a physical queue is used to obtain the response messages. A queue receiver drains the message from the queue and proceeds with processing.
This pattern is used by publication services that execute based on a business event or a data operation even. The service will publish standardized messages that align with your business-specific or domain-specific data model. This is very useful when multiple consumers need to get notified upon updates/changes to core data. Using this model, new consumers can be added via configuration in a message broker as opposed to writing code for each integration. The service will publish to a destination (i.e. a Topic) and subscribers (consumer applications or processes) will each get the appropriate publication.