Showing posts with label event producers. Show all posts
Showing posts with label event producers. Show all posts

Wednesday, September 28, 2011

CICS event processing improved version


IBM CICS is an example for smart producer of event processing system, it does not do event processing inline, but instruments CICS transactions to emit events, and works in a loosely coupled mode with any event processing engine that can read its emitted events.  CICS TS 4.2  released recently has several improvements in the CICS event producing capabilities.  Among these improvement are:



  • Including the event emission to be part of the transaction, by doing the event emission as part of the commit process.  Note that since it is loosely coupled with the event processing itself, this does not becomes atomic unit with the event processing itself, I have recently written about the relationships between transactions and events, and identified this area as one that need to be investigated more. 
  • Change management inside the event instrumentation in CICS with appropriate tools
  • Inclusion of system events inside the CICS instrumentation (e.g. connection/disconnection to databases, transactions aborts etc..). 


Since the strength of a chain s typically equivalent to the strength of the weakest link,  in many cases the producer is the weakest link, and the amount of work required to emit the right events and the right time is often much larger than the rest of the system.   Smart event producers like CICS making this weakest link much stronger.




Tuesday, June 29, 2010

On restful event producers


Another year has passed and yesterday was my birthday; I tend not to celebrate an event that reminds me that I am getting older, so I found on the Web a greeting card that fits my sentiments; however -- this year I got much more birthday greetings than usual. Of course I can be encouraged that all of a sudden I became popular, but the truth is that I filled profile descriptions in some social networks - Facebook, LinkedIn, the family tree site (may be one or two more) - all of them send alerts to people about birthdays...

I have written about event producers such as the CICS event producer, that instruments CICS transactions and emits them as events; event producer needs to produce the event by sensing or instrumenting some software/process to create the event, and also needs to communicate the event to the event targets (event processing agents, channels and consumers).

There is a new article on IBM developerWorks dealing with emitting events using REST interface. The word REST has an association with resting, like the Van Gogh famous picture of noon rest - below.


But in this context REST of course means "representational state transfer". The article describe how to use the REST interface of Websphere Business Events to emit events from any application that has access to web protocols. Enjoy!

Saturday, March 13, 2010

On events versus data

The word "data" always reminds me of the android from Star Trek The Next Generation whose name was data. The word data (in computing) typically is very general and refers to anything the is represented on digital media, the picture of data above is also a piece of data, like many other things. The word "event" also has a broad term which means something that happened.

Recently Paul Vincent wondered in his Blog about the difference between event and data, as some people think that events are footnotes to data. Since by the definitions above, obviously event and data are not really the same, I'll try to talk about the touch points among them, since those are the reason of misconceptions.

There are various touch points between events and data:

  1. Event representation contains data. Event is represented in the computing domain by "event object" or "event message" which usually is also is called "event" as a short name. This event representation includes some information about -- what is the event type, where it happened, when it happened, what happened, who were the players etc... Example: the event is "enter to the building", the event's payload contains information that answer questions such as: what building? who entered? when ? and maybe more. The payload of the event is data, it may be stored (see event store), or just pass by the system.
  2. Data store can store historical events. Event representations can be accumulated and stored in a data store, for further usage. There are large data stores that collect weather events. Note that in order to navigate in historical events, these events may be stored in a temporal database an area that I've dealt with in the past, sometimes if the events are spatial then it have to be stored in spatiotemporal database.
  3. Database can be event producer. In active databases the event were database operations; insert, modify, delete and retrieve, in this case the fact that some data-element has been updated or accessed is the "something that happens" (which may or may not reflect something that happens in reality), and the database acts as event producer and emits event for processing by an event processing network. Note that actually all event producer contains some data that is turned into event, for example transaction instrumentation like what IBM has done in CICS as event producer.
  4. Derived events as database updates. An event processing application take events from somewhere as input, does something, and creates derived events, and send them somewhere, this is all event processing is in one sentence, a derived event created in this process may go to an event consumer, the event consumer may be a DBMS or another type of consumer whose action is to update some data store.
  5. Event enrichment by data during the event processing. During the event processing operations, sometimes enrichments of events is requested, let's return to the event of a person enters a building, the event processing application deals with security access control, and needs to know what is the person security clearance, this information is not provided with the event which provides only identification of the person, and there need to be some enrichment process in which an enrichment event processing agent accesses some global store, in this case reference data, to extract the clearance value and put it inside the event for further processing.
Thus the main issue is not the "versus" issue but the various relationships between the two terms.

Saturday, June 13, 2009

On CICS event processing


Recently, IBM has announced "CICS Event Processing". A full presentation explaining what it is about, is available on the Web. As I have written many times in the past, the processing part of events is just a part of a bigger picture, that includes: producing the events before the processing, route the events to the right processing elements, and consuming the events by consumers. In some cases, devising the event processing application is the easier part of the work, and the more difficult part is to connect it to the rest of the world. Since a substantial amount of the world transactions are going through CICS, which is a rather old, but still alive and kicking transaction processor, then it makes good sense to take it as a place for instrumentation, and emit events that can be sent either to further processing or directly to a consumer or a dashboard. The event processing part of CICS perform simple and mediated event processing, e.g. filtering, transformation, enrichment and routing. For pattern matching it sends the event to an event processing engine. I think that we'll see more of the producer side event processing support, that will reduce the need to write ad-hoc adapters and make it more cost-effective to use. We'll also see the complementary part - the consumer side, on which I'll write in a later date.

Tuesday, May 12, 2009

On Gartner's EPN Reference Architecture


Today is a holiday (for children, no vacation for adults..) called Lag Baomer, the highlight (besides not going to school) is that last night all children have gathered around bonfires, as seen in the picture. Fun.

Recently Gartner has published a report called "A Gartner Reference Architecture for Event Processing Networks".

On the positive side, it seems that the concept of EPN, as an underlying model for event processing is catching. The readers of the Blog may realize that I am in the opinion that we need an agreed upon conceptual and execution model for event processing (the same role that the relational model assumes in relational database, however, I never believed that the relational model per se, is appropriate also as the model behind event processing). The book I am writing now "Event Processing in Action" concentrates around the notion of EPN, and a deep dive into construction of EPN-based application.

Reading Gartner's report I found some slight differences between the way they describe EPN, and my own description. In the Gartner report they define a term called "dissemination network" that consists of event processing agents, channels and event flow among them, and then they define EPN to be a dissemination network + producers + consumers. I actually could not find any compelling reason to introduce the notion of dissemination network. According to the definition we are using, event processing network is a directed graph that has nodes for producers, channels, EPAs and consumers, and edges that determine the event flow among them. Another difference is that the Gartner report views event consumers and event producers as type of event processing agents. I have a slightly different opinions, I think that both event producers and consumers are not really event processing agents, since event processing agent is some software module that function events and may generate more events. Event consumer and producer have nodes representing them in the EPN in order to make the event flow from and to them explicitly, however, they are only proxies of the actual producer and consumer, for the event processing network, they are sources and sinks. The main difference is that EPA functionality is explicitly specified in the EPN definition, while what the producer and consumer do is "black box". We don't want to include their functionality, since we don't want to extend the event processing language ad infinitum,

Mentioning the EPIA book -- Chapter 3 is now on the Web, and can be obtained through the MEAP program, this is the last chapter in the introductory part, and deals with principles of programming with events. Chapter 4, the first in the deep dive will be sent to the publisher soon. It has been much more challenging to write, deals about what information we need to store about events -- I'll Blog about it soon.

Saturday, April 18, 2009

On Event Processing Building Blocks

Back to work for one day in the office, with five conference calls (one with Germany, one with France, one with UK, and two with USA...) and then back to home for the weekend. When I have free time I like to read books, the current book I am reading is "A Lion Among Men", 0ne of the books of Gregory Macguire, who writes stories that take as background famous children stories (in this case - the Wizard of Oz), actually this is the third one behind the scene of the Wizard, now taking the Lion as its main character. I have another book of the same author still waiting...

We also submitted the draft of chapter 3 of the "Event Processing in Action" book to the publisher, which hopefully be posted on the MEAP site soon.

The approach we have taken in the book, as I have written before, is to use the "building block" approach, describing event processing principles, and the use case whose construction demonstrates the application, using building blocks, which are like the chemical elements. The application itself is being built by using "definition elements" which are like atoms (my partner for writing this book, Peter Niblett, has come with the analogy from the world of chemistry). we believe that this is the right approach to teach what event processing is -- in the "deep dive" part of the book we dedicate a chapter for each of the major seven building blocks and then dive deeper into the types of event processing agents (which deserves a different discussion). We'll also provide samples of how each building blocks is realized in different models.

The seven building blocks are:
  • Event type: defines the event schema
  • Even producer: the projection of the event producer over the event processing network (note that the event producer itself is outside the scope)
  • Event consumer: same -- the projection of the event consumer over the EPN.
  • Event channel: the glue that holds the EPN together
  • Event processing agent: the brain that does the entire work; each agent is doing a specific task of processing.
  • Context: the semantic partition of events and agents
  • Event derivation: A building block that is possibly part of each EPA that specifies the derived event.

There are some more building blocks that are used to support these ones, but our claim is that this set of building block is what needed to build an event processing application.

Chapter 4 which is in advanced phases of being written starts the deep dive by discussing the event type building block, and in one of the next posts I'll say more about it.

Saturday, February 28, 2009

On fusion confusion and infusion


This is a picture of Akko (AKA Acre), an ancient city with walls, middle eastern typical market with the smells of spices, an fisherman's harbor. I am hosting now some visitors from Germany, Rainer von Ammon and his colleagues from CITT to discuss some collaboration topics including a consortium for an EU project that we are establishing with other partners. Unfortunately they chose to arrive in the most rainy period we have this year, so we could not do much sightseeing today, however, we succeeded to get two hours break in the rain to stroll around the old city of Akko.

I'll get back to the discussion about the questions I posed yesterday soon, as I would like to see if more people want to react before stating my opinion (I am in learning mode...).

Today I'll write something about Information Fusion and its relationship to event processing; I came across a recent survey article in ACM computing surveys about data fusion.

There are various kind of fusions - data fusion, information fusion and sensor fusion -- and all of them are intended to get information from distinct sources blend it together and understand what has happened. A very simple example of sensor fusion is in traffic monitoring, there is a sensor that senses the speed of a car, there is a camera that takes pictures of the car and its license plate, fusion of both can identify the fact that a certain car has violated the speed laws, this is a relatively simple case that requires some basic image processing, but it is quite easy to determine what happened. This is, of course, very simple case, and in the area of military intelligence it is much more complicated to understand what happened / happening / going to happen and some techniques are being used. The Center for Multi source information fusion in University of Buffalo maintains a site with collection of bibliography about fusion issues including tutorials and their proposals to modify the relatively old JDL model, so you can find much more information there.

So where is the confusion ? --- there are people who confuse event processing with some other different areas, somebody in IBM who saw an illustration of event processing network once tried to convince me that we are re-inventing workflows, some data management people think that event processing is just a footnote to existing query processing, everyone with a hammer looks at the rest of the world as a bunch of nails;
Likewise, there are people who confuse fusion with event processing.



So what is the infusion? the fact of the matter is that information fusion and event processing are complementary technologies. The goal of fusion is to determine what happened, i.e. to determine what is the event that has occurred. Event processing is processing the event after somebody determined that the event happened it has multiple goals, the techniques are different, fusion is using conflict resolution techniques and stochastic modeling, event processing is using pattern matching, transformation, aggregation etc. Thus an event can be created using fusion techniques and then processed using an event processing system -- this is the infusion.

However -- there is also a potential synergies between these two applications - a partnership of fusion technology as a preprocessor for events and event processing can be beneficial for certain applications, this is the most obvious synergy. Another type of synergy is that techniques used in fusion can be used in event processing and vice versa, this is an interesting direction to investigate further and also investigate possible real applications for it. More on this - later.

Saturday, June 28, 2008

On embedded intelligence within event processing application




In the previous post I have referred to the term "Intelligent Event Processing" - one question that I have asked - is this a new term ? how does it related to the other "X event processing" terms ? -- I am not sure if the term "intelligent event processing" will stick around, I would say that a better way to explain what it is may be - "embedded intelligence in event processing".


If we look at event processing architectures

- there are: producers who produce events, consumers who consume the processing results, and the EPN (Event processing Network) in the middle, which really does the processing. So where are intelligent techniques can help - here are some (real) examples:
1. In the producer - the producer has a video stream of all cars that pass below the camera, an intelligent process (using image processing techniques) isolates the license plate number of the car, and send it for further processing (security, traffic violation, billing etc..).
2. In the "meta-data" composition -- a "pattern detection" node typically looks at pre-defined patterns and attempts to detect them in run-time. In current applications the patterns are entered by the developers or users. In some cases the patterns are "moving target" like in - fraud detection -- if the patterns for fraud are discovered they are of little value, and thus in the other side of the law - people are constantly looking for new loopholes, thus, intelligent techniques, such as machine learning are used to refresh the patterns that are looked for in run-time. The run-time does not change - same pattern-detection mechanism, just different sources of where these patterns come from.



3. Intelligent nodes within the EPN -- in some cases the process of derivation of new events cannot be expressed as derivation expression and need some intelligent derivation process - e.g. a heuristic algorithm to determine the traffic light policies based on traffic events.


There are many more examples - like creating predicted events and more -- but this was more to give some flavor. Is it useful -- yes, it is useful for a variety of applications. Does every CEP application need embedded intelligence -- not really. More - later.

Saturday, April 26, 2008

On Streams and Events

The picture above is taken from a UCLA project that deals with multimedia stream systems. While the term "data streams" and later "event streams" that deals with continuous queries over structured data, have been introduced in the last decade in the database research community (with spin-off to products), the term "streams" has more general and more traditional meaning - referring to multimedia streams - video, voice, news etc... - which by nature belong to the family of unstructured data. In previous posting I discussed some of problems around "event stream processing", and around classification of event processing technologies. However, in this posting, I would like to point out that "stream processing" in its more traditional meaning is an important complementary technology to event processing.

First - the result of stream processing is in detection that an event has happened. Examples are: detection of vehicle's registration plate in automatic toll roads (we have in Israel one of these roads, there are other roads like this in Canada and Sweden - and maybe in more places), where the event "vehicle with registration plate X entered the highway in entrance Y in time T". This can be further processed (after correlating to the exit event) for billing purposes, but can also serve for security and other applications. In this case, from the event processing architectural view, the "stream processing" is done in a producer application, which generates events that are processed in the event processing system.

Second - the result of an event processing system can be an input to a stream, example: a game is being presented to the players as a video stream. Decisions made by the electornic player (or by the human player) can be assisted by an event processing system. The result of the decision can be movement of a player to a certain direction, and this is fed back to the video stream. In that case, the video stream is being processed in a consumer application, which gets event as an input.

Of course, a producer can also be a consumer, especially in games which are of iterative nature, thus an application is communicating with an event processing system in both side.

Since much of the events that happen in the universe is sensed thtough various unstructured media, the area of creating events out of multimedia streams, and embedding events to control the behavior of multimedia streams, will be one of the future major directions for the future, we can see some of this already hapening.

Sunday, January 13, 2008

How events are getting into processing ?


Hello - again in a business trip after some break in trips - now in Regensburg, Germany (see in the picture) for the CITT conference that starts tomorrow, so I'll have time to write about it later this week. The topic of this posting relates to the event processing conceptual model and deals with the producer side.
Getting events from a producer may be the most time-consuming work in the event processing application, depending on number of producers, existance of adapters, and complexity of instrumentation (if needed)
A producer can be either a sensor that reports on an event it senses, or a creator of an event, in which case the event is typically an instrumentation of something that happens within the producer.
While "event-driven" is closely associated with the push mode - meaning the consumer decided when and how it sends events, this is only one of the possible modes. In some cases "pull" mode is used - meaning the event processing networks pulls events from the producer.
Why is pull needed? several reasons:
(1). Not all the sources are really producers -- the source may be a piece of legacy software that has not been designed to produce events.
(2). Push may not be cost-effective - we have to weigh the benefit of processing events fast, and the overhead occurs in transaction systems to obtain the events. This depends on the application, on type of processing for events and the tolernace of the operational system.
Pull can be done as: periodic pull or on-demand pull. More about the conceptual model parts - later.