Skip to Main Content
Digital Business Automation Ideas


This is an IBM Automation portal for Digital Business Automation products. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,

Post your ideas
  1. Post an idea.

  2. Get feedback from the IBM team and other customers to refine your idea.

  3. Follow the idea through the IBM Ideas process.


Please use the following category to raise ideas for these offerings for all environments (traditional on premises, containers, on cloud):
  • Cloud Pak for Business Automation - including Business Automation Studio and App Designer, Business Automation Insights

  • Business Automation Workflow (BAW) - including BAW, Business Process Manager, Workstream Services, Business Performance Center, Advanced Case Management

  • Content Services - FileNet Content Manager

  • Content Services - Content Manager OnDemand

  • Content Services - Daeja Virtual Viewer

  • Content Services - Navigator

  • Content Services - Content Collector for Email, Sharepoint, Files

  • Content Services - Content Collector for SAP

  • Content Services - Enterprise Records

  • Content Services - Content Manager (CM8)

  • Datacap

  • Automation Document Processing

  • Automation Decision Services (ADS)

  • Operational Decision Manager

  • Robotic Process Automation

  • Robotic Process Automation with Automation Anywhere

  • Blueworks Live

  • Business Automation Manager Open Edition

  • IBM Process Mining


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.


Status Planned for future release
Created by Guest
Created on Oct 4, 2024

Support full Kafka protocol for event - key and header support missing

Our BAMOE 8.0.5 deployment on OpenShift is integrated with an external Kafka cluster.


In the processes that are running in our BAMOE instance, tasks shall be used to send and receive Kafka messages, containing a message value and a specific message key. The key is important for assuring that related messages end up in the same Kafka topic partition. When creating a process with a task that shall publish a Kafka message, using an intermediate throw event or an end event of the type "message", there seems to be no way to configure the message key to be used. 
 

Also, Kakfa protocol includes header, which is also required to enable different business context outside of the kafka value (payload).

A detailed mapping to cloudevent specs can be found here: https://github.com/cloudevents/spec/blob/main/cloudevents/bindings/kafka-protocol-binding.md

Idea priority Urgent
  • Admin
    Karina Varela
    Reply
    |
    Nov 15, 2024

    Thanks for sharing your feedback in great detail, and for clarifying why this feature impacts your usage of the solution.

    We have assessed the possibility of delivering this requirement, and currently, this request tentative release status is:

    • Committed for Q2, as part of the 8.0.x release of the quarter.

    Since Q1's release scope was closed, this request status for Q1 is tentative.

    We'll keep you updated through this same channel, and please feel free to get in touch if needed.

  • Guest
    Reply
    |
    Oct 24, 2024

    Ideas for implementing this:

    ### Kafka Producer ###

    The existing implementation for publishing Kafka messages (in org.kie.server.services.jbpm.kafka.KafkaServerProducer) takes a value Object as input.
    As I see no way of adding additional input parameters here, the extra Kafka message key and headers to be used by the Producer would need to be part of the input value object itself, I guess.

    I haven't found a suitable kie/jbpm "Event"/"EventContainer"/"EventWrapper" object. Therefore, either a new such class needs to be created or a Map with predefined keys would need to be used.

    The new class would be better placed in a more generic module than the specific kie-server-services-kafka module, I guess.
    The Map could have keys with a common prefix like "org.jbpm.process.core.event." (customizable via a SystemProperty) and the suffixes "data", "kafka.key", "kafka.headers". Also other kinds of metadata entries could be supported, that the CouldEventWriter could then map to cloudevent (extension) properties.

    When the KafkaServerProducer.processEvent() method gets an object of that class or such a Map as value input parameter, it extracts value/key/headers from it and uses it for publishing the Kafka message.


    ### Kafka Consumer ###

    In the existing Kafka consumer implementation (in org.kie.server.services.jbpm.kafka.KafkaServerConsumer), the idea would be that based on some input information, the consumer could be made to not only pass on the received Kafka message value, but also key and headers.
    The corresponding "processEvent" method takes the received Kafka message and some info about the signalling node (name and output type class) as input.
    Maybe there would a way to access the metadata of the signalling node and let some kind of "passOnKeyAndHeaders"/"passOnValueAndMetadata" flag be defined there.
    I haven't seen a straightforward way to implement this, though.
    An alternative would be to let the KafkaServerConsumer handle a special "Event"/"EventContainer"/"EventWrapper" output type class in such a way that it initializes this object with the received message value plus key and headers.
    It could be the same class also used by the producer.

    ---
    In general, concerning cloud event support:
    The current Producer/Consumer implementation does not expose the CloudEvent object to the process but only uses it for the (Cloud)EventWriter/Reader internally.
    I think this approach can be kept in place. The Kafka message key and headers don't necessarily fit into the cloudEvent attributes anyway. (The Kafka message key could be mapped to the cloudEvent "partitionkey", but that is not a necessity.)
    Therefore, the new "Event"/"EventContainer"/"EventWrapper" object could be kept more generic, instead of it being a CloudEventContainer. It could contain the Kafka message key/headers and could in the future be extended with specific CloudEvent properties.