An event-driven architecture is a model that allows communication between services in a decoupled fashion. This pattern has evolved into a powerful software paradigm and covers a wide array of use cases. Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. With this industry evolution, we can now create event-driven business processes which can work seamlessly in a microservices-based environment. With the latest release of Red Hat Process Automation Manager (7.11), you can work with business processes capable of interacting with external services via events, either by emitting or consuming them.
Earlier this year, Karina Varela wrote a detailed article about the integration between Red Hat Process Automation Manager and Kafka. In this article, we will look at the integration of Red Hat Process Automation with Red Hat AMQ Streams on OpenShift. Red Hat AMQ Streams is an enterprise grade Kubernetes-native Kafka solution.
In IT today, it can be both challenging and time-consuming for operations and development teams to be experts in many different technologies knowing how to use them, while also knowing how to install, configure, and maintain them. Kubernetes operators help streamline the installation, configuration, and maintenance complexities. We will be using the AMQ Streams Operator and the Business Automation Operator to help simplify the process.
In this article you’ll see how to deliver and test an event-driven process application on OpenShift in four steps:
- Kafka deployment on OpenShift
- Red Hat Process Automation Manager deployment on OpenShift
- Creation and deployment of the business application
- Test of the business application using events
To learn more check check out the full blog post with a detailed step-by-step guide.