This price updates are again shown on Sitefinity along with the inventory updates we already have. These inventory updates and product demand are then analyzed by Progress Corticon/Spark Engine by reading the events from Kafka to determine the real-time demand-based pricing for the products. You start receiving orders from your customers on Sitefinity and these orders along with inventory events from the production line are written into Kafka by enabling CDC on OpenEdge database. To recap, Let’s say we have a baseball manufacturing plant from where we are getting the data from IOT devices on the production line about finished goods – this data is used to update the Inventory levels in OpenEdge via Apache Kafka and show the data on Sitefinity. We heard some good things about the demo and there were requests for sharing on how we built it and I want to do this in series of articles on how I built the demo.Īt Progress NEXT, we showcased the above Event driven Streaming architecture and the main backbone for this architecture is Progress OpenEdge and Apache Kafka. At Progress NEXT 2019, Yogesh in his keynote spoke about how at Progress we are accelerating Digital Innovation and during that presentation we showed off a cool little demo around event driven architecture where a baseball company updates its inventory and pricing on Sitefinity based on production rate data from IOT devices and demand for the baseball products.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |