As data needs continue to grow, the architecture you’ve created as recently as 10 years ago doesn’t always scale to meet them. The operational data store once provided significant value, but has since become a bottleneck to the architecture. You’ve thrown hardware at it. You’ve optimized the software and the data model. You’ve stopped allowing access to it — and then allowed access to it again because its data is so valuable. Yet it still comes up at almost every system risk review meeting.
In this session, Jennifer Lim will talk about the challenges the operational data store offers and the impacts modern technologies like Kafka, Spark, and the cloud introduce to our solutions. Jennifer will share with you some of the lessons that Cerner IT has learned as they shifted to a modern big data architecture for their operational needs, both for transactional integration and for operational reporting and analytics needs.