By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Articles

Executive Q&A: Inside Event-Driven Architectures

What is EDA, what are the benefits and challenges, and what’s driving enterprises to adopt it? We turned to Matthew O’Riordan, CEO and technical co-founder of Ably, for answers.

Upside: What is an event-driven architecture (EDA)?

For Further Reading:

Using Edge Technology to Achieve Near Real-Time Insights

Understanding Big Data Source Types Can Improve Your Project Planning

Data Management Best Practices for Changing Times 

Matthew O’Riordan: EDA is a software design model that allows enterprises to act in real time on business events such as customer transactions. It is based on events, which signal to interested parties that a state change occurred in a business system. For example, in an e-commerce transaction, events include a consumer buying a product online, the retailer shipping the product, and a delivery service dropping the product on a doorstep.

In EDA, there are producers and consumers. Producers trigger events sent to interested consumers as messages through an event channel, where they are processed asynchronously. Producers and consumers don’t have to wait for each other to begin the next task because producers are either loosely coupled or completely decoupled from recipients.

Publish/subscribe (pub/sub), for example, is an architectural design pattern often used in EDA that provides a framework for exchanging messages between publishers (who are producers) and subscribers (who are consumers). A broker receives all the publisher’s events and routes them to the registered subscribers. A broker also records events. Consumers can access the event stream at any time and read the most recent message or process a series of messages from the last time they checked the stream.

What is driving enterprises to adopt EDA?

Enterprises are sitting on large volumes of valuable data that they need to use to succeed in today’s digital economy. The amount of data generated daily will reach 463 exabytes globally in three years. As part of ongoing digital transformation projects, enterprises are implementing strategies such as EDA to consume and analyze all this data to help make the right business decisions in real time. In fact, IDC predicts that by 2025 30 percent of all data may be real-time data.

What are the benefits of an EDA approach?

EDA helps organizations process data immediately to meet today’s demands for real-time digital interactions. In EDA, the producer and consumer are decoupled, which improves scalability because it separates communication and business logic. A publisher can avoid bottlenecks and remain unaffected if subscribers go offline or consumption slows down. If a subscriber struggles to keep up with the pace of events, the event stream records them and the subscriber can retrieve them later. The publisher can continue to send notifications without throughput limitations and with high resilience to failure.

Also, by using a broker in a pub/sub design pattern, a publisher does not know its subscribers and is unaffected if the number of interested parties scales up. Publishing to the broker offers the producer the opportunity to deliver notifications to a range of consumers across different devices and platforms.

What are the biggest challenges with building an EDA, and how can enterprises overcome them?

Although enterprises can easily build an initial custom EDA to meet today’s demands for instant access, it becomes more risky, complex, costly, and time-consuming as enterprises start to offer a service at scale. Organizations spend months putting together disparate systems to build real-time capabilities, resulting in technical debt and more engineering investment for essential features.

Many applications rely on a sequence of messages that depend on each other. However, if these messages are lost or out of order, organizations can’t deliver the seamless real-time digital experiences users expect. The engineering required to provide real-time digital experiences with data integrity is complex and leads to unacceptable tradeoffs to move forward. Companies sacrifice performance at scale for data integrity.

Issues with poor latency and bandwidth create uncertainty when designing, building, and scaling real-time features. Not only do organizations need to minimize latency and bandwidth requirements, but they also need to minimize the variance in them and provide predictability to developers. It’s also hard for organizations to design, build, and operate their own globally distributed, fault-tolerant real-time infrastructure.

To avoid the issues associated with delivering a custom EDA at scale, enterprises should offload this complexity to a third party, giving their engineering teams the ability to focus on improving their core offering.

How do enterprises know where to use EDA?

Enterprises should look to implement EDA in use cases where the user expects instant, reliable digital experiences on internet-connected devices. EDA is suitable for various use cases, such as powering real-time functionality and technologies such as chat, alerts, notifications, or IoT devices. It can also stream high volumes of data between different businesses or different components of a system.

What skills or resources are needed to implement EDA?

As with many technology sourcing decisions, the skills required for implementation will depend on how much the team chooses to implement in-house. When building in-house, or composing a solution from OSS components, adopting EDA will require a team to invest more in infrastructure skills than may have been previously necessary.

If the use case requires high levels of availability, exactly-once delivery semantics, and low latency, a global implementation will be required. This is usually a significant step up for most teams, requiring not only more site reliability engineers (SREs), but probably some distributed systems engineering knowledge, too. Serverless or hosted solutions can reduce or eliminate this need for more specialist skills being added to teams for many use cases.

[Editor’s note: Matthew O’Riordan is CEO and technical co-founder of Ably, a serverless WebSocket Platform-as-a-Service (PaaS) operating at the edge. He has been a software engineer for over 20 years, many of those as a CTO. He first started working on commercial internet projects in the mid-1990s, when Internet Explorer 3 and Netscape were still battling it out. While he enjoys coding, the challenges he faces as an entrepreneur starting and scaling businesses are what drive him. Matthew has previously started and successfully exited from two tech businesses. You can reach him on LinkedIn or Twitter.]

TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.