Serverless with the Pipes and Filters Architecture

In a previous blog post I argued that serverless computing is not an architecture style. However, I made a point there that – as with every technology – some architecture styles go along with it better than the others. What most probably comes to your mind are microservices and event-driven architecture. In this blog post though, I would like to cover a less known architecture style that seems to perfectly match serverless computing – the pipes and filters architecture, which emphasises modularity and reusability in processing components. This architecture is built on the concept of “pipes” that connect filters, enabling the transformation or processing of data as it flows through the system. 

Understanding Pipes and Filters Architecture

The Pipes and Filters architecture is characterised by decomposing a system into small, self-contained processing components called filters. Each filter performs a specific task or transformation on the data it receives, promoting a highly modular and reusable design. Filters are connected using pipes, which serve as conduits for data flow, ensuring a one-way flow of information through the system. This architecture is particularly useful in scenarios such as data integration, data processing workflows, data transformation pipelines, and stream processing systems.

Key features of the Pipes and Filters architecture include:

  • Modularity and Reusability: Each filter is designed to be independent, reusable, and replaceable. This modularity allows for flexible composition of processing components.
  • Sequential Data Flow: Data passes from one filter to the next through pipes, enabling sequential processing.
  • Loose Coupling: Filters interact through well-defined data interfaces provided by the pipes, promoting loose coupling and reusability.
  • Scalability and Parallelism: Filters can be replicated or distributed to handle increased processing loads, allowing for parallel processing.
  • Flexibility and Adaptability: Filters can be added, removed, or rearranged within the pipeline to accommodate changing processing needs.

Serverless Pipes and Filters

Serverless computing naturally complements the Pipes and Filters architecture through several key features. One significant aspect is the modularity and reusability inherent in both paradigms. In serverless computing, individual functions are designed to perform specific tasks and can be independently deployed, updated, or replaced without impacting the rest of the system. This mirrors the independent and reusable nature of filters in the Pipes and Filters architecture, where each filter is a standalone processing unit. Additionally, serverless platforms inherently support sequential data flow through event-driven triggers, similar to how data flows through pipes from one filter to another. This ensures that each function or filter performs its task in sequence, enhancing the clarity and manageability of the data processing pipeline. Moreover, serverless functions communicate through well-defined event interfaces, which aligns with the loose coupling seen in the Pipes and Filters architecture. This separation allows for easier maintenance and testing, as changes in one function or filter do not directly affect others. The scalability and parallelism provided by serverless architectures are also a perfect match for the scalable and distributable nature of filters, allowing the system to handle varying loads efficiently. Finally, the flexibility and adaptability of serverless functions, which can be quickly modified or scaled, resonate with the ability to add, remove, or rearrange filters within the pipeline, making it easy to adapt to changing requirements or workloads.

Functions Chaining

There is one special feature of serverless computing that goes particularly well with the pipes and filters architecture – functions chaining. It is the practice of linking multiple serverless functions together, where the output of one function serves as the input for the next. This allows for the creation of complex workflows by decomposing tasks into smaller, manageable, and reusable functions that execute sequentially. It mirrors exactly how a pipes-and-filters application should be designed, and that is the main reason why serverless computing is my go-to technology whenever I use the pipes and filter architecture in my design.

Summary

In this blog post, we explored the synergy between serverless computing and the pipes and filters architecture, a lesser-known but highly effective design that emphasises modularity and reusability in processing components. The pipes and filters architecture decomposes systems into self-contained filters connected by pipes, ensuring a sequential data flow. Serverless computing complements this by offering modular, independently deployable functions that support sequential processing and loose coupling through well-defined event interfaces. This combination enhances scalability, parallelism, and flexibility, making it ideal for dynamic workloads. A key feature of serverless computing, function chaining, perfectly aligns with the pipes and filters model, enabling the creation of complex, manageable workflows.

In the next blog post I will show you in example how serverless computing can be used in such a design.