In a farm, each data item appearing onto the output stream is transformed into a data output of the output stream by applying a function f over it.

Functionally farm f has type

'a stream -> 'b stream
provided that the function f has type

'a -> 'b.

Given the input stream :xn:...:x1:x0: the farm farm(f,n) computes the output stream :f(xn):...:f(x1):f(x0): by using n worker processes

In terms of (parallel) processes, a sequence of data appearing onto the input stream of a farm is submitted to a set of worker processes. Each worker process applies the same function to the data items received and delivers the result onto the output stream. The resulting process network looks like the following:

The emitter process takes care of task-to-worker scheduling, while the collector process takes care of reordering the output data items with respect to the input ordering and of delivering them onto the output data stream. Have a look at the farm behaviour, in terms of data items processed.


Back to the skeleton set page.