Data processing technologies perform at different rates, making the Total Cost of Performance a hot topic for feasibility studies concerning Big Data tools including traditional databases, Hadoop and stream processors. The reason is simple. Any storage-based technology must store the data first before they can be queried and processed, and the entire dataset must be requeried when new data arrive. This introduces latency into the system. Adding more firepower in terms of more servers can help to a point, but inevitably a tipping point is reached where it’s simply impossible to reduce the latency further. This is the point where stream processing comes in to play.