Month: October 2016


  • I wanted to run a job which runs 24×7 and which reports if certain keywords occur more than a N times in the stream. Spark streaming looked a ideal candidate for this task. Spark has a reduceByKeyAndWindow function which was exactly what I was looking for. I decided to use a window length of 1 minute and…

  • I was working on a feature recently which needed a streaming job that runs 24×7 and processing 100 million rows per day. The spark web ui is a wonderful tool to look at how things are running internally. While debugging I noticed that the streaming jobs were getting allocated to only one machine. Spark has…