Apache Flume –is the streaming framework in Hadoop family. The server log, twitter feeds, stock market share price movements are known for streaming data. These streaming data were earlier processed by conventional technologies or frame works. However, after Hadoop comes into existence Apache Flume is playing key role to process the streaming data.
This video by http://www.hadoopexam.com – is giving clear information of Data injection before the data come to Flume. This video also explains various components of Flume and how these components are working in real-time scenarios. In middle of the video explains how the Flume is working and in real time where the Flume is implementing. The explanation is very clear. Hope you will get some understanding of Apache Flume from this video.