Hadoop Apache Flume Introduction

Apache-Flume IntroductionApache Flume –is the streaming framework in Hadoop family. The server log, twitter feeds, stock market share price movements are known for streaming data. These streaming data were earlier processed by conventional technologies or frame works. However, after Hadoop comes into existence Apache Flume is playing key role to process the streaming data.

This video by http://www.hadoopexam.com – is giving clear information of Data injection before the data come to Flume. This video also explains various components of Flume and how these components are working in real-time scenarios. In middle of the video explains how the Flume is working and in real time where the Flume is implementing. The explanation is very clear. Hope you will get some understanding of Apache Flume from this video.

Apache Flume Introduction
1. Data Acquisition : Apache Flume Introduction
2. Apache Flume Components
3. POSIX and HDFS File Write
4. Flume Events
5. Interceptors, Channel Selectors, Sink Processor

by http://www.HadoopExam.com



Leave a Reply

Your email address will not be published. Required fields are marked *