ابدأ بالتواصل مع الأشخاص وتبادل معارفك المهنية

أنشئ حسابًا أو سجّل الدخول للانضمام إلى مجتمعك المهني.

متابعة

Can we Use Hadoop Technology to process data in a live streaming environment? For example to process data feed coming from Flight auto log system?

user-image
تم إضافة السؤال من قبل Abrar Mohd , Consulting Engineer , CISCO SYSTEMS
تاريخ النشر: 2014/02/27
Sudheer J
من قبل Sudheer J , Lead , Ford

Yes Apache Flume helps in your case .

 

Apache Flume is a distributed, reliable, and available service for efficiently collecting, aggregating, and moving large amounts of streaming data into the Hadoop Distributed File System (HDFS).

The use of Apache Flume is not only restricted to log data aggregation. Flume can be used to transport massive quantities of event data including but not limited to network traffic data, social-media-generated data, email messages and pretty much any data source possible.

Flume allows users to:

Stream data from multiple sources into Hadoop for analysis

Collect high-volume Web logs in real time

Guarantee data delivery

Scale horizontally to handle additional data volume

 

 

Once data is available in HDFS , you can write MapReduce , Hive SQL queries on the top of  data to do analysis . Hope this helps !!

Gayasuddin Mohammed
من قبل Gayasuddin Mohammed , Advocate , Practicing Law before High Court at Hyderabad

No. As per my knowledge Hadoop Technology is not for to process data in a live streaming environment. Thanks.

المزيد من الأسئلة المماثلة