Communiquez avec les autres et partagez vos connaissances professionnelles

Inscrivez-vous ou connectez-vous pour rejoindre votre communauté professionnelle.

Suivre

Can we Use Hadoop Technology to process data in a live streaming environment? For example to process data feed coming from Flight auto log system?

user-image
Question ajoutée par Abrar Mohd , Consulting Engineer , CISCO SYSTEMS
Date de publication: 2014/02/27
Sudheer J
par Sudheer J , Lead , Ford

Yes Apache Flume helps in your case .

 

Apache Flume is a distributed, reliable, and available service for efficiently collecting, aggregating, and moving large amounts of streaming data into the Hadoop Distributed File System (HDFS).

The use of Apache Flume is not only restricted to log data aggregation. Flume can be used to transport massive quantities of event data including but not limited to network traffic data, social-media-generated data, email messages and pretty much any data source possible.

Flume allows users to:

Stream data from multiple sources into Hadoop for analysis

Collect high-volume Web logs in real time

Guarantee data delivery

Scale horizontally to handle additional data volume

 

 

Once data is available in HDFS , you can write MapReduce , Hive SQL queries on the top of  data to do analysis . Hope this helps !!

Gayasuddin Mohammed
par Gayasuddin Mohammed , Advocate , Practicing Law before High Court at Hyderabad

No. As per my knowledge Hadoop Technology is not for to process data in a live streaming environment. Thanks.

More Questions Like This