Communiquez avec les autres et partagez vos connaissances professionnelles

Inscrivez-vous ou connectez-vous pour rejoindre votre communauté professionnelle.

Suivre

Cloud Computing enables the future of Big Data since it allows data to be standardized. True or False ? Briefly Explain your answer.

user-image
Question ajoutée par Themali Silva , Advisor IT Technology Partnership , Air Canada
Date de publication: 2014/04/09
Mostafa Abdo
par Mostafa Abdo , Senior Infrastructure and Security Architect , Devoteam

True.

Cloud computing -- a consumer/delivery model where information technology (IT) capabilities are offered as services billed based on usage -- has brought big data analysis to the masses by giving businesses access to vast amounts of computing resources on demand. As the technology continues to advance, the question for many businesses is how they can benefit from big data and how to use cloud computing to make it happen.

Sudheer J
par Sudheer J , Lead , Ford

Yes, it’s true that Cloud Computing enables the future of Big Data . Cloud Computing provides potential computing solutions to the management, discovery, access, and processing of the Big Data for intuitive decision support knowledge. 

Both technologies share similar intrinsic features such as distribution, parallelization, space-time, and being geographically dispersed. By utilizing these intrinsic features would help provide Cloud Computing solutions for Big Data with computing infrastructure capability to process and obtain 

unprecedented information.

 

We can leverage the bigdata stack as below which allows data be standardized .

Foundational layer  - The cloud provides the scalable persistence and compute power needed to manufacture data products .

Middle layer - Where features are extracted from data, and fed into classification and prediction algorithms.

Top layer -  Are services and applications. This is the level  at which consumers experience a data product, whether it be a music recommendation or a traffic route prediction. 

Rateb Abu Hawieleh
par Rateb Abu Hawieleh , Solution sales specialist cloud and datacenter , Microsoft

True

NASSRELDEEN IBRAHIM ELTAYP BASHEER
par NASSRELDEEN IBRAHIM ELTAYP BASHEER , Senior Bussiness Solution Support Engineer , MTN Sudan

true

first of all let us have alook to big data, first the data are collected from diffrent divices in the inter net,application ,and people from the internet .

difintly it is ahuge amont of data it is challenge to have storges to accomdate all data,in secure manner,also it is oppertnty to have ahuge data have abenfit in the reaserches and we be helpfull in some discision. also it is motivated to have technology allow the user to get benfit from.

The forum's goal is to help end-user organizations align their cloud-based big-data initiatives with key strategic imperatives in a fast-changing competitive and operational environment.  deploying and managing big data in the cloud including governance, compliance, security, resiliency, and other controls on enterprise-grade big-data lead to have standerd cloude to deal with both public and premises-based. Speakers will discuss practical applications and use cases of cloud-based big-data in diverse industries, including government, healthcare, finance, communications, retailing, and marketing services.

viturtalization is used to divide this huge data to smallest depending on some critria, as samboyl of standerd.

this from the data viewer

Arindam Majumder
par Arindam Majumder , Enterprise Architect , Multinational Company

True.

 

It's really difficult to tackle Big Data inititiative with Cloud Computing.

 

WHY? Suppose Telecom company wants to do an exercise of analyzing the trends based on Log files which requires processing speed equivalent to100 servers and there is no surity whether this exercise will really provide any desired view which company is looking for.

 

Since it is one time exercise, you can not purchase such a giant infrastructure otherwise you will have to bear huge depreciation cost. There is no ROI

 

What to do: Build this infrastructure in Public Cloud with OnDemand mode. Do this exercise for one month. Pay only for the duration you use the infrastructure and rest of the time just switch off the environment and you don't need to spend extra except Storage cost.

 

Once your exercise done. Terminate the created infrastructure on the Public Cloud.

 

ZERO Capex - Capital Expenditure

ZERO Depreciation Cost

Only OPEX - Operating Expenditure

 

Another example can be for Banking sector, Oil and Gas sector, Retail sector, Manufacturing sector or can be even for Phamaceutical sector as well.

 

Moreover based on the selection of the right Public Cloud you can control the OPEX as well.

 

Hope I have answered your question.

 

If need any further update, let me know.

 

Best Regards

Muhammad Anzar
par Muhammad Anzar , DevOps/DevSecOps Architect , Confidential

Absolutely True.

 

 Big Data refers to technologies and initiatives that involve data that is too diverse, fast-changing or massive for conventional technologies, skills and infra- structure to address efficiently. Said differently, the volume, velocity or variety of data is too great.

But today, new technologies make it possible to realize value from Big Data. For example, retailers can track user web clicks to identify behavioral trends that improve campaigns, pricing and stockage. Utilities can capture household energy usage levels to predict outages and to incent more efficient energy consumption. Governments and even Google can detect and track the emergence of disease outbreaks via social media signals. Oil and gas companies can take the output of sensors in their drilling equipment to make more efficient and safer drilling decisions. "Big Data" describes data sets so large and complex they are impractical to manage with traditional software tools.

Specifically, Big Data relates to data creation, storage, retrieval and analysis that is remarkable in terms of volume, velocity, and variety:

  • Volume. A typical PC might have had10 gigabytes of storage in2000. Today, Facebook ingests500 terabytes of new data every day; a Boeing737 will generate240 terabytes of flight data during a single flight across the US; the proliferation of smart phones, the data they create and consume; sensors embedded into everyday objects will soon result in billions of new, constantly-updated data feeds containing environmental, location, and other information, including video. 1 2 3

  • Velocity. Clickstreams and ad impressions capture user behavior at millions of events per second; high-frequency stock trading algorithms reflect market changes within microseconds; machine to machine processes exchange data between billions of devices; infrastructure and sensors generate massive log data in real-time; on-line gaming systems support millions of concurrent users, each producing multiple inputs per second.

  • Variety. Big Data data isn't just numbers, dates, and strings. Big Data is also geospatial data,3D data, audio and video, and unstructured text, including log files and social media. Traditional database systems were designed to address smaller volumes of structured data, fewer updates or a predictable, consistent data structure. Traditional database systems are also designed to operate on a single server, making increased capacity expensive and finite. As applications have evolved to serve large volumes of users, and as application development practices have become agile, the traditional use of the relational database has become a liability for many companies rather than an enabling factor in their business. Big Data databases, such as MongoDB, solve these problems and provide companies with the means to create tremendous business value.

More Questions Like This