Today in every 60 seconds we have,
10 thousands+ of tweets.
100 thousands+ of facebook status updates.
10 million+ instant messages.
100 thousands+ of google searches.
100 million+ emails sent.
1000 TB+ of data created.
100+ new mobile web users.
It means we have lots and lots of data generated every second, and we can imagine much of such large data is generated and managed every day. It is getting generated exponentially with increasing usage of devices and digitization.
The traditional (common) database management application developed so far cannot handle these much data, if even the organizations have designed and implemented those with a scale in mind.
There are also limitation on factors like CPU, RAM, machine etc. that be scaled up. These traditional design, architecture, system and database would not be able to support the increasing demand of companies and clients to handle such data.
Not only the size of data is large but the data is becoming complex which can be categorized as below:
Volume: companies are producing enormous amount of data per day. These data needs to be stored, analyzed and processed in order to know about the market, trends, customers and their problems along with the solutions.
Variety: Data are generated from different sources in different forms, like videos, text, images, emails, binaries and lots more, and most of these data are unstructured or semi structured. The traditional data systems that we know all works on structured data. so it is quite difficult for those system to handle the quality and quantity of data we are producing nowadays.
Velocity: With 100 million+ of data even a simple query to get list of persons would time for execution. And here we are talking about the analysis and processing of data that is in the range of hundreds and thousands of petabytes, exabytes and much more. So to analyze the same we have to develop a system that will process the data at much higher speed and with high scalability.
What is bigData?
In simple terms, Dataset whose volume, velocity, variety and complexity are beyond the ability of commonly used tools to capture, process, store, manage and analyze them can be termed as BIGDATA.
How bigData help in above discussed complex systems?
• Data distribution.
• Parallel processing.
• Fault tolerance.
• Flexibility and scalability.
Few of the widely used frameworks for implementing bigData are:
• Hadoop.
• Spark.
• Presto.
If you have few of the below business scenario (web or mobile), then you will surely need bigData architecture in your application and at LetsNurture we will help you to meet such scenarios:
• Face recognition and comparison application.
• Video analysis and sensor based application.
• Real time and CCTV camera based application.
• Live GPS based application like traffic updates.
• Health care related application like recording heartbeats etc.
• Large sales volume, prediction and equity based application.
• Ebooks, Music and Video recording application.