Beyond Volume, Variety and Velocity

Apart from the 3Vs of big data which are Volume, Variety and Velocity, there are additional Vs that IT, business and data scientists need to be concerned with, most notably big data Veracity, Validity & Volatility.

Volume

Big data implies enormous volumes of data. It used to be user created data. Now that data is generated by machines, networks and human interaction on systems like social media the volume of data to be analyzed is massive. Yet, the volume of data is not as much the problem as other V’s like veracity.

Variety

Variety refers to the many sources and types of data both structured and unstructured. We used to store data from sources like spreadsheets and databases. Now data comes in the form of emails, photos, videos, monitoring devices, PDFs, audio, etc. This variety of unstructured data creates problems for storage, mining and analyzing data.

Velocity

Big Data Velocity deals with the pace at which data flows in from sources like business processes, machines, networks and human interaction with things like social media sites, mobile devices, etc. The flow of data is massive and continuous. This real-time data can help researchers and businesses make valuable decisions that provide strategic competitive advantages and ROI if you are able to handle the velocity. Sampling data can help deal with issues like volume and velocity.

Veracity

Big Data Veracity refers to the biases, noise and abnormality in data. Is the data that is being stored, and mined meaningful to the problem being analyzed. Veracity in data analysis is the biggest challenge when compares to things like volume and velocity. In scoping out your big data strategy you need to have your team and partners work to help keep your data clean and processes to keep ‘dirty data’ from accumulating in your systems.

Validity

Like big data veracity is the issue of validity, meaning, is the data correct and accurate for the intended use. Clearly valid data is key to making the right decisions. IBM’s big data strategy and tools claims to offer help with data veracity and validity.

Volatility

Big data volatility refers to how long is data valid and how long should it be stored. In this world of real time data you need to determine at what point is data no longer relevant to the current analysis.

Big data clearly deals with issues beyond volume, variety and velocity to other concerns like veracity, validity and volatility.

Follow the Big Data Innovation Summit on twitter #BIGDBN for more info.

Reference www.insidebigdata.com*