The word zettabytes (or 44trillion gigabytes) or Big Data continues to steer players in the digital Ecosystem in the face; subduing us to head-scratching! Today, it has become a buzz magic word in the digital Landmark. Further, it constitutes a distinguished resource; and significant variable that shapes our life and economic orientation in multiple dimensions.
If the indicators on global growth of data are not frightening; and a wake-up call for our urgent attention to engage the singularity challenge; I don’t know what else can and ultimately will!
Okay, if that is not enough, let’s bring the topic to speed.
”Today it would take a person approximately 181 million years to download all the data from the Internet’. (Source: Physics.org).
The puzzling question on how long it would take an individual to download the content of the Internet; was resolved by Physics.org as follows.
“The source used the following values: 0.55 zettabytes for all the information on the internet; and 44Mbps as average download speed. However, since these statistics have changed; we redid the calculation with 33 zettabytes of data and an average download speed of 46Mbps. The result we got was around 181.3 million years!”
The revelation and message are clear. And we cannot continue to close our eyes and behave as if tomorrow’s digital Data disaster will never come. Oh yes, it will – and sorry to that nation/s who neglect this call! We urgently need a radical digital data cleansing policy and strategy at all national operational levels; both in the public and private sectors of the economy.
Unless serious attention is given to resolving the challenges of reality data-checks, cleansing and critical updates; the missing gaps may be further elongated and continue to hunt and mislead our national development strategies. Perhaps, if we get it right, those dimensions may systematically lead us along the right compass.
If not, the gaps, amidst mind-boggling quantum data growth may drive us into a misguided economic orientation; and perpetual consumerism destination-of-things. It can make us lose focus on economic realities; reliability; and inflate the challenges as well as complexities of the national mission.
Further, it can sommersault our leadership (aid and abet Regime Change) and institutional goals; discomfiture education; health care and healthcare delivery; food production and security; governance as well as subvert our culture and sustainable development. Indeed, it can make us a sitting duck in the face of our adversaries.
Therefore, it is important to acknowledge that there are many ambiguous errors in the process of generating; analysing, storing and applying data into our database system for constructive decision making. In particular, the complexity and growth of zettabytes – faced by shortage of requisite skillsets; constitutes a monumental challenge to the progression of digital economy and sustainable national development.
How much do we really understand and appreciate the critical relevance and mastery of the Software Ecosystem; especially in resolving current and future challenges of Data transformation and digital economy for national development; creation of wealth and national security?
This invokes our inquest to explore what indeed is Data? Informed by reliable sources: “Data is any sequence of one or more symbols given meaning by specific act of interpretation. Data requires interpretation to become information. To translate data to information, there must be several known factors considered.
The factors involved are determined by the creator of the data and the desired information”. (Wiki). This translates to the fact that we are confronted with the complex task of engaging Data transformation gaps in our institutional framework to deliver assurances for the attainment of sustainable digital economy.
Zettabytes refers to enormous data sets gathered from numerous sources; offering us the inevitable insight and mirror to clear our beclouded mindset on national orientation, development and security.
Data accumulation and collection is not enough. Indeed, it may not lead to the promised land unless it is structured/distilled into information for critical decision-making assurance. This is followed by the domain knowledge extracted from the distilled information; and further subjected to multi-dimensional analytics to arrive at core intelligence.
To achieve the foregoing, constructive policy, strategic framework and standards become a critical imperative. The central goal of deploying data cleansing model is to improve data assurances, quality and utility by identifying and correcting errors; before it is transferred to a target database.
However, in our case, this process is further compounded; particularly in a predominantly manual process of data creation and movement – as currently is in Nigeria. Manual cleansing of Data is not only inadequate but dangerous and mostly misleading. That is why we require the building of commensurate capacities and capabilities of special digital data workforce; as a matter of urgency.
In 2019, there were over 4.4 billion Internet users. The accumulated world data is projected to grow to 44 zettabytes (that’s 44 trillion gigabytes)! For comparison, today it’s about 4.4 zettabytes. The revenues generated by BDA worldwide were $42 billion in 2018. In 2027, they’re projected to increase to $103 billion with a CAGR of 10.5% until then!
As we are aware, the Domain of BD Sets presents an unusual challenge of its own. This is because its exploration and assurance analytics are impossible without superior tools; predominantly due to their abysmal large data quantity and complexity. So, there is a variety of tools used to analyse zettabytes – NoSQL databases; Hadoop; as well as Spark, Etc. (EMC, IDC).
Some Big Time Big / Data Statistics for your information and actionable appreciation:
– The BD analytics market is set to reach $103 billion by 2023
– In 2019, the zettabytes market projection was a growth of 20%
– In 2020, every person will generate 1.7 megabytes in just a second
– Internet users generate about 2.5 quintillion bytes of data each day
– As of September 2019, there are 2.45 billion active Facebook users; and they generate a lot of data
– 97.2% of organizations are investing in big data and AI
– Using BD, Netflix saves $1 billion per year on customer retention (Tech Jury)