
By Byron V. Acohido
The amount of data in the world topped an astounding 59 zetabytes in 2020, much of it pooling in data lakes.
Related: The importance of basic research
We’ve barely scratched the surface of applying artificial intelligence and advanced data analytics to the raw data collecting in these gargantuan cloud-storage structures erected by Amazon, Microsoft and Google. But it’s coming, in the form of driverless cars, climate-restoring infrastructure and next-gen healthcare technology.
In order to get there, one big technical hurdle must be surmounted. A new form of agile cryptography must get established in order to robustly preserve privacy and security as all this raw data gets put to commercial use.
I recently had the chance to discuss this with Kei Karasawa, vice president of strategy, and Fang Wu, consultant, at NTT Research, a Silicon Valley-based think tank which is in the thick of deriving the math formulas that will get us there.
They outlined why something called attribute-based encryption, or ABE, has emerged as the basis for a new form of agile cryptography that we will need in order to kick digital transformation into high gear.
For a drill down on our discussion, please give the accompanying podcast a listen. Here are the key takeaways:
Cloud exposures
Data lakes continue to swell because each second of every day, every human, on average, is creating 1.7 megabytes of fresh data. These are the rivulets feeding the data lakes.
A zettabyte equals one trillion gigabytes. Big data just keeps getting bigger. And we humans crunch as much of it as we can by applying machine learning and artificial intelligence to derive cool new digital services. But we’re going to need the help of quantum computers to get to the really amazing stuff, and that hardware is coming.
As we press ahead into our digital future, however, we’ll also need to retool the public-key-infrastructure. PKI is the authentication and encryption framework … more