The new National Institute of Standards and Technology Big Data Interoperability Framework has us talking a big game.
A big game about securing Big Data, that is.
In the new framework, NIST cites several ways in which securing Big Data differs from traditional security and privacy implementations.
NIST summarizes the difference like this:
"Big Data is increasingly stored on public cloud infrastructure built by employing various hardware, operating systems, and analytical software. Traditional security approaches usually addressed small-scale systems holding static data on firewalled and semi-isolated networks."
Now let's look at some helpful specifics.
NIST created a list of eight major characteristics that set Big Data projects apart, making these projects a security and privacy challenge:
Maybe you are a data scientist trying to get a handle on Big Data security and privacy, or perhaps you are a cybersecurity leader trying to get a handle on the same thing.
Either way, expect record amounts of Big Data to secure. According to NIST:
"Data generation is expected to double every two years to about 40,000 exabytes in 2020. It is estimated that over one-third of the data in 2020 could be valuable if analyzed.
Less than a third of data needed protection in 2010, but more than 40 percent of data will need protection in 2020."
All the more reason to check out the new NIST framework for Big Data.
Also, here is the NIST Big Data Security and Privacy section.
[RELATED: 5 Things to Know About the NIST Cybersecurity Framework]