Did you miss today’s livestream? Watch the AI at the Edge & IoT Summit on demand now.
Securiti this week announced it has extended its framework for managing and securing data that leverages machine learning algorithms to now include integration with the cloud service provided by Snowflake.
Designed to run natively on the Snowflake cloud, Securiti for Snowflake provides the data governance tools required to protect data, ensure privacy, and achieve compliance using a single platform. Previously, IT teams would have to either manually manage data stored in the Snowflake cloud or deploy separate data management, protection, privacy, and compliance tools, said Securiti CEO Rehan Jalil. “There is no need to do this in a piecemeal fashion,” he said.
The Securiti platform employs machine learning algorithms and other forms of AI to scan petabytes of data to identify sensitive information that could fall under regulations such as the European Union’s General Data Protection Rule (GDPR) or the U.S. Health Insurance Portability and Accountability Act (HIPAA). Securiti then catalogs that data, structured or unstructured, in a way that allows the platform to surface People Data Graphs that identity who in the organization created that data and who is currently responsible for governing it.
IT teams can also apply fine-grained access controls that not only limit access but also dynamically mask sensitive data based on the policies the IT team defines. Similarly, privacy rules can be enforced based on rights, permissions, and consents.
Finally, Security also provides a set of tools that continuously monitor whether the Snowflake platform running on the Amazon Web Services (AWS) cloud service is properly configured. While cloud platforms are generally secure, the services that run on them are frequently misconfigured in a way that makes it easy for cybercriminals to exfiltrate data via a port that has inadvertently been left open, noted Jalil.
Centralizing data management
The integration with Snowflake comes at a time when many organizations seek to manage data residing in multiple platforms in a more holistic fashion. Historically, data has tended to be managed within the context of the application employed to create it. However, as organizations start to view data as a business asset, there is now a more concerted effort to store data in a cloud-based data warehouse or data lake spanning multiple cloud platforms. The goal is to make that data accessible to a broad range of applications in a way that better ensures quality and maintains consistency. Most hurdles that digital business transformation initiatives encounter stem from the simple fact that much of the data stored in various applications is often conflicting or simply incorrect.
Once that data is stored in those platforms, organizations can encounter a range of governance challenges in managing data that is distributed across multiple cloud services and any number of legacy on-premises platforms. Large enterprises that lack an empowered, centralized IT organization are employing multiple cloud data warehouses as business units, each of which might opt to fund its own initiatives. Naturally, each business unit then encounters its own unique set of data management and security challenges.
Duplication of data management and security efforts wastes resources. The efforts to better secure data are sometimes led by a chief data officer, and other times by a chief information security officer (CISO). One way or another, someone has to more efficiently manage what is now petabytes of data within large enterprises. In fact, machine learning algorithms are arguably the only viable option to scan and classify data at that scale.
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more
Source: Read Full Article