The cloud’s scalability, flexibility, and cost savings have federal government agencies replacing some traditional, physical data center infrastructure with cloud services. This also led to the federal “Cloud First” mandate which required agencies to consider available cloud solutions before building new data centers.
However, while adopting a cloud infrastructure allows agencies to increase their agility – often while saving money – it does come with a host of new security considerations. With heterogeneous IT landscapes – a mix of legacy and cloud residing environments – and the growing IoT reality due to the rise in sensor devices, networks and IT infrastructures are becoming increasingly complex. Securing these more complicated and complex networks requires machine learning and security platforms that can reach across multi-platform data environments.
Detecting threats and learning from previous ones will drive cybersecurity towards a proactive approach regardless of where the data is stored. The skills and technologies needed to accomplish this are available. Adopting a forward-looking approach is essential for government agencies and organizations. Putting the best personnel, technologies, and methods on the task of safeguarding multi-strata data is critical.
Recently, the GovCyberHub sat down with Nasheb Ismaily, Senior Solutions Engineer with Cloudera, to discuss cybersecurity in today’s complex data environments. Nasheb is a published author with more than ten years of experience in streaming analytic initiatives and cyber intrusion detection within the public sector. As a researcher of AI, IoT, big data, and machine learning, it was a treat to learn from Nasheb.
Here’s what he had to say:
GovCyberHub (GCH): How do heterogeneous infrastructures, including legacy and cloud deployments, within government agencies create risk and challenges for cybersecurity?
Nasheb Ismaily: Heterogeneous infrastructures across the cloud and on-premises create challenges and risks because different platforms have different security and governance protocols and tools. You may be able to restrict datasets and workloads in one environment, but there’s no guarantee you will be able to do the same in another. Moreover, administrators will have the added responsibility of understanding the complexities and restrictions across every security and governance toolset in each environment. The burden of keeping these tools in sync at all times further increases the risk of human error and noncompliance.
GCH: What about the proliferation of connected devices and the rise of IoT? How does IoT create additional complexities when thinking about security?
Nasheb Ismaily: In the realm of IoT, we are typically dealing with many types and brands of sensors, all sending data with different protocols, frequencies, and standards. Implementing a single view of security across all these devices is difficult.
We also have to deal with the veracity and the validity of the data in streaming architectures. This means understanding if the data has been maliciously transformed in any way and if the data is coming in the correct order.
GCH: How can government agencies balance the need to continue legacy systems while safeguarding and adding new technologies? What makes this challenge unique in the public sphere?
Nasheb Ismaily: Many government agencies looking to modernize will migrate specific, non-sensitive datasets and workloads to newer technologies while keeping the remaining datasets on legacy systems in the short and intermediate term. In the long term, your data and workloads are all migrated to take advantage of more efficient architectures and software design methodologies, like Kubernetes.
The sensitivity, classification, and compliance requirements of the data provide a unique challenge to the public sector. For instance, certain sensitive datasets are not permitted to be moved to the cloud from on-premises environments. In this case, you would implement a hybrid solution where sensitive data and workloads remain on-premises, and non-sensitive workloads move to the cloud.
GCH: What is needed to provide security and also allow accessibility within hybrid/multi-platform data environments? What does triage look like in these settings?
Nasheb Ismaily: It’s essential to have a single view of metadata, governance, and security policies in any object store across any of the major cloud environments. This centralized view enables the deployment of security controls with easy data, metadata, policy, and governance migration between cloud environments.
GCH: How is it possible to do proactive cybersecurity? What are the benefits?
Nasheb Ismaily: Proactive cybersecurity involves preemptively identifying weaknesses and adding processes to identify threats before they occur. Here we can leverage Machine Learning to detect and visualize anomalous patterns in our data. We can then implement the machine learning models back into the stream enabling us to detect anomalies as they appear in real-time.
GCH: All of these trends we’ve discussed point to government networks that are getting increasingly complex. How are new technologies helping government agencies better monitor, manage and protect these complex, complicated IT environments?
Nasheb Ismaily: New technologies and management consoles are giving government IT and cybersecurity professionals a centralized view that can streamline and simplify data security, governance, and replication for any environment – transient or persistent, hybrid cloud or multi-cloud.
Giving cybersecurity professionals a single pane of glass in which to manage hybrid-cloud environments, administer cloud and on-premises resources, and maintain users and their access dramatically reduces organizational overhead, security risks, and noncompliance.
Then there is a new generation of Big Data and data analytics tools that are enabling agencies to collect, curate, enrich, and transform all types of data for immediate analysis. These tools deliver key insights to government cybersecurity professionals in real-time without having to write a single line of code. And they also are essential for delivering the insights and data necessary for embracing artificial intelligence and machine learning solutions to help advance cybersecurity programs and identify threats.
GCH: You’re not the first cybersecurity expert that we’ve heard advocate for using AI and ML tools to secure these more complex networks. Can you describe what is meant by machine learning? And, how does machine learning help security within government agencies?
Nasheb Ismaily: Machine learning refers to algorithms that use statistics to analyze and find patterns in large amounts of data. Machine learning has many security applications, including detecting malware in encrypted traffic and finding insider threats by analyzing log data for patterns.
These machine learning models can then be deployed back into the source and stream to detect real-time threats. This capability is known as predictive analytics.
For example, Cloudera recently worked with a government agency to utilize its predictive analytics platform to implement a log ingestion pipeline. This pipeline leverages behavior analytics-based machine learning to profile and detect potential advanced persistent threats within their network, in real-time. This is essential since advanced persistent threats remain one of the largest and impactful security risks threatening government networks today.
To find out more about how advanced AI and ML solutions can help protect today’s more complex, complicated government networks, click here to access a complimentary copy of the whitepaper, “Delivering Government Data Context: Security and Governance in Support of Your Agency Mission.“