<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=2012313439098820&amp;ev=PageView&amp;noscript=1">
Posted by Mike Racine

Azure Storage Basics: How to Configure Security

Do you ever wonder what happens to all those bits and bytes you send to the cloud? Most of them end up in Azure Storage. In this episode of KnowOps, Dana shows us how to make things much safer for all those little bits and keep them from floating off into someone else’s possession.

--

Microsoft blog post on moving Storage Analytics data into Log Analytics :
https://azure.microsoft.com/en-us/blog/query-azure-storage-analytics-logs-in-azure-log-analytics/

Want to learn more about Kusto Query Language (KQL)?
https://www.auditwolf.com/blog/recon-your-azure-resources-with-kusto-query-language-kql

Show Transcript

 

So last week on Reddit, in response to my episode on locking down Key Vault, it was made clear to me that some people don't know how to protect their data in Azure Storage as well. Seems like a great topic to cover. Let's get to it.

Dana Epp here. Welcome to the channel that helps aspiring Azure administrators like you and me to know ops and, well, master the Microsoft Cloud. I'm glad you're here. If you haven't yet, smash the subscribe button so you can be notified when I release new videos each week.

Alright, so Azure Storage is Microsoft's cloud storage solution for modern data storage scenarios. It offers a massively scalable object store for data objects, a file system service for the cloud, a messaging store for reliable messaging, and a cost effective no SQL table store. A lot of Azure resources leverage average storage today. From your VM disks to application queues, chances are you have sensitive data sitting in Azure Storage right now. How secure is it? Let's find out.

Let's first discuss data at rest. So, all data written to Azure Storage is encrypted using a 256 bit AES key. By default, Microsoft manages that encryption key for you. However, you are in complete control of that and can use your own customer manage key, storing it and Key Vault and referencing that within the configuration of your storage account. Unless you have a specific need to do so, I recommend you let Microsoft manage encryption keys. They regularly rotate the keys and take care of all of this transparently for you. Why add administrative burden on yourself to manage keys for the very small gain it gives you of being in control? Besides, behind the scenes Microsoft will need to use that key to encrypt and decrypt the data inside of the stored service anyways. So if you're trying protect yourself against Microsoft, this isn't the way to do it.

One other thing to consider, when thinking about data at rest, is address storage encryption versus disk encryption. Azure Storage encryption encrypts the page blobs that back the Azure virtual machine disks. Additionally, though, all Azure VM disks may be optionally encrypted with Azure Disk Encryption. Azure Disk Encryption uses industry standard BitLocker on Window and dm-crypt on Linux to provide operating system based encryption solutions that are integrated with Azure Key Vault. Use both methods. You want the operating systems to take advantage of disk encryption for data inside the disk while keeping the raw disk blobs protected when access via Azure Storage. There is no real performance loss in this scenario and offers you additional protection. So use it.

So let's start with the simple stuff. Enforce secure transfer. All requests to Azure Storage should be done over a secure channel using TLS. I don't even know why we even have an option these days to not enforce it. It should always be enforced. And here's why. In some of my pen tests, I've been able to do an HTTP downgrade attack because Azure Storage can't enforce the strict transport security flag or each HSTS. Once that session gets redirected to HTTP, I can capture the query string and extract the storage keys, and then immediately redirect them back to HTTPS. The target doesn't even know I've done it. Later on I'll use those keys and access the data directly, which gets me to my second recommendation.

Don't expose access to your storage account to the entire internet. Take advantage of the firewalls and virtual network security controls to reduce your attack surface and limit access to only those select network V-Nets and public IPs that require it. This way, if an adversary was able to capture your storage keys, they wouldn't be able to use them anyways.

So if you watch a lot of videos on YouTube about Azure Storage, you will see many of them showing you how to use one of the two access keys to connect to Azure Storage directly. Don't listen to them. That's bad at practice and there's a much better way to do it that's safer. And that's through shared access signatures, or SAS tokens for short, where a storage access key will give full administrative access to the storage account. A SAS token allows you to delegate restricted access to your data and apply least privilege giving the absolute minimum access required for a specific amount of time. You can even scope it down to a specific set of IP addresses if needed. And more importantly, you can force HTTPS, eliminating the risks of those HTTP downgrade attacks, even if the storage account itself allows for HTTP. Combine the SAS tokens with the built in network access controls, you have the one two punch that can really restrict that access.

One additional recommendation I have, is to try to limit the SAS tokens to a short as an expiration as you can get away with. I see some examples online where people are saying to set it for like 50 years and that's a really bad idea. I like access keys that you can rotate and manage in the Azure portal, once a SAS token is generated, you never see it again. If you don't remember which access key you keep that off of you have no visibility to ever revoke or rotate it.

Now, some DevOps people, they might complain about that recommendation, and that's okay. If the concern is not wanting to manage the regeneration of the SAS tokens all the time, then automate it. Store the SAS tokens in Key Vault, and then use the Azure Automation to generate new SAS tokens for you automatically. This way you decouple the application connection string and code itself from the SAS token that's in Key Vault, and can then let Azure Automation update Key Vault with the value automatically. Use Key Vault references in your app and have the whole end to end plumbing taken care for you so you never have to worry about the secret nature of the SAS tokens being in your code, or in your configuration in the first place.

Azure Storage is always improving. One such example is the introduction of advanced threat protection for your storage data. It provides an additional layer of security intelligence that detects unusual and potentially harmful attempts to access or exploit your storage accounts. This layer of protection allows you to address threats without being a security expert or managing security monitoring systems. There is an additional charge for this but it's like two cents for every 10,000 transactions. Storage ATP even integrates directly into the Azure Security Center, and it will email all subscription administrators when alerts are triggered.

Another option to consider is to enable soft deletes under data protection. You can enable a retention policy that will allow you to recover deleted data for a short period of time, kind of like how your recycling bin works on your desktop. This is useful in those situations where data is accidentally deleted and you wanna recover it quickly.

Finally, I wannna talk about logging. I might sound like a broken record here, but it's a pet peeve that I have about Azure. So many services do not have audit logging turned on by default, which leaves cloud administrators over a barrel when we have to track problems down and understand how our IT is being accessed in the cloud. So go into diagnostic settings and enable logging. Make sure you enable the v2.0 log format so that you can benefit from the additional logging of requests using OAuth tokens, which was missing from the platform for what seems like forever.

Turning this logging on will then enable a new block container in your storage account called dollar sign logs. You won't be able to see this in the Storage Explorer built into the Azure portal, but you can see it if you use the desktop version. This contains a deeply nested structure broken down by year, month and day to make it easier to isolate your logs. There isn't a log viewer for all of us, unfortunately, but there is a better way to read this data anyways. Microsoft has published a PowerShell Script that will convert your storage analytics logs into a format that can be consumed by log analytics. So you can ingest your storage logs into your log analytics workspace, and then use KQL to grok the data however you like.

I'll leave a link to the Microsoft blog post that discusses this strategy in the show notes below. Are you new to KQL and the whole Cousteau query language? No problem. I'll also leave a link to the episode I did on how to recon your resources with KQL. That'll help you understand how to work with log analytics. It doesn't take a lot of effort to better secure your data in Azure Storage.

I hope I've been able to show you that here. What do you think? Was this helpful? Let me know by hitting the like button. It really does help. And if you haven't yet, smash the subscribe button so that you can be notified as I publish new content each week. Until then, thanks for watching. Keep your cloud data safe. We'll see you in the next episode.

 

 

 
Do you ever wonder what happens to all those bits and bytes you send to the Microsoft cloud? Most of them end up in Azure Storage. #knowops @auditwolf
 

 

  Tweet

 

 

 

Topics: Cloud Operations (CloudOps), KnowOps

Comments