One of the many things we offer as part of our Managed Azure service is a monthly report with advice on cost savings, security and cloud best practice. Fortunately, the native tool, Azure Advisor, provides personalized information on exactly these categories. However, we had a problem. Although Azure Advisor allows manual export of its reports from the Azure portal, this was time consuming for our Service Desk team to update into the format that we want to provide to our customers. This re-formatting includes customisation for each customer, adding a notes field so we can add more detailed explanations and the ability to track remediation progress and exclude categories or certain priorities of recommendations.
We knew some automation was needed!
Since PowerShell is now fully supported by Azure Functions and has a lot of built-in Azure functionality in the Az module, we decided this was the language to use. Practicing what we preach, we leveraged Azure Platform as a Service resources to host a truly cloud-native solution. The script needed to be able to connect to the customer’s Azure Advisor and pull out any recommendations before exporting these to a customised Word document.
This is what we did – a fairly smart solution to our requirements.
To make the connection we used a Service Principal, which is a security identity used by apps, services, and automation tools to access specific Azure resources. To ensure we connected to the correct Azure Advisor instance for each customer, we needed to know the customer’s name, the ID of their Azure Active Directory tenant, the ID of their Subscription and the same for the Service Principal itself. Finally, we required the Service Principal Client Secret, which is the equivalent of its password. The values of these parameters are stored in Table storage except for the Client Secret, which is kept in a Key Vault.
The script initially connects to Azure in a user context and pulls the previously described parameters for all the customers into an array. It then disconnects and uses the Service Principal with the Reader role in the customer’s subscription to re-authenticate to Azure. Once authenticated, it calls the Azure Advisor API to refresh and get the latest recommendations. Once it has filtered, sorted and exported them to an appropriately customised Word Document, it loops round to start the next customer. Finally, the Word documents are emailed to our Service Desk team via a free SendGrid account and also archived to cool tier blob storage in a Storage Account for future reference.
As mentioned, our original intent was to run the script from an Azure Function with a timer trigger. This would mean we would pay only when the script was running and there would be no need for patching or other maintenance. Unfortunately, we found during testing that the PowerShell module used to create the Word document was not able to work properly in a Function due to a .Net dependency in a piece of middleware. To overcome this hurdle without redeveloping the whole module, we currently run the script from a virtual machine using the Windows Task Scheduler. The virtual machine is, in turn connected to the Azure Automation On-Off solution which powers it on for a few hours each month to allow the script to run and for patching to take place. This means that the Pay-As-You-Go virtual machine is only costing us the few pounds per month whilst it’s powered on.
We also contacted the developer of the module to make him aware of the issue and we hope in a future release it will be resolved so that we can move to the much neater solution using an Azure Function.
The final part of the automation was a separate script to allow on-boarding new customers. This pulls the name, tenant and subscription information from the customer Low Level Design document to create the Service Principal and Client Secret with the appropriate Azure role and then save this information in the Table storage and Key Vault ready for the next time the reports are generated.
The end result is a fully customised report delivered to our Service Desk team ready for them to check and annotate before passing on to our customers which helps us align to our ISO accreditations.
End-to-end this was about three days of work to put together, including writing the script and some infrastructure as code around the Azure resources. We’d estimate this is probably a saving of 12 days a year for our Service Desk manager and saving him the task of manual exporting, filtering and formatting. It also means that our reporting is completely consistent across all our Azure customers and much less susceptible to human error.