30 Azure Developer Interview Questions & Answers
Below is a list of our Azure Developer interview questions. Click on any interview question to view our answer advice and answer examples. You may view 5 answer examples before our paywall loads. Afterwards, you'll be asked to upgrade to view the rest of our answers.
1. What are the general datatypes that could be expected in a cloud environment?
a. Structured data - this is the type of data hosted in a relational database, such as MS SQL Server. This data is organized in tables comprised of rows and column and retrieved using structure query language (SQL)
b. Semi-structure data - this data does not fit into tables with rows and field names, but rather are identified by tags in key/pair values. Common formats include XML, JSON, and YAML and are also referred to NoSQL data.
c. No-structure data - These data have no organization and are often in binary formats. Files such as images, executables, audio or video files may be tagged, but the data itself is unstructured.
Written by Ryan Brown on September 13th, 2021
2. What Azure facility can you use to appropriately categorize resources and provide a use case for doing so.
Applying tags to individual resources will make it easier to search for and identify resources that should be logically grouped together. For example, you would assign a tag named "environment", in which that value can be "development", "QA" or "production". By assigning those resources (VMs, for example) with the appropriate environment tag, you can quickly search for those resources to ensure that they are placed in the correct resource group and possibly avoid accidental deletion of production resources. Another use case for tags can be for the assignment of resources to certain departments for billing purposes. In this case, you can quickly search for resources that fall under the responsibility of the Marketing department and then forward those invoices accordingly.
Written by Ryan Brown on September 13th, 2021
3. What other desktop tool could you use to create automation scripts, particularly if you are a Windows developer?
Windows PowerShell (PS) is a command-line tool that allows the developer to automate Windows System operations and other repetitive system tasks. In order to interact with Azure resources, PowerShell version 5.0 or higher is required. You can execute, $PSVersionTable.PSVersion, on the command line to determine your PS version. Additionally, you'll need to install Azure PowerShell module (Az) which contains the appropriate cmdlets to work with Azure features.
Written by Ryan Brown on September 13th, 2021
4. Describe the process for locally setting up Azure CLI what commands you would use to create a resource group.
It is important to know that there are a couple of options of creating and managing Azure resources. In addition to using the Azure portal, Azure provides the CLI, or command line interface, a locally installed application that allows you to connect to your Azure subscription to create and manage your resources. You can run the CLI on a Mac, Linux or Windows machine.
a. Install the CLI: To install Azure CLI on a Windows machine, download the installer from here, https://aka.ms/installazurecliwindows and execute the msi file. Next, verify the installation by opening up a powershell session and execute the command, az --version, which will display the version of the CLI that was installed.
b. Create a resource group: First login by executing this, 'az login', and enter your credentials. Then you would execute 'az group create' followed by the unique name of the resource group you wish create. Once this is completed, you can then use the CLI to create new resources and add them to the resource group.
Written by Ryan Brown on September 13th, 2021
5. Describe the different methods for securing your Azure SQL Database.
You can implement firewall rules at two levels, at the server level and database level. At the server level, this can be adjusted through the firewall pane in the Azure portal. You can select "on/off" to allow/disallow access to Azure services, or you can specify IP address rules, which are based on specific public IP ranges. Database-level firewall rules can be configured only through IP address specification. These rules allow access to a specific database and are stored in the database itself. You can secure user user access through authentication implementation, either through SQL authentication or integration with Azure Active Directory. By implementing authorization, you can further restrict a user to access only a specific database or database objects. For example, say there is a specific user in the database, User1.
//The following SQL query allows User1 to read data:
ALTER ROLE db_datareader ADD MEMBER User1;
GO;
// This statement will disallow User1 from selecting data from Schema.Table1
DENY SELECT ON Schema.Table1 TO User1
GO
Written by Ryan Brown on September 13th, 2021
6. What are resource groups and how would you use them?
Azure resource groups are logical containers to which any resource (VM, Storage account, VNet, etc) is assigned. Resource groups cannot be nested and a resource can only be assigned to a single resource group, however, you can move a resource between resource groups. You can use them to implement organization within your enterprise as follows:
- You can group resources by type, location or usage
- Project life cycle. For example, your company would like to develop an application proof-of-concept, whose resources would be only needed temporarily, these resources can be assigned to a resource group. After the project is over and the resources are no longer needed, the resource group can be deleted, effectively removing all the resources contained therein.
-Authorization: Resource groups are a scope for implementing role-based access control (RBAC). For example, all VNets can be contained with a particular resource group and only network administrators (those assigned t an network admin role) would have access to that resource group.
Written by Ryan Brown on September 13th, 2021
7. What is RBAC?
RBAC stands for Role-Based Access Control and provides an Azure administrator to configure access to an Azure resource(s) to a individual, group, or application, assigned to a particular role. The administrator assigns individuals, groups or applications to a particular role. This role has a scope (which can include a management group, subscription, resource group or an individual resource). Role assignments can be inherited. For example, if an individual is assigned access to a resource group, that user has access to perform actions for all resources (VMs, storage, networks,etc) within that group.
Written by Ryan Brown on September 13th, 2021
8. Describe ways in which to secure data in-transit and data at rest.
Encrypting disk drives is the best way to secure data at rest. Azure provides the facility, Microsoft Azure Disk Encryption. On Windows machines, encryption is based upon BitLocker technology. Linux VM disks can also be encrypted, using the Linux DM-Crypt feature of Azure encryption. Azure storage and Azure SQL Server have encryption at rest turned on by default.
Encryption in transit is accomplished by enabling SSL/TLS protocols during data exchange. For data exchange between on-premises and cloud resources, a VPN tunnel is recommended. On premises to cloud VPN configurations can be further modified as follows:
Site-to-site VPN -- should be enabled for multiple on-premises workstations accessing Azure resources
Point-to-site VPN -- should be enabled for an individual on-premises workstation accessing Azure resources
Azure Express Route -- should be enabled for large datasets. This requires a WAN link from on-prem to cloud infrastructure.
Written by Ryan Brown on September 13th, 2021
9. Name the top 5 security tasks you can implement prior to a production installation.
1. Enabling the Azure Security Center (ASC) - the ASC provides security monitoring of on/off-prem resources, providing security recommendations across your resources; It utilizes machine learning algorithms to detect and prevent malware installations. The ASC can be enabled through the Azure portal, and searching for the "Security Center" option.
2. Validate your input - It is a best practice to add validation to your html input boxes in your web applications. You can add client-side masking to make sure only accepted value-types are submitted. When executing SQL queries upon submission, always use parameterized queries or stored stored procedures, to avoid sql injection attacks. Additionally, alway encode any output to be presented to the user on screen.
3. Use Azure Key vault to store your keys/secrets. You could then programmatically access the sensitive data through an api call versus accessing data directly from a configuration file, for example.
4. Keep your application frameworks up-to-date.
5. Continually keep track of the latest security vulnerabilities. Be aware of the Common Vulnerabilities and Exposures (CVE - https://cve.mitre.org/) list and expeditiously implement the recommended fixes.
Written by Ryan Brown on September 13th, 2021
10. Describe the different options for creating an Azure SQL database.
Azure SQL Database is a (platform-as-a-service) PaaS option, removing you as a developer or DBA, from the responsibility of standing up and maintaining the infrastructure required for hosting a SQL Server instance. As such, you only need to worry about the design of your database and the different service level options. Here are the different deployment options:
- a single database: priced and managed per database
- elastic pools: priced as a group of databases managed together and share common set of resources
- hyperscale: this option is based upon a single database, but allows for size greater than the 4 TB limit normally assigned to a regular Azure database
There are two purchasing options:
- per Database Transaction Unit (DTU), in which pricing is based upon compute, I/O and storage options
-within this option, you can purchase the Basic, Standard, and Premium service levels
- per vCore (virtual Core), which allows you to purchase specified vCore according to your database workload
- within this option, you can purchase the following service tiers
-general purpose: for general purpose workloads
-business critical: this level offers the lowest latency among the three and is for higher performing workloads. Storage is hosted on local SSDs instead of Azure blob storage.
-hyperscale: this level accommodates the databases requiring greater than 4 TB of storage, up to 100 TB.
Written by Ryan Brown on September 13th, 2021
11. What are the options for accommodating high traffic usage scenarios with your web application?
Scaling out - You can adjust your app service plan to add more instances, up to the limit allowed by your service plan. If you notice the following application characteristics:
increase in CPU utilization, memory occupancy, disk queue length, response time length or the number of failed request, you should consider scaling out.
Scaling up - If you still notice an increase in any/all of the parameters above and the scaled out instances within the app service pricing plan is reaching the limit, scaling up is the next option. In this case, you can increase the memory allocations and processing power for the instances within your service plan.
In both cases, these adjustments can be made through the Azure portal by opening up your app service plan resources and adjusting the scaling out or scaling up parameters as needed.
Written by Ryan Brown on September 13th, 2021
12. What facility does Azure offer to minimize application downtime during deployments?
Azure deployment slots are separate environments, approximating different app service instances. With deployment slots, you are able to create separate development, testing and production environments, each having their own distinct hostnames. For example, once your application has been thoroughly tested in the testing slot, you can swap the production slot with the testing slot. This happens instantaneously, with no impact to the end user. If there is an issue with the new production version (ie. an uncaught exception condition), you can easily and instantaneously swap the old production version back. This facility is only available for the basic, standard, premium and isolated pricing tiers.
Written by Ryan Brown on September 13th, 2021
13. Describe the process for building an ASP.net app in Visual Studio and deploying it to an Azure app service.
1. Configure your VS environment to include the appropriate 'workloads'. A workload is a pre-defined set of tools in Visual Studio, for building certain types of applications, use specific languages, or develop for specific platforms. So for an ASP.net application, you will need to make sure that the ASP.net and web development workload is installed. For Azure hosting, the Azure workload should be included as well.
2. Create and build your ASP.net application.
3. Create and deploy an Azure App service. You will need to consider the level of service plan, the region where your app is to be hosted, and your pricing/reliability plans. You should also select the resource group to which the service will be assigned.
4. Deploy your application. Once the app is built and tested, you will select the deployment target, which in this case will be Azure. During the publishing process, you will enter the specifics of the app service created in the previous step.
Written by Ryan Brown on September 13th, 2021
14. What is the MEAN stack?
MEAN is the acronym for MongoDB, Express.js, AngularJS and Node.js development stack. If your expertise is in Javascript and NoSQL databases, this would be the appropriate technology stack in which to develop your web application. This environment is supported on both the Windows and Linux OS.
Written by Ryan Brown on September 13th, 2021
15. What steps should you consider when implementing an Azure virtual machine?
- Consider the network in which the VM will reside
Will your VMs connect to resources within the same virtual network, external resources or both? This will come into play when setting up your address spaces/subnets and security. Once created, any change would incur significant time to test and re-implement.
- Plan the VM deployment
What OS will you use? How much space should be allocated? What is the purpose of the VM (app server, database server, etc )? What ports are to be opened? What will you name the VM? This last point may sound trivial, but the better the naming convention, the easier it will be to quickly identify it and its purpose. Once created, the name is not a trivial thing to change.
- Decide the location for the VM -- You will want to place the VM(s) in a region, as close to the end users as possible, to improve performance. There may also be legal and compliance restrictions to be considered.
- Understanding the pricing model-- You will need to consider 2 metrics, compute and storage costs. With the pay-as-you-go model, you/your company will incur charges for as much time your VM is in operation. The hourly price varies based upon VM size and operating system, with Windows machines being more expensive than Linux machines due to licensing costs. You can possibly save money by creating reserved VM instances if you need to have the VM always on and can commit to your VM being active for at least one year. In this case, you prepay for the VM. Each virtual hard disk (VHD) allocated to your VM incurs a cost for the space that is being used.
Written by Ryan Brown on September 13th, 2021
16. What options do you have as a developer, for implementing business logic in the cloud and why would you do so?
Azure app services provide the ability to implement business processes in the cloud. A use case for doing so could be to integrate 2 or more disparate back-offices systems into one cohesive workflow. Let's say an online retailer of product 1 acquires another retailer of a complementary product, each with separate ordering and inventory management system. Azure simplifies this integration by supplying two designs for connecting business processes:
-Design first:
-Microsft Power Automate
-Logic Apps
Code first:
-WebJobs and WebJobs SDK
-Azure Functions
Written by Ryan Brown on September 13th, 2021
17. Describe a use case for implementing an Azure Event Hub.
When the volume of source data is expected to grow during certain peak time periods, an event pipeline should be available to handle those data packets until the consumer(s) of the packets are able to handle them. In this case, an Azure Event hub is an appropriate service to implement.
Written by Ryan Brown on September 13th, 2021
18. Describe the process for implementing messaging with Azure Queue storage.
#1: Create a storage account - login to the Azure portal, create a storage resource by completing the required information
#2: Once your storage account is created, retrieve the connection string of the account which can be found in the Portal ---> Settings ---> Access keys section of the portal account. Save this for later.
//#3: Instantiate the connection to the Queue --
CloudStorageAccount account = CloudStorageAccount.Parse(connectionString);
CloudQueueClient client = account.CreateCloudQueueClient();
CloudQueue queue = client.GetQueueReference("myqueue");
//b. The sender will programatically create the Queue -
CloudQueue queue; // create an instance of the queue
await queue.CreateIfNotExistsAsync();// only create it if it doesn't already exist
//#4: Add a message to the Queue --
var message = new CloudQueueMessage("your message here");
CloudQueue queue;
wait queue.AddMessageAsync(message);
//#5: Retrieve and delete a message from the Queue --
// in your client subscriber code:
CloudQueue queue;
CloudQueueMessage message = await queue.GetMessageAsync();
if (message != null)
{
// Process the message
await queue.DeleteMessageAsync(message);
}
Written by Ryan Brown on September 13th, 2021
19. What is an appropriate use case for an Azure Event grid?
An event grid enables you to aggregate multiple events generated from Azure resources and provides a routing mechanism from those sources to multiple destinations. This follows the publisher/subscriber paradigm. A use case for an event grid could be the creation of a notification workflow when changes occur to a virtual machine, for example. You can then use an Azure logic app to handle the event and send an email to system administrators, depending upon the specific criteria within the logic app. Event handlers can be Azure Functions, Azure Logic Apps, Azure Automation, Azure Event Hubs, and Azure Service Bus.
Written by Ryan Brown on September 13th, 2021
20. What are the different messaging types and how would you decide which design to use?
There are 2 main types of messages, which are implemented through the Azure Service bus. A message queue allows the temporary storage of messages from one source to one destination. A message topic is similar to a queue, except that more than one destination is allowed. In other words, each message can be delivered to multiple receivers in a topic. Additionally, a subscriber can filter a message to only those that may be relevant. The anticipated message and holding queue sizes is an important considerations. If the size of a message will be greater the 64kb but less than 256kb and the queue size will be smaller than 80 gb, then a service bus queue would be appropriate. Otherwise, a storage queue should be implemented.
Written by Ryan Brown on September 13th, 2021
21. What is the difference between an event and a message?
A message contains the raw data, which is meant to be consumed by another component. It is not just a reference to the data. An event implements the publisher/subscriber or pub/sub model, in which a source 'publishes' or broadcasts an event or something that has occurred at the source. Interested components or subscribers can then respond accordingly to the event. This subscription is handled by an Azure Event Grid or Azure Event Hub. It is important to know the difference between the message and event models and know which one to implement for a given use case.
Written by Ryan Brown on September 13th, 2021
22. List the different messaging mechanisms available through Azure, which allow robust communication among disparate systems.
When you have several disparate systems that need to communicate with one another in order to complete a particular workflow, there may be times when this workflow is disrupted due to packets being dropped between one or more systems. An example of this could be a system where you have the following components:
a. a web front end
b. a server-side function
c. a database
In this scenario, Azure provides message and event modules to facilitate communication between and among separate components.
Written by Ryan Brown on September 13th, 2021
23. What Azure service enables you, as a developer, to conveniently allow and manage access to your function?
Azure API management (APIM) service allow developers to conveniently manage access to Azure functions. This model enables external partners, for example, to make secure http calls to internal functions or microservices. The steps to do so are as follows:
A. Either through the command line or Azure portal, create your http-triggered, Azure function apps.
B. Through the portal, search for the function of interest, and select it.
C. In the middle pane in the API section, find and select "API Management".
D. In the right pane, find and click on the "Create new" link, enter the appropriate details and click "Create".
E. After it has been created, click "Link API". In the next screen, click "Select" to continue.
F. In the next screen, "Create from Function App", change the "API URL suffix" to an appropriate name and click "Create".
APIM enables you to monitor the performance and apply security policies to each microservice. You can also define the output format for each. For example, you can enforce transformation of XML output to JSON.
Written by Ryan Brown on September 13th, 2021
24. What Azure mechanism could you use to implement a persistent connection between a datasource input and your Azure function?
SignalR is a series of technologies that allow 2-way communication between a publishing datasource and any subscribing clients. Through this technology, a persistent connection can be established between a CosmosDB, for example, and an Azure function allowing changes in the database to be monitored by the function which can then respond accordingly.
Written by Ryan Brown on September 13th, 2021
25. Describe a webhook and how you would use it in an Azure function?
A webhook is an http-triggered callback that has an associated URL, which is requested when a configured event is triggered. A common use cases for a webhook event is source code repository changes. In a DevOps environment, you can configure an webhook-enabled, Azure function to respond to changes in a GitHub. GitHub has a webhook event, Gollum, that is specified for this purpose. In the payload URL Box of the GitHub UI, add the URL of the Azure function that will respond when the event is triggered.
Written by Ryan Brown on September 13th, 2021
26. What are the steps required for creating an Azure function through Visual Studio?
Though you can directly create functions in your Azure subscription through the portal, you may wish to first create and test your function locally. Visual Studio affords this opportunity and allows you to develop your code in C#.
Steps:
1. Create a new project and select the "Azure Functions" template.
2. On the next page, select the type of trigger to implement. This will create the Azure Function application, to which you add your specific function. Remember that the Azure Function app acts as a container, to which you can add functions.
3. Right-click on the newly created project, select "Add" and select "New Azure Function". Then, name the function.
4. You will next get prompted for the type of trigger to implement.
5. You can next debug your function as you would any other application.
6. If you implemented an HTTP trigger, running the application will display a command window which will show the URL of your functions.
7. With an HTTP trigger function, you would simply make a request to that URL passing the appropriate parameters, through a browser or CURL.
Written by Ryan Brown on September 13th, 2021
27. How are you able to develop and test Azure functions without using an IDE?
You can use Azure Core Tools to create and test Azure functions locally. It is a command-line tool which will allow you to perform the following:
-locally create the files and folders needed for function development
-execute the function on your local workstation and allow you to test and debug it
-publish your function to the Azure app service for production use
Written by Ryan Brown on September 13th, 2021
28. How are you able to persist state and use functions to create a long-running workflow?
Durable functions allows you to chain together a series of functions, simulating a lengthly workflow and persisting values from one function to another. There are several approaches to implementing durable functions. In the simplest case, you can implement "function chaining", where the output of one function becomes the input of the next and the last function returns your final result:
--->input parameters ---->function 1: output1 ----> (input1)function 2: output2 ---->(input2)function 3: final result
Written by Ryan Brown on September 13th, 2021
29. What are the different types of triggers for an Azure function?
- A timer trigger
Use case - create a trigger that executes when an appointment is made and then a timer trigger as a reminder for day-of appointments
Steps: A. Create a function app (A function app is like a container to which you can add functions) and then deploy it. B. To the function app, add a function and to it, add a timer trigger from the select a template list. You can then edit the trigger and specify the interval to execute the trigger.
- HTTP trigger
This trigger executes a function when it receives an http request. After creating and deploying your function app, create your function and select the HTTP trigger type from the template list. A "Get function URL" link is available to copy the generated link which will trigger the function.
- Storage trigger
This trigger is executed when a file is uploaded to a storage account, most commonly blob storage. A blob storage account is designed to hold large amounts of unstructured data.
Written by Ryan Brown on September 13th, 2021
30. What is an Azure Function and provide a use case for implementing one?
An Azure Function allows developers to design and code custom functionality, in a variety of languages (C#, F#, JavaScript, Python, and PowerShell Core). You can create your code locally on Visual Studio or use Azure Cloud Shell's online code generation editor. You would then create an Azure Function application through the Azure Portal. The function is connected to inputs and outputs through "bindings", which connect a source and target to your function. It can then be executed through a number of "triggers", which can be the addition of records to a CosmosDB or storage blob, an http request as well as others. It can also be set up as a scheduled trigger by implementing a timer. This capability frees the developer from implementing any infrastructure to host the function and allows them to focus on simply developing the logic. This design would be ideal for monitoring smart devices (IOT) in the field, for example. The devices can be bound as a datasource to your function, which can then monitor changes in temperature, humidity or other data you wish to analyze.
Written by Ryan Brown on September 13th, 2021