Thursday, May 4, 2023

Performing SAML Authentication Against Azure AD in Laravel Without the ext-http Extension

Yes, you can use cURL instead of the ext-http extension in Laravel to perform SAML authentication against Azure AD. Here's how you can do it:

  1. Install the LightSaml library in your Laravel application using Composer.
composer require lightsaml/lightsaml
  1. Use the cURL extension in PHP to send the SAML request to Azure AD.

Here's an example of how to use cURL to send a SAML request:

$url = 'https://login.microsoftonline.com/[tenant-id]/saml2'; $relayState = 'https://example.com/dashboard'; $id = '_' . sha1(uniqid('', true)); $issueInstant = gmdate('Y-m-d\TH:i:s\Z'); $samlRequest = '...'; // The SAML request XML $curl = curl_init(); curl_setopt_array($curl, array( CURLOPT_URL => $url, CURLOPT_POST => true, CURLOPT_POSTFIELDS => http_build_query(array( 'SAMLRequest' => base64_encode($samlRequest), 'RelayState' => $relayState )), CURLOPT_HTTPHEADER => array( 'Content-Type: application/x-www-form-urlencoded', 'Content-Length: ' . strlen(http_build_query(array( 'SAMLRequest' => base64_encode($samlRequest), 'RelayState' => $relayState ))), 'Accept-Encoding: gzip, deflate', 'Accept-Language: en-US,en;q=0.9', 'Connection: keep-alive', 'Host: login.microsoftonline.com', 'Referer: https://example.com/login', 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36' )), CURLOPT_RETURNTRANSFER => true )); $response = curl_exec($curl); curl_close($curl);

 Parse the SAML response received from Azure AD using the LightSaml library.

Here's an example of how to use the LightSaml library to parse the SAML response:

$responseDom = new \DOMDocument(); $responseDom->loadXML($response); $deserializer = new \LightSaml\Model\Protocol\Response\SamlResponseDeserializer(); /** @var \LightSaml\Model\Protocol\Response\SamlResponse $response */ $response = $deserializer->deserialize($responseDom->documentElement);


 By following these steps, you can perform SAML authentication against Azure AD in Laravel without using the ext-http extension.






How to Include @search.score in Azure Cognitive Search Suggest Response

 You can return the search score (@search.score) in the response along with the suggested search terms using Azure Cognitive Search Suggest API. Here's how you can do it:

  1. In the suggest query, add "@search.score" to the "select" parameter to include the search score in the response.

For example:

https://[service name].search.windows.net/indexes/[index name]/docs/suggest?api-version=[api-version]&suggesterName=[suggester name]&search=[user input]&$select=searchText,@search.score

  1. In the Suggester definition, add "@search.score" to the "sourceFields" parameter to enable scoring of the suggested search terms.

For example:

{ "name": "[suggester name]", "searchMode": "analyzingInfixMatching", "sourceFields": ["[field name 1]", "[field name 2]", "@search.score"] }


How to Create a Static Website in Azure Accessible Only on Company VPN with Custom Domain

 It is possible to have a static website with a custom domain that is fully locked down to just a company's VPN in Azure.

Here are the steps you can follow:

  1. Create a storage account and enable static website hosting.

  2. Upload your static website content to the $web container in the storage account.

  3. Create a private endpoint for the storage account.

  4. Configure the private endpoint to allow traffic only from the company's VPN.

  5. Create a custom domain and add a CNAME record pointing to the Azure CDN endpoint.

  6. Create a CDN profile and a CDN endpoint.

  7. Configure the CDN endpoint to use the storage account as the origin.

  8. Configure the CDN endpoint to use HTTPS and a custom domain.

  9. Lock down the CDN endpoint to allow traffic only from the company's VPN.

By following these steps, you can have a static website with a custom domain that is fully locked down to just a company's VPN in Azure. The CDN endpoint will serve the static website content from the storage account, and access to the CDN endpoint will be restricted to only the company's VPN.

Tuesday, May 2, 2023

Detecting Changes in Azure Data Factory Triggers with KQL Queries

 To detect changes in Azure Data Factory (ADF) triggers using Kusto Query Language (KQL), you can use the AzureActivity table in Log Analytics. You can use the following KQL query to identify trigger changes:


AzureActivity | where Category == "DataFactoryPipelineRun" | where OperationName == "Microsoft.DataFactory/factories/pipelines/create" | where ResourceProvider == "MICROSOFT.DATAFACTORY" | where ActivityStatusValue == "Succeeded" | where Details contains "New-AzDataFactoryPipeline"


This query looks for successful pipeline creation operations in ADF and specifically checks if the New-AzDataFactoryPipeline command was used, indicating a new pipeline was created. You can adjust the query to filter for specific triggers or time ranges by adding additional where clauses.

Note that if ADF auditing is not enabled, or if the logs are not sent to Log Analytics, the AzureActivity table may not contain the necessary information.

Moving Azure Functions from App Service plan to Consumption Plan: A Step-by-Step Guide for Smooth Transition

 Moving Azure Functions from an App Service plan to a Consumption plan involves a few steps, as outlined below:

  1. Create a new Function App on the Consumption plan: First, you need to create a new Function App on the Consumption plan in the Azure portal.

  2. Deploy your Functions code to the new Function App: Once you have created a new Function App on the Consumption plan, you can deploy your existing Functions code to the new app. You can do this by publishing the code from your development environment or by using tools like Azure DevOps or Visual Studio.

  3. Configure your Functions to run on the Consumption plan: Once you have deployed your code, you need to configure your Functions to run on the Consumption plan. You can do this by changing the hosting plan for each Function from the App Service plan to the Consumption plan. You can do this in the Azure portal by going to the Function App's Configuration settings and selecting the Consumption plan.

  4. Test your Functions: After you have configured your Functions to run on the Consumption plan, you should test them to ensure that they are working correctly.

  5. Delete the old Function App: Once you have verified that your Functions are working correctly on the Consumption plan, you can delete the old Function App running on the App Service plan.

It is important to note that moving Functions from an App Service plan to a Consumption plan can impact their performance and scalability, as the Consumption plan uses a pay-as-you-go model that can result in longer cold-start times for Functions. Therefore, it is important to test and monitor your Functions carefully after moving them to the Consumption plan.

Azure service bus topics vs Queues and example of which one should be used and when ?

Azure Service Bus provides two types of messaging entities: queues and topics. Both queues and topics can be used to implement messaging patterns like point-to-point and publish-subscribe, but they have different characteristics and are best suited for different scenarios.

Azure Service Bus Queues:

  • Queues provide a one-to-one messaging pattern, where a message is sent to a single recipient (receiver) that retrieves and processes the message from the queue.
  • The messages in the queue are processed in the order they were received.
  • Queues guarantee message delivery to a single recipient in a first-in, first-out (FIFO) order.
  • Queues can be used for load leveling, as multiple receivers can be set up to retrieve and process messages from the same queue.

Azure Service Bus Topics:

  • Topics provide a one-to-many messaging pattern, where a message is sent to a topic and all subscribers receive a copy of the message.
  • Subscribers can filter the messages they receive based on message properties.
  • Topics are used for publish-subscribe scenarios, where multiple subscribers need to receive the same message.
  • Topics provide a way to decouple the sender and the receivers, as the sender doesn't need to know who the subscribers are or how many there are.

Here's an example of when to use a queue versus a topic:

Suppose you have an e-commerce website that needs to process orders. Each order consists of multiple items, and each item needs to be processed by a different component in your system. You have two options to implement the message passing mechanism:

  1. Use a queue: In this case, you can have a single queue for all the orders, and each item in the order is sent as a separate message to the queue. Each component can retrieve and process the messages from the queue in a FIFO order, and you can use multiple instances of the component to process messages in parallel.

  2. Use a topic: In this case, you can have a topic for each component, and each item in the order is sent as a message to the corresponding topic. Each component subscribes to its own topic and receives only the messages that match its subscription filter. This way, each component processes only the messages it needs to process, and you can add or remove components without affecting the other components.

In summary, queues are best suited for point-to-point scenarios where messages need to be processed in a specific order, while topics are best suited for publish-subscribe scenarios where multiple subscribers need to receive the same message.

Running All Create SQL Scripts Hosted in Synapse Workspace Using Azure DevOps Pipeline CI/CD

As organizations adopt Azure Synapse Analytics for their data warehousing and big data analytics workloads, they need to automate their deployment and release processes to ensure consistency and reliability. Azure DevOps provides a robust and flexible platform for continuous integration and continuous deployment (CI/CD) of Synapse Analytics artifacts, including SQL scripts, notebooks, pipelines, and more.

In this blog post, we will focus on how to use Azure DevOps Pipeline to run all the Create SQL scripts that are hosted in Synapse Workspace at the end of the deployment process. We will provide step-by-step instructions, code snippets, and best practices for achieving this goal.

Step 1: Create a Synapse Workspace

To start, we need to have a Synapse Workspace that contains one or more SQL scripts that we want to run. Follow these steps to create a Synapse Workspace:

  1. Sign in to the Azure portal and navigate to the Synapse Workspace resource.
  2. Click on the "Add" button to create a new Synapse Workspace.
  3. Provide a unique name, subscription, resource group, and region for the workspace.
  4. Review and accept the default settings for the workspace, such as workspace pricing tier, workspace storage account, and workspace managed virtual network.
  5. Click on the "Review + create" button to create the workspace.
  6. Wait for the workspace to be created, which may take a few minutes.

Step 2: Create a SQL script in Synapse Workspace

Now, let's create a sample SQL script in Synapse Workspace that we will use for testing the pipeline later. Follow these steps to create a SQL script:

  1. Sign in to the Azure Synapse Studio by clicking on the "Open Synapse Studio" button on the Synapse Workspace resource page.
  2. Navigate to the "Develop" tab and click on the "New" button to create a new SQL script.
  3. Provide a name and a description for the script.
  4. Write the SQL code that creates a sample table in the workspace.
  5. Click on the "Save" button to save the script.

Step 3: Create an Azure DevOps Pipeline

Next, let's create an Azure DevOps Pipeline that will deploy the Synapse Workspace artifacts and run the SQL scripts at the end of the deployment. Follow these steps to create a pipeline:

  1. Sign in to the Azure DevOps portal and navigate to the project that contains the Synapse Workspace artifacts.
  2. Click on the "Pipelines" menu and then click on the "New pipeline" button.
  3. Select the "Azure Repos Git" option as the source of the pipeline.
  4. Select the repository that contains the Synapse Workspace artifacts.
  5. Select the "Starter pipeline" template as the starting point for the pipeline.
  6. Click on the "Save and run" button to save and run the pipeline.

Step 4: Add a PowerShell script to the pipeline

To run the SQL scripts in Synapse Workspace, we need to add a PowerShell script to the pipeline that will execute the scripts. Follow these steps to add a PowerShell script to the pipeline:

  1. Open the pipeline editor by clicking on the "Edit" button on the pipeline page.
  2. Navigate to the "Tasks" section and click on the "+" button to add a new task.
  3. Search for the "PowerShell" task and after adding this task, when the pipeline is run, it will list out all the SQL scripts found in the Synapse Workspace for the specified environment.
  4. To actually run these SQL scripts, we need to add another task that will connect to the Synapse Workspace and execute these scripts. For this, we can use the Azure Synapse Analytics SQL script execution task.
  5. Here's an example YAML code for this task:

- task: SqlTask@2

  inputs:

    azureSubscription: '<Azure subscription name>'

    sqlServerName: '<Synapse workspace name>'

    databaseName: '<Database name>'

    sqlFile: '**/Create*.sql'

 

In this code, we are specifying the Azure subscription name, Synapse workspace name, and database name where the SQL scripts are located. We are also using the **/Create*.sql pattern to search for all SQL scripts that start with "Create" in any folder.

This task will connect to the Synapse workspace and execute all the SQL scripts found by the previous task.

With these two tasks added to our pipeline, we can now automatically run all the Create SQL scripts that are hosted in the Synapse Workspace after deploying our artifacts to the specified environment.

By using Azure DevOps and Azure Synapse Analytics together, we can automate the entire deployment process for our data analytics solution, including the creation of the necessary database objects. This reduces the time and effort required for manual deployment, improves consistency and accuracy, and ensures that the same process is followed for every deployment. 

ASP.NET Core

 Certainly! Here are 10 advanced .NET Core interview questions covering various topics: 1. **ASP.NET Core Middleware Pipeline**: Explain the...