Monday, May 22, 2023

minimal api authentication JWT .NET 6

 To implement minimal API authentication with JWT (JSON Web Tokens) in .NET 6, you can follow these steps:


Step 1: Create a new .NET 6 Minimal API project.


Step 2: Install the required NuGet packages:


dotnet add package Microsoft.AspNetCore.Authentication.JwtBearer

dotnet add package System.IdentityModel.Tokens.Jwt



Step 3: Configure JWT authentication in the `Program.cs` file:


using Microsoft.AspNetCore.Authentication.JwtBearer;

using Microsoft.IdentityModel.Tokens;


var builder = WebApplication.CreateBuilder(args);


// JWT Configuration

var jwtSettings = builder.Configuration.GetSection("JwtSettings");

var key = Encoding.ASCII.GetBytes(jwtSettings["SecretKey"]);

var tokenValidationParameters = new TokenValidationParameters

{

    ValidateIssuerSigningKey = true,

    IssuerSigningKey = new SymmetricSecurityKey(key),

    ValidateIssuer = false,

    ValidateAudience = false

};


builder.Services.AddAuthentication(options =>

{

    options.DefaultAuthenticateScheme = JwtBearerDefaults.AuthenticationScheme;

    options.DefaultChallengeScheme = JwtBearerDefaults.AuthenticationScheme;

})

.AddJwtBearer(options =>

{

    options.TokenValidationParameters = tokenValidationParameters;

});


builder.Services.AddSingleton(tokenValidationParameters);




Step 4: Configure JWT secret key and issuer in the `appsettings.json` file:

{

  "JwtSettings": {

    "SecretKey": "your_secret_key_here"

  }

}



Step 5: Protect your API endpoints with the `[Authorize]` attribute:


using Microsoft.AspNetCore.Authorization;



app.MapGet("/protected", () =>

{

    return "This is a protected endpoint.";

}).RequireAuthorization(); // Requires authentication for this endpoint


Step 6: Generate JWT tokens during the login process:


using System.IdentityModel.Tokens.Jwt;

using Microsoft.Extensions.Configuration;

using Microsoft.IdentityModel.Tokens;



app.MapPost("/login", async (LoginModel model, IConfiguration configuration) =>

{

    // Validate the user credentials and generate JWT token

    if (IsValidUser(model.Username, model.Password))

    {

        var tokenHandler = new JwtSecurityTokenHandler();

        var jwtSettings = configuration.GetSection("JwtSettings");

        var key = Encoding.ASCII.GetBytes(jwtSettings["SecretKey"]);

        var tokenDescriptor = new SecurityTokenDescriptor

        {

            Subject = new ClaimsIdentity(new[]

            {

                new Claim(ClaimTypes.Name, model.Username)

            }),

            Expires = DateTime.UtcNow.AddHours(1),

            SigningCredentials = new SigningCredentials(new SymmetricSecurityKey(key), SecurityAlgorithms.HmacSha256Signature)

        };

        var token = tokenHandler.CreateToken(tokenDescriptor);

        var tokenString = tokenHandler.WriteToken(token);

        return Results.Ok(new { Token = tokenString });

    }

    else

    {

        return Results.Unauthorized();

    }

});



Step 7: Test the protected endpoints by including the JWT token in the `Authorization` header of the request:


GET /protected HTTP/1.1

Host: localhost:5000

Authorization: Bearer <your_token_here>



That's it! With these steps, you have implemented minimal API authentication with JWT in .NET 6 using the Minimal API approach. Remember to customize the authentication and authorization logic according to your requirements.

Sunday, May 21, 2023

How to save a file to Azure Storage Account through App Service?

 To save a file to Azure Storage Account through an App Service, you can follow these general steps:


1. Set up an Azure Storage Account: Create a storage account in the Azure portal if you haven't already done so. Note down the storage account name and access key, as you will need them later.


2. Configure App Service settings: In the Azure portal, navigate to your App Service and go to the "Configuration" section. Add or modify the following application settings:


   - `AzureStorageAccountName`: Set this to your Azure Storage Account name.

   - `AzureStorageAccountKey`: Set this to the access key of your Azure Storage Account.

   - `AzureStorageContainerName`: Specify the name of the container within the storage account where you want to store the file.


3. Add code to your application: Depending on the programming language and framework you are using for your App Service, the code implementation may vary. Here's an example using C# and the Azure Storage SDK:



   using Microsoft.WindowsAzure.Storage;

   using Microsoft.WindowsAzure.Storage.Blob;

   using System.IO;


   // Retrieve the storage account connection string and container name from app settings

   var storageAccountName = System.Environment.GetEnvironmentVariable("AzureStorageAccountName");

   var storageAccountKey = System.Environment.GetEnvironmentVariable("AzureStorageAccountKey");

   var containerName = System.Environment.GetEnvironmentVariable("AzureStorageContainerName");


   // Create a CloudStorageAccount object

   var storageAccount = new CloudStorageAccount(

       new StorageCredentials(storageAccountName, storageAccountKey), true);


   // Create a CloudBlobClient object

   var blobClient = storageAccount.CreateCloudBlobClient();


   // Get a reference to the container

   var container = blobClient.GetContainerReference(containerName);


   // Create the container if it doesn't exist

   await container.CreateIfNotExistsAsync();


   // Set the permissions for the container (optional)

   await container.SetPermissionsAsync(new BlobContainerPermissions

   {

       PublicAccess = BlobContainerPublicAccessType.Blob

   });


   // Create a CloudBlockBlob object

   var blob = container.GetBlockBlobReference("filename.txt");


   // Upload the file to the blob

   using (var fileStream = File.OpenRead("path/to/file.txt"))

   {

       await blob.UploadFromStreamAsync(fileStream);

   }

   


   In this example, make sure to replace `"filename.txt"` with the desired name of the file in the storage account and `"path/to/file.txt"` with the actual path of the file you want to upload.


4. Deploy and test: Deploy your App Service with the updated code and test the functionality by uploading a file. The file should be saved to the specified Azure Storage Account and container.


Note: Ensure that the appropriate SDK or library is installed for your programming language and framework to interact with Azure Storage, such as `Microsoft.WindowsAzure.Storage` for C#/.NET.

Saturday, May 20, 2023

Optimizing Azure SQL Performance: Bulk Inserts and Commit Control in JDBC

 There are still some ways you can improve the performance of your inserts. Here are a few suggestions:

  1. Use Parameterized Queries: You're already using parameterized queries, which is good. It helps with query performance and prevents SQL injection. Make sure the column types in your table match the data types you're setting in the Prepared Statement.

  2. Increase Batch Size: Instead of executing the batch every 10,000 rows, you can try increasing the batch size to a larger number. This can help reduce the number of round trips to the database and improve performance. Experiment with different batch sizes to find the optimal value for your specific scenario.

  3. Use Connection Pooling: You can use HikariCP for connection pooling, which is recommended for efficient connection management. Connection pooling helps reuse existing connections instead of creating new connections for each insert, which can significantly improve performance.

  4. Disable Auto-Commit: By default, JDBC commits each individual statement execution, which can be costly for bulk inserts. You can try disabling auto-commit mode before executing the batch and commit explicitly after the batch completes. This can help reduce the overhead of committing after every single statement.

Here's an updated version of your code incorporating these suggestions:

public void saveAllJdbc(List<JdbcDetail> jdbcDetails) { String sql = String.format("INSERT INTO %s VALUES (?, ?, ?, ?)", "my-table"); try (Connection connection = hikariDataSource.getConnection(); PreparedStatement statement = connection.prepareStatement(sql)) { int counter = 0; connection.setAutoCommit(false); // Disable auto-commit for (JdbcDetail row : jdbcDetails) { statement.clearParameters(); statement.setInt(1, row.getQuantity()); statement.setDate(2, Date.valueOf(row.getDate())); statement.setFloat(3, row.getId()); statement.setInt(4, row.getNum()); statement.addBatch(); if ((counter + 1) % 10000 == 0) { statement.executeBatch(); statement.clearBatch(); connection.commit(); // Explicitly commit the batch } counter++; } statement.executeBatch(); // Execute any remaining statements in the batch connection.commit(); // Commit the final batch connection.setAutoCommit(true); // Re-enable auto-commit } catch (Exception e) { e.printStackTrace(); } }

By increasing the batch size and disabling auto-commit, you should see improvements in the performance of your insert operations in Azure SQL. Remember to experiment with different batch sizes to find the optimal value for your specific scenario.

Friday, May 19, 2023

Streamlining File Archiving: How Azure Logic Apps, Office 365, and Blob Storage Solved Our Client's Document Management Challenge

In this blog post, we share a real-world problem faced by one of our clients regarding document management and how we successfully addressed it using Azure Logic Apps, Office 365, and Blob Storage. We will discuss the client's specific needs, how we conceptualized and implemented the solution, and the benefits it brought to their organization.

Client Challenge: Our client, a growing financial services company, was struggling with an inefficient and error-prone manual process for archiving important documents. They dealt with a high volume of emails containing attachments, and their team had to manually save each attachment to a local file system, leading to delays, misplaced files, and increased operational costs. They sought a streamlined and automated solution to improve their document management workflow.

Solution Design and Implementation: Understanding our client's pain points, we proposed an automated file archiving solution leveraging Azure Logic Apps, Office 365, and Blob Storage. Here is how we designed and implemented the solution:

  1. Azure Logic Apps Setup: We created an Azure Logic App to orchestrate the workflow. The Logic App acted as the central hub for connecting the different components and driving the automation.

  2. Office 365 Connector Integration: We integrated the Office 365 Outlook connector with the Logic App. This allowed us to leverage Office 365's powerful email capabilities, enabling seamless interaction with the client's mailbox.

  3. Triggering the Workflow: To initiate the workflow, we configured a trigger that monitored the client's mailbox for new emails. We customized the trigger to filter emails based on specific criteria such as subject lines, senders, or keywords related to important documents.

  4. Saving Attachments to Blob Storage: Using the Blob Storage connector within the Logic App, we seamlessly connected to the client's Azure Blob Storage account. When a new email arrived, the Logic App automatically extracted and saved the attachments directly to Blob Storage, eliminating the need for manual intervention.

  5. Archiving and Organizing Files: To ensure efficient file organization, we implemented custom logic within the Logic App. This included renaming files, adding metadata, and organizing them into appropriate folders based on the email attributes or other relevant criteria defined by the client.

Benefits and Results: By implementing this integrated solution, our client experienced significant improvements in their document management process:

  • Time and Cost Savings: The automated file archiving workflow drastically reduced manual efforts, saving the team countless hours each week. This allowed them to reallocate resources to more value-added tasks, leading to cost savings in the long run.

  • Error Reduction: Manual errors and file misplacements were virtually eliminated, as the process became standardized and automated. The risk of losing critical documents was significantly mitigated.

  • Enhanced Access and Searchability: Storing files in Blob Storage facilitated easy retrieval and improved searchability. With organized folder structures and metadata, the team could quickly locate specific documents when needed.

Thursday, May 18, 2023

azure create Linux VM with ssh and storage options

 az vm create --name VMname --resource-group RGname --image UbuntuLTS --generate-ssh-keys


Create Azure VM using terraform ?

 To create an Azure virtual machine (VM) using Terraform, you need to follow these general steps:

  1. Set up Azure credentials: Before you begin, you'll need to set up your Azure credentials to authenticate Terraform with your Azure account. You can create a service principal or use other authentication methods supported by Azure.

  2. Create a Terraform configuration file: Create a file with a .tf extension (e.g., main.tf) to define your Terraform configuration. In this file, you'll specify the desired state of your Azure VM and other related resources.

Here's an example of a basic Terraform configuration file to create an Azure VM:


provider "azurerm" { features {} } resource "azurerm_resource_group" "example" { name = "my-resource-group" location = "East US" } resource "azurerm_virtual_network" "example" { name = "my-vnet" address_space = ["10.0.0.0/16"] location = azurerm_resource_group.example.location resource_group_name = azurerm_resource_group.example.name } resource "azurerm_subnet" "example" { name = "my-subnet" resource_group_name = azurerm_resource_group.example.name virtual_network_name = azurerm_virtual_network.example.name address_prefixes = ["10.0.1.0/24"] } resource "azurerm_network_interface" "example" { name = "my-nic" location = azurerm_resource_group.example.location resource_group_name = azurerm_resource_group.example.name ip_configuration { name = "my-ipconfig" subnet_id = azurerm_subnet.example.id private_ip_address_allocation = "Dynamic" } } resource "azurerm_virtual_machine" "example" { name = "my-vm" location = azurerm_resource_group.example.location resource_group_name = azurerm_resource_group.example.name network_interface_ids = [azurerm_network_interface.example.id] vm_size = "Standard_DS1_v2" storage_image_reference { publisher = "Canonical" offer = "UbuntuServer" sku = "16.04-LTS" version = "latest" } storage_os_disk { name = "my-os-disk" caching = "ReadWrite" create_option = "FromImage" managed_disk_type = "Standard_LRS" } os_profile { computer_name = "my-vm" admin_username = "adminuser" admin_password = "Password1234!" } os_profile_linux_config { disable_password_authentication = false } }
  1. Initialize and apply the Terraform configuration: Run the following commands in the directory where you have your Terraform configuration file:

terraform init terraform apply

The terraform init command initializes the Terraform working directory and downloads the necessary provider plugins. The terraform apply command creates or updates the Azure resources defined in your configuration based on the desired state.

Note: Make sure you have Terraform and the Azure provider installed before running these commands.

This is a basic example, and you can customize it further based on your specific requirements for the VM, such as specifying the VM size, storage options, networking configuration, and more. Refer to the Azure provider documentation in the Terraform website for more details and additional configuration options.

Remember to review and understand the changes that Terraform will make to your Azure resources before confirming the execution

ASP.NET Core

 Certainly! Here are 10 advanced .NET Core interview questions covering various topics: 1. **ASP.NET Core Middleware Pipeline**: Explain the...