Tuesday, May 2, 2023

What is the difference between an app setting and a connection string in an Azure Function?

 In an Azure Function, app settings and connection strings are used to store configuration information that is required by the function at runtime. The main difference between app settings and connection strings is the type of information that they store.

An app setting is used to store any kind of configuration information that is not related to a database connection, such as API keys, URLs, or any other settings that your function might need. App settings are typically key-value pairs that can be accessed by your function through the configuration object.

On the other hand, a connection string is used to store information that is required to connect to a database or other external resource, such as a storage account. Connection strings typically include information about the server name, database name, username, and password.

While both app settings and connection strings are used to store configuration information, it is important to note that connection strings are more sensitive in nature, and should be treated with extra care. For example, you might choose to store your connection strings in a key vault or use Azure Key Vault references to prevent unauthorized access to this sensitive information.

What is Azure SignalR Service and how do you use Azure SignalR Service with Azure Functions?

 Azure SignalR Service is a fully-managed service that enables real-time messaging between client and server applications. It provides an easy way to add real-time functionality to web applications, such as chat, live updates, and real-time data visualization. SignalR supports multiple platforms and programming languages and provides a simple API for developers to use.


To use Azure SignalR Service with Azure Functions, you need to follow these steps:

  1. Create an Azure SignalR Service instance in the Azure portal.
  2. Create an Azure Functions app and a function in the Azure portal.
  3. Install the Microsoft.Azure.SignalR nuget package for your Azure Function.
  4. Add the necessary code to your Azure Function to integrate with Azure SignalR Service.
  5. Configure the connection to Azure SignalR Service by providing the connection string in the Azure Function's configuration settings.
  6. Deploy your Azure Function to Azure and test the real-time messaging functionality with SignalR.

By using Azure SignalR Service with Azure Functions, you can build real-time web applications that provide seamless communication between client and server applications.

How do you create a function app?

 To create a function app in Azure, follow these steps:

  1. Sign in to the Azure portal.
  2. Click on the Create a Resource button.
  3. Search for "Function App" and select it from the list of results.
  4. Click on the Create button to begin the creation process.
  5. Fill in the required information, including the subscription, resource group, and function app name.
  6. Choose the operating system, either Windows or Linux, and the hosting plan, either Consumption or App Service Plan.
  7. Choose the runtime stack and version, such as Node.js, Python, .NET Core, or Java.
  8. Choose the region where you want to deploy the function app.
  9. Click on the Create button to create the function app.

What is a function app ?

 A function app is a container for one or more Azure Functions that allows you to group related functions together as a logical unit for deployment, management, and sharing of resources. It provides an environment for developing, testing, and running functions, and can be scaled automatically based on demand.

How do you create an Azure Function?

 You can create an Azure Function using the following steps:

  1. Open the Azure portal and sign in to your account.
  2. Click on the Create a Resource button in the left-hand pane and search for "Function App".
  3. Click on the Function App option and then click on the Create button.
  4. Fill in the required information, including the subscription, resource group, and function app name.
  5. Choose the operating system, either Windows or Linux, and the hosting plan, either Consumption or App Service Plan.
  6. Choose the runtime stack and version, such as Node.js, Python, .NET Core, or Java.
  7. Choose the region where you want to deploy the function app.
  8. Click on the Create button to create the function app.

Once the function app is created, you can create a new function by following these steps:

  1. In the function app blade, click on the Functions option in the left-hand pane.
  2. Click on the + button to create a new function.
  3. Choose a template or create a custom function.
  4. Choose a trigger type for the function.
  5. Fill in the required information for the trigger and any input bindings.
  6. Write the function code in your preferred programming language.
  7. Test the function using the Test tab or the function URL.
  8. Save and publish the function.

What programming languages can you use to develop Azure Functions?

 Azure Functions supports several programming languages for developing functions, including:

  1. C#
  2. Java
  3. JavaScript (Node.js)
  4. Python
  5. PowerShell

Developers can choose the programming language based on their preference and experience. Each language has its own unique set of tools and libraries that can be used to build Azure Functions. Developers can also use integrated development environments (IDEs) such as Visual Studio and Visual Studio Code to build and debug Azure Functions written in these languages.

What are the different types of Azure Functions?

 Azure Functions provides several different types of triggers that can be used to invoke functions. Here are some of the common types of Azure Functions:

  1. HTTP Trigger: Invokes a function when an HTTP request is made to a specified URL.
  2. Timer Trigger: Invokes a function on a schedule.
  3. Blob Trigger: Invokes a function when a new blob is added to an Azure Storage container.
  4. Cosmos DB Trigger: Invokes a function when a new or updated document is added to an Azure Cosmos DB database.
  5. Event Grid Trigger: Invokes a function when an event is published to an Azure Event Grid topic.
  6. Event Hub Trigger: Invokes a function when a new message is added to an Azure Event Hub.
  7. Service Bus Queue Trigger: Invokes a function when a new message is added to an Azure Service Bus queue.
  8. Service Bus Topic Trigger: Invokes a function when a new message is added to an Azure Service Bus topic.

Developers can also create custom triggers for Azure Functions using the Azure Event Grid or the Azure Service Bus.

What is an Azure Function?

 An Azure Function is a serverless computing service provided by Microsoft Azure that enables developers to build event-driven applications that can be executed without the need for provisioning and managing servers. With Azure Functions, developers can write small, single-purpose functions that respond to events such as HTTP requests, changes to data in Azure Storage or Azure Cosmos DB, or messages from Azure Service Bus or Azure Event Hubs. These functions can be written in several programming languages including C#, Java, JavaScript, Python, and PowerShell. Azure Functions scales automatically, from just a few instances up to thousands of instances, depending on the demand of the application.

Creating Custom Triggers for Azure Functions with Azure Event Hubs and Azure Service Bus

Azure Functions is a serverless compute service that allows you to run your code on-demand without having to manage infrastructure. With Azure Functions, you can build scalable, event-driven applications that can respond to changes in real-time. One way to achieve this is by creating custom triggers that respond to events from Azure Event Hubs and Azure Service Bus. In this tutorial, we'll show you how to create custom triggers for Azure Functions using these two services.
Prerequisites
Before we get started, you'll need to have the following:

1. An Azure account
2. Visual Studio Code
3. Azure Functions extension for Visual Studio Code
Creating an Azure Event Hub 
The first step is to create an Azure Event Hub. In the Azure portal, select "Create a resource" and search for "Event Hubs". Choose "Event Hubs" and follow the prompts to create a new Event Hub.
Once your Event Hub is created, you can send events to it using any compatible client library. In this tutorial, we'll use the Azure Functions extension for Visual Studio Code to create a custom trigger that responds to events from our Event Hub.
Creating an Azure Service Bus
The next step is to create an Azure Service Bus. In the Azure portal, select "Create a resource" and search for "Service Bus". Choose "Service Bus" and follow the prompts to create a new Service Bus.
Once your Service Bus is created, you can send messages to it using any compatible client library. We'll use the Azure Functions extension for Visual Studio Code to create a custom trigger that responds to messages from our Service Bus.
Creating Custom Triggers for Azure Functions
Now that our Event Hub and Service Bus are set up, we can create custom triggers for Azure Functions that respond to events and messages from these services.

To create a custom trigger for Azure Functions, you'll need to define a function that takes in the event or message as input. This function can then process the event or message and perform any necessary actions.
Custom Trigger for Azure Event Hubs
Here's an example of a custom trigger for Azure Event Hubs:
module.exports = async function(context, eventHubMessages) {
    context.log(`Event hub trigger function called for message array: ${eventHubMessages}`);

    eventHubMessages.forEach(message => {
        // Process message here
    });
};
This function takes in the eventHubMessages array as input and processes each message in the array. You can add your own processing logic to this function, such as sending notifications or updating a database.

To connect this function to your Event Hub, you'll need to add a new function to your Azure Functions app using the Event Hub trigger template. Follow the prompts to specify the Event Hub connection string and configure the function.
Custom Trigger for Azure Service Bus
Here's an example of a custom trigger for Azure Service Bus:
module.exports = async function(context, mySbMsg) {
    context.log(`Service bus trigger function called for message: ${mySbMsg}`);

    // Process message here
};
This function takes in the mySbMsg object as input and processes the message. You can add your own processing logic to this function, such as sending notifications or updating a database.
To connect this function to your Service Bus, you'll need to add a new function to your Azure Functions app using the Service Bus trigger template. Follow the prompts to specify the Service Bus connection string and configure the function.

Real-time Image Processing with Azure Functions and Azure Blob Storage

 

Image processing is a critical component of many applications, from social media to healthcare. However, processing large volumes of image data can be time-consuming and resource-intensive. In this tutorial, we'll show you how to use Azure Functions and Azure Blob Storage to create a real-time image processing pipeline that can handle large volumes of data with scalability and flexibility.

 

Prerequisites

Before we get started, you'll need to have the following:

 

1.       An Azure account

2.       Visual Studio Code

3.       Azure Functions extension for Visual Studio Code

4.       Azure Blob Storage extension for Visual Studio Code

Creating the Azure Functions App

The first step is to create an Azure Functions app. In Visual Studio Code, select the Azure Functions extension and choose "Create New Project". Follow the prompts to choose your programming language and runtime.

 

Once your project is created, you can create a new function by selecting the "Create Function" button in the Azure Functions Explorer. Choose the Blob trigger template to create a function that responds to new files added to Azure Blob Storage.

 

In this example, we'll create a function that recognizes objects in images using Azure Cognitive Services. We'll use the Cognitive Services extension for Visual Studio Code to connect to our Cognitive Services account.

 

Creating the Azure Blob Storage Account

Next, we'll create an Azure Blob Storage account to store our image data. In the Azure portal, select "Create a resource" and search for "Blob Storage". Choose "Storage account" and follow the prompts to create a new account.

 

Once your account is created, select "Containers" to create a new container for your image data. Choose a container name and access level, and select "Create". You can now add images to your container through the Azure portal or through your Azure Functions app.

 

Connecting the Azure Functions App to Azure Cognitive Services

To connect your Azure Functions app to Azure Cognitive Services, you'll need to add the Cognitive Services extension to your project. In Visual Studio Code, select the Extensions icon and search for "Azure Cognitive Services". Install the extension and reload Visual Studio Code.

 

Next, open your function code and add the following code to your function:

const { ComputerVisionClient } = require("@azure/cognitiveservices-computervision");
const { BlobServiceClient } = require("@azure/storage-blob");

module.exports = async function (context, myBlob) {
    const endpoint = process.env["ComputerVisionEndpoint"];
    const key = process.env["ComputerVisionKey"];
    const client = new ComputerVisionClient({ endpoint, key });
    
    const blobEndpoint = process.env["BlobEndpoint"];
    const blobKey = process.env["BlobKey"];
    const blobServiceClient = BlobServiceClient.fromConnectionString(`BlobEndpoint=${blobEndpoint};BlobAccessKey=${blobKey}`);
    const containerClient = blobServiceClient.getContainerClient("mycontainer");
    
    const buffer = myBlob;
    
    const result = await client.analyzeImageInStream(buffer, { visualFeatures: ["Objects"] });
    
    const blobName = context.bindingData.name;
    const blobClient = containerClient.getBlockBlobClient(blobName);
    const metadata = { tags: result.objects.map(obj => obj.objectProperty) };
    await blobClient.setMetadata(metadata);
}

This code connects to your Azure Cognitive Services account and creates a new ComputerVisionClient object. It also connects to your Blob Storage account and retrieves the image data from the blob trigger.

 

The code then uses the Computer Vision API to analyze the image and extract any objects it detects. It adds these object tags to the image metadata and saves the updated metadata to Blob Storage.

 

Testing the Image Processing Pipeline

Now that our image processing pipeline is set up, we can test it by uploading an image to our Blob Storage container. The function should automatically trigger and process the image, adding object tags to the metadata.

 

To view the updated metadata, select the image in the Azure portal and choose "Properties". You should see a list of object tags extracted from the image.

 

 

 

 

Building a Serverless Web App with Azure Functions and Azure Cosmos DB

 Server less computing has revolutionized the way we build and deploy web applications. With server less, you can focus on writing code without worrying about managing infrastructure, and pay only for the compute resources you use. In this tutorial, we'll show you how to build a server less web app with Azure Functions and Azure Cosmos DB that provides scalable and cost-effective data storage and processing.


Prerequisites

Before we get started, you'll need to have the following:

  1. An Azure account
  2. Visual Studio Code
  3. Azure Functions extension for Visual Studio Code
  4. Azure Cosmos DB extension for Visual Studio Code
Creating the Azure Functions App

The first step is to create an Azure Functions app. In Visual Studio Code, select the Azure Functions extension and choose "Create New Project". Follow the prompts to choose your programming language and runtime.

Once your project is created, you can create a new function by selecting the "Create Function" button in the Azure Functions Explorer. Choose the HTTP trigger template to create a function that responds to HTTP requests.

In this example, we'll create a function that retrieves data from Azure Cosmos DB. We'll use the Cosmos DB extension for Visual Studio Code to connect to our database and retrieve data.

Creating the Azure Cosmos DB Account

Next, we'll create an Azure Cosmos DB account to store our data. In the Azure portal, select "Create a resource" and search for "Cosmos DB". Choose "Azure Cosmos DB" and follow the prompts to create a new account.

Once your account is created, select "Add Collection" to create a new container for your data. Choose a partition key and throughput level, and select "Create". You can now add data to your container through the Azure portal or through your Azure Functions app.


Connecting the Azure Functions App to Azure Cosmos DB

To connect your Azure Functions app to Azure Cosmos DB, you'll need to add the Cosmos DB extension to your project. In Visual Studio Code, select the Extensions icon and search for "Azure Cosmos DB". Install the extension and reload Visual Studio Code.

Next, open your function code and add the following code to your function:


const { CosmosClient } = require("@azure/cosmos");

module.exports = async function (context, req) {
    const endpoint = process.env["CosmosDBEndpoint"];
    const key = process.env["CosmosDBKey"];
    const client = new CosmosClient({ endpoint, key });
    
    const database = client.database("mydatabase");
    const container = database.container("mycontainer");
    
    const querySpec = {
        query: "SELECT * FROM c"
    };
    
    const { resources } = await container.items.query(querySpec).fetchAll();
    
    context.res = {
        body: resources
    };
}

This code connects to your Azure Cosmos DB account and retrieves all data from the specified container. Replace "mydatabase" and "mycontainer" with your database and container names.

Finally, add your Azure Cosmos DB account endpoint and key to your function's Application Settings. In the Azure Functions Explorer, select your function and choose "Application Settings". Add the following settings:

CosmosDBEndpoint: Your Azure Cosmos DB account endpoint
CosmosDBKey: Your Azure Cosmos DB account key

Conclusion
we learned how to build a serverless web app with Azure Functions and Azure Cosmos DB. We created an Azure Functions app and a new function that retrieves data from Azure Cosmos DB using the Cosmos DB extension for Visual Studio Code.

We also created an Azure Cosmos DB account and added a new container to store our data. Finally, we connected our Azure Functions app to Azure Cosmos DB by adding the necessary code and application settings. By using Azure Functions and Azure Cosmos DB together, you can build scalable and cost-effective web applications that handle data storage and processing without managing infrastructure.

You can extend this example to include more complex queries, data manipulation, and other functions that respond to HTTP requests or other triggers. 

 If you're new to serverless computing or Azure Functions, be sure to check out the documentation and resources available from Microsoft. With the right tools and knowledge, you can quickly build and deploy serverless web applications that are flexible, scalable, and cost-effective.

Friday, April 28, 2023

Maximizing Azure Functions: Use Cases and Limitations for Effective Serverless Computing

Azure Functions: Use Cases, Limitations, and Best Practices for Serverless Computing

Azure Functions is a powerful serverless compute service provided by Microsoft Azure that enables developers to build and run event-driven applications at scale. This service supports a wide range of use cases, such as real-time data processing, RESTful APIs, event triggers, scheduled tasks, and chatbots, making it an ideal choice for businesses looking to adopt a serverless computing model.

However, it's important to note that there are some limitations and best practices to consider when working with Azure Functions. In this article, we'll discuss some of the common use cases for Azure Functions, as well as the limitations and best practices you should be aware of.

Real-time Data Processing with Azure Functions

Azure Functions is an ideal choice for real-time data processing use cases, such as data validation, enrichment, and transformation. By leveraging Azure Functions, you can process data as it flows into your application, ensuring that it's accurate and up-to-date. Additionally, Azure Functions can integrate with other Azure services, such as Azure Blob Storage, Event Hubs, and IoT Hub, enabling you to process large volumes of data in real-time.

Building RESTful APIs with Azure Functions

Azure Functions can also be used to build RESTful APIs that can be consumed by other applications. This is particularly useful for businesses looking to expose their services to external customers or partners. By using Azure Functions to build APIs, you can reduce development time and costs, as well as improve scalability and reliability.

Event-driven Computing with Azure Functions

Another key use case for Azure Functions is event-driven computing. Azure Functions can be triggered by events in other Azure services, such as Azure Blob Storage, Event Hubs, and IoT Hub. This allows you to respond to events in real-time, such as processing a new file upload to Azure Blob Storage or handling an incoming message from an IoT device.

Scheduled Tasks with Azure Functions

Azure Functions can also be used to perform scheduled tasks, such as sending email notifications or generating reports. By leveraging Azure Functions for scheduled tasks, you can automate repetitive tasks and free up time for your development team to focus on higher-value tasks.

Chatbots with Azure Functions

Azure Functions can also be used to build chatbots that can interact with users and respond to their queries. By using Azure Functions to build chatbots, you can reduce development time and costs, as well as improve scalability and reliability.

Limitations and Best Practices for Azure Functions

While Azure Functions is a powerful serverless compute service, there are some limitations and best practices to keep in mind. For example, Azure Functions are designed to be short-lived, so they may not be the best choice for long-running tasks or tasks that require a lot of resources. Additionally, Azure Functions are stateless, which means that they don't maintain any state between function invocations. This can be problematic for applications that require complex state management. To overcome these limitations, you may want to consider using Azure Durable Functions or other Azure services such as Azure Virtual Machines or Azure App Service.

Conclusion

Azure Functions is a powerful serverless compute service that supports a wide range of use cases, such as real-time data processing, RESTful APIs, event triggers, scheduled tasks, and chatbots. By leveraging Azure Functions, you can reduce development time and costs, as well as improve scalability and reliability. However, it's important to keep in mind the limitations and best practices for Azure Functions to ensure that you're using the service effectively.

code for azure function with storage of excel file

This function listens to HTTP POST requests and stores the Excel file in Blob storage under the "excel-files" container with a random GUID as the file name. Note that this function requires the Microsoft.Azure.WebJobs.Extensions.Storage NuGet package. When you make a POST request to this function with an Excel file in the request body, the function will store the file in Blob storage and return an HTTP 200 OK response with the message "Excel file stored successfully". You can then use this file in other Azure Functions or download it from Blob storage using the Azure Storage SDK or Azure portal.


using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Extensions.Logging;
using System.IO;
using System.Threading.Tasks;

public static class StoreExcelFunction
{
    [FunctionName("StoreExcel")]
    public static async Task Run(
        [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
        [Blob("excel-files/{rand-guid}.xlsx", FileAccess.Write)] Stream excelFile,
        ILogger log)
    {
        await req.Body.CopyToAsync(excelFile);
        log.LogInformation("Excel file stored successfully");

        return new OkObjectResult("Excel file stored successfully");
    }
}

Thursday, April 27, 2023

Handwritten Digit Recognition using OpenCV using Python

This code loads a pre-trained CNN model to recognize the digits, captures the video from the webcam, and analyzes each frame in real-time to recognize the digits. The code uses OpenCV to preprocess the images and extract the digits from the video frames. The recognized digits are printed on the video frames and displayed in real-time.

 

import cv2

import numpy as np

from keras.models import load_model


# Load the pre-trained CNN model

model = load_model('model.h5')


# Define the size of the image to be analyzed

IMG_SIZE = 28


# Define the function to preprocess the image

def preprocess_image(img):

    img = cv2.resize(img, (IMG_SIZE, IMG_SIZE))

    img = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)

    img = cv2.threshold(img, 0, 255, cv2.THRESH_BINARY_INV | cv2.THRESH_OTSU)[1]

    img = img.astype('float32') / 255.0

    img = np.reshape(img, (1, IMG_SIZE, IMG_SIZE, 1))

    return img


# Define the function to recognize the digit

def recognize_digit(img):

    img_processed = preprocess_image(img)

    digit = model.predict_classes(img_processed)[0]

    return digit


# Capture the video from the webcam

cap = cv2.VideoCapture(0)


while True:

    # Read a frame from the video stream

    ret, frame = cap.read()

    

    # Convert the frame to grayscale

    gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)

    

    # Threshold the grayscale image

    ret, thresh = cv2.threshold(gray, 0, 255, cv2.THRESH_BINARY_INV | cv2.THRESH_OTSU)

    

    # Find the contours in the thresholded image

    contours, hierarchy = cv2.findContours(thresh, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)

    

    # Loop through all the contours

    for contour in contours:

        # Find the bounding rectangle of the contour

        x, y, w, h = cv2.boundingRect(contour)

        

        # Ignore contours that are too small

        if w < 10 or h < 10:

            continue

        

        # Draw the rectangle around the contour

        cv2.rectangle(frame, (x, y), (x + w, y + h), (0, 255, 0), 2)

        

        # Extract the digit from the image

        digit_img = gray[y:y+h, x:x+w]

        

        # Recognize the digit

        digit = recognize_digit(digit_img)

        

        # Print the recognized digit on the frame

        cv2.putText(frame, str(digit), (x, y), cv2.FONT_HERSHEY_SIMPLEX, 1, (0, 0, 255), 2)

    

    # Display the video stream

    cv2.imshow('Handwritten Digit Recognition', frame)

    

    # Wait for a key press

    key = cv2.waitKey(1)

    

    # If the 'q' key is pressed, exit the loop

    if key == ord('q'):

        break


# Release the resources

cap.release()

cv2.destroyAllWindows()

 

Motion Detection using OpenCV using python

 import cv2


# Set up video capture device

cap = cv2.VideoCapture(0)


# Initialize variables

previous_frame = None


while True:

    # Capture current frame

    ret, current_frame = cap.read()


    # Convert to grayscale

    current_frame_gray = cv2.cvtColor(current_frame, cv2.COLOR_BGR2GRAY)


    # Check if previous frame exists

    if previous_frame is not None:

        # Compute absolute difference between current and previous frame

        frame_diff = cv2.absdiff(current_frame_gray, previous_frame)


        # Apply thresholding to remove noise

        thresh = cv2.threshold(frame_diff, 25, 255, cv2.THRESH_BINARY)[1]


        # Find contours of objects in thresholded image

        contours, hierarchy = cv2.findContours(thresh, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)


        # Draw bounding box around each contour

        for contour in contours:

            (x, y, w, h) = cv2.boundingRect(contour)

            cv2.rectangle(current_frame, (x, y), (x + w, y + h), (0, 0, 255), 2)


    # Update previous frame

    previous_frame = current_frame_gray


    # Display current frame

    cv2.imshow("Motion Detection", current_frame)


    # Exit on 'q' key press

    if cv2.waitKey(1) & 0xFF == ord('q'):

        break


# Release video capture device and destroy all windows

cap.release()

cv2.destroyAllWindows()

In this code, we capture frames from the default video capture device using cv2.VideoCapture(0). We then convert the current frame to grayscale using cv2.cvtColor(), and compute the absolute difference between the current and previous frames using cv2.absdiff(). We apply thresholding to the difference image to remove noise using cv2.threshold(), and find the contours of objects in the thresholded image using cv2.findContours(). Finally, we draw bounding boxes around each contour using cv2.rectangle().

To run this code, save it in a Python file (e.g., motion_detection.py) and run it using the command python motion_detection.py in a terminal or command prompt. Make sure you have OpenCV installed before running the code.

Face Recognition with OpenCV python code

 import cv2


# Load the Haar Cascade face detection classifier

face_cascade = cv2.CascadeClassifier('haarcascade_frontalface_default.xml')


# Load the trained face recognition model

recognizer = cv2.face.LBPHFaceRecognizer_create()

recognizer.read('trained_model.xml')


# Set the video capture device (0 is usually the default webcam)

cap = cv2.VideoCapture(0)


while True:

    # Read a frame from the video stream

    ret, frame = cap.read()


    # Convert the frame to grayscale

    gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)


    # Detect faces in the grayscale frame

    faces = face_cascade.detectMultiScale(gray, scaleFactor=1.2, minNeighbors=5)


    # Loop through each face detected

    for (x, y, w, h) in faces:

        # Crop the face region from the grayscale frame

        face_gray = gray[y:y+h, x:x+w]


        # Resize the face image to match the training image size

        face_gray = cv2.resize(face_gray, (100, 100))


        # Predict the label (person) of the face using the trained model

        label, confidence = recognizer.predict(face_gray)


        # Draw a rectangle around the face and display the predicted label

        cv2.rectangle(frame, (x, y), (x+w, y+h), (0, 255, 0), 2)

        cv2.putText(frame, f'Person {label} ({confidence:.2f})', (x, y-10), cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 255, 0), 2)


    # Display the frame

    cv2.imshow('Face Recognition', frame)


    # Exit the loop if 'q' is pressed

    if cv2.waitKey(1) == ord('q'):

        break


# Release the video capture device and close the OpenCV window

cap.release()

cv2.destroyAllWindows()


 Note that this code assumes you have already trained a face recognition model and saved it to a file (in this case, trained_model.xml). If you haven't done this yet, you will need to train the model on a dataset of labeled face images before you can use it for recognition.






Object Detection with OpenCV-Python code

 import cv2


# Load the pre-trained face detection classifier

face_cascade = cv2.CascadeClassifier('haarcascade_frontalface_default.xml')


# Load the image

img = cv2.imread('test.jpg')


# Convert the image to grayscale

gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)


# Detect faces in the grayscale image

faces = face_cascade.detectMultiScale(gray, scaleFactor=1.1, minNeighbors=5, minSize=(30, 30))


# Draw rectangles around the detected faces

for (x, y, w, h) in faces:

    cv2.rectangle(img, (x, y), (x+w, y+h), (0, 255, 0), 2)


# Display the result

cv2.imshow('img', img)

cv2.waitKey(0)

cv2.destroyAllWindows()

In this example, the cv2.CascadeClassifier function is used to load the pre-trained Haar Cascade classifier file for face detection. The detectMultiScale function is used to detect faces in the image. The scaleFactor parameter determines how much the image size is reduced at each image scale, the minNeighbors parameter controls the number of neighbors a detection candidate needs to retain, and the minSize parameter specifies the minimum size of the face to detect. Finally, the cv2.rectangle function is used to draw a rectangle around each detected face in the image, and the cv2.imshow function is used to display the result.

Wednesday, April 26, 2023

Get column count in MySQL

 SELECT count(*)  FROM information_schema.columns WHERE table_name = 'vmdata'

Get all column names in MySQL of a table comma separated

 For that you can use the following MySQL: 


select group_concat(column_name order by ordinal_position) from information_schema.columns where table_schema = 'vops' and table_name = 'vmdata'

Thursday, April 6, 2023

How to check the final SQL query generated by Entity Framework based on the LINQ expression for MySQL database

 If you want to check the final SQL query generated by  Entity Framework based on the LINQ expression for MySQL database  . you can follow the following steps : 


1. Connect to your MySQL command line 

2. run the following command :  SET GLOBAL general_log = 'ON';

3. In next command you need to setup the log file location. Here is the command for that SET GLOBAL general_log_file = 'C://file.log';

4. Execute the method for which you want to check the SQL query. 

5. Once you are done SET GLOBAL general_log = 'OFF'; run this. 

You can now check in your log file the output sql query from Entity Framework . 


Wednesday, March 29, 2023

Server sent charset unknown to the client.

 After install mysql 8. I'm trying to connect to a MySQL database from php. 


<?php

echo "Hello World!";

?>


</br>

<?php

$link = mysqli_connect('localhost', 'root', 'pass12#####');

if (!$link) {

echo "Failed to connect to MySQL: " . mysqli_connect_error();

die('Could not connect: ' . mysqli_error());

}

echo 'Connected successfully';

mysqli_close($link);

?>


</br>

<?php

phpinfo();

?>

</br>

But when I put in username and password I get the error message saying:


Server sent charset unknown to the client. Please, report to the developers

Solution

MySQL 8 changed the default charset to utf8mb4. But some clients don't know this charset. Hence when the server reports its default charset to the client, and the client doesn't know what the server means, it throws this error.


Edit my.cnf, specify the client code, and add the following content.


[client]

default-character-set=utf8


[mysql]

default-character-set=utf8


[mysqld]

collation-server = utf8_unicode_ci

character-set-server = utf8


Wednesday, January 18, 2023

How to find the data directory path in MySQL from command line

 show variables like 'datadir';

MYSQL 5.6 data directory in windows

 DB file location in mysql 5.6 in windows 

The default data directory location is C:\Program Files\MySQL\MySQL Server 5.6\data , or C:\ProgramData\Mysql on Windows 7 and Windows Server 2008. The C:\ProgramData directory is hidden by default.

Wednesday, January 4, 2023

Create a object of class in c# from string input dynamically from user

 You can use the following code to create object from a class which name is provided by the user in program run time. 


Type type = Type.GetType("Employee");

object instance = Activator.CreateInstance(type);


just for reference sharing the URL of the static class Activator which is part of system namespace. 


https://learn.microsoft.com/en-us/dotnet/api/system.activator?view=net-7.0

Tuesday, January 3, 2023

python code to import xlsx file in mysql and validate also each column and rows before import

 import mysql.connector

import openpyxl


# Open the .xlsx file

wb = openpyxl.load_workbook('data.xlsx')

sheet = wb.active


# Connect to the MySQL database

cnx = mysql.connector.connect(user='user', password='password', host='host', database='database')

cursor = cnx.cursor()


# Validate and import the data

for row in sheet.rows:

    # Validate each cell in the row

    if row[0].value == None:

        print("Error: Missing value in column 1")

    elif row[1].value == None:

        print("Error: Missing value in column 2")

    else:

        # If the data is valid, insert it into the database

        sql = "INSERT INTO table (column1, column2) VALUES (%s, %s)"

        val = (row[0].value, row[1].value)

        cursor.execute(sql, val)


# Commit the changes to the database

cnx.commit()


# Close the connection

cnx.close()

Monday, January 2, 2023

logout from linux ssh from windows powershell

 To exit the Bash session, type "exit" without quotes and then the Enter key.


That should exit the SSH session and get you back to the PS C:\Windows\system32 prompt.


Just close the Powershell session by clicking on the X at the upper right corner of the window.

bash: apt-get: command not found CentOS 7

  Amazon Linux it's CentOS-based, which is RedHat-based. RH-based installs use yum not apt-get. Something like yum search httpd should show you the available Apache packages - you likely want yum install httpd24.

how to make sure node is installed or not in window 10 ?

 To see if Node is installed, open the Windows Command Prompt, Powershell or a similar command line tool, and type node -v . This should print the version number so you'll see something like this v0. 10.35

ASP.NET Core

 Certainly! Here are 10 advanced .NET Core interview questions covering various topics: 1. **ASP.NET Core Middleware Pipeline**: Explain the...