Just In Time - Manufacturing Quality Inspection

Project Overview:

A Long Time Ago in a Galaxy Far, Far Away…

STAR WARS OPENING THEME MUSIC

Episode IV: The Quest for Quality

A New Inspection

In the bustling galaxy of manufacturing, precision and efficiency are the ultimate goals. Amid the quest for quality assurance, a team of engineers aboard the Millennium Falcon, equipped with a Raspberry Pi and the power of machine learning, set out to revolutionize quality inspection. Their mission: to integrate AI into a mock Manufacturing Execution System (MES) for detecting defects in real-time.

————————————————————————————————–

The Project Overview

This project combines edge computing, machine learning, and web technologies to create a mock MES system capable of detecting defects in manufacturing components using AI inference. Below is the breakdown of the system:

Key Components and Setup
1. Raspberry Pi and Camera Integration:

A Raspberry Pi with a camera module captures high-resolution images of components to be inspected. Python scripts utilizing libcamera handle the image capturing process, ensuring consistent quality for inference.

2. AI Inference with Roboflow:

A machine learning model trained with Roboflow identifies potential defects in the captured images. The Roboflow API is utilized to send images for inference and receive predictions in JSON format. These predictions include bounding boxes and confidence scores for detected components.

3. Drawing Predictions

Predictions from Roboflow are drawn directly onto the images using libraries like Pillow. Annotated images provide a visual representation of the AI’s findings, ready to be displayed in the MES system.

4. Flask Backend

A Flask server facilitates communication between the Raspberry Pi, the AI inference process, and the web interface.

Endpoints: Handle image capturing, inference requests, and return annotated images to the frontend.

Static and Templates: Serve JavaScript, CSS, and HTML files to create a seamless user experience.

5. Web-Based User Interface

The MES mock interface is designed with:

HTML and CSS: For a clean, responsive layout.

JavaScript: To handle user interactions, trigger inferences, and update the displayed results.

Dynamic Image Display: Displays the annotated image after each inference request.

Challenges and Solutions
1. Cross-Origin Resource Sharing (CORS) Errors

Issue: Blocked communication between the web frontend and Flask server.

Solution: Configured Flask with the flask-cors library to allow requests from the frontend.

2. File Path Errors

Issue: Misaligned static and template file paths for serving resources.

Solution: Verified directory structure and corrected url_for paths in the HTML and Python code.

3. Real-Time Inference Integration

Issue: Accurately syncing image capturing, AI inference, and frontend display.

Solution: Streamlined Flask endpoints and added robust logging for debugging.

Future Improvements

Edge Device Inference: Running AI models directly on the Raspberry Pi to eliminate latency from external API calls.

Enhanced UI/UX: Improving the frontend with more polished visuals and an intuitive dashboard.

Scalability: Adapting the system for real-world manufacturing use cases, including defect categorization and batch processing.

Data Visualization: Adding graphical elements to track inspection statistics and trends over time.

 

May the Flow Be With You.

Screenshot 2024-12-09 at 3.14.07 PM

Mock App Overview

This mock application, built using only HTML, demonstrates how an edge device could function on a manufacturing shop floor for quality assurance processes.

Workflow:

  1. Part Installation: The operator installs the part of interest—such as the satellite component of the Millennium Falcon.

  2. Operator Acknowledgment: A prompt appears requesting the operator to confirm that the assembly area is clear of obstructions.

  3. Initiate Quality Check: The operator clicks the “Run Quality Check” button. This action simulates the edge device capturing an image of the assembly and sending it to Roboflow for analysis.

  4. Inference Processing: Roboflow processes the image and generates a JSON file with detection results, identifying the presence or absence of the specified part.

  5. Display Results: Using CV2, bounding boxes and labels from the JSON file are drawn onto the original image. The annotated image is then displayed on the operator’s screen, providing clear validation of the part’s presence.


This mock app offers a glimpse into how computer vision and machine learning can enhance manufacturing quality assurance through real-time, automated inspections.

Mock Quality Check